2024/11/10 09:28:08 - mmengine - INFO - ------------------------------------------------------------ System environment: sys.platform: linux Python: 3.10.6 (main, Mar 10 2023, 10:55:28) [GCC 11.3.0] CUDA available: True MUSA available: False numpy_random_seed: 624982218 GPU 0,1,2,3,4,5,6,7: NVIDIA L20 CUDA_HOME: /usr/local/cuda NVCC: Cuda compilation tools, release 12.1, V12.1.105 GCC: x86_64-linux-gnu-gcc (Ubuntu 11.3.0-1ubuntu1~22.04) 11.3.0 PyTorch: 2.2.1+cu121 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201703 - Intel(R) oneAPI Math Kernel Library Version 2022.2-Product Build 20220804 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v3.3.2 (Git Hash 2dc95a2ad0841e29db8b22fbccaf3e5da7992b01) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX512 - CUDA Runtime 12.1 - NVCC architecture flags: -gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_90,code=sm_90 - CuDNN 8.9.2 - Magma 2.6.1 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=12.1, CUDNN_VERSION=8.9.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -D_GLIBCXX_USE_CXX11_ABI=0 -fabi-version=11 -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -DNDEBUG -DUSE_KINETO -DLIBKINETO_NOROCTRACER -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -Wall -Wextra -Werror=return-type -Werror=non-virtual-dtor -Werror=bool-operation -Wnarrowing -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-strict-overflow -Wno-strict-aliasing -Wno-stringop-overflow -Wsuggest-override -Wno-psabi -Wno-error=pedantic -Wno-error=old-style-cast -Wno-missing-braces -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=2.2.1, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=1, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, USE_ROCM_KERNEL_ASSERT=OFF, TorchVision: 0.17.1+cu121 OpenCV: 4.6.0 MMEngine: 0.10.5 Runtime environment: cudnn_benchmark: False mp_cfg: {'mp_start_method': 'fork', 'opencv_num_threads': 0} dist_cfg: {'backend': 'nccl', 'timeout': 36000} seed: 624982218 Distributed launcher: pytorch Distributed training: True GPU number: 8 ------------------------------------------------------------ 2024/11/10 09:28:10 - mmengine - INFO - Config: auto_scale_lr = dict(base_batch_size=16, enable=True) backend_args = None caption_dataset = dict( actual_dataset_mode='OD', ann_file='LLaVA-ReCap-558K_tag_box_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img='images/'), data_root='../grounding_data/llava_cap/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict(min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True) clean_caption = True coco2017_train_dataset = dict( actual_dataset_mode='OD', ann_file='annotations/instances_train2017_vg_merged6.json', backend_args=None, clean_caption=True, data_prefix=dict(img='train2017'), data_root='../grounding_data/coco/', filter_cfg=dict(filter_empty_gt=False), pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict(min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True) data_root = '../grounding_data/coco/' dataset_type = 'LVISV1Dataset' default_hooks = dict( checkpoint=dict( by_epoch=False, interval=30000, max_keep_ckpts=30, type='CheckpointHook'), logger=dict(interval=100, type='LoggerHook'), param_scheduler=dict(type='ParamSchedulerHook'), sampler_seed=dict(type='DistSamplerSeedHook'), timer=dict(type='IterTimerHook'), visualization=dict(type='GroundingVisualizationHook')) default_scope = 'mmdet' env_cfg = dict( cudnn_benchmark=False, dist_cfg=dict(backend='nccl', timeout=36000), mp_cfg=dict(mp_start_method='fork', opencv_num_threads=0)) flickr30k_dataset = dict( actual_dataset_mode='VG', ann_file='flickr_train_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img='flickr30k_images/'), data_root='../grounding_data/flickr30k_entities/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict(min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True) gqa_dataset = dict( actual_dataset_mode='VG', ann_file='gqa_train_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img='images/'), data_root='../grounding_data/gqa/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict(min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True) lang_model_name = '../huggingface/bert-base-uncased/' launcher = 'pytorch' lmm_max_token_length = 1100 lmm_path = '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/' load_from = '../huggingface/mm_grounding_dino/grounding_dino_swin-l_pretrain_obj365_goldg-34dcdc53.pth' log_level = 'INFO' log_processor = dict(by_epoch=False, type='LogProcessor', window_size=50) max_iter = 150000 model = dict( as_two_stage=True, backbone=dict( attn_drop_rate=0.0, convert_weights=True, depths=[ 2, 2, 18, 2, ], drop_path_rate=0.2, drop_rate=0.0, embed_dims=192, frozen_stages=-1, init_cfg=None, mlp_ratio=4, num_heads=[ 6, 12, 24, 48, ], out_indices=( 0, 1, 2, 3, ), patch_norm=True, pretrain_img_size=384, qk_scale=None, qkv_bias=True, type='SwinTransformer', window_size=12, with_cp=True), bbox_head=dict( contrastive_cfg=dict(bias=True, log_scale='auto', max_text_len=256), loss_bbox=dict(loss_weight=5.0, type='L1Loss'), loss_cls=dict( alpha=0.25, gamma=2.0, loss_weight=1.0, type='FocalLoss', use_sigmoid=True), num_classes=256, sync_cls_avg_factor=True, type='GroundingDINOHead'), data_preprocessor=dict( bgr_to_rgb=True, mean=[ 123.675, 116.28, 103.53, ], pad_mask=False, std=[ 58.395, 57.12, 57.375, ], type='DetDataPreprocessor'), decoder=dict( layer_cfg=dict( cross_attn_cfg=dict( dropout=0.0, embed_dims=256, num_heads=8, num_levels=5), cross_attn_text_cfg=dict(dropout=0.0, embed_dims=256, num_heads=8), ffn_cfg=dict( embed_dims=256, feedforward_channels=2048, ffn_drop=0.0), self_attn_cfg=dict(dropout=0.0, embed_dims=256, num_heads=8)), num_layers=6, post_norm_cfg=None, return_intermediate=True), dn_cfg=dict( box_noise_scale=1.0, group_cfg=dict(dynamic=True, num_dn_queries=100, num_groups=None), label_noise_scale=0.5), encoder=dict( fusion_layer_cfg=dict( embed_dim=1024, init_values=0.0001, l_dim=256, num_heads=4, v_dim=256), layer_cfg=dict( ffn_cfg=dict( embed_dims=256, feedforward_channels=2048, ffn_drop=0.0), self_attn_cfg=dict(dropout=0.0, embed_dims=256, num_levels=5)), num_cp=6, num_layers=6, text_layer_cfg=dict( ffn_cfg=dict( embed_dims=256, feedforward_channels=1024, ffn_drop=0.0), self_attn_cfg=dict(dropout=0.0, embed_dims=256, num_heads=4))), feature_map_size=27, freeze_backbone=True, freeze_lm=False, language_model=dict( add_pooling_layer=False, max_tokens=256, name='../huggingface/bert-base-uncased/', pad_to_max=False, special_tokens_list=[ '[CLS]', '[SEP]', '.', '?', ], type='BertModel', use_sub_sentence_represent=True), lmm='../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', lmm_connector= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/mm_projector2.bin', lmm_connector_prefix='mm_projector', lmm_image_loss_weight=1.0, lmm_max_token_length=1100, lmm_new_layer_insert_type='all', lmm_region_loss_weight=1.0, lora_alpha=256, lora_dropout=0, lora_r=64, mini_query=False, neck=dict( act_cfg=None, bias=True, in_channels=[ 192, 384, 768, 1536, ], kernel_size=1, norm_cfg=dict(num_groups=32, type='GN'), num_outs=5, out_channels=256, type='ChannelMapper'), num_feature_levels=5, num_lmm_new_layers=6, num_queries=900, num_region_caption=16, positional_encoding=dict( normalize=True, num_feats=128, offset=0.0, temperature=20), test_cfg=dict(chunked_size=40, max_per_img=300), train_cfg=dict( assigner=dict( match_costs=[ dict(type='BinaryFocalLossCost', weight=2.0), dict(box_format='xywh', type='BBoxL1Cost', weight=5.0), dict(iou_mode='giou', type='IoUCost', weight=2.0), ], type='HungarianAssigner')), type='GroundingDINO', use_autocast=True, use_lmm_cross_attn=True, use_lora=True, use_p4_input=False, use_p5_input=True, use_plora=False, use_query_input=False, use_query_input_cross_attn=False, with_box_refine=True) num_levels = 5 num_region_caption = 16 optim_wrapper = dict( clip_grad=dict(max_norm=0.1, norm_type=2), loss_scale='dynamic', optimizer=dict(lr=0.0001, type='AdamW', weight_decay=0.0001), paramwise_cfg=dict( custom_keys=dict( absolute_pos_embed=dict(decay_mult=0.0), backbone=dict(lr_mult=0.1), language_model=dict(lr_mult=0.1))), type='AmpOptimWrapper') param_scheduler = [ dict( begin=0, by_epoch=False, end=1000, start_factor=0.001, type='LinearLR'), dict( begin=0, by_epoch=False, end=150000, gamma=0.1, milestones=[ 110000, 140000, ], type='MultiStepLR'), ] pretrained = None randomness = dict(seed=624982218) resume = False test_cfg = dict(type='TestLoop') test_dataloader = dict( batch_size=1, dataset=dict( ann_file='annotations/lvis_v1_minival_inserted_image_name.json', backend_args=None, data_prefix=dict(img=''), data_root='../grounding_data/coco/', pipeline=[ dict( backend_args=None, imdecode_backend='pillow', type='LoadImageFromFile'), dict( backend='pillow', keep_ratio=True, scale=( 800, 1333, ), type='FixScaleResize'), dict(type='LoadAnnotations', with_bbox=True), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'text', 'custom_entities', 'tokens_positive', ), type='PackDetInputs'), ], return_classes=True, test_mode=True, type='LVISV1Dataset'), drop_last=False, num_workers=2, persistent_workers=True, sampler=dict(shuffle=False, type='DefaultSampler')) test_evaluator = dict( ann_file= '../grounding_data/coco/annotations/lvis_v1_minival_inserted_image_name.json', type='LVISFixedAPMetric') test_pipeline = [ dict( backend_args=None, imdecode_backend='pillow', type='LoadImageFromFile'), dict( backend='pillow', keep_ratio=True, scale=( 800, 1333, ), type='FixScaleResize'), dict(type='LoadAnnotations', with_bbox=True), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'text', 'custom_entities', 'tokens_positive', ), type='PackDetInputs'), ] train_cfg = dict( max_iters=150000, type='IterBasedTrainLoop', val_interval=30000) train_dataloader = dict( batch_sampler=dict(type='AspectRatioBatchSampler'), batch_size=2, dataset=dict( datasets=[ dict( actual_dataset_mode='OD', ann_file='annotations/instances_train2017_vg_merged6.json', backend_args=None, clean_caption=True, data_prefix=dict(img='train2017'), data_root='../grounding_data/coco/', filter_cfg=dict(filter_empty_gt=False), pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict( min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True), dict( actual_dataset_mode='VG', ann_file='flickr_train_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img='flickr30k_images/'), data_root='../grounding_data/flickr30k_entities/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict( min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True), dict( actual_dataset_mode='VG', ann_file='gqa_train_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img='images/'), data_root='../grounding_data/gqa/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict( min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True), dict( actual_dataset_mode='OD', ann_file='LLaVA-ReCap-558K_tag_box_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img='images/'), data_root='../grounding_data/llava_cap/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict( min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True), dict( actual_dataset_mode='OD', ann_file='annotations/v3det_2023_v1_train_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img=''), data_root='../grounding_data/v3det/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict( min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True), ], type='ConcatDataset'), num_workers=4, persistent_workers=True, sampler=dict(shuffle=True, type='DefaultSampler')) train_pipeline = [ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict(min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2='../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ] use_short_cap = False use_uniform_prompt = True v3det_dataset = dict( actual_dataset_mode='OD', ann_file='annotations/v3det_2023_v1_train_vg7.json', backend_args=None, clean_caption=True, data_prefix=dict(img=''), data_root='../grounding_data/v3det/', filter_cfg=dict(filter_empty_gt=False), label_map_file=None, pipeline=[ dict(backend_args=None, type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True), dict( keep_ratio=True, scales=[ ( 480, 1100, ), ( 512, 1100, ), ( 544, 1100, ), ( 576, 1100, ), ( 608, 1100, ), ( 640, 1100, ), ( 672, 1100, ), ( 704, 1100, ), ( 736, 1100, ), ( 768, 1100, ), ( 800, 1100, ), ], type='RandomChoiceResize'), dict(min_gt_bbox_wh=( 0.01, 0.01, ), type='FilterAnnotations'), dict( lmm_max_token_length=1100, max_tokens=256, num_region_caption=16, num_sample_negative=85, tokenizer_name='../huggingface/bert-base-uncased/', tokenizer_name2= '../huggingface/my_llava-onevision-qwen2-0.5b-ov-2/', type='RandomSamplingNegPos2'), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'flip', 'flip_direction', 'text', 'tags', 'contrast_conv', 'custom_entities', 'tokens_positive', 'dataset_mode', 'conversations', 'region_conversations', ), type='PackDetInputs'), ], return_classes=True, type='ODVGDataset', use_short_cap=False, use_uniform_prompt=True) val_cfg = dict(type='ValLoop') val_dataloader = dict( batch_size=1, dataset=dict( ann_file='annotations/lvis_v1_minival_inserted_image_name.json', backend_args=None, data_prefix=dict(img=''), data_root='../grounding_data/coco/', pipeline=[ dict( backend_args=None, imdecode_backend='pillow', type='LoadImageFromFile'), dict( backend='pillow', keep_ratio=True, scale=( 800, 1333, ), type='FixScaleResize'), dict(type='LoadAnnotations', with_bbox=True), dict( meta_keys=( 'img_id', 'img_path', 'ori_shape', 'img_shape', 'scale_factor', 'text', 'custom_entities', 'tokens_positive', ), type='PackDetInputs'), ], return_classes=True, test_mode=True, type='LVISV1Dataset'), drop_last=False, num_workers=2, persistent_workers=True, sampler=dict(shuffle=False, type='DefaultSampler')) val_evaluator = dict( ann_file= '../grounding_data/coco/annotations/lvis_v1_minival_inserted_image_name.json', type='LVISFixedAPMetric') vis_backends = [ dict(type='LocalVisBackend'), ] visualizer = dict( name='visualizer', type='DetLocalVisualizer', vis_backends=[ dict(type='LocalVisBackend'), ]) work_dir = './work_dirs/grounding_dino_swin_l' 2024/11/10 09:31:50 - mmengine - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) RuntimeInfoHook (BELOW_NORMAL) LoggerHook -------------------- before_train: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (VERY_LOW ) CheckpointHook -------------------- before_train_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (NORMAL ) DistSamplerSeedHook -------------------- before_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook -------------------- after_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_train_epoch: (NORMAL ) IterTimerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- before_val: (VERY_HIGH ) RuntimeInfoHook -------------------- before_val_epoch: (NORMAL ) IterTimerHook -------------------- before_val_iter: (NORMAL ) IterTimerHook -------------------- after_val_iter: (NORMAL ) IterTimerHook (NORMAL ) GroundingVisualizationHook (BELOW_NORMAL) LoggerHook -------------------- after_val_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_val: (VERY_HIGH ) RuntimeInfoHook -------------------- after_train: (VERY_HIGH ) RuntimeInfoHook (VERY_LOW ) CheckpointHook -------------------- before_test: (VERY_HIGH ) RuntimeInfoHook -------------------- before_test_epoch: (NORMAL ) IterTimerHook -------------------- before_test_iter: (NORMAL ) IterTimerHook -------------------- after_test_iter: (NORMAL ) IterTimerHook (NORMAL ) GroundingVisualizationHook (BELOW_NORMAL) LoggerHook -------------------- after_test_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook -------------------- after_test: (VERY_HIGH ) RuntimeInfoHook -------------------- after_run: (BELOW_NORMAL) LoggerHook -------------------- 2024/11/10 09:40:21 - mmengine - WARNING - backbone.patch_embed.projection.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.patch_embed.projection.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.patch_embed.norm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.patch_embed.norm.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.0.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.blocks.1.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.downsample.norm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.downsample.norm.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.0.downsample.reduction.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.0.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.blocks.1.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.downsample.norm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.downsample.norm.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.1.downsample.reduction.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.0.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.1.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.2.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.3.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.4.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.5.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.6.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.7.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.8.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.9.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.10.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.11.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.12.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.13.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.14.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.15.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.16.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.blocks.17.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.downsample.norm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.downsample.norm.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.2.downsample.reduction.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.0.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.attn.w_msa.qkv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.attn.w_msa.qkv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.attn.w_msa.proj.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.attn.w_msa.proj.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.ffn.layers.0.0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.ffn.layers.0.0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.ffn.layers.1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.stages.3.blocks.1.ffn.layers.1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm0.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm0.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm1.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm1.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm2.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm2.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm3.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - backbone.norm3.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.0.conv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.0.conv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.0.gn.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.0.gn.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.1.conv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.1.conv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.1.gn.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.1.gn.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.2.conv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.2.conv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.2.gn.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.2.gn.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.3.conv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.3.conv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.3.gn.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.convs.3.gn.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.extra_convs.0.conv.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.extra_convs.0.conv.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.extra_convs.0.gn.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - neck.extra_convs.0.gn.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.word_embeddings.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.word_embeddings.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.word_embeddings.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.position_embeddings.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.position_embeddings.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.position_embeddings.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.token_type_embeddings.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.token_type_embeddings.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.token_type_embeddings.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.embeddings.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.dense.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.dense.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.dense.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.dense.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.dense.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.dense.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.weight:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.weight:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.weight:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.bias:lr=1e-05 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.bias:weight_decay=0.0001 2024/11/10 09:40:21 - mmengine - INFO - paramwise_options -- language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.bias:lr_mult=0.1 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.embed_tokens.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.0.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.1.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.2.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.3.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.4.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.5.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.6.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.7.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.8.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.9.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.10.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.11.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.12.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.13.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.14.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.15.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.16.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.17.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.18.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.19.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.20.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.21.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.22.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.self_attn.q_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.self_attn.q_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.self_attn.k_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.self_attn.k_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.self_attn.v_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.self_attn.v_proj.ori_model.bias is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.self_attn.o_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.mlp.gate_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.mlp.up_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.mlp.down_proj.ori_model.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.input_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.layers.23.post_attention_layernorm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.model.norm.weight is skipped since its requires_grad=False 2024/11/10 09:40:21 - mmengine - WARNING - lmm.lm_head.weight is skipped since its requires_grad=False 2024/11/10 09:40:22 - mmengine - INFO - LR is set based on batch size of 16 and the current batch size is 16. Scaling the original LR by 1.0. Name of parameter - Initialization information level_embed - torch.Size([5, 256]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.patch_embed.projection.weight - torch.Size([192, 3, 4, 4]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.patch_embed.projection.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.patch_embed.norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.patch_embed.norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 6]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.attn.w_msa.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.attn.w_msa.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.attn.w_msa.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.attn.w_msa.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.ffn.layers.0.0.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.ffn.layers.0.0.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.ffn.layers.1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.0.ffn.layers.1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 6]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.attn.w_msa.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.attn.w_msa.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.attn.w_msa.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.attn.w_msa.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.ffn.layers.0.0.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.ffn.layers.0.0.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.ffn.layers.1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.blocks.1.ffn.layers.1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.downsample.norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.downsample.norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.0.downsample.reduction.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 12]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.attn.w_msa.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.attn.w_msa.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.attn.w_msa.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.attn.w_msa.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.ffn.layers.0.0.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.ffn.layers.0.0.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.ffn.layers.1.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.0.ffn.layers.1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 12]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.attn.w_msa.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.attn.w_msa.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.attn.w_msa.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.attn.w_msa.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.ffn.layers.0.0.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.ffn.layers.0.0.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.ffn.layers.1.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.blocks.1.ffn.layers.1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.downsample.norm.weight - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.downsample.norm.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.1.downsample.reduction.weight - torch.Size([768, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.0.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.1.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.2.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.3.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.4.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.5.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.6.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.7.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.8.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.9.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.10.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.11.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.12.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.13.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.14.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.15.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.16.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.attn.w_msa.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.attn.w_msa.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.attn.w_msa.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.attn.w_msa.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.ffn.layers.0.0.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.ffn.layers.0.0.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.ffn.layers.1.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.blocks.17.ffn.layers.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.downsample.norm.weight - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.downsample.norm.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.2.downsample.reduction.weight - torch.Size([1536, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.norm1.weight - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.norm1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 48]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.attn.w_msa.qkv.weight - torch.Size([4608, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.attn.w_msa.qkv.bias - torch.Size([4608]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.attn.w_msa.proj.weight - torch.Size([1536, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.attn.w_msa.proj.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.norm2.weight - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.norm2.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.ffn.layers.0.0.weight - torch.Size([6144, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.ffn.layers.0.0.bias - torch.Size([6144]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.ffn.layers.1.weight - torch.Size([1536, 6144]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.0.ffn.layers.1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.norm1.weight - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.norm1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 48]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.attn.w_msa.qkv.weight - torch.Size([4608, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.attn.w_msa.qkv.bias - torch.Size([4608]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.attn.w_msa.proj.weight - torch.Size([1536, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.attn.w_msa.proj.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.norm2.weight - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.norm2.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.ffn.layers.0.0.weight - torch.Size([6144, 1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.ffn.layers.0.0.bias - torch.Size([6144]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.ffn.layers.1.weight - torch.Size([1536, 6144]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.stages.3.blocks.1.ffn.layers.1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm0.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm0.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm3.weight - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO backbone.norm3.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.0.conv.weight - torch.Size([256, 192, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.0.gn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.0.gn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.1.conv.weight - torch.Size([256, 384, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.1.gn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.1.gn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.2.conv.weight - torch.Size([256, 768, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.2.gn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.2.gn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.3.conv.weight - torch.Size([256, 1536, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.3.gn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.convs.3.gn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.extra_convs.0.conv.weight - torch.Size([256, 1536, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.extra_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.extra_convs.0.gn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO neck.extra_convs.0.gn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.cls_branches.0.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.cls_branches.1.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.cls_branches.2.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.cls_branches.3.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.cls_branches.4.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.cls_branches.5.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.cls_branches.6.bias - torch.Size([1]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.0.0.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.0.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.0.2.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.0.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.0.4.weight - torch.Size([4, 256]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.0.4.bias - torch.Size([4]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.1.0.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.1.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.1.2.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.1.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.1.4.weight - torch.Size([4, 256]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.1.4.bias - torch.Size([4]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.2.0.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.2.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.2.2.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.2.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.2.4.weight - torch.Size([4, 256]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.2.4.bias - torch.Size([4]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.3.0.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.3.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.3.2.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.3.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.3.4.weight - torch.Size([4, 256]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.3.4.bias - torch.Size([4]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.4.0.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.4.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.4.2.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.4.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.4.4.weight - torch.Size([4, 256]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.4.4.bias - torch.Size([4]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.5.0.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.5.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.5.2.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.5.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.5.4.weight - torch.Size([4, 256]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.5.4.bias - torch.Size([4]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.6.0.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.6.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.6.2.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.6.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO bbox_head.reg_branches.6.4.weight - torch.Size([4, 256]): Initialized by user-defined `init_weights` in GroundingDINOHead bbox_head.reg_branches.6.4.bias - torch.Size([4]): Initialized by user-defined `init_weights` in GroundingDINOHead encoder.layers.0.self_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.self_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.self_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.self_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.self_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.self_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.self_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.self_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.0.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.self_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.1.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.self_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.2.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.self_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.3.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.self_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.4.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.self_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.layers.5.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.ffn.layers.0.0.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.ffn.layers.0.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.ffn.layers.1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.0.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.ffn.layers.0.0.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.ffn.layers.0.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.ffn.layers.1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.1.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.ffn.layers.0.0.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.ffn.layers.0.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.ffn.layers.1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.2.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.ffn.layers.0.0.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.ffn.layers.0.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.ffn.layers.1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.3.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.ffn.layers.0.0.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.ffn.layers.0.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.ffn.layers.1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.4.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.ffn.layers.0.0.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.ffn.layers.0.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.ffn.layers.1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.text_layers.5.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.gamma_v - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.gamma_l - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.layer_norm_v.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.layer_norm_v.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.layer_norm_l.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.layer_norm_l.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.values_v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.values_v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.values_l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.values_l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.out_v_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.out_v_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.out_l_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.0.attn.out_l_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.gamma_v - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.gamma_l - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.layer_norm_v.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.layer_norm_v.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.layer_norm_l.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.layer_norm_l.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.values_v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.values_v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.values_l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.values_l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.out_v_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.out_v_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.out_l_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.1.attn.out_l_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.gamma_v - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.gamma_l - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.layer_norm_v.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.layer_norm_v.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.layer_norm_l.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.layer_norm_l.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.values_v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.values_v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.values_l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.values_l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.out_v_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.out_v_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.out_l_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.2.attn.out_l_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.gamma_v - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.gamma_l - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.layer_norm_v.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.layer_norm_v.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.layer_norm_l.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.layer_norm_l.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.values_v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.values_v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.values_l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.values_l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.out_v_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.out_v_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.out_l_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.3.attn.out_l_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.gamma_v - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.gamma_l - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.layer_norm_v.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.layer_norm_v.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.layer_norm_l.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.layer_norm_l.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.values_v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.values_v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.values_l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.values_l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.out_v_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.out_v_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.out_l_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.4.attn.out_l_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.gamma_v - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.gamma_l - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.layer_norm_v.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.layer_norm_v.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.layer_norm_l.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.layer_norm_l.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.values_v_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.values_v_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.values_l_proj.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.values_l_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.out_v_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.out_v_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.out_l_proj.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of GroundingDINO encoder.fusion_layers.5.attn.out_l_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn_text.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn_text.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn_text.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn_text.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.3.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.0.norms.3.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn_text.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn_text.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn_text.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn_text.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.3.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.1.norms.3.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn_text.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn_text.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn_text.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn_text.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.3.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.2.norms.3.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn_text.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn_text.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn_text.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn_text.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.3.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.3.norms.3.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn_text.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn_text.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn_text.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn_text.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.3.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.4.norms.3.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.self_attn.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.self_attn.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.self_attn.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.self_attn.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn_text.attn.in_proj_weight - torch.Size([768, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn_text.attn.in_proj_bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn_text.attn.out_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn_text.attn.out_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.ffn.layers.0.0.weight - torch.Size([2048, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.ffn.layers.0.0.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.ffn.layers.1.weight - torch.Size([256, 2048]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.ffn.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.0.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.3.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.layers.5.norms.3.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.ref_point_head.layers.0.weight - torch.Size([256, 512]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.ref_point_head.layers.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.ref_point_head.layers.1.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.ref_point_head.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.norm.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO decoder.norm.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO query_embedding.weight - torch.Size([900, 256]): The value is the same before and after calling `init_weights` of GroundingDINO memory_trans_fc.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO memory_trans_fc.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO memory_trans_norm.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO memory_trans_norm.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.embeddings.word_embeddings.weight - torch.Size([30522, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.embeddings.position_embeddings.weight - torch.Size([512, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.embeddings.token_type_embeddings.weight - torch.Size([2, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.embeddings.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.embeddings.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.0.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.1.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.2.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.3.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.4.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.5.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.6.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.7.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.8.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.9.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.10.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.self.query.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.self.key.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.self.value.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.attention.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.intermediate.dense.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.output.dense.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.output.dense.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO language_model.language_backbone.body.model.encoder.layer.11.output.LayerNorm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of GroundingDINO text_feat_map.weight - torch.Size([256, 768]): The value is the same before and after calling `init_weights` of GroundingDINO text_feat_map.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO dn_query_generator.label_embedding.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.embed_tokens.weight - torch.Size([151647, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.down_proj.weight - torch.Size([256, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.down_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.up_proj.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.up_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.corss_attn_norm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.0.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.1.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.2.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.3.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.down_proj.weight - torch.Size([256, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.down_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.up_proj.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.up_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.corss_attn_norm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.4.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.5.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.6.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.7.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.down_proj.weight - torch.Size([256, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.down_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.up_proj.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.up_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.corss_attn_norm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.8.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.9.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.10.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.11.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.down_proj.weight - torch.Size([256, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.down_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.up_proj.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.up_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.corss_attn_norm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.12.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.13.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.14.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.15.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.down_proj.weight - torch.Size([256, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.down_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.up_proj.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.up_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.corss_attn_norm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.16.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.17.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.18.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.19.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.down_proj.weight - torch.Size([256, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.down_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.up_proj.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.up_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.corss_attn_norm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.sampling_offsets.weight - torch.Size([320, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.sampling_offsets.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.attention_weights.weight - torch.Size([160, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.attention_weights.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.value_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.value_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.output_proj.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.20.cross_attn.output_proj.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.21.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.22.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.q_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.q_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.q_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.q_proj.ori_model.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.k_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.k_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.k_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.k_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.v_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.v_proj.lora_B.weight - torch.Size([128, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.v_proj.ori_model.weight - torch.Size([128, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.v_proj.ori_model.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.o_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.o_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.self_attn.o_proj.ori_model.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.gate_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.gate_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.gate_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.up_proj.lora_A.weight - torch.Size([64, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.up_proj.lora_B.weight - torch.Size([4864, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.up_proj.ori_model.weight - torch.Size([4864, 896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.down_proj.lora_A.weight - torch.Size([64, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.down_proj.lora_B.weight - torch.Size([896, 64]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.mlp.down_proj.ori_model.weight - torch.Size([896, 4864]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.input_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.layers.23.post_attention_layernorm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO lmm.model.norm.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO ref_point_head.layers.0.weight - torch.Size([256, 512]): The value is the same before and after calling `init_weights` of GroundingDINO ref_point_head.layers.0.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO ref_point_head.layers.1.weight - torch.Size([256, 256]): The value is the same before and after calling `init_weights` of GroundingDINO ref_point_head.layers.1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of GroundingDINO connector.vision_projector.0.0.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO connector.vision_projector.0.0.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO connector.vision_projector.0.2.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO connector.vision_projector.0.2.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO connector.pos_proj.weight - torch.Size([896, 512]): The value is the same before and after calling `init_weights` of GroundingDINO connector.pos_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO region_connector.vision_projector.0.0.weight - torch.Size([896, 256]): The value is the same before and after calling `init_weights` of GroundingDINO region_connector.vision_projector.0.0.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO region_connector.vision_projector.0.2.weight - torch.Size([896, 896]): The value is the same before and after calling `init_weights` of GroundingDINO region_connector.vision_projector.0.2.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO region_connector.pos_proj.weight - torch.Size([896, 512]): The value is the same before and after calling `init_weights` of GroundingDINO region_connector.pos_proj.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of GroundingDINO 2024/11/10 09:40:55 - mmengine - INFO - Load checkpoint from ../huggingface/mm_grounding_dino/grounding_dino_swin-l_pretrain_obj365_goldg-34dcdc53.pth 2024/11/10 09:40:55 - mmengine - WARNING - "FileClient" will be deprecated in future. Please use io functions in https://mmengine.readthedocs.io/en/latest/api/fileio.html#file-io 2024/11/10 09:40:55 - mmengine - WARNING - "HardDiskBackend" is the alias of "LocalBackend" and the former will be deprecated in future. 2024/11/10 09:40:55 - mmengine - INFO - Checkpoints will be saved to /mnt/data/qize.yqz/code/VideoVLM/LMMDet/myllava/work_dirs/grounding_dino_swin_l. 2024/11/10 09:44:21 - mmengine - INFO - Iter(train) [ 100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 3 days, 14:01:07 time: 2.0169 data_time: 0.0188 memory: 34533 grad_norm: 51.1539 loss: 15.2381 loss_cls: 0.5859 loss_bbox: 0.1574 loss_iou: 0.2678 d0.loss_cls: 0.6258 d0.loss_bbox: 0.1635 d0.loss_iou: 0.2760 d1.loss_cls: 0.5978 d1.loss_bbox: 0.1594 d1.loss_iou: 0.2752 d2.loss_cls: 0.5949 d2.loss_bbox: 0.1522 d2.loss_iou: 0.2669 d3.loss_cls: 0.5827 d3.loss_bbox: 0.1572 d3.loss_iou: 0.2670 d4.loss_cls: 0.5860 d4.loss_bbox: 0.1549 d4.loss_iou: 0.2671 enc_loss_cls: 0.6122 enc_loss_bbox: 0.1745 enc_loss_iou: 0.2978 dn_loss_cls: 0.4139 dn_loss_bbox: 0.2403 dn_loss_iou: 0.2477 d0.dn_loss_cls: 0.4870 d0.dn_loss_bbox: 0.4000 d0.dn_loss_iou: 0.4027 d1.dn_loss_cls: 0.4578 d1.dn_loss_bbox: 0.2686 d1.dn_loss_iou: 0.2833 d2.dn_loss_cls: 0.4314 d2.dn_loss_bbox: 0.2500 d2.dn_loss_iou: 0.2596 d3.dn_loss_cls: 0.4167 d3.dn_loss_bbox: 0.2428 d3.dn_loss_iou: 0.2507 d4.dn_loss_cls: 0.4123 d4.dn_loss_bbox: 0.2403 d4.dn_loss_iou: 0.2478 d1.loss_lmm_region: 0.6365 loss_lmm_image: 1.4265 2024/11/10 09:47:41 - mmengine - INFO - Iter(train) [ 200/150000] base_lr: 2.0000e-05 lr: 2.0000e-05 eta: 3 days, 12:27:31 time: 1.9844 data_time: 0.0180 memory: 32972 grad_norm: nan loss: 13.3723 loss_cls: 0.4843 loss_bbox: 0.1574 loss_iou: 0.2546 d0.loss_cls: 0.5222 d0.loss_bbox: 0.1683 d0.loss_iou: 0.2745 d1.loss_cls: 0.5015 d1.loss_bbox: 0.1619 d1.loss_iou: 0.2646 d2.loss_cls: 0.4956 d2.loss_bbox: 0.1557 d2.loss_iou: 0.2576 d3.loss_cls: 0.4898 d3.loss_bbox: 0.1499 d3.loss_iou: 0.2529 d4.loss_cls: 0.4875 d4.loss_bbox: 0.1583 d4.loss_iou: 0.2548 enc_loss_cls: 0.5140 enc_loss_bbox: 0.1763 enc_loss_iou: 0.2862 dn_loss_cls: 0.3255 dn_loss_bbox: 0.1976 dn_loss_iou: 0.2303 d0.dn_loss_cls: 0.4041 d0.dn_loss_bbox: 0.3582 d0.dn_loss_iou: 0.3877 d1.dn_loss_cls: 0.3629 d1.dn_loss_bbox: 0.2312 d1.dn_loss_iou: 0.2650 d2.dn_loss_cls: 0.3424 d2.dn_loss_bbox: 0.2091 d2.dn_loss_iou: 0.2417 d3.dn_loss_cls: 0.3311 d3.dn_loss_bbox: 0.2003 d3.dn_loss_iou: 0.2339 d4.dn_loss_cls: 0.3259 d4.dn_loss_bbox: 0.1976 d4.dn_loss_iou: 0.2303 d1.loss_lmm_region: 0.5939 loss_lmm_image: 1.2356 2024/11/10 09:51:02 - mmengine - INFO - Iter(train) [ 300/150000] base_lr: 3.0000e-05 lr: 3.0000e-05 eta: 3 days, 12:13:28 time: 2.0147 data_time: 0.0184 memory: 33468 grad_norm: 31.2586 loss: 12.9213 loss_cls: 0.4786 loss_bbox: 0.1536 loss_iou: 0.2729 d0.loss_cls: 0.5231 d0.loss_bbox: 0.1648 d0.loss_iou: 0.2923 d1.loss_cls: 0.4892 d1.loss_bbox: 0.1590 d1.loss_iou: 0.2814 d2.loss_cls: 0.4828 d2.loss_bbox: 0.1554 d2.loss_iou: 0.2761 d3.loss_cls: 0.4791 d3.loss_bbox: 0.1533 d3.loss_iou: 0.2741 d4.loss_cls: 0.4770 d4.loss_bbox: 0.1531 d4.loss_iou: 0.2730 enc_loss_cls: 0.5076 enc_loss_bbox: 0.1801 enc_loss_iou: 0.3133 dn_loss_cls: 0.2220 dn_loss_bbox: 0.2109 dn_loss_iou: 0.2400 d0.dn_loss_cls: 0.3016 d0.dn_loss_bbox: 0.3841 d0.dn_loss_iou: 0.3956 d1.dn_loss_cls: 0.2527 d1.dn_loss_bbox: 0.2517 d1.dn_loss_iou: 0.2755 d2.dn_loss_cls: 0.2318 d2.dn_loss_bbox: 0.2217 d2.dn_loss_iou: 0.2502 d3.dn_loss_cls: 0.2240 d3.dn_loss_bbox: 0.2125 d3.dn_loss_iou: 0.2423 d4.dn_loss_cls: 0.2210 d4.dn_loss_bbox: 0.2108 d4.dn_loss_iou: 0.2399 d1.loss_lmm_region: 0.5999 loss_lmm_image: 1.1931 2024/11/10 09:54:22 - mmengine - INFO - Iter(train) [ 400/150000] base_lr: 4.0000e-05 lr: 4.0000e-05 eta: 3 days, 11:50:42 time: 2.0094 data_time: 0.0183 memory: 34777 grad_norm: 28.9738 loss: 12.2970 loss_cls: 0.4054 loss_bbox: 0.1368 loss_iou: 0.2585 d0.loss_cls: 0.4603 d0.loss_bbox: 0.1517 d0.loss_iou: 0.2744 d1.loss_cls: 0.4222 d1.loss_bbox: 0.1421 d1.loss_iou: 0.2648 d2.loss_cls: 0.4167 d2.loss_bbox: 0.1359 d2.loss_iou: 0.2569 d3.loss_cls: 0.4098 d3.loss_bbox: 0.1381 d3.loss_iou: 0.2595 d4.loss_cls: 0.4069 d4.loss_bbox: 0.1365 d4.loss_iou: 0.2583 enc_loss_cls: 0.4497 enc_loss_bbox: 0.1637 enc_loss_iou: 0.2908 dn_loss_cls: 0.2515 dn_loss_bbox: 0.1994 dn_loss_iou: 0.2447 d0.dn_loss_cls: 0.3241 d0.dn_loss_bbox: 0.3748 d0.dn_loss_iou: 0.4071 d1.dn_loss_cls: 0.2797 d1.dn_loss_bbox: 0.2423 d1.dn_loss_iou: 0.2832 d2.dn_loss_cls: 0.2614 d2.dn_loss_bbox: 0.2112 d2.dn_loss_iou: 0.2560 d3.dn_loss_cls: 0.2563 d3.dn_loss_bbox: 0.2016 d3.dn_loss_iou: 0.2475 d4.dn_loss_cls: 0.2533 d4.dn_loss_bbox: 0.1993 d4.dn_loss_iou: 0.2446 d1.loss_lmm_region: 0.5711 loss_lmm_image: 1.1489 2024/11/10 09:57:42 - mmengine - INFO - Iter(train) [ 500/150000] base_lr: 5.0000e-05 lr: 5.0000e-05 eta: 3 days, 11:38:49 time: 1.9798 data_time: 0.0183 memory: 33972 grad_norm: 25.8791 loss: 12.3433 loss_cls: 0.3987 loss_bbox: 0.1740 loss_iou: 0.2675 d0.loss_cls: 0.4500 d0.loss_bbox: 0.1863 d0.loss_iou: 0.2784 d1.loss_cls: 0.4143 d1.loss_bbox: 0.1799 d1.loss_iou: 0.2719 d2.loss_cls: 0.4072 d2.loss_bbox: 0.1766 d2.loss_iou: 0.2709 d3.loss_cls: 0.4047 d3.loss_bbox: 0.1767 d3.loss_iou: 0.2684 d4.loss_cls: 0.3981 d4.loss_bbox: 0.1773 d4.loss_iou: 0.2703 enc_loss_cls: 0.4381 enc_loss_bbox: 0.1986 enc_loss_iou: 0.3001 dn_loss_cls: 0.1997 dn_loss_bbox: 0.2186 dn_loss_iou: 0.2491 d0.dn_loss_cls: 0.2865 d0.dn_loss_bbox: 0.3718 d0.dn_loss_iou: 0.4011 d1.dn_loss_cls: 0.2323 d1.dn_loss_bbox: 0.2535 d1.dn_loss_iou: 0.2846 d2.dn_loss_cls: 0.2084 d2.dn_loss_bbox: 0.2289 d2.dn_loss_iou: 0.2603 d3.dn_loss_cls: 0.2020 d3.dn_loss_bbox: 0.2201 d3.dn_loss_iou: 0.2520 d4.dn_loss_cls: 0.1987 d4.dn_loss_bbox: 0.2186 d4.dn_loss_iou: 0.2490 d1.loss_lmm_region: 0.5341 loss_lmm_image: 1.1658 2024/11/10 10:01:04 - mmengine - INFO - Iter(train) [ 600/150000] base_lr: 6.0000e-05 lr: 6.0000e-05 eta: 3 days, 11:38:32 time: 2.0273 data_time: 0.0182 memory: 32503 grad_norm: 28.2289 loss: 13.3477 loss_cls: 0.4645 loss_bbox: 0.1718 loss_iou: 0.2714 d0.loss_cls: 0.5070 d0.loss_bbox: 0.1935 d0.loss_iou: 0.2899 d1.loss_cls: 0.4786 d1.loss_bbox: 0.1787 d1.loss_iou: 0.2776 d2.loss_cls: 0.4728 d2.loss_bbox: 0.1750 d2.loss_iou: 0.2737 d3.loss_cls: 0.4630 d3.loss_bbox: 0.1732 d3.loss_iou: 0.2723 d4.loss_cls: 0.4630 d4.loss_bbox: 0.1728 d4.loss_iou: 0.2739 enc_loss_cls: 0.5074 enc_loss_bbox: 0.1968 enc_loss_iou: 0.3036 dn_loss_cls: 0.2758 dn_loss_bbox: 0.2248 dn_loss_iou: 0.2540 d0.dn_loss_cls: 0.3473 d0.dn_loss_bbox: 0.4276 d0.dn_loss_iou: 0.4110 d1.dn_loss_cls: 0.3015 d1.dn_loss_bbox: 0.2696 d1.dn_loss_iou: 0.2887 d2.dn_loss_cls: 0.2867 d2.dn_loss_bbox: 0.2390 d2.dn_loss_iou: 0.2658 d3.dn_loss_cls: 0.2781 d3.dn_loss_bbox: 0.2280 d3.dn_loss_iou: 0.2571 d4.dn_loss_cls: 0.2750 d4.dn_loss_bbox: 0.2250 d4.dn_loss_iou: 0.2541 d1.loss_lmm_region: 0.5333 loss_lmm_image: 1.1247 2024/11/10 10:04:26 - mmengine - INFO - Iter(train) [ 700/150000] base_lr: 7.0000e-05 lr: 7.0000e-05 eta: 3 days, 11:35:33 time: 1.9974 data_time: 0.0181 memory: 33779 grad_norm: nan loss: 11.8927 loss_cls: 0.3783 loss_bbox: 0.1555 loss_iou: 0.2325 d0.loss_cls: 0.4163 d0.loss_bbox: 0.1768 d0.loss_iou: 0.2531 d1.loss_cls: 0.3942 d1.loss_bbox: 0.1646 d1.loss_iou: 0.2390 d2.loss_cls: 0.3810 d2.loss_bbox: 0.1635 d2.loss_iou: 0.2374 d3.loss_cls: 0.3790 d3.loss_bbox: 0.1545 d3.loss_iou: 0.2317 d4.loss_cls: 0.3760 d4.loss_bbox: 0.1542 d4.loss_iou: 0.2311 enc_loss_cls: 0.4072 enc_loss_bbox: 0.1895 enc_loss_iou: 0.2673 dn_loss_cls: 0.2369 dn_loss_bbox: 0.2161 dn_loss_iou: 0.2363 d0.dn_loss_cls: 0.3114 d0.dn_loss_bbox: 0.4055 d0.dn_loss_iou: 0.4022 d1.dn_loss_cls: 0.2610 d1.dn_loss_bbox: 0.2573 d1.dn_loss_iou: 0.2746 d2.dn_loss_cls: 0.2426 d2.dn_loss_bbox: 0.2285 d2.dn_loss_iou: 0.2475 d3.dn_loss_cls: 0.2375 d3.dn_loss_bbox: 0.2182 d3.dn_loss_iou: 0.2383 d4.dn_loss_cls: 0.2340 d4.dn_loss_bbox: 0.2161 d4.dn_loss_iou: 0.2362 d1.loss_lmm_region: 0.4572 loss_lmm_image: 1.1525 2024/11/10 10:07:49 - mmengine - INFO - Iter(train) [ 800/150000] base_lr: 8.0000e-05 lr: 8.0000e-05 eta: 3 days, 11:37:24 time: 2.0461 data_time: 0.0181 memory: 34125 grad_norm: 27.5186 loss: 10.9050 loss_cls: 0.3927 loss_bbox: 0.1203 loss_iou: 0.2109 d0.loss_cls: 0.4354 d0.loss_bbox: 0.1314 d0.loss_iou: 0.2261 d1.loss_cls: 0.4083 d1.loss_bbox: 0.1203 d1.loss_iou: 0.2156 d2.loss_cls: 0.4003 d2.loss_bbox: 0.1208 d2.loss_iou: 0.2138 d3.loss_cls: 0.3956 d3.loss_bbox: 0.1190 d3.loss_iou: 0.2103 d4.loss_cls: 0.3902 d4.loss_bbox: 0.1212 d4.loss_iou: 0.2111 enc_loss_cls: 0.4237 enc_loss_bbox: 0.1469 enc_loss_iou: 0.2529 dn_loss_cls: 0.2142 dn_loss_bbox: 0.1662 dn_loss_iou: 0.2069 d0.dn_loss_cls: 0.2945 d0.dn_loss_bbox: 0.3298 d0.dn_loss_iou: 0.3580 d1.dn_loss_cls: 0.2415 d1.dn_loss_bbox: 0.2010 d1.dn_loss_iou: 0.2422 d2.dn_loss_cls: 0.2220 d2.dn_loss_bbox: 0.1778 d2.dn_loss_iou: 0.2190 d3.dn_loss_cls: 0.2163 d3.dn_loss_bbox: 0.1684 d3.dn_loss_iou: 0.2099 d4.dn_loss_cls: 0.2129 d4.dn_loss_bbox: 0.1662 d4.dn_loss_iou: 0.2068 d1.loss_lmm_region: 0.4524 loss_lmm_image: 1.1319 2024/11/10 10:11:09 - mmengine - INFO - Iter(train) [ 900/150000] base_lr: 9.0000e-05 lr: 9.0000e-05 eta: 3 days, 11:28:49 time: 1.9812 data_time: 0.0180 memory: 33254 grad_norm: 25.4385 loss: 10.3031 loss_cls: 0.3229 loss_bbox: 0.1135 loss_iou: 0.2120 d0.loss_cls: 0.3671 d0.loss_bbox: 0.1208 d0.loss_iou: 0.2248 d1.loss_cls: 0.3293 d1.loss_bbox: 0.1229 d1.loss_iou: 0.2187 d2.loss_cls: 0.3297 d2.loss_bbox: 0.1150 d2.loss_iou: 0.2144 d3.loss_cls: 0.3244 d3.loss_bbox: 0.1142 d3.loss_iou: 0.2117 d4.loss_cls: 0.3245 d4.loss_bbox: 0.1128 d4.loss_iou: 0.2110 enc_loss_cls: 0.3553 enc_loss_bbox: 0.1388 enc_loss_iou: 0.2508 dn_loss_cls: 0.1808 dn_loss_bbox: 0.1720 dn_loss_iou: 0.2117 d0.dn_loss_cls: 0.2731 d0.dn_loss_bbox: 0.3469 d0.dn_loss_iou: 0.3696 d1.dn_loss_cls: 0.2200 d1.dn_loss_bbox: 0.2145 d1.dn_loss_iou: 0.2471 d2.dn_loss_cls: 0.1962 d2.dn_loss_bbox: 0.1855 d2.dn_loss_iou: 0.2225 d3.dn_loss_cls: 0.1838 d3.dn_loss_bbox: 0.1749 d3.dn_loss_iou: 0.2146 d4.dn_loss_cls: 0.1807 d4.dn_loss_bbox: 0.1720 d4.dn_loss_iou: 0.2116 d1.loss_lmm_region: 0.4365 loss_lmm_image: 1.1545 2024/11/10 10:14:33 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 10:14:33 - mmengine - INFO - Iter(train) [ 1000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 11:32:14 time: 2.0253 data_time: 0.0191 memory: 35133 grad_norm: 26.9837 loss: 11.3224 loss_cls: 0.3759 loss_bbox: 0.1529 loss_iou: 0.2387 d0.loss_cls: 0.4292 d0.loss_bbox: 0.1615 d0.loss_iou: 0.2517 d1.loss_cls: 0.3877 d1.loss_bbox: 0.1609 d1.loss_iou: 0.2489 d2.loss_cls: 0.3874 d2.loss_bbox: 0.1518 d2.loss_iou: 0.2398 d3.loss_cls: 0.3788 d3.loss_bbox: 0.1550 d3.loss_iou: 0.2422 d4.loss_cls: 0.3752 d4.loss_bbox: 0.1534 d4.loss_iou: 0.2397 enc_loss_cls: 0.4193 enc_loss_bbox: 0.1742 enc_loss_iou: 0.2684 dn_loss_cls: 0.1894 dn_loss_bbox: 0.1977 dn_loss_iou: 0.2263 d0.dn_loss_cls: 0.2603 d0.dn_loss_bbox: 0.3680 d0.dn_loss_iou: 0.3848 d1.dn_loss_cls: 0.2172 d1.dn_loss_bbox: 0.2365 d1.dn_loss_iou: 0.2632 d2.dn_loss_cls: 0.2002 d2.dn_loss_bbox: 0.2078 d2.dn_loss_iou: 0.2365 d3.dn_loss_cls: 0.1922 d3.dn_loss_bbox: 0.2005 d3.dn_loss_iou: 0.2296 d4.dn_loss_cls: 0.1888 d4.dn_loss_bbox: 0.1977 d4.dn_loss_iou: 0.2263 d1.loss_lmm_region: 0.4021 loss_lmm_image: 1.1047 2024/11/10 10:17:53 - mmengine - INFO - Iter(train) [ 1100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 11:25:46 time: 1.9945 data_time: 0.0169 memory: 33679 grad_norm: 26.2734 loss: 11.8727 loss_cls: 0.4107 loss_bbox: 0.1432 loss_iou: 0.2638 d0.loss_cls: 0.4595 d0.loss_bbox: 0.1548 d0.loss_iou: 0.2779 d1.loss_cls: 0.4256 d1.loss_bbox: 0.1483 d1.loss_iou: 0.2685 d2.loss_cls: 0.4168 d2.loss_bbox: 0.1453 d2.loss_iou: 0.2642 d3.loss_cls: 0.4158 d3.loss_bbox: 0.1416 d3.loss_iou: 0.2625 d4.loss_cls: 0.4096 d4.loss_bbox: 0.1421 d4.loss_iou: 0.2637 enc_loss_cls: 0.4564 enc_loss_bbox: 0.1745 enc_loss_iou: 0.3047 dn_loss_cls: 0.2080 dn_loss_bbox: 0.1926 dn_loss_iou: 0.2452 d0.dn_loss_cls: 0.2849 d0.dn_loss_bbox: 0.3419 d0.dn_loss_iou: 0.3994 d1.dn_loss_cls: 0.2393 d1.dn_loss_bbox: 0.2277 d1.dn_loss_iou: 0.2806 d2.dn_loss_cls: 0.2204 d2.dn_loss_bbox: 0.2026 d2.dn_loss_iou: 0.2558 d3.dn_loss_cls: 0.2130 d3.dn_loss_bbox: 0.1930 d3.dn_loss_iou: 0.2476 d4.dn_loss_cls: 0.2067 d4.dn_loss_bbox: 0.1924 d4.dn_loss_iou: 0.2451 d1.loss_lmm_region: 0.4126 loss_lmm_image: 1.1144 2024/11/10 10:21:16 - mmengine - INFO - Iter(train) [ 1200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 11:23:29 time: 2.0110 data_time: 0.0166 memory: 34615 grad_norm: 27.8818 loss: 11.8943 loss_cls: 0.4341 loss_bbox: 0.1426 loss_iou: 0.2599 d0.loss_cls: 0.4787 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2801 d1.loss_cls: 0.4461 d1.loss_bbox: 0.1523 d1.loss_iou: 0.2712 d2.loss_cls: 0.4400 d2.loss_bbox: 0.1468 d2.loss_iou: 0.2650 d3.loss_cls: 0.4354 d3.loss_bbox: 0.1442 d3.loss_iou: 0.2640 d4.loss_cls: 0.4345 d4.loss_bbox: 0.1436 d4.loss_iou: 0.2605 enc_loss_cls: 0.4685 enc_loss_bbox: 0.1765 enc_loss_iou: 0.3108 dn_loss_cls: 0.2223 dn_loss_bbox: 0.1829 dn_loss_iou: 0.2218 d0.dn_loss_cls: 0.3028 d0.dn_loss_bbox: 0.3397 d0.dn_loss_iou: 0.3705 d1.dn_loss_cls: 0.2581 d1.dn_loss_bbox: 0.2114 d1.dn_loss_iou: 0.2542 d2.dn_loss_cls: 0.2372 d2.dn_loss_bbox: 0.1895 d2.dn_loss_iou: 0.2315 d3.dn_loss_cls: 0.2298 d3.dn_loss_bbox: 0.1827 d3.dn_loss_iou: 0.2235 d4.dn_loss_cls: 0.2250 d4.dn_loss_bbox: 0.1829 d4.dn_loss_iou: 0.2218 d1.loss_lmm_region: 0.3671 loss_lmm_image: 1.1258 2024/11/10 10:24:33 - mmengine - INFO - Iter(train) [ 1300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 11:11:31 time: 2.0051 data_time: 0.0167 memory: 34070 grad_norm: 26.9068 loss: 11.2617 loss_cls: 0.3582 loss_bbox: 0.1381 loss_iou: 0.2227 d0.loss_cls: 0.4122 d0.loss_bbox: 0.1584 d0.loss_iou: 0.2366 d1.loss_cls: 0.3733 d1.loss_bbox: 0.1468 d1.loss_iou: 0.2309 d2.loss_cls: 0.3679 d2.loss_bbox: 0.1439 d2.loss_iou: 0.2280 d3.loss_cls: 0.3607 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2240 d4.loss_cls: 0.3588 d4.loss_bbox: 0.1360 d4.loss_iou: 0.2221 enc_loss_cls: 0.3947 enc_loss_bbox: 0.1701 enc_loss_iou: 0.2619 dn_loss_cls: 0.2201 dn_loss_bbox: 0.2051 dn_loss_iou: 0.2180 d0.dn_loss_cls: 0.3050 d0.dn_loss_bbox: 0.3739 d0.dn_loss_iou: 0.3725 d1.dn_loss_cls: 0.2519 d1.dn_loss_bbox: 0.2460 d1.dn_loss_iou: 0.2574 d2.dn_loss_cls: 0.2342 d2.dn_loss_bbox: 0.2165 d2.dn_loss_iou: 0.2304 d3.dn_loss_cls: 0.2231 d3.dn_loss_bbox: 0.2083 d3.dn_loss_iou: 0.2213 d4.dn_loss_cls: 0.2214 d4.dn_loss_bbox: 0.2051 d4.dn_loss_iou: 0.2179 d1.loss_lmm_region: 0.3770 loss_lmm_image: 1.1705 2024/11/10 10:27:54 - mmengine - INFO - Iter(train) [ 1400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 11:06:52 time: 2.0221 data_time: 0.0164 memory: 34897 grad_norm: 25.4814 loss: 10.4646 loss_cls: 0.3835 loss_bbox: 0.1178 loss_iou: 0.2080 d0.loss_cls: 0.4314 d0.loss_bbox: 0.1306 d0.loss_iou: 0.2215 d1.loss_cls: 0.3981 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2183 d2.loss_cls: 0.3922 d2.loss_bbox: 0.1195 d2.loss_iou: 0.2108 d3.loss_cls: 0.3870 d3.loss_bbox: 0.1179 d3.loss_iou: 0.2070 d4.loss_cls: 0.3815 d4.loss_bbox: 0.1177 d4.loss_iou: 0.2077 enc_loss_cls: 0.4168 enc_loss_bbox: 0.1492 enc_loss_iou: 0.2484 dn_loss_cls: 0.2200 dn_loss_bbox: 0.1483 dn_loss_iou: 0.1876 d0.dn_loss_cls: 0.2893 d0.dn_loss_bbox: 0.2931 d0.dn_loss_iou: 0.3268 d1.dn_loss_cls: 0.2477 d1.dn_loss_bbox: 0.1839 d1.dn_loss_iou: 0.2191 d2.dn_loss_cls: 0.2337 d2.dn_loss_bbox: 0.1647 d2.dn_loss_iou: 0.1991 d3.dn_loss_cls: 0.2280 d3.dn_loss_bbox: 0.1518 d3.dn_loss_iou: 0.1907 d4.dn_loss_cls: 0.2231 d4.dn_loss_bbox: 0.1483 d4.dn_loss_iou: 0.1877 d1.loss_lmm_region: 0.3442 loss_lmm_image: 1.0858 2024/11/10 10:31:14 - mmengine - INFO - Iter(train) [ 1500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 11:01:05 time: 2.0145 data_time: 0.0164 memory: 34190 grad_norm: 24.9868 loss: 10.9792 loss_cls: 0.3499 loss_bbox: 0.1476 loss_iou: 0.2622 d0.loss_cls: 0.4063 d0.loss_bbox: 0.1535 d0.loss_iou: 0.2751 d1.loss_cls: 0.3786 d1.loss_bbox: 0.1484 d1.loss_iou: 0.2664 d2.loss_cls: 0.3662 d2.loss_bbox: 0.1434 d2.loss_iou: 0.2639 d3.loss_cls: 0.3578 d3.loss_bbox: 0.1440 d3.loss_iou: 0.2621 d4.loss_cls: 0.3529 d4.loss_bbox: 0.1469 d4.loss_iou: 0.2628 enc_loss_cls: 0.4039 enc_loss_bbox: 0.1704 enc_loss_iou: 0.3026 dn_loss_cls: 0.1525 dn_loss_bbox: 0.1915 dn_loss_iou: 0.2384 d0.dn_loss_cls: 0.2303 d0.dn_loss_bbox: 0.3540 d0.dn_loss_iou: 0.3998 d1.dn_loss_cls: 0.1767 d1.dn_loss_bbox: 0.2249 d1.dn_loss_iou: 0.2737 d2.dn_loss_cls: 0.1612 d2.dn_loss_bbox: 0.2008 d2.dn_loss_iou: 0.2488 d3.dn_loss_cls: 0.1553 d3.dn_loss_bbox: 0.1929 d3.dn_loss_iou: 0.2407 d4.dn_loss_cls: 0.1529 d4.dn_loss_bbox: 0.1916 d4.dn_loss_iou: 0.2384 d1.loss_lmm_region: 0.3194 loss_lmm_image: 1.0705 2024/11/10 10:34:33 - mmengine - INFO - Iter(train) [ 1600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:55:13 time: 1.9968 data_time: 0.0166 memory: 34543 grad_norm: 28.0744 loss: 9.9871 loss_cls: 0.3577 loss_bbox: 0.0925 loss_iou: 0.1864 d0.loss_cls: 0.4053 d0.loss_bbox: 0.1102 d0.loss_iou: 0.2048 d1.loss_cls: 0.3777 d1.loss_bbox: 0.0999 d1.loss_iou: 0.1916 d2.loss_cls: 0.3717 d2.loss_bbox: 0.0981 d2.loss_iou: 0.1866 d3.loss_cls: 0.3633 d3.loss_bbox: 0.0933 d3.loss_iou: 0.1875 d4.loss_cls: 0.3612 d4.loss_bbox: 0.0924 d4.loss_iou: 0.1869 enc_loss_cls: 0.4043 enc_loss_bbox: 0.1161 enc_loss_iou: 0.2231 dn_loss_cls: 0.2120 dn_loss_bbox: 0.1445 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.2959 d0.dn_loss_bbox: 0.3000 d0.dn_loss_iou: 0.3472 d1.dn_loss_cls: 0.2469 d1.dn_loss_bbox: 0.1815 d1.dn_loss_iou: 0.2328 d2.dn_loss_cls: 0.2236 d2.dn_loss_bbox: 0.1552 d2.dn_loss_iou: 0.2064 d3.dn_loss_cls: 0.2176 d3.dn_loss_bbox: 0.1466 d3.dn_loss_iou: 0.1993 d4.dn_loss_cls: 0.2148 d4.dn_loss_bbox: 0.1445 d4.dn_loss_iou: 0.1962 d1.loss_lmm_region: 0.3173 loss_lmm_image: 1.0980 2024/11/10 10:37:53 - mmengine - INFO - Iter(train) [ 1700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:49:56 time: 1.9672 data_time: 0.0169 memory: 34414 grad_norm: 28.0535 loss: 10.4232 loss_cls: 0.3233 loss_bbox: 0.1268 loss_iou: 0.2249 d0.loss_cls: 0.3712 d0.loss_bbox: 0.1428 d0.loss_iou: 0.2469 d1.loss_cls: 0.3380 d1.loss_bbox: 0.1280 d1.loss_iou: 0.2335 d2.loss_cls: 0.3248 d2.loss_bbox: 0.1311 d2.loss_iou: 0.2336 d3.loss_cls: 0.3246 d3.loss_bbox: 0.1279 d3.loss_iou: 0.2285 d4.loss_cls: 0.3251 d4.loss_bbox: 0.1253 d4.loss_iou: 0.2242 enc_loss_cls: 0.3670 enc_loss_bbox: 0.1529 enc_loss_iou: 0.2639 dn_loss_cls: 0.1823 dn_loss_bbox: 0.1911 dn_loss_iou: 0.2200 d0.dn_loss_cls: 0.2403 d0.dn_loss_bbox: 0.3532 d0.dn_loss_iou: 0.3713 d1.dn_loss_cls: 0.2027 d1.dn_loss_bbox: 0.2312 d1.dn_loss_iou: 0.2551 d2.dn_loss_cls: 0.1904 d2.dn_loss_bbox: 0.2031 d2.dn_loss_iou: 0.2294 d3.dn_loss_cls: 0.1859 d3.dn_loss_bbox: 0.1935 d3.dn_loss_iou: 0.2223 d4.dn_loss_cls: 0.1846 d4.dn_loss_bbox: 0.1911 d4.dn_loss_iou: 0.2199 d1.loss_lmm_region: 0.3021 loss_lmm_image: 1.0893 2024/11/10 10:41:13 - mmengine - INFO - Iter(train) [ 1800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:45:17 time: 1.9760 data_time: 0.0164 memory: 34190 grad_norm: 25.2449 loss: 12.4726 loss_cls: 0.4489 loss_bbox: 0.1552 loss_iou: 0.2734 d0.loss_cls: 0.5152 d0.loss_bbox: 0.1750 d0.loss_iou: 0.2940 d1.loss_cls: 0.4715 d1.loss_bbox: 0.1653 d1.loss_iou: 0.2860 d2.loss_cls: 0.4594 d2.loss_bbox: 0.1601 d2.loss_iou: 0.2742 d3.loss_cls: 0.4533 d3.loss_bbox: 0.1533 d3.loss_iou: 0.2719 d4.loss_cls: 0.4469 d4.loss_bbox: 0.1578 d4.loss_iou: 0.2747 enc_loss_cls: 0.4996 enc_loss_bbox: 0.1879 enc_loss_iou: 0.3179 dn_loss_cls: 0.2312 dn_loss_bbox: 0.2075 dn_loss_iou: 0.2350 d0.dn_loss_cls: 0.3089 d0.dn_loss_bbox: 0.3812 d0.dn_loss_iou: 0.3934 d1.dn_loss_cls: 0.2639 d1.dn_loss_bbox: 0.2463 d1.dn_loss_iou: 0.2735 d2.dn_loss_cls: 0.2453 d2.dn_loss_bbox: 0.2183 d2.dn_loss_iou: 0.2469 d3.dn_loss_cls: 0.2367 d3.dn_loss_bbox: 0.2083 d3.dn_loss_iou: 0.2380 d4.dn_loss_cls: 0.2311 d4.dn_loss_bbox: 0.2076 d4.dn_loss_iou: 0.2350 d1.loss_lmm_region: 0.3283 loss_lmm_image: 1.0949 2024/11/10 10:44:33 - mmengine - INFO - Iter(train) [ 1900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:39:53 time: 1.9932 data_time: 0.0166 memory: 34695 grad_norm: 24.0360 loss: 10.9094 loss_cls: 0.3531 loss_bbox: 0.1402 loss_iou: 0.2511 d0.loss_cls: 0.4145 d0.loss_bbox: 0.1492 d0.loss_iou: 0.2652 d1.loss_cls: 0.3746 d1.loss_bbox: 0.1454 d1.loss_iou: 0.2600 d2.loss_cls: 0.3659 d2.loss_bbox: 0.1389 d2.loss_iou: 0.2500 d3.loss_cls: 0.3576 d3.loss_bbox: 0.1383 d3.loss_iou: 0.2509 d4.loss_cls: 0.3524 d4.loss_bbox: 0.1402 d4.loss_iou: 0.2531 enc_loss_cls: 0.4014 enc_loss_bbox: 0.1675 enc_loss_iou: 0.2888 dn_loss_cls: 0.1540 dn_loss_bbox: 0.1893 dn_loss_iou: 0.2296 d0.dn_loss_cls: 0.2494 d0.dn_loss_bbox: 0.3610 d0.dn_loss_iou: 0.3980 d1.dn_loss_cls: 0.1964 d1.dn_loss_bbox: 0.2322 d1.dn_loss_iou: 0.2712 d2.dn_loss_cls: 0.1719 d2.dn_loss_bbox: 0.2042 d2.dn_loss_iou: 0.2425 d3.dn_loss_cls: 0.1610 d3.dn_loss_bbox: 0.1916 d3.dn_loss_iou: 0.2332 d4.dn_loss_cls: 0.1557 d4.dn_loss_bbox: 0.1891 d4.dn_loss_iou: 0.2296 d1.loss_lmm_region: 0.3100 loss_lmm_image: 1.0812 2024/11/10 10:47:49 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 10:47:49 - mmengine - INFO - Iter(train) [ 2000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:31:14 time: 1.9763 data_time: 0.0164 memory: 34586 grad_norm: 23.5537 loss: 11.2691 loss_cls: 0.3799 loss_bbox: 0.1447 loss_iou: 0.2706 d0.loss_cls: 0.4298 d0.loss_bbox: 0.1517 d0.loss_iou: 0.2778 d1.loss_cls: 0.3946 d1.loss_bbox: 0.1534 d1.loss_iou: 0.2814 d2.loss_cls: 0.3823 d2.loss_bbox: 0.1477 d2.loss_iou: 0.2742 d3.loss_cls: 0.3816 d3.loss_bbox: 0.1456 d3.loss_iou: 0.2710 d4.loss_cls: 0.3759 d4.loss_bbox: 0.1459 d4.loss_iou: 0.2714 enc_loss_cls: 0.4130 enc_loss_bbox: 0.1674 enc_loss_iou: 0.3003 dn_loss_cls: 0.2055 dn_loss_bbox: 0.1513 dn_loss_iou: 0.2349 d0.dn_loss_cls: 0.2971 d0.dn_loss_bbox: 0.3069 d0.dn_loss_iou: 0.3849 d1.dn_loss_cls: 0.2383 d1.dn_loss_bbox: 0.1838 d1.dn_loss_iou: 0.2671 d2.dn_loss_cls: 0.2183 d2.dn_loss_bbox: 0.1614 d2.dn_loss_iou: 0.2450 d3.dn_loss_cls: 0.2104 d3.dn_loss_bbox: 0.1537 d3.dn_loss_iou: 0.2377 d4.dn_loss_cls: 0.2031 d4.dn_loss_bbox: 0.1513 d4.dn_loss_iou: 0.2349 d1.loss_lmm_region: 0.3229 loss_lmm_image: 1.1005 2024/11/10 10:51:06 - mmengine - INFO - Iter(train) [ 2100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:23:27 time: 1.9383 data_time: 0.0166 memory: 33506 grad_norm: 26.4876 loss: 11.7211 loss_cls: 0.4122 loss_bbox: 0.1515 loss_iou: 0.2831 d0.loss_cls: 0.4658 d0.loss_bbox: 0.1669 d0.loss_iou: 0.3008 d1.loss_cls: 0.4312 d1.loss_bbox: 0.1549 d1.loss_iou: 0.2903 d2.loss_cls: 0.4199 d2.loss_bbox: 0.1541 d2.loss_iou: 0.2868 d3.loss_cls: 0.4168 d3.loss_bbox: 0.1517 d3.loss_iou: 0.2831 d4.loss_cls: 0.4117 d4.loss_bbox: 0.1529 d4.loss_iou: 0.2836 enc_loss_cls: 0.4520 enc_loss_bbox: 0.1842 enc_loss_iou: 0.3260 dn_loss_cls: 0.1951 dn_loss_bbox: 0.1660 dn_loss_iou: 0.2330 d0.dn_loss_cls: 0.2802 d0.dn_loss_bbox: 0.3228 d0.dn_loss_iou: 0.3950 d1.dn_loss_cls: 0.2255 d1.dn_loss_bbox: 0.2024 d1.dn_loss_iou: 0.2681 d2.dn_loss_cls: 0.2044 d2.dn_loss_bbox: 0.1781 d2.dn_loss_iou: 0.2435 d3.dn_loss_cls: 0.1978 d3.dn_loss_bbox: 0.1678 d3.dn_loss_iou: 0.2358 d4.dn_loss_cls: 0.1947 d4.dn_loss_bbox: 0.1661 d4.dn_loss_iou: 0.2331 d1.loss_lmm_region: 0.3311 loss_lmm_image: 1.1014 2024/11/10 10:54:26 - mmengine - INFO - Iter(train) [ 2200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:19:18 time: 2.0044 data_time: 0.0166 memory: 34202 grad_norm: 26.4487 loss: 11.4798 loss_cls: 0.4072 loss_bbox: 0.1484 loss_iou: 0.2824 d0.loss_cls: 0.4695 d0.loss_bbox: 0.1593 d0.loss_iou: 0.2967 d1.loss_cls: 0.4356 d1.loss_bbox: 0.1536 d1.loss_iou: 0.2826 d2.loss_cls: 0.4227 d2.loss_bbox: 0.1471 d2.loss_iou: 0.2780 d3.loss_cls: 0.4101 d3.loss_bbox: 0.1498 d3.loss_iou: 0.2820 d4.loss_cls: 0.4077 d4.loss_bbox: 0.1485 d4.loss_iou: 0.2812 enc_loss_cls: 0.4484 enc_loss_bbox: 0.1796 enc_loss_iou: 0.3247 dn_loss_cls: 0.1928 dn_loss_bbox: 0.1585 dn_loss_iou: 0.2228 d0.dn_loss_cls: 0.2693 d0.dn_loss_bbox: 0.3241 d0.dn_loss_iou: 0.3820 d1.dn_loss_cls: 0.2208 d1.dn_loss_bbox: 0.1975 d1.dn_loss_iou: 0.2592 d2.dn_loss_cls: 0.2017 d2.dn_loss_bbox: 0.1699 d2.dn_loss_iou: 0.2343 d3.dn_loss_cls: 0.1964 d3.dn_loss_bbox: 0.1609 d3.dn_loss_iou: 0.2259 d4.dn_loss_cls: 0.1935 d4.dn_loss_bbox: 0.1585 d4.dn_loss_iou: 0.2228 d1.loss_lmm_region: 0.2972 loss_lmm_image: 1.0763 2024/11/10 10:57:41 - mmengine - INFO - Iter(train) [ 2300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:10:36 time: 1.9514 data_time: 0.0165 memory: 33810 grad_norm: 26.8801 loss: 10.6212 loss_cls: 0.3395 loss_bbox: 0.1339 loss_iou: 0.2304 d0.loss_cls: 0.3959 d0.loss_bbox: 0.1496 d0.loss_iou: 0.2439 d1.loss_cls: 0.3600 d1.loss_bbox: 0.1374 d1.loss_iou: 0.2344 d2.loss_cls: 0.3503 d2.loss_bbox: 0.1342 d2.loss_iou: 0.2323 d3.loss_cls: 0.3417 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2308 d4.loss_cls: 0.3388 d4.loss_bbox: 0.1336 d4.loss_iou: 0.2296 enc_loss_cls: 0.3778 enc_loss_bbox: 0.1661 enc_loss_iou: 0.2655 dn_loss_cls: 0.2008 dn_loss_bbox: 0.1725 dn_loss_iou: 0.2174 d0.dn_loss_cls: 0.2892 d0.dn_loss_bbox: 0.3453 d0.dn_loss_iou: 0.3753 d1.dn_loss_cls: 0.2373 d1.dn_loss_bbox: 0.2111 d1.dn_loss_iou: 0.2505 d2.dn_loss_cls: 0.2148 d2.dn_loss_bbox: 0.1843 d2.dn_loss_iou: 0.2282 d3.dn_loss_cls: 0.2058 d3.dn_loss_bbox: 0.1747 d3.dn_loss_iou: 0.2201 d4.dn_loss_cls: 0.2021 d4.dn_loss_bbox: 0.1725 d4.dn_loss_iou: 0.2174 d1.loss_lmm_region: 0.3066 loss_lmm_image: 1.0342 2024/11/10 11:00:59 - mmengine - INFO - Iter(train) [ 2400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:04:43 time: 1.9792 data_time: 0.0166 memory: 34215 grad_norm: 27.1951 loss: 11.3301 loss_cls: 0.3920 loss_bbox: 0.1256 loss_iou: 0.2659 d0.loss_cls: 0.4403 d0.loss_bbox: 0.1402 d0.loss_iou: 0.2843 d1.loss_cls: 0.4153 d1.loss_bbox: 0.1323 d1.loss_iou: 0.2726 d2.loss_cls: 0.3999 d2.loss_bbox: 0.1349 d2.loss_iou: 0.2706 d3.loss_cls: 0.3944 d3.loss_bbox: 0.1280 d3.loss_iou: 0.2676 d4.loss_cls: 0.3927 d4.loss_bbox: 0.1257 d4.loss_iou: 0.2662 enc_loss_cls: 0.4300 enc_loss_bbox: 0.1618 enc_loss_iou: 0.3087 dn_loss_cls: 0.2191 dn_loss_bbox: 0.1674 dn_loss_iou: 0.2287 d0.dn_loss_cls: 0.2942 d0.dn_loss_bbox: 0.3146 d0.dn_loss_iou: 0.3794 d1.dn_loss_cls: 0.2537 d1.dn_loss_bbox: 0.2018 d1.dn_loss_iou: 0.2653 d2.dn_loss_cls: 0.2315 d2.dn_loss_bbox: 0.1762 d2.dn_loss_iou: 0.2391 d3.dn_loss_cls: 0.2231 d3.dn_loss_bbox: 0.1692 d3.dn_loss_iou: 0.2309 d4.dn_loss_cls: 0.2186 d4.dn_loss_bbox: 0.1672 d4.dn_loss_iou: 0.2286 d1.loss_lmm_region: 0.3178 loss_lmm_image: 1.0544 2024/11/10 11:04:18 - mmengine - INFO - Iter(train) [ 2500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 10:00:23 time: 1.9954 data_time: 0.0166 memory: 33628 grad_norm: 27.1482 loss: 11.1207 loss_cls: 0.3454 loss_bbox: 0.1317 loss_iou: 0.2324 d0.loss_cls: 0.3985 d0.loss_bbox: 0.1378 d0.loss_iou: 0.2427 d1.loss_cls: 0.3629 d1.loss_bbox: 0.1385 d1.loss_iou: 0.2371 d2.loss_cls: 0.3596 d2.loss_bbox: 0.1309 d2.loss_iou: 0.2321 d3.loss_cls: 0.3511 d3.loss_bbox: 0.1309 d3.loss_iou: 0.2331 d4.loss_cls: 0.3482 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2326 enc_loss_cls: 0.3864 enc_loss_bbox: 0.1568 enc_loss_iou: 0.2686 dn_loss_cls: 0.2164 dn_loss_bbox: 0.2047 dn_loss_iou: 0.2343 d0.dn_loss_cls: 0.3034 d0.dn_loss_bbox: 0.3866 d0.dn_loss_iou: 0.4012 d1.dn_loss_cls: 0.2534 d1.dn_loss_bbox: 0.2419 d1.dn_loss_iou: 0.2725 d2.dn_loss_cls: 0.2309 d2.dn_loss_bbox: 0.2209 d2.dn_loss_iou: 0.2485 d3.dn_loss_cls: 0.2209 d3.dn_loss_bbox: 0.2068 d3.dn_loss_iou: 0.2376 d4.dn_loss_cls: 0.2183 d4.dn_loss_bbox: 0.2047 d4.dn_loss_iou: 0.2343 d1.loss_lmm_region: 0.3141 loss_lmm_image: 1.0813 2024/11/10 11:07:39 - mmengine - INFO - Iter(train) [ 2600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:57:50 time: 2.0172 data_time: 0.0166 memory: 34388 grad_norm: 25.6469 loss: 10.4601 loss_cls: 0.3431 loss_bbox: 0.1338 loss_iou: 0.2555 d0.loss_cls: 0.4043 d0.loss_bbox: 0.1456 d0.loss_iou: 0.2724 d1.loss_cls: 0.3675 d1.loss_bbox: 0.1409 d1.loss_iou: 0.2654 d2.loss_cls: 0.3546 d2.loss_bbox: 0.1357 d2.loss_iou: 0.2588 d3.loss_cls: 0.3478 d3.loss_bbox: 0.1326 d3.loss_iou: 0.2573 d4.loss_cls: 0.3473 d4.loss_bbox: 0.1340 d4.loss_iou: 0.2558 enc_loss_cls: 0.3918 enc_loss_bbox: 0.1630 enc_loss_iou: 0.3014 dn_loss_cls: 0.1455 dn_loss_bbox: 0.1650 dn_loss_iou: 0.2309 d0.dn_loss_cls: 0.2207 d0.dn_loss_bbox: 0.3225 d0.dn_loss_iou: 0.3883 d1.dn_loss_cls: 0.1730 d1.dn_loss_bbox: 0.2078 d1.dn_loss_iou: 0.2693 d2.dn_loss_cls: 0.1556 d2.dn_loss_bbox: 0.1786 d2.dn_loss_iou: 0.2439 d3.dn_loss_cls: 0.1489 d3.dn_loss_bbox: 0.1672 d3.dn_loss_iou: 0.2339 d4.dn_loss_cls: 0.1450 d4.dn_loss_bbox: 0.1650 d4.dn_loss_iou: 0.2309 d1.loss_lmm_region: 0.2315 loss_lmm_image: 1.0280 2024/11/10 11:10:59 - mmengine - INFO - Iter(train) [ 2700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:54:15 time: 2.0152 data_time: 0.0164 memory: 34194 grad_norm: 24.6290 loss: 11.1565 loss_cls: 0.3630 loss_bbox: 0.1491 loss_iou: 0.2836 d0.loss_cls: 0.4187 d0.loss_bbox: 0.1638 d0.loss_iou: 0.2982 d1.loss_cls: 0.3846 d1.loss_bbox: 0.1549 d1.loss_iou: 0.2915 d2.loss_cls: 0.3735 d2.loss_bbox: 0.1530 d2.loss_iou: 0.2838 d3.loss_cls: 0.3667 d3.loss_bbox: 0.1509 d3.loss_iou: 0.2822 d4.loss_cls: 0.3642 d4.loss_bbox: 0.1481 d4.loss_iou: 0.2828 enc_loss_cls: 0.4188 enc_loss_bbox: 0.1813 enc_loss_iou: 0.3230 dn_loss_cls: 0.1580 dn_loss_bbox: 0.1745 dn_loss_iou: 0.2313 d0.dn_loss_cls: 0.2290 d0.dn_loss_bbox: 0.3493 d0.dn_loss_iou: 0.3894 d1.dn_loss_cls: 0.1845 d1.dn_loss_bbox: 0.2168 d1.dn_loss_iou: 0.2683 d2.dn_loss_cls: 0.1674 d2.dn_loss_bbox: 0.1865 d2.dn_loss_iou: 0.2421 d3.dn_loss_cls: 0.1622 d3.dn_loss_bbox: 0.1781 d3.dn_loss_iou: 0.2349 d4.dn_loss_cls: 0.1596 d4.dn_loss_bbox: 0.1745 d4.dn_loss_iou: 0.2313 d1.loss_lmm_region: 0.2675 loss_lmm_image: 1.1158 2024/11/10 11:14:17 - mmengine - INFO - Iter(train) [ 2800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:48:28 time: 1.9690 data_time: 0.0163 memory: 35274 grad_norm: 26.4086 loss: 10.6222 loss_cls: 0.3354 loss_bbox: 0.1360 loss_iou: 0.2611 d0.loss_cls: 0.3946 d0.loss_bbox: 0.1496 d0.loss_iou: 0.2707 d1.loss_cls: 0.3544 d1.loss_bbox: 0.1446 d1.loss_iou: 0.2651 d2.loss_cls: 0.3445 d2.loss_bbox: 0.1377 d2.loss_iou: 0.2608 d3.loss_cls: 0.3386 d3.loss_bbox: 0.1382 d3.loss_iou: 0.2612 d4.loss_cls: 0.3360 d4.loss_bbox: 0.1360 d4.loss_iou: 0.2612 enc_loss_cls: 0.3896 enc_loss_bbox: 0.1569 enc_loss_iou: 0.2905 dn_loss_cls: 0.1666 dn_loss_bbox: 0.1697 dn_loss_iou: 0.2300 d0.dn_loss_cls: 0.2605 d0.dn_loss_bbox: 0.3063 d0.dn_loss_iou: 0.3812 d1.dn_loss_cls: 0.2027 d1.dn_loss_bbox: 0.2001 d1.dn_loss_iou: 0.2631 d2.dn_loss_cls: 0.1802 d2.dn_loss_bbox: 0.1809 d2.dn_loss_iou: 0.2409 d3.dn_loss_cls: 0.1730 d3.dn_loss_bbox: 0.1718 d3.dn_loss_iou: 0.2328 d4.dn_loss_cls: 0.1684 d4.dn_loss_bbox: 0.1697 d4.dn_loss_iou: 0.2300 d1.loss_lmm_region: 0.3035 loss_lmm_image: 1.0283 2024/11/10 11:17:34 - mmengine - INFO - Iter(train) [ 2900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:42:31 time: 1.9617 data_time: 0.0164 memory: 33364 grad_norm: 27.7129 loss: 11.2002 loss_cls: 0.3798 loss_bbox: 0.1390 loss_iou: 0.2620 d0.loss_cls: 0.4425 d0.loss_bbox: 0.1522 d0.loss_iou: 0.2774 d1.loss_cls: 0.4009 d1.loss_bbox: 0.1456 d1.loss_iou: 0.2673 d2.loss_cls: 0.3912 d2.loss_bbox: 0.1387 d2.loss_iou: 0.2628 d3.loss_cls: 0.3816 d3.loss_bbox: 0.1395 d3.loss_iou: 0.2632 d4.loss_cls: 0.3818 d4.loss_bbox: 0.1376 d4.loss_iou: 0.2601 enc_loss_cls: 0.4291 enc_loss_bbox: 0.1737 enc_loss_iou: 0.3084 dn_loss_cls: 0.1846 dn_loss_bbox: 0.1846 dn_loss_iou: 0.2345 d0.dn_loss_cls: 0.2691 d0.dn_loss_bbox: 0.3509 d0.dn_loss_iou: 0.3985 d1.dn_loss_cls: 0.2160 d1.dn_loss_bbox: 0.2243 d1.dn_loss_iou: 0.2721 d2.dn_loss_cls: 0.1961 d2.dn_loss_bbox: 0.1991 d2.dn_loss_iou: 0.2472 d3.dn_loss_cls: 0.1878 d3.dn_loss_bbox: 0.1892 d3.dn_loss_iou: 0.2383 d4.dn_loss_cls: 0.1849 d4.dn_loss_bbox: 0.1847 d4.dn_loss_iou: 0.2345 d1.loss_lmm_region: 0.2899 loss_lmm_image: 0.9793 2024/11/10 11:20:54 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 11:20:54 - mmengine - INFO - Iter(train) [ 3000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:39:48 time: 1.9981 data_time: 0.0166 memory: 34565 grad_norm: 26.6689 loss: 10.7180 loss_cls: 0.3321 loss_bbox: 0.1487 loss_iou: 0.2358 d0.loss_cls: 0.3956 d0.loss_bbox: 0.1554 d0.loss_iou: 0.2468 d1.loss_cls: 0.3600 d1.loss_bbox: 0.1532 d1.loss_iou: 0.2380 d2.loss_cls: 0.3524 d2.loss_bbox: 0.1472 d2.loss_iou: 0.2333 d3.loss_cls: 0.3394 d3.loss_bbox: 0.1479 d3.loss_iou: 0.2347 d4.loss_cls: 0.3335 d4.loss_bbox: 0.1490 d4.loss_iou: 0.2356 enc_loss_cls: 0.3879 enc_loss_bbox: 0.1751 enc_loss_iou: 0.2683 dn_loss_cls: 0.1527 dn_loss_bbox: 0.2013 dn_loss_iou: 0.2271 d0.dn_loss_cls: 0.2473 d0.dn_loss_bbox: 0.3915 d0.dn_loss_iou: 0.3936 d1.dn_loss_cls: 0.1882 d1.dn_loss_bbox: 0.2465 d1.dn_loss_iou: 0.2650 d2.dn_loss_cls: 0.1659 d2.dn_loss_bbox: 0.2157 d2.dn_loss_iou: 0.2391 d3.dn_loss_cls: 0.1583 d3.dn_loss_bbox: 0.2039 d3.dn_loss_iou: 0.2304 d4.dn_loss_cls: 0.1531 d4.dn_loss_bbox: 0.2013 d4.dn_loss_iou: 0.2271 d1.loss_lmm_region: 0.2769 loss_lmm_image: 1.0632 2024/11/10 11:24:13 - mmengine - INFO - Iter(train) [ 3100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:35:37 time: 1.9911 data_time: 0.0169 memory: 34770 grad_norm: 25.1237 loss: 11.6100 loss_cls: 0.3785 loss_bbox: 0.1751 loss_iou: 0.2983 d0.loss_cls: 0.4353 d0.loss_bbox: 0.1827 d0.loss_iou: 0.3108 d1.loss_cls: 0.4063 d1.loss_bbox: 0.1775 d1.loss_iou: 0.3000 d2.loss_cls: 0.3899 d2.loss_bbox: 0.1755 d2.loss_iou: 0.2976 d3.loss_cls: 0.3867 d3.loss_bbox: 0.1716 d3.loss_iou: 0.2971 d4.loss_cls: 0.3790 d4.loss_bbox: 0.1729 d4.loss_iou: 0.2969 enc_loss_cls: 0.4346 enc_loss_bbox: 0.1898 enc_loss_iou: 0.3289 dn_loss_cls: 0.1591 dn_loss_bbox: 0.1905 dn_loss_iou: 0.2402 d0.dn_loss_cls: 0.2456 d0.dn_loss_bbox: 0.3477 d0.dn_loss_iou: 0.3965 d1.dn_loss_cls: 0.1918 d1.dn_loss_bbox: 0.2185 d1.dn_loss_iou: 0.2723 d2.dn_loss_cls: 0.1725 d2.dn_loss_bbox: 0.1995 d2.dn_loss_iou: 0.2502 d3.dn_loss_cls: 0.1630 d3.dn_loss_bbox: 0.1920 d3.dn_loss_iou: 0.2426 d4.dn_loss_cls: 0.1587 d4.dn_loss_bbox: 0.1905 d4.dn_loss_iou: 0.2401 d1.loss_lmm_region: 0.2840 loss_lmm_image: 1.0696 2024/11/10 11:27:30 - mmengine - INFO - Iter(train) [ 3200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:29:58 time: 1.9818 data_time: 0.0168 memory: 31263 grad_norm: 28.5844 loss: 10.7818 loss_cls: 0.3257 loss_bbox: 0.1366 loss_iou: 0.2258 d0.loss_cls: 0.3779 d0.loss_bbox: 0.1467 d0.loss_iou: 0.2349 d1.loss_cls: 0.3454 d1.loss_bbox: 0.1437 d1.loss_iou: 0.2299 d2.loss_cls: 0.3303 d2.loss_bbox: 0.1417 d2.loss_iou: 0.2294 d3.loss_cls: 0.3298 d3.loss_bbox: 0.1351 d3.loss_iou: 0.2246 d4.loss_cls: 0.3284 d4.loss_bbox: 0.1341 d4.loss_iou: 0.2242 enc_loss_cls: 0.3677 enc_loss_bbox: 0.1640 enc_loss_iou: 0.2585 dn_loss_cls: 0.2308 dn_loss_bbox: 0.1897 dn_loss_iou: 0.2162 d0.dn_loss_cls: 0.3083 d0.dn_loss_bbox: 0.3562 d0.dn_loss_iou: 0.3642 d1.dn_loss_cls: 0.2620 d1.dn_loss_bbox: 0.2272 d1.dn_loss_iou: 0.2505 d2.dn_loss_cls: 0.2425 d2.dn_loss_bbox: 0.2003 d2.dn_loss_iou: 0.2267 d3.dn_loss_cls: 0.2348 d3.dn_loss_bbox: 0.1922 d3.dn_loss_iou: 0.2189 d4.dn_loss_cls: 0.2310 d4.dn_loss_bbox: 0.1898 d4.dn_loss_iou: 0.2163 d1.loss_lmm_region: 0.3013 loss_lmm_image: 1.0881 2024/11/10 11:30:51 - mmengine - INFO - Iter(train) [ 3300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:27:18 time: 1.9836 data_time: 0.0167 memory: 33810 grad_norm: 25.3738 loss: 11.3682 loss_cls: 0.4551 loss_bbox: 0.1357 loss_iou: 0.2429 d0.loss_cls: 0.5206 d0.loss_bbox: 0.1379 d0.loss_iou: 0.2611 d1.loss_cls: 0.4844 d1.loss_bbox: 0.1375 d1.loss_iou: 0.2508 d2.loss_cls: 0.4694 d2.loss_bbox: 0.1299 d2.loss_iou: 0.2435 d3.loss_cls: 0.4640 d3.loss_bbox: 0.1303 d3.loss_iou: 0.2424 d4.loss_cls: 0.4577 d4.loss_bbox: 0.1344 d4.loss_iou: 0.2425 enc_loss_cls: 0.5108 enc_loss_bbox: 0.1535 enc_loss_iou: 0.2848 dn_loss_cls: 0.2043 dn_loss_bbox: 0.1608 dn_loss_iou: 0.1900 d0.dn_loss_cls: 0.2839 d0.dn_loss_bbox: 0.3505 d0.dn_loss_iou: 0.3472 d1.dn_loss_cls: 0.2357 d1.dn_loss_bbox: 0.2014 d1.dn_loss_iou: 0.2251 d2.dn_loss_cls: 0.2141 d2.dn_loss_bbox: 0.1724 d2.dn_loss_iou: 0.1999 d3.dn_loss_cls: 0.2083 d3.dn_loss_bbox: 0.1628 d3.dn_loss_iou: 0.1921 d4.dn_loss_cls: 0.2035 d4.dn_loss_bbox: 0.1609 d4.dn_loss_iou: 0.1901 d1.loss_lmm_region: 0.3050 loss_lmm_image: 1.0714 2024/11/10 11:34:12 - mmengine - INFO - Iter(train) [ 3400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:24:34 time: 2.0104 data_time: 0.0166 memory: 33711 grad_norm: 27.2256 loss: 11.8665 loss_cls: 0.3531 loss_bbox: 0.1499 loss_iou: 0.2486 d0.loss_cls: 0.4055 d0.loss_bbox: 0.1614 d0.loss_iou: 0.2603 d1.loss_cls: 0.3680 d1.loss_bbox: 0.1567 d1.loss_iou: 0.2591 d2.loss_cls: 0.3591 d2.loss_bbox: 0.1561 d2.loss_iou: 0.2519 d3.loss_cls: 0.3565 d3.loss_bbox: 0.1525 d3.loss_iou: 0.2508 d4.loss_cls: 0.3555 d4.loss_bbox: 0.1495 d4.loss_iou: 0.2488 enc_loss_cls: 0.3943 enc_loss_bbox: 0.1843 enc_loss_iou: 0.2879 dn_loss_cls: 0.2802 dn_loss_bbox: 0.2126 dn_loss_iou: 0.2432 d0.dn_loss_cls: 0.3721 d0.dn_loss_bbox: 0.3997 d0.dn_loss_iou: 0.4057 d1.dn_loss_cls: 0.3254 d1.dn_loss_bbox: 0.2536 d1.dn_loss_iou: 0.2786 d2.dn_loss_cls: 0.3009 d2.dn_loss_bbox: 0.2232 d2.dn_loss_iou: 0.2531 d3.dn_loss_cls: 0.2905 d3.dn_loss_bbox: 0.2148 d3.dn_loss_iou: 0.2459 d4.dn_loss_cls: 0.2833 d4.dn_loss_bbox: 0.2125 d4.dn_loss_iou: 0.2432 d1.loss_lmm_region: 0.3091 loss_lmm_image: 1.0089 2024/11/10 11:37:31 - mmengine - INFO - Iter(train) [ 3500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:20:37 time: 1.9959 data_time: 0.0164 memory: 33885 grad_norm: 26.1764 loss: 9.2899 loss_cls: 0.3238 loss_bbox: 0.0940 loss_iou: 0.2074 d0.loss_cls: 0.3761 d0.loss_bbox: 0.1049 d0.loss_iou: 0.2250 d1.loss_cls: 0.3458 d1.loss_bbox: 0.0986 d1.loss_iou: 0.2130 d2.loss_cls: 0.3301 d2.loss_bbox: 0.0955 d2.loss_iou: 0.2098 d3.loss_cls: 0.3240 d3.loss_bbox: 0.0938 d3.loss_iou: 0.2074 d4.loss_cls: 0.3224 d4.loss_bbox: 0.0938 d4.loss_iou: 0.2061 enc_loss_cls: 0.3709 enc_loss_bbox: 0.1212 enc_loss_iou: 0.2501 dn_loss_cls: 0.1665 dn_loss_bbox: 0.1260 dn_loss_iou: 0.1863 d0.dn_loss_cls: 0.2350 d0.dn_loss_bbox: 0.2659 d0.dn_loss_iou: 0.3285 d1.dn_loss_cls: 0.1900 d1.dn_loss_bbox: 0.1615 d1.dn_loss_iou: 0.2222 d2.dn_loss_cls: 0.1747 d2.dn_loss_bbox: 0.1374 d2.dn_loss_iou: 0.1975 d3.dn_loss_cls: 0.1666 d3.dn_loss_bbox: 0.1293 d3.dn_loss_iou: 0.1894 d4.dn_loss_cls: 0.1673 d4.dn_loss_bbox: 0.1261 d4.dn_loss_iou: 0.1863 d1.loss_lmm_region: 0.2473 loss_lmm_image: 1.0723 2024/11/10 11:40:49 - mmengine - INFO - Iter(train) [ 3600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:16:27 time: 1.9898 data_time: 0.0166 memory: 34584 grad_norm: 29.6661 loss: 11.4517 loss_cls: 0.3930 loss_bbox: 0.1441 loss_iou: 0.2450 d0.loss_cls: 0.4393 d0.loss_bbox: 0.1505 d0.loss_iou: 0.2562 d1.loss_cls: 0.4090 d1.loss_bbox: 0.1522 d1.loss_iou: 0.2532 d2.loss_cls: 0.3987 d2.loss_bbox: 0.1462 d2.loss_iou: 0.2495 d3.loss_cls: 0.3931 d3.loss_bbox: 0.1463 d3.loss_iou: 0.2473 d4.loss_cls: 0.3924 d4.loss_bbox: 0.1443 d4.loss_iou: 0.2447 enc_loss_cls: 0.4362 enc_loss_bbox: 0.1648 enc_loss_iou: 0.2760 dn_loss_cls: 0.2389 dn_loss_bbox: 0.1951 dn_loss_iou: 0.2274 d0.dn_loss_cls: 0.3174 d0.dn_loss_bbox: 0.3562 d0.dn_loss_iou: 0.3792 d1.dn_loss_cls: 0.2671 d1.dn_loss_bbox: 0.2292 d1.dn_loss_iou: 0.2609 d2.dn_loss_cls: 0.2497 d2.dn_loss_bbox: 0.2052 d2.dn_loss_iou: 0.2373 d3.dn_loss_cls: 0.2401 d3.dn_loss_bbox: 0.1969 d3.dn_loss_iou: 0.2302 d4.dn_loss_cls: 0.2381 d4.dn_loss_bbox: 0.1951 d4.dn_loss_iou: 0.2274 d1.loss_lmm_region: 0.2822 loss_lmm_image: 0.9963 2024/11/10 11:44:08 - mmengine - INFO - Iter(train) [ 3700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:12:21 time: 1.9885 data_time: 0.0167 memory: 34989 grad_norm: 23.9762 loss: 10.0708 loss_cls: 0.3277 loss_bbox: 0.1243 loss_iou: 0.2202 d0.loss_cls: 0.3833 d0.loss_bbox: 0.1367 d0.loss_iou: 0.2346 d1.loss_cls: 0.3489 d1.loss_bbox: 0.1347 d1.loss_iou: 0.2302 d2.loss_cls: 0.3373 d2.loss_bbox: 0.1277 d2.loss_iou: 0.2235 d3.loss_cls: 0.3329 d3.loss_bbox: 0.1228 d3.loss_iou: 0.2192 d4.loss_cls: 0.3295 d4.loss_bbox: 0.1247 d4.loss_iou: 0.2203 enc_loss_cls: 0.3632 enc_loss_bbox: 0.1554 enc_loss_iou: 0.2618 dn_loss_cls: 0.1488 dn_loss_bbox: 0.1737 dn_loss_iou: 0.2158 d0.dn_loss_cls: 0.2393 d0.dn_loss_bbox: 0.3482 d0.dn_loss_iou: 0.3819 d1.dn_loss_cls: 0.1864 d1.dn_loss_bbox: 0.2123 d1.dn_loss_iou: 0.2549 d2.dn_loss_cls: 0.1644 d2.dn_loss_bbox: 0.1881 d2.dn_loss_iou: 0.2297 d3.dn_loss_cls: 0.1545 d3.dn_loss_bbox: 0.1772 d3.dn_loss_iou: 0.2197 d4.dn_loss_cls: 0.1506 d4.dn_loss_bbox: 0.1738 d4.dn_loss_iou: 0.2159 d1.loss_lmm_region: 0.2332 loss_lmm_image: 1.0435 2024/11/10 11:47:27 - mmengine - INFO - Iter(train) [ 3800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:08:36 time: 1.9792 data_time: 0.0165 memory: 33738 grad_norm: 25.4440 loss: 12.7076 loss_cls: 0.4227 loss_bbox: 0.1666 loss_iou: 0.3201 d0.loss_cls: 0.4798 d0.loss_bbox: 0.1750 d0.loss_iou: 0.3335 d1.loss_cls: 0.4412 d1.loss_bbox: 0.1735 d1.loss_iou: 0.3330 d2.loss_cls: 0.4361 d2.loss_bbox: 0.1699 d2.loss_iou: 0.3223 d3.loss_cls: 0.4291 d3.loss_bbox: 0.1669 d3.loss_iou: 0.3190 d4.loss_cls: 0.4227 d4.loss_bbox: 0.1671 d4.loss_iou: 0.3189 enc_loss_cls: 0.4692 enc_loss_bbox: 0.1938 enc_loss_iou: 0.3614 dn_loss_cls: 0.2491 dn_loss_bbox: 0.1881 dn_loss_iou: 0.2614 d0.dn_loss_cls: 0.3311 d0.dn_loss_bbox: 0.3576 d0.dn_loss_iou: 0.4179 d1.dn_loss_cls: 0.2819 d1.dn_loss_bbox: 0.2266 d1.dn_loss_iou: 0.2976 d2.dn_loss_cls: 0.2647 d2.dn_loss_bbox: 0.2002 d2.dn_loss_iou: 0.2719 d3.dn_loss_cls: 0.2588 d3.dn_loss_bbox: 0.1896 d3.dn_loss_iou: 0.2640 d4.dn_loss_cls: 0.2511 d4.dn_loss_bbox: 0.1882 d4.dn_loss_iou: 0.2614 d1.loss_lmm_region: 0.2916 loss_lmm_image: 1.0335 2024/11/10 11:50:47 - mmengine - INFO - Iter(train) [ 3900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:05:26 time: 1.9961 data_time: 0.0167 memory: 35204 grad_norm: 28.3786 loss: 12.0597 loss_cls: 0.4007 loss_bbox: 0.1529 loss_iou: 0.2646 d0.loss_cls: 0.4438 d0.loss_bbox: 0.1723 d0.loss_iou: 0.2831 d1.loss_cls: 0.4115 d1.loss_bbox: 0.1601 d1.loss_iou: 0.2705 d2.loss_cls: 0.4033 d2.loss_bbox: 0.1593 d2.loss_iou: 0.2679 d3.loss_cls: 0.3962 d3.loss_bbox: 0.1598 d3.loss_iou: 0.2676 d4.loss_cls: 0.3941 d4.loss_bbox: 0.1585 d4.loss_iou: 0.2678 enc_loss_cls: 0.4286 enc_loss_bbox: 0.1812 enc_loss_iou: 0.2977 dn_loss_cls: 0.2499 dn_loss_bbox: 0.2017 dn_loss_iou: 0.2582 d0.dn_loss_cls: 0.3188 d0.dn_loss_bbox: 0.3593 d0.dn_loss_iou: 0.4140 d1.dn_loss_cls: 0.2742 d1.dn_loss_bbox: 0.2340 d1.dn_loss_iou: 0.2907 d2.dn_loss_cls: 0.2596 d2.dn_loss_bbox: 0.2126 d2.dn_loss_iou: 0.2674 d3.dn_loss_cls: 0.2518 d3.dn_loss_bbox: 0.2039 d3.dn_loss_iou: 0.2606 d4.dn_loss_cls: 0.2487 d4.dn_loss_bbox: 0.2018 d4.dn_loss_iou: 0.2582 d1.loss_lmm_region: 0.3248 loss_lmm_image: 1.0278 2024/11/10 11:54:07 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 11:54:07 - mmengine - INFO - Iter(train) [ 4000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 9:01:55 time: 1.9862 data_time: 0.0167 memory: 32287 grad_norm: 25.7166 loss: 11.0142 loss_cls: 0.3886 loss_bbox: 0.1387 loss_iou: 0.2705 d0.loss_cls: 0.4484 d0.loss_bbox: 0.1583 d0.loss_iou: 0.2947 d1.loss_cls: 0.4116 d1.loss_bbox: 0.1499 d1.loss_iou: 0.2808 d2.loss_cls: 0.3981 d2.loss_bbox: 0.1438 d2.loss_iou: 0.2756 d3.loss_cls: 0.3892 d3.loss_bbox: 0.1427 d3.loss_iou: 0.2728 d4.loss_cls: 0.3874 d4.loss_bbox: 0.1417 d4.loss_iou: 0.2727 enc_loss_cls: 0.4357 enc_loss_bbox: 0.1753 enc_loss_iou: 0.3196 dn_loss_cls: 0.1594 dn_loss_bbox: 0.1627 dn_loss_iou: 0.2182 d0.dn_loss_cls: 0.2497 d0.dn_loss_bbox: 0.3139 d0.dn_loss_iou: 0.3726 d1.dn_loss_cls: 0.1940 d1.dn_loss_bbox: 0.1978 d1.dn_loss_iou: 0.2553 d2.dn_loss_cls: 0.1735 d2.dn_loss_bbox: 0.1748 d2.dn_loss_iou: 0.2302 d3.dn_loss_cls: 0.1652 d3.dn_loss_bbox: 0.1653 d3.dn_loss_iou: 0.2216 d4.dn_loss_cls: 0.1582 d4.dn_loss_bbox: 0.1627 d4.dn_loss_iou: 0.2182 d1.loss_lmm_region: 0.2705 loss_lmm_image: 1.0544 2024/11/10 11:57:26 - mmengine - INFO - Iter(train) [ 4100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:58:17 time: 1.9826 data_time: 0.0165 memory: 34675 grad_norm: 25.9645 loss: 10.7739 loss_cls: 0.3563 loss_bbox: 0.1140 loss_iou: 0.1879 d0.loss_cls: 0.4050 d0.loss_bbox: 0.1387 d0.loss_iou: 0.2196 d1.loss_cls: 0.3708 d1.loss_bbox: 0.1248 d1.loss_iou: 0.2017 d2.loss_cls: 0.3684 d2.loss_bbox: 0.1156 d2.loss_iou: 0.1889 d3.loss_cls: 0.3569 d3.loss_bbox: 0.1171 d3.loss_iou: 0.1886 d4.loss_cls: 0.3541 d4.loss_bbox: 0.1158 d4.loss_iou: 0.1884 enc_loss_cls: 0.3931 enc_loss_bbox: 0.1479 enc_loss_iou: 0.2339 dn_loss_cls: 0.2589 dn_loss_bbox: 0.1915 dn_loss_iou: 0.2092 d0.dn_loss_cls: 0.3278 d0.dn_loss_bbox: 0.3743 d0.dn_loss_iou: 0.3728 d1.dn_loss_cls: 0.2871 d1.dn_loss_bbox: 0.2346 d1.dn_loss_iou: 0.2499 d2.dn_loss_cls: 0.2714 d2.dn_loss_bbox: 0.2049 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.2646 d3.dn_loss_bbox: 0.1932 d3.dn_loss_iou: 0.2118 d4.dn_loss_cls: 0.2586 d4.dn_loss_bbox: 0.1915 d4.dn_loss_iou: 0.2091 d1.loss_lmm_region: 0.2635 loss_lmm_image: 1.0898 2024/11/10 12:00:47 - mmengine - INFO - Iter(train) [ 4200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:55:25 time: 1.9785 data_time: 0.0163 memory: 33766 grad_norm: 24.6504 loss: 10.5925 loss_cls: 0.3382 loss_bbox: 0.1316 loss_iou: 0.2191 d0.loss_cls: 0.3880 d0.loss_bbox: 0.1415 d0.loss_iou: 0.2307 d1.loss_cls: 0.3548 d1.loss_bbox: 0.1405 d1.loss_iou: 0.2271 d2.loss_cls: 0.3483 d2.loss_bbox: 0.1334 d2.loss_iou: 0.2197 d3.loss_cls: 0.3430 d3.loss_bbox: 0.1309 d3.loss_iou: 0.2199 d4.loss_cls: 0.3420 d4.loss_bbox: 0.1302 d4.loss_iou: 0.2184 enc_loss_cls: 0.3781 enc_loss_bbox: 0.1577 enc_loss_iou: 0.2525 dn_loss_cls: 0.1900 dn_loss_bbox: 0.1833 dn_loss_iou: 0.2258 d0.dn_loss_cls: 0.2822 d0.dn_loss_bbox: 0.3743 d0.dn_loss_iou: 0.3892 d1.dn_loss_cls: 0.2276 d1.dn_loss_bbox: 0.2233 d1.dn_loss_iou: 0.2634 d2.dn_loss_cls: 0.2024 d2.dn_loss_bbox: 0.1962 d2.dn_loss_iou: 0.2377 d3.dn_loss_cls: 0.1944 d3.dn_loss_bbox: 0.1863 d3.dn_loss_iou: 0.2290 d4.dn_loss_cls: 0.1899 d4.dn_loss_bbox: 0.1835 d4.dn_loss_iou: 0.2258 d1.loss_lmm_region: 0.2797 loss_lmm_image: 1.0629 2024/11/10 12:04:06 - mmengine - INFO - Iter(train) [ 4300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:51:50 time: 1.9989 data_time: 0.0165 memory: 31838 grad_norm: 24.4746 loss: 9.7477 loss_cls: 0.3149 loss_bbox: 0.1153 loss_iou: 0.2036 d0.loss_cls: 0.3593 d0.loss_bbox: 0.1311 d0.loss_iou: 0.2229 d1.loss_cls: 0.3288 d1.loss_bbox: 0.1213 d1.loss_iou: 0.2139 d2.loss_cls: 0.3208 d2.loss_bbox: 0.1159 d2.loss_iou: 0.2057 d3.loss_cls: 0.3147 d3.loss_bbox: 0.1174 d3.loss_iou: 0.2041 d4.loss_cls: 0.3157 d4.loss_bbox: 0.1169 d4.loss_iou: 0.2041 enc_loss_cls: 0.3499 enc_loss_bbox: 0.1427 enc_loss_iou: 0.2479 dn_loss_cls: 0.1635 dn_loss_bbox: 0.1677 dn_loss_iou: 0.2099 d0.dn_loss_cls: 0.2395 d0.dn_loss_bbox: 0.3300 d0.dn_loss_iou: 0.3637 d1.dn_loss_cls: 0.1968 d1.dn_loss_bbox: 0.2081 d1.dn_loss_iou: 0.2472 d2.dn_loss_cls: 0.1762 d2.dn_loss_bbox: 0.1790 d2.dn_loss_iou: 0.2217 d3.dn_loss_cls: 0.1665 d3.dn_loss_bbox: 0.1700 d3.dn_loss_iou: 0.2129 d4.dn_loss_cls: 0.1644 d4.dn_loss_bbox: 0.1677 d4.dn_loss_iou: 0.2099 d1.loss_lmm_region: 0.2278 loss_lmm_image: 1.0583 2024/11/10 12:07:23 - mmengine - INFO - Iter(train) [ 4400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:46:52 time: 1.9713 data_time: 0.0164 memory: 34627 grad_norm: 27.0103 loss: 10.9030 loss_cls: 0.3507 loss_bbox: 0.1528 loss_iou: 0.2745 d0.loss_cls: 0.4108 d0.loss_bbox: 0.1635 d0.loss_iou: 0.2894 d1.loss_cls: 0.3651 d1.loss_bbox: 0.1624 d1.loss_iou: 0.2847 d2.loss_cls: 0.3546 d2.loss_bbox: 0.1598 d2.loss_iou: 0.2824 d3.loss_cls: 0.3527 d3.loss_bbox: 0.1534 d3.loss_iou: 0.2756 d4.loss_cls: 0.3498 d4.loss_bbox: 0.1549 d4.loss_iou: 0.2760 enc_loss_cls: 0.3903 enc_loss_bbox: 0.1853 enc_loss_iou: 0.3202 dn_loss_cls: 0.1117 dn_loss_bbox: 0.1897 dn_loss_iou: 0.2485 d0.dn_loss_cls: 0.2070 d0.dn_loss_bbox: 0.3743 d0.dn_loss_iou: 0.4120 d1.dn_loss_cls: 0.1505 d1.dn_loss_bbox: 0.2305 d1.dn_loss_iou: 0.2849 d2.dn_loss_cls: 0.1282 d2.dn_loss_bbox: 0.2032 d2.dn_loss_iou: 0.2607 d3.dn_loss_cls: 0.1182 d3.dn_loss_bbox: 0.1927 d3.dn_loss_iou: 0.2513 d4.dn_loss_cls: 0.1125 d4.dn_loss_bbox: 0.1897 d4.dn_loss_iou: 0.2484 d1.loss_lmm_region: 0.2372 loss_lmm_image: 1.0427 2024/11/10 12:10:42 - mmengine - INFO - Iter(train) [ 4500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:43:18 time: 1.9850 data_time: 0.0163 memory: 33106 grad_norm: 25.5049 loss: 10.3928 loss_cls: 0.3239 loss_bbox: 0.1316 loss_iou: 0.2214 d0.loss_cls: 0.3800 d0.loss_bbox: 0.1413 d0.loss_iou: 0.2315 d1.loss_cls: 0.3419 d1.loss_bbox: 0.1388 d1.loss_iou: 0.2260 d2.loss_cls: 0.3330 d2.loss_bbox: 0.1327 d2.loss_iou: 0.2209 d3.loss_cls: 0.3272 d3.loss_bbox: 0.1323 d3.loss_iou: 0.2239 d4.loss_cls: 0.3265 d4.loss_bbox: 0.1303 d4.loss_iou: 0.2218 enc_loss_cls: 0.3714 enc_loss_bbox: 0.1561 enc_loss_iou: 0.2541 dn_loss_cls: 0.1874 dn_loss_bbox: 0.1827 dn_loss_iou: 0.2238 d0.dn_loss_cls: 0.2660 d0.dn_loss_bbox: 0.3559 d0.dn_loss_iou: 0.3821 d1.dn_loss_cls: 0.2165 d1.dn_loss_bbox: 0.2242 d1.dn_loss_iou: 0.2607 d2.dn_loss_cls: 0.1956 d2.dn_loss_bbox: 0.1957 d2.dn_loss_iou: 0.2355 d3.dn_loss_cls: 0.1888 d3.dn_loss_bbox: 0.1853 d3.dn_loss_iou: 0.2269 d4.dn_loss_cls: 0.1871 d4.dn_loss_bbox: 0.1827 d4.dn_loss_iou: 0.2238 d1.loss_lmm_region: 0.2584 loss_lmm_image: 1.0472 2024/11/10 12:14:01 - mmengine - INFO - Iter(train) [ 4600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:39:29 time: 1.9994 data_time: 0.0165 memory: 34133 grad_norm: 25.2106 loss: 10.3811 loss_cls: 0.3436 loss_bbox: 0.1361 loss_iou: 0.2431 d0.loss_cls: 0.3912 d0.loss_bbox: 0.1523 d0.loss_iou: 0.2602 d1.loss_cls: 0.3653 d1.loss_bbox: 0.1408 d1.loss_iou: 0.2475 d2.loss_cls: 0.3575 d2.loss_bbox: 0.1341 d2.loss_iou: 0.2418 d3.loss_cls: 0.3464 d3.loss_bbox: 0.1362 d3.loss_iou: 0.2427 d4.loss_cls: 0.3433 d4.loss_bbox: 0.1361 d4.loss_iou: 0.2429 enc_loss_cls: 0.3807 enc_loss_bbox: 0.1636 enc_loss_iou: 0.2827 dn_loss_cls: 0.1687 dn_loss_bbox: 0.1685 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.2451 d0.dn_loss_bbox: 0.3168 d0.dn_loss_iou: 0.3622 d1.dn_loss_cls: 0.1963 d1.dn_loss_bbox: 0.2056 d1.dn_loss_iou: 0.2501 d2.dn_loss_cls: 0.1791 d2.dn_loss_bbox: 0.1802 d2.dn_loss_iou: 0.2258 d3.dn_loss_cls: 0.1734 d3.dn_loss_bbox: 0.1717 d3.dn_loss_iou: 0.2190 d4.dn_loss_cls: 0.1683 d4.dn_loss_bbox: 0.1686 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.2499 loss_lmm_image: 1.0121 2024/11/10 12:17:21 - mmengine - INFO - Iter(train) [ 4700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:36:16 time: 2.0077 data_time: 0.0165 memory: 34311 grad_norm: 24.7865 loss: 10.1970 loss_cls: 0.3706 loss_bbox: 0.1034 loss_iou: 0.2129 d0.loss_cls: 0.4132 d0.loss_bbox: 0.1115 d0.loss_iou: 0.2201 d1.loss_cls: 0.3912 d1.loss_bbox: 0.1071 d1.loss_iou: 0.2173 d2.loss_cls: 0.3867 d2.loss_bbox: 0.1018 d2.loss_iou: 0.2109 d3.loss_cls: 0.3787 d3.loss_bbox: 0.1043 d3.loss_iou: 0.2073 d4.loss_cls: 0.3696 d4.loss_bbox: 0.1038 d4.loss_iou: 0.2130 enc_loss_cls: 0.4161 enc_loss_bbox: 0.1224 enc_loss_iou: 0.2396 dn_loss_cls: 0.2328 dn_loss_bbox: 0.1316 dn_loss_iou: 0.1993 d0.dn_loss_cls: 0.3098 d0.dn_loss_bbox: 0.2878 d0.dn_loss_iou: 0.3530 d1.dn_loss_cls: 0.2639 d1.dn_loss_bbox: 0.1674 d1.dn_loss_iou: 0.2346 d2.dn_loss_cls: 0.2490 d2.dn_loss_bbox: 0.1435 d2.dn_loss_iou: 0.2094 d3.dn_loss_cls: 0.2378 d3.dn_loss_bbox: 0.1338 d3.dn_loss_iou: 0.2019 d4.dn_loss_cls: 0.2321 d4.dn_loss_bbox: 0.1316 d4.dn_loss_iou: 0.1993 d1.loss_lmm_region: 0.2493 loss_lmm_image: 1.0275 2024/11/10 12:20:40 - mmengine - INFO - Iter(train) [ 4800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:32:49 time: 1.9912 data_time: 0.0166 memory: 34509 grad_norm: 23.6095 loss: 10.5544 loss_cls: 0.3843 loss_bbox: 0.1389 loss_iou: 0.2634 d0.loss_cls: 0.4446 d0.loss_bbox: 0.1477 d0.loss_iou: 0.2747 d1.loss_cls: 0.4085 d1.loss_bbox: 0.1425 d1.loss_iou: 0.2706 d2.loss_cls: 0.4001 d2.loss_bbox: 0.1362 d2.loss_iou: 0.2631 d3.loss_cls: 0.3882 d3.loss_bbox: 0.1398 d3.loss_iou: 0.2621 d4.loss_cls: 0.3873 d4.loss_bbox: 0.1375 d4.loss_iou: 0.2614 enc_loss_cls: 0.4384 enc_loss_bbox: 0.1603 enc_loss_iou: 0.2977 dn_loss_cls: 0.1683 dn_loss_bbox: 0.1511 dn_loss_iou: 0.2007 d0.dn_loss_cls: 0.2406 d0.dn_loss_bbox: 0.2777 d0.dn_loss_iou: 0.3305 d1.dn_loss_cls: 0.1987 d1.dn_loss_bbox: 0.1821 d1.dn_loss_iou: 0.2291 d2.dn_loss_cls: 0.1835 d2.dn_loss_bbox: 0.1580 d2.dn_loss_iou: 0.2080 d3.dn_loss_cls: 0.1747 d3.dn_loss_bbox: 0.1528 d3.dn_loss_iou: 0.2023 d4.dn_loss_cls: 0.1697 d4.dn_loss_bbox: 0.1512 d4.dn_loss_iou: 0.2007 d1.loss_lmm_region: 0.2330 loss_lmm_image: 0.9948 2024/11/10 12:24:01 - mmengine - INFO - Iter(train) [ 4900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:29:58 time: 1.9969 data_time: 0.0166 memory: 34762 grad_norm: 25.4460 loss: 10.2851 loss_cls: 0.3744 loss_bbox: 0.1308 loss_iou: 0.2478 d0.loss_cls: 0.4311 d0.loss_bbox: 0.1410 d0.loss_iou: 0.2577 d1.loss_cls: 0.3979 d1.loss_bbox: 0.1294 d1.loss_iou: 0.2523 d2.loss_cls: 0.3906 d2.loss_bbox: 0.1252 d2.loss_iou: 0.2449 d3.loss_cls: 0.3800 d3.loss_bbox: 0.1267 d3.loss_iou: 0.2457 d4.loss_cls: 0.3763 d4.loss_bbox: 0.1299 d4.loss_iou: 0.2472 enc_loss_cls: 0.4150 enc_loss_bbox: 0.1599 enc_loss_iou: 0.2888 dn_loss_cls: 0.1555 dn_loss_bbox: 0.1421 dn_loss_iou: 0.1938 d0.dn_loss_cls: 0.2356 d0.dn_loss_bbox: 0.2932 d0.dn_loss_iou: 0.3416 d1.dn_loss_cls: 0.1888 d1.dn_loss_bbox: 0.1757 d1.dn_loss_iou: 0.2263 d2.dn_loss_cls: 0.1672 d2.dn_loss_bbox: 0.1495 d2.dn_loss_iou: 0.2019 d3.dn_loss_cls: 0.1618 d3.dn_loss_bbox: 0.1430 d3.dn_loss_iou: 0.1960 d4.dn_loss_cls: 0.1589 d4.dn_loss_bbox: 0.1422 d4.dn_loss_iou: 0.1938 d1.loss_lmm_region: 0.2869 loss_lmm_image: 1.0387 2024/11/10 12:27:20 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 12:27:20 - mmengine - INFO - Iter(train) [ 5000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:26:19 time: 2.0180 data_time: 0.0169 memory: 32860 grad_norm: 27.9688 loss: 10.9270 loss_cls: 0.3533 loss_bbox: 0.1464 loss_iou: 0.2310 d0.loss_cls: 0.4092 d0.loss_bbox: 0.1540 d0.loss_iou: 0.2407 d1.loss_cls: 0.3752 d1.loss_bbox: 0.1515 d1.loss_iou: 0.2375 d2.loss_cls: 0.3651 d2.loss_bbox: 0.1454 d2.loss_iou: 0.2337 d3.loss_cls: 0.3601 d3.loss_bbox: 0.1433 d3.loss_iou: 0.2312 d4.loss_cls: 0.3585 d4.loss_bbox: 0.1457 d4.loss_iou: 0.2311 enc_loss_cls: 0.3979 enc_loss_bbox: 0.1686 enc_loss_iou: 0.2638 dn_loss_cls: 0.1788 dn_loss_bbox: 0.2097 dn_loss_iou: 0.2420 d0.dn_loss_cls: 0.2627 d0.dn_loss_bbox: 0.3797 d0.dn_loss_iou: 0.4014 d1.dn_loss_cls: 0.2053 d1.dn_loss_bbox: 0.2489 d1.dn_loss_iou: 0.2791 d2.dn_loss_cls: 0.1876 d2.dn_loss_bbox: 0.2213 d2.dn_loss_iou: 0.2524 d3.dn_loss_cls: 0.1830 d3.dn_loss_bbox: 0.2123 d3.dn_loss_iou: 0.2448 d4.dn_loss_cls: 0.1798 d4.dn_loss_bbox: 0.2096 d4.dn_loss_iou: 0.2419 d1.loss_lmm_region: 0.2466 loss_lmm_image: 0.9968 2024/11/10 12:30:38 - mmengine - INFO - Iter(train) [ 5100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:22:06 time: 1.9640 data_time: 0.0164 memory: 33426 grad_norm: 26.3008 loss: 10.5244 loss_cls: 0.3477 loss_bbox: 0.1512 loss_iou: 0.2491 d0.loss_cls: 0.4046 d0.loss_bbox: 0.1575 d0.loss_iou: 0.2611 d1.loss_cls: 0.3582 d1.loss_bbox: 0.1575 d1.loss_iou: 0.2573 d2.loss_cls: 0.3521 d2.loss_bbox: 0.1559 d2.loss_iou: 0.2531 d3.loss_cls: 0.3490 d3.loss_bbox: 0.1505 d3.loss_iou: 0.2494 d4.loss_cls: 0.3507 d4.loss_bbox: 0.1505 d4.loss_iou: 0.2473 enc_loss_cls: 0.3963 enc_loss_bbox: 0.1786 enc_loss_iou: 0.2842 dn_loss_cls: 0.1459 dn_loss_bbox: 0.1826 dn_loss_iou: 0.2196 d0.dn_loss_cls: 0.2274 d0.dn_loss_bbox: 0.3501 d0.dn_loss_iou: 0.3678 d1.dn_loss_cls: 0.1813 d1.dn_loss_bbox: 0.2160 d1.dn_loss_iou: 0.2502 d2.dn_loss_cls: 0.1614 d2.dn_loss_bbox: 0.1916 d2.dn_loss_iou: 0.2289 d3.dn_loss_cls: 0.1524 d3.dn_loss_bbox: 0.1856 d3.dn_loss_iou: 0.2226 d4.dn_loss_cls: 0.1468 d4.dn_loss_bbox: 0.1827 d4.dn_loss_iou: 0.2197 d1.loss_lmm_region: 0.2148 loss_lmm_image: 1.0152 2024/11/10 12:33:57 - mmengine - INFO - Iter(train) [ 5200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:18:27 time: 1.9955 data_time: 0.0166 memory: 35925 grad_norm: 26.2009 loss: 10.6123 loss_cls: 0.3701 loss_bbox: 0.1418 loss_iou: 0.2534 d0.loss_cls: 0.4281 d0.loss_bbox: 0.1473 d0.loss_iou: 0.2683 d1.loss_cls: 0.3903 d1.loss_bbox: 0.1429 d1.loss_iou: 0.2584 d2.loss_cls: 0.3810 d2.loss_bbox: 0.1408 d2.loss_iou: 0.2534 d3.loss_cls: 0.3731 d3.loss_bbox: 0.1390 d3.loss_iou: 0.2521 d4.loss_cls: 0.3722 d4.loss_bbox: 0.1411 d4.loss_iou: 0.2529 enc_loss_cls: 0.4138 enc_loss_bbox: 0.1683 enc_loss_iou: 0.2978 dn_loss_cls: 0.1508 dn_loss_bbox: 0.1661 dn_loss_iou: 0.2229 d0.dn_loss_cls: 0.2321 d0.dn_loss_bbox: 0.3077 d0.dn_loss_iou: 0.3778 d1.dn_loss_cls: 0.1830 d1.dn_loss_bbox: 0.2021 d1.dn_loss_iou: 0.2608 d2.dn_loss_cls: 0.1629 d2.dn_loss_bbox: 0.1758 d2.dn_loss_iou: 0.2343 d3.dn_loss_cls: 0.1547 d3.dn_loss_bbox: 0.1684 d3.dn_loss_iou: 0.2259 d4.dn_loss_cls: 0.1514 d4.dn_loss_bbox: 0.1660 d4.dn_loss_iou: 0.2230 d1.loss_lmm_region: 0.2376 loss_lmm_image: 1.0229 2024/11/10 12:37:17 - mmengine - INFO - Iter(train) [ 5300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:15:09 time: 1.9974 data_time: 0.0165 memory: 34275 grad_norm: 27.8470 loss: 10.0994 loss_cls: 0.3470 loss_bbox: 0.1319 loss_iou: 0.2712 d0.loss_cls: 0.4112 d0.loss_bbox: 0.1370 d0.loss_iou: 0.2800 d1.loss_cls: 0.3716 d1.loss_bbox: 0.1347 d1.loss_iou: 0.2745 d2.loss_cls: 0.3597 d2.loss_bbox: 0.1298 d2.loss_iou: 0.2693 d3.loss_cls: 0.3560 d3.loss_bbox: 0.1285 d3.loss_iou: 0.2692 d4.loss_cls: 0.3489 d4.loss_bbox: 0.1325 d4.loss_iou: 0.2720 enc_loss_cls: 0.4025 enc_loss_bbox: 0.1576 enc_loss_iou: 0.3107 dn_loss_cls: 0.1237 dn_loss_bbox: 0.1415 dn_loss_iou: 0.2161 d0.dn_loss_cls: 0.1999 d0.dn_loss_bbox: 0.2859 d0.dn_loss_iou: 0.3653 d1.dn_loss_cls: 0.1564 d1.dn_loss_bbox: 0.1754 d1.dn_loss_iou: 0.2522 d2.dn_loss_cls: 0.1346 d2.dn_loss_bbox: 0.1517 d2.dn_loss_iou: 0.2262 d3.dn_loss_cls: 0.1285 d3.dn_loss_bbox: 0.1437 d3.dn_loss_iou: 0.2194 d4.dn_loss_cls: 0.1253 d4.dn_loss_bbox: 0.1415 d4.dn_loss_iou: 0.2162 d1.loss_lmm_region: 0.2281 loss_lmm_image: 0.9720 2024/11/10 12:40:36 - mmengine - INFO - Iter(train) [ 5400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:11:36 time: 1.9865 data_time: 0.0165 memory: 35024 grad_norm: 27.2925 loss: 12.7394 loss_cls: 0.4793 loss_bbox: 0.1774 loss_iou: 0.2988 d0.loss_cls: 0.5251 d0.loss_bbox: 0.1916 d0.loss_iou: 0.3131 d1.loss_cls: 0.4863 d1.loss_bbox: 0.1870 d1.loss_iou: 0.3065 d2.loss_cls: 0.4768 d2.loss_bbox: 0.1827 d2.loss_iou: 0.3025 d3.loss_cls: 0.4802 d3.loss_bbox: 0.1744 d3.loss_iou: 0.2968 d4.loss_cls: 0.4849 d4.loss_bbox: 0.1744 d4.loss_iou: 0.2947 enc_loss_cls: 0.5123 enc_loss_bbox: 0.1989 enc_loss_iou: 0.3310 dn_loss_cls: 0.2233 dn_loss_bbox: 0.1938 dn_loss_iou: 0.2558 d0.dn_loss_cls: 0.3159 d0.dn_loss_bbox: 0.3348 d0.dn_loss_iou: 0.3971 d1.dn_loss_cls: 0.2600 d1.dn_loss_bbox: 0.2286 d1.dn_loss_iou: 0.2890 d2.dn_loss_cls: 0.2338 d2.dn_loss_bbox: 0.2059 d2.dn_loss_iou: 0.2688 d3.dn_loss_cls: 0.2293 d3.dn_loss_bbox: 0.1965 d3.dn_loss_iou: 0.2586 d4.dn_loss_cls: 0.2248 d4.dn_loss_bbox: 0.1939 d4.dn_loss_iou: 0.2558 d1.loss_lmm_region: 0.2540 loss_lmm_image: 1.0449 2024/11/10 12:43:56 - mmengine - INFO - Iter(train) [ 5500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:08:22 time: 1.9986 data_time: 0.0167 memory: 34247 grad_norm: 25.4603 loss: 12.5198 loss_cls: 0.4236 loss_bbox: 0.1793 loss_iou: 0.3143 d0.loss_cls: 0.4920 d0.loss_bbox: 0.1974 d0.loss_iou: 0.3269 d1.loss_cls: 0.4516 d1.loss_bbox: 0.1875 d1.loss_iou: 0.3177 d2.loss_cls: 0.4379 d2.loss_bbox: 0.1856 d2.loss_iou: 0.3201 d3.loss_cls: 0.4291 d3.loss_bbox: 0.1839 d3.loss_iou: 0.3163 d4.loss_cls: 0.4250 d4.loss_bbox: 0.1790 d4.loss_iou: 0.3134 enc_loss_cls: 0.4858 enc_loss_bbox: 0.2196 enc_loss_iou: 0.3560 dn_loss_cls: 0.2162 dn_loss_bbox: 0.1987 dn_loss_iou: 0.2377 d0.dn_loss_cls: 0.2917 d0.dn_loss_bbox: 0.3654 d0.dn_loss_iou: 0.3882 d1.dn_loss_cls: 0.2480 d1.dn_loss_bbox: 0.2377 d1.dn_loss_iou: 0.2709 d2.dn_loss_cls: 0.2270 d2.dn_loss_bbox: 0.2117 d2.dn_loss_iou: 0.2485 d3.dn_loss_cls: 0.2190 d3.dn_loss_bbox: 0.2010 d3.dn_loss_iou: 0.2403 d4.dn_loss_cls: 0.2157 d4.dn_loss_bbox: 0.1987 d4.dn_loss_iou: 0.2377 d1.loss_lmm_region: 0.2828 loss_lmm_image: 1.0411 2024/11/10 12:47:16 - mmengine - INFO - Iter(train) [ 5600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:05:18 time: 2.0055 data_time: 0.0167 memory: 32925 grad_norm: 27.9201 loss: 10.7334 loss_cls: 0.3962 loss_bbox: 0.1217 loss_iou: 0.2098 d0.loss_cls: 0.4554 d0.loss_bbox: 0.1327 d0.loss_iou: 0.2254 d1.loss_cls: 0.4169 d1.loss_bbox: 0.1234 d1.loss_iou: 0.2148 d2.loss_cls: 0.4089 d2.loss_bbox: 0.1203 d2.loss_iou: 0.2097 d3.loss_cls: 0.4023 d3.loss_bbox: 0.1189 d3.loss_iou: 0.2092 d4.loss_cls: 0.3992 d4.loss_bbox: 0.1191 d4.loss_iou: 0.2088 enc_loss_cls: 0.4428 enc_loss_bbox: 0.1463 enc_loss_iou: 0.2518 dn_loss_cls: 0.2081 dn_loss_bbox: 0.1786 dn_loss_iou: 0.2162 d0.dn_loss_cls: 0.2977 d0.dn_loss_bbox: 0.3436 d0.dn_loss_iou: 0.3676 d1.dn_loss_cls: 0.2477 d1.dn_loss_bbox: 0.2156 d1.dn_loss_iou: 0.2501 d2.dn_loss_cls: 0.2269 d2.dn_loss_bbox: 0.1894 d2.dn_loss_iou: 0.2253 d3.dn_loss_cls: 0.2146 d3.dn_loss_bbox: 0.1809 d3.dn_loss_iou: 0.2187 d4.dn_loss_cls: 0.2085 d4.dn_loss_bbox: 0.1786 d4.dn_loss_iou: 0.2160 d1.loss_lmm_region: 0.2561 loss_lmm_image: 0.9595 2024/11/10 12:50:36 - mmengine - INFO - Iter(train) [ 5700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 8:02:03 time: 2.0123 data_time: 0.0165 memory: 33035 grad_norm: 24.6521 loss: 9.2770 loss_cls: 0.2807 loss_bbox: 0.1107 loss_iou: 0.1955 d0.loss_cls: 0.3219 d0.loss_bbox: 0.1275 d0.loss_iou: 0.2129 d1.loss_cls: 0.2952 d1.loss_bbox: 0.1217 d1.loss_iou: 0.2033 d2.loss_cls: 0.2857 d2.loss_bbox: 0.1152 d2.loss_iou: 0.1991 d3.loss_cls: 0.2834 d3.loss_bbox: 0.1134 d3.loss_iou: 0.1962 d4.loss_cls: 0.2809 d4.loss_bbox: 0.1125 d4.loss_iou: 0.1966 enc_loss_cls: 0.3201 enc_loss_bbox: 0.1449 enc_loss_iou: 0.2399 dn_loss_cls: 0.1393 dn_loss_bbox: 0.1767 dn_loss_iou: 0.2156 d0.dn_loss_cls: 0.2197 d0.dn_loss_bbox: 0.3555 d0.dn_loss_iou: 0.3743 d1.dn_loss_cls: 0.1735 d1.dn_loss_bbox: 0.2210 d1.dn_loss_iou: 0.2530 d2.dn_loss_cls: 0.1530 d2.dn_loss_bbox: 0.1907 d2.dn_loss_iou: 0.2273 d3.dn_loss_cls: 0.1466 d3.dn_loss_bbox: 0.1794 d3.dn_loss_iou: 0.2185 d4.dn_loss_cls: 0.1413 d4.dn_loss_bbox: 0.1767 d4.dn_loss_iou: 0.2156 d1.loss_lmm_region: 0.1874 loss_lmm_image: 0.9546 2024/11/10 12:53:54 - mmengine - INFO - Iter(train) [ 5800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:58:14 time: 2.0016 data_time: 0.0165 memory: 35401 grad_norm: 23.9004 loss: 11.5380 loss_cls: 0.4077 loss_bbox: 0.1711 loss_iou: 0.2935 d0.loss_cls: 0.4621 d0.loss_bbox: 0.1760 d0.loss_iou: 0.3003 d1.loss_cls: 0.4250 d1.loss_bbox: 0.1730 d1.loss_iou: 0.2953 d2.loss_cls: 0.4171 d2.loss_bbox: 0.1702 d2.loss_iou: 0.2929 d3.loss_cls: 0.4110 d3.loss_bbox: 0.1715 d3.loss_iou: 0.2928 d4.loss_cls: 0.4072 d4.loss_bbox: 0.1707 d4.loss_iou: 0.2943 enc_loss_cls: 0.4553 enc_loss_bbox: 0.1932 enc_loss_iou: 0.3306 dn_loss_cls: 0.1538 dn_loss_bbox: 0.1923 dn_loss_iou: 0.2350 d0.dn_loss_cls: 0.2374 d0.dn_loss_bbox: 0.3381 d0.dn_loss_iou: 0.3779 d1.dn_loss_cls: 0.1880 d1.dn_loss_bbox: 0.2216 d1.dn_loss_iou: 0.2666 d2.dn_loss_cls: 0.1704 d2.dn_loss_bbox: 0.2016 d2.dn_loss_iou: 0.2443 d3.dn_loss_cls: 0.1592 d3.dn_loss_bbox: 0.1938 d3.dn_loss_iou: 0.2371 d4.dn_loss_cls: 0.1552 d4.dn_loss_bbox: 0.1921 d4.dn_loss_iou: 0.2349 d1.loss_lmm_region: 0.2327 loss_lmm_image: 0.9949 2024/11/10 12:57:16 - mmengine - INFO - Iter(train) [ 5900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:55:39 time: 2.0230 data_time: 0.0168 memory: 35071 grad_norm: 25.9943 loss: 11.9515 loss_cls: 0.4557 loss_bbox: 0.1379 loss_iou: 0.2555 d0.loss_cls: 0.5122 d0.loss_bbox: 0.1539 d0.loss_iou: 0.2762 d1.loss_cls: 0.4891 d1.loss_bbox: 0.1453 d1.loss_iou: 0.2638 d2.loss_cls: 0.4642 d2.loss_bbox: 0.1409 d2.loss_iou: 0.2579 d3.loss_cls: 0.4561 d3.loss_bbox: 0.1405 d3.loss_iou: 0.2573 d4.loss_cls: 0.4568 d4.loss_bbox: 0.1403 d4.loss_iou: 0.2579 enc_loss_cls: 0.4977 enc_loss_bbox: 0.1689 enc_loss_iou: 0.3045 dn_loss_cls: 0.2240 dn_loss_bbox: 0.1856 dn_loss_iou: 0.2306 d0.dn_loss_cls: 0.3054 d0.dn_loss_bbox: 0.3657 d0.dn_loss_iou: 0.3932 d1.dn_loss_cls: 0.2555 d1.dn_loss_bbox: 0.2351 d1.dn_loss_iou: 0.2734 d2.dn_loss_cls: 0.2359 d2.dn_loss_bbox: 0.2048 d2.dn_loss_iou: 0.2462 d3.dn_loss_cls: 0.2291 d3.dn_loss_bbox: 0.1895 d3.dn_loss_iou: 0.2340 d4.dn_loss_cls: 0.2246 d4.dn_loss_bbox: 0.1856 d4.dn_loss_iou: 0.2306 d1.loss_lmm_region: 0.2525 loss_lmm_image: 1.0176 2024/11/10 13:00:35 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 13:00:35 - mmengine - INFO - Iter(train) [ 6000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:52:03 time: 1.9805 data_time: 0.0165 memory: 33667 grad_norm: 29.8210 loss: 11.1803 loss_cls: 0.3734 loss_bbox: 0.1338 loss_iou: 0.2284 d0.loss_cls: 0.4276 d0.loss_bbox: 0.1413 d0.loss_iou: 0.2431 d1.loss_cls: 0.3912 d1.loss_bbox: 0.1408 d1.loss_iou: 0.2367 d2.loss_cls: 0.3815 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2310 d3.loss_cls: 0.3759 d3.loss_bbox: 0.1345 d3.loss_iou: 0.2287 d4.loss_cls: 0.3716 d4.loss_bbox: 0.1364 d4.loss_iou: 0.2298 enc_loss_cls: 0.4224 enc_loss_bbox: 0.1556 enc_loss_iou: 0.2637 dn_loss_cls: 0.2405 dn_loss_bbox: 0.1942 dn_loss_iou: 0.2295 d0.dn_loss_cls: 0.3141 d0.dn_loss_bbox: 0.3654 d0.dn_loss_iou: 0.3796 d1.dn_loss_cls: 0.2594 d1.dn_loss_bbox: 0.2247 d1.dn_loss_iou: 0.2617 d2.dn_loss_cls: 0.2438 d2.dn_loss_bbox: 0.2006 d2.dn_loss_iou: 0.2387 d3.dn_loss_cls: 0.2377 d3.dn_loss_bbox: 0.1948 d3.dn_loss_iou: 0.2319 d4.dn_loss_cls: 0.2374 d4.dn_loss_bbox: 0.1943 d4.dn_loss_iou: 0.2294 d1.loss_lmm_region: 0.2584 loss_lmm_image: 1.0619 2024/11/10 13:03:52 - mmengine - INFO - Iter(train) [ 6100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:47:47 time: 1.9535 data_time: 0.0170 memory: 34360 grad_norm: 29.1253 loss: 11.1305 loss_cls: 0.3869 loss_bbox: 0.1486 loss_iou: 0.2530 d0.loss_cls: 0.4516 d0.loss_bbox: 0.1553 d0.loss_iou: 0.2642 d1.loss_cls: 0.4101 d1.loss_bbox: 0.1545 d1.loss_iou: 0.2618 d2.loss_cls: 0.4043 d2.loss_bbox: 0.1435 d2.loss_iou: 0.2495 d3.loss_cls: 0.3922 d3.loss_bbox: 0.1479 d3.loss_iou: 0.2523 d4.loss_cls: 0.3878 d4.loss_bbox: 0.1500 d4.loss_iou: 0.2532 enc_loss_cls: 0.4343 enc_loss_bbox: 0.1758 enc_loss_iou: 0.2965 dn_loss_cls: 0.2061 dn_loss_bbox: 0.1667 dn_loss_iou: 0.2175 d0.dn_loss_cls: 0.2859 d0.dn_loss_bbox: 0.3224 d0.dn_loss_iou: 0.3748 d1.dn_loss_cls: 0.2426 d1.dn_loss_bbox: 0.1973 d1.dn_loss_iou: 0.2529 d2.dn_loss_cls: 0.2202 d2.dn_loss_bbox: 0.1753 d2.dn_loss_iou: 0.2279 d3.dn_loss_cls: 0.2119 d3.dn_loss_bbox: 0.1674 d3.dn_loss_iou: 0.2197 d4.dn_loss_cls: 0.2063 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2174 d1.loss_lmm_region: 0.2396 loss_lmm_image: 1.0383 2024/11/10 13:07:11 - mmengine - INFO - Iter(train) [ 6200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:44:01 time: 1.9841 data_time: 0.0166 memory: 32893 grad_norm: 26.4604 loss: 10.3741 loss_cls: 0.3011 loss_bbox: 0.1463 loss_iou: 0.2626 d0.loss_cls: 0.3678 d0.loss_bbox: 0.1518 d0.loss_iou: 0.2706 d1.loss_cls: 0.3300 d1.loss_bbox: 0.1433 d1.loss_iou: 0.2632 d2.loss_cls: 0.3187 d2.loss_bbox: 0.1383 d2.loss_iou: 0.2583 d3.loss_cls: 0.3065 d3.loss_bbox: 0.1448 d3.loss_iou: 0.2615 d4.loss_cls: 0.3042 d4.loss_bbox: 0.1441 d4.loss_iou: 0.2590 enc_loss_cls: 0.3560 enc_loss_bbox: 0.1666 enc_loss_iou: 0.3002 dn_loss_cls: 0.1376 dn_loss_bbox: 0.1924 dn_loss_iou: 0.2392 d0.dn_loss_cls: 0.2278 d0.dn_loss_bbox: 0.3556 d0.dn_loss_iou: 0.3952 d1.dn_loss_cls: 0.1712 d1.dn_loss_bbox: 0.2241 d1.dn_loss_iou: 0.2727 d2.dn_loss_cls: 0.1503 d2.dn_loss_bbox: 0.2022 d2.dn_loss_iou: 0.2492 d3.dn_loss_cls: 0.1434 d3.dn_loss_bbox: 0.1943 d3.dn_loss_iou: 0.2418 d4.dn_loss_cls: 0.1366 d4.dn_loss_bbox: 0.1925 d4.dn_loss_iou: 0.2392 d1.loss_lmm_region: 0.2291 loss_lmm_image: 0.9848 2024/11/10 13:10:28 - mmengine - INFO - Iter(train) [ 6300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:39:58 time: 1.9904 data_time: 0.0166 memory: 34209 grad_norm: 26.7393 loss: 10.1058 loss_cls: 0.3143 loss_bbox: 0.1329 loss_iou: 0.2065 d0.loss_cls: 0.3623 d0.loss_bbox: 0.1504 d0.loss_iou: 0.2280 d1.loss_cls: 0.3418 d1.loss_bbox: 0.1379 d1.loss_iou: 0.2133 d2.loss_cls: 0.3303 d2.loss_bbox: 0.1329 d2.loss_iou: 0.2070 d3.loss_cls: 0.3232 d3.loss_bbox: 0.1336 d3.loss_iou: 0.2065 d4.loss_cls: 0.3161 d4.loss_bbox: 0.1332 d4.loss_iou: 0.2070 enc_loss_cls: 0.3614 enc_loss_bbox: 0.1585 enc_loss_iou: 0.2489 dn_loss_cls: 0.1552 dn_loss_bbox: 0.1986 dn_loss_iou: 0.2270 d0.dn_loss_cls: 0.2451 d0.dn_loss_bbox: 0.3454 d0.dn_loss_iou: 0.3771 d1.dn_loss_cls: 0.1920 d1.dn_loss_bbox: 0.2293 d1.dn_loss_iou: 0.2612 d2.dn_loss_cls: 0.1702 d2.dn_loss_bbox: 0.2079 d2.dn_loss_iou: 0.2377 d3.dn_loss_cls: 0.1607 d3.dn_loss_bbox: 0.2004 d3.dn_loss_iou: 0.2300 d4.dn_loss_cls: 0.1562 d4.dn_loss_bbox: 0.1984 d4.dn_loss_iou: 0.2269 d1.loss_lmm_region: 0.2360 loss_lmm_image: 1.0046 2024/11/10 13:13:50 - mmengine - INFO - Iter(train) [ 6400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:37:15 time: 2.0293 data_time: 0.0165 memory: 33230 grad_norm: 25.8640 loss: 11.0163 loss_cls: 0.3692 loss_bbox: 0.1572 loss_iou: 0.2765 d0.loss_cls: 0.4282 d0.loss_bbox: 0.1709 d0.loss_iou: 0.2967 d1.loss_cls: 0.3867 d1.loss_bbox: 0.1647 d1.loss_iou: 0.2877 d2.loss_cls: 0.3802 d2.loss_bbox: 0.1592 d2.loss_iou: 0.2838 d3.loss_cls: 0.3744 d3.loss_bbox: 0.1575 d3.loss_iou: 0.2771 d4.loss_cls: 0.3720 d4.loss_bbox: 0.1575 d4.loss_iou: 0.2769 enc_loss_cls: 0.4164 enc_loss_bbox: 0.1914 enc_loss_iou: 0.3256 dn_loss_cls: 0.1479 dn_loss_bbox: 0.1782 dn_loss_iou: 0.2321 d0.dn_loss_cls: 0.2269 d0.dn_loss_bbox: 0.3333 d0.dn_loss_iou: 0.3777 d1.dn_loss_cls: 0.1764 d1.dn_loss_bbox: 0.2107 d1.dn_loss_iou: 0.2644 d2.dn_loss_cls: 0.1619 d2.dn_loss_bbox: 0.1914 d2.dn_loss_iou: 0.2438 d3.dn_loss_cls: 0.1527 d3.dn_loss_bbox: 0.1809 d3.dn_loss_iou: 0.2349 d4.dn_loss_cls: 0.1494 d4.dn_loss_bbox: 0.1781 d4.dn_loss_iou: 0.2321 d1.loss_lmm_region: 0.2253 loss_lmm_image: 1.0087 2024/11/10 13:17:09 - mmengine - INFO - Iter(train) [ 6500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:33:42 time: 2.0059 data_time: 0.0165 memory: 34233 grad_norm: 25.5611 loss: 11.4123 loss_cls: 0.4383 loss_bbox: 0.1425 loss_iou: 0.3107 d0.loss_cls: 0.4973 d0.loss_bbox: 0.1552 d0.loss_iou: 0.3283 d1.loss_cls: 0.4528 d1.loss_bbox: 0.1554 d1.loss_iou: 0.3217 d2.loss_cls: 0.4472 d2.loss_bbox: 0.1448 d2.loss_iou: 0.3130 d3.loss_cls: 0.4419 d3.loss_bbox: 0.1420 d3.loss_iou: 0.3124 d4.loss_cls: 0.4340 d4.loss_bbox: 0.1443 d4.loss_iou: 0.3136 enc_loss_cls: 0.4857 enc_loss_bbox: 0.1696 enc_loss_iou: 0.3561 dn_loss_cls: 0.1663 dn_loss_bbox: 0.1439 dn_loss_iou: 0.2227 d0.dn_loss_cls: 0.2455 d0.dn_loss_bbox: 0.3027 d0.dn_loss_iou: 0.3734 d1.dn_loss_cls: 0.1950 d1.dn_loss_bbox: 0.1763 d1.dn_loss_iou: 0.2578 d2.dn_loss_cls: 0.1756 d2.dn_loss_bbox: 0.1547 d2.dn_loss_iou: 0.2343 d3.dn_loss_cls: 0.1711 d3.dn_loss_bbox: 0.1459 d3.dn_loss_iou: 0.2256 d4.dn_loss_cls: 0.1671 d4.dn_loss_bbox: 0.1439 d4.dn_loss_iou: 0.2228 d1.loss_lmm_region: 0.2238 loss_lmm_image: 0.9571 2024/11/10 13:20:26 - mmengine - INFO - Iter(train) [ 6600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:29:25 time: 1.9878 data_time: 0.0168 memory: 32184 grad_norm: 25.2604 loss: 10.1474 loss_cls: 0.3565 loss_bbox: 0.1411 loss_iou: 0.2333 d0.loss_cls: 0.4203 d0.loss_bbox: 0.1446 d0.loss_iou: 0.2465 d1.loss_cls: 0.3832 d1.loss_bbox: 0.1385 d1.loss_iou: 0.2367 d2.loss_cls: 0.3666 d2.loss_bbox: 0.1423 d2.loss_iou: 0.2327 d3.loss_cls: 0.3576 d3.loss_bbox: 0.1413 d3.loss_iou: 0.2340 d4.loss_cls: 0.3586 d4.loss_bbox: 0.1410 d4.loss_iou: 0.2328 enc_loss_cls: 0.4000 enc_loss_bbox: 0.1618 enc_loss_iou: 0.2736 dn_loss_cls: 0.1438 dn_loss_bbox: 0.1636 dn_loss_iou: 0.1980 d0.dn_loss_cls: 0.2183 d0.dn_loss_bbox: 0.3080 d0.dn_loss_iou: 0.3408 d1.dn_loss_cls: 0.1711 d1.dn_loss_bbox: 0.1926 d1.dn_loss_iou: 0.2307 d2.dn_loss_cls: 0.1496 d2.dn_loss_bbox: 0.1735 d2.dn_loss_iou: 0.2078 d3.dn_loss_cls: 0.1441 d3.dn_loss_bbox: 0.1660 d3.dn_loss_iou: 0.2008 d4.dn_loss_cls: 0.1429 d4.dn_loss_bbox: 0.1636 d4.dn_loss_iou: 0.1981 d1.loss_lmm_region: 0.2287 loss_lmm_image: 1.0624 2024/11/10 13:23:46 - mmengine - INFO - Iter(train) [ 6700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:26:22 time: 2.0160 data_time: 0.0166 memory: 35689 grad_norm: 27.8797 loss: 10.1676 loss_cls: 0.3148 loss_bbox: 0.1262 loss_iou: 0.2327 d0.loss_cls: 0.3662 d0.loss_bbox: 0.1462 d0.loss_iou: 0.2569 d1.loss_cls: 0.3384 d1.loss_bbox: 0.1295 d1.loss_iou: 0.2409 d2.loss_cls: 0.3217 d2.loss_bbox: 0.1321 d2.loss_iou: 0.2374 d3.loss_cls: 0.3204 d3.loss_bbox: 0.1270 d3.loss_iou: 0.2323 d4.loss_cls: 0.3144 d4.loss_bbox: 0.1291 d4.loss_iou: 0.2352 enc_loss_cls: 0.3594 enc_loss_bbox: 0.1545 enc_loss_iou: 0.2773 dn_loss_cls: 0.1561 dn_loss_bbox: 0.1717 dn_loss_iou: 0.2286 d0.dn_loss_cls: 0.2446 d0.dn_loss_bbox: 0.3680 d0.dn_loss_iou: 0.4089 d1.dn_loss_cls: 0.1870 d1.dn_loss_bbox: 0.2142 d1.dn_loss_iou: 0.2681 d2.dn_loss_cls: 0.1690 d2.dn_loss_bbox: 0.1839 d2.dn_loss_iou: 0.2396 d3.dn_loss_cls: 0.1605 d3.dn_loss_bbox: 0.1743 d3.dn_loss_iou: 0.2314 d4.dn_loss_cls: 0.1559 d4.dn_loss_bbox: 0.1716 d4.dn_loss_iou: 0.2286 d1.loss_lmm_region: 0.2230 loss_lmm_image: 0.9898 2024/11/10 13:27:07 - mmengine - INFO - Iter(train) [ 6800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:23:31 time: 2.0117 data_time: 0.0166 memory: 34671 grad_norm: 28.9181 loss: 11.3944 loss_cls: 0.4509 loss_bbox: 0.1349 loss_iou: 0.2464 d0.loss_cls: 0.5153 d0.loss_bbox: 0.1423 d0.loss_iou: 0.2557 d1.loss_cls: 0.4748 d1.loss_bbox: 0.1368 d1.loss_iou: 0.2522 d2.loss_cls: 0.4608 d2.loss_bbox: 0.1371 d2.loss_iou: 0.2512 d3.loss_cls: 0.4500 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2495 d4.loss_cls: 0.4501 d4.loss_bbox: 0.1358 d4.loss_iou: 0.2499 enc_loss_cls: 0.4961 enc_loss_bbox: 0.1603 enc_loss_iou: 0.2869 dn_loss_cls: 0.2089 dn_loss_bbox: 0.1649 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.2864 d0.dn_loss_bbox: 0.3236 d0.dn_loss_iou: 0.3657 d1.dn_loss_cls: 0.2413 d1.dn_loss_bbox: 0.2000 d1.dn_loss_iou: 0.2476 d2.dn_loss_cls: 0.2224 d2.dn_loss_bbox: 0.1744 d2.dn_loss_iou: 0.2244 d3.dn_loss_cls: 0.2151 d3.dn_loss_bbox: 0.1670 d3.dn_loss_iou: 0.2185 d4.dn_loss_cls: 0.2110 d4.dn_loss_bbox: 0.1649 d4.dn_loss_iou: 0.2156 d1.loss_lmm_region: 0.2450 loss_lmm_image: 1.0093 2024/11/10 13:30:26 - mmengine - INFO - Iter(train) [ 6900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:20:12 time: 1.9795 data_time: 0.0166 memory: 33260 grad_norm: 28.7348 loss: 10.4570 loss_cls: 0.3474 loss_bbox: 0.1131 loss_iou: 0.2002 d0.loss_cls: 0.4017 d0.loss_bbox: 0.1382 d0.loss_iou: 0.2179 d1.loss_cls: 0.3701 d1.loss_bbox: 0.1267 d1.loss_iou: 0.2103 d2.loss_cls: 0.3613 d2.loss_bbox: 0.1173 d2.loss_iou: 0.2036 d3.loss_cls: 0.3496 d3.loss_bbox: 0.1161 d3.loss_iou: 0.2017 d4.loss_cls: 0.3442 d4.loss_bbox: 0.1133 d4.loss_iou: 0.2007 enc_loss_cls: 0.4008 enc_loss_bbox: 0.1408 enc_loss_iou: 0.2352 dn_loss_cls: 0.2582 dn_loss_bbox: 0.1583 dn_loss_iou: 0.2073 d0.dn_loss_cls: 0.3403 d0.dn_loss_bbox: 0.3077 d0.dn_loss_iou: 0.3583 d1.dn_loss_cls: 0.2902 d1.dn_loss_bbox: 0.1945 d1.dn_loss_iou: 0.2418 d2.dn_loss_cls: 0.2721 d2.dn_loss_bbox: 0.1701 d2.dn_loss_iou: 0.2173 d3.dn_loss_cls: 0.2606 d3.dn_loss_bbox: 0.1604 d3.dn_loss_iou: 0.2101 d4.dn_loss_cls: 0.2563 d4.dn_loss_bbox: 0.1583 d4.dn_loss_iou: 0.2073 d1.loss_lmm_region: 0.3070 loss_lmm_image: 0.9706 2024/11/10 13:33:45 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 13:33:45 - mmengine - INFO - Iter(train) [ 7000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:16:31 time: 1.9530 data_time: 0.0168 memory: 34675 grad_norm: 26.5569 loss: 10.6090 loss_cls: 0.3794 loss_bbox: 0.1344 loss_iou: 0.2538 d0.loss_cls: 0.4482 d0.loss_bbox: 0.1398 d0.loss_iou: 0.2659 d1.loss_cls: 0.4052 d1.loss_bbox: 0.1352 d1.loss_iou: 0.2587 d2.loss_cls: 0.3945 d2.loss_bbox: 0.1316 d2.loss_iou: 0.2535 d3.loss_cls: 0.3864 d3.loss_bbox: 0.1328 d3.loss_iou: 0.2526 d4.loss_cls: 0.3809 d4.loss_bbox: 0.1326 d4.loss_iou: 0.2532 enc_loss_cls: 0.4368 enc_loss_bbox: 0.1568 enc_loss_iou: 0.2954 dn_loss_cls: 0.1733 dn_loss_bbox: 0.1517 dn_loss_iou: 0.2090 d0.dn_loss_cls: 0.2596 d0.dn_loss_bbox: 0.3104 d0.dn_loss_iou: 0.3660 d1.dn_loss_cls: 0.2064 d1.dn_loss_bbox: 0.1917 d1.dn_loss_iou: 0.2474 d2.dn_loss_cls: 0.1825 d2.dn_loss_bbox: 0.1638 d2.dn_loss_iou: 0.2200 d3.dn_loss_cls: 0.1765 d3.dn_loss_bbox: 0.1550 d3.dn_loss_iou: 0.2125 d4.dn_loss_cls: 0.1742 d4.dn_loss_bbox: 0.1517 d4.dn_loss_iou: 0.2090 d1.loss_lmm_region: 0.2315 loss_lmm_image: 0.9891 2024/11/10 13:37:06 - mmengine - INFO - Iter(train) [ 7100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:13:37 time: 1.9936 data_time: 0.0166 memory: 34035 grad_norm: 25.8039 loss: 11.8612 loss_cls: 0.4297 loss_bbox: 0.1530 loss_iou: 0.2812 d0.loss_cls: 0.4881 d0.loss_bbox: 0.1711 d0.loss_iou: 0.3019 d1.loss_cls: 0.4481 d1.loss_bbox: 0.1609 d1.loss_iou: 0.2905 d2.loss_cls: 0.4397 d2.loss_bbox: 0.1561 d2.loss_iou: 0.2830 d3.loss_cls: 0.4341 d3.loss_bbox: 0.1541 d3.loss_iou: 0.2804 d4.loss_cls: 0.4300 d4.loss_bbox: 0.1555 d4.loss_iou: 0.2816 enc_loss_cls: 0.4737 enc_loss_bbox: 0.1884 enc_loss_iou: 0.3226 dn_loss_cls: 0.2167 dn_loss_bbox: 0.1863 dn_loss_iou: 0.2288 d0.dn_loss_cls: 0.2967 d0.dn_loss_bbox: 0.3455 d0.dn_loss_iou: 0.3769 d1.dn_loss_cls: 0.2483 d1.dn_loss_bbox: 0.2211 d1.dn_loss_iou: 0.2622 d2.dn_loss_cls: 0.2278 d2.dn_loss_bbox: 0.1969 d2.dn_loss_iou: 0.2395 d3.dn_loss_cls: 0.2189 d3.dn_loss_bbox: 0.1887 d3.dn_loss_iou: 0.2314 d4.dn_loss_cls: 0.2178 d4.dn_loss_bbox: 0.1863 d4.dn_loss_iou: 0.2287 d1.loss_lmm_region: 0.2378 loss_lmm_image: 0.9811 2024/11/10 13:40:25 - mmengine - INFO - Iter(train) [ 7200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:10:17 time: 2.0000 data_time: 0.0164 memory: 33756 grad_norm: 27.4830 loss: 11.7987 loss_cls: 0.4069 loss_bbox: 0.1714 loss_iou: 0.3191 d0.loss_cls: 0.4584 d0.loss_bbox: 0.1784 d0.loss_iou: 0.3312 d1.loss_cls: 0.4241 d1.loss_bbox: 0.1754 d1.loss_iou: 0.3278 d2.loss_cls: 0.4135 d2.loss_bbox: 0.1728 d2.loss_iou: 0.3201 d3.loss_cls: 0.4151 d3.loss_bbox: 0.1704 d3.loss_iou: 0.3167 d4.loss_cls: 0.4067 d4.loss_bbox: 0.1710 d4.loss_iou: 0.3194 enc_loss_cls: 0.4483 enc_loss_bbox: 0.1933 enc_loss_iou: 0.3580 dn_loss_cls: 0.1600 dn_loss_bbox: 0.1965 dn_loss_iou: 0.2519 d0.dn_loss_cls: 0.2361 d0.dn_loss_bbox: 0.3443 d0.dn_loss_iou: 0.3984 d1.dn_loss_cls: 0.1899 d1.dn_loss_bbox: 0.2326 d1.dn_loss_iou: 0.2852 d2.dn_loss_cls: 0.1681 d2.dn_loss_bbox: 0.2081 d2.dn_loss_iou: 0.2630 d3.dn_loss_cls: 0.1637 d3.dn_loss_bbox: 0.1990 d3.dn_loss_iou: 0.2544 d4.dn_loss_cls: 0.1598 d4.dn_loss_bbox: 0.1965 d4.dn_loss_iou: 0.2516 d1.loss_lmm_region: 0.2015 loss_lmm_image: 0.9401 2024/11/10 13:43:46 - mmengine - INFO - Iter(train) [ 7300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:07:15 time: 2.0058 data_time: 0.0166 memory: 32478 grad_norm: 26.2227 loss: 9.7433 loss_cls: 0.3247 loss_bbox: 0.1131 loss_iou: 0.2131 d0.loss_cls: 0.3818 d0.loss_bbox: 0.1222 d0.loss_iou: 0.2266 d1.loss_cls: 0.3453 d1.loss_bbox: 0.1179 d1.loss_iou: 0.2207 d2.loss_cls: 0.3368 d2.loss_bbox: 0.1141 d2.loss_iou: 0.2191 d3.loss_cls: 0.3285 d3.loss_bbox: 0.1121 d3.loss_iou: 0.2166 d4.loss_cls: 0.3260 d4.loss_bbox: 0.1136 d4.loss_iou: 0.2151 enc_loss_cls: 0.3702 enc_loss_bbox: 0.1441 enc_loss_iou: 0.2518 dn_loss_cls: 0.1656 dn_loss_bbox: 0.1586 dn_loss_iou: 0.2105 d0.dn_loss_cls: 0.2533 d0.dn_loss_bbox: 0.3169 d0.dn_loss_iou: 0.3552 d1.dn_loss_cls: 0.2028 d1.dn_loss_bbox: 0.2006 d1.dn_loss_iou: 0.2439 d2.dn_loss_cls: 0.1803 d2.dn_loss_bbox: 0.1708 d2.dn_loss_iou: 0.2200 d3.dn_loss_cls: 0.1734 d3.dn_loss_bbox: 0.1618 d3.dn_loss_iou: 0.2132 d4.dn_loss_cls: 0.1679 d4.dn_loss_bbox: 0.1586 d4.dn_loss_iou: 0.2104 d1.loss_lmm_region: 0.2236 loss_lmm_image: 0.9424 2024/11/10 13:47:05 - mmengine - INFO - Iter(train) [ 7400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:03:55 time: 1.9739 data_time: 0.0167 memory: 34914 grad_norm: 26.7456 loss: 10.5749 loss_cls: 0.3768 loss_bbox: 0.1195 loss_iou: 0.2306 d0.loss_cls: 0.4336 d0.loss_bbox: 0.1330 d0.loss_iou: 0.2459 d1.loss_cls: 0.4006 d1.loss_bbox: 0.1269 d1.loss_iou: 0.2388 d2.loss_cls: 0.3881 d2.loss_bbox: 0.1218 d2.loss_iou: 0.2341 d3.loss_cls: 0.3836 d3.loss_bbox: 0.1192 d3.loss_iou: 0.2287 d4.loss_cls: 0.3795 d4.loss_bbox: 0.1194 d4.loss_iou: 0.2310 enc_loss_cls: 0.4245 enc_loss_bbox: 0.1489 enc_loss_iou: 0.2703 dn_loss_cls: 0.2253 dn_loss_bbox: 0.1529 dn_loss_iou: 0.2003 d0.dn_loss_cls: 0.3019 d0.dn_loss_bbox: 0.3019 d0.dn_loss_iou: 0.3472 d1.dn_loss_cls: 0.2638 d1.dn_loss_bbox: 0.1895 d1.dn_loss_iou: 0.2343 d2.dn_loss_cls: 0.2417 d2.dn_loss_bbox: 0.1661 d2.dn_loss_iou: 0.2111 d3.dn_loss_cls: 0.2329 d3.dn_loss_bbox: 0.1561 d3.dn_loss_iou: 0.2033 d4.dn_loss_cls: 0.2263 d4.dn_loss_bbox: 0.1530 d4.dn_loss_iou: 0.2004 d1.loss_lmm_region: 0.2373 loss_lmm_image: 0.9746 2024/11/10 13:50:27 - mmengine - INFO - Iter(train) [ 7500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 7:01:13 time: 2.0190 data_time: 0.0167 memory: 33832 grad_norm: 29.9779 loss: 9.6351 loss_cls: 0.3141 loss_bbox: 0.1326 loss_iou: 0.2060 d0.loss_cls: 0.3724 d0.loss_bbox: 0.1460 d0.loss_iou: 0.2219 d1.loss_cls: 0.3359 d1.loss_bbox: 0.1377 d1.loss_iou: 0.2100 d2.loss_cls: 0.3216 d2.loss_bbox: 0.1346 d2.loss_iou: 0.2073 d3.loss_cls: 0.3190 d3.loss_bbox: 0.1328 d3.loss_iou: 0.2044 d4.loss_cls: 0.3168 d4.loss_bbox: 0.1315 d4.loss_iou: 0.2038 enc_loss_cls: 0.3656 enc_loss_bbox: 0.1702 enc_loss_iou: 0.2504 dn_loss_cls: 0.1464 dn_loss_bbox: 0.1697 dn_loss_iou: 0.1924 d0.dn_loss_cls: 0.2266 d0.dn_loss_bbox: 0.3423 d0.dn_loss_iou: 0.3401 d1.dn_loss_cls: 0.1778 d1.dn_loss_bbox: 0.2082 d1.dn_loss_iou: 0.2273 d2.dn_loss_cls: 0.1568 d2.dn_loss_bbox: 0.1833 d2.dn_loss_iou: 0.2035 d3.dn_loss_cls: 0.1501 d3.dn_loss_bbox: 0.1734 d3.dn_loss_iou: 0.1959 d4.dn_loss_cls: 0.1453 d4.dn_loss_bbox: 0.1697 d4.dn_loss_iou: 0.1924 d1.loss_lmm_region: 0.1978 loss_lmm_image: 1.0014 2024/11/10 13:53:45 - mmengine - INFO - Iter(train) [ 7600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:57:24 time: 2.0040 data_time: 0.0166 memory: 35370 grad_norm: 29.3172 loss: 10.4630 loss_cls: 0.3168 loss_bbox: 0.1296 loss_iou: 0.2228 d0.loss_cls: 0.3664 d0.loss_bbox: 0.1430 d0.loss_iou: 0.2386 d1.loss_cls: 0.3278 d1.loss_bbox: 0.1348 d1.loss_iou: 0.2292 d2.loss_cls: 0.3237 d2.loss_bbox: 0.1290 d2.loss_iou: 0.2252 d3.loss_cls: 0.3165 d3.loss_bbox: 0.1342 d3.loss_iou: 0.2259 d4.loss_cls: 0.3149 d4.loss_bbox: 0.1302 d4.loss_iou: 0.2241 enc_loss_cls: 0.3536 enc_loss_bbox: 0.1622 enc_loss_iou: 0.2672 dn_loss_cls: 0.1804 dn_loss_bbox: 0.1991 dn_loss_iou: 0.2431 d0.dn_loss_cls: 0.2656 d0.dn_loss_bbox: 0.3807 d0.dn_loss_iou: 0.4145 d1.dn_loss_cls: 0.2140 d1.dn_loss_bbox: 0.2397 d1.dn_loss_iou: 0.2820 d2.dn_loss_cls: 0.1940 d2.dn_loss_bbox: 0.2098 d2.dn_loss_iou: 0.2548 d3.dn_loss_cls: 0.1844 d3.dn_loss_bbox: 0.2016 d3.dn_loss_iou: 0.2465 d4.dn_loss_cls: 0.1812 d4.dn_loss_bbox: 0.1991 d4.dn_loss_iou: 0.2430 d1.loss_lmm_region: 0.2331 loss_lmm_image: 0.9808 2024/11/10 13:57:05 - mmengine - INFO - Iter(train) [ 7700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:54:02 time: 1.9834 data_time: 0.0165 memory: 35150 grad_norm: 26.7686 loss: 12.0600 loss_cls: 0.4127 loss_bbox: 0.1660 loss_iou: 0.2850 d0.loss_cls: 0.4844 d0.loss_bbox: 0.1784 d0.loss_iou: 0.3017 d1.loss_cls: 0.4442 d1.loss_bbox: 0.1718 d1.loss_iou: 0.2879 d2.loss_cls: 0.4264 d2.loss_bbox: 0.1677 d2.loss_iou: 0.2903 d3.loss_cls: 0.4217 d3.loss_bbox: 0.1636 d3.loss_iou: 0.2848 d4.loss_cls: 0.4148 d4.loss_bbox: 0.1667 d4.loss_iou: 0.2859 enc_loss_cls: 0.4651 enc_loss_bbox: 0.1965 enc_loss_iou: 0.3309 dn_loss_cls: 0.1866 dn_loss_bbox: 0.2107 dn_loss_iou: 0.2530 d0.dn_loss_cls: 0.2779 d0.dn_loss_bbox: 0.3761 d0.dn_loss_iou: 0.4108 d1.dn_loss_cls: 0.2241 d1.dn_loss_bbox: 0.2501 d1.dn_loss_iou: 0.2881 d2.dn_loss_cls: 0.2007 d2.dn_loss_bbox: 0.2246 d2.dn_loss_iou: 0.2637 d3.dn_loss_cls: 0.1920 d3.dn_loss_bbox: 0.2134 d3.dn_loss_iou: 0.2561 d4.dn_loss_cls: 0.1882 d4.dn_loss_bbox: 0.2107 d4.dn_loss_iou: 0.2530 d1.loss_lmm_region: 0.2454 loss_lmm_image: 0.9882 2024/11/10 14:00:26 - mmengine - INFO - Iter(train) [ 7800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:51:06 time: 2.0174 data_time: 0.0164 memory: 33999 grad_norm: 24.5555 loss: 11.2462 loss_cls: 0.3581 loss_bbox: 0.1595 loss_iou: 0.2791 d0.loss_cls: 0.4161 d0.loss_bbox: 0.1614 d0.loss_iou: 0.2937 d1.loss_cls: 0.3784 d1.loss_bbox: 0.1589 d1.loss_iou: 0.2836 d2.loss_cls: 0.3712 d2.loss_bbox: 0.1548 d2.loss_iou: 0.2794 d3.loss_cls: 0.3644 d3.loss_bbox: 0.1537 d3.loss_iou: 0.2794 d4.loss_cls: 0.3600 d4.loss_bbox: 0.1547 d4.loss_iou: 0.2796 enc_loss_cls: 0.4066 enc_loss_bbox: 0.1790 enc_loss_iou: 0.3182 dn_loss_cls: 0.1785 dn_loss_bbox: 0.1828 dn_loss_iou: 0.2555 d0.dn_loss_cls: 0.2595 d0.dn_loss_bbox: 0.3374 d0.dn_loss_iou: 0.4154 d1.dn_loss_cls: 0.2085 d1.dn_loss_bbox: 0.2214 d1.dn_loss_iou: 0.2937 d2.dn_loss_cls: 0.1923 d2.dn_loss_bbox: 0.1937 d2.dn_loss_iou: 0.2677 d3.dn_loss_cls: 0.1833 d3.dn_loss_bbox: 0.1850 d3.dn_loss_iou: 0.2582 d4.dn_loss_cls: 0.1794 d4.dn_loss_bbox: 0.1829 d4.dn_loss_iou: 0.2555 d1.loss_lmm_region: 0.2366 loss_lmm_image: 0.9691 2024/11/10 14:03:43 - mmengine - INFO - Iter(train) [ 7900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:47:09 time: 1.9809 data_time: 0.0166 memory: 34413 grad_norm: 27.0955 loss: 9.9070 loss_cls: 0.3177 loss_bbox: 0.1189 loss_iou: 0.2138 d0.loss_cls: 0.3731 d0.loss_bbox: 0.1362 d0.loss_iou: 0.2315 d1.loss_cls: 0.3387 d1.loss_bbox: 0.1258 d1.loss_iou: 0.2229 d2.loss_cls: 0.3237 d2.loss_bbox: 0.1233 d2.loss_iou: 0.2179 d3.loss_cls: 0.3237 d3.loss_bbox: 0.1170 d3.loss_iou: 0.2118 d4.loss_cls: 0.3193 d4.loss_bbox: 0.1185 d4.loss_iou: 0.2128 enc_loss_cls: 0.3591 enc_loss_bbox: 0.1555 enc_loss_iou: 0.2595 dn_loss_cls: 0.1639 dn_loss_bbox: 0.1747 dn_loss_iou: 0.2115 d0.dn_loss_cls: 0.2462 d0.dn_loss_bbox: 0.3308 d0.dn_loss_iou: 0.3588 d1.dn_loss_cls: 0.1983 d1.dn_loss_bbox: 0.2102 d1.dn_loss_iou: 0.2460 d2.dn_loss_cls: 0.1760 d2.dn_loss_bbox: 0.1860 d2.dn_loss_iou: 0.2212 d3.dn_loss_cls: 0.1686 d3.dn_loss_bbox: 0.1773 d3.dn_loss_iou: 0.2148 d4.dn_loss_cls: 0.1646 d4.dn_loss_bbox: 0.1748 d4.dn_loss_iou: 0.2116 d1.loss_lmm_region: 0.2255 loss_lmm_image: 1.0253 2024/11/10 14:07:02 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 14:07:02 - mmengine - INFO - Iter(train) [ 8000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:43:46 time: 1.9858 data_time: 0.0168 memory: 34209 grad_norm: nan loss: 10.5796 loss_cls: 0.3343 loss_bbox: 0.1412 loss_iou: 0.2508 d0.loss_cls: 0.3977 d0.loss_bbox: 0.1586 d0.loss_iou: 0.2635 d1.loss_cls: 0.3563 d1.loss_bbox: 0.1444 d1.loss_iou: 0.2527 d2.loss_cls: 0.3431 d2.loss_bbox: 0.1419 d2.loss_iou: 0.2490 d3.loss_cls: 0.3377 d3.loss_bbox: 0.1416 d3.loss_iou: 0.2515 d4.loss_cls: 0.3366 d4.loss_bbox: 0.1398 d4.loss_iou: 0.2482 enc_loss_cls: 0.3891 enc_loss_bbox: 0.1723 enc_loss_iou: 0.2895 dn_loss_cls: 0.1519 dn_loss_bbox: 0.1949 dn_loss_iou: 0.2322 d0.dn_loss_cls: 0.2281 d0.dn_loss_bbox: 0.3557 d0.dn_loss_iou: 0.3858 d1.dn_loss_cls: 0.1808 d1.dn_loss_bbox: 0.2297 d1.dn_loss_iou: 0.2660 d2.dn_loss_cls: 0.1597 d2.dn_loss_bbox: 0.2051 d2.dn_loss_iou: 0.2431 d3.dn_loss_cls: 0.1536 d3.dn_loss_bbox: 0.1974 d3.dn_loss_iou: 0.2352 d4.dn_loss_cls: 0.1511 d4.dn_loss_bbox: 0.1950 d4.dn_loss_iou: 0.2323 d1.loss_lmm_region: 0.2162 loss_lmm_image: 1.0262 2024/11/10 14:10:23 - mmengine - INFO - Iter(train) [ 8100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:40:41 time: 1.9991 data_time: 0.0166 memory: 34229 grad_norm: 32.0082 loss: 10.4992 loss_cls: 0.3596 loss_bbox: 0.1186 loss_iou: 0.2248 d0.loss_cls: 0.4158 d0.loss_bbox: 0.1321 d0.loss_iou: 0.2375 d1.loss_cls: 0.3811 d1.loss_bbox: 0.1267 d1.loss_iou: 0.2288 d2.loss_cls: 0.3661 d2.loss_bbox: 0.1273 d2.loss_iou: 0.2305 d3.loss_cls: 0.3644 d3.loss_bbox: 0.1211 d3.loss_iou: 0.2260 d4.loss_cls: 0.3599 d4.loss_bbox: 0.1172 d4.loss_iou: 0.2254 enc_loss_cls: 0.3963 enc_loss_bbox: 0.1461 enc_loss_iou: 0.2605 dn_loss_cls: 0.1972 dn_loss_bbox: 0.1776 dn_loss_iou: 0.2205 d0.dn_loss_cls: 0.2782 d0.dn_loss_bbox: 0.3347 d0.dn_loss_iou: 0.3627 d1.dn_loss_cls: 0.2286 d1.dn_loss_bbox: 0.2170 d1.dn_loss_iou: 0.2535 d2.dn_loss_cls: 0.2094 d2.dn_loss_bbox: 0.1932 d2.dn_loss_iou: 0.2311 d3.dn_loss_cls: 0.2007 d3.dn_loss_bbox: 0.1808 d3.dn_loss_iou: 0.2233 d4.dn_loss_cls: 0.1974 d4.dn_loss_bbox: 0.1777 d4.dn_loss_iou: 0.2205 d1.loss_lmm_region: 0.2348 loss_lmm_image: 0.9946 2024/11/10 14:13:44 - mmengine - INFO - Iter(train) [ 8200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:37:48 time: 2.0223 data_time: 0.0163 memory: 35822 grad_norm: 33.2186 loss: 10.5047 loss_cls: 0.3539 loss_bbox: 0.1243 loss_iou: 0.2350 d0.loss_cls: 0.4011 d0.loss_bbox: 0.1468 d0.loss_iou: 0.2603 d1.loss_cls: 0.3651 d1.loss_bbox: 0.1415 d1.loss_iou: 0.2533 d2.loss_cls: 0.3630 d2.loss_bbox: 0.1322 d2.loss_iou: 0.2419 d3.loss_cls: 0.3525 d3.loss_bbox: 0.1296 d3.loss_iou: 0.2435 d4.loss_cls: 0.3481 d4.loss_bbox: 0.1304 d4.loss_iou: 0.2434 enc_loss_cls: 0.3899 enc_loss_bbox: 0.1701 enc_loss_iou: 0.2927 dn_loss_cls: 0.1525 dn_loss_bbox: 0.1765 dn_loss_iou: 0.2325 d0.dn_loss_cls: 0.2412 d0.dn_loss_bbox: 0.3357 d0.dn_loss_iou: 0.3938 d1.dn_loss_cls: 0.1917 d1.dn_loss_bbox: 0.2162 d1.dn_loss_iou: 0.2716 d2.dn_loss_cls: 0.1672 d2.dn_loss_bbox: 0.1905 d2.dn_loss_iou: 0.2458 d3.dn_loss_cls: 0.1599 d3.dn_loss_bbox: 0.1814 d3.dn_loss_iou: 0.2364 d4.dn_loss_cls: 0.1530 d4.dn_loss_bbox: 0.1764 d4.dn_loss_iou: 0.2326 d1.loss_lmm_region: 0.2350 loss_lmm_image: 0.9965 2024/11/10 14:17:02 - mmengine - INFO - Iter(train) [ 8300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:33:55 time: 1.9943 data_time: 0.0162 memory: 33480 grad_norm: 28.1081 loss: 10.1858 loss_cls: 0.2949 loss_bbox: 0.1342 loss_iou: 0.2372 d0.loss_cls: 0.3448 d0.loss_bbox: 0.1440 d0.loss_iou: 0.2511 d1.loss_cls: 0.3155 d1.loss_bbox: 0.1381 d1.loss_iou: 0.2398 d2.loss_cls: 0.3054 d2.loss_bbox: 0.1356 d2.loss_iou: 0.2379 d3.loss_cls: 0.2990 d3.loss_bbox: 0.1334 d3.loss_iou: 0.2357 d4.loss_cls: 0.2971 d4.loss_bbox: 0.1343 d4.loss_iou: 0.2377 enc_loss_cls: 0.3376 enc_loss_bbox: 0.1560 enc_loss_iou: 0.2665 dn_loss_cls: 0.1478 dn_loss_bbox: 0.2079 dn_loss_iou: 0.2371 d0.dn_loss_cls: 0.2334 d0.dn_loss_bbox: 0.3895 d0.dn_loss_iou: 0.3969 d1.dn_loss_cls: 0.1842 d1.dn_loss_bbox: 0.2515 d1.dn_loss_iou: 0.2729 d2.dn_loss_cls: 0.1653 d2.dn_loss_bbox: 0.2233 d2.dn_loss_iou: 0.2499 d3.dn_loss_cls: 0.1552 d3.dn_loss_bbox: 0.2109 d3.dn_loss_iou: 0.2403 d4.dn_loss_cls: 0.1489 d4.dn_loss_bbox: 0.2080 d4.dn_loss_iou: 0.2372 d1.loss_lmm_region: 0.2115 loss_lmm_image: 0.9383 2024/11/10 14:20:19 - mmengine - INFO - Iter(train) [ 8400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:29:49 time: 1.9713 data_time: 0.0162 memory: 33357 grad_norm: 23.5020 loss: 11.5512 loss_cls: 0.3774 loss_bbox: 0.1629 loss_iou: 0.2693 d0.loss_cls: 0.4407 d0.loss_bbox: 0.1794 d0.loss_iou: 0.2866 d1.loss_cls: 0.4049 d1.loss_bbox: 0.1700 d1.loss_iou: 0.2745 d2.loss_cls: 0.3910 d2.loss_bbox: 0.1668 d2.loss_iou: 0.2717 d3.loss_cls: 0.3825 d3.loss_bbox: 0.1650 d3.loss_iou: 0.2706 d4.loss_cls: 0.3767 d4.loss_bbox: 0.1641 d4.loss_iou: 0.2700 enc_loss_cls: 0.4300 enc_loss_bbox: 0.2052 enc_loss_iou: 0.3207 dn_loss_cls: 0.1886 dn_loss_bbox: 0.2033 dn_loss_iou: 0.2392 d0.dn_loss_cls: 0.2718 d0.dn_loss_bbox: 0.3703 d0.dn_loss_iou: 0.3844 d1.dn_loss_cls: 0.2226 d1.dn_loss_bbox: 0.2394 d1.dn_loss_iou: 0.2714 d2.dn_loss_cls: 0.2034 d2.dn_loss_bbox: 0.2130 d2.dn_loss_iou: 0.2493 d3.dn_loss_cls: 0.1927 d3.dn_loss_bbox: 0.2055 d3.dn_loss_iou: 0.2423 d4.dn_loss_cls: 0.1894 d4.dn_loss_bbox: 0.2033 d4.dn_loss_iou: 0.2392 d1.loss_lmm_region: 0.2460 loss_lmm_image: 0.9959 2024/11/10 14:23:38 - mmengine - INFO - Iter(train) [ 8500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:26:35 time: 1.9916 data_time: 0.0163 memory: 34146 grad_norm: 27.4760 loss: 10.1604 loss_cls: 0.3012 loss_bbox: 0.1482 loss_iou: 0.2476 d0.loss_cls: 0.3616 d0.loss_bbox: 0.1647 d0.loss_iou: 0.2671 d1.loss_cls: 0.3234 d1.loss_bbox: 0.1552 d1.loss_iou: 0.2560 d2.loss_cls: 0.3116 d2.loss_bbox: 0.1528 d2.loss_iou: 0.2512 d3.loss_cls: 0.3056 d3.loss_bbox: 0.1512 d3.loss_iou: 0.2495 d4.loss_cls: 0.3000 d4.loss_bbox: 0.1490 d4.loss_iou: 0.2486 enc_loss_cls: 0.3563 enc_loss_bbox: 0.1785 enc_loss_iou: 0.2909 dn_loss_cls: 0.1140 dn_loss_bbox: 0.2046 dn_loss_iou: 0.2365 d0.dn_loss_cls: 0.1894 d0.dn_loss_bbox: 0.3581 d0.dn_loss_iou: 0.3821 d1.dn_loss_cls: 0.1471 d1.dn_loss_bbox: 0.2431 d1.dn_loss_iou: 0.2720 d2.dn_loss_cls: 0.1284 d2.dn_loss_bbox: 0.2169 d2.dn_loss_iou: 0.2484 d3.dn_loss_cls: 0.1184 d3.dn_loss_bbox: 0.2076 d3.dn_loss_iou: 0.2397 d4.dn_loss_cls: 0.1167 d4.dn_loss_bbox: 0.2044 d4.dn_loss_iou: 0.2365 d1.loss_lmm_region: 0.1901 loss_lmm_image: 0.9361 2024/11/10 14:26:58 - mmengine - INFO - Iter(train) [ 8600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:23:17 time: 1.9919 data_time: 0.0164 memory: 33852 grad_norm: 23.9470 loss: 9.2995 loss_cls: 0.3217 loss_bbox: 0.1137 loss_iou: 0.2107 d0.loss_cls: 0.3802 d0.loss_bbox: 0.1256 d0.loss_iou: 0.2227 d1.loss_cls: 0.3439 d1.loss_bbox: 0.1200 d1.loss_iou: 0.2132 d2.loss_cls: 0.3345 d2.loss_bbox: 0.1113 d2.loss_iou: 0.2071 d3.loss_cls: 0.3250 d3.loss_bbox: 0.1140 d3.loss_iou: 0.2111 d4.loss_cls: 0.3221 d4.loss_bbox: 0.1144 d4.loss_iou: 0.2110 enc_loss_cls: 0.3743 enc_loss_bbox: 0.1420 enc_loss_iou: 0.2455 dn_loss_cls: 0.1391 dn_loss_bbox: 0.1420 dn_loss_iou: 0.1913 d0.dn_loss_cls: 0.2238 d0.dn_loss_bbox: 0.2933 d0.dn_loss_iou: 0.3407 d1.dn_loss_cls: 0.1686 d1.dn_loss_bbox: 0.1714 d1.dn_loss_iou: 0.2245 d2.dn_loss_cls: 0.1501 d2.dn_loss_bbox: 0.1483 d2.dn_loss_iou: 0.2003 d3.dn_loss_cls: 0.1448 d3.dn_loss_bbox: 0.1438 d3.dn_loss_iou: 0.1940 d4.dn_loss_cls: 0.1395 d4.dn_loss_bbox: 0.1419 d4.dn_loss_iou: 0.1914 d1.loss_lmm_region: 0.1922 loss_lmm_image: 0.9945 2024/11/10 14:30:16 - mmengine - INFO - Iter(train) [ 8700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:19:29 time: 1.9794 data_time: 0.0163 memory: 32647 grad_norm: 31.2050 loss: 11.0883 loss_cls: 0.3432 loss_bbox: 0.1557 loss_iou: 0.2893 d0.loss_cls: 0.4006 d0.loss_bbox: 0.1724 d0.loss_iou: 0.3121 d1.loss_cls: 0.3679 d1.loss_bbox: 0.1599 d1.loss_iou: 0.2969 d2.loss_cls: 0.3570 d2.loss_bbox: 0.1503 d2.loss_iou: 0.2894 d3.loss_cls: 0.3515 d3.loss_bbox: 0.1512 d3.loss_iou: 0.2858 d4.loss_cls: 0.3463 d4.loss_bbox: 0.1516 d4.loss_iou: 0.2880 enc_loss_cls: 0.3906 enc_loss_bbox: 0.1863 enc_loss_iou: 0.3345 dn_loss_cls: 0.1499 dn_loss_bbox: 0.2004 dn_loss_iou: 0.2444 d0.dn_loss_cls: 0.2318 d0.dn_loss_bbox: 0.3529 d0.dn_loss_iou: 0.3942 d1.dn_loss_cls: 0.1814 d1.dn_loss_bbox: 0.2312 d1.dn_loss_iou: 0.2780 d2.dn_loss_cls: 0.1628 d2.dn_loss_bbox: 0.2092 d2.dn_loss_iou: 0.2546 d3.dn_loss_cls: 0.1576 d3.dn_loss_bbox: 0.2015 d3.dn_loss_iou: 0.2471 d4.dn_loss_cls: 0.1511 d4.dn_loss_bbox: 0.2004 d4.dn_loss_iou: 0.2445 d1.loss_lmm_region: 0.2330 loss_lmm_image: 0.9821 2024/11/10 14:33:32 - mmengine - INFO - Iter(train) [ 8800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:15:12 time: 1.9505 data_time: 0.0165 memory: 33473 grad_norm: 26.8216 loss: 10.3874 loss_cls: 0.3374 loss_bbox: 0.1489 loss_iou: 0.2619 d0.loss_cls: 0.3920 d0.loss_bbox: 0.1544 d0.loss_iou: 0.2733 d1.loss_cls: 0.3601 d1.loss_bbox: 0.1483 d1.loss_iou: 0.2677 d2.loss_cls: 0.3516 d2.loss_bbox: 0.1462 d2.loss_iou: 0.2620 d3.loss_cls: 0.3457 d3.loss_bbox: 0.1443 d3.loss_iou: 0.2599 d4.loss_cls: 0.3373 d4.loss_bbox: 0.1486 d4.loss_iou: 0.2624 enc_loss_cls: 0.3820 enc_loss_bbox: 0.1684 enc_loss_iou: 0.2973 dn_loss_cls: 0.1380 dn_loss_bbox: 0.1705 dn_loss_iou: 0.2255 d0.dn_loss_cls: 0.2291 d0.dn_loss_bbox: 0.3182 d0.dn_loss_iou: 0.3729 d1.dn_loss_cls: 0.1729 d1.dn_loss_bbox: 0.1994 d1.dn_loss_iou: 0.2574 d2.dn_loss_cls: 0.1515 d2.dn_loss_bbox: 0.1801 d2.dn_loss_iou: 0.2353 d3.dn_loss_cls: 0.1436 d3.dn_loss_bbox: 0.1725 d3.dn_loss_iou: 0.2279 d4.dn_loss_cls: 0.1384 d4.dn_loss_bbox: 0.1705 d4.dn_loss_iou: 0.2255 d1.loss_lmm_region: 0.2264 loss_lmm_image: 0.9821 2024/11/10 14:36:52 - mmengine - INFO - Iter(train) [ 8900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:11:53 time: 1.9729 data_time: 0.0164 memory: 32796 grad_norm: 31.1591 loss: 11.6818 loss_cls: 0.4188 loss_bbox: 0.1690 loss_iou: 0.2781 d0.loss_cls: 0.4834 d0.loss_bbox: 0.1819 d0.loss_iou: 0.2879 d1.loss_cls: 0.4513 d1.loss_bbox: 0.1660 d1.loss_iou: 0.2805 d2.loss_cls: 0.4347 d2.loss_bbox: 0.1690 d2.loss_iou: 0.2797 d3.loss_cls: 0.4263 d3.loss_bbox: 0.1688 d3.loss_iou: 0.2783 d4.loss_cls: 0.4198 d4.loss_bbox: 0.1711 d4.loss_iou: 0.2776 enc_loss_cls: 0.4746 enc_loss_bbox: 0.1932 enc_loss_iou: 0.3110 dn_loss_cls: 0.2351 dn_loss_bbox: 0.1582 dn_loss_iou: 0.2122 d0.dn_loss_cls: 0.3072 d0.dn_loss_bbox: 0.2980 d0.dn_loss_iou: 0.3519 d1.dn_loss_cls: 0.2620 d1.dn_loss_bbox: 0.1907 d1.dn_loss_iou: 0.2454 d2.dn_loss_cls: 0.2397 d2.dn_loss_bbox: 0.1685 d2.dn_loss_iou: 0.2230 d3.dn_loss_cls: 0.2378 d3.dn_loss_bbox: 0.1607 d3.dn_loss_iou: 0.2152 d4.dn_loss_cls: 0.2366 d4.dn_loss_bbox: 0.1582 d4.dn_loss_iou: 0.2123 d1.loss_lmm_region: 0.2909 loss_lmm_image: 0.9570 2024/11/10 14:40:09 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 14:40:09 - mmengine - INFO - Iter(train) [ 9000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:08:05 time: 1.9641 data_time: 0.0163 memory: 34406 grad_norm: 24.3273 loss: 8.6991 loss_cls: 0.2659 loss_bbox: 0.1125 loss_iou: 0.2058 d0.loss_cls: 0.3175 d0.loss_bbox: 0.1279 d0.loss_iou: 0.2235 d1.loss_cls: 0.2802 d1.loss_bbox: 0.1183 d1.loss_iou: 0.2132 d2.loss_cls: 0.2730 d2.loss_bbox: 0.1153 d2.loss_iou: 0.2091 d3.loss_cls: 0.2694 d3.loss_bbox: 0.1135 d3.loss_iou: 0.2061 d4.loss_cls: 0.2656 d4.loss_bbox: 0.1131 d4.loss_iou: 0.2058 enc_loss_cls: 0.3132 enc_loss_bbox: 0.1423 enc_loss_iou: 0.2486 dn_loss_cls: 0.1084 dn_loss_bbox: 0.1433 dn_loss_iou: 0.1903 d0.dn_loss_cls: 0.1838 d0.dn_loss_bbox: 0.2907 d0.dn_loss_iou: 0.3375 d1.dn_loss_cls: 0.1360 d1.dn_loss_bbox: 0.1764 d1.dn_loss_iou: 0.2235 d2.dn_loss_cls: 0.1196 d2.dn_loss_bbox: 0.1545 d2.dn_loss_iou: 0.2003 d3.dn_loss_cls: 0.1126 d3.dn_loss_bbox: 0.1459 d3.dn_loss_iou: 0.1931 d4.dn_loss_cls: 0.1086 d4.dn_loss_bbox: 0.1432 d4.dn_loss_iou: 0.1901 d1.loss_lmm_region: 0.1995 loss_lmm_image: 1.0020 2024/11/10 14:43:32 - mmengine - INFO - Iter(train) [ 9100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:05:41 time: 2.0262 data_time: 0.0164 memory: 32678 grad_norm: nan loss: 12.9393 loss_cls: 0.5349 loss_bbox: 0.1525 loss_iou: 0.2503 d0.loss_cls: 0.5857 d0.loss_bbox: 0.1711 d0.loss_iou: 0.2720 d1.loss_cls: 0.5607 d1.loss_bbox: 0.1641 d1.loss_iou: 0.2620 d2.loss_cls: 0.5496 d2.loss_bbox: 0.1591 d2.loss_iou: 0.2585 d3.loss_cls: 0.5285 d3.loss_bbox: 0.1587 d3.loss_iou: 0.2566 d4.loss_cls: 0.5337 d4.loss_bbox: 0.1566 d4.loss_iou: 0.2529 enc_loss_cls: 0.5796 enc_loss_bbox: 0.1852 enc_loss_iou: 0.2888 dn_loss_cls: 0.2681 dn_loss_bbox: 0.1985 dn_loss_iou: 0.2313 d0.dn_loss_cls: 0.3647 d0.dn_loss_bbox: 0.3640 d0.dn_loss_iou: 0.3922 d1.dn_loss_cls: 0.3070 d1.dn_loss_bbox: 0.2341 d1.dn_loss_iou: 0.2678 d2.dn_loss_cls: 0.2841 d2.dn_loss_bbox: 0.2091 d2.dn_loss_iou: 0.2429 d3.dn_loss_cls: 0.2756 d3.dn_loss_bbox: 0.1997 d3.dn_loss_iou: 0.2339 d4.dn_loss_cls: 0.2727 d4.dn_loss_bbox: 0.1984 d4.dn_loss_iou: 0.2311 d1.loss_lmm_region: 0.2977 loss_lmm_image: 1.0050 2024/11/10 14:46:51 - mmengine - INFO - Iter(train) [ 9200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 6:02:11 time: 1.9886 data_time: 0.0162 memory: 34227 grad_norm: 24.4182 loss: 9.6104 loss_cls: 0.3011 loss_bbox: 0.1184 loss_iou: 0.2161 d0.loss_cls: 0.3512 d0.loss_bbox: 0.1306 d0.loss_iou: 0.2345 d1.loss_cls: 0.3199 d1.loss_bbox: 0.1233 d1.loss_iou: 0.2267 d2.loss_cls: 0.3061 d2.loss_bbox: 0.1232 d2.loss_iou: 0.2225 d3.loss_cls: 0.3049 d3.loss_bbox: 0.1198 d3.loss_iou: 0.2192 d4.loss_cls: 0.3021 d4.loss_bbox: 0.1185 d4.loss_iou: 0.2168 enc_loss_cls: 0.3445 enc_loss_bbox: 0.1443 enc_loss_iou: 0.2579 dn_loss_cls: 0.1414 dn_loss_bbox: 0.1699 dn_loss_iou: 0.2129 d0.dn_loss_cls: 0.2250 d0.dn_loss_bbox: 0.3218 d0.dn_loss_iou: 0.3627 d1.dn_loss_cls: 0.1734 d1.dn_loss_bbox: 0.2068 d1.dn_loss_iou: 0.2482 d2.dn_loss_cls: 0.1529 d2.dn_loss_bbox: 0.1827 d2.dn_loss_iou: 0.2248 d3.dn_loss_cls: 0.1466 d3.dn_loss_bbox: 0.1734 d3.dn_loss_iou: 0.2168 d4.dn_loss_cls: 0.1418 d4.dn_loss_bbox: 0.1700 d4.dn_loss_iou: 0.2130 d1.loss_lmm_region: 0.2203 loss_lmm_image: 1.0041 2024/11/10 14:50:13 - mmengine - INFO - Iter(train) [ 9300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:59:22 time: 1.9994 data_time: 0.0164 memory: 34881 grad_norm: 29.0989 loss: 9.9069 loss_cls: 0.3033 loss_bbox: 0.1273 loss_iou: 0.2114 d0.loss_cls: 0.3564 d0.loss_bbox: 0.1335 d0.loss_iou: 0.2221 d1.loss_cls: 0.3230 d1.loss_bbox: 0.1310 d1.loss_iou: 0.2184 d2.loss_cls: 0.3176 d2.loss_bbox: 0.1286 d2.loss_iou: 0.2141 d3.loss_cls: 0.3106 d3.loss_bbox: 0.1254 d3.loss_iou: 0.2092 d4.loss_cls: 0.3056 d4.loss_bbox: 0.1264 d4.loss_iou: 0.2110 enc_loss_cls: 0.3374 enc_loss_bbox: 0.1511 enc_loss_iou: 0.2477 dn_loss_cls: 0.1742 dn_loss_bbox: 0.1801 dn_loss_iou: 0.2251 d0.dn_loss_cls: 0.2563 d0.dn_loss_bbox: 0.3336 d0.dn_loss_iou: 0.3770 d1.dn_loss_cls: 0.2090 d1.dn_loss_bbox: 0.2148 d1.dn_loss_iou: 0.2583 d2.dn_loss_cls: 0.1927 d2.dn_loss_bbox: 0.1892 d2.dn_loss_iou: 0.2345 d3.dn_loss_cls: 0.1854 d3.dn_loss_bbox: 0.1823 d3.dn_loss_iou: 0.2278 d4.dn_loss_cls: 0.1762 d4.dn_loss_bbox: 0.1800 d4.dn_loss_iou: 0.2251 d1.loss_lmm_region: 0.2175 loss_lmm_image: 0.9568 2024/11/10 14:53:34 - mmengine - INFO - Iter(train) [ 9400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:56:22 time: 2.0332 data_time: 0.0166 memory: 34001 grad_norm: 26.8691 loss: 10.5931 loss_cls: 0.3679 loss_bbox: 0.1299 loss_iou: 0.2416 d0.loss_cls: 0.4166 d0.loss_bbox: 0.1497 d0.loss_iou: 0.2608 d1.loss_cls: 0.3888 d1.loss_bbox: 0.1410 d1.loss_iou: 0.2489 d2.loss_cls: 0.3798 d2.loss_bbox: 0.1356 d2.loss_iou: 0.2452 d3.loss_cls: 0.3692 d3.loss_bbox: 0.1309 d3.loss_iou: 0.2442 d4.loss_cls: 0.3659 d4.loss_bbox: 0.1326 d4.loss_iou: 0.2453 enc_loss_cls: 0.4081 enc_loss_bbox: 0.1601 enc_loss_iou: 0.2774 dn_loss_cls: 0.1627 dn_loss_bbox: 0.1765 dn_loss_iou: 0.2124 d0.dn_loss_cls: 0.2505 d0.dn_loss_bbox: 0.3527 d0.dn_loss_iou: 0.3714 d1.dn_loss_cls: 0.1949 d1.dn_loss_bbox: 0.2191 d1.dn_loss_iou: 0.2500 d2.dn_loss_cls: 0.1738 d2.dn_loss_bbox: 0.1915 d2.dn_loss_iou: 0.2239 d3.dn_loss_cls: 0.1670 d3.dn_loss_bbox: 0.1802 d3.dn_loss_iou: 0.2157 d4.dn_loss_cls: 0.1630 d4.dn_loss_bbox: 0.1765 d4.dn_loss_iou: 0.2124 d1.loss_lmm_region: 0.2233 loss_lmm_image: 1.0362 2024/11/10 14:56:55 - mmengine - INFO - Iter(train) [ 9500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:53:26 time: 1.9971 data_time: 0.0164 memory: 34462 grad_norm: 25.7934 loss: 10.9397 loss_cls: 0.3454 loss_bbox: 0.1509 loss_iou: 0.2506 d0.loss_cls: 0.3999 d0.loss_bbox: 0.1679 d0.loss_iou: 0.2687 d1.loss_cls: 0.3634 d1.loss_bbox: 0.1573 d1.loss_iou: 0.2606 d2.loss_cls: 0.3544 d2.loss_bbox: 0.1534 d2.loss_iou: 0.2542 d3.loss_cls: 0.3496 d3.loss_bbox: 0.1521 d3.loss_iou: 0.2517 d4.loss_cls: 0.3455 d4.loss_bbox: 0.1510 d4.loss_iou: 0.2501 enc_loss_cls: 0.3887 enc_loss_bbox: 0.1813 enc_loss_iou: 0.2903 dn_loss_cls: 0.1438 dn_loss_bbox: 0.2104 dn_loss_iou: 0.2465 d0.dn_loss_cls: 0.2391 d0.dn_loss_bbox: 0.3900 d0.dn_loss_iou: 0.4107 d1.dn_loss_cls: 0.1831 d1.dn_loss_bbox: 0.2460 d1.dn_loss_iou: 0.2830 d2.dn_loss_cls: 0.1617 d2.dn_loss_bbox: 0.2226 d2.dn_loss_iou: 0.2595 d3.dn_loss_cls: 0.1522 d3.dn_loss_bbox: 0.2135 d3.dn_loss_iou: 0.2495 d4.dn_loss_cls: 0.1454 d4.dn_loss_bbox: 0.2104 d4.dn_loss_iou: 0.2464 d1.loss_lmm_region: 0.2221 loss_lmm_image: 1.0169 2024/11/10 15:00:17 - mmengine - INFO - Iter(train) [ 9600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:50:42 time: 2.0177 data_time: 0.0166 memory: 35222 grad_norm: 27.9810 loss: 11.6499 loss_cls: 0.4376 loss_bbox: 0.1532 loss_iou: 0.3215 d0.loss_cls: 0.5191 d0.loss_bbox: 0.1669 d0.loss_iou: 0.3398 d1.loss_cls: 0.4735 d1.loss_bbox: 0.1577 d1.loss_iou: 0.3301 d2.loss_cls: 0.4519 d2.loss_bbox: 0.1538 d2.loss_iou: 0.3221 d3.loss_cls: 0.4393 d3.loss_bbox: 0.1539 d3.loss_iou: 0.3224 d4.loss_cls: 0.4379 d4.loss_bbox: 0.1544 d4.loss_iou: 0.3217 enc_loss_cls: 0.4978 enc_loss_bbox: 0.1908 enc_loss_iou: 0.3757 dn_loss_cls: 0.1499 dn_loss_bbox: 0.1571 dn_loss_iou: 0.2181 d0.dn_loss_cls: 0.2329 d0.dn_loss_bbox: 0.3127 d0.dn_loss_iou: 0.3712 d1.dn_loss_cls: 0.1820 d1.dn_loss_bbox: 0.1956 d1.dn_loss_iou: 0.2531 d2.dn_loss_cls: 0.1586 d2.dn_loss_bbox: 0.1691 d2.dn_loss_iou: 0.2297 d3.dn_loss_cls: 0.1521 d3.dn_loss_bbox: 0.1599 d3.dn_loss_iou: 0.2216 d4.dn_loss_cls: 0.1513 d4.dn_loss_bbox: 0.1571 d4.dn_loss_iou: 0.2181 d1.loss_lmm_region: 0.1973 loss_lmm_image: 1.0414 2024/11/10 15:03:35 - mmengine - INFO - Iter(train) [ 9700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:47:11 time: 2.0110 data_time: 0.0164 memory: 34176 grad_norm: 25.7342 loss: 9.6873 loss_cls: 0.2963 loss_bbox: 0.1507 loss_iou: 0.2727 d0.loss_cls: 0.3583 d0.loss_bbox: 0.1620 d0.loss_iou: 0.2944 d1.loss_cls: 0.3249 d1.loss_bbox: 0.1562 d1.loss_iou: 0.2834 d2.loss_cls: 0.3119 d2.loss_bbox: 0.1531 d2.loss_iou: 0.2754 d3.loss_cls: 0.3037 d3.loss_bbox: 0.1506 d3.loss_iou: 0.2718 d4.loss_cls: 0.2984 d4.loss_bbox: 0.1508 d4.loss_iou: 0.2724 enc_loss_cls: 0.3461 enc_loss_bbox: 0.1801 enc_loss_iou: 0.3214 dn_loss_cls: 0.0831 dn_loss_bbox: 0.1561 dn_loss_iou: 0.2092 d0.dn_loss_cls: 0.1672 d0.dn_loss_bbox: 0.3112 d0.dn_loss_iou: 0.3597 d1.dn_loss_cls: 0.1123 d1.dn_loss_bbox: 0.1897 d1.dn_loss_iou: 0.2417 d2.dn_loss_cls: 0.0946 d2.dn_loss_bbox: 0.1649 d2.dn_loss_iou: 0.2184 d3.dn_loss_cls: 0.0877 d3.dn_loss_bbox: 0.1576 d3.dn_loss_iou: 0.2114 d4.dn_loss_cls: 0.0842 d4.dn_loss_bbox: 0.1562 d4.dn_loss_iou: 0.2092 d1.loss_lmm_region: 0.1861 loss_lmm_image: 0.9523 2024/11/10 15:06:56 - mmengine - INFO - Iter(train) [ 9800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:44:05 time: 2.0019 data_time: 0.0164 memory: 35161 grad_norm: 24.7097 loss: 11.1135 loss_cls: 0.3501 loss_bbox: 0.1567 loss_iou: 0.2414 d0.loss_cls: 0.4154 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2534 d1.loss_cls: 0.3782 d1.loss_bbox: 0.1567 d1.loss_iou: 0.2500 d2.loss_cls: 0.3607 d2.loss_bbox: 0.1567 d2.loss_iou: 0.2468 d3.loss_cls: 0.3529 d3.loss_bbox: 0.1590 d3.loss_iou: 0.2459 d4.loss_cls: 0.3474 d4.loss_bbox: 0.1602 d4.loss_iou: 0.2445 enc_loss_cls: 0.3919 enc_loss_bbox: 0.1880 enc_loss_iou: 0.2824 dn_loss_cls: 0.1969 dn_loss_bbox: 0.2066 dn_loss_iou: 0.2382 d0.dn_loss_cls: 0.2789 d0.dn_loss_bbox: 0.3746 d0.dn_loss_iou: 0.3886 d1.dn_loss_cls: 0.2307 d1.dn_loss_bbox: 0.2467 d1.dn_loss_iou: 0.2709 d2.dn_loss_cls: 0.2134 d2.dn_loss_bbox: 0.2176 d2.dn_loss_iou: 0.2474 d3.dn_loss_cls: 0.2049 d3.dn_loss_bbox: 0.2071 d3.dn_loss_iou: 0.2397 d4.dn_loss_cls: 0.1986 d4.dn_loss_bbox: 0.2067 d4.dn_loss_iou: 0.2383 d1.loss_lmm_region: 0.2328 loss_lmm_image: 0.9776 2024/11/10 15:10:15 - mmengine - INFO - Iter(train) [ 9900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:40:40 time: 1.9863 data_time: 0.0164 memory: 34557 grad_norm: 28.7953 loss: 10.0232 loss_cls: 0.3423 loss_bbox: 0.1281 loss_iou: 0.2152 d0.loss_cls: 0.3848 d0.loss_bbox: 0.1424 d0.loss_iou: 0.2290 d1.loss_cls: 0.3551 d1.loss_bbox: 0.1316 d1.loss_iou: 0.2197 d2.loss_cls: 0.3510 d2.loss_bbox: 0.1296 d2.loss_iou: 0.2160 d3.loss_cls: 0.3479 d3.loss_bbox: 0.1270 d3.loss_iou: 0.2145 d4.loss_cls: 0.3445 d4.loss_bbox: 0.1287 d4.loss_iou: 0.2156 enc_loss_cls: 0.3816 enc_loss_bbox: 0.1581 enc_loss_iou: 0.2549 dn_loss_cls: 0.1421 dn_loss_bbox: 0.1841 dn_loss_iou: 0.2183 d0.dn_loss_cls: 0.2267 d0.dn_loss_bbox: 0.3468 d0.dn_loss_iou: 0.3731 d1.dn_loss_cls: 0.1790 d1.dn_loss_bbox: 0.2122 d1.dn_loss_iou: 0.2500 d2.dn_loss_cls: 0.1551 d2.dn_loss_bbox: 0.1920 d2.dn_loss_iou: 0.2277 d3.dn_loss_cls: 0.1469 d3.dn_loss_bbox: 0.1863 d3.dn_loss_iou: 0.2209 d4.dn_loss_cls: 0.1432 d4.dn_loss_bbox: 0.1843 d4.dn_loss_iou: 0.2184 d1.loss_lmm_region: 0.2142 loss_lmm_image: 0.9841 2024/11/10 15:13:36 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 15:13:36 - mmengine - INFO - Iter(train) [ 10000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:37:37 time: 2.0095 data_time: 0.0168 memory: 35267 grad_norm: 28.9832 loss: 11.4678 loss_cls: 0.3772 loss_bbox: 0.1701 loss_iou: 0.2743 d0.loss_cls: 0.4453 d0.loss_bbox: 0.1796 d0.loss_iou: 0.2827 d1.loss_cls: 0.4028 d1.loss_bbox: 0.1780 d1.loss_iou: 0.2829 d2.loss_cls: 0.3909 d2.loss_bbox: 0.1750 d2.loss_iou: 0.2757 d3.loss_cls: 0.3806 d3.loss_bbox: 0.1727 d3.loss_iou: 0.2756 d4.loss_cls: 0.3769 d4.loss_bbox: 0.1716 d4.loss_iou: 0.2742 enc_loss_cls: 0.4409 enc_loss_bbox: 0.2011 enc_loss_iou: 0.3100 dn_loss_cls: 0.1785 dn_loss_bbox: 0.2051 dn_loss_iou: 0.2313 d0.dn_loss_cls: 0.2653 d0.dn_loss_bbox: 0.3653 d0.dn_loss_iou: 0.3818 d1.dn_loss_cls: 0.2141 d1.dn_loss_bbox: 0.2448 d1.dn_loss_iou: 0.2669 d2.dn_loss_cls: 0.1942 d2.dn_loss_bbox: 0.2196 d2.dn_loss_iou: 0.2436 d3.dn_loss_cls: 0.1820 d3.dn_loss_bbox: 0.2076 d3.dn_loss_iou: 0.2343 d4.dn_loss_cls: 0.1784 d4.dn_loss_bbox: 0.2049 d4.dn_loss_iou: 0.2311 d1.loss_lmm_region: 0.2202 loss_lmm_image: 0.9609 2024/11/10 15:16:56 - mmengine - INFO - Iter(train) [ 10100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:34:21 time: 2.0083 data_time: 0.0166 memory: 33867 grad_norm: 27.1653 loss: 11.0498 loss_cls: 0.3716 loss_bbox: 0.1467 loss_iou: 0.2744 d0.loss_cls: 0.4370 d0.loss_bbox: 0.1537 d0.loss_iou: 0.2883 d1.loss_cls: 0.3958 d1.loss_bbox: 0.1513 d1.loss_iou: 0.2823 d2.loss_cls: 0.3836 d2.loss_bbox: 0.1514 d2.loss_iou: 0.2786 d3.loss_cls: 0.3818 d3.loss_bbox: 0.1441 d3.loss_iou: 0.2742 d4.loss_cls: 0.3743 d4.loss_bbox: 0.1431 d4.loss_iou: 0.2732 enc_loss_cls: 0.4225 enc_loss_bbox: 0.1724 enc_loss_iou: 0.3103 dn_loss_cls: 0.1625 dn_loss_bbox: 0.1856 dn_loss_iou: 0.2300 d0.dn_loss_cls: 0.2433 d0.dn_loss_bbox: 0.3420 d0.dn_loss_iou: 0.3803 d1.dn_loss_cls: 0.1958 d1.dn_loss_bbox: 0.2206 d1.dn_loss_iou: 0.2616 d2.dn_loss_cls: 0.1776 d2.dn_loss_bbox: 0.1956 d2.dn_loss_iou: 0.2405 d3.dn_loss_cls: 0.1708 d3.dn_loss_bbox: 0.1883 d3.dn_loss_iou: 0.2325 d4.dn_loss_cls: 0.1628 d4.dn_loss_bbox: 0.1857 d4.dn_loss_iou: 0.2300 d1.loss_lmm_region: 0.2259 loss_lmm_image: 1.0077 2024/11/10 15:20:15 - mmengine - INFO - Iter(train) [ 10200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:30:59 time: 2.0186 data_time: 0.0165 memory: 34796 grad_norm: 30.0786 loss: 10.1172 loss_cls: 0.3247 loss_bbox: 0.1400 loss_iou: 0.2416 d0.loss_cls: 0.3819 d0.loss_bbox: 0.1513 d0.loss_iou: 0.2579 d1.loss_cls: 0.3515 d1.loss_bbox: 0.1451 d1.loss_iou: 0.2497 d2.loss_cls: 0.3357 d2.loss_bbox: 0.1406 d2.loss_iou: 0.2439 d3.loss_cls: 0.3259 d3.loss_bbox: 0.1401 d3.loss_iou: 0.2408 d4.loss_cls: 0.3223 d4.loss_bbox: 0.1420 d4.loss_iou: 0.2417 enc_loss_cls: 0.3745 enc_loss_bbox: 0.1651 enc_loss_iou: 0.2811 dn_loss_cls: 0.1502 dn_loss_bbox: 0.1690 dn_loss_iou: 0.2098 d0.dn_loss_cls: 0.2295 d0.dn_loss_bbox: 0.3255 d0.dn_loss_iou: 0.3524 d1.dn_loss_cls: 0.1838 d1.dn_loss_bbox: 0.2052 d1.dn_loss_iou: 0.2415 d2.dn_loss_cls: 0.1620 d2.dn_loss_bbox: 0.1819 d2.dn_loss_iou: 0.2197 d3.dn_loss_cls: 0.1538 d3.dn_loss_bbox: 0.1722 d3.dn_loss_iou: 0.2124 d4.dn_loss_cls: 0.1518 d4.dn_loss_bbox: 0.1689 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.2259 loss_lmm_image: 0.9947 2024/11/10 15:23:37 - mmengine - INFO - Iter(train) [ 10300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:28:12 time: 2.0245 data_time: 0.0164 memory: 34335 grad_norm: 25.8424 loss: 10.8241 loss_cls: 0.3448 loss_bbox: 0.1610 loss_iou: 0.2607 d0.loss_cls: 0.4096 d0.loss_bbox: 0.1696 d0.loss_iou: 0.2749 d1.loss_cls: 0.3696 d1.loss_bbox: 0.1677 d1.loss_iou: 0.2678 d2.loss_cls: 0.3550 d2.loss_bbox: 0.1641 d2.loss_iou: 0.2634 d3.loss_cls: 0.3485 d3.loss_bbox: 0.1619 d3.loss_iou: 0.2616 d4.loss_cls: 0.3468 d4.loss_bbox: 0.1582 d4.loss_iou: 0.2596 enc_loss_cls: 0.3999 enc_loss_bbox: 0.1905 enc_loss_iou: 0.3025 dn_loss_cls: 0.1636 dn_loss_bbox: 0.1829 dn_loss_iou: 0.2281 d0.dn_loss_cls: 0.2472 d0.dn_loss_bbox: 0.3421 d0.dn_loss_iou: 0.3849 d1.dn_loss_cls: 0.1982 d1.dn_loss_bbox: 0.2173 d1.dn_loss_iou: 0.2641 d2.dn_loss_cls: 0.1759 d2.dn_loss_bbox: 0.1919 d2.dn_loss_iou: 0.2382 d3.dn_loss_cls: 0.1690 d3.dn_loss_bbox: 0.1848 d3.dn_loss_iou: 0.2313 d4.dn_loss_cls: 0.1631 d4.dn_loss_bbox: 0.1830 d4.dn_loss_iou: 0.2281 d1.loss_lmm_region: 0.2070 loss_lmm_image: 0.9857 2024/11/10 15:26:58 - mmengine - INFO - Iter(train) [ 10400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:25:06 time: 2.0354 data_time: 0.0167 memory: 33609 grad_norm: 27.0512 loss: 11.1207 loss_cls: 0.3677 loss_bbox: 0.1729 loss_iou: 0.2824 d0.loss_cls: 0.4291 d0.loss_bbox: 0.1847 d0.loss_iou: 0.2920 d1.loss_cls: 0.3938 d1.loss_bbox: 0.1750 d1.loss_iou: 0.2860 d2.loss_cls: 0.3789 d2.loss_bbox: 0.1741 d2.loss_iou: 0.2846 d3.loss_cls: 0.3727 d3.loss_bbox: 0.1733 d3.loss_iou: 0.2831 d4.loss_cls: 0.3710 d4.loss_bbox: 0.1721 d4.loss_iou: 0.2811 enc_loss_cls: 0.4249 enc_loss_bbox: 0.1971 enc_loss_iou: 0.3170 dn_loss_cls: 0.1773 dn_loss_bbox: 0.1640 dn_loss_iou: 0.2143 d0.dn_loss_cls: 0.2568 d0.dn_loss_bbox: 0.3292 d0.dn_loss_iou: 0.3708 d1.dn_loss_cls: 0.2091 d1.dn_loss_bbox: 0.2022 d1.dn_loss_iou: 0.2485 d2.dn_loss_cls: 0.1872 d2.dn_loss_bbox: 0.1800 d2.dn_loss_iou: 0.2265 d3.dn_loss_cls: 0.1813 d3.dn_loss_bbox: 0.1669 d3.dn_loss_iou: 0.2179 d4.dn_loss_cls: 0.1774 d4.dn_loss_bbox: 0.1641 d4.dn_loss_iou: 0.2143 d1.loss_lmm_region: 0.2253 loss_lmm_image: 0.9943 2024/11/10 15:30:18 - mmengine - INFO - Iter(train) [ 10500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:21:45 time: 1.9819 data_time: 0.0168 memory: 32975 grad_norm: 28.3523 loss: 10.2078 loss_cls: 0.3281 loss_bbox: 0.1346 loss_iou: 0.2322 d0.loss_cls: 0.3684 d0.loss_bbox: 0.1467 d0.loss_iou: 0.2466 d1.loss_cls: 0.3437 d1.loss_bbox: 0.1416 d1.loss_iou: 0.2391 d2.loss_cls: 0.3328 d2.loss_bbox: 0.1351 d2.loss_iou: 0.2346 d3.loss_cls: 0.3324 d3.loss_bbox: 0.1338 d3.loss_iou: 0.2342 d4.loss_cls: 0.3302 d4.loss_bbox: 0.1337 d4.loss_iou: 0.2304 enc_loss_cls: 0.3613 enc_loss_bbox: 0.1611 enc_loss_iou: 0.2675 dn_loss_cls: 0.1475 dn_loss_bbox: 0.1820 dn_loss_iou: 0.2267 d0.dn_loss_cls: 0.2300 d0.dn_loss_bbox: 0.3671 d0.dn_loss_iou: 0.3898 d1.dn_loss_cls: 0.1790 d1.dn_loss_bbox: 0.2233 d1.dn_loss_iou: 0.2623 d2.dn_loss_cls: 0.1581 d2.dn_loss_bbox: 0.1971 d2.dn_loss_iou: 0.2397 d3.dn_loss_cls: 0.1515 d3.dn_loss_bbox: 0.1852 d3.dn_loss_iou: 0.2299 d4.dn_loss_cls: 0.1484 d4.dn_loss_bbox: 0.1821 d4.dn_loss_iou: 0.2268 d1.loss_lmm_region: 0.2127 loss_lmm_image: 1.0002 2024/11/10 15:33:36 - mmengine - INFO - Iter(train) [ 10600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:18:12 time: 1.9682 data_time: 0.0167 memory: 33084 grad_norm: 26.6235 loss: 9.3458 loss_cls: 0.2889 loss_bbox: 0.1332 loss_iou: 0.2320 d0.loss_cls: 0.3335 d0.loss_bbox: 0.1548 d0.loss_iou: 0.2485 d1.loss_cls: 0.2980 d1.loss_bbox: 0.1499 d1.loss_iou: 0.2448 d2.loss_cls: 0.2960 d2.loss_bbox: 0.1426 d2.loss_iou: 0.2372 d3.loss_cls: 0.2981 d3.loss_bbox: 0.1314 d3.loss_iou: 0.2322 d4.loss_cls: 0.2905 d4.loss_bbox: 0.1307 d4.loss_iou: 0.2310 enc_loss_cls: 0.3284 enc_loss_bbox: 0.1704 enc_loss_iou: 0.2774 dn_loss_cls: 0.1031 dn_loss_bbox: 0.1677 dn_loss_iou: 0.2144 d0.dn_loss_cls: 0.1691 d0.dn_loss_bbox: 0.2986 d0.dn_loss_iou: 0.3553 d1.dn_loss_cls: 0.1266 d1.dn_loss_bbox: 0.1979 d1.dn_loss_iou: 0.2468 d2.dn_loss_cls: 0.1101 d2.dn_loss_bbox: 0.1760 d2.dn_loss_iou: 0.2233 d3.dn_loss_cls: 0.1072 d3.dn_loss_bbox: 0.1680 d3.dn_loss_iou: 0.2162 d4.dn_loss_cls: 0.1039 d4.dn_loss_bbox: 0.1676 d4.dn_loss_iou: 0.2142 d1.loss_lmm_region: 0.1672 loss_lmm_image: 0.9628 2024/11/10 15:36:56 - mmengine - INFO - Iter(train) [ 10700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:14:53 time: 1.9860 data_time: 0.0166 memory: 33787 grad_norm: 24.3338 loss: 10.7841 loss_cls: 0.3594 loss_bbox: 0.1510 loss_iou: 0.2604 d0.loss_cls: 0.4142 d0.loss_bbox: 0.1586 d0.loss_iou: 0.2735 d1.loss_cls: 0.3795 d1.loss_bbox: 0.1560 d1.loss_iou: 0.2657 d2.loss_cls: 0.3678 d2.loss_bbox: 0.1539 d2.loss_iou: 0.2630 d3.loss_cls: 0.3648 d3.loss_bbox: 0.1494 d3.loss_iou: 0.2591 d4.loss_cls: 0.3625 d4.loss_bbox: 0.1496 d4.loss_iou: 0.2589 enc_loss_cls: 0.4057 enc_loss_bbox: 0.1704 enc_loss_iou: 0.2948 dn_loss_cls: 0.1751 dn_loss_bbox: 0.1797 dn_loss_iou: 0.2275 d0.dn_loss_cls: 0.2401 d0.dn_loss_bbox: 0.3271 d0.dn_loss_iou: 0.3698 d1.dn_loss_cls: 0.2061 d1.dn_loss_bbox: 0.2171 d1.dn_loss_iou: 0.2585 d2.dn_loss_cls: 0.1871 d2.dn_loss_bbox: 0.1930 d2.dn_loss_iou: 0.2383 d3.dn_loss_cls: 0.1786 d3.dn_loss_bbox: 0.1826 d3.dn_loss_iou: 0.2297 d4.dn_loss_cls: 0.1731 d4.dn_loss_bbox: 0.1798 d4.dn_loss_iou: 0.2275 d1.loss_lmm_region: 0.1930 loss_lmm_image: 0.9821 2024/11/10 15:40:15 - mmengine - INFO - Iter(train) [ 10800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:11:26 time: 1.9987 data_time: 0.0165 memory: 32662 grad_norm: 26.1154 loss: 10.1970 loss_cls: 0.2998 loss_bbox: 0.1571 loss_iou: 0.2611 d0.loss_cls: 0.3612 d0.loss_bbox: 0.1604 d0.loss_iou: 0.2709 d1.loss_cls: 0.3235 d1.loss_bbox: 0.1599 d1.loss_iou: 0.2652 d2.loss_cls: 0.3111 d2.loss_bbox: 0.1563 d2.loss_iou: 0.2616 d3.loss_cls: 0.3024 d3.loss_bbox: 0.1579 d3.loss_iou: 0.2612 d4.loss_cls: 0.3003 d4.loss_bbox: 0.1566 d4.loss_iou: 0.2609 enc_loss_cls: 0.3522 enc_loss_bbox: 0.1803 enc_loss_iou: 0.2932 dn_loss_cls: 0.1224 dn_loss_bbox: 0.1948 dn_loss_iou: 0.2426 d0.dn_loss_cls: 0.1996 d0.dn_loss_bbox: 0.3290 d0.dn_loss_iou: 0.3859 d1.dn_loss_cls: 0.1527 d1.dn_loss_bbox: 0.2254 d1.dn_loss_iou: 0.2744 d2.dn_loss_cls: 0.1323 d2.dn_loss_bbox: 0.2052 d2.dn_loss_iou: 0.2532 d3.dn_loss_cls: 0.1287 d3.dn_loss_bbox: 0.1966 d3.dn_loss_iou: 0.2453 d4.dn_loss_cls: 0.1230 d4.dn_loss_bbox: 0.1947 d4.dn_loss_iou: 0.2425 d1.loss_lmm_region: 0.1816 loss_lmm_image: 0.9138 2024/11/10 15:43:35 - mmengine - INFO - Iter(train) [ 10900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:08:14 time: 2.0077 data_time: 0.0165 memory: 34374 grad_norm: 30.2425 loss: 10.7444 loss_cls: 0.3788 loss_bbox: 0.1390 loss_iou: 0.2756 d0.loss_cls: 0.4470 d0.loss_bbox: 0.1485 d0.loss_iou: 0.2879 d1.loss_cls: 0.4043 d1.loss_bbox: 0.1437 d1.loss_iou: 0.2789 d2.loss_cls: 0.3877 d2.loss_bbox: 0.1402 d2.loss_iou: 0.2764 d3.loss_cls: 0.3818 d3.loss_bbox: 0.1399 d3.loss_iou: 0.2761 d4.loss_cls: 0.3803 d4.loss_bbox: 0.1388 d4.loss_iou: 0.2746 enc_loss_cls: 0.4359 enc_loss_bbox: 0.1635 enc_loss_iou: 0.3145 dn_loss_cls: 0.1645 dn_loss_bbox: 0.1573 dn_loss_iou: 0.2180 d0.dn_loss_cls: 0.2388 d0.dn_loss_bbox: 0.2959 d0.dn_loss_iou: 0.3653 d1.dn_loss_cls: 0.1936 d1.dn_loss_bbox: 0.1921 d1.dn_loss_iou: 0.2519 d2.dn_loss_cls: 0.1789 d2.dn_loss_bbox: 0.1690 d2.dn_loss_iou: 0.2289 d3.dn_loss_cls: 0.1702 d3.dn_loss_bbox: 0.1606 d3.dn_loss_iou: 0.2218 d4.dn_loss_cls: 0.1654 d4.dn_loss_bbox: 0.1575 d4.dn_loss_iou: 0.2180 d1.loss_lmm_region: 0.1979 loss_lmm_image: 0.9855 2024/11/10 15:46:54 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 15:46:54 - mmengine - INFO - Iter(train) [ 11000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:04:37 time: 1.9928 data_time: 0.0168 memory: 34793 grad_norm: 30.3052 loss: 10.6891 loss_cls: 0.2987 loss_bbox: 0.1657 loss_iou: 0.2651 d0.loss_cls: 0.3497 d0.loss_bbox: 0.1771 d0.loss_iou: 0.2741 d1.loss_cls: 0.3226 d1.loss_bbox: 0.1672 d1.loss_iou: 0.2661 d2.loss_cls: 0.3117 d2.loss_bbox: 0.1635 d2.loss_iou: 0.2620 d3.loss_cls: 0.3030 d3.loss_bbox: 0.1696 d3.loss_iou: 0.2653 d4.loss_cls: 0.2984 d4.loss_bbox: 0.1667 d4.loss_iou: 0.2646 enc_loss_cls: 0.3462 enc_loss_bbox: 0.1879 enc_loss_iou: 0.2931 dn_loss_cls: 0.1366 dn_loss_bbox: 0.2167 dn_loss_iou: 0.2476 d0.dn_loss_cls: 0.2172 d0.dn_loss_bbox: 0.3900 d0.dn_loss_iou: 0.4048 d1.dn_loss_cls: 0.1651 d1.dn_loss_bbox: 0.2523 d1.dn_loss_iou: 0.2790 d2.dn_loss_cls: 0.1503 d2.dn_loss_bbox: 0.2291 d2.dn_loss_iou: 0.2576 d3.dn_loss_cls: 0.1415 d3.dn_loss_bbox: 0.2191 d3.dn_loss_iou: 0.2499 d4.dn_loss_cls: 0.1379 d4.dn_loss_bbox: 0.2166 d4.dn_loss_iou: 0.2476 d1.loss_lmm_region: 0.2284 loss_lmm_image: 0.9834 2024/11/10 15:50:13 - mmengine - INFO - Iter(train) [ 11100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 5:01:13 time: 2.0082 data_time: 0.0164 memory: 34061 grad_norm: 27.7049 loss: 9.8279 loss_cls: 0.3242 loss_bbox: 0.1357 loss_iou: 0.2475 d0.loss_cls: 0.3821 d0.loss_bbox: 0.1491 d0.loss_iou: 0.2690 d1.loss_cls: 0.3488 d1.loss_bbox: 0.1408 d1.loss_iou: 0.2601 d2.loss_cls: 0.3366 d2.loss_bbox: 0.1376 d2.loss_iou: 0.2552 d3.loss_cls: 0.3281 d3.loss_bbox: 0.1359 d3.loss_iou: 0.2524 d4.loss_cls: 0.3242 d4.loss_bbox: 0.1356 d4.loss_iou: 0.2482 enc_loss_cls: 0.3699 enc_loss_bbox: 0.1654 enc_loss_iou: 0.2983 dn_loss_cls: 0.1273 dn_loss_bbox: 0.1591 dn_loss_iou: 0.2051 d0.dn_loss_cls: 0.2024 d0.dn_loss_bbox: 0.3012 d0.dn_loss_iou: 0.3461 d1.dn_loss_cls: 0.1560 d1.dn_loss_bbox: 0.1931 d1.dn_loss_iou: 0.2364 d2.dn_loss_cls: 0.1382 d2.dn_loss_bbox: 0.1696 d2.dn_loss_iou: 0.2147 d3.dn_loss_cls: 0.1302 d3.dn_loss_bbox: 0.1612 d3.dn_loss_iou: 0.2076 d4.dn_loss_cls: 0.1264 d4.dn_loss_bbox: 0.1589 d4.dn_loss_iou: 0.2049 d1.loss_lmm_region: 0.1834 loss_lmm_image: 0.9614 2024/11/10 15:53:35 - mmengine - INFO - Iter(train) [ 11200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:58:26 time: 2.0350 data_time: 0.0166 memory: 33842 grad_norm: 28.7495 loss: 11.4102 loss_cls: 0.4364 loss_bbox: 0.1458 loss_iou: 0.2483 d0.loss_cls: 0.4814 d0.loss_bbox: 0.1627 d0.loss_iou: 0.2652 d1.loss_cls: 0.4447 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2540 d2.loss_cls: 0.4361 d2.loss_bbox: 0.1488 d2.loss_iou: 0.2530 d3.loss_cls: 0.4324 d3.loss_bbox: 0.1481 d3.loss_iou: 0.2511 d4.loss_cls: 0.4291 d4.loss_bbox: 0.1511 d4.loss_iou: 0.2512 enc_loss_cls: 0.4740 enc_loss_bbox: 0.1729 enc_loss_iou: 0.2844 dn_loss_cls: 0.2077 dn_loss_bbox: 0.1812 dn_loss_iou: 0.2170 d0.dn_loss_cls: 0.2820 d0.dn_loss_bbox: 0.3440 d0.dn_loss_iou: 0.3657 d1.dn_loss_cls: 0.2350 d1.dn_loss_bbox: 0.2142 d1.dn_loss_iou: 0.2504 d2.dn_loss_cls: 0.2186 d2.dn_loss_bbox: 0.1928 d2.dn_loss_iou: 0.2268 d3.dn_loss_cls: 0.2123 d3.dn_loss_bbox: 0.1834 d3.dn_loss_iou: 0.2194 d4.dn_loss_cls: 0.2117 d4.dn_loss_bbox: 0.1812 d4.dn_loss_iou: 0.2169 d1.loss_lmm_region: 0.2575 loss_lmm_image: 0.9721 2024/11/10 15:56:56 - mmengine - INFO - Iter(train) [ 11300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:55:28 time: 2.0223 data_time: 0.0166 memory: 33157 grad_norm: 28.6324 loss: 10.9323 loss_cls: 0.3998 loss_bbox: 0.1305 loss_iou: 0.2567 d0.loss_cls: 0.4596 d0.loss_bbox: 0.1519 d0.loss_iou: 0.2762 d1.loss_cls: 0.4274 d1.loss_bbox: 0.1388 d1.loss_iou: 0.2631 d2.loss_cls: 0.4124 d2.loss_bbox: 0.1325 d2.loss_iou: 0.2576 d3.loss_cls: 0.4037 d3.loss_bbox: 0.1310 d3.loss_iou: 0.2568 d4.loss_cls: 0.3985 d4.loss_bbox: 0.1293 d4.loss_iou: 0.2560 enc_loss_cls: 0.4524 enc_loss_bbox: 0.1678 enc_loss_iou: 0.3025 dn_loss_cls: 0.1891 dn_loss_bbox: 0.1610 dn_loss_iou: 0.2137 d0.dn_loss_cls: 0.2737 d0.dn_loss_bbox: 0.3115 d0.dn_loss_iou: 0.3644 d1.dn_loss_cls: 0.2225 d1.dn_loss_bbox: 0.1932 d1.dn_loss_iou: 0.2464 d2.dn_loss_cls: 0.2033 d2.dn_loss_bbox: 0.1732 d2.dn_loss_iou: 0.2249 d3.dn_loss_cls: 0.1946 d3.dn_loss_bbox: 0.1628 d3.dn_loss_iou: 0.2166 d4.dn_loss_cls: 0.1886 d4.dn_loss_bbox: 0.1610 d4.dn_loss_iou: 0.2137 d1.loss_lmm_region: 0.2268 loss_lmm_image: 0.9867 2024/11/10 16:00:15 - mmengine - INFO - Iter(train) [ 11400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:51:49 time: 1.9817 data_time: 0.0167 memory: 33730 grad_norm: 25.7459 loss: 9.3617 loss_cls: 0.2941 loss_bbox: 0.1268 loss_iou: 0.2196 d0.loss_cls: 0.3421 d0.loss_bbox: 0.1405 d0.loss_iou: 0.2298 d1.loss_cls: 0.3139 d1.loss_bbox: 0.1307 d1.loss_iou: 0.2180 d2.loss_cls: 0.3023 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2181 d3.loss_cls: 0.2969 d3.loss_bbox: 0.1294 d3.loss_iou: 0.2191 d4.loss_cls: 0.2932 d4.loss_bbox: 0.1296 d4.loss_iou: 0.2202 enc_loss_cls: 0.3335 enc_loss_bbox: 0.1522 enc_loss_iou: 0.2500 dn_loss_cls: 0.1408 dn_loss_bbox: 0.1583 dn_loss_iou: 0.2022 d0.dn_loss_cls: 0.2225 d0.dn_loss_bbox: 0.3028 d0.dn_loss_iou: 0.3514 d1.dn_loss_cls: 0.1698 d1.dn_loss_bbox: 0.1897 d1.dn_loss_iou: 0.2339 d2.dn_loss_cls: 0.1480 d2.dn_loss_bbox: 0.1668 d2.dn_loss_iou: 0.2113 d3.dn_loss_cls: 0.1434 d3.dn_loss_bbox: 0.1609 d3.dn_loss_iou: 0.2044 d4.dn_loss_cls: 0.1408 d4.dn_loss_bbox: 0.1583 d4.dn_loss_iou: 0.2021 d1.loss_lmm_region: 0.2032 loss_lmm_image: 0.9624 2024/11/10 16:03:33 - mmengine - INFO - Iter(train) [ 11500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:48:15 time: 1.9942 data_time: 0.0167 memory: 33169 grad_norm: 25.7010 loss: 9.5119 loss_cls: 0.2773 loss_bbox: 0.1321 loss_iou: 0.2277 d0.loss_cls: 0.3294 d0.loss_bbox: 0.1450 d0.loss_iou: 0.2386 d1.loss_cls: 0.2944 d1.loss_bbox: 0.1430 d1.loss_iou: 0.2350 d2.loss_cls: 0.2893 d2.loss_bbox: 0.1348 d2.loss_iou: 0.2283 d3.loss_cls: 0.2798 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2288 d4.loss_cls: 0.2785 d4.loss_bbox: 0.1314 d4.loss_iou: 0.2267 enc_loss_cls: 0.3246 enc_loss_bbox: 0.1635 enc_loss_iou: 0.2661 dn_loss_cls: 0.1134 dn_loss_bbox: 0.1901 dn_loss_iou: 0.2228 d0.dn_loss_cls: 0.1881 d0.dn_loss_bbox: 0.3429 d0.dn_loss_iou: 0.3693 d1.dn_loss_cls: 0.1426 d1.dn_loss_bbox: 0.2259 d1.dn_loss_iou: 0.2559 d2.dn_loss_cls: 0.1231 d2.dn_loss_bbox: 0.2012 d2.dn_loss_iou: 0.2327 d3.dn_loss_cls: 0.1162 d3.dn_loss_bbox: 0.1930 d3.dn_loss_iou: 0.2258 d4.dn_loss_cls: 0.1129 d4.dn_loss_bbox: 0.1901 d4.dn_loss_iou: 0.2227 d1.loss_lmm_region: 0.1678 loss_lmm_image: 0.9658 2024/11/10 16:06:54 - mmengine - INFO - Iter(train) [ 11600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:45:12 time: 2.0119 data_time: 0.0166 memory: 35059 grad_norm: 29.0952 loss: 11.1385 loss_cls: 0.3537 loss_bbox: 0.1434 loss_iou: 0.2606 d0.loss_cls: 0.4026 d0.loss_bbox: 0.1520 d0.loss_iou: 0.2782 d1.loss_cls: 0.3690 d1.loss_bbox: 0.1502 d1.loss_iou: 0.2702 d2.loss_cls: 0.3578 d2.loss_bbox: 0.1456 d2.loss_iou: 0.2655 d3.loss_cls: 0.3573 d3.loss_bbox: 0.1434 d3.loss_iou: 0.2605 d4.loss_cls: 0.3528 d4.loss_bbox: 0.1428 d4.loss_iou: 0.2599 enc_loss_cls: 0.3943 enc_loss_bbox: 0.1698 enc_loss_iou: 0.2967 dn_loss_cls: 0.2311 dn_loss_bbox: 0.1827 dn_loss_iou: 0.2364 d0.dn_loss_cls: 0.3127 d0.dn_loss_bbox: 0.3383 d0.dn_loss_iou: 0.3839 d1.dn_loss_cls: 0.2650 d1.dn_loss_bbox: 0.2190 d1.dn_loss_iou: 0.2693 d2.dn_loss_cls: 0.2410 d2.dn_loss_bbox: 0.1933 d2.dn_loss_iou: 0.2465 d3.dn_loss_cls: 0.2362 d3.dn_loss_bbox: 0.1853 d3.dn_loss_iou: 0.2392 d4.dn_loss_cls: 0.2329 d4.dn_loss_bbox: 0.1829 d4.dn_loss_iou: 0.2365 d1.loss_lmm_region: 0.2549 loss_lmm_image: 0.9255 2024/11/10 16:10:13 - mmengine - INFO - Iter(train) [ 11700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:41:50 time: 2.0034 data_time: 0.0168 memory: 34930 grad_norm: 24.8213 loss: 10.2443 loss_cls: 0.3040 loss_bbox: 0.1357 loss_iou: 0.2545 d0.loss_cls: 0.3660 d0.loss_bbox: 0.1465 d0.loss_iou: 0.2666 d1.loss_cls: 0.3229 d1.loss_bbox: 0.1392 d1.loss_iou: 0.2594 d2.loss_cls: 0.3150 d2.loss_bbox: 0.1376 d2.loss_iou: 0.2580 d3.loss_cls: 0.3123 d3.loss_bbox: 0.1346 d3.loss_iou: 0.2519 d4.loss_cls: 0.3070 d4.loss_bbox: 0.1335 d4.loss_iou: 0.2531 enc_loss_cls: 0.3544 enc_loss_bbox: 0.1605 enc_loss_iou: 0.2913 dn_loss_cls: 0.1687 dn_loss_bbox: 0.1791 dn_loss_iou: 0.2249 d0.dn_loss_cls: 0.2465 d0.dn_loss_bbox: 0.3309 d0.dn_loss_iou: 0.3683 d1.dn_loss_cls: 0.1972 d1.dn_loss_bbox: 0.2131 d1.dn_loss_iou: 0.2568 d2.dn_loss_cls: 0.1809 d2.dn_loss_bbox: 0.1884 d2.dn_loss_iou: 0.2337 d3.dn_loss_cls: 0.1705 d3.dn_loss_bbox: 0.1814 d3.dn_loss_iou: 0.2269 d4.dn_loss_cls: 0.1673 d4.dn_loss_bbox: 0.1791 d4.dn_loss_iou: 0.2247 d1.loss_lmm_region: 0.2189 loss_lmm_image: 0.9830 2024/11/10 16:13:32 - mmengine - INFO - Iter(train) [ 11800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:38:22 time: 1.9993 data_time: 0.0164 memory: 34129 grad_norm: 26.9763 loss: 10.8821 loss_cls: 0.3502 loss_bbox: 0.1535 loss_iou: 0.2763 d0.loss_cls: 0.4106 d0.loss_bbox: 0.1663 d0.loss_iou: 0.2979 d1.loss_cls: 0.3802 d1.loss_bbox: 0.1560 d1.loss_iou: 0.2814 d2.loss_cls: 0.3658 d2.loss_bbox: 0.1535 d2.loss_iou: 0.2790 d3.loss_cls: 0.3523 d3.loss_bbox: 0.1523 d3.loss_iou: 0.2774 d4.loss_cls: 0.3471 d4.loss_bbox: 0.1546 d4.loss_iou: 0.2784 enc_loss_cls: 0.4011 enc_loss_bbox: 0.1883 enc_loss_iou: 0.3274 dn_loss_cls: 0.1584 dn_loss_bbox: 0.1837 dn_loss_iou: 0.2349 d0.dn_loss_cls: 0.2429 d0.dn_loss_bbox: 0.3207 d0.dn_loss_iou: 0.3792 d1.dn_loss_cls: 0.1911 d1.dn_loss_bbox: 0.2072 d1.dn_loss_iou: 0.2645 d2.dn_loss_cls: 0.1710 d2.dn_loss_bbox: 0.1902 d2.dn_loss_iou: 0.2435 d3.dn_loss_cls: 0.1638 d3.dn_loss_bbox: 0.1846 d3.dn_loss_iou: 0.2369 d4.dn_loss_cls: 0.1590 d4.dn_loss_bbox: 0.1836 d4.dn_loss_iou: 0.2347 d1.loss_lmm_region: 0.2095 loss_lmm_image: 0.9730 2024/11/10 16:16:52 - mmengine - INFO - Iter(train) [ 11900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:35:04 time: 1.9984 data_time: 0.0166 memory: 33989 grad_norm: 26.4793 loss: 10.6027 loss_cls: 0.3682 loss_bbox: 0.1348 loss_iou: 0.2410 d0.loss_cls: 0.4199 d0.loss_bbox: 0.1418 d0.loss_iou: 0.2510 d1.loss_cls: 0.3865 d1.loss_bbox: 0.1377 d1.loss_iou: 0.2428 d2.loss_cls: 0.3793 d2.loss_bbox: 0.1351 d2.loss_iou: 0.2414 d3.loss_cls: 0.3742 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2425 d4.loss_cls: 0.3720 d4.loss_bbox: 0.1320 d4.loss_iou: 0.2399 enc_loss_cls: 0.4136 enc_loss_bbox: 0.1566 enc_loss_iou: 0.2712 dn_loss_cls: 0.1968 dn_loss_bbox: 0.1684 dn_loss_iou: 0.2190 d0.dn_loss_cls: 0.2676 d0.dn_loss_bbox: 0.3282 d0.dn_loss_iou: 0.3645 d1.dn_loss_cls: 0.2201 d1.dn_loss_bbox: 0.1996 d1.dn_loss_iou: 0.2508 d2.dn_loss_cls: 0.2042 d2.dn_loss_bbox: 0.1786 d2.dn_loss_iou: 0.2289 d3.dn_loss_cls: 0.2020 d3.dn_loss_bbox: 0.1696 d3.dn_loss_iou: 0.2215 d4.dn_loss_cls: 0.1947 d4.dn_loss_bbox: 0.1684 d4.dn_loss_iou: 0.2190 d1.loss_lmm_region: 0.2050 loss_lmm_image: 0.9789 2024/11/10 16:20:11 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 16:20:11 - mmengine - INFO - Iter(train) [ 12000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:31:37 time: 1.9917 data_time: 0.0165 memory: 34274 grad_norm: 27.4353 loss: 9.0933 loss_cls: 0.2836 loss_bbox: 0.1235 loss_iou: 0.2118 d0.loss_cls: 0.3281 d0.loss_bbox: 0.1408 d0.loss_iou: 0.2308 d1.loss_cls: 0.2979 d1.loss_bbox: 0.1311 d1.loss_iou: 0.2202 d2.loss_cls: 0.2917 d2.loss_bbox: 0.1226 d2.loss_iou: 0.2134 d3.loss_cls: 0.2843 d3.loss_bbox: 0.1244 d3.loss_iou: 0.2130 d4.loss_cls: 0.2855 d4.loss_bbox: 0.1218 d4.loss_iou: 0.2102 enc_loss_cls: 0.3227 enc_loss_bbox: 0.1520 enc_loss_iou: 0.2534 dn_loss_cls: 0.1328 dn_loss_bbox: 0.1411 dn_loss_iou: 0.1998 d0.dn_loss_cls: 0.2061 d0.dn_loss_bbox: 0.2921 d0.dn_loss_iou: 0.3512 d1.dn_loss_cls: 0.1557 d1.dn_loss_bbox: 0.1711 d1.dn_loss_iou: 0.2320 d2.dn_loss_cls: 0.1412 d2.dn_loss_bbox: 0.1505 d2.dn_loss_iou: 0.2102 d3.dn_loss_cls: 0.1359 d3.dn_loss_bbox: 0.1434 d3.dn_loss_iou: 0.2023 d4.dn_loss_cls: 0.1347 d4.dn_loss_bbox: 0.1412 d4.dn_loss_iou: 0.1998 d1.loss_lmm_region: 0.1919 loss_lmm_image: 0.9977 2024/11/10 16:23:33 - mmengine - INFO - Iter(train) [ 12100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:28:45 time: 2.0182 data_time: 0.0165 memory: 34829 grad_norm: 26.4021 loss: 10.5619 loss_cls: 0.3353 loss_bbox: 0.1422 loss_iou: 0.2655 d0.loss_cls: 0.4008 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2760 d1.loss_cls: 0.3599 d1.loss_bbox: 0.1489 d1.loss_iou: 0.2703 d2.loss_cls: 0.3495 d2.loss_bbox: 0.1410 d2.loss_iou: 0.2635 d3.loss_cls: 0.3402 d3.loss_bbox: 0.1425 d3.loss_iou: 0.2638 d4.loss_cls: 0.3365 d4.loss_bbox: 0.1408 d4.loss_iou: 0.2634 enc_loss_cls: 0.3882 enc_loss_bbox: 0.1710 enc_loss_iou: 0.3007 dn_loss_cls: 0.1364 dn_loss_bbox: 0.1813 dn_loss_iou: 0.2329 d0.dn_loss_cls: 0.2353 d0.dn_loss_bbox: 0.3575 d0.dn_loss_iou: 0.3999 d1.dn_loss_cls: 0.1768 d1.dn_loss_bbox: 0.2270 d1.dn_loss_iou: 0.2744 d2.dn_loss_cls: 0.1511 d2.dn_loss_bbox: 0.1976 d2.dn_loss_iou: 0.2467 d3.dn_loss_cls: 0.1401 d3.dn_loss_bbox: 0.1865 d3.dn_loss_iou: 0.2375 d4.dn_loss_cls: 0.1364 d4.dn_loss_bbox: 0.1813 d4.dn_loss_iou: 0.2328 d1.loss_lmm_region: 0.2182 loss_lmm_image: 0.9572 2024/11/10 16:26:52 - mmengine - INFO - Iter(train) [ 12200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:25:12 time: 1.9787 data_time: 0.0165 memory: 34366 grad_norm: 31.0048 loss: 11.1738 loss_cls: 0.4008 loss_bbox: 0.1346 loss_iou: 0.2584 d0.loss_cls: 0.4633 d0.loss_bbox: 0.1442 d0.loss_iou: 0.2688 d1.loss_cls: 0.4266 d1.loss_bbox: 0.1389 d1.loss_iou: 0.2597 d2.loss_cls: 0.4085 d2.loss_bbox: 0.1351 d2.loss_iou: 0.2600 d3.loss_cls: 0.4034 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2582 d4.loss_cls: 0.4020 d4.loss_bbox: 0.1359 d4.loss_iou: 0.2593 enc_loss_cls: 0.4503 enc_loss_bbox: 0.1550 enc_loss_iou: 0.2886 dn_loss_cls: 0.2347 dn_loss_bbox: 0.1615 dn_loss_iou: 0.2220 d0.dn_loss_cls: 0.3150 d0.dn_loss_bbox: 0.3185 d0.dn_loss_iou: 0.3684 d1.dn_loss_cls: 0.2687 d1.dn_loss_bbox: 0.1984 d1.dn_loss_iou: 0.2541 d2.dn_loss_cls: 0.2478 d2.dn_loss_bbox: 0.1716 d2.dn_loss_iou: 0.2315 d3.dn_loss_cls: 0.2399 d3.dn_loss_bbox: 0.1631 d3.dn_loss_iou: 0.2247 d4.dn_loss_cls: 0.2364 d4.dn_loss_bbox: 0.1615 d4.dn_loss_iou: 0.2219 d1.loss_lmm_region: 0.2345 loss_lmm_image: 0.9113 2024/11/10 16:30:12 - mmengine - INFO - Iter(train) [ 12300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:21:56 time: 2.0102 data_time: 0.0165 memory: 34064 grad_norm: 25.4387 loss: 8.3643 loss_cls: 0.2787 loss_bbox: 0.0846 loss_iou: 0.1766 d0.loss_cls: 0.3324 d0.loss_bbox: 0.0992 d0.loss_iou: 0.1966 d1.loss_cls: 0.3057 d1.loss_bbox: 0.0869 d1.loss_iou: 0.1807 d2.loss_cls: 0.2881 d2.loss_bbox: 0.0852 d2.loss_iou: 0.1784 d3.loss_cls: 0.2834 d3.loss_bbox: 0.0843 d3.loss_iou: 0.1769 d4.loss_cls: 0.2802 d4.loss_bbox: 0.0841 d4.loss_iou: 0.1757 enc_loss_cls: 0.3147 enc_loss_bbox: 0.1165 enc_loss_iou: 0.2179 dn_loss_cls: 0.1350 dn_loss_bbox: 0.1401 dn_loss_iou: 0.1855 d0.dn_loss_cls: 0.2077 d0.dn_loss_bbox: 0.2739 d0.dn_loss_iou: 0.3260 d1.dn_loss_cls: 0.1656 d1.dn_loss_bbox: 0.1704 d1.dn_loss_iou: 0.2178 d2.dn_loss_cls: 0.1493 d2.dn_loss_bbox: 0.1461 d2.dn_loss_iou: 0.1930 d3.dn_loss_cls: 0.1391 d3.dn_loss_bbox: 0.1412 d3.dn_loss_iou: 0.1878 d4.dn_loss_cls: 0.1347 d4.dn_loss_bbox: 0.1402 d4.dn_loss_iou: 0.1854 d1.loss_lmm_region: 0.1651 loss_lmm_image: 0.9336 2024/11/10 16:33:33 - mmengine - INFO - Iter(train) [ 12400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:18:52 time: 2.0138 data_time: 0.0165 memory: 34873 grad_norm: 27.0804 loss: 8.4694 loss_cls: 0.2557 loss_bbox: 0.0863 loss_iou: 0.1756 d0.loss_cls: 0.3065 d0.loss_bbox: 0.0937 d0.loss_iou: 0.1897 d1.loss_cls: 0.2758 d1.loss_bbox: 0.0859 d1.loss_iou: 0.1774 d2.loss_cls: 0.2654 d2.loss_bbox: 0.0849 d2.loss_iou: 0.1756 d3.loss_cls: 0.2592 d3.loss_bbox: 0.0848 d3.loss_iou: 0.1755 d4.loss_cls: 0.2567 d4.loss_bbox: 0.0846 d4.loss_iou: 0.1745 enc_loss_cls: 0.3027 enc_loss_bbox: 0.1073 enc_loss_iou: 0.2084 dn_loss_cls: 0.1498 dn_loss_bbox: 0.1474 dn_loss_iou: 0.1880 d0.dn_loss_cls: 0.2346 d0.dn_loss_bbox: 0.3125 d0.dn_loss_iou: 0.3411 d1.dn_loss_cls: 0.1786 d1.dn_loss_bbox: 0.1852 d1.dn_loss_iou: 0.2231 d2.dn_loss_cls: 0.1649 d2.dn_loss_bbox: 0.1579 d2.dn_loss_iou: 0.1985 d3.dn_loss_cls: 0.1563 d3.dn_loss_bbox: 0.1495 d3.dn_loss_iou: 0.1913 d4.dn_loss_cls: 0.1519 d4.dn_loss_bbox: 0.1473 d4.dn_loss_iou: 0.1880 d1.loss_lmm_region: 0.1837 loss_lmm_image: 0.9934 2024/11/10 16:36:52 - mmengine - INFO - Iter(train) [ 12500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:15:28 time: 2.0005 data_time: 0.0165 memory: 34202 grad_norm: 24.7653 loss: 10.1798 loss_cls: 0.3068 loss_bbox: 0.1327 loss_iou: 0.2501 d0.loss_cls: 0.3590 d0.loss_bbox: 0.1537 d0.loss_iou: 0.2695 d1.loss_cls: 0.3244 d1.loss_bbox: 0.1431 d1.loss_iou: 0.2596 d2.loss_cls: 0.3192 d2.loss_bbox: 0.1338 d2.loss_iou: 0.2530 d3.loss_cls: 0.3115 d3.loss_bbox: 0.1353 d3.loss_iou: 0.2520 d4.loss_cls: 0.3096 d4.loss_bbox: 0.1322 d4.loss_iou: 0.2498 enc_loss_cls: 0.3585 enc_loss_bbox: 0.1624 enc_loss_iou: 0.2897 dn_loss_cls: 0.1403 dn_loss_bbox: 0.1954 dn_loss_iou: 0.2371 d0.dn_loss_cls: 0.2207 d0.dn_loss_bbox: 0.3472 d0.dn_loss_iou: 0.3839 d1.dn_loss_cls: 0.1693 d1.dn_loss_bbox: 0.2294 d1.dn_loss_iou: 0.2698 d2.dn_loss_cls: 0.1502 d2.dn_loss_bbox: 0.2055 d2.dn_loss_iou: 0.2464 d3.dn_loss_cls: 0.1439 d3.dn_loss_bbox: 0.1969 d3.dn_loss_iou: 0.2393 d4.dn_loss_cls: 0.1404 d4.dn_loss_bbox: 0.1953 d4.dn_loss_iou: 0.2370 d1.loss_lmm_region: 0.1960 loss_lmm_image: 0.9299 2024/11/10 16:40:10 - mmengine - INFO - Iter(train) [ 12600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:11:49 time: 1.9875 data_time: 0.0164 memory: 33674 grad_norm: 26.6972 loss: 12.2346 loss_cls: 0.4106 loss_bbox: 0.1560 loss_iou: 0.2661 d0.loss_cls: 0.4724 d0.loss_bbox: 0.1719 d0.loss_iou: 0.2852 d1.loss_cls: 0.4399 d1.loss_bbox: 0.1634 d1.loss_iou: 0.2757 d2.loss_cls: 0.4263 d2.loss_bbox: 0.1597 d2.loss_iou: 0.2696 d3.loss_cls: 0.4191 d3.loss_bbox: 0.1572 d3.loss_iou: 0.2679 d4.loss_cls: 0.4116 d4.loss_bbox: 0.1553 d4.loss_iou: 0.2651 enc_loss_cls: 0.4633 enc_loss_bbox: 0.1878 enc_loss_iou: 0.3076 dn_loss_cls: 0.2571 dn_loss_bbox: 0.2179 dn_loss_iou: 0.2571 d0.dn_loss_cls: 0.3433 d0.dn_loss_bbox: 0.3813 d0.dn_loss_iou: 0.4111 d1.dn_loss_cls: 0.2939 d1.dn_loss_bbox: 0.2542 d1.dn_loss_iou: 0.2895 d2.dn_loss_cls: 0.2687 d2.dn_loss_bbox: 0.2262 d2.dn_loss_iou: 0.2662 d3.dn_loss_cls: 0.2616 d3.dn_loss_bbox: 0.2209 d3.dn_loss_iou: 0.2601 d4.dn_loss_cls: 0.2555 d4.dn_loss_bbox: 0.2180 d4.dn_loss_iou: 0.2572 d1.loss_lmm_region: 0.2399 loss_lmm_image: 0.9230 2024/11/10 16:43:30 - mmengine - INFO - Iter(train) [ 12700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:08:38 time: 2.0042 data_time: 0.0164 memory: 34774 grad_norm: 27.6860 loss: 10.8046 loss_cls: 0.3599 loss_bbox: 0.1480 loss_iou: 0.2323 d0.loss_cls: 0.4067 d0.loss_bbox: 0.1695 d0.loss_iou: 0.2555 d1.loss_cls: 0.3844 d1.loss_bbox: 0.1506 d1.loss_iou: 0.2362 d2.loss_cls: 0.3686 d2.loss_bbox: 0.1490 d2.loss_iou: 0.2367 d3.loss_cls: 0.3669 d3.loss_bbox: 0.1479 d3.loss_iou: 0.2323 d4.loss_cls: 0.3581 d4.loss_bbox: 0.1509 d4.loss_iou: 0.2342 enc_loss_cls: 0.3968 enc_loss_bbox: 0.1823 enc_loss_iou: 0.2782 dn_loss_cls: 0.1535 dn_loss_bbox: 0.1932 dn_loss_iou: 0.2401 d0.dn_loss_cls: 0.2586 d0.dn_loss_bbox: 0.3806 d0.dn_loss_iou: 0.4085 d1.dn_loss_cls: 0.2051 d1.dn_loss_bbox: 0.2374 d1.dn_loss_iou: 0.2784 d2.dn_loss_cls: 0.1735 d2.dn_loss_bbox: 0.2074 d2.dn_loss_iou: 0.2522 d3.dn_loss_cls: 0.1637 d3.dn_loss_bbox: 0.1958 d3.dn_loss_iou: 0.2430 d4.dn_loss_cls: 0.1551 d4.dn_loss_bbox: 0.1933 d4.dn_loss_iou: 0.2401 d1.loss_lmm_region: 0.2037 loss_lmm_image: 0.9764 2024/11/10 16:46:49 - mmengine - INFO - Iter(train) [ 12800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:05:11 time: 1.9905 data_time: 0.0164 memory: 34133 grad_norm: 26.9282 loss: 10.7072 loss_cls: 0.3434 loss_bbox: 0.1443 loss_iou: 0.2468 d0.loss_cls: 0.3839 d0.loss_bbox: 0.1573 d0.loss_iou: 0.2597 d1.loss_cls: 0.3509 d1.loss_bbox: 0.1537 d1.loss_iou: 0.2560 d2.loss_cls: 0.3503 d2.loss_bbox: 0.1466 d2.loss_iou: 0.2483 d3.loss_cls: 0.3454 d3.loss_bbox: 0.1435 d3.loss_iou: 0.2452 d4.loss_cls: 0.3421 d4.loss_bbox: 0.1450 d4.loss_iou: 0.2455 enc_loss_cls: 0.3757 enc_loss_bbox: 0.1717 enc_loss_iou: 0.2832 dn_loss_cls: 0.1702 dn_loss_bbox: 0.1928 dn_loss_iou: 0.2431 d0.dn_loss_cls: 0.2597 d0.dn_loss_bbox: 0.3488 d0.dn_loss_iou: 0.3980 d1.dn_loss_cls: 0.2073 d1.dn_loss_bbox: 0.2242 d1.dn_loss_iou: 0.2758 d2.dn_loss_cls: 0.1831 d2.dn_loss_bbox: 0.2008 d2.dn_loss_iou: 0.2522 d3.dn_loss_cls: 0.1730 d3.dn_loss_bbox: 0.1949 d3.dn_loss_iou: 0.2453 d4.dn_loss_cls: 0.1705 d4.dn_loss_bbox: 0.1928 d4.dn_loss_iou: 0.2431 d1.loss_lmm_region: 0.2131 loss_lmm_image: 0.9800 2024/11/10 16:50:08 - mmengine - INFO - Iter(train) [ 12900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 4:01:39 time: 1.9918 data_time: 0.0167 memory: 33982 grad_norm: 25.8118 loss: 10.0918 loss_cls: 0.3274 loss_bbox: 0.1398 loss_iou: 0.2467 d0.loss_cls: 0.3783 d0.loss_bbox: 0.1497 d0.loss_iou: 0.2598 d1.loss_cls: 0.3462 d1.loss_bbox: 0.1485 d1.loss_iou: 0.2587 d2.loss_cls: 0.3380 d2.loss_bbox: 0.1409 d2.loss_iou: 0.2476 d3.loss_cls: 0.3344 d3.loss_bbox: 0.1391 d3.loss_iou: 0.2454 d4.loss_cls: 0.3309 d4.loss_bbox: 0.1406 d4.loss_iou: 0.2463 enc_loss_cls: 0.3721 enc_loss_bbox: 0.1608 enc_loss_iou: 0.2767 dn_loss_cls: 0.1584 dn_loss_bbox: 0.1638 dn_loss_iou: 0.2129 d0.dn_loss_cls: 0.2340 d0.dn_loss_bbox: 0.3330 d0.dn_loss_iou: 0.3641 d1.dn_loss_cls: 0.1874 d1.dn_loss_bbox: 0.2016 d1.dn_loss_iou: 0.2477 d2.dn_loss_cls: 0.1690 d2.dn_loss_bbox: 0.1747 d2.dn_loss_iou: 0.2222 d3.dn_loss_cls: 0.1617 d3.dn_loss_bbox: 0.1658 d3.dn_loss_iou: 0.2148 d4.dn_loss_cls: 0.1602 d4.dn_loss_bbox: 0.1639 d4.dn_loss_iou: 0.2128 d1.loss_lmm_region: 0.2153 loss_lmm_image: 0.9009 2024/11/10 16:53:28 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 16:53:28 - mmengine - INFO - Iter(train) [ 13000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:58:27 time: 2.0164 data_time: 0.0164 memory: 33664 grad_norm: 30.8348 loss: 9.6984 loss_cls: 0.3234 loss_bbox: 0.1251 loss_iou: 0.2192 d0.loss_cls: 0.3616 d0.loss_bbox: 0.1375 d0.loss_iou: 0.2363 d1.loss_cls: 0.3384 d1.loss_bbox: 0.1254 d1.loss_iou: 0.2224 d2.loss_cls: 0.3317 d2.loss_bbox: 0.1265 d2.loss_iou: 0.2207 d3.loss_cls: 0.3272 d3.loss_bbox: 0.1239 d3.loss_iou: 0.2171 d4.loss_cls: 0.3239 d4.loss_bbox: 0.1256 d4.loss_iou: 0.2190 enc_loss_cls: 0.3641 enc_loss_bbox: 0.1499 enc_loss_iou: 0.2565 dn_loss_cls: 0.1586 dn_loss_bbox: 0.1516 dn_loss_iou: 0.2057 d0.dn_loss_cls: 0.2323 d0.dn_loss_bbox: 0.2974 d0.dn_loss_iou: 0.3476 d1.dn_loss_cls: 0.1860 d1.dn_loss_bbox: 0.1857 d1.dn_loss_iou: 0.2382 d2.dn_loss_cls: 0.1671 d2.dn_loss_bbox: 0.1604 d2.dn_loss_iou: 0.2157 d3.dn_loss_cls: 0.1613 d3.dn_loss_bbox: 0.1531 d3.dn_loss_iou: 0.2079 d4.dn_loss_cls: 0.1605 d4.dn_loss_bbox: 0.1516 d4.dn_loss_iou: 0.2057 d1.loss_lmm_region: 0.2286 loss_lmm_image: 1.0079 2024/11/10 16:56:48 - mmengine - INFO - Iter(train) [ 13100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:55:04 time: 2.0202 data_time: 0.0167 memory: 34484 grad_norm: 26.6127 loss: 11.6765 loss_cls: 0.3849 loss_bbox: 0.1703 loss_iou: 0.3088 d0.loss_cls: 0.4490 d0.loss_bbox: 0.1856 d0.loss_iou: 0.3288 d1.loss_cls: 0.4060 d1.loss_bbox: 0.1779 d1.loss_iou: 0.3179 d2.loss_cls: 0.4025 d2.loss_bbox: 0.1660 d2.loss_iou: 0.3081 d3.loss_cls: 0.3859 d3.loss_bbox: 0.1709 d3.loss_iou: 0.3077 d4.loss_cls: 0.3836 d4.loss_bbox: 0.1712 d4.loss_iou: 0.3093 enc_loss_cls: 0.4386 enc_loss_bbox: 0.1983 enc_loss_iou: 0.3566 dn_loss_cls: 0.1717 dn_loss_bbox: 0.1965 dn_loss_iou: 0.2488 d0.dn_loss_cls: 0.2456 d0.dn_loss_bbox: 0.3356 d0.dn_loss_iou: 0.3876 d1.dn_loss_cls: 0.2002 d1.dn_loss_bbox: 0.2260 d1.dn_loss_iou: 0.2785 d2.dn_loss_cls: 0.1855 d2.dn_loss_bbox: 0.2061 d2.dn_loss_iou: 0.2590 d3.dn_loss_cls: 0.1759 d3.dn_loss_bbox: 0.1977 d3.dn_loss_iou: 0.2515 d4.dn_loss_cls: 0.1731 d4.dn_loss_bbox: 0.1965 d4.dn_loss_iou: 0.2487 d1.loss_lmm_region: 0.1994 loss_lmm_image: 0.9645 2024/11/10 17:00:06 - mmengine - INFO - Iter(train) [ 13200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:51:30 time: 1.9701 data_time: 0.0166 memory: 34858 grad_norm: 25.3685 loss: 10.1638 loss_cls: 0.3059 loss_bbox: 0.1374 loss_iou: 0.2296 d0.loss_cls: 0.3626 d0.loss_bbox: 0.1579 d0.loss_iou: 0.2485 d1.loss_cls: 0.3378 d1.loss_bbox: 0.1399 d1.loss_iou: 0.2338 d2.loss_cls: 0.3202 d2.loss_bbox: 0.1363 d2.loss_iou: 0.2271 d3.loss_cls: 0.3116 d3.loss_bbox: 0.1347 d3.loss_iou: 0.2271 d4.loss_cls: 0.3061 d4.loss_bbox: 0.1361 d4.loss_iou: 0.2288 enc_loss_cls: 0.3515 enc_loss_bbox: 0.1709 enc_loss_iou: 0.2673 dn_loss_cls: 0.1389 dn_loss_bbox: 0.1914 dn_loss_iou: 0.2341 d0.dn_loss_cls: 0.2365 d0.dn_loss_bbox: 0.3617 d0.dn_loss_iou: 0.3958 d1.dn_loss_cls: 0.1817 d1.dn_loss_bbox: 0.2249 d1.dn_loss_iou: 0.2675 d2.dn_loss_cls: 0.1555 d2.dn_loss_bbox: 0.1990 d2.dn_loss_iou: 0.2430 d3.dn_loss_cls: 0.1462 d3.dn_loss_bbox: 0.1938 d3.dn_loss_iou: 0.2369 d4.dn_loss_cls: 0.1399 d4.dn_loss_bbox: 0.1913 d4.dn_loss_iou: 0.2341 d1.loss_lmm_region: 0.2147 loss_lmm_image: 1.0058 2024/11/10 17:03:24 - mmengine - INFO - Iter(train) [ 13300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:47:58 time: 1.9661 data_time: 0.0167 memory: 32355 grad_norm: 29.3986 loss: 10.5202 loss_cls: 0.3674 loss_bbox: 0.1517 loss_iou: 0.2663 d0.loss_cls: 0.4282 d0.loss_bbox: 0.1554 d0.loss_iou: 0.2750 d1.loss_cls: 0.3883 d1.loss_bbox: 0.1549 d1.loss_iou: 0.2709 d2.loss_cls: 0.3739 d2.loss_bbox: 0.1483 d2.loss_iou: 0.2631 d3.loss_cls: 0.3698 d3.loss_bbox: 0.1504 d3.loss_iou: 0.2633 d4.loss_cls: 0.3693 d4.loss_bbox: 0.1486 d4.loss_iou: 0.2633 enc_loss_cls: 0.4208 enc_loss_bbox: 0.1812 enc_loss_iou: 0.3085 dn_loss_cls: 0.1540 dn_loss_bbox: 0.1622 dn_loss_iou: 0.2080 d0.dn_loss_cls: 0.2329 d0.dn_loss_bbox: 0.3164 d0.dn_loss_iou: 0.3583 d1.dn_loss_cls: 0.1845 d1.dn_loss_bbox: 0.1948 d1.dn_loss_iou: 0.2405 d2.dn_loss_cls: 0.1681 d2.dn_loss_bbox: 0.1703 d2.dn_loss_iou: 0.2168 d3.dn_loss_cls: 0.1592 d3.dn_loss_bbox: 0.1635 d3.dn_loss_iou: 0.2099 d4.dn_loss_cls: 0.1540 d4.dn_loss_bbox: 0.1622 d4.dn_loss_iou: 0.2079 d1.loss_lmm_region: 0.2080 loss_lmm_image: 0.9304 2024/11/10 17:06:42 - mmengine - INFO - Iter(train) [ 13400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:44:21 time: 1.9897 data_time: 0.0166 memory: 34386 grad_norm: 26.0733 loss: 9.1811 loss_cls: 0.2686 loss_bbox: 0.1205 loss_iou: 0.2067 d0.loss_cls: 0.3262 d0.loss_bbox: 0.1291 d0.loss_iou: 0.2197 d1.loss_cls: 0.2872 d1.loss_bbox: 0.1233 d1.loss_iou: 0.2118 d2.loss_cls: 0.2819 d2.loss_bbox: 0.1170 d2.loss_iou: 0.2056 d3.loss_cls: 0.2716 d3.loss_bbox: 0.1196 d3.loss_iou: 0.2057 d4.loss_cls: 0.2712 d4.loss_bbox: 0.1203 d4.loss_iou: 0.2066 enc_loss_cls: 0.3171 enc_loss_bbox: 0.1409 enc_loss_iou: 0.2369 dn_loss_cls: 0.1612 dn_loss_bbox: 0.1626 dn_loss_iou: 0.2055 d0.dn_loss_cls: 0.2448 d0.dn_loss_bbox: 0.3157 d0.dn_loss_iou: 0.3564 d1.dn_loss_cls: 0.1945 d1.dn_loss_bbox: 0.1969 d1.dn_loss_iou: 0.2366 d2.dn_loss_cls: 0.1723 d2.dn_loss_bbox: 0.1700 d2.dn_loss_iou: 0.2145 d3.dn_loss_cls: 0.1647 d3.dn_loss_bbox: 0.1634 d3.dn_loss_iou: 0.2076 d4.dn_loss_cls: 0.1643 d4.dn_loss_bbox: 0.1626 d4.dn_loss_iou: 0.2055 d1.loss_lmm_region: 0.1870 loss_lmm_image: 0.9076 2024/11/10 17:10:01 - mmengine - INFO - Iter(train) [ 13500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:40:55 time: 2.0046 data_time: 0.0167 memory: 35114 grad_norm: 26.9327 loss: 10.2600 loss_cls: 0.3254 loss_bbox: 0.1547 loss_iou: 0.2978 d0.loss_cls: 0.3837 d0.loss_bbox: 0.1638 d0.loss_iou: 0.3070 d1.loss_cls: 0.3491 d1.loss_bbox: 0.1581 d1.loss_iou: 0.3000 d2.loss_cls: 0.3387 d2.loss_bbox: 0.1543 d2.loss_iou: 0.2956 d3.loss_cls: 0.3303 d3.loss_bbox: 0.1545 d3.loss_iou: 0.2957 d4.loss_cls: 0.3299 d4.loss_bbox: 0.1531 d4.loss_iou: 0.2961 enc_loss_cls: 0.3738 enc_loss_bbox: 0.1784 enc_loss_iou: 0.3343 dn_loss_cls: 0.1168 dn_loss_bbox: 0.1440 dn_loss_iou: 0.2256 d0.dn_loss_cls: 0.1955 d0.dn_loss_bbox: 0.2906 d0.dn_loss_iou: 0.3768 d1.dn_loss_cls: 0.1485 d1.dn_loss_bbox: 0.1736 d1.dn_loss_iou: 0.2563 d2.dn_loss_cls: 0.1292 d2.dn_loss_bbox: 0.1536 d2.dn_loss_iou: 0.2356 d3.dn_loss_cls: 0.1224 d3.dn_loss_bbox: 0.1455 d3.dn_loss_iou: 0.2281 d4.dn_loss_cls: 0.1172 d4.dn_loss_bbox: 0.1439 d4.dn_loss_iou: 0.2254 d1.loss_lmm_region: 0.1790 loss_lmm_image: 0.9781 2024/11/10 17:13:22 - mmengine - INFO - Iter(train) [ 13600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:37:45 time: 2.0060 data_time: 0.0166 memory: 33792 grad_norm: 26.9554 loss: 9.7403 loss_cls: 0.3078 loss_bbox: 0.1409 loss_iou: 0.2432 d0.loss_cls: 0.3605 d0.loss_bbox: 0.1525 d0.loss_iou: 0.2606 d1.loss_cls: 0.3334 d1.loss_bbox: 0.1426 d1.loss_iou: 0.2492 d2.loss_cls: 0.3247 d2.loss_bbox: 0.1375 d2.loss_iou: 0.2424 d3.loss_cls: 0.3158 d3.loss_bbox: 0.1377 d3.loss_iou: 0.2411 d4.loss_cls: 0.3092 d4.loss_bbox: 0.1417 d4.loss_iou: 0.2449 enc_loss_cls: 0.3563 enc_loss_bbox: 0.1712 enc_loss_iou: 0.2885 dn_loss_cls: 0.1328 dn_loss_bbox: 0.1509 dn_loss_iou: 0.2058 d0.dn_loss_cls: 0.2213 d0.dn_loss_bbox: 0.2953 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1737 d1.dn_loss_bbox: 0.1864 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1494 d2.dn_loss_bbox: 0.1607 d2.dn_loss_iou: 0.2164 d3.dn_loss_cls: 0.1385 d3.dn_loss_bbox: 0.1532 d3.dn_loss_iou: 0.2089 d4.dn_loss_cls: 0.1332 d4.dn_loss_bbox: 0.1509 d4.dn_loss_iou: 0.2057 d1.loss_lmm_region: 0.1845 loss_lmm_image: 0.9754 2024/11/10 17:16:41 - mmengine - INFO - Iter(train) [ 13700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:34:23 time: 2.0003 data_time: 0.0167 memory: 34911 grad_norm: 28.3423 loss: 9.6841 loss_cls: 0.3170 loss_bbox: 0.1220 loss_iou: 0.2376 d0.loss_cls: 0.3692 d0.loss_bbox: 0.1333 d0.loss_iou: 0.2482 d1.loss_cls: 0.3327 d1.loss_bbox: 0.1283 d1.loss_iou: 0.2444 d2.loss_cls: 0.3219 d2.loss_bbox: 0.1248 d2.loss_iou: 0.2384 d3.loss_cls: 0.3159 d3.loss_bbox: 0.1263 d3.loss_iou: 0.2409 d4.loss_cls: 0.3163 d4.loss_bbox: 0.1245 d4.loss_iou: 0.2380 enc_loss_cls: 0.3583 enc_loss_bbox: 0.1457 enc_loss_iou: 0.2716 dn_loss_cls: 0.1453 dn_loss_bbox: 0.1475 dn_loss_iou: 0.2128 d0.dn_loss_cls: 0.2361 d0.dn_loss_bbox: 0.2805 d0.dn_loss_iou: 0.3535 d1.dn_loss_cls: 0.1831 d1.dn_loss_bbox: 0.1777 d1.dn_loss_iou: 0.2422 d2.dn_loss_cls: 0.1564 d2.dn_loss_bbox: 0.1588 d2.dn_loss_iou: 0.2234 d3.dn_loss_cls: 0.1479 d3.dn_loss_bbox: 0.1496 d3.dn_loss_iou: 0.2155 d4.dn_loss_cls: 0.1446 d4.dn_loss_bbox: 0.1475 d4.dn_loss_iou: 0.2130 d1.loss_lmm_region: 0.2012 loss_lmm_image: 0.9922 2024/11/10 17:20:01 - mmengine - INFO - Iter(train) [ 13800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:31:03 time: 2.0158 data_time: 0.0167 memory: 34932 grad_norm: 26.6226 loss: 10.0288 loss_cls: 0.3335 loss_bbox: 0.1311 loss_iou: 0.2492 d0.loss_cls: 0.3898 d0.loss_bbox: 0.1411 d0.loss_iou: 0.2641 d1.loss_cls: 0.3564 d1.loss_bbox: 0.1352 d1.loss_iou: 0.2544 d2.loss_cls: 0.3483 d2.loss_bbox: 0.1307 d2.loss_iou: 0.2504 d3.loss_cls: 0.3418 d3.loss_bbox: 0.1276 d3.loss_iou: 0.2469 d4.loss_cls: 0.3361 d4.loss_bbox: 0.1299 d4.loss_iou: 0.2473 enc_loss_cls: 0.3759 enc_loss_bbox: 0.1621 enc_loss_iou: 0.2954 dn_loss_cls: 0.1532 dn_loss_bbox: 0.1584 dn_loss_iou: 0.2129 d0.dn_loss_cls: 0.2363 d0.dn_loss_bbox: 0.3109 d0.dn_loss_iou: 0.3627 d1.dn_loss_cls: 0.1865 d1.dn_loss_bbox: 0.1909 d1.dn_loss_iou: 0.2473 d2.dn_loss_cls: 0.1633 d2.dn_loss_bbox: 0.1691 d2.dn_loss_iou: 0.2234 d3.dn_loss_cls: 0.1576 d3.dn_loss_bbox: 0.1597 d3.dn_loss_iou: 0.2150 d4.dn_loss_cls: 0.1542 d4.dn_loss_bbox: 0.1584 d4.dn_loss_iou: 0.2129 d1.loss_lmm_region: 0.1951 loss_lmm_image: 0.9138 2024/11/10 17:23:20 - mmengine - INFO - Iter(train) [ 13900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:27:43 time: 1.9914 data_time: 0.0167 memory: 33911 grad_norm: 25.5573 loss: 10.0576 loss_cls: 0.3255 loss_bbox: 0.1390 loss_iou: 0.2567 d0.loss_cls: 0.3845 d0.loss_bbox: 0.1478 d0.loss_iou: 0.2705 d1.loss_cls: 0.3463 d1.loss_bbox: 0.1404 d1.loss_iou: 0.2622 d2.loss_cls: 0.3318 d2.loss_bbox: 0.1413 d2.loss_iou: 0.2591 d3.loss_cls: 0.3336 d3.loss_bbox: 0.1383 d3.loss_iou: 0.2557 d4.loss_cls: 0.3280 d4.loss_bbox: 0.1359 d4.loss_iou: 0.2541 enc_loss_cls: 0.3826 enc_loss_bbox: 0.1589 enc_loss_iou: 0.2930 dn_loss_cls: 0.1335 dn_loss_bbox: 0.1577 dn_loss_iou: 0.2135 d0.dn_loss_cls: 0.2222 d0.dn_loss_bbox: 0.3210 d0.dn_loss_iou: 0.3634 d1.dn_loss_cls: 0.1661 d1.dn_loss_bbox: 0.1972 d1.dn_loss_iou: 0.2494 d2.dn_loss_cls: 0.1460 d2.dn_loss_bbox: 0.1720 d2.dn_loss_iou: 0.2258 d3.dn_loss_cls: 0.1386 d3.dn_loss_bbox: 0.1611 d3.dn_loss_iou: 0.2169 d4.dn_loss_cls: 0.1345 d4.dn_loss_bbox: 0.1578 d4.dn_loss_iou: 0.2135 d1.loss_lmm_region: 0.1959 loss_lmm_image: 0.9862 2024/11/10 17:26:39 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 17:26:39 - mmengine - INFO - Iter(train) [ 14000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:24:19 time: 1.9922 data_time: 0.0168 memory: 35916 grad_norm: 26.9818 loss: 10.7588 loss_cls: 0.3755 loss_bbox: 0.1382 loss_iou: 0.2450 d0.loss_cls: 0.4202 d0.loss_bbox: 0.1490 d0.loss_iou: 0.2670 d1.loss_cls: 0.3827 d1.loss_bbox: 0.1460 d1.loss_iou: 0.2564 d2.loss_cls: 0.3833 d2.loss_bbox: 0.1405 d2.loss_iou: 0.2486 d3.loss_cls: 0.3769 d3.loss_bbox: 0.1389 d3.loss_iou: 0.2481 d4.loss_cls: 0.3757 d4.loss_bbox: 0.1369 d4.loss_iou: 0.2453 enc_loss_cls: 0.4091 enc_loss_bbox: 0.1659 enc_loss_iou: 0.2880 dn_loss_cls: 0.1852 dn_loss_bbox: 0.1650 dn_loss_iou: 0.2257 d0.dn_loss_cls: 0.2683 d0.dn_loss_bbox: 0.3105 d0.dn_loss_iou: 0.3738 d1.dn_loss_cls: 0.2168 d1.dn_loss_bbox: 0.2022 d1.dn_loss_iou: 0.2601 d2.dn_loss_cls: 0.1973 d2.dn_loss_bbox: 0.1772 d2.dn_loss_iou: 0.2372 d3.dn_loss_cls: 0.1906 d3.dn_loss_bbox: 0.1684 d3.dn_loss_iou: 0.2289 d4.dn_loss_cls: 0.1835 d4.dn_loss_bbox: 0.1651 d4.dn_loss_iou: 0.2257 d1.loss_lmm_region: 0.2280 loss_lmm_image: 1.0122 2024/11/10 17:29:58 - mmengine - INFO - Iter(train) [ 14100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:20:53 time: 1.9877 data_time: 0.0166 memory: 33623 grad_norm: 28.2566 loss: 10.1263 loss_cls: 0.2842 loss_bbox: 0.1245 loss_iou: 0.2223 d0.loss_cls: 0.3241 d0.loss_bbox: 0.1368 d0.loss_iou: 0.2358 d1.loss_cls: 0.3028 d1.loss_bbox: 0.1338 d1.loss_iou: 0.2272 d2.loss_cls: 0.2907 d2.loss_bbox: 0.1299 d2.loss_iou: 0.2231 d3.loss_cls: 0.2883 d3.loss_bbox: 0.1246 d3.loss_iou: 0.2221 d4.loss_cls: 0.2845 d4.loss_bbox: 0.1253 d4.loss_iou: 0.2220 enc_loss_cls: 0.3166 enc_loss_bbox: 0.1539 enc_loss_iou: 0.2599 dn_loss_cls: 0.2306 dn_loss_bbox: 0.1763 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.3095 d0.dn_loss_bbox: 0.3389 d0.dn_loss_iou: 0.3784 d1.dn_loss_cls: 0.2612 d1.dn_loss_bbox: 0.2115 d1.dn_loss_iou: 0.2580 d2.dn_loss_cls: 0.2423 d2.dn_loss_bbox: 0.1857 d2.dn_loss_iou: 0.2341 d3.dn_loss_cls: 0.2341 d3.dn_loss_bbox: 0.1790 d3.dn_loss_iou: 0.2275 d4.dn_loss_cls: 0.2293 d4.dn_loss_bbox: 0.1764 d4.dn_loss_iou: 0.2248 d1.loss_lmm_region: 0.1962 loss_lmm_image: 0.9753 2024/11/10 17:33:19 - mmengine - INFO - Iter(train) [ 14200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:17:45 time: 2.0089 data_time: 0.0166 memory: 34897 grad_norm: 32.6170 loss: 9.5867 loss_cls: 0.3124 loss_bbox: 0.1209 loss_iou: 0.2344 d0.loss_cls: 0.3666 d0.loss_bbox: 0.1308 d0.loss_iou: 0.2483 d1.loss_cls: 0.3415 d1.loss_bbox: 0.1232 d1.loss_iou: 0.2418 d2.loss_cls: 0.3256 d2.loss_bbox: 0.1219 d2.loss_iou: 0.2372 d3.loss_cls: 0.3222 d3.loss_bbox: 0.1205 d3.loss_iou: 0.2331 d4.loss_cls: 0.3156 d4.loss_bbox: 0.1206 d4.loss_iou: 0.2333 enc_loss_cls: 0.3653 enc_loss_bbox: 0.1447 enc_loss_iou: 0.2732 dn_loss_cls: 0.1647 dn_loss_bbox: 0.1400 dn_loss_iou: 0.1877 d0.dn_loss_cls: 0.2439 d0.dn_loss_bbox: 0.2863 d0.dn_loss_iou: 0.3330 d1.dn_loss_cls: 0.1963 d1.dn_loss_bbox: 0.1704 d1.dn_loss_iou: 0.2199 d2.dn_loss_cls: 0.1771 d2.dn_loss_bbox: 0.1512 d2.dn_loss_iou: 0.1982 d3.dn_loss_cls: 0.1683 d3.dn_loss_bbox: 0.1428 d3.dn_loss_iou: 0.1902 d4.dn_loss_cls: 0.1647 d4.dn_loss_bbox: 0.1400 d4.dn_loss_iou: 0.1876 d1.loss_lmm_region: 0.2144 loss_lmm_image: 0.9769 2024/11/10 17:36:40 - mmengine - INFO - Iter(train) [ 14300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:14:34 time: 1.9939 data_time: 0.0186 memory: 34191 grad_norm: 23.0606 loss: 9.5422 loss_cls: 0.2852 loss_bbox: 0.1271 loss_iou: 0.2479 d0.loss_cls: 0.3293 d0.loss_bbox: 0.1419 d0.loss_iou: 0.2621 d1.loss_cls: 0.3036 d1.loss_bbox: 0.1360 d1.loss_iou: 0.2561 d2.loss_cls: 0.2942 d2.loss_bbox: 0.1308 d2.loss_iou: 0.2528 d3.loss_cls: 0.2881 d3.loss_bbox: 0.1302 d3.loss_iou: 0.2489 d4.loss_cls: 0.2847 d4.loss_bbox: 0.1273 d4.loss_iou: 0.2488 enc_loss_cls: 0.3190 enc_loss_bbox: 0.1602 enc_loss_iou: 0.2859 dn_loss_cls: 0.1109 dn_loss_bbox: 0.1661 dn_loss_iou: 0.2282 d0.dn_loss_cls: 0.1900 d0.dn_loss_bbox: 0.3116 d0.dn_loss_iou: 0.3786 d1.dn_loss_cls: 0.1412 d1.dn_loss_bbox: 0.1986 d1.dn_loss_iou: 0.2593 d2.dn_loss_cls: 0.1239 d2.dn_loss_bbox: 0.1761 d2.dn_loss_iou: 0.2387 d3.dn_loss_cls: 0.1150 d3.dn_loss_bbox: 0.1683 d3.dn_loss_iou: 0.2309 d4.dn_loss_cls: 0.1118 d4.dn_loss_bbox: 0.1661 d4.dn_loss_iou: 0.2280 d1.loss_lmm_region: 0.1764 loss_lmm_image: 0.9622 2024/11/10 17:40:00 - mmengine - INFO - Iter(train) [ 14400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:11:19 time: 2.0073 data_time: 0.0169 memory: 34317 grad_norm: 30.0010 loss: 10.7465 loss_cls: 0.3860 loss_bbox: 0.1246 loss_iou: 0.2577 d0.loss_cls: 0.4419 d0.loss_bbox: 0.1346 d0.loss_iou: 0.2702 d1.loss_cls: 0.4125 d1.loss_bbox: 0.1248 d1.loss_iou: 0.2542 d2.loss_cls: 0.3966 d2.loss_bbox: 0.1232 d2.loss_iou: 0.2540 d3.loss_cls: 0.3906 d3.loss_bbox: 0.1233 d3.loss_iou: 0.2531 d4.loss_cls: 0.3874 d4.loss_bbox: 0.1241 d4.loss_iou: 0.2570 enc_loss_cls: 0.4336 enc_loss_bbox: 0.1495 enc_loss_iou: 0.2896 dn_loss_cls: 0.1814 dn_loss_bbox: 0.1657 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.2640 d0.dn_loss_bbox: 0.3264 d0.dn_loss_iou: 0.3699 d1.dn_loss_cls: 0.2124 d1.dn_loss_bbox: 0.2080 d1.dn_loss_iou: 0.2585 d2.dn_loss_cls: 0.1975 d2.dn_loss_bbox: 0.1811 d2.dn_loss_iou: 0.2338 d3.dn_loss_cls: 0.1861 d3.dn_loss_bbox: 0.1688 d3.dn_loss_iou: 0.2238 d4.dn_loss_cls: 0.1813 d4.dn_loss_bbox: 0.1657 d4.dn_loss_iou: 0.2204 d1.loss_lmm_region: 0.2451 loss_lmm_image: 0.9476 2024/11/10 17:43:20 - mmengine - INFO - Iter(train) [ 14500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:08:07 time: 1.9911 data_time: 0.0167 memory: 33294 grad_norm: 33.2692 loss: 11.3451 loss_cls: 0.3601 loss_bbox: 0.1662 loss_iou: 0.3135 d0.loss_cls: 0.4194 d0.loss_bbox: 0.1910 d0.loss_iou: 0.3327 d1.loss_cls: 0.3866 d1.loss_bbox: 0.1724 d1.loss_iou: 0.3176 d2.loss_cls: 0.3702 d2.loss_bbox: 0.1673 d2.loss_iou: 0.3140 d3.loss_cls: 0.3653 d3.loss_bbox: 0.1646 d3.loss_iou: 0.3102 d4.loss_cls: 0.3630 d4.loss_bbox: 0.1649 d4.loss_iou: 0.3126 enc_loss_cls: 0.4141 enc_loss_bbox: 0.2035 enc_loss_iou: 0.3603 dn_loss_cls: 0.1179 dn_loss_bbox: 0.2165 dn_loss_iou: 0.2501 d0.dn_loss_cls: 0.2005 d0.dn_loss_bbox: 0.3988 d0.dn_loss_iou: 0.4152 d1.dn_loss_cls: 0.1491 d1.dn_loss_bbox: 0.2560 d1.dn_loss_iou: 0.2877 d2.dn_loss_cls: 0.1287 d2.dn_loss_bbox: 0.2293 d2.dn_loss_iou: 0.2620 d3.dn_loss_cls: 0.1217 d3.dn_loss_bbox: 0.2189 d3.dn_loss_iou: 0.2528 d4.dn_loss_cls: 0.1182 d4.dn_loss_bbox: 0.2165 d4.dn_loss_iou: 0.2501 d1.loss_lmm_region: 0.1813 loss_lmm_image: 0.9045 2024/11/10 17:46:40 - mmengine - INFO - Iter(train) [ 14600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:04:49 time: 2.0031 data_time: 0.0172 memory: 34164 grad_norm: 29.1413 loss: 9.2279 loss_cls: 0.2727 loss_bbox: 0.1222 loss_iou: 0.2231 d0.loss_cls: 0.3120 d0.loss_bbox: 0.1342 d0.loss_iou: 0.2362 d1.loss_cls: 0.2951 d1.loss_bbox: 0.1210 d1.loss_iou: 0.2245 d2.loss_cls: 0.2877 d2.loss_bbox: 0.1154 d2.loss_iou: 0.2186 d3.loss_cls: 0.2820 d3.loss_bbox: 0.1159 d3.loss_iou: 0.2181 d4.loss_cls: 0.2772 d4.loss_bbox: 0.1211 d4.loss_iou: 0.2211 enc_loss_cls: 0.3147 enc_loss_bbox: 0.1478 enc_loss_iou: 0.2511 dn_loss_cls: 0.1232 dn_loss_bbox: 0.1706 dn_loss_iou: 0.2146 d0.dn_loss_cls: 0.1976 d0.dn_loss_bbox: 0.3340 d0.dn_loss_iou: 0.3675 d1.dn_loss_cls: 0.1483 d1.dn_loss_bbox: 0.2139 d1.dn_loss_iou: 0.2515 d2.dn_loss_cls: 0.1344 d2.dn_loss_bbox: 0.1849 d2.dn_loss_iou: 0.2259 d3.dn_loss_cls: 0.1270 d3.dn_loss_bbox: 0.1742 d3.dn_loss_iou: 0.2181 d4.dn_loss_cls: 0.1243 d4.dn_loss_bbox: 0.1707 d4.dn_loss_iou: 0.2146 d1.loss_lmm_region: 0.1863 loss_lmm_image: 0.9349 2024/11/10 17:50:02 - mmengine - INFO - Iter(train) [ 14700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 3:01:55 time: 2.0216 data_time: 0.0168 memory: 35507 grad_norm: 26.8590 loss: 10.5901 loss_cls: 0.3371 loss_bbox: 0.1455 loss_iou: 0.2371 d0.loss_cls: 0.3922 d0.loss_bbox: 0.1527 d0.loss_iou: 0.2468 d1.loss_cls: 0.3658 d1.loss_bbox: 0.1479 d1.loss_iou: 0.2400 d2.loss_cls: 0.3519 d2.loss_bbox: 0.1464 d2.loss_iou: 0.2401 d3.loss_cls: 0.3412 d3.loss_bbox: 0.1506 d3.loss_iou: 0.2412 d4.loss_cls: 0.3415 d4.loss_bbox: 0.1466 d4.loss_iou: 0.2378 enc_loss_cls: 0.3789 enc_loss_bbox: 0.1634 enc_loss_iou: 0.2655 dn_loss_cls: 0.1754 dn_loss_bbox: 0.1871 dn_loss_iou: 0.2268 d0.dn_loss_cls: 0.2783 d0.dn_loss_bbox: 0.3387 d0.dn_loss_iou: 0.3753 d1.dn_loss_cls: 0.2154 d1.dn_loss_bbox: 0.2217 d1.dn_loss_iou: 0.2603 d2.dn_loss_cls: 0.1931 d2.dn_loss_bbox: 0.1988 d2.dn_loss_iou: 0.2366 d3.dn_loss_cls: 0.1827 d3.dn_loss_bbox: 0.1896 d3.dn_loss_iou: 0.2294 d4.dn_loss_cls: 0.1783 d4.dn_loss_bbox: 0.1873 d4.dn_loss_iou: 0.2268 d1.loss_lmm_region: 0.2275 loss_lmm_image: 0.9906 2024/11/10 17:53:21 - mmengine - INFO - Iter(train) [ 14800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:58:28 time: 1.9568 data_time: 0.0167 memory: 33211 grad_norm: 24.9493 loss: 8.8089 loss_cls: 0.2785 loss_bbox: 0.1097 loss_iou: 0.2164 d0.loss_cls: 0.3188 d0.loss_bbox: 0.1237 d0.loss_iou: 0.2330 d1.loss_cls: 0.2958 d1.loss_bbox: 0.1144 d1.loss_iou: 0.2224 d2.loss_cls: 0.2859 d2.loss_bbox: 0.1133 d2.loss_iou: 0.2211 d3.loss_cls: 0.2831 d3.loss_bbox: 0.1111 d3.loss_iou: 0.2173 d4.loss_cls: 0.2803 d4.loss_bbox: 0.1119 d4.loss_iou: 0.2176 enc_loss_cls: 0.3124 enc_loss_bbox: 0.1395 enc_loss_iou: 0.2576 dn_loss_cls: 0.1022 dn_loss_bbox: 0.1486 dn_loss_iou: 0.2113 d0.dn_loss_cls: 0.1832 d0.dn_loss_bbox: 0.2919 d0.dn_loss_iou: 0.3607 d1.dn_loss_cls: 0.1356 d1.dn_loss_bbox: 0.1781 d1.dn_loss_iou: 0.2438 d2.dn_loss_cls: 0.1195 d2.dn_loss_bbox: 0.1559 d2.dn_loss_iou: 0.2208 d3.dn_loss_cls: 0.1089 d3.dn_loss_bbox: 0.1498 d3.dn_loss_iou: 0.2133 d4.dn_loss_cls: 0.1046 d4.dn_loss_bbox: 0.1485 d4.dn_loss_iou: 0.2113 d1.loss_lmm_region: 0.1520 loss_lmm_image: 0.9049 2024/11/10 17:56:39 - mmengine - INFO - Iter(train) [ 14900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:54:52 time: 1.9854 data_time: 0.0167 memory: 31620 grad_norm: 26.8817 loss: 9.4659 loss_cls: 0.2936 loss_bbox: 0.1329 loss_iou: 0.2260 d0.loss_cls: 0.3580 d0.loss_bbox: 0.1238 d0.loss_iou: 0.2259 d1.loss_cls: 0.3266 d1.loss_bbox: 0.1250 d1.loss_iou: 0.2236 d2.loss_cls: 0.3105 d2.loss_bbox: 0.1227 d2.loss_iou: 0.2202 d3.loss_cls: 0.3050 d3.loss_bbox: 0.1243 d3.loss_iou: 0.2237 d4.loss_cls: 0.2973 d4.loss_bbox: 0.1280 d4.loss_iou: 0.2251 enc_loss_cls: 0.3462 enc_loss_bbox: 0.1433 enc_loss_iou: 0.2509 dn_loss_cls: 0.1391 dn_loss_bbox: 0.1600 dn_loss_iou: 0.2048 d0.dn_loss_cls: 0.2234 d0.dn_loss_bbox: 0.3329 d0.dn_loss_iou: 0.3594 d1.dn_loss_cls: 0.1748 d1.dn_loss_bbox: 0.2005 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1517 d2.dn_loss_bbox: 0.1722 d2.dn_loss_iou: 0.2166 d3.dn_loss_cls: 0.1441 d3.dn_loss_bbox: 0.1625 d3.dn_loss_iou: 0.2076 d4.dn_loss_cls: 0.1394 d4.dn_loss_bbox: 0.1600 d4.dn_loss_iou: 0.2048 d1.loss_lmm_region: 0.1998 loss_lmm_image: 0.9382 2024/11/10 17:59:57 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 17:59:57 - mmengine - INFO - Iter(train) [ 15000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:51:16 time: 1.9912 data_time: 0.0168 memory: 33112 grad_norm: 25.3923 loss: 9.7075 loss_cls: 0.3387 loss_bbox: 0.1265 loss_iou: 0.2085 d0.loss_cls: 0.4058 d0.loss_bbox: 0.1332 d0.loss_iou: 0.2180 d1.loss_cls: 0.3673 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2114 d2.loss_cls: 0.3521 d2.loss_bbox: 0.1266 d2.loss_iou: 0.2104 d3.loss_cls: 0.3488 d3.loss_bbox: 0.1240 d3.loss_iou: 0.2033 d4.loss_cls: 0.3423 d4.loss_bbox: 0.1243 d4.loss_iou: 0.2071 enc_loss_cls: 0.3920 enc_loss_bbox: 0.1509 enc_loss_iou: 0.2419 dn_loss_cls: 0.1537 dn_loss_bbox: 0.1631 dn_loss_iou: 0.1944 d0.dn_loss_cls: 0.2359 d0.dn_loss_bbox: 0.3113 d0.dn_loss_iou: 0.3347 d1.dn_loss_cls: 0.1857 d1.dn_loss_bbox: 0.2019 d1.dn_loss_iou: 0.2292 d2.dn_loss_cls: 0.1655 d2.dn_loss_bbox: 0.1765 d2.dn_loss_iou: 0.2058 d3.dn_loss_cls: 0.1604 d3.dn_loss_bbox: 0.1654 d3.dn_loss_iou: 0.1976 d4.dn_loss_cls: 0.1545 d4.dn_loss_bbox: 0.1631 d4.dn_loss_iou: 0.1944 d1.loss_lmm_region: 0.1981 loss_lmm_image: 0.9567 2024/11/10 18:03:16 - mmengine - INFO - Iter(train) [ 15100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:47:50 time: 1.9906 data_time: 0.0168 memory: 34583 grad_norm: 24.7740 loss: 10.5946 loss_cls: 0.3118 loss_bbox: 0.1713 loss_iou: 0.2729 d0.loss_cls: 0.3621 d0.loss_bbox: 0.1783 d0.loss_iou: 0.2779 d1.loss_cls: 0.3290 d1.loss_bbox: 0.1725 d1.loss_iou: 0.2765 d2.loss_cls: 0.3197 d2.loss_bbox: 0.1722 d2.loss_iou: 0.2757 d3.loss_cls: 0.3126 d3.loss_bbox: 0.1747 d3.loss_iou: 0.2744 d4.loss_cls: 0.3132 d4.loss_bbox: 0.1728 d4.loss_iou: 0.2737 enc_loss_cls: 0.3533 enc_loss_bbox: 0.2037 enc_loss_iou: 0.3052 dn_loss_cls: 0.1072 dn_loss_bbox: 0.2093 dn_loss_iou: 0.2542 d0.dn_loss_cls: 0.1978 d0.dn_loss_bbox: 0.3702 d0.dn_loss_iou: 0.4057 d1.dn_loss_cls: 0.1417 d1.dn_loss_bbox: 0.2402 d1.dn_loss_iou: 0.2846 d2.dn_loss_cls: 0.1224 d2.dn_loss_bbox: 0.2172 d2.dn_loss_iou: 0.2618 d3.dn_loss_cls: 0.1146 d3.dn_loss_bbox: 0.2106 d3.dn_loss_iou: 0.2561 d4.dn_loss_cls: 0.1085 d4.dn_loss_bbox: 0.2093 d4.dn_loss_iou: 0.2541 d1.loss_lmm_region: 0.1954 loss_lmm_image: 0.9302 2024/11/10 18:06:36 - mmengine - INFO - Iter(train) [ 15200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:44:40 time: 2.0015 data_time: 0.0168 memory: 32359 grad_norm: 24.8884 loss: 9.0994 loss_cls: 0.3039 loss_bbox: 0.1252 loss_iou: 0.2250 d0.loss_cls: 0.3483 d0.loss_bbox: 0.1376 d0.loss_iou: 0.2377 d1.loss_cls: 0.3243 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2315 d2.loss_cls: 0.3149 d2.loss_bbox: 0.1255 d2.loss_iou: 0.2261 d3.loss_cls: 0.3089 d3.loss_bbox: 0.1254 d3.loss_iou: 0.2263 d4.loss_cls: 0.3067 d4.loss_bbox: 0.1248 d4.loss_iou: 0.2239 enc_loss_cls: 0.3446 enc_loss_bbox: 0.1521 enc_loss_iou: 0.2638 dn_loss_cls: 0.1137 dn_loss_bbox: 0.1446 dn_loss_iou: 0.1972 d0.dn_loss_cls: 0.1808 d0.dn_loss_bbox: 0.2896 d0.dn_loss_iou: 0.3412 d1.dn_loss_cls: 0.1443 d1.dn_loss_bbox: 0.1735 d1.dn_loss_iou: 0.2289 d2.dn_loss_cls: 0.1236 d2.dn_loss_bbox: 0.1538 d2.dn_loss_iou: 0.2065 d3.dn_loss_cls: 0.1175 d3.dn_loss_bbox: 0.1465 d3.dn_loss_iou: 0.2001 d4.dn_loss_cls: 0.1138 d4.dn_loss_bbox: 0.1448 d4.dn_loss_iou: 0.1973 d1.loss_lmm_region: 0.1730 loss_lmm_image: 0.9044 2024/11/10 18:09:56 - mmengine - INFO - Iter(train) [ 15300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:41:17 time: 1.9785 data_time: 0.0170 memory: 32729 grad_norm: 25.7886 loss: 9.6208 loss_cls: 0.2980 loss_bbox: 0.1293 loss_iou: 0.2455 d0.loss_cls: 0.3554 d0.loss_bbox: 0.1483 d0.loss_iou: 0.2643 d1.loss_cls: 0.3217 d1.loss_bbox: 0.1402 d1.loss_iou: 0.2541 d2.loss_cls: 0.3127 d2.loss_bbox: 0.1325 d2.loss_iou: 0.2480 d3.loss_cls: 0.3025 d3.loss_bbox: 0.1301 d3.loss_iou: 0.2479 d4.loss_cls: 0.2999 d4.loss_bbox: 0.1300 d4.loss_iou: 0.2460 enc_loss_cls: 0.3575 enc_loss_bbox: 0.1606 enc_loss_iou: 0.2832 dn_loss_cls: 0.1286 dn_loss_bbox: 0.1539 dn_loss_iou: 0.2050 d0.dn_loss_cls: 0.2129 d0.dn_loss_bbox: 0.3084 d0.dn_loss_iou: 0.3522 d1.dn_loss_cls: 0.1617 d1.dn_loss_bbox: 0.1920 d1.dn_loss_iou: 0.2389 d2.dn_loss_cls: 0.1411 d2.dn_loss_bbox: 0.1666 d2.dn_loss_iou: 0.2165 d3.dn_loss_cls: 0.1336 d3.dn_loss_bbox: 0.1563 d3.dn_loss_iou: 0.2085 d4.dn_loss_cls: 0.1295 d4.dn_loss_bbox: 0.1540 d4.dn_loss_iou: 0.2051 d1.loss_lmm_region: 0.1799 loss_lmm_image: 0.9689 2024/11/10 18:13:14 - mmengine - INFO - Iter(train) [ 15400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:37:47 time: 1.9766 data_time: 0.0167 memory: 33402 grad_norm: 29.1214 loss: 11.3424 loss_cls: 0.3648 loss_bbox: 0.1602 loss_iou: 0.2695 d0.loss_cls: 0.4252 d0.loss_bbox: 0.1695 d0.loss_iou: 0.2784 d1.loss_cls: 0.3776 d1.loss_bbox: 0.1661 d1.loss_iou: 0.2773 d2.loss_cls: 0.3672 d2.loss_bbox: 0.1658 d2.loss_iou: 0.2739 d3.loss_cls: 0.3645 d3.loss_bbox: 0.1630 d3.loss_iou: 0.2713 d4.loss_cls: 0.3618 d4.loss_bbox: 0.1604 d4.loss_iou: 0.2700 enc_loss_cls: 0.4104 enc_loss_bbox: 0.1855 enc_loss_iou: 0.3030 dn_loss_cls: 0.2074 dn_loss_bbox: 0.1933 dn_loss_iou: 0.2369 d0.dn_loss_cls: 0.2910 d0.dn_loss_bbox: 0.3717 d0.dn_loss_iou: 0.3967 d1.dn_loss_cls: 0.2432 d1.dn_loss_bbox: 0.2339 d1.dn_loss_iou: 0.2716 d2.dn_loss_cls: 0.2194 d2.dn_loss_bbox: 0.2046 d2.dn_loss_iou: 0.2480 d3.dn_loss_cls: 0.2135 d3.dn_loss_bbox: 0.1962 d3.dn_loss_iou: 0.2402 d4.dn_loss_cls: 0.2074 d4.dn_loss_bbox: 0.1933 d4.dn_loss_iou: 0.2371 d1.loss_lmm_region: 0.2182 loss_lmm_image: 0.9333 2024/11/10 18:16:34 - mmengine - INFO - Iter(train) [ 15500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:34:32 time: 2.0034 data_time: 0.0166 memory: 34649 grad_norm: 27.4654 loss: 9.6342 loss_cls: 0.2849 loss_bbox: 0.1321 loss_iou: 0.2481 d0.loss_cls: 0.3389 d0.loss_bbox: 0.1407 d0.loss_iou: 0.2624 d1.loss_cls: 0.3061 d1.loss_bbox: 0.1360 d1.loss_iou: 0.2552 d2.loss_cls: 0.2958 d2.loss_bbox: 0.1339 d2.loss_iou: 0.2505 d3.loss_cls: 0.2942 d3.loss_bbox: 0.1279 d3.loss_iou: 0.2468 d4.loss_cls: 0.2846 d4.loss_bbox: 0.1321 d4.loss_iou: 0.2483 enc_loss_cls: 0.3237 enc_loss_bbox: 0.1600 enc_loss_iou: 0.2853 dn_loss_cls: 0.1184 dn_loss_bbox: 0.1642 dn_loss_iou: 0.2374 d0.dn_loss_cls: 0.1962 d0.dn_loss_bbox: 0.3129 d0.dn_loss_iou: 0.3851 d1.dn_loss_cls: 0.1465 d1.dn_loss_bbox: 0.1981 d1.dn_loss_iou: 0.2715 d2.dn_loss_cls: 0.1287 d2.dn_loss_bbox: 0.1746 d2.dn_loss_iou: 0.2479 d3.dn_loss_cls: 0.1228 d3.dn_loss_bbox: 0.1662 d3.dn_loss_iou: 0.2400 d4.dn_loss_cls: 0.1185 d4.dn_loss_bbox: 0.1643 d4.dn_loss_iou: 0.2373 d1.loss_lmm_region: 0.2016 loss_lmm_image: 0.9145 2024/11/10 18:19:53 - mmengine - INFO - Iter(train) [ 15600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:31:08 time: 1.9961 data_time: 0.0167 memory: 33136 grad_norm: 26.4675 loss: 8.5778 loss_cls: 0.2636 loss_bbox: 0.1161 loss_iou: 0.2035 d0.loss_cls: 0.3132 d0.loss_bbox: 0.1272 d0.loss_iou: 0.2205 d1.loss_cls: 0.2852 d1.loss_bbox: 0.1161 d1.loss_iou: 0.2079 d2.loss_cls: 0.2713 d2.loss_bbox: 0.1176 d2.loss_iou: 0.2062 d3.loss_cls: 0.2613 d3.loss_bbox: 0.1207 d3.loss_iou: 0.2088 d4.loss_cls: 0.2620 d4.loss_bbox: 0.1179 d4.loss_iou: 0.2041 enc_loss_cls: 0.3045 enc_loss_bbox: 0.1450 enc_loss_iou: 0.2442 dn_loss_cls: 0.1049 dn_loss_bbox: 0.1487 dn_loss_iou: 0.1874 d0.dn_loss_cls: 0.1794 d0.dn_loss_bbox: 0.3131 d0.dn_loss_iou: 0.3409 d1.dn_loss_cls: 0.1357 d1.dn_loss_bbox: 0.1864 d1.dn_loss_iou: 0.2244 d2.dn_loss_cls: 0.1168 d2.dn_loss_bbox: 0.1600 d2.dn_loss_iou: 0.1982 d3.dn_loss_cls: 0.1090 d3.dn_loss_bbox: 0.1510 d3.dn_loss_iou: 0.1900 d4.dn_loss_cls: 0.1050 d4.dn_loss_bbox: 0.1487 d4.dn_loss_iou: 0.1873 d1.loss_lmm_region: 0.1583 loss_lmm_image: 0.9156 2024/11/10 18:23:13 - mmengine - INFO - Iter(train) [ 15700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:27:54 time: 1.9971 data_time: 0.0167 memory: 32567 grad_norm: 28.9555 loss: 10.8714 loss_cls: 0.3790 loss_bbox: 0.1450 loss_iou: 0.2651 d0.loss_cls: 0.4357 d0.loss_bbox: 0.1539 d0.loss_iou: 0.2760 d1.loss_cls: 0.3973 d1.loss_bbox: 0.1518 d1.loss_iou: 0.2723 d2.loss_cls: 0.3849 d2.loss_bbox: 0.1470 d2.loss_iou: 0.2677 d3.loss_cls: 0.3820 d3.loss_bbox: 0.1458 d3.loss_iou: 0.2657 d4.loss_cls: 0.3786 d4.loss_bbox: 0.1449 d4.loss_iou: 0.2650 enc_loss_cls: 0.4334 enc_loss_bbox: 0.1717 enc_loss_iou: 0.3068 dn_loss_cls: 0.1841 dn_loss_bbox: 0.1642 dn_loss_iou: 0.2212 d0.dn_loss_cls: 0.2545 d0.dn_loss_bbox: 0.3083 d0.dn_loss_iou: 0.3660 d1.dn_loss_cls: 0.2078 d1.dn_loss_bbox: 0.2012 d1.dn_loss_iou: 0.2555 d2.dn_loss_cls: 0.1913 d2.dn_loss_bbox: 0.1755 d2.dn_loss_iou: 0.2323 d3.dn_loss_cls: 0.1875 d3.dn_loss_bbox: 0.1669 d3.dn_loss_iou: 0.2246 d4.dn_loss_cls: 0.1847 d4.dn_loss_bbox: 0.1643 d4.dn_loss_iou: 0.2213 d1.loss_lmm_region: 0.1902 loss_lmm_image: 0.9999 2024/11/10 18:26:34 - mmengine - INFO - Iter(train) [ 15800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:24:42 time: 1.9961 data_time: 0.0167 memory: 35980 grad_norm: 31.1500 loss: 10.2707 loss_cls: 0.3219 loss_bbox: 0.1230 loss_iou: 0.2053 d0.loss_cls: 0.3734 d0.loss_bbox: 0.1356 d0.loss_iou: 0.2205 d1.loss_cls: 0.3399 d1.loss_bbox: 0.1283 d1.loss_iou: 0.2095 d2.loss_cls: 0.3301 d2.loss_bbox: 0.1256 d2.loss_iou: 0.2084 d3.loss_cls: 0.3271 d3.loss_bbox: 0.1227 d3.loss_iou: 0.2051 d4.loss_cls: 0.3224 d4.loss_bbox: 0.1226 d4.loss_iou: 0.2049 enc_loss_cls: 0.3686 enc_loss_bbox: 0.1556 enc_loss_iou: 0.2444 dn_loss_cls: 0.2190 dn_loss_bbox: 0.1946 dn_loss_iou: 0.2104 d0.dn_loss_cls: 0.3173 d0.dn_loss_bbox: 0.3703 d0.dn_loss_iou: 0.3684 d1.dn_loss_cls: 0.2595 d1.dn_loss_bbox: 0.2388 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.2322 d2.dn_loss_bbox: 0.2096 d2.dn_loss_iou: 0.2232 d3.dn_loss_cls: 0.2232 d3.dn_loss_bbox: 0.1980 d3.dn_loss_iou: 0.2137 d4.dn_loss_cls: 0.2194 d4.dn_loss_bbox: 0.1946 d4.dn_loss_iou: 0.2104 d1.loss_lmm_region: 0.2216 loss_lmm_image: 0.9036 2024/11/10 18:29:53 - mmengine - INFO - Iter(train) [ 15900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:21:21 time: 2.0033 data_time: 0.0168 memory: 33760 grad_norm: 33.1991 loss: 9.7519 loss_cls: 0.3306 loss_bbox: 0.1256 loss_iou: 0.2521 d0.loss_cls: 0.3715 d0.loss_bbox: 0.1382 d0.loss_iou: 0.2666 d1.loss_cls: 0.3405 d1.loss_bbox: 0.1317 d1.loss_iou: 0.2564 d2.loss_cls: 0.3314 d2.loss_bbox: 0.1282 d2.loss_iou: 0.2575 d3.loss_cls: 0.3310 d3.loss_bbox: 0.1277 d3.loss_iou: 0.2545 d4.loss_cls: 0.3274 d4.loss_bbox: 0.1264 d4.loss_iou: 0.2527 enc_loss_cls: 0.3625 enc_loss_bbox: 0.1585 enc_loss_iou: 0.2982 dn_loss_cls: 0.1144 dn_loss_bbox: 0.1569 dn_loss_iou: 0.2268 d0.dn_loss_cls: 0.1902 d0.dn_loss_bbox: 0.2972 d0.dn_loss_iou: 0.3668 d1.dn_loss_cls: 0.1433 d1.dn_loss_bbox: 0.1822 d1.dn_loss_iou: 0.2558 d2.dn_loss_cls: 0.1268 d2.dn_loss_bbox: 0.1636 d2.dn_loss_iou: 0.2341 d3.dn_loss_cls: 0.1199 d3.dn_loss_bbox: 0.1578 d3.dn_loss_iou: 0.2288 d4.dn_loss_cls: 0.1157 d4.dn_loss_bbox: 0.1569 d4.dn_loss_iou: 0.2269 d1.loss_lmm_region: 0.1686 loss_lmm_image: 0.9498 2024/11/10 18:33:14 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 18:33:14 - mmengine - INFO - Iter(train) [ 16000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:18:12 time: 2.0239 data_time: 0.0166 memory: 34139 grad_norm: 30.3487 loss: 9.4413 loss_cls: 0.3608 loss_bbox: 0.1058 loss_iou: 0.2271 d0.loss_cls: 0.4279 d0.loss_bbox: 0.1127 d0.loss_iou: 0.2403 d1.loss_cls: 0.3877 d1.loss_bbox: 0.1084 d1.loss_iou: 0.2333 d2.loss_cls: 0.3702 d2.loss_bbox: 0.1049 d2.loss_iou: 0.2299 d3.loss_cls: 0.3644 d3.loss_bbox: 0.1023 d3.loss_iou: 0.2280 d4.loss_cls: 0.3624 d4.loss_bbox: 0.1027 d4.loss_iou: 0.2252 enc_loss_cls: 0.4173 enc_loss_bbox: 0.1300 enc_loss_iou: 0.2665 dn_loss_cls: 0.1477 dn_loss_bbox: 0.1274 dn_loss_iou: 0.1771 d0.dn_loss_cls: 0.2258 d0.dn_loss_bbox: 0.2722 d0.dn_loss_iou: 0.3212 d1.dn_loss_cls: 0.1776 d1.dn_loss_bbox: 0.1607 d1.dn_loss_iou: 0.2104 d2.dn_loss_cls: 0.1573 d2.dn_loss_bbox: 0.1346 d2.dn_loss_iou: 0.1871 d3.dn_loss_cls: 0.1494 d3.dn_loss_bbox: 0.1290 d3.dn_loss_iou: 0.1793 d4.dn_loss_cls: 0.1475 d4.dn_loss_bbox: 0.1274 d4.dn_loss_iou: 0.1769 d1.loss_lmm_region: 0.1746 loss_lmm_image: 0.9503 2024/11/10 18:36:32 - mmengine - INFO - Iter(train) [ 16100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:14:40 time: 1.9923 data_time: 0.0167 memory: 33781 grad_norm: 24.9872 loss: 9.3464 loss_cls: 0.2841 loss_bbox: 0.1299 loss_iou: 0.2071 d0.loss_cls: 0.3263 d0.loss_bbox: 0.1425 d0.loss_iou: 0.2197 d1.loss_cls: 0.2973 d1.loss_bbox: 0.1399 d1.loss_iou: 0.2177 d2.loss_cls: 0.2926 d2.loss_bbox: 0.1314 d2.loss_iou: 0.2088 d3.loss_cls: 0.2837 d3.loss_bbox: 0.1318 d3.loss_iou: 0.2076 d4.loss_cls: 0.2840 d4.loss_bbox: 0.1301 d4.loss_iou: 0.2073 enc_loss_cls: 0.3203 enc_loss_bbox: 0.1605 enc_loss_iou: 0.2461 dn_loss_cls: 0.1274 dn_loss_bbox: 0.1807 dn_loss_iou: 0.2110 d0.dn_loss_cls: 0.2125 d0.dn_loss_bbox: 0.3320 d0.dn_loss_iou: 0.3572 d1.dn_loss_cls: 0.1618 d1.dn_loss_bbox: 0.2144 d1.dn_loss_iou: 0.2446 d2.dn_loss_cls: 0.1438 d2.dn_loss_bbox: 0.1895 d2.dn_loss_iou: 0.2206 d3.dn_loss_cls: 0.1350 d3.dn_loss_bbox: 0.1813 d3.dn_loss_iou: 0.2128 d4.dn_loss_cls: 0.1302 d4.dn_loss_bbox: 0.1806 d4.dn_loss_iou: 0.2108 d1.loss_lmm_region: 0.1775 loss_lmm_image: 0.9540 2024/11/10 18:39:50 - mmengine - INFO - Iter(train) [ 16200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:11:00 time: 1.9645 data_time: 0.0168 memory: 34059 grad_norm: 28.4970 loss: 10.8810 loss_cls: 0.3512 loss_bbox: 0.1478 loss_iou: 0.3222 d0.loss_cls: 0.4139 d0.loss_bbox: 0.1592 d0.loss_iou: 0.3380 d1.loss_cls: 0.3778 d1.loss_bbox: 0.1564 d1.loss_iou: 0.3349 d2.loss_cls: 0.3598 d2.loss_bbox: 0.1515 d2.loss_iou: 0.3316 d3.loss_cls: 0.3574 d3.loss_bbox: 0.1484 d3.loss_iou: 0.3235 d4.loss_cls: 0.3497 d4.loss_bbox: 0.1491 d4.loss_iou: 0.3225 enc_loss_cls: 0.4041 enc_loss_bbox: 0.1754 enc_loss_iou: 0.3654 dn_loss_cls: 0.1399 dn_loss_bbox: 0.1634 dn_loss_iou: 0.2370 d0.dn_loss_cls: 0.2147 d0.dn_loss_bbox: 0.3007 d0.dn_loss_iou: 0.3779 d1.dn_loss_cls: 0.1698 d1.dn_loss_bbox: 0.1947 d1.dn_loss_iou: 0.2672 d2.dn_loss_cls: 0.1484 d2.dn_loss_bbox: 0.1741 d2.dn_loss_iou: 0.2463 d3.dn_loss_cls: 0.1419 d3.dn_loss_bbox: 0.1651 d3.dn_loss_iou: 0.2394 d4.dn_loss_cls: 0.1407 d4.dn_loss_bbox: 0.1635 d4.dn_loss_iou: 0.2371 d1.loss_lmm_region: 0.1849 loss_lmm_image: 0.9350 2024/11/10 18:43:10 - mmengine - INFO - Iter(train) [ 16300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:07:49 time: 2.0101 data_time: 0.0168 memory: 34101 grad_norm: 31.0972 loss: 10.5222 loss_cls: 0.3834 loss_bbox: 0.1334 loss_iou: 0.2636 d0.loss_cls: 0.4455 d0.loss_bbox: 0.1434 d0.loss_iou: 0.2813 d1.loss_cls: 0.4059 d1.loss_bbox: 0.1403 d1.loss_iou: 0.2741 d2.loss_cls: 0.3984 d2.loss_bbox: 0.1361 d2.loss_iou: 0.2647 d3.loss_cls: 0.3844 d3.loss_bbox: 0.1347 d3.loss_iou: 0.2647 d4.loss_cls: 0.3856 d4.loss_bbox: 0.1325 d4.loss_iou: 0.2639 enc_loss_cls: 0.4359 enc_loss_bbox: 0.1645 enc_loss_iou: 0.3132 dn_loss_cls: 0.1415 dn_loss_bbox: 0.1534 dn_loss_iou: 0.2129 d0.dn_loss_cls: 0.2250 d0.dn_loss_bbox: 0.3018 d0.dn_loss_iou: 0.3722 d1.dn_loss_cls: 0.1785 d1.dn_loss_bbox: 0.1883 d1.dn_loss_iou: 0.2504 d2.dn_loss_cls: 0.1605 d2.dn_loss_bbox: 0.1652 d2.dn_loss_iou: 0.2244 d3.dn_loss_cls: 0.1472 d3.dn_loss_bbox: 0.1562 d3.dn_loss_iou: 0.2161 d4.dn_loss_cls: 0.1425 d4.dn_loss_bbox: 0.1535 d4.dn_loss_iou: 0.2128 d1.loss_lmm_region: 0.1918 loss_lmm_image: 0.9785 2024/11/10 18:46:28 - mmengine - INFO - Iter(train) [ 16400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:04:11 time: 1.9631 data_time: 0.0166 memory: 35005 grad_norm: 28.8125 loss: 10.1596 loss_cls: 0.3302 loss_bbox: 0.1345 loss_iou: 0.2359 d0.loss_cls: 0.3782 d0.loss_bbox: 0.1486 d0.loss_iou: 0.2483 d1.loss_cls: 0.3478 d1.loss_bbox: 0.1405 d1.loss_iou: 0.2394 d2.loss_cls: 0.3393 d2.loss_bbox: 0.1399 d2.loss_iou: 0.2394 d3.loss_cls: 0.3340 d3.loss_bbox: 0.1364 d3.loss_iou: 0.2380 d4.loss_cls: 0.3329 d4.loss_bbox: 0.1336 d4.loss_iou: 0.2346 enc_loss_cls: 0.3714 enc_loss_bbox: 0.1683 enc_loss_iou: 0.2692 dn_loss_cls: 0.1689 dn_loss_bbox: 0.1603 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.2616 d0.dn_loss_bbox: 0.3361 d0.dn_loss_iou: 0.3754 d1.dn_loss_cls: 0.2055 d1.dn_loss_bbox: 0.2045 d1.dn_loss_iou: 0.2549 d2.dn_loss_cls: 0.1823 d2.dn_loss_bbox: 0.1730 d2.dn_loss_iou: 0.2274 d3.dn_loss_cls: 0.1752 d3.dn_loss_bbox: 0.1626 d3.dn_loss_iou: 0.2185 d4.dn_loss_cls: 0.1706 d4.dn_loss_bbox: 0.1603 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1949 loss_lmm_image: 0.9556 2024/11/10 18:49:46 - mmengine - INFO - Iter(train) [ 16500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 2:00:43 time: 1.9464 data_time: 0.0166 memory: 35167 grad_norm: nan loss: 9.4434 loss_cls: 0.2889 loss_bbox: 0.1318 loss_iou: 0.2201 d0.loss_cls: 0.3483 d0.loss_bbox: 0.1390 d0.loss_iou: 0.2284 d1.loss_cls: 0.3167 d1.loss_bbox: 0.1326 d1.loss_iou: 0.2199 d2.loss_cls: 0.3016 d2.loss_bbox: 0.1314 d2.loss_iou: 0.2198 d3.loss_cls: 0.2952 d3.loss_bbox: 0.1321 d3.loss_iou: 0.2196 d4.loss_cls: 0.2868 d4.loss_bbox: 0.1345 d4.loss_iou: 0.2212 enc_loss_cls: 0.3426 enc_loss_bbox: 0.1506 enc_loss_iou: 0.2458 dn_loss_cls: 0.1518 dn_loss_bbox: 0.1649 dn_loss_iou: 0.2047 d0.dn_loss_cls: 0.2278 d0.dn_loss_bbox: 0.3049 d0.dn_loss_iou: 0.3383 d1.dn_loss_cls: 0.1791 d1.dn_loss_bbox: 0.1948 d1.dn_loss_iou: 0.2352 d2.dn_loss_cls: 0.1618 d2.dn_loss_bbox: 0.1742 d2.dn_loss_iou: 0.2134 d3.dn_loss_cls: 0.1571 d3.dn_loss_bbox: 0.1674 d3.dn_loss_iou: 0.2072 d4.dn_loss_cls: 0.1520 d4.dn_loss_bbox: 0.1650 d4.dn_loss_iou: 0.2047 d1.loss_lmm_region: 0.1799 loss_lmm_image: 0.9525 2024/11/10 18:53:05 - mmengine - INFO - Iter(train) [ 16600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:57:13 time: 1.9917 data_time: 0.0168 memory: 33052 grad_norm: 28.4883 loss: 12.4251 loss_cls: 0.4068 loss_bbox: 0.1836 loss_iou: 0.3312 d0.loss_cls: 0.4600 d0.loss_bbox: 0.2006 d0.loss_iou: 0.3462 d1.loss_cls: 0.4282 d1.loss_bbox: 0.1933 d1.loss_iou: 0.3380 d2.loss_cls: 0.4184 d2.loss_bbox: 0.1900 d2.loss_iou: 0.3328 d3.loss_cls: 0.4091 d3.loss_bbox: 0.1848 d3.loss_iou: 0.3304 d4.loss_cls: 0.4079 d4.loss_bbox: 0.1825 d4.loss_iou: 0.3298 enc_loss_cls: 0.4614 enc_loss_bbox: 0.2109 enc_loss_iou: 0.3629 dn_loss_cls: 0.1985 dn_loss_bbox: 0.2104 dn_loss_iou: 0.2593 d0.dn_loss_cls: 0.2814 d0.dn_loss_bbox: 0.3669 d0.dn_loss_iou: 0.4053 d1.dn_loss_cls: 0.2317 d1.dn_loss_bbox: 0.2439 d1.dn_loss_iou: 0.2913 d2.dn_loss_cls: 0.2100 d2.dn_loss_bbox: 0.2226 d2.dn_loss_iou: 0.2701 d3.dn_loss_cls: 0.2017 d3.dn_loss_bbox: 0.2128 d3.dn_loss_iou: 0.2612 d4.dn_loss_cls: 0.1998 d4.dn_loss_bbox: 0.2105 d4.dn_loss_iou: 0.2593 d1.loss_lmm_region: 0.2381 loss_lmm_image: 0.9418 2024/11/10 18:56:23 - mmengine - INFO - Iter(train) [ 16700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:53:48 time: 1.9770 data_time: 0.0167 memory: 34642 grad_norm: 30.5602 loss: 10.2742 loss_cls: 0.3705 loss_bbox: 0.1151 loss_iou: 0.2122 d0.loss_cls: 0.4303 d0.loss_bbox: 0.1286 d0.loss_iou: 0.2342 d1.loss_cls: 0.3974 d1.loss_bbox: 0.1206 d1.loss_iou: 0.2198 d2.loss_cls: 0.3767 d2.loss_bbox: 0.1170 d2.loss_iou: 0.2145 d3.loss_cls: 0.3666 d3.loss_bbox: 0.1149 d3.loss_iou: 0.2121 d4.loss_cls: 0.3638 d4.loss_bbox: 0.1163 d4.loss_iou: 0.2114 enc_loss_cls: 0.4205 enc_loss_bbox: 0.1479 enc_loss_iou: 0.2652 dn_loss_cls: 0.1909 dn_loss_bbox: 0.1830 dn_loss_iou: 0.2038 d0.dn_loss_cls: 0.2407 d0.dn_loss_bbox: 0.3429 d0.dn_loss_iou: 0.3599 d1.dn_loss_cls: 0.2108 d1.dn_loss_bbox: 0.2196 d1.dn_loss_iou: 0.2397 d2.dn_loss_cls: 0.1979 d2.dn_loss_bbox: 0.1937 d2.dn_loss_iou: 0.2141 d3.dn_loss_cls: 0.2012 d3.dn_loss_bbox: 0.1864 d3.dn_loss_iou: 0.2065 d4.dn_loss_cls: 0.2008 d4.dn_loss_bbox: 0.1830 d4.dn_loss_iou: 0.2037 d1.loss_lmm_region: 0.1900 loss_lmm_image: 0.9502 2024/11/10 18:59:42 - mmengine - INFO - Iter(train) [ 16800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:50:20 time: 1.9888 data_time: 0.0169 memory: 34088 grad_norm: 28.5500 loss: 10.1630 loss_cls: 0.3291 loss_bbox: 0.1380 loss_iou: 0.2587 d0.loss_cls: 0.3845 d0.loss_bbox: 0.1459 d0.loss_iou: 0.2643 d1.loss_cls: 0.3467 d1.loss_bbox: 0.1390 d1.loss_iou: 0.2596 d2.loss_cls: 0.3384 d2.loss_bbox: 0.1370 d2.loss_iou: 0.2580 d3.loss_cls: 0.3341 d3.loss_bbox: 0.1370 d3.loss_iou: 0.2585 d4.loss_cls: 0.3299 d4.loss_bbox: 0.1370 d4.loss_iou: 0.2581 enc_loss_cls: 0.3810 enc_loss_bbox: 0.1586 enc_loss_iou: 0.2876 dn_loss_cls: 0.1336 dn_loss_bbox: 0.1825 dn_loss_iou: 0.2205 d0.dn_loss_cls: 0.2086 d0.dn_loss_bbox: 0.3259 d0.dn_loss_iou: 0.3603 d1.dn_loss_cls: 0.1638 d1.dn_loss_bbox: 0.2135 d1.dn_loss_iou: 0.2499 d2.dn_loss_cls: 0.1458 d2.dn_loss_bbox: 0.1928 d2.dn_loss_iou: 0.2297 d3.dn_loss_cls: 0.1390 d3.dn_loss_bbox: 0.1848 d3.dn_loss_iou: 0.2230 d4.dn_loss_cls: 0.1347 d4.dn_loss_bbox: 0.1824 d4.dn_loss_iou: 0.2204 d1.loss_lmm_region: 0.1958 loss_lmm_image: 0.9753 2024/11/10 19:02:59 - mmengine - INFO - Iter(train) [ 16900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:46:43 time: 1.9626 data_time: 0.0167 memory: 34029 grad_norm: 28.1627 loss: 9.7757 loss_cls: 0.3010 loss_bbox: 0.1269 loss_iou: 0.2237 d0.loss_cls: 0.3519 d0.loss_bbox: 0.1350 d0.loss_iou: 0.2401 d1.loss_cls: 0.3235 d1.loss_bbox: 0.1231 d1.loss_iou: 0.2277 d2.loss_cls: 0.3084 d2.loss_bbox: 0.1234 d2.loss_iou: 0.2245 d3.loss_cls: 0.3037 d3.loss_bbox: 0.1245 d3.loss_iou: 0.2252 d4.loss_cls: 0.3006 d4.loss_bbox: 0.1248 d4.loss_iou: 0.2265 enc_loss_cls: 0.3508 enc_loss_bbox: 0.1496 enc_loss_iou: 0.2617 dn_loss_cls: 0.1553 dn_loss_bbox: 0.1762 dn_loss_iou: 0.2323 d0.dn_loss_cls: 0.2423 d0.dn_loss_bbox: 0.3132 d0.dn_loss_iou: 0.3737 d1.dn_loss_cls: 0.1873 d1.dn_loss_bbox: 0.2131 d1.dn_loss_iou: 0.2640 d2.dn_loss_cls: 0.1669 d2.dn_loss_bbox: 0.1890 d2.dn_loss_iou: 0.2422 d3.dn_loss_cls: 0.1593 d3.dn_loss_bbox: 0.1788 d3.dn_loss_iou: 0.2352 d4.dn_loss_cls: 0.1559 d4.dn_loss_bbox: 0.1762 d4.dn_loss_iou: 0.2323 d1.loss_lmm_region: 0.2050 loss_lmm_image: 0.9007 2024/11/10 19:06:18 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 19:06:18 - mmengine - INFO - Iter(train) [ 17000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:43:19 time: 2.0157 data_time: 0.0169 memory: 35015 grad_norm: 35.0605 loss: 10.7534 loss_cls: 0.3427 loss_bbox: 0.1478 loss_iou: 0.2824 d0.loss_cls: 0.3923 d0.loss_bbox: 0.1622 d0.loss_iou: 0.3002 d1.loss_cls: 0.3621 d1.loss_bbox: 0.1541 d1.loss_iou: 0.2872 d2.loss_cls: 0.3440 d2.loss_bbox: 0.1534 d2.loss_iou: 0.2890 d3.loss_cls: 0.3450 d3.loss_bbox: 0.1520 d3.loss_iou: 0.2854 d4.loss_cls: 0.3397 d4.loss_bbox: 0.1492 d4.loss_iou: 0.2827 enc_loss_cls: 0.3869 enc_loss_bbox: 0.1737 enc_loss_iou: 0.3212 dn_loss_cls: 0.1385 dn_loss_bbox: 0.1785 dn_loss_iou: 0.2435 d0.dn_loss_cls: 0.2128 d0.dn_loss_bbox: 0.3482 d0.dn_loss_iou: 0.3985 d1.dn_loss_cls: 0.1693 d1.dn_loss_bbox: 0.2179 d1.dn_loss_iou: 0.2776 d2.dn_loss_cls: 0.1469 d2.dn_loss_bbox: 0.1949 d2.dn_loss_iou: 0.2559 d3.dn_loss_cls: 0.1413 d3.dn_loss_bbox: 0.1822 d3.dn_loss_iou: 0.2470 d4.dn_loss_cls: 0.1373 d4.dn_loss_bbox: 0.1787 d4.dn_loss_iou: 0.2437 d1.loss_lmm_region: 0.1967 loss_lmm_image: 0.9909 2024/11/10 19:09:38 - mmengine - INFO - Iter(train) [ 17100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:39:59 time: 1.9875 data_time: 0.0168 memory: 36187 grad_norm: 27.0077 loss: 9.9823 loss_cls: 0.2977 loss_bbox: 0.1384 loss_iou: 0.2338 d0.loss_cls: 0.3578 d0.loss_bbox: 0.1429 d0.loss_iou: 0.2416 d1.loss_cls: 0.3212 d1.loss_bbox: 0.1430 d1.loss_iou: 0.2388 d2.loss_cls: 0.3086 d2.loss_bbox: 0.1373 d2.loss_iou: 0.2343 d3.loss_cls: 0.3005 d3.loss_bbox: 0.1394 d3.loss_iou: 0.2341 d4.loss_cls: 0.3006 d4.loss_bbox: 0.1378 d4.loss_iou: 0.2327 enc_loss_cls: 0.3473 enc_loss_bbox: 0.1656 enc_loss_iou: 0.2724 dn_loss_cls: 0.1481 dn_loss_bbox: 0.1929 dn_loss_iou: 0.2196 d0.dn_loss_cls: 0.2222 d0.dn_loss_bbox: 0.3601 d0.dn_loss_iou: 0.3730 d1.dn_loss_cls: 0.1737 d1.dn_loss_bbox: 0.2290 d1.dn_loss_iou: 0.2531 d2.dn_loss_cls: 0.1554 d2.dn_loss_bbox: 0.2043 d2.dn_loss_iou: 0.2295 d3.dn_loss_cls: 0.1499 d3.dn_loss_bbox: 0.1952 d3.dn_loss_iou: 0.2224 d4.dn_loss_cls: 0.1461 d4.dn_loss_bbox: 0.1929 d4.dn_loss_iou: 0.2196 d1.loss_lmm_region: 0.1909 loss_lmm_image: 0.9789 2024/11/10 19:12:58 - mmengine - INFO - Iter(train) [ 17200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:36:43 time: 2.0195 data_time: 0.0168 memory: 33506 grad_norm: 26.2660 loss: 9.3023 loss_cls: 0.3017 loss_bbox: 0.1249 loss_iou: 0.2390 d0.loss_cls: 0.3611 d0.loss_bbox: 0.1371 d0.loss_iou: 0.2576 d1.loss_cls: 0.3235 d1.loss_bbox: 0.1266 d1.loss_iou: 0.2452 d2.loss_cls: 0.3087 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2424 d3.loss_cls: 0.3017 d3.loss_bbox: 0.1283 d3.loss_iou: 0.2417 d4.loss_cls: 0.3022 d4.loss_bbox: 0.1267 d4.loss_iou: 0.2392 enc_loss_cls: 0.3521 enc_loss_bbox: 0.1546 enc_loss_iou: 0.2857 dn_loss_cls: 0.1201 dn_loss_bbox: 0.1445 dn_loss_iou: 0.1995 d0.dn_loss_cls: 0.1910 d0.dn_loss_bbox: 0.2785 d0.dn_loss_iou: 0.3403 d1.dn_loss_cls: 0.1450 d1.dn_loss_bbox: 0.1761 d1.dn_loss_iou: 0.2309 d2.dn_loss_cls: 0.1290 d2.dn_loss_bbox: 0.1529 d2.dn_loss_iou: 0.2082 d3.dn_loss_cls: 0.1241 d3.dn_loss_bbox: 0.1462 d3.dn_loss_iou: 0.2020 d4.dn_loss_cls: 0.1213 d4.dn_loss_bbox: 0.1445 d4.dn_loss_iou: 0.1995 d1.loss_lmm_region: 0.1691 loss_lmm_image: 0.9508 2024/11/10 19:16:16 - mmengine - INFO - Iter(train) [ 17300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:33:13 time: 1.9866 data_time: 0.0169 memory: 34771 grad_norm: 24.2480 loss: 9.1747 loss_cls: 0.2827 loss_bbox: 0.1203 loss_iou: 0.2257 d0.loss_cls: 0.3366 d0.loss_bbox: 0.1357 d0.loss_iou: 0.2384 d1.loss_cls: 0.2980 d1.loss_bbox: 0.1291 d1.loss_iou: 0.2318 d2.loss_cls: 0.2931 d2.loss_bbox: 0.1214 d2.loss_iou: 0.2274 d3.loss_cls: 0.2874 d3.loss_bbox: 0.1220 d3.loss_iou: 0.2266 d4.loss_cls: 0.2827 d4.loss_bbox: 0.1199 d4.loss_iou: 0.2255 enc_loss_cls: 0.3272 enc_loss_bbox: 0.1495 enc_loss_iou: 0.2631 dn_loss_cls: 0.1198 dn_loss_bbox: 0.1560 dn_loss_iou: 0.2058 d0.dn_loss_cls: 0.1994 d0.dn_loss_bbox: 0.3073 d0.dn_loss_iou: 0.3463 d1.dn_loss_cls: 0.1528 d1.dn_loss_bbox: 0.1898 d1.dn_loss_iou: 0.2365 d2.dn_loss_cls: 0.1319 d2.dn_loss_bbox: 0.1658 d2.dn_loss_iou: 0.2140 d3.dn_loss_cls: 0.1256 d3.dn_loss_bbox: 0.1590 d3.dn_loss_iou: 0.2086 d4.dn_loss_cls: 0.1205 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.2058 d1.loss_lmm_region: 0.1749 loss_lmm_image: 0.9549 2024/11/10 19:19:35 - mmengine - INFO - Iter(train) [ 17400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:29:52 time: 1.9857 data_time: 0.0169 memory: 34924 grad_norm: 26.5256 loss: 11.3745 loss_cls: 0.3576 loss_bbox: 0.1831 loss_iou: 0.3146 d0.loss_cls: 0.4203 d0.loss_bbox: 0.1920 d0.loss_iou: 0.3258 d1.loss_cls: 0.3776 d1.loss_bbox: 0.1868 d1.loss_iou: 0.3224 d2.loss_cls: 0.3702 d2.loss_bbox: 0.1817 d2.loss_iou: 0.3153 d3.loss_cls: 0.3637 d3.loss_bbox: 0.1806 d3.loss_iou: 0.3141 d4.loss_cls: 0.3559 d4.loss_bbox: 0.1840 d4.loss_iou: 0.3169 enc_loss_cls: 0.4168 enc_loss_bbox: 0.2121 enc_loss_iou: 0.3485 dn_loss_cls: 0.1576 dn_loss_bbox: 0.1871 dn_loss_iou: 0.2408 d0.dn_loss_cls: 0.2447 d0.dn_loss_bbox: 0.3425 d0.dn_loss_iou: 0.3893 d1.dn_loss_cls: 0.1939 d1.dn_loss_bbox: 0.2194 d1.dn_loss_iou: 0.2726 d2.dn_loss_cls: 0.1748 d2.dn_loss_bbox: 0.1982 d2.dn_loss_iou: 0.2510 d3.dn_loss_cls: 0.1645 d3.dn_loss_bbox: 0.1877 d3.dn_loss_iou: 0.2428 d4.dn_loss_cls: 0.1577 d4.dn_loss_bbox: 0.1871 d4.dn_loss_iou: 0.2408 d1.loss_lmm_region: 0.2010 loss_lmm_image: 0.8810 2024/11/10 19:22:55 - mmengine - INFO - Iter(train) [ 17500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:26:33 time: 2.0161 data_time: 0.0170 memory: 33915 grad_norm: 28.6255 loss: 10.7222 loss_cls: 0.3487 loss_bbox: 0.1609 loss_iou: 0.2756 d0.loss_cls: 0.4201 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2821 d1.loss_cls: 0.3842 d1.loss_bbox: 0.1498 d1.loss_iou: 0.2752 d2.loss_cls: 0.3658 d2.loss_bbox: 0.1592 d2.loss_iou: 0.2757 d3.loss_cls: 0.3560 d3.loss_bbox: 0.1588 d3.loss_iou: 0.2750 d4.loss_cls: 0.3505 d4.loss_bbox: 0.1594 d4.loss_iou: 0.2739 enc_loss_cls: 0.4144 enc_loss_bbox: 0.1860 enc_loss_iou: 0.3109 dn_loss_cls: 0.1344 dn_loss_bbox: 0.1945 dn_loss_iou: 0.2293 d0.dn_loss_cls: 0.2118 d0.dn_loss_bbox: 0.3553 d0.dn_loss_iou: 0.3825 d1.dn_loss_cls: 0.1639 d1.dn_loss_bbox: 0.2282 d1.dn_loss_iou: 0.2630 d2.dn_loss_cls: 0.1443 d2.dn_loss_bbox: 0.2043 d2.dn_loss_iou: 0.2397 d3.dn_loss_cls: 0.1395 d3.dn_loss_bbox: 0.1969 d3.dn_loss_iou: 0.2328 d4.dn_loss_cls: 0.1349 d4.dn_loss_bbox: 0.1946 d4.dn_loss_iou: 0.2291 d1.loss_lmm_region: 0.1982 loss_lmm_image: 0.9043 2024/11/10 19:26:13 - mmengine - INFO - Iter(train) [ 17600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:23:02 time: 1.9762 data_time: 0.0168 memory: 35129 grad_norm: 26.9273 loss: 10.6577 loss_cls: 0.3446 loss_bbox: 0.1358 loss_iou: 0.2480 d0.loss_cls: 0.3968 d0.loss_bbox: 0.1573 d0.loss_iou: 0.2699 d1.loss_cls: 0.3622 d1.loss_bbox: 0.1498 d1.loss_iou: 0.2619 d2.loss_cls: 0.3566 d2.loss_bbox: 0.1408 d2.loss_iou: 0.2546 d3.loss_cls: 0.3504 d3.loss_bbox: 0.1390 d3.loss_iou: 0.2503 d4.loss_cls: 0.3476 d4.loss_bbox: 0.1365 d4.loss_iou: 0.2493 enc_loss_cls: 0.3938 enc_loss_bbox: 0.1720 enc_loss_iou: 0.2890 dn_loss_cls: 0.1785 dn_loss_bbox: 0.1858 dn_loss_iou: 0.2162 d0.dn_loss_cls: 0.2553 d0.dn_loss_bbox: 0.3754 d0.dn_loss_iou: 0.3855 d1.dn_loss_cls: 0.2060 d1.dn_loss_bbox: 0.2355 d1.dn_loss_iou: 0.2576 d2.dn_loss_cls: 0.1873 d2.dn_loss_bbox: 0.2032 d2.dn_loss_iou: 0.2301 d3.dn_loss_cls: 0.1829 d3.dn_loss_bbox: 0.1906 d3.dn_loss_iou: 0.2203 d4.dn_loss_cls: 0.1775 d4.dn_loss_bbox: 0.1858 d4.dn_loss_iou: 0.2164 d1.loss_lmm_region: 0.1901 loss_lmm_image: 0.9717 2024/11/10 19:29:32 - mmengine - INFO - Iter(train) [ 17700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:19:42 time: 1.9839 data_time: 0.0168 memory: 34547 grad_norm: 23.3160 loss: 10.4316 loss_cls: 0.3328 loss_bbox: 0.1607 loss_iou: 0.2761 d0.loss_cls: 0.3931 d0.loss_bbox: 0.1744 d0.loss_iou: 0.2953 d1.loss_cls: 0.3619 d1.loss_bbox: 0.1664 d1.loss_iou: 0.2863 d2.loss_cls: 0.3538 d2.loss_bbox: 0.1600 d2.loss_iou: 0.2766 d3.loss_cls: 0.3435 d3.loss_bbox: 0.1604 d3.loss_iou: 0.2767 d4.loss_cls: 0.3369 d4.loss_bbox: 0.1585 d4.loss_iou: 0.2749 enc_loss_cls: 0.3924 enc_loss_bbox: 0.1874 enc_loss_iou: 0.3156 dn_loss_cls: 0.0982 dn_loss_bbox: 0.1818 dn_loss_iou: 0.2368 d0.dn_loss_cls: 0.1858 d0.dn_loss_bbox: 0.3320 d0.dn_loss_iou: 0.3873 d1.dn_loss_cls: 0.1334 d1.dn_loss_bbox: 0.2133 d1.dn_loss_iou: 0.2685 d2.dn_loss_cls: 0.1096 d2.dn_loss_bbox: 0.1919 d2.dn_loss_iou: 0.2470 d3.dn_loss_cls: 0.1038 d3.dn_loss_bbox: 0.1838 d3.dn_loss_iou: 0.2394 d4.dn_loss_cls: 0.0986 d4.dn_loss_bbox: 0.1819 d4.dn_loss_iou: 0.2368 d1.loss_lmm_region: 0.1828 loss_lmm_image: 0.9351 2024/11/10 19:32:49 - mmengine - INFO - Iter(train) [ 17800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:15:58 time: 1.9481 data_time: 0.0170 memory: 33045 grad_norm: 27.3882 loss: 8.4013 loss_cls: 0.2625 loss_bbox: 0.1022 loss_iou: 0.1951 d0.loss_cls: 0.3065 d0.loss_bbox: 0.1164 d0.loss_iou: 0.2103 d1.loss_cls: 0.2819 d1.loss_bbox: 0.1093 d1.loss_iou: 0.2045 d2.loss_cls: 0.2701 d2.loss_bbox: 0.1041 d2.loss_iou: 0.1983 d3.loss_cls: 0.2607 d3.loss_bbox: 0.1066 d3.loss_iou: 0.1976 d4.loss_cls: 0.2611 d4.loss_bbox: 0.1041 d4.loss_iou: 0.1965 enc_loss_cls: 0.3075 enc_loss_bbox: 0.1262 enc_loss_iou: 0.2289 dn_loss_cls: 0.1046 dn_loss_bbox: 0.1418 dn_loss_iou: 0.1909 d0.dn_loss_cls: 0.1874 d0.dn_loss_bbox: 0.2888 d0.dn_loss_iou: 0.3358 d1.dn_loss_cls: 0.1361 d1.dn_loss_bbox: 0.1745 d1.dn_loss_iou: 0.2247 d2.dn_loss_cls: 0.1154 d2.dn_loss_bbox: 0.1538 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.1079 d3.dn_loss_bbox: 0.1441 d3.dn_loss_iou: 0.1938 d4.dn_loss_cls: 0.1045 d4.dn_loss_bbox: 0.1418 d4.dn_loss_iou: 0.1910 d1.loss_lmm_region: 0.1614 loss_lmm_image: 0.9510 2024/11/10 19:36:10 - mmengine - INFO - Iter(train) [ 17900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:12:55 time: 2.0027 data_time: 0.0169 memory: 34254 grad_norm: 24.7950 loss: 9.8705 loss_cls: 0.3368 loss_bbox: 0.1212 loss_iou: 0.2251 d0.loss_cls: 0.3967 d0.loss_bbox: 0.1351 d0.loss_iou: 0.2407 d1.loss_cls: 0.3627 d1.loss_bbox: 0.1271 d1.loss_iou: 0.2309 d2.loss_cls: 0.3464 d2.loss_bbox: 0.1265 d2.loss_iou: 0.2301 d3.loss_cls: 0.3380 d3.loss_bbox: 0.1230 d3.loss_iou: 0.2278 d4.loss_cls: 0.3371 d4.loss_bbox: 0.1197 d4.loss_iou: 0.2235 enc_loss_cls: 0.3892 enc_loss_bbox: 0.1476 enc_loss_iou: 0.2629 dn_loss_cls: 0.1818 dn_loss_bbox: 0.1555 dn_loss_iou: 0.1929 d0.dn_loss_cls: 0.2627 d0.dn_loss_bbox: 0.3138 d0.dn_loss_iou: 0.3419 d1.dn_loss_cls: 0.2162 d1.dn_loss_bbox: 0.1860 d1.dn_loss_iou: 0.2258 d2.dn_loss_cls: 0.1949 d2.dn_loss_bbox: 0.1639 d2.dn_loss_iou: 0.2026 d3.dn_loss_cls: 0.1855 d3.dn_loss_bbox: 0.1586 d3.dn_loss_iou: 0.1961 d4.dn_loss_cls: 0.1808 d4.dn_loss_bbox: 0.1556 d4.dn_loss_iou: 0.1929 d1.loss_lmm_region: 0.2176 loss_lmm_image: 0.8971 2024/11/10 19:39:32 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 19:39:32 - mmengine - INFO - Iter(train) [ 18000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:09:48 time: 2.0135 data_time: 0.0170 memory: 33356 grad_norm: 26.4904 loss: 10.1485 loss_cls: 0.3265 loss_bbox: 0.1414 loss_iou: 0.2267 d0.loss_cls: 0.3785 d0.loss_bbox: 0.1553 d0.loss_iou: 0.2333 d1.loss_cls: 0.3454 d1.loss_bbox: 0.1490 d1.loss_iou: 0.2297 d2.loss_cls: 0.3344 d2.loss_bbox: 0.1433 d2.loss_iou: 0.2263 d3.loss_cls: 0.3293 d3.loss_bbox: 0.1432 d3.loss_iou: 0.2257 d4.loss_cls: 0.3277 d4.loss_bbox: 0.1427 d4.loss_iou: 0.2276 enc_loss_cls: 0.3813 enc_loss_bbox: 0.1713 enc_loss_iou: 0.2558 dn_loss_cls: 0.1432 dn_loss_bbox: 0.1881 dn_loss_iou: 0.2283 d0.dn_loss_cls: 0.2208 d0.dn_loss_bbox: 0.3581 d0.dn_loss_iou: 0.3796 d1.dn_loss_cls: 0.1738 d1.dn_loss_bbox: 0.2270 d1.dn_loss_iou: 0.2615 d2.dn_loss_cls: 0.1585 d2.dn_loss_bbox: 0.2031 d2.dn_loss_iou: 0.2406 d3.dn_loss_cls: 0.1502 d3.dn_loss_bbox: 0.1912 d3.dn_loss_iou: 0.2316 d4.dn_loss_cls: 0.1455 d4.dn_loss_bbox: 0.1881 d4.dn_loss_iou: 0.2282 d1.loss_lmm_region: 0.1849 loss_lmm_image: 0.9516 2024/11/10 19:42:50 - mmengine - INFO - Iter(train) [ 18100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:06:20 time: 1.9842 data_time: 0.0171 memory: 35197 grad_norm: 36.2767 loss: 9.8150 loss_cls: 0.3053 loss_bbox: 0.1253 loss_iou: 0.2394 d0.loss_cls: 0.3584 d0.loss_bbox: 0.1448 d0.loss_iou: 0.2578 d1.loss_cls: 0.3253 d1.loss_bbox: 0.1342 d1.loss_iou: 0.2501 d2.loss_cls: 0.3146 d2.loss_bbox: 0.1297 d2.loss_iou: 0.2443 d3.loss_cls: 0.3068 d3.loss_bbox: 0.1292 d3.loss_iou: 0.2427 d4.loss_cls: 0.3071 d4.loss_bbox: 0.1256 d4.loss_iou: 0.2395 enc_loss_cls: 0.3528 enc_loss_bbox: 0.1580 enc_loss_iou: 0.2812 dn_loss_cls: 0.1492 dn_loss_bbox: 0.1757 dn_loss_iou: 0.2189 d0.dn_loss_cls: 0.2275 d0.dn_loss_bbox: 0.3374 d0.dn_loss_iou: 0.3726 d1.dn_loss_cls: 0.1771 d1.dn_loss_bbox: 0.2144 d1.dn_loss_iou: 0.2530 d2.dn_loss_cls: 0.1584 d2.dn_loss_bbox: 0.1883 d2.dn_loss_iou: 0.2291 d3.dn_loss_cls: 0.1531 d3.dn_loss_bbox: 0.1784 d3.dn_loss_iou: 0.2214 d4.dn_loss_cls: 0.1493 d4.dn_loss_bbox: 0.1757 d4.dn_loss_iou: 0.2188 d1.loss_lmm_region: 0.1688 loss_lmm_image: 0.8757 2024/11/10 19:46:09 - mmengine - INFO - Iter(train) [ 18200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 1:02:53 time: 1.9702 data_time: 0.0172 memory: 34768 grad_norm: 28.1509 loss: 9.4807 loss_cls: 0.3101 loss_bbox: 0.1156 loss_iou: 0.1954 d0.loss_cls: 0.3649 d0.loss_bbox: 0.1281 d0.loss_iou: 0.2119 d1.loss_cls: 0.3366 d1.loss_bbox: 0.1202 d1.loss_iou: 0.2026 d2.loss_cls: 0.3247 d2.loss_bbox: 0.1166 d2.loss_iou: 0.1957 d3.loss_cls: 0.3153 d3.loss_bbox: 0.1164 d3.loss_iou: 0.1959 d4.loss_cls: 0.3114 d4.loss_bbox: 0.1144 d4.loss_iou: 0.1945 enc_loss_cls: 0.3569 enc_loss_bbox: 0.1445 enc_loss_iou: 0.2365 dn_loss_cls: 0.1652 dn_loss_bbox: 0.1717 dn_loss_iou: 0.1980 d0.dn_loss_cls: 0.2451 d0.dn_loss_bbox: 0.3404 d0.dn_loss_iou: 0.3515 d1.dn_loss_cls: 0.1971 d1.dn_loss_bbox: 0.2080 d1.dn_loss_iou: 0.2309 d2.dn_loss_cls: 0.1771 d2.dn_loss_bbox: 0.1806 d2.dn_loss_iou: 0.2071 d3.dn_loss_cls: 0.1690 d3.dn_loss_bbox: 0.1732 d3.dn_loss_iou: 0.2003 d4.dn_loss_cls: 0.1645 d4.dn_loss_bbox: 0.1718 d4.dn_loss_iou: 0.1980 d1.loss_lmm_region: 0.1758 loss_lmm_image: 0.9468 2024/11/10 19:49:27 - mmengine - INFO - Iter(train) [ 18300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:59:23 time: 1.9842 data_time: 0.0170 memory: 34567 grad_norm: 31.6343 loss: 9.9318 loss_cls: 0.3072 loss_bbox: 0.1436 loss_iou: 0.2325 d0.loss_cls: 0.3534 d0.loss_bbox: 0.1577 d0.loss_iou: 0.2487 d1.loss_cls: 0.3307 d1.loss_bbox: 0.1467 d1.loss_iou: 0.2375 d2.loss_cls: 0.3238 d2.loss_bbox: 0.1445 d2.loss_iou: 0.2335 d3.loss_cls: 0.3113 d3.loss_bbox: 0.1477 d3.loss_iou: 0.2352 d4.loss_cls: 0.3083 d4.loss_bbox: 0.1470 d4.loss_iou: 0.2334 enc_loss_cls: 0.3482 enc_loss_bbox: 0.1702 enc_loss_iou: 0.2728 dn_loss_cls: 0.1374 dn_loss_bbox: 0.1956 dn_loss_iou: 0.2232 d0.dn_loss_cls: 0.2126 d0.dn_loss_bbox: 0.3323 d0.dn_loss_iou: 0.3629 d1.dn_loss_cls: 0.1645 d1.dn_loss_bbox: 0.2262 d1.dn_loss_iou: 0.2561 d2.dn_loss_cls: 0.1501 d2.dn_loss_bbox: 0.2000 d2.dn_loss_iou: 0.2316 d3.dn_loss_cls: 0.1419 d3.dn_loss_bbox: 0.1950 d3.dn_loss_iou: 0.2254 d4.dn_loss_cls: 0.1385 d4.dn_loss_bbox: 0.1957 d4.dn_loss_iou: 0.2232 d1.loss_lmm_region: 0.1873 loss_lmm_image: 0.8981 2024/11/10 19:52:45 - mmengine - INFO - Iter(train) [ 18400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:55:56 time: 2.0140 data_time: 0.0171 memory: 35549 grad_norm: 30.0152 loss: 10.5766 loss_cls: 0.3160 loss_bbox: 0.1486 loss_iou: 0.2602 d0.loss_cls: 0.3781 d0.loss_bbox: 0.1602 d0.loss_iou: 0.2734 d1.loss_cls: 0.3416 d1.loss_bbox: 0.1485 d1.loss_iou: 0.2642 d2.loss_cls: 0.3330 d2.loss_bbox: 0.1418 d2.loss_iou: 0.2555 d3.loss_cls: 0.3262 d3.loss_bbox: 0.1431 d3.loss_iou: 0.2553 d4.loss_cls: 0.3193 d4.loss_bbox: 0.1438 d4.loss_iou: 0.2585 enc_loss_cls: 0.3716 enc_loss_bbox: 0.1716 enc_loss_iou: 0.2969 dn_loss_cls: 0.1450 dn_loss_bbox: 0.2002 dn_loss_iou: 0.2550 d0.dn_loss_cls: 0.2316 d0.dn_loss_bbox: 0.3604 d0.dn_loss_iou: 0.4130 d1.dn_loss_cls: 0.1801 d1.dn_loss_bbox: 0.2333 d1.dn_loss_iou: 0.2881 d2.dn_loss_cls: 0.1561 d2.dn_loss_bbox: 0.2115 d2.dn_loss_iou: 0.2649 d3.dn_loss_cls: 0.1479 d3.dn_loss_bbox: 0.2023 d3.dn_loss_iou: 0.2577 d4.dn_loss_cls: 0.1445 d4.dn_loss_bbox: 0.2003 d4.dn_loss_iou: 0.2550 d1.loss_lmm_region: 0.2132 loss_lmm_image: 0.9093 2024/11/10 19:56:04 - mmengine - INFO - Iter(train) [ 18500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:52:33 time: 1.9819 data_time: 0.0169 memory: 35968 grad_norm: 25.6506 loss: 9.5339 loss_cls: 0.3253 loss_bbox: 0.1221 loss_iou: 0.2261 d0.loss_cls: 0.3630 d0.loss_bbox: 0.1425 d0.loss_iou: 0.2462 d1.loss_cls: 0.3377 d1.loss_bbox: 0.1351 d1.loss_iou: 0.2405 d2.loss_cls: 0.3319 d2.loss_bbox: 0.1267 d2.loss_iou: 0.2299 d3.loss_cls: 0.3292 d3.loss_bbox: 0.1215 d3.loss_iou: 0.2261 d4.loss_cls: 0.3269 d4.loss_bbox: 0.1217 d4.loss_iou: 0.2256 enc_loss_cls: 0.3597 enc_loss_bbox: 0.1542 enc_loss_iou: 0.2681 dn_loss_cls: 0.1174 dn_loss_bbox: 0.1593 dn_loss_iou: 0.2088 d0.dn_loss_cls: 0.1991 d0.dn_loss_bbox: 0.3126 d0.dn_loss_iou: 0.3568 d1.dn_loss_cls: 0.1499 d1.dn_loss_bbox: 0.1931 d1.dn_loss_iou: 0.2408 d2.dn_loss_cls: 0.1319 d2.dn_loss_bbox: 0.1719 d2.dn_loss_iou: 0.2193 d3.dn_loss_cls: 0.1217 d3.dn_loss_bbox: 0.1614 d3.dn_loss_iou: 0.2113 d4.dn_loss_cls: 0.1181 d4.dn_loss_bbox: 0.1593 d4.dn_loss_iou: 0.2090 d1.loss_lmm_region: 0.1717 loss_lmm_image: 0.9603 2024/11/10 19:59:25 - mmengine - INFO - Iter(train) [ 18600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:49:24 time: 2.0072 data_time: 0.0171 memory: 34457 grad_norm: 30.4607 loss: 9.1371 loss_cls: 0.2755 loss_bbox: 0.1225 loss_iou: 0.2012 d0.loss_cls: 0.3244 d0.loss_bbox: 0.1294 d0.loss_iou: 0.2111 d1.loss_cls: 0.2885 d1.loss_bbox: 0.1279 d1.loss_iou: 0.2086 d2.loss_cls: 0.2851 d2.loss_bbox: 0.1230 d2.loss_iou: 0.2019 d3.loss_cls: 0.2736 d3.loss_bbox: 0.1240 d3.loss_iou: 0.2025 d4.loss_cls: 0.2748 d4.loss_bbox: 0.1236 d4.loss_iou: 0.2015 enc_loss_cls: 0.3170 enc_loss_bbox: 0.1430 enc_loss_iou: 0.2296 dn_loss_cls: 0.1146 dn_loss_bbox: 0.1762 dn_loss_iou: 0.2137 d0.dn_loss_cls: 0.2001 d0.dn_loss_bbox: 0.3557 d0.dn_loss_iou: 0.3738 d1.dn_loss_cls: 0.1475 d1.dn_loss_bbox: 0.2170 d1.dn_loss_iou: 0.2487 d2.dn_loss_cls: 0.1272 d2.dn_loss_bbox: 0.1910 d2.dn_loss_iou: 0.2251 d3.dn_loss_cls: 0.1186 d3.dn_loss_bbox: 0.1795 d3.dn_loss_iou: 0.2172 d4.dn_loss_cls: 0.1146 d4.dn_loss_bbox: 0.1763 d4.dn_loss_iou: 0.2138 d1.loss_lmm_region: 0.1720 loss_lmm_image: 0.9662 2024/11/10 20:02:44 - mmengine - INFO - Iter(train) [ 18700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:45:58 time: 1.9698 data_time: 0.0169 memory: 33013 grad_norm: 26.9181 loss: 10.6478 loss_cls: 0.3203 loss_bbox: 0.1415 loss_iou: 0.2083 d0.loss_cls: 0.3720 d0.loss_bbox: 0.1615 d0.loss_iou: 0.2220 d1.loss_cls: 0.3417 d1.loss_bbox: 0.1453 d1.loss_iou: 0.2123 d2.loss_cls: 0.3313 d2.loss_bbox: 0.1436 d2.loss_iou: 0.2095 d3.loss_cls: 0.3259 d3.loss_bbox: 0.1436 d3.loss_iou: 0.2082 d4.loss_cls: 0.3231 d4.loss_bbox: 0.1419 d4.loss_iou: 0.2085 enc_loss_cls: 0.3706 enc_loss_bbox: 0.1706 enc_loss_iou: 0.2389 dn_loss_cls: 0.2062 dn_loss_bbox: 0.2040 dn_loss_iou: 0.2284 d0.dn_loss_cls: 0.2990 d0.dn_loss_bbox: 0.4015 d0.dn_loss_iou: 0.3943 d1.dn_loss_cls: 0.2411 d1.dn_loss_bbox: 0.2534 d1.dn_loss_iou: 0.2677 d2.dn_loss_cls: 0.2207 d2.dn_loss_bbox: 0.2219 d2.dn_loss_iou: 0.2414 d3.dn_loss_cls: 0.2114 d3.dn_loss_bbox: 0.2065 d3.dn_loss_iou: 0.2312 d4.dn_loss_cls: 0.2078 d4.dn_loss_bbox: 0.2040 d4.dn_loss_iou: 0.2285 d1.loss_lmm_region: 0.2336 loss_lmm_image: 1.0043 2024/11/10 20:06:02 - mmengine - INFO - Iter(train) [ 18800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:42:33 time: 1.9740 data_time: 0.0170 memory: 33770 grad_norm: 25.8907 loss: 10.4105 loss_cls: 0.4001 loss_bbox: 0.1377 loss_iou: 0.2346 d0.loss_cls: 0.4522 d0.loss_bbox: 0.1548 d0.loss_iou: 0.2578 d1.loss_cls: 0.4064 d1.loss_bbox: 0.1488 d1.loss_iou: 0.2457 d2.loss_cls: 0.4105 d2.loss_bbox: 0.1397 d2.loss_iou: 0.2384 d3.loss_cls: 0.4020 d3.loss_bbox: 0.1418 d3.loss_iou: 0.2375 d4.loss_cls: 0.3980 d4.loss_bbox: 0.1399 d4.loss_iou: 0.2364 enc_loss_cls: 0.4453 enc_loss_bbox: 0.1644 enc_loss_iou: 0.2739 dn_loss_cls: 0.1297 dn_loss_bbox: 0.1802 dn_loss_iou: 0.2114 d0.dn_loss_cls: 0.2166 d0.dn_loss_bbox: 0.3195 d0.dn_loss_iou: 0.3470 d1.dn_loss_cls: 0.1635 d1.dn_loss_bbox: 0.2078 d1.dn_loss_iou: 0.2396 d2.dn_loss_cls: 0.1438 d2.dn_loss_bbox: 0.1907 d2.dn_loss_iou: 0.2208 d3.dn_loss_cls: 0.1364 d3.dn_loss_bbox: 0.1814 d3.dn_loss_iou: 0.2134 d4.dn_loss_cls: 0.1289 d4.dn_loss_bbox: 0.1802 d4.dn_loss_iou: 0.2113 d1.loss_lmm_region: 0.1842 loss_lmm_image: 0.9385 2024/11/10 20:09:19 - mmengine - INFO - Iter(train) [ 18900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:38:54 time: 1.9607 data_time: 0.0170 memory: 32969 grad_norm: 28.1344 loss: 9.9059 loss_cls: 0.3233 loss_bbox: 0.1378 loss_iou: 0.2254 d0.loss_cls: 0.3735 d0.loss_bbox: 0.1498 d0.loss_iou: 0.2393 d1.loss_cls: 0.3436 d1.loss_bbox: 0.1413 d1.loss_iou: 0.2263 d2.loss_cls: 0.3288 d2.loss_bbox: 0.1391 d2.loss_iou: 0.2258 d3.loss_cls: 0.3278 d3.loss_bbox: 0.1370 d3.loss_iou: 0.2230 d4.loss_cls: 0.3223 d4.loss_bbox: 0.1383 d4.loss_iou: 0.2260 enc_loss_cls: 0.3676 enc_loss_bbox: 0.1665 enc_loss_iou: 0.2630 dn_loss_cls: 0.1287 dn_loss_bbox: 0.1902 dn_loss_iou: 0.2080 d0.dn_loss_cls: 0.2094 d0.dn_loss_bbox: 0.3602 d0.dn_loss_iou: 0.3585 d1.dn_loss_cls: 0.1589 d1.dn_loss_bbox: 0.2216 d1.dn_loss_iou: 0.2392 d2.dn_loss_cls: 0.1399 d2.dn_loss_bbox: 0.1991 d2.dn_loss_iou: 0.2173 d3.dn_loss_cls: 0.1330 d3.dn_loss_bbox: 0.1918 d3.dn_loss_iou: 0.2106 d4.dn_loss_cls: 0.1298 d4.dn_loss_bbox: 0.1903 d4.dn_loss_iou: 0.2080 d1.loss_lmm_region: 0.2025 loss_lmm_image: 0.9834 2024/11/10 20:12:39 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 20:12:39 - mmengine - INFO - Iter(train) [ 19000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:35:34 time: 1.9780 data_time: 0.0169 memory: 34291 grad_norm: 29.5621 loss: 9.6880 loss_cls: 0.3078 loss_bbox: 0.1349 loss_iou: 0.2397 d0.loss_cls: 0.3610 d0.loss_bbox: 0.1390 d0.loss_iou: 0.2520 d1.loss_cls: 0.3284 d1.loss_bbox: 0.1330 d1.loss_iou: 0.2478 d2.loss_cls: 0.3211 d2.loss_bbox: 0.1295 d2.loss_iou: 0.2370 d3.loss_cls: 0.3133 d3.loss_bbox: 0.1324 d3.loss_iou: 0.2380 d4.loss_cls: 0.3075 d4.loss_bbox: 0.1354 d4.loss_iou: 0.2406 enc_loss_cls: 0.3536 enc_loss_bbox: 0.1548 enc_loss_iou: 0.2721 dn_loss_cls: 0.1553 dn_loss_bbox: 0.1646 dn_loss_iou: 0.2203 d0.dn_loss_cls: 0.2109 d0.dn_loss_bbox: 0.2911 d0.dn_loss_iou: 0.3575 d1.dn_loss_cls: 0.1737 d1.dn_loss_bbox: 0.1882 d1.dn_loss_iou: 0.2489 d2.dn_loss_cls: 0.1599 d2.dn_loss_bbox: 0.1711 d2.dn_loss_iou: 0.2290 d3.dn_loss_cls: 0.1570 d3.dn_loss_bbox: 0.1657 d3.dn_loss_iou: 0.2218 d4.dn_loss_cls: 0.1549 d4.dn_loss_bbox: 0.1645 d4.dn_loss_iou: 0.2203 d1.loss_lmm_region: 0.1563 loss_lmm_image: 0.8980 2024/11/10 20:15:57 - mmengine - INFO - Iter(train) [ 19100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:32:10 time: 2.0160 data_time: 0.0172 memory: 34061 grad_norm: 26.5241 loss: 9.3337 loss_cls: 0.2718 loss_bbox: 0.1282 loss_iou: 0.2302 d0.loss_cls: 0.3297 d0.loss_bbox: 0.1288 d0.loss_iou: 0.2356 d1.loss_cls: 0.2948 d1.loss_bbox: 0.1298 d1.loss_iou: 0.2327 d2.loss_cls: 0.2833 d2.loss_bbox: 0.1293 d2.loss_iou: 0.2315 d3.loss_cls: 0.2758 d3.loss_bbox: 0.1302 d3.loss_iou: 0.2309 d4.loss_cls: 0.2734 d4.loss_bbox: 0.1276 d4.loss_iou: 0.2301 enc_loss_cls: 0.3158 enc_loss_bbox: 0.1508 enc_loss_iou: 0.2646 dn_loss_cls: 0.1509 dn_loss_bbox: 0.1602 dn_loss_iou: 0.2138 d0.dn_loss_cls: 0.2141 d0.dn_loss_bbox: 0.3014 d0.dn_loss_iou: 0.3545 d1.dn_loss_cls: 0.1768 d1.dn_loss_bbox: 0.1919 d1.dn_loss_iou: 0.2431 d2.dn_loss_cls: 0.1602 d2.dn_loss_bbox: 0.1682 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.1539 d3.dn_loss_bbox: 0.1611 d3.dn_loss_iou: 0.2157 d4.dn_loss_cls: 0.1515 d4.dn_loss_bbox: 0.1601 d4.dn_loss_iou: 0.2138 d1.loss_lmm_region: 0.1632 loss_lmm_image: 0.9325 2024/11/10 20:19:18 - mmengine - INFO - Iter(train) [ 19200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:28:55 time: 2.0219 data_time: 0.0169 memory: 34747 grad_norm: 29.8191 loss: 10.6237 loss_cls: 0.3406 loss_bbox: 0.1479 loss_iou: 0.2531 d0.loss_cls: 0.3886 d0.loss_bbox: 0.1537 d0.loss_iou: 0.2624 d1.loss_cls: 0.3653 d1.loss_bbox: 0.1446 d1.loss_iou: 0.2525 d2.loss_cls: 0.3539 d2.loss_bbox: 0.1483 d2.loss_iou: 0.2545 d3.loss_cls: 0.3417 d3.loss_bbox: 0.1514 d3.loss_iou: 0.2553 d4.loss_cls: 0.3390 d4.loss_bbox: 0.1503 d4.loss_iou: 0.2546 enc_loss_cls: 0.3835 enc_loss_bbox: 0.1612 enc_loss_iou: 0.2772 dn_loss_cls: 0.1968 dn_loss_bbox: 0.1816 dn_loss_iou: 0.2297 d0.dn_loss_cls: 0.2706 d0.dn_loss_bbox: 0.3234 d0.dn_loss_iou: 0.3724 d1.dn_loss_cls: 0.2220 d1.dn_loss_bbox: 0.2161 d1.dn_loss_iou: 0.2612 d2.dn_loss_cls: 0.2073 d2.dn_loss_bbox: 0.1919 d2.dn_loss_iou: 0.2391 d3.dn_loss_cls: 0.1991 d3.dn_loss_bbox: 0.1834 d3.dn_loss_iou: 0.2322 d4.dn_loss_cls: 0.1945 d4.dn_loss_bbox: 0.1816 d4.dn_loss_iou: 0.2297 d1.loss_lmm_region: 0.2072 loss_lmm_image: 0.9039 2024/11/10 20:22:36 - mmengine - INFO - Iter(train) [ 19300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:25:30 time: 1.9801 data_time: 0.0171 memory: 34368 grad_norm: 28.0255 loss: 9.0187 loss_cls: 0.2697 loss_bbox: 0.1213 loss_iou: 0.2227 d0.loss_cls: 0.3145 d0.loss_bbox: 0.1314 d0.loss_iou: 0.2330 d1.loss_cls: 0.2907 d1.loss_bbox: 0.1252 d1.loss_iou: 0.2264 d2.loss_cls: 0.2815 d2.loss_bbox: 0.1213 d2.loss_iou: 0.2224 d3.loss_cls: 0.2753 d3.loss_bbox: 0.1210 d3.loss_iou: 0.2213 d4.loss_cls: 0.2686 d4.loss_bbox: 0.1218 d4.loss_iou: 0.2238 enc_loss_cls: 0.3058 enc_loss_bbox: 0.1547 enc_loss_iou: 0.2689 dn_loss_cls: 0.1315 dn_loss_bbox: 0.1491 dn_loss_iou: 0.2024 d0.dn_loss_cls: 0.2009 d0.dn_loss_bbox: 0.3011 d0.dn_loss_iou: 0.3481 d1.dn_loss_cls: 0.1589 d1.dn_loss_bbox: 0.1922 d1.dn_loss_iou: 0.2376 d2.dn_loss_cls: 0.1427 d2.dn_loss_bbox: 0.1595 d2.dn_loss_iou: 0.2117 d3.dn_loss_cls: 0.1369 d3.dn_loss_bbox: 0.1509 d3.dn_loss_iou: 0.2045 d4.dn_loss_cls: 0.1315 d4.dn_loss_bbox: 0.1490 d4.dn_loss_iou: 0.2023 d1.loss_lmm_region: 0.1542 loss_lmm_image: 0.9325 2024/11/10 20:25:56 - mmengine - INFO - Iter(train) [ 19400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:22:12 time: 2.0203 data_time: 0.0171 memory: 34647 grad_norm: 36.2585 loss: 10.4246 loss_cls: 0.3293 loss_bbox: 0.1292 loss_iou: 0.2208 d0.loss_cls: 0.3802 d0.loss_bbox: 0.1494 d0.loss_iou: 0.2404 d1.loss_cls: 0.3475 d1.loss_bbox: 0.1379 d1.loss_iou: 0.2314 d2.loss_cls: 0.3297 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2240 d3.loss_cls: 0.3303 d3.loss_bbox: 0.1330 d3.loss_iou: 0.2221 d4.loss_cls: 0.3288 d4.loss_bbox: 0.1313 d4.loss_iou: 0.2223 enc_loss_cls: 0.3693 enc_loss_bbox: 0.1622 enc_loss_iou: 0.2603 dn_loss_cls: 0.2056 dn_loss_bbox: 0.1856 dn_loss_iou: 0.2265 d0.dn_loss_cls: 0.2979 d0.dn_loss_bbox: 0.3570 d0.dn_loss_iou: 0.3844 d1.dn_loss_cls: 0.2414 d1.dn_loss_bbox: 0.2245 d1.dn_loss_iou: 0.2603 d2.dn_loss_cls: 0.2180 d2.dn_loss_bbox: 0.1956 d2.dn_loss_iou: 0.2357 d3.dn_loss_cls: 0.2100 d3.dn_loss_bbox: 0.1882 d3.dn_loss_iou: 0.2298 d4.dn_loss_cls: 0.2048 d4.dn_loss_bbox: 0.1854 d4.dn_loss_iou: 0.2264 d1.loss_lmm_region: 0.2068 loss_lmm_image: 0.9269 2024/11/10 20:29:16 - mmengine - INFO - Iter(train) [ 19500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:18:58 time: 1.9981 data_time: 0.0170 memory: 32512 grad_norm: 24.9248 loss: 10.2619 loss_cls: 0.2940 loss_bbox: 0.1453 loss_iou: 0.2234 d0.loss_cls: 0.3419 d0.loss_bbox: 0.1672 d0.loss_iou: 0.2449 d1.loss_cls: 0.3204 d1.loss_bbox: 0.1481 d1.loss_iou: 0.2269 d2.loss_cls: 0.3070 d2.loss_bbox: 0.1459 d2.loss_iou: 0.2234 d3.loss_cls: 0.2972 d3.loss_bbox: 0.1469 d3.loss_iou: 0.2250 d4.loss_cls: 0.2931 d4.loss_bbox: 0.1464 d4.loss_iou: 0.2239 enc_loss_cls: 0.3352 enc_loss_bbox: 0.1838 enc_loss_iou: 0.2678 dn_loss_cls: 0.1620 dn_loss_bbox: 0.2171 dn_loss_iou: 0.2316 d0.dn_loss_cls: 0.2516 d0.dn_loss_bbox: 0.4039 d0.dn_loss_iou: 0.3939 d1.dn_loss_cls: 0.1954 d1.dn_loss_bbox: 0.2584 d1.dn_loss_iou: 0.2661 d2.dn_loss_cls: 0.1768 d2.dn_loss_bbox: 0.2295 d2.dn_loss_iou: 0.2428 d3.dn_loss_cls: 0.1672 d3.dn_loss_bbox: 0.2200 d3.dn_loss_iou: 0.2344 d4.dn_loss_cls: 0.1643 d4.dn_loss_bbox: 0.2171 d4.dn_loss_iou: 0.2316 d1.loss_lmm_region: 0.2142 loss_lmm_image: 0.8763 2024/11/10 20:32:34 - mmengine - INFO - Iter(train) [ 19600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:15:28 time: 1.9933 data_time: 0.0170 memory: 32708 grad_norm: 26.5578 loss: 9.4041 loss_cls: 0.2817 loss_bbox: 0.1330 loss_iou: 0.2330 d0.loss_cls: 0.3256 d0.loss_bbox: 0.1405 d0.loss_iou: 0.2425 d1.loss_cls: 0.2982 d1.loss_bbox: 0.1374 d1.loss_iou: 0.2390 d2.loss_cls: 0.2905 d2.loss_bbox: 0.1365 d2.loss_iou: 0.2353 d3.loss_cls: 0.2809 d3.loss_bbox: 0.1389 d3.loss_iou: 0.2353 d4.loss_cls: 0.2811 d4.loss_bbox: 0.1326 d4.loss_iou: 0.2333 enc_loss_cls: 0.3226 enc_loss_bbox: 0.1554 enc_loss_iou: 0.2619 dn_loss_cls: 0.1188 dn_loss_bbox: 0.1751 dn_loss_iou: 0.2234 d0.dn_loss_cls: 0.1952 d0.dn_loss_bbox: 0.3213 d0.dn_loss_iou: 0.3639 d1.dn_loss_cls: 0.1496 d1.dn_loss_bbox: 0.2096 d1.dn_loss_iou: 0.2546 d2.dn_loss_cls: 0.1310 d2.dn_loss_bbox: 0.1845 d2.dn_loss_iou: 0.2328 d3.dn_loss_cls: 0.1250 d3.dn_loss_bbox: 0.1773 d3.dn_loss_iou: 0.2259 d4.dn_loss_cls: 0.1179 d4.dn_loss_bbox: 0.1751 d4.dn_loss_iou: 0.2234 d1.loss_lmm_region: 0.1633 loss_lmm_image: 0.9015 2024/11/10 20:35:53 - mmengine - INFO - Iter(train) [ 19700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:12:05 time: 1.9975 data_time: 0.0171 memory: 34975 grad_norm: 32.0782 loss: 10.7917 loss_cls: 0.3030 loss_bbox: 0.1868 loss_iou: 0.3070 d0.loss_cls: 0.3593 d0.loss_bbox: 0.1940 d0.loss_iou: 0.3188 d1.loss_cls: 0.3333 d1.loss_bbox: 0.1864 d1.loss_iou: 0.3073 d2.loss_cls: 0.3180 d2.loss_bbox: 0.1828 d2.loss_iou: 0.3021 d3.loss_cls: 0.3065 d3.loss_bbox: 0.1881 d3.loss_iou: 0.3063 d4.loss_cls: 0.3058 d4.loss_bbox: 0.1855 d4.loss_iou: 0.3042 enc_loss_cls: 0.3632 enc_loss_bbox: 0.2048 enc_loss_iou: 0.3380 dn_loss_cls: 0.1208 dn_loss_bbox: 0.1930 dn_loss_iou: 0.2395 d0.dn_loss_cls: 0.2051 d0.dn_loss_bbox: 0.3468 d0.dn_loss_iou: 0.3913 d1.dn_loss_cls: 0.1514 d1.dn_loss_bbox: 0.2305 d1.dn_loss_iou: 0.2725 d2.dn_loss_cls: 0.1324 d2.dn_loss_bbox: 0.2020 d2.dn_loss_iou: 0.2495 d3.dn_loss_cls: 0.1265 d3.dn_loss_bbox: 0.1955 d3.dn_loss_iou: 0.2428 d4.dn_loss_cls: 0.1222 d4.dn_loss_bbox: 0.1931 d4.dn_loss_iou: 0.2396 d1.loss_lmm_region: 0.1752 loss_lmm_image: 0.9608 2024/11/10 20:39:11 - mmengine - INFO - Iter(train) [ 19800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:08:36 time: 1.9893 data_time: 0.0171 memory: 34752 grad_norm: 31.5912 loss: 9.0286 loss_cls: 0.3050 loss_bbox: 0.1167 loss_iou: 0.2020 d0.loss_cls: 0.3514 d0.loss_bbox: 0.1322 d0.loss_iou: 0.2143 d1.loss_cls: 0.3231 d1.loss_bbox: 0.1212 d1.loss_iou: 0.2079 d2.loss_cls: 0.3163 d2.loss_bbox: 0.1178 d2.loss_iou: 0.2044 d3.loss_cls: 0.3072 d3.loss_bbox: 0.1177 d3.loss_iou: 0.2024 d4.loss_cls: 0.3034 d4.loss_bbox: 0.1167 d4.loss_iou: 0.2029 enc_loss_cls: 0.3552 enc_loss_bbox: 0.1388 enc_loss_iou: 0.2347 dn_loss_cls: 0.1227 dn_loss_bbox: 0.1560 dn_loss_iou: 0.1923 d0.dn_loss_cls: 0.2051 d0.dn_loss_bbox: 0.2960 d0.dn_loss_iou: 0.3292 d1.dn_loss_cls: 0.1545 d1.dn_loss_bbox: 0.1844 d1.dn_loss_iou: 0.2202 d2.dn_loss_cls: 0.1362 d2.dn_loss_bbox: 0.1646 d2.dn_loss_iou: 0.2001 d3.dn_loss_cls: 0.1281 d3.dn_loss_bbox: 0.1583 d3.dn_loss_iou: 0.1944 d4.dn_loss_cls: 0.1251 d4.dn_loss_bbox: 0.1560 d4.dn_loss_iou: 0.1922 d1.loss_lmm_region: 0.1891 loss_lmm_image: 0.9324 2024/11/10 20:42:31 - mmengine - INFO - Iter(train) [ 19900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:05:20 time: 1.9968 data_time: 0.0170 memory: 34264 grad_norm: 28.4410 loss: 9.6676 loss_cls: 0.3009 loss_bbox: 0.1347 loss_iou: 0.2404 d0.loss_cls: 0.3484 d0.loss_bbox: 0.1455 d0.loss_iou: 0.2541 d1.loss_cls: 0.3130 d1.loss_bbox: 0.1436 d1.loss_iou: 0.2457 d2.loss_cls: 0.3111 d2.loss_bbox: 0.1332 d2.loss_iou: 0.2371 d3.loss_cls: 0.3039 d3.loss_bbox: 0.1353 d3.loss_iou: 0.2397 d4.loss_cls: 0.3013 d4.loss_bbox: 0.1347 d4.loss_iou: 0.2399 enc_loss_cls: 0.3437 enc_loss_bbox: 0.1640 enc_loss_iou: 0.2796 dn_loss_cls: 0.1382 dn_loss_bbox: 0.1663 dn_loss_iou: 0.2148 d0.dn_loss_cls: 0.2136 d0.dn_loss_bbox: 0.3277 d0.dn_loss_iou: 0.3658 d1.dn_loss_cls: 0.1653 d1.dn_loss_bbox: 0.2049 d1.dn_loss_iou: 0.2483 d2.dn_loss_cls: 0.1475 d2.dn_loss_bbox: 0.1784 d2.dn_loss_iou: 0.2242 d3.dn_loss_cls: 0.1430 d3.dn_loss_bbox: 0.1689 d3.dn_loss_iou: 0.2171 d4.dn_loss_cls: 0.1371 d4.dn_loss_bbox: 0.1663 d4.dn_loss_iou: 0.2145 d1.loss_lmm_region: 0.1694 loss_lmm_image: 0.9064 2024/11/10 20:45:52 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 20:45:52 - mmengine - INFO - Iter(train) [ 20000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 3 days, 0:02:08 time: 2.0072 data_time: 0.0171 memory: 34917 grad_norm: 29.1752 loss: 9.6787 loss_cls: 0.2989 loss_bbox: 0.1353 loss_iou: 0.2123 d0.loss_cls: 0.3591 d0.loss_bbox: 0.1426 d0.loss_iou: 0.2219 d1.loss_cls: 0.3188 d1.loss_bbox: 0.1338 d1.loss_iou: 0.2135 d2.loss_cls: 0.3093 d2.loss_bbox: 0.1325 d2.loss_iou: 0.2105 d3.loss_cls: 0.3019 d3.loss_bbox: 0.1355 d3.loss_iou: 0.2136 d4.loss_cls: 0.2978 d4.loss_bbox: 0.1356 d4.loss_iou: 0.2135 enc_loss_cls: 0.3555 enc_loss_bbox: 0.1571 enc_loss_iou: 0.2423 dn_loss_cls: 0.1629 dn_loss_bbox: 0.1860 dn_loss_iou: 0.2046 d0.dn_loss_cls: 0.2460 d0.dn_loss_bbox: 0.3371 d0.dn_loss_iou: 0.3410 d1.dn_loss_cls: 0.1939 d1.dn_loss_bbox: 0.2220 d1.dn_loss_iou: 0.2350 d2.dn_loss_cls: 0.1733 d2.dn_loss_bbox: 0.1955 d2.dn_loss_iou: 0.2133 d3.dn_loss_cls: 0.1656 d3.dn_loss_bbox: 0.1882 d3.dn_loss_iou: 0.2074 d4.dn_loss_cls: 0.1616 d4.dn_loss_bbox: 0.1860 d4.dn_loss_iou: 0.2048 d1.loss_lmm_region: 0.2018 loss_lmm_image: 0.9115 2024/11/10 20:49:13 - mmengine - INFO - Iter(train) [ 20100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:58:57 time: 1.9846 data_time: 0.0170 memory: 34268 grad_norm: 25.0609 loss: 9.8145 loss_cls: 0.2901 loss_bbox: 0.1378 loss_iou: 0.2583 d0.loss_cls: 0.3465 d0.loss_bbox: 0.1495 d0.loss_iou: 0.2766 d1.loss_cls: 0.3172 d1.loss_bbox: 0.1429 d1.loss_iou: 0.2685 d2.loss_cls: 0.3056 d2.loss_bbox: 0.1387 d2.loss_iou: 0.2590 d3.loss_cls: 0.2967 d3.loss_bbox: 0.1371 d3.loss_iou: 0.2588 d4.loss_cls: 0.2945 d4.loss_bbox: 0.1378 d4.loss_iou: 0.2580 enc_loss_cls: 0.3423 enc_loss_bbox: 0.1683 enc_loss_iou: 0.3058 dn_loss_cls: 0.1127 dn_loss_bbox: 0.1678 dn_loss_iou: 0.2303 d0.dn_loss_cls: 0.1909 d0.dn_loss_bbox: 0.3269 d0.dn_loss_iou: 0.3774 d1.dn_loss_cls: 0.1442 d1.dn_loss_bbox: 0.2060 d1.dn_loss_iou: 0.2642 d2.dn_loss_cls: 0.1248 d2.dn_loss_bbox: 0.1815 d2.dn_loss_iou: 0.2401 d3.dn_loss_cls: 0.1152 d3.dn_loss_bbox: 0.1707 d3.dn_loss_iou: 0.2329 d4.dn_loss_cls: 0.1137 d4.dn_loss_bbox: 0.1677 d4.dn_loss_iou: 0.2304 d1.loss_lmm_region: 0.1778 loss_lmm_image: 0.9492 2024/11/10 20:52:29 - mmengine - INFO - Iter(train) [ 20200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:55:20 time: 1.9748 data_time: 0.0172 memory: 34603 grad_norm: 28.5134 loss: 9.4995 loss_cls: 0.2724 loss_bbox: 0.1374 loss_iou: 0.2552 d0.loss_cls: 0.3188 d0.loss_bbox: 0.1520 d0.loss_iou: 0.2748 d1.loss_cls: 0.2911 d1.loss_bbox: 0.1422 d1.loss_iou: 0.2660 d2.loss_cls: 0.2830 d2.loss_bbox: 0.1398 d2.loss_iou: 0.2607 d3.loss_cls: 0.2761 d3.loss_bbox: 0.1383 d3.loss_iou: 0.2535 d4.loss_cls: 0.2711 d4.loss_bbox: 0.1418 d4.loss_iou: 0.2548 enc_loss_cls: 0.3149 enc_loss_bbox: 0.1639 enc_loss_iou: 0.2957 dn_loss_cls: 0.0899 dn_loss_bbox: 0.1769 dn_loss_iou: 0.2311 d0.dn_loss_cls: 0.1709 d0.dn_loss_bbox: 0.3437 d0.dn_loss_iou: 0.3835 d1.dn_loss_cls: 0.1204 d1.dn_loss_bbox: 0.2163 d1.dn_loss_iou: 0.2639 d2.dn_loss_cls: 0.1056 d2.dn_loss_bbox: 0.1892 d2.dn_loss_iou: 0.2407 d3.dn_loss_cls: 0.0982 d3.dn_loss_bbox: 0.1785 d3.dn_loss_iou: 0.2336 d4.dn_loss_cls: 0.0923 d4.dn_loss_bbox: 0.1769 d4.dn_loss_iou: 0.2310 d1.loss_lmm_region: 0.1351 loss_lmm_image: 0.9184 2024/11/10 20:55:47 - mmengine - INFO - Iter(train) [ 20300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:51:51 time: 1.9688 data_time: 0.0172 memory: 36475 grad_norm: 32.0769 loss: 11.0737 loss_cls: 0.3928 loss_bbox: 0.1589 loss_iou: 0.2806 d0.loss_cls: 0.4369 d0.loss_bbox: 0.1748 d0.loss_iou: 0.2920 d1.loss_cls: 0.4093 d1.loss_bbox: 0.1673 d1.loss_iou: 0.2868 d2.loss_cls: 0.4003 d2.loss_bbox: 0.1617 d2.loss_iou: 0.2827 d3.loss_cls: 0.3972 d3.loss_bbox: 0.1607 d3.loss_iou: 0.2800 d4.loss_cls: 0.3926 d4.loss_bbox: 0.1592 d4.loss_iou: 0.2809 enc_loss_cls: 0.4363 enc_loss_bbox: 0.1830 enc_loss_iou: 0.3199 dn_loss_cls: 0.1812 dn_loss_bbox: 0.1732 dn_loss_iou: 0.2134 d0.dn_loss_cls: 0.2575 d0.dn_loss_bbox: 0.3069 d0.dn_loss_iou: 0.3434 d1.dn_loss_cls: 0.2092 d1.dn_loss_bbox: 0.1989 d1.dn_loss_iou: 0.2413 d2.dn_loss_cls: 0.1920 d2.dn_loss_bbox: 0.1818 d2.dn_loss_iou: 0.2223 d3.dn_loss_cls: 0.1853 d3.dn_loss_bbox: 0.1755 d3.dn_loss_iou: 0.2162 d4.dn_loss_cls: 0.1838 d4.dn_loss_bbox: 0.1732 d4.dn_loss_iou: 0.2134 d1.loss_lmm_region: 0.2082 loss_lmm_image: 0.9434 2024/11/10 20:59:07 - mmengine - INFO - Iter(train) [ 20400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:48:33 time: 1.9896 data_time: 0.0172 memory: 34038 grad_norm: 34.7891 loss: 10.0504 loss_cls: 0.3595 loss_bbox: 0.1347 loss_iou: 0.2351 d0.loss_cls: 0.4048 d0.loss_bbox: 0.1437 d0.loss_iou: 0.2468 d1.loss_cls: 0.3767 d1.loss_bbox: 0.1392 d1.loss_iou: 0.2409 d2.loss_cls: 0.3648 d2.loss_bbox: 0.1393 d2.loss_iou: 0.2407 d3.loss_cls: 0.3602 d3.loss_bbox: 0.1367 d3.loss_iou: 0.2384 d4.loss_cls: 0.3602 d4.loss_bbox: 0.1353 d4.loss_iou: 0.2380 enc_loss_cls: 0.4051 enc_loss_bbox: 0.1568 enc_loss_iou: 0.2693 dn_loss_cls: 0.1372 dn_loss_bbox: 0.1620 dn_loss_iou: 0.2076 d0.dn_loss_cls: 0.2271 d0.dn_loss_bbox: 0.3083 d0.dn_loss_iou: 0.3449 d1.dn_loss_cls: 0.1739 d1.dn_loss_bbox: 0.1985 d1.dn_loss_iou: 0.2386 d2.dn_loss_cls: 0.1514 d2.dn_loss_bbox: 0.1710 d2.dn_loss_iou: 0.2160 d3.dn_loss_cls: 0.1395 d3.dn_loss_bbox: 0.1638 d3.dn_loss_iou: 0.2099 d4.dn_loss_cls: 0.1383 d4.dn_loss_bbox: 0.1620 d4.dn_loss_iou: 0.2076 d1.loss_lmm_region: 0.2088 loss_lmm_image: 0.9577 2024/11/10 21:02:25 - mmengine - INFO - Iter(train) [ 20500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:45:05 time: 1.9857 data_time: 0.0171 memory: 33860 grad_norm: 29.9323 loss: 9.1746 loss_cls: 0.2854 loss_bbox: 0.1266 loss_iou: 0.2267 d0.loss_cls: 0.3317 d0.loss_bbox: 0.1442 d0.loss_iou: 0.2457 d1.loss_cls: 0.3064 d1.loss_bbox: 0.1297 d1.loss_iou: 0.2323 d2.loss_cls: 0.2934 d2.loss_bbox: 0.1290 d2.loss_iou: 0.2298 d3.loss_cls: 0.2906 d3.loss_bbox: 0.1273 d3.loss_iou: 0.2263 d4.loss_cls: 0.2875 d4.loss_bbox: 0.1257 d4.loss_iou: 0.2257 enc_loss_cls: 0.3370 enc_loss_bbox: 0.1555 enc_loss_iou: 0.2658 dn_loss_cls: 0.1123 dn_loss_bbox: 0.1619 dn_loss_iou: 0.2041 d0.dn_loss_cls: 0.1835 d0.dn_loss_bbox: 0.3081 d0.dn_loss_iou: 0.3403 d1.dn_loss_cls: 0.1395 d1.dn_loss_bbox: 0.1951 d1.dn_loss_iou: 0.2325 d2.dn_loss_cls: 0.1240 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2128 d3.dn_loss_cls: 0.1170 d3.dn_loss_bbox: 0.1639 d3.dn_loss_iou: 0.2065 d4.dn_loss_cls: 0.1136 d4.dn_loss_bbox: 0.1619 d4.dn_loss_iou: 0.2040 d1.loss_lmm_region: 0.1520 loss_lmm_image: 0.9470 2024/11/10 21:05:46 - mmengine - INFO - Iter(train) [ 20600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:41:55 time: 2.0060 data_time: 0.0171 memory: 33291 grad_norm: 29.7140 loss: 9.1267 loss_cls: 0.2685 loss_bbox: 0.1267 loss_iou: 0.2061 d0.loss_cls: 0.3259 d0.loss_bbox: 0.1317 d0.loss_iou: 0.2163 d1.loss_cls: 0.2884 d1.loss_bbox: 0.1294 d1.loss_iou: 0.2095 d2.loss_cls: 0.2774 d2.loss_bbox: 0.1257 d2.loss_iou: 0.2058 d3.loss_cls: 0.2790 d3.loss_bbox: 0.1211 d3.loss_iou: 0.2031 d4.loss_cls: 0.2711 d4.loss_bbox: 0.1242 d4.loss_iou: 0.2054 enc_loss_cls: 0.3129 enc_loss_bbox: 0.1460 enc_loss_iou: 0.2373 dn_loss_cls: 0.1513 dn_loss_bbox: 0.1636 dn_loss_iou: 0.2123 d0.dn_loss_cls: 0.2221 d0.dn_loss_bbox: 0.3113 d0.dn_loss_iou: 0.3587 d1.dn_loss_cls: 0.1738 d1.dn_loss_bbox: 0.1985 d1.dn_loss_iou: 0.2450 d2.dn_loss_cls: 0.1579 d2.dn_loss_bbox: 0.1750 d2.dn_loss_iou: 0.2216 d3.dn_loss_cls: 0.1544 d3.dn_loss_bbox: 0.1662 d3.dn_loss_iou: 0.2144 d4.dn_loss_cls: 0.1511 d4.dn_loss_bbox: 0.1635 d4.dn_loss_iou: 0.2121 d1.loss_lmm_region: 0.1614 loss_lmm_image: 0.9010 2024/11/10 21:09:05 - mmengine - INFO - Iter(train) [ 20700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:38:31 time: 1.9991 data_time: 0.0170 memory: 34547 grad_norm: 34.1531 loss: 9.3656 loss_cls: 0.2991 loss_bbox: 0.1331 loss_iou: 0.2599 d0.loss_cls: 0.3576 d0.loss_bbox: 0.1347 d0.loss_iou: 0.2645 d1.loss_cls: 0.3181 d1.loss_bbox: 0.1321 d1.loss_iou: 0.2586 d2.loss_cls: 0.3057 d2.loss_bbox: 0.1369 d2.loss_iou: 0.2640 d3.loss_cls: 0.3039 d3.loss_bbox: 0.1325 d3.loss_iou: 0.2584 d4.loss_cls: 0.3043 d4.loss_bbox: 0.1290 d4.loss_iou: 0.2555 enc_loss_cls: 0.3487 enc_loss_bbox: 0.1527 enc_loss_iou: 0.2919 dn_loss_cls: 0.1174 dn_loss_bbox: 0.1387 dn_loss_iou: 0.2116 d0.dn_loss_cls: 0.1823 d0.dn_loss_bbox: 0.2680 d0.dn_loss_iou: 0.3509 d1.dn_loss_cls: 0.1371 d1.dn_loss_bbox: 0.1657 d1.dn_loss_iou: 0.2400 d2.dn_loss_cls: 0.1236 d2.dn_loss_bbox: 0.1470 d2.dn_loss_iou: 0.2201 d3.dn_loss_cls: 0.1190 d3.dn_loss_bbox: 0.1406 d3.dn_loss_iou: 0.2142 d4.dn_loss_cls: 0.1173 d4.dn_loss_bbox: 0.1387 d4.dn_loss_iou: 0.2115 d1.loss_lmm_region: 0.1592 loss_lmm_image: 0.9214 2024/11/10 21:12:23 - mmengine - INFO - Iter(train) [ 20800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:35:00 time: 1.9685 data_time: 0.0173 memory: 34083 grad_norm: 30.4139 loss: 12.4922 loss_cls: 0.4622 loss_bbox: 0.1802 loss_iou: 0.3379 d0.loss_cls: 0.5241 d0.loss_bbox: 0.1943 d0.loss_iou: 0.3589 d1.loss_cls: 0.4786 d1.loss_bbox: 0.1942 d1.loss_iou: 0.3519 d2.loss_cls: 0.4705 d2.loss_bbox: 0.1816 d2.loss_iou: 0.3393 d3.loss_cls: 0.4661 d3.loss_bbox: 0.1811 d3.loss_iou: 0.3393 d4.loss_cls: 0.4644 d4.loss_bbox: 0.1794 d4.loss_iou: 0.3362 enc_loss_cls: 0.5192 enc_loss_bbox: 0.2059 enc_loss_iou: 0.3792 dn_loss_cls: 0.1787 dn_loss_bbox: 0.1831 dn_loss_iou: 0.2531 d0.dn_loss_cls: 0.2614 d0.dn_loss_bbox: 0.3197 d0.dn_loss_iou: 0.3979 d1.dn_loss_cls: 0.2116 d1.dn_loss_bbox: 0.2100 d1.dn_loss_iou: 0.2827 d2.dn_loss_cls: 0.1953 d2.dn_loss_bbox: 0.1923 d2.dn_loss_iou: 0.2619 d3.dn_loss_cls: 0.1864 d3.dn_loss_bbox: 0.1847 d3.dn_loss_iou: 0.2557 d4.dn_loss_cls: 0.1806 d4.dn_loss_bbox: 0.1831 d4.dn_loss_iou: 0.2530 d1.loss_lmm_region: 0.2459 loss_lmm_image: 0.9104 2024/11/10 21:15:43 - mmengine - INFO - Iter(train) [ 20900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:31:48 time: 1.9909 data_time: 0.0171 memory: 33188 grad_norm: 28.1238 loss: 9.6149 loss_cls: 0.3041 loss_bbox: 0.1440 loss_iou: 0.2366 d0.loss_cls: 0.3695 d0.loss_bbox: 0.1488 d0.loss_iou: 0.2501 d1.loss_cls: 0.3307 d1.loss_bbox: 0.1419 d1.loss_iou: 0.2361 d2.loss_cls: 0.3140 d2.loss_bbox: 0.1429 d2.loss_iou: 0.2364 d3.loss_cls: 0.3083 d3.loss_bbox: 0.1429 d3.loss_iou: 0.2354 d4.loss_cls: 0.3058 d4.loss_bbox: 0.1430 d4.loss_iou: 0.2355 enc_loss_cls: 0.3693 enc_loss_bbox: 0.1628 enc_loss_iou: 0.2729 dn_loss_cls: 0.1096 dn_loss_bbox: 0.1707 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1862 d0.dn_loss_bbox: 0.3543 d0.dn_loss_iou: 0.3522 d1.dn_loss_cls: 0.1389 d1.dn_loss_bbox: 0.2129 d1.dn_loss_iou: 0.2372 d2.dn_loss_cls: 0.1214 d2.dn_loss_bbox: 0.1844 d2.dn_loss_iou: 0.2142 d3.dn_loss_cls: 0.1152 d3.dn_loss_bbox: 0.1737 d3.dn_loss_iou: 0.2063 d4.dn_loss_cls: 0.1106 d4.dn_loss_bbox: 0.1708 d4.dn_loss_iou: 0.2036 d1.loss_lmm_region: 0.1780 loss_lmm_image: 0.9403 2024/11/10 21:19:01 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 21:19:01 - mmengine - INFO - Iter(train) [ 21000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:28:21 time: 1.9857 data_time: 0.0170 memory: 33905 grad_norm: 28.6604 loss: 10.6374 loss_cls: 0.3669 loss_bbox: 0.1429 loss_iou: 0.2664 d0.loss_cls: 0.4351 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2784 d1.loss_cls: 0.3941 d1.loss_bbox: 0.1450 d1.loss_iou: 0.2686 d2.loss_cls: 0.3817 d2.loss_bbox: 0.1449 d2.loss_iou: 0.2674 d3.loss_cls: 0.3705 d3.loss_bbox: 0.1424 d3.loss_iou: 0.2662 d4.loss_cls: 0.3678 d4.loss_bbox: 0.1426 d4.loss_iou: 0.2649 enc_loss_cls: 0.4188 enc_loss_bbox: 0.1716 enc_loss_iou: 0.3076 dn_loss_cls: 0.1381 dn_loss_bbox: 0.1797 dn_loss_iou: 0.2419 d0.dn_loss_cls: 0.2236 d0.dn_loss_bbox: 0.3319 d0.dn_loss_iou: 0.3959 d1.dn_loss_cls: 0.1728 d1.dn_loss_bbox: 0.2188 d1.dn_loss_iou: 0.2773 d2.dn_loss_cls: 0.1484 d2.dn_loss_bbox: 0.1912 d2.dn_loss_iou: 0.2527 d3.dn_loss_cls: 0.1418 d3.dn_loss_bbox: 0.1807 d3.dn_loss_iou: 0.2441 d4.dn_loss_cls: 0.1392 d4.dn_loss_bbox: 0.1796 d4.dn_loss_iou: 0.2419 d1.loss_lmm_region: 0.1589 loss_lmm_image: 0.8802 2024/11/10 21:22:21 - mmengine - INFO - Iter(train) [ 21100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:25:02 time: 1.9916 data_time: 0.0170 memory: 33805 grad_norm: 25.0257 loss: 11.1578 loss_cls: 0.3333 loss_bbox: 0.1814 loss_iou: 0.3221 d0.loss_cls: 0.3877 d0.loss_bbox: 0.1935 d0.loss_iou: 0.3354 d1.loss_cls: 0.3569 d1.loss_bbox: 0.1854 d1.loss_iou: 0.3256 d2.loss_cls: 0.3428 d2.loss_bbox: 0.1826 d2.loss_iou: 0.3254 d3.loss_cls: 0.3363 d3.loss_bbox: 0.1856 d3.loss_iou: 0.3222 d4.loss_cls: 0.3348 d4.loss_bbox: 0.1817 d4.loss_iou: 0.3213 enc_loss_cls: 0.3848 enc_loss_bbox: 0.2057 enc_loss_iou: 0.3542 dn_loss_cls: 0.1198 dn_loss_bbox: 0.1950 dn_loss_iou: 0.2562 d0.dn_loss_cls: 0.2091 d0.dn_loss_bbox: 0.3570 d0.dn_loss_iou: 0.4206 d1.dn_loss_cls: 0.1533 d1.dn_loss_bbox: 0.2356 d1.dn_loss_iou: 0.2946 d2.dn_loss_cls: 0.1324 d2.dn_loss_bbox: 0.2058 d2.dn_loss_iou: 0.2668 d3.dn_loss_cls: 0.1235 d3.dn_loss_bbox: 0.1981 d3.dn_loss_iou: 0.2592 d4.dn_loss_cls: 0.1201 d4.dn_loss_bbox: 0.1950 d4.dn_loss_iou: 0.2561 d1.loss_lmm_region: 0.1664 loss_lmm_image: 0.8947 2024/11/10 21:25:42 - mmengine - INFO - Iter(train) [ 21200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:21:53 time: 1.9998 data_time: 0.0173 memory: 34451 grad_norm: 34.2675 loss: 11.1857 loss_cls: 0.4172 loss_bbox: 0.1501 loss_iou: 0.2712 d0.loss_cls: 0.4683 d0.loss_bbox: 0.1606 d0.loss_iou: 0.2878 d1.loss_cls: 0.4348 d1.loss_bbox: 0.1537 d1.loss_iou: 0.2798 d2.loss_cls: 0.4207 d2.loss_bbox: 0.1520 d2.loss_iou: 0.2758 d3.loss_cls: 0.4241 d3.loss_bbox: 0.1480 d3.loss_iou: 0.2696 d4.loss_cls: 0.4125 d4.loss_bbox: 0.1526 d4.loss_iou: 0.2745 enc_loss_cls: 0.4575 enc_loss_bbox: 0.1725 enc_loss_iou: 0.3129 dn_loss_cls: 0.1525 dn_loss_bbox: 0.1798 dn_loss_iou: 0.2343 d0.dn_loss_cls: 0.2456 d0.dn_loss_bbox: 0.3236 d0.dn_loss_iou: 0.3903 d1.dn_loss_cls: 0.1903 d1.dn_loss_bbox: 0.2070 d1.dn_loss_iou: 0.2660 d2.dn_loss_cls: 0.1665 d2.dn_loss_bbox: 0.1869 d2.dn_loss_iou: 0.2416 d3.dn_loss_cls: 0.1595 d3.dn_loss_bbox: 0.1814 d3.dn_loss_iou: 0.2363 d4.dn_loss_cls: 0.1536 d4.dn_loss_bbox: 0.1799 d4.dn_loss_iou: 0.2342 d1.loss_lmm_region: 0.2058 loss_lmm_image: 0.9544 2024/11/10 21:29:02 - mmengine - INFO - Iter(train) [ 21300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:18:35 time: 1.9845 data_time: 0.0172 memory: 34985 grad_norm: 25.0174 loss: 9.0966 loss_cls: 0.2648 loss_bbox: 0.1216 loss_iou: 0.2102 d0.loss_cls: 0.3121 d0.loss_bbox: 0.1313 d0.loss_iou: 0.2158 d1.loss_cls: 0.2876 d1.loss_bbox: 0.1222 d1.loss_iou: 0.2103 d2.loss_cls: 0.2770 d2.loss_bbox: 0.1231 d2.loss_iou: 0.2115 d3.loss_cls: 0.2676 d3.loss_bbox: 0.1233 d3.loss_iou: 0.2099 d4.loss_cls: 0.2674 d4.loss_bbox: 0.1206 d4.loss_iou: 0.2100 enc_loss_cls: 0.3102 enc_loss_bbox: 0.1430 enc_loss_iou: 0.2371 dn_loss_cls: 0.1224 dn_loss_bbox: 0.1781 dn_loss_iou: 0.2166 d0.dn_loss_cls: 0.2002 d0.dn_loss_bbox: 0.3119 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1523 d1.dn_loss_bbox: 0.2108 d1.dn_loss_iou: 0.2457 d2.dn_loss_cls: 0.1362 d2.dn_loss_bbox: 0.1871 d2.dn_loss_iou: 0.2249 d3.dn_loss_cls: 0.1277 d3.dn_loss_bbox: 0.1803 d3.dn_loss_iou: 0.2188 d4.dn_loss_cls: 0.1255 d4.dn_loss_bbox: 0.1781 d4.dn_loss_iou: 0.2165 d1.loss_lmm_region: 0.1814 loss_lmm_image: 0.9512 2024/11/10 21:32:20 - mmengine - INFO - Iter(train) [ 21400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:15:10 time: 1.9680 data_time: 0.0170 memory: 32756 grad_norm: 28.5268 loss: 10.4662 loss_cls: 0.3244 loss_bbox: 0.1458 loss_iou: 0.2714 d0.loss_cls: 0.3809 d0.loss_bbox: 0.1489 d0.loss_iou: 0.2777 d1.loss_cls: 0.3541 d1.loss_bbox: 0.1475 d1.loss_iou: 0.2756 d2.loss_cls: 0.3392 d2.loss_bbox: 0.1439 d2.loss_iou: 0.2715 d3.loss_cls: 0.3268 d3.loss_bbox: 0.1436 d3.loss_iou: 0.2698 d4.loss_cls: 0.3239 d4.loss_bbox: 0.1456 d4.loss_iou: 0.2703 enc_loss_cls: 0.3756 enc_loss_bbox: 0.1625 enc_loss_iou: 0.3016 dn_loss_cls: 0.1434 dn_loss_bbox: 0.1914 dn_loss_iou: 0.2361 d0.dn_loss_cls: 0.2237 d0.dn_loss_bbox: 0.3538 d0.dn_loss_iou: 0.3929 d1.dn_loss_cls: 0.1814 d1.dn_loss_bbox: 0.2247 d1.dn_loss_iou: 0.2668 d2.dn_loss_cls: 0.1625 d2.dn_loss_bbox: 0.1981 d2.dn_loss_iou: 0.2434 d3.dn_loss_cls: 0.1503 d3.dn_loss_bbox: 0.1924 d3.dn_loss_iou: 0.2380 d4.dn_loss_cls: 0.1453 d4.dn_loss_bbox: 0.1914 d4.dn_loss_iou: 0.2361 d1.loss_lmm_region: 0.1895 loss_lmm_image: 0.9043 2024/11/10 21:35:41 - mmengine - INFO - Iter(train) [ 21500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:11:54 time: 1.9821 data_time: 0.0171 memory: 33111 grad_norm: 30.1317 loss: 10.3271 loss_cls: 0.3230 loss_bbox: 0.1507 loss_iou: 0.2635 d0.loss_cls: 0.3944 d0.loss_bbox: 0.1568 d0.loss_iou: 0.2725 d1.loss_cls: 0.3475 d1.loss_bbox: 0.1557 d1.loss_iou: 0.2659 d2.loss_cls: 0.3373 d2.loss_bbox: 0.1510 d2.loss_iou: 0.2623 d3.loss_cls: 0.3269 d3.loss_bbox: 0.1517 d3.loss_iou: 0.2629 d4.loss_cls: 0.3229 d4.loss_bbox: 0.1522 d4.loss_iou: 0.2642 enc_loss_cls: 0.3825 enc_loss_bbox: 0.1780 enc_loss_iou: 0.3005 dn_loss_cls: 0.1366 dn_loss_bbox: 0.1708 dn_loss_iou: 0.2288 d0.dn_loss_cls: 0.2272 d0.dn_loss_bbox: 0.3323 d0.dn_loss_iou: 0.3830 d1.dn_loss_cls: 0.1717 d1.dn_loss_bbox: 0.2027 d1.dn_loss_iou: 0.2603 d2.dn_loss_cls: 0.1512 d2.dn_loss_bbox: 0.1798 d2.dn_loss_iou: 0.2383 d3.dn_loss_cls: 0.1414 d3.dn_loss_bbox: 0.1722 d3.dn_loss_iou: 0.2305 d4.dn_loss_cls: 0.1374 d4.dn_loss_bbox: 0.1708 d4.dn_loss_iou: 0.2288 d1.loss_lmm_region: 0.1959 loss_lmm_image: 0.9452 2024/11/10 21:38:59 - mmengine - INFO - Iter(train) [ 21600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:08:31 time: 1.9955 data_time: 0.0172 memory: 35327 grad_norm: 26.2224 loss: 9.7203 loss_cls: 0.2897 loss_bbox: 0.1366 loss_iou: 0.2305 d0.loss_cls: 0.3321 d0.loss_bbox: 0.1505 d0.loss_iou: 0.2504 d1.loss_cls: 0.3040 d1.loss_bbox: 0.1396 d1.loss_iou: 0.2388 d2.loss_cls: 0.3014 d2.loss_bbox: 0.1353 d2.loss_iou: 0.2349 d3.loss_cls: 0.2984 d3.loss_bbox: 0.1314 d3.loss_iou: 0.2310 d4.loss_cls: 0.2913 d4.loss_bbox: 0.1354 d4.loss_iou: 0.2339 enc_loss_cls: 0.3279 enc_loss_bbox: 0.1608 enc_loss_iou: 0.2728 dn_loss_cls: 0.1290 dn_loss_bbox: 0.1976 dn_loss_iou: 0.2232 d0.dn_loss_cls: 0.2052 d0.dn_loss_bbox: 0.3568 d0.dn_loss_iou: 0.3764 d1.dn_loss_cls: 0.1594 d1.dn_loss_bbox: 0.2305 d1.dn_loss_iou: 0.2576 d2.dn_loss_cls: 0.1394 d2.dn_loss_bbox: 0.2088 d2.dn_loss_iou: 0.2340 d3.dn_loss_cls: 0.1358 d3.dn_loss_bbox: 0.2006 d3.dn_loss_iou: 0.2270 d4.dn_loss_cls: 0.1298 d4.dn_loss_bbox: 0.1976 d4.dn_loss_iou: 0.2234 d1.loss_lmm_region: 0.1516 loss_lmm_image: 0.9101 2024/11/10 21:42:18 - mmengine - INFO - Iter(train) [ 21700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:05:09 time: 1.9805 data_time: 0.0170 memory: 34533 grad_norm: 28.5603 loss: 10.9823 loss_cls: 0.3541 loss_bbox: 0.1572 loss_iou: 0.2924 d0.loss_cls: 0.4129 d0.loss_bbox: 0.1804 d0.loss_iou: 0.3032 d1.loss_cls: 0.3792 d1.loss_bbox: 0.1608 d1.loss_iou: 0.2967 d2.loss_cls: 0.3599 d2.loss_bbox: 0.1598 d2.loss_iou: 0.2937 d3.loss_cls: 0.3563 d3.loss_bbox: 0.1578 d3.loss_iou: 0.2941 d4.loss_cls: 0.3570 d4.loss_bbox: 0.1551 d4.loss_iou: 0.2908 enc_loss_cls: 0.4116 enc_loss_bbox: 0.1906 enc_loss_iou: 0.3304 dn_loss_cls: 0.1351 dn_loss_bbox: 0.1920 dn_loss_iou: 0.2358 d0.dn_loss_cls: 0.2241 d0.dn_loss_bbox: 0.3674 d0.dn_loss_iou: 0.4035 d1.dn_loss_cls: 0.1696 d1.dn_loss_bbox: 0.2282 d1.dn_loss_iou: 0.2745 d2.dn_loss_cls: 0.1467 d2.dn_loss_bbox: 0.2015 d2.dn_loss_iou: 0.2474 d3.dn_loss_cls: 0.1393 d3.dn_loss_bbox: 0.1945 d3.dn_loss_iou: 0.2392 d4.dn_loss_cls: 0.1366 d4.dn_loss_bbox: 0.1919 d4.dn_loss_iou: 0.2358 d1.loss_lmm_region: 0.1805 loss_lmm_image: 0.9447 2024/11/10 21:45:38 - mmengine - INFO - Iter(train) [ 21800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 23:01:51 time: 2.0218 data_time: 0.0171 memory: 34961 grad_norm: 27.1246 loss: 11.1955 loss_cls: 0.4121 loss_bbox: 0.1535 loss_iou: 0.2848 d0.loss_cls: 0.4612 d0.loss_bbox: 0.1535 d0.loss_iou: 0.2963 d1.loss_cls: 0.4218 d1.loss_bbox: 0.1650 d1.loss_iou: 0.2989 d2.loss_cls: 0.4100 d2.loss_bbox: 0.1614 d2.loss_iou: 0.2897 d3.loss_cls: 0.4074 d3.loss_bbox: 0.1611 d3.loss_iou: 0.2905 d4.loss_cls: 0.4090 d4.loss_bbox: 0.1583 d4.loss_iou: 0.2858 enc_loss_cls: 0.4534 enc_loss_bbox: 0.1733 enc_loss_iou: 0.3166 dn_loss_cls: 0.1279 dn_loss_bbox: 0.1922 dn_loss_iou: 0.2437 d0.dn_loss_cls: 0.2127 d0.dn_loss_bbox: 0.3597 d0.dn_loss_iou: 0.4032 d1.dn_loss_cls: 0.1574 d1.dn_loss_bbox: 0.2294 d1.dn_loss_iou: 0.2794 d2.dn_loss_cls: 0.1383 d2.dn_loss_bbox: 0.2015 d2.dn_loss_iou: 0.2541 d3.dn_loss_cls: 0.1312 d3.dn_loss_bbox: 0.1953 d3.dn_loss_iou: 0.2467 d4.dn_loss_cls: 0.1275 d4.dn_loss_bbox: 0.1922 d4.dn_loss_iou: 0.2437 d1.loss_lmm_region: 0.1707 loss_lmm_image: 0.9251 2024/11/10 21:48:55 - mmengine - INFO - Iter(train) [ 21900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:58:19 time: 1.9919 data_time: 0.0171 memory: 34547 grad_norm: 28.8943 loss: 10.6293 loss_cls: 0.3451 loss_bbox: 0.1747 loss_iou: 0.2772 d0.loss_cls: 0.4061 d0.loss_bbox: 0.1829 d0.loss_iou: 0.2942 d1.loss_cls: 0.3656 d1.loss_bbox: 0.1812 d1.loss_iou: 0.2875 d2.loss_cls: 0.3578 d2.loss_bbox: 0.1737 d2.loss_iou: 0.2779 d3.loss_cls: 0.3471 d3.loss_bbox: 0.1782 d3.loss_iou: 0.2790 d4.loss_cls: 0.3407 d4.loss_bbox: 0.1789 d4.loss_iou: 0.2786 enc_loss_cls: 0.3986 enc_loss_bbox: 0.1959 enc_loss_iou: 0.3125 dn_loss_cls: 0.1281 dn_loss_bbox: 0.1884 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.2108 d0.dn_loss_bbox: 0.3427 d0.dn_loss_iou: 0.3488 d1.dn_loss_cls: 0.1635 d1.dn_loss_bbox: 0.2208 d1.dn_loss_iou: 0.2417 d2.dn_loss_cls: 0.1435 d2.dn_loss_bbox: 0.1986 d2.dn_loss_iou: 0.2232 d3.dn_loss_cls: 0.1328 d3.dn_loss_bbox: 0.1905 d3.dn_loss_iou: 0.2169 d4.dn_loss_cls: 0.1288 d4.dn_loss_bbox: 0.1885 d4.dn_loss_iou: 0.2143 d1.loss_lmm_region: 0.1852 loss_lmm_image: 0.9141 2024/11/10 21:52:16 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 21:52:16 - mmengine - INFO - Iter(train) [ 22000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:55:03 time: 2.0111 data_time: 0.0171 memory: 34000 grad_norm: 32.0029 loss: 11.2932 loss_cls: 0.3797 loss_bbox: 0.1663 loss_iou: 0.2775 d0.loss_cls: 0.4434 d0.loss_bbox: 0.1747 d0.loss_iou: 0.2966 d1.loss_cls: 0.4034 d1.loss_bbox: 0.1700 d1.loss_iou: 0.2875 d2.loss_cls: 0.3945 d2.loss_bbox: 0.1646 d2.loss_iou: 0.2794 d3.loss_cls: 0.3896 d3.loss_bbox: 0.1620 d3.loss_iou: 0.2753 d4.loss_cls: 0.3798 d4.loss_bbox: 0.1676 d4.loss_iou: 0.2801 enc_loss_cls: 0.4370 enc_loss_bbox: 0.1895 enc_loss_iou: 0.3135 dn_loss_cls: 0.1485 dn_loss_bbox: 0.1979 dn_loss_iou: 0.2397 d0.dn_loss_cls: 0.2378 d0.dn_loss_bbox: 0.3779 d0.dn_loss_iou: 0.3994 d1.dn_loss_cls: 0.1843 d1.dn_loss_bbox: 0.2392 d1.dn_loss_iou: 0.2768 d2.dn_loss_cls: 0.1649 d2.dn_loss_bbox: 0.2133 d2.dn_loss_iou: 0.2526 d3.dn_loss_cls: 0.1535 d3.dn_loss_bbox: 0.2024 d3.dn_loss_iou: 0.2434 d4.dn_loss_cls: 0.1477 d4.dn_loss_bbox: 0.1980 d4.dn_loss_iou: 0.2396 d1.loss_lmm_region: 0.1989 loss_lmm_image: 0.9453 2024/11/10 21:55:34 - mmengine - INFO - Iter(train) [ 22100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:51:39 time: 1.9880 data_time: 0.0171 memory: 34402 grad_norm: 30.5269 loss: 9.9606 loss_cls: 0.3222 loss_bbox: 0.1382 loss_iou: 0.2135 d0.loss_cls: 0.3718 d0.loss_bbox: 0.1451 d0.loss_iou: 0.2186 d1.loss_cls: 0.3419 d1.loss_bbox: 0.1400 d1.loss_iou: 0.2144 d2.loss_cls: 0.3326 d2.loss_bbox: 0.1374 d2.loss_iou: 0.2113 d3.loss_cls: 0.3324 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2088 d4.loss_cls: 0.3234 d4.loss_bbox: 0.1370 d4.loss_iou: 0.2143 enc_loss_cls: 0.3666 enc_loss_bbox: 0.1603 enc_loss_iou: 0.2395 dn_loss_cls: 0.1650 dn_loss_bbox: 0.1869 dn_loss_iou: 0.2201 d0.dn_loss_cls: 0.2473 d0.dn_loss_bbox: 0.3346 d0.dn_loss_iou: 0.3685 d1.dn_loss_cls: 0.1946 d1.dn_loss_bbox: 0.2109 d1.dn_loss_iou: 0.2508 d2.dn_loss_cls: 0.1747 d2.dn_loss_bbox: 0.1909 d2.dn_loss_iou: 0.2291 d3.dn_loss_cls: 0.1690 d3.dn_loss_bbox: 0.1859 d3.dn_loss_iou: 0.2222 d4.dn_loss_cls: 0.1640 d4.dn_loss_bbox: 0.1868 d4.dn_loss_iou: 0.2199 d1.loss_lmm_region: 0.2073 loss_lmm_image: 0.9275 2024/11/10 21:58:52 - mmengine - INFO - Iter(train) [ 22200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:48:12 time: 1.9834 data_time: 0.0171 memory: 34740 grad_norm: 26.3414 loss: 11.0102 loss_cls: 0.3396 loss_bbox: 0.1686 loss_iou: 0.2927 d0.loss_cls: 0.3923 d0.loss_bbox: 0.1763 d0.loss_iou: 0.2993 d1.loss_cls: 0.3620 d1.loss_bbox: 0.1731 d1.loss_iou: 0.2941 d2.loss_cls: 0.3479 d2.loss_bbox: 0.1721 d2.loss_iou: 0.2947 d3.loss_cls: 0.3388 d3.loss_bbox: 0.1730 d3.loss_iou: 0.2948 d4.loss_cls: 0.3406 d4.loss_bbox: 0.1698 d4.loss_iou: 0.2913 enc_loss_cls: 0.3812 enc_loss_bbox: 0.1977 enc_loss_iou: 0.3259 dn_loss_cls: 0.1399 dn_loss_bbox: 0.1968 dn_loss_iou: 0.2492 d0.dn_loss_cls: 0.2158 d0.dn_loss_bbox: 0.3638 d0.dn_loss_iou: 0.4024 d1.dn_loss_cls: 0.1663 d1.dn_loss_bbox: 0.2351 d1.dn_loss_iou: 0.2832 d2.dn_loss_cls: 0.1543 d2.dn_loss_bbox: 0.2113 d2.dn_loss_iou: 0.2600 d3.dn_loss_cls: 0.1454 d3.dn_loss_bbox: 0.1993 d3.dn_loss_iou: 0.2519 d4.dn_loss_cls: 0.1413 d4.dn_loss_bbox: 0.1968 d4.dn_loss_iou: 0.2493 d1.loss_lmm_region: 0.1736 loss_lmm_image: 0.9488 2024/11/10 22:02:12 - mmengine - INFO - Iter(train) [ 22300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:44:55 time: 1.9990 data_time: 0.0170 memory: 33127 grad_norm: 28.4767 loss: 9.8076 loss_cls: 0.3055 loss_bbox: 0.1333 loss_iou: 0.2380 d0.loss_cls: 0.3525 d0.loss_bbox: 0.1609 d0.loss_iou: 0.2639 d1.loss_cls: 0.3263 d1.loss_bbox: 0.1469 d1.loss_iou: 0.2530 d2.loss_cls: 0.3147 d2.loss_bbox: 0.1349 d2.loss_iou: 0.2415 d3.loss_cls: 0.3109 d3.loss_bbox: 0.1328 d3.loss_iou: 0.2377 d4.loss_cls: 0.3073 d4.loss_bbox: 0.1326 d4.loss_iou: 0.2377 enc_loss_cls: 0.3332 enc_loss_bbox: 0.1790 enc_loss_iou: 0.2911 dn_loss_cls: 0.1354 dn_loss_bbox: 0.1713 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.2221 d0.dn_loss_bbox: 0.3310 d0.dn_loss_iou: 0.3637 d1.dn_loss_cls: 0.1739 d1.dn_loss_bbox: 0.2023 d1.dn_loss_iou: 0.2469 d2.dn_loss_cls: 0.1521 d2.dn_loss_bbox: 0.1796 d2.dn_loss_iou: 0.2237 d3.dn_loss_cls: 0.1436 d3.dn_loss_bbox: 0.1736 d3.dn_loss_iou: 0.2166 d4.dn_loss_cls: 0.1361 d4.dn_loss_bbox: 0.1715 d4.dn_loss_iou: 0.2143 d1.loss_lmm_region: 0.1715 loss_lmm_image: 0.9304 2024/11/10 22:05:31 - mmengine - INFO - Iter(train) [ 22400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:41:32 time: 1.9766 data_time: 0.0169 memory: 32370 grad_norm: 26.7253 loss: 9.6107 loss_cls: 0.2610 loss_bbox: 0.1352 loss_iou: 0.2167 d0.loss_cls: 0.3132 d0.loss_bbox: 0.1398 d0.loss_iou: 0.2215 d1.loss_cls: 0.2808 d1.loss_bbox: 0.1356 d1.loss_iou: 0.2165 d2.loss_cls: 0.2721 d2.loss_bbox: 0.1318 d2.loss_iou: 0.2152 d3.loss_cls: 0.2672 d3.loss_bbox: 0.1317 d3.loss_iou: 0.2142 d4.loss_cls: 0.2640 d4.loss_bbox: 0.1335 d4.loss_iou: 0.2163 enc_loss_cls: 0.3141 enc_loss_bbox: 0.1508 enc_loss_iou: 0.2425 dn_loss_cls: 0.1518 dn_loss_bbox: 0.2051 dn_loss_iou: 0.2224 d0.dn_loss_cls: 0.2370 d0.dn_loss_bbox: 0.3731 d0.dn_loss_iou: 0.3714 d1.dn_loss_cls: 0.1843 d1.dn_loss_bbox: 0.2371 d1.dn_loss_iou: 0.2529 d2.dn_loss_cls: 0.1659 d2.dn_loss_bbox: 0.2126 d2.dn_loss_iou: 0.2309 d3.dn_loss_cls: 0.1558 d3.dn_loss_bbox: 0.2070 d3.dn_loss_iou: 0.2248 d4.dn_loss_cls: 0.1529 d4.dn_loss_bbox: 0.2050 d4.dn_loss_iou: 0.2222 d1.loss_lmm_region: 0.1775 loss_lmm_image: 0.9474 2024/11/10 22:08:50 - mmengine - INFO - Iter(train) [ 22500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:38:12 time: 1.9685 data_time: 0.0169 memory: 33564 grad_norm: 30.7047 loss: 8.5911 loss_cls: 0.2396 loss_bbox: 0.1186 loss_iou: 0.1906 d0.loss_cls: 0.2788 d0.loss_bbox: 0.1264 d0.loss_iou: 0.2020 d1.loss_cls: 0.2544 d1.loss_bbox: 0.1206 d1.loss_iou: 0.1966 d2.loss_cls: 0.2482 d2.loss_bbox: 0.1188 d2.loss_iou: 0.1925 d3.loss_cls: 0.2445 d3.loss_bbox: 0.1168 d3.loss_iou: 0.1897 d4.loss_cls: 0.2378 d4.loss_bbox: 0.1194 d4.loss_iou: 0.1932 enc_loss_cls: 0.2772 enc_loss_bbox: 0.1418 enc_loss_iou: 0.2203 dn_loss_cls: 0.1061 dn_loss_bbox: 0.1809 dn_loss_iou: 0.2029 d0.dn_loss_cls: 0.1805 d0.dn_loss_bbox: 0.3593 d0.dn_loss_iou: 0.3598 d1.dn_loss_cls: 0.1386 d1.dn_loss_bbox: 0.2249 d1.dn_loss_iou: 0.2425 d2.dn_loss_cls: 0.1173 d2.dn_loss_bbox: 0.1957 d2.dn_loss_iou: 0.2156 d3.dn_loss_cls: 0.1085 d3.dn_loss_bbox: 0.1837 d3.dn_loss_iou: 0.2062 d4.dn_loss_cls: 0.1061 d4.dn_loss_bbox: 0.1809 d4.dn_loss_iou: 0.2028 d1.loss_lmm_region: 0.1385 loss_lmm_image: 0.9127 2024/11/10 22:12:08 - mmengine - INFO - Iter(train) [ 22600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:34:42 time: 1.9900 data_time: 0.0170 memory: 34130 grad_norm: 27.6069 loss: 9.0877 loss_cls: 0.2783 loss_bbox: 0.1200 loss_iou: 0.2175 d0.loss_cls: 0.3134 d0.loss_bbox: 0.1412 d0.loss_iou: 0.2362 d1.loss_cls: 0.2913 d1.loss_bbox: 0.1296 d1.loss_iou: 0.2290 d2.loss_cls: 0.2854 d2.loss_bbox: 0.1207 d2.loss_iou: 0.2203 d3.loss_cls: 0.2782 d3.loss_bbox: 0.1220 d3.loss_iou: 0.2198 d4.loss_cls: 0.2799 d4.loss_bbox: 0.1207 d4.loss_iou: 0.2188 enc_loss_cls: 0.3169 enc_loss_bbox: 0.1494 enc_loss_iou: 0.2588 dn_loss_cls: 0.1130 dn_loss_bbox: 0.1728 dn_loss_iou: 0.2073 d0.dn_loss_cls: 0.1901 d0.dn_loss_bbox: 0.3368 d0.dn_loss_iou: 0.3630 d1.dn_loss_cls: 0.1385 d1.dn_loss_bbox: 0.2115 d1.dn_loss_iou: 0.2434 d2.dn_loss_cls: 0.1207 d2.dn_loss_bbox: 0.1839 d2.dn_loss_iou: 0.2192 d3.dn_loss_cls: 0.1156 d3.dn_loss_bbox: 0.1763 d3.dn_loss_iou: 0.2112 d4.dn_loss_cls: 0.1119 d4.dn_loss_bbox: 0.1729 d4.dn_loss_iou: 0.2072 d1.loss_lmm_region: 0.1679 loss_lmm_image: 0.8772 2024/11/10 22:15:27 - mmengine - INFO - Iter(train) [ 22700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:31:21 time: 2.0072 data_time: 0.0170 memory: 32945 grad_norm: 28.0967 loss: 10.2567 loss_cls: 0.3502 loss_bbox: 0.1420 loss_iou: 0.2664 d0.loss_cls: 0.4045 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2780 d1.loss_cls: 0.3708 d1.loss_bbox: 0.1506 d1.loss_iou: 0.2738 d2.loss_cls: 0.3576 d2.loss_bbox: 0.1492 d2.loss_iou: 0.2709 d3.loss_cls: 0.3544 d3.loss_bbox: 0.1425 d3.loss_iou: 0.2671 d4.loss_cls: 0.3499 d4.loss_bbox: 0.1423 d4.loss_iou: 0.2660 enc_loss_cls: 0.3981 enc_loss_bbox: 0.1704 enc_loss_iou: 0.3009 dn_loss_cls: 0.1429 dn_loss_bbox: 0.1683 dn_loss_iou: 0.2068 d0.dn_loss_cls: 0.2091 d0.dn_loss_bbox: 0.3196 d0.dn_loss_iou: 0.3459 d1.dn_loss_cls: 0.1671 d1.dn_loss_bbox: 0.1985 d1.dn_loss_iou: 0.2384 d2.dn_loss_cls: 0.1514 d2.dn_loss_bbox: 0.1766 d2.dn_loss_iou: 0.2156 d3.dn_loss_cls: 0.1455 d3.dn_loss_bbox: 0.1697 d3.dn_loss_iou: 0.2091 d4.dn_loss_cls: 0.1433 d4.dn_loss_bbox: 0.1683 d4.dn_loss_iou: 0.2068 d1.loss_lmm_region: 0.1789 loss_lmm_image: 0.9381 2024/11/10 22:18:45 - mmengine - INFO - Iter(train) [ 22800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:27:53 time: 1.9637 data_time: 0.0170 memory: 33940 grad_norm: 29.1212 loss: 10.1041 loss_cls: 0.3158 loss_bbox: 0.1358 loss_iou: 0.2505 d0.loss_cls: 0.3832 d0.loss_bbox: 0.1392 d0.loss_iou: 0.2581 d1.loss_cls: 0.3413 d1.loss_bbox: 0.1398 d1.loss_iou: 0.2572 d2.loss_cls: 0.3314 d2.loss_bbox: 0.1339 d2.loss_iou: 0.2464 d3.loss_cls: 0.3247 d3.loss_bbox: 0.1342 d3.loss_iou: 0.2474 d4.loss_cls: 0.3171 d4.loss_bbox: 0.1367 d4.loss_iou: 0.2483 enc_loss_cls: 0.3683 enc_loss_bbox: 0.1639 enc_loss_iou: 0.2890 dn_loss_cls: 0.1817 dn_loss_bbox: 0.1652 dn_loss_iou: 0.2097 d0.dn_loss_cls: 0.2462 d0.dn_loss_bbox: 0.3232 d0.dn_loss_iou: 0.3564 d1.dn_loss_cls: 0.2017 d1.dn_loss_bbox: 0.1979 d1.dn_loss_iou: 0.2412 d2.dn_loss_cls: 0.1861 d2.dn_loss_bbox: 0.1754 d2.dn_loss_iou: 0.2201 d3.dn_loss_cls: 0.1822 d3.dn_loss_bbox: 0.1678 d3.dn_loss_iou: 0.2124 d4.dn_loss_cls: 0.1804 d4.dn_loss_bbox: 0.1653 d4.dn_loss_iou: 0.2096 d1.loss_lmm_region: 0.1843 loss_lmm_image: 0.9351 2024/11/10 22:22:04 - mmengine - INFO - Iter(train) [ 22900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:24:33 time: 1.9847 data_time: 0.0181 memory: 33941 grad_norm: 32.5818 loss: 9.6499 loss_cls: 0.2781 loss_bbox: 0.1358 loss_iou: 0.2336 d0.loss_cls: 0.3271 d0.loss_bbox: 0.1490 d0.loss_iou: 0.2465 d1.loss_cls: 0.2989 d1.loss_bbox: 0.1421 d1.loss_iou: 0.2394 d2.loss_cls: 0.2907 d2.loss_bbox: 0.1365 d2.loss_iou: 0.2333 d3.loss_cls: 0.2821 d3.loss_bbox: 0.1367 d3.loss_iou: 0.2334 d4.loss_cls: 0.2792 d4.loss_bbox: 0.1359 d4.loss_iou: 0.2327 enc_loss_cls: 0.3245 enc_loss_bbox: 0.1655 enc_loss_iou: 0.2709 dn_loss_cls: 0.1444 dn_loss_bbox: 0.1800 dn_loss_iou: 0.2249 d0.dn_loss_cls: 0.2184 d0.dn_loss_bbox: 0.3493 d0.dn_loss_iou: 0.3822 d1.dn_loss_cls: 0.1689 d1.dn_loss_bbox: 0.2144 d1.dn_loss_iou: 0.2572 d2.dn_loss_cls: 0.1537 d2.dn_loss_bbox: 0.1885 d2.dn_loss_iou: 0.2344 d3.dn_loss_cls: 0.1475 d3.dn_loss_bbox: 0.1811 d3.dn_loss_iou: 0.2272 d4.dn_loss_cls: 0.1450 d4.dn_loss_bbox: 0.1800 d4.dn_loss_iou: 0.2248 d1.loss_lmm_region: 0.1601 loss_lmm_image: 0.8961 2024/11/10 22:25:23 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 22:25:23 - mmengine - INFO - Iter(train) [ 23000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:21:10 time: 1.9881 data_time: 0.0170 memory: 35088 grad_norm: 29.1878 loss: 9.7528 loss_cls: 0.3076 loss_bbox: 0.1317 loss_iou: 0.2470 d0.loss_cls: 0.3535 d0.loss_bbox: 0.1479 d0.loss_iou: 0.2668 d1.loss_cls: 0.3268 d1.loss_bbox: 0.1379 d1.loss_iou: 0.2547 d2.loss_cls: 0.3214 d2.loss_bbox: 0.1319 d2.loss_iou: 0.2438 d3.loss_cls: 0.3151 d3.loss_bbox: 0.1288 d3.loss_iou: 0.2442 d4.loss_cls: 0.3131 d4.loss_bbox: 0.1289 d4.loss_iou: 0.2426 enc_loss_cls: 0.3544 enc_loss_bbox: 0.1665 enc_loss_iou: 0.2885 dn_loss_cls: 0.1334 dn_loss_bbox: 0.1576 dn_loss_iou: 0.2150 d0.dn_loss_cls: 0.2190 d0.dn_loss_bbox: 0.3082 d0.dn_loss_iou: 0.3700 d1.dn_loss_cls: 0.1702 d1.dn_loss_bbox: 0.1945 d1.dn_loss_iou: 0.2488 d2.dn_loss_cls: 0.1508 d2.dn_loss_bbox: 0.1713 d2.dn_loss_iou: 0.2261 d3.dn_loss_cls: 0.1386 d3.dn_loss_bbox: 0.1603 d3.dn_loss_iou: 0.2175 d4.dn_loss_cls: 0.1350 d4.dn_loss_bbox: 0.1575 d4.dn_loss_iou: 0.2148 d1.loss_lmm_region: 0.1741 loss_lmm_image: 0.9371 2024/11/10 22:28:44 - mmengine - INFO - Iter(train) [ 23100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:18:01 time: 2.0182 data_time: 0.0169 memory: 35434 grad_norm: 33.3754 loss: 9.9500 loss_cls: 0.3253 loss_bbox: 0.1269 loss_iou: 0.2051 d0.loss_cls: 0.3692 d0.loss_bbox: 0.1500 d0.loss_iou: 0.2294 d1.loss_cls: 0.3432 d1.loss_bbox: 0.1345 d1.loss_iou: 0.2167 d2.loss_cls: 0.3338 d2.loss_bbox: 0.1300 d2.loss_iou: 0.2085 d3.loss_cls: 0.3295 d3.loss_bbox: 0.1290 d3.loss_iou: 0.2053 d4.loss_cls: 0.3260 d4.loss_bbox: 0.1275 d4.loss_iou: 0.2066 enc_loss_cls: 0.3713 enc_loss_bbox: 0.1663 enc_loss_iou: 0.2548 dn_loss_cls: 0.1457 dn_loss_bbox: 0.1874 dn_loss_iou: 0.2168 d0.dn_loss_cls: 0.2265 d0.dn_loss_bbox: 0.3682 d0.dn_loss_iou: 0.3772 d1.dn_loss_cls: 0.1781 d1.dn_loss_bbox: 0.2291 d1.dn_loss_iou: 0.2545 d2.dn_loss_cls: 0.1600 d2.dn_loss_bbox: 0.2003 d2.dn_loss_iou: 0.2286 d3.dn_loss_cls: 0.1533 d3.dn_loss_bbox: 0.1910 d3.dn_loss_iou: 0.2205 d4.dn_loss_cls: 0.1453 d4.dn_loss_bbox: 0.1873 d4.dn_loss_iou: 0.2168 d1.loss_lmm_region: 0.1923 loss_lmm_image: 0.9826 2024/11/10 22:32:02 - mmengine - INFO - Iter(train) [ 23200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:14:31 time: 1.9689 data_time: 0.0170 memory: 33336 grad_norm: 31.0048 loss: 9.6778 loss_cls: 0.2587 loss_bbox: 0.1476 loss_iou: 0.2351 d0.loss_cls: 0.3022 d0.loss_bbox: 0.1479 d0.loss_iou: 0.2418 d1.loss_cls: 0.2779 d1.loss_bbox: 0.1431 d1.loss_iou: 0.2346 d2.loss_cls: 0.2689 d2.loss_bbox: 0.1473 d2.loss_iou: 0.2340 d3.loss_cls: 0.2663 d3.loss_bbox: 0.1441 d3.loss_iou: 0.2363 d4.loss_cls: 0.2584 d4.loss_bbox: 0.1478 d4.loss_iou: 0.2349 enc_loss_cls: 0.2964 enc_loss_bbox: 0.1635 enc_loss_iou: 0.2624 dn_loss_cls: 0.1277 dn_loss_bbox: 0.2004 dn_loss_iou: 0.2345 d0.dn_loss_cls: 0.2085 d0.dn_loss_bbox: 0.3829 d0.dn_loss_iou: 0.3934 d1.dn_loss_cls: 0.1550 d1.dn_loss_bbox: 0.2466 d1.dn_loss_iou: 0.2705 d2.dn_loss_cls: 0.1390 d2.dn_loss_bbox: 0.2144 d2.dn_loss_iou: 0.2457 d3.dn_loss_cls: 0.1333 d3.dn_loss_bbox: 0.2041 d3.dn_loss_iou: 0.2375 d4.dn_loss_cls: 0.1292 d4.dn_loss_bbox: 0.2002 d4.dn_loss_iou: 0.2345 d1.loss_lmm_region: 0.1855 loss_lmm_image: 0.8856 2024/11/10 22:35:20 - mmengine - INFO - Iter(train) [ 23300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:11:04 time: 1.9889 data_time: 0.0170 memory: 32455 grad_norm: 31.9108 loss: 9.8183 loss_cls: 0.3028 loss_bbox: 0.1383 loss_iou: 0.2171 d0.loss_cls: 0.3564 d0.loss_bbox: 0.1501 d0.loss_iou: 0.2290 d1.loss_cls: 0.3296 d1.loss_bbox: 0.1391 d1.loss_iou: 0.2167 d2.loss_cls: 0.3200 d2.loss_bbox: 0.1338 d2.loss_iou: 0.2136 d3.loss_cls: 0.3078 d3.loss_bbox: 0.1394 d3.loss_iou: 0.2159 d4.loss_cls: 0.3063 d4.loss_bbox: 0.1380 d4.loss_iou: 0.2137 enc_loss_cls: 0.3515 enc_loss_bbox: 0.1686 enc_loss_iou: 0.2584 dn_loss_cls: 0.1704 dn_loss_bbox: 0.1760 dn_loss_iou: 0.2036 d0.dn_loss_cls: 0.2537 d0.dn_loss_bbox: 0.3550 d0.dn_loss_iou: 0.3569 d1.dn_loss_cls: 0.2063 d1.dn_loss_bbox: 0.2155 d1.dn_loss_iou: 0.2382 d2.dn_loss_cls: 0.1901 d2.dn_loss_bbox: 0.1878 d2.dn_loss_iou: 0.2140 d3.dn_loss_cls: 0.1811 d3.dn_loss_bbox: 0.1791 d3.dn_loss_iou: 0.2070 d4.dn_loss_cls: 0.1709 d4.dn_loss_bbox: 0.1759 d4.dn_loss_iou: 0.2035 d1.loss_lmm_region: 0.1603 loss_lmm_image: 0.9270 2024/11/10 22:38:38 - mmengine - INFO - Iter(train) [ 23400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:07:39 time: 1.9932 data_time: 0.0168 memory: 33437 grad_norm: 31.3921 loss: 9.2660 loss_cls: 0.3136 loss_bbox: 0.1110 loss_iou: 0.2202 d0.loss_cls: 0.3747 d0.loss_bbox: 0.1268 d0.loss_iou: 0.2372 d1.loss_cls: 0.3315 d1.loss_bbox: 0.1165 d1.loss_iou: 0.2242 d2.loss_cls: 0.3210 d2.loss_bbox: 0.1142 d2.loss_iou: 0.2195 d3.loss_cls: 0.3163 d3.loss_bbox: 0.1110 d3.loss_iou: 0.2195 d4.loss_cls: 0.3131 d4.loss_bbox: 0.1106 d4.loss_iou: 0.2200 enc_loss_cls: 0.3634 enc_loss_bbox: 0.1386 enc_loss_iou: 0.2589 dn_loss_cls: 0.1468 dn_loss_bbox: 0.1394 dn_loss_iou: 0.1896 d0.dn_loss_cls: 0.2226 d0.dn_loss_bbox: 0.2902 d0.dn_loss_iou: 0.3404 d1.dn_loss_cls: 0.1756 d1.dn_loss_bbox: 0.1742 d1.dn_loss_iou: 0.2235 d2.dn_loss_cls: 0.1559 d2.dn_loss_bbox: 0.1523 d2.dn_loss_iou: 0.2017 d3.dn_loss_cls: 0.1499 d3.dn_loss_bbox: 0.1415 d3.dn_loss_iou: 0.1929 d4.dn_loss_cls: 0.1460 d4.dn_loss_bbox: 0.1395 d4.dn_loss_iou: 0.1897 d1.loss_lmm_region: 0.1884 loss_lmm_image: 0.9442 2024/11/10 22:41:59 - mmengine - INFO - Iter(train) [ 23500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:04:24 time: 2.0002 data_time: 0.0169 memory: 34616 grad_norm: 26.1910 loss: 10.8434 loss_cls: 0.3740 loss_bbox: 0.1589 loss_iou: 0.2998 d0.loss_cls: 0.4303 d0.loss_bbox: 0.1738 d0.loss_iou: 0.3151 d1.loss_cls: 0.3940 d1.loss_bbox: 0.1643 d1.loss_iou: 0.3043 d2.loss_cls: 0.3847 d2.loss_bbox: 0.1602 d2.loss_iou: 0.3013 d3.loss_cls: 0.3777 d3.loss_bbox: 0.1570 d3.loss_iou: 0.2979 d4.loss_cls: 0.3741 d4.loss_bbox: 0.1597 d4.loss_iou: 0.2994 enc_loss_cls: 0.4278 enc_loss_bbox: 0.1834 enc_loss_iou: 0.3376 dn_loss_cls: 0.1441 dn_loss_bbox: 0.1549 dn_loss_iou: 0.2206 d0.dn_loss_cls: 0.2225 d0.dn_loss_bbox: 0.3151 d0.dn_loss_iou: 0.3783 d1.dn_loss_cls: 0.1721 d1.dn_loss_bbox: 0.1918 d1.dn_loss_iou: 0.2545 d2.dn_loss_cls: 0.1542 d2.dn_loss_bbox: 0.1671 d2.dn_loss_iou: 0.2305 d3.dn_loss_cls: 0.1466 d3.dn_loss_bbox: 0.1572 d3.dn_loss_iou: 0.2235 d4.dn_loss_cls: 0.1458 d4.dn_loss_bbox: 0.1549 d4.dn_loss_iou: 0.2206 d1.loss_lmm_region: 0.1911 loss_lmm_image: 0.9229 2024/11/10 22:45:15 - mmengine - INFO - Iter(train) [ 23600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 22:00:51 time: 1.9750 data_time: 0.0171 memory: 33679 grad_norm: 31.2841 loss: 10.9392 loss_cls: 0.3913 loss_bbox: 0.1603 loss_iou: 0.3018 d0.loss_cls: 0.4444 d0.loss_bbox: 0.1840 d0.loss_iou: 0.3210 d1.loss_cls: 0.4128 d1.loss_bbox: 0.1717 d1.loss_iou: 0.3120 d2.loss_cls: 0.4000 d2.loss_bbox: 0.1695 d2.loss_iou: 0.3024 d3.loss_cls: 0.3963 d3.loss_bbox: 0.1618 d3.loss_iou: 0.2997 d4.loss_cls: 0.3914 d4.loss_bbox: 0.1636 d4.loss_iou: 0.3023 enc_loss_cls: 0.4422 enc_loss_bbox: 0.1990 enc_loss_iou: 0.3430 dn_loss_cls: 0.1408 dn_loss_bbox: 0.1603 dn_loss_iou: 0.2149 d0.dn_loss_cls: 0.2187 d0.dn_loss_bbox: 0.3138 d0.dn_loss_iou: 0.3613 d1.dn_loss_cls: 0.1715 d1.dn_loss_bbox: 0.1976 d1.dn_loss_iou: 0.2484 d2.dn_loss_cls: 0.1490 d2.dn_loss_bbox: 0.1709 d2.dn_loss_iou: 0.2249 d3.dn_loss_cls: 0.1429 d3.dn_loss_bbox: 0.1624 d3.dn_loss_iou: 0.2175 d4.dn_loss_cls: 0.1398 d4.dn_loss_bbox: 0.1603 d4.dn_loss_iou: 0.2150 d1.loss_lmm_region: 0.1772 loss_lmm_image: 0.8817 2024/11/10 22:48:34 - mmengine - INFO - Iter(train) [ 23700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:57:27 time: 1.9481 data_time: 0.0175 memory: 35178 grad_norm: 27.7132 loss: 10.6956 loss_cls: 0.3597 loss_bbox: 0.1575 loss_iou: 0.2718 d0.loss_cls: 0.4109 d0.loss_bbox: 0.1684 d0.loss_iou: 0.2867 d1.loss_cls: 0.3788 d1.loss_bbox: 0.1634 d1.loss_iou: 0.2768 d2.loss_cls: 0.3685 d2.loss_bbox: 0.1573 d2.loss_iou: 0.2724 d3.loss_cls: 0.3629 d3.loss_bbox: 0.1549 d3.loss_iou: 0.2703 d4.loss_cls: 0.3596 d4.loss_bbox: 0.1568 d4.loss_iou: 0.2710 enc_loss_cls: 0.3991 enc_loss_bbox: 0.1817 enc_loss_iou: 0.3051 dn_loss_cls: 0.1334 dn_loss_bbox: 0.2032 dn_loss_iou: 0.2342 d0.dn_loss_cls: 0.2102 d0.dn_loss_bbox: 0.3598 d0.dn_loss_iou: 0.3806 d1.dn_loss_cls: 0.1623 d1.dn_loss_bbox: 0.2336 d1.dn_loss_iou: 0.2649 d2.dn_loss_cls: 0.1454 d2.dn_loss_bbox: 0.2130 d2.dn_loss_iou: 0.2444 d3.dn_loss_cls: 0.1374 d3.dn_loss_bbox: 0.2049 d3.dn_loss_iou: 0.2374 d4.dn_loss_cls: 0.1349 d4.dn_loss_bbox: 0.2032 d4.dn_loss_iou: 0.2342 d1.loss_lmm_region: 0.1475 loss_lmm_image: 0.8772 2024/11/10 22:51:54 - mmengine - INFO - Iter(train) [ 23800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:54:09 time: 1.9993 data_time: 0.0172 memory: 32619 grad_norm: 27.4112 loss: 9.9459 loss_cls: 0.3169 loss_bbox: 0.1475 loss_iou: 0.2475 d0.loss_cls: 0.3726 d0.loss_bbox: 0.1576 d0.loss_iou: 0.2582 d1.loss_cls: 0.3427 d1.loss_bbox: 0.1483 d1.loss_iou: 0.2501 d2.loss_cls: 0.3320 d2.loss_bbox: 0.1465 d2.loss_iou: 0.2462 d3.loss_cls: 0.3254 d3.loss_bbox: 0.1434 d3.loss_iou: 0.2455 d4.loss_cls: 0.3164 d4.loss_bbox: 0.1463 d4.loss_iou: 0.2454 enc_loss_cls: 0.3644 enc_loss_bbox: 0.1742 enc_loss_iou: 0.2854 dn_loss_cls: 0.1294 dn_loss_bbox: 0.1634 dn_loss_iou: 0.2242 d0.dn_loss_cls: 0.2171 d0.dn_loss_bbox: 0.3189 d0.dn_loss_iou: 0.3799 d1.dn_loss_cls: 0.1673 d1.dn_loss_bbox: 0.1944 d1.dn_loss_iou: 0.2553 d2.dn_loss_cls: 0.1500 d2.dn_loss_bbox: 0.1727 d2.dn_loss_iou: 0.2332 d3.dn_loss_cls: 0.1392 d3.dn_loss_bbox: 0.1660 d3.dn_loss_iou: 0.2268 d4.dn_loss_cls: 0.1299 d4.dn_loss_bbox: 0.1634 d4.dn_loss_iou: 0.2242 d1.loss_lmm_region: 0.1539 loss_lmm_image: 0.9239 2024/11/10 22:55:14 - mmengine - INFO - Iter(train) [ 23900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:50:54 time: 2.0100 data_time: 0.0172 memory: 34108 grad_norm: 30.4227 loss: 9.7815 loss_cls: 0.3039 loss_bbox: 0.1353 loss_iou: 0.2300 d0.loss_cls: 0.3599 d0.loss_bbox: 0.1452 d0.loss_iou: 0.2474 d1.loss_cls: 0.3261 d1.loss_bbox: 0.1335 d1.loss_iou: 0.2321 d2.loss_cls: 0.3167 d2.loss_bbox: 0.1320 d2.loss_iou: 0.2280 d3.loss_cls: 0.3075 d3.loss_bbox: 0.1335 d3.loss_iou: 0.2295 d4.loss_cls: 0.3021 d4.loss_bbox: 0.1344 d4.loss_iou: 0.2293 enc_loss_cls: 0.3537 enc_loss_bbox: 0.1600 enc_loss_iou: 0.2658 dn_loss_cls: 0.1457 dn_loss_bbox: 0.1688 dn_loss_iou: 0.2214 d0.dn_loss_cls: 0.2270 d0.dn_loss_bbox: 0.3279 d0.dn_loss_iou: 0.3721 d1.dn_loss_cls: 0.1759 d1.dn_loss_bbox: 0.2023 d1.dn_loss_iou: 0.2537 d2.dn_loss_cls: 0.1586 d2.dn_loss_bbox: 0.1779 d2.dn_loss_iou: 0.2313 d3.dn_loss_cls: 0.1507 d3.dn_loss_bbox: 0.1716 d3.dn_loss_iou: 0.2237 d4.dn_loss_cls: 0.1471 d4.dn_loss_bbox: 0.1688 d4.dn_loss_iou: 0.2214 d1.loss_lmm_region: 0.2008 loss_lmm_image: 0.9286 2024/11/10 22:58:34 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 22:58:34 - mmengine - INFO - Iter(train) [ 24000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:47:40 time: 2.0269 data_time: 0.0171 memory: 34829 grad_norm: 26.6749 loss: 10.4589 loss_cls: 0.3421 loss_bbox: 0.1446 loss_iou: 0.2802 d0.loss_cls: 0.3956 d0.loss_bbox: 0.1509 d0.loss_iou: 0.2951 d1.loss_cls: 0.3617 d1.loss_bbox: 0.1468 d1.loss_iou: 0.2869 d2.loss_cls: 0.3485 d2.loss_bbox: 0.1452 d2.loss_iou: 0.2833 d3.loss_cls: 0.3451 d3.loss_bbox: 0.1461 d3.loss_iou: 0.2820 d4.loss_cls: 0.3425 d4.loss_bbox: 0.1450 d4.loss_iou: 0.2809 enc_loss_cls: 0.3892 enc_loss_bbox: 0.1666 enc_loss_iou: 0.3164 dn_loss_cls: 0.1284 dn_loss_bbox: 0.1760 dn_loss_iou: 0.2329 d0.dn_loss_cls: 0.2069 d0.dn_loss_bbox: 0.3242 d0.dn_loss_iou: 0.3786 d1.dn_loss_cls: 0.1603 d1.dn_loss_bbox: 0.2115 d1.dn_loss_iou: 0.2652 d2.dn_loss_cls: 0.1432 d2.dn_loss_bbox: 0.1885 d2.dn_loss_iou: 0.2427 d3.dn_loss_cls: 0.1365 d3.dn_loss_bbox: 0.1789 d3.dn_loss_iou: 0.2349 d4.dn_loss_cls: 0.1312 d4.dn_loss_bbox: 0.1761 d4.dn_loss_iou: 0.2330 d1.loss_lmm_region: 0.1955 loss_lmm_image: 0.9197 2024/11/10 23:01:53 - mmengine - INFO - Iter(train) [ 24100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:44:18 time: 1.9696 data_time: 0.0170 memory: 32285 grad_norm: 26.4973 loss: 9.7204 loss_cls: 0.2871 loss_bbox: 0.1222 loss_iou: 0.2362 d0.loss_cls: 0.3475 d0.loss_bbox: 0.1397 d0.loss_iou: 0.2534 d1.loss_cls: 0.3006 d1.loss_bbox: 0.1362 d1.loss_iou: 0.2454 d2.loss_cls: 0.2989 d2.loss_bbox: 0.1244 d2.loss_iou: 0.2357 d3.loss_cls: 0.2932 d3.loss_bbox: 0.1190 d3.loss_iou: 0.2357 d4.loss_cls: 0.2856 d4.loss_bbox: 0.1213 d4.loss_iou: 0.2384 enc_loss_cls: 0.3328 enc_loss_bbox: 0.1578 enc_loss_iou: 0.2750 dn_loss_cls: 0.1493 dn_loss_bbox: 0.1746 dn_loss_iou: 0.2173 d0.dn_loss_cls: 0.2228 d0.dn_loss_bbox: 0.3557 d0.dn_loss_iou: 0.3719 d1.dn_loss_cls: 0.1748 d1.dn_loss_bbox: 0.2159 d1.dn_loss_iou: 0.2530 d2.dn_loss_cls: 0.1590 d2.dn_loss_bbox: 0.1864 d2.dn_loss_iou: 0.2280 d3.dn_loss_cls: 0.1525 d3.dn_loss_bbox: 0.1771 d3.dn_loss_iou: 0.2201 d4.dn_loss_cls: 0.1503 d4.dn_loss_bbox: 0.1745 d4.dn_loss_iou: 0.2173 d1.loss_lmm_region: 0.1818 loss_lmm_image: 0.9519 2024/11/10 23:05:13 - mmengine - INFO - Iter(train) [ 24200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:41:01 time: 1.9800 data_time: 0.0172 memory: 34366 grad_norm: 36.4944 loss: 9.6588 loss_cls: 0.2857 loss_bbox: 0.1299 loss_iou: 0.2398 d0.loss_cls: 0.3342 d0.loss_bbox: 0.1417 d0.loss_iou: 0.2506 d1.loss_cls: 0.3057 d1.loss_bbox: 0.1313 d1.loss_iou: 0.2395 d2.loss_cls: 0.2940 d2.loss_bbox: 0.1323 d2.loss_iou: 0.2431 d3.loss_cls: 0.2875 d3.loss_bbox: 0.1295 d3.loss_iou: 0.2427 d4.loss_cls: 0.2885 d4.loss_bbox: 0.1294 d4.loss_iou: 0.2398 enc_loss_cls: 0.3239 enc_loss_bbox: 0.1610 enc_loss_iou: 0.2791 dn_loss_cls: 0.1255 dn_loss_bbox: 0.1789 dn_loss_iou: 0.2240 d0.dn_loss_cls: 0.2033 d0.dn_loss_bbox: 0.3473 d0.dn_loss_iou: 0.3805 d1.dn_loss_cls: 0.1569 d1.dn_loss_bbox: 0.2224 d1.dn_loss_iou: 0.2607 d2.dn_loss_cls: 0.1334 d2.dn_loss_bbox: 0.1959 d2.dn_loss_iou: 0.2378 d3.dn_loss_cls: 0.1270 d3.dn_loss_bbox: 0.1838 d3.dn_loss_iou: 0.2278 d4.dn_loss_cls: 0.1252 d4.dn_loss_bbox: 0.1790 d4.dn_loss_iou: 0.2241 d1.loss_lmm_region: 0.1751 loss_lmm_image: 0.9413 2024/11/10 23:08:32 - mmengine - INFO - Iter(train) [ 24300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:37:40 time: 1.9956 data_time: 0.0172 memory: 34943 grad_norm: 32.5160 loss: 10.4345 loss_cls: 0.3323 loss_bbox: 0.1423 loss_iou: 0.2495 d0.loss_cls: 0.3780 d0.loss_bbox: 0.1560 d0.loss_iou: 0.2698 d1.loss_cls: 0.3533 d1.loss_bbox: 0.1395 d1.loss_iou: 0.2543 d2.loss_cls: 0.3428 d2.loss_bbox: 0.1380 d2.loss_iou: 0.2504 d3.loss_cls: 0.3398 d3.loss_bbox: 0.1379 d3.loss_iou: 0.2489 d4.loss_cls: 0.3364 d4.loss_bbox: 0.1378 d4.loss_iou: 0.2480 enc_loss_cls: 0.3764 enc_loss_bbox: 0.1691 enc_loss_iou: 0.2907 dn_loss_cls: 0.1919 dn_loss_bbox: 0.1670 dn_loss_iou: 0.2255 d0.dn_loss_cls: 0.2784 d0.dn_loss_bbox: 0.3283 d0.dn_loss_iou: 0.3756 d1.dn_loss_cls: 0.2266 d1.dn_loss_bbox: 0.2027 d1.dn_loss_iou: 0.2587 d2.dn_loss_cls: 0.2049 d2.dn_loss_bbox: 0.1782 d2.dn_loss_iou: 0.2360 d3.dn_loss_cls: 0.1968 d3.dn_loss_bbox: 0.1694 d3.dn_loss_iou: 0.2282 d4.dn_loss_cls: 0.1919 d4.dn_loss_bbox: 0.1669 d4.dn_loss_iou: 0.2253 d1.loss_lmm_region: 0.1721 loss_lmm_image: 0.9188 2024/11/10 23:11:51 - mmengine - INFO - Iter(train) [ 24400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:34:19 time: 1.9625 data_time: 0.0171 memory: 33873 grad_norm: 26.3436 loss: 10.5404 loss_cls: 0.3658 loss_bbox: 0.1554 loss_iou: 0.2839 d0.loss_cls: 0.4203 d0.loss_bbox: 0.1718 d0.loss_iou: 0.3054 d1.loss_cls: 0.3822 d1.loss_bbox: 0.1715 d1.loss_iou: 0.3025 d2.loss_cls: 0.3741 d2.loss_bbox: 0.1616 d2.loss_iou: 0.2928 d3.loss_cls: 0.3661 d3.loss_bbox: 0.1590 d3.loss_iou: 0.2910 d4.loss_cls: 0.3670 d4.loss_bbox: 0.1563 d4.loss_iou: 0.2876 enc_loss_cls: 0.4098 enc_loss_bbox: 0.1871 enc_loss_iou: 0.3343 dn_loss_cls: 0.1127 dn_loss_bbox: 0.1607 dn_loss_iou: 0.2259 d0.dn_loss_cls: 0.1962 d0.dn_loss_bbox: 0.3122 d0.dn_loss_iou: 0.3744 d1.dn_loss_cls: 0.1468 d1.dn_loss_bbox: 0.1939 d1.dn_loss_iou: 0.2578 d2.dn_loss_cls: 0.1282 d2.dn_loss_bbox: 0.1709 d2.dn_loss_iou: 0.2362 d3.dn_loss_cls: 0.1191 d3.dn_loss_bbox: 0.1636 d3.dn_loss_iou: 0.2290 d4.dn_loss_cls: 0.1151 d4.dn_loss_bbox: 0.1607 d4.dn_loss_iou: 0.2259 d1.loss_lmm_region: 0.1616 loss_lmm_image: 0.9039 2024/11/10 23:15:11 - mmengine - INFO - Iter(train) [ 24500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:31:00 time: 2.0147 data_time: 0.0172 memory: 34829 grad_norm: 31.4797 loss: 9.3315 loss_cls: 0.3055 loss_bbox: 0.1144 loss_iou: 0.2213 d0.loss_cls: 0.3612 d0.loss_bbox: 0.1283 d0.loss_iou: 0.2404 d1.loss_cls: 0.3233 d1.loss_bbox: 0.1205 d1.loss_iou: 0.2290 d2.loss_cls: 0.3131 d2.loss_bbox: 0.1185 d2.loss_iou: 0.2281 d3.loss_cls: 0.3060 d3.loss_bbox: 0.1160 d3.loss_iou: 0.2247 d4.loss_cls: 0.3026 d4.loss_bbox: 0.1155 d4.loss_iou: 0.2252 enc_loss_cls: 0.3603 enc_loss_bbox: 0.1461 enc_loss_iou: 0.2604 dn_loss_cls: 0.1403 dn_loss_bbox: 0.1556 dn_loss_iou: 0.2070 d0.dn_loss_cls: 0.2096 d0.dn_loss_bbox: 0.2975 d0.dn_loss_iou: 0.3518 d1.dn_loss_cls: 0.1643 d1.dn_loss_bbox: 0.1940 d1.dn_loss_iou: 0.2409 d2.dn_loss_cls: 0.1490 d2.dn_loss_bbox: 0.1662 d2.dn_loss_iou: 0.2174 d3.dn_loss_cls: 0.1443 d3.dn_loss_bbox: 0.1593 d3.dn_loss_iou: 0.2104 d4.dn_loss_cls: 0.1393 d4.dn_loss_bbox: 0.1556 d4.dn_loss_iou: 0.2070 d1.loss_lmm_region: 0.1634 loss_lmm_image: 0.8981 2024/11/10 23:18:30 - mmengine - INFO - Iter(train) [ 24600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:27:41 time: 2.0032 data_time: 0.0171 memory: 33574 grad_norm: 29.6826 loss: 9.6819 loss_cls: 0.2898 loss_bbox: 0.1468 loss_iou: 0.2827 d0.loss_cls: 0.3546 d0.loss_bbox: 0.1600 d0.loss_iou: 0.2925 d1.loss_cls: 0.3160 d1.loss_bbox: 0.1593 d1.loss_iou: 0.2893 d2.loss_cls: 0.3086 d2.loss_bbox: 0.1498 d2.loss_iou: 0.2794 d3.loss_cls: 0.2986 d3.loss_bbox: 0.1459 d3.loss_iou: 0.2810 d4.loss_cls: 0.2921 d4.loss_bbox: 0.1466 d4.loss_iou: 0.2819 enc_loss_cls: 0.3428 enc_loss_bbox: 0.1761 enc_loss_iou: 0.3208 dn_loss_cls: 0.0971 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2189 d0.dn_loss_cls: 0.1758 d0.dn_loss_bbox: 0.2831 d0.dn_loss_iou: 0.3530 d1.dn_loss_cls: 0.1276 d1.dn_loss_bbox: 0.1826 d1.dn_loss_iou: 0.2476 d2.dn_loss_cls: 0.1119 d2.dn_loss_bbox: 0.1645 d2.dn_loss_iou: 0.2274 d3.dn_loss_cls: 0.1046 d3.dn_loss_bbox: 0.1584 d3.dn_loss_iou: 0.2210 d4.dn_loss_cls: 0.0989 d4.dn_loss_bbox: 0.1560 d4.dn_loss_iou: 0.2188 d1.loss_lmm_region: 0.1493 loss_lmm_image: 0.9149 2024/11/10 23:21:52 - mmengine - INFO - Iter(train) [ 24700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:24:34 time: 2.0134 data_time: 0.0173 memory: 35075 grad_norm: 31.1644 loss: 10.1275 loss_cls: 0.3132 loss_bbox: 0.1501 loss_iou: 0.2410 d0.loss_cls: 0.3621 d0.loss_bbox: 0.1641 d0.loss_iou: 0.2528 d1.loss_cls: 0.3319 d1.loss_bbox: 0.1556 d1.loss_iou: 0.2457 d2.loss_cls: 0.3201 d2.loss_bbox: 0.1537 d2.loss_iou: 0.2427 d3.loss_cls: 0.3144 d3.loss_bbox: 0.1519 d3.loss_iou: 0.2423 d4.loss_cls: 0.3130 d4.loss_bbox: 0.1512 d4.loss_iou: 0.2409 enc_loss_cls: 0.3643 enc_loss_bbox: 0.1780 enc_loss_iou: 0.2752 dn_loss_cls: 0.1265 dn_loss_bbox: 0.1973 dn_loss_iou: 0.2310 d0.dn_loss_cls: 0.2121 d0.dn_loss_bbox: 0.3622 d0.dn_loss_iou: 0.3827 d1.dn_loss_cls: 0.1596 d1.dn_loss_bbox: 0.2346 d1.dn_loss_iou: 0.2648 d2.dn_loss_cls: 0.1385 d2.dn_loss_bbox: 0.2101 d2.dn_loss_iou: 0.2412 d3.dn_loss_cls: 0.1303 d3.dn_loss_bbox: 0.2007 d3.dn_loss_iou: 0.2337 d4.dn_loss_cls: 0.1269 d4.dn_loss_bbox: 0.1973 d4.dn_loss_iou: 0.2310 d1.loss_lmm_region: 0.1819 loss_lmm_image: 0.9008 2024/11/10 23:25:12 - mmengine - INFO - Iter(train) [ 24800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:21:15 time: 2.0163 data_time: 0.0172 memory: 34918 grad_norm: nan loss: 11.5582 loss_cls: 0.3694 loss_bbox: 0.1791 loss_iou: 0.2828 d0.loss_cls: 0.4322 d0.loss_bbox: 0.1881 d0.loss_iou: 0.2876 d1.loss_cls: 0.3952 d1.loss_bbox: 0.1820 d1.loss_iou: 0.2823 d2.loss_cls: 0.3782 d2.loss_bbox: 0.1837 d2.loss_iou: 0.2812 d3.loss_cls: 0.3713 d3.loss_bbox: 0.1814 d3.loss_iou: 0.2796 d4.loss_cls: 0.3686 d4.loss_bbox: 0.1817 d4.loss_iou: 0.2814 enc_loss_cls: 0.4287 enc_loss_bbox: 0.2038 enc_loss_iou: 0.3105 dn_loss_cls: 0.1413 dn_loss_bbox: 0.2323 dn_loss_iou: 0.2569 d0.dn_loss_cls: 0.2312 d0.dn_loss_bbox: 0.4274 d0.dn_loss_iou: 0.4164 d1.dn_loss_cls: 0.1786 d1.dn_loss_bbox: 0.2716 d1.dn_loss_iou: 0.2921 d2.dn_loss_cls: 0.1568 d2.dn_loss_bbox: 0.2443 d2.dn_loss_iou: 0.2690 d3.dn_loss_cls: 0.1467 d3.dn_loss_bbox: 0.2344 d3.dn_loss_iou: 0.2597 d4.dn_loss_cls: 0.1430 d4.dn_loss_bbox: 0.2322 d4.dn_loss_iou: 0.2569 d1.loss_lmm_region: 0.2046 loss_lmm_image: 0.9141 2024/11/10 23:28:30 - mmengine - INFO - Iter(train) [ 24900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:17:49 time: 1.9805 data_time: 0.0171 memory: 34992 grad_norm: 33.1922 loss: 10.9934 loss_cls: 0.3624 loss_bbox: 0.1570 loss_iou: 0.3041 d0.loss_cls: 0.4159 d0.loss_bbox: 0.1629 d0.loss_iou: 0.3126 d1.loss_cls: 0.3772 d1.loss_bbox: 0.1598 d1.loss_iou: 0.3074 d2.loss_cls: 0.3695 d2.loss_bbox: 0.1565 d2.loss_iou: 0.3052 d3.loss_cls: 0.3659 d3.loss_bbox: 0.1573 d3.loss_iou: 0.3037 d4.loss_cls: 0.3644 d4.loss_bbox: 0.1569 d4.loss_iou: 0.3036 enc_loss_cls: 0.4136 enc_loss_bbox: 0.1783 enc_loss_iou: 0.3359 dn_loss_cls: 0.1713 dn_loss_bbox: 0.1659 dn_loss_iou: 0.2285 d0.dn_loss_cls: 0.2478 d0.dn_loss_bbox: 0.3112 d0.dn_loss_iou: 0.3784 d1.dn_loss_cls: 0.2007 d1.dn_loss_bbox: 0.1983 d1.dn_loss_iou: 0.2625 d2.dn_loss_cls: 0.1838 d2.dn_loss_bbox: 0.1768 d2.dn_loss_iou: 0.2395 d3.dn_loss_cls: 0.1758 d3.dn_loss_bbox: 0.1697 d3.dn_loss_iou: 0.2317 d4.dn_loss_cls: 0.1719 d4.dn_loss_bbox: 0.1660 d4.dn_loss_iou: 0.2285 d1.loss_lmm_region: 0.2036 loss_lmm_image: 0.9113 2024/11/10 23:31:49 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/10 23:31:49 - mmengine - INFO - Iter(train) [ 25000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:14:30 time: 1.9867 data_time: 0.0172 memory: 34509 grad_norm: 24.3495 loss: 10.1519 loss_cls: 0.3163 loss_bbox: 0.1619 loss_iou: 0.2725 d0.loss_cls: 0.3721 d0.loss_bbox: 0.1742 d0.loss_iou: 0.2854 d1.loss_cls: 0.3390 d1.loss_bbox: 0.1607 d1.loss_iou: 0.2743 d2.loss_cls: 0.3264 d2.loss_bbox: 0.1623 d2.loss_iou: 0.2721 d3.loss_cls: 0.3190 d3.loss_bbox: 0.1643 d3.loss_iou: 0.2741 d4.loss_cls: 0.3158 d4.loss_bbox: 0.1636 d4.loss_iou: 0.2732 enc_loss_cls: 0.3619 enc_loss_bbox: 0.1889 enc_loss_iou: 0.3123 dn_loss_cls: 0.1031 dn_loss_bbox: 0.1731 dn_loss_iou: 0.2277 d0.dn_loss_cls: 0.1951 d0.dn_loss_bbox: 0.3148 d0.dn_loss_iou: 0.3771 d1.dn_loss_cls: 0.1407 d1.dn_loss_bbox: 0.2032 d1.dn_loss_iou: 0.2580 d2.dn_loss_cls: 0.1170 d2.dn_loss_bbox: 0.1826 d2.dn_loss_iou: 0.2373 d3.dn_loss_cls: 0.1083 d3.dn_loss_bbox: 0.1747 d3.dn_loss_iou: 0.2301 d4.dn_loss_cls: 0.1039 d4.dn_loss_bbox: 0.1731 d4.dn_loss_iou: 0.2277 d1.loss_lmm_region: 0.1801 loss_lmm_image: 0.9344 2024/11/10 23:35:07 - mmengine - INFO - Iter(train) [ 25100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:11:02 time: 1.9656 data_time: 0.0171 memory: 34917 grad_norm: 31.1754 loss: 10.1444 loss_cls: 0.3176 loss_bbox: 0.1561 loss_iou: 0.2665 d0.loss_cls: 0.3765 d0.loss_bbox: 0.1719 d0.loss_iou: 0.2867 d1.loss_cls: 0.3494 d1.loss_bbox: 0.1548 d1.loss_iou: 0.2727 d2.loss_cls: 0.3339 d2.loss_bbox: 0.1543 d2.loss_iou: 0.2702 d3.loss_cls: 0.3227 d3.loss_bbox: 0.1564 d3.loss_iou: 0.2690 d4.loss_cls: 0.3174 d4.loss_bbox: 0.1551 d4.loss_iou: 0.2664 enc_loss_cls: 0.3733 enc_loss_bbox: 0.1862 enc_loss_iou: 0.3102 dn_loss_cls: 0.1042 dn_loss_bbox: 0.1801 dn_loss_iou: 0.2240 d0.dn_loss_cls: 0.1870 d0.dn_loss_bbox: 0.3459 d0.dn_loss_iou: 0.3713 d1.dn_loss_cls: 0.1318 d1.dn_loss_bbox: 0.2162 d1.dn_loss_iou: 0.2589 d2.dn_loss_cls: 0.1136 d2.dn_loss_bbox: 0.1950 d2.dn_loss_iou: 0.2348 d3.dn_loss_cls: 0.1076 d3.dn_loss_bbox: 0.1824 d3.dn_loss_iou: 0.2263 d4.dn_loss_cls: 0.1038 d4.dn_loss_bbox: 0.1803 d4.dn_loss_iou: 0.2239 d1.loss_lmm_region: 0.1632 loss_lmm_image: 0.9268 2024/11/10 23:38:28 - mmengine - INFO - Iter(train) [ 25200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:07:49 time: 2.0068 data_time: 0.0172 memory: 34064 grad_norm: 28.2380 loss: 9.1762 loss_cls: 0.2888 loss_bbox: 0.1186 loss_iou: 0.2156 d0.loss_cls: 0.3268 d0.loss_bbox: 0.1334 d0.loss_iou: 0.2343 d1.loss_cls: 0.3055 d1.loss_bbox: 0.1253 d1.loss_iou: 0.2222 d2.loss_cls: 0.2970 d2.loss_bbox: 0.1209 d2.loss_iou: 0.2196 d3.loss_cls: 0.2893 d3.loss_bbox: 0.1220 d3.loss_iou: 0.2193 d4.loss_cls: 0.2873 d4.loss_bbox: 0.1200 d4.loss_iou: 0.2171 enc_loss_cls: 0.3247 enc_loss_bbox: 0.1492 enc_loss_iou: 0.2612 dn_loss_cls: 0.0990 dn_loss_bbox: 0.1669 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.1939 d0.dn_loss_bbox: 0.3234 d0.dn_loss_iou: 0.3714 d1.dn_loss_cls: 0.1352 d1.dn_loss_bbox: 0.2031 d1.dn_loss_iou: 0.2508 d2.dn_loss_cls: 0.1131 d2.dn_loss_bbox: 0.1776 d2.dn_loss_iou: 0.2275 d3.dn_loss_cls: 0.1047 d3.dn_loss_bbox: 0.1688 d3.dn_loss_iou: 0.2185 d4.dn_loss_cls: 0.1000 d4.dn_loss_bbox: 0.1668 d4.dn_loss_iou: 0.2157 d1.loss_lmm_region: 0.1773 loss_lmm_image: 0.9486 2024/11/10 23:41:46 - mmengine - INFO - Iter(train) [ 25300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:04:23 time: 1.9771 data_time: 0.0170 memory: 33864 grad_norm: 28.3974 loss: 9.7522 loss_cls: 0.3048 loss_bbox: 0.1357 loss_iou: 0.2452 d0.loss_cls: 0.3555 d0.loss_bbox: 0.1479 d0.loss_iou: 0.2598 d1.loss_cls: 0.3263 d1.loss_bbox: 0.1389 d1.loss_iou: 0.2512 d2.loss_cls: 0.3132 d2.loss_bbox: 0.1378 d2.loss_iou: 0.2488 d3.loss_cls: 0.3088 d3.loss_bbox: 0.1355 d3.loss_iou: 0.2455 d4.loss_cls: 0.3055 d4.loss_bbox: 0.1374 d4.loss_iou: 0.2468 enc_loss_cls: 0.3546 enc_loss_bbox: 0.1638 enc_loss_iou: 0.2835 dn_loss_cls: 0.1214 dn_loss_bbox: 0.1679 dn_loss_iou: 0.2339 d0.dn_loss_cls: 0.2090 d0.dn_loss_bbox: 0.3131 d0.dn_loss_iou: 0.3815 d1.dn_loss_cls: 0.1591 d1.dn_loss_bbox: 0.1967 d1.dn_loss_iou: 0.2657 d2.dn_loss_cls: 0.1355 d2.dn_loss_bbox: 0.1755 d2.dn_loss_iou: 0.2436 d3.dn_loss_cls: 0.1275 d3.dn_loss_bbox: 0.1696 d3.dn_loss_iou: 0.2364 d4.dn_loss_cls: 0.1224 d4.dn_loss_bbox: 0.1679 d4.dn_loss_iou: 0.2340 d1.loss_lmm_region: 0.1751 loss_lmm_image: 0.8701 2024/11/10 23:45:02 - mmengine - INFO - Iter(train) [ 25400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 21:00:48 time: 1.9495 data_time: 0.0171 memory: 33716 grad_norm: 28.0677 loss: 10.1634 loss_cls: 0.3376 loss_bbox: 0.1352 loss_iou: 0.2504 d0.loss_cls: 0.3914 d0.loss_bbox: 0.1376 d0.loss_iou: 0.2558 d1.loss_cls: 0.3577 d1.loss_bbox: 0.1387 d1.loss_iou: 0.2549 d2.loss_cls: 0.3439 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2510 d3.loss_cls: 0.3410 d3.loss_bbox: 0.1342 d3.loss_iou: 0.2507 d4.loss_cls: 0.3377 d4.loss_bbox: 0.1345 d4.loss_iou: 0.2504 enc_loss_cls: 0.3803 enc_loss_bbox: 0.1551 enc_loss_iou: 0.2795 dn_loss_cls: 0.1681 dn_loss_bbox: 0.1646 dn_loss_iou: 0.2165 d0.dn_loss_cls: 0.2373 d0.dn_loss_bbox: 0.3159 d0.dn_loss_iou: 0.3604 d1.dn_loss_cls: 0.1910 d1.dn_loss_bbox: 0.1991 d1.dn_loss_iou: 0.2491 d2.dn_loss_cls: 0.1769 d2.dn_loss_bbox: 0.1701 d2.dn_loss_iou: 0.2248 d3.dn_loss_cls: 0.1706 d3.dn_loss_bbox: 0.1678 d3.dn_loss_iou: 0.2197 d4.dn_loss_cls: 0.1652 d4.dn_loss_bbox: 0.1645 d4.dn_loss_iou: 0.2167 d1.loss_lmm_region: 0.1981 loss_lmm_image: 0.9349 2024/11/10 23:48:21 - mmengine - INFO - Iter(train) [ 25500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:57:25 time: 1.9775 data_time: 0.0170 memory: 33013 grad_norm: 30.6877 loss: 10.3208 loss_cls: 0.3406 loss_bbox: 0.1439 loss_iou: 0.2452 d0.loss_cls: 0.3970 d0.loss_bbox: 0.1581 d0.loss_iou: 0.2631 d1.loss_cls: 0.3620 d1.loss_bbox: 0.1528 d1.loss_iou: 0.2592 d2.loss_cls: 0.3519 d2.loss_bbox: 0.1499 d2.loss_iou: 0.2509 d3.loss_cls: 0.3436 d3.loss_bbox: 0.1458 d3.loss_iou: 0.2474 d4.loss_cls: 0.3432 d4.loss_bbox: 0.1440 d4.loss_iou: 0.2455 enc_loss_cls: 0.3840 enc_loss_bbox: 0.1831 enc_loss_iou: 0.2945 dn_loss_cls: 0.1492 dn_loss_bbox: 0.1823 dn_loss_iou: 0.2205 d0.dn_loss_cls: 0.2235 d0.dn_loss_bbox: 0.3347 d0.dn_loss_iou: 0.3620 d1.dn_loss_cls: 0.1773 d1.dn_loss_bbox: 0.2191 d1.dn_loss_iou: 0.2518 d2.dn_loss_cls: 0.1599 d2.dn_loss_bbox: 0.1957 d2.dn_loss_iou: 0.2310 d3.dn_loss_cls: 0.1513 d3.dn_loss_bbox: 0.1852 d3.dn_loss_iou: 0.2232 d4.dn_loss_cls: 0.1506 d4.dn_loss_bbox: 0.1824 d4.dn_loss_iou: 0.2205 d1.loss_lmm_region: 0.1659 loss_lmm_image: 0.9290 2024/11/10 23:51:41 - mmengine - INFO - Iter(train) [ 25600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:54:09 time: 2.0114 data_time: 0.0172 memory: 33568 grad_norm: 29.3979 loss: 9.0493 loss_cls: 0.2835 loss_bbox: 0.1322 loss_iou: 0.2172 d0.loss_cls: 0.3312 d0.loss_bbox: 0.1433 d0.loss_iou: 0.2306 d1.loss_cls: 0.3049 d1.loss_bbox: 0.1362 d1.loss_iou: 0.2198 d2.loss_cls: 0.2925 d2.loss_bbox: 0.1308 d2.loss_iou: 0.2176 d3.loss_cls: 0.2847 d3.loss_bbox: 0.1314 d3.loss_iou: 0.2169 d4.loss_cls: 0.2828 d4.loss_bbox: 0.1338 d4.loss_iou: 0.2181 enc_loss_cls: 0.3265 enc_loss_bbox: 0.1555 enc_loss_iou: 0.2496 dn_loss_cls: 0.1359 dn_loss_bbox: 0.1493 dn_loss_iou: 0.1854 d0.dn_loss_cls: 0.2081 d0.dn_loss_bbox: 0.2899 d0.dn_loss_iou: 0.3223 d1.dn_loss_cls: 0.1657 d1.dn_loss_bbox: 0.1841 d1.dn_loss_iou: 0.2166 d2.dn_loss_cls: 0.1465 d2.dn_loss_bbox: 0.1580 d2.dn_loss_iou: 0.1940 d3.dn_loss_cls: 0.1407 d3.dn_loss_bbox: 0.1514 d3.dn_loss_iou: 0.1880 d4.dn_loss_cls: 0.1382 d4.dn_loss_bbox: 0.1493 d4.dn_loss_iou: 0.1854 d1.loss_lmm_region: 0.1881 loss_lmm_image: 0.9137 2024/11/10 23:54:59 - mmengine - INFO - Iter(train) [ 25700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:50:41 time: 1.9816 data_time: 0.0171 memory: 33992 grad_norm: 37.3810 loss: 11.0466 loss_cls: 0.3851 loss_bbox: 0.1601 loss_iou: 0.3016 d0.loss_cls: 0.4376 d0.loss_bbox: 0.1764 d0.loss_iou: 0.3250 d1.loss_cls: 0.4120 d1.loss_bbox: 0.1655 d1.loss_iou: 0.3153 d2.loss_cls: 0.3907 d2.loss_bbox: 0.1619 d2.loss_iou: 0.3103 d3.loss_cls: 0.3901 d3.loss_bbox: 0.1603 d3.loss_iou: 0.3037 d4.loss_cls: 0.3842 d4.loss_bbox: 0.1603 d4.loss_iou: 0.3014 enc_loss_cls: 0.4389 enc_loss_bbox: 0.1932 enc_loss_iou: 0.3525 dn_loss_cls: 0.1377 dn_loss_bbox: 0.1656 dn_loss_iou: 0.2233 d0.dn_loss_cls: 0.2159 d0.dn_loss_bbox: 0.3122 d0.dn_loss_iou: 0.3717 d1.dn_loss_cls: 0.1719 d1.dn_loss_bbox: 0.2041 d1.dn_loss_iou: 0.2585 d2.dn_loss_cls: 0.1505 d2.dn_loss_bbox: 0.1779 d2.dn_loss_iou: 0.2350 d3.dn_loss_cls: 0.1413 d3.dn_loss_bbox: 0.1674 d3.dn_loss_iou: 0.2253 d4.dn_loss_cls: 0.1381 d4.dn_loss_bbox: 0.1656 d4.dn_loss_iou: 0.2234 d1.loss_lmm_region: 0.1726 loss_lmm_image: 0.9625 2024/11/10 23:58:16 - mmengine - INFO - Iter(train) [ 25800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:47:14 time: 1.9777 data_time: 0.0169 memory: 34243 grad_norm: 28.2203 loss: 9.0523 loss_cls: 0.2919 loss_bbox: 0.1105 loss_iou: 0.2156 d0.loss_cls: 0.3257 d0.loss_bbox: 0.1248 d0.loss_iou: 0.2326 d1.loss_cls: 0.3017 d1.loss_bbox: 0.1184 d1.loss_iou: 0.2232 d2.loss_cls: 0.3004 d2.loss_bbox: 0.1111 d2.loss_iou: 0.2148 d3.loss_cls: 0.2949 d3.loss_bbox: 0.1104 d3.loss_iou: 0.2139 d4.loss_cls: 0.2911 d4.loss_bbox: 0.1109 d4.loss_iou: 0.2152 enc_loss_cls: 0.3308 enc_loss_bbox: 0.1374 enc_loss_iou: 0.2529 dn_loss_cls: 0.1445 dn_loss_bbox: 0.1457 dn_loss_iou: 0.1979 d0.dn_loss_cls: 0.2173 d0.dn_loss_bbox: 0.2907 d0.dn_loss_iou: 0.3425 d1.dn_loss_cls: 0.1728 d1.dn_loss_bbox: 0.1763 d1.dn_loss_iou: 0.2294 d2.dn_loss_cls: 0.1590 d2.dn_loss_bbox: 0.1534 d2.dn_loss_iou: 0.2063 d3.dn_loss_cls: 0.1484 d3.dn_loss_bbox: 0.1462 d3.dn_loss_iou: 0.1999 d4.dn_loss_cls: 0.1453 d4.dn_loss_bbox: 0.1457 d4.dn_loss_iou: 0.1979 d1.loss_lmm_region: 0.1717 loss_lmm_image: 0.9337 2024/11/11 00:01:33 - mmengine - INFO - Iter(train) [ 25900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:43:40 time: 1.9740 data_time: 0.0169 memory: 35044 grad_norm: 31.7514 loss: 10.1491 loss_cls: 0.2918 loss_bbox: 0.1482 loss_iou: 0.2597 d0.loss_cls: 0.3525 d0.loss_bbox: 0.1615 d0.loss_iou: 0.2741 d1.loss_cls: 0.3129 d1.loss_bbox: 0.1561 d1.loss_iou: 0.2681 d2.loss_cls: 0.3013 d2.loss_bbox: 0.1518 d2.loss_iou: 0.2638 d3.loss_cls: 0.2951 d3.loss_bbox: 0.1483 d3.loss_iou: 0.2623 d4.loss_cls: 0.2938 d4.loss_bbox: 0.1482 d4.loss_iou: 0.2598 enc_loss_cls: 0.3386 enc_loss_bbox: 0.1826 enc_loss_iou: 0.3003 dn_loss_cls: 0.1216 dn_loss_bbox: 0.1903 dn_loss_iou: 0.2381 d0.dn_loss_cls: 0.2220 d0.dn_loss_bbox: 0.3627 d0.dn_loss_iou: 0.4028 d1.dn_loss_cls: 0.1674 d1.dn_loss_bbox: 0.2294 d1.dn_loss_iou: 0.2742 d2.dn_loss_cls: 0.1399 d2.dn_loss_bbox: 0.2038 d2.dn_loss_iou: 0.2507 d3.dn_loss_cls: 0.1272 d3.dn_loss_bbox: 0.1927 d3.dn_loss_iou: 0.2416 d4.dn_loss_cls: 0.1227 d4.dn_loss_bbox: 0.1904 d4.dn_loss_iou: 0.2381 d1.loss_lmm_region: 0.1647 loss_lmm_image: 0.8982 2024/11/11 00:04:53 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 00:04:53 - mmengine - INFO - Iter(train) [ 26000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:40:24 time: 2.0053 data_time: 0.0171 memory: 32068 grad_norm: 28.9159 loss: 10.5384 loss_cls: 0.3202 loss_bbox: 0.1599 loss_iou: 0.2698 d0.loss_cls: 0.3745 d0.loss_bbox: 0.1720 d0.loss_iou: 0.2852 d1.loss_cls: 0.3368 d1.loss_bbox: 0.1651 d1.loss_iou: 0.2763 d2.loss_cls: 0.3335 d2.loss_bbox: 0.1539 d2.loss_iou: 0.2659 d3.loss_cls: 0.3198 d3.loss_bbox: 0.1605 d3.loss_iou: 0.2703 d4.loss_cls: 0.3212 d4.loss_bbox: 0.1593 d4.loss_iou: 0.2702 enc_loss_cls: 0.3620 enc_loss_bbox: 0.1942 enc_loss_iou: 0.3183 dn_loss_cls: 0.1158 dn_loss_bbox: 0.1949 dn_loss_iou: 0.2548 d0.dn_loss_cls: 0.2111 d0.dn_loss_bbox: 0.3614 d0.dn_loss_iou: 0.4145 d1.dn_loss_cls: 0.1497 d1.dn_loss_bbox: 0.2318 d1.dn_loss_iou: 0.2892 d2.dn_loss_cls: 0.1291 d2.dn_loss_bbox: 0.2072 d2.dn_loss_iou: 0.2653 d3.dn_loss_cls: 0.1206 d3.dn_loss_bbox: 0.1978 d3.dn_loss_iou: 0.2573 d4.dn_loss_cls: 0.1164 d4.dn_loss_bbox: 0.1949 d4.dn_loss_iou: 0.2548 d1.loss_lmm_region: 0.1636 loss_lmm_image: 0.9192 2024/11/11 00:08:09 - mmengine - INFO - Iter(train) [ 26100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:36:49 time: 1.9614 data_time: 0.0170 memory: 33050 grad_norm: 29.2609 loss: 9.6755 loss_cls: 0.3139 loss_bbox: 0.1331 loss_iou: 0.2332 d0.loss_cls: 0.3814 d0.loss_bbox: 0.1362 d0.loss_iou: 0.2414 d1.loss_cls: 0.3400 d1.loss_bbox: 0.1305 d1.loss_iou: 0.2353 d2.loss_cls: 0.3241 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2375 d3.loss_cls: 0.3215 d3.loss_bbox: 0.1309 d3.loss_iou: 0.2332 d4.loss_cls: 0.3164 d4.loss_bbox: 0.1324 d4.loss_iou: 0.2323 enc_loss_cls: 0.3642 enc_loss_bbox: 0.1594 enc_loss_iou: 0.2707 dn_loss_cls: 0.1331 dn_loss_bbox: 0.1609 dn_loss_iou: 0.2126 d0.dn_loss_cls: 0.2180 d0.dn_loss_bbox: 0.3051 d0.dn_loss_iou: 0.3502 d1.dn_loss_cls: 0.1717 d1.dn_loss_bbox: 0.1944 d1.dn_loss_iou: 0.2439 d2.dn_loss_cls: 0.1468 d2.dn_loss_bbox: 0.1704 d2.dn_loss_iou: 0.2216 d3.dn_loss_cls: 0.1380 d3.dn_loss_bbox: 0.1633 d3.dn_loss_iou: 0.2155 d4.dn_loss_cls: 0.1341 d4.dn_loss_bbox: 0.1609 d4.dn_loss_iou: 0.2126 d1.loss_lmm_region: 0.1992 loss_lmm_image: 0.9206 2024/11/11 00:11:27 - mmengine - INFO - Iter(train) [ 26200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:33:23 time: 1.9765 data_time: 0.0169 memory: 34634 grad_norm: 34.6519 loss: 11.9348 loss_cls: 0.4117 loss_bbox: 0.1767 loss_iou: 0.2963 d0.loss_cls: 0.4739 d0.loss_bbox: 0.1882 d0.loss_iou: 0.3128 d1.loss_cls: 0.4432 d1.loss_bbox: 0.1767 d1.loss_iou: 0.3038 d2.loss_cls: 0.4285 d2.loss_bbox: 0.1775 d2.loss_iou: 0.2993 d3.loss_cls: 0.4163 d3.loss_bbox: 0.1765 d3.loss_iou: 0.2957 d4.loss_cls: 0.4169 d4.loss_bbox: 0.1753 d4.loss_iou: 0.2971 enc_loss_cls: 0.4649 enc_loss_bbox: 0.2002 enc_loss_iou: 0.3363 dn_loss_cls: 0.1968 dn_loss_bbox: 0.1898 dn_loss_iou: 0.2467 d0.dn_loss_cls: 0.2737 d0.dn_loss_bbox: 0.3518 d0.dn_loss_iou: 0.4062 d1.dn_loss_cls: 0.2324 d1.dn_loss_bbox: 0.2267 d1.dn_loss_iou: 0.2823 d2.dn_loss_cls: 0.2160 d2.dn_loss_bbox: 0.2015 d2.dn_loss_iou: 0.2576 d3.dn_loss_cls: 0.2081 d3.dn_loss_bbox: 0.1928 d3.dn_loss_iou: 0.2495 d4.dn_loss_cls: 0.1989 d4.dn_loss_bbox: 0.1898 d4.dn_loss_iou: 0.2466 d1.loss_lmm_region: 0.1886 loss_lmm_image: 0.9111 2024/11/11 00:14:46 - mmengine - INFO - Iter(train) [ 26300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:30:01 time: 1.9979 data_time: 0.0171 memory: 33716 grad_norm: 30.5773 loss: 9.8251 loss_cls: 0.3063 loss_bbox: 0.1264 loss_iou: 0.2265 d0.loss_cls: 0.3547 d0.loss_bbox: 0.1391 d0.loss_iou: 0.2448 d1.loss_cls: 0.3258 d1.loss_bbox: 0.1349 d1.loss_iou: 0.2374 d2.loss_cls: 0.3187 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2309 d3.loss_cls: 0.3123 d3.loss_bbox: 0.1271 d3.loss_iou: 0.2292 d4.loss_cls: 0.3074 d4.loss_bbox: 0.1261 d4.loss_iou: 0.2270 enc_loss_cls: 0.3543 enc_loss_bbox: 0.1545 enc_loss_iou: 0.2673 dn_loss_cls: 0.1599 dn_loss_bbox: 0.1705 dn_loss_iou: 0.2200 d0.dn_loss_cls: 0.2517 d0.dn_loss_bbox: 0.3311 d0.dn_loss_iou: 0.3684 d1.dn_loss_cls: 0.1979 d1.dn_loss_bbox: 0.2052 d1.dn_loss_iou: 0.2537 d2.dn_loss_cls: 0.1772 d2.dn_loss_bbox: 0.1811 d2.dn_loss_iou: 0.2309 d3.dn_loss_cls: 0.1672 d3.dn_loss_bbox: 0.1727 d3.dn_loss_iou: 0.2232 d4.dn_loss_cls: 0.1619 d4.dn_loss_bbox: 0.1705 d4.dn_loss_iou: 0.2199 d1.loss_lmm_region: 0.1722 loss_lmm_image: 0.9109 2024/11/11 00:18:02 - mmengine - INFO - Iter(train) [ 26400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:26:26 time: 1.9674 data_time: 0.0170 memory: 32369 grad_norm: 27.7502 loss: 9.3770 loss_cls: 0.2712 loss_bbox: 0.1465 loss_iou: 0.2523 d0.loss_cls: 0.3231 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2660 d1.loss_cls: 0.2968 d1.loss_bbox: 0.1502 d1.loss_iou: 0.2578 d2.loss_cls: 0.2875 d2.loss_bbox: 0.1427 d2.loss_iou: 0.2500 d3.loss_cls: 0.2768 d3.loss_bbox: 0.1446 d3.loss_iou: 0.2514 d4.loss_cls: 0.2728 d4.loss_bbox: 0.1454 d4.loss_iou: 0.2521 enc_loss_cls: 0.3258 enc_loss_bbox: 0.1694 enc_loss_iou: 0.2882 dn_loss_cls: 0.1133 dn_loss_bbox: 0.1535 dn_loss_iou: 0.2097 d0.dn_loss_cls: 0.1940 d0.dn_loss_bbox: 0.2960 d0.dn_loss_iou: 0.3512 d1.dn_loss_cls: 0.1439 d1.dn_loss_bbox: 0.1848 d1.dn_loss_iou: 0.2396 d2.dn_loss_cls: 0.1233 d2.dn_loss_bbox: 0.1640 d2.dn_loss_iou: 0.2179 d3.dn_loss_cls: 0.1175 d3.dn_loss_bbox: 0.1562 d3.dn_loss_iou: 0.2119 d4.dn_loss_cls: 0.1126 d4.dn_loss_bbox: 0.1536 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.1753 loss_lmm_image: 0.9238 2024/11/11 00:21:19 - mmengine - INFO - Iter(train) [ 26500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:22:58 time: 1.9838 data_time: 0.0172 memory: 34999 grad_norm: 29.3460 loss: 9.7020 loss_cls: 0.3007 loss_bbox: 0.1418 loss_iou: 0.2643 d0.loss_cls: 0.3566 d0.loss_bbox: 0.1498 d0.loss_iou: 0.2772 d1.loss_cls: 0.3230 d1.loss_bbox: 0.1428 d1.loss_iou: 0.2669 d2.loss_cls: 0.3092 d2.loss_bbox: 0.1399 d2.loss_iou: 0.2656 d3.loss_cls: 0.3050 d3.loss_bbox: 0.1373 d3.loss_iou: 0.2625 d4.loss_cls: 0.3006 d4.loss_bbox: 0.1398 d4.loss_iou: 0.2639 enc_loss_cls: 0.3518 enc_loss_bbox: 0.1684 enc_loss_iou: 0.3016 dn_loss_cls: 0.1125 dn_loss_bbox: 0.1581 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.1899 d0.dn_loss_bbox: 0.3056 d0.dn_loss_iou: 0.3586 d1.dn_loss_cls: 0.1424 d1.dn_loss_bbox: 0.1870 d1.dn_loss_iou: 0.2469 d2.dn_loss_cls: 0.1263 d2.dn_loss_bbox: 0.1696 d2.dn_loss_iou: 0.2297 d3.dn_loss_cls: 0.1174 d3.dn_loss_bbox: 0.1594 d3.dn_loss_iou: 0.2221 d4.dn_loss_cls: 0.1137 d4.dn_loss_bbox: 0.1582 d4.dn_loss_iou: 0.2204 d1.loss_lmm_region: 0.1889 loss_lmm_image: 0.9063 2024/11/11 00:24:38 - mmengine - INFO - Iter(train) [ 26600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:19:35 time: 1.9674 data_time: 0.0170 memory: 32635 grad_norm: 29.3939 loss: 9.2799 loss_cls: 0.2948 loss_bbox: 0.1208 loss_iou: 0.2125 d0.loss_cls: 0.3442 d0.loss_bbox: 0.1313 d0.loss_iou: 0.2299 d1.loss_cls: 0.3085 d1.loss_bbox: 0.1262 d1.loss_iou: 0.2218 d2.loss_cls: 0.3019 d2.loss_bbox: 0.1232 d2.loss_iou: 0.2168 d3.loss_cls: 0.2981 d3.loss_bbox: 0.1224 d3.loss_iou: 0.2144 d4.loss_cls: 0.2934 d4.loss_bbox: 0.1227 d4.loss_iou: 0.2126 enc_loss_cls: 0.3302 enc_loss_bbox: 0.1512 enc_loss_iou: 0.2554 dn_loss_cls: 0.1318 dn_loss_bbox: 0.1730 dn_loss_iou: 0.2182 d0.dn_loss_cls: 0.2153 d0.dn_loss_bbox: 0.3089 d0.dn_loss_iou: 0.3664 d1.dn_loss_cls: 0.1653 d1.dn_loss_bbox: 0.1992 d1.dn_loss_iou: 0.2513 d2.dn_loss_cls: 0.1431 d2.dn_loss_bbox: 0.1804 d2.dn_loss_iou: 0.2279 d3.dn_loss_cls: 0.1353 d3.dn_loss_bbox: 0.1735 d3.dn_loss_iou: 0.2204 d4.dn_loss_cls: 0.1312 d4.dn_loss_bbox: 0.1730 d4.dn_loss_iou: 0.2183 d1.loss_lmm_region: 0.1614 loss_lmm_image: 0.8537 2024/11/11 00:27:57 - mmengine - INFO - Iter(train) [ 26700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:16:17 time: 1.9954 data_time: 0.0173 memory: 31677 grad_norm: 28.2898 loss: 8.4278 loss_cls: 0.2288 loss_bbox: 0.1084 loss_iou: 0.1993 d0.loss_cls: 0.2725 d0.loss_bbox: 0.1231 d0.loss_iou: 0.2123 d1.loss_cls: 0.2405 d1.loss_bbox: 0.1151 d1.loss_iou: 0.2030 d2.loss_cls: 0.2343 d2.loss_bbox: 0.1120 d2.loss_iou: 0.1997 d3.loss_cls: 0.2324 d3.loss_bbox: 0.1078 d3.loss_iou: 0.1987 d4.loss_cls: 0.2308 d4.loss_bbox: 0.1066 d4.loss_iou: 0.1979 enc_loss_cls: 0.2670 enc_loss_bbox: 0.1374 enc_loss_iou: 0.2353 dn_loss_cls: 0.1191 dn_loss_bbox: 0.1552 dn_loss_iou: 0.1977 d0.dn_loss_cls: 0.1958 d0.dn_loss_bbox: 0.3321 d0.dn_loss_iou: 0.3527 d1.dn_loss_cls: 0.1483 d1.dn_loss_bbox: 0.1898 d1.dn_loss_iou: 0.2290 d2.dn_loss_cls: 0.1332 d2.dn_loss_bbox: 0.1635 d2.dn_loss_iou: 0.2060 d3.dn_loss_cls: 0.1264 d3.dn_loss_bbox: 0.1563 d3.dn_loss_iou: 0.1994 d4.dn_loss_cls: 0.1208 d4.dn_loss_bbox: 0.1553 d4.dn_loss_iou: 0.1977 d1.loss_lmm_region: 0.1506 loss_lmm_image: 0.9357 2024/11/11 00:31:15 - mmengine - INFO - Iter(train) [ 26800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:12:53 time: 1.9777 data_time: 0.0171 memory: 34800 grad_norm: 28.1539 loss: 10.0061 loss_cls: 0.3317 loss_bbox: 0.1468 loss_iou: 0.2629 d0.loss_cls: 0.3829 d0.loss_bbox: 0.1537 d0.loss_iou: 0.2801 d1.loss_cls: 0.3511 d1.loss_bbox: 0.1498 d1.loss_iou: 0.2684 d2.loss_cls: 0.3447 d2.loss_bbox: 0.1463 d2.loss_iou: 0.2627 d3.loss_cls: 0.3335 d3.loss_bbox: 0.1478 d3.loss_iou: 0.2634 d4.loss_cls: 0.3294 d4.loss_bbox: 0.1492 d4.loss_iou: 0.2639 enc_loss_cls: 0.3883 enc_loss_bbox: 0.1661 enc_loss_iou: 0.2956 dn_loss_cls: 0.1263 dn_loss_bbox: 0.1595 dn_loss_iou: 0.2179 d0.dn_loss_cls: 0.2057 d0.dn_loss_bbox: 0.3033 d0.dn_loss_iou: 0.3705 d1.dn_loss_cls: 0.1583 d1.dn_loss_bbox: 0.1900 d1.dn_loss_iou: 0.2501 d2.dn_loss_cls: 0.1371 d2.dn_loss_bbox: 0.1680 d2.dn_loss_iou: 0.2270 d3.dn_loss_cls: 0.1310 d3.dn_loss_bbox: 0.1613 d3.dn_loss_iou: 0.2203 d4.dn_loss_cls: 0.1271 d4.dn_loss_bbox: 0.1595 d4.dn_loss_iou: 0.2179 d1.loss_lmm_region: 0.1718 loss_lmm_image: 0.8851 2024/11/11 00:34:34 - mmengine - INFO - Iter(train) [ 26900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:09:30 time: 1.9820 data_time: 0.0172 memory: 34853 grad_norm: 28.3174 loss: 10.4497 loss_cls: 0.3198 loss_bbox: 0.1428 loss_iou: 0.2331 d0.loss_cls: 0.3643 d0.loss_bbox: 0.1544 d0.loss_iou: 0.2490 d1.loss_cls: 0.3395 d1.loss_bbox: 0.1495 d1.loss_iou: 0.2391 d2.loss_cls: 0.3221 d2.loss_bbox: 0.1520 d2.loss_iou: 0.2370 d3.loss_cls: 0.3196 d3.loss_bbox: 0.1458 d3.loss_iou: 0.2371 d4.loss_cls: 0.3168 d4.loss_bbox: 0.1443 d4.loss_iou: 0.2348 enc_loss_cls: 0.3594 enc_loss_bbox: 0.1704 enc_loss_iou: 0.2673 dn_loss_cls: 0.1836 dn_loss_bbox: 0.1894 dn_loss_iou: 0.2312 d0.dn_loss_cls: 0.2648 d0.dn_loss_bbox: 0.3564 d0.dn_loss_iou: 0.3858 d1.dn_loss_cls: 0.2160 d1.dn_loss_bbox: 0.2259 d1.dn_loss_iou: 0.2658 d2.dn_loss_cls: 0.1984 d2.dn_loss_bbox: 0.2013 d2.dn_loss_iou: 0.2425 d3.dn_loss_cls: 0.1901 d3.dn_loss_bbox: 0.1916 d3.dn_loss_iou: 0.2342 d4.dn_loss_cls: 0.1856 d4.dn_loss_bbox: 0.1895 d4.dn_loss_iou: 0.2313 d1.loss_lmm_region: 0.2165 loss_lmm_image: 0.9515 2024/11/11 00:37:50 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 00:37:50 - mmengine - INFO - Iter(train) [ 27000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:05:55 time: 1.9464 data_time: 0.0172 memory: 34217 grad_norm: 28.6439 loss: 10.6838 loss_cls: 0.3949 loss_bbox: 0.1395 loss_iou: 0.2508 d0.loss_cls: 0.4483 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2661 d1.loss_cls: 0.4175 d1.loss_bbox: 0.1427 d1.loss_iou: 0.2573 d2.loss_cls: 0.4064 d2.loss_bbox: 0.1382 d2.loss_iou: 0.2500 d3.loss_cls: 0.3928 d3.loss_bbox: 0.1401 d3.loss_iou: 0.2511 d4.loss_cls: 0.3928 d4.loss_bbox: 0.1407 d4.loss_iou: 0.2510 enc_loss_cls: 0.4441 enc_loss_bbox: 0.1739 enc_loss_iou: 0.2960 dn_loss_cls: 0.1720 dn_loss_bbox: 0.1709 dn_loss_iou: 0.2160 d0.dn_loss_cls: 0.2497 d0.dn_loss_bbox: 0.3297 d0.dn_loss_iou: 0.3680 d1.dn_loss_cls: 0.2010 d1.dn_loss_bbox: 0.2047 d1.dn_loss_iou: 0.2490 d2.dn_loss_cls: 0.1801 d2.dn_loss_bbox: 0.1804 d2.dn_loss_iou: 0.2264 d3.dn_loss_cls: 0.1736 d3.dn_loss_bbox: 0.1729 d3.dn_loss_iou: 0.2185 d4.dn_loss_cls: 0.1704 d4.dn_loss_bbox: 0.1709 d4.dn_loss_iou: 0.2159 d1.loss_lmm_region: 0.1689 loss_lmm_image: 0.8993 2024/11/11 00:41:07 - mmengine - INFO - Iter(train) [ 27100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 20:02:28 time: 1.9639 data_time: 0.0172 memory: 35576 grad_norm: 35.1872 loss: 10.3999 loss_cls: 0.3156 loss_bbox: 0.1508 loss_iou: 0.2712 d0.loss_cls: 0.3636 d0.loss_bbox: 0.1636 d0.loss_iou: 0.2848 d1.loss_cls: 0.3325 d1.loss_bbox: 0.1530 d1.loss_iou: 0.2732 d2.loss_cls: 0.3220 d2.loss_bbox: 0.1555 d2.loss_iou: 0.2704 d3.loss_cls: 0.3187 d3.loss_bbox: 0.1515 d3.loss_iou: 0.2699 d4.loss_cls: 0.3147 d4.loss_bbox: 0.1533 d4.loss_iou: 0.2729 enc_loss_cls: 0.3543 enc_loss_bbox: 0.1858 enc_loss_iou: 0.3177 dn_loss_cls: 0.1321 dn_loss_bbox: 0.1890 dn_loss_iou: 0.2507 d0.dn_loss_cls: 0.2174 d0.dn_loss_bbox: 0.3395 d0.dn_loss_iou: 0.3965 d1.dn_loss_cls: 0.1620 d1.dn_loss_bbox: 0.2198 d1.dn_loss_iou: 0.2816 d2.dn_loss_cls: 0.1460 d2.dn_loss_bbox: 0.1957 d2.dn_loss_iou: 0.2583 d3.dn_loss_cls: 0.1391 d3.dn_loss_bbox: 0.1906 d3.dn_loss_iou: 0.2530 d4.dn_loss_cls: 0.1327 d4.dn_loss_bbox: 0.1890 d4.dn_loss_iou: 0.2507 d1.loss_lmm_region: 0.1794 loss_lmm_image: 0.8820 2024/11/11 00:44:29 - mmengine - INFO - Iter(train) [ 27200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:59:18 time: 2.0173 data_time: 0.0172 memory: 34550 grad_norm: 35.3693 loss: 10.0399 loss_cls: 0.3351 loss_bbox: 0.1299 loss_iou: 0.2463 d0.loss_cls: 0.3960 d0.loss_bbox: 0.1339 d0.loss_iou: 0.2567 d1.loss_cls: 0.3605 d1.loss_bbox: 0.1294 d1.loss_iou: 0.2506 d2.loss_cls: 0.3443 d2.loss_bbox: 0.1285 d2.loss_iou: 0.2481 d3.loss_cls: 0.3382 d3.loss_bbox: 0.1265 d3.loss_iou: 0.2475 d4.loss_cls: 0.3379 d4.loss_bbox: 0.1262 d4.loss_iou: 0.2457 enc_loss_cls: 0.3818 enc_loss_bbox: 0.1564 enc_loss_iou: 0.2874 dn_loss_cls: 0.1451 dn_loss_bbox: 0.1751 dn_loss_iou: 0.2144 d0.dn_loss_cls: 0.2271 d0.dn_loss_bbox: 0.3248 d0.dn_loss_iou: 0.3600 d1.dn_loss_cls: 0.1756 d1.dn_loss_bbox: 0.2131 d1.dn_loss_iou: 0.2496 d2.dn_loss_cls: 0.1585 d2.dn_loss_bbox: 0.1863 d2.dn_loss_iou: 0.2262 d3.dn_loss_cls: 0.1506 d3.dn_loss_bbox: 0.1771 d3.dn_loss_iou: 0.2175 d4.dn_loss_cls: 0.1452 d4.dn_loss_bbox: 0.1751 d4.dn_loss_iou: 0.2144 d1.loss_lmm_region: 0.1934 loss_lmm_image: 0.9037 2024/11/11 00:47:48 - mmengine - INFO - Iter(train) [ 27300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:55:59 time: 1.9764 data_time: 0.0171 memory: 33081 grad_norm: 26.2233 loss: 9.8326 loss_cls: 0.2605 loss_bbox: 0.1536 loss_iou: 0.2635 d0.loss_cls: 0.3110 d0.loss_bbox: 0.1573 d0.loss_iou: 0.2742 d1.loss_cls: 0.2789 d1.loss_bbox: 0.1579 d1.loss_iou: 0.2696 d2.loss_cls: 0.2709 d2.loss_bbox: 0.1520 d2.loss_iou: 0.2630 d3.loss_cls: 0.2650 d3.loss_bbox: 0.1525 d3.loss_iou: 0.2641 d4.loss_cls: 0.2626 d4.loss_bbox: 0.1534 d4.loss_iou: 0.2633 enc_loss_cls: 0.3018 enc_loss_bbox: 0.1762 enc_loss_iou: 0.2966 dn_loss_cls: 0.0884 dn_loss_bbox: 0.2029 dn_loss_iou: 0.2428 d0.dn_loss_cls: 0.1761 d0.dn_loss_bbox: 0.3765 d0.dn_loss_iou: 0.3864 d1.dn_loss_cls: 0.1252 d1.dn_loss_bbox: 0.2451 d1.dn_loss_iou: 0.2727 d2.dn_loss_cls: 0.1017 d2.dn_loss_bbox: 0.2210 d2.dn_loss_iou: 0.2549 d3.dn_loss_cls: 0.0932 d3.dn_loss_bbox: 0.2078 d3.dn_loss_iou: 0.2463 d4.dn_loss_cls: 0.0905 d4.dn_loss_bbox: 0.2030 d4.dn_loss_iou: 0.2429 d1.loss_lmm_region: 0.1608 loss_lmm_image: 0.9466 2024/11/11 00:51:08 - mmengine - INFO - Iter(train) [ 27400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:52:42 time: 1.9957 data_time: 0.0171 memory: 35778 grad_norm: 36.3482 loss: 11.0040 loss_cls: 0.3759 loss_bbox: 0.1532 loss_iou: 0.2818 d0.loss_cls: 0.4326 d0.loss_bbox: 0.1675 d0.loss_iou: 0.2968 d1.loss_cls: 0.3975 d1.loss_bbox: 0.1611 d1.loss_iou: 0.2907 d2.loss_cls: 0.3927 d2.loss_bbox: 0.1530 d2.loss_iou: 0.2845 d3.loss_cls: 0.3871 d3.loss_bbox: 0.1516 d3.loss_iou: 0.2807 d4.loss_cls: 0.3757 d4.loss_bbox: 0.1542 d4.loss_iou: 0.2822 enc_loss_cls: 0.4233 enc_loss_bbox: 0.1923 enc_loss_iou: 0.3313 dn_loss_cls: 0.1775 dn_loss_bbox: 0.1762 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.2451 d0.dn_loss_bbox: 0.3142 d0.dn_loss_iou: 0.3632 d1.dn_loss_cls: 0.2047 d1.dn_loss_bbox: 0.2094 d1.dn_loss_iou: 0.2551 d2.dn_loss_cls: 0.1877 d2.dn_loss_bbox: 0.1861 d2.dn_loss_iou: 0.2309 d3.dn_loss_cls: 0.1808 d3.dn_loss_bbox: 0.1789 d3.dn_loss_iou: 0.2238 d4.dn_loss_cls: 0.1763 d4.dn_loss_bbox: 0.1763 d4.dn_loss_iou: 0.2205 d1.loss_lmm_region: 0.1871 loss_lmm_image: 0.9242 2024/11/11 00:54:25 - mmengine - INFO - Iter(train) [ 27500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:49:12 time: 1.9678 data_time: 0.0170 memory: 34888 grad_norm: 31.7767 loss: 9.7431 loss_cls: 0.3409 loss_bbox: 0.1420 loss_iou: 0.2572 d0.loss_cls: 0.3860 d0.loss_bbox: 0.1590 d0.loss_iou: 0.2748 d1.loss_cls: 0.3617 d1.loss_bbox: 0.1531 d1.loss_iou: 0.2683 d2.loss_cls: 0.3476 d2.loss_bbox: 0.1480 d2.loss_iou: 0.2640 d3.loss_cls: 0.3465 d3.loss_bbox: 0.1432 d3.loss_iou: 0.2583 d4.loss_cls: 0.3440 d4.loss_bbox: 0.1413 d4.loss_iou: 0.2565 enc_loss_cls: 0.3829 enc_loss_bbox: 0.1770 enc_loss_iou: 0.3066 dn_loss_cls: 0.1025 dn_loss_bbox: 0.1488 dn_loss_iou: 0.2074 d0.dn_loss_cls: 0.1724 d0.dn_loss_bbox: 0.2770 d0.dn_loss_iou: 0.3512 d1.dn_loss_cls: 0.1284 d1.dn_loss_bbox: 0.1727 d1.dn_loss_iou: 0.2378 d2.dn_loss_cls: 0.1127 d2.dn_loss_bbox: 0.1559 d2.dn_loss_iou: 0.2170 d3.dn_loss_cls: 0.1061 d3.dn_loss_bbox: 0.1503 d3.dn_loss_iou: 0.2100 d4.dn_loss_cls: 0.1035 d4.dn_loss_bbox: 0.1488 d4.dn_loss_iou: 0.2073 d1.loss_lmm_region: 0.1453 loss_lmm_image: 0.9293 2024/11/11 00:57:45 - mmengine - INFO - Iter(train) [ 27600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:45:55 time: 2.0030 data_time: 0.0172 memory: 34185 grad_norm: 26.2090 loss: 10.0568 loss_cls: 0.2937 loss_bbox: 0.1508 loss_iou: 0.2807 d0.loss_cls: 0.3560 d0.loss_bbox: 0.1558 d0.loss_iou: 0.2907 d1.loss_cls: 0.3186 d1.loss_bbox: 0.1555 d1.loss_iou: 0.2844 d2.loss_cls: 0.3104 d2.loss_bbox: 0.1501 d2.loss_iou: 0.2811 d3.loss_cls: 0.3018 d3.loss_bbox: 0.1498 d3.loss_iou: 0.2782 d4.loss_cls: 0.2950 d4.loss_bbox: 0.1516 d4.loss_iou: 0.2783 enc_loss_cls: 0.3475 enc_loss_bbox: 0.1741 enc_loss_iou: 0.3106 dn_loss_cls: 0.1103 dn_loss_bbox: 0.1806 dn_loss_iou: 0.2358 d0.dn_loss_cls: 0.1913 d0.dn_loss_bbox: 0.3375 d0.dn_loss_iou: 0.3837 d1.dn_loss_cls: 0.1417 d1.dn_loss_bbox: 0.2159 d1.dn_loss_iou: 0.2691 d2.dn_loss_cls: 0.1228 d2.dn_loss_bbox: 0.1916 d2.dn_loss_iou: 0.2453 d3.dn_loss_cls: 0.1172 d3.dn_loss_bbox: 0.1821 d3.dn_loss_iou: 0.2380 d4.dn_loss_cls: 0.1112 d4.dn_loss_bbox: 0.1806 d4.dn_loss_iou: 0.2359 d1.loss_lmm_region: 0.1657 loss_lmm_image: 0.8857 2024/11/11 01:01:05 - mmengine - INFO - Iter(train) [ 27700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:42:38 time: 2.0095 data_time: 0.0172 memory: 32145 grad_norm: 28.5825 loss: 10.3223 loss_cls: 0.3114 loss_bbox: 0.1470 loss_iou: 0.2244 d0.loss_cls: 0.3516 d0.loss_bbox: 0.1681 d0.loss_iou: 0.2446 d1.loss_cls: 0.3253 d1.loss_bbox: 0.1629 d1.loss_iou: 0.2358 d2.loss_cls: 0.3155 d2.loss_bbox: 0.1569 d2.loss_iou: 0.2294 d3.loss_cls: 0.3155 d3.loss_bbox: 0.1489 d3.loss_iou: 0.2228 d4.loss_cls: 0.3101 d4.loss_bbox: 0.1488 d4.loss_iou: 0.2272 enc_loss_cls: 0.3555 enc_loss_bbox: 0.1842 enc_loss_iou: 0.2686 dn_loss_cls: 0.1453 dn_loss_bbox: 0.2108 dn_loss_iou: 0.2242 d0.dn_loss_cls: 0.2374 d0.dn_loss_bbox: 0.3813 d0.dn_loss_iou: 0.3817 d1.dn_loss_cls: 0.1843 d1.dn_loss_bbox: 0.2523 d1.dn_loss_iou: 0.2608 d2.dn_loss_cls: 0.1621 d2.dn_loss_bbox: 0.2261 d2.dn_loss_iou: 0.2372 d3.dn_loss_cls: 0.1518 d3.dn_loss_bbox: 0.2137 d3.dn_loss_iou: 0.2278 d4.dn_loss_cls: 0.1470 d4.dn_loss_bbox: 0.2108 d4.dn_loss_iou: 0.2242 d1.loss_lmm_region: 0.1849 loss_lmm_image: 1.0040 2024/11/11 01:04:22 - mmengine - INFO - Iter(train) [ 27800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:39:10 time: 1.9678 data_time: 0.0171 memory: 33690 grad_norm: 27.6838 loss: 9.6486 loss_cls: 0.3026 loss_bbox: 0.1440 loss_iou: 0.2344 d0.loss_cls: 0.3505 d0.loss_bbox: 0.1670 d0.loss_iou: 0.2572 d1.loss_cls: 0.3182 d1.loss_bbox: 0.1519 d1.loss_iou: 0.2434 d2.loss_cls: 0.3086 d2.loss_bbox: 0.1470 d2.loss_iou: 0.2403 d3.loss_cls: 0.3043 d3.loss_bbox: 0.1483 d3.loss_iou: 0.2370 d4.loss_cls: 0.3048 d4.loss_bbox: 0.1457 d4.loss_iou: 0.2348 enc_loss_cls: 0.3453 enc_loss_bbox: 0.1787 enc_loss_iou: 0.2779 dn_loss_cls: 0.1183 dn_loss_bbox: 0.1707 dn_loss_iou: 0.2105 d0.dn_loss_cls: 0.1929 d0.dn_loss_bbox: 0.3251 d0.dn_loss_iou: 0.3593 d1.dn_loss_cls: 0.1449 d1.dn_loss_bbox: 0.2148 d1.dn_loss_iou: 0.2484 d2.dn_loss_cls: 0.1280 d2.dn_loss_bbox: 0.1853 d2.dn_loss_iou: 0.2231 d3.dn_loss_cls: 0.1208 d3.dn_loss_bbox: 0.1738 d3.dn_loss_iou: 0.2141 d4.dn_loss_cls: 0.1176 d4.dn_loss_bbox: 0.1706 d4.dn_loss_iou: 0.2105 d1.loss_lmm_region: 0.1628 loss_lmm_image: 0.9152 2024/11/11 01:07:41 - mmengine - INFO - Iter(train) [ 27900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:35:47 time: 1.9991 data_time: 0.0172 memory: 33880 grad_norm: 38.8368 loss: 9.4410 loss_cls: 0.2990 loss_bbox: 0.1394 loss_iou: 0.2490 d0.loss_cls: 0.3574 d0.loss_bbox: 0.1548 d0.loss_iou: 0.2658 d1.loss_cls: 0.3227 d1.loss_bbox: 0.1438 d1.loss_iou: 0.2521 d2.loss_cls: 0.3144 d2.loss_bbox: 0.1381 d2.loss_iou: 0.2474 d3.loss_cls: 0.3096 d3.loss_bbox: 0.1346 d3.loss_iou: 0.2456 d4.loss_cls: 0.3034 d4.loss_bbox: 0.1383 d4.loss_iou: 0.2482 enc_loss_cls: 0.3513 enc_loss_bbox: 0.1693 enc_loss_iou: 0.2914 dn_loss_cls: 0.1046 dn_loss_bbox: 0.1497 dn_loss_iou: 0.2112 d0.dn_loss_cls: 0.1760 d0.dn_loss_bbox: 0.3027 d0.dn_loss_iou: 0.3578 d1.dn_loss_cls: 0.1317 d1.dn_loss_bbox: 0.1852 d1.dn_loss_iou: 0.2441 d2.dn_loss_cls: 0.1142 d2.dn_loss_bbox: 0.1598 d2.dn_loss_iou: 0.2208 d3.dn_loss_cls: 0.1084 d3.dn_loss_bbox: 0.1509 d3.dn_loss_iou: 0.2130 d4.dn_loss_cls: 0.1047 d4.dn_loss_bbox: 0.1496 d4.dn_loss_iou: 0.2111 d1.loss_lmm_region: 0.1644 loss_lmm_image: 0.9054 2024/11/11 01:10:59 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 01:10:59 - mmengine - INFO - Iter(train) [ 28000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:32:22 time: 1.9706 data_time: 0.0172 memory: 34887 grad_norm: 31.1695 loss: 9.9881 loss_cls: 0.3042 loss_bbox: 0.1384 loss_iou: 0.2791 d0.loss_cls: 0.3484 d0.loss_bbox: 0.1477 d0.loss_iou: 0.2912 d1.loss_cls: 0.3225 d1.loss_bbox: 0.1467 d1.loss_iou: 0.2871 d2.loss_cls: 0.3156 d2.loss_bbox: 0.1394 d2.loss_iou: 0.2787 d3.loss_cls: 0.3109 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2787 d4.loss_cls: 0.3076 d4.loss_bbox: 0.1358 d4.loss_iou: 0.2780 enc_loss_cls: 0.3482 enc_loss_bbox: 0.1605 enc_loss_iou: 0.3139 dn_loss_cls: 0.1051 dn_loss_bbox: 0.1751 dn_loss_iou: 0.2436 d0.dn_loss_cls: 0.1851 d0.dn_loss_bbox: 0.3161 d0.dn_loss_iou: 0.3869 d1.dn_loss_cls: 0.1384 d1.dn_loss_bbox: 0.2063 d1.dn_loss_iou: 0.2732 d2.dn_loss_cls: 0.1216 d2.dn_loss_bbox: 0.1867 d2.dn_loss_iou: 0.2524 d3.dn_loss_cls: 0.1106 d3.dn_loss_bbox: 0.1778 d3.dn_loss_iou: 0.2460 d4.dn_loss_cls: 0.1074 d4.dn_loss_bbox: 0.1752 d4.dn_loss_iou: 0.2435 d1.loss_lmm_region: 0.1595 loss_lmm_image: 0.9099 2024/11/11 01:14:15 - mmengine - INFO - Iter(train) [ 28100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:28:50 time: 1.9636 data_time: 0.0175 memory: 33726 grad_norm: 33.6259 loss: 11.5481 loss_cls: 0.4056 loss_bbox: 0.1744 loss_iou: 0.3163 d0.loss_cls: 0.4617 d0.loss_bbox: 0.1831 d0.loss_iou: 0.3298 d1.loss_cls: 0.4295 d1.loss_bbox: 0.1773 d1.loss_iou: 0.3193 d2.loss_cls: 0.4160 d2.loss_bbox: 0.1771 d2.loss_iou: 0.3149 d3.loss_cls: 0.4154 d3.loss_bbox: 0.1696 d3.loss_iou: 0.3124 d4.loss_cls: 0.4102 d4.loss_bbox: 0.1724 d4.loss_iou: 0.3124 enc_loss_cls: 0.4556 enc_loss_bbox: 0.2045 enc_loss_iou: 0.3637 dn_loss_cls: 0.1464 dn_loss_bbox: 0.1767 dn_loss_iou: 0.2406 d0.dn_loss_cls: 0.2365 d0.dn_loss_bbox: 0.3258 d0.dn_loss_iou: 0.3907 d1.dn_loss_cls: 0.1823 d1.dn_loss_bbox: 0.2093 d1.dn_loss_iou: 0.2723 d2.dn_loss_cls: 0.1582 d2.dn_loss_bbox: 0.1867 d2.dn_loss_iou: 0.2503 d3.dn_loss_cls: 0.1494 d3.dn_loss_bbox: 0.1776 d3.dn_loss_iou: 0.2425 d4.dn_loss_cls: 0.1460 d4.dn_loss_bbox: 0.1767 d4.dn_loss_iou: 0.2406 d1.loss_lmm_region: 0.1779 loss_lmm_image: 0.9405 2024/11/11 01:17:33 - mmengine - INFO - Iter(train) [ 28200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:25:26 time: 1.9927 data_time: 0.0172 memory: 35208 grad_norm: 30.9563 loss: 10.7907 loss_cls: 0.3459 loss_bbox: 0.1525 loss_iou: 0.3051 d0.loss_cls: 0.3947 d0.loss_bbox: 0.1626 d0.loss_iou: 0.3178 d1.loss_cls: 0.3644 d1.loss_bbox: 0.1561 d1.loss_iou: 0.3098 d2.loss_cls: 0.3543 d2.loss_bbox: 0.1495 d2.loss_iou: 0.3028 d3.loss_cls: 0.3548 d3.loss_bbox: 0.1497 d3.loss_iou: 0.3015 d4.loss_cls: 0.3501 d4.loss_bbox: 0.1500 d4.loss_iou: 0.3016 enc_loss_cls: 0.3948 enc_loss_bbox: 0.1786 enc_loss_iou: 0.3392 dn_loss_cls: 0.1528 dn_loss_bbox: 0.1761 dn_loss_iou: 0.2316 d0.dn_loss_cls: 0.2225 d0.dn_loss_bbox: 0.3345 d0.dn_loss_iou: 0.3823 d1.dn_loss_cls: 0.1788 d1.dn_loss_bbox: 0.2122 d1.dn_loss_iou: 0.2652 d2.dn_loss_cls: 0.1619 d2.dn_loss_bbox: 0.1867 d2.dn_loss_iou: 0.2418 d3.dn_loss_cls: 0.1573 d3.dn_loss_bbox: 0.1797 d3.dn_loss_iou: 0.2347 d4.dn_loss_cls: 0.1540 d4.dn_loss_bbox: 0.1761 d4.dn_loss_iou: 0.2314 d1.loss_lmm_region: 0.1621 loss_lmm_image: 0.9132 2024/11/11 01:20:50 - mmengine - INFO - Iter(train) [ 28300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:21:58 time: 1.9744 data_time: 0.0171 memory: 33734 grad_norm: 30.1969 loss: 10.3838 loss_cls: 0.3295 loss_bbox: 0.1465 loss_iou: 0.2730 d0.loss_cls: 0.3841 d0.loss_bbox: 0.1552 d0.loss_iou: 0.2824 d1.loss_cls: 0.3549 d1.loss_bbox: 0.1490 d1.loss_iou: 0.2770 d2.loss_cls: 0.3434 d2.loss_bbox: 0.1469 d2.loss_iou: 0.2718 d3.loss_cls: 0.3329 d3.loss_bbox: 0.1478 d3.loss_iou: 0.2735 d4.loss_cls: 0.3333 d4.loss_bbox: 0.1458 d4.loss_iou: 0.2708 enc_loss_cls: 0.3780 enc_loss_bbox: 0.1718 enc_loss_iou: 0.3065 dn_loss_cls: 0.1694 dn_loss_bbox: 0.1684 dn_loss_iou: 0.2208 d0.dn_loss_cls: 0.2429 d0.dn_loss_bbox: 0.3177 d0.dn_loss_iou: 0.3604 d1.dn_loss_cls: 0.1985 d1.dn_loss_bbox: 0.1994 d1.dn_loss_iou: 0.2522 d2.dn_loss_cls: 0.1808 d2.dn_loss_bbox: 0.1793 d2.dn_loss_iou: 0.2305 d3.dn_loss_cls: 0.1754 d3.dn_loss_bbox: 0.1701 d3.dn_loss_iou: 0.2232 d4.dn_loss_cls: 0.1710 d4.dn_loss_bbox: 0.1685 d4.dn_loss_iou: 0.2208 d1.loss_lmm_region: 0.1668 loss_lmm_image: 0.8937 2024/11/11 01:24:08 - mmengine - INFO - Iter(train) [ 28400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:18:30 time: 1.9667 data_time: 0.0174 memory: 33793 grad_norm: 27.3509 loss: 9.5830 loss_cls: 0.2865 loss_bbox: 0.1460 loss_iou: 0.2330 d0.loss_cls: 0.3291 d0.loss_bbox: 0.1616 d0.loss_iou: 0.2481 d1.loss_cls: 0.2972 d1.loss_bbox: 0.1607 d1.loss_iou: 0.2455 d2.loss_cls: 0.2914 d2.loss_bbox: 0.1543 d2.loss_iou: 0.2386 d3.loss_cls: 0.2901 d3.loss_bbox: 0.1481 d3.loss_iou: 0.2332 d4.loss_cls: 0.2868 d4.loss_bbox: 0.1481 d4.loss_iou: 0.2330 enc_loss_cls: 0.3249 enc_loss_bbox: 0.1798 enc_loss_iou: 0.2734 dn_loss_cls: 0.1256 dn_loss_bbox: 0.1836 dn_loss_iou: 0.2115 d0.dn_loss_cls: 0.2084 d0.dn_loss_bbox: 0.3307 d0.dn_loss_iou: 0.3517 d1.dn_loss_cls: 0.1574 d1.dn_loss_bbox: 0.2187 d1.dn_loss_iou: 0.2445 d2.dn_loss_cls: 0.1404 d2.dn_loss_bbox: 0.1951 d2.dn_loss_iou: 0.2220 d3.dn_loss_cls: 0.1325 d3.dn_loss_bbox: 0.1854 d3.dn_loss_iou: 0.2142 d4.dn_loss_cls: 0.1261 d4.dn_loss_bbox: 0.1836 d4.dn_loss_iou: 0.2116 d1.loss_lmm_region: 0.1541 loss_lmm_image: 0.8766 2024/11/11 01:27:26 - mmengine - INFO - Iter(train) [ 28500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:15:06 time: 1.9862 data_time: 0.0172 memory: 33444 grad_norm: 34.2245 loss: 10.2846 loss_cls: 0.3050 loss_bbox: 0.1487 loss_iou: 0.2674 d0.loss_cls: 0.3584 d0.loss_bbox: 0.1527 d0.loss_iou: 0.2858 d1.loss_cls: 0.3283 d1.loss_bbox: 0.1466 d1.loss_iou: 0.2763 d2.loss_cls: 0.3186 d2.loss_bbox: 0.1459 d2.loss_iou: 0.2701 d3.loss_cls: 0.3128 d3.loss_bbox: 0.1450 d3.loss_iou: 0.2669 d4.loss_cls: 0.3102 d4.loss_bbox: 0.1439 d4.loss_iou: 0.2651 enc_loss_cls: 0.3593 enc_loss_bbox: 0.1683 enc_loss_iou: 0.3058 dn_loss_cls: 0.1155 dn_loss_bbox: 0.1931 dn_loss_iou: 0.2467 d0.dn_loss_cls: 0.2036 d0.dn_loss_bbox: 0.3685 d0.dn_loss_iou: 0.4112 d1.dn_loss_cls: 0.1514 d1.dn_loss_bbox: 0.2316 d1.dn_loss_iou: 0.2807 d2.dn_loss_cls: 0.1294 d2.dn_loss_bbox: 0.2049 d2.dn_loss_iou: 0.2576 d3.dn_loss_cls: 0.1221 d3.dn_loss_bbox: 0.1953 d3.dn_loss_iou: 0.2493 d4.dn_loss_cls: 0.1171 d4.dn_loss_bbox: 0.1931 d4.dn_loss_iou: 0.2467 d1.loss_lmm_region: 0.1758 loss_lmm_image: 0.9099 2024/11/11 01:30:44 - mmengine - INFO - Iter(train) [ 28600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:11:42 time: 1.9737 data_time: 0.0173 memory: 34046 grad_norm: 30.1542 loss: 11.4231 loss_cls: 0.3817 loss_bbox: 0.1821 loss_iou: 0.3187 d0.loss_cls: 0.4478 d0.loss_bbox: 0.1877 d0.loss_iou: 0.3333 d1.loss_cls: 0.4093 d1.loss_bbox: 0.1770 d1.loss_iou: 0.3204 d2.loss_cls: 0.3915 d2.loss_bbox: 0.1754 d2.loss_iou: 0.3191 d3.loss_cls: 0.3837 d3.loss_bbox: 0.1799 d3.loss_iou: 0.3164 d4.loss_cls: 0.3784 d4.loss_bbox: 0.1816 d4.loss_iou: 0.3182 enc_loss_cls: 0.4413 enc_loss_bbox: 0.1942 enc_loss_iou: 0.3485 dn_loss_cls: 0.1505 dn_loss_bbox: 0.1842 dn_loss_iou: 0.2325 d0.dn_loss_cls: 0.2341 d0.dn_loss_bbox: 0.3252 d0.dn_loss_iou: 0.3641 d1.dn_loss_cls: 0.1866 d1.dn_loss_bbox: 0.2169 d1.dn_loss_iou: 0.2607 d2.dn_loss_cls: 0.1646 d2.dn_loss_bbox: 0.1953 d2.dn_loss_iou: 0.2411 d3.dn_loss_cls: 0.1567 d3.dn_loss_bbox: 0.1864 d3.dn_loss_iou: 0.2348 d4.dn_loss_cls: 0.1525 d4.dn_loss_bbox: 0.1841 d4.dn_loss_iou: 0.2325 d1.loss_lmm_region: 0.2168 loss_lmm_image: 0.9176 2024/11/11 01:34:02 - mmengine - INFO - Iter(train) [ 28700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:08:19 time: 1.9914 data_time: 0.0172 memory: 33955 grad_norm: 32.1585 loss: 9.4372 loss_cls: 0.2839 loss_bbox: 0.1269 loss_iou: 0.2255 d0.loss_cls: 0.3313 d0.loss_bbox: 0.1356 d0.loss_iou: 0.2384 d1.loss_cls: 0.2993 d1.loss_bbox: 0.1324 d1.loss_iou: 0.2341 d2.loss_cls: 0.2917 d2.loss_bbox: 0.1243 d2.loss_iou: 0.2294 d3.loss_cls: 0.2863 d3.loss_bbox: 0.1252 d3.loss_iou: 0.2286 d4.loss_cls: 0.2810 d4.loss_bbox: 0.1289 d4.loss_iou: 0.2283 enc_loss_cls: 0.3212 enc_loss_bbox: 0.1496 enc_loss_iou: 0.2599 dn_loss_cls: 0.1557 dn_loss_bbox: 0.1612 dn_loss_iou: 0.2100 d0.dn_loss_cls: 0.2343 d0.dn_loss_bbox: 0.3037 d0.dn_loss_iou: 0.3473 d1.dn_loss_cls: 0.1804 d1.dn_loss_bbox: 0.1943 d1.dn_loss_iou: 0.2390 d2.dn_loss_cls: 0.1636 d2.dn_loss_bbox: 0.1707 d2.dn_loss_iou: 0.2182 d3.dn_loss_cls: 0.1548 d3.dn_loss_bbox: 0.1630 d3.dn_loss_iou: 0.2124 d4.dn_loss_cls: 0.1546 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.2100 d1.loss_lmm_region: 0.2003 loss_lmm_image: 0.9406 2024/11/11 01:37:20 - mmengine - INFO - Iter(train) [ 28800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:04:55 time: 1.9955 data_time: 0.0172 memory: 34062 grad_norm: 29.9309 loss: 8.6870 loss_cls: 0.2695 loss_bbox: 0.1025 loss_iou: 0.2197 d0.loss_cls: 0.3123 d0.loss_bbox: 0.1185 d0.loss_iou: 0.2377 d1.loss_cls: 0.2873 d1.loss_bbox: 0.1099 d1.loss_iou: 0.2305 d2.loss_cls: 0.2743 d2.loss_bbox: 0.1059 d2.loss_iou: 0.2244 d3.loss_cls: 0.2712 d3.loss_bbox: 0.1049 d3.loss_iou: 0.2221 d4.loss_cls: 0.2672 d4.loss_bbox: 0.1067 d4.loss_iou: 0.2232 enc_loss_cls: 0.3019 enc_loss_bbox: 0.1362 enc_loss_iou: 0.2672 dn_loss_cls: 0.1074 dn_loss_bbox: 0.1436 dn_loss_iou: 0.2056 d0.dn_loss_cls: 0.1758 d0.dn_loss_bbox: 0.2948 d0.dn_loss_iou: 0.3538 d1.dn_loss_cls: 0.1338 d1.dn_loss_bbox: 0.1738 d1.dn_loss_iou: 0.2372 d2.dn_loss_cls: 0.1169 d2.dn_loss_bbox: 0.1519 d2.dn_loss_iou: 0.2151 d3.dn_loss_cls: 0.1110 d3.dn_loss_bbox: 0.1453 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.1074 d4.dn_loss_bbox: 0.1436 d4.dn_loss_iou: 0.2057 d1.loss_lmm_region: 0.1338 loss_lmm_image: 0.9293 2024/11/11 01:40:41 - mmengine - INFO - Iter(train) [ 28900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 19:01:39 time: 2.0007 data_time: 0.0171 memory: 35250 grad_norm: 29.2227 loss: 10.8458 loss_cls: 0.3577 loss_bbox: 0.1641 loss_iou: 0.3036 d0.loss_cls: 0.4206 d0.loss_bbox: 0.1680 d0.loss_iou: 0.3212 d1.loss_cls: 0.3903 d1.loss_bbox: 0.1634 d1.loss_iou: 0.3125 d2.loss_cls: 0.3742 d2.loss_bbox: 0.1620 d2.loss_iou: 0.3047 d3.loss_cls: 0.3603 d3.loss_bbox: 0.1684 d3.loss_iou: 0.3072 d4.loss_cls: 0.3578 d4.loss_bbox: 0.1671 d4.loss_iou: 0.3059 enc_loss_cls: 0.4093 enc_loss_bbox: 0.1842 enc_loss_iou: 0.3488 dn_loss_cls: 0.1450 dn_loss_bbox: 0.1672 dn_loss_iou: 0.2325 d0.dn_loss_cls: 0.2141 d0.dn_loss_bbox: 0.3126 d0.dn_loss_iou: 0.3794 d1.dn_loss_cls: 0.1688 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2643 d2.dn_loss_cls: 0.1546 d2.dn_loss_bbox: 0.1759 d2.dn_loss_iou: 0.2409 d3.dn_loss_cls: 0.1496 d3.dn_loss_bbox: 0.1683 d3.dn_loss_iou: 0.2342 d4.dn_loss_cls: 0.1457 d4.dn_loss_bbox: 0.1671 d4.dn_loss_iou: 0.2324 d1.loss_lmm_region: 0.1618 loss_lmm_image: 0.8828 2024/11/11 01:44:00 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 01:44:00 - mmengine - INFO - Iter(train) [ 29000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:58:22 time: 2.0083 data_time: 0.0172 memory: 34342 grad_norm: 28.0714 loss: 9.8905 loss_cls: 0.3003 loss_bbox: 0.1324 loss_iou: 0.2600 d0.loss_cls: 0.3575 d0.loss_bbox: 0.1421 d0.loss_iou: 0.2747 d1.loss_cls: 0.3263 d1.loss_bbox: 0.1326 d1.loss_iou: 0.2619 d2.loss_cls: 0.3113 d2.loss_bbox: 0.1327 d2.loss_iou: 0.2620 d3.loss_cls: 0.3045 d3.loss_bbox: 0.1312 d3.loss_iou: 0.2595 d4.loss_cls: 0.3002 d4.loss_bbox: 0.1333 d4.loss_iou: 0.2604 enc_loss_cls: 0.3614 enc_loss_bbox: 0.1579 enc_loss_iou: 0.2971 dn_loss_cls: 0.1226 dn_loss_bbox: 0.1683 dn_loss_iou: 0.2256 d0.dn_loss_cls: 0.2126 d0.dn_loss_bbox: 0.3292 d0.dn_loss_iou: 0.3714 d1.dn_loss_cls: 0.1576 d1.dn_loss_bbox: 0.2117 d1.dn_loss_iou: 0.2613 d2.dn_loss_cls: 0.1373 d2.dn_loss_bbox: 0.1842 d2.dn_loss_iou: 0.2376 d3.dn_loss_cls: 0.1280 d3.dn_loss_bbox: 0.1724 d3.dn_loss_iou: 0.2287 d4.dn_loss_cls: 0.1229 d4.dn_loss_bbox: 0.1684 d4.dn_loss_iou: 0.2255 d1.loss_lmm_region: 0.1753 loss_lmm_image: 0.9504 2024/11/11 01:47:18 - mmengine - INFO - Iter(train) [ 29100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:54:55 time: 1.9759 data_time: 0.0171 memory: 34785 grad_norm: 30.8134 loss: 8.7727 loss_cls: 0.2728 loss_bbox: 0.1108 loss_iou: 0.2036 d0.loss_cls: 0.3203 d0.loss_bbox: 0.1213 d0.loss_iou: 0.2160 d1.loss_cls: 0.2976 d1.loss_bbox: 0.1143 d1.loss_iou: 0.2097 d2.loss_cls: 0.2824 d2.loss_bbox: 0.1126 d2.loss_iou: 0.2080 d3.loss_cls: 0.2784 d3.loss_bbox: 0.1084 d3.loss_iou: 0.2015 d4.loss_cls: 0.2770 d4.loss_bbox: 0.1079 d4.loss_iou: 0.2001 enc_loss_cls: 0.3249 enc_loss_bbox: 0.1360 enc_loss_iou: 0.2333 dn_loss_cls: 0.1348 dn_loss_bbox: 0.1542 dn_loss_iou: 0.1857 d0.dn_loss_cls: 0.2137 d0.dn_loss_bbox: 0.3019 d0.dn_loss_iou: 0.3271 d1.dn_loss_cls: 0.1645 d1.dn_loss_bbox: 0.1923 d1.dn_loss_iou: 0.2183 d2.dn_loss_cls: 0.1469 d2.dn_loss_bbox: 0.1652 d2.dn_loss_iou: 0.1952 d3.dn_loss_cls: 0.1405 d3.dn_loss_bbox: 0.1570 d3.dn_loss_iou: 0.1884 d4.dn_loss_cls: 0.1364 d4.dn_loss_bbox: 0.1543 d4.dn_loss_iou: 0.1857 d1.loss_lmm_region: 0.1761 loss_lmm_image: 0.8973 2024/11/11 01:50:39 - mmengine - INFO - Iter(train) [ 29200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:51:44 time: 2.0155 data_time: 0.0173 memory: 34940 grad_norm: 31.4013 loss: 11.1837 loss_cls: 0.3955 loss_bbox: 0.1678 loss_iou: 0.2974 d0.loss_cls: 0.4470 d0.loss_bbox: 0.1784 d0.loss_iou: 0.3151 d1.loss_cls: 0.4194 d1.loss_bbox: 0.1722 d1.loss_iou: 0.3071 d2.loss_cls: 0.4024 d2.loss_bbox: 0.1667 d2.loss_iou: 0.3000 d3.loss_cls: 0.3995 d3.loss_bbox: 0.1642 d3.loss_iou: 0.2966 d4.loss_cls: 0.3953 d4.loss_bbox: 0.1665 d4.loss_iou: 0.2962 enc_loss_cls: 0.4498 enc_loss_bbox: 0.1927 enc_loss_iou: 0.3381 dn_loss_cls: 0.1469 dn_loss_bbox: 0.1670 dn_loss_iou: 0.2362 d0.dn_loss_cls: 0.2371 d0.dn_loss_bbox: 0.3037 d0.dn_loss_iou: 0.3837 d1.dn_loss_cls: 0.1864 d1.dn_loss_bbox: 0.1976 d1.dn_loss_iou: 0.2681 d2.dn_loss_cls: 0.1606 d2.dn_loss_bbox: 0.1772 d2.dn_loss_iou: 0.2462 d3.dn_loss_cls: 0.1536 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2390 d4.dn_loss_cls: 0.1497 d4.dn_loss_bbox: 0.1670 d4.dn_loss_iou: 0.2361 d1.loss_lmm_region: 0.1860 loss_lmm_image: 0.9049 2024/11/11 01:54:00 - mmengine - INFO - Iter(train) [ 29300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:48:31 time: 2.0230 data_time: 0.0172 memory: 34023 grad_norm: 28.1938 loss: 10.5623 loss_cls: 0.3590 loss_bbox: 0.1406 loss_iou: 0.2848 d0.loss_cls: 0.4143 d0.loss_bbox: 0.1578 d0.loss_iou: 0.3010 d1.loss_cls: 0.3746 d1.loss_bbox: 0.1497 d1.loss_iou: 0.2931 d2.loss_cls: 0.3701 d2.loss_bbox: 0.1430 d2.loss_iou: 0.2883 d3.loss_cls: 0.3669 d3.loss_bbox: 0.1419 d3.loss_iou: 0.2876 d4.loss_cls: 0.3607 d4.loss_bbox: 0.1395 d4.loss_iou: 0.2836 enc_loss_cls: 0.4059 enc_loss_bbox: 0.1704 enc_loss_iou: 0.3239 dn_loss_cls: 0.1216 dn_loss_bbox: 0.1690 dn_loss_iou: 0.2400 d0.dn_loss_cls: 0.2102 d0.dn_loss_bbox: 0.3107 d0.dn_loss_iou: 0.3915 d1.dn_loss_cls: 0.1539 d1.dn_loss_bbox: 0.1957 d1.dn_loss_iou: 0.2720 d2.dn_loss_cls: 0.1366 d2.dn_loss_bbox: 0.1757 d2.dn_loss_iou: 0.2506 d3.dn_loss_cls: 0.1288 d3.dn_loss_bbox: 0.1704 d3.dn_loss_iou: 0.2425 d4.dn_loss_cls: 0.1235 d4.dn_loss_bbox: 0.1689 d4.dn_loss_iou: 0.2399 d1.loss_lmm_region: 0.1706 loss_lmm_image: 0.9336 2024/11/11 01:57:17 - mmengine - INFO - Iter(train) [ 29400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:45:05 time: 1.9632 data_time: 0.0172 memory: 34212 grad_norm: 27.6253 loss: 10.7425 loss_cls: 0.3172 loss_bbox: 0.1729 loss_iou: 0.2619 d0.loss_cls: 0.3605 d0.loss_bbox: 0.1856 d0.loss_iou: 0.2765 d1.loss_cls: 0.3244 d1.loss_bbox: 0.1864 d1.loss_iou: 0.2741 d2.loss_cls: 0.3290 d2.loss_bbox: 0.1697 d2.loss_iou: 0.2612 d3.loss_cls: 0.3214 d3.loss_bbox: 0.1723 d3.loss_iou: 0.2625 d4.loss_cls: 0.3190 d4.loss_bbox: 0.1716 d4.loss_iou: 0.2607 enc_loss_cls: 0.3610 enc_loss_bbox: 0.1996 enc_loss_iou: 0.2976 dn_loss_cls: 0.1230 dn_loss_bbox: 0.2254 dn_loss_iou: 0.2530 d0.dn_loss_cls: 0.2166 d0.dn_loss_bbox: 0.3843 d0.dn_loss_iou: 0.4039 d1.dn_loss_cls: 0.1579 d1.dn_loss_bbox: 0.2602 d1.dn_loss_iou: 0.2850 d2.dn_loss_cls: 0.1370 d2.dn_loss_bbox: 0.2344 d2.dn_loss_iou: 0.2619 d3.dn_loss_cls: 0.1281 d3.dn_loss_bbox: 0.2275 d3.dn_loss_iou: 0.2556 d4.dn_loss_cls: 0.1236 d4.dn_loss_bbox: 0.2253 d4.dn_loss_iou: 0.2529 d1.loss_lmm_region: 0.1919 loss_lmm_image: 0.9096 2024/11/11 02:00:38 - mmengine - INFO - Iter(train) [ 29500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:41:52 time: 1.9965 data_time: 0.0173 memory: 34413 grad_norm: 31.6327 loss: 10.6746 loss_cls: 0.3602 loss_bbox: 0.1362 loss_iou: 0.2307 d0.loss_cls: 0.4194 d0.loss_bbox: 0.1409 d0.loss_iou: 0.2430 d1.loss_cls: 0.3865 d1.loss_bbox: 0.1357 d1.loss_iou: 0.2353 d2.loss_cls: 0.3730 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2306 d3.loss_cls: 0.3664 d3.loss_bbox: 0.1333 d3.loss_iou: 0.2300 d4.loss_cls: 0.3630 d4.loss_bbox: 0.1331 d4.loss_iou: 0.2308 enc_loss_cls: 0.4130 enc_loss_bbox: 0.1581 enc_loss_iou: 0.2667 dn_loss_cls: 0.2695 dn_loss_bbox: 0.1710 dn_loss_iou: 0.1912 d0.dn_loss_cls: 0.3456 d0.dn_loss_bbox: 0.2918 d0.dn_loss_iou: 0.3174 d1.dn_loss_cls: 0.3116 d1.dn_loss_bbox: 0.2007 d1.dn_loss_iou: 0.2195 d2.dn_loss_cls: 0.2870 d2.dn_loss_bbox: 0.1806 d2.dn_loss_iou: 0.1994 d3.dn_loss_cls: 0.2744 d3.dn_loss_bbox: 0.1724 d3.dn_loss_iou: 0.1931 d4.dn_loss_cls: 0.2678 d4.dn_loss_bbox: 0.1710 d4.dn_loss_iou: 0.1911 d1.loss_lmm_region: 0.1840 loss_lmm_image: 0.9152 2024/11/11 02:03:57 - mmengine - INFO - Iter(train) [ 29600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:38:31 time: 1.9746 data_time: 0.0171 memory: 34215 grad_norm: 30.5815 loss: 9.1511 loss_cls: 0.2802 loss_bbox: 0.1289 loss_iou: 0.2173 d0.loss_cls: 0.3316 d0.loss_bbox: 0.1377 d0.loss_iou: 0.2253 d1.loss_cls: 0.3056 d1.loss_bbox: 0.1333 d1.loss_iou: 0.2190 d2.loss_cls: 0.2946 d2.loss_bbox: 0.1307 d2.loss_iou: 0.2161 d3.loss_cls: 0.2863 d3.loss_bbox: 0.1261 d3.loss_iou: 0.2161 d4.loss_cls: 0.2783 d4.loss_bbox: 0.1316 d4.loss_iou: 0.2198 enc_loss_cls: 0.3322 enc_loss_bbox: 0.1487 enc_loss_iou: 0.2431 dn_loss_cls: 0.1259 dn_loss_bbox: 0.1626 dn_loss_iou: 0.2007 d0.dn_loss_cls: 0.2129 d0.dn_loss_bbox: 0.3166 d0.dn_loss_iou: 0.3467 d1.dn_loss_cls: 0.1619 d1.dn_loss_bbox: 0.1948 d1.dn_loss_iou: 0.2309 d2.dn_loss_cls: 0.1408 d2.dn_loss_bbox: 0.1730 d2.dn_loss_iou: 0.2096 d3.dn_loss_cls: 0.1317 d3.dn_loss_bbox: 0.1651 d3.dn_loss_iou: 0.2031 d4.dn_loss_cls: 0.1281 d4.dn_loss_bbox: 0.1627 d4.dn_loss_iou: 0.2007 d1.loss_lmm_region: 0.1600 loss_lmm_image: 0.9208 2024/11/11 02:07:16 - mmengine - INFO - Iter(train) [ 29700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:35:09 time: 1.9838 data_time: 0.0173 memory: 32956 grad_norm: 36.1398 loss: 10.0893 loss_cls: 0.2817 loss_bbox: 0.1473 loss_iou: 0.2388 d0.loss_cls: 0.3208 d0.loss_bbox: 0.1623 d0.loss_iou: 0.2530 d1.loss_cls: 0.2969 d1.loss_bbox: 0.1558 d1.loss_iou: 0.2477 d2.loss_cls: 0.2906 d2.loss_bbox: 0.1478 d2.loss_iou: 0.2413 d3.loss_cls: 0.2856 d3.loss_bbox: 0.1459 d3.loss_iou: 0.2400 d4.loss_cls: 0.2820 d4.loss_bbox: 0.1483 d4.loss_iou: 0.2398 enc_loss_cls: 0.3223 enc_loss_bbox: 0.1753 enc_loss_iou: 0.2756 dn_loss_cls: 0.1556 dn_loss_bbox: 0.1912 dn_loss_iou: 0.2390 d0.dn_loss_cls: 0.2347 d0.dn_loss_bbox: 0.3427 d0.dn_loss_iou: 0.3902 d1.dn_loss_cls: 0.1829 d1.dn_loss_bbox: 0.2263 d1.dn_loss_iou: 0.2727 d2.dn_loss_cls: 0.1652 d2.dn_loss_bbox: 0.2030 d2.dn_loss_iou: 0.2498 d3.dn_loss_cls: 0.1604 d3.dn_loss_bbox: 0.1940 d3.dn_loss_iou: 0.2423 d4.dn_loss_cls: 0.1586 d4.dn_loss_bbox: 0.1912 d4.dn_loss_iou: 0.2390 d1.loss_lmm_region: 0.1936 loss_lmm_image: 0.9581 2024/11/11 02:10:34 - mmengine - INFO - Iter(train) [ 29800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:31:48 time: 1.9887 data_time: 0.0170 memory: 34309 grad_norm: 37.1438 loss: 9.8365 loss_cls: 0.3114 loss_bbox: 0.1353 loss_iou: 0.2614 d0.loss_cls: 0.3611 d0.loss_bbox: 0.1539 d0.loss_iou: 0.2814 d1.loss_cls: 0.3364 d1.loss_bbox: 0.1447 d1.loss_iou: 0.2703 d2.loss_cls: 0.3227 d2.loss_bbox: 0.1381 d2.loss_iou: 0.2623 d3.loss_cls: 0.3125 d3.loss_bbox: 0.1397 d3.loss_iou: 0.2630 d4.loss_cls: 0.3090 d4.loss_bbox: 0.1375 d4.loss_iou: 0.2621 enc_loss_cls: 0.3670 enc_loss_bbox: 0.1645 enc_loss_iou: 0.3013 dn_loss_cls: 0.1230 dn_loss_bbox: 0.1655 dn_loss_iou: 0.2084 d0.dn_loss_cls: 0.1976 d0.dn_loss_bbox: 0.3146 d0.dn_loss_iou: 0.3559 d1.dn_loss_cls: 0.1514 d1.dn_loss_bbox: 0.2002 d1.dn_loss_iou: 0.2424 d2.dn_loss_cls: 0.1345 d2.dn_loss_bbox: 0.1749 d2.dn_loss_iou: 0.2182 d3.dn_loss_cls: 0.1261 d3.dn_loss_bbox: 0.1680 d3.dn_loss_iou: 0.2111 d4.dn_loss_cls: 0.1246 d4.dn_loss_bbox: 0.1654 d4.dn_loss_iou: 0.2083 d1.loss_lmm_region: 0.1633 loss_lmm_image: 0.9476 2024/11/11 02:13:51 - mmengine - INFO - Iter(train) [ 29900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:28:18 time: 1.9512 data_time: 0.0171 memory: 34598 grad_norm: 29.6278 loss: 9.3267 loss_cls: 0.2624 loss_bbox: 0.1366 loss_iou: 0.2361 d0.loss_cls: 0.2972 d0.loss_bbox: 0.1564 d0.loss_iou: 0.2471 d1.loss_cls: 0.2822 d1.loss_bbox: 0.1370 d1.loss_iou: 0.2380 d2.loss_cls: 0.2679 d2.loss_bbox: 0.1389 d2.loss_iou: 0.2378 d3.loss_cls: 0.2651 d3.loss_bbox: 0.1364 d3.loss_iou: 0.2354 d4.loss_cls: 0.2535 d4.loss_bbox: 0.1415 d4.loss_iou: 0.2388 enc_loss_cls: 0.2991 enc_loss_bbox: 0.1616 enc_loss_iou: 0.2670 dn_loss_cls: 0.1318 dn_loss_bbox: 0.1643 dn_loss_iou: 0.2118 d0.dn_loss_cls: 0.2109 d0.dn_loss_bbox: 0.3331 d0.dn_loss_iou: 0.3560 d1.dn_loss_cls: 0.1636 d1.dn_loss_bbox: 0.2025 d1.dn_loss_iou: 0.2441 d2.dn_loss_cls: 0.1427 d2.dn_loss_bbox: 0.1799 d2.dn_loss_iou: 0.2225 d3.dn_loss_cls: 0.1348 d3.dn_loss_bbox: 0.1672 d3.dn_loss_iou: 0.2144 d4.dn_loss_cls: 0.1311 d4.dn_loss_bbox: 0.1645 d4.dn_loss_iou: 0.2117 d1.loss_lmm_region: 0.1835 loss_lmm_image: 0.9208 2024/11/11 02:17:09 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 02:17:09 - mmengine - INFO - Iter(train) [ 30000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:24:54 time: 1.9907 data_time: 0.0173 memory: 34844 grad_norm: 30.5122 loss: 9.3812 loss_cls: 0.2951 loss_bbox: 0.1376 loss_iou: 0.2349 d0.loss_cls: 0.3408 d0.loss_bbox: 0.1480 d0.loss_iou: 0.2474 d1.loss_cls: 0.3165 d1.loss_bbox: 0.1365 d1.loss_iou: 0.2358 d2.loss_cls: 0.3078 d2.loss_bbox: 0.1359 d2.loss_iou: 0.2348 d3.loss_cls: 0.3016 d3.loss_bbox: 0.1376 d3.loss_iou: 0.2352 d4.loss_cls: 0.2959 d4.loss_bbox: 0.1371 d4.loss_iou: 0.2352 enc_loss_cls: 0.3422 enc_loss_bbox: 0.1627 enc_loss_iou: 0.2706 dn_loss_cls: 0.1053 dn_loss_bbox: 0.1712 dn_loss_iou: 0.2155 d0.dn_loss_cls: 0.1927 d0.dn_loss_bbox: 0.3194 d0.dn_loss_iou: 0.3561 d1.dn_loss_cls: 0.1372 d1.dn_loss_bbox: 0.2022 d1.dn_loss_iou: 0.2460 d2.dn_loss_cls: 0.1214 d2.dn_loss_bbox: 0.1807 d2.dn_loss_iou: 0.2240 d3.dn_loss_cls: 0.1134 d3.dn_loss_bbox: 0.1735 d3.dn_loss_iou: 0.2177 d4.dn_loss_cls: 0.1073 d4.dn_loss_bbox: 0.1713 d4.dn_loss_iou: 0.2156 d1.loss_lmm_region: 0.1536 loss_lmm_image: 0.8680 2024/11/11 02:17:09 - mmengine - INFO - Saving checkpoint at 30000 iterations 2024/11/11 02:28:46 - mmengine - INFO - Iter(val) [100/602] eta: 0:49:17 time: 6.0342 data_time: 0.0018 memory: 7496 2024/11/11 02:39:09 - mmengine - INFO - Iter(val) [200/602] eta: 0:40:34 time: 6.3799 data_time: 0.0018 memory: 7528 2024/11/11 02:50:29 - mmengine - INFO - Iter(val) [300/602] eta: 0:31:43 time: 6.8916 data_time: 0.0019 memory: 7506 2024/11/11 03:02:43 - mmengine - INFO - Iter(val) [400/602] eta: 0:22:05 time: 7.4373 data_time: 0.0020 memory: 7528 2024/11/11 03:15:37 - mmengine - INFO - Iter(val) [500/602] eta: 0:11:33 time: 7.8340 data_time: 0.0020 memory: 7473 2024/11/11 03:29:16 - mmengine - INFO - Iter(val) [600/602] eta: 0:00:14 time: 8.3867 data_time: 0.0020 memory: 7522 2024/11/11 03:36:10 - mmengine - INFO - === 85 classes had less than 10000 detections! Outputting 10000 detections for each class will improve AP further. === 2024/11/11 03:38:32 - mmengine - INFO - mAP_copypaste: {'AP': 0.468045773885989, 'AP50': 0.5761800651063914, 'AP75': 0.4982136611780151, 'APs': 0.41563498781503583, 'APm': 0.601956585879971, 'APl': 0.6860693164672527, 'APr': 0.35667245127869734, 'APc': 0.414069882712702, 'APf': 0.5359578948044834} 2024/11/11 03:38:56 - mmengine - INFO - Iter(val) [602/602] lvis_fixed_ap/AP: 0.4680 lvis_fixed_ap/AP50: 0.5762 lvis_fixed_ap/AP75: 0.4982 lvis_fixed_ap/APs: 0.4156 lvis_fixed_ap/APm: 0.6020 lvis_fixed_ap/APl: 0.6861 lvis_fixed_ap/APr: 0.3567 lvis_fixed_ap/APc: 0.4141 lvis_fixed_ap/APf: 0.5360 data_time: 0.0025 time: 7.0346 2024/11/11 03:42:15 - mmengine - INFO - Iter(train) [ 30100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:59:01 time: 1.9885 data_time: 0.0167 memory: 34143 grad_norm: 29.8675 loss: 9.6194 loss_cls: 0.3178 loss_bbox: 0.1402 loss_iou: 0.2563 d0.loss_cls: 0.3735 d0.loss_bbox: 0.1470 d0.loss_iou: 0.2669 d1.loss_cls: 0.3364 d1.loss_bbox: 0.1456 d1.loss_iou: 0.2626 d2.loss_cls: 0.3249 d2.loss_bbox: 0.1419 d2.loss_iou: 0.2574 d3.loss_cls: 0.3214 d3.loss_bbox: 0.1394 d3.loss_iou: 0.2568 d4.loss_cls: 0.3177 d4.loss_bbox: 0.1396 d4.loss_iou: 0.2559 enc_loss_cls: 0.3671 enc_loss_bbox: 0.1682 enc_loss_iou: 0.2976 dn_loss_cls: 0.1041 dn_loss_bbox: 0.1628 dn_loss_iou: 0.2116 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.3073 d0.dn_loss_iou: 0.3519 d1.dn_loss_cls: 0.1307 d1.dn_loss_bbox: 0.1955 d1.dn_loss_iou: 0.2427 d2.dn_loss_cls: 0.1119 d2.dn_loss_bbox: 0.1729 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.1069 d3.dn_loss_bbox: 0.1664 d3.dn_loss_iou: 0.2152 d4.dn_loss_cls: 0.1035 d4.dn_loss_bbox: 0.1629 d4.dn_loss_iou: 0.2118 d1.loss_lmm_region: 0.1473 loss_lmm_image: 0.8750 2024/11/11 03:45:36 - mmengine - INFO - Iter(train) [ 30200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:55:38 time: 2.0011 data_time: 0.0168 memory: 34370 grad_norm: 31.9187 loss: 9.6142 loss_cls: 0.3127 loss_bbox: 0.1215 loss_iou: 0.2175 d0.loss_cls: 0.3532 d0.loss_bbox: 0.1340 d0.loss_iou: 0.2306 d1.loss_cls: 0.3237 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2220 d2.loss_cls: 0.3107 d2.loss_bbox: 0.1258 d2.loss_iou: 0.2224 d3.loss_cls: 0.3069 d3.loss_bbox: 0.1245 d3.loss_iou: 0.2209 d4.loss_cls: 0.3148 d4.loss_bbox: 0.1221 d4.loss_iou: 0.2169 enc_loss_cls: 0.3502 enc_loss_bbox: 0.1541 enc_loss_iou: 0.2557 dn_loss_cls: 0.1378 dn_loss_bbox: 0.1869 dn_loss_iou: 0.2193 d0.dn_loss_cls: 0.2197 d0.dn_loss_bbox: 0.3390 d0.dn_loss_iou: 0.3654 d1.dn_loss_cls: 0.1696 d1.dn_loss_bbox: 0.2141 d1.dn_loss_iou: 0.2487 d2.dn_loss_cls: 0.1518 d2.dn_loss_bbox: 0.1952 d2.dn_loss_iou: 0.2279 d3.dn_loss_cls: 0.1432 d3.dn_loss_bbox: 0.1890 d3.dn_loss_iou: 0.2216 d4.dn_loss_cls: 0.1394 d4.dn_loss_bbox: 0.1869 d4.dn_loss_iou: 0.2193 d1.loss_lmm_region: 0.1875 loss_lmm_image: 0.8848 2024/11/11 03:48:55 - mmengine - INFO - Iter(train) [ 30300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:52:07 time: 1.9878 data_time: 0.0167 memory: 35666 grad_norm: 30.4463 loss: 10.7270 loss_cls: 0.4873 loss_bbox: 0.1158 loss_iou: 0.2450 d0.loss_cls: 0.5447 d0.loss_bbox: 0.1299 d0.loss_iou: 0.2615 d1.loss_cls: 0.5076 d1.loss_bbox: 0.1205 d1.loss_iou: 0.2523 d2.loss_cls: 0.4877 d2.loss_bbox: 0.1228 d2.loss_iou: 0.2507 d3.loss_cls: 0.4887 d3.loss_bbox: 0.1176 d3.loss_iou: 0.2462 d4.loss_cls: 0.4908 d4.loss_bbox: 0.1158 d4.loss_iou: 0.2441 enc_loss_cls: 0.5445 enc_loss_bbox: 0.1422 enc_loss_iou: 0.2855 dn_loss_cls: 0.1607 dn_loss_bbox: 0.1354 dn_loss_iou: 0.1927 d0.dn_loss_cls: 0.2395 d0.dn_loss_bbox: 0.2693 d0.dn_loss_iou: 0.3265 d1.dn_loss_cls: 0.1985 d1.dn_loss_bbox: 0.1667 d1.dn_loss_iou: 0.2226 d2.dn_loss_cls: 0.1799 d2.dn_loss_bbox: 0.1448 d2.dn_loss_iou: 0.2021 d3.dn_loss_cls: 0.1689 d3.dn_loss_bbox: 0.1372 d3.dn_loss_iou: 0.1953 d4.dn_loss_cls: 0.1633 d4.dn_loss_bbox: 0.1353 d4.dn_loss_iou: 0.1928 d1.loss_lmm_region: 0.1789 loss_lmm_image: 0.9155 2024/11/11 03:52:13 - mmengine - INFO - Iter(train) [ 30400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:48:36 time: 1.9687 data_time: 0.0168 memory: 34540 grad_norm: 26.6763 loss: 8.5857 loss_cls: 0.2619 loss_bbox: 0.1054 loss_iou: 0.1695 d0.loss_cls: 0.3203 d0.loss_bbox: 0.1084 d0.loss_iou: 0.1771 d1.loss_cls: 0.2867 d1.loss_bbox: 0.1046 d1.loss_iou: 0.1707 d2.loss_cls: 0.2744 d2.loss_bbox: 0.1050 d2.loss_iou: 0.1692 d3.loss_cls: 0.2662 d3.loss_bbox: 0.1065 d3.loss_iou: 0.1694 d4.loss_cls: 0.2635 d4.loss_bbox: 0.1039 d4.loss_iou: 0.1685 enc_loss_cls: 0.3109 enc_loss_bbox: 0.1216 enc_loss_iou: 0.2001 dn_loss_cls: 0.1261 dn_loss_bbox: 0.1688 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.2142 d0.dn_loss_bbox: 0.3304 d0.dn_loss_iou: 0.3403 d1.dn_loss_cls: 0.1633 d1.dn_loss_bbox: 0.2009 d1.dn_loss_iou: 0.2258 d2.dn_loss_cls: 0.1424 d2.dn_loss_bbox: 0.1779 d2.dn_loss_iou: 0.2049 d3.dn_loss_cls: 0.1296 d3.dn_loss_bbox: 0.1697 d3.dn_loss_iou: 0.1982 d4.dn_loss_cls: 0.1268 d4.dn_loss_bbox: 0.1687 d4.dn_loss_iou: 0.1961 d1.loss_lmm_region: 0.1875 loss_lmm_image: 0.9541 2024/11/11 03:55:33 - mmengine - INFO - Iter(train) [ 30500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:45:09 time: 1.9914 data_time: 0.0167 memory: 35314 grad_norm: 37.8261 loss: 8.9453 loss_cls: 0.2817 loss_bbox: 0.1113 loss_iou: 0.1879 d0.loss_cls: 0.3282 d0.loss_bbox: 0.1195 d0.loss_iou: 0.2005 d1.loss_cls: 0.3027 d1.loss_bbox: 0.1120 d1.loss_iou: 0.1919 d2.loss_cls: 0.2933 d2.loss_bbox: 0.1076 d2.loss_iou: 0.1881 d3.loss_cls: 0.2911 d3.loss_bbox: 0.1067 d3.loss_iou: 0.1837 d4.loss_cls: 0.2853 d4.loss_bbox: 0.1102 d4.loss_iou: 0.1859 enc_loss_cls: 0.3149 enc_loss_bbox: 0.1383 enc_loss_iou: 0.2257 dn_loss_cls: 0.1711 dn_loss_bbox: 0.1517 dn_loss_iou: 0.1907 d0.dn_loss_cls: 0.2329 d0.dn_loss_bbox: 0.3051 d0.dn_loss_iou: 0.3349 d1.dn_loss_cls: 0.1924 d1.dn_loss_bbox: 0.1871 d1.dn_loss_iou: 0.2237 d2.dn_loss_cls: 0.1787 d2.dn_loss_bbox: 0.1622 d2.dn_loss_iou: 0.2008 d3.dn_loss_cls: 0.1740 d3.dn_loss_bbox: 0.1544 d3.dn_loss_iou: 0.1933 d4.dn_loss_cls: 0.1725 d4.dn_loss_bbox: 0.1518 d4.dn_loss_iou: 0.1906 d1.loss_lmm_region: 0.1959 loss_lmm_image: 0.9149 2024/11/11 03:58:51 - mmengine - INFO - Iter(train) [ 30600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:41:37 time: 1.9826 data_time: 0.0168 memory: 34906 grad_norm: 27.8507 loss: 8.5697 loss_cls: 0.2609 loss_bbox: 0.1174 loss_iou: 0.2257 d0.loss_cls: 0.3070 d0.loss_bbox: 0.1252 d0.loss_iou: 0.2391 d1.loss_cls: 0.2839 d1.loss_bbox: 0.1153 d1.loss_iou: 0.2283 d2.loss_cls: 0.2717 d2.loss_bbox: 0.1137 d2.loss_iou: 0.2259 d3.loss_cls: 0.2641 d3.loss_bbox: 0.1175 d3.loss_iou: 0.2259 d4.loss_cls: 0.2622 d4.loss_bbox: 0.1171 d4.loss_iou: 0.2249 enc_loss_cls: 0.3081 enc_loss_bbox: 0.1332 enc_loss_iou: 0.2623 dn_loss_cls: 0.1034 dn_loss_bbox: 0.1427 dn_loss_iou: 0.1896 d0.dn_loss_cls: 0.1717 d0.dn_loss_bbox: 0.2821 d0.dn_loss_iou: 0.3345 d1.dn_loss_cls: 0.1297 d1.dn_loss_bbox: 0.1656 d1.dn_loss_iou: 0.2178 d2.dn_loss_cls: 0.1123 d2.dn_loss_bbox: 0.1486 d2.dn_loss_iou: 0.1974 d3.dn_loss_cls: 0.1066 d3.dn_loss_bbox: 0.1452 d3.dn_loss_iou: 0.1916 d4.dn_loss_cls: 0.1050 d4.dn_loss_bbox: 0.1427 d4.dn_loss_iou: 0.1896 d1.loss_lmm_region: 0.1504 loss_lmm_image: 0.9138 2024/11/11 04:02:12 - mmengine - INFO - Iter(train) [ 30700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:38:14 time: 1.9736 data_time: 0.0196 memory: 35222 grad_norm: nan loss: 10.4409 loss_cls: 0.3120 loss_bbox: 0.1578 loss_iou: 0.2627 d0.loss_cls: 0.3688 d0.loss_bbox: 0.1702 d0.loss_iou: 0.2807 d1.loss_cls: 0.3303 d1.loss_bbox: 0.1656 d1.loss_iou: 0.2711 d2.loss_cls: 0.3241 d2.loss_bbox: 0.1577 d2.loss_iou: 0.2646 d3.loss_cls: 0.3219 d3.loss_bbox: 0.1534 d3.loss_iou: 0.2617 d4.loss_cls: 0.3124 d4.loss_bbox: 0.1572 d4.loss_iou: 0.2628 enc_loss_cls: 0.3619 enc_loss_bbox: 0.1839 enc_loss_iou: 0.3018 dn_loss_cls: 0.1419 dn_loss_bbox: 0.1828 dn_loss_iou: 0.2383 d0.dn_loss_cls: 0.2223 d0.dn_loss_bbox: 0.3338 d0.dn_loss_iou: 0.3883 d1.dn_loss_cls: 0.1750 d1.dn_loss_bbox: 0.2176 d1.dn_loss_iou: 0.2701 d2.dn_loss_cls: 0.1574 d2.dn_loss_bbox: 0.1915 d2.dn_loss_iou: 0.2469 d3.dn_loss_cls: 0.1490 d3.dn_loss_bbox: 0.1847 d3.dn_loss_iou: 0.2408 d4.dn_loss_cls: 0.1442 d4.dn_loss_bbox: 0.1828 d4.dn_loss_iou: 0.2383 d1.loss_lmm_region: 0.1931 loss_lmm_image: 0.9594 2024/11/11 04:05:31 - mmengine - INFO - Iter(train) [ 30800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:34:46 time: 1.9880 data_time: 0.0195 memory: 33779 grad_norm: 27.1755 loss: 10.8783 loss_cls: 0.3398 loss_bbox: 0.1740 loss_iou: 0.2940 d0.loss_cls: 0.4041 d0.loss_bbox: 0.1811 d0.loss_iou: 0.3080 d1.loss_cls: 0.3684 d1.loss_bbox: 0.1762 d1.loss_iou: 0.3010 d2.loss_cls: 0.3547 d2.loss_bbox: 0.1755 d2.loss_iou: 0.2932 d3.loss_cls: 0.3442 d3.loss_bbox: 0.1728 d3.loss_iou: 0.2923 d4.loss_cls: 0.3411 d4.loss_bbox: 0.1730 d4.loss_iou: 0.2926 enc_loss_cls: 0.3977 enc_loss_bbox: 0.2038 enc_loss_iou: 0.3295 dn_loss_cls: 0.1441 dn_loss_bbox: 0.1765 dn_loss_iou: 0.2454 d0.dn_loss_cls: 0.2295 d0.dn_loss_bbox: 0.3243 d0.dn_loss_iou: 0.3900 d1.dn_loss_cls: 0.1755 d1.dn_loss_bbox: 0.2053 d1.dn_loss_iou: 0.2746 d2.dn_loss_cls: 0.1563 d2.dn_loss_bbox: 0.1880 d2.dn_loss_iou: 0.2548 d3.dn_loss_cls: 0.1466 d3.dn_loss_bbox: 0.1783 d3.dn_loss_iou: 0.2474 d4.dn_loss_cls: 0.1418 d4.dn_loss_bbox: 0.1767 d4.dn_loss_iou: 0.2455 d1.loss_lmm_region: 0.1715 loss_lmm_image: 0.8892 2024/11/11 04:08:50 - mmengine - INFO - Iter(train) [ 30900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:31:17 time: 1.9847 data_time: 0.0189 memory: 34044 grad_norm: 24.1819 loss: 10.0399 loss_cls: 0.2860 loss_bbox: 0.1549 loss_iou: 0.2721 d0.loss_cls: 0.3252 d0.loss_bbox: 0.1738 d0.loss_iou: 0.2923 d1.loss_cls: 0.2968 d1.loss_bbox: 0.1689 d1.loss_iou: 0.2851 d2.loss_cls: 0.2893 d2.loss_bbox: 0.1652 d2.loss_iou: 0.2799 d3.loss_cls: 0.2843 d3.loss_bbox: 0.1606 d3.loss_iou: 0.2776 d4.loss_cls: 0.2845 d4.loss_bbox: 0.1571 d4.loss_iou: 0.2751 enc_loss_cls: 0.3262 enc_loss_bbox: 0.1862 enc_loss_iou: 0.3116 dn_loss_cls: 0.1057 dn_loss_bbox: 0.1877 dn_loss_iou: 0.2416 d0.dn_loss_cls: 0.1882 d0.dn_loss_bbox: 0.3445 d0.dn_loss_iou: 0.3874 d1.dn_loss_cls: 0.1377 d1.dn_loss_bbox: 0.2313 d1.dn_loss_iou: 0.2769 d2.dn_loss_cls: 0.1181 d2.dn_loss_bbox: 0.1996 d2.dn_loss_iou: 0.2508 d3.dn_loss_cls: 0.1121 d3.dn_loss_bbox: 0.1908 d3.dn_loss_iou: 0.2444 d4.dn_loss_cls: 0.1075 d4.dn_loss_bbox: 0.1877 d4.dn_loss_iou: 0.2416 d1.loss_lmm_region: 0.1737 loss_lmm_image: 0.8599 2024/11/11 04:12:10 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 04:12:10 - mmengine - INFO - Iter(train) [ 31000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:27:49 time: 1.9750 data_time: 0.0189 memory: 33572 grad_norm: 31.4289 loss: 10.2250 loss_cls: 0.3398 loss_bbox: 0.1434 loss_iou: 0.2542 d0.loss_cls: 0.3842 d0.loss_bbox: 0.1525 d0.loss_iou: 0.2635 d1.loss_cls: 0.3570 d1.loss_bbox: 0.1483 d1.loss_iou: 0.2581 d2.loss_cls: 0.3498 d2.loss_bbox: 0.1436 d2.loss_iou: 0.2540 d3.loss_cls: 0.3440 d3.loss_bbox: 0.1437 d3.loss_iou: 0.2534 d4.loss_cls: 0.3397 d4.loss_bbox: 0.1452 d4.loss_iou: 0.2558 enc_loss_cls: 0.3838 enc_loss_bbox: 0.1662 enc_loss_iou: 0.2868 dn_loss_cls: 0.1352 dn_loss_bbox: 0.1714 dn_loss_iou: 0.2366 d0.dn_loss_cls: 0.2202 d0.dn_loss_bbox: 0.3226 d0.dn_loss_iou: 0.3831 d1.dn_loss_cls: 0.1725 d1.dn_loss_bbox: 0.2041 d1.dn_loss_iou: 0.2668 d2.dn_loss_cls: 0.1500 d2.dn_loss_bbox: 0.1805 d2.dn_loss_iou: 0.2447 d3.dn_loss_cls: 0.1415 d3.dn_loss_bbox: 0.1736 d3.dn_loss_iou: 0.2389 d4.dn_loss_cls: 0.1369 d4.dn_loss_bbox: 0.1714 d4.dn_loss_iou: 0.2365 d1.loss_lmm_region: 0.1600 loss_lmm_image: 0.9114 2024/11/11 04:15:30 - mmengine - INFO - Iter(train) [ 31100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:24:24 time: 2.0060 data_time: 0.0186 memory: 33849 grad_norm: 33.6745 loss: 8.9387 loss_cls: 0.2978 loss_bbox: 0.1125 loss_iou: 0.1949 d0.loss_cls: 0.3470 d0.loss_bbox: 0.1276 d0.loss_iou: 0.2096 d1.loss_cls: 0.3172 d1.loss_bbox: 0.1186 d1.loss_iou: 0.1997 d2.loss_cls: 0.3056 d2.loss_bbox: 0.1184 d2.loss_iou: 0.1973 d3.loss_cls: 0.3008 d3.loss_bbox: 0.1138 d3.loss_iou: 0.1962 d4.loss_cls: 0.3002 d4.loss_bbox: 0.1118 d4.loss_iou: 0.1947 enc_loss_cls: 0.3425 enc_loss_bbox: 0.1490 enc_loss_iou: 0.2312 dn_loss_cls: 0.1491 dn_loss_bbox: 0.1462 dn_loss_iou: 0.1858 d0.dn_loss_cls: 0.2251 d0.dn_loss_bbox: 0.2916 d0.dn_loss_iou: 0.3277 d1.dn_loss_cls: 0.1779 d1.dn_loss_bbox: 0.1790 d1.dn_loss_iou: 0.2174 d2.dn_loss_cls: 0.1623 d2.dn_loss_bbox: 0.1542 d2.dn_loss_iou: 0.1949 d3.dn_loss_cls: 0.1515 d3.dn_loss_bbox: 0.1488 d3.dn_loss_iou: 0.1887 d4.dn_loss_cls: 0.1492 d4.dn_loss_bbox: 0.1461 d4.dn_loss_iou: 0.1857 d1.loss_lmm_region: 0.1663 loss_lmm_image: 0.9047 2024/11/11 04:18:50 - mmengine - INFO - Iter(train) [ 31200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:21:01 time: 2.0099 data_time: 0.0179 memory: 35247 grad_norm: 30.5285 loss: 10.7164 loss_cls: 0.3220 loss_bbox: 0.1787 loss_iou: 0.3009 d0.loss_cls: 0.3725 d0.loss_bbox: 0.2021 d0.loss_iou: 0.3205 d1.loss_cls: 0.3504 d1.loss_bbox: 0.1849 d1.loss_iou: 0.3065 d2.loss_cls: 0.3358 d2.loss_bbox: 0.1803 d2.loss_iou: 0.3018 d3.loss_cls: 0.3248 d3.loss_bbox: 0.1829 d3.loss_iou: 0.3015 d4.loss_cls: 0.3237 d4.loss_bbox: 0.1784 d4.loss_iou: 0.2987 enc_loss_cls: 0.3753 enc_loss_bbox: 0.2144 enc_loss_iou: 0.3443 dn_loss_cls: 0.1240 dn_loss_bbox: 0.1877 dn_loss_iou: 0.2399 d0.dn_loss_cls: 0.2009 d0.dn_loss_bbox: 0.3351 d0.dn_loss_iou: 0.3823 d1.dn_loss_cls: 0.1515 d1.dn_loss_bbox: 0.2179 d1.dn_loss_iou: 0.2692 d2.dn_loss_cls: 0.1354 d2.dn_loss_bbox: 0.1972 d2.dn_loss_iou: 0.2492 d3.dn_loss_cls: 0.1285 d3.dn_loss_bbox: 0.1892 d3.dn_loss_iou: 0.2417 d4.dn_loss_cls: 0.1251 d4.dn_loss_bbox: 0.1876 d4.dn_loss_iou: 0.2398 d1.loss_lmm_region: 0.1483 loss_lmm_image: 0.8655 2024/11/11 04:22:12 - mmengine - INFO - Iter(train) [ 31300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:17:43 time: 2.0227 data_time: 0.0174 memory: 33842 grad_norm: 28.3653 loss: 11.7404 loss_cls: 0.3600 loss_bbox: 0.1933 loss_iou: 0.2934 d0.loss_cls: 0.4124 d0.loss_bbox: 0.2110 d0.loss_iou: 0.3074 d1.loss_cls: 0.3828 d1.loss_bbox: 0.1957 d1.loss_iou: 0.2957 d2.loss_cls: 0.3731 d2.loss_bbox: 0.1895 d2.loss_iou: 0.2888 d3.loss_cls: 0.3632 d3.loss_bbox: 0.1903 d3.loss_iou: 0.2913 d4.loss_cls: 0.3609 d4.loss_bbox: 0.1936 d4.loss_iou: 0.2915 enc_loss_cls: 0.4094 enc_loss_bbox: 0.2226 enc_loss_iou: 0.3240 dn_loss_cls: 0.1584 dn_loss_bbox: 0.2411 dn_loss_iou: 0.2649 d0.dn_loss_cls: 0.2456 d0.dn_loss_bbox: 0.4000 d0.dn_loss_iou: 0.4112 d1.dn_loss_cls: 0.1939 d1.dn_loss_bbox: 0.2782 d1.dn_loss_iou: 0.2972 d2.dn_loss_cls: 0.1727 d2.dn_loss_bbox: 0.2525 d2.dn_loss_iou: 0.2758 d3.dn_loss_cls: 0.1662 d3.dn_loss_bbox: 0.2439 d3.dn_loss_iou: 0.2675 d4.dn_loss_cls: 0.1592 d4.dn_loss_bbox: 0.2411 d4.dn_loss_iou: 0.2650 d1.loss_lmm_region: 0.1879 loss_lmm_image: 0.8685 2024/11/11 04:25:28 - mmengine - INFO - Iter(train) [ 31400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:14:04 time: 1.9626 data_time: 0.0178 memory: 33796 grad_norm: 31.9009 loss: 11.1826 loss_cls: 0.3907 loss_bbox: 0.1606 loss_iou: 0.2730 d0.loss_cls: 0.4428 d0.loss_bbox: 0.1696 d0.loss_iou: 0.2874 d1.loss_cls: 0.4107 d1.loss_bbox: 0.1681 d1.loss_iou: 0.2832 d2.loss_cls: 0.4061 d2.loss_bbox: 0.1635 d2.loss_iou: 0.2773 d3.loss_cls: 0.3919 d3.loss_bbox: 0.1638 d3.loss_iou: 0.2748 d4.loss_cls: 0.3951 d4.loss_bbox: 0.1588 d4.loss_iou: 0.2724 enc_loss_cls: 0.4468 enc_loss_bbox: 0.1875 enc_loss_iou: 0.3122 dn_loss_cls: 0.1724 dn_loss_bbox: 0.1810 dn_loss_iou: 0.2298 d0.dn_loss_cls: 0.2574 d0.dn_loss_bbox: 0.3352 d0.dn_loss_iou: 0.3737 d1.dn_loss_cls: 0.2096 d1.dn_loss_bbox: 0.2129 d1.dn_loss_iou: 0.2594 d2.dn_loss_cls: 0.1876 d2.dn_loss_bbox: 0.1911 d2.dn_loss_iou: 0.2389 d3.dn_loss_cls: 0.1804 d3.dn_loss_bbox: 0.1830 d3.dn_loss_iou: 0.2323 d4.dn_loss_cls: 0.1730 d4.dn_loss_bbox: 0.1810 d4.dn_loss_iou: 0.2299 d1.loss_lmm_region: 0.2004 loss_lmm_image: 0.9174 2024/11/11 04:28:49 - mmengine - INFO - Iter(train) [ 31500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:10:43 time: 1.9923 data_time: 0.0173 memory: 35108 grad_norm: 28.5904 loss: 9.0956 loss_cls: 0.2835 loss_bbox: 0.1385 loss_iou: 0.2178 d0.loss_cls: 0.3295 d0.loss_bbox: 0.1511 d0.loss_iou: 0.2404 d1.loss_cls: 0.2994 d1.loss_bbox: 0.1467 d1.loss_iou: 0.2310 d2.loss_cls: 0.2926 d2.loss_bbox: 0.1412 d2.loss_iou: 0.2216 d3.loss_cls: 0.2986 d3.loss_bbox: 0.1295 d3.loss_iou: 0.2134 d4.loss_cls: 0.2847 d4.loss_bbox: 0.1393 d4.loss_iou: 0.2192 enc_loss_cls: 0.3364 enc_loss_bbox: 0.1652 enc_loss_iou: 0.2593 dn_loss_cls: 0.0886 dn_loss_bbox: 0.1650 dn_loss_iou: 0.1946 d0.dn_loss_cls: 0.1697 d0.dn_loss_bbox: 0.3285 d0.dn_loss_iou: 0.3471 d1.dn_loss_cls: 0.1199 d1.dn_loss_bbox: 0.2033 d1.dn_loss_iou: 0.2304 d2.dn_loss_cls: 0.1008 d2.dn_loss_bbox: 0.1787 d2.dn_loss_iou: 0.2067 d3.dn_loss_cls: 0.0927 d3.dn_loss_bbox: 0.1695 d3.dn_loss_iou: 0.1985 d4.dn_loss_cls: 0.0898 d4.dn_loss_bbox: 0.1650 d4.dn_loss_iou: 0.1946 d1.loss_lmm_region: 0.1640 loss_lmm_image: 0.9489 2024/11/11 04:32:08 - mmengine - INFO - Iter(train) [ 31600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:07:14 time: 1.9844 data_time: 0.0173 memory: 33733 grad_norm: 27.8526 loss: 9.2645 loss_cls: 0.2641 loss_bbox: 0.1352 loss_iou: 0.2284 d0.loss_cls: 0.3145 d0.loss_bbox: 0.1484 d0.loss_iou: 0.2398 d1.loss_cls: 0.2837 d1.loss_bbox: 0.1390 d1.loss_iou: 0.2313 d2.loss_cls: 0.2731 d2.loss_bbox: 0.1380 d2.loss_iou: 0.2283 d3.loss_cls: 0.2688 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2276 d4.loss_cls: 0.2644 d4.loss_bbox: 0.1357 d4.loss_iou: 0.2294 enc_loss_cls: 0.3154 enc_loss_bbox: 0.1621 enc_loss_iou: 0.2647 dn_loss_cls: 0.1120 dn_loss_bbox: 0.1730 dn_loss_iou: 0.2163 d0.dn_loss_cls: 0.1901 d0.dn_loss_bbox: 0.3253 d0.dn_loss_iou: 0.3656 d1.dn_loss_cls: 0.1411 d1.dn_loss_bbox: 0.2083 d1.dn_loss_iou: 0.2482 d2.dn_loss_cls: 0.1222 d2.dn_loss_bbox: 0.1841 d2.dn_loss_iou: 0.2262 d3.dn_loss_cls: 0.1157 d3.dn_loss_bbox: 0.1757 d3.dn_loss_iou: 0.2190 d4.dn_loss_cls: 0.1123 d4.dn_loss_bbox: 0.1730 d4.dn_loss_iou: 0.2163 d1.loss_lmm_region: 0.1631 loss_lmm_image: 0.9501 2024/11/11 04:35:25 - mmengine - INFO - Iter(train) [ 31700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:03:37 time: 1.9573 data_time: 0.0174 memory: 33567 grad_norm: 28.5580 loss: 10.6621 loss_cls: 0.3495 loss_bbox: 0.1510 loss_iou: 0.2627 d0.loss_cls: 0.4058 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2692 d1.loss_cls: 0.3642 d1.loss_bbox: 0.1548 d1.loss_iou: 0.2672 d2.loss_cls: 0.3603 d2.loss_bbox: 0.1500 d2.loss_iou: 0.2630 d3.loss_cls: 0.3562 d3.loss_bbox: 0.1490 d3.loss_iou: 0.2621 d4.loss_cls: 0.3532 d4.loss_bbox: 0.1500 d4.loss_iou: 0.2602 enc_loss_cls: 0.3956 enc_loss_bbox: 0.1645 enc_loss_iou: 0.2871 dn_loss_cls: 0.1547 dn_loss_bbox: 0.1860 dn_loss_iou: 0.2414 d0.dn_loss_cls: 0.2382 d0.dn_loss_bbox: 0.3440 d0.dn_loss_iou: 0.3926 d1.dn_loss_cls: 0.1836 d1.dn_loss_bbox: 0.2269 d1.dn_loss_iou: 0.2771 d2.dn_loss_cls: 0.1652 d2.dn_loss_bbox: 0.2000 d2.dn_loss_iou: 0.2534 d3.dn_loss_cls: 0.1617 d3.dn_loss_bbox: 0.1896 d3.dn_loss_iou: 0.2450 d4.dn_loss_cls: 0.1531 d4.dn_loss_bbox: 0.1861 d4.dn_loss_iou: 0.2415 d1.loss_lmm_region: 0.1908 loss_lmm_image: 0.8970 2024/11/11 04:38:45 - mmengine - INFO - Iter(train) [ 31800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 18:00:12 time: 1.9913 data_time: 0.0176 memory: 33949 grad_norm: 27.1877 loss: 10.2082 loss_cls: 0.3763 loss_bbox: 0.1366 loss_iou: 0.2245 d0.loss_cls: 0.4198 d0.loss_bbox: 0.1508 d0.loss_iou: 0.2429 d1.loss_cls: 0.3972 d1.loss_bbox: 0.1388 d1.loss_iou: 0.2313 d2.loss_cls: 0.3838 d2.loss_bbox: 0.1372 d2.loss_iou: 0.2274 d3.loss_cls: 0.3793 d3.loss_bbox: 0.1364 d3.loss_iou: 0.2266 d4.loss_cls: 0.3754 d4.loss_bbox: 0.1360 d4.loss_iou: 0.2256 enc_loss_cls: 0.4166 enc_loss_bbox: 0.1609 enc_loss_iou: 0.2657 dn_loss_cls: 0.1584 dn_loss_bbox: 0.1673 dn_loss_iou: 0.2091 d0.dn_loss_cls: 0.2330 d0.dn_loss_bbox: 0.3172 d0.dn_loss_iou: 0.3566 d1.dn_loss_cls: 0.1871 d1.dn_loss_bbox: 0.1969 d1.dn_loss_iou: 0.2404 d2.dn_loss_cls: 0.1685 d2.dn_loss_bbox: 0.1752 d2.dn_loss_iou: 0.2184 d3.dn_loss_cls: 0.1611 d3.dn_loss_bbox: 0.1686 d3.dn_loss_iou: 0.2112 d4.dn_loss_cls: 0.1593 d4.dn_loss_bbox: 0.1672 d4.dn_loss_iou: 0.2091 d1.loss_lmm_region: 0.1977 loss_lmm_image: 0.9167 2024/11/11 04:42:05 - mmengine - INFO - Iter(train) [ 31900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:56:48 time: 1.9905 data_time: 0.0173 memory: 32745 grad_norm: 28.6893 loss: 9.9357 loss_cls: 0.2742 loss_bbox: 0.1565 loss_iou: 0.2621 d0.loss_cls: 0.3141 d0.loss_bbox: 0.1747 d0.loss_iou: 0.2821 d1.loss_cls: 0.2901 d1.loss_bbox: 0.1638 d1.loss_iou: 0.2740 d2.loss_cls: 0.2815 d2.loss_bbox: 0.1592 d2.loss_iou: 0.2669 d3.loss_cls: 0.2767 d3.loss_bbox: 0.1573 d3.loss_iou: 0.2634 d4.loss_cls: 0.2750 d4.loss_bbox: 0.1568 d4.loss_iou: 0.2617 enc_loss_cls: 0.3194 enc_loss_bbox: 0.1882 enc_loss_iou: 0.3006 dn_loss_cls: 0.0923 dn_loss_bbox: 0.1975 dn_loss_iou: 0.2425 d0.dn_loss_cls: 0.1790 d0.dn_loss_bbox: 0.3646 d0.dn_loss_iou: 0.3919 d1.dn_loss_cls: 0.1328 d1.dn_loss_bbox: 0.2311 d1.dn_loss_iou: 0.2743 d2.dn_loss_cls: 0.1106 d2.dn_loss_bbox: 0.2107 d2.dn_loss_iou: 0.2528 d3.dn_loss_cls: 0.0994 d3.dn_loss_bbox: 0.1994 d3.dn_loss_iou: 0.2444 d4.dn_loss_cls: 0.0934 d4.dn_loss_bbox: 0.1974 d4.dn_loss_iou: 0.2424 d1.loss_lmm_region: 0.1527 loss_lmm_image: 0.9284 2024/11/11 04:45:22 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 04:45:22 - mmengine - INFO - Iter(train) [ 32000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:53:12 time: 1.9882 data_time: 0.0174 memory: 31741 grad_norm: 29.8923 loss: 10.3253 loss_cls: 0.3272 loss_bbox: 0.1543 loss_iou: 0.2389 d0.loss_cls: 0.3770 d0.loss_bbox: 0.1593 d0.loss_iou: 0.2492 d1.loss_cls: 0.3442 d1.loss_bbox: 0.1546 d1.loss_iou: 0.2418 d2.loss_cls: 0.3337 d2.loss_bbox: 0.1508 d2.loss_iou: 0.2375 d3.loss_cls: 0.3277 d3.loss_bbox: 0.1549 d3.loss_iou: 0.2372 d4.loss_cls: 0.3292 d4.loss_bbox: 0.1514 d4.loss_iou: 0.2367 enc_loss_cls: 0.3790 enc_loss_bbox: 0.1741 enc_loss_iou: 0.2683 dn_loss_cls: 0.1643 dn_loss_bbox: 0.1918 dn_loss_iou: 0.2217 d0.dn_loss_cls: 0.2456 d0.dn_loss_bbox: 0.3535 d0.dn_loss_iou: 0.3719 d1.dn_loss_cls: 0.1987 d1.dn_loss_bbox: 0.2291 d1.dn_loss_iou: 0.2555 d2.dn_loss_cls: 0.1775 d2.dn_loss_bbox: 0.2038 d2.dn_loss_iou: 0.2327 d3.dn_loss_cls: 0.1703 d3.dn_loss_bbox: 0.1942 d3.dn_loss_iou: 0.2245 d4.dn_loss_cls: 0.1629 d4.dn_loss_bbox: 0.1917 d4.dn_loss_iou: 0.2219 d1.loss_lmm_region: 0.2097 loss_lmm_image: 0.8769 2024/11/11 04:48:42 - mmengine - INFO - Iter(train) [ 32100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:49:45 time: 1.9992 data_time: 0.0173 memory: 32474 grad_norm: 30.8860 loss: 11.0273 loss_cls: 0.3982 loss_bbox: 0.1560 loss_iou: 0.2755 d0.loss_cls: 0.4347 d0.loss_bbox: 0.1726 d0.loss_iou: 0.2902 d1.loss_cls: 0.4143 d1.loss_bbox: 0.1654 d1.loss_iou: 0.2857 d2.loss_cls: 0.4017 d2.loss_bbox: 0.1610 d2.loss_iou: 0.2822 d3.loss_cls: 0.3974 d3.loss_bbox: 0.1592 d3.loss_iou: 0.2768 d4.loss_cls: 0.3966 d4.loss_bbox: 0.1581 d4.loss_iou: 0.2754 enc_loss_cls: 0.4317 enc_loss_bbox: 0.1923 enc_loss_iou: 0.3202 dn_loss_cls: 0.1664 dn_loss_bbox: 0.1777 dn_loss_iou: 0.2212 d0.dn_loss_cls: 0.2452 d0.dn_loss_bbox: 0.3239 d0.dn_loss_iou: 0.3635 d1.dn_loss_cls: 0.1990 d1.dn_loss_bbox: 0.2088 d1.dn_loss_iou: 0.2521 d2.dn_loss_cls: 0.1810 d2.dn_loss_bbox: 0.1852 d2.dn_loss_iou: 0.2302 d3.dn_loss_cls: 0.1733 d3.dn_loss_bbox: 0.1796 d3.dn_loss_iou: 0.2239 d4.dn_loss_cls: 0.1686 d4.dn_loss_bbox: 0.1776 d4.dn_loss_iou: 0.2212 d1.loss_lmm_region: 0.1740 loss_lmm_image: 0.9097 2024/11/11 04:52:01 - mmengine - INFO - Iter(train) [ 32200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:46:18 time: 1.9837 data_time: 0.0173 memory: 34596 grad_norm: 30.4094 loss: 9.8519 loss_cls: 0.3132 loss_bbox: 0.1349 loss_iou: 0.2454 d0.loss_cls: 0.3636 d0.loss_bbox: 0.1420 d0.loss_iou: 0.2648 d1.loss_cls: 0.3339 d1.loss_bbox: 0.1365 d1.loss_iou: 0.2562 d2.loss_cls: 0.3250 d2.loss_bbox: 0.1336 d2.loss_iou: 0.2500 d3.loss_cls: 0.3190 d3.loss_bbox: 0.1330 d3.loss_iou: 0.2473 d4.loss_cls: 0.3140 d4.loss_bbox: 0.1348 d4.loss_iou: 0.2456 enc_loss_cls: 0.3580 enc_loss_bbox: 0.1612 enc_loss_iou: 0.2937 dn_loss_cls: 0.1346 dn_loss_bbox: 0.1644 dn_loss_iou: 0.2085 d0.dn_loss_cls: 0.2139 d0.dn_loss_bbox: 0.3209 d0.dn_loss_iou: 0.3593 d1.dn_loss_cls: 0.1638 d1.dn_loss_bbox: 0.2010 d1.dn_loss_iou: 0.2427 d2.dn_loss_cls: 0.1453 d2.dn_loss_bbox: 0.1758 d2.dn_loss_iou: 0.2204 d3.dn_loss_cls: 0.1409 d3.dn_loss_bbox: 0.1664 d3.dn_loss_iou: 0.2111 d4.dn_loss_cls: 0.1353 d4.dn_loss_bbox: 0.1644 d4.dn_loss_iou: 0.2086 d1.loss_lmm_region: 0.1718 loss_lmm_image: 0.9972 2024/11/11 04:55:19 - mmengine - INFO - Iter(train) [ 32300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:42:48 time: 1.9995 data_time: 0.0175 memory: 34932 grad_norm: 28.4568 loss: 9.7524 loss_cls: 0.2954 loss_bbox: 0.1414 loss_iou: 0.2722 d0.loss_cls: 0.3337 d0.loss_bbox: 0.1516 d0.loss_iou: 0.2864 d1.loss_cls: 0.3067 d1.loss_bbox: 0.1474 d1.loss_iou: 0.2765 d2.loss_cls: 0.2989 d2.loss_bbox: 0.1425 d2.loss_iou: 0.2743 d3.loss_cls: 0.2969 d3.loss_bbox: 0.1396 d3.loss_iou: 0.2706 d4.loss_cls: 0.2951 d4.loss_bbox: 0.1415 d4.loss_iou: 0.2714 enc_loss_cls: 0.3386 enc_loss_bbox: 0.1609 enc_loss_iou: 0.3041 dn_loss_cls: 0.1369 dn_loss_bbox: 0.1586 dn_loss_iou: 0.2109 d0.dn_loss_cls: 0.1941 d0.dn_loss_bbox: 0.3096 d0.dn_loss_iou: 0.3518 d1.dn_loss_cls: 0.1578 d1.dn_loss_bbox: 0.1888 d1.dn_loss_iou: 0.2411 d2.dn_loss_cls: 0.1450 d2.dn_loss_bbox: 0.1675 d2.dn_loss_iou: 0.2200 d3.dn_loss_cls: 0.1406 d3.dn_loss_bbox: 0.1608 d3.dn_loss_iou: 0.2130 d4.dn_loss_cls: 0.1390 d4.dn_loss_bbox: 0.1586 d4.dn_loss_iou: 0.2109 d1.loss_lmm_region: 0.1463 loss_lmm_image: 0.9553 2024/11/11 04:58:39 - mmengine - INFO - Iter(train) [ 32400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:39:23 time: 2.0108 data_time: 0.0204 memory: 33674 grad_norm: 31.7033 loss: 10.5144 loss_cls: 0.3957 loss_bbox: 0.1353 loss_iou: 0.2519 d0.loss_cls: 0.4386 d0.loss_bbox: 0.1492 d0.loss_iou: 0.2674 d1.loss_cls: 0.4090 d1.loss_bbox: 0.1452 d1.loss_iou: 0.2624 d2.loss_cls: 0.4043 d2.loss_bbox: 0.1366 d2.loss_iou: 0.2545 d3.loss_cls: 0.3990 d3.loss_bbox: 0.1354 d3.loss_iou: 0.2522 d4.loss_cls: 0.3959 d4.loss_bbox: 0.1370 d4.loss_iou: 0.2529 enc_loss_cls: 0.4416 enc_loss_bbox: 0.1625 enc_loss_iou: 0.2896 dn_loss_cls: 0.1567 dn_loss_bbox: 0.1576 dn_loss_iou: 0.2105 d0.dn_loss_cls: 0.2321 d0.dn_loss_bbox: 0.3222 d0.dn_loss_iou: 0.3622 d1.dn_loss_cls: 0.1850 d1.dn_loss_bbox: 0.1920 d1.dn_loss_iou: 0.2443 d2.dn_loss_cls: 0.1659 d2.dn_loss_bbox: 0.1658 d2.dn_loss_iou: 0.2206 d3.dn_loss_cls: 0.1606 d3.dn_loss_bbox: 0.1584 d3.dn_loss_iou: 0.2131 d4.dn_loss_cls: 0.1568 d4.dn_loss_bbox: 0.1577 d4.dn_loss_iou: 0.2107 d1.loss_lmm_region: 0.1829 loss_lmm_image: 0.9431 2024/11/11 05:01:59 - mmengine - INFO - Iter(train) [ 32500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:35:57 time: 1.9978 data_time: 0.0184 memory: 33198 grad_norm: 28.7332 loss: 9.5728 loss_cls: 0.2925 loss_bbox: 0.1400 loss_iou: 0.2301 d0.loss_cls: 0.3393 d0.loss_bbox: 0.1540 d0.loss_iou: 0.2478 d1.loss_cls: 0.3062 d1.loss_bbox: 0.1472 d1.loss_iou: 0.2391 d2.loss_cls: 0.2975 d2.loss_bbox: 0.1459 d2.loss_iou: 0.2342 d3.loss_cls: 0.2946 d3.loss_bbox: 0.1429 d3.loss_iou: 0.2325 d4.loss_cls: 0.2902 d4.loss_bbox: 0.1416 d4.loss_iou: 0.2306 enc_loss_cls: 0.3350 enc_loss_bbox: 0.1684 enc_loss_iou: 0.2683 dn_loss_cls: 0.1257 dn_loss_bbox: 0.1716 dn_loss_iou: 0.2200 d0.dn_loss_cls: 0.2030 d0.dn_loss_bbox: 0.3364 d0.dn_loss_iou: 0.3769 d1.dn_loss_cls: 0.1512 d1.dn_loss_bbox: 0.2060 d1.dn_loss_iou: 0.2553 d2.dn_loss_cls: 0.1334 d2.dn_loss_bbox: 0.1808 d2.dn_loss_iou: 0.2308 d3.dn_loss_cls: 0.1282 d3.dn_loss_bbox: 0.1735 d3.dn_loss_iou: 0.2232 d4.dn_loss_cls: 0.1257 d4.dn_loss_bbox: 0.1715 d4.dn_loss_iou: 0.2200 d1.loss_lmm_region: 0.1724 loss_lmm_image: 0.8897 2024/11/11 05:05:18 - mmengine - INFO - Iter(train) [ 32600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:32:28 time: 1.9977 data_time: 0.0200 memory: 34383 grad_norm: 32.9659 loss: 9.7168 loss_cls: 0.3203 loss_bbox: 0.1291 loss_iou: 0.2573 d0.loss_cls: 0.3691 d0.loss_bbox: 0.1472 d0.loss_iou: 0.2768 d1.loss_cls: 0.3372 d1.loss_bbox: 0.1402 d1.loss_iou: 0.2665 d2.loss_cls: 0.3289 d2.loss_bbox: 0.1338 d2.loss_iou: 0.2617 d3.loss_cls: 0.3250 d3.loss_bbox: 0.1291 d3.loss_iou: 0.2588 d4.loss_cls: 0.3222 d4.loss_bbox: 0.1289 d4.loss_iou: 0.2571 enc_loss_cls: 0.3740 enc_loss_bbox: 0.1575 enc_loss_iou: 0.2952 dn_loss_cls: 0.1233 dn_loss_bbox: 0.1583 dn_loss_iou: 0.2118 d0.dn_loss_cls: 0.1984 d0.dn_loss_bbox: 0.3108 d0.dn_loss_iou: 0.3625 d1.dn_loss_cls: 0.1533 d1.dn_loss_bbox: 0.1917 d1.dn_loss_iou: 0.2460 d2.dn_loss_cls: 0.1304 d2.dn_loss_bbox: 0.1685 d2.dn_loss_iou: 0.2228 d3.dn_loss_cls: 0.1265 d3.dn_loss_bbox: 0.1613 d3.dn_loss_iou: 0.2151 d4.dn_loss_cls: 0.1216 d4.dn_loss_bbox: 0.1584 d4.dn_loss_iou: 0.2119 d1.loss_lmm_region: 0.1489 loss_lmm_image: 0.8797 2024/11/11 05:08:37 - mmengine - INFO - Iter(train) [ 32700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:29:01 time: 2.0015 data_time: 0.0199 memory: 34801 grad_norm: 33.7794 loss: 10.5135 loss_cls: 0.3427 loss_bbox: 0.1528 loss_iou: 0.2641 d0.loss_cls: 0.3977 d0.loss_bbox: 0.1670 d0.loss_iou: 0.2785 d1.loss_cls: 0.3712 d1.loss_bbox: 0.1562 d1.loss_iou: 0.2701 d2.loss_cls: 0.3582 d2.loss_bbox: 0.1532 d2.loss_iou: 0.2670 d3.loss_cls: 0.3476 d3.loss_bbox: 0.1549 d3.loss_iou: 0.2686 d4.loss_cls: 0.3443 d4.loss_bbox: 0.1524 d4.loss_iou: 0.2641 enc_loss_cls: 0.3920 enc_loss_bbox: 0.1786 enc_loss_iou: 0.2971 dn_loss_cls: 0.1609 dn_loss_bbox: 0.1778 dn_loss_iou: 0.2138 d0.dn_loss_cls: 0.2519 d0.dn_loss_bbox: 0.3314 d0.dn_loss_iou: 0.3563 d1.dn_loss_cls: 0.1967 d1.dn_loss_bbox: 0.2151 d1.dn_loss_iou: 0.2480 d2.dn_loss_cls: 0.1753 d2.dn_loss_bbox: 0.1886 d2.dn_loss_iou: 0.2236 d3.dn_loss_cls: 0.1652 d3.dn_loss_bbox: 0.1806 d3.dn_loss_iou: 0.2170 d4.dn_loss_cls: 0.1612 d4.dn_loss_bbox: 0.1778 d4.dn_loss_iou: 0.2138 d1.loss_lmm_region: 0.1834 loss_lmm_image: 0.8972 2024/11/11 05:11:56 - mmengine - INFO - Iter(train) [ 32800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:25:31 time: 1.9635 data_time: 0.0182 memory: 34681 grad_norm: 33.8089 loss: 9.3822 loss_cls: 0.2912 loss_bbox: 0.1380 loss_iou: 0.2110 d0.loss_cls: 0.3304 d0.loss_bbox: 0.1549 d0.loss_iou: 0.2287 d1.loss_cls: 0.3017 d1.loss_bbox: 0.1514 d1.loss_iou: 0.2245 d2.loss_cls: 0.2984 d2.loss_bbox: 0.1450 d2.loss_iou: 0.2177 d3.loss_cls: 0.2954 d3.loss_bbox: 0.1408 d3.loss_iou: 0.2133 d4.loss_cls: 0.2900 d4.loss_bbox: 0.1403 d4.loss_iou: 0.2116 enc_loss_cls: 0.3287 enc_loss_bbox: 0.1750 enc_loss_iou: 0.2539 dn_loss_cls: 0.1257 dn_loss_bbox: 0.1797 dn_loss_iou: 0.2049 d0.dn_loss_cls: 0.2086 d0.dn_loss_bbox: 0.3326 d0.dn_loss_iou: 0.3472 d1.dn_loss_cls: 0.1561 d1.dn_loss_bbox: 0.2120 d1.dn_loss_iou: 0.2344 d2.dn_loss_cls: 0.1376 d2.dn_loss_bbox: 0.1884 d2.dn_loss_iou: 0.2132 d3.dn_loss_cls: 0.1318 d3.dn_loss_bbox: 0.1816 d3.dn_loss_iou: 0.2074 d4.dn_loss_cls: 0.1267 d4.dn_loss_bbox: 0.1796 d4.dn_loss_iou: 0.2049 d1.loss_lmm_region: 0.1696 loss_lmm_image: 0.8984 2024/11/11 05:15:15 - mmengine - INFO - Iter(train) [ 32900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:22:06 time: 1.9718 data_time: 0.0180 memory: 34445 grad_norm: 37.0530 loss: 9.3482 loss_cls: 0.2490 loss_bbox: 0.1401 loss_iou: 0.2279 d0.loss_cls: 0.3022 d0.loss_bbox: 0.1505 d0.loss_iou: 0.2393 d1.loss_cls: 0.2652 d1.loss_bbox: 0.1486 d1.loss_iou: 0.2356 d2.loss_cls: 0.2601 d2.loss_bbox: 0.1458 d2.loss_iou: 0.2325 d3.loss_cls: 0.2527 d3.loss_bbox: 0.1386 d3.loss_iou: 0.2274 d4.loss_cls: 0.2496 d4.loss_bbox: 0.1395 d4.loss_iou: 0.2281 enc_loss_cls: 0.3033 enc_loss_bbox: 0.1660 enc_loss_iou: 0.2626 dn_loss_cls: 0.0940 dn_loss_bbox: 0.2071 dn_loss_iou: 0.2375 d0.dn_loss_cls: 0.1834 d0.dn_loss_bbox: 0.3562 d0.dn_loss_iou: 0.3794 d1.dn_loss_cls: 0.1270 d1.dn_loss_bbox: 0.2404 d1.dn_loss_iou: 0.2684 d2.dn_loss_cls: 0.1071 d2.dn_loss_bbox: 0.2175 d2.dn_loss_iou: 0.2474 d3.dn_loss_cls: 0.0988 d3.dn_loss_bbox: 0.2095 d3.dn_loss_iou: 0.2399 d4.dn_loss_cls: 0.0946 d4.dn_loss_bbox: 0.2070 d4.dn_loss_iou: 0.2375 d1.loss_lmm_region: 0.1739 loss_lmm_image: 0.8570 2024/11/11 05:18:33 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 05:18:33 - mmengine - INFO - Iter(train) [ 33000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:18:34 time: 1.9506 data_time: 0.0176 memory: 33864 grad_norm: nan loss: 9.6485 loss_cls: 0.3030 loss_bbox: 0.1436 loss_iou: 0.2543 d0.loss_cls: 0.3464 d0.loss_bbox: 0.1573 d0.loss_iou: 0.2707 d1.loss_cls: 0.3197 d1.loss_bbox: 0.1474 d1.loss_iou: 0.2596 d2.loss_cls: 0.3081 d2.loss_bbox: 0.1428 d2.loss_iou: 0.2546 d3.loss_cls: 0.3070 d3.loss_bbox: 0.1429 d3.loss_iou: 0.2531 d4.loss_cls: 0.3042 d4.loss_bbox: 0.1414 d4.loss_iou: 0.2521 enc_loss_cls: 0.3482 enc_loss_bbox: 0.1741 enc_loss_iou: 0.2936 dn_loss_cls: 0.0906 dn_loss_bbox: 0.1769 dn_loss_iou: 0.2206 d0.dn_loss_cls: 0.1808 d0.dn_loss_bbox: 0.3312 d0.dn_loss_iou: 0.3633 d1.dn_loss_cls: 0.1297 d1.dn_loss_bbox: 0.2105 d1.dn_loss_iou: 0.2511 d2.dn_loss_cls: 0.1056 d2.dn_loss_bbox: 0.1892 d2.dn_loss_iou: 0.2324 d3.dn_loss_cls: 0.0992 d3.dn_loss_bbox: 0.1801 d3.dn_loss_iou: 0.2240 d4.dn_loss_cls: 0.0918 d4.dn_loss_bbox: 0.1770 d4.dn_loss_iou: 0.2206 d1.loss_lmm_region: 0.1437 loss_lmm_image: 0.9060 2024/11/11 05:21:50 - mmengine - INFO - Iter(train) [ 33100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:14:59 time: 1.9840 data_time: 0.0174 memory: 33734 grad_norm: 28.0674 loss: 8.3237 loss_cls: 0.2481 loss_bbox: 0.1051 loss_iou: 0.2140 d0.loss_cls: 0.2883 d0.loss_bbox: 0.1112 d0.loss_iou: 0.2235 d1.loss_cls: 0.2628 d1.loss_bbox: 0.1081 d1.loss_iou: 0.2175 d2.loss_cls: 0.2566 d2.loss_bbox: 0.1058 d2.loss_iou: 0.2150 d3.loss_cls: 0.2538 d3.loss_bbox: 0.1042 d3.loss_iou: 0.2121 d4.loss_cls: 0.2473 d4.loss_bbox: 0.1050 d4.loss_iou: 0.2142 enc_loss_cls: 0.2904 enc_loss_bbox: 0.1277 enc_loss_iou: 0.2480 dn_loss_cls: 0.1173 dn_loss_bbox: 0.1368 dn_loss_iou: 0.1878 d0.dn_loss_cls: 0.1837 d0.dn_loss_bbox: 0.2737 d0.dn_loss_iou: 0.3245 d1.dn_loss_cls: 0.1391 d1.dn_loss_bbox: 0.1685 d1.dn_loss_iou: 0.2176 d2.dn_loss_cls: 0.1280 d2.dn_loss_bbox: 0.1466 d2.dn_loss_iou: 0.1973 d3.dn_loss_cls: 0.1218 d3.dn_loss_bbox: 0.1397 d3.dn_loss_iou: 0.1909 d4.dn_loss_cls: 0.1182 d4.dn_loss_bbox: 0.1368 d4.dn_loss_iou: 0.1877 d1.loss_lmm_region: 0.1532 loss_lmm_image: 0.8957 2024/11/11 05:25:09 - mmengine - INFO - Iter(train) [ 33200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:11:32 time: 1.9772 data_time: 0.0175 memory: 34089 grad_norm: 30.5498 loss: 9.8095 loss_cls: 0.3136 loss_bbox: 0.1401 loss_iou: 0.2424 d0.loss_cls: 0.3775 d0.loss_bbox: 0.1451 d0.loss_iou: 0.2512 d1.loss_cls: 0.3400 d1.loss_bbox: 0.1396 d1.loss_iou: 0.2436 d2.loss_cls: 0.3242 d2.loss_bbox: 0.1389 d2.loss_iou: 0.2427 d3.loss_cls: 0.3141 d3.loss_bbox: 0.1398 d3.loss_iou: 0.2447 d4.loss_cls: 0.3152 d4.loss_bbox: 0.1399 d4.loss_iou: 0.2421 enc_loss_cls: 0.3613 enc_loss_bbox: 0.1652 enc_loss_iou: 0.2790 dn_loss_cls: 0.1410 dn_loss_bbox: 0.1612 dn_loss_iou: 0.2173 d0.dn_loss_cls: 0.2247 d0.dn_loss_bbox: 0.3112 d0.dn_loss_iou: 0.3612 d1.dn_loss_cls: 0.1705 d1.dn_loss_bbox: 0.1939 d1.dn_loss_iou: 0.2479 d2.dn_loss_cls: 0.1521 d2.dn_loss_bbox: 0.1707 d2.dn_loss_iou: 0.2274 d3.dn_loss_cls: 0.1448 d3.dn_loss_bbox: 0.1629 d3.dn_loss_iou: 0.2199 d4.dn_loss_cls: 0.1439 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.2173 d1.loss_lmm_region: 0.1776 loss_lmm_image: 0.9025 2024/11/11 05:28:29 - mmengine - INFO - Iter(train) [ 33300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:08:06 time: 2.0070 data_time: 0.0180 memory: 31745 grad_norm: 28.7538 loss: 8.5366 loss_cls: 0.2852 loss_bbox: 0.1130 loss_iou: 0.2186 d0.loss_cls: 0.3386 d0.loss_bbox: 0.1166 d0.loss_iou: 0.2308 d1.loss_cls: 0.3077 d1.loss_bbox: 0.1137 d1.loss_iou: 0.2232 d2.loss_cls: 0.2913 d2.loss_bbox: 0.1140 d2.loss_iou: 0.2209 d3.loss_cls: 0.2871 d3.loss_bbox: 0.1124 d3.loss_iou: 0.2180 d4.loss_cls: 0.2851 d4.loss_bbox: 0.1131 d4.loss_iou: 0.2185 enc_loss_cls: 0.3306 enc_loss_bbox: 0.1340 enc_loss_iou: 0.2586 dn_loss_cls: 0.1233 dn_loss_bbox: 0.1248 dn_loss_iou: 0.1792 d0.dn_loss_cls: 0.1767 d0.dn_loss_bbox: 0.2518 d0.dn_loss_iou: 0.3083 d1.dn_loss_cls: 0.1446 d1.dn_loss_bbox: 0.1531 d1.dn_loss_iou: 0.2073 d2.dn_loss_cls: 0.1302 d2.dn_loss_bbox: 0.1329 d2.dn_loss_iou: 0.1868 d3.dn_loss_cls: 0.1241 d3.dn_loss_bbox: 0.1273 d3.dn_loss_iou: 0.1816 d4.dn_loss_cls: 0.1233 d4.dn_loss_bbox: 0.1248 d4.dn_loss_iou: 0.1792 d1.loss_lmm_region: 0.1365 loss_lmm_image: 0.8900 2024/11/11 05:31:50 - mmengine - INFO - Iter(train) [ 33400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:04:46 time: 2.0149 data_time: 0.0175 memory: 33472 grad_norm: 30.5101 loss: 9.8526 loss_cls: 0.3144 loss_bbox: 0.1299 loss_iou: 0.2483 d0.loss_cls: 0.3634 d0.loss_bbox: 0.1420 d0.loss_iou: 0.2616 d1.loss_cls: 0.3350 d1.loss_bbox: 0.1326 d1.loss_iou: 0.2513 d2.loss_cls: 0.3194 d2.loss_bbox: 0.1359 d2.loss_iou: 0.2539 d3.loss_cls: 0.3174 d3.loss_bbox: 0.1302 d3.loss_iou: 0.2495 d4.loss_cls: 0.3153 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2486 enc_loss_cls: 0.3644 enc_loss_bbox: 0.1528 enc_loss_iou: 0.2819 dn_loss_cls: 0.1487 dn_loss_bbox: 0.1592 dn_loss_iou: 0.2263 d0.dn_loss_cls: 0.2267 d0.dn_loss_bbox: 0.3180 d0.dn_loss_iou: 0.3770 d1.dn_loss_cls: 0.1790 d1.dn_loss_bbox: 0.1941 d1.dn_loss_iou: 0.2573 d2.dn_loss_cls: 0.1608 d2.dn_loss_bbox: 0.1684 d2.dn_loss_iou: 0.2351 d3.dn_loss_cls: 0.1523 d3.dn_loss_bbox: 0.1609 d3.dn_loss_iou: 0.2288 d4.dn_loss_cls: 0.1488 d4.dn_loss_bbox: 0.1593 d4.dn_loss_iou: 0.2263 d1.loss_lmm_region: 0.1740 loss_lmm_image: 0.8731 2024/11/11 05:35:10 - mmengine - INFO - Iter(train) [ 33500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 17:01:21 time: 1.9897 data_time: 0.0175 memory: 34920 grad_norm: 29.2624 loss: 9.4962 loss_cls: 0.2711 loss_bbox: 0.1391 loss_iou: 0.2370 d0.loss_cls: 0.3127 d0.loss_bbox: 0.1468 d0.loss_iou: 0.2498 d1.loss_cls: 0.2923 d1.loss_bbox: 0.1435 d1.loss_iou: 0.2409 d2.loss_cls: 0.2828 d2.loss_bbox: 0.1402 d2.loss_iou: 0.2406 d3.loss_cls: 0.2761 d3.loss_bbox: 0.1405 d3.loss_iou: 0.2384 d4.loss_cls: 0.2721 d4.loss_bbox: 0.1395 d4.loss_iou: 0.2375 enc_loss_cls: 0.3129 enc_loss_bbox: 0.1606 enc_loss_iou: 0.2681 dn_loss_cls: 0.1085 dn_loss_bbox: 0.1949 dn_loss_iou: 0.2343 d0.dn_loss_cls: 0.1871 d0.dn_loss_bbox: 0.3405 d0.dn_loss_iou: 0.3828 d1.dn_loss_cls: 0.1368 d1.dn_loss_bbox: 0.2195 d1.dn_loss_iou: 0.2651 d2.dn_loss_cls: 0.1246 d2.dn_loss_bbox: 0.1992 d2.dn_loss_iou: 0.2434 d3.dn_loss_cls: 0.1195 d3.dn_loss_bbox: 0.1954 d3.dn_loss_iou: 0.2363 d4.dn_loss_cls: 0.1120 d4.dn_loss_bbox: 0.1948 d4.dn_loss_iou: 0.2344 d1.loss_lmm_region: 0.1490 loss_lmm_image: 0.8756 2024/11/11 05:38:28 - mmengine - INFO - Iter(train) [ 33600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:57:51 time: 1.9807 data_time: 0.0175 memory: 33112 grad_norm: 31.2802 loss: 9.9246 loss_cls: 0.3191 loss_bbox: 0.1371 loss_iou: 0.2459 d0.loss_cls: 0.3686 d0.loss_bbox: 0.1537 d0.loss_iou: 0.2711 d1.loss_cls: 0.3412 d1.loss_bbox: 0.1413 d1.loss_iou: 0.2518 d2.loss_cls: 0.3304 d2.loss_bbox: 0.1392 d2.loss_iou: 0.2483 d3.loss_cls: 0.3238 d3.loss_bbox: 0.1364 d3.loss_iou: 0.2465 d4.loss_cls: 0.3208 d4.loss_bbox: 0.1354 d4.loss_iou: 0.2451 enc_loss_cls: 0.3788 enc_loss_bbox: 0.1597 enc_loss_iou: 0.2857 dn_loss_cls: 0.1356 dn_loss_bbox: 0.1688 dn_loss_iou: 0.2257 d0.dn_loss_cls: 0.2288 d0.dn_loss_bbox: 0.3110 d0.dn_loss_iou: 0.3713 d1.dn_loss_cls: 0.1712 d1.dn_loss_bbox: 0.2038 d1.dn_loss_iou: 0.2592 d2.dn_loss_cls: 0.1491 d2.dn_loss_bbox: 0.1804 d2.dn_loss_iou: 0.2364 d3.dn_loss_cls: 0.1397 d3.dn_loss_bbox: 0.1716 d3.dn_loss_iou: 0.2283 d4.dn_loss_cls: 0.1372 d4.dn_loss_bbox: 0.1689 d4.dn_loss_iou: 0.2258 d1.loss_lmm_region: 0.1647 loss_lmm_image: 0.8670 2024/11/11 05:41:48 - mmengine - INFO - Iter(train) [ 33700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:54:24 time: 1.9950 data_time: 0.0175 memory: 35334 grad_norm: 30.9386 loss: 9.5308 loss_cls: 0.3187 loss_bbox: 0.1343 loss_iou: 0.2413 d0.loss_cls: 0.3730 d0.loss_bbox: 0.1441 d0.loss_iou: 0.2526 d1.loss_cls: 0.3387 d1.loss_bbox: 0.1377 d1.loss_iou: 0.2462 d2.loss_cls: 0.3284 d2.loss_bbox: 0.1373 d2.loss_iou: 0.2430 d3.loss_cls: 0.3242 d3.loss_bbox: 0.1340 d3.loss_iou: 0.2395 d4.loss_cls: 0.3206 d4.loss_bbox: 0.1336 d4.loss_iou: 0.2395 enc_loss_cls: 0.3691 enc_loss_bbox: 0.1514 enc_loss_iou: 0.2688 dn_loss_cls: 0.1342 dn_loss_bbox: 0.1536 dn_loss_iou: 0.2046 d0.dn_loss_cls: 0.2178 d0.dn_loss_bbox: 0.2989 d0.dn_loss_iou: 0.3464 d1.dn_loss_cls: 0.1674 d1.dn_loss_bbox: 0.1844 d1.dn_loss_iou: 0.2336 d2.dn_loss_cls: 0.1477 d2.dn_loss_bbox: 0.1607 d2.dn_loss_iou: 0.2123 d3.dn_loss_cls: 0.1391 d3.dn_loss_bbox: 0.1560 d3.dn_loss_iou: 0.2069 d4.dn_loss_cls: 0.1349 d4.dn_loss_bbox: 0.1537 d4.dn_loss_iou: 0.2047 d1.loss_lmm_region: 0.1578 loss_lmm_image: 0.8397 2024/11/11 05:45:06 - mmengine - INFO - Iter(train) [ 33800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:50:54 time: 1.9658 data_time: 0.0177 memory: 33785 grad_norm: 29.7783 loss: 11.1791 loss_cls: 0.4399 loss_bbox: 0.1520 loss_iou: 0.2741 d0.loss_cls: 0.4952 d0.loss_bbox: 0.1611 d0.loss_iou: 0.2884 d1.loss_cls: 0.4654 d1.loss_bbox: 0.1589 d1.loss_iou: 0.2810 d2.loss_cls: 0.4502 d2.loss_bbox: 0.1535 d2.loss_iou: 0.2752 d3.loss_cls: 0.4414 d3.loss_bbox: 0.1528 d3.loss_iou: 0.2728 d4.loss_cls: 0.4414 d4.loss_bbox: 0.1514 d4.loss_iou: 0.2721 enc_loss_cls: 0.4921 enc_loss_bbox: 0.1795 enc_loss_iou: 0.3124 dn_loss_cls: 0.1560 dn_loss_bbox: 0.1682 dn_loss_iou: 0.2271 d0.dn_loss_cls: 0.2335 d0.dn_loss_bbox: 0.3085 d0.dn_loss_iou: 0.3696 d1.dn_loss_cls: 0.1811 d1.dn_loss_bbox: 0.2016 d1.dn_loss_iou: 0.2603 d2.dn_loss_cls: 0.1632 d2.dn_loss_bbox: 0.1769 d2.dn_loss_iou: 0.2363 d3.dn_loss_cls: 0.1582 d3.dn_loss_bbox: 0.1697 d3.dn_loss_iou: 0.2292 d4.dn_loss_cls: 0.1566 d4.dn_loss_bbox: 0.1682 d4.dn_loss_iou: 0.2271 d1.loss_lmm_region: 0.1727 loss_lmm_image: 0.9043 2024/11/11 05:48:23 - mmengine - INFO - Iter(train) [ 33900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:47:22 time: 1.9673 data_time: 0.0173 memory: 33637 grad_norm: 26.2775 loss: 9.9339 loss_cls: 0.3471 loss_bbox: 0.1352 loss_iou: 0.2439 d0.loss_cls: 0.4003 d0.loss_bbox: 0.1444 d0.loss_iou: 0.2585 d1.loss_cls: 0.3670 d1.loss_bbox: 0.1378 d1.loss_iou: 0.2477 d2.loss_cls: 0.3567 d2.loss_bbox: 0.1348 d2.loss_iou: 0.2458 d3.loss_cls: 0.3550 d3.loss_bbox: 0.1336 d3.loss_iou: 0.2432 d4.loss_cls: 0.3467 d4.loss_bbox: 0.1335 d4.loss_iou: 0.2433 enc_loss_cls: 0.3987 enc_loss_bbox: 0.1623 enc_loss_iou: 0.2827 dn_loss_cls: 0.1351 dn_loss_bbox: 0.1607 dn_loss_iou: 0.2065 d0.dn_loss_cls: 0.2356 d0.dn_loss_bbox: 0.3050 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1773 d1.dn_loss_bbox: 0.1991 d1.dn_loss_iou: 0.2423 d2.dn_loss_cls: 0.1539 d2.dn_loss_bbox: 0.1738 d2.dn_loss_iou: 0.2180 d3.dn_loss_cls: 0.1423 d3.dn_loss_bbox: 0.1641 d3.dn_loss_iou: 0.2101 d4.dn_loss_cls: 0.1371 d4.dn_loss_bbox: 0.1608 d4.dn_loss_iou: 0.2066 d1.loss_lmm_region: 0.1622 loss_lmm_image: 0.8708 2024/11/11 05:51:42 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 05:51:42 - mmengine - INFO - Iter(train) [ 34000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:43:54 time: 1.9913 data_time: 0.0182 memory: 34967 grad_norm: 28.7173 loss: 11.2200 loss_cls: 0.3283 loss_bbox: 0.1865 loss_iou: 0.3250 d0.loss_cls: 0.3845 d0.loss_bbox: 0.2004 d0.loss_iou: 0.3410 d1.loss_cls: 0.3532 d1.loss_bbox: 0.1919 d1.loss_iou: 0.3310 d2.loss_cls: 0.3497 d2.loss_bbox: 0.1825 d2.loss_iou: 0.3190 d3.loss_cls: 0.3369 d3.loss_bbox: 0.1844 d3.loss_iou: 0.3230 d4.loss_cls: 0.3295 d4.loss_bbox: 0.1867 d4.loss_iou: 0.3254 enc_loss_cls: 0.3833 enc_loss_bbox: 0.2085 enc_loss_iou: 0.3590 dn_loss_cls: 0.1157 dn_loss_bbox: 0.2061 dn_loss_iou: 0.2577 d0.dn_loss_cls: 0.2039 d0.dn_loss_bbox: 0.3735 d0.dn_loss_iou: 0.4069 d1.dn_loss_cls: 0.1495 d1.dn_loss_bbox: 0.2413 d1.dn_loss_iou: 0.2881 d2.dn_loss_cls: 0.1299 d2.dn_loss_bbox: 0.2169 d2.dn_loss_iou: 0.2669 d3.dn_loss_cls: 0.1209 d3.dn_loss_bbox: 0.2083 d3.dn_loss_iou: 0.2599 d4.dn_loss_cls: 0.1160 d4.dn_loss_bbox: 0.2060 d4.dn_loss_iou: 0.2577 d1.loss_lmm_region: 0.1469 loss_lmm_image: 0.9177 2024/11/11 05:55:02 - mmengine - INFO - Iter(train) [ 34100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:40:29 time: 1.9802 data_time: 0.0175 memory: 34560 grad_norm: 27.0302 loss: 10.7090 loss_cls: 0.3251 loss_bbox: 0.1699 loss_iou: 0.2970 d0.loss_cls: 0.3838 d0.loss_bbox: 0.1848 d0.loss_iou: 0.3103 d1.loss_cls: 0.3494 d1.loss_bbox: 0.1796 d1.loss_iou: 0.3054 d2.loss_cls: 0.3420 d2.loss_bbox: 0.1684 d2.loss_iou: 0.2944 d3.loss_cls: 0.3294 d3.loss_bbox: 0.1712 d3.loss_iou: 0.2954 d4.loss_cls: 0.3266 d4.loss_bbox: 0.1694 d4.loss_iou: 0.2955 enc_loss_cls: 0.3831 enc_loss_bbox: 0.2032 enc_loss_iou: 0.3391 dn_loss_cls: 0.1093 dn_loss_bbox: 0.1944 dn_loss_iou: 0.2504 d0.dn_loss_cls: 0.1897 d0.dn_loss_bbox: 0.3618 d0.dn_loss_iou: 0.4051 d1.dn_loss_cls: 0.1412 d1.dn_loss_bbox: 0.2333 d1.dn_loss_iou: 0.2858 d2.dn_loss_cls: 0.1207 d2.dn_loss_bbox: 0.2063 d2.dn_loss_iou: 0.2611 d3.dn_loss_cls: 0.1158 d3.dn_loss_bbox: 0.1986 d3.dn_loss_iou: 0.2541 d4.dn_loss_cls: 0.1099 d4.dn_loss_bbox: 0.1944 d4.dn_loss_iou: 0.2503 d1.loss_lmm_region: 0.1361 loss_lmm_image: 0.8677 2024/11/11 05:58:19 - mmengine - INFO - Iter(train) [ 34200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:36:56 time: 1.9702 data_time: 0.0196 memory: 34264 grad_norm: 33.7478 loss: 10.0570 loss_cls: 0.3369 loss_bbox: 0.1293 loss_iou: 0.2413 d0.loss_cls: 0.3930 d0.loss_bbox: 0.1435 d0.loss_iou: 0.2609 d1.loss_cls: 0.3601 d1.loss_bbox: 0.1377 d1.loss_iou: 0.2519 d2.loss_cls: 0.3465 d2.loss_bbox: 0.1325 d2.loss_iou: 0.2441 d3.loss_cls: 0.3396 d3.loss_bbox: 0.1282 d3.loss_iou: 0.2391 d4.loss_cls: 0.3354 d4.loss_bbox: 0.1300 d4.loss_iou: 0.2420 enc_loss_cls: 0.3917 enc_loss_bbox: 0.1580 enc_loss_iou: 0.2879 dn_loss_cls: 0.1358 dn_loss_bbox: 0.1833 dn_loss_iou: 0.2146 d0.dn_loss_cls: 0.2232 d0.dn_loss_bbox: 0.3419 d0.dn_loss_iou: 0.3682 d1.dn_loss_cls: 0.1690 d1.dn_loss_bbox: 0.2147 d1.dn_loss_iou: 0.2480 d2.dn_loss_cls: 0.1480 d2.dn_loss_bbox: 0.1932 d2.dn_loss_iou: 0.2248 d3.dn_loss_cls: 0.1410 d3.dn_loss_bbox: 0.1868 d3.dn_loss_iou: 0.2183 d4.dn_loss_cls: 0.1371 d4.dn_loss_bbox: 0.1833 d4.dn_loss_iou: 0.2146 d1.loss_lmm_region: 0.1645 loss_lmm_image: 0.9171 2024/11/11 06:01:39 - mmengine - INFO - Iter(train) [ 34300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:33:32 time: 1.9887 data_time: 0.0176 memory: 34960 grad_norm: 26.8341 loss: 10.9001 loss_cls: 0.3343 loss_bbox: 0.1747 loss_iou: 0.3297 d0.loss_cls: 0.3892 d0.loss_bbox: 0.1881 d0.loss_iou: 0.3469 d1.loss_cls: 0.3591 d1.loss_bbox: 0.1774 d1.loss_iou: 0.3379 d2.loss_cls: 0.3459 d2.loss_bbox: 0.1752 d2.loss_iou: 0.3325 d3.loss_cls: 0.3377 d3.loss_bbox: 0.1747 d3.loss_iou: 0.3310 d4.loss_cls: 0.3361 d4.loss_bbox: 0.1742 d4.loss_iou: 0.3289 enc_loss_cls: 0.3847 enc_loss_bbox: 0.2043 enc_loss_iou: 0.3709 dn_loss_cls: 0.1087 dn_loss_bbox: 0.1797 dn_loss_iou: 0.2375 d0.dn_loss_cls: 0.1886 d0.dn_loss_bbox: 0.3332 d0.dn_loss_iou: 0.3875 d1.dn_loss_cls: 0.1391 d1.dn_loss_bbox: 0.2156 d1.dn_loss_iou: 0.2734 d2.dn_loss_cls: 0.1237 d2.dn_loss_bbox: 0.1892 d2.dn_loss_iou: 0.2480 d3.dn_loss_cls: 0.1149 d3.dn_loss_bbox: 0.1826 d3.dn_loss_iou: 0.2404 d4.dn_loss_cls: 0.1097 d4.dn_loss_bbox: 0.1796 d4.dn_loss_iou: 0.2375 d1.loss_lmm_region: 0.1488 loss_lmm_image: 0.9291 2024/11/11 06:04:59 - mmengine - INFO - Iter(train) [ 34400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:30:09 time: 2.0341 data_time: 0.0179 memory: 33997 grad_norm: 35.4279 loss: 10.4219 loss_cls: 0.3361 loss_bbox: 0.1481 loss_iou: 0.2454 d0.loss_cls: 0.3977 d0.loss_bbox: 0.1588 d0.loss_iou: 0.2584 d1.loss_cls: 0.3679 d1.loss_bbox: 0.1538 d1.loss_iou: 0.2552 d2.loss_cls: 0.3516 d2.loss_bbox: 0.1523 d2.loss_iou: 0.2476 d3.loss_cls: 0.3443 d3.loss_bbox: 0.1490 d3.loss_iou: 0.2449 d4.loss_cls: 0.3385 d4.loss_bbox: 0.1486 d4.loss_iou: 0.2453 enc_loss_cls: 0.3896 enc_loss_bbox: 0.1835 enc_loss_iou: 0.2889 dn_loss_cls: 0.1932 dn_loss_bbox: 0.1634 dn_loss_iou: 0.2160 d0.dn_loss_cls: 0.2795 d0.dn_loss_bbox: 0.3032 d0.dn_loss_iou: 0.3509 d1.dn_loss_cls: 0.2316 d1.dn_loss_bbox: 0.1943 d1.dn_loss_iou: 0.2460 d2.dn_loss_cls: 0.2084 d2.dn_loss_bbox: 0.1731 d2.dn_loss_iou: 0.2254 d3.dn_loss_cls: 0.2018 d3.dn_loss_bbox: 0.1651 d3.dn_loss_iou: 0.2189 d4.dn_loss_cls: 0.1946 d4.dn_loss_bbox: 0.1634 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1567 loss_lmm_image: 0.9152 2024/11/11 06:08:18 - mmengine - INFO - Iter(train) [ 34500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:26:39 time: 1.9764 data_time: 0.0198 memory: 35170 grad_norm: 31.8359 loss: 10.2347 loss_cls: 0.3146 loss_bbox: 0.1522 loss_iou: 0.2734 d0.loss_cls: 0.3652 d0.loss_bbox: 0.1619 d0.loss_iou: 0.2889 d1.loss_cls: 0.3392 d1.loss_bbox: 0.1567 d1.loss_iou: 0.2824 d2.loss_cls: 0.3176 d2.loss_bbox: 0.1598 d2.loss_iou: 0.2811 d3.loss_cls: 0.3148 d3.loss_bbox: 0.1542 d3.loss_iou: 0.2770 d4.loss_cls: 0.3163 d4.loss_bbox: 0.1517 d4.loss_iou: 0.2722 enc_loss_cls: 0.3600 enc_loss_bbox: 0.1814 enc_loss_iou: 0.3188 dn_loss_cls: 0.1228 dn_loss_bbox: 0.1827 dn_loss_iou: 0.2298 d0.dn_loss_cls: 0.2088 d0.dn_loss_bbox: 0.3468 d0.dn_loss_iou: 0.3871 d1.dn_loss_cls: 0.1587 d1.dn_loss_bbox: 0.2235 d1.dn_loss_iou: 0.2663 d2.dn_loss_cls: 0.1351 d2.dn_loss_bbox: 0.1967 d2.dn_loss_iou: 0.2423 d3.dn_loss_cls: 0.1278 d3.dn_loss_bbox: 0.1869 d3.dn_loss_iou: 0.2338 d4.dn_loss_cls: 0.1231 d4.dn_loss_bbox: 0.1828 d4.dn_loss_iou: 0.2299 d1.loss_lmm_region: 0.1513 loss_lmm_image: 0.8591 2024/11/11 06:11:39 - mmengine - INFO - Iter(train) [ 34600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:23:20 time: 2.0047 data_time: 0.0177 memory: 35354 grad_norm: 34.6446 loss: 10.7379 loss_cls: 0.3451 loss_bbox: 0.1536 loss_iou: 0.2568 d0.loss_cls: 0.3954 d0.loss_bbox: 0.1671 d0.loss_iou: 0.2729 d1.loss_cls: 0.3660 d1.loss_bbox: 0.1621 d1.loss_iou: 0.2645 d2.loss_cls: 0.3585 d2.loss_bbox: 0.1523 d2.loss_iou: 0.2584 d3.loss_cls: 0.3452 d3.loss_bbox: 0.1564 d3.loss_iou: 0.2589 d4.loss_cls: 0.3440 d4.loss_bbox: 0.1555 d4.loss_iou: 0.2583 enc_loss_cls: 0.3955 enc_loss_bbox: 0.1811 enc_loss_iou: 0.2931 dn_loss_cls: 0.1731 dn_loss_bbox: 0.1876 dn_loss_iou: 0.2327 d0.dn_loss_cls: 0.2583 d0.dn_loss_bbox: 0.3447 d0.dn_loss_iou: 0.3808 d1.dn_loss_cls: 0.2072 d1.dn_loss_bbox: 0.2263 d1.dn_loss_iou: 0.2667 d2.dn_loss_cls: 0.1875 d2.dn_loss_bbox: 0.2020 d2.dn_loss_iou: 0.2437 d3.dn_loss_cls: 0.1769 d3.dn_loss_bbox: 0.1903 d3.dn_loss_iou: 0.2351 d4.dn_loss_cls: 0.1741 d4.dn_loss_bbox: 0.1876 d4.dn_loss_iou: 0.2326 d1.loss_lmm_region: 0.1972 loss_lmm_image: 0.8929 2024/11/11 06:14:56 - mmengine - INFO - Iter(train) [ 34700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:19:46 time: 1.9787 data_time: 0.0176 memory: 33376 grad_norm: 33.0499 loss: 9.3601 loss_cls: 0.2723 loss_bbox: 0.1443 loss_iou: 0.2402 d0.loss_cls: 0.3145 d0.loss_bbox: 0.1585 d0.loss_iou: 0.2591 d1.loss_cls: 0.2905 d1.loss_bbox: 0.1531 d1.loss_iou: 0.2499 d2.loss_cls: 0.2851 d2.loss_bbox: 0.1465 d2.loss_iou: 0.2413 d3.loss_cls: 0.2773 d3.loss_bbox: 0.1451 d3.loss_iou: 0.2404 d4.loss_cls: 0.2740 d4.loss_bbox: 0.1437 d4.loss_iou: 0.2405 enc_loss_cls: 0.3165 enc_loss_bbox: 0.1728 enc_loss_iou: 0.2825 dn_loss_cls: 0.1138 dn_loss_bbox: 0.1719 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.1849 d0.dn_loss_bbox: 0.2980 d0.dn_loss_iou: 0.3516 d1.dn_loss_cls: 0.1421 d1.dn_loss_bbox: 0.2035 d1.dn_loss_iou: 0.2458 d2.dn_loss_cls: 0.1254 d2.dn_loss_bbox: 0.1820 d2.dn_loss_iou: 0.2242 d3.dn_loss_cls: 0.1179 d3.dn_loss_bbox: 0.1739 d3.dn_loss_iou: 0.2167 d4.dn_loss_cls: 0.1152 d4.dn_loss_bbox: 0.1718 d4.dn_loss_iou: 0.2142 d1.loss_lmm_region: 0.1522 loss_lmm_image: 0.8926 2024/11/11 06:18:13 - mmengine - INFO - Iter(train) [ 34800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:16:14 time: 1.9841 data_time: 0.0181 memory: 32327 grad_norm: 26.7180 loss: 10.5649 loss_cls: 0.3160 loss_bbox: 0.1735 loss_iou: 0.2604 d0.loss_cls: 0.3601 d0.loss_bbox: 0.1904 d0.loss_iou: 0.2790 d1.loss_cls: 0.3321 d1.loss_bbox: 0.1856 d1.loss_iou: 0.2742 d2.loss_cls: 0.3193 d2.loss_bbox: 0.1826 d2.loss_iou: 0.2678 d3.loss_cls: 0.3164 d3.loss_bbox: 0.1775 d3.loss_iou: 0.2648 d4.loss_cls: 0.3155 d4.loss_bbox: 0.1750 d4.loss_iou: 0.2615 enc_loss_cls: 0.3580 enc_loss_bbox: 0.2066 enc_loss_iou: 0.3019 dn_loss_cls: 0.1421 dn_loss_bbox: 0.2075 dn_loss_iou: 0.2271 d0.dn_loss_cls: 0.2159 d0.dn_loss_bbox: 0.3730 d0.dn_loss_iou: 0.3666 d1.dn_loss_cls: 0.1754 d1.dn_loss_bbox: 0.2426 d1.dn_loss_iou: 0.2553 d2.dn_loss_cls: 0.1539 d2.dn_loss_bbox: 0.2185 d2.dn_loss_iou: 0.2357 d3.dn_loss_cls: 0.1502 d3.dn_loss_bbox: 0.2098 d3.dn_loss_iou: 0.2292 d4.dn_loss_cls: 0.1453 d4.dn_loss_bbox: 0.2075 d4.dn_loss_iou: 0.2270 d1.loss_lmm_region: 0.1637 loss_lmm_image: 0.9003 2024/11/11 06:21:33 - mmengine - INFO - Iter(train) [ 34900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:12:49 time: 2.0208 data_time: 0.0176 memory: 34792 grad_norm: 32.3845 loss: 10.5899 loss_cls: 0.3809 loss_bbox: 0.1369 loss_iou: 0.2906 d0.loss_cls: 0.4318 d0.loss_bbox: 0.1526 d0.loss_iou: 0.3062 d1.loss_cls: 0.4071 d1.loss_bbox: 0.1429 d1.loss_iou: 0.2934 d2.loss_cls: 0.3975 d2.loss_bbox: 0.1367 d2.loss_iou: 0.2905 d3.loss_cls: 0.3822 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2913 d4.loss_cls: 0.3822 d4.loss_bbox: 0.1357 d4.loss_iou: 0.2908 enc_loss_cls: 0.4245 enc_loss_bbox: 0.1675 enc_loss_iou: 0.3296 dn_loss_cls: 0.1724 dn_loss_bbox: 0.1481 dn_loss_iou: 0.2168 d0.dn_loss_cls: 0.2352 d0.dn_loss_bbox: 0.2737 d0.dn_loss_iou: 0.3566 d1.dn_loss_cls: 0.1924 d1.dn_loss_bbox: 0.1763 d1.dn_loss_iou: 0.2462 d2.dn_loss_cls: 0.1705 d2.dn_loss_bbox: 0.1602 d2.dn_loss_iou: 0.2281 d3.dn_loss_cls: 0.1778 d3.dn_loss_bbox: 0.1512 d3.dn_loss_iou: 0.2200 d4.dn_loss_cls: 0.1775 d4.dn_loss_bbox: 0.1481 d4.dn_loss_iou: 0.2168 d1.loss_lmm_region: 0.1466 loss_lmm_image: 0.8678 2024/11/11 06:24:49 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 06:24:49 - mmengine - INFO - Iter(train) [ 35000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:09:13 time: 1.9574 data_time: 0.0201 memory: 33747 grad_norm: 27.3150 loss: 9.1910 loss_cls: 0.3061 loss_bbox: 0.1169 loss_iou: 0.2461 d0.loss_cls: 0.3572 d0.loss_bbox: 0.1287 d0.loss_iou: 0.2596 d1.loss_cls: 0.3311 d1.loss_bbox: 0.1219 d1.loss_iou: 0.2540 d2.loss_cls: 0.3166 d2.loss_bbox: 0.1202 d2.loss_iou: 0.2513 d3.loss_cls: 0.3104 d3.loss_bbox: 0.1176 d3.loss_iou: 0.2492 d4.loss_cls: 0.3055 d4.loss_bbox: 0.1179 d4.loss_iou: 0.2486 enc_loss_cls: 0.3569 enc_loss_bbox: 0.1471 enc_loss_iou: 0.2938 dn_loss_cls: 0.1084 dn_loss_bbox: 0.1476 dn_loss_iou: 0.1965 d0.dn_loss_cls: 0.1914 d0.dn_loss_bbox: 0.3029 d0.dn_loss_iou: 0.3414 d1.dn_loss_cls: 0.1414 d1.dn_loss_bbox: 0.1800 d1.dn_loss_iou: 0.2281 d2.dn_loss_cls: 0.1203 d2.dn_loss_bbox: 0.1579 d2.dn_loss_iou: 0.2058 d3.dn_loss_cls: 0.1129 d3.dn_loss_bbox: 0.1487 d3.dn_loss_iou: 0.1990 d4.dn_loss_cls: 0.1085 d4.dn_loss_bbox: 0.1476 d4.dn_loss_iou: 0.1966 d1.loss_lmm_region: 0.1382 loss_lmm_image: 0.8608 2024/11/11 06:28:10 - mmengine - INFO - Iter(train) [ 35100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:05:51 time: 2.0187 data_time: 0.0205 memory: 34510 grad_norm: 33.7958 loss: 10.0601 loss_cls: 0.3282 loss_bbox: 0.1323 loss_iou: 0.2110 d0.loss_cls: 0.3699 d0.loss_bbox: 0.1454 d0.loss_iou: 0.2186 d1.loss_cls: 0.3445 d1.loss_bbox: 0.1364 d1.loss_iou: 0.2114 d2.loss_cls: 0.3335 d2.loss_bbox: 0.1338 d2.loss_iou: 0.2111 d3.loss_cls: 0.3267 d3.loss_bbox: 0.1346 d3.loss_iou: 0.2116 d4.loss_cls: 0.3283 d4.loss_bbox: 0.1311 d4.loss_iou: 0.2112 enc_loss_cls: 0.3596 enc_loss_bbox: 0.1629 enc_loss_iou: 0.2428 dn_loss_cls: 0.1698 dn_loss_bbox: 0.1974 dn_loss_iou: 0.2218 d0.dn_loss_cls: 0.2639 d0.dn_loss_bbox: 0.3644 d0.dn_loss_iou: 0.3717 d1.dn_loss_cls: 0.2085 d1.dn_loss_bbox: 0.2292 d1.dn_loss_iou: 0.2529 d2.dn_loss_cls: 0.1845 d2.dn_loss_bbox: 0.2051 d2.dn_loss_iou: 0.2323 d3.dn_loss_cls: 0.1738 d3.dn_loss_bbox: 0.1996 d3.dn_loss_iou: 0.2251 d4.dn_loss_cls: 0.1732 d4.dn_loss_bbox: 0.1973 d4.dn_loss_iou: 0.2217 d1.loss_lmm_region: 0.1756 loss_lmm_image: 0.9075 2024/11/11 06:31:28 - mmengine - INFO - Iter(train) [ 35200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 16:02:23 time: 1.9905 data_time: 0.0188 memory: 33264 grad_norm: 26.3499 loss: 9.1455 loss_cls: 0.2700 loss_bbox: 0.1415 loss_iou: 0.2318 d0.loss_cls: 0.3216 d0.loss_bbox: 0.1568 d0.loss_iou: 0.2444 d1.loss_cls: 0.2902 d1.loss_bbox: 0.1492 d1.loss_iou: 0.2413 d2.loss_cls: 0.2808 d2.loss_bbox: 0.1424 d2.loss_iou: 0.2362 d3.loss_cls: 0.2739 d3.loss_bbox: 0.1421 d3.loss_iou: 0.2350 d4.loss_cls: 0.2697 d4.loss_bbox: 0.1423 d4.loss_iou: 0.2334 enc_loss_cls: 0.3253 enc_loss_bbox: 0.1749 enc_loss_iou: 0.2734 dn_loss_cls: 0.1001 dn_loss_bbox: 0.1622 dn_loss_iou: 0.2039 d0.dn_loss_cls: 0.1839 d0.dn_loss_bbox: 0.3033 d0.dn_loss_iou: 0.3486 d1.dn_loss_cls: 0.1316 d1.dn_loss_bbox: 0.1933 d1.dn_loss_iou: 0.2373 d2.dn_loss_cls: 0.1120 d2.dn_loss_bbox: 0.1714 d2.dn_loss_iou: 0.2148 d3.dn_loss_cls: 0.1034 d3.dn_loss_bbox: 0.1638 d3.dn_loss_iou: 0.2066 d4.dn_loss_cls: 0.1002 d4.dn_loss_bbox: 0.1622 d4.dn_loss_iou: 0.2039 d1.loss_lmm_region: 0.1713 loss_lmm_image: 0.8953 2024/11/11 06:34:48 - mmengine - INFO - Iter(train) [ 35300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:58:58 time: 2.0232 data_time: 0.0185 memory: 34782 grad_norm: 28.8309 loss: 10.7035 loss_cls: 0.3518 loss_bbox: 0.1528 loss_iou: 0.3002 d0.loss_cls: 0.4058 d0.loss_bbox: 0.1647 d0.loss_iou: 0.3189 d1.loss_cls: 0.3640 d1.loss_bbox: 0.1572 d1.loss_iou: 0.3065 d2.loss_cls: 0.3566 d2.loss_bbox: 0.1555 d2.loss_iou: 0.3036 d3.loss_cls: 0.3525 d3.loss_bbox: 0.1574 d3.loss_iou: 0.3030 d4.loss_cls: 0.3518 d4.loss_bbox: 0.1549 d4.loss_iou: 0.3010 enc_loss_cls: 0.4025 enc_loss_bbox: 0.1743 enc_loss_iou: 0.3349 dn_loss_cls: 0.1331 dn_loss_bbox: 0.1753 dn_loss_iou: 0.2346 d0.dn_loss_cls: 0.2236 d0.dn_loss_bbox: 0.3164 d0.dn_loss_iou: 0.3720 d1.dn_loss_cls: 0.1716 d1.dn_loss_bbox: 0.2029 d1.dn_loss_iou: 0.2639 d2.dn_loss_cls: 0.1456 d2.dn_loss_bbox: 0.1839 d2.dn_loss_iou: 0.2444 d3.dn_loss_cls: 0.1375 d3.dn_loss_bbox: 0.1771 d3.dn_loss_iou: 0.2372 d4.dn_loss_cls: 0.1334 d4.dn_loss_bbox: 0.1753 d4.dn_loss_iou: 0.2347 d1.loss_lmm_region: 0.1751 loss_lmm_image: 0.8961 2024/11/11 06:38:07 - mmengine - INFO - Iter(train) [ 35400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:55:31 time: 2.0147 data_time: 0.0179 memory: 33248 grad_norm: 29.3758 loss: 11.2949 loss_cls: 0.3436 loss_bbox: 0.1883 loss_iou: 0.3295 d0.loss_cls: 0.3980 d0.loss_bbox: 0.2175 d0.loss_iou: 0.3599 d1.loss_cls: 0.3671 d1.loss_bbox: 0.1984 d1.loss_iou: 0.3372 d2.loss_cls: 0.3564 d2.loss_bbox: 0.1897 d2.loss_iou: 0.3332 d3.loss_cls: 0.3475 d3.loss_bbox: 0.1858 d3.loss_iou: 0.3328 d4.loss_cls: 0.3448 d4.loss_bbox: 0.1884 d4.loss_iou: 0.3304 enc_loss_cls: 0.3949 enc_loss_bbox: 0.2251 enc_loss_iou: 0.3778 dn_loss_cls: 0.1157 dn_loss_bbox: 0.1877 dn_loss_iou: 0.2572 d0.dn_loss_cls: 0.2096 d0.dn_loss_bbox: 0.3326 d0.dn_loss_iou: 0.4044 d1.dn_loss_cls: 0.1547 d1.dn_loss_bbox: 0.2162 d1.dn_loss_iou: 0.2869 d2.dn_loss_cls: 0.1299 d2.dn_loss_bbox: 0.1966 d2.dn_loss_iou: 0.2664 d3.dn_loss_cls: 0.1249 d3.dn_loss_bbox: 0.1889 d3.dn_loss_iou: 0.2592 d4.dn_loss_cls: 0.1176 d4.dn_loss_bbox: 0.1876 d4.dn_loss_iou: 0.2572 d1.loss_lmm_region: 0.1697 loss_lmm_image: 0.8854 2024/11/11 06:41:26 - mmengine - INFO - Iter(train) [ 35500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:52:06 time: 1.9867 data_time: 0.0190 memory: 35520 grad_norm: 31.5736 loss: 10.1315 loss_cls: 0.3467 loss_bbox: 0.1439 loss_iou: 0.2567 d0.loss_cls: 0.4140 d0.loss_bbox: 0.1499 d0.loss_iou: 0.2728 d1.loss_cls: 0.3702 d1.loss_bbox: 0.1544 d1.loss_iou: 0.2643 d2.loss_cls: 0.3563 d2.loss_bbox: 0.1453 d2.loss_iou: 0.2605 d3.loss_cls: 0.3499 d3.loss_bbox: 0.1472 d3.loss_iou: 0.2596 d4.loss_cls: 0.3460 d4.loss_bbox: 0.1449 d4.loss_iou: 0.2575 enc_loss_cls: 0.4088 enc_loss_bbox: 0.1731 enc_loss_iou: 0.3033 dn_loss_cls: 0.1334 dn_loss_bbox: 0.1618 dn_loss_iou: 0.2111 d0.dn_loss_cls: 0.2082 d0.dn_loss_bbox: 0.3003 d0.dn_loss_iou: 0.3506 d1.dn_loss_cls: 0.1640 d1.dn_loss_bbox: 0.1923 d1.dn_loss_iou: 0.2403 d2.dn_loss_cls: 0.1467 d2.dn_loss_bbox: 0.1706 d2.dn_loss_iou: 0.2196 d3.dn_loss_cls: 0.1400 d3.dn_loss_bbox: 0.1647 d3.dn_loss_iou: 0.2142 d4.dn_loss_cls: 0.1339 d4.dn_loss_bbox: 0.1619 d4.dn_loss_iou: 0.2113 d1.loss_lmm_region: 0.1527 loss_lmm_image: 0.9288 2024/11/11 06:44:46 - mmengine - INFO - Iter(train) [ 35600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:48:42 time: 1.9777 data_time: 0.0185 memory: 34499 grad_norm: 25.1631 loss: 10.5696 loss_cls: 0.3565 loss_bbox: 0.1518 loss_iou: 0.2876 d0.loss_cls: 0.4093 d0.loss_bbox: 0.1728 d0.loss_iou: 0.3084 d1.loss_cls: 0.3790 d1.loss_bbox: 0.1622 d1.loss_iou: 0.2961 d2.loss_cls: 0.3643 d2.loss_bbox: 0.1565 d2.loss_iou: 0.2901 d3.loss_cls: 0.3563 d3.loss_bbox: 0.1553 d3.loss_iou: 0.2909 d4.loss_cls: 0.3582 d4.loss_bbox: 0.1528 d4.loss_iou: 0.2885 enc_loss_cls: 0.4027 enc_loss_bbox: 0.1832 enc_loss_iou: 0.3308 dn_loss_cls: 0.1334 dn_loss_bbox: 0.1698 dn_loss_iou: 0.2320 d0.dn_loss_cls: 0.2090 d0.dn_loss_bbox: 0.3039 d0.dn_loss_iou: 0.3729 d1.dn_loss_cls: 0.1636 d1.dn_loss_bbox: 0.2068 d1.dn_loss_iou: 0.2671 d2.dn_loss_cls: 0.1422 d2.dn_loss_bbox: 0.1809 d2.dn_loss_iou: 0.2424 d3.dn_loss_cls: 0.1335 d3.dn_loss_bbox: 0.1709 d3.dn_loss_iou: 0.2336 d4.dn_loss_cls: 0.1333 d4.dn_loss_bbox: 0.1698 d4.dn_loss_iou: 0.2319 d1.loss_lmm_region: 0.1566 loss_lmm_image: 0.8627 2024/11/11 06:48:06 - mmengine - INFO - Iter(train) [ 35700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:45:18 time: 1.9935 data_time: 0.0179 memory: 33657 grad_norm: 25.2590 loss: 9.9487 loss_cls: 0.3030 loss_bbox: 0.1437 loss_iou: 0.2615 d0.loss_cls: 0.3396 d0.loss_bbox: 0.1653 d0.loss_iou: 0.2832 d1.loss_cls: 0.3168 d1.loss_bbox: 0.1550 d1.loss_iou: 0.2695 d2.loss_cls: 0.3071 d2.loss_bbox: 0.1491 d2.loss_iou: 0.2656 d3.loss_cls: 0.3052 d3.loss_bbox: 0.1452 d3.loss_iou: 0.2631 d4.loss_cls: 0.3023 d4.loss_bbox: 0.1442 d4.loss_iou: 0.2625 enc_loss_cls: 0.3481 enc_loss_bbox: 0.1764 enc_loss_iou: 0.3035 dn_loss_cls: 0.1092 dn_loss_bbox: 0.1798 dn_loss_iou: 0.2347 d0.dn_loss_cls: 0.1821 d0.dn_loss_bbox: 0.3380 d0.dn_loss_iou: 0.3919 d1.dn_loss_cls: 0.1350 d1.dn_loss_bbox: 0.2125 d1.dn_loss_iou: 0.2678 d2.dn_loss_cls: 0.1213 d2.dn_loss_bbox: 0.1904 d2.dn_loss_iou: 0.2445 d3.dn_loss_cls: 0.1125 d3.dn_loss_bbox: 0.1819 d3.dn_loss_iou: 0.2371 d4.dn_loss_cls: 0.1089 d4.dn_loss_bbox: 0.1798 d4.dn_loss_iou: 0.2347 d1.loss_lmm_region: 0.1401 loss_lmm_image: 0.9363 2024/11/11 06:51:26 - mmengine - INFO - Iter(train) [ 35800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:41:54 time: 1.9771 data_time: 0.0179 memory: 33343 grad_norm: 34.9961 loss: 9.9887 loss_cls: 0.3021 loss_bbox: 0.1428 loss_iou: 0.2360 d0.loss_cls: 0.3519 d0.loss_bbox: 0.1526 d0.loss_iou: 0.2462 d1.loss_cls: 0.3191 d1.loss_bbox: 0.1470 d1.loss_iou: 0.2400 d2.loss_cls: 0.3094 d2.loss_bbox: 0.1430 d2.loss_iou: 0.2356 d3.loss_cls: 0.3046 d3.loss_bbox: 0.1431 d3.loss_iou: 0.2352 d4.loss_cls: 0.3012 d4.loss_bbox: 0.1431 d4.loss_iou: 0.2372 enc_loss_cls: 0.3505 enc_loss_bbox: 0.1630 enc_loss_iou: 0.2672 dn_loss_cls: 0.1321 dn_loss_bbox: 0.2050 dn_loss_iou: 0.2205 d0.dn_loss_cls: 0.2227 d0.dn_loss_bbox: 0.3820 d0.dn_loss_iou: 0.3679 d1.dn_loss_cls: 0.1698 d1.dn_loss_bbox: 0.2491 d1.dn_loss_iou: 0.2555 d2.dn_loss_cls: 0.1470 d2.dn_loss_bbox: 0.2204 d2.dn_loss_iou: 0.2318 d3.dn_loss_cls: 0.1368 d3.dn_loss_bbox: 0.2091 d3.dn_loss_iou: 0.2243 d4.dn_loss_cls: 0.1328 d4.dn_loss_bbox: 0.2051 d4.dn_loss_iou: 0.2206 d1.loss_lmm_region: 0.1695 loss_lmm_image: 0.9161 2024/11/11 06:54:44 - mmengine - INFO - Iter(train) [ 35900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:38:25 time: 1.9906 data_time: 0.0179 memory: 33959 grad_norm: 34.7536 loss: 9.5079 loss_cls: 0.2832 loss_bbox: 0.1313 loss_iou: 0.2576 d0.loss_cls: 0.3246 d0.loss_bbox: 0.1418 d0.loss_iou: 0.2686 d1.loss_cls: 0.2968 d1.loss_bbox: 0.1398 d1.loss_iou: 0.2688 d2.loss_cls: 0.2873 d2.loss_bbox: 0.1342 d2.loss_iou: 0.2620 d3.loss_cls: 0.2869 d3.loss_bbox: 0.1292 d3.loss_iou: 0.2562 d4.loss_cls: 0.2851 d4.loss_bbox: 0.1302 d4.loss_iou: 0.2569 enc_loss_cls: 0.3222 enc_loss_bbox: 0.1570 enc_loss_iou: 0.2910 dn_loss_cls: 0.1228 dn_loss_bbox: 0.1672 dn_loss_iou: 0.2169 d0.dn_loss_cls: 0.1873 d0.dn_loss_bbox: 0.3236 d0.dn_loss_iou: 0.3594 d1.dn_loss_cls: 0.1509 d1.dn_loss_bbox: 0.2006 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.1332 d2.dn_loss_bbox: 0.1771 d2.dn_loss_iou: 0.2271 d3.dn_loss_cls: 0.1285 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2193 d4.dn_loss_cls: 0.1240 d4.dn_loss_bbox: 0.1672 d4.dn_loss_iou: 0.2169 d1.loss_lmm_region: 0.1331 loss_lmm_image: 0.9250 2024/11/11 06:58:05 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 06:58:05 - mmengine - INFO - Iter(train) [ 36000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:35:04 time: 1.9949 data_time: 0.0179 memory: 34861 grad_norm: 32.0634 loss: 8.5160 loss_cls: 0.2430 loss_bbox: 0.0971 loss_iou: 0.1640 d0.loss_cls: 0.2806 d0.loss_bbox: 0.1102 d0.loss_iou: 0.1734 d1.loss_cls: 0.2556 d1.loss_bbox: 0.1039 d1.loss_iou: 0.1714 d2.loss_cls: 0.2484 d2.loss_bbox: 0.0996 d2.loss_iou: 0.1641 d3.loss_cls: 0.2451 d3.loss_bbox: 0.0984 d3.loss_iou: 0.1640 d4.loss_cls: 0.2408 d4.loss_bbox: 0.0976 d4.loss_iou: 0.1639 enc_loss_cls: 0.2784 enc_loss_bbox: 0.1210 enc_loss_iou: 0.2003 dn_loss_cls: 0.1265 dn_loss_bbox: 0.1952 dn_loss_iou: 0.2093 d0.dn_loss_cls: 0.2028 d0.dn_loss_bbox: 0.3718 d0.dn_loss_iou: 0.3572 d1.dn_loss_cls: 0.1559 d1.dn_loss_bbox: 0.2295 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1369 d2.dn_loss_bbox: 0.2058 d2.dn_loss_iou: 0.2189 d3.dn_loss_cls: 0.1307 d3.dn_loss_bbox: 0.1964 d3.dn_loss_iou: 0.2112 d4.dn_loss_cls: 0.1263 d4.dn_loss_bbox: 0.1951 d4.dn_loss_iou: 0.2092 d1.loss_lmm_region: 0.1625 loss_lmm_image: 0.9126 2024/11/11 07:01:24 - mmengine - INFO - Iter(train) [ 36100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:31:40 time: 1.9803 data_time: 0.0177 memory: 34047 grad_norm: 30.1469 loss: 9.6423 loss_cls: 0.3102 loss_bbox: 0.1283 loss_iou: 0.2511 d0.loss_cls: 0.3732 d0.loss_bbox: 0.1450 d0.loss_iou: 0.2662 d1.loss_cls: 0.3417 d1.loss_bbox: 0.1313 d1.loss_iou: 0.2517 d2.loss_cls: 0.3295 d2.loss_bbox: 0.1289 d2.loss_iou: 0.2518 d3.loss_cls: 0.3212 d3.loss_bbox: 0.1292 d3.loss_iou: 0.2495 d4.loss_cls: 0.3150 d4.loss_bbox: 0.1289 d4.loss_iou: 0.2517 enc_loss_cls: 0.3641 enc_loss_bbox: 0.1720 enc_loss_iou: 0.2996 dn_loss_cls: 0.1142 dn_loss_bbox: 0.1619 dn_loss_iou: 0.2159 d0.dn_loss_cls: 0.1929 d0.dn_loss_bbox: 0.3086 d0.dn_loss_iou: 0.3628 d1.dn_loss_cls: 0.1434 d1.dn_loss_bbox: 0.1932 d1.dn_loss_iou: 0.2469 d2.dn_loss_cls: 0.1260 d2.dn_loss_bbox: 0.1756 d2.dn_loss_iou: 0.2264 d3.dn_loss_cls: 0.1185 d3.dn_loss_bbox: 0.1659 d3.dn_loss_iou: 0.2187 d4.dn_loss_cls: 0.1150 d4.dn_loss_bbox: 0.1619 d4.dn_loss_iou: 0.2157 d1.loss_lmm_region: 0.1398 loss_lmm_image: 0.8991 2024/11/11 07:04:42 - mmengine - INFO - Iter(train) [ 36200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:28:09 time: 1.9687 data_time: 0.0178 memory: 35237 grad_norm: 28.0135 loss: 9.7272 loss_cls: 0.3269 loss_bbox: 0.1322 loss_iou: 0.2691 d0.loss_cls: 0.3829 d0.loss_bbox: 0.1454 d0.loss_iou: 0.2831 d1.loss_cls: 0.3451 d1.loss_bbox: 0.1408 d1.loss_iou: 0.2782 d2.loss_cls: 0.3371 d2.loss_bbox: 0.1358 d2.loss_iou: 0.2727 d3.loss_cls: 0.3308 d3.loss_bbox: 0.1344 d3.loss_iou: 0.2688 d4.loss_cls: 0.3280 d4.loss_bbox: 0.1334 d4.loss_iou: 0.2684 enc_loss_cls: 0.3768 enc_loss_bbox: 0.1640 enc_loss_iou: 0.3151 dn_loss_cls: 0.1087 dn_loss_bbox: 0.1470 dn_loss_iou: 0.2094 d0.dn_loss_cls: 0.1851 d0.dn_loss_bbox: 0.2713 d0.dn_loss_iou: 0.3452 d1.dn_loss_cls: 0.1394 d1.dn_loss_bbox: 0.1760 d1.dn_loss_iou: 0.2393 d2.dn_loss_cls: 0.1209 d2.dn_loss_bbox: 0.1557 d2.dn_loss_iou: 0.2194 d3.dn_loss_cls: 0.1137 d3.dn_loss_bbox: 0.1491 d3.dn_loss_iou: 0.2115 d4.dn_loss_cls: 0.1079 d4.dn_loss_bbox: 0.1470 d4.dn_loss_iou: 0.2094 d1.loss_lmm_region: 0.1607 loss_lmm_image: 0.9418 2024/11/11 07:08:03 - mmengine - INFO - Iter(train) [ 36300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:24:48 time: 1.9830 data_time: 0.0179 memory: 32492 grad_norm: 27.4253 loss: 10.7651 loss_cls: 0.3489 loss_bbox: 0.1534 loss_iou: 0.2633 d0.loss_cls: 0.3989 d0.loss_bbox: 0.1644 d0.loss_iou: 0.2737 d1.loss_cls: 0.3613 d1.loss_bbox: 0.1579 d1.loss_iou: 0.2670 d2.loss_cls: 0.3545 d2.loss_bbox: 0.1517 d2.loss_iou: 0.2660 d3.loss_cls: 0.3458 d3.loss_bbox: 0.1557 d3.loss_iou: 0.2664 d4.loss_cls: 0.3427 d4.loss_bbox: 0.1540 d4.loss_iou: 0.2655 enc_loss_cls: 0.4041 enc_loss_bbox: 0.1768 enc_loss_iou: 0.2960 dn_loss_cls: 0.1465 dn_loss_bbox: 0.1998 dn_loss_iou: 0.2446 d0.dn_loss_cls: 0.2282 d0.dn_loss_bbox: 0.3649 d0.dn_loss_iou: 0.3944 d1.dn_loss_cls: 0.1796 d1.dn_loss_bbox: 0.2446 d1.dn_loss_iou: 0.2815 d2.dn_loss_cls: 0.1583 d2.dn_loss_bbox: 0.2171 d2.dn_loss_iou: 0.2559 d3.dn_loss_cls: 0.1501 d3.dn_loss_bbox: 0.2030 d3.dn_loss_iou: 0.2475 d4.dn_loss_cls: 0.1486 d4.dn_loss_bbox: 0.1999 d4.dn_loss_iou: 0.2447 d1.loss_lmm_region: 0.1849 loss_lmm_image: 0.9030 2024/11/11 07:11:24 - mmengine - INFO - Iter(train) [ 36400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:21:29 time: 2.0102 data_time: 0.0179 memory: 34241 grad_norm: 31.5597 loss: 10.6518 loss_cls: 0.3150 loss_bbox: 0.1665 loss_iou: 0.2751 d0.loss_cls: 0.3628 d0.loss_bbox: 0.1755 d0.loss_iou: 0.2838 d1.loss_cls: 0.3342 d1.loss_bbox: 0.1696 d1.loss_iou: 0.2814 d2.loss_cls: 0.3306 d2.loss_bbox: 0.1612 d2.loss_iou: 0.2710 d3.loss_cls: 0.3190 d3.loss_bbox: 0.1637 d3.loss_iou: 0.2727 d4.loss_cls: 0.3213 d4.loss_bbox: 0.1632 d4.loss_iou: 0.2729 enc_loss_cls: 0.3585 enc_loss_bbox: 0.1907 enc_loss_iou: 0.3100 dn_loss_cls: 0.1569 dn_loss_bbox: 0.1952 dn_loss_iou: 0.2340 d0.dn_loss_cls: 0.2320 d0.dn_loss_bbox: 0.3635 d0.dn_loss_iou: 0.3884 d1.dn_loss_cls: 0.1867 d1.dn_loss_bbox: 0.2359 d1.dn_loss_iou: 0.2686 d2.dn_loss_cls: 0.1673 d2.dn_loss_bbox: 0.2055 d2.dn_loss_iou: 0.2438 d3.dn_loss_cls: 0.1607 d3.dn_loss_bbox: 0.1985 d3.dn_loss_iou: 0.2367 d4.dn_loss_cls: 0.1575 d4.dn_loss_bbox: 0.1951 d4.dn_loss_iou: 0.2340 d1.loss_lmm_region: 0.1648 loss_lmm_image: 0.9278 2024/11/11 07:14:43 - mmengine - INFO - Iter(train) [ 36500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:18:01 time: 1.9827 data_time: 0.0180 memory: 31881 grad_norm: 30.4637 loss: 8.9783 loss_cls: 0.2821 loss_bbox: 0.1088 loss_iou: 0.1999 d0.loss_cls: 0.3203 d0.loss_bbox: 0.1184 d0.loss_iou: 0.2147 d1.loss_cls: 0.2961 d1.loss_bbox: 0.1151 d1.loss_iou: 0.2085 d2.loss_cls: 0.2886 d2.loss_bbox: 0.1119 d2.loss_iou: 0.2042 d3.loss_cls: 0.2813 d3.loss_bbox: 0.1111 d3.loss_iou: 0.2049 d4.loss_cls: 0.2821 d4.loss_bbox: 0.1096 d4.loss_iou: 0.2015 enc_loss_cls: 0.3164 enc_loss_bbox: 0.1396 enc_loss_iou: 0.2380 dn_loss_cls: 0.1802 dn_loss_bbox: 0.1469 dn_loss_iou: 0.1911 d0.dn_loss_cls: 0.2500 d0.dn_loss_bbox: 0.2875 d0.dn_loss_iou: 0.3286 d1.dn_loss_cls: 0.2061 d1.dn_loss_bbox: 0.1782 d1.dn_loss_iou: 0.2208 d2.dn_loss_cls: 0.1938 d2.dn_loss_bbox: 0.1527 d2.dn_loss_iou: 0.1983 d3.dn_loss_cls: 0.1861 d3.dn_loss_bbox: 0.1481 d3.dn_loss_iou: 0.1930 d4.dn_loss_cls: 0.1818 d4.dn_loss_bbox: 0.1468 d4.dn_loss_iou: 0.1911 d1.loss_lmm_region: 0.1588 loss_lmm_image: 0.8852 2024/11/11 07:18:02 - mmengine - INFO - Iter(train) [ 36600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:14:35 time: 1.9949 data_time: 0.0181 memory: 34814 grad_norm: 29.0992 loss: 9.9813 loss_cls: 0.2982 loss_bbox: 0.1465 loss_iou: 0.2518 d0.loss_cls: 0.3466 d0.loss_bbox: 0.1617 d0.loss_iou: 0.2742 d1.loss_cls: 0.3171 d1.loss_bbox: 0.1556 d1.loss_iou: 0.2625 d2.loss_cls: 0.3097 d2.loss_bbox: 0.1497 d2.loss_iou: 0.2562 d3.loss_cls: 0.3026 d3.loss_bbox: 0.1488 d3.loss_iou: 0.2524 d4.loss_cls: 0.3009 d4.loss_bbox: 0.1466 d4.loss_iou: 0.2492 enc_loss_cls: 0.3463 enc_loss_bbox: 0.1823 enc_loss_iou: 0.2985 dn_loss_cls: 0.1326 dn_loss_bbox: 0.1785 dn_loss_iou: 0.2291 d0.dn_loss_cls: 0.2178 d0.dn_loss_bbox: 0.3344 d0.dn_loss_iou: 0.3783 d1.dn_loss_cls: 0.1621 d1.dn_loss_bbox: 0.2095 d1.dn_loss_iou: 0.2584 d2.dn_loss_cls: 0.1442 d2.dn_loss_bbox: 0.1877 d2.dn_loss_iou: 0.2361 d3.dn_loss_cls: 0.1362 d3.dn_loss_bbox: 0.1803 d3.dn_loss_iou: 0.2311 d4.dn_loss_cls: 0.1343 d4.dn_loss_bbox: 0.1784 d4.dn_loss_iou: 0.2290 d1.loss_lmm_region: 0.1586 loss_lmm_image: 0.9072 2024/11/11 07:21:21 - mmengine - INFO - Iter(train) [ 36700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:11:09 time: 1.9885 data_time: 0.0179 memory: 33963 grad_norm: 30.3740 loss: 8.9325 loss_cls: 0.2892 loss_bbox: 0.1144 loss_iou: 0.2257 d0.loss_cls: 0.3367 d0.loss_bbox: 0.1289 d0.loss_iou: 0.2440 d1.loss_cls: 0.2961 d1.loss_bbox: 0.1244 d1.loss_iou: 0.2355 d2.loss_cls: 0.2906 d2.loss_bbox: 0.1148 d2.loss_iou: 0.2290 d3.loss_cls: 0.2871 d3.loss_bbox: 0.1141 d3.loss_iou: 0.2251 d4.loss_cls: 0.2841 d4.loss_bbox: 0.1151 d4.loss_iou: 0.2278 enc_loss_cls: 0.3334 enc_loss_bbox: 0.1411 enc_loss_iou: 0.2632 dn_loss_cls: 0.1447 dn_loss_bbox: 0.1418 dn_loss_iou: 0.1943 d0.dn_loss_cls: 0.2103 d0.dn_loss_bbox: 0.2617 d0.dn_loss_iou: 0.3178 d1.dn_loss_cls: 0.1694 d1.dn_loss_bbox: 0.1627 d1.dn_loss_iou: 0.2171 d2.dn_loss_cls: 0.1508 d2.dn_loss_bbox: 0.1482 d2.dn_loss_iou: 0.2003 d3.dn_loss_cls: 0.1469 d3.dn_loss_bbox: 0.1433 d3.dn_loss_iou: 0.1959 d4.dn_loss_cls: 0.1439 d4.dn_loss_bbox: 0.1418 d4.dn_loss_iou: 0.1943 d1.loss_lmm_region: 0.1603 loss_lmm_image: 0.8669 2024/11/11 07:24:40 - mmengine - INFO - Iter(train) [ 36800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:07:44 time: 2.0045 data_time: 0.0178 memory: 34432 grad_norm: 30.9063 loss: 9.9225 loss_cls: 0.3164 loss_bbox: 0.1427 loss_iou: 0.2352 d0.loss_cls: 0.3723 d0.loss_bbox: 0.1528 d0.loss_iou: 0.2492 d1.loss_cls: 0.3446 d1.loss_bbox: 0.1419 d1.loss_iou: 0.2421 d2.loss_cls: 0.3328 d2.loss_bbox: 0.1428 d2.loss_iou: 0.2365 d3.loss_cls: 0.3182 d3.loss_bbox: 0.1442 d3.loss_iou: 0.2370 d4.loss_cls: 0.3165 d4.loss_bbox: 0.1433 d4.loss_iou: 0.2355 enc_loss_cls: 0.3717 enc_loss_bbox: 0.1709 enc_loss_iou: 0.2755 dn_loss_cls: 0.1221 dn_loss_bbox: 0.1818 dn_loss_iou: 0.2125 d0.dn_loss_cls: 0.2111 d0.dn_loss_bbox: 0.3619 d0.dn_loss_iou: 0.3645 d1.dn_loss_cls: 0.1541 d1.dn_loss_bbox: 0.2296 d1.dn_loss_iou: 0.2486 d2.dn_loss_cls: 0.1310 d2.dn_loss_bbox: 0.1951 d2.dn_loss_iou: 0.2218 d3.dn_loss_cls: 0.1231 d3.dn_loss_bbox: 0.1854 d3.dn_loss_iou: 0.2157 d4.dn_loss_cls: 0.1226 d4.dn_loss_bbox: 0.1818 d4.dn_loss_iou: 0.2125 d1.loss_lmm_region: 0.1760 loss_lmm_image: 0.9495 2024/11/11 07:27:59 - mmengine - INFO - Iter(train) [ 36900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:04:16 time: 1.9668 data_time: 0.0180 memory: 32951 grad_norm: 27.5211 loss: 8.3131 loss_cls: 0.2249 loss_bbox: 0.1149 loss_iou: 0.2006 d0.loss_cls: 0.2645 d0.loss_bbox: 0.1274 d0.loss_iou: 0.2116 d1.loss_cls: 0.2445 d1.loss_bbox: 0.1241 d1.loss_iou: 0.2076 d2.loss_cls: 0.2381 d2.loss_bbox: 0.1207 d2.loss_iou: 0.2032 d3.loss_cls: 0.2336 d3.loss_bbox: 0.1120 d3.loss_iou: 0.1987 d4.loss_cls: 0.2245 d4.loss_bbox: 0.1144 d4.loss_iou: 0.1997 enc_loss_cls: 0.2649 enc_loss_bbox: 0.1425 enc_loss_iou: 0.2288 dn_loss_cls: 0.1003 dn_loss_bbox: 0.1686 dn_loss_iou: 0.2020 d0.dn_loss_cls: 0.1691 d0.dn_loss_bbox: 0.3070 d0.dn_loss_iou: 0.3346 d1.dn_loss_cls: 0.1243 d1.dn_loss_bbox: 0.1993 d1.dn_loss_iou: 0.2298 d2.dn_loss_cls: 0.1065 d2.dn_loss_bbox: 0.1778 d2.dn_loss_iou: 0.2100 d3.dn_loss_cls: 0.1021 d3.dn_loss_bbox: 0.1709 d3.dn_loss_iou: 0.2047 d4.dn_loss_cls: 0.0999 d4.dn_loss_bbox: 0.1686 d4.dn_loss_iou: 0.2021 d1.loss_lmm_region: 0.1333 loss_lmm_image: 0.9012 2024/11/11 07:31:18 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 07:31:18 - mmengine - INFO - Iter(train) [ 37000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 15:00:51 time: 1.9933 data_time: 0.0189 memory: 33441 grad_norm: 26.5288 loss: 10.1511 loss_cls: 0.3555 loss_bbox: 0.1353 loss_iou: 0.2505 d0.loss_cls: 0.4071 d0.loss_bbox: 0.1527 d0.loss_iou: 0.2702 d1.loss_cls: 0.3742 d1.loss_bbox: 0.1445 d1.loss_iou: 0.2627 d2.loss_cls: 0.3638 d2.loss_bbox: 0.1388 d2.loss_iou: 0.2568 d3.loss_cls: 0.3539 d3.loss_bbox: 0.1379 d3.loss_iou: 0.2545 d4.loss_cls: 0.3544 d4.loss_bbox: 0.1351 d4.loss_iou: 0.2511 enc_loss_cls: 0.3984 enc_loss_bbox: 0.1703 enc_loss_iou: 0.2956 dn_loss_cls: 0.1075 dn_loss_bbox: 0.1792 dn_loss_iou: 0.2284 d0.dn_loss_cls: 0.1893 d0.dn_loss_bbox: 0.3325 d0.dn_loss_iou: 0.3841 d1.dn_loss_cls: 0.1406 d1.dn_loss_bbox: 0.2173 d1.dn_loss_iou: 0.2650 d2.dn_loss_cls: 0.1206 d2.dn_loss_bbox: 0.1901 d2.dn_loss_iou: 0.2393 d3.dn_loss_cls: 0.1127 d3.dn_loss_bbox: 0.1811 d3.dn_loss_iou: 0.2314 d4.dn_loss_cls: 0.1081 d4.dn_loss_bbox: 0.1792 d4.dn_loss_iou: 0.2284 d1.loss_lmm_region: 0.1433 loss_lmm_image: 0.9096 2024/11/11 07:34:36 - mmengine - INFO - Iter(train) [ 37100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:57:21 time: 1.9915 data_time: 0.0192 memory: 34128 grad_norm: 43.6747 loss: 8.9967 loss_cls: 0.2433 loss_bbox: 0.1208 loss_iou: 0.2278 d0.loss_cls: 0.2921 d0.loss_bbox: 0.1281 d0.loss_iou: 0.2385 d1.loss_cls: 0.2597 d1.loss_bbox: 0.1263 d1.loss_iou: 0.2362 d2.loss_cls: 0.2538 d2.loss_bbox: 0.1214 d2.loss_iou: 0.2293 d3.loss_cls: 0.2514 d3.loss_bbox: 0.1206 d3.loss_iou: 0.2278 d4.loss_cls: 0.2450 d4.loss_bbox: 0.1214 d4.loss_iou: 0.2287 enc_loss_cls: 0.2790 enc_loss_bbox: 0.1448 enc_loss_iou: 0.2643 dn_loss_cls: 0.1454 dn_loss_bbox: 0.1600 dn_loss_iou: 0.2159 d0.dn_loss_cls: 0.2211 d0.dn_loss_bbox: 0.3190 d0.dn_loss_iou: 0.3638 d1.dn_loss_cls: 0.1684 d1.dn_loss_bbox: 0.1920 d1.dn_loss_iou: 0.2460 d2.dn_loss_cls: 0.1558 d2.dn_loss_bbox: 0.1685 d2.dn_loss_iou: 0.2245 d3.dn_loss_cls: 0.1519 d3.dn_loss_bbox: 0.1616 d3.dn_loss_iou: 0.2183 d4.dn_loss_cls: 0.1477 d4.dn_loss_bbox: 0.1597 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1386 loss_lmm_image: 0.8625 2024/11/11 07:37:53 - mmengine - INFO - Iter(train) [ 37200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:53:51 time: 1.9884 data_time: 0.0189 memory: 34771 grad_norm: 26.4487 loss: 8.4648 loss_cls: 0.2541 loss_bbox: 0.1172 loss_iou: 0.1821 d0.loss_cls: 0.3030 d0.loss_bbox: 0.1259 d0.loss_iou: 0.1955 d1.loss_cls: 0.2708 d1.loss_bbox: 0.1207 d1.loss_iou: 0.1878 d2.loss_cls: 0.2630 d2.loss_bbox: 0.1176 d2.loss_iou: 0.1850 d3.loss_cls: 0.2602 d3.loss_bbox: 0.1165 d3.loss_iou: 0.1826 d4.loss_cls: 0.2529 d4.loss_bbox: 0.1204 d4.loss_iou: 0.1839 enc_loss_cls: 0.3057 enc_loss_bbox: 0.1428 enc_loss_iou: 0.2168 dn_loss_cls: 0.1237 dn_loss_bbox: 0.1576 dn_loss_iou: 0.1844 d0.dn_loss_cls: 0.2034 d0.dn_loss_bbox: 0.3077 d0.dn_loss_iou: 0.3198 d1.dn_loss_cls: 0.1543 d1.dn_loss_bbox: 0.1903 d1.dn_loss_iou: 0.2113 d2.dn_loss_cls: 0.1370 d2.dn_loss_bbox: 0.1656 d2.dn_loss_iou: 0.1916 d3.dn_loss_cls: 0.1301 d3.dn_loss_bbox: 0.1598 d3.dn_loss_iou: 0.1866 d4.dn_loss_cls: 0.1252 d4.dn_loss_bbox: 0.1576 d4.dn_loss_iou: 0.1844 d1.loss_lmm_region: 0.1535 loss_lmm_image: 0.9166 2024/11/11 07:41:15 - mmengine - INFO - Iter(train) [ 37300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:50:32 time: 1.9907 data_time: 0.0181 memory: 34411 grad_norm: 32.8667 loss: 10.1641 loss_cls: 0.3360 loss_bbox: 0.1650 loss_iou: 0.2634 d0.loss_cls: 0.3927 d0.loss_bbox: 0.1829 d0.loss_iou: 0.2769 d1.loss_cls: 0.3588 d1.loss_bbox: 0.1739 d1.loss_iou: 0.2655 d2.loss_cls: 0.3456 d2.loss_bbox: 0.1682 d2.loss_iou: 0.2622 d3.loss_cls: 0.3388 d3.loss_bbox: 0.1680 d3.loss_iou: 0.2598 d4.loss_cls: 0.3353 d4.loss_bbox: 0.1655 d4.loss_iou: 0.2632 enc_loss_cls: 0.3837 enc_loss_bbox: 0.1920 enc_loss_iou: 0.2997 dn_loss_cls: 0.1044 dn_loss_bbox: 0.1677 dn_loss_iou: 0.2197 d0.dn_loss_cls: 0.1902 d0.dn_loss_bbox: 0.3093 d0.dn_loss_iou: 0.3665 d1.dn_loss_cls: 0.1386 d1.dn_loss_bbox: 0.1945 d1.dn_loss_iou: 0.2521 d2.dn_loss_cls: 0.1184 d2.dn_loss_bbox: 0.1756 d2.dn_loss_iou: 0.2296 d3.dn_loss_cls: 0.1094 d3.dn_loss_bbox: 0.1702 d3.dn_loss_iou: 0.2224 d4.dn_loss_cls: 0.1059 d4.dn_loss_bbox: 0.1677 d4.dn_loss_iou: 0.2198 d1.loss_lmm_region: 0.1809 loss_lmm_image: 0.9243 2024/11/11 07:44:32 - mmengine - INFO - Iter(train) [ 37400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:47:02 time: 1.9822 data_time: 0.0178 memory: 32804 grad_norm: 30.7508 loss: 9.5912 loss_cls: 0.3269 loss_bbox: 0.1359 loss_iou: 0.2700 d0.loss_cls: 0.3700 d0.loss_bbox: 0.1545 d0.loss_iou: 0.2870 d1.loss_cls: 0.3405 d1.loss_bbox: 0.1499 d1.loss_iou: 0.2792 d2.loss_cls: 0.3363 d2.loss_bbox: 0.1396 d2.loss_iou: 0.2713 d3.loss_cls: 0.3244 d3.loss_bbox: 0.1383 d3.loss_iou: 0.2710 d4.loss_cls: 0.3246 d4.loss_bbox: 0.1366 d4.loss_iou: 0.2709 enc_loss_cls: 0.3699 enc_loss_bbox: 0.1678 enc_loss_iou: 0.3160 dn_loss_cls: 0.0991 dn_loss_bbox: 0.1404 dn_loss_iou: 0.2034 d0.dn_loss_cls: 0.1767 d0.dn_loss_bbox: 0.2920 d0.dn_loss_iou: 0.3529 d1.dn_loss_cls: 0.1290 d1.dn_loss_bbox: 0.1747 d1.dn_loss_iou: 0.2356 d2.dn_loss_cls: 0.1094 d2.dn_loss_bbox: 0.1522 d2.dn_loss_iou: 0.2139 d3.dn_loss_cls: 0.1022 d3.dn_loss_bbox: 0.1426 d3.dn_loss_iou: 0.2057 d4.dn_loss_cls: 0.0991 d4.dn_loss_bbox: 0.1404 d4.dn_loss_iou: 0.2034 d1.loss_lmm_region: 0.1394 loss_lmm_image: 0.8985 2024/11/11 07:47:52 - mmengine - INFO - Iter(train) [ 37500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:43:37 time: 2.0047 data_time: 0.0182 memory: 33279 grad_norm: 27.4381 loss: 8.9963 loss_cls: 0.2603 loss_bbox: 0.1234 loss_iou: 0.2345 d0.loss_cls: 0.3039 d0.loss_bbox: 0.1355 d0.loss_iou: 0.2572 d1.loss_cls: 0.2772 d1.loss_bbox: 0.1319 d1.loss_iou: 0.2458 d2.loss_cls: 0.2699 d2.loss_bbox: 0.1259 d2.loss_iou: 0.2389 d3.loss_cls: 0.2631 d3.loss_bbox: 0.1245 d3.loss_iou: 0.2348 d4.loss_cls: 0.2612 d4.loss_bbox: 0.1252 d4.loss_iou: 0.2340 enc_loss_cls: 0.2997 enc_loss_bbox: 0.1532 enc_loss_iou: 0.2759 dn_loss_cls: 0.0888 dn_loss_bbox: 0.1627 dn_loss_iou: 0.2167 d0.dn_loss_cls: 0.1638 d0.dn_loss_bbox: 0.3394 d0.dn_loss_iou: 0.3806 d1.dn_loss_cls: 0.1152 d1.dn_loss_bbox: 0.2035 d1.dn_loss_iou: 0.2544 d2.dn_loss_cls: 0.0993 d2.dn_loss_bbox: 0.1729 d2.dn_loss_iou: 0.2269 d3.dn_loss_cls: 0.0914 d3.dn_loss_bbox: 0.1651 d3.dn_loss_iou: 0.2196 d4.dn_loss_cls: 0.0894 d4.dn_loss_bbox: 0.1628 d4.dn_loss_iou: 0.2167 d1.loss_lmm_region: 0.1143 loss_lmm_image: 0.9368 2024/11/11 07:51:09 - mmengine - INFO - Iter(train) [ 37600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:40:05 time: 1.9559 data_time: 0.0181 memory: 34359 grad_norm: 31.7821 loss: 8.9284 loss_cls: 0.2612 loss_bbox: 0.1086 loss_iou: 0.1968 d0.loss_cls: 0.3033 d0.loss_bbox: 0.1151 d0.loss_iou: 0.2049 d1.loss_cls: 0.2767 d1.loss_bbox: 0.1144 d1.loss_iou: 0.2011 d2.loss_cls: 0.2710 d2.loss_bbox: 0.1068 d2.loss_iou: 0.1958 d3.loss_cls: 0.2676 d3.loss_bbox: 0.1083 d3.loss_iou: 0.1957 d4.loss_cls: 0.2626 d4.loss_bbox: 0.1077 d4.loss_iou: 0.1961 enc_loss_cls: 0.3000 enc_loss_bbox: 0.1300 enc_loss_iou: 0.2250 dn_loss_cls: 0.1646 dn_loss_bbox: 0.1632 dn_loss_iou: 0.2059 d0.dn_loss_cls: 0.2494 d0.dn_loss_bbox: 0.3207 d0.dn_loss_iou: 0.3478 d1.dn_loss_cls: 0.1993 d1.dn_loss_bbox: 0.1920 d1.dn_loss_iou: 0.2356 d2.dn_loss_cls: 0.1794 d2.dn_loss_bbox: 0.1712 d2.dn_loss_iou: 0.2146 d3.dn_loss_cls: 0.1696 d3.dn_loss_bbox: 0.1655 d3.dn_loss_iou: 0.2087 d4.dn_loss_cls: 0.1643 d4.dn_loss_bbox: 0.1632 d4.dn_loss_iou: 0.2057 d1.loss_lmm_region: 0.1800 loss_lmm_image: 0.8788 2024/11/11 07:54:27 - mmengine - INFO - Iter(train) [ 37700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:36:36 time: 1.9683 data_time: 0.0181 memory: 34918 grad_norm: 27.1510 loss: 9.5509 loss_cls: 0.2832 loss_bbox: 0.1445 loss_iou: 0.2759 d0.loss_cls: 0.3293 d0.loss_bbox: 0.1549 d0.loss_iou: 0.2919 d1.loss_cls: 0.3114 d1.loss_bbox: 0.1466 d1.loss_iou: 0.2779 d2.loss_cls: 0.2968 d2.loss_bbox: 0.1441 d2.loss_iou: 0.2759 d3.loss_cls: 0.2873 d3.loss_bbox: 0.1438 d3.loss_iou: 0.2761 d4.loss_cls: 0.2840 d4.loss_bbox: 0.1445 d4.loss_iou: 0.2753 enc_loss_cls: 0.3323 enc_loss_bbox: 0.1666 enc_loss_iou: 0.3127 dn_loss_cls: 0.0800 dn_loss_bbox: 0.1752 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.1538 d0.dn_loss_bbox: 0.3273 d0.dn_loss_iou: 0.3667 d1.dn_loss_cls: 0.1104 d1.dn_loss_bbox: 0.2151 d1.dn_loss_iou: 0.2599 d2.dn_loss_cls: 0.0919 d2.dn_loss_bbox: 0.1878 d2.dn_loss_iou: 0.2357 d3.dn_loss_cls: 0.0851 d3.dn_loss_bbox: 0.1781 d3.dn_loss_iou: 0.2278 d4.dn_loss_cls: 0.0809 d4.dn_loss_bbox: 0.1752 d4.dn_loss_iou: 0.2248 d1.loss_lmm_region: 0.1383 loss_lmm_image: 0.8574 2024/11/11 07:57:45 - mmengine - INFO - Iter(train) [ 37800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:33:08 time: 1.9894 data_time: 0.0183 memory: 34884 grad_norm: 31.8017 loss: 10.4955 loss_cls: 0.3092 loss_bbox: 0.1515 loss_iou: 0.2536 d0.loss_cls: 0.3891 d0.loss_bbox: 0.1622 d0.loss_iou: 0.2636 d1.loss_cls: 0.3452 d1.loss_bbox: 0.1540 d1.loss_iou: 0.2561 d2.loss_cls: 0.3255 d2.loss_bbox: 0.1520 d2.loss_iou: 0.2529 d3.loss_cls: 0.3156 d3.loss_bbox: 0.1530 d3.loss_iou: 0.2527 d4.loss_cls: 0.3100 d4.loss_bbox: 0.1512 d4.loss_iou: 0.2520 enc_loss_cls: 0.3761 enc_loss_bbox: 0.1763 enc_loss_iou: 0.2866 dn_loss_cls: 0.1416 dn_loss_bbox: 0.2054 dn_loss_iou: 0.2483 d0.dn_loss_cls: 0.2400 d0.dn_loss_bbox: 0.3785 d0.dn_loss_iou: 0.4043 d1.dn_loss_cls: 0.1769 d1.dn_loss_bbox: 0.2443 d1.dn_loss_iou: 0.2804 d2.dn_loss_cls: 0.1543 d2.dn_loss_bbox: 0.2192 d2.dn_loss_iou: 0.2586 d3.dn_loss_cls: 0.1469 d3.dn_loss_bbox: 0.2081 d3.dn_loss_iou: 0.2507 d4.dn_loss_cls: 0.1419 d4.dn_loss_bbox: 0.2054 d4.dn_loss_iou: 0.2482 d1.loss_lmm_region: 0.1913 loss_lmm_image: 0.8630 2024/11/11 08:01:03 - mmengine - INFO - Iter(train) [ 37900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:29:39 time: 1.9726 data_time: 0.0180 memory: 34834 grad_norm: 33.7492 loss: 9.0367 loss_cls: 0.3048 loss_bbox: 0.1128 loss_iou: 0.1934 d0.loss_cls: 0.3423 d0.loss_bbox: 0.1264 d0.loss_iou: 0.2150 d1.loss_cls: 0.3111 d1.loss_bbox: 0.1234 d1.loss_iou: 0.2104 d2.loss_cls: 0.3110 d2.loss_bbox: 0.1145 d2.loss_iou: 0.2009 d3.loss_cls: 0.3078 d3.loss_bbox: 0.1118 d3.loss_iou: 0.1987 d4.loss_cls: 0.3017 d4.loss_bbox: 0.1152 d4.loss_iou: 0.1956 enc_loss_cls: 0.3389 enc_loss_bbox: 0.1407 enc_loss_iou: 0.2345 dn_loss_cls: 0.1237 dn_loss_bbox: 0.1684 dn_loss_iou: 0.2016 d0.dn_loss_cls: 0.2067 d0.dn_loss_bbox: 0.3168 d0.dn_loss_iou: 0.3497 d1.dn_loss_cls: 0.1520 d1.dn_loss_bbox: 0.1990 d1.dn_loss_iou: 0.2325 d2.dn_loss_cls: 0.1315 d2.dn_loss_bbox: 0.1783 d2.dn_loss_iou: 0.2107 d3.dn_loss_cls: 0.1244 d3.dn_loss_bbox: 0.1716 d3.dn_loss_iou: 0.2053 d4.dn_loss_cls: 0.1238 d4.dn_loss_bbox: 0.1682 d4.dn_loss_iou: 0.2015 d1.loss_lmm_region: 0.1555 loss_lmm_image: 0.9048 2024/11/11 08:04:20 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 08:04:20 - mmengine - INFO - Iter(train) [ 38000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:26:09 time: 1.9832 data_time: 0.0180 memory: 33135 grad_norm: 29.9580 loss: 10.8854 loss_cls: 0.3187 loss_bbox: 0.1719 loss_iou: 0.3044 d0.loss_cls: 0.3789 d0.loss_bbox: 0.1896 d0.loss_iou: 0.3233 d1.loss_cls: 0.3413 d1.loss_bbox: 0.1770 d1.loss_iou: 0.3165 d2.loss_cls: 0.3332 d2.loss_bbox: 0.1712 d2.loss_iou: 0.3049 d3.loss_cls: 0.3281 d3.loss_bbox: 0.1694 d3.loss_iou: 0.3001 d4.loss_cls: 0.3196 d4.loss_bbox: 0.1717 d4.loss_iou: 0.3033 enc_loss_cls: 0.3745 enc_loss_bbox: 0.2050 enc_loss_iou: 0.3510 dn_loss_cls: 0.1164 dn_loss_bbox: 0.2009 dn_loss_iou: 0.2465 d0.dn_loss_cls: 0.2048 d0.dn_loss_bbox: 0.3687 d0.dn_loss_iou: 0.4057 d1.dn_loss_cls: 0.1498 d1.dn_loss_bbox: 0.2396 d1.dn_loss_iou: 0.2814 d2.dn_loss_cls: 0.1311 d2.dn_loss_bbox: 0.2131 d2.dn_loss_iou: 0.2583 d3.dn_loss_cls: 0.1220 d3.dn_loss_bbox: 0.2036 d3.dn_loss_iou: 0.2490 d4.dn_loss_cls: 0.1177 d4.dn_loss_bbox: 0.2010 d4.dn_loss_iou: 0.2465 d1.loss_lmm_region: 0.1561 loss_lmm_image: 0.9194 2024/11/11 08:07:39 - mmengine - INFO - Iter(train) [ 38100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:22:42 time: 2.0010 data_time: 0.0187 memory: 33881 grad_norm: 28.9971 loss: 9.6381 loss_cls: 0.3220 loss_bbox: 0.1484 loss_iou: 0.2618 d0.loss_cls: 0.3764 d0.loss_bbox: 0.1565 d0.loss_iou: 0.2748 d1.loss_cls: 0.3404 d1.loss_bbox: 0.1529 d1.loss_iou: 0.2684 d2.loss_cls: 0.3332 d2.loss_bbox: 0.1459 d2.loss_iou: 0.2615 d3.loss_cls: 0.3275 d3.loss_bbox: 0.1495 d3.loss_iou: 0.2630 d4.loss_cls: 0.3224 d4.loss_bbox: 0.1489 d4.loss_iou: 0.2623 enc_loss_cls: 0.3635 enc_loss_bbox: 0.1735 enc_loss_iou: 0.2955 dn_loss_cls: 0.0889 dn_loss_bbox: 0.1611 dn_loss_iou: 0.2051 d0.dn_loss_cls: 0.1682 d0.dn_loss_bbox: 0.3118 d0.dn_loss_iou: 0.3474 d1.dn_loss_cls: 0.1224 d1.dn_loss_bbox: 0.1956 d1.dn_loss_iou: 0.2397 d2.dn_loss_cls: 0.1010 d2.dn_loss_bbox: 0.1713 d2.dn_loss_iou: 0.2164 d3.dn_loss_cls: 0.0930 d3.dn_loss_bbox: 0.1634 d3.dn_loss_iou: 0.2082 d4.dn_loss_cls: 0.0890 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.2051 d1.loss_lmm_region: 0.1303 loss_lmm_image: 0.9107 2024/11/11 08:10:57 - mmengine - INFO - Iter(train) [ 38200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:19:15 time: 1.9920 data_time: 0.0183 memory: 31349 grad_norm: 33.8850 loss: 8.8178 loss_cls: 0.2864 loss_bbox: 0.1126 loss_iou: 0.2046 d0.loss_cls: 0.3354 d0.loss_bbox: 0.1257 d0.loss_iou: 0.2220 d1.loss_cls: 0.3098 d1.loss_bbox: 0.1188 d1.loss_iou: 0.2119 d2.loss_cls: 0.2972 d2.loss_bbox: 0.1167 d2.loss_iou: 0.2092 d3.loss_cls: 0.2914 d3.loss_bbox: 0.1167 d3.loss_iou: 0.2066 d4.loss_cls: 0.2847 d4.loss_bbox: 0.1136 d4.loss_iou: 0.2058 enc_loss_cls: 0.3329 enc_loss_bbox: 0.1433 enc_loss_iou: 0.2445 dn_loss_cls: 0.1192 dn_loss_bbox: 0.1542 dn_loss_iou: 0.1890 d0.dn_loss_cls: 0.1937 d0.dn_loss_bbox: 0.2945 d0.dn_loss_iou: 0.3207 d1.dn_loss_cls: 0.1483 d1.dn_loss_bbox: 0.1832 d1.dn_loss_iou: 0.2162 d2.dn_loss_cls: 0.1327 d2.dn_loss_bbox: 0.1630 d2.dn_loss_iou: 0.1969 d3.dn_loss_cls: 0.1264 d3.dn_loss_bbox: 0.1551 d3.dn_loss_iou: 0.1909 d4.dn_loss_cls: 0.1204 d4.dn_loss_bbox: 0.1542 d4.dn_loss_iou: 0.1890 d1.loss_lmm_region: 0.1684 loss_lmm_image: 0.9119 2024/11/11 08:14:18 - mmengine - INFO - Iter(train) [ 38300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:15:53 time: 2.0115 data_time: 0.0185 memory: 35291 grad_norm: 31.5281 loss: 10.1884 loss_cls: 0.2992 loss_bbox: 0.1467 loss_iou: 0.2913 d0.loss_cls: 0.3494 d0.loss_bbox: 0.1556 d0.loss_iou: 0.2997 d1.loss_cls: 0.3181 d1.loss_bbox: 0.1502 d1.loss_iou: 0.2940 d2.loss_cls: 0.3071 d2.loss_bbox: 0.1434 d2.loss_iou: 0.2877 d3.loss_cls: 0.2980 d3.loss_bbox: 0.1455 d3.loss_iou: 0.2914 d4.loss_cls: 0.2983 d4.loss_bbox: 0.1460 d4.loss_iou: 0.2916 enc_loss_cls: 0.3443 enc_loss_bbox: 0.1730 enc_loss_iou: 0.3240 dn_loss_cls: 0.1332 dn_loss_bbox: 0.1738 dn_loss_iou: 0.2371 d0.dn_loss_cls: 0.2101 d0.dn_loss_bbox: 0.3138 d0.dn_loss_iou: 0.3784 d1.dn_loss_cls: 0.1616 d1.dn_loss_bbox: 0.2058 d1.dn_loss_iou: 0.2670 d2.dn_loss_cls: 0.1463 d2.dn_loss_bbox: 0.1839 d2.dn_loss_iou: 0.2457 d3.dn_loss_cls: 0.1377 d3.dn_loss_bbox: 0.1762 d3.dn_loss_iou: 0.2397 d4.dn_loss_cls: 0.1349 d4.dn_loss_bbox: 0.1738 d4.dn_loss_iou: 0.2370 d1.loss_lmm_region: 0.1691 loss_lmm_image: 0.9089 2024/11/11 08:17:35 - mmengine - INFO - Iter(train) [ 38400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:12:24 time: 1.9639 data_time: 0.0180 memory: 33502 grad_norm: 30.5526 loss: 9.4867 loss_cls: 0.3217 loss_bbox: 0.1301 loss_iou: 0.2714 d0.loss_cls: 0.3713 d0.loss_bbox: 0.1501 d0.loss_iou: 0.2896 d1.loss_cls: 0.3405 d1.loss_bbox: 0.1432 d1.loss_iou: 0.2772 d2.loss_cls: 0.3350 d2.loss_bbox: 0.1368 d2.loss_iou: 0.2742 d3.loss_cls: 0.3226 d3.loss_bbox: 0.1336 d3.loss_iou: 0.2735 d4.loss_cls: 0.3215 d4.loss_bbox: 0.1314 d4.loss_iou: 0.2717 enc_loss_cls: 0.3722 enc_loss_bbox: 0.1622 enc_loss_iou: 0.3187 dn_loss_cls: 0.0822 dn_loss_bbox: 0.1501 dn_loss_iou: 0.2089 d0.dn_loss_cls: 0.1624 d0.dn_loss_bbox: 0.2820 d0.dn_loss_iou: 0.3490 d1.dn_loss_cls: 0.1095 d1.dn_loss_bbox: 0.1790 d1.dn_loss_iou: 0.2398 d2.dn_loss_cls: 0.0954 d2.dn_loss_bbox: 0.1578 d2.dn_loss_iou: 0.2193 d3.dn_loss_cls: 0.0853 d3.dn_loss_bbox: 0.1514 d3.dn_loss_iou: 0.2112 d4.dn_loss_cls: 0.0827 d4.dn_loss_bbox: 0.1501 d4.dn_loss_iou: 0.2088 d1.loss_lmm_region: 0.1459 loss_lmm_image: 0.8677 2024/11/11 08:20:54 - mmengine - INFO - Iter(train) [ 38500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:08:57 time: 1.9908 data_time: 0.0180 memory: 33398 grad_norm: 33.3218 loss: 10.1999 loss_cls: 0.3033 loss_bbox: 0.1392 loss_iou: 0.2649 d0.loss_cls: 0.3599 d0.loss_bbox: 0.1521 d0.loss_iou: 0.2806 d1.loss_cls: 0.3275 d1.loss_bbox: 0.1472 d1.loss_iou: 0.2702 d2.loss_cls: 0.3163 d2.loss_bbox: 0.1426 d2.loss_iou: 0.2681 d3.loss_cls: 0.3100 d3.loss_bbox: 0.1404 d3.loss_iou: 0.2646 d4.loss_cls: 0.3054 d4.loss_bbox: 0.1397 d4.loss_iou: 0.2635 enc_loss_cls: 0.3609 enc_loss_bbox: 0.1687 enc_loss_iou: 0.3086 dn_loss_cls: 0.1473 dn_loss_bbox: 0.1790 dn_loss_iou: 0.2304 d0.dn_loss_cls: 0.2359 d0.dn_loss_bbox: 0.3351 d0.dn_loss_iou: 0.3769 d1.dn_loss_cls: 0.1802 d1.dn_loss_bbox: 0.2135 d1.dn_loss_iou: 0.2642 d2.dn_loss_cls: 0.1615 d2.dn_loss_bbox: 0.1878 d2.dn_loss_iou: 0.2400 d3.dn_loss_cls: 0.1512 d3.dn_loss_bbox: 0.1802 d3.dn_loss_iou: 0.2325 d4.dn_loss_cls: 0.1486 d4.dn_loss_bbox: 0.1790 d4.dn_loss_iou: 0.2304 d1.loss_lmm_region: 0.1887 loss_lmm_image: 0.9040 2024/11/11 08:24:12 - mmengine - INFO - Iter(train) [ 38600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:05:28 time: 1.9781 data_time: 0.0181 memory: 33167 grad_norm: 30.8625 loss: 10.4147 loss_cls: 0.3146 loss_bbox: 0.1517 loss_iou: 0.2783 d0.loss_cls: 0.3747 d0.loss_bbox: 0.1635 d0.loss_iou: 0.2951 d1.loss_cls: 0.3379 d1.loss_bbox: 0.1590 d1.loss_iou: 0.2866 d2.loss_cls: 0.3299 d2.loss_bbox: 0.1539 d2.loss_iou: 0.2798 d3.loss_cls: 0.3178 d3.loss_bbox: 0.1538 d3.loss_iou: 0.2806 d4.loss_cls: 0.3156 d4.loss_bbox: 0.1522 d4.loss_iou: 0.2787 enc_loss_cls: 0.3762 enc_loss_bbox: 0.1796 enc_loss_iou: 0.3170 dn_loss_cls: 0.1235 dn_loss_bbox: 0.1826 dn_loss_iou: 0.2384 d0.dn_loss_cls: 0.2148 d0.dn_loss_bbox: 0.3405 d0.dn_loss_iou: 0.3850 d1.dn_loss_cls: 0.1628 d1.dn_loss_bbox: 0.2171 d1.dn_loss_iou: 0.2710 d2.dn_loss_cls: 0.1432 d2.dn_loss_bbox: 0.1911 d2.dn_loss_iou: 0.2471 d3.dn_loss_cls: 0.1310 d3.dn_loss_bbox: 0.1853 d3.dn_loss_iou: 0.2408 d4.dn_loss_cls: 0.1255 d4.dn_loss_bbox: 0.1826 d4.dn_loss_iou: 0.2384 d1.loss_lmm_region: 0.1756 loss_lmm_image: 0.9217 2024/11/11 08:27:32 - mmengine - INFO - Iter(train) [ 38700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 14:02:06 time: 2.0090 data_time: 0.0181 memory: 35019 grad_norm: 28.5323 loss: 10.8613 loss_cls: 0.3650 loss_bbox: 0.1633 loss_iou: 0.3166 d0.loss_cls: 0.4332 d0.loss_bbox: 0.1775 d0.loss_iou: 0.3322 d1.loss_cls: 0.3897 d1.loss_bbox: 0.1742 d1.loss_iou: 0.3262 d2.loss_cls: 0.3738 d2.loss_bbox: 0.1709 d2.loss_iou: 0.3228 d3.loss_cls: 0.3659 d3.loss_bbox: 0.1709 d3.loss_iou: 0.3194 d4.loss_cls: 0.3657 d4.loss_bbox: 0.1646 d4.loss_iou: 0.3184 enc_loss_cls: 0.4209 enc_loss_bbox: 0.1921 enc_loss_iou: 0.3588 dn_loss_cls: 0.1104 dn_loss_bbox: 0.1592 dn_loss_iou: 0.2373 d0.dn_loss_cls: 0.1915 d0.dn_loss_bbox: 0.2974 d0.dn_loss_iou: 0.3844 d1.dn_loss_cls: 0.1460 d1.dn_loss_bbox: 0.1898 d1.dn_loss_iou: 0.2720 d2.dn_loss_cls: 0.1228 d2.dn_loss_bbox: 0.1698 d2.dn_loss_iou: 0.2498 d3.dn_loss_cls: 0.1145 d3.dn_loss_bbox: 0.1620 d3.dn_loss_iou: 0.2405 d4.dn_loss_cls: 0.1115 d4.dn_loss_bbox: 0.1593 d4.dn_loss_iou: 0.2374 d1.loss_lmm_region: 0.1522 loss_lmm_image: 0.9313 2024/11/11 08:30:50 - mmengine - INFO - Iter(train) [ 38800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:58:39 time: 2.0123 data_time: 0.0181 memory: 34559 grad_norm: 28.7209 loss: 10.1020 loss_cls: 0.3007 loss_bbox: 0.1528 loss_iou: 0.2486 d0.loss_cls: 0.3466 d0.loss_bbox: 0.1653 d0.loss_iou: 0.2619 d1.loss_cls: 0.3211 d1.loss_bbox: 0.1552 d1.loss_iou: 0.2528 d2.loss_cls: 0.3136 d2.loss_bbox: 0.1533 d2.loss_iou: 0.2483 d3.loss_cls: 0.3010 d3.loss_bbox: 0.1621 d3.loss_iou: 0.2521 d4.loss_cls: 0.3019 d4.loss_bbox: 0.1526 d4.loss_iou: 0.2496 enc_loss_cls: 0.3458 enc_loss_bbox: 0.1810 enc_loss_iou: 0.2819 dn_loss_cls: 0.1425 dn_loss_bbox: 0.1816 dn_loss_iou: 0.2272 d0.dn_loss_cls: 0.2435 d0.dn_loss_bbox: 0.3426 d0.dn_loss_iou: 0.3769 d1.dn_loss_cls: 0.1824 d1.dn_loss_bbox: 0.2194 d1.dn_loss_iou: 0.2605 d2.dn_loss_cls: 0.1574 d2.dn_loss_bbox: 0.1933 d2.dn_loss_iou: 0.2375 d3.dn_loss_cls: 0.1456 d3.dn_loss_bbox: 0.1858 d3.dn_loss_iou: 0.2304 d4.dn_loss_cls: 0.1434 d4.dn_loss_bbox: 0.1816 d4.dn_loss_iou: 0.2271 d1.loss_lmm_region: 0.1835 loss_lmm_image: 0.8916 2024/11/11 08:34:09 - mmengine - INFO - Iter(train) [ 38900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:55:12 time: 1.9901 data_time: 0.0181 memory: 33115 grad_norm: 32.0236 loss: 9.5721 loss_cls: 0.2949 loss_bbox: 0.1356 loss_iou: 0.2425 d0.loss_cls: 0.3390 d0.loss_bbox: 0.1449 d0.loss_iou: 0.2514 d1.loss_cls: 0.3139 d1.loss_bbox: 0.1416 d1.loss_iou: 0.2457 d2.loss_cls: 0.2993 d2.loss_bbox: 0.1406 d2.loss_iou: 0.2442 d3.loss_cls: 0.2960 d3.loss_bbox: 0.1338 d3.loss_iou: 0.2402 d4.loss_cls: 0.2933 d4.loss_bbox: 0.1351 d4.loss_iou: 0.2415 enc_loss_cls: 0.3390 enc_loss_bbox: 0.1651 enc_loss_iou: 0.2802 dn_loss_cls: 0.1253 dn_loss_bbox: 0.1730 dn_loss_iou: 0.2134 d0.dn_loss_cls: 0.2046 d0.dn_loss_bbox: 0.3377 d0.dn_loss_iou: 0.3592 d1.dn_loss_cls: 0.1547 d1.dn_loss_bbox: 0.2123 d1.dn_loss_iou: 0.2462 d2.dn_loss_cls: 0.1395 d2.dn_loss_bbox: 0.1857 d2.dn_loss_iou: 0.2234 d3.dn_loss_cls: 0.1299 d3.dn_loss_bbox: 0.1743 d3.dn_loss_iou: 0.2152 d4.dn_loss_cls: 0.1262 d4.dn_loss_bbox: 0.1729 d4.dn_loss_iou: 0.2133 d1.loss_lmm_region: 0.1635 loss_lmm_image: 0.8841 2024/11/11 08:37:29 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 08:37:29 - mmengine - INFO - Iter(train) [ 39000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:51:50 time: 2.0099 data_time: 0.0182 memory: 33896 grad_norm: 28.2020 loss: 9.3055 loss_cls: 0.2911 loss_bbox: 0.1242 loss_iou: 0.2224 d0.loss_cls: 0.3384 d0.loss_bbox: 0.1373 d0.loss_iou: 0.2373 d1.loss_cls: 0.3145 d1.loss_bbox: 0.1321 d1.loss_iou: 0.2310 d2.loss_cls: 0.3080 d2.loss_bbox: 0.1245 d2.loss_iou: 0.2244 d3.loss_cls: 0.2933 d3.loss_bbox: 0.1256 d3.loss_iou: 0.2228 d4.loss_cls: 0.2924 d4.loss_bbox: 0.1241 d4.loss_iou: 0.2222 enc_loss_cls: 0.3374 enc_loss_bbox: 0.1578 enc_loss_iou: 0.2641 dn_loss_cls: 0.1300 dn_loss_bbox: 0.1663 dn_loss_iou: 0.2050 d0.dn_loss_cls: 0.2148 d0.dn_loss_bbox: 0.3179 d0.dn_loss_iou: 0.3476 d1.dn_loss_cls: 0.1644 d1.dn_loss_bbox: 0.2048 d1.dn_loss_iou: 0.2374 d2.dn_loss_cls: 0.1446 d2.dn_loss_bbox: 0.1797 d2.dn_loss_iou: 0.2152 d3.dn_loss_cls: 0.1354 d3.dn_loss_bbox: 0.1683 d3.dn_loss_iou: 0.2074 d4.dn_loss_cls: 0.1317 d4.dn_loss_bbox: 0.1662 d4.dn_loss_iou: 0.2050 d1.loss_lmm_region: 0.1417 loss_lmm_image: 0.8972 2024/11/11 08:40:48 - mmengine - INFO - Iter(train) [ 39100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:48:24 time: 1.9993 data_time: 0.0183 memory: 33082 grad_norm: 28.8180 loss: 8.9045 loss_cls: 0.2714 loss_bbox: 0.1228 loss_iou: 0.2332 d0.loss_cls: 0.3173 d0.loss_bbox: 0.1334 d0.loss_iou: 0.2484 d1.loss_cls: 0.2875 d1.loss_bbox: 0.1296 d1.loss_iou: 0.2391 d2.loss_cls: 0.2783 d2.loss_bbox: 0.1253 d2.loss_iou: 0.2355 d3.loss_cls: 0.2750 d3.loss_bbox: 0.1224 d3.loss_iou: 0.2330 d4.loss_cls: 0.2722 d4.loss_bbox: 0.1228 d4.loss_iou: 0.2330 enc_loss_cls: 0.3182 enc_loss_bbox: 0.1463 enc_loss_iou: 0.2722 dn_loss_cls: 0.1355 dn_loss_bbox: 0.1325 dn_loss_iou: 0.1823 d0.dn_loss_cls: 0.2098 d0.dn_loss_bbox: 0.2731 d0.dn_loss_iou: 0.3226 d1.dn_loss_cls: 0.1612 d1.dn_loss_bbox: 0.1640 d1.dn_loss_iou: 0.2142 d2.dn_loss_cls: 0.1445 d2.dn_loss_bbox: 0.1406 d2.dn_loss_iou: 0.1903 d3.dn_loss_cls: 0.1394 d3.dn_loss_bbox: 0.1336 d3.dn_loss_iou: 0.1845 d4.dn_loss_cls: 0.1359 d4.dn_loss_bbox: 0.1325 d4.dn_loss_iou: 0.1824 d1.loss_lmm_region: 0.1648 loss_lmm_image: 0.9437 2024/11/11 08:44:08 - mmengine - INFO - Iter(train) [ 39200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:45:02 time: 2.0127 data_time: 0.0182 memory: 34184 grad_norm: 25.3792 loss: 9.6195 loss_cls: 0.2900 loss_bbox: 0.1415 loss_iou: 0.2622 d0.loss_cls: 0.3403 d0.loss_bbox: 0.1549 d0.loss_iou: 0.2748 d1.loss_cls: 0.3161 d1.loss_bbox: 0.1459 d1.loss_iou: 0.2666 d2.loss_cls: 0.3017 d2.loss_bbox: 0.1460 d2.loss_iou: 0.2642 d3.loss_cls: 0.2942 d3.loss_bbox: 0.1419 d3.loss_iou: 0.2628 d4.loss_cls: 0.2864 d4.loss_bbox: 0.1437 d4.loss_iou: 0.2636 enc_loss_cls: 0.3381 enc_loss_bbox: 0.1712 enc_loss_iou: 0.3000 dn_loss_cls: 0.0913 dn_loss_bbox: 0.1835 dn_loss_iou: 0.2251 d0.dn_loss_cls: 0.1734 d0.dn_loss_bbox: 0.3240 d0.dn_loss_iou: 0.3577 d1.dn_loss_cls: 0.1237 d1.dn_loss_bbox: 0.2191 d1.dn_loss_iou: 0.2568 d2.dn_loss_cls: 0.1026 d2.dn_loss_bbox: 0.1935 d2.dn_loss_iou: 0.2345 d3.dn_loss_cls: 0.0968 d3.dn_loss_bbox: 0.1853 d3.dn_loss_iou: 0.2275 d4.dn_loss_cls: 0.0929 d4.dn_loss_bbox: 0.1834 d4.dn_loss_iou: 0.2250 d1.loss_lmm_region: 0.1362 loss_lmm_image: 0.8812 2024/11/11 08:47:25 - mmengine - INFO - Iter(train) [ 39300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:41:31 time: 1.9937 data_time: 0.0183 memory: 35281 grad_norm: 32.7730 loss: 9.1486 loss_cls: 0.2873 loss_bbox: 0.1239 loss_iou: 0.2249 d0.loss_cls: 0.3466 d0.loss_bbox: 0.1317 d0.loss_iou: 0.2342 d1.loss_cls: 0.3190 d1.loss_bbox: 0.1206 d1.loss_iou: 0.2225 d2.loss_cls: 0.3033 d2.loss_bbox: 0.1202 d2.loss_iou: 0.2212 d3.loss_cls: 0.2961 d3.loss_bbox: 0.1205 d3.loss_iou: 0.2218 d4.loss_cls: 0.2901 d4.loss_bbox: 0.1227 d4.loss_iou: 0.2234 enc_loss_cls: 0.3389 enc_loss_bbox: 0.1437 enc_loss_iou: 0.2545 dn_loss_cls: 0.1004 dn_loss_bbox: 0.1779 dn_loss_iou: 0.2082 d0.dn_loss_cls: 0.1815 d0.dn_loss_bbox: 0.3299 d0.dn_loss_iou: 0.3606 d1.dn_loss_cls: 0.1297 d1.dn_loss_bbox: 0.2144 d1.dn_loss_iou: 0.2418 d2.dn_loss_cls: 0.1097 d2.dn_loss_bbox: 0.1913 d2.dn_loss_iou: 0.2191 d3.dn_loss_cls: 0.1036 d3.dn_loss_bbox: 0.1806 d3.dn_loss_iou: 0.2108 d4.dn_loss_cls: 0.1017 d4.dn_loss_bbox: 0.1780 d4.dn_loss_iou: 0.2083 d1.loss_lmm_region: 0.1433 loss_lmm_image: 0.8904 2024/11/11 08:50:42 - mmengine - INFO - Iter(train) [ 39400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:38:02 time: 1.9871 data_time: 0.0182 memory: 34337 grad_norm: 34.3514 loss: 9.3233 loss_cls: 0.3263 loss_bbox: 0.1268 loss_iou: 0.2321 d0.loss_cls: 0.3794 d0.loss_bbox: 0.1318 d0.loss_iou: 0.2470 d1.loss_cls: 0.3511 d1.loss_bbox: 0.1280 d1.loss_iou: 0.2391 d2.loss_cls: 0.3377 d2.loss_bbox: 0.1273 d2.loss_iou: 0.2370 d3.loss_cls: 0.3297 d3.loss_bbox: 0.1274 d3.loss_iou: 0.2357 d4.loss_cls: 0.3274 d4.loss_bbox: 0.1269 d4.loss_iou: 0.2325 enc_loss_cls: 0.3667 enc_loss_bbox: 0.1485 enc_loss_iou: 0.2742 dn_loss_cls: 0.1349 dn_loss_bbox: 0.1385 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.2029 d0.dn_loss_bbox: 0.2667 d0.dn_loss_iou: 0.3296 d1.dn_loss_cls: 0.1648 d1.dn_loss_bbox: 0.1674 d1.dn_loss_iou: 0.2253 d2.dn_loss_cls: 0.1465 d2.dn_loss_bbox: 0.1491 d2.dn_loss_iou: 0.2049 d3.dn_loss_cls: 0.1395 d3.dn_loss_bbox: 0.1404 d3.dn_loss_iou: 0.1986 d4.dn_loss_cls: 0.1353 d4.dn_loss_bbox: 0.1386 d4.dn_loss_iou: 0.1963 d1.loss_lmm_region: 0.1371 loss_lmm_image: 0.8782 2024/11/11 08:54:00 - mmengine - INFO - Iter(train) [ 39500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:34:33 time: 1.9540 data_time: 0.0183 memory: 33812 grad_norm: 30.4475 loss: 10.2802 loss_cls: 0.2873 loss_bbox: 0.1610 loss_iou: 0.2515 d0.loss_cls: 0.3453 d0.loss_bbox: 0.1750 d0.loss_iou: 0.2631 d1.loss_cls: 0.3113 d1.loss_bbox: 0.1641 d1.loss_iou: 0.2504 d2.loss_cls: 0.3003 d2.loss_bbox: 0.1587 d2.loss_iou: 0.2483 d3.loss_cls: 0.2928 d3.loss_bbox: 0.1591 d3.loss_iou: 0.2486 d4.loss_cls: 0.2895 d4.loss_bbox: 0.1602 d4.loss_iou: 0.2498 enc_loss_cls: 0.3381 enc_loss_bbox: 0.1943 enc_loss_iou: 0.2895 dn_loss_cls: 0.1143 dn_loss_bbox: 0.2256 dn_loss_iou: 0.2492 d0.dn_loss_cls: 0.2087 d0.dn_loss_bbox: 0.4005 d0.dn_loss_iou: 0.4054 d1.dn_loss_cls: 0.1542 d1.dn_loss_bbox: 0.2636 d1.dn_loss_iou: 0.2838 d2.dn_loss_cls: 0.1317 d2.dn_loss_bbox: 0.2363 d2.dn_loss_iou: 0.2585 d3.dn_loss_cls: 0.1224 d3.dn_loss_bbox: 0.2289 d3.dn_loss_iou: 0.2518 d4.dn_loss_cls: 0.1159 d4.dn_loss_bbox: 0.2257 d4.dn_loss_iou: 0.2493 d1.loss_lmm_region: 0.1584 loss_lmm_image: 0.8579 2024/11/11 08:57:21 - mmengine - INFO - Iter(train) [ 39600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:31:14 time: 2.0135 data_time: 0.0186 memory: 33437 grad_norm: 25.3485 loss: 11.6202 loss_cls: 0.3602 loss_bbox: 0.1882 loss_iou: 0.2845 d0.loss_cls: 0.4228 d0.loss_bbox: 0.1945 d0.loss_iou: 0.2943 d1.loss_cls: 0.3831 d1.loss_bbox: 0.1901 d1.loss_iou: 0.2873 d2.loss_cls: 0.3789 d2.loss_bbox: 0.1839 d2.loss_iou: 0.2825 d3.loss_cls: 0.3642 d3.loss_bbox: 0.1862 d3.loss_iou: 0.2818 d4.loss_cls: 0.3587 d4.loss_bbox: 0.1887 d4.loss_iou: 0.2855 enc_loss_cls: 0.4156 enc_loss_bbox: 0.2133 enc_loss_iou: 0.3222 dn_loss_cls: 0.1717 dn_loss_bbox: 0.2128 dn_loss_iou: 0.2674 d0.dn_loss_cls: 0.2576 d0.dn_loss_bbox: 0.3932 d0.dn_loss_iou: 0.4359 d1.dn_loss_cls: 0.2040 d1.dn_loss_bbox: 0.2556 d1.dn_loss_iou: 0.3061 d2.dn_loss_cls: 0.1845 d2.dn_loss_bbox: 0.2252 d2.dn_loss_iou: 0.2795 d3.dn_loss_cls: 0.1769 d3.dn_loss_bbox: 0.2152 d3.dn_loss_iou: 0.2700 d4.dn_loss_cls: 0.1732 d4.dn_loss_bbox: 0.2128 d4.dn_loss_iou: 0.2674 d1.loss_lmm_region: 0.1706 loss_lmm_image: 0.8739 2024/11/11 09:00:41 - mmengine - INFO - Iter(train) [ 39700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:27:51 time: 1.9998 data_time: 0.0182 memory: 34947 grad_norm: 27.8011 loss: 10.1243 loss_cls: 0.3123 loss_bbox: 0.1732 loss_iou: 0.2754 d0.loss_cls: 0.3807 d0.loss_bbox: 0.1759 d0.loss_iou: 0.2889 d1.loss_cls: 0.3466 d1.loss_bbox: 0.1683 d1.loss_iou: 0.2791 d2.loss_cls: 0.3349 d2.loss_bbox: 0.1675 d2.loss_iou: 0.2726 d3.loss_cls: 0.3224 d3.loss_bbox: 0.1682 d3.loss_iou: 0.2744 d4.loss_cls: 0.3132 d4.loss_bbox: 0.1736 d4.loss_iou: 0.2764 enc_loss_cls: 0.3742 enc_loss_bbox: 0.1927 enc_loss_iou: 0.3143 dn_loss_cls: 0.1103 dn_loss_bbox: 0.1752 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.1915 d0.dn_loss_bbox: 0.3232 d0.dn_loss_iou: 0.3725 d1.dn_loss_cls: 0.1372 d1.dn_loss_bbox: 0.2097 d1.dn_loss_iou: 0.2570 d2.dn_loss_cls: 0.1185 d2.dn_loss_bbox: 0.1877 d2.dn_loss_iou: 0.2356 d3.dn_loss_cls: 0.1122 d3.dn_loss_bbox: 0.1771 d3.dn_loss_iou: 0.2276 d4.dn_loss_cls: 0.1116 d4.dn_loss_bbox: 0.1753 d4.dn_loss_iou: 0.2249 d1.loss_lmm_region: 0.1485 loss_lmm_image: 0.8189 2024/11/11 09:03:59 - mmengine - INFO - Iter(train) [ 39800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:24:24 time: 1.9992 data_time: 0.0205 memory: 33222 grad_norm: 29.4069 loss: 10.1631 loss_cls: 0.3744 loss_bbox: 0.1315 loss_iou: 0.2514 d0.loss_cls: 0.4454 d0.loss_bbox: 0.1350 d0.loss_iou: 0.2615 d1.loss_cls: 0.4092 d1.loss_bbox: 0.1324 d1.loss_iou: 0.2546 d2.loss_cls: 0.3992 d2.loss_bbox: 0.1244 d2.loss_iou: 0.2505 d3.loss_cls: 0.3840 d3.loss_bbox: 0.1266 d3.loss_iou: 0.2515 d4.loss_cls: 0.3773 d4.loss_bbox: 0.1287 d4.loss_iou: 0.2499 enc_loss_cls: 0.4364 enc_loss_bbox: 0.1614 enc_loss_iou: 0.2927 dn_loss_cls: 0.1537 dn_loss_bbox: 0.1485 dn_loss_iou: 0.2009 d0.dn_loss_cls: 0.2442 d0.dn_loss_bbox: 0.2921 d0.dn_loss_iou: 0.3450 d1.dn_loss_cls: 0.1923 d1.dn_loss_bbox: 0.1806 d1.dn_loss_iou: 0.2332 d2.dn_loss_cls: 0.1684 d2.dn_loss_bbox: 0.1576 d2.dn_loss_iou: 0.2107 d3.dn_loss_cls: 0.1610 d3.dn_loss_bbox: 0.1504 d3.dn_loss_iou: 0.2034 d4.dn_loss_cls: 0.1548 d4.dn_loss_bbox: 0.1486 d4.dn_loss_iou: 0.2009 d1.loss_lmm_region: 0.1589 loss_lmm_image: 0.8798 2024/11/11 09:07:19 - mmengine - INFO - Iter(train) [ 39900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:21:00 time: 1.9797 data_time: 0.0186 memory: 33598 grad_norm: 27.7541 loss: 10.3899 loss_cls: 0.3390 loss_bbox: 0.1538 loss_iou: 0.2665 d0.loss_cls: 0.3855 d0.loss_bbox: 0.1754 d0.loss_iou: 0.2841 d1.loss_cls: 0.3598 d1.loss_bbox: 0.1581 d1.loss_iou: 0.2699 d2.loss_cls: 0.3504 d2.loss_bbox: 0.1525 d2.loss_iou: 0.2675 d3.loss_cls: 0.3432 d3.loss_bbox: 0.1530 d3.loss_iou: 0.2664 d4.loss_cls: 0.3399 d4.loss_bbox: 0.1541 d4.loss_iou: 0.2665 enc_loss_cls: 0.3852 enc_loss_bbox: 0.1956 enc_loss_iou: 0.3070 dn_loss_cls: 0.1189 dn_loss_bbox: 0.1827 dn_loss_iou: 0.2277 d0.dn_loss_cls: 0.2075 d0.dn_loss_bbox: 0.3519 d0.dn_loss_iou: 0.3824 d1.dn_loss_cls: 0.1587 d1.dn_loss_bbox: 0.2246 d1.dn_loss_iou: 0.2636 d2.dn_loss_cls: 0.1358 d2.dn_loss_bbox: 0.1970 d2.dn_loss_iou: 0.2389 d3.dn_loss_cls: 0.1263 d3.dn_loss_bbox: 0.1859 d3.dn_loss_iou: 0.2307 d4.dn_loss_cls: 0.1197 d4.dn_loss_bbox: 0.1827 d4.dn_loss_iou: 0.2277 d1.loss_lmm_region: 0.1745 loss_lmm_image: 0.8791 2024/11/11 09:10:39 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 09:10:39 - mmengine - INFO - Iter(train) [ 40000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:17:37 time: 2.0118 data_time: 0.0184 memory: 34253 grad_norm: 25.3986 loss: 8.4879 loss_cls: 0.2541 loss_bbox: 0.1184 loss_iou: 0.2098 d0.loss_cls: 0.2953 d0.loss_bbox: 0.1295 d0.loss_iou: 0.2210 d1.loss_cls: 0.2664 d1.loss_bbox: 0.1236 d1.loss_iou: 0.2145 d2.loss_cls: 0.2617 d2.loss_bbox: 0.1207 d2.loss_iou: 0.2118 d3.loss_cls: 0.2590 d3.loss_bbox: 0.1172 d3.loss_iou: 0.2103 d4.loss_cls: 0.2564 d4.loss_bbox: 0.1182 d4.loss_iou: 0.2094 enc_loss_cls: 0.2980 enc_loss_bbox: 0.1405 enc_loss_iou: 0.2434 dn_loss_cls: 0.0922 dn_loss_bbox: 0.1585 dn_loss_iou: 0.2041 d0.dn_loss_cls: 0.1724 d0.dn_loss_bbox: 0.3044 d0.dn_loss_iou: 0.3474 d1.dn_loss_cls: 0.1240 d1.dn_loss_bbox: 0.1876 d1.dn_loss_iou: 0.2336 d2.dn_loss_cls: 0.1022 d2.dn_loss_bbox: 0.1681 d2.dn_loss_iou: 0.2133 d3.dn_loss_cls: 0.0961 d3.dn_loss_bbox: 0.1614 d3.dn_loss_iou: 0.2070 d4.dn_loss_cls: 0.0909 d4.dn_loss_bbox: 0.1585 d4.dn_loss_iou: 0.2043 d1.loss_lmm_region: 0.1098 loss_lmm_image: 0.8731 2024/11/11 09:13:57 - mmengine - INFO - Iter(train) [ 40100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:14:11 time: 1.9810 data_time: 0.0182 memory: 34012 grad_norm: 32.4324 loss: 11.2598 loss_cls: 0.3961 loss_bbox: 0.1647 loss_iou: 0.2892 d0.loss_cls: 0.4472 d0.loss_bbox: 0.1717 d0.loss_iou: 0.3030 d1.loss_cls: 0.4218 d1.loss_bbox: 0.1655 d1.loss_iou: 0.2964 d2.loss_cls: 0.4084 d2.loss_bbox: 0.1654 d2.loss_iou: 0.2918 d3.loss_cls: 0.4070 d3.loss_bbox: 0.1650 d3.loss_iou: 0.2875 d4.loss_cls: 0.3989 d4.loss_bbox: 0.1646 d4.loss_iou: 0.2898 enc_loss_cls: 0.4537 enc_loss_bbox: 0.1830 enc_loss_iou: 0.3242 dn_loss_cls: 0.1564 dn_loss_bbox: 0.1850 dn_loss_iou: 0.2343 d0.dn_loss_cls: 0.2420 d0.dn_loss_bbox: 0.3330 d0.dn_loss_iou: 0.3810 d1.dn_loss_cls: 0.1911 d1.dn_loss_bbox: 0.2164 d1.dn_loss_iou: 0.2663 d2.dn_loss_cls: 0.1692 d2.dn_loss_bbox: 0.1954 d2.dn_loss_iou: 0.2444 d3.dn_loss_cls: 0.1614 d3.dn_loss_bbox: 0.1863 d3.dn_loss_iou: 0.2367 d4.dn_loss_cls: 0.1569 d4.dn_loss_bbox: 0.1850 d4.dn_loss_iou: 0.2344 d1.loss_lmm_region: 0.1901 loss_lmm_image: 0.8995 2024/11/11 09:17:13 - mmengine - INFO - Iter(train) [ 40200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:10:38 time: 1.9555 data_time: 0.0183 memory: 33543 grad_norm: 26.7920 loss: 8.6442 loss_cls: 0.2641 loss_bbox: 0.1192 loss_iou: 0.1991 d0.loss_cls: 0.2975 d0.loss_bbox: 0.1335 d0.loss_iou: 0.2130 d1.loss_cls: 0.2759 d1.loss_bbox: 0.1263 d1.loss_iou: 0.2055 d2.loss_cls: 0.2682 d2.loss_bbox: 0.1228 d2.loss_iou: 0.2011 d3.loss_cls: 0.2644 d3.loss_bbox: 0.1217 d3.loss_iou: 0.2005 d4.loss_cls: 0.2630 d4.loss_bbox: 0.1203 d4.loss_iou: 0.1996 enc_loss_cls: 0.2966 enc_loss_bbox: 0.1496 enc_loss_iou: 0.2363 dn_loss_cls: 0.1361 dn_loss_bbox: 0.1460 dn_loss_iou: 0.1889 d0.dn_loss_cls: 0.2117 d0.dn_loss_bbox: 0.2964 d0.dn_loss_iou: 0.3313 d1.dn_loss_cls: 0.1661 d1.dn_loss_bbox: 0.1788 d1.dn_loss_iou: 0.2186 d2.dn_loss_cls: 0.1476 d2.dn_loss_bbox: 0.1557 d2.dn_loss_iou: 0.1978 d3.dn_loss_cls: 0.1419 d3.dn_loss_bbox: 0.1478 d3.dn_loss_iou: 0.1911 d4.dn_loss_cls: 0.1379 d4.dn_loss_bbox: 0.1462 d4.dn_loss_iou: 0.1889 d1.loss_lmm_region: 0.1711 loss_lmm_image: 0.8659 2024/11/11 09:20:34 - mmengine - INFO - Iter(train) [ 40300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:07:18 time: 1.9792 data_time: 0.0183 memory: 34094 grad_norm: 28.6134 loss: 9.3028 loss_cls: 0.2931 loss_bbox: 0.1381 loss_iou: 0.2742 d0.loss_cls: 0.3428 d0.loss_bbox: 0.1487 d0.loss_iou: 0.2886 d1.loss_cls: 0.3130 d1.loss_bbox: 0.1441 d1.loss_iou: 0.2804 d2.loss_cls: 0.3034 d2.loss_bbox: 0.1390 d2.loss_iou: 0.2754 d3.loss_cls: 0.2998 d3.loss_bbox: 0.1365 d3.loss_iou: 0.2742 d4.loss_cls: 0.2940 d4.loss_bbox: 0.1365 d4.loss_iou: 0.2720 enc_loss_cls: 0.3490 enc_loss_bbox: 0.1652 enc_loss_iou: 0.3128 dn_loss_cls: 0.0844 dn_loss_bbox: 0.1503 dn_loss_iou: 0.2056 d0.dn_loss_cls: 0.1584 d0.dn_loss_bbox: 0.2821 d0.dn_loss_iou: 0.3427 d1.dn_loss_cls: 0.1130 d1.dn_loss_bbox: 0.1765 d1.dn_loss_iou: 0.2339 d2.dn_loss_cls: 0.0948 d2.dn_loss_bbox: 0.1571 d2.dn_loss_iou: 0.2136 d3.dn_loss_cls: 0.0891 d3.dn_loss_bbox: 0.1518 d3.dn_loss_iou: 0.2079 d4.dn_loss_cls: 0.0853 d4.dn_loss_bbox: 0.1504 d4.dn_loss_iou: 0.2058 d1.loss_lmm_region: 0.1165 loss_lmm_image: 0.9030 2024/11/11 09:23:53 - mmengine - INFO - Iter(train) [ 40400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:03:54 time: 2.0142 data_time: 0.0181 memory: 34409 grad_norm: 29.2338 loss: 8.8499 loss_cls: 0.2701 loss_bbox: 0.1233 loss_iou: 0.2206 d0.loss_cls: 0.3083 d0.loss_bbox: 0.1311 d0.loss_iou: 0.2314 d1.loss_cls: 0.2806 d1.loss_bbox: 0.1268 d1.loss_iou: 0.2235 d2.loss_cls: 0.2727 d2.loss_bbox: 0.1239 d2.loss_iou: 0.2211 d3.loss_cls: 0.2718 d3.loss_bbox: 0.1226 d3.loss_iou: 0.2187 d4.loss_cls: 0.2725 d4.loss_bbox: 0.1233 d4.loss_iou: 0.2209 enc_loss_cls: 0.3064 enc_loss_bbox: 0.1428 enc_loss_iou: 0.2495 dn_loss_cls: 0.1230 dn_loss_bbox: 0.1489 dn_loss_iou: 0.2011 d0.dn_loss_cls: 0.1978 d0.dn_loss_bbox: 0.3005 d0.dn_loss_iou: 0.3488 d1.dn_loss_cls: 0.1524 d1.dn_loss_bbox: 0.1844 d1.dn_loss_iou: 0.2333 d2.dn_loss_cls: 0.1331 d2.dn_loss_bbox: 0.1616 d2.dn_loss_iou: 0.2110 d3.dn_loss_cls: 0.1278 d3.dn_loss_bbox: 0.1517 d3.dn_loss_iou: 0.2037 d4.dn_loss_cls: 0.1233 d4.dn_loss_bbox: 0.1490 d4.dn_loss_iou: 0.2011 d1.loss_lmm_region: 0.1481 loss_lmm_image: 0.8875 2024/11/11 09:27:13 - mmengine - INFO - Iter(train) [ 40500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 13:00:31 time: 1.9869 data_time: 0.0204 memory: 33046 grad_norm: 25.3857 loss: 8.6563 loss_cls: 0.2344 loss_bbox: 0.1137 loss_iou: 0.1894 d0.loss_cls: 0.2748 d0.loss_bbox: 0.1258 d0.loss_iou: 0.2005 d1.loss_cls: 0.2510 d1.loss_bbox: 0.1198 d1.loss_iou: 0.1953 d2.loss_cls: 0.2456 d2.loss_bbox: 0.1124 d2.loss_iou: 0.1905 d3.loss_cls: 0.2390 d3.loss_bbox: 0.1130 d3.loss_iou: 0.1900 d4.loss_cls: 0.2371 d4.loss_bbox: 0.1135 d4.loss_iou: 0.1902 enc_loss_cls: 0.2817 enc_loss_bbox: 0.1334 enc_loss_iou: 0.2203 dn_loss_cls: 0.1137 dn_loss_bbox: 0.1883 dn_loss_iou: 0.2184 d0.dn_loss_cls: 0.1967 d0.dn_loss_bbox: 0.3611 d0.dn_loss_iou: 0.3737 d1.dn_loss_cls: 0.1456 d1.dn_loss_bbox: 0.2282 d1.dn_loss_iou: 0.2504 d2.dn_loss_cls: 0.1253 d2.dn_loss_bbox: 0.1999 d2.dn_loss_iou: 0.2276 d3.dn_loss_cls: 0.1169 d3.dn_loss_bbox: 0.1905 d3.dn_loss_iou: 0.2206 d4.dn_loss_cls: 0.1146 d4.dn_loss_bbox: 0.1882 d4.dn_loss_iou: 0.2183 d1.loss_lmm_region: 0.1440 loss_lmm_image: 0.8629 2024/11/11 09:30:32 - mmengine - INFO - Iter(train) [ 40600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:57:07 time: 1.9758 data_time: 0.0181 memory: 36190 grad_norm: 26.9939 loss: 8.7366 loss_cls: 0.2480 loss_bbox: 0.1163 loss_iou: 0.1819 d0.loss_cls: 0.2753 d0.loss_bbox: 0.1373 d0.loss_iou: 0.1988 d1.loss_cls: 0.2591 d1.loss_bbox: 0.1230 d1.loss_iou: 0.1924 d2.loss_cls: 0.2540 d2.loss_bbox: 0.1179 d2.loss_iou: 0.1864 d3.loss_cls: 0.2515 d3.loss_bbox: 0.1142 d3.loss_iou: 0.1820 d4.loss_cls: 0.2502 d4.loss_bbox: 0.1146 d4.loss_iou: 0.1813 enc_loss_cls: 0.2798 enc_loss_bbox: 0.1460 enc_loss_iou: 0.2194 dn_loss_cls: 0.1475 dn_loss_bbox: 0.1671 dn_loss_iou: 0.2039 d0.dn_loss_cls: 0.2311 d0.dn_loss_bbox: 0.3399 d0.dn_loss_iou: 0.3569 d1.dn_loss_cls: 0.1840 d1.dn_loss_bbox: 0.2021 d1.dn_loss_iou: 0.2361 d2.dn_loss_cls: 0.1617 d2.dn_loss_bbox: 0.1775 d2.dn_loss_iou: 0.2140 d3.dn_loss_cls: 0.1530 d3.dn_loss_bbox: 0.1691 d3.dn_loss_iou: 0.2069 d4.dn_loss_cls: 0.1487 d4.dn_loss_bbox: 0.1671 d4.dn_loss_iou: 0.2039 d1.loss_lmm_region: 0.1592 loss_lmm_image: 0.8774 2024/11/11 09:33:51 - mmengine - INFO - Iter(train) [ 40700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:53:42 time: 1.9896 data_time: 0.0185 memory: 35339 grad_norm: 26.1718 loss: 10.5002 loss_cls: 0.3196 loss_bbox: 0.1531 loss_iou: 0.2776 d0.loss_cls: 0.3743 d0.loss_bbox: 0.1553 d0.loss_iou: 0.2872 d1.loss_cls: 0.3439 d1.loss_bbox: 0.1574 d1.loss_iou: 0.2853 d2.loss_cls: 0.3248 d2.loss_bbox: 0.1581 d2.loss_iou: 0.2824 d3.loss_cls: 0.3165 d3.loss_bbox: 0.1561 d3.loss_iou: 0.2825 d4.loss_cls: 0.3197 d4.loss_bbox: 0.1519 d4.loss_iou: 0.2766 enc_loss_cls: 0.3734 enc_loss_bbox: 0.1716 enc_loss_iou: 0.3129 dn_loss_cls: 0.1234 dn_loss_bbox: 0.1940 dn_loss_iou: 0.2459 d0.dn_loss_cls: 0.2207 d0.dn_loss_bbox: 0.3567 d0.dn_loss_iou: 0.3931 d1.dn_loss_cls: 0.1632 d1.dn_loss_bbox: 0.2352 d1.dn_loss_iou: 0.2816 d2.dn_loss_cls: 0.1398 d2.dn_loss_bbox: 0.2068 d2.dn_loss_iou: 0.2575 d3.dn_loss_cls: 0.1308 d3.dn_loss_bbox: 0.1965 d3.dn_loss_iou: 0.2490 d4.dn_loss_cls: 0.1233 d4.dn_loss_bbox: 0.1941 d4.dn_loss_iou: 0.2460 d1.loss_lmm_region: 0.1764 loss_lmm_image: 0.8858 2024/11/11 09:37:08 - mmengine - INFO - Iter(train) [ 40800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:50:10 time: 1.9682 data_time: 0.0192 memory: 33099 grad_norm: 31.6080 loss: 9.8848 loss_cls: 0.2938 loss_bbox: 0.1353 loss_iou: 0.2219 d0.loss_cls: 0.3323 d0.loss_bbox: 0.1464 d0.loss_iou: 0.2360 d1.loss_cls: 0.3099 d1.loss_bbox: 0.1413 d1.loss_iou: 0.2273 d2.loss_cls: 0.2998 d2.loss_bbox: 0.1399 d2.loss_iou: 0.2256 d3.loss_cls: 0.3011 d3.loss_bbox: 0.1347 d3.loss_iou: 0.2209 d4.loss_cls: 0.2981 d4.loss_bbox: 0.1326 d4.loss_iou: 0.2200 enc_loss_cls: 0.3333 enc_loss_bbox: 0.1625 enc_loss_iou: 0.2570 dn_loss_cls: 0.1427 dn_loss_bbox: 0.1987 dn_loss_iou: 0.2274 d0.dn_loss_cls: 0.2342 d0.dn_loss_bbox: 0.3841 d0.dn_loss_iou: 0.3866 d1.dn_loss_cls: 0.1788 d1.dn_loss_bbox: 0.2381 d1.dn_loss_iou: 0.2603 d2.dn_loss_cls: 0.1602 d2.dn_loss_bbox: 0.2100 d2.dn_loss_iou: 0.2379 d3.dn_loss_cls: 0.1510 d3.dn_loss_bbox: 0.2016 d3.dn_loss_iou: 0.2303 d4.dn_loss_cls: 0.1445 d4.dn_loss_bbox: 0.1987 d4.dn_loss_iou: 0.2274 d1.loss_lmm_region: 0.1892 loss_lmm_image: 0.9135 2024/11/11 09:40:28 - mmengine - INFO - Iter(train) [ 40900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:46:50 time: 2.0003 data_time: 0.0185 memory: 33797 grad_norm: 28.0025 loss: 10.8478 loss_cls: 0.3481 loss_bbox: 0.1719 loss_iou: 0.2832 d0.loss_cls: 0.4039 d0.loss_bbox: 0.1811 d0.loss_iou: 0.2947 d1.loss_cls: 0.3806 d1.loss_bbox: 0.1720 d1.loss_iou: 0.2842 d2.loss_cls: 0.3659 d2.loss_bbox: 0.1692 d2.loss_iou: 0.2787 d3.loss_cls: 0.3537 d3.loss_bbox: 0.1699 d3.loss_iou: 0.2808 d4.loss_cls: 0.3482 d4.loss_bbox: 0.1727 d4.loss_iou: 0.2820 enc_loss_cls: 0.4049 enc_loss_bbox: 0.1985 enc_loss_iou: 0.3196 dn_loss_cls: 0.1437 dn_loss_bbox: 0.1886 dn_loss_iou: 0.2329 d0.dn_loss_cls: 0.2288 d0.dn_loss_bbox: 0.3467 d0.dn_loss_iou: 0.3803 d1.dn_loss_cls: 0.1825 d1.dn_loss_bbox: 0.2269 d1.dn_loss_iou: 0.2659 d2.dn_loss_cls: 0.1597 d2.dn_loss_bbox: 0.2002 d2.dn_loss_iou: 0.2431 d3.dn_loss_cls: 0.1506 d3.dn_loss_bbox: 0.1903 d3.dn_loss_iou: 0.2352 d4.dn_loss_cls: 0.1437 d4.dn_loss_bbox: 0.1885 d4.dn_loss_iou: 0.2330 d1.loss_lmm_region: 0.1769 loss_lmm_image: 0.8666 2024/11/11 09:43:47 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 09:43:47 - mmengine - INFO - Iter(train) [ 41000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:43:26 time: 2.0250 data_time: 0.0187 memory: 33886 grad_norm: 26.9134 loss: 11.6722 loss_cls: 0.3960 loss_bbox: 0.1819 loss_iou: 0.3358 d0.loss_cls: 0.4492 d0.loss_bbox: 0.1923 d0.loss_iou: 0.3527 d1.loss_cls: 0.4242 d1.loss_bbox: 0.1855 d1.loss_iou: 0.3449 d2.loss_cls: 0.4101 d2.loss_bbox: 0.1852 d2.loss_iou: 0.3376 d3.loss_cls: 0.4023 d3.loss_bbox: 0.1811 d3.loss_iou: 0.3364 d4.loss_cls: 0.3996 d4.loss_bbox: 0.1800 d4.loss_iou: 0.3351 enc_loss_cls: 0.4526 enc_loss_bbox: 0.2125 enc_loss_iou: 0.3755 dn_loss_cls: 0.1405 dn_loss_bbox: 0.1869 dn_loss_iou: 0.2512 d0.dn_loss_cls: 0.2215 d0.dn_loss_bbox: 0.3404 d0.dn_loss_iou: 0.3933 d1.dn_loss_cls: 0.1782 d1.dn_loss_bbox: 0.2181 d1.dn_loss_iou: 0.2813 d2.dn_loss_cls: 0.1539 d2.dn_loss_bbox: 0.1969 d2.dn_loss_iou: 0.2606 d3.dn_loss_cls: 0.1449 d3.dn_loss_bbox: 0.1890 d3.dn_loss_iou: 0.2531 d4.dn_loss_cls: 0.1420 d4.dn_loss_bbox: 0.1869 d4.dn_loss_iou: 0.2512 d1.loss_lmm_region: 0.1683 loss_lmm_image: 0.8436 2024/11/11 09:47:07 - mmengine - INFO - Iter(train) [ 41100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:40:03 time: 2.0051 data_time: 0.0192 memory: 34993 grad_norm: 28.8932 loss: 9.7293 loss_cls: 0.2757 loss_bbox: 0.1623 loss_iou: 0.2806 d0.loss_cls: 0.3256 d0.loss_bbox: 0.1690 d0.loss_iou: 0.2921 d1.loss_cls: 0.2989 d1.loss_bbox: 0.1644 d1.loss_iou: 0.2848 d2.loss_cls: 0.2876 d2.loss_bbox: 0.1610 d2.loss_iou: 0.2818 d3.loss_cls: 0.2789 d3.loss_bbox: 0.1641 d3.loss_iou: 0.2821 d4.loss_cls: 0.2761 d4.loss_bbox: 0.1642 d4.loss_iou: 0.2818 enc_loss_cls: 0.3276 enc_loss_bbox: 0.1844 enc_loss_iou: 0.3111 dn_loss_cls: 0.0874 dn_loss_bbox: 0.1711 dn_loss_iou: 0.2260 d0.dn_loss_cls: 0.1740 d0.dn_loss_bbox: 0.3138 d0.dn_loss_iou: 0.3659 d1.dn_loss_cls: 0.1216 d1.dn_loss_bbox: 0.1999 d1.dn_loss_iou: 0.2531 d2.dn_loss_cls: 0.1015 d2.dn_loss_bbox: 0.1796 d2.dn_loss_iou: 0.2336 d3.dn_loss_cls: 0.0916 d3.dn_loss_bbox: 0.1731 d3.dn_loss_iou: 0.2282 d4.dn_loss_cls: 0.0877 d4.dn_loss_bbox: 0.1711 d4.dn_loss_iou: 0.2261 d1.loss_lmm_region: 0.1650 loss_lmm_image: 0.9051 2024/11/11 09:50:26 - mmengine - INFO - Iter(train) [ 41200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:36:37 time: 2.0037 data_time: 0.0195 memory: 34901 grad_norm: 28.3277 loss: 9.7119 loss_cls: 0.2810 loss_bbox: 0.1618 loss_iou: 0.2591 d0.loss_cls: 0.3386 d0.loss_bbox: 0.1717 d0.loss_iou: 0.2782 d1.loss_cls: 0.2994 d1.loss_bbox: 0.1734 d1.loss_iou: 0.2754 d2.loss_cls: 0.2895 d2.loss_bbox: 0.1671 d2.loss_iou: 0.2648 d3.loss_cls: 0.2867 d3.loss_bbox: 0.1623 d3.loss_iou: 0.2598 d4.loss_cls: 0.2808 d4.loss_bbox: 0.1631 d4.loss_iou: 0.2597 enc_loss_cls: 0.3351 enc_loss_bbox: 0.1836 enc_loss_iou: 0.2981 dn_loss_cls: 0.1208 dn_loss_bbox: 0.1751 dn_loss_iou: 0.2172 d0.dn_loss_cls: 0.1900 d0.dn_loss_bbox: 0.3169 d0.dn_loss_iou: 0.3638 d1.dn_loss_cls: 0.1477 d1.dn_loss_bbox: 0.2036 d1.dn_loss_iou: 0.2478 d2.dn_loss_cls: 0.1309 d2.dn_loss_bbox: 0.1843 d2.dn_loss_iou: 0.2275 d3.dn_loss_cls: 0.1250 d3.dn_loss_bbox: 0.1772 d3.dn_loss_iou: 0.2204 d4.dn_loss_cls: 0.1205 d4.dn_loss_bbox: 0.1750 d4.dn_loss_iou: 0.2173 d1.loss_lmm_region: 0.1351 loss_lmm_image: 0.8266 2024/11/11 09:53:43 - mmengine - INFO - Iter(train) [ 41300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:33:09 time: 1.9936 data_time: 0.0192 memory: 35039 grad_norm: 26.2789 loss: 9.8313 loss_cls: 0.3312 loss_bbox: 0.1359 loss_iou: 0.2592 d0.loss_cls: 0.3859 d0.loss_bbox: 0.1454 d0.loss_iou: 0.2691 d1.loss_cls: 0.3467 d1.loss_bbox: 0.1443 d1.loss_iou: 0.2666 d2.loss_cls: 0.3376 d2.loss_bbox: 0.1363 d2.loss_iou: 0.2610 d3.loss_cls: 0.3352 d3.loss_bbox: 0.1353 d3.loss_iou: 0.2602 d4.loss_cls: 0.3325 d4.loss_bbox: 0.1354 d4.loss_iou: 0.2587 enc_loss_cls: 0.3772 enc_loss_bbox: 0.1586 enc_loss_iou: 0.2957 dn_loss_cls: 0.1373 dn_loss_bbox: 0.1461 dn_loss_iou: 0.2034 d0.dn_loss_cls: 0.2168 d0.dn_loss_bbox: 0.2902 d0.dn_loss_iou: 0.3485 d1.dn_loss_cls: 0.1699 d1.dn_loss_bbox: 0.1788 d1.dn_loss_iou: 0.2357 d2.dn_loss_cls: 0.1497 d2.dn_loss_bbox: 0.1572 d2.dn_loss_iou: 0.2137 d3.dn_loss_cls: 0.1435 d3.dn_loss_bbox: 0.1485 d3.dn_loss_iou: 0.2066 d4.dn_loss_cls: 0.1397 d4.dn_loss_bbox: 0.1461 d4.dn_loss_iou: 0.2033 d1.loss_lmm_region: 0.1641 loss_lmm_image: 0.9243 2024/11/11 09:57:03 - mmengine - INFO - Iter(train) [ 41400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:29:47 time: 1.9733 data_time: 0.0192 memory: 34095 grad_norm: 30.8891 loss: 9.1265 loss_cls: 0.3006 loss_bbox: 0.1106 loss_iou: 0.2055 d0.loss_cls: 0.3384 d0.loss_bbox: 0.1262 d0.loss_iou: 0.2187 d1.loss_cls: 0.3127 d1.loss_bbox: 0.1170 d1.loss_iou: 0.2103 d2.loss_cls: 0.3055 d2.loss_bbox: 0.1145 d2.loss_iou: 0.2079 d3.loss_cls: 0.3025 d3.loss_bbox: 0.1118 d3.loss_iou: 0.2066 d4.loss_cls: 0.3005 d4.loss_bbox: 0.1106 d4.loss_iou: 0.2049 enc_loss_cls: 0.3360 enc_loss_bbox: 0.1373 enc_loss_iou: 0.2358 dn_loss_cls: 0.1381 dn_loss_bbox: 0.1509 dn_loss_iou: 0.2161 d0.dn_loss_cls: 0.2356 d0.dn_loss_bbox: 0.2937 d0.dn_loss_iou: 0.3631 d1.dn_loss_cls: 0.1796 d1.dn_loss_bbox: 0.1836 d1.dn_loss_iou: 0.2480 d2.dn_loss_cls: 0.1557 d2.dn_loss_bbox: 0.1568 d2.dn_loss_iou: 0.2243 d3.dn_loss_cls: 0.1438 d3.dn_loss_bbox: 0.1531 d3.dn_loss_iou: 0.2188 d4.dn_loss_cls: 0.1395 d4.dn_loss_bbox: 0.1508 d4.dn_loss_iou: 0.2162 d1.loss_lmm_region: 0.1757 loss_lmm_image: 0.8693 2024/11/11 10:00:24 - mmengine - INFO - Iter(train) [ 41500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:26:27 time: 2.0154 data_time: 0.0195 memory: 34548 grad_norm: 25.9255 loss: 9.6905 loss_cls: 0.3153 loss_bbox: 0.1239 loss_iou: 0.2243 d0.loss_cls: 0.3486 d0.loss_bbox: 0.1405 d0.loss_iou: 0.2442 d1.loss_cls: 0.3301 d1.loss_bbox: 0.1313 d1.loss_iou: 0.2324 d2.loss_cls: 0.3213 d2.loss_bbox: 0.1288 d2.loss_iou: 0.2279 d3.loss_cls: 0.3158 d3.loss_bbox: 0.1264 d3.loss_iou: 0.2273 d4.loss_cls: 0.3129 d4.loss_bbox: 0.1256 d4.loss_iou: 0.2265 enc_loss_cls: 0.3577 enc_loss_bbox: 0.1530 enc_loss_iou: 0.2667 dn_loss_cls: 0.1494 dn_loss_bbox: 0.1698 dn_loss_iou: 0.2167 d0.dn_loss_cls: 0.2218 d0.dn_loss_bbox: 0.3039 d0.dn_loss_iou: 0.3573 d1.dn_loss_cls: 0.1790 d1.dn_loss_bbox: 0.2045 d1.dn_loss_iou: 0.2493 d2.dn_loss_cls: 0.1607 d2.dn_loss_bbox: 0.1821 d2.dn_loss_iou: 0.2280 d3.dn_loss_cls: 0.1531 d3.dn_loss_bbox: 0.1730 d3.dn_loss_iou: 0.2198 d4.dn_loss_cls: 0.1481 d4.dn_loss_bbox: 0.1698 d4.dn_loss_iou: 0.2166 d1.loss_lmm_region: 0.1455 loss_lmm_image: 0.9613 2024/11/11 10:03:43 - mmengine - INFO - Iter(train) [ 41600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:23:01 time: 1.9915 data_time: 0.0195 memory: 33722 grad_norm: 28.0183 loss: 9.6952 loss_cls: 0.2831 loss_bbox: 0.1517 loss_iou: 0.2514 d0.loss_cls: 0.3346 d0.loss_bbox: 0.1569 d0.loss_iou: 0.2654 d1.loss_cls: 0.3051 d1.loss_bbox: 0.1534 d1.loss_iou: 0.2588 d2.loss_cls: 0.2959 d2.loss_bbox: 0.1481 d2.loss_iou: 0.2533 d3.loss_cls: 0.2904 d3.loss_bbox: 0.1465 d3.loss_iou: 0.2511 d4.loss_cls: 0.2880 d4.loss_bbox: 0.1462 d4.loss_iou: 0.2517 enc_loss_cls: 0.3371 enc_loss_bbox: 0.1685 enc_loss_iou: 0.2859 dn_loss_cls: 0.0898 dn_loss_bbox: 0.1839 dn_loss_iou: 0.2253 d0.dn_loss_cls: 0.1825 d0.dn_loss_bbox: 0.3390 d0.dn_loss_iou: 0.3710 d1.dn_loss_cls: 0.1273 d1.dn_loss_bbox: 0.2207 d1.dn_loss_iou: 0.2561 d2.dn_loss_cls: 0.1060 d2.dn_loss_bbox: 0.1979 d2.dn_loss_iou: 0.2347 d3.dn_loss_cls: 0.0972 d3.dn_loss_bbox: 0.1862 d3.dn_loss_iou: 0.2275 d4.dn_loss_cls: 0.0912 d4.dn_loss_bbox: 0.1838 d4.dn_loss_iou: 0.2252 d1.loss_lmm_region: 0.1628 loss_lmm_image: 0.9640 2024/11/11 10:07:02 - mmengine - INFO - Iter(train) [ 41700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:19:37 time: 1.9951 data_time: 0.0195 memory: 32392 grad_norm: 27.6455 loss: 10.7948 loss_cls: 0.3308 loss_bbox: 0.1766 loss_iou: 0.2982 d0.loss_cls: 0.3724 d0.loss_bbox: 0.1884 d0.loss_iou: 0.3111 d1.loss_cls: 0.3462 d1.loss_bbox: 0.1784 d1.loss_iou: 0.3016 d2.loss_cls: 0.3388 d2.loss_bbox: 0.1776 d2.loss_iou: 0.3011 d3.loss_cls: 0.3331 d3.loss_bbox: 0.1794 d3.loss_iou: 0.3001 d4.loss_cls: 0.3298 d4.loss_bbox: 0.1772 d4.loss_iou: 0.2995 enc_loss_cls: 0.3743 enc_loss_bbox: 0.2022 enc_loss_iou: 0.3366 dn_loss_cls: 0.1133 dn_loss_bbox: 0.2087 dn_loss_iou: 0.2492 d0.dn_loss_cls: 0.1937 d0.dn_loss_bbox: 0.3491 d0.dn_loss_iou: 0.3893 d1.dn_loss_cls: 0.1475 d1.dn_loss_bbox: 0.2416 d1.dn_loss_iou: 0.2800 d2.dn_loss_cls: 0.1273 d2.dn_loss_bbox: 0.2173 d2.dn_loss_iou: 0.2580 d3.dn_loss_cls: 0.1204 d3.dn_loss_bbox: 0.2106 d3.dn_loss_iou: 0.2515 d4.dn_loss_cls: 0.1158 d4.dn_loss_bbox: 0.2087 d4.dn_loss_iou: 0.2491 d1.loss_lmm_region: 0.1704 loss_lmm_image: 0.8401 2024/11/11 10:10:21 - mmengine - INFO - Iter(train) [ 41800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:16:13 time: 1.9536 data_time: 0.0194 memory: 33972 grad_norm: 32.1353 loss: 9.9547 loss_cls: 0.3270 loss_bbox: 0.1399 loss_iou: 0.2471 d0.loss_cls: 0.3701 d0.loss_bbox: 0.1606 d0.loss_iou: 0.2617 d1.loss_cls: 0.3452 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2534 d2.loss_cls: 0.3378 d2.loss_bbox: 0.1440 d2.loss_iou: 0.2474 d3.loss_cls: 0.3297 d3.loss_bbox: 0.1453 d3.loss_iou: 0.2487 d4.loss_cls: 0.3289 d4.loss_bbox: 0.1392 d4.loss_iou: 0.2462 enc_loss_cls: 0.3746 enc_loss_bbox: 0.1727 enc_loss_iou: 0.2874 dn_loss_cls: 0.1492 dn_loss_bbox: 0.1716 dn_loss_iou: 0.2098 d0.dn_loss_cls: 0.2249 d0.dn_loss_bbox: 0.3172 d0.dn_loss_iou: 0.3458 d1.dn_loss_cls: 0.1768 d1.dn_loss_bbox: 0.2013 d1.dn_loss_iou: 0.2411 d2.dn_loss_cls: 0.1604 d2.dn_loss_bbox: 0.1828 d2.dn_loss_iou: 0.2198 d3.dn_loss_cls: 0.1556 d3.dn_loss_bbox: 0.1746 d3.dn_loss_iou: 0.2121 d4.dn_loss_cls: 0.1483 d4.dn_loss_bbox: 0.1715 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.1701 loss_lmm_image: 0.8559 2024/11/11 10:13:40 - mmengine - INFO - Iter(train) [ 41900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:12:49 time: 1.9866 data_time: 0.0195 memory: 34694 grad_norm: 28.0716 loss: 9.7097 loss_cls: 0.2877 loss_bbox: 0.1449 loss_iou: 0.2515 d0.loss_cls: 0.3337 d0.loss_bbox: 0.1528 d0.loss_iou: 0.2631 d1.loss_cls: 0.3073 d1.loss_bbox: 0.1484 d1.loss_iou: 0.2586 d2.loss_cls: 0.2994 d2.loss_bbox: 0.1439 d2.loss_iou: 0.2547 d3.loss_cls: 0.2939 d3.loss_bbox: 0.1448 d3.loss_iou: 0.2529 d4.loss_cls: 0.2903 d4.loss_bbox: 0.1444 d4.loss_iou: 0.2506 enc_loss_cls: 0.3366 enc_loss_bbox: 0.1630 enc_loss_iou: 0.2832 dn_loss_cls: 0.1062 dn_loss_bbox: 0.1888 dn_loss_iou: 0.2314 d0.dn_loss_cls: 0.1854 d0.dn_loss_bbox: 0.3538 d0.dn_loss_iou: 0.3797 d1.dn_loss_cls: 0.1368 d1.dn_loss_bbox: 0.2316 d1.dn_loss_iou: 0.2669 d2.dn_loss_cls: 0.1177 d2.dn_loss_bbox: 0.2027 d2.dn_loss_iou: 0.2434 d3.dn_loss_cls: 0.1112 d3.dn_loss_bbox: 0.1916 d3.dn_loss_iou: 0.2346 d4.dn_loss_cls: 0.1072 d4.dn_loss_bbox: 0.1889 d4.dn_loss_iou: 0.2314 d1.loss_lmm_region: 0.1341 loss_lmm_image: 0.8605 2024/11/11 10:17:00 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 10:17:00 - mmengine - INFO - Iter(train) [ 42000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:09:26 time: 2.0077 data_time: 0.0194 memory: 34808 grad_norm: 27.8435 loss: 10.9747 loss_cls: 0.3250 loss_bbox: 0.1813 loss_iou: 0.3192 d0.loss_cls: 0.3854 d0.loss_bbox: 0.1816 d0.loss_iou: 0.3280 d1.loss_cls: 0.3501 d1.loss_bbox: 0.1769 d1.loss_iou: 0.3179 d2.loss_cls: 0.3371 d2.loss_bbox: 0.1781 d2.loss_iou: 0.3137 d3.loss_cls: 0.3358 d3.loss_bbox: 0.1772 d3.loss_iou: 0.3120 d4.loss_cls: 0.3279 d4.loss_bbox: 0.1772 d4.loss_iou: 0.3180 enc_loss_cls: 0.3837 enc_loss_bbox: 0.2046 enc_loss_iou: 0.3579 dn_loss_cls: 0.1183 dn_loss_bbox: 0.1914 dn_loss_iou: 0.2539 d0.dn_loss_cls: 0.2056 d0.dn_loss_bbox: 0.3636 d0.dn_loss_iou: 0.4123 d1.dn_loss_cls: 0.1530 d1.dn_loss_bbox: 0.2256 d1.dn_loss_iou: 0.2867 d2.dn_loss_cls: 0.1314 d2.dn_loss_bbox: 0.2031 d2.dn_loss_iou: 0.2637 d3.dn_loss_cls: 0.1229 d3.dn_loss_bbox: 0.1923 d3.dn_loss_iou: 0.2561 d4.dn_loss_cls: 0.1188 d4.dn_loss_bbox: 0.1914 d4.dn_loss_iou: 0.2539 d1.loss_lmm_region: 0.1426 loss_lmm_image: 0.8994 2024/11/11 10:20:20 - mmengine - INFO - Iter(train) [ 42100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:06:04 time: 2.0210 data_time: 0.0192 memory: 33420 grad_norm: 30.4594 loss: 9.2757 loss_cls: 0.3010 loss_bbox: 0.1346 loss_iou: 0.2202 d0.loss_cls: 0.3422 d0.loss_bbox: 0.1489 d0.loss_iou: 0.2295 d1.loss_cls: 0.3136 d1.loss_bbox: 0.1430 d1.loss_iou: 0.2247 d2.loss_cls: 0.3075 d2.loss_bbox: 0.1414 d2.loss_iou: 0.2236 d3.loss_cls: 0.3038 d3.loss_bbox: 0.1391 d3.loss_iou: 0.2212 d4.loss_cls: 0.3044 d4.loss_bbox: 0.1348 d4.loss_iou: 0.2182 enc_loss_cls: 0.3444 enc_loss_bbox: 0.1671 enc_loss_iou: 0.2610 dn_loss_cls: 0.1303 dn_loss_bbox: 0.1549 dn_loss_iou: 0.1973 d0.dn_loss_cls: 0.2119 d0.dn_loss_bbox: 0.2998 d0.dn_loss_iou: 0.3419 d1.dn_loss_cls: 0.1623 d1.dn_loss_bbox: 0.1828 d1.dn_loss_iou: 0.2279 d2.dn_loss_cls: 0.1409 d2.dn_loss_bbox: 0.1627 d2.dn_loss_iou: 0.2064 d3.dn_loss_cls: 0.1342 d3.dn_loss_bbox: 0.1566 d3.dn_loss_iou: 0.2002 d4.dn_loss_cls: 0.1311 d4.dn_loss_bbox: 0.1549 d4.dn_loss_iou: 0.1974 d1.loss_lmm_region: 0.1505 loss_lmm_image: 0.9076 2024/11/11 10:23:38 - mmengine - INFO - Iter(train) [ 42200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 12:02:38 time: 1.9684 data_time: 0.0194 memory: 34864 grad_norm: 25.5248 loss: 11.0765 loss_cls: 0.3225 loss_bbox: 0.1902 loss_iou: 0.3051 d0.loss_cls: 0.3861 d0.loss_bbox: 0.2017 d0.loss_iou: 0.3246 d1.loss_cls: 0.3554 d1.loss_bbox: 0.1965 d1.loss_iou: 0.3100 d2.loss_cls: 0.3352 d2.loss_bbox: 0.1952 d2.loss_iou: 0.3123 d3.loss_cls: 0.3250 d3.loss_bbox: 0.1971 d3.loss_iou: 0.3068 d4.loss_cls: 0.3254 d4.loss_bbox: 0.1896 d4.loss_iou: 0.3067 enc_loss_cls: 0.3798 enc_loss_bbox: 0.2215 enc_loss_iou: 0.3445 dn_loss_cls: 0.0895 dn_loss_bbox: 0.2261 dn_loss_iou: 0.2581 d0.dn_loss_cls: 0.1869 d0.dn_loss_bbox: 0.3810 d0.dn_loss_iou: 0.4108 d1.dn_loss_cls: 0.1266 d1.dn_loss_bbox: 0.2634 d1.dn_loss_iou: 0.2915 d2.dn_loss_cls: 0.1059 d2.dn_loss_bbox: 0.2381 d2.dn_loss_iou: 0.2695 d3.dn_loss_cls: 0.0942 d3.dn_loss_bbox: 0.2286 d3.dn_loss_iou: 0.2609 d4.dn_loss_cls: 0.0899 d4.dn_loss_bbox: 0.2261 d4.dn_loss_iou: 0.2581 d1.loss_lmm_region: 0.1279 loss_lmm_image: 0.9121 2024/11/11 10:26:56 - mmengine - INFO - Iter(train) [ 42300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:59:12 time: 1.9779 data_time: 0.0193 memory: 32493 grad_norm: 30.3370 loss: 10.0813 loss_cls: 0.3276 loss_bbox: 0.1444 loss_iou: 0.2473 d0.loss_cls: 0.3745 d0.loss_bbox: 0.1702 d0.loss_iou: 0.2706 d1.loss_cls: 0.3523 d1.loss_bbox: 0.1510 d1.loss_iou: 0.2526 d2.loss_cls: 0.3313 d2.loss_bbox: 0.1493 d2.loss_iou: 0.2518 d3.loss_cls: 0.3247 d3.loss_bbox: 0.1499 d3.loss_iou: 0.2517 d4.loss_cls: 0.3238 d4.loss_bbox: 0.1492 d4.loss_iou: 0.2501 enc_loss_cls: 0.3737 enc_loss_bbox: 0.1904 enc_loss_iou: 0.2993 dn_loss_cls: 0.1327 dn_loss_bbox: 0.1799 dn_loss_iou: 0.2158 d0.dn_loss_cls: 0.2210 d0.dn_loss_bbox: 0.3332 d0.dn_loss_iou: 0.3550 d1.dn_loss_cls: 0.1657 d1.dn_loss_bbox: 0.2105 d1.dn_loss_iou: 0.2450 d2.dn_loss_cls: 0.1455 d2.dn_loss_bbox: 0.1921 d2.dn_loss_iou: 0.2262 d3.dn_loss_cls: 0.1380 d3.dn_loss_bbox: 0.1822 d3.dn_loss_iou: 0.2180 d4.dn_loss_cls: 0.1332 d4.dn_loss_bbox: 0.1799 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1718 loss_lmm_image: 0.8840 2024/11/11 10:30:16 - mmengine - INFO - Iter(train) [ 42400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:55:48 time: 1.9839 data_time: 0.0195 memory: 33117 grad_norm: 30.0505 loss: 10.3461 loss_cls: 0.3608 loss_bbox: 0.1518 loss_iou: 0.2506 d0.loss_cls: 0.4044 d0.loss_bbox: 0.1629 d0.loss_iou: 0.2639 d1.loss_cls: 0.3807 d1.loss_bbox: 0.1539 d1.loss_iou: 0.2560 d2.loss_cls: 0.3674 d2.loss_bbox: 0.1545 d2.loss_iou: 0.2534 d3.loss_cls: 0.3615 d3.loss_bbox: 0.1536 d3.loss_iou: 0.2516 d4.loss_cls: 0.3577 d4.loss_bbox: 0.1548 d4.loss_iou: 0.2551 enc_loss_cls: 0.4045 enc_loss_bbox: 0.1837 enc_loss_iou: 0.2951 dn_loss_cls: 0.1194 dn_loss_bbox: 0.1865 dn_loss_iou: 0.2164 d0.dn_loss_cls: 0.2138 d0.dn_loss_bbox: 0.3502 d0.dn_loss_iou: 0.3662 d1.dn_loss_cls: 0.1645 d1.dn_loss_bbox: 0.2237 d1.dn_loss_iou: 0.2498 d2.dn_loss_cls: 0.1359 d2.dn_loss_bbox: 0.1988 d2.dn_loss_iou: 0.2265 d3.dn_loss_cls: 0.1261 d3.dn_loss_bbox: 0.1893 d3.dn_loss_iou: 0.2188 d4.dn_loss_cls: 0.1212 d4.dn_loss_bbox: 0.1866 d4.dn_loss_iou: 0.2164 d1.loss_lmm_region: 0.1666 loss_lmm_image: 0.8916 2024/11/11 10:33:34 - mmengine - INFO - Iter(train) [ 42500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:52:21 time: 1.9636 data_time: 0.0186 memory: 31951 grad_norm: 26.7353 loss: 10.2454 loss_cls: 0.2941 loss_bbox: 0.1576 loss_iou: 0.2785 d0.loss_cls: 0.3411 d0.loss_bbox: 0.1718 d0.loss_iou: 0.2892 d1.loss_cls: 0.3175 d1.loss_bbox: 0.1590 d1.loss_iou: 0.2824 d2.loss_cls: 0.3020 d2.loss_bbox: 0.1608 d2.loss_iou: 0.2820 d3.loss_cls: 0.2962 d3.loss_bbox: 0.1549 d3.loss_iou: 0.2792 d4.loss_cls: 0.2952 d4.loss_bbox: 0.1564 d4.loss_iou: 0.2779 enc_loss_cls: 0.3392 enc_loss_bbox: 0.1843 enc_loss_iou: 0.3102 dn_loss_cls: 0.0985 dn_loss_bbox: 0.2093 dn_loss_iou: 0.2474 d0.dn_loss_cls: 0.1718 d0.dn_loss_bbox: 0.3767 d0.dn_loss_iou: 0.3952 d1.dn_loss_cls: 0.1298 d1.dn_loss_bbox: 0.2484 d1.dn_loss_iou: 0.2784 d2.dn_loss_cls: 0.1118 d2.dn_loss_bbox: 0.2230 d2.dn_loss_iou: 0.2589 d3.dn_loss_cls: 0.1047 d3.dn_loss_bbox: 0.2117 d3.dn_loss_iou: 0.2503 d4.dn_loss_cls: 0.1001 d4.dn_loss_bbox: 0.2093 d4.dn_loss_iou: 0.2473 d1.loss_lmm_region: 0.1429 loss_lmm_image: 0.9002 2024/11/11 10:36:53 - mmengine - INFO - Iter(train) [ 42600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:48:59 time: 1.9985 data_time: 0.0185 memory: 35273 grad_norm: 30.9747 loss: 9.2420 loss_cls: 0.2937 loss_bbox: 0.1285 loss_iou: 0.2278 d0.loss_cls: 0.3507 d0.loss_bbox: 0.1427 d0.loss_iou: 0.2488 d1.loss_cls: 0.3182 d1.loss_bbox: 0.1312 d1.loss_iou: 0.2358 d2.loss_cls: 0.3057 d2.loss_bbox: 0.1274 d2.loss_iou: 0.2288 d3.loss_cls: 0.2986 d3.loss_bbox: 0.1276 d3.loss_iou: 0.2278 d4.loss_cls: 0.2932 d4.loss_bbox: 0.1283 d4.loss_iou: 0.2281 enc_loss_cls: 0.3442 enc_loss_bbox: 0.1509 enc_loss_iou: 0.2698 dn_loss_cls: 0.1627 dn_loss_bbox: 0.1305 dn_loss_iou: 0.1895 d0.dn_loss_cls: 0.2533 d0.dn_loss_bbox: 0.2706 d0.dn_loss_iou: 0.3310 d1.dn_loss_cls: 0.1968 d1.dn_loss_bbox: 0.1611 d1.dn_loss_iou: 0.2203 d2.dn_loss_cls: 0.1731 d2.dn_loss_bbox: 0.1407 d2.dn_loss_iou: 0.1996 d3.dn_loss_cls: 0.1663 d3.dn_loss_bbox: 0.1338 d3.dn_loss_iou: 0.1931 d4.dn_loss_cls: 0.1646 d4.dn_loss_bbox: 0.1306 d4.dn_loss_iou: 0.1895 d1.loss_lmm_region: 0.1561 loss_lmm_image: 0.8710 2024/11/11 10:40:09 - mmengine - INFO - Iter(train) [ 42700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:45:25 time: 1.9617 data_time: 0.0180 memory: 34571 grad_norm: 35.3079 loss: 10.1093 loss_cls: 0.3130 loss_bbox: 0.1213 loss_iou: 0.2195 d0.loss_cls: 0.3557 d0.loss_bbox: 0.1346 d0.loss_iou: 0.2305 d1.loss_cls: 0.3233 d1.loss_bbox: 0.1303 d1.loss_iou: 0.2252 d2.loss_cls: 0.3226 d2.loss_bbox: 0.1211 d2.loss_iou: 0.2170 d3.loss_cls: 0.3144 d3.loss_bbox: 0.1189 d3.loss_iou: 0.2186 d4.loss_cls: 0.3120 d4.loss_bbox: 0.1193 d4.loss_iou: 0.2197 enc_loss_cls: 0.3513 enc_loss_bbox: 0.1512 enc_loss_iou: 0.2541 dn_loss_cls: 0.1889 dn_loss_bbox: 0.1993 dn_loss_iou: 0.2287 d0.dn_loss_cls: 0.2605 d0.dn_loss_bbox: 0.3631 d0.dn_loss_iou: 0.3739 d1.dn_loss_cls: 0.2370 d1.dn_loss_bbox: 0.2470 d1.dn_loss_iou: 0.2651 d2.dn_loss_cls: 0.2226 d2.dn_loss_bbox: 0.2131 d2.dn_loss_iou: 0.2394 d3.dn_loss_cls: 0.2056 d3.dn_loss_bbox: 0.2021 d3.dn_loss_iou: 0.2310 d4.dn_loss_cls: 0.1942 d4.dn_loss_bbox: 0.1994 d4.dn_loss_iou: 0.2287 d1.loss_lmm_region: 0.1642 loss_lmm_image: 0.8716 2024/11/11 10:43:28 - mmengine - INFO - Iter(train) [ 42800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:42:02 time: 1.9909 data_time: 0.0179 memory: 33967 grad_norm: 29.7919 loss: 11.4579 loss_cls: 0.3661 loss_bbox: 0.1781 loss_iou: 0.3035 d0.loss_cls: 0.4199 d0.loss_bbox: 0.1965 d0.loss_iou: 0.3235 d1.loss_cls: 0.3874 d1.loss_bbox: 0.1892 d1.loss_iou: 0.3160 d2.loss_cls: 0.3779 d2.loss_bbox: 0.1829 d2.loss_iou: 0.3077 d3.loss_cls: 0.3709 d3.loss_bbox: 0.1799 d3.loss_iou: 0.3080 d4.loss_cls: 0.3693 d4.loss_bbox: 0.1779 d4.loss_iou: 0.3042 enc_loss_cls: 0.4155 enc_loss_bbox: 0.2055 enc_loss_iou: 0.3462 dn_loss_cls: 0.1494 dn_loss_bbox: 0.1984 dn_loss_iou: 0.2569 d0.dn_loss_cls: 0.2419 d0.dn_loss_bbox: 0.3366 d0.dn_loss_iou: 0.4060 d1.dn_loss_cls: 0.1867 d1.dn_loss_bbox: 0.2286 d1.dn_loss_iou: 0.2875 d2.dn_loss_cls: 0.1648 d2.dn_loss_bbox: 0.2073 d2.dn_loss_iou: 0.2664 d3.dn_loss_cls: 0.1548 d3.dn_loss_bbox: 0.2004 d3.dn_loss_iou: 0.2594 d4.dn_loss_cls: 0.1500 d4.dn_loss_bbox: 0.1983 d4.dn_loss_iou: 0.2570 d1.loss_lmm_region: 0.1829 loss_lmm_image: 0.8984 2024/11/11 10:46:47 - mmengine - INFO - Iter(train) [ 42900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:38:38 time: 1.9903 data_time: 0.0178 memory: 34405 grad_norm: 31.6968 loss: 9.8059 loss_cls: 0.3030 loss_bbox: 0.1389 loss_iou: 0.2623 d0.loss_cls: 0.3476 d0.loss_bbox: 0.1540 d0.loss_iou: 0.2750 d1.loss_cls: 0.3188 d1.loss_bbox: 0.1508 d1.loss_iou: 0.2729 d2.loss_cls: 0.3156 d2.loss_bbox: 0.1400 d2.loss_iou: 0.2641 d3.loss_cls: 0.3070 d3.loss_bbox: 0.1395 d3.loss_iou: 0.2638 d4.loss_cls: 0.3052 d4.loss_bbox: 0.1378 d4.loss_iou: 0.2612 enc_loss_cls: 0.3446 enc_loss_bbox: 0.1675 enc_loss_iou: 0.2975 dn_loss_cls: 0.1266 dn_loss_bbox: 0.1627 dn_loss_iou: 0.2198 d0.dn_loss_cls: 0.2106 d0.dn_loss_bbox: 0.3112 d0.dn_loss_iou: 0.3590 d1.dn_loss_cls: 0.1600 d1.dn_loss_bbox: 0.1957 d1.dn_loss_iou: 0.2505 d2.dn_loss_cls: 0.1383 d2.dn_loss_bbox: 0.1737 d2.dn_loss_iou: 0.2290 d3.dn_loss_cls: 0.1287 d3.dn_loss_bbox: 0.1641 d3.dn_loss_iou: 0.2215 d4.dn_loss_cls: 0.1263 d4.dn_loss_bbox: 0.1626 d4.dn_loss_iou: 0.2197 d1.loss_lmm_region: 0.1839 loss_lmm_image: 0.8949 2024/11/11 10:50:05 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 10:50:05 - mmengine - INFO - Iter(train) [ 43000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:35:10 time: 1.9818 data_time: 0.0179 memory: 34623 grad_norm: 27.3530 loss: 10.0115 loss_cls: 0.3179 loss_bbox: 0.1530 loss_iou: 0.2840 d0.loss_cls: 0.3630 d0.loss_bbox: 0.1600 d0.loss_iou: 0.2911 d1.loss_cls: 0.3346 d1.loss_bbox: 0.1576 d1.loss_iou: 0.2871 d2.loss_cls: 0.3259 d2.loss_bbox: 0.1548 d2.loss_iou: 0.2834 d3.loss_cls: 0.3232 d3.loss_bbox: 0.1535 d3.loss_iou: 0.2830 d4.loss_cls: 0.3217 d4.loss_bbox: 0.1520 d4.loss_iou: 0.2831 enc_loss_cls: 0.3611 enc_loss_bbox: 0.1769 enc_loss_iou: 0.3147 dn_loss_cls: 0.0937 dn_loss_bbox: 0.1809 dn_loss_iou: 0.2330 d0.dn_loss_cls: 0.1752 d0.dn_loss_bbox: 0.3130 d0.dn_loss_iou: 0.3725 d1.dn_loss_cls: 0.1248 d1.dn_loss_bbox: 0.2104 d1.dn_loss_iou: 0.2639 d2.dn_loss_cls: 0.1050 d2.dn_loss_bbox: 0.1925 d2.dn_loss_iou: 0.2434 d3.dn_loss_cls: 0.0980 d3.dn_loss_bbox: 0.1836 d3.dn_loss_iou: 0.2354 d4.dn_loss_cls: 0.0951 d4.dn_loss_bbox: 0.1809 d4.dn_loss_iou: 0.2328 d1.loss_lmm_region: 0.1409 loss_lmm_image: 0.8551 2024/11/11 10:53:21 - mmengine - INFO - Iter(train) [ 43100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:31:40 time: 1.9741 data_time: 0.0179 memory: 33569 grad_norm: 31.5163 loss: 9.2826 loss_cls: 0.2765 loss_bbox: 0.1264 loss_iou: 0.2356 d0.loss_cls: 0.3248 d0.loss_bbox: 0.1353 d0.loss_iou: 0.2454 d1.loss_cls: 0.2992 d1.loss_bbox: 0.1299 d1.loss_iou: 0.2390 d2.loss_cls: 0.2849 d2.loss_bbox: 0.1294 d2.loss_iou: 0.2388 d3.loss_cls: 0.2764 d3.loss_bbox: 0.1311 d3.loss_iou: 0.2371 d4.loss_cls: 0.2769 d4.loss_bbox: 0.1276 d4.loss_iou: 0.2345 enc_loss_cls: 0.3206 enc_loss_bbox: 0.1455 enc_loss_iou: 0.2658 dn_loss_cls: 0.1230 dn_loss_bbox: 0.1651 dn_loss_iou: 0.2175 d0.dn_loss_cls: 0.2097 d0.dn_loss_bbox: 0.3078 d0.dn_loss_iou: 0.3705 d1.dn_loss_cls: 0.1558 d1.dn_loss_bbox: 0.1968 d1.dn_loss_iou: 0.2499 d2.dn_loss_cls: 0.1340 d2.dn_loss_bbox: 0.1735 d2.dn_loss_iou: 0.2278 d3.dn_loss_cls: 0.1280 d3.dn_loss_bbox: 0.1659 d3.dn_loss_iou: 0.2196 d4.dn_loss_cls: 0.1228 d4.dn_loss_bbox: 0.1649 d4.dn_loss_iou: 0.2174 d1.loss_lmm_region: 0.1700 loss_lmm_image: 0.8818 2024/11/11 10:56:43 - mmengine - INFO - Iter(train) [ 43200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:28:23 time: 2.0214 data_time: 0.0178 memory: 34034 grad_norm: 28.6575 loss: 9.6297 loss_cls: 0.2604 loss_bbox: 0.1661 loss_iou: 0.2523 d0.loss_cls: 0.3040 d0.loss_bbox: 0.1831 d0.loss_iou: 0.2637 d1.loss_cls: 0.2767 d1.loss_bbox: 0.1703 d1.loss_iou: 0.2559 d2.loss_cls: 0.2624 d2.loss_bbox: 0.1746 d2.loss_iou: 0.2565 d3.loss_cls: 0.2728 d3.loss_bbox: 0.1580 d3.loss_iou: 0.2471 d4.loss_cls: 0.2641 d4.loss_bbox: 0.1650 d4.loss_iou: 0.2511 enc_loss_cls: 0.3071 enc_loss_bbox: 0.1896 enc_loss_iou: 0.2801 dn_loss_cls: 0.0873 dn_loss_bbox: 0.1986 dn_loss_iou: 0.2420 d0.dn_loss_cls: 0.1788 d0.dn_loss_bbox: 0.3550 d0.dn_loss_iou: 0.3897 d1.dn_loss_cls: 0.1196 d1.dn_loss_bbox: 0.2274 d1.dn_loss_iou: 0.2714 d2.dn_loss_cls: 0.0998 d2.dn_loss_bbox: 0.2071 d2.dn_loss_iou: 0.2508 d3.dn_loss_cls: 0.0922 d3.dn_loss_bbox: 0.2002 d3.dn_loss_iou: 0.2444 d4.dn_loss_cls: 0.0885 d4.dn_loss_bbox: 0.1985 d4.dn_loss_iou: 0.2419 d1.loss_lmm_region: 0.1387 loss_lmm_image: 0.8372 2024/11/11 11:00:02 - mmengine - INFO - Iter(train) [ 43300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:24:58 time: 1.9978 data_time: 0.0179 memory: 31660 grad_norm: 25.9280 loss: 8.7494 loss_cls: 0.2701 loss_bbox: 0.1115 loss_iou: 0.2040 d0.loss_cls: 0.3127 d0.loss_bbox: 0.1256 d0.loss_iou: 0.2259 d1.loss_cls: 0.2848 d1.loss_bbox: 0.1163 d1.loss_iou: 0.2132 d2.loss_cls: 0.2772 d2.loss_bbox: 0.1113 d2.loss_iou: 0.2035 d3.loss_cls: 0.2755 d3.loss_bbox: 0.1103 d3.loss_iou: 0.2023 d4.loss_cls: 0.2695 d4.loss_bbox: 0.1108 d4.loss_iou: 0.2028 enc_loss_cls: 0.3142 enc_loss_bbox: 0.1373 enc_loss_iou: 0.2430 dn_loss_cls: 0.1200 dn_loss_bbox: 0.1625 dn_loss_iou: 0.1974 d0.dn_loss_cls: 0.2013 d0.dn_loss_bbox: 0.2999 d0.dn_loss_iou: 0.3350 d1.dn_loss_cls: 0.1532 d1.dn_loss_bbox: 0.1898 d1.dn_loss_iou: 0.2279 d2.dn_loss_cls: 0.1327 d2.dn_loss_bbox: 0.1720 d2.dn_loss_iou: 0.2071 d3.dn_loss_cls: 0.1237 d3.dn_loss_bbox: 0.1638 d3.dn_loss_iou: 0.1999 d4.dn_loss_cls: 0.1214 d4.dn_loss_bbox: 0.1625 d4.dn_loss_iou: 0.1972 d1.loss_lmm_region: 0.1372 loss_lmm_image: 0.9231 2024/11/11 11:03:21 - mmengine - INFO - Iter(train) [ 43400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:21:34 time: 1.9938 data_time: 0.0179 memory: 35048 grad_norm: 33.6567 loss: 9.7007 loss_cls: 0.2951 loss_bbox: 0.1494 loss_iou: 0.2392 d0.loss_cls: 0.3456 d0.loss_bbox: 0.1609 d0.loss_iou: 0.2499 d1.loss_cls: 0.3148 d1.loss_bbox: 0.1484 d1.loss_iou: 0.2391 d2.loss_cls: 0.3068 d2.loss_bbox: 0.1469 d2.loss_iou: 0.2358 d3.loss_cls: 0.2998 d3.loss_bbox: 0.1495 d3.loss_iou: 0.2394 d4.loss_cls: 0.2978 d4.loss_bbox: 0.1473 d4.loss_iou: 0.2384 enc_loss_cls: 0.3391 enc_loss_bbox: 0.1802 enc_loss_iou: 0.2768 dn_loss_cls: 0.1142 dn_loss_bbox: 0.1873 dn_loss_iou: 0.2215 d0.dn_loss_cls: 0.1858 d0.dn_loss_bbox: 0.3377 d0.dn_loss_iou: 0.3639 d1.dn_loss_cls: 0.1363 d1.dn_loss_bbox: 0.2178 d1.dn_loss_iou: 0.2503 d2.dn_loss_cls: 0.1246 d2.dn_loss_bbox: 0.1979 d2.dn_loss_iou: 0.2311 d3.dn_loss_cls: 0.1177 d3.dn_loss_bbox: 0.1888 d3.dn_loss_iou: 0.2236 d4.dn_loss_cls: 0.1148 d4.dn_loss_bbox: 0.1872 d4.dn_loss_iou: 0.2216 d1.loss_lmm_region: 0.1596 loss_lmm_image: 0.9189 2024/11/11 11:06:39 - mmengine - INFO - Iter(train) [ 43500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:18:09 time: 2.0110 data_time: 0.0179 memory: 32675 grad_norm: 35.3382 loss: 10.4956 loss_cls: 0.3290 loss_bbox: 0.1823 loss_iou: 0.2785 d0.loss_cls: 0.3786 d0.loss_bbox: 0.1900 d0.loss_iou: 0.2937 d1.loss_cls: 0.3485 d1.loss_bbox: 0.1842 d1.loss_iou: 0.2855 d2.loss_cls: 0.3454 d2.loss_bbox: 0.1772 d2.loss_iou: 0.2773 d3.loss_cls: 0.3377 d3.loss_bbox: 0.1765 d3.loss_iou: 0.2761 d4.loss_cls: 0.3324 d4.loss_bbox: 0.1785 d4.loss_iou: 0.2779 enc_loss_cls: 0.3719 enc_loss_bbox: 0.2034 enc_loss_iou: 0.3108 dn_loss_cls: 0.1083 dn_loss_bbox: 0.1923 dn_loss_iou: 0.2341 d0.dn_loss_cls: 0.1928 d0.dn_loss_bbox: 0.3440 d0.dn_loss_iou: 0.3746 d1.dn_loss_cls: 0.1442 d1.dn_loss_bbox: 0.2248 d1.dn_loss_iou: 0.2626 d2.dn_loss_cls: 0.1221 d2.dn_loss_bbox: 0.2025 d2.dn_loss_iou: 0.2437 d3.dn_loss_cls: 0.1122 d3.dn_loss_bbox: 0.1930 d3.dn_loss_iou: 0.2357 d4.dn_loss_cls: 0.1083 d4.dn_loss_bbox: 0.1925 d4.dn_loss_iou: 0.2342 d1.loss_lmm_region: 0.1564 loss_lmm_image: 0.8823 2024/11/11 11:09:56 - mmengine - INFO - Iter(train) [ 43600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:14:40 time: 1.9448 data_time: 0.0178 memory: 33470 grad_norm: 32.7684 loss: 10.7031 loss_cls: 0.3121 loss_bbox: 0.1806 loss_iou: 0.2741 d0.loss_cls: 0.3633 d0.loss_bbox: 0.1904 d0.loss_iou: 0.2913 d1.loss_cls: 0.3346 d1.loss_bbox: 0.1853 d1.loss_iou: 0.2782 d2.loss_cls: 0.3213 d2.loss_bbox: 0.1834 d2.loss_iou: 0.2793 d3.loss_cls: 0.3153 d3.loss_bbox: 0.1790 d3.loss_iou: 0.2734 d4.loss_cls: 0.3114 d4.loss_bbox: 0.1809 d4.loss_iou: 0.2751 enc_loss_cls: 0.3602 enc_loss_bbox: 0.2068 enc_loss_iou: 0.3163 dn_loss_cls: 0.1284 dn_loss_bbox: 0.2070 dn_loss_iou: 0.2456 d0.dn_loss_cls: 0.2198 d0.dn_loss_bbox: 0.3704 d0.dn_loss_iou: 0.3920 d1.dn_loss_cls: 0.1655 d1.dn_loss_bbox: 0.2466 d1.dn_loss_iou: 0.2822 d2.dn_loss_cls: 0.1453 d2.dn_loss_bbox: 0.2190 d2.dn_loss_iou: 0.2573 d3.dn_loss_cls: 0.1345 d3.dn_loss_bbox: 0.2094 d3.dn_loss_iou: 0.2484 d4.dn_loss_cls: 0.1299 d4.dn_loss_bbox: 0.2069 d4.dn_loss_iou: 0.2455 d1.loss_lmm_region: 0.1652 loss_lmm_image: 0.8720 2024/11/11 11:13:13 - mmengine - INFO - Iter(train) [ 43700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:11:11 time: 1.9543 data_time: 0.0178 memory: 32946 grad_norm: 30.5274 loss: 8.5438 loss_cls: 0.2599 loss_bbox: 0.1106 loss_iou: 0.2044 d0.loss_cls: 0.2893 d0.loss_bbox: 0.1291 d0.loss_iou: 0.2194 d1.loss_cls: 0.2665 d1.loss_bbox: 0.1209 d1.loss_iou: 0.2123 d2.loss_cls: 0.2663 d2.loss_bbox: 0.1136 d2.loss_iou: 0.2081 d3.loss_cls: 0.2638 d3.loss_bbox: 0.1114 d3.loss_iou: 0.2050 d4.loss_cls: 0.2633 d4.loss_bbox: 0.1101 d4.loss_iou: 0.2041 enc_loss_cls: 0.2953 enc_loss_bbox: 0.1428 enc_loss_iou: 0.2412 dn_loss_cls: 0.1144 dn_loss_bbox: 0.1446 dn_loss_iou: 0.1962 d0.dn_loss_cls: 0.1933 d0.dn_loss_bbox: 0.2971 d0.dn_loss_iou: 0.3506 d1.dn_loss_cls: 0.1439 d1.dn_loss_bbox: 0.1786 d1.dn_loss_iou: 0.2299 d2.dn_loss_cls: 0.1280 d2.dn_loss_bbox: 0.1526 d2.dn_loss_iou: 0.2063 d3.dn_loss_cls: 0.1187 d3.dn_loss_bbox: 0.1468 d3.dn_loss_iou: 0.1994 d4.dn_loss_cls: 0.1163 d4.dn_loss_bbox: 0.1446 d4.dn_loss_iou: 0.1962 d1.loss_lmm_region: 0.1438 loss_lmm_image: 0.9051 2024/11/11 11:16:32 - mmengine - INFO - Iter(train) [ 43800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:07:46 time: 1.9789 data_time: 0.0178 memory: 34610 grad_norm: 33.5296 loss: 10.4727 loss_cls: 0.3130 loss_bbox: 0.1553 loss_iou: 0.2920 d0.loss_cls: 0.3493 d0.loss_bbox: 0.1756 d0.loss_iou: 0.3077 d1.loss_cls: 0.3270 d1.loss_bbox: 0.1655 d1.loss_iou: 0.2927 d2.loss_cls: 0.3192 d2.loss_bbox: 0.1592 d2.loss_iou: 0.2889 d3.loss_cls: 0.3156 d3.loss_bbox: 0.1598 d3.loss_iou: 0.2923 d4.loss_cls: 0.3147 d4.loss_bbox: 0.1553 d4.loss_iou: 0.2914 enc_loss_cls: 0.3529 enc_loss_bbox: 0.1844 enc_loss_iou: 0.3237 dn_loss_cls: 0.1592 dn_loss_bbox: 0.1754 dn_loss_iou: 0.2383 d0.dn_loss_cls: 0.2261 d0.dn_loss_bbox: 0.3257 d0.dn_loss_iou: 0.3858 d1.dn_loss_cls: 0.1838 d1.dn_loss_bbox: 0.2064 d1.dn_loss_iou: 0.2686 d2.dn_loss_cls: 0.1707 d2.dn_loss_bbox: 0.1860 d2.dn_loss_iou: 0.2478 d3.dn_loss_cls: 0.1636 d3.dn_loss_bbox: 0.1781 d3.dn_loss_iou: 0.2408 d4.dn_loss_cls: 0.1600 d4.dn_loss_bbox: 0.1755 d4.dn_loss_iou: 0.2383 d1.loss_lmm_region: 0.1692 loss_lmm_image: 0.8380 2024/11/11 11:19:52 - mmengine - INFO - Iter(train) [ 43900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:04:24 time: 2.0052 data_time: 0.0178 memory: 33342 grad_norm: 32.9215 loss: 10.2561 loss_cls: 0.3184 loss_bbox: 0.1647 loss_iou: 0.2872 d0.loss_cls: 0.3748 d0.loss_bbox: 0.1709 d0.loss_iou: 0.2991 d1.loss_cls: 0.3434 d1.loss_bbox: 0.1626 d1.loss_iou: 0.2909 d2.loss_cls: 0.3326 d2.loss_bbox: 0.1633 d2.loss_iou: 0.2858 d3.loss_cls: 0.3233 d3.loss_bbox: 0.1641 d3.loss_iou: 0.2859 d4.loss_cls: 0.3189 d4.loss_bbox: 0.1663 d4.loss_iou: 0.2881 enc_loss_cls: 0.3755 enc_loss_bbox: 0.1876 enc_loss_iou: 0.3253 dn_loss_cls: 0.0994 dn_loss_bbox: 0.1766 dn_loss_iou: 0.2256 d0.dn_loss_cls: 0.1897 d0.dn_loss_bbox: 0.3319 d0.dn_loss_iou: 0.3770 d1.dn_loss_cls: 0.1366 d1.dn_loss_bbox: 0.2154 d1.dn_loss_iou: 0.2616 d2.dn_loss_cls: 0.1137 d2.dn_loss_bbox: 0.1867 d2.dn_loss_iou: 0.2360 d3.dn_loss_cls: 0.1061 d3.dn_loss_bbox: 0.1798 d3.dn_loss_iou: 0.2282 d4.dn_loss_cls: 0.1008 d4.dn_loss_bbox: 0.1766 d4.dn_loss_iou: 0.2256 d1.loss_lmm_region: 0.1543 loss_lmm_image: 0.9057 2024/11/11 11:23:12 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 11:23:12 - mmengine - INFO - Iter(train) [ 44000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 11:01:04 time: 1.9844 data_time: 0.0178 memory: 35143 grad_norm: 29.8451 loss: 10.2700 loss_cls: 0.3630 loss_bbox: 0.1506 loss_iou: 0.2790 d0.loss_cls: 0.4089 d0.loss_bbox: 0.1656 d0.loss_iou: 0.2958 d1.loss_cls: 0.3804 d1.loss_bbox: 0.1576 d1.loss_iou: 0.2837 d2.loss_cls: 0.3759 d2.loss_bbox: 0.1499 d2.loss_iou: 0.2733 d3.loss_cls: 0.3647 d3.loss_bbox: 0.1508 d3.loss_iou: 0.2783 d4.loss_cls: 0.3647 d4.loss_bbox: 0.1506 d4.loss_iou: 0.2789 enc_loss_cls: 0.4099 enc_loss_bbox: 0.1819 enc_loss_iou: 0.3214 dn_loss_cls: 0.1126 dn_loss_bbox: 0.1650 dn_loss_iou: 0.2151 d0.dn_loss_cls: 0.2025 d0.dn_loss_bbox: 0.2995 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1466 d1.dn_loss_bbox: 0.1957 d1.dn_loss_iou: 0.2468 d2.dn_loss_cls: 0.1249 d2.dn_loss_bbox: 0.1739 d2.dn_loss_iou: 0.2249 d3.dn_loss_cls: 0.1142 d3.dn_loss_bbox: 0.1670 d3.dn_loss_iou: 0.2180 d4.dn_loss_cls: 0.1133 d4.dn_loss_bbox: 0.1650 d4.dn_loss_iou: 0.2152 d1.loss_lmm_region: 0.1547 loss_lmm_image: 0.8757 2024/11/11 11:26:33 - mmengine - INFO - Iter(train) [ 44100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:57:43 time: 1.9990 data_time: 0.0177 memory: 34172 grad_norm: 31.1581 loss: 9.9791 loss_cls: 0.2989 loss_bbox: 0.1359 loss_iou: 0.2462 d0.loss_cls: 0.3523 d0.loss_bbox: 0.1564 d0.loss_iou: 0.2663 d1.loss_cls: 0.3186 d1.loss_bbox: 0.1552 d1.loss_iou: 0.2588 d2.loss_cls: 0.3129 d2.loss_bbox: 0.1430 d2.loss_iou: 0.2486 d3.loss_cls: 0.3063 d3.loss_bbox: 0.1377 d3.loss_iou: 0.2465 d4.loss_cls: 0.2969 d4.loss_bbox: 0.1379 d4.loss_iou: 0.2477 enc_loss_cls: 0.3486 enc_loss_bbox: 0.1697 enc_loss_iou: 0.2903 dn_loss_cls: 0.1439 dn_loss_bbox: 0.1762 dn_loss_iou: 0.2327 d0.dn_loss_cls: 0.2284 d0.dn_loss_bbox: 0.3172 d0.dn_loss_iou: 0.3722 d1.dn_loss_cls: 0.1790 d1.dn_loss_bbox: 0.2077 d1.dn_loss_iou: 0.2620 d2.dn_loss_cls: 0.1579 d2.dn_loss_bbox: 0.1858 d2.dn_loss_iou: 0.2409 d3.dn_loss_cls: 0.1484 d3.dn_loss_bbox: 0.1781 d3.dn_loss_iou: 0.2348 d4.dn_loss_cls: 0.1453 d4.dn_loss_bbox: 0.1762 d4.dn_loss_iou: 0.2327 d1.loss_lmm_region: 0.1985 loss_lmm_image: 0.8863 2024/11/11 11:29:51 - mmengine - INFO - Iter(train) [ 44200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:54:18 time: 1.9620 data_time: 0.0174 memory: 34495 grad_norm: 26.0708 loss: 9.9980 loss_cls: 0.3183 loss_bbox: 0.1533 loss_iou: 0.2545 d0.loss_cls: 0.3709 d0.loss_bbox: 0.1703 d0.loss_iou: 0.2745 d1.loss_cls: 0.3453 d1.loss_bbox: 0.1601 d1.loss_iou: 0.2626 d2.loss_cls: 0.3261 d2.loss_bbox: 0.1588 d2.loss_iou: 0.2598 d3.loss_cls: 0.3199 d3.loss_bbox: 0.1539 d3.loss_iou: 0.2567 d4.loss_cls: 0.3199 d4.loss_bbox: 0.1519 d4.loss_iou: 0.2533 enc_loss_cls: 0.3752 enc_loss_bbox: 0.1760 enc_loss_iou: 0.2951 dn_loss_cls: 0.1153 dn_loss_bbox: 0.1699 dn_loss_iou: 0.2177 d0.dn_loss_cls: 0.2040 d0.dn_loss_bbox: 0.3327 d0.dn_loss_iou: 0.3633 d1.dn_loss_cls: 0.1546 d1.dn_loss_bbox: 0.2062 d1.dn_loss_iou: 0.2494 d2.dn_loss_cls: 0.1332 d2.dn_loss_bbox: 0.1823 d2.dn_loss_iou: 0.2274 d3.dn_loss_cls: 0.1229 d3.dn_loss_bbox: 0.1722 d3.dn_loss_iou: 0.2203 d4.dn_loss_cls: 0.1160 d4.dn_loss_bbox: 0.1699 d4.dn_loss_iou: 0.2177 d1.loss_lmm_region: 0.1602 loss_lmm_image: 0.9066 2024/11/11 11:33:09 - mmengine - INFO - Iter(train) [ 44300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:50:50 time: 1.9644 data_time: 0.0176 memory: 33427 grad_norm: 29.3169 loss: 9.4302 loss_cls: 0.2659 loss_bbox: 0.1373 loss_iou: 0.2271 d0.loss_cls: 0.3065 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2447 d1.loss_cls: 0.2817 d1.loss_bbox: 0.1462 d1.loss_iou: 0.2348 d2.loss_cls: 0.2769 d2.loss_bbox: 0.1408 d2.loss_iou: 0.2294 d3.loss_cls: 0.2689 d3.loss_bbox: 0.1393 d3.loss_iou: 0.2289 d4.loss_cls: 0.2669 d4.loss_bbox: 0.1388 d4.loss_iou: 0.2281 enc_loss_cls: 0.3168 enc_loss_bbox: 0.1638 enc_loss_iou: 0.2602 dn_loss_cls: 0.1332 dn_loss_bbox: 0.1703 dn_loss_iou: 0.2294 d0.dn_loss_cls: 0.2219 d0.dn_loss_bbox: 0.3212 d0.dn_loss_iou: 0.3774 d1.dn_loss_cls: 0.1677 d1.dn_loss_bbox: 0.2028 d1.dn_loss_iou: 0.2594 d2.dn_loss_cls: 0.1508 d2.dn_loss_bbox: 0.1793 d2.dn_loss_iou: 0.2378 d3.dn_loss_cls: 0.1402 d3.dn_loss_bbox: 0.1730 d3.dn_loss_iou: 0.2318 d4.dn_loss_cls: 0.1334 d4.dn_loss_bbox: 0.1702 d4.dn_loss_iou: 0.2293 d1.loss_lmm_region: 0.1603 loss_lmm_image: 0.8868 2024/11/11 11:36:28 - mmengine - INFO - Iter(train) [ 44400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:47:27 time: 1.9824 data_time: 0.0176 memory: 35277 grad_norm: 28.2865 loss: 8.6321 loss_cls: 0.2533 loss_bbox: 0.1153 loss_iou: 0.2152 d0.loss_cls: 0.2960 d0.loss_bbox: 0.1277 d0.loss_iou: 0.2281 d1.loss_cls: 0.2774 d1.loss_bbox: 0.1184 d1.loss_iou: 0.2187 d2.loss_cls: 0.2657 d2.loss_bbox: 0.1143 d2.loss_iou: 0.2175 d3.loss_cls: 0.2553 d3.loss_bbox: 0.1194 d3.loss_iou: 0.2163 d4.loss_cls: 0.2538 d4.loss_bbox: 0.1173 d4.loss_iou: 0.2156 enc_loss_cls: 0.3059 enc_loss_bbox: 0.1348 enc_loss_iou: 0.2458 dn_loss_cls: 0.1142 dn_loss_bbox: 0.1503 dn_loss_iou: 0.2053 d0.dn_loss_cls: 0.1928 d0.dn_loss_bbox: 0.2969 d0.dn_loss_iou: 0.3461 d1.dn_loss_cls: 0.1412 d1.dn_loss_bbox: 0.1828 d1.dn_loss_iou: 0.2355 d2.dn_loss_cls: 0.1217 d2.dn_loss_bbox: 0.1617 d2.dn_loss_iou: 0.2154 d3.dn_loss_cls: 0.1170 d3.dn_loss_bbox: 0.1528 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.1147 d4.dn_loss_bbox: 0.1502 d4.dn_loss_iou: 0.2054 d1.loss_lmm_region: 0.1500 loss_lmm_image: 0.8579 2024/11/11 11:39:44 - mmengine - INFO - Iter(train) [ 44500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:43:56 time: 1.9574 data_time: 0.0177 memory: 32614 grad_norm: 26.9801 loss: 9.3644 loss_cls: 0.2896 loss_bbox: 0.1367 loss_iou: 0.2291 d0.loss_cls: 0.3418 d0.loss_bbox: 0.1486 d0.loss_iou: 0.2442 d1.loss_cls: 0.3076 d1.loss_bbox: 0.1449 d1.loss_iou: 0.2393 d2.loss_cls: 0.2984 d2.loss_bbox: 0.1397 d2.loss_iou: 0.2313 d3.loss_cls: 0.2916 d3.loss_bbox: 0.1377 d3.loss_iou: 0.2299 d4.loss_cls: 0.2893 d4.loss_bbox: 0.1373 d4.loss_iou: 0.2291 enc_loss_cls: 0.3463 enc_loss_bbox: 0.1594 enc_loss_iou: 0.2668 dn_loss_cls: 0.1086 dn_loss_bbox: 0.1689 dn_loss_iou: 0.2193 d0.dn_loss_cls: 0.2050 d0.dn_loss_bbox: 0.3359 d0.dn_loss_iou: 0.3665 d1.dn_loss_cls: 0.1448 d1.dn_loss_bbox: 0.2052 d1.dn_loss_iou: 0.2496 d2.dn_loss_cls: 0.1230 d2.dn_loss_bbox: 0.1783 d2.dn_loss_iou: 0.2280 d3.dn_loss_cls: 0.1171 d3.dn_loss_bbox: 0.1713 d3.dn_loss_iou: 0.2213 d4.dn_loss_cls: 0.1097 d4.dn_loss_bbox: 0.1690 d4.dn_loss_iou: 0.2193 d1.loss_lmm_region: 0.1533 loss_lmm_image: 0.8316 2024/11/11 11:43:00 - mmengine - INFO - Iter(train) [ 44600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:40:27 time: 1.9793 data_time: 0.0178 memory: 33616 grad_norm: 26.0400 loss: 10.7342 loss_cls: 0.3401 loss_bbox: 0.1701 loss_iou: 0.2781 d0.loss_cls: 0.3872 d0.loss_bbox: 0.1903 d0.loss_iou: 0.2989 d1.loss_cls: 0.3669 d1.loss_bbox: 0.1758 d1.loss_iou: 0.2855 d2.loss_cls: 0.3555 d2.loss_bbox: 0.1739 d2.loss_iou: 0.2810 d3.loss_cls: 0.3451 d3.loss_bbox: 0.1706 d3.loss_iou: 0.2799 d4.loss_cls: 0.3390 d4.loss_bbox: 0.1755 d4.loss_iou: 0.2814 enc_loss_cls: 0.3860 enc_loss_bbox: 0.2060 enc_loss_iou: 0.3239 dn_loss_cls: 0.1406 dn_loss_bbox: 0.1873 dn_loss_iou: 0.2311 d0.dn_loss_cls: 0.2177 d0.dn_loss_bbox: 0.3342 d0.dn_loss_iou: 0.3697 d1.dn_loss_cls: 0.1686 d1.dn_loss_bbox: 0.2256 d1.dn_loss_iou: 0.2611 d2.dn_loss_cls: 0.1518 d2.dn_loss_bbox: 0.2006 d2.dn_loss_iou: 0.2415 d3.dn_loss_cls: 0.1434 d3.dn_loss_bbox: 0.1897 d3.dn_loss_iou: 0.2337 d4.dn_loss_cls: 0.1408 d4.dn_loss_bbox: 0.1873 d4.dn_loss_iou: 0.2310 d1.loss_lmm_region: 0.1666 loss_lmm_image: 0.9011 2024/11/11 11:46:17 - mmengine - INFO - Iter(train) [ 44700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:36:58 time: 1.9638 data_time: 0.0177 memory: 32180 grad_norm: nan loss: 8.9801 loss_cls: 0.2806 loss_bbox: 0.1254 loss_iou: 0.2156 d0.loss_cls: 0.3318 d0.loss_bbox: 0.1346 d0.loss_iou: 0.2277 d1.loss_cls: 0.3023 d1.loss_bbox: 0.1272 d1.loss_iou: 0.2176 d2.loss_cls: 0.2927 d2.loss_bbox: 0.1247 d2.loss_iou: 0.2146 d3.loss_cls: 0.2843 d3.loss_bbox: 0.1258 d3.loss_iou: 0.2151 d4.loss_cls: 0.2807 d4.loss_bbox: 0.1251 d4.loss_iou: 0.2149 enc_loss_cls: 0.3362 enc_loss_bbox: 0.1510 enc_loss_iou: 0.2522 dn_loss_cls: 0.1113 dn_loss_bbox: 0.1603 dn_loss_iou: 0.1956 d0.dn_loss_cls: 0.1967 d0.dn_loss_bbox: 0.3189 d0.dn_loss_iou: 0.3500 d1.dn_loss_cls: 0.1481 d1.dn_loss_bbox: 0.1944 d1.dn_loss_iou: 0.2289 d2.dn_loss_cls: 0.1240 d2.dn_loss_bbox: 0.1715 d2.dn_loss_iou: 0.2064 d3.dn_loss_cls: 0.1193 d3.dn_loss_bbox: 0.1638 d3.dn_loss_iou: 0.1991 d4.dn_loss_cls: 0.1139 d4.dn_loss_bbox: 0.1602 d4.dn_loss_iou: 0.1955 d1.loss_lmm_region: 0.1670 loss_lmm_image: 0.8751 2024/11/11 11:49:39 - mmengine - INFO - Iter(train) [ 44800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:33:41 time: 2.0033 data_time: 0.0176 memory: 34337 grad_norm: 28.8480 loss: 9.4401 loss_cls: 0.2820 loss_bbox: 0.1412 loss_iou: 0.2499 d0.loss_cls: 0.3231 d0.loss_bbox: 0.1508 d0.loss_iou: 0.2693 d1.loss_cls: 0.2999 d1.loss_bbox: 0.1448 d1.loss_iou: 0.2587 d2.loss_cls: 0.2896 d2.loss_bbox: 0.1438 d2.loss_iou: 0.2533 d3.loss_cls: 0.2849 d3.loss_bbox: 0.1414 d3.loss_iou: 0.2533 d4.loss_cls: 0.2841 d4.loss_bbox: 0.1390 d4.loss_iou: 0.2486 enc_loss_cls: 0.3272 enc_loss_bbox: 0.1608 enc_loss_iou: 0.2870 dn_loss_cls: 0.1009 dn_loss_bbox: 0.1787 dn_loss_iou: 0.2270 d0.dn_loss_cls: 0.1866 d0.dn_loss_bbox: 0.3109 d0.dn_loss_iou: 0.3592 d1.dn_loss_cls: 0.1324 d1.dn_loss_bbox: 0.2086 d1.dn_loss_iou: 0.2574 d2.dn_loss_cls: 0.1139 d2.dn_loss_bbox: 0.1884 d2.dn_loss_iou: 0.2376 d3.dn_loss_cls: 0.1063 d3.dn_loss_bbox: 0.1813 d3.dn_loss_iou: 0.2304 d4.dn_loss_cls: 0.1028 d4.dn_loss_bbox: 0.1787 d4.dn_loss_iou: 0.2270 d1.loss_lmm_region: 0.1471 loss_lmm_image: 0.8325 2024/11/11 11:52:58 - mmengine - INFO - Iter(train) [ 44900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:30:17 time: 1.9829 data_time: 0.0178 memory: 33940 grad_norm: 31.2584 loss: 9.6463 loss_cls: 0.3193 loss_bbox: 0.1294 loss_iou: 0.2104 d0.loss_cls: 0.3597 d0.loss_bbox: 0.1539 d0.loss_iou: 0.2339 d1.loss_cls: 0.3373 d1.loss_bbox: 0.1448 d1.loss_iou: 0.2229 d2.loss_cls: 0.3283 d2.loss_bbox: 0.1352 d2.loss_iou: 0.2139 d3.loss_cls: 0.3233 d3.loss_bbox: 0.1308 d3.loss_iou: 0.2114 d4.loss_cls: 0.3198 d4.loss_bbox: 0.1300 d4.loss_iou: 0.2105 enc_loss_cls: 0.3629 enc_loss_bbox: 0.1696 enc_loss_iou: 0.2594 dn_loss_cls: 0.1106 dn_loss_bbox: 0.2092 dn_loss_iou: 0.2122 d0.dn_loss_cls: 0.1935 d0.dn_loss_bbox: 0.3644 d0.dn_loss_iou: 0.3496 d1.dn_loss_cls: 0.1459 d1.dn_loss_bbox: 0.2452 d1.dn_loss_iou: 0.2439 d2.dn_loss_cls: 0.1232 d2.dn_loss_bbox: 0.2198 d2.dn_loss_iou: 0.2208 d3.dn_loss_cls: 0.1141 d3.dn_loss_bbox: 0.2111 d3.dn_loss_iou: 0.2148 d4.dn_loss_cls: 0.1107 d4.dn_loss_bbox: 0.2092 d4.dn_loss_iou: 0.2122 d1.loss_lmm_region: 0.1469 loss_lmm_image: 0.8820 2024/11/11 11:56:17 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 11:56:17 - mmengine - INFO - Iter(train) [ 45000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:26:54 time: 2.0022 data_time: 0.0174 memory: 34404 grad_norm: 36.8948 loss: 9.7200 loss_cls: 0.2939 loss_bbox: 0.1214 loss_iou: 0.2418 d0.loss_cls: 0.3349 d0.loss_bbox: 0.1344 d0.loss_iou: 0.2566 d1.loss_cls: 0.3044 d1.loss_bbox: 0.1307 d1.loss_iou: 0.2473 d2.loss_cls: 0.2967 d2.loss_bbox: 0.1256 d2.loss_iou: 0.2447 d3.loss_cls: 0.2948 d3.loss_bbox: 0.1230 d3.loss_iou: 0.2424 d4.loss_cls: 0.2909 d4.loss_bbox: 0.1243 d4.loss_iou: 0.2440 enc_loss_cls: 0.3290 enc_loss_bbox: 0.1556 enc_loss_iou: 0.2832 dn_loss_cls: 0.1640 dn_loss_bbox: 0.1644 dn_loss_iou: 0.2304 d0.dn_loss_cls: 0.2408 d0.dn_loss_bbox: 0.3043 d0.dn_loss_iou: 0.3721 d1.dn_loss_cls: 0.1925 d1.dn_loss_bbox: 0.1957 d1.dn_loss_iou: 0.2607 d2.dn_loss_cls: 0.1751 d2.dn_loss_bbox: 0.1741 d2.dn_loss_iou: 0.2394 d3.dn_loss_cls: 0.1677 d3.dn_loss_bbox: 0.1663 d3.dn_loss_iou: 0.2326 d4.dn_loss_cls: 0.1641 d4.dn_loss_bbox: 0.1645 d4.dn_loss_iou: 0.2306 d1.loss_lmm_region: 0.1704 loss_lmm_image: 0.8910 2024/11/11 11:59:35 - mmengine - INFO - Iter(train) [ 45100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:23:28 time: 1.9632 data_time: 0.0175 memory: 34347 grad_norm: 33.2293 loss: 10.0765 loss_cls: 0.3322 loss_bbox: 0.1434 loss_iou: 0.2464 d0.loss_cls: 0.3734 d0.loss_bbox: 0.1564 d0.loss_iou: 0.2657 d1.loss_cls: 0.3470 d1.loss_bbox: 0.1488 d1.loss_iou: 0.2557 d2.loss_cls: 0.3398 d2.loss_bbox: 0.1433 d2.loss_iou: 0.2479 d3.loss_cls: 0.3361 d3.loss_bbox: 0.1436 d3.loss_iou: 0.2470 d4.loss_cls: 0.3351 d4.loss_bbox: 0.1429 d4.loss_iou: 0.2468 enc_loss_cls: 0.3761 enc_loss_bbox: 0.1684 enc_loss_iou: 0.2855 dn_loss_cls: 0.1456 dn_loss_bbox: 0.1710 dn_loss_iou: 0.2189 d0.dn_loss_cls: 0.2219 d0.dn_loss_bbox: 0.3099 d0.dn_loss_iou: 0.3612 d1.dn_loss_cls: 0.1766 d1.dn_loss_bbox: 0.2025 d1.dn_loss_iou: 0.2517 d2.dn_loss_cls: 0.1614 d2.dn_loss_bbox: 0.1824 d2.dn_loss_iou: 0.2293 d3.dn_loss_cls: 0.1533 d3.dn_loss_bbox: 0.1729 d3.dn_loss_iou: 0.2216 d4.dn_loss_cls: 0.1485 d4.dn_loss_bbox: 0.1709 d4.dn_loss_iou: 0.2189 d1.loss_lmm_region: 0.1714 loss_lmm_image: 0.9053 2024/11/11 12:02:54 - mmengine - INFO - Iter(train) [ 45200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:20:03 time: 1.9954 data_time: 0.0175 memory: 34258 grad_norm: 31.6481 loss: 8.9747 loss_cls: 0.3061 loss_bbox: 0.1141 loss_iou: 0.2380 d0.loss_cls: 0.3518 d0.loss_bbox: 0.1252 d0.loss_iou: 0.2496 d1.loss_cls: 0.3210 d1.loss_bbox: 0.1163 d1.loss_iou: 0.2414 d2.loss_cls: 0.3151 d2.loss_bbox: 0.1144 d2.loss_iou: 0.2375 d3.loss_cls: 0.3061 d3.loss_bbox: 0.1172 d3.loss_iou: 0.2382 d4.loss_cls: 0.3044 d4.loss_bbox: 0.1179 d4.loss_iou: 0.2400 enc_loss_cls: 0.3612 enc_loss_bbox: 0.1404 enc_loss_iou: 0.2740 dn_loss_cls: 0.0989 dn_loss_bbox: 0.1377 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1734 d0.dn_loss_bbox: 0.2549 d0.dn_loss_iou: 0.3372 d1.dn_loss_cls: 0.1310 d1.dn_loss_bbox: 0.1643 d1.dn_loss_iou: 0.2338 d2.dn_loss_cls: 0.1132 d2.dn_loss_bbox: 0.1459 d2.dn_loss_iou: 0.2126 d3.dn_loss_cls: 0.1043 d3.dn_loss_bbox: 0.1386 d3.dn_loss_iou: 0.2061 d4.dn_loss_cls: 0.1006 d4.dn_loss_bbox: 0.1376 d4.dn_loss_iou: 0.2035 d1.loss_lmm_region: 0.1402 loss_lmm_image: 0.9076 2024/11/11 12:06:12 - mmengine - INFO - Iter(train) [ 45300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:16:39 time: 1.9648 data_time: 0.0175 memory: 34004 grad_norm: 28.4132 loss: 9.3074 loss_cls: 0.2721 loss_bbox: 0.1392 loss_iou: 0.2334 d0.loss_cls: 0.3159 d0.loss_bbox: 0.1475 d0.loss_iou: 0.2484 d1.loss_cls: 0.2853 d1.loss_bbox: 0.1421 d1.loss_iou: 0.2415 d2.loss_cls: 0.2819 d2.loss_bbox: 0.1404 d2.loss_iou: 0.2361 d3.loss_cls: 0.2770 d3.loss_bbox: 0.1389 d3.loss_iou: 0.2333 d4.loss_cls: 0.2740 d4.loss_bbox: 0.1393 d4.loss_iou: 0.2338 enc_loss_cls: 0.3192 enc_loss_bbox: 0.1550 enc_loss_iou: 0.2647 dn_loss_cls: 0.1109 dn_loss_bbox: 0.1767 dn_loss_iou: 0.2122 d0.dn_loss_cls: 0.1894 d0.dn_loss_bbox: 0.3315 d0.dn_loss_iou: 0.3550 d1.dn_loss_cls: 0.1418 d1.dn_loss_bbox: 0.2105 d1.dn_loss_iou: 0.2447 d2.dn_loss_cls: 0.1235 d2.dn_loss_bbox: 0.1885 d2.dn_loss_iou: 0.2228 d3.dn_loss_cls: 0.1177 d3.dn_loss_bbox: 0.1788 d3.dn_loss_iou: 0.2147 d4.dn_loss_cls: 0.1122 d4.dn_loss_bbox: 0.1767 d4.dn_loss_iou: 0.2122 d1.loss_lmm_region: 0.1551 loss_lmm_image: 0.9137 2024/11/11 12:09:32 - mmengine - INFO - Iter(train) [ 45400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:13:16 time: 2.0088 data_time: 0.0175 memory: 35069 grad_norm: 30.0710 loss: 10.7325 loss_cls: 0.3565 loss_bbox: 0.1609 loss_iou: 0.2968 d0.loss_cls: 0.4150 d0.loss_bbox: 0.1736 d0.loss_iou: 0.3143 d1.loss_cls: 0.3852 d1.loss_bbox: 0.1616 d1.loss_iou: 0.3020 d2.loss_cls: 0.3721 d2.loss_bbox: 0.1561 d2.loss_iou: 0.2963 d3.loss_cls: 0.3641 d3.loss_bbox: 0.1571 d3.loss_iou: 0.2928 d4.loss_cls: 0.3577 d4.loss_bbox: 0.1603 d4.loss_iou: 0.2927 enc_loss_cls: 0.4201 enc_loss_bbox: 0.1876 enc_loss_iou: 0.3415 dn_loss_cls: 0.1249 dn_loss_bbox: 0.1805 dn_loss_iou: 0.2272 d0.dn_loss_cls: 0.2060 d0.dn_loss_bbox: 0.3351 d0.dn_loss_iou: 0.3797 d1.dn_loss_cls: 0.1545 d1.dn_loss_bbox: 0.2136 d1.dn_loss_iou: 0.2602 d2.dn_loss_cls: 0.1372 d2.dn_loss_bbox: 0.1914 d2.dn_loss_iou: 0.2379 d3.dn_loss_cls: 0.1318 d3.dn_loss_bbox: 0.1830 d3.dn_loss_iou: 0.2300 d4.dn_loss_cls: 0.1257 d4.dn_loss_bbox: 0.1805 d4.dn_loss_iou: 0.2271 d1.loss_lmm_region: 0.1693 loss_lmm_image: 0.8725 2024/11/11 12:12:51 - mmengine - INFO - Iter(train) [ 45500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:09:53 time: 2.0024 data_time: 0.0176 memory: 34234 grad_norm: 25.2233 loss: 9.9957 loss_cls: 0.3045 loss_bbox: 0.1471 loss_iou: 0.2773 d0.loss_cls: 0.3449 d0.loss_bbox: 0.1676 d0.loss_iou: 0.2954 d1.loss_cls: 0.3261 d1.loss_bbox: 0.1538 d1.loss_iou: 0.2860 d2.loss_cls: 0.3157 d2.loss_bbox: 0.1495 d2.loss_iou: 0.2812 d3.loss_cls: 0.3112 d3.loss_bbox: 0.1468 d3.loss_iou: 0.2785 d4.loss_cls: 0.3074 d4.loss_bbox: 0.1468 d4.loss_iou: 0.2782 enc_loss_cls: 0.3537 enc_loss_bbox: 0.1749 enc_loss_iou: 0.3169 dn_loss_cls: 0.1013 dn_loss_bbox: 0.1678 dn_loss_iou: 0.2439 d0.dn_loss_cls: 0.1795 d0.dn_loss_bbox: 0.3349 d0.dn_loss_iou: 0.4023 d1.dn_loss_cls: 0.1306 d1.dn_loss_bbox: 0.2057 d1.dn_loss_iou: 0.2791 d2.dn_loss_cls: 0.1132 d2.dn_loss_bbox: 0.1805 d2.dn_loss_iou: 0.2554 d3.dn_loss_cls: 0.1045 d3.dn_loss_bbox: 0.1709 d3.dn_loss_iou: 0.2469 d4.dn_loss_cls: 0.1026 d4.dn_loss_bbox: 0.1680 d4.dn_loss_iou: 0.2437 d1.loss_lmm_region: 0.1482 loss_lmm_image: 0.8528 2024/11/11 12:16:10 - mmengine - INFO - Iter(train) [ 45600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:06:29 time: 1.9529 data_time: 0.0177 memory: 34883 grad_norm: 40.1071 loss: 11.4065 loss_cls: 0.3582 loss_bbox: 0.1909 loss_iou: 0.3012 d0.loss_cls: 0.4061 d0.loss_bbox: 0.1941 d0.loss_iou: 0.3110 d1.loss_cls: 0.3840 d1.loss_bbox: 0.1902 d1.loss_iou: 0.3046 d2.loss_cls: 0.3773 d2.loss_bbox: 0.1864 d2.loss_iou: 0.3009 d3.loss_cls: 0.3677 d3.loss_bbox: 0.1849 d3.loss_iou: 0.2982 d4.loss_cls: 0.3626 d4.loss_bbox: 0.1895 d4.loss_iou: 0.2989 enc_loss_cls: 0.4118 enc_loss_bbox: 0.2127 enc_loss_iou: 0.3375 dn_loss_cls: 0.1348 dn_loss_bbox: 0.2059 dn_loss_iou: 0.2589 d0.dn_loss_cls: 0.2267 d0.dn_loss_bbox: 0.3754 d0.dn_loss_iou: 0.4152 d1.dn_loss_cls: 0.1712 d1.dn_loss_bbox: 0.2399 d1.dn_loss_iou: 0.2921 d2.dn_loss_cls: 0.1493 d2.dn_loss_bbox: 0.2180 d2.dn_loss_iou: 0.2700 d3.dn_loss_cls: 0.1433 d3.dn_loss_bbox: 0.2080 d3.dn_loss_iou: 0.2610 d4.dn_loss_cls: 0.1352 d4.dn_loss_bbox: 0.2059 d4.dn_loss_iou: 0.2587 d1.loss_lmm_region: 0.1833 loss_lmm_image: 0.8853 2024/11/11 12:19:27 - mmengine - INFO - Iter(train) [ 45700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 10:03:01 time: 1.9484 data_time: 0.0174 memory: 32131 grad_norm: 32.5276 loss: 9.0106 loss_cls: 0.2653 loss_bbox: 0.1393 loss_iou: 0.2412 d0.loss_cls: 0.3222 d0.loss_bbox: 0.1463 d0.loss_iou: 0.2512 d1.loss_cls: 0.3014 d1.loss_bbox: 0.1372 d1.loss_iou: 0.2406 d2.loss_cls: 0.2879 d2.loss_bbox: 0.1343 d2.loss_iou: 0.2398 d3.loss_cls: 0.2746 d3.loss_bbox: 0.1327 d3.loss_iou: 0.2383 d4.loss_cls: 0.2687 d4.loss_bbox: 0.1370 d4.loss_iou: 0.2384 enc_loss_cls: 0.3295 enc_loss_bbox: 0.1609 enc_loss_iou: 0.2740 dn_loss_cls: 0.0918 dn_loss_bbox: 0.1694 dn_loss_iou: 0.1991 d0.dn_loss_cls: 0.1699 d0.dn_loss_bbox: 0.3206 d0.dn_loss_iou: 0.3355 d1.dn_loss_cls: 0.1257 d1.dn_loss_bbox: 0.1987 d1.dn_loss_iou: 0.2284 d2.dn_loss_cls: 0.1076 d2.dn_loss_bbox: 0.1763 d2.dn_loss_iou: 0.2073 d3.dn_loss_cls: 0.0973 d3.dn_loss_bbox: 0.1711 d3.dn_loss_iou: 0.2017 d4.dn_loss_cls: 0.0912 d4.dn_loss_bbox: 0.1694 d4.dn_loss_iou: 0.1991 d1.loss_lmm_region: 0.1459 loss_lmm_image: 0.8438 2024/11/11 12:22:44 - mmengine - INFO - Iter(train) [ 45800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:59:34 time: 1.9725 data_time: 0.0174 memory: 35657 grad_norm: 31.4862 loss: 11.2008 loss_cls: 0.3619 loss_bbox: 0.1468 loss_iou: 0.2376 d0.loss_cls: 0.4188 d0.loss_bbox: 0.1516 d0.loss_iou: 0.2505 d1.loss_cls: 0.3824 d1.loss_bbox: 0.1526 d1.loss_iou: 0.2485 d2.loss_cls: 0.3745 d2.loss_bbox: 0.1474 d2.loss_iou: 0.2390 d3.loss_cls: 0.3705 d3.loss_bbox: 0.1450 d3.loss_iou: 0.2370 d4.loss_cls: 0.3658 d4.loss_bbox: 0.1451 d4.loss_iou: 0.2353 enc_loss_cls: 0.4161 enc_loss_bbox: 0.1671 enc_loss_iou: 0.2778 dn_loss_cls: 0.2288 dn_loss_bbox: 0.2172 dn_loss_iou: 0.2350 d0.dn_loss_cls: 0.3252 d0.dn_loss_bbox: 0.3671 d0.dn_loss_iou: 0.3776 d1.dn_loss_cls: 0.2813 d1.dn_loss_bbox: 0.2481 d1.dn_loss_iou: 0.2673 d2.dn_loss_cls: 0.2524 d2.dn_loss_bbox: 0.2279 d2.dn_loss_iou: 0.2455 d3.dn_loss_cls: 0.2362 d3.dn_loss_bbox: 0.2194 d3.dn_loss_iou: 0.2381 d4.dn_loss_cls: 0.2295 d4.dn_loss_bbox: 0.2173 d4.dn_loss_iou: 0.2352 d1.loss_lmm_region: 0.1883 loss_lmm_image: 0.8922 2024/11/11 12:26:01 - mmengine - INFO - Iter(train) [ 45900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:56:05 time: 1.9587 data_time: 0.0175 memory: 33814 grad_norm: 34.1035 loss: 9.6919 loss_cls: 0.2805 loss_bbox: 0.1406 loss_iou: 0.2366 d0.loss_cls: 0.3188 d0.loss_bbox: 0.1570 d0.loss_iou: 0.2530 d1.loss_cls: 0.3001 d1.loss_bbox: 0.1471 d1.loss_iou: 0.2412 d2.loss_cls: 0.2883 d2.loss_bbox: 0.1451 d2.loss_iou: 0.2392 d3.loss_cls: 0.2832 d3.loss_bbox: 0.1416 d3.loss_iou: 0.2380 d4.loss_cls: 0.2790 d4.loss_bbox: 0.1407 d4.loss_iou: 0.2359 enc_loss_cls: 0.3203 enc_loss_bbox: 0.1709 enc_loss_iou: 0.2715 dn_loss_cls: 0.1325 dn_loss_bbox: 0.1917 dn_loss_iou: 0.2179 d0.dn_loss_cls: 0.2190 d0.dn_loss_bbox: 0.3592 d0.dn_loss_iou: 0.3761 d1.dn_loss_cls: 0.1695 d1.dn_loss_bbox: 0.2259 d1.dn_loss_iou: 0.2505 d2.dn_loss_cls: 0.1459 d2.dn_loss_bbox: 0.2027 d2.dn_loss_iou: 0.2293 d3.dn_loss_cls: 0.1384 d3.dn_loss_bbox: 0.1940 d3.dn_loss_iou: 0.2208 d4.dn_loss_cls: 0.1319 d4.dn_loss_bbox: 0.1918 d4.dn_loss_iou: 0.2178 d1.loss_lmm_region: 0.1788 loss_lmm_image: 0.8696 2024/11/11 12:29:18 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 12:29:18 - mmengine - INFO - Iter(train) [ 46000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:52:38 time: 2.0017 data_time: 0.0174 memory: 33029 grad_norm: 30.3704 loss: 10.7357 loss_cls: 0.3432 loss_bbox: 0.1721 loss_iou: 0.3037 d0.loss_cls: 0.4034 d0.loss_bbox: 0.1796 d0.loss_iou: 0.3176 d1.loss_cls: 0.3762 d1.loss_bbox: 0.1676 d1.loss_iou: 0.3048 d2.loss_cls: 0.3637 d2.loss_bbox: 0.1695 d2.loss_iou: 0.3074 d3.loss_cls: 0.3535 d3.loss_bbox: 0.1680 d3.loss_iou: 0.3055 d4.loss_cls: 0.3471 d4.loss_bbox: 0.1704 d4.loss_iou: 0.3057 enc_loss_cls: 0.3982 enc_loss_bbox: 0.1951 enc_loss_iou: 0.3435 dn_loss_cls: 0.1088 dn_loss_bbox: 0.1825 dn_loss_iou: 0.2369 d0.dn_loss_cls: 0.1909 d0.dn_loss_bbox: 0.3430 d0.dn_loss_iou: 0.3860 d1.dn_loss_cls: 0.1424 d1.dn_loss_bbox: 0.2218 d1.dn_loss_iou: 0.2712 d2.dn_loss_cls: 0.1190 d2.dn_loss_bbox: 0.1939 d2.dn_loss_iou: 0.2464 d3.dn_loss_cls: 0.1126 d3.dn_loss_bbox: 0.1847 d3.dn_loss_iou: 0.2397 d4.dn_loss_cls: 0.1109 d4.dn_loss_bbox: 0.1825 d4.dn_loss_iou: 0.2369 d1.loss_lmm_region: 0.1358 loss_lmm_image: 0.8939 2024/11/11 12:32:35 - mmengine - INFO - Iter(train) [ 46100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:49:10 time: 1.9654 data_time: 0.0176 memory: 33842 grad_norm: 27.4484 loss: 9.2149 loss_cls: 0.2651 loss_bbox: 0.1368 loss_iou: 0.2456 d0.loss_cls: 0.3141 d0.loss_bbox: 0.1536 d0.loss_iou: 0.2614 d1.loss_cls: 0.2854 d1.loss_bbox: 0.1471 d1.loss_iou: 0.2523 d2.loss_cls: 0.2750 d2.loss_bbox: 0.1409 d2.loss_iou: 0.2471 d3.loss_cls: 0.2693 d3.loss_bbox: 0.1387 d3.loss_iou: 0.2449 d4.loss_cls: 0.2693 d4.loss_bbox: 0.1362 d4.loss_iou: 0.2439 enc_loss_cls: 0.3219 enc_loss_bbox: 0.1652 enc_loss_iou: 0.2781 dn_loss_cls: 0.1077 dn_loss_bbox: 0.1602 dn_loss_iou: 0.2128 d0.dn_loss_cls: 0.1791 d0.dn_loss_bbox: 0.2999 d0.dn_loss_iou: 0.3541 d1.dn_loss_cls: 0.1341 d1.dn_loss_bbox: 0.1924 d1.dn_loss_iou: 0.2450 d2.dn_loss_cls: 0.1186 d2.dn_loss_bbox: 0.1698 d2.dn_loss_iou: 0.2212 d3.dn_loss_cls: 0.1111 d3.dn_loss_bbox: 0.1623 d3.dn_loss_iou: 0.2154 d4.dn_loss_cls: 0.1076 d4.dn_loss_bbox: 0.1603 d4.dn_loss_iou: 0.2129 d1.loss_lmm_region: 0.1371 loss_lmm_image: 0.9213 2024/11/11 12:35:54 - mmengine - INFO - Iter(train) [ 46200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:45:47 time: 1.9973 data_time: 0.0176 memory: 34680 grad_norm: 32.2877 loss: 9.7025 loss_cls: 0.2748 loss_bbox: 0.1482 loss_iou: 0.2571 d0.loss_cls: 0.3186 d0.loss_bbox: 0.1603 d0.loss_iou: 0.2703 d1.loss_cls: 0.2914 d1.loss_bbox: 0.1561 d1.loss_iou: 0.2664 d2.loss_cls: 0.2843 d2.loss_bbox: 0.1485 d2.loss_iou: 0.2567 d3.loss_cls: 0.2760 d3.loss_bbox: 0.1482 d3.loss_iou: 0.2569 d4.loss_cls: 0.2760 d4.loss_bbox: 0.1474 d4.loss_iou: 0.2581 enc_loss_cls: 0.3184 enc_loss_bbox: 0.1731 enc_loss_iou: 0.2900 dn_loss_cls: 0.1005 dn_loss_bbox: 0.1968 dn_loss_iou: 0.2373 d0.dn_loss_cls: 0.1806 d0.dn_loss_bbox: 0.3468 d0.dn_loss_iou: 0.3823 d1.dn_loss_cls: 0.1375 d1.dn_loss_bbox: 0.2288 d1.dn_loss_iou: 0.2697 d2.dn_loss_cls: 0.1162 d2.dn_loss_bbox: 0.2061 d2.dn_loss_iou: 0.2466 d3.dn_loss_cls: 0.1063 d3.dn_loss_bbox: 0.1979 d3.dn_loss_iou: 0.2395 d4.dn_loss_cls: 0.1021 d4.dn_loss_bbox: 0.1968 d4.dn_loss_iou: 0.2371 d1.loss_lmm_region: 0.1475 loss_lmm_image: 0.8492 2024/11/11 12:39:14 - mmengine - INFO - Iter(train) [ 46300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:42:25 time: 2.0128 data_time: 0.0173 memory: 33792 grad_norm: 33.0723 loss: 10.1903 loss_cls: 0.3445 loss_bbox: 0.1398 loss_iou: 0.2372 d0.loss_cls: 0.3829 d0.loss_bbox: 0.1540 d0.loss_iou: 0.2542 d1.loss_cls: 0.3534 d1.loss_bbox: 0.1522 d1.loss_iou: 0.2487 d2.loss_cls: 0.3489 d2.loss_bbox: 0.1435 d2.loss_iou: 0.2420 d3.loss_cls: 0.3428 d3.loss_bbox: 0.1424 d3.loss_iou: 0.2405 d4.loss_cls: 0.3440 d4.loss_bbox: 0.1414 d4.loss_iou: 0.2373 enc_loss_cls: 0.3772 enc_loss_bbox: 0.1631 enc_loss_iou: 0.2716 dn_loss_cls: 0.1519 dn_loss_bbox: 0.1951 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.2246 d0.dn_loss_bbox: 0.3394 d0.dn_loss_iou: 0.3599 d1.dn_loss_cls: 0.1799 d1.dn_loss_bbox: 0.2254 d1.dn_loss_iou: 0.2534 d2.dn_loss_cls: 0.1621 d2.dn_loss_bbox: 0.2069 d2.dn_loss_iou: 0.2342 d3.dn_loss_cls: 0.1562 d3.dn_loss_bbox: 0.1969 d3.dn_loss_iou: 0.2266 d4.dn_loss_cls: 0.1503 d4.dn_loss_bbox: 0.1951 d4.dn_loss_iou: 0.2247 d1.loss_lmm_region: 0.1622 loss_lmm_image: 0.8591 2024/11/11 12:42:33 - mmengine - INFO - Iter(train) [ 46400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:39:03 time: 1.9954 data_time: 0.0172 memory: 34964 grad_norm: 35.1188 loss: 10.3775 loss_cls: 0.3547 loss_bbox: 0.1424 loss_iou: 0.2553 d0.loss_cls: 0.3979 d0.loss_bbox: 0.1549 d0.loss_iou: 0.2728 d1.loss_cls: 0.3629 d1.loss_bbox: 0.1510 d1.loss_iou: 0.2655 d2.loss_cls: 0.3609 d2.loss_bbox: 0.1438 d2.loss_iou: 0.2572 d3.loss_cls: 0.3602 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2545 d4.loss_cls: 0.3567 d4.loss_bbox: 0.1404 d4.loss_iou: 0.2531 enc_loss_cls: 0.3946 enc_loss_bbox: 0.1724 enc_loss_iou: 0.2972 dn_loss_cls: 0.1893 dn_loss_bbox: 0.1583 dn_loss_iou: 0.2164 d0.dn_loss_cls: 0.2453 d0.dn_loss_bbox: 0.2890 d0.dn_loss_iou: 0.3584 d1.dn_loss_cls: 0.2046 d1.dn_loss_bbox: 0.1881 d1.dn_loss_iou: 0.2480 d2.dn_loss_cls: 0.1937 d2.dn_loss_bbox: 0.1663 d2.dn_loss_iou: 0.2256 d3.dn_loss_cls: 0.1881 d3.dn_loss_bbox: 0.1595 d3.dn_loss_iou: 0.2193 d4.dn_loss_cls: 0.1908 d4.dn_loss_bbox: 0.1584 d4.dn_loss_iou: 0.2164 d1.loss_lmm_region: 0.1596 loss_lmm_image: 0.9136 2024/11/11 12:45:53 - mmengine - INFO - Iter(train) [ 46500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:35:41 time: 2.0160 data_time: 0.0173 memory: 32633 grad_norm: 31.3266 loss: 10.1391 loss_cls: 0.2735 loss_bbox: 0.1599 loss_iou: 0.2358 d0.loss_cls: 0.3196 d0.loss_bbox: 0.1618 d0.loss_iou: 0.2401 d1.loss_cls: 0.2992 d1.loss_bbox: 0.1546 d1.loss_iou: 0.2352 d2.loss_cls: 0.2857 d2.loss_bbox: 0.1589 d2.loss_iou: 0.2359 d3.loss_cls: 0.2782 d3.loss_bbox: 0.1580 d3.loss_iou: 0.2349 d4.loss_cls: 0.2731 d4.loss_bbox: 0.1600 d4.loss_iou: 0.2366 enc_loss_cls: 0.3250 enc_loss_bbox: 0.1796 enc_loss_iou: 0.2640 dn_loss_cls: 0.1436 dn_loss_bbox: 0.2208 dn_loss_iou: 0.2463 d0.dn_loss_cls: 0.2251 d0.dn_loss_bbox: 0.4028 d0.dn_loss_iou: 0.4009 d1.dn_loss_cls: 0.1758 d1.dn_loss_bbox: 0.2584 d1.dn_loss_iou: 0.2782 d2.dn_loss_cls: 0.1554 d2.dn_loss_bbox: 0.2298 d2.dn_loss_iou: 0.2545 d3.dn_loss_cls: 0.1493 d3.dn_loss_bbox: 0.2230 d3.dn_loss_iou: 0.2486 d4.dn_loss_cls: 0.1450 d4.dn_loss_bbox: 0.2208 d4.dn_loss_iou: 0.2461 d1.loss_lmm_region: 0.1610 loss_lmm_image: 0.8841 2024/11/11 12:49:09 - mmengine - INFO - Iter(train) [ 46600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:32:11 time: 1.9590 data_time: 0.0174 memory: 34761 grad_norm: 36.6292 loss: 9.5648 loss_cls: 0.2792 loss_bbox: 0.1571 loss_iou: 0.2550 d0.loss_cls: 0.3189 d0.loss_bbox: 0.1714 d0.loss_iou: 0.2662 d1.loss_cls: 0.3010 d1.loss_bbox: 0.1632 d1.loss_iou: 0.2594 d2.loss_cls: 0.2933 d2.loss_bbox: 0.1590 d2.loss_iou: 0.2533 d3.loss_cls: 0.2862 d3.loss_bbox: 0.1571 d3.loss_iou: 0.2530 d4.loss_cls: 0.2772 d4.loss_bbox: 0.1582 d4.loss_iou: 0.2553 enc_loss_cls: 0.3211 enc_loss_bbox: 0.1868 enc_loss_iou: 0.2903 dn_loss_cls: 0.1154 dn_loss_bbox: 0.1659 dn_loss_iou: 0.2133 d0.dn_loss_cls: 0.1851 d0.dn_loss_bbox: 0.3218 d0.dn_loss_iou: 0.3568 d1.dn_loss_cls: 0.1387 d1.dn_loss_bbox: 0.2031 d1.dn_loss_iou: 0.2453 d2.dn_loss_cls: 0.1231 d2.dn_loss_bbox: 0.1758 d2.dn_loss_iou: 0.2219 d3.dn_loss_cls: 0.1192 d3.dn_loss_bbox: 0.1672 d3.dn_loss_iou: 0.2151 d4.dn_loss_cls: 0.1152 d4.dn_loss_bbox: 0.1658 d4.dn_loss_iou: 0.2134 d1.loss_lmm_region: 0.1521 loss_lmm_image: 0.8884 2024/11/11 12:52:28 - mmengine - INFO - Iter(train) [ 46700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:28:48 time: 1.9851 data_time: 0.0172 memory: 34007 grad_norm: 31.8909 loss: 9.8802 loss_cls: 0.2960 loss_bbox: 0.1346 loss_iou: 0.2484 d0.loss_cls: 0.3433 d0.loss_bbox: 0.1441 d0.loss_iou: 0.2627 d1.loss_cls: 0.3154 d1.loss_bbox: 0.1348 d1.loss_iou: 0.2522 d2.loss_cls: 0.3088 d2.loss_bbox: 0.1305 d2.loss_iou: 0.2477 d3.loss_cls: 0.3036 d3.loss_bbox: 0.1329 d3.loss_iou: 0.2474 d4.loss_cls: 0.2990 d4.loss_bbox: 0.1335 d4.loss_iou: 0.2478 enc_loss_cls: 0.3459 enc_loss_bbox: 0.1504 enc_loss_iou: 0.2805 dn_loss_cls: 0.1276 dn_loss_bbox: 0.1923 dn_loss_iou: 0.2328 d0.dn_loss_cls: 0.2136 d0.dn_loss_bbox: 0.3441 d0.dn_loss_iou: 0.3823 d1.dn_loss_cls: 0.1599 d1.dn_loss_bbox: 0.2252 d1.dn_loss_iou: 0.2660 d2.dn_loss_cls: 0.1394 d2.dn_loss_bbox: 0.2023 d2.dn_loss_iou: 0.2430 d3.dn_loss_cls: 0.1312 d3.dn_loss_bbox: 0.1949 d3.dn_loss_iou: 0.2350 d4.dn_loss_cls: 0.1260 d4.dn_loss_bbox: 0.1925 d4.dn_loss_iou: 0.2330 d1.loss_lmm_region: 0.1721 loss_lmm_image: 0.9075 2024/11/11 12:55:48 - mmengine - INFO - Iter(train) [ 46800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:25:27 time: 2.0085 data_time: 0.0173 memory: 34304 grad_norm: 28.7805 loss: 9.8204 loss_cls: 0.3140 loss_bbox: 0.1443 loss_iou: 0.2661 d0.loss_cls: 0.3664 d0.loss_bbox: 0.1632 d0.loss_iou: 0.2871 d1.loss_cls: 0.3355 d1.loss_bbox: 0.1536 d1.loss_iou: 0.2769 d2.loss_cls: 0.3237 d2.loss_bbox: 0.1492 d2.loss_iou: 0.2714 d3.loss_cls: 0.3182 d3.loss_bbox: 0.1460 d3.loss_iou: 0.2680 d4.loss_cls: 0.3146 d4.loss_bbox: 0.1442 d4.loss_iou: 0.2652 enc_loss_cls: 0.3620 enc_loss_bbox: 0.1763 enc_loss_iou: 0.3087 dn_loss_cls: 0.1177 dn_loss_bbox: 0.1639 dn_loss_iou: 0.2167 d0.dn_loss_cls: 0.1841 d0.dn_loss_bbox: 0.2916 d0.dn_loss_iou: 0.3464 d1.dn_loss_cls: 0.1421 d1.dn_loss_bbox: 0.1929 d1.dn_loss_iou: 0.2458 d2.dn_loss_cls: 0.1260 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2248 d3.dn_loss_cls: 0.1210 d3.dn_loss_bbox: 0.1657 d3.dn_loss_iou: 0.2192 d4.dn_loss_cls: 0.1184 d4.dn_loss_bbox: 0.1639 d4.dn_loss_iou: 0.2166 d1.loss_lmm_region: 0.1731 loss_lmm_image: 0.8636 2024/11/11 12:59:06 - mmengine - INFO - Iter(train) [ 46900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:22:01 time: 1.9822 data_time: 0.0171 memory: 32674 grad_norm: 35.3988 loss: 9.6160 loss_cls: 0.2821 loss_bbox: 0.1357 loss_iou: 0.2300 d0.loss_cls: 0.3369 d0.loss_bbox: 0.1500 d0.loss_iou: 0.2453 d1.loss_cls: 0.3098 d1.loss_bbox: 0.1363 d1.loss_iou: 0.2312 d2.loss_cls: 0.2930 d2.loss_bbox: 0.1383 d2.loss_iou: 0.2307 d3.loss_cls: 0.2865 d3.loss_bbox: 0.1361 d3.loss_iou: 0.2301 d4.loss_cls: 0.2815 d4.loss_bbox: 0.1360 d4.loss_iou: 0.2299 enc_loss_cls: 0.3371 enc_loss_bbox: 0.1640 enc_loss_iou: 0.2657 dn_loss_cls: 0.1230 dn_loss_bbox: 0.1902 dn_loss_iou: 0.2262 d0.dn_loss_cls: 0.2098 d0.dn_loss_bbox: 0.3501 d0.dn_loss_iou: 0.3775 d1.dn_loss_cls: 0.1556 d1.dn_loss_bbox: 0.2268 d1.dn_loss_iou: 0.2614 d2.dn_loss_cls: 0.1355 d2.dn_loss_bbox: 0.2023 d2.dn_loss_iou: 0.2375 d3.dn_loss_cls: 0.1273 d3.dn_loss_bbox: 0.1939 d3.dn_loss_iou: 0.2294 d4.dn_loss_cls: 0.1234 d4.dn_loss_bbox: 0.1903 d4.dn_loss_iou: 0.2263 d1.loss_lmm_region: 0.1557 loss_lmm_image: 0.8875 2024/11/11 13:02:24 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 13:02:24 - mmengine - INFO - Iter(train) [ 47000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:18:36 time: 1.9695 data_time: 0.0172 memory: 34028 grad_norm: nan loss: 9.8340 loss_cls: 0.3271 loss_bbox: 0.1402 loss_iou: 0.2325 d0.loss_cls: 0.3700 d0.loss_bbox: 0.1467 d0.loss_iou: 0.2457 d1.loss_cls: 0.3388 d1.loss_bbox: 0.1412 d1.loss_iou: 0.2371 d2.loss_cls: 0.3333 d2.loss_bbox: 0.1424 d2.loss_iou: 0.2333 d3.loss_cls: 0.3303 d3.loss_bbox: 0.1414 d3.loss_iou: 0.2318 d4.loss_cls: 0.3267 d4.loss_bbox: 0.1414 d4.loss_iou: 0.2332 enc_loss_cls: 0.3639 enc_loss_bbox: 0.1601 enc_loss_iou: 0.2681 dn_loss_cls: 0.1653 dn_loss_bbox: 0.1686 dn_loss_iou: 0.2025 d0.dn_loss_cls: 0.2283 d0.dn_loss_bbox: 0.3149 d0.dn_loss_iou: 0.3505 d1.dn_loss_cls: 0.1855 d1.dn_loss_bbox: 0.2056 d1.dn_loss_iou: 0.2365 d2.dn_loss_cls: 0.1738 d2.dn_loss_bbox: 0.1803 d2.dn_loss_iou: 0.2138 d3.dn_loss_cls: 0.1660 d3.dn_loss_bbox: 0.1706 d3.dn_loss_iou: 0.2056 d4.dn_loss_cls: 0.1629 d4.dn_loss_bbox: 0.1685 d4.dn_loss_iou: 0.2026 d1.loss_lmm_region: 0.1648 loss_lmm_image: 0.8821 2024/11/11 13:05:42 - mmengine - INFO - Iter(train) [ 47100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:15:10 time: 1.9817 data_time: 0.0172 memory: 32403 grad_norm: 33.0205 loss: 9.6500 loss_cls: 0.2958 loss_bbox: 0.1397 loss_iou: 0.2327 d0.loss_cls: 0.3409 d0.loss_bbox: 0.1506 d0.loss_iou: 0.2435 d1.loss_cls: 0.3131 d1.loss_bbox: 0.1465 d1.loss_iou: 0.2388 d2.loss_cls: 0.3034 d2.loss_bbox: 0.1406 d2.loss_iou: 0.2343 d3.loss_cls: 0.3007 d3.loss_bbox: 0.1378 d3.loss_iou: 0.2307 d4.loss_cls: 0.2949 d4.loss_bbox: 0.1405 d4.loss_iou: 0.2332 enc_loss_cls: 0.3354 enc_loss_bbox: 0.1673 enc_loss_iou: 0.2627 dn_loss_cls: 0.1253 dn_loss_bbox: 0.1797 dn_loss_iou: 0.2238 d0.dn_loss_cls: 0.2078 d0.dn_loss_bbox: 0.3405 d0.dn_loss_iou: 0.3712 d1.dn_loss_cls: 0.1580 d1.dn_loss_bbox: 0.2171 d1.dn_loss_iou: 0.2562 d2.dn_loss_cls: 0.1382 d2.dn_loss_bbox: 0.1918 d2.dn_loss_iou: 0.2340 d3.dn_loss_cls: 0.1290 d3.dn_loss_bbox: 0.1823 d3.dn_loss_iou: 0.2263 d4.dn_loss_cls: 0.1256 d4.dn_loss_bbox: 0.1796 d4.dn_loss_iou: 0.2238 d1.loss_lmm_region: 0.1620 loss_lmm_image: 0.8947 2024/11/11 13:09:03 - mmengine - INFO - Iter(train) [ 47200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:11:52 time: 2.0101 data_time: 0.0173 memory: 33636 grad_norm: 33.8820 loss: 9.4016 loss_cls: 0.2603 loss_bbox: 0.1450 loss_iou: 0.2500 d0.loss_cls: 0.3055 d0.loss_bbox: 0.1512 d0.loss_iou: 0.2616 d1.loss_cls: 0.2866 d1.loss_bbox: 0.1433 d1.loss_iou: 0.2525 d2.loss_cls: 0.2742 d2.loss_bbox: 0.1430 d2.loss_iou: 0.2504 d3.loss_cls: 0.2635 d3.loss_bbox: 0.1424 d3.loss_iou: 0.2508 d4.loss_cls: 0.2612 d4.loss_bbox: 0.1446 d4.loss_iou: 0.2501 enc_loss_cls: 0.3109 enc_loss_bbox: 0.1650 enc_loss_iou: 0.2856 dn_loss_cls: 0.0901 dn_loss_bbox: 0.1821 dn_loss_iou: 0.2314 d0.dn_loss_cls: 0.1754 d0.dn_loss_bbox: 0.3373 d0.dn_loss_iou: 0.3805 d1.dn_loss_cls: 0.1231 d1.dn_loss_bbox: 0.2178 d1.dn_loss_iou: 0.2635 d2.dn_loss_cls: 0.1026 d2.dn_loss_bbox: 0.1903 d2.dn_loss_iou: 0.2403 d3.dn_loss_cls: 0.0946 d3.dn_loss_bbox: 0.1839 d3.dn_loss_iou: 0.2341 d4.dn_loss_cls: 0.0905 d4.dn_loss_bbox: 0.1821 d4.dn_loss_iou: 0.2314 d1.loss_lmm_region: 0.1172 loss_lmm_image: 0.9354 2024/11/11 13:12:21 - mmengine - INFO - Iter(train) [ 47300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:08:27 time: 1.9580 data_time: 0.0172 memory: 35030 grad_norm: 28.2686 loss: 10.3510 loss_cls: 0.3385 loss_bbox: 0.1540 loss_iou: 0.2712 d0.loss_cls: 0.3755 d0.loss_bbox: 0.1779 d0.loss_iou: 0.2988 d1.loss_cls: 0.3468 d1.loss_bbox: 0.1708 d1.loss_iou: 0.2871 d2.loss_cls: 0.3454 d2.loss_bbox: 0.1575 d2.loss_iou: 0.2765 d3.loss_cls: 0.3447 d3.loss_bbox: 0.1557 d3.loss_iou: 0.2724 d4.loss_cls: 0.3424 d4.loss_bbox: 0.1521 d4.loss_iou: 0.2697 enc_loss_cls: 0.3850 enc_loss_bbox: 0.1852 enc_loss_iou: 0.3165 dn_loss_cls: 0.1172 dn_loss_bbox: 0.1828 dn_loss_iou: 0.2260 d0.dn_loss_cls: 0.2004 d0.dn_loss_bbox: 0.3491 d0.dn_loss_iou: 0.3830 d1.dn_loss_cls: 0.1494 d1.dn_loss_bbox: 0.2174 d1.dn_loss_iou: 0.2613 d2.dn_loss_cls: 0.1276 d2.dn_loss_bbox: 0.1935 d2.dn_loss_iou: 0.2368 d3.dn_loss_cls: 0.1241 d3.dn_loss_bbox: 0.1861 d3.dn_loss_iou: 0.2291 d4.dn_loss_cls: 0.1180 d4.dn_loss_bbox: 0.1827 d4.dn_loss_iou: 0.2259 d1.loss_lmm_region: 0.1650 loss_lmm_image: 0.8518 2024/11/11 13:15:42 - mmengine - INFO - Iter(train) [ 47400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:05:08 time: 1.9997 data_time: 0.0173 memory: 33367 grad_norm: 35.9635 loss: 10.6637 loss_cls: 0.3504 loss_bbox: 0.1647 loss_iou: 0.2715 d0.loss_cls: 0.4119 d0.loss_bbox: 0.1723 d0.loss_iou: 0.2812 d1.loss_cls: 0.3736 d1.loss_bbox: 0.1674 d1.loss_iou: 0.2774 d2.loss_cls: 0.3612 d2.loss_bbox: 0.1688 d2.loss_iou: 0.2769 d3.loss_cls: 0.3583 d3.loss_bbox: 0.1622 d3.loss_iou: 0.2696 d4.loss_cls: 0.3563 d4.loss_bbox: 0.1625 d4.loss_iou: 0.2697 enc_loss_cls: 0.4070 enc_loss_bbox: 0.1925 enc_loss_iou: 0.3114 dn_loss_cls: 0.1743 dn_loss_bbox: 0.1629 dn_loss_iou: 0.2210 d0.dn_loss_cls: 0.2473 d0.dn_loss_bbox: 0.3076 d0.dn_loss_iou: 0.3653 d1.dn_loss_cls: 0.2092 d1.dn_loss_bbox: 0.1948 d1.dn_loss_iou: 0.2517 d2.dn_loss_cls: 0.1866 d2.dn_loss_bbox: 0.1705 d2.dn_loss_iou: 0.2298 d3.dn_loss_cls: 0.1776 d3.dn_loss_bbox: 0.1642 d3.dn_loss_iou: 0.2229 d4.dn_loss_cls: 0.1754 d4.dn_loss_bbox: 0.1629 d4.dn_loss_iou: 0.2210 d1.loss_lmm_region: 0.1580 loss_lmm_image: 0.8939 2024/11/11 13:19:03 - mmengine - INFO - Iter(train) [ 47500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 9:01:49 time: 2.0112 data_time: 0.0172 memory: 36231 grad_norm: 30.0275 loss: 10.7285 loss_cls: 0.3298 loss_bbox: 0.1883 loss_iou: 0.3114 d0.loss_cls: 0.3854 d0.loss_bbox: 0.1904 d0.loss_iou: 0.3168 d1.loss_cls: 0.3711 d1.loss_bbox: 0.1663 d1.loss_iou: 0.3050 d2.loss_cls: 0.3469 d2.loss_bbox: 0.1742 d2.loss_iou: 0.3059 d3.loss_cls: 0.3374 d3.loss_bbox: 0.1840 d3.loss_iou: 0.3093 d4.loss_cls: 0.3285 d4.loss_bbox: 0.1884 d4.loss_iou: 0.3123 enc_loss_cls: 0.3990 enc_loss_bbox: 0.1877 enc_loss_iou: 0.3297 dn_loss_cls: 0.1433 dn_loss_bbox: 0.1700 dn_loss_iou: 0.2330 d0.dn_loss_cls: 0.2162 d0.dn_loss_bbox: 0.3106 d0.dn_loss_iou: 0.3711 d1.dn_loss_cls: 0.1688 d1.dn_loss_bbox: 0.2014 d1.dn_loss_iou: 0.2624 d2.dn_loss_cls: 0.1524 d2.dn_loss_bbox: 0.1793 d2.dn_loss_iou: 0.2418 d3.dn_loss_cls: 0.1451 d3.dn_loss_bbox: 0.1722 d3.dn_loss_iou: 0.2354 d4.dn_loss_cls: 0.1392 d4.dn_loss_bbox: 0.1701 d4.dn_loss_iou: 0.2330 d1.loss_lmm_region: 0.1608 loss_lmm_image: 0.8550 2024/11/11 13:22:22 - mmengine - INFO - Iter(train) [ 47600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:58:25 time: 1.9596 data_time: 0.0171 memory: 31419 grad_norm: 26.0749 loss: 8.6975 loss_cls: 0.2477 loss_bbox: 0.1358 loss_iou: 0.2448 d0.loss_cls: 0.2883 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2486 d1.loss_cls: 0.2600 d1.loss_bbox: 0.1407 d1.loss_iou: 0.2473 d2.loss_cls: 0.2541 d2.loss_bbox: 0.1364 d2.loss_iou: 0.2452 d3.loss_cls: 0.2440 d3.loss_bbox: 0.1374 d3.loss_iou: 0.2463 d4.loss_cls: 0.2450 d4.loss_bbox: 0.1388 d4.loss_iou: 0.2453 enc_loss_cls: 0.2922 enc_loss_bbox: 0.1560 enc_loss_iou: 0.2709 dn_loss_cls: 0.0829 dn_loss_bbox: 0.1509 dn_loss_iou: 0.2203 d0.dn_loss_cls: 0.1563 d0.dn_loss_bbox: 0.2964 d0.dn_loss_iou: 0.3646 d1.dn_loss_cls: 0.1079 d1.dn_loss_bbox: 0.1789 d1.dn_loss_iou: 0.2511 d2.dn_loss_cls: 0.0913 d2.dn_loss_bbox: 0.1597 d2.dn_loss_iou: 0.2301 d3.dn_loss_cls: 0.0851 d3.dn_loss_bbox: 0.1521 d3.dn_loss_iou: 0.2225 d4.dn_loss_cls: 0.0830 d4.dn_loss_bbox: 0.1509 d4.dn_loss_iou: 0.2204 d1.loss_lmm_region: 0.1154 loss_lmm_image: 0.8071 2024/11/11 13:25:40 - mmengine - INFO - Iter(train) [ 47700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:54:59 time: 1.9913 data_time: 0.0172 memory: 35465 grad_norm: 32.6480 loss: 8.6378 loss_cls: 0.2970 loss_bbox: 0.1121 loss_iou: 0.2095 d0.loss_cls: 0.3410 d0.loss_bbox: 0.1304 d0.loss_iou: 0.2316 d1.loss_cls: 0.3207 d1.loss_bbox: 0.1177 d1.loss_iou: 0.2180 d2.loss_cls: 0.3157 d2.loss_bbox: 0.1128 d2.loss_iou: 0.2108 d3.loss_cls: 0.3134 d3.loss_bbox: 0.1130 d3.loss_iou: 0.2110 d4.loss_cls: 0.3072 d4.loss_bbox: 0.1127 d4.loss_iou: 0.2094 enc_loss_cls: 0.3506 enc_loss_bbox: 0.1342 enc_loss_iou: 0.2459 dn_loss_cls: 0.0875 dn_loss_bbox: 0.1407 dn_loss_iou: 0.1987 d0.dn_loss_cls: 0.1659 d0.dn_loss_bbox: 0.2714 d0.dn_loss_iou: 0.3327 d1.dn_loss_cls: 0.1142 d1.dn_loss_bbox: 0.1671 d1.dn_loss_iou: 0.2252 d2.dn_loss_cls: 0.0974 d2.dn_loss_bbox: 0.1463 d2.dn_loss_iou: 0.2054 d3.dn_loss_cls: 0.0912 d3.dn_loss_bbox: 0.1416 d3.dn_loss_iou: 0.2000 d4.dn_loss_cls: 0.0882 d4.dn_loss_bbox: 0.1407 d4.dn_loss_iou: 0.1986 d1.loss_lmm_region: 0.1557 loss_lmm_image: 0.8545 2024/11/11 13:28:58 - mmengine - INFO - Iter(train) [ 47800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:51:34 time: 1.9872 data_time: 0.0175 memory: 32255 grad_norm: 35.1859 loss: 11.3169 loss_cls: 0.3865 loss_bbox: 0.1671 loss_iou: 0.3137 d0.loss_cls: 0.4493 d0.loss_bbox: 0.1774 d0.loss_iou: 0.3290 d1.loss_cls: 0.4185 d1.loss_bbox: 0.1680 d1.loss_iou: 0.3173 d2.loss_cls: 0.3999 d2.loss_bbox: 0.1686 d2.loss_iou: 0.3160 d3.loss_cls: 0.3919 d3.loss_bbox: 0.1647 d3.loss_iou: 0.3112 d4.loss_cls: 0.3880 d4.loss_bbox: 0.1668 d4.loss_iou: 0.3115 enc_loss_cls: 0.4494 enc_loss_bbox: 0.1927 enc_loss_iou: 0.3476 dn_loss_cls: 0.1390 dn_loss_bbox: 0.1800 dn_loss_iou: 0.2393 d0.dn_loss_cls: 0.2201 d0.dn_loss_bbox: 0.3500 d0.dn_loss_iou: 0.3914 d1.dn_loss_cls: 0.1714 d1.dn_loss_bbox: 0.2212 d1.dn_loss_iou: 0.2734 d2.dn_loss_cls: 0.1530 d2.dn_loss_bbox: 0.1947 d2.dn_loss_iou: 0.2505 d3.dn_loss_cls: 0.1480 d3.dn_loss_bbox: 0.1832 d3.dn_loss_iou: 0.2414 d4.dn_loss_cls: 0.1406 d4.dn_loss_bbox: 0.1800 d4.dn_loss_iou: 0.2392 d1.loss_lmm_region: 0.1576 loss_lmm_image: 0.9075 2024/11/11 13:32:15 - mmengine - INFO - Iter(train) [ 47900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:48:07 time: 1.9908 data_time: 0.0173 memory: 34618 grad_norm: 29.5722 loss: 10.3219 loss_cls: 0.3227 loss_bbox: 0.1712 loss_iou: 0.3314 d0.loss_cls: 0.3797 d0.loss_bbox: 0.1811 d0.loss_iou: 0.3463 d1.loss_cls: 0.3486 d1.loss_bbox: 0.1762 d1.loss_iou: 0.3336 d2.loss_cls: 0.3381 d2.loss_bbox: 0.1687 d2.loss_iou: 0.3272 d3.loss_cls: 0.3269 d3.loss_bbox: 0.1714 d3.loss_iou: 0.3299 d4.loss_cls: 0.3231 d4.loss_bbox: 0.1720 d4.loss_iou: 0.3311 enc_loss_cls: 0.3761 enc_loss_bbox: 0.1967 enc_loss_iou: 0.3745 dn_loss_cls: 0.0867 dn_loss_bbox: 0.1517 dn_loss_iou: 0.2297 d0.dn_loss_cls: 0.1526 d0.dn_loss_bbox: 0.2891 d0.dn_loss_iou: 0.3705 d1.dn_loss_cls: 0.1113 d1.dn_loss_bbox: 0.1796 d1.dn_loss_iou: 0.2594 d2.dn_loss_cls: 0.0945 d2.dn_loss_bbox: 0.1595 d2.dn_loss_iou: 0.2391 d3.dn_loss_cls: 0.0879 d3.dn_loss_bbox: 0.1537 d3.dn_loss_iou: 0.2321 d4.dn_loss_cls: 0.0866 d4.dn_loss_bbox: 0.1519 d4.dn_loss_iou: 0.2298 d1.loss_lmm_region: 0.1273 loss_lmm_image: 0.9022 2024/11/11 13:35:32 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 13:35:32 - mmengine - INFO - Iter(train) [ 48000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:44:41 time: 1.9671 data_time: 0.0173 memory: 33283 grad_norm: 29.2478 loss: 9.9664 loss_cls: 0.3112 loss_bbox: 0.1603 loss_iou: 0.2594 d0.loss_cls: 0.3570 d0.loss_bbox: 0.1655 d0.loss_iou: 0.2680 d1.loss_cls: 0.3261 d1.loss_bbox: 0.1631 d1.loss_iou: 0.2623 d2.loss_cls: 0.3131 d2.loss_bbox: 0.1653 d2.loss_iou: 0.2622 d3.loss_cls: 0.3143 d3.loss_bbox: 0.1618 d3.loss_iou: 0.2599 d4.loss_cls: 0.3110 d4.loss_bbox: 0.1623 d4.loss_iou: 0.2611 enc_loss_cls: 0.3584 enc_loss_bbox: 0.1771 enc_loss_iou: 0.2851 dn_loss_cls: 0.1362 dn_loss_bbox: 0.1755 dn_loss_iou: 0.2165 d0.dn_loss_cls: 0.2126 d0.dn_loss_bbox: 0.3317 d0.dn_loss_iou: 0.3591 d1.dn_loss_cls: 0.1662 d1.dn_loss_bbox: 0.2100 d1.dn_loss_iou: 0.2485 d2.dn_loss_cls: 0.1509 d2.dn_loss_bbox: 0.1854 d2.dn_loss_iou: 0.2274 d3.dn_loss_cls: 0.1405 d3.dn_loss_bbox: 0.1775 d3.dn_loss_iou: 0.2186 d4.dn_loss_cls: 0.1368 d4.dn_loss_bbox: 0.1756 d4.dn_loss_iou: 0.2165 d1.loss_lmm_region: 0.1526 loss_lmm_image: 0.8238 2024/11/11 13:38:51 - mmengine - INFO - Iter(train) [ 48100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:41:18 time: 1.9651 data_time: 0.0172 memory: 36191 grad_norm: 26.4779 loss: 8.9162 loss_cls: 0.2658 loss_bbox: 0.1132 loss_iou: 0.2431 d0.loss_cls: 0.3164 d0.loss_bbox: 0.1242 d0.loss_iou: 0.2547 d1.loss_cls: 0.2880 d1.loss_bbox: 0.1187 d1.loss_iou: 0.2475 d2.loss_cls: 0.2824 d2.loss_bbox: 0.1116 d2.loss_iou: 0.2402 d3.loss_cls: 0.2706 d3.loss_bbox: 0.1139 d3.loss_iou: 0.2442 d4.loss_cls: 0.2681 d4.loss_bbox: 0.1141 d4.loss_iou: 0.2430 enc_loss_cls: 0.3184 enc_loss_bbox: 0.1347 enc_loss_iou: 0.2730 dn_loss_cls: 0.1004 dn_loss_bbox: 0.1556 dn_loss_iou: 0.2112 d0.dn_loss_cls: 0.1784 d0.dn_loss_bbox: 0.2997 d0.dn_loss_iou: 0.3510 d1.dn_loss_cls: 0.1336 d1.dn_loss_bbox: 0.1817 d1.dn_loss_iou: 0.2374 d2.dn_loss_cls: 0.1147 d2.dn_loss_bbox: 0.1633 d2.dn_loss_iou: 0.2185 d3.dn_loss_cls: 0.1054 d3.dn_loss_bbox: 0.1579 d3.dn_loss_iou: 0.2131 d4.dn_loss_cls: 0.1011 d4.dn_loss_bbox: 0.1556 d4.dn_loss_iou: 0.2111 d1.loss_lmm_region: 0.1339 loss_lmm_image: 0.9070 2024/11/11 13:42:11 - mmengine - INFO - Iter(train) [ 48200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:37:57 time: 1.9863 data_time: 0.0177 memory: 33581 grad_norm: 31.5771 loss: 11.2489 loss_cls: 0.3739 loss_bbox: 0.1805 loss_iou: 0.3103 d0.loss_cls: 0.4335 d0.loss_bbox: 0.1887 d0.loss_iou: 0.3269 d1.loss_cls: 0.3975 d1.loss_bbox: 0.1872 d1.loss_iou: 0.3209 d2.loss_cls: 0.3939 d2.loss_bbox: 0.1804 d2.loss_iou: 0.3129 d3.loss_cls: 0.3835 d3.loss_bbox: 0.1787 d3.loss_iou: 0.3073 d4.loss_cls: 0.3809 d4.loss_bbox: 0.1759 d4.loss_iou: 0.3089 enc_loss_cls: 0.4343 enc_loss_bbox: 0.2025 enc_loss_iou: 0.3512 dn_loss_cls: 0.1056 dn_loss_bbox: 0.2037 dn_loss_iou: 0.2464 d0.dn_loss_cls: 0.2073 d0.dn_loss_bbox: 0.3567 d0.dn_loss_iou: 0.3942 d1.dn_loss_cls: 0.1500 d1.dn_loss_bbox: 0.2384 d1.dn_loss_iou: 0.2800 d2.dn_loss_cls: 0.1262 d2.dn_loss_bbox: 0.2135 d2.dn_loss_iou: 0.2567 d3.dn_loss_cls: 0.1155 d3.dn_loss_bbox: 0.2060 d3.dn_loss_iou: 0.2494 d4.dn_loss_cls: 0.1069 d4.dn_loss_bbox: 0.2039 d4.dn_loss_iou: 0.2465 d1.loss_lmm_region: 0.1653 loss_lmm_image: 0.8468 2024/11/11 13:45:32 - mmengine - INFO - Iter(train) [ 48300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:34:37 time: 2.0103 data_time: 0.0172 memory: 32079 grad_norm: 32.9601 loss: 9.8821 loss_cls: 0.3011 loss_bbox: 0.1405 loss_iou: 0.2158 d0.loss_cls: 0.3423 d0.loss_bbox: 0.1554 d0.loss_iou: 0.2311 d1.loss_cls: 0.3200 d1.loss_bbox: 0.1468 d1.loss_iou: 0.2223 d2.loss_cls: 0.3086 d2.loss_bbox: 0.1399 d2.loss_iou: 0.2142 d3.loss_cls: 0.3041 d3.loss_bbox: 0.1444 d3.loss_iou: 0.2169 d4.loss_cls: 0.3016 d4.loss_bbox: 0.1396 d4.loss_iou: 0.2154 enc_loss_cls: 0.3423 enc_loss_bbox: 0.1721 enc_loss_iou: 0.2494 dn_loss_cls: 0.1759 dn_loss_bbox: 0.1795 dn_loss_iou: 0.2226 d0.dn_loss_cls: 0.2573 d0.dn_loss_bbox: 0.3433 d0.dn_loss_iou: 0.3760 d1.dn_loss_cls: 0.2089 d1.dn_loss_bbox: 0.2139 d1.dn_loss_iou: 0.2541 d2.dn_loss_cls: 0.1884 d2.dn_loss_bbox: 0.1879 d2.dn_loss_iou: 0.2317 d3.dn_loss_cls: 0.1816 d3.dn_loss_bbox: 0.1829 d3.dn_loss_iou: 0.2255 d4.dn_loss_cls: 0.1769 d4.dn_loss_bbox: 0.1795 d4.dn_loss_iou: 0.2224 d1.loss_lmm_region: 0.1876 loss_lmm_image: 0.8622 2024/11/11 13:48:50 - mmengine - INFO - Iter(train) [ 48400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:31:13 time: 1.9941 data_time: 0.0172 memory: 34664 grad_norm: 37.5166 loss: 10.2016 loss_cls: 0.3036 loss_bbox: 0.1514 loss_iou: 0.3030 d0.loss_cls: 0.3456 d0.loss_bbox: 0.1609 d0.loss_iou: 0.3161 d1.loss_cls: 0.3234 d1.loss_bbox: 0.1475 d1.loss_iou: 0.3037 d2.loss_cls: 0.3161 d2.loss_bbox: 0.1468 d2.loss_iou: 0.2995 d3.loss_cls: 0.3111 d3.loss_bbox: 0.1460 d3.loss_iou: 0.2983 d4.loss_cls: 0.3031 d4.loss_bbox: 0.1499 d4.loss_iou: 0.3013 enc_loss_cls: 0.3524 enc_loss_bbox: 0.1705 enc_loss_iou: 0.3292 dn_loss_cls: 0.1067 dn_loss_bbox: 0.1732 dn_loss_iou: 0.2473 d0.dn_loss_cls: 0.1957 d0.dn_loss_bbox: 0.3158 d0.dn_loss_iou: 0.3902 d1.dn_loss_cls: 0.1416 d1.dn_loss_bbox: 0.2027 d1.dn_loss_iou: 0.2760 d2.dn_loss_cls: 0.1195 d2.dn_loss_bbox: 0.1817 d2.dn_loss_iou: 0.2555 d3.dn_loss_cls: 0.1107 d3.dn_loss_bbox: 0.1749 d3.dn_loss_iou: 0.2492 d4.dn_loss_cls: 0.1077 d4.dn_loss_bbox: 0.1731 d4.dn_loss_iou: 0.2472 d1.loss_lmm_region: 0.1695 loss_lmm_image: 0.8839 2024/11/11 13:52:09 - mmengine - INFO - Iter(train) [ 48500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:27:50 time: 1.9973 data_time: 0.0171 memory: 34217 grad_norm: 30.9215 loss: 9.4221 loss_cls: 0.2863 loss_bbox: 0.1285 loss_iou: 0.2198 d0.loss_cls: 0.3251 d0.loss_bbox: 0.1408 d0.loss_iou: 0.2353 d1.loss_cls: 0.3061 d1.loss_bbox: 0.1324 d1.loss_iou: 0.2255 d2.loss_cls: 0.2934 d2.loss_bbox: 0.1317 d2.loss_iou: 0.2229 d3.loss_cls: 0.2873 d3.loss_bbox: 0.1287 d3.loss_iou: 0.2206 d4.loss_cls: 0.2869 d4.loss_bbox: 0.1281 d4.loss_iou: 0.2193 enc_loss_cls: 0.3291 enc_loss_bbox: 0.1606 enc_loss_iou: 0.2609 dn_loss_cls: 0.1220 dn_loss_bbox: 0.1980 dn_loss_iou: 0.2165 d0.dn_loss_cls: 0.1979 d0.dn_loss_bbox: 0.3424 d0.dn_loss_iou: 0.3588 d1.dn_loss_cls: 0.1511 d1.dn_loss_bbox: 0.2288 d1.dn_loss_iou: 0.2471 d2.dn_loss_cls: 0.1345 d2.dn_loss_bbox: 0.2069 d2.dn_loss_iou: 0.2250 d3.dn_loss_cls: 0.1254 d3.dn_loss_bbox: 0.2005 d3.dn_loss_iou: 0.2186 d4.dn_loss_cls: 0.1220 d4.dn_loss_bbox: 0.1980 d4.dn_loss_iou: 0.2164 d1.loss_lmm_region: 0.1531 loss_lmm_image: 0.8898 2024/11/11 13:55:28 - mmengine - INFO - Iter(train) [ 48600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:24:26 time: 2.0035 data_time: 0.0172 memory: 35694 grad_norm: 26.4629 loss: 9.5827 loss_cls: 0.3284 loss_bbox: 0.1422 loss_iou: 0.2685 d0.loss_cls: 0.3773 d0.loss_bbox: 0.1531 d0.loss_iou: 0.2849 d1.loss_cls: 0.3475 d1.loss_bbox: 0.1514 d1.loss_iou: 0.2827 d2.loss_cls: 0.3382 d2.loss_bbox: 0.1442 d2.loss_iou: 0.2716 d3.loss_cls: 0.3312 d3.loss_bbox: 0.1420 d3.loss_iou: 0.2698 d4.loss_cls: 0.3298 d4.loss_bbox: 0.1424 d4.loss_iou: 0.2683 enc_loss_cls: 0.3791 enc_loss_bbox: 0.1691 enc_loss_iou: 0.3141 dn_loss_cls: 0.0888 dn_loss_bbox: 0.1440 dn_loss_iou: 0.2044 d0.dn_loss_cls: 0.1679 d0.dn_loss_bbox: 0.2912 d0.dn_loss_iou: 0.3477 d1.dn_loss_cls: 0.1201 d1.dn_loss_bbox: 0.1772 d1.dn_loss_iou: 0.2376 d2.dn_loss_cls: 0.0992 d2.dn_loss_bbox: 0.1535 d2.dn_loss_iou: 0.2141 d3.dn_loss_cls: 0.0928 d3.dn_loss_bbox: 0.1462 d3.dn_loss_iou: 0.2072 d4.dn_loss_cls: 0.0895 d4.dn_loss_bbox: 0.1441 d4.dn_loss_iou: 0.2042 d1.loss_lmm_region: 0.1327 loss_lmm_image: 0.8845 2024/11/11 13:58:46 - mmengine - INFO - Iter(train) [ 48700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:21:03 time: 1.9884 data_time: 0.0172 memory: 34640 grad_norm: 27.5983 loss: 9.2351 loss_cls: 0.2839 loss_bbox: 0.1337 loss_iou: 0.2261 d0.loss_cls: 0.3192 d0.loss_bbox: 0.1482 d0.loss_iou: 0.2420 d1.loss_cls: 0.3049 d1.loss_bbox: 0.1383 d1.loss_iou: 0.2332 d2.loss_cls: 0.2927 d2.loss_bbox: 0.1345 d2.loss_iou: 0.2284 d3.loss_cls: 0.2861 d3.loss_bbox: 0.1351 d3.loss_iou: 0.2279 d4.loss_cls: 0.2849 d4.loss_bbox: 0.1334 d4.loss_iou: 0.2270 enc_loss_cls: 0.3301 enc_loss_bbox: 0.1560 enc_loss_iou: 0.2595 dn_loss_cls: 0.0980 dn_loss_bbox: 0.1858 dn_loss_iou: 0.2197 d0.dn_loss_cls: 0.1773 d0.dn_loss_bbox: 0.3525 d0.dn_loss_iou: 0.3741 d1.dn_loss_cls: 0.1301 d1.dn_loss_bbox: 0.2192 d1.dn_loss_iou: 0.2531 d2.dn_loss_cls: 0.1106 d2.dn_loss_bbox: 0.1976 d2.dn_loss_iou: 0.2312 d3.dn_loss_cls: 0.1040 d3.dn_loss_bbox: 0.1881 d3.dn_loss_iou: 0.2232 d4.dn_loss_cls: 0.0997 d4.dn_loss_bbox: 0.1859 d4.dn_loss_iou: 0.2198 d1.loss_lmm_region: 0.1117 loss_lmm_image: 0.8287 2024/11/11 14:02:06 - mmengine - INFO - Iter(train) [ 48800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:17:41 time: 1.9828 data_time: 0.0170 memory: 32035 grad_norm: 29.5066 loss: 8.3446 loss_cls: 0.2391 loss_bbox: 0.1062 loss_iou: 0.1957 d0.loss_cls: 0.2784 d0.loss_bbox: 0.1184 d0.loss_iou: 0.2115 d1.loss_cls: 0.2565 d1.loss_bbox: 0.1128 d1.loss_iou: 0.2025 d2.loss_cls: 0.2463 d2.loss_bbox: 0.1109 d2.loss_iou: 0.1984 d3.loss_cls: 0.2420 d3.loss_bbox: 0.1080 d3.loss_iou: 0.1965 d4.loss_cls: 0.2406 d4.loss_bbox: 0.1070 d4.loss_iou: 0.1958 enc_loss_cls: 0.2780 enc_loss_bbox: 0.1297 enc_loss_iou: 0.2276 dn_loss_cls: 0.1225 dn_loss_bbox: 0.1539 dn_loss_iou: 0.1916 d0.dn_loss_cls: 0.1977 d0.dn_loss_bbox: 0.2996 d0.dn_loss_iou: 0.3323 d1.dn_loss_cls: 0.1503 d1.dn_loss_bbox: 0.1863 d1.dn_loss_iou: 0.2223 d2.dn_loss_cls: 0.1324 d2.dn_loss_bbox: 0.1626 d2.dn_loss_iou: 0.2000 d3.dn_loss_cls: 0.1276 d3.dn_loss_bbox: 0.1551 d3.dn_loss_iou: 0.1940 d4.dn_loss_cls: 0.1238 d4.dn_loss_bbox: 0.1538 d4.dn_loss_iou: 0.1916 d1.loss_lmm_region: 0.1349 loss_lmm_image: 0.9107 2024/11/11 14:05:27 - mmengine - INFO - Iter(train) [ 48900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:14:21 time: 2.0067 data_time: 0.0174 memory: 35315 grad_norm: 34.7229 loss: 9.9254 loss_cls: 0.3068 loss_bbox: 0.1391 loss_iou: 0.2652 d0.loss_cls: 0.3508 d0.loss_bbox: 0.1516 d0.loss_iou: 0.2824 d1.loss_cls: 0.3223 d1.loss_bbox: 0.1446 d1.loss_iou: 0.2712 d2.loss_cls: 0.3144 d2.loss_bbox: 0.1412 d2.loss_iou: 0.2696 d3.loss_cls: 0.3076 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2666 d4.loss_cls: 0.3061 d4.loss_bbox: 0.1398 d4.loss_iou: 0.2662 enc_loss_cls: 0.3443 enc_loss_bbox: 0.1665 enc_loss_iou: 0.3045 dn_loss_cls: 0.1114 dn_loss_bbox: 0.1824 dn_loss_iou: 0.2317 d0.dn_loss_cls: 0.2057 d0.dn_loss_bbox: 0.3356 d0.dn_loss_iou: 0.3834 d1.dn_loss_cls: 0.1486 d1.dn_loss_bbox: 0.2091 d1.dn_loss_iou: 0.2630 d2.dn_loss_cls: 0.1256 d2.dn_loss_bbox: 0.1911 d2.dn_loss_iou: 0.2418 d3.dn_loss_cls: 0.1165 d3.dn_loss_bbox: 0.1833 d3.dn_loss_iou: 0.2344 d4.dn_loss_cls: 0.1118 d4.dn_loss_bbox: 0.1824 d4.dn_loss_iou: 0.2316 d1.loss_lmm_region: 0.1602 loss_lmm_image: 0.8743 2024/11/11 14:08:46 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 14:08:46 - mmengine - INFO - Iter(train) [ 49000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:10:59 time: 1.9846 data_time: 0.0172 memory: 34977 grad_norm: 29.0638 loss: 10.8439 loss_cls: 0.3645 loss_bbox: 0.1630 loss_iou: 0.2988 d0.loss_cls: 0.4235 d0.loss_bbox: 0.1761 d0.loss_iou: 0.3180 d1.loss_cls: 0.3854 d1.loss_bbox: 0.1689 d1.loss_iou: 0.3070 d2.loss_cls: 0.3725 d2.loss_bbox: 0.1625 d2.loss_iou: 0.2983 d3.loss_cls: 0.3716 d3.loss_bbox: 0.1609 d3.loss_iou: 0.2981 d4.loss_cls: 0.3667 d4.loss_bbox: 0.1616 d4.loss_iou: 0.2977 enc_loss_cls: 0.4245 enc_loss_bbox: 0.1918 enc_loss_iou: 0.3367 dn_loss_cls: 0.1133 dn_loss_bbox: 0.1846 dn_loss_iou: 0.2368 d0.dn_loss_cls: 0.1929 d0.dn_loss_bbox: 0.3436 d0.dn_loss_iou: 0.3884 d1.dn_loss_cls: 0.1437 d1.dn_loss_bbox: 0.2239 d1.dn_loss_iou: 0.2724 d2.dn_loss_cls: 0.1220 d2.dn_loss_bbox: 0.1992 d2.dn_loss_iou: 0.2480 d3.dn_loss_cls: 0.1175 d3.dn_loss_bbox: 0.1875 d3.dn_loss_iou: 0.2396 d4.dn_loss_cls: 0.1141 d4.dn_loss_bbox: 0.1845 d4.dn_loss_iou: 0.2367 d1.loss_lmm_region: 0.1445 loss_lmm_image: 0.9028 2024/11/11 14:12:03 - mmengine - INFO - Iter(train) [ 49100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:07:33 time: 1.9691 data_time: 0.0173 memory: 34626 grad_norm: 23.7468 loss: 9.1798 loss_cls: 0.2778 loss_bbox: 0.1258 loss_iou: 0.2397 d0.loss_cls: 0.3242 d0.loss_bbox: 0.1358 d0.loss_iou: 0.2537 d1.loss_cls: 0.2992 d1.loss_bbox: 0.1291 d1.loss_iou: 0.2473 d2.loss_cls: 0.2897 d2.loss_bbox: 0.1251 d2.loss_iou: 0.2395 d3.loss_cls: 0.2838 d3.loss_bbox: 0.1235 d3.loss_iou: 0.2385 d4.loss_cls: 0.2803 d4.loss_bbox: 0.1253 d4.loss_iou: 0.2397 enc_loss_cls: 0.3265 enc_loss_bbox: 0.1505 enc_loss_iou: 0.2792 dn_loss_cls: 0.1265 dn_loss_bbox: 0.1557 dn_loss_iou: 0.1958 d0.dn_loss_cls: 0.2052 d0.dn_loss_bbox: 0.3075 d0.dn_loss_iou: 0.3433 d1.dn_loss_cls: 0.1582 d1.dn_loss_bbox: 0.1922 d1.dn_loss_iou: 0.2294 d2.dn_loss_cls: 0.1375 d2.dn_loss_bbox: 0.1672 d2.dn_loss_iou: 0.2061 d3.dn_loss_cls: 0.1311 d3.dn_loss_bbox: 0.1581 d3.dn_loss_iou: 0.1989 d4.dn_loss_cls: 0.1287 d4.dn_loss_bbox: 0.1557 d4.dn_loss_iou: 0.1959 d1.loss_lmm_region: 0.1507 loss_lmm_image: 0.9016 2024/11/11 14:15:24 - mmengine - INFO - Iter(train) [ 49200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:04:14 time: 2.0052 data_time: 0.0171 memory: 33942 grad_norm: 33.9398 loss: 10.2321 loss_cls: 0.4027 loss_bbox: 0.1383 loss_iou: 0.2733 d0.loss_cls: 0.4488 d0.loss_bbox: 0.1577 d0.loss_iou: 0.2953 d1.loss_cls: 0.4021 d1.loss_bbox: 0.1507 d1.loss_iou: 0.2833 d2.loss_cls: 0.4037 d2.loss_bbox: 0.1443 d2.loss_iou: 0.2779 d3.loss_cls: 0.4004 d3.loss_bbox: 0.1381 d3.loss_iou: 0.2719 d4.loss_cls: 0.4017 d4.loss_bbox: 0.1390 d4.loss_iou: 0.2739 enc_loss_cls: 0.4481 enc_loss_bbox: 0.1686 enc_loss_iou: 0.3105 dn_loss_cls: 0.1044 dn_loss_bbox: 0.1558 dn_loss_iou: 0.1967 d0.dn_loss_cls: 0.1818 d0.dn_loss_bbox: 0.2929 d0.dn_loss_iou: 0.3355 d1.dn_loss_cls: 0.1331 d1.dn_loss_bbox: 0.1921 d1.dn_loss_iou: 0.2304 d2.dn_loss_cls: 0.1169 d2.dn_loss_bbox: 0.1654 d2.dn_loss_iou: 0.2065 d3.dn_loss_cls: 0.1107 d3.dn_loss_bbox: 0.1584 d3.dn_loss_iou: 0.2002 d4.dn_loss_cls: 0.1058 d4.dn_loss_bbox: 0.1557 d4.dn_loss_iou: 0.1967 d1.loss_lmm_region: 0.1451 loss_lmm_image: 0.9177 2024/11/11 14:18:44 - mmengine - INFO - Iter(train) [ 49300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 8:00:52 time: 2.0057 data_time: 0.0172 memory: 32634 grad_norm: 31.8505 loss: 10.2917 loss_cls: 0.3328 loss_bbox: 0.1610 loss_iou: 0.2808 d0.loss_cls: 0.3809 d0.loss_bbox: 0.1730 d0.loss_iou: 0.2905 d1.loss_cls: 0.3524 d1.loss_bbox: 0.1683 d1.loss_iou: 0.2877 d2.loss_cls: 0.3457 d2.loss_bbox: 0.1617 d2.loss_iou: 0.2801 d3.loss_cls: 0.3363 d3.loss_bbox: 0.1595 d3.loss_iou: 0.2798 d4.loss_cls: 0.3350 d4.loss_bbox: 0.1597 d4.loss_iou: 0.2809 enc_loss_cls: 0.3836 enc_loss_bbox: 0.1830 enc_loss_iou: 0.3093 dn_loss_cls: 0.1062 dn_loss_bbox: 0.1865 dn_loss_iou: 0.2243 d0.dn_loss_cls: 0.1926 d0.dn_loss_bbox: 0.3325 d0.dn_loss_iou: 0.3652 d1.dn_loss_cls: 0.1404 d1.dn_loss_bbox: 0.2167 d1.dn_loss_iou: 0.2531 d2.dn_loss_cls: 0.1176 d2.dn_loss_bbox: 0.1970 d2.dn_loss_iou: 0.2332 d3.dn_loss_cls: 0.1110 d3.dn_loss_bbox: 0.1886 d3.dn_loss_iou: 0.2267 d4.dn_loss_cls: 0.1072 d4.dn_loss_bbox: 0.1865 d4.dn_loss_iou: 0.2243 d1.loss_lmm_region: 0.1500 loss_lmm_image: 0.8901 2024/11/11 14:22:05 - mmengine - INFO - Iter(train) [ 49400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:57:35 time: 1.9825 data_time: 0.0171 memory: 34589 grad_norm: 29.3240 loss: 9.7640 loss_cls: 0.3381 loss_bbox: 0.1368 loss_iou: 0.2420 d0.loss_cls: 0.3771 d0.loss_bbox: 0.1485 d0.loss_iou: 0.2571 d1.loss_cls: 0.3562 d1.loss_bbox: 0.1395 d1.loss_iou: 0.2485 d2.loss_cls: 0.3528 d2.loss_bbox: 0.1318 d2.loss_iou: 0.2406 d3.loss_cls: 0.3473 d3.loss_bbox: 0.1353 d3.loss_iou: 0.2400 d4.loss_cls: 0.3373 d4.loss_bbox: 0.1379 d4.loss_iou: 0.2425 enc_loss_cls: 0.3795 enc_loss_bbox: 0.1624 enc_loss_iou: 0.2815 dn_loss_cls: 0.1343 dn_loss_bbox: 0.1596 dn_loss_iou: 0.2042 d0.dn_loss_cls: 0.2106 d0.dn_loss_bbox: 0.3011 d0.dn_loss_iou: 0.3410 d1.dn_loss_cls: 0.1634 d1.dn_loss_bbox: 0.1936 d1.dn_loss_iou: 0.2344 d2.dn_loss_cls: 0.1484 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2139 d3.dn_loss_cls: 0.1400 d3.dn_loss_bbox: 0.1624 d3.dn_loss_iou: 0.2072 d4.dn_loss_cls: 0.1349 d4.dn_loss_bbox: 0.1597 d4.dn_loss_iou: 0.2043 d1.loss_lmm_region: 0.1600 loss_lmm_image: 0.8858 2024/11/11 14:25:24 - mmengine - INFO - Iter(train) [ 49500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:54:12 time: 1.9911 data_time: 0.0172 memory: 35477 grad_norm: nan loss: 10.2908 loss_cls: 0.3100 loss_bbox: 0.1557 loss_iou: 0.2730 d0.loss_cls: 0.3518 d0.loss_bbox: 0.1688 d0.loss_iou: 0.2851 d1.loss_cls: 0.3304 d1.loss_bbox: 0.1635 d1.loss_iou: 0.2794 d2.loss_cls: 0.3221 d2.loss_bbox: 0.1597 d2.loss_iou: 0.2778 d3.loss_cls: 0.3175 d3.loss_bbox: 0.1547 d3.loss_iou: 0.2748 d4.loss_cls: 0.3094 d4.loss_bbox: 0.1574 d4.loss_iou: 0.2731 enc_loss_cls: 0.3527 enc_loss_bbox: 0.1826 enc_loss_iou: 0.3093 dn_loss_cls: 0.1338 dn_loss_bbox: 0.1778 dn_loss_iou: 0.2373 d0.dn_loss_cls: 0.2203 d0.dn_loss_bbox: 0.3416 d0.dn_loss_iou: 0.3821 d1.dn_loss_cls: 0.1677 d1.dn_loss_bbox: 0.2081 d1.dn_loss_iou: 0.2655 d2.dn_loss_cls: 0.1506 d2.dn_loss_bbox: 0.1856 d2.dn_loss_iou: 0.2449 d3.dn_loss_cls: 0.1407 d3.dn_loss_bbox: 0.1795 d3.dn_loss_iou: 0.2396 d4.dn_loss_cls: 0.1340 d4.dn_loss_bbox: 0.1778 d4.dn_loss_iou: 0.2372 d1.loss_lmm_region: 0.1910 loss_lmm_image: 0.8669 2024/11/11 14:28:43 - mmengine - INFO - Iter(train) [ 49600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:50:49 time: 1.9711 data_time: 0.0171 memory: 33337 grad_norm: 31.7499 loss: 8.6568 loss_cls: 0.2692 loss_bbox: 0.1238 loss_iou: 0.1912 d0.loss_cls: 0.3356 d0.loss_bbox: 0.1303 d0.loss_iou: 0.1955 d1.loss_cls: 0.2971 d1.loss_bbox: 0.1303 d1.loss_iou: 0.1939 d2.loss_cls: 0.2858 d2.loss_bbox: 0.1249 d2.loss_iou: 0.1901 d3.loss_cls: 0.2757 d3.loss_bbox: 0.1254 d3.loss_iou: 0.1919 d4.loss_cls: 0.2717 d4.loss_bbox: 0.1226 d4.loss_iou: 0.1902 enc_loss_cls: 0.3325 enc_loss_bbox: 0.1458 enc_loss_iou: 0.2188 dn_loss_cls: 0.1208 dn_loss_bbox: 0.1670 dn_loss_iou: 0.1864 d0.dn_loss_cls: 0.1834 d0.dn_loss_bbox: 0.3233 d0.dn_loss_iou: 0.3260 d1.dn_loss_cls: 0.1379 d1.dn_loss_bbox: 0.2029 d1.dn_loss_iou: 0.2149 d2.dn_loss_cls: 0.1207 d2.dn_loss_bbox: 0.1771 d2.dn_loss_iou: 0.1954 d3.dn_loss_cls: 0.1193 d3.dn_loss_bbox: 0.1701 d3.dn_loss_iou: 0.1891 d4.dn_loss_cls: 0.1182 d4.dn_loss_bbox: 0.1670 d4.dn_loss_iou: 0.1864 d1.loss_lmm_region: 0.1572 loss_lmm_image: 0.8512 2024/11/11 14:32:01 - mmengine - INFO - Iter(train) [ 49700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:47:24 time: 1.9799 data_time: 0.0173 memory: 34437 grad_norm: 32.9791 loss: 9.9465 loss_cls: 0.3026 loss_bbox: 0.1533 loss_iou: 0.2637 d0.loss_cls: 0.3462 d0.loss_bbox: 0.1675 d0.loss_iou: 0.2749 d1.loss_cls: 0.3272 d1.loss_bbox: 0.1542 d1.loss_iou: 0.2656 d2.loss_cls: 0.3150 d2.loss_bbox: 0.1531 d2.loss_iou: 0.2635 d3.loss_cls: 0.3077 d3.loss_bbox: 0.1540 d3.loss_iou: 0.2635 d4.loss_cls: 0.3056 d4.loss_bbox: 0.1526 d4.loss_iou: 0.2624 enc_loss_cls: 0.3422 enc_loss_bbox: 0.1831 enc_loss_iou: 0.2976 dn_loss_cls: 0.1319 dn_loss_bbox: 0.1791 dn_loss_iou: 0.2102 d0.dn_loss_cls: 0.2124 d0.dn_loss_bbox: 0.3313 d0.dn_loss_iou: 0.3479 d1.dn_loss_cls: 0.1606 d1.dn_loss_bbox: 0.2075 d1.dn_loss_iou: 0.2358 d2.dn_loss_cls: 0.1433 d2.dn_loss_bbox: 0.1882 d2.dn_loss_iou: 0.2178 d3.dn_loss_cls: 0.1383 d3.dn_loss_bbox: 0.1810 d3.dn_loss_iou: 0.2122 d4.dn_loss_cls: 0.1335 d4.dn_loss_bbox: 0.1791 d4.dn_loss_iou: 0.2103 d1.loss_lmm_region: 0.1883 loss_lmm_image: 0.8820 2024/11/11 14:35:21 - mmengine - INFO - Iter(train) [ 49800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:44:03 time: 2.0035 data_time: 0.0172 memory: 34538 grad_norm: 31.2095 loss: 10.3132 loss_cls: 0.3121 loss_bbox: 0.1508 loss_iou: 0.2691 d0.loss_cls: 0.3626 d0.loss_bbox: 0.1717 d0.loss_iou: 0.2874 d1.loss_cls: 0.3304 d1.loss_bbox: 0.1661 d1.loss_iou: 0.2802 d2.loss_cls: 0.3185 d2.loss_bbox: 0.1563 d2.loss_iou: 0.2737 d3.loss_cls: 0.3153 d3.loss_bbox: 0.1502 d3.loss_iou: 0.2700 d4.loss_cls: 0.3138 d4.loss_bbox: 0.1499 d4.loss_iou: 0.2685 enc_loss_cls: 0.3575 enc_loss_bbox: 0.1892 enc_loss_iou: 0.3177 dn_loss_cls: 0.1448 dn_loss_bbox: 0.1762 dn_loss_iou: 0.2367 d0.dn_loss_cls: 0.2147 d0.dn_loss_bbox: 0.3338 d0.dn_loss_iou: 0.3900 d1.dn_loss_cls: 0.1725 d1.dn_loss_bbox: 0.2102 d1.dn_loss_iou: 0.2709 d2.dn_loss_cls: 0.1564 d2.dn_loss_bbox: 0.1874 d2.dn_loss_iou: 0.2479 d3.dn_loss_cls: 0.1509 d3.dn_loss_bbox: 0.1782 d3.dn_loss_iou: 0.2394 d4.dn_loss_cls: 0.1452 d4.dn_loss_bbox: 0.1762 d4.dn_loss_iou: 0.2367 d1.loss_lmm_region: 0.1546 loss_lmm_image: 0.8795 2024/11/11 14:38:39 - mmengine - INFO - Iter(train) [ 49900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:40:39 time: 2.0002 data_time: 0.0171 memory: 34108 grad_norm: 30.5477 loss: 10.4561 loss_cls: 0.3563 loss_bbox: 0.1673 loss_iou: 0.2742 d0.loss_cls: 0.3979 d0.loss_bbox: 0.1787 d0.loss_iou: 0.2863 d1.loss_cls: 0.3767 d1.loss_bbox: 0.1706 d1.loss_iou: 0.2771 d2.loss_cls: 0.3686 d2.loss_bbox: 0.1691 d2.loss_iou: 0.2725 d3.loss_cls: 0.3598 d3.loss_bbox: 0.1680 d3.loss_iou: 0.2725 d4.loss_cls: 0.3568 d4.loss_bbox: 0.1690 d4.loss_iou: 0.2757 enc_loss_cls: 0.4009 enc_loss_bbox: 0.1995 enc_loss_iou: 0.3156 dn_loss_cls: 0.1097 dn_loss_bbox: 0.1727 dn_loss_iou: 0.2187 d0.dn_loss_cls: 0.1989 d0.dn_loss_bbox: 0.3349 d0.dn_loss_iou: 0.3697 d1.dn_loss_cls: 0.1489 d1.dn_loss_bbox: 0.2089 d1.dn_loss_iou: 0.2547 d2.dn_loss_cls: 0.1228 d2.dn_loss_bbox: 0.1834 d2.dn_loss_iou: 0.2294 d3.dn_loss_cls: 0.1150 d3.dn_loss_bbox: 0.1752 d3.dn_loss_iou: 0.2212 d4.dn_loss_cls: 0.1102 d4.dn_loss_bbox: 0.1730 d4.dn_loss_iou: 0.2187 d1.loss_lmm_region: 0.1756 loss_lmm_image: 0.9015 2024/11/11 14:41:58 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 14:41:58 - mmengine - INFO - Iter(train) [ 50000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:37:16 time: 1.9838 data_time: 0.0171 memory: 32599 grad_norm: 29.2735 loss: 7.6840 loss_cls: 0.2216 loss_bbox: 0.0976 loss_iou: 0.1904 d0.loss_cls: 0.2665 d0.loss_bbox: 0.1080 d0.loss_iou: 0.2034 d1.loss_cls: 0.2454 d1.loss_bbox: 0.0998 d1.loss_iou: 0.1942 d2.loss_cls: 0.2328 d2.loss_bbox: 0.0992 d2.loss_iou: 0.1936 d3.loss_cls: 0.2270 d3.loss_bbox: 0.0965 d3.loss_iou: 0.1899 d4.loss_cls: 0.2231 d4.loss_bbox: 0.0975 d4.loss_iou: 0.1901 enc_loss_cls: 0.2678 enc_loss_bbox: 0.1236 enc_loss_iou: 0.2297 dn_loss_cls: 0.1340 dn_loss_bbox: 0.1138 dn_loss_iou: 0.1656 d0.dn_loss_cls: 0.1933 d0.dn_loss_bbox: 0.2475 d0.dn_loss_iou: 0.2983 d1.dn_loss_cls: 0.1518 d1.dn_loss_bbox: 0.1420 d1.dn_loss_iou: 0.1927 d2.dn_loss_cls: 0.1434 d2.dn_loss_bbox: 0.1234 d2.dn_loss_iou: 0.1741 d3.dn_loss_cls: 0.1326 d3.dn_loss_bbox: 0.1163 d3.dn_loss_iou: 0.1680 d4.dn_loss_cls: 0.1343 d4.dn_loss_bbox: 0.1138 d4.dn_loss_iou: 0.1657 d1.loss_lmm_region: 0.1339 loss_lmm_image: 0.8414 2024/11/11 14:45:16 - mmengine - INFO - Iter(train) [ 50100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:33:50 time: 1.9780 data_time: 0.0170 memory: 34433 grad_norm: 25.0285 loss: 10.4854 loss_cls: 0.3561 loss_bbox: 0.1479 loss_iou: 0.2874 d0.loss_cls: 0.4302 d0.loss_bbox: 0.1591 d0.loss_iou: 0.2978 d1.loss_cls: 0.3899 d1.loss_bbox: 0.1526 d1.loss_iou: 0.2870 d2.loss_cls: 0.3696 d2.loss_bbox: 0.1495 d2.loss_iou: 0.2873 d3.loss_cls: 0.3616 d3.loss_bbox: 0.1467 d3.loss_iou: 0.2850 d4.loss_cls: 0.3568 d4.loss_bbox: 0.1472 d4.loss_iou: 0.2855 enc_loss_cls: 0.4224 enc_loss_bbox: 0.1767 enc_loss_iou: 0.3257 dn_loss_cls: 0.1171 dn_loss_bbox: 0.1665 dn_loss_iou: 0.2254 d0.dn_loss_cls: 0.2069 d0.dn_loss_bbox: 0.3274 d0.dn_loss_iou: 0.3812 d1.dn_loss_cls: 0.1527 d1.dn_loss_bbox: 0.2070 d1.dn_loss_iou: 0.2619 d2.dn_loss_cls: 0.1295 d2.dn_loss_bbox: 0.1802 d2.dn_loss_iou: 0.2378 d3.dn_loss_cls: 0.1219 d3.dn_loss_bbox: 0.1693 d3.dn_loss_iou: 0.2288 d4.dn_loss_cls: 0.1178 d4.dn_loss_bbox: 0.1664 d4.dn_loss_iou: 0.2253 d1.loss_lmm_region: 0.1656 loss_lmm_image: 0.8744 2024/11/11 14:48:36 - mmengine - INFO - Iter(train) [ 50200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:30:31 time: 2.0112 data_time: 0.0172 memory: 35123 grad_norm: 32.0195 loss: 10.0261 loss_cls: 0.3404 loss_bbox: 0.1369 loss_iou: 0.2210 d0.loss_cls: 0.3959 d0.loss_bbox: 0.1422 d0.loss_iou: 0.2365 d1.loss_cls: 0.3687 d1.loss_bbox: 0.1394 d1.loss_iou: 0.2272 d2.loss_cls: 0.3513 d2.loss_bbox: 0.1367 d2.loss_iou: 0.2231 d3.loss_cls: 0.3428 d3.loss_bbox: 0.1370 d3.loss_iou: 0.2218 d4.loss_cls: 0.3441 d4.loss_bbox: 0.1359 d4.loss_iou: 0.2195 enc_loss_cls: 0.3985 enc_loss_bbox: 0.1587 enc_loss_iou: 0.2579 dn_loss_cls: 0.1825 dn_loss_bbox: 0.1727 dn_loss_iou: 0.2026 d0.dn_loss_cls: 0.2503 d0.dn_loss_bbox: 0.3259 d0.dn_loss_iou: 0.3442 d1.dn_loss_cls: 0.2058 d1.dn_loss_bbox: 0.2040 d1.dn_loss_iou: 0.2315 d2.dn_loss_cls: 0.1888 d2.dn_loss_bbox: 0.1782 d2.dn_loss_iou: 0.2101 d3.dn_loss_cls: 0.1835 d3.dn_loss_bbox: 0.1726 d3.dn_loss_iou: 0.2041 d4.dn_loss_cls: 0.1818 d4.dn_loss_bbox: 0.1726 d4.dn_loss_iou: 0.2025 d1.loss_lmm_region: 0.1692 loss_lmm_image: 0.9078 2024/11/11 14:51:55 - mmengine - INFO - Iter(train) [ 50300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:27:07 time: 1.9988 data_time: 0.0170 memory: 34671 grad_norm: 48.0903 loss: 9.1084 loss_cls: 0.2772 loss_bbox: 0.1166 loss_iou: 0.2309 d0.loss_cls: 0.3269 d0.loss_bbox: 0.1252 d0.loss_iou: 0.2464 d1.loss_cls: 0.2966 d1.loss_bbox: 0.1239 d1.loss_iou: 0.2410 d2.loss_cls: 0.2909 d2.loss_bbox: 0.1185 d2.loss_iou: 0.2342 d3.loss_cls: 0.2875 d3.loss_bbox: 0.1155 d3.loss_iou: 0.2294 d4.loss_cls: 0.2771 d4.loss_bbox: 0.1174 d4.loss_iou: 0.2330 enc_loss_cls: 0.3225 enc_loss_bbox: 0.1444 enc_loss_iou: 0.2697 dn_loss_cls: 0.0946 dn_loss_bbox: 0.1583 dn_loss_iou: 0.2093 d0.dn_loss_cls: 0.1665 d0.dn_loss_bbox: 0.2941 d0.dn_loss_iou: 0.3490 d1.dn_loss_cls: 0.1199 d1.dn_loss_bbox: 0.1874 d1.dn_loss_iou: 0.2407 d2.dn_loss_cls: 0.1026 d2.dn_loss_bbox: 0.1662 d2.dn_loss_iou: 0.2190 d3.dn_loss_cls: 0.0979 d3.dn_loss_bbox: 0.1602 d3.dn_loss_iou: 0.2119 d4.dn_loss_cls: 0.0942 d4.dn_loss_bbox: 0.1583 d4.dn_loss_iou: 0.2094 d1.loss_lmm_region: 0.2779 loss_lmm_image: 0.9662 2024/11/11 14:55:14 - mmengine - INFO - Iter(train) [ 50400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:23:45 time: 1.9776 data_time: 0.0172 memory: 35001 grad_norm: 30.7040 loss: 9.5248 loss_cls: 0.2803 loss_bbox: 0.1245 loss_iou: 0.2663 d0.loss_cls: 0.3254 d0.loss_bbox: 0.1373 d0.loss_iou: 0.2817 d1.loss_cls: 0.2975 d1.loss_bbox: 0.1301 d1.loss_iou: 0.2711 d2.loss_cls: 0.2834 d2.loss_bbox: 0.1245 d2.loss_iou: 0.2661 d3.loss_cls: 0.2832 d3.loss_bbox: 0.1263 d3.loss_iou: 0.2680 d4.loss_cls: 0.2771 d4.loss_bbox: 0.1248 d4.loss_iou: 0.2679 enc_loss_cls: 0.3329 enc_loss_bbox: 0.1510 enc_loss_iou: 0.3062 dn_loss_cls: 0.1263 dn_loss_bbox: 0.1615 dn_loss_iou: 0.2220 d0.dn_loss_cls: 0.1965 d0.dn_loss_bbox: 0.2989 d0.dn_loss_iou: 0.3577 d1.dn_loss_cls: 0.1533 d1.dn_loss_bbox: 0.1938 d1.dn_loss_iou: 0.2530 d2.dn_loss_cls: 0.1351 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2320 d3.dn_loss_cls: 0.1295 d3.dn_loss_bbox: 0.1638 d3.dn_loss_iou: 0.2245 d4.dn_loss_cls: 0.1248 d4.dn_loss_bbox: 0.1615 d4.dn_loss_iou: 0.2221 d1.loss_lmm_region: 0.1793 loss_lmm_image: 0.8915 2024/11/11 14:58:32 - mmengine - INFO - Iter(train) [ 50500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:20:21 time: 1.9693 data_time: 0.0171 memory: 33728 grad_norm: 31.2451 loss: 9.4113 loss_cls: 0.2564 loss_bbox: 0.1402 loss_iou: 0.2579 d0.loss_cls: 0.3013 d0.loss_bbox: 0.1502 d0.loss_iou: 0.2723 d1.loss_cls: 0.2711 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2678 d2.loss_cls: 0.2664 d2.loss_bbox: 0.1392 d2.loss_iou: 0.2601 d3.loss_cls: 0.2622 d3.loss_bbox: 0.1358 d3.loss_iou: 0.2576 d4.loss_cls: 0.2599 d4.loss_bbox: 0.1390 d4.loss_iou: 0.2559 enc_loss_cls: 0.2957 enc_loss_bbox: 0.1604 enc_loss_iou: 0.2937 dn_loss_cls: 0.0973 dn_loss_bbox: 0.1768 dn_loss_iou: 0.2318 d0.dn_loss_cls: 0.1811 d0.dn_loss_bbox: 0.3429 d0.dn_loss_iou: 0.3910 d1.dn_loss_cls: 0.1279 d1.dn_loss_bbox: 0.2179 d1.dn_loss_iou: 0.2680 d2.dn_loss_cls: 0.1091 d2.dn_loss_bbox: 0.1870 d2.dn_loss_iou: 0.2422 d3.dn_loss_cls: 0.1017 d3.dn_loss_bbox: 0.1785 d3.dn_loss_iou: 0.2345 d4.dn_loss_cls: 0.0968 d4.dn_loss_bbox: 0.1768 d4.dn_loss_iou: 0.2317 d1.loss_lmm_region: 0.1303 loss_lmm_image: 0.8951 2024/11/11 15:01:50 - mmengine - INFO - Iter(train) [ 50600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:16:55 time: 1.9666 data_time: 0.0170 memory: 34099 grad_norm: 40.2320 loss: 10.4283 loss_cls: 0.3183 loss_bbox: 0.1569 loss_iou: 0.2638 d0.loss_cls: 0.3755 d0.loss_bbox: 0.1657 d0.loss_iou: 0.2778 d1.loss_cls: 0.3341 d1.loss_bbox: 0.1647 d1.loss_iou: 0.2721 d2.loss_cls: 0.3312 d2.loss_bbox: 0.1612 d2.loss_iou: 0.2669 d3.loss_cls: 0.3229 d3.loss_bbox: 0.1564 d3.loss_iou: 0.2622 d4.loss_cls: 0.3168 d4.loss_bbox: 0.1564 d4.loss_iou: 0.2642 enc_loss_cls: 0.3756 enc_loss_bbox: 0.1817 enc_loss_iou: 0.2993 dn_loss_cls: 0.1453 dn_loss_bbox: 0.1810 dn_loss_iou: 0.2346 d0.dn_loss_cls: 0.2403 d0.dn_loss_bbox: 0.3491 d0.dn_loss_iou: 0.3910 d1.dn_loss_cls: 0.1829 d1.dn_loss_bbox: 0.2170 d1.dn_loss_iou: 0.2696 d2.dn_loss_cls: 0.1612 d2.dn_loss_bbox: 0.1907 d2.dn_loss_iou: 0.2445 d3.dn_loss_cls: 0.1542 d3.dn_loss_bbox: 0.1826 d3.dn_loss_iou: 0.2372 d4.dn_loss_cls: 0.1468 d4.dn_loss_bbox: 0.1811 d4.dn_loss_iou: 0.2346 d1.loss_lmm_region: 0.1598 loss_lmm_image: 0.9010 2024/11/11 15:05:11 - mmengine - INFO - Iter(train) [ 50700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:13:36 time: 2.0164 data_time: 0.0170 memory: 32840 grad_norm: 33.6427 loss: 10.0738 loss_cls: 0.3385 loss_bbox: 0.1488 loss_iou: 0.2645 d0.loss_cls: 0.3924 d0.loss_bbox: 0.1619 d0.loss_iou: 0.2782 d1.loss_cls: 0.3628 d1.loss_bbox: 0.1565 d1.loss_iou: 0.2675 d2.loss_cls: 0.3599 d2.loss_bbox: 0.1433 d2.loss_iou: 0.2632 d3.loss_cls: 0.3435 d3.loss_bbox: 0.1455 d3.loss_iou: 0.2615 d4.loss_cls: 0.3407 d4.loss_bbox: 0.1465 d4.loss_iou: 0.2611 enc_loss_cls: 0.3869 enc_loss_bbox: 0.1746 enc_loss_iou: 0.2989 dn_loss_cls: 0.1300 dn_loss_bbox: 0.1538 dn_loss_iou: 0.2177 d0.dn_loss_cls: 0.2082 d0.dn_loss_bbox: 0.2839 d0.dn_loss_iou: 0.3603 d1.dn_loss_cls: 0.1609 d1.dn_loss_bbox: 0.1822 d1.dn_loss_iou: 0.2484 d2.dn_loss_cls: 0.1412 d2.dn_loss_bbox: 0.1638 d2.dn_loss_iou: 0.2270 d3.dn_loss_cls: 0.1347 d3.dn_loss_bbox: 0.1558 d3.dn_loss_iou: 0.2193 d4.dn_loss_cls: 0.1311 d4.dn_loss_bbox: 0.1539 d4.dn_loss_iou: 0.2177 d1.loss_lmm_region: 0.1699 loss_lmm_image: 0.9172 2024/11/11 15:08:29 - mmengine - INFO - Iter(train) [ 50800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:10:12 time: 1.9865 data_time: 0.0171 memory: 34998 grad_norm: 28.3526 loss: 9.2718 loss_cls: 0.3020 loss_bbox: 0.1427 loss_iou: 0.2393 d0.loss_cls: 0.3459 d0.loss_bbox: 0.1566 d0.loss_iou: 0.2529 d1.loss_cls: 0.3249 d1.loss_bbox: 0.1455 d1.loss_iou: 0.2416 d2.loss_cls: 0.3110 d2.loss_bbox: 0.1431 d2.loss_iou: 0.2394 d3.loss_cls: 0.3037 d3.loss_bbox: 0.1434 d3.loss_iou: 0.2413 d4.loss_cls: 0.3021 d4.loss_bbox: 0.1434 d4.loss_iou: 0.2399 enc_loss_cls: 0.3513 enc_loss_bbox: 0.1717 enc_loss_iou: 0.2728 dn_loss_cls: 0.0988 dn_loss_bbox: 0.1670 dn_loss_iou: 0.1930 d0.dn_loss_cls: 0.1807 d0.dn_loss_bbox: 0.3100 d0.dn_loss_iou: 0.3310 d1.dn_loss_cls: 0.1302 d1.dn_loss_bbox: 0.2014 d1.dn_loss_iou: 0.2271 d2.dn_loss_cls: 0.1092 d2.dn_loss_bbox: 0.1756 d2.dn_loss_iou: 0.2028 d3.dn_loss_cls: 0.1029 d3.dn_loss_bbox: 0.1690 d3.dn_loss_iou: 0.1957 d4.dn_loss_cls: 0.0996 d4.dn_loss_bbox: 0.1671 d4.dn_loss_iou: 0.1930 d1.loss_lmm_region: 0.1392 loss_lmm_image: 0.8641 2024/11/11 15:11:48 - mmengine - INFO - Iter(train) [ 50900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:06:50 time: 1.9898 data_time: 0.0171 memory: 33997 grad_norm: 27.7034 loss: 9.3475 loss_cls: 0.2673 loss_bbox: 0.1381 loss_iou: 0.2434 d0.loss_cls: 0.3208 d0.loss_bbox: 0.1438 d0.loss_iou: 0.2489 d1.loss_cls: 0.2851 d1.loss_bbox: 0.1409 d1.loss_iou: 0.2447 d2.loss_cls: 0.2823 d2.loss_bbox: 0.1362 d2.loss_iou: 0.2415 d3.loss_cls: 0.2734 d3.loss_bbox: 0.1367 d3.loss_iou: 0.2411 d4.loss_cls: 0.2696 d4.loss_bbox: 0.1383 d4.loss_iou: 0.2424 enc_loss_cls: 0.3118 enc_loss_bbox: 0.1625 enc_loss_iou: 0.2750 dn_loss_cls: 0.0936 dn_loss_bbox: 0.1925 dn_loss_iou: 0.2277 d0.dn_loss_cls: 0.1797 d0.dn_loss_bbox: 0.3447 d0.dn_loss_iou: 0.3698 d1.dn_loss_cls: 0.1293 d1.dn_loss_bbox: 0.2296 d1.dn_loss_iou: 0.2594 d2.dn_loss_cls: 0.1097 d2.dn_loss_bbox: 0.2067 d2.dn_loss_iou: 0.2380 d3.dn_loss_cls: 0.1011 d3.dn_loss_bbox: 0.1955 d3.dn_loss_iou: 0.2304 d4.dn_loss_cls: 0.0942 d4.dn_loss_bbox: 0.1924 d4.dn_loss_iou: 0.2276 d1.loss_lmm_region: 0.1391 loss_lmm_image: 0.8426 2024/11/11 15:15:08 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 15:15:08 - mmengine - INFO - Iter(train) [ 51000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:03:30 time: 2.0099 data_time: 0.0170 memory: 34161 grad_norm: 29.7321 loss: 8.9195 loss_cls: 0.2903 loss_bbox: 0.1181 loss_iou: 0.2124 d0.loss_cls: 0.3436 d0.loss_bbox: 0.1242 d0.loss_iou: 0.2232 d1.loss_cls: 0.3104 d1.loss_bbox: 0.1224 d1.loss_iou: 0.2170 d2.loss_cls: 0.2988 d2.loss_bbox: 0.1195 d2.loss_iou: 0.2142 d3.loss_cls: 0.2909 d3.loss_bbox: 0.1177 d3.loss_iou: 0.2120 d4.loss_cls: 0.2928 d4.loss_bbox: 0.1185 d4.loss_iou: 0.2121 enc_loss_cls: 0.3398 enc_loss_bbox: 0.1409 enc_loss_iou: 0.2477 dn_loss_cls: 0.1252 dn_loss_bbox: 0.1500 dn_loss_iou: 0.1925 d0.dn_loss_cls: 0.1998 d0.dn_loss_bbox: 0.2964 d0.dn_loss_iou: 0.3373 d1.dn_loss_cls: 0.1526 d1.dn_loss_bbox: 0.1843 d1.dn_loss_iou: 0.2259 d2.dn_loss_cls: 0.1333 d2.dn_loss_bbox: 0.1617 d2.dn_loss_iou: 0.2028 d3.dn_loss_cls: 0.1282 d3.dn_loss_bbox: 0.1527 d3.dn_loss_iou: 0.1949 d4.dn_loss_cls: 0.1261 d4.dn_loss_bbox: 0.1502 d4.dn_loss_iou: 0.1926 d1.loss_lmm_region: 0.1533 loss_lmm_image: 0.8933 2024/11/11 15:18:28 - mmengine - INFO - Iter(train) [ 51100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 7:00:08 time: 1.9898 data_time: 0.0171 memory: 34080 grad_norm: 32.2083 loss: 10.0933 loss_cls: 0.2920 loss_bbox: 0.1667 loss_iou: 0.3062 d0.loss_cls: 0.3487 d0.loss_bbox: 0.1724 d0.loss_iou: 0.3174 d1.loss_cls: 0.3172 d1.loss_bbox: 0.1665 d1.loss_iou: 0.3070 d2.loss_cls: 0.2977 d2.loss_bbox: 0.1723 d2.loss_iou: 0.3119 d3.loss_cls: 0.2949 d3.loss_bbox: 0.1690 d3.loss_iou: 0.3078 d4.loss_cls: 0.2957 d4.loss_bbox: 0.1646 d4.loss_iou: 0.3052 enc_loss_cls: 0.3461 enc_loss_bbox: 0.1933 enc_loss_iou: 0.3498 dn_loss_cls: 0.0863 dn_loss_bbox: 0.1711 dn_loss_iou: 0.2475 d0.dn_loss_cls: 0.1659 d0.dn_loss_bbox: 0.3116 d0.dn_loss_iou: 0.3970 d1.dn_loss_cls: 0.1138 d1.dn_loss_bbox: 0.1980 d1.dn_loss_iou: 0.2770 d2.dn_loss_cls: 0.0960 d2.dn_loss_bbox: 0.1791 d2.dn_loss_iou: 0.2561 d3.dn_loss_cls: 0.0899 d3.dn_loss_bbox: 0.1719 d3.dn_loss_iou: 0.2490 d4.dn_loss_cls: 0.0863 d4.dn_loss_bbox: 0.1711 d4.dn_loss_iou: 0.2475 d1.loss_lmm_region: 0.1475 loss_lmm_image: 0.8280 2024/11/11 15:21:46 - mmengine - INFO - Iter(train) [ 51200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:56:45 time: 1.9554 data_time: 0.0171 memory: 34527 grad_norm: 30.5122 loss: 10.2850 loss_cls: 0.3216 loss_bbox: 0.1540 loss_iou: 0.2651 d0.loss_cls: 0.3676 d0.loss_bbox: 0.1624 d0.loss_iou: 0.2845 d1.loss_cls: 0.3410 d1.loss_bbox: 0.1542 d1.loss_iou: 0.2748 d2.loss_cls: 0.3312 d2.loss_bbox: 0.1519 d2.loss_iou: 0.2705 d3.loss_cls: 0.3249 d3.loss_bbox: 0.1531 d3.loss_iou: 0.2677 d4.loss_cls: 0.3256 d4.loss_bbox: 0.1530 d4.loss_iou: 0.2645 enc_loss_cls: 0.3691 enc_loss_bbox: 0.1745 enc_loss_iou: 0.3018 dn_loss_cls: 0.1395 dn_loss_bbox: 0.1704 dn_loss_iou: 0.2371 d0.dn_loss_cls: 0.2216 d0.dn_loss_bbox: 0.3087 d0.dn_loss_iou: 0.3746 d1.dn_loss_cls: 0.1755 d1.dn_loss_bbox: 0.1953 d1.dn_loss_iou: 0.2617 d2.dn_loss_cls: 0.1546 d2.dn_loss_bbox: 0.1767 d2.dn_loss_iou: 0.2438 d3.dn_loss_cls: 0.1475 d3.dn_loss_bbox: 0.1718 d3.dn_loss_iou: 0.2383 d4.dn_loss_cls: 0.1404 d4.dn_loss_bbox: 0.1704 d4.dn_loss_iou: 0.2369 d1.loss_lmm_region: 0.2098 loss_lmm_image: 0.8973 2024/11/11 15:25:05 - mmengine - INFO - Iter(train) [ 51300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:53:22 time: 1.9648 data_time: 0.0172 memory: 35258 grad_norm: 35.4465 loss: 8.6327 loss_cls: 0.2431 loss_bbox: 0.1229 loss_iou: 0.1951 d0.loss_cls: 0.2901 d0.loss_bbox: 0.1247 d0.loss_iou: 0.2032 d1.loss_cls: 0.2632 d1.loss_bbox: 0.1252 d1.loss_iou: 0.2013 d2.loss_cls: 0.2559 d2.loss_bbox: 0.1193 d2.loss_iou: 0.1935 d3.loss_cls: 0.2445 d3.loss_bbox: 0.1228 d3.loss_iou: 0.1945 d4.loss_cls: 0.2415 d4.loss_bbox: 0.1239 d4.loss_iou: 0.1964 enc_loss_cls: 0.3021 enc_loss_bbox: 0.1399 enc_loss_iou: 0.2227 dn_loss_cls: 0.1071 dn_loss_bbox: 0.1856 dn_loss_iou: 0.2037 d0.dn_loss_cls: 0.1904 d0.dn_loss_bbox: 0.3402 d0.dn_loss_iou: 0.3395 d1.dn_loss_cls: 0.1345 d1.dn_loss_bbox: 0.2210 d1.dn_loss_iou: 0.2329 d2.dn_loss_cls: 0.1193 d2.dn_loss_bbox: 0.1985 d2.dn_loss_iou: 0.2143 d3.dn_loss_cls: 0.1116 d3.dn_loss_bbox: 0.1884 d3.dn_loss_iou: 0.2065 d4.dn_loss_cls: 0.1095 d4.dn_loss_bbox: 0.1857 d4.dn_loss_iou: 0.2037 d1.loss_lmm_region: 0.1564 loss_lmm_image: 0.8582 2024/11/11 15:28:25 - mmengine - INFO - Iter(train) [ 51400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:50:01 time: 1.9983 data_time: 0.0173 memory: 34752 grad_norm: 25.2510 loss: 10.4443 loss_cls: 0.3374 loss_bbox: 0.1635 loss_iou: 0.2910 d0.loss_cls: 0.3940 d0.loss_bbox: 0.1818 d0.loss_iou: 0.3058 d1.loss_cls: 0.3680 d1.loss_bbox: 0.1685 d1.loss_iou: 0.2930 d2.loss_cls: 0.3511 d2.loss_bbox: 0.1704 d2.loss_iou: 0.2934 d3.loss_cls: 0.3444 d3.loss_bbox: 0.1684 d3.loss_iou: 0.2939 d4.loss_cls: 0.3391 d4.loss_bbox: 0.1651 d4.loss_iou: 0.2932 enc_loss_cls: 0.3977 enc_loss_bbox: 0.1887 enc_loss_iou: 0.3282 dn_loss_cls: 0.1098 dn_loss_bbox: 0.1832 dn_loss_iou: 0.2288 d0.dn_loss_cls: 0.1855 d0.dn_loss_bbox: 0.3223 d0.dn_loss_iou: 0.3645 d1.dn_loss_cls: 0.1404 d1.dn_loss_bbox: 0.2082 d1.dn_loss_iou: 0.2551 d2.dn_loss_cls: 0.1222 d2.dn_loss_bbox: 0.1913 d2.dn_loss_iou: 0.2368 d3.dn_loss_cls: 0.1140 d3.dn_loss_bbox: 0.1842 d3.dn_loss_iou: 0.2303 d4.dn_loss_cls: 0.1113 d4.dn_loss_bbox: 0.1832 d4.dn_loss_iou: 0.2286 d1.loss_lmm_region: 0.1413 loss_lmm_image: 0.8667 2024/11/11 15:31:45 - mmengine - INFO - Iter(train) [ 51500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:46:40 time: 1.9931 data_time: 0.0172 memory: 32770 grad_norm: 34.1403 loss: 8.6298 loss_cls: 0.2449 loss_bbox: 0.1149 loss_iou: 0.2090 d0.loss_cls: 0.2896 d0.loss_bbox: 0.1254 d0.loss_iou: 0.2202 d1.loss_cls: 0.2630 d1.loss_bbox: 0.1193 d1.loss_iou: 0.2126 d2.loss_cls: 0.2543 d2.loss_bbox: 0.1178 d2.loss_iou: 0.2111 d3.loss_cls: 0.2529 d3.loss_bbox: 0.1111 d3.loss_iou: 0.2074 d4.loss_cls: 0.2483 d4.loss_bbox: 0.1135 d4.loss_iou: 0.2067 enc_loss_cls: 0.2854 enc_loss_bbox: 0.1367 enc_loss_iou: 0.2417 dn_loss_cls: 0.1284 dn_loss_bbox: 0.1605 dn_loss_iou: 0.2111 d0.dn_loss_cls: 0.1993 d0.dn_loss_bbox: 0.2975 d0.dn_loss_iou: 0.3597 d1.dn_loss_cls: 0.1522 d1.dn_loss_bbox: 0.1897 d1.dn_loss_iou: 0.2435 d2.dn_loss_cls: 0.1335 d2.dn_loss_bbox: 0.1700 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.1275 d3.dn_loss_bbox: 0.1617 d3.dn_loss_iou: 0.2132 d4.dn_loss_cls: 0.1278 d4.dn_loss_bbox: 0.1604 d4.dn_loss_iou: 0.2110 d1.loss_lmm_region: 0.1426 loss_lmm_image: 0.8319 2024/11/11 15:35:03 - mmengine - INFO - Iter(train) [ 51600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:43:16 time: 1.9878 data_time: 0.0173 memory: 34528 grad_norm: 39.8955 loss: 9.7268 loss_cls: 0.3084 loss_bbox: 0.1561 loss_iou: 0.2676 d0.loss_cls: 0.3638 d0.loss_bbox: 0.1642 d0.loss_iou: 0.2852 d1.loss_cls: 0.3284 d1.loss_bbox: 0.1621 d1.loss_iou: 0.2761 d2.loss_cls: 0.3213 d2.loss_bbox: 0.1573 d2.loss_iou: 0.2688 d3.loss_cls: 0.3097 d3.loss_bbox: 0.1565 d3.loss_iou: 0.2696 d4.loss_cls: 0.3088 d4.loss_bbox: 0.1575 d4.loss_iou: 0.2676 enc_loss_cls: 0.3657 enc_loss_bbox: 0.1802 enc_loss_iou: 0.3041 dn_loss_cls: 0.1168 dn_loss_bbox: 0.1546 dn_loss_iou: 0.2060 d0.dn_loss_cls: 0.1930 d0.dn_loss_bbox: 0.2904 d0.dn_loss_iou: 0.3406 d1.dn_loss_cls: 0.1410 d1.dn_loss_bbox: 0.1820 d1.dn_loss_iou: 0.2334 d2.dn_loss_cls: 0.1261 d2.dn_loss_bbox: 0.1619 d2.dn_loss_iou: 0.2146 d3.dn_loss_cls: 0.1184 d3.dn_loss_bbox: 0.1566 d3.dn_loss_iou: 0.2084 d4.dn_loss_cls: 0.1162 d4.dn_loss_bbox: 0.1546 d4.dn_loss_iou: 0.2060 d1.loss_lmm_region: 0.1526 loss_lmm_image: 0.8745 2024/11/11 15:38:22 - mmengine - INFO - Iter(train) [ 51700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:39:54 time: 1.9580 data_time: 0.0171 memory: 34090 grad_norm: 29.9499 loss: 10.0517 loss_cls: 0.3146 loss_bbox: 0.1457 loss_iou: 0.2686 d0.loss_cls: 0.3551 d0.loss_bbox: 0.1647 d0.loss_iou: 0.2937 d1.loss_cls: 0.3249 d1.loss_bbox: 0.1628 d1.loss_iou: 0.2889 d2.loss_cls: 0.3224 d2.loss_bbox: 0.1513 d2.loss_iou: 0.2772 d3.loss_cls: 0.3156 d3.loss_bbox: 0.1496 d3.loss_iou: 0.2734 d4.loss_cls: 0.3169 d4.loss_bbox: 0.1463 d4.loss_iou: 0.2690 enc_loss_cls: 0.3549 enc_loss_bbox: 0.1799 enc_loss_iou: 0.3177 dn_loss_cls: 0.1306 dn_loss_bbox: 0.1619 dn_loss_iou: 0.2288 d0.dn_loss_cls: 0.2109 d0.dn_loss_bbox: 0.3200 d0.dn_loss_iou: 0.3791 d1.dn_loss_cls: 0.1627 d1.dn_loss_bbox: 0.1964 d1.dn_loss_iou: 0.2598 d2.dn_loss_cls: 0.1434 d2.dn_loss_bbox: 0.1732 d2.dn_loss_iou: 0.2388 d3.dn_loss_cls: 0.1356 d3.dn_loss_bbox: 0.1640 d3.dn_loss_iou: 0.2318 d4.dn_loss_cls: 0.1317 d4.dn_loss_bbox: 0.1619 d4.dn_loss_iou: 0.2287 d1.loss_lmm_region: 0.1432 loss_lmm_image: 0.8557 2024/11/11 15:41:43 - mmengine - INFO - Iter(train) [ 51800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:36:35 time: 2.0070 data_time: 0.0172 memory: 33606 grad_norm: nan loss: 11.1989 loss_cls: 0.3704 loss_bbox: 0.1655 loss_iou: 0.2726 d0.loss_cls: 0.4356 d0.loss_bbox: 0.1667 d0.loss_iou: 0.2765 d1.loss_cls: 0.4032 d1.loss_bbox: 0.1613 d1.loss_iou: 0.2715 d2.loss_cls: 0.3975 d2.loss_bbox: 0.1622 d2.loss_iou: 0.2701 d3.loss_cls: 0.3812 d3.loss_bbox: 0.1633 d3.loss_iou: 0.2681 d4.loss_cls: 0.3740 d4.loss_bbox: 0.1662 d4.loss_iou: 0.2721 enc_loss_cls: 0.4346 enc_loss_bbox: 0.1814 enc_loss_iou: 0.3004 dn_loss_cls: 0.1791 dn_loss_bbox: 0.2046 dn_loss_iou: 0.2381 d0.dn_loss_cls: 0.2606 d0.dn_loss_bbox: 0.3690 d0.dn_loss_iou: 0.3903 d1.dn_loss_cls: 0.2005 d1.dn_loss_bbox: 0.2408 d1.dn_loss_iou: 0.2710 d2.dn_loss_cls: 0.1905 d2.dn_loss_bbox: 0.2157 d2.dn_loss_iou: 0.2475 d3.dn_loss_cls: 0.1844 d3.dn_loss_bbox: 0.2068 d3.dn_loss_iou: 0.2409 d4.dn_loss_cls: 0.1871 d4.dn_loss_bbox: 0.2045 d4.dn_loss_iou: 0.2381 d1.loss_lmm_region: 0.1648 loss_lmm_image: 0.8701 2024/11/11 15:45:02 - mmengine - INFO - Iter(train) [ 51900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:33:12 time: 1.9961 data_time: 0.0173 memory: 33991 grad_norm: 33.8293 loss: 7.8163 loss_cls: 0.2289 loss_bbox: 0.0991 loss_iou: 0.1623 d0.loss_cls: 0.2722 d0.loss_bbox: 0.1151 d0.loss_iou: 0.1776 d1.loss_cls: 0.2506 d1.loss_bbox: 0.1055 d1.loss_iou: 0.1656 d2.loss_cls: 0.2395 d2.loss_bbox: 0.1026 d2.loss_iou: 0.1660 d3.loss_cls: 0.2339 d3.loss_bbox: 0.0980 d3.loss_iou: 0.1604 d4.loss_cls: 0.2293 d4.loss_bbox: 0.0991 d4.loss_iou: 0.1618 enc_loss_cls: 0.2699 enc_loss_bbox: 0.1330 enc_loss_iou: 0.2026 dn_loss_cls: 0.1125 dn_loss_bbox: 0.1531 dn_loss_iou: 0.1827 d0.dn_loss_cls: 0.1755 d0.dn_loss_bbox: 0.2900 d0.dn_loss_iou: 0.3106 d1.dn_loss_cls: 0.1328 d1.dn_loss_bbox: 0.1824 d1.dn_loss_iou: 0.2102 d2.dn_loss_cls: 0.1190 d2.dn_loss_bbox: 0.1609 d2.dn_loss_iou: 0.1901 d3.dn_loss_cls: 0.1143 d3.dn_loss_bbox: 0.1547 d3.dn_loss_iou: 0.1847 d4.dn_loss_cls: 0.1124 d4.dn_loss_bbox: 0.1531 d4.dn_loss_iou: 0.1826 d1.loss_lmm_region: 0.1454 loss_lmm_image: 0.8759 2024/11/11 15:48:21 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 15:48:21 - mmengine - INFO - Iter(train) [ 52000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:29:49 time: 1.9824 data_time: 0.0170 memory: 35237 grad_norm: 28.7103 loss: 9.4132 loss_cls: 0.2908 loss_bbox: 0.1412 loss_iou: 0.2358 d0.loss_cls: 0.3369 d0.loss_bbox: 0.1585 d0.loss_iou: 0.2561 d1.loss_cls: 0.3134 d1.loss_bbox: 0.1465 d1.loss_iou: 0.2408 d2.loss_cls: 0.3049 d2.loss_bbox: 0.1415 d2.loss_iou: 0.2339 d3.loss_cls: 0.2967 d3.loss_bbox: 0.1431 d3.loss_iou: 0.2365 d4.loss_cls: 0.2899 d4.loss_bbox: 0.1425 d4.loss_iou: 0.2372 enc_loss_cls: 0.3456 enc_loss_bbox: 0.1693 enc_loss_iou: 0.2716 dn_loss_cls: 0.0973 dn_loss_bbox: 0.1719 dn_loss_iou: 0.2145 d0.dn_loss_cls: 0.1882 d0.dn_loss_bbox: 0.3238 d0.dn_loss_iou: 0.3627 d1.dn_loss_cls: 0.1323 d1.dn_loss_bbox: 0.2071 d1.dn_loss_iou: 0.2479 d2.dn_loss_cls: 0.1111 d2.dn_loss_bbox: 0.1841 d2.dn_loss_iou: 0.2245 d3.dn_loss_cls: 0.1050 d3.dn_loss_bbox: 0.1742 d3.dn_loss_iou: 0.2168 d4.dn_loss_cls: 0.0992 d4.dn_loss_bbox: 0.1718 d4.dn_loss_iou: 0.2145 d1.loss_lmm_region: 0.1572 loss_lmm_image: 0.8765 2024/11/11 15:51:40 - mmengine - INFO - Iter(train) [ 52100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:26:27 time: 1.9993 data_time: 0.0171 memory: 33419 grad_norm: 29.6974 loss: 9.7010 loss_cls: 0.2718 loss_bbox: 0.1641 loss_iou: 0.2603 d0.loss_cls: 0.3141 d0.loss_bbox: 0.1733 d0.loss_iou: 0.2703 d1.loss_cls: 0.2923 d1.loss_bbox: 0.1648 d1.loss_iou: 0.2609 d2.loss_cls: 0.2771 d2.loss_bbox: 0.1678 d2.loss_iou: 0.2647 d3.loss_cls: 0.2724 d3.loss_bbox: 0.1672 d3.loss_iou: 0.2611 d4.loss_cls: 0.2719 d4.loss_bbox: 0.1634 d4.loss_iou: 0.2593 enc_loss_cls: 0.3138 enc_loss_bbox: 0.1920 enc_loss_iou: 0.2949 dn_loss_cls: 0.1042 dn_loss_bbox: 0.1831 dn_loss_iou: 0.2257 d0.dn_loss_cls: 0.1757 d0.dn_loss_bbox: 0.3386 d0.dn_loss_iou: 0.3676 d1.dn_loss_cls: 0.1349 d1.dn_loss_bbox: 0.2176 d1.dn_loss_iou: 0.2574 d2.dn_loss_cls: 0.1162 d2.dn_loss_bbox: 0.1928 d2.dn_loss_iou: 0.2349 d3.dn_loss_cls: 0.1094 d3.dn_loss_bbox: 0.1855 d3.dn_loss_iou: 0.2284 d4.dn_loss_cls: 0.1046 d4.dn_loss_bbox: 0.1833 d4.dn_loss_iou: 0.2258 d1.loss_lmm_region: 0.1517 loss_lmm_image: 0.8862 2024/11/11 15:54:58 - mmengine - INFO - Iter(train) [ 52200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:23:03 time: 1.9696 data_time: 0.0171 memory: 34875 grad_norm: 39.2759 loss: 8.6492 loss_cls: 0.2860 loss_bbox: 0.1122 loss_iou: 0.2170 d0.loss_cls: 0.3205 d0.loss_bbox: 0.1234 d0.loss_iou: 0.2355 d1.loss_cls: 0.2932 d1.loss_bbox: 0.1178 d1.loss_iou: 0.2266 d2.loss_cls: 0.2917 d2.loss_bbox: 0.1130 d2.loss_iou: 0.2190 d3.loss_cls: 0.2891 d3.loss_bbox: 0.1113 d3.loss_iou: 0.2168 d4.loss_cls: 0.2870 d4.loss_bbox: 0.1105 d4.loss_iou: 0.2168 enc_loss_cls: 0.3177 enc_loss_bbox: 0.1451 enc_loss_iou: 0.2590 dn_loss_cls: 0.1103 dn_loss_bbox: 0.1381 dn_loss_iou: 0.1876 d0.dn_loss_cls: 0.1721 d0.dn_loss_bbox: 0.2800 d0.dn_loss_iou: 0.3229 d1.dn_loss_cls: 0.1318 d1.dn_loss_bbox: 0.1725 d1.dn_loss_iou: 0.2190 d2.dn_loss_cls: 0.1181 d2.dn_loss_bbox: 0.1502 d2.dn_loss_iou: 0.1982 d3.dn_loss_cls: 0.1181 d3.dn_loss_bbox: 0.1408 d3.dn_loss_iou: 0.1902 d4.dn_loss_cls: 0.1133 d4.dn_loss_bbox: 0.1382 d4.dn_loss_iou: 0.1875 d1.loss_lmm_region: 0.1389 loss_lmm_image: 0.9123 2024/11/11 15:58:17 - mmengine - INFO - Iter(train) [ 52300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:19:40 time: 2.0083 data_time: 0.0172 memory: 34661 grad_norm: 34.7827 loss: 9.0280 loss_cls: 0.3038 loss_bbox: 0.1139 loss_iou: 0.2211 d0.loss_cls: 0.3499 d0.loss_bbox: 0.1294 d0.loss_iou: 0.2395 d1.loss_cls: 0.3197 d1.loss_bbox: 0.1219 d1.loss_iou: 0.2326 d2.loss_cls: 0.3167 d2.loss_bbox: 0.1130 d2.loss_iou: 0.2231 d3.loss_cls: 0.3124 d3.loss_bbox: 0.1109 d3.loss_iou: 0.2201 d4.loss_cls: 0.3073 d4.loss_bbox: 0.1129 d4.loss_iou: 0.2189 enc_loss_cls: 0.3438 enc_loss_bbox: 0.1410 enc_loss_iou: 0.2635 dn_loss_cls: 0.1136 dn_loss_bbox: 0.1498 dn_loss_iou: 0.2030 d0.dn_loss_cls: 0.1951 d0.dn_loss_bbox: 0.2896 d0.dn_loss_iou: 0.3433 d1.dn_loss_cls: 0.1443 d1.dn_loss_bbox: 0.1765 d1.dn_loss_iou: 0.2329 d2.dn_loss_cls: 0.1296 d2.dn_loss_bbox: 0.1580 d2.dn_loss_iou: 0.2122 d3.dn_loss_cls: 0.1221 d3.dn_loss_bbox: 0.1520 d3.dn_loss_iou: 0.2058 d4.dn_loss_cls: 0.1172 d4.dn_loss_bbox: 0.1499 d4.dn_loss_iou: 0.2030 d1.loss_lmm_region: 0.1504 loss_lmm_image: 0.8643 2024/11/11 16:01:34 - mmengine - INFO - Iter(train) [ 52400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:16:14 time: 1.9678 data_time: 0.0171 memory: 34089 grad_norm: 26.7896 loss: 9.1321 loss_cls: 0.2999 loss_bbox: 0.1242 loss_iou: 0.2168 d0.loss_cls: 0.3220 d0.loss_bbox: 0.1436 d0.loss_iou: 0.2408 d1.loss_cls: 0.3029 d1.loss_bbox: 0.1322 d1.loss_iou: 0.2278 d2.loss_cls: 0.3043 d2.loss_bbox: 0.1220 d2.loss_iou: 0.2158 d3.loss_cls: 0.2965 d3.loss_bbox: 0.1281 d3.loss_iou: 0.2193 d4.loss_cls: 0.2978 d4.loss_bbox: 0.1242 d4.loss_iou: 0.2166 enc_loss_cls: 0.3329 enc_loss_bbox: 0.1516 enc_loss_iou: 0.2626 dn_loss_cls: 0.1003 dn_loss_bbox: 0.1641 dn_loss_iou: 0.2121 d0.dn_loss_cls: 0.1751 d0.dn_loss_bbox: 0.3247 d0.dn_loss_iou: 0.3603 d1.dn_loss_cls: 0.1279 d1.dn_loss_bbox: 0.1984 d1.dn_loss_iou: 0.2446 d2.dn_loss_cls: 0.1121 d2.dn_loss_bbox: 0.1751 d2.dn_loss_iou: 0.2223 d3.dn_loss_cls: 0.1080 d3.dn_loss_bbox: 0.1661 d3.dn_loss_iou: 0.2147 d4.dn_loss_cls: 0.1014 d4.dn_loss_bbox: 0.1640 d4.dn_loss_iou: 0.2121 d1.loss_lmm_region: 0.1542 loss_lmm_image: 0.9126 2024/11/11 16:04:51 - mmengine - INFO - Iter(train) [ 52500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:12:49 time: 2.0076 data_time: 0.0172 memory: 33947 grad_norm: 33.8791 loss: 10.4273 loss_cls: 0.3212 loss_bbox: 0.1670 loss_iou: 0.3450 d0.loss_cls: 0.3772 d0.loss_bbox: 0.1761 d0.loss_iou: 0.3562 d1.loss_cls: 0.3525 d1.loss_bbox: 0.1689 d1.loss_iou: 0.3387 d2.loss_cls: 0.3389 d2.loss_bbox: 0.1699 d2.loss_iou: 0.3429 d3.loss_cls: 0.3266 d3.loss_bbox: 0.1683 d3.loss_iou: 0.3444 d4.loss_cls: 0.3226 d4.loss_bbox: 0.1665 d4.loss_iou: 0.3436 enc_loss_cls: 0.3775 enc_loss_bbox: 0.1906 enc_loss_iou: 0.3814 dn_loss_cls: 0.0846 dn_loss_bbox: 0.1489 dn_loss_iou: 0.2413 d0.dn_loss_cls: 0.1643 d0.dn_loss_bbox: 0.2912 d0.dn_loss_iou: 0.3881 d1.dn_loss_cls: 0.1148 d1.dn_loss_bbox: 0.1784 d1.dn_loss_iou: 0.2719 d2.dn_loss_cls: 0.0976 d2.dn_loss_bbox: 0.1598 d2.dn_loss_iou: 0.2512 d3.dn_loss_cls: 0.0888 d3.dn_loss_bbox: 0.1514 d3.dn_loss_iou: 0.2437 d4.dn_loss_cls: 0.0843 d4.dn_loss_bbox: 0.1488 d4.dn_loss_iou: 0.2412 d1.loss_lmm_region: 0.1348 loss_lmm_image: 0.8661 2024/11/11 16:08:12 - mmengine - INFO - Iter(train) [ 52600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:09:29 time: 1.9920 data_time: 0.0171 memory: 33379 grad_norm: 29.5105 loss: 9.9035 loss_cls: 0.3215 loss_bbox: 0.1431 loss_iou: 0.2847 d0.loss_cls: 0.3758 d0.loss_bbox: 0.1522 d0.loss_iou: 0.3039 d1.loss_cls: 0.3466 d1.loss_bbox: 0.1464 d1.loss_iou: 0.2937 d2.loss_cls: 0.3347 d2.loss_bbox: 0.1412 d2.loss_iou: 0.2878 d3.loss_cls: 0.3301 d3.loss_bbox: 0.1411 d3.loss_iou: 0.2843 d4.loss_cls: 0.3252 d4.loss_bbox: 0.1430 d4.loss_iou: 0.2849 enc_loss_cls: 0.3760 enc_loss_bbox: 0.1686 enc_loss_iou: 0.3281 dn_loss_cls: 0.1240 dn_loss_bbox: 0.1425 dn_loss_iou: 0.2166 d0.dn_loss_cls: 0.1906 d0.dn_loss_bbox: 0.2735 d0.dn_loss_iou: 0.3524 d1.dn_loss_cls: 0.1481 d1.dn_loss_bbox: 0.1751 d1.dn_loss_iou: 0.2474 d2.dn_loss_cls: 0.1300 d2.dn_loss_bbox: 0.1542 d2.dn_loss_iou: 0.2263 d3.dn_loss_cls: 0.1267 d3.dn_loss_bbox: 0.1450 d3.dn_loss_iou: 0.2184 d4.dn_loss_cls: 0.1233 d4.dn_loss_bbox: 0.1425 d4.dn_loss_iou: 0.2165 d1.loss_lmm_region: 0.1363 loss_lmm_image: 0.9011 2024/11/11 16:11:30 - mmengine - INFO - Iter(train) [ 52700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:06:05 time: 1.9900 data_time: 0.0172 memory: 33981 grad_norm: 26.9277 loss: 9.4793 loss_cls: 0.3187 loss_bbox: 0.1422 loss_iou: 0.2738 d0.loss_cls: 0.3666 d0.loss_bbox: 0.1591 d0.loss_iou: 0.2890 d1.loss_cls: 0.3384 d1.loss_bbox: 0.1511 d1.loss_iou: 0.2843 d2.loss_cls: 0.3293 d2.loss_bbox: 0.1478 d2.loss_iou: 0.2802 d3.loss_cls: 0.3281 d3.loss_bbox: 0.1395 d3.loss_iou: 0.2708 d4.loss_cls: 0.3210 d4.loss_bbox: 0.1421 d4.loss_iou: 0.2724 enc_loss_cls: 0.3666 enc_loss_bbox: 0.1708 enc_loss_iou: 0.3169 dn_loss_cls: 0.1041 dn_loss_bbox: 0.1444 dn_loss_iou: 0.1886 d0.dn_loss_cls: 0.1775 d0.dn_loss_bbox: 0.2740 d0.dn_loss_iou: 0.3207 d1.dn_loss_cls: 0.1349 d1.dn_loss_bbox: 0.1721 d1.dn_loss_iou: 0.2185 d2.dn_loss_cls: 0.1156 d2.dn_loss_bbox: 0.1519 d2.dn_loss_iou: 0.1971 d3.dn_loss_cls: 0.1094 d3.dn_loss_bbox: 0.1470 d3.dn_loss_iou: 0.1910 d4.dn_loss_cls: 0.1053 d4.dn_loss_bbox: 0.1444 d4.dn_loss_iou: 0.1886 d1.loss_lmm_region: 0.1346 loss_lmm_image: 0.8512 2024/11/11 16:14:50 - mmengine - INFO - Iter(train) [ 52800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 6:02:45 time: 1.9944 data_time: 0.0174 memory: 35039 grad_norm: 26.9287 loss: 9.4805 loss_cls: 0.2957 loss_bbox: 0.1386 loss_iou: 0.2539 d0.loss_cls: 0.3416 d0.loss_bbox: 0.1508 d0.loss_iou: 0.2698 d1.loss_cls: 0.3188 d1.loss_bbox: 0.1412 d1.loss_iou: 0.2621 d2.loss_cls: 0.3097 d2.loss_bbox: 0.1376 d2.loss_iou: 0.2561 d3.loss_cls: 0.3063 d3.loss_bbox: 0.1330 d3.loss_iou: 0.2541 d4.loss_cls: 0.2956 d4.loss_bbox: 0.1403 d4.loss_iou: 0.2554 enc_loss_cls: 0.3519 enc_loss_bbox: 0.1539 enc_loss_iou: 0.2879 dn_loss_cls: 0.0978 dn_loss_bbox: 0.1674 dn_loss_iou: 0.2257 d0.dn_loss_cls: 0.1776 d0.dn_loss_bbox: 0.3145 d0.dn_loss_iou: 0.3663 d1.dn_loss_cls: 0.1327 d1.dn_loss_bbox: 0.2013 d1.dn_loss_iou: 0.2553 d2.dn_loss_cls: 0.1117 d2.dn_loss_bbox: 0.1765 d2.dn_loss_iou: 0.2343 d3.dn_loss_cls: 0.1033 d3.dn_loss_bbox: 0.1694 d3.dn_loss_iou: 0.2279 d4.dn_loss_cls: 0.0975 d4.dn_loss_bbox: 0.1674 d4.dn_loss_iou: 0.2256 d1.loss_lmm_region: 0.1374 loss_lmm_image: 0.8365 2024/11/11 16:18:08 - mmengine - INFO - Iter(train) [ 52900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:59:20 time: 2.0023 data_time: 0.0172 memory: 33378 grad_norm: 25.4433 loss: 10.3089 loss_cls: 0.3323 loss_bbox: 0.1588 loss_iou: 0.2947 d0.loss_cls: 0.3908 d0.loss_bbox: 0.1654 d0.loss_iou: 0.3065 d1.loss_cls: 0.3495 d1.loss_bbox: 0.1651 d1.loss_iou: 0.3048 d2.loss_cls: 0.3520 d2.loss_bbox: 0.1530 d2.loss_iou: 0.2911 d3.loss_cls: 0.3418 d3.loss_bbox: 0.1528 d3.loss_iou: 0.2891 d4.loss_cls: 0.3337 d4.loss_bbox: 0.1574 d4.loss_iou: 0.2935 enc_loss_cls: 0.3831 enc_loss_bbox: 0.1773 enc_loss_iou: 0.3334 dn_loss_cls: 0.1388 dn_loss_bbox: 0.1643 dn_loss_iou: 0.2159 d0.dn_loss_cls: 0.2060 d0.dn_loss_bbox: 0.2996 d0.dn_loss_iou: 0.3530 d1.dn_loss_cls: 0.1689 d1.dn_loss_bbox: 0.1909 d1.dn_loss_iou: 0.2434 d2.dn_loss_cls: 0.1494 d2.dn_loss_bbox: 0.1721 d2.dn_loss_iou: 0.2237 d3.dn_loss_cls: 0.1427 d3.dn_loss_bbox: 0.1661 d3.dn_loss_iou: 0.2175 d4.dn_loss_cls: 0.1401 d4.dn_loss_bbox: 0.1643 d4.dn_loss_iou: 0.2157 d1.loss_lmm_region: 0.1481 loss_lmm_image: 0.8622 2024/11/11 16:21:28 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 16:21:28 - mmengine - INFO - Iter(train) [ 53000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:55:59 time: 2.0008 data_time: 0.0172 memory: 33731 grad_norm: 27.6823 loss: 9.5505 loss_cls: 0.3001 loss_bbox: 0.1545 loss_iou: 0.2349 d0.loss_cls: 0.3521 d0.loss_bbox: 0.1535 d0.loss_iou: 0.2468 d1.loss_cls: 0.3197 d1.loss_bbox: 0.1574 d1.loss_iou: 0.2415 d2.loss_cls: 0.3163 d2.loss_bbox: 0.1489 d2.loss_iou: 0.2332 d3.loss_cls: 0.3123 d3.loss_bbox: 0.1468 d3.loss_iou: 0.2324 d4.loss_cls: 0.3036 d4.loss_bbox: 0.1536 d4.loss_iou: 0.2332 enc_loss_cls: 0.3483 enc_loss_bbox: 0.1754 enc_loss_iou: 0.2706 dn_loss_cls: 0.1190 dn_loss_bbox: 0.1614 dn_loss_iou: 0.2072 d0.dn_loss_cls: 0.2015 d0.dn_loss_bbox: 0.3037 d0.dn_loss_iou: 0.3391 d1.dn_loss_cls: 0.1548 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2370 d2.dn_loss_cls: 0.1349 d2.dn_loss_bbox: 0.1726 d2.dn_loss_iou: 0.2170 d3.dn_loss_cls: 0.1258 d3.dn_loss_bbox: 0.1637 d3.dn_loss_iou: 0.2102 d4.dn_loss_cls: 0.1205 d4.dn_loss_bbox: 0.1613 d4.dn_loss_iou: 0.2070 d1.loss_lmm_region: 0.1730 loss_lmm_image: 0.9084 2024/11/11 16:24:46 - mmengine - INFO - Iter(train) [ 53100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:52:36 time: 1.9847 data_time: 0.0172 memory: 33756 grad_norm: 27.3643 loss: 9.2666 loss_cls: 0.3052 loss_bbox: 0.1244 loss_iou: 0.2234 d0.loss_cls: 0.3406 d0.loss_bbox: 0.1310 d0.loss_iou: 0.2355 d1.loss_cls: 0.3163 d1.loss_bbox: 0.1275 d1.loss_iou: 0.2319 d2.loss_cls: 0.3143 d2.loss_bbox: 0.1242 d2.loss_iou: 0.2242 d3.loss_cls: 0.3088 d3.loss_bbox: 0.1248 d3.loss_iou: 0.2222 d4.loss_cls: 0.3067 d4.loss_bbox: 0.1239 d4.loss_iou: 0.2223 enc_loss_cls: 0.3450 enc_loss_bbox: 0.1478 enc_loss_iou: 0.2664 dn_loss_cls: 0.1257 dn_loss_bbox: 0.1674 dn_loss_iou: 0.2073 d0.dn_loss_cls: 0.1962 d0.dn_loss_bbox: 0.3202 d0.dn_loss_iou: 0.3502 d1.dn_loss_cls: 0.1473 d1.dn_loss_bbox: 0.2015 d1.dn_loss_iou: 0.2390 d2.dn_loss_cls: 0.1345 d2.dn_loss_bbox: 0.1808 d2.dn_loss_iou: 0.2199 d3.dn_loss_cls: 0.1279 d3.dn_loss_bbox: 0.1694 d3.dn_loss_iou: 0.2102 d4.dn_loss_cls: 0.1253 d4.dn_loss_bbox: 0.1674 d4.dn_loss_iou: 0.2073 d1.loss_lmm_region: 0.1537 loss_lmm_image: 0.8489 2024/11/11 16:28:06 - mmengine - INFO - Iter(train) [ 53200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:49:16 time: 1.9838 data_time: 0.0172 memory: 33541 grad_norm: 28.3043 loss: 9.9956 loss_cls: 0.3232 loss_bbox: 0.1406 loss_iou: 0.2609 d0.loss_cls: 0.3788 d0.loss_bbox: 0.1550 d0.loss_iou: 0.2770 d1.loss_cls: 0.3447 d1.loss_bbox: 0.1495 d1.loss_iou: 0.2687 d2.loss_cls: 0.3362 d2.loss_bbox: 0.1422 d2.loss_iou: 0.2646 d3.loss_cls: 0.3279 d3.loss_bbox: 0.1374 d3.loss_iou: 0.2592 d4.loss_cls: 0.3240 d4.loss_bbox: 0.1396 d4.loss_iou: 0.2598 enc_loss_cls: 0.3759 enc_loss_bbox: 0.1682 enc_loss_iou: 0.2952 dn_loss_cls: 0.1198 dn_loss_bbox: 0.1660 dn_loss_iou: 0.2309 d0.dn_loss_cls: 0.2053 d0.dn_loss_bbox: 0.3206 d0.dn_loss_iou: 0.3775 d1.dn_loss_cls: 0.1521 d1.dn_loss_bbox: 0.2039 d1.dn_loss_iou: 0.2655 d2.dn_loss_cls: 0.1341 d2.dn_loss_bbox: 0.1810 d2.dn_loss_iou: 0.2420 d3.dn_loss_cls: 0.1233 d3.dn_loss_bbox: 0.1691 d3.dn_loss_iou: 0.2336 d4.dn_loss_cls: 0.1202 d4.dn_loss_bbox: 0.1660 d4.dn_loss_iou: 0.2307 d1.loss_lmm_region: 0.1614 loss_lmm_image: 0.8639 2024/11/11 16:31:24 - mmengine - INFO - Iter(train) [ 53300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:45:51 time: 1.9504 data_time: 0.0173 memory: 32659 grad_norm: 30.5130 loss: 8.5340 loss_cls: 0.2926 loss_bbox: 0.1111 loss_iou: 0.2080 d0.loss_cls: 0.3332 d0.loss_bbox: 0.1233 d0.loss_iou: 0.2217 d1.loss_cls: 0.3124 d1.loss_bbox: 0.1133 d1.loss_iou: 0.2129 d2.loss_cls: 0.3021 d2.loss_bbox: 0.1113 d2.loss_iou: 0.2078 d3.loss_cls: 0.2938 d3.loss_bbox: 0.1093 d3.loss_iou: 0.2069 d4.loss_cls: 0.2929 d4.loss_bbox: 0.1105 d4.loss_iou: 0.2074 enc_loss_cls: 0.3361 enc_loss_bbox: 0.1349 enc_loss_iou: 0.2413 dn_loss_cls: 0.0900 dn_loss_bbox: 0.1439 dn_loss_iou: 0.1790 d0.dn_loss_cls: 0.1612 d0.dn_loss_bbox: 0.2928 d0.dn_loss_iou: 0.3141 d1.dn_loss_cls: 0.1166 d1.dn_loss_bbox: 0.1824 d1.dn_loss_iou: 0.2117 d2.dn_loss_cls: 0.1055 d2.dn_loss_bbox: 0.1543 d2.dn_loss_iou: 0.1886 d3.dn_loss_cls: 0.0946 d3.dn_loss_bbox: 0.1463 d3.dn_loss_iou: 0.1811 d4.dn_loss_cls: 0.0898 d4.dn_loss_bbox: 0.1438 d4.dn_loss_iou: 0.1789 d1.loss_lmm_region: 0.1436 loss_lmm_image: 0.9328 2024/11/11 16:34:42 - mmengine - INFO - Iter(train) [ 53400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:42:27 time: 1.9900 data_time: 0.0173 memory: 34085 grad_norm: 27.4507 loss: 10.6365 loss_cls: 0.3348 loss_bbox: 0.1676 loss_iou: 0.2768 d0.loss_cls: 0.3837 d0.loss_bbox: 0.1791 d0.loss_iou: 0.2913 d1.loss_cls: 0.3529 d1.loss_bbox: 0.1741 d1.loss_iou: 0.2855 d2.loss_cls: 0.3525 d2.loss_bbox: 0.1674 d2.loss_iou: 0.2780 d3.loss_cls: 0.3432 d3.loss_bbox: 0.1656 d3.loss_iou: 0.2750 d4.loss_cls: 0.3353 d4.loss_bbox: 0.1678 d4.loss_iou: 0.2777 enc_loss_cls: 0.3932 enc_loss_bbox: 0.1936 enc_loss_iou: 0.3115 dn_loss_cls: 0.1319 dn_loss_bbox: 0.2032 dn_loss_iou: 0.2355 d0.dn_loss_cls: 0.2099 d0.dn_loss_bbox: 0.3451 d0.dn_loss_iou: 0.3763 d1.dn_loss_cls: 0.1607 d1.dn_loss_bbox: 0.2373 d1.dn_loss_iou: 0.2654 d2.dn_loss_cls: 0.1429 d2.dn_loss_bbox: 0.2138 d2.dn_loss_iou: 0.2438 d3.dn_loss_cls: 0.1371 d3.dn_loss_bbox: 0.2049 d3.dn_loss_iou: 0.2372 d4.dn_loss_cls: 0.1335 d4.dn_loss_bbox: 0.2033 d4.dn_loss_iou: 0.2355 d1.loss_lmm_region: 0.1619 loss_lmm_image: 0.8506 2024/11/11 16:38:01 - mmengine - INFO - Iter(train) [ 53500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:39:06 time: 1.9813 data_time: 0.0174 memory: 34286 grad_norm: 28.3179 loss: 9.0629 loss_cls: 0.2945 loss_bbox: 0.1234 loss_iou: 0.2113 d0.loss_cls: 0.3242 d0.loss_bbox: 0.1414 d0.loss_iou: 0.2325 d1.loss_cls: 0.3075 d1.loss_bbox: 0.1354 d1.loss_iou: 0.2248 d2.loss_cls: 0.3054 d2.loss_bbox: 0.1210 d2.loss_iou: 0.2124 d3.loss_cls: 0.2957 d3.loss_bbox: 0.1238 d3.loss_iou: 0.2126 d4.loss_cls: 0.2976 d4.loss_bbox: 0.1233 d4.loss_iou: 0.2106 enc_loss_cls: 0.3264 enc_loss_bbox: 0.1538 enc_loss_iou: 0.2520 dn_loss_cls: 0.1218 dn_loss_bbox: 0.1575 dn_loss_iou: 0.2074 d0.dn_loss_cls: 0.2040 d0.dn_loss_bbox: 0.3020 d0.dn_loss_iou: 0.3493 d1.dn_loss_cls: 0.1563 d1.dn_loss_bbox: 0.1913 d1.dn_loss_iou: 0.2391 d2.dn_loss_cls: 0.1365 d2.dn_loss_bbox: 0.1677 d2.dn_loss_iou: 0.2179 d3.dn_loss_cls: 0.1311 d3.dn_loss_bbox: 0.1599 d3.dn_loss_iou: 0.2097 d4.dn_loss_cls: 0.1249 d4.dn_loss_bbox: 0.1576 d4.dn_loss_iou: 0.2073 d1.loss_lmm_region: 0.1699 loss_lmm_image: 0.8222 2024/11/11 16:41:22 - mmengine - INFO - Iter(train) [ 53600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:35:47 time: 2.0020 data_time: 0.0173 memory: 33741 grad_norm: 34.5845 loss: 10.2355 loss_cls: 0.3086 loss_bbox: 0.1578 loss_iou: 0.2897 d0.loss_cls: 0.3502 d0.loss_bbox: 0.1658 d0.loss_iou: 0.3036 d1.loss_cls: 0.3264 d1.loss_bbox: 0.1590 d1.loss_iou: 0.2895 d2.loss_cls: 0.3187 d2.loss_bbox: 0.1556 d2.loss_iou: 0.2869 d3.loss_cls: 0.3075 d3.loss_bbox: 0.1574 d3.loss_iou: 0.2897 d4.loss_cls: 0.3091 d4.loss_bbox: 0.1587 d4.loss_iou: 0.2895 enc_loss_cls: 0.3541 enc_loss_bbox: 0.1817 enc_loss_iou: 0.3288 dn_loss_cls: 0.1125 dn_loss_bbox: 0.1752 dn_loss_iou: 0.2403 d0.dn_loss_cls: 0.1895 d0.dn_loss_bbox: 0.3258 d0.dn_loss_iou: 0.3911 d1.dn_loss_cls: 0.1446 d1.dn_loss_bbox: 0.2098 d1.dn_loss_iou: 0.2734 d2.dn_loss_cls: 0.1237 d2.dn_loss_bbox: 0.1839 d2.dn_loss_iou: 0.2489 d3.dn_loss_cls: 0.1166 d3.dn_loss_bbox: 0.1778 d3.dn_loss_iou: 0.2432 d4.dn_loss_cls: 0.1145 d4.dn_loss_bbox: 0.1751 d4.dn_loss_iou: 0.2402 d1.loss_lmm_region: 0.1638 loss_lmm_image: 0.8975 2024/11/11 16:44:41 - mmengine - INFO - Iter(train) [ 53700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:32:25 time: 2.0028 data_time: 0.0174 memory: 34346 grad_norm: 47.6259 loss: 9.0731 loss_cls: 0.2713 loss_bbox: 0.1268 loss_iou: 0.2414 d0.loss_cls: 0.3053 d0.loss_bbox: 0.1405 d0.loss_iou: 0.2507 d1.loss_cls: 0.2814 d1.loss_bbox: 0.1317 d1.loss_iou: 0.2457 d2.loss_cls: 0.2797 d2.loss_bbox: 0.1275 d2.loss_iou: 0.2423 d3.loss_cls: 0.2724 d3.loss_bbox: 0.1269 d3.loss_iou: 0.2444 d4.loss_cls: 0.2715 d4.loss_bbox: 0.1267 d4.loss_iou: 0.2402 enc_loss_cls: 0.3110 enc_loss_bbox: 0.1496 enc_loss_iou: 0.2709 dn_loss_cls: 0.1107 dn_loss_bbox: 0.1629 dn_loss_iou: 0.2168 d0.dn_loss_cls: 0.1874 d0.dn_loss_bbox: 0.2993 d0.dn_loss_iou: 0.3605 d1.dn_loss_cls: 0.1385 d1.dn_loss_bbox: 0.1914 d1.dn_loss_iou: 0.2473 d2.dn_loss_cls: 0.1227 d2.dn_loss_bbox: 0.1732 d2.dn_loss_iou: 0.2269 d3.dn_loss_cls: 0.1159 d3.dn_loss_bbox: 0.1659 d3.dn_loss_iou: 0.2199 d4.dn_loss_cls: 0.1125 d4.dn_loss_bbox: 0.1629 d4.dn_loss_iou: 0.2169 d1.loss_lmm_region: 0.1596 loss_lmm_image: 0.8241 2024/11/11 16:47:59 - mmengine - INFO - Iter(train) [ 53800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:29:00 time: 1.9497 data_time: 0.0171 memory: 34282 grad_norm: 34.4664 loss: 9.9554 loss_cls: 0.3084 loss_bbox: 0.1538 loss_iou: 0.2480 d0.loss_cls: 0.3681 d0.loss_bbox: 0.1711 d0.loss_iou: 0.2636 d1.loss_cls: 0.3345 d1.loss_bbox: 0.1631 d1.loss_iou: 0.2574 d2.loss_cls: 0.3254 d2.loss_bbox: 0.1574 d2.loss_iou: 0.2510 d3.loss_cls: 0.3100 d3.loss_bbox: 0.1578 d3.loss_iou: 0.2523 d4.loss_cls: 0.3122 d4.loss_bbox: 0.1539 d4.loss_iou: 0.2488 enc_loss_cls: 0.3566 enc_loss_bbox: 0.1979 enc_loss_iou: 0.2912 dn_loss_cls: 0.0909 dn_loss_bbox: 0.2022 dn_loss_iou: 0.2269 d0.dn_loss_cls: 0.1806 d0.dn_loss_bbox: 0.3811 d0.dn_loss_iou: 0.3790 d1.dn_loss_cls: 0.1255 d1.dn_loss_bbox: 0.2458 d1.dn_loss_iou: 0.2630 d2.dn_loss_cls: 0.1028 d2.dn_loss_bbox: 0.2168 d2.dn_loss_iou: 0.2385 d3.dn_loss_cls: 0.0946 d3.dn_loss_bbox: 0.2047 d3.dn_loss_iou: 0.2294 d4.dn_loss_cls: 0.0915 d4.dn_loss_bbox: 0.2023 d4.dn_loss_iou: 0.2269 d1.loss_lmm_region: 0.1294 loss_lmm_image: 0.8406 2024/11/11 16:51:17 - mmengine - INFO - Iter(train) [ 53900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:25:36 time: 1.9783 data_time: 0.0172 memory: 33597 grad_norm: 26.8599 loss: 9.8937 loss_cls: 0.2907 loss_bbox: 0.1471 loss_iou: 0.2321 d0.loss_cls: 0.3481 d0.loss_bbox: 0.1641 d0.loss_iou: 0.2460 d1.loss_cls: 0.3158 d1.loss_bbox: 0.1576 d1.loss_iou: 0.2420 d2.loss_cls: 0.3061 d2.loss_bbox: 0.1497 d2.loss_iou: 0.2350 d3.loss_cls: 0.2943 d3.loss_bbox: 0.1506 d3.loss_iou: 0.2338 d4.loss_cls: 0.2899 d4.loss_bbox: 0.1483 d4.loss_iou: 0.2337 enc_loss_cls: 0.3509 enc_loss_bbox: 0.1795 enc_loss_iou: 0.2697 dn_loss_cls: 0.1235 dn_loss_bbox: 0.1959 dn_loss_iou: 0.2328 d0.dn_loss_cls: 0.2160 d0.dn_loss_bbox: 0.3594 d0.dn_loss_iou: 0.3883 d1.dn_loss_cls: 0.1684 d1.dn_loss_bbox: 0.2340 d1.dn_loss_iou: 0.2693 d2.dn_loss_cls: 0.1427 d2.dn_loss_bbox: 0.2076 d2.dn_loss_iou: 0.2437 d3.dn_loss_cls: 0.1308 d3.dn_loss_bbox: 0.1981 d3.dn_loss_iou: 0.2359 d4.dn_loss_cls: 0.1242 d4.dn_loss_bbox: 0.1959 d4.dn_loss_iou: 0.2329 d1.loss_lmm_region: 0.1596 loss_lmm_image: 0.8499 2024/11/11 16:54:37 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 16:54:37 - mmengine - INFO - Iter(train) [ 54000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:22:17 time: 2.0078 data_time: 0.0172 memory: 34008 grad_norm: nan loss: 9.3235 loss_cls: 0.2792 loss_bbox: 0.1302 loss_iou: 0.2286 d0.loss_cls: 0.3093 d0.loss_bbox: 0.1409 d0.loss_iou: 0.2475 d1.loss_cls: 0.2987 d1.loss_bbox: 0.1345 d1.loss_iou: 0.2311 d2.loss_cls: 0.2925 d2.loss_bbox: 0.1283 d2.loss_iou: 0.2304 d3.loss_cls: 0.2824 d3.loss_bbox: 0.1310 d3.loss_iou: 0.2290 d4.loss_cls: 0.2797 d4.loss_bbox: 0.1313 d4.loss_iou: 0.2273 enc_loss_cls: 0.3189 enc_loss_bbox: 0.1518 enc_loss_iou: 0.2670 dn_loss_cls: 0.1133 dn_loss_bbox: 0.1906 dn_loss_iou: 0.2213 d0.dn_loss_cls: 0.1779 d0.dn_loss_bbox: 0.3522 d0.dn_loss_iou: 0.3664 d1.dn_loss_cls: 0.1321 d1.dn_loss_bbox: 0.2235 d1.dn_loss_iou: 0.2513 d2.dn_loss_cls: 0.1152 d2.dn_loss_bbox: 0.2020 d2.dn_loss_iou: 0.2312 d3.dn_loss_cls: 0.1114 d3.dn_loss_bbox: 0.1922 d3.dn_loss_iou: 0.2228 d4.dn_loss_cls: 0.1098 d4.dn_loss_bbox: 0.1906 d4.dn_loss_iou: 0.2213 d1.loss_lmm_region: 0.1209 loss_lmm_image: 0.9076 2024/11/11 16:57:56 - mmengine - INFO - Iter(train) [ 54100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:18:54 time: 1.9881 data_time: 0.0172 memory: 34314 grad_norm: 28.9815 loss: 10.4279 loss_cls: 0.3637 loss_bbox: 0.1587 loss_iou: 0.2802 d0.loss_cls: 0.4210 d0.loss_bbox: 0.1708 d0.loss_iou: 0.2976 d1.loss_cls: 0.3857 d1.loss_bbox: 0.1634 d1.loss_iou: 0.2888 d2.loss_cls: 0.3723 d2.loss_bbox: 0.1607 d2.loss_iou: 0.2838 d3.loss_cls: 0.3672 d3.loss_bbox: 0.1573 d3.loss_iou: 0.2802 d4.loss_cls: 0.3628 d4.loss_bbox: 0.1593 d4.loss_iou: 0.2800 enc_loss_cls: 0.4187 enc_loss_bbox: 0.1881 enc_loss_iou: 0.3254 dn_loss_cls: 0.1445 dn_loss_bbox: 0.1500 dn_loss_iou: 0.2116 d0.dn_loss_cls: 0.2108 d0.dn_loss_bbox: 0.2839 d0.dn_loss_iou: 0.3525 d1.dn_loss_cls: 0.1683 d1.dn_loss_bbox: 0.1793 d1.dn_loss_iou: 0.2445 d2.dn_loss_cls: 0.1552 d2.dn_loss_bbox: 0.1569 d2.dn_loss_iou: 0.2198 d3.dn_loss_cls: 0.1488 d3.dn_loss_bbox: 0.1522 d3.dn_loss_iou: 0.2135 d4.dn_loss_cls: 0.1461 d4.dn_loss_bbox: 0.1499 d4.dn_loss_iou: 0.2115 d1.loss_lmm_region: 0.1374 loss_lmm_image: 0.9056 2024/11/11 17:01:17 - mmengine - INFO - Iter(train) [ 54200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:15:35 time: 1.9896 data_time: 0.0172 memory: 33948 grad_norm: 32.7606 loss: 9.1399 loss_cls: 0.3136 loss_bbox: 0.1171 loss_iou: 0.1916 d0.loss_cls: 0.3582 d0.loss_bbox: 0.1255 d0.loss_iou: 0.2038 d1.loss_cls: 0.3243 d1.loss_bbox: 0.1224 d1.loss_iou: 0.2028 d2.loss_cls: 0.3194 d2.loss_bbox: 0.1178 d2.loss_iou: 0.1951 d3.loss_cls: 0.3209 d3.loss_bbox: 0.1158 d3.loss_iou: 0.1902 d4.loss_cls: 0.3144 d4.loss_bbox: 0.1179 d4.loss_iou: 0.1914 enc_loss_cls: 0.3522 enc_loss_bbox: 0.1404 enc_loss_iou: 0.2271 dn_loss_cls: 0.1634 dn_loss_bbox: 0.1556 dn_loss_iou: 0.1915 d0.dn_loss_cls: 0.2305 d0.dn_loss_bbox: 0.3055 d0.dn_loss_iou: 0.3342 d1.dn_loss_cls: 0.1894 d1.dn_loss_bbox: 0.1889 d1.dn_loss_iou: 0.2241 d2.dn_loss_cls: 0.1733 d2.dn_loss_bbox: 0.1639 d2.dn_loss_iou: 0.2005 d3.dn_loss_cls: 0.1726 d3.dn_loss_bbox: 0.1566 d3.dn_loss_iou: 0.1935 d4.dn_loss_cls: 0.1669 d4.dn_loss_bbox: 0.1556 d4.dn_loss_iou: 0.1915 d1.loss_lmm_region: 0.1416 loss_lmm_image: 0.8788 2024/11/11 17:04:37 - mmengine - INFO - Iter(train) [ 54300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:12:14 time: 2.0009 data_time: 0.0173 memory: 33444 grad_norm: 33.4549 loss: 9.3181 loss_cls: 0.2831 loss_bbox: 0.1273 loss_iou: 0.1949 d0.loss_cls: 0.3377 d0.loss_bbox: 0.1448 d0.loss_iou: 0.2119 d1.loss_cls: 0.2957 d1.loss_bbox: 0.1395 d1.loss_iou: 0.2040 d2.loss_cls: 0.2911 d2.loss_bbox: 0.1322 d2.loss_iou: 0.1966 d3.loss_cls: 0.2809 d3.loss_bbox: 0.1317 d3.loss_iou: 0.1973 d4.loss_cls: 0.2812 d4.loss_bbox: 0.1302 d4.loss_iou: 0.1968 enc_loss_cls: 0.3355 enc_loss_bbox: 0.1561 enc_loss_iou: 0.2308 dn_loss_cls: 0.1772 dn_loss_bbox: 0.1649 dn_loss_iou: 0.1944 d0.dn_loss_cls: 0.2627 d0.dn_loss_bbox: 0.3344 d0.dn_loss_iou: 0.3444 d1.dn_loss_cls: 0.2068 d1.dn_loss_bbox: 0.2049 d1.dn_loss_iou: 0.2282 d2.dn_loss_cls: 0.1827 d2.dn_loss_bbox: 0.1770 d2.dn_loss_iou: 0.2055 d3.dn_loss_cls: 0.1793 d3.dn_loss_bbox: 0.1668 d3.dn_loss_iou: 0.1975 d4.dn_loss_cls: 0.1767 d4.dn_loss_bbox: 0.1648 d4.dn_loss_iou: 0.1943 d1.loss_lmm_region: 0.1574 loss_lmm_image: 0.8989 2024/11/11 17:07:56 - mmengine - INFO - Iter(train) [ 54400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:08:53 time: 1.9946 data_time: 0.0173 memory: 33348 grad_norm: 35.6465 loss: 9.6998 loss_cls: 0.3026 loss_bbox: 0.1372 loss_iou: 0.2340 d0.loss_cls: 0.3427 d0.loss_bbox: 0.1485 d0.loss_iou: 0.2468 d1.loss_cls: 0.3219 d1.loss_bbox: 0.1391 d1.loss_iou: 0.2356 d2.loss_cls: 0.3132 d2.loss_bbox: 0.1343 d2.loss_iou: 0.2325 d3.loss_cls: 0.3052 d3.loss_bbox: 0.1365 d3.loss_iou: 0.2345 d4.loss_cls: 0.3048 d4.loss_bbox: 0.1367 d4.loss_iou: 0.2336 enc_loss_cls: 0.3477 enc_loss_bbox: 0.1604 enc_loss_iou: 0.2665 dn_loss_cls: 0.1397 dn_loss_bbox: 0.1825 dn_loss_iou: 0.2202 d0.dn_loss_cls: 0.2307 d0.dn_loss_bbox: 0.3276 d0.dn_loss_iou: 0.3568 d1.dn_loss_cls: 0.1730 d1.dn_loss_bbox: 0.2136 d1.dn_loss_iou: 0.2480 d2.dn_loss_cls: 0.1514 d2.dn_loss_bbox: 0.1922 d2.dn_loss_iou: 0.2285 d3.dn_loss_cls: 0.1448 d3.dn_loss_bbox: 0.1848 d3.dn_loss_iou: 0.2221 d4.dn_loss_cls: 0.1394 d4.dn_loss_bbox: 0.1824 d4.dn_loss_iou: 0.2201 d1.loss_lmm_region: 0.1701 loss_lmm_image: 0.8576 2024/11/11 17:11:16 - mmengine - INFO - Iter(train) [ 54500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:05:31 time: 1.9745 data_time: 0.0172 memory: 35492 grad_norm: 32.8509 loss: 8.7836 loss_cls: 0.2601 loss_bbox: 0.1283 loss_iou: 0.2291 d0.loss_cls: 0.3123 d0.loss_bbox: 0.1388 d0.loss_iou: 0.2445 d1.loss_cls: 0.2792 d1.loss_bbox: 0.1358 d1.loss_iou: 0.2336 d2.loss_cls: 0.2696 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2294 d3.loss_cls: 0.2653 d3.loss_bbox: 0.1273 d3.loss_iou: 0.2288 d4.loss_cls: 0.2611 d4.loss_bbox: 0.1277 d4.loss_iou: 0.2281 enc_loss_cls: 0.3135 enc_loss_bbox: 0.1514 enc_loss_iou: 0.2615 dn_loss_cls: 0.0887 dn_loss_bbox: 0.1537 dn_loss_iou: 0.2061 d0.dn_loss_cls: 0.1713 d0.dn_loss_bbox: 0.3061 d0.dn_loss_iou: 0.3488 d1.dn_loss_cls: 0.1195 d1.dn_loss_bbox: 0.1874 d1.dn_loss_iou: 0.2374 d2.dn_loss_cls: 0.1025 d2.dn_loss_bbox: 0.1649 d2.dn_loss_iou: 0.2161 d3.dn_loss_cls: 0.0941 d3.dn_loss_bbox: 0.1560 d3.dn_loss_iou: 0.2087 d4.dn_loss_cls: 0.0891 d4.dn_loss_bbox: 0.1536 d4.dn_loss_iou: 0.2060 d1.loss_lmm_region: 0.1501 loss_lmm_image: 0.8693 2024/11/11 17:14:36 - mmengine - INFO - Iter(train) [ 54600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 5:02:12 time: 2.0144 data_time: 0.0173 memory: 34199 grad_norm: 29.8620 loss: 11.9040 loss_cls: 0.4464 loss_bbox: 0.1822 loss_iou: 0.3062 d0.loss_cls: 0.5035 d0.loss_bbox: 0.2067 d0.loss_iou: 0.3330 d1.loss_cls: 0.4588 d1.loss_bbox: 0.1994 d1.loss_iou: 0.3190 d2.loss_cls: 0.4502 d2.loss_bbox: 0.1865 d2.loss_iou: 0.3081 d3.loss_cls: 0.4396 d3.loss_bbox: 0.1860 d3.loss_iou: 0.3081 d4.loss_cls: 0.4416 d4.loss_bbox: 0.1860 d4.loss_iou: 0.3081 enc_loss_cls: 0.4883 enc_loss_bbox: 0.2231 enc_loss_iou: 0.3528 dn_loss_cls: 0.1371 dn_loss_bbox: 0.2009 dn_loss_iou: 0.2333 d0.dn_loss_cls: 0.2317 d0.dn_loss_bbox: 0.3680 d0.dn_loss_iou: 0.3878 d1.dn_loss_cls: 0.1767 d1.dn_loss_bbox: 0.2390 d1.dn_loss_iou: 0.2708 d2.dn_loss_cls: 0.1563 d2.dn_loss_bbox: 0.2153 d2.dn_loss_iou: 0.2470 d3.dn_loss_cls: 0.1420 d3.dn_loss_bbox: 0.2034 d3.dn_loss_iou: 0.2372 d4.dn_loss_cls: 0.1387 d4.dn_loss_bbox: 0.2010 d4.dn_loss_iou: 0.2333 d1.loss_lmm_region: 0.1876 loss_lmm_image: 0.8634 2024/11/11 17:17:52 - mmengine - INFO - Iter(train) [ 54700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:58:45 time: 1.9545 data_time: 0.0172 memory: 33958 grad_norm: 36.0505 loss: 9.9868 loss_cls: 0.3071 loss_bbox: 0.1514 loss_iou: 0.2688 d0.loss_cls: 0.3551 d0.loss_bbox: 0.1688 d0.loss_iou: 0.2888 d1.loss_cls: 0.3276 d1.loss_bbox: 0.1592 d1.loss_iou: 0.2787 d2.loss_cls: 0.3161 d2.loss_bbox: 0.1553 d2.loss_iou: 0.2730 d3.loss_cls: 0.3085 d3.loss_bbox: 0.1525 d3.loss_iou: 0.2689 d4.loss_cls: 0.3068 d4.loss_bbox: 0.1519 d4.loss_iou: 0.2685 enc_loss_cls: 0.3598 enc_loss_bbox: 0.1809 enc_loss_iou: 0.3098 dn_loss_cls: 0.0980 dn_loss_bbox: 0.1739 dn_loss_iou: 0.2287 d0.dn_loss_cls: 0.1857 d0.dn_loss_bbox: 0.3280 d0.dn_loss_iou: 0.3799 d1.dn_loss_cls: 0.1333 d1.dn_loss_bbox: 0.2107 d1.dn_loss_iou: 0.2616 d2.dn_loss_cls: 0.1119 d2.dn_loss_bbox: 0.1848 d2.dn_loss_iou: 0.2390 d3.dn_loss_cls: 0.1040 d3.dn_loss_bbox: 0.1755 d3.dn_loss_iou: 0.2314 d4.dn_loss_cls: 0.0980 d4.dn_loss_bbox: 0.1739 d4.dn_loss_iou: 0.2287 d1.loss_lmm_region: 0.1510 loss_lmm_image: 0.9313 2024/11/11 17:21:11 - mmengine - INFO - Iter(train) [ 54800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:55:22 time: 1.9762 data_time: 0.0173 memory: 34635 grad_norm: 33.2639 loss: 10.2437 loss_cls: 0.3115 loss_bbox: 0.1734 loss_iou: 0.2922 d0.loss_cls: 0.3742 d0.loss_bbox: 0.1927 d0.loss_iou: 0.3118 d1.loss_cls: 0.3351 d1.loss_bbox: 0.1818 d1.loss_iou: 0.3065 d2.loss_cls: 0.3245 d2.loss_bbox: 0.1738 d2.loss_iou: 0.2949 d3.loss_cls: 0.3156 d3.loss_bbox: 0.1736 d3.loss_iou: 0.2931 d4.loss_cls: 0.3136 d4.loss_bbox: 0.1717 d4.loss_iou: 0.2907 enc_loss_cls: 0.3712 enc_loss_bbox: 0.2015 enc_loss_iou: 0.3329 dn_loss_cls: 0.1150 dn_loss_bbox: 0.1711 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.1857 d0.dn_loss_bbox: 0.3192 d0.dn_loss_iou: 0.3534 d1.dn_loss_cls: 0.1425 d1.dn_loss_bbox: 0.2008 d1.dn_loss_iou: 0.2459 d2.dn_loss_cls: 0.1262 d2.dn_loss_bbox: 0.1810 d2.dn_loss_iou: 0.2260 d3.dn_loss_cls: 0.1184 d3.dn_loss_bbox: 0.1729 d3.dn_loss_iou: 0.2184 d4.dn_loss_cls: 0.1158 d4.dn_loss_bbox: 0.1711 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1356 loss_lmm_image: 0.8769 2024/11/11 17:24:31 - mmengine - INFO - Iter(train) [ 54900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:52:02 time: 1.9990 data_time: 0.0174 memory: 33533 grad_norm: 27.5046 loss: 10.0409 loss_cls: 0.2879 loss_bbox: 0.1476 loss_iou: 0.2688 d0.loss_cls: 0.3310 d0.loss_bbox: 0.1691 d0.loss_iou: 0.2826 d1.loss_cls: 0.3078 d1.loss_bbox: 0.1563 d1.loss_iou: 0.2763 d2.loss_cls: 0.2995 d2.loss_bbox: 0.1529 d2.loss_iou: 0.2703 d3.loss_cls: 0.2864 d3.loss_bbox: 0.1512 d3.loss_iou: 0.2714 d4.loss_cls: 0.2833 d4.loss_bbox: 0.1502 d4.loss_iou: 0.2691 enc_loss_cls: 0.3267 enc_loss_bbox: 0.1813 enc_loss_iou: 0.3025 dn_loss_cls: 0.1240 dn_loss_bbox: 0.1911 dn_loss_iou: 0.2405 d0.dn_loss_cls: 0.2044 d0.dn_loss_bbox: 0.3429 d0.dn_loss_iou: 0.3878 d1.dn_loss_cls: 0.1546 d1.dn_loss_bbox: 0.2212 d1.dn_loss_iou: 0.2715 d2.dn_loss_cls: 0.1367 d2.dn_loss_bbox: 0.2006 d2.dn_loss_iou: 0.2489 d3.dn_loss_cls: 0.1316 d3.dn_loss_bbox: 0.1922 d3.dn_loss_iou: 0.2427 d4.dn_loss_cls: 0.1284 d4.dn_loss_bbox: 0.1911 d4.dn_loss_iou: 0.2404 d1.loss_lmm_region: 0.1454 loss_lmm_image: 0.8728 2024/11/11 17:27:49 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 17:27:49 - mmengine - INFO - Iter(train) [ 55000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:48:38 time: 1.9781 data_time: 0.0175 memory: 34120 grad_norm: 29.5748 loss: 10.1725 loss_cls: 0.3385 loss_bbox: 0.1534 loss_iou: 0.2796 d0.loss_cls: 0.3825 d0.loss_bbox: 0.1659 d0.loss_iou: 0.2942 d1.loss_cls: 0.3512 d1.loss_bbox: 0.1630 d1.loss_iou: 0.2836 d2.loss_cls: 0.3469 d2.loss_bbox: 0.1558 d2.loss_iou: 0.2811 d3.loss_cls: 0.3386 d3.loss_bbox: 0.1579 d3.loss_iou: 0.2814 d4.loss_cls: 0.3333 d4.loss_bbox: 0.1572 d4.loss_iou: 0.2801 enc_loss_cls: 0.3781 enc_loss_bbox: 0.1849 enc_loss_iou: 0.3186 dn_loss_cls: 0.1080 dn_loss_bbox: 0.1646 dn_loss_iou: 0.2317 d0.dn_loss_cls: 0.1923 d0.dn_loss_bbox: 0.3104 d0.dn_loss_iou: 0.3780 d1.dn_loss_cls: 0.1419 d1.dn_loss_bbox: 0.1923 d1.dn_loss_iou: 0.2609 d2.dn_loss_cls: 0.1205 d2.dn_loss_bbox: 0.1748 d2.dn_loss_iou: 0.2413 d3.dn_loss_cls: 0.1147 d3.dn_loss_bbox: 0.1669 d3.dn_loss_iou: 0.2345 d4.dn_loss_cls: 0.1078 d4.dn_loss_bbox: 0.1646 d4.dn_loss_iou: 0.2317 d1.loss_lmm_region: 0.1476 loss_lmm_image: 0.8621 2024/11/11 17:31:06 - mmengine - INFO - Iter(train) [ 55100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:45:13 time: 1.9881 data_time: 0.0174 memory: 34897 grad_norm: 26.7342 loss: 10.6352 loss_cls: 0.3270 loss_bbox: 0.1777 loss_iou: 0.2795 d0.loss_cls: 0.3680 d0.loss_bbox: 0.1973 d0.loss_iou: 0.2990 d1.loss_cls: 0.3518 d1.loss_bbox: 0.1788 d1.loss_iou: 0.2829 d2.loss_cls: 0.3433 d2.loss_bbox: 0.1734 d2.loss_iou: 0.2783 d3.loss_cls: 0.3316 d3.loss_bbox: 0.1787 d3.loss_iou: 0.2782 d4.loss_cls: 0.3276 d4.loss_bbox: 0.1791 d4.loss_iou: 0.2793 enc_loss_cls: 0.3713 enc_loss_bbox: 0.2143 enc_loss_iou: 0.3233 dn_loss_cls: 0.1136 dn_loss_bbox: 0.1954 dn_loss_iou: 0.2487 d0.dn_loss_cls: 0.1994 d0.dn_loss_bbox: 0.3509 d0.dn_loss_iou: 0.3963 d1.dn_loss_cls: 0.1474 d1.dn_loss_bbox: 0.2283 d1.dn_loss_iou: 0.2796 d2.dn_loss_cls: 0.1269 d2.dn_loss_bbox: 0.2062 d2.dn_loss_iou: 0.2585 d3.dn_loss_cls: 0.1197 d3.dn_loss_bbox: 0.1973 d3.dn_loss_iou: 0.2509 d4.dn_loss_cls: 0.1153 d4.dn_loss_bbox: 0.1953 d4.dn_loss_iou: 0.2486 d1.loss_lmm_region: 0.1544 loss_lmm_image: 0.8618 2024/11/11 17:34:25 - mmengine - INFO - Iter(train) [ 55200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:41:51 time: 2.0015 data_time: 0.0171 memory: 34733 grad_norm: 35.5831 loss: 9.1469 loss_cls: 0.2803 loss_bbox: 0.1279 loss_iou: 0.2246 d0.loss_cls: 0.3215 d0.loss_bbox: 0.1418 d0.loss_iou: 0.2404 d1.loss_cls: 0.3033 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2229 d2.loss_cls: 0.2983 d2.loss_bbox: 0.1195 d2.loss_iou: 0.2192 d3.loss_cls: 0.2840 d3.loss_bbox: 0.1288 d3.loss_iou: 0.2239 d4.loss_cls: 0.2812 d4.loss_bbox: 0.1269 d4.loss_iou: 0.2243 enc_loss_cls: 0.3301 enc_loss_bbox: 0.1529 enc_loss_iou: 0.2639 dn_loss_cls: 0.1244 dn_loss_bbox: 0.1615 dn_loss_iou: 0.2053 d0.dn_loss_cls: 0.2038 d0.dn_loss_bbox: 0.3306 d0.dn_loss_iou: 0.3532 d1.dn_loss_cls: 0.1531 d1.dn_loss_bbox: 0.2050 d1.dn_loss_iou: 0.2389 d2.dn_loss_cls: 0.1339 d2.dn_loss_bbox: 0.1735 d2.dn_loss_iou: 0.2142 d3.dn_loss_cls: 0.1263 d3.dn_loss_bbox: 0.1641 d3.dn_loss_iou: 0.2077 d4.dn_loss_cls: 0.1214 d4.dn_loss_bbox: 0.1617 d4.dn_loss_iou: 0.2053 d1.loss_lmm_region: 0.1492 loss_lmm_image: 0.8705 2024/11/11 17:37:44 - mmengine - INFO - Iter(train) [ 55300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:38:29 time: 1.9885 data_time: 0.0173 memory: 34074 grad_norm: 25.6403 loss: 8.9984 loss_cls: 0.2616 loss_bbox: 0.1395 loss_iou: 0.2221 d0.loss_cls: 0.3099 d0.loss_bbox: 0.1469 d0.loss_iou: 0.2382 d1.loss_cls: 0.2762 d1.loss_bbox: 0.1474 d1.loss_iou: 0.2311 d2.loss_cls: 0.2655 d2.loss_bbox: 0.1409 d2.loss_iou: 0.2287 d3.loss_cls: 0.2654 d3.loss_bbox: 0.1379 d3.loss_iou: 0.2254 d4.loss_cls: 0.2608 d4.loss_bbox: 0.1392 d4.loss_iou: 0.2242 enc_loss_cls: 0.3047 enc_loss_bbox: 0.1734 enc_loss_iou: 0.2654 dn_loss_cls: 0.0975 dn_loss_bbox: 0.1613 dn_loss_iou: 0.2123 d0.dn_loss_cls: 0.1861 d0.dn_loss_bbox: 0.3254 d0.dn_loss_iou: 0.3595 d1.dn_loss_cls: 0.1315 d1.dn_loss_bbox: 0.2002 d1.dn_loss_iou: 0.2446 d2.dn_loss_cls: 0.1100 d2.dn_loss_bbox: 0.1734 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.1013 d3.dn_loss_bbox: 0.1637 d3.dn_loss_iou: 0.2152 d4.dn_loss_cls: 0.0988 d4.dn_loss_bbox: 0.1613 d4.dn_loss_iou: 0.2124 d1.loss_lmm_region: 0.1469 loss_lmm_image: 0.8706 2024/11/11 17:41:02 - mmengine - INFO - Iter(train) [ 55400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:35:04 time: 1.9562 data_time: 0.0173 memory: 34036 grad_norm: 31.8280 loss: 9.8272 loss_cls: 0.2851 loss_bbox: 0.1490 loss_iou: 0.2544 d0.loss_cls: 0.3439 d0.loss_bbox: 0.1588 d0.loss_iou: 0.2685 d1.loss_cls: 0.3017 d1.loss_bbox: 0.1586 d1.loss_iou: 0.2610 d2.loss_cls: 0.2900 d2.loss_bbox: 0.1544 d2.loss_iou: 0.2604 d3.loss_cls: 0.2880 d3.loss_bbox: 0.1498 d3.loss_iou: 0.2574 d4.loss_cls: 0.2833 d4.loss_bbox: 0.1495 d4.loss_iou: 0.2540 enc_loss_cls: 0.3401 enc_loss_bbox: 0.1720 enc_loss_iou: 0.2909 dn_loss_cls: 0.1134 dn_loss_bbox: 0.1935 dn_loss_iou: 0.2318 d0.dn_loss_cls: 0.1916 d0.dn_loss_bbox: 0.3446 d0.dn_loss_iou: 0.3818 d1.dn_loss_cls: 0.1438 d1.dn_loss_bbox: 0.2286 d1.dn_loss_iou: 0.2648 d2.dn_loss_cls: 0.1244 d2.dn_loss_bbox: 0.2038 d2.dn_loss_iou: 0.2423 d3.dn_loss_cls: 0.1210 d3.dn_loss_bbox: 0.1959 d3.dn_loss_iou: 0.2345 d4.dn_loss_cls: 0.1153 d4.dn_loss_bbox: 0.1935 d4.dn_loss_iou: 0.2318 d1.loss_lmm_region: 0.1272 loss_lmm_image: 0.8729 2024/11/11 17:44:19 - mmengine - INFO - Iter(train) [ 55500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:31:40 time: 1.9775 data_time: 0.0173 memory: 34431 grad_norm: 27.5790 loss: 8.6505 loss_cls: 0.2723 loss_bbox: 0.1157 loss_iou: 0.2119 d0.loss_cls: 0.3138 d0.loss_bbox: 0.1301 d0.loss_iou: 0.2288 d1.loss_cls: 0.2806 d1.loss_bbox: 0.1248 d1.loss_iou: 0.2200 d2.loss_cls: 0.2770 d2.loss_bbox: 0.1192 d2.loss_iou: 0.2144 d3.loss_cls: 0.2721 d3.loss_bbox: 0.1183 d3.loss_iou: 0.2132 d4.loss_cls: 0.2717 d4.loss_bbox: 0.1161 d4.loss_iou: 0.2121 enc_loss_cls: 0.3154 enc_loss_bbox: 0.1502 enc_loss_iou: 0.2589 dn_loss_cls: 0.0950 dn_loss_bbox: 0.1541 dn_loss_iou: 0.2084 d0.dn_loss_cls: 0.1658 d0.dn_loss_bbox: 0.2941 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1202 d1.dn_loss_bbox: 0.1853 d1.dn_loss_iou: 0.2399 d2.dn_loss_cls: 0.1051 d2.dn_loss_bbox: 0.1639 d2.dn_loss_iou: 0.2186 d3.dn_loss_cls: 0.1000 d3.dn_loss_bbox: 0.1563 d3.dn_loss_iou: 0.2114 d4.dn_loss_cls: 0.0957 d4.dn_loss_bbox: 0.1540 d4.dn_loss_iou: 0.2084 d1.loss_lmm_region: 0.1165 loss_lmm_image: 0.8668 2024/11/11 17:47:39 - mmengine - INFO - Iter(train) [ 55600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:28:19 time: 2.0001 data_time: 0.0173 memory: 34006 grad_norm: 34.4318 loss: 9.1505 loss_cls: 0.2617 loss_bbox: 0.1396 loss_iou: 0.2245 d0.loss_cls: 0.3043 d0.loss_bbox: 0.1503 d0.loss_iou: 0.2360 d1.loss_cls: 0.2811 d1.loss_bbox: 0.1436 d1.loss_iou: 0.2297 d2.loss_cls: 0.2732 d2.loss_bbox: 0.1436 d2.loss_iou: 0.2263 d3.loss_cls: 0.2666 d3.loss_bbox: 0.1394 d3.loss_iou: 0.2279 d4.loss_cls: 0.2580 d4.loss_bbox: 0.1432 d4.loss_iou: 0.2275 enc_loss_cls: 0.3104 enc_loss_bbox: 0.1591 enc_loss_iou: 0.2552 dn_loss_cls: 0.1121 dn_loss_bbox: 0.1826 dn_loss_iou: 0.2121 d0.dn_loss_cls: 0.1921 d0.dn_loss_bbox: 0.3271 d0.dn_loss_iou: 0.3499 d1.dn_loss_cls: 0.1438 d1.dn_loss_bbox: 0.2148 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1254 d2.dn_loss_bbox: 0.1942 d2.dn_loss_iou: 0.2215 d3.dn_loss_cls: 0.1161 d3.dn_loss_bbox: 0.1858 d3.dn_loss_iou: 0.2148 d4.dn_loss_cls: 0.1125 d4.dn_loss_bbox: 0.1826 d4.dn_loss_iou: 0.2121 d1.loss_lmm_region: 0.1427 loss_lmm_image: 0.8654 2024/11/11 17:50:59 - mmengine - INFO - Iter(train) [ 55700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:24:59 time: 1.9961 data_time: 0.0174 memory: 35461 grad_norm: 27.0511 loss: 11.1083 loss_cls: 0.3623 loss_bbox: 0.1741 loss_iou: 0.3068 d0.loss_cls: 0.4093 d0.loss_bbox: 0.1898 d0.loss_iou: 0.3175 d1.loss_cls: 0.3827 d1.loss_bbox: 0.1785 d1.loss_iou: 0.3101 d2.loss_cls: 0.3713 d2.loss_bbox: 0.1791 d2.loss_iou: 0.3078 d3.loss_cls: 0.3672 d3.loss_bbox: 0.1735 d3.loss_iou: 0.3054 d4.loss_cls: 0.3620 d4.loss_bbox: 0.1765 d4.loss_iou: 0.3080 enc_loss_cls: 0.4102 enc_loss_bbox: 0.2011 enc_loss_iou: 0.3392 dn_loss_cls: 0.1335 dn_loss_bbox: 0.1968 dn_loss_iou: 0.2422 d0.dn_loss_cls: 0.2109 d0.dn_loss_bbox: 0.3623 d0.dn_loss_iou: 0.3919 d1.dn_loss_cls: 0.1599 d1.dn_loss_bbox: 0.2358 d1.dn_loss_iou: 0.2756 d2.dn_loss_cls: 0.1435 d2.dn_loss_bbox: 0.2116 d2.dn_loss_iou: 0.2538 d3.dn_loss_cls: 0.1350 d3.dn_loss_bbox: 0.1991 d3.dn_loss_iou: 0.2450 d4.dn_loss_cls: 0.1344 d4.dn_loss_bbox: 0.1968 d4.dn_loss_iou: 0.2422 d1.loss_lmm_region: 0.1506 loss_lmm_image: 0.8552 2024/11/11 17:54:18 - mmengine - INFO - Iter(train) [ 55800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:21:37 time: 1.9978 data_time: 0.0172 memory: 34175 grad_norm: 30.1820 loss: 10.1391 loss_cls: 0.2787 loss_bbox: 0.1681 loss_iou: 0.2693 d0.loss_cls: 0.3332 d0.loss_bbox: 0.1792 d0.loss_iou: 0.2728 d1.loss_cls: 0.3005 d1.loss_bbox: 0.1731 d1.loss_iou: 0.2692 d2.loss_cls: 0.2883 d2.loss_bbox: 0.1744 d2.loss_iou: 0.2704 d3.loss_cls: 0.2817 d3.loss_bbox: 0.1719 d3.loss_iou: 0.2709 d4.loss_cls: 0.2769 d4.loss_bbox: 0.1724 d4.loss_iou: 0.2694 enc_loss_cls: 0.3389 enc_loss_bbox: 0.1906 enc_loss_iou: 0.2966 dn_loss_cls: 0.0960 dn_loss_bbox: 0.2140 dn_loss_iou: 0.2498 d0.dn_loss_cls: 0.1841 d0.dn_loss_bbox: 0.3787 d0.dn_loss_iou: 0.3963 d1.dn_loss_cls: 0.1343 d1.dn_loss_bbox: 0.2494 d1.dn_loss_iou: 0.2828 d2.dn_loss_cls: 0.1143 d2.dn_loss_bbox: 0.2237 d2.dn_loss_iou: 0.2588 d3.dn_loss_cls: 0.1034 d3.dn_loss_bbox: 0.2154 d3.dn_loss_iou: 0.2523 d4.dn_loss_cls: 0.0980 d4.dn_loss_bbox: 0.2141 d4.dn_loss_iou: 0.2499 d1.loss_lmm_region: 0.1563 loss_lmm_image: 0.8213 2024/11/11 17:57:37 - mmengine - INFO - Iter(train) [ 55900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:18:14 time: 2.0015 data_time: 0.0172 memory: 33222 grad_norm: 28.4435 loss: 9.3944 loss_cls: 0.2706 loss_bbox: 0.1375 loss_iou: 0.2264 d0.loss_cls: 0.3164 d0.loss_bbox: 0.1488 d0.loss_iou: 0.2423 d1.loss_cls: 0.2857 d1.loss_bbox: 0.1439 d1.loss_iou: 0.2348 d2.loss_cls: 0.2741 d2.loss_bbox: 0.1424 d2.loss_iou: 0.2313 d3.loss_cls: 0.2751 d3.loss_bbox: 0.1384 d3.loss_iou: 0.2281 d4.loss_cls: 0.2705 d4.loss_bbox: 0.1382 d4.loss_iou: 0.2265 enc_loss_cls: 0.3097 enc_loss_bbox: 0.1646 enc_loss_iou: 0.2606 dn_loss_cls: 0.1270 dn_loss_bbox: 0.1861 dn_loss_iou: 0.2213 d0.dn_loss_cls: 0.2030 d0.dn_loss_bbox: 0.3359 d0.dn_loss_iou: 0.3650 d1.dn_loss_cls: 0.1579 d1.dn_loss_bbox: 0.2192 d1.dn_loss_iou: 0.2539 d2.dn_loss_cls: 0.1415 d2.dn_loss_bbox: 0.1956 d2.dn_loss_iou: 0.2318 d3.dn_loss_cls: 0.1319 d3.dn_loss_bbox: 0.1885 d3.dn_loss_iou: 0.2243 d4.dn_loss_cls: 0.1284 d4.dn_loss_bbox: 0.1860 d4.dn_loss_iou: 0.2214 d1.loss_lmm_region: 0.1388 loss_lmm_image: 0.8710 2024/11/11 18:00:56 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 18:00:56 - mmengine - INFO - Iter(train) [ 56000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:14:53 time: 2.0141 data_time: 0.0174 memory: 33814 grad_norm: 31.3354 loss: 10.8138 loss_cls: 0.3333 loss_bbox: 0.1828 loss_iou: 0.3178 d0.loss_cls: 0.3805 d0.loss_bbox: 0.1975 d0.loss_iou: 0.3353 d1.loss_cls: 0.3491 d1.loss_bbox: 0.1869 d1.loss_iou: 0.3275 d2.loss_cls: 0.3433 d2.loss_bbox: 0.1822 d2.loss_iou: 0.3195 d3.loss_cls: 0.3375 d3.loss_bbox: 0.1837 d3.loss_iou: 0.3219 d4.loss_cls: 0.3341 d4.loss_bbox: 0.1828 d4.loss_iou: 0.3196 enc_loss_cls: 0.3722 enc_loss_bbox: 0.2149 enc_loss_iou: 0.3617 dn_loss_cls: 0.1124 dn_loss_bbox: 0.1910 dn_loss_iou: 0.2406 d0.dn_loss_cls: 0.1867 d0.dn_loss_bbox: 0.3371 d0.dn_loss_iou: 0.3827 d1.dn_loss_cls: 0.1393 d1.dn_loss_bbox: 0.2217 d1.dn_loss_iou: 0.2709 d2.dn_loss_cls: 0.1207 d2.dn_loss_bbox: 0.2003 d2.dn_loss_iou: 0.2498 d3.dn_loss_cls: 0.1155 d3.dn_loss_bbox: 0.1928 d3.dn_loss_iou: 0.2422 d4.dn_loss_cls: 0.1137 d4.dn_loss_bbox: 0.1909 d4.dn_loss_iou: 0.2406 d1.loss_lmm_region: 0.1365 loss_lmm_image: 0.8441 2024/11/11 18:04:15 - mmengine - INFO - Iter(train) [ 56100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:11:32 time: 1.9799 data_time: 0.0172 memory: 33708 grad_norm: 30.8287 loss: 9.2094 loss_cls: 0.2857 loss_bbox: 0.1416 loss_iou: 0.2405 d0.loss_cls: 0.3409 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2504 d1.loss_cls: 0.3100 d1.loss_bbox: 0.1454 d1.loss_iou: 0.2405 d2.loss_cls: 0.2937 d2.loss_bbox: 0.1450 d2.loss_iou: 0.2392 d3.loss_cls: 0.2865 d3.loss_bbox: 0.1457 d3.loss_iou: 0.2410 d4.loss_cls: 0.2831 d4.loss_bbox: 0.1441 d4.loss_iou: 0.2409 enc_loss_cls: 0.3375 enc_loss_bbox: 0.1700 enc_loss_iou: 0.2772 dn_loss_cls: 0.0834 dn_loss_bbox: 0.1766 dn_loss_iou: 0.2093 d0.dn_loss_cls: 0.1562 d0.dn_loss_bbox: 0.3275 d0.dn_loss_iou: 0.3464 d1.dn_loss_cls: 0.1093 d1.dn_loss_bbox: 0.2071 d1.dn_loss_iou: 0.2393 d2.dn_loss_cls: 0.0887 d2.dn_loss_bbox: 0.1845 d2.dn_loss_iou: 0.2174 d3.dn_loss_cls: 0.0841 d3.dn_loss_bbox: 0.1772 d3.dn_loss_iou: 0.2109 d4.dn_loss_cls: 0.0831 d4.dn_loss_bbox: 0.1764 d4.dn_loss_iou: 0.2092 d1.loss_lmm_region: 0.1438 loss_lmm_image: 0.8653 2024/11/11 18:07:34 - mmengine - INFO - Iter(train) [ 56200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:08:08 time: 1.9906 data_time: 0.0172 memory: 33963 grad_norm: 32.0871 loss: 8.9647 loss_cls: 0.2822 loss_bbox: 0.1219 loss_iou: 0.2435 d0.loss_cls: 0.3238 d0.loss_bbox: 0.1322 d0.loss_iou: 0.2586 d1.loss_cls: 0.2990 d1.loss_bbox: 0.1298 d1.loss_iou: 0.2500 d2.loss_cls: 0.2928 d2.loss_bbox: 0.1254 d2.loss_iou: 0.2442 d3.loss_cls: 0.2899 d3.loss_bbox: 0.1220 d3.loss_iou: 0.2412 d4.loss_cls: 0.2877 d4.loss_bbox: 0.1210 d4.loss_iou: 0.2415 enc_loss_cls: 0.3279 enc_loss_bbox: 0.1436 enc_loss_iou: 0.2805 dn_loss_cls: 0.0887 dn_loss_bbox: 0.1556 dn_loss_iou: 0.2032 d0.dn_loss_cls: 0.1693 d0.dn_loss_bbox: 0.2920 d0.dn_loss_iou: 0.3420 d1.dn_loss_cls: 0.1243 d1.dn_loss_bbox: 0.1879 d1.dn_loss_iou: 0.2338 d2.dn_loss_cls: 0.1028 d2.dn_loss_bbox: 0.1667 d2.dn_loss_iou: 0.2121 d3.dn_loss_cls: 0.0933 d3.dn_loss_bbox: 0.1581 d3.dn_loss_iou: 0.2055 d4.dn_loss_cls: 0.0895 d4.dn_loss_bbox: 0.1556 d4.dn_loss_iou: 0.2032 d1.loss_lmm_region: 0.1314 loss_lmm_image: 0.8912 2024/11/11 18:10:53 - mmengine - INFO - Iter(train) [ 56300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:04:48 time: 1.9994 data_time: 0.0172 memory: 35587 grad_norm: 30.0289 loss: 10.0095 loss_cls: 0.2986 loss_bbox: 0.1657 loss_iou: 0.2695 d0.loss_cls: 0.3539 d0.loss_bbox: 0.1837 d0.loss_iou: 0.2883 d1.loss_cls: 0.3236 d1.loss_bbox: 0.1755 d1.loss_iou: 0.2764 d2.loss_cls: 0.3131 d2.loss_bbox: 0.1674 d2.loss_iou: 0.2697 d3.loss_cls: 0.3066 d3.loss_bbox: 0.1676 d3.loss_iou: 0.2684 d4.loss_cls: 0.3035 d4.loss_bbox: 0.1647 d4.loss_iou: 0.2674 enc_loss_cls: 0.3609 enc_loss_bbox: 0.1970 enc_loss_iou: 0.3119 dn_loss_cls: 0.1045 dn_loss_bbox: 0.1823 dn_loss_iou: 0.2124 d0.dn_loss_cls: 0.1885 d0.dn_loss_bbox: 0.3378 d0.dn_loss_iou: 0.3572 d1.dn_loss_cls: 0.1352 d1.dn_loss_bbox: 0.2155 d1.dn_loss_iou: 0.2429 d2.dn_loss_cls: 0.1179 d2.dn_loss_bbox: 0.1930 d2.dn_loss_iou: 0.2213 d3.dn_loss_cls: 0.1099 d3.dn_loss_bbox: 0.1836 d3.dn_loss_iou: 0.2143 d4.dn_loss_cls: 0.1055 d4.dn_loss_bbox: 0.1824 d4.dn_loss_iou: 0.2124 d1.loss_lmm_region: 0.1715 loss_lmm_image: 0.8879 2024/11/11 18:14:13 - mmengine - INFO - Iter(train) [ 56400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 4:01:26 time: 1.9982 data_time: 0.0173 memory: 34222 grad_norm: 28.1062 loss: 9.3974 loss_cls: 0.2985 loss_bbox: 0.1371 loss_iou: 0.2618 d0.loss_cls: 0.3544 d0.loss_bbox: 0.1448 d0.loss_iou: 0.2775 d1.loss_cls: 0.3177 d1.loss_bbox: 0.1423 d1.loss_iou: 0.2679 d2.loss_cls: 0.3089 d2.loss_bbox: 0.1365 d2.loss_iou: 0.2641 d3.loss_cls: 0.3031 d3.loss_bbox: 0.1314 d3.loss_iou: 0.2615 d4.loss_cls: 0.2993 d4.loss_bbox: 0.1362 d4.loss_iou: 0.2623 enc_loss_cls: 0.3454 enc_loss_bbox: 0.1610 enc_loss_iou: 0.2993 dn_loss_cls: 0.0942 dn_loss_bbox: 0.1520 dn_loss_iou: 0.2143 d0.dn_loss_cls: 0.1709 d0.dn_loss_bbox: 0.2873 d0.dn_loss_iou: 0.3516 d1.dn_loss_cls: 0.1246 d1.dn_loss_bbox: 0.1846 d1.dn_loss_iou: 0.2461 d2.dn_loss_cls: 0.1080 d2.dn_loss_bbox: 0.1651 d2.dn_loss_iou: 0.2252 d3.dn_loss_cls: 0.0996 d3.dn_loss_bbox: 0.1547 d3.dn_loss_iou: 0.2171 d4.dn_loss_cls: 0.0947 d4.dn_loss_bbox: 0.1519 d4.dn_loss_iou: 0.2144 d1.loss_lmm_region: 0.1488 loss_lmm_image: 0.8811 2024/11/11 18:17:32 - mmengine - INFO - Iter(train) [ 56500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:58:04 time: 1.9658 data_time: 0.0173 memory: 33780 grad_norm: 28.7827 loss: 11.2554 loss_cls: 0.4086 loss_bbox: 0.1565 loss_iou: 0.2811 d0.loss_cls: 0.4659 d0.loss_bbox: 0.1768 d0.loss_iou: 0.3025 d1.loss_cls: 0.4344 d1.loss_bbox: 0.1636 d1.loss_iou: 0.2886 d2.loss_cls: 0.4179 d2.loss_bbox: 0.1635 d2.loss_iou: 0.2871 d3.loss_cls: 0.4251 d3.loss_bbox: 0.1567 d3.loss_iou: 0.2781 d4.loss_cls: 0.4140 d4.loss_bbox: 0.1574 d4.loss_iou: 0.2801 enc_loss_cls: 0.4677 enc_loss_bbox: 0.1835 enc_loss_iou: 0.3141 dn_loss_cls: 0.1693 dn_loss_bbox: 0.1735 dn_loss_iou: 0.2310 d0.dn_loss_cls: 0.2445 d0.dn_loss_bbox: 0.3205 d0.dn_loss_iou: 0.3772 d1.dn_loss_cls: 0.1933 d1.dn_loss_bbox: 0.2068 d1.dn_loss_iou: 0.2647 d2.dn_loss_cls: 0.1834 d2.dn_loss_bbox: 0.1837 d2.dn_loss_iou: 0.2412 d3.dn_loss_cls: 0.1777 d3.dn_loss_bbox: 0.1754 d3.dn_loss_iou: 0.2332 d4.dn_loss_cls: 0.1723 d4.dn_loss_bbox: 0.1736 d4.dn_loss_iou: 0.2310 d1.loss_lmm_region: 0.1681 loss_lmm_image: 0.9117 2024/11/11 18:20:52 - mmengine - INFO - Iter(train) [ 56600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:54:45 time: 1.9787 data_time: 0.0172 memory: 33953 grad_norm: 30.5494 loss: 10.5011 loss_cls: 0.3128 loss_bbox: 0.1565 loss_iou: 0.2826 d0.loss_cls: 0.3646 d0.loss_bbox: 0.1720 d0.loss_iou: 0.2995 d1.loss_cls: 0.3382 d1.loss_bbox: 0.1647 d1.loss_iou: 0.2897 d2.loss_cls: 0.3246 d2.loss_bbox: 0.1603 d2.loss_iou: 0.2838 d3.loss_cls: 0.3194 d3.loss_bbox: 0.1571 d3.loss_iou: 0.2840 d4.loss_cls: 0.3135 d4.loss_bbox: 0.1577 d4.loss_iou: 0.2830 enc_loss_cls: 0.3617 enc_loss_bbox: 0.1895 enc_loss_iou: 0.3225 dn_loss_cls: 0.1260 dn_loss_bbox: 0.1905 dn_loss_iou: 0.2394 d0.dn_loss_cls: 0.2221 d0.dn_loss_bbox: 0.3552 d0.dn_loss_iou: 0.3965 d1.dn_loss_cls: 0.1659 d1.dn_loss_bbox: 0.2250 d1.dn_loss_iou: 0.2729 d2.dn_loss_cls: 0.1393 d2.dn_loss_bbox: 0.2002 d2.dn_loss_iou: 0.2501 d3.dn_loss_cls: 0.1303 d3.dn_loss_bbox: 0.1921 d3.dn_loss_iou: 0.2422 d4.dn_loss_cls: 0.1266 d4.dn_loss_bbox: 0.1903 d4.dn_loss_iou: 0.2395 d1.loss_lmm_region: 0.1626 loss_lmm_image: 0.8966 2024/11/11 18:24:11 - mmengine - INFO - Iter(train) [ 56700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:51:22 time: 2.0002 data_time: 0.0172 memory: 34832 grad_norm: 31.9074 loss: 10.4620 loss_cls: 0.4022 loss_bbox: 0.1323 loss_iou: 0.2280 d0.loss_cls: 0.4524 d0.loss_bbox: 0.1416 d0.loss_iou: 0.2401 d1.loss_cls: 0.4169 d1.loss_bbox: 0.1426 d1.loss_iou: 0.2354 d2.loss_cls: 0.4072 d2.loss_bbox: 0.1401 d2.loss_iou: 0.2303 d3.loss_cls: 0.4071 d3.loss_bbox: 0.1349 d3.loss_iou: 0.2244 d4.loss_cls: 0.4073 d4.loss_bbox: 0.1319 d4.loss_iou: 0.2256 enc_loss_cls: 0.4500 enc_loss_bbox: 0.1588 enc_loss_iou: 0.2674 dn_loss_cls: 0.1926 dn_loss_bbox: 0.1747 dn_loss_iou: 0.2065 d0.dn_loss_cls: 0.2574 d0.dn_loss_bbox: 0.3074 d0.dn_loss_iou: 0.3398 d1.dn_loss_cls: 0.2115 d1.dn_loss_bbox: 0.2025 d1.dn_loss_iou: 0.2353 d2.dn_loss_cls: 0.1979 d2.dn_loss_bbox: 0.1832 d2.dn_loss_iou: 0.2151 d3.dn_loss_cls: 0.1930 d3.dn_loss_bbox: 0.1766 d3.dn_loss_iou: 0.2094 d4.dn_loss_cls: 0.1902 d4.dn_loss_bbox: 0.1747 d4.dn_loss_iou: 0.2064 d1.loss_lmm_region: 0.1478 loss_lmm_image: 0.8633 2024/11/11 18:27:29 - mmengine - INFO - Iter(train) [ 56800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:47:59 time: 1.9912 data_time: 0.0172 memory: 34210 grad_norm: 32.0271 loss: 10.9160 loss_cls: 0.3213 loss_bbox: 0.1706 loss_iou: 0.2754 d0.loss_cls: 0.3796 d0.loss_bbox: 0.1813 d0.loss_iou: 0.2891 d1.loss_cls: 0.3433 d1.loss_bbox: 0.1795 d1.loss_iou: 0.2873 d2.loss_cls: 0.3378 d2.loss_bbox: 0.1688 d2.loss_iou: 0.2777 d3.loss_cls: 0.3239 d3.loss_bbox: 0.1715 d3.loss_iou: 0.2798 d4.loss_cls: 0.3236 d4.loss_bbox: 0.1690 d4.loss_iou: 0.2757 enc_loss_cls: 0.3758 enc_loss_bbox: 0.1904 enc_loss_iou: 0.3133 dn_loss_cls: 0.1518 dn_loss_bbox: 0.2050 dn_loss_iou: 0.2603 d0.dn_loss_cls: 0.2498 d0.dn_loss_bbox: 0.3641 d0.dn_loss_iou: 0.4127 d1.dn_loss_cls: 0.1918 d1.dn_loss_bbox: 0.2359 d1.dn_loss_iou: 0.2925 d2.dn_loss_cls: 0.1713 d2.dn_loss_bbox: 0.2122 d2.dn_loss_iou: 0.2700 d3.dn_loss_cls: 0.1598 d3.dn_loss_bbox: 0.2063 d3.dn_loss_iou: 0.2627 d4.dn_loss_cls: 0.1528 d4.dn_loss_bbox: 0.2049 d4.dn_loss_iou: 0.2604 d1.loss_lmm_region: 0.1741 loss_lmm_image: 0.8430 2024/11/11 18:30:48 - mmengine - INFO - Iter(train) [ 56900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:44:37 time: 2.0031 data_time: 0.0172 memory: 34887 grad_norm: 29.6184 loss: 10.4782 loss_cls: 0.3710 loss_bbox: 0.1501 loss_iou: 0.2598 d0.loss_cls: 0.4319 d0.loss_bbox: 0.1615 d0.loss_iou: 0.2774 d1.loss_cls: 0.3942 d1.loss_bbox: 0.1531 d1.loss_iou: 0.2652 d2.loss_cls: 0.3872 d2.loss_bbox: 0.1484 d2.loss_iou: 0.2596 d3.loss_cls: 0.3778 d3.loss_bbox: 0.1491 d3.loss_iou: 0.2610 d4.loss_cls: 0.3720 d4.loss_bbox: 0.1506 d4.loss_iou: 0.2602 enc_loss_cls: 0.4280 enc_loss_bbox: 0.1808 enc_loss_iou: 0.3067 dn_loss_cls: 0.1520 dn_loss_bbox: 0.1619 dn_loss_iou: 0.2167 d0.dn_loss_cls: 0.2408 d0.dn_loss_bbox: 0.3097 d0.dn_loss_iou: 0.3663 d1.dn_loss_cls: 0.1849 d1.dn_loss_bbox: 0.1970 d1.dn_loss_iou: 0.2485 d2.dn_loss_cls: 0.1651 d2.dn_loss_bbox: 0.1718 d2.dn_loss_iou: 0.2260 d3.dn_loss_cls: 0.1563 d3.dn_loss_bbox: 0.1634 d3.dn_loss_iou: 0.2187 d4.dn_loss_cls: 0.1513 d4.dn_loss_bbox: 0.1618 d4.dn_loss_iou: 0.2165 d1.loss_lmm_region: 0.1609 loss_lmm_image: 0.8633 2024/11/11 18:34:05 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 18:34:05 - mmengine - INFO - Iter(train) [ 57000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:41:12 time: 1.9720 data_time: 0.0172 memory: 33439 grad_norm: 31.3112 loss: 8.8468 loss_cls: 0.2861 loss_bbox: 0.1246 loss_iou: 0.2266 d0.loss_cls: 0.3341 d0.loss_bbox: 0.1354 d0.loss_iou: 0.2430 d1.loss_cls: 0.3052 d1.loss_bbox: 0.1329 d1.loss_iou: 0.2375 d2.loss_cls: 0.2956 d2.loss_bbox: 0.1282 d2.loss_iou: 0.2311 d3.loss_cls: 0.2928 d3.loss_bbox: 0.1270 d3.loss_iou: 0.2292 d4.loss_cls: 0.2878 d4.loss_bbox: 0.1251 d4.loss_iou: 0.2269 enc_loss_cls: 0.3316 enc_loss_bbox: 0.1529 enc_loss_iou: 0.2721 dn_loss_cls: 0.0921 dn_loss_bbox: 0.1519 dn_loss_iou: 0.2041 d0.dn_loss_cls: 0.1656 d0.dn_loss_bbox: 0.2774 d0.dn_loss_iou: 0.3389 d1.dn_loss_cls: 0.1191 d1.dn_loss_bbox: 0.1786 d1.dn_loss_iou: 0.2315 d2.dn_loss_cls: 0.1014 d2.dn_loss_bbox: 0.1613 d2.dn_loss_iou: 0.2133 d3.dn_loss_cls: 0.0950 d3.dn_loss_bbox: 0.1543 d3.dn_loss_iou: 0.2065 d4.dn_loss_cls: 0.0911 d4.dn_loss_bbox: 0.1518 d4.dn_loss_iou: 0.2041 d1.loss_lmm_region: 0.1310 loss_lmm_image: 0.8518 2024/11/11 18:37:24 - mmengine - INFO - Iter(train) [ 57100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:37:51 time: 1.9793 data_time: 0.0174 memory: 33156 grad_norm: 28.3885 loss: 9.3703 loss_cls: 0.2838 loss_bbox: 0.1328 loss_iou: 0.2744 d0.loss_cls: 0.3318 d0.loss_bbox: 0.1386 d0.loss_iou: 0.2877 d1.loss_cls: 0.2993 d1.loss_bbox: 0.1350 d1.loss_iou: 0.2768 d2.loss_cls: 0.2927 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2746 d3.loss_cls: 0.2848 d3.loss_bbox: 0.1349 d3.loss_iou: 0.2765 d4.loss_cls: 0.2884 d4.loss_bbox: 0.1321 d4.loss_iou: 0.2716 enc_loss_cls: 0.3314 enc_loss_bbox: 0.1520 enc_loss_iou: 0.3120 dn_loss_cls: 0.0823 dn_loss_bbox: 0.1647 dn_loss_iou: 0.2307 d0.dn_loss_cls: 0.1529 d0.dn_loss_bbox: 0.2990 d0.dn_loss_iou: 0.3685 d1.dn_loss_cls: 0.1100 d1.dn_loss_bbox: 0.1937 d1.dn_loss_iou: 0.2618 d2.dn_loss_cls: 0.0946 d2.dn_loss_bbox: 0.1733 d2.dn_loss_iou: 0.2401 d3.dn_loss_cls: 0.0861 d3.dn_loss_bbox: 0.1671 d3.dn_loss_iou: 0.2329 d4.dn_loss_cls: 0.0831 d4.dn_loss_bbox: 0.1648 d4.dn_loss_iou: 0.2306 d1.loss_lmm_region: 0.1207 loss_lmm_image: 0.8674 2024/11/11 18:40:45 - mmengine - INFO - Iter(train) [ 57200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:34:32 time: 2.0095 data_time: 0.0173 memory: 34222 grad_norm: 28.3204 loss: 10.1130 loss_cls: 0.3242 loss_bbox: 0.1523 loss_iou: 0.2786 d0.loss_cls: 0.3765 d0.loss_bbox: 0.1651 d0.loss_iou: 0.2985 d1.loss_cls: 0.3399 d1.loss_bbox: 0.1581 d1.loss_iou: 0.2855 d2.loss_cls: 0.3352 d2.loss_bbox: 0.1510 d2.loss_iou: 0.2789 d3.loss_cls: 0.3275 d3.loss_bbox: 0.1511 d3.loss_iou: 0.2766 d4.loss_cls: 0.3257 d4.loss_bbox: 0.1518 d4.loss_iou: 0.2776 enc_loss_cls: 0.3696 enc_loss_bbox: 0.1782 enc_loss_iou: 0.3205 dn_loss_cls: 0.1138 dn_loss_bbox: 0.1801 dn_loss_iou: 0.2151 d0.dn_loss_cls: 0.1910 d0.dn_loss_bbox: 0.3384 d0.dn_loss_iou: 0.3680 d1.dn_loss_cls: 0.1455 d1.dn_loss_bbox: 0.2145 d1.dn_loss_iou: 0.2491 d2.dn_loss_cls: 0.1247 d2.dn_loss_bbox: 0.1918 d2.dn_loss_iou: 0.2248 d3.dn_loss_cls: 0.1181 d3.dn_loss_bbox: 0.1834 d3.dn_loss_iou: 0.2178 d4.dn_loss_cls: 0.1139 d4.dn_loss_bbox: 0.1801 d4.dn_loss_iou: 0.2152 d1.loss_lmm_region: 0.1393 loss_lmm_image: 0.8663 2024/11/11 18:44:03 - mmengine - INFO - Iter(train) [ 57300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:31:09 time: 2.0079 data_time: 0.0172 memory: 34399 grad_norm: 43.6030 loss: 10.3645 loss_cls: 0.3338 loss_bbox: 0.1648 loss_iou: 0.2538 d0.loss_cls: 0.3983 d0.loss_bbox: 0.1678 d0.loss_iou: 0.2634 d1.loss_cls: 0.3639 d1.loss_bbox: 0.1564 d1.loss_iou: 0.2576 d2.loss_cls: 0.3476 d2.loss_bbox: 0.1560 d2.loss_iou: 0.2553 d3.loss_cls: 0.3400 d3.loss_bbox: 0.1616 d3.loss_iou: 0.2537 d4.loss_cls: 0.3340 d4.loss_bbox: 0.1624 d4.loss_iou: 0.2528 enc_loss_cls: 0.3893 enc_loss_bbox: 0.1791 enc_loss_iou: 0.2846 dn_loss_cls: 0.1446 dn_loss_bbox: 0.1820 dn_loss_iou: 0.2363 d0.dn_loss_cls: 0.2234 d0.dn_loss_bbox: 0.3296 d0.dn_loss_iou: 0.3831 d1.dn_loss_cls: 0.1702 d1.dn_loss_bbox: 0.2139 d1.dn_loss_iou: 0.2670 d2.dn_loss_cls: 0.1554 d2.dn_loss_bbox: 0.1927 d2.dn_loss_iou: 0.2464 d3.dn_loss_cls: 0.1475 d3.dn_loss_bbox: 0.1863 d3.dn_loss_iou: 0.2398 d4.dn_loss_cls: 0.1432 d4.dn_loss_bbox: 0.1821 d4.dn_loss_iou: 0.2364 d1.loss_lmm_region: 0.1696 loss_lmm_image: 0.8391 2024/11/11 18:47:22 - mmengine - INFO - Iter(train) [ 57400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:27:47 time: 1.9908 data_time: 0.0173 memory: 34951 grad_norm: 40.5908 loss: 9.5379 loss_cls: 0.3289 loss_bbox: 0.1300 loss_iou: 0.2283 d0.loss_cls: 0.3709 d0.loss_bbox: 0.1436 d0.loss_iou: 0.2449 d1.loss_cls: 0.3510 d1.loss_bbox: 0.1380 d1.loss_iou: 0.2379 d2.loss_cls: 0.3462 d2.loss_bbox: 0.1270 d2.loss_iou: 0.2284 d3.loss_cls: 0.3295 d3.loss_bbox: 0.1311 d3.loss_iou: 0.2296 d4.loss_cls: 0.3270 d4.loss_bbox: 0.1307 d4.loss_iou: 0.2276 enc_loss_cls: 0.3729 enc_loss_bbox: 0.1568 enc_loss_iou: 0.2731 dn_loss_cls: 0.1321 dn_loss_bbox: 0.1643 dn_loss_iou: 0.2002 d0.dn_loss_cls: 0.2136 d0.dn_loss_bbox: 0.3011 d0.dn_loss_iou: 0.3400 d1.dn_loss_cls: 0.1641 d1.dn_loss_bbox: 0.1941 d1.dn_loss_iou: 0.2320 d2.dn_loss_cls: 0.1441 d2.dn_loss_bbox: 0.1734 d2.dn_loss_iou: 0.2102 d3.dn_loss_cls: 0.1343 d3.dn_loss_bbox: 0.1673 d3.dn_loss_iou: 0.2035 d4.dn_loss_cls: 0.1317 d4.dn_loss_bbox: 0.1644 d4.dn_loss_iou: 0.2002 d1.loss_lmm_region: 0.1469 loss_lmm_image: 0.8668 2024/11/11 18:50:43 - mmengine - INFO - Iter(train) [ 57500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:24:28 time: 2.0225 data_time: 0.0172 memory: 34600 grad_norm: 32.5080 loss: 9.2406 loss_cls: 0.2802 loss_bbox: 0.1489 loss_iou: 0.2317 d0.loss_cls: 0.3123 d0.loss_bbox: 0.1618 d0.loss_iou: 0.2483 d1.loss_cls: 0.2887 d1.loss_bbox: 0.1527 d1.loss_iou: 0.2389 d2.loss_cls: 0.2834 d2.loss_bbox: 0.1470 d2.loss_iou: 0.2357 d3.loss_cls: 0.2783 d3.loss_bbox: 0.1497 d3.loss_iou: 0.2316 d4.loss_cls: 0.2785 d4.loss_bbox: 0.1507 d4.loss_iou: 0.2327 enc_loss_cls: 0.3204 enc_loss_bbox: 0.1669 enc_loss_iou: 0.2649 dn_loss_cls: 0.1047 dn_loss_bbox: 0.1741 dn_loss_iou: 0.2061 d0.dn_loss_cls: 0.1838 d0.dn_loss_bbox: 0.3180 d0.dn_loss_iou: 0.3506 d1.dn_loss_cls: 0.1334 d1.dn_loss_bbox: 0.2024 d1.dn_loss_iou: 0.2347 d2.dn_loss_cls: 0.1183 d2.dn_loss_bbox: 0.1822 d2.dn_loss_iou: 0.2152 d3.dn_loss_cls: 0.1108 d3.dn_loss_bbox: 0.1751 d3.dn_loss_iou: 0.2082 d4.dn_loss_cls: 0.1062 d4.dn_loss_bbox: 0.1741 d4.dn_loss_iou: 0.2060 d1.loss_lmm_region: 0.1573 loss_lmm_image: 0.8763 2024/11/11 18:54:04 - mmengine - INFO - Iter(train) [ 57600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:21:10 time: 1.9958 data_time: 0.0174 memory: 35116 grad_norm: 30.6277 loss: 9.0249 loss_cls: 0.2777 loss_bbox: 0.1368 loss_iou: 0.2311 d0.loss_cls: 0.3206 d0.loss_bbox: 0.1479 d0.loss_iou: 0.2422 d1.loss_cls: 0.3045 d1.loss_bbox: 0.1395 d1.loss_iou: 0.2311 d2.loss_cls: 0.2898 d2.loss_bbox: 0.1378 d2.loss_iou: 0.2312 d3.loss_cls: 0.2869 d3.loss_bbox: 0.1347 d3.loss_iou: 0.2276 d4.loss_cls: 0.2789 d4.loss_bbox: 0.1362 d4.loss_iou: 0.2281 enc_loss_cls: 0.3231 enc_loss_bbox: 0.1637 enc_loss_iou: 0.2634 dn_loss_cls: 0.1070 dn_loss_bbox: 0.1555 dn_loss_iou: 0.2023 d0.dn_loss_cls: 0.1866 d0.dn_loss_bbox: 0.3003 d0.dn_loss_iou: 0.3374 d1.dn_loss_cls: 0.1386 d1.dn_loss_bbox: 0.1856 d1.dn_loss_iou: 0.2306 d2.dn_loss_cls: 0.1201 d2.dn_loss_bbox: 0.1665 d2.dn_loss_iou: 0.2122 d3.dn_loss_cls: 0.1126 d3.dn_loss_bbox: 0.1578 d3.dn_loss_iou: 0.2044 d4.dn_loss_cls: 0.1074 d4.dn_loss_bbox: 0.1554 d4.dn_loss_iou: 0.2022 d1.loss_lmm_region: 0.1374 loss_lmm_image: 0.8721 2024/11/11 18:57:22 - mmengine - INFO - Iter(train) [ 57700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:17:46 time: 1.9545 data_time: 0.0172 memory: 33180 grad_norm: 33.4547 loss: 9.4451 loss_cls: 0.3019 loss_bbox: 0.1307 loss_iou: 0.2296 d0.loss_cls: 0.3443 d0.loss_bbox: 0.1390 d0.loss_iou: 0.2406 d1.loss_cls: 0.3185 d1.loss_bbox: 0.1312 d1.loss_iou: 0.2335 d2.loss_cls: 0.3066 d2.loss_bbox: 0.1302 d2.loss_iou: 0.2331 d3.loss_cls: 0.3057 d3.loss_bbox: 0.1281 d3.loss_iou: 0.2308 d4.loss_cls: 0.3012 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2302 enc_loss_cls: 0.3521 enc_loss_bbox: 0.1526 enc_loss_iou: 0.2561 dn_loss_cls: 0.1349 dn_loss_bbox: 0.1658 dn_loss_iou: 0.2115 d0.dn_loss_cls: 0.2222 d0.dn_loss_bbox: 0.3087 d0.dn_loss_iou: 0.3584 d1.dn_loss_cls: 0.1679 d1.dn_loss_bbox: 0.1953 d1.dn_loss_iou: 0.2415 d2.dn_loss_cls: 0.1500 d2.dn_loss_bbox: 0.1740 d2.dn_loss_iou: 0.2207 d3.dn_loss_cls: 0.1419 d3.dn_loss_bbox: 0.1675 d3.dn_loss_iou: 0.2136 d4.dn_loss_cls: 0.1362 d4.dn_loss_bbox: 0.1658 d4.dn_loss_iou: 0.2116 d1.loss_lmm_region: 0.1751 loss_lmm_image: 0.8552 2024/11/11 19:00:43 - mmengine - INFO - Iter(train) [ 57800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:14:27 time: 2.0108 data_time: 0.0173 memory: 34618 grad_norm: 31.8462 loss: 8.2150 loss_cls: 0.2602 loss_bbox: 0.1049 loss_iou: 0.2000 d0.loss_cls: 0.3007 d0.loss_bbox: 0.1088 d0.loss_iou: 0.2094 d1.loss_cls: 0.2792 d1.loss_bbox: 0.1063 d1.loss_iou: 0.2049 d2.loss_cls: 0.2742 d2.loss_bbox: 0.1021 d2.loss_iou: 0.1998 d3.loss_cls: 0.2676 d3.loss_bbox: 0.1031 d3.loss_iou: 0.1988 d4.loss_cls: 0.2644 d4.loss_bbox: 0.1011 d4.loss_iou: 0.1972 enc_loss_cls: 0.2966 enc_loss_bbox: 0.1232 enc_loss_iou: 0.2365 dn_loss_cls: 0.1038 dn_loss_bbox: 0.1343 dn_loss_iou: 0.1846 d0.dn_loss_cls: 0.1828 d0.dn_loss_bbox: 0.2815 d0.dn_loss_iou: 0.3261 d1.dn_loss_cls: 0.1345 d1.dn_loss_bbox: 0.1702 d1.dn_loss_iou: 0.2154 d2.dn_loss_cls: 0.1165 d2.dn_loss_bbox: 0.1448 d2.dn_loss_iou: 0.1942 d3.dn_loss_cls: 0.1094 d3.dn_loss_bbox: 0.1366 d3.dn_loss_iou: 0.1875 d4.dn_loss_cls: 0.1055 d4.dn_loss_bbox: 0.1344 d4.dn_loss_iou: 0.1847 d1.loss_lmm_region: 0.1360 loss_lmm_image: 0.8934 2024/11/11 19:04:00 - mmengine - INFO - Iter(train) [ 57900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:11:03 time: 1.9825 data_time: 0.0172 memory: 34029 grad_norm: 30.6411 loss: 10.2242 loss_cls: 0.3403 loss_bbox: 0.1438 loss_iou: 0.2797 d0.loss_cls: 0.3935 d0.loss_bbox: 0.1530 d0.loss_iou: 0.2948 d1.loss_cls: 0.3580 d1.loss_bbox: 0.1476 d1.loss_iou: 0.2865 d2.loss_cls: 0.3457 d2.loss_bbox: 0.1437 d2.loss_iou: 0.2799 d3.loss_cls: 0.3409 d3.loss_bbox: 0.1432 d3.loss_iou: 0.2797 d4.loss_cls: 0.3410 d4.loss_bbox: 0.1417 d4.loss_iou: 0.2784 enc_loss_cls: 0.3965 enc_loss_bbox: 0.1704 enc_loss_iou: 0.3241 dn_loss_cls: 0.1374 dn_loss_bbox: 0.1527 dn_loss_iou: 0.2206 d0.dn_loss_cls: 0.2078 d0.dn_loss_bbox: 0.2894 d0.dn_loss_iou: 0.3637 d1.dn_loss_cls: 0.1645 d1.dn_loss_bbox: 0.1864 d1.dn_loss_iou: 0.2541 d2.dn_loss_cls: 0.1471 d2.dn_loss_bbox: 0.1637 d2.dn_loss_iou: 0.2308 d3.dn_loss_cls: 0.1442 d3.dn_loss_bbox: 0.1555 d3.dn_loss_iou: 0.2230 d4.dn_loss_cls: 0.1405 d4.dn_loss_bbox: 0.1527 d4.dn_loss_iou: 0.2204 d1.loss_lmm_region: 0.1587 loss_lmm_image: 0.9289 2024/11/11 19:07:18 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 19:07:18 - mmengine - INFO - Iter(train) [ 58000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:07:38 time: 1.9716 data_time: 0.0173 memory: 33883 grad_norm: 27.6553 loss: 8.6568 loss_cls: 0.2680 loss_bbox: 0.1059 loss_iou: 0.1893 d0.loss_cls: 0.3133 d0.loss_bbox: 0.1143 d0.loss_iou: 0.2002 d1.loss_cls: 0.2836 d1.loss_bbox: 0.1093 d1.loss_iou: 0.1951 d2.loss_cls: 0.2749 d2.loss_bbox: 0.1067 d2.loss_iou: 0.1915 d3.loss_cls: 0.2727 d3.loss_bbox: 0.1056 d3.loss_iou: 0.1891 d4.loss_cls: 0.2689 d4.loss_bbox: 0.1061 d4.loss_iou: 0.1896 enc_loss_cls: 0.3102 enc_loss_bbox: 0.1306 enc_loss_iou: 0.2255 dn_loss_cls: 0.1366 dn_loss_bbox: 0.1610 dn_loss_iou: 0.1921 d0.dn_loss_cls: 0.2216 d0.dn_loss_bbox: 0.3091 d0.dn_loss_iou: 0.3305 d1.dn_loss_cls: 0.1708 d1.dn_loss_bbox: 0.1973 d1.dn_loss_iou: 0.2234 d2.dn_loss_cls: 0.1502 d2.dn_loss_bbox: 0.1743 d2.dn_loss_iou: 0.2023 d3.dn_loss_cls: 0.1455 d3.dn_loss_bbox: 0.1637 d3.dn_loss_iou: 0.1949 d4.dn_loss_cls: 0.1379 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.1921 d1.loss_lmm_region: 0.1683 loss_lmm_image: 0.8735 2024/11/11 19:10:38 - mmengine - INFO - Iter(train) [ 58100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:04:19 time: 1.9940 data_time: 0.0172 memory: 33868 grad_norm: 32.9225 loss: 10.4151 loss_cls: 0.3084 loss_bbox: 0.1782 loss_iou: 0.2719 d0.loss_cls: 0.3676 d0.loss_bbox: 0.1894 d0.loss_iou: 0.2844 d1.loss_cls: 0.3350 d1.loss_bbox: 0.1824 d1.loss_iou: 0.2747 d2.loss_cls: 0.3245 d2.loss_bbox: 0.1784 d2.loss_iou: 0.2741 d3.loss_cls: 0.3153 d3.loss_bbox: 0.1787 d3.loss_iou: 0.2723 d4.loss_cls: 0.3083 d4.loss_bbox: 0.1770 d4.loss_iou: 0.2738 enc_loss_cls: 0.3639 enc_loss_bbox: 0.2088 enc_loss_iou: 0.3111 dn_loss_cls: 0.1081 dn_loss_bbox: 0.2033 dn_loss_iou: 0.2391 d0.dn_loss_cls: 0.2030 d0.dn_loss_bbox: 0.3644 d0.dn_loss_iou: 0.3783 d1.dn_loss_cls: 0.1450 d1.dn_loss_bbox: 0.2332 d1.dn_loss_iou: 0.2678 d2.dn_loss_cls: 0.1228 d2.dn_loss_bbox: 0.2141 d2.dn_loss_iou: 0.2478 d3.dn_loss_cls: 0.1160 d3.dn_loss_bbox: 0.2057 d3.dn_loss_iou: 0.2415 d4.dn_loss_cls: 0.1095 d4.dn_loss_bbox: 0.2033 d4.dn_loss_iou: 0.2391 d1.loss_lmm_region: 0.1631 loss_lmm_image: 0.8318 2024/11/11 19:13:57 - mmengine - INFO - Iter(train) [ 58200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 3:00:57 time: 1.9981 data_time: 0.0172 memory: 33932 grad_norm: 34.3990 loss: 9.6014 loss_cls: 0.3123 loss_bbox: 0.1471 loss_iou: 0.2581 d0.loss_cls: 0.3506 d0.loss_bbox: 0.1609 d0.loss_iou: 0.2692 d1.loss_cls: 0.3289 d1.loss_bbox: 0.1575 d1.loss_iou: 0.2652 d2.loss_cls: 0.3188 d2.loss_bbox: 0.1538 d2.loss_iou: 0.2619 d3.loss_cls: 0.3177 d3.loss_bbox: 0.1491 d3.loss_iou: 0.2592 d4.loss_cls: 0.3176 d4.loss_bbox: 0.1466 d4.loss_iou: 0.2568 enc_loss_cls: 0.3572 enc_loss_bbox: 0.1703 enc_loss_iou: 0.2892 dn_loss_cls: 0.1072 dn_loss_bbox: 0.1650 dn_loss_iou: 0.2159 d0.dn_loss_cls: 0.1798 d0.dn_loss_bbox: 0.2849 d0.dn_loss_iou: 0.3397 d1.dn_loss_cls: 0.1343 d1.dn_loss_bbox: 0.1923 d1.dn_loss_iou: 0.2428 d2.dn_loss_cls: 0.1182 d2.dn_loss_bbox: 0.1714 d2.dn_loss_iou: 0.2232 d3.dn_loss_cls: 0.1089 d3.dn_loss_bbox: 0.1656 d3.dn_loss_iou: 0.2177 d4.dn_loss_cls: 0.1069 d4.dn_loss_bbox: 0.1649 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1427 loss_lmm_image: 0.8561 2024/11/11 19:17:17 - mmengine - INFO - Iter(train) [ 58300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:57:37 time: 1.9826 data_time: 0.0176 memory: 35349 grad_norm: 28.5168 loss: 10.0356 loss_cls: 0.3078 loss_bbox: 0.1546 loss_iou: 0.2440 d0.loss_cls: 0.3506 d0.loss_bbox: 0.1658 d0.loss_iou: 0.2580 d1.loss_cls: 0.3269 d1.loss_bbox: 0.1621 d1.loss_iou: 0.2523 d2.loss_cls: 0.3231 d2.loss_bbox: 0.1548 d2.loss_iou: 0.2460 d3.loss_cls: 0.3152 d3.loss_bbox: 0.1541 d3.loss_iou: 0.2440 d4.loss_cls: 0.3088 d4.loss_bbox: 0.1553 d4.loss_iou: 0.2442 enc_loss_cls: 0.3563 enc_loss_bbox: 0.1796 enc_loss_iou: 0.2781 dn_loss_cls: 0.1387 dn_loss_bbox: 0.1939 dn_loss_iou: 0.2237 d0.dn_loss_cls: 0.2152 d0.dn_loss_bbox: 0.3482 d0.dn_loss_iou: 0.3618 d1.dn_loss_cls: 0.1686 d1.dn_loss_bbox: 0.2251 d1.dn_loss_iou: 0.2515 d2.dn_loss_cls: 0.1531 d2.dn_loss_bbox: 0.2050 d2.dn_loss_iou: 0.2325 d3.dn_loss_cls: 0.1447 d3.dn_loss_bbox: 0.1955 d3.dn_loss_iou: 0.2257 d4.dn_loss_cls: 0.1409 d4.dn_loss_bbox: 0.1940 d4.dn_loss_iou: 0.2237 d1.loss_lmm_region: 0.1558 loss_lmm_image: 0.8563 2024/11/11 19:20:35 - mmengine - INFO - Iter(train) [ 58400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:54:13 time: 1.9735 data_time: 0.0172 memory: 33037 grad_norm: 26.6887 loss: 9.0711 loss_cls: 0.2462 loss_bbox: 0.1321 loss_iou: 0.2441 d0.loss_cls: 0.2871 d0.loss_bbox: 0.1388 d0.loss_iou: 0.2521 d1.loss_cls: 0.2595 d1.loss_bbox: 0.1361 d1.loss_iou: 0.2473 d2.loss_cls: 0.2516 d2.loss_bbox: 0.1344 d2.loss_iou: 0.2454 d3.loss_cls: 0.2448 d3.loss_bbox: 0.1329 d3.loss_iou: 0.2447 d4.loss_cls: 0.2446 d4.loss_bbox: 0.1318 d4.loss_iou: 0.2436 enc_loss_cls: 0.2910 enc_loss_bbox: 0.1543 enc_loss_iou: 0.2743 dn_loss_cls: 0.0914 dn_loss_bbox: 0.1798 dn_loss_iou: 0.2301 d0.dn_loss_cls: 0.1751 d0.dn_loss_bbox: 0.3548 d0.dn_loss_iou: 0.3828 d1.dn_loss_cls: 0.1236 d1.dn_loss_bbox: 0.2184 d1.dn_loss_iou: 0.2646 d2.dn_loss_cls: 0.1025 d2.dn_loss_bbox: 0.1900 d2.dn_loss_iou: 0.2409 d3.dn_loss_cls: 0.0962 d3.dn_loss_bbox: 0.1823 d3.dn_loss_iou: 0.2330 d4.dn_loss_cls: 0.0917 d4.dn_loss_bbox: 0.1798 d4.dn_loss_iou: 0.2299 d1.loss_lmm_region: 0.1374 loss_lmm_image: 0.8301 2024/11/11 19:23:56 - mmengine - INFO - Iter(train) [ 58500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:50:55 time: 1.9966 data_time: 0.0172 memory: 34337 grad_norm: 29.8755 loss: 10.2103 loss_cls: 0.3255 loss_bbox: 0.1542 loss_iou: 0.2538 d0.loss_cls: 0.3713 d0.loss_bbox: 0.1660 d0.loss_iou: 0.2709 d1.loss_cls: 0.3540 d1.loss_bbox: 0.1503 d1.loss_iou: 0.2565 d2.loss_cls: 0.3346 d2.loss_bbox: 0.1567 d2.loss_iou: 0.2575 d3.loss_cls: 0.3311 d3.loss_bbox: 0.1543 d3.loss_iou: 0.2553 d4.loss_cls: 0.3228 d4.loss_bbox: 0.1551 d4.loss_iou: 0.2548 enc_loss_cls: 0.3814 enc_loss_bbox: 0.1828 enc_loss_iou: 0.2981 dn_loss_cls: 0.1152 dn_loss_bbox: 0.1997 dn_loss_iou: 0.2270 d0.dn_loss_cls: 0.1981 d0.dn_loss_bbox: 0.3528 d0.dn_loss_iou: 0.3698 d1.dn_loss_cls: 0.1489 d1.dn_loss_bbox: 0.2352 d1.dn_loss_iou: 0.2595 d2.dn_loss_cls: 0.1289 d2.dn_loss_bbox: 0.2142 d2.dn_loss_iou: 0.2388 d3.dn_loss_cls: 0.1207 d3.dn_loss_bbox: 0.2027 d3.dn_loss_iou: 0.2303 d4.dn_loss_cls: 0.1160 d4.dn_loss_bbox: 0.1998 d4.dn_loss_iou: 0.2270 d1.loss_lmm_region: 0.1671 loss_lmm_image: 0.8718 2024/11/11 19:27:18 - mmengine - INFO - Iter(train) [ 58600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:47:38 time: 2.0168 data_time: 0.0173 memory: 35287 grad_norm: 26.2478 loss: 10.7489 loss_cls: 0.3578 loss_bbox: 0.1655 loss_iou: 0.3307 d0.loss_cls: 0.4089 d0.loss_bbox: 0.1760 d0.loss_iou: 0.3420 d1.loss_cls: 0.3847 d1.loss_bbox: 0.1625 d1.loss_iou: 0.3331 d2.loss_cls: 0.3729 d2.loss_bbox: 0.1609 d2.loss_iou: 0.3279 d3.loss_cls: 0.3631 d3.loss_bbox: 0.1636 d3.loss_iou: 0.3294 d4.loss_cls: 0.3603 d4.loss_bbox: 0.1631 d4.loss_iou: 0.3307 enc_loss_cls: 0.4145 enc_loss_bbox: 0.1845 enc_loss_iou: 0.3669 dn_loss_cls: 0.0926 dn_loss_bbox: 0.1727 dn_loss_iou: 0.2376 d0.dn_loss_cls: 0.1728 d0.dn_loss_bbox: 0.3083 d0.dn_loss_iou: 0.3755 d1.dn_loss_cls: 0.1257 d1.dn_loss_bbox: 0.1995 d1.dn_loss_iou: 0.2670 d2.dn_loss_cls: 0.1057 d2.dn_loss_bbox: 0.1808 d2.dn_loss_iou: 0.2457 d3.dn_loss_cls: 0.0984 d3.dn_loss_bbox: 0.1752 d3.dn_loss_iou: 0.2397 d4.dn_loss_cls: 0.0942 d4.dn_loss_bbox: 0.1726 d4.dn_loss_iou: 0.2375 d1.loss_lmm_region: 0.1607 loss_lmm_image: 0.8879 2024/11/11 19:30:35 - mmengine - INFO - Iter(train) [ 58700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:44:14 time: 1.9866 data_time: 0.0173 memory: 33772 grad_norm: 28.2575 loss: 9.3487 loss_cls: 0.2865 loss_bbox: 0.1273 loss_iou: 0.2227 d0.loss_cls: 0.3362 d0.loss_bbox: 0.1352 d0.loss_iou: 0.2386 d1.loss_cls: 0.3059 d1.loss_bbox: 0.1333 d1.loss_iou: 0.2309 d2.loss_cls: 0.3005 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2247 d3.loss_cls: 0.2889 d3.loss_bbox: 0.1298 d3.loss_iou: 0.2272 d4.loss_cls: 0.2851 d4.loss_bbox: 0.1291 d4.loss_iou: 0.2259 enc_loss_cls: 0.3350 enc_loss_bbox: 0.1500 enc_loss_iou: 0.2602 dn_loss_cls: 0.1268 dn_loss_bbox: 0.1689 dn_loss_iou: 0.2076 d0.dn_loss_cls: 0.2136 d0.dn_loss_bbox: 0.3334 d0.dn_loss_iou: 0.3611 d1.dn_loss_cls: 0.1573 d1.dn_loss_bbox: 0.2106 d1.dn_loss_iou: 0.2433 d2.dn_loss_cls: 0.1400 d2.dn_loss_bbox: 0.1804 d2.dn_loss_iou: 0.2179 d3.dn_loss_cls: 0.1326 d3.dn_loss_bbox: 0.1708 d3.dn_loss_iou: 0.2098 d4.dn_loss_cls: 0.1266 d4.dn_loss_bbox: 0.1688 d4.dn_loss_iou: 0.2075 d1.loss_lmm_region: 0.1646 loss_lmm_image: 0.9051 2024/11/11 19:33:56 - mmengine - INFO - Iter(train) [ 58800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:40:54 time: 2.0045 data_time: 0.0173 memory: 34304 grad_norm: 42.6754 loss: 10.2753 loss_cls: 0.3281 loss_bbox: 0.1446 loss_iou: 0.2814 d0.loss_cls: 0.3717 d0.loss_bbox: 0.1652 d0.loss_iou: 0.3008 d1.loss_cls: 0.3456 d1.loss_bbox: 0.1512 d1.loss_iou: 0.2882 d2.loss_cls: 0.3351 d2.loss_bbox: 0.1465 d2.loss_iou: 0.2839 d3.loss_cls: 0.3322 d3.loss_bbox: 0.1433 d3.loss_iou: 0.2809 d4.loss_cls: 0.3276 d4.loss_bbox: 0.1441 d4.loss_iou: 0.2814 enc_loss_cls: 0.3723 enc_loss_bbox: 0.1831 enc_loss_iou: 0.3270 dn_loss_cls: 0.1240 dn_loss_bbox: 0.1837 dn_loss_iou: 0.2359 d0.dn_loss_cls: 0.2004 d0.dn_loss_bbox: 0.3293 d0.dn_loss_iou: 0.3784 d1.dn_loss_cls: 0.1530 d1.dn_loss_bbox: 0.2187 d1.dn_loss_iou: 0.2696 d2.dn_loss_cls: 0.1345 d2.dn_loss_bbox: 0.1937 d2.dn_loss_iou: 0.2458 d3.dn_loss_cls: 0.1277 d3.dn_loss_bbox: 0.1864 d3.dn_loss_iou: 0.2387 d4.dn_loss_cls: 0.1247 d4.dn_loss_bbox: 0.1838 d4.dn_loss_iou: 0.2359 d1.loss_lmm_region: 0.1306 loss_lmm_image: 0.8462 2024/11/11 19:37:14 - mmengine - INFO - Iter(train) [ 58900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:37:31 time: 1.9886 data_time: 0.0172 memory: 35210 grad_norm: 30.9824 loss: 9.6888 loss_cls: 0.2747 loss_bbox: 0.1533 loss_iou: 0.2392 d0.loss_cls: 0.3218 d0.loss_bbox: 0.1628 d0.loss_iou: 0.2515 d1.loss_cls: 0.3005 d1.loss_bbox: 0.1515 d1.loss_iou: 0.2406 d2.loss_cls: 0.2896 d2.loss_bbox: 0.1515 d2.loss_iou: 0.2405 d3.loss_cls: 0.2843 d3.loss_bbox: 0.1499 d3.loss_iou: 0.2381 d4.loss_cls: 0.2779 d4.loss_bbox: 0.1525 d4.loss_iou: 0.2393 enc_loss_cls: 0.3263 enc_loss_bbox: 0.1743 enc_loss_iou: 0.2694 dn_loss_cls: 0.1331 dn_loss_bbox: 0.1868 dn_loss_iou: 0.2300 d0.dn_loss_cls: 0.2095 d0.dn_loss_bbox: 0.3310 d0.dn_loss_iou: 0.3673 d1.dn_loss_cls: 0.1682 d1.dn_loss_bbox: 0.2223 d1.dn_loss_iou: 0.2591 d2.dn_loss_cls: 0.1494 d2.dn_loss_bbox: 0.1955 d2.dn_loss_iou: 0.2373 d3.dn_loss_cls: 0.1402 d3.dn_loss_bbox: 0.1898 d3.dn_loss_iou: 0.2323 d4.dn_loss_cls: 0.1332 d4.dn_loss_bbox: 0.1869 d4.dn_loss_iou: 0.2301 d1.loss_lmm_region: 0.1609 loss_lmm_image: 0.8367 2024/11/11 19:40:34 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 19:40:34 - mmengine - INFO - Iter(train) [ 59000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:34:12 time: 2.0100 data_time: 0.0174 memory: 32197 grad_norm: 28.4729 loss: 9.6820 loss_cls: 0.2967 loss_bbox: 0.1431 loss_iou: 0.2444 d0.loss_cls: 0.3369 d0.loss_bbox: 0.1559 d0.loss_iou: 0.2598 d1.loss_cls: 0.3101 d1.loss_bbox: 0.1488 d1.loss_iou: 0.2519 d2.loss_cls: 0.3012 d2.loss_bbox: 0.1483 d2.loss_iou: 0.2492 d3.loss_cls: 0.2994 d3.loss_bbox: 0.1436 d3.loss_iou: 0.2465 d4.loss_cls: 0.2982 d4.loss_bbox: 0.1419 d4.loss_iou: 0.2454 enc_loss_cls: 0.3415 enc_loss_bbox: 0.1678 enc_loss_iou: 0.2819 dn_loss_cls: 0.1188 dn_loss_bbox: 0.1822 dn_loss_iou: 0.2239 d0.dn_loss_cls: 0.1971 d0.dn_loss_bbox: 0.3220 d0.dn_loss_iou: 0.3633 d1.dn_loss_cls: 0.1506 d1.dn_loss_bbox: 0.2092 d1.dn_loss_iou: 0.2528 d2.dn_loss_cls: 0.1331 d2.dn_loss_bbox: 0.1931 d2.dn_loss_iou: 0.2339 d3.dn_loss_cls: 0.1209 d3.dn_loss_bbox: 0.1836 d3.dn_loss_iou: 0.2258 d4.dn_loss_cls: 0.1194 d4.dn_loss_bbox: 0.1823 d4.dn_loss_iou: 0.2239 d1.loss_lmm_region: 0.1579 loss_lmm_image: 0.8751 2024/11/11 19:43:53 - mmengine - INFO - Iter(train) [ 59100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:30:49 time: 1.9660 data_time: 0.0173 memory: 35159 grad_norm: 27.7463 loss: 9.2687 loss_cls: 0.2954 loss_bbox: 0.1343 loss_iou: 0.2436 d0.loss_cls: 0.3390 d0.loss_bbox: 0.1465 d0.loss_iou: 0.2596 d1.loss_cls: 0.3140 d1.loss_bbox: 0.1421 d1.loss_iou: 0.2539 d2.loss_cls: 0.3073 d2.loss_bbox: 0.1384 d2.loss_iou: 0.2490 d3.loss_cls: 0.2969 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2463 d4.loss_cls: 0.2966 d4.loss_bbox: 0.1357 d4.loss_iou: 0.2442 enc_loss_cls: 0.3319 enc_loss_bbox: 0.1664 enc_loss_iou: 0.2865 dn_loss_cls: 0.0967 dn_loss_bbox: 0.1561 dn_loss_iou: 0.2068 d0.dn_loss_cls: 0.1733 d0.dn_loss_bbox: 0.3023 d0.dn_loss_iou: 0.3503 d1.dn_loss_cls: 0.1279 d1.dn_loss_bbox: 0.1908 d1.dn_loss_iou: 0.2390 d2.dn_loss_cls: 0.1103 d2.dn_loss_bbox: 0.1647 d2.dn_loss_iou: 0.2163 d3.dn_loss_cls: 0.1020 d3.dn_loss_bbox: 0.1577 d3.dn_loss_iou: 0.2089 d4.dn_loss_cls: 0.0991 d4.dn_loss_bbox: 0.1561 d4.dn_loss_iou: 0.2067 d1.loss_lmm_region: 0.1521 loss_lmm_image: 0.8870 2024/11/11 19:47:12 - mmengine - INFO - Iter(train) [ 59200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:27:28 time: 1.9991 data_time: 0.0174 memory: 34054 grad_norm: 34.0378 loss: 9.1119 loss_cls: 0.2956 loss_bbox: 0.1273 loss_iou: 0.2228 d0.loss_cls: 0.3460 d0.loss_bbox: 0.1420 d0.loss_iou: 0.2385 d1.loss_cls: 0.3161 d1.loss_bbox: 0.1408 d1.loss_iou: 0.2345 d2.loss_cls: 0.3060 d2.loss_bbox: 0.1339 d2.loss_iou: 0.2274 d3.loss_cls: 0.2980 d3.loss_bbox: 0.1285 d3.loss_iou: 0.2255 d4.loss_cls: 0.2945 d4.loss_bbox: 0.1295 d4.loss_iou: 0.2239 enc_loss_cls: 0.3381 enc_loss_bbox: 0.1635 enc_loss_iou: 0.2673 dn_loss_cls: 0.0952 dn_loss_bbox: 0.1533 dn_loss_iou: 0.2042 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.3154 d0.dn_loss_iou: 0.3521 d1.dn_loss_cls: 0.1296 d1.dn_loss_bbox: 0.1907 d1.dn_loss_iou: 0.2383 d2.dn_loss_cls: 0.1107 d2.dn_loss_bbox: 0.1667 d2.dn_loss_iou: 0.2155 d3.dn_loss_cls: 0.1022 d3.dn_loss_bbox: 0.1561 d3.dn_loss_iou: 0.2069 d4.dn_loss_cls: 0.0970 d4.dn_loss_bbox: 0.1532 d4.dn_loss_iou: 0.2041 d1.loss_lmm_region: 0.1530 loss_lmm_image: 0.8856 2024/11/11 19:50:30 - mmengine - INFO - Iter(train) [ 59300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:24:05 time: 1.9732 data_time: 0.0173 memory: 33642 grad_norm: 29.7194 loss: 9.7085 loss_cls: 0.3065 loss_bbox: 0.1442 loss_iou: 0.2504 d0.loss_cls: 0.3690 d0.loss_bbox: 0.1603 d0.loss_iou: 0.2694 d1.loss_cls: 0.3363 d1.loss_bbox: 0.1473 d1.loss_iou: 0.2592 d2.loss_cls: 0.3204 d2.loss_bbox: 0.1489 d2.loss_iou: 0.2581 d3.loss_cls: 0.3153 d3.loss_bbox: 0.1440 d3.loss_iou: 0.2523 d4.loss_cls: 0.3114 d4.loss_bbox: 0.1425 d4.loss_iou: 0.2485 enc_loss_cls: 0.3699 enc_loss_bbox: 0.1728 enc_loss_iou: 0.2911 dn_loss_cls: 0.1098 dn_loss_bbox: 0.1766 dn_loss_iou: 0.2085 d0.dn_loss_cls: 0.1870 d0.dn_loss_bbox: 0.3278 d0.dn_loss_iou: 0.3495 d1.dn_loss_cls: 0.1374 d1.dn_loss_bbox: 0.2130 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1233 d2.dn_loss_bbox: 0.1911 d2.dn_loss_iou: 0.2189 d3.dn_loss_cls: 0.1158 d3.dn_loss_bbox: 0.1819 d3.dn_loss_iou: 0.2120 d4.dn_loss_cls: 0.1131 d4.dn_loss_bbox: 0.1767 d4.dn_loss_iou: 0.2086 d1.loss_lmm_region: 0.1333 loss_lmm_image: 0.8649 2024/11/11 19:53:47 - mmengine - INFO - Iter(train) [ 59400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:20:40 time: 1.9475 data_time: 0.0174 memory: 34161 grad_norm: 30.2940 loss: 9.8163 loss_cls: 0.3359 loss_bbox: 0.1434 loss_iou: 0.2608 d0.loss_cls: 0.3881 d0.loss_bbox: 0.1519 d0.loss_iou: 0.2702 d1.loss_cls: 0.3575 d1.loss_bbox: 0.1480 d1.loss_iou: 0.2635 d2.loss_cls: 0.3385 d2.loss_bbox: 0.1404 d2.loss_iou: 0.2560 d3.loss_cls: 0.3448 d3.loss_bbox: 0.1403 d3.loss_iou: 0.2576 d4.loss_cls: 0.3403 d4.loss_bbox: 0.1410 d4.loss_iou: 0.2596 enc_loss_cls: 0.3846 enc_loss_bbox: 0.1694 enc_loss_iou: 0.2943 dn_loss_cls: 0.1276 dn_loss_bbox: 0.1518 dn_loss_iou: 0.2011 d0.dn_loss_cls: 0.2299 d0.dn_loss_bbox: 0.2912 d0.dn_loss_iou: 0.3345 d1.dn_loss_cls: 0.1691 d1.dn_loss_bbox: 0.1822 d1.dn_loss_iou: 0.2305 d2.dn_loss_cls: 0.1406 d2.dn_loss_bbox: 0.1625 d2.dn_loss_iou: 0.2102 d3.dn_loss_cls: 0.1289 d3.dn_loss_bbox: 0.1546 d3.dn_loss_iou: 0.2035 d4.dn_loss_cls: 0.1263 d4.dn_loss_bbox: 0.1518 d4.dn_loss_iou: 0.2010 d1.loss_lmm_region: 0.1688 loss_lmm_image: 0.8639 2024/11/11 19:57:04 - mmengine - INFO - Iter(train) [ 59500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:17:16 time: 1.9753 data_time: 0.0172 memory: 32767 grad_norm: 30.8989 loss: 9.2833 loss_cls: 0.3070 loss_bbox: 0.1319 loss_iou: 0.2282 d0.loss_cls: 0.3507 d0.loss_bbox: 0.1370 d0.loss_iou: 0.2407 d1.loss_cls: 0.3299 d1.loss_bbox: 0.1318 d1.loss_iou: 0.2294 d2.loss_cls: 0.3195 d2.loss_bbox: 0.1276 d2.loss_iou: 0.2264 d3.loss_cls: 0.3110 d3.loss_bbox: 0.1307 d3.loss_iou: 0.2267 d4.loss_cls: 0.3064 d4.loss_bbox: 0.1330 d4.loss_iou: 0.2299 enc_loss_cls: 0.3597 enc_loss_bbox: 0.1484 enc_loss_iou: 0.2611 dn_loss_cls: 0.1178 dn_loss_bbox: 0.1529 dn_loss_iou: 0.2057 d0.dn_loss_cls: 0.1957 d0.dn_loss_bbox: 0.2854 d0.dn_loss_iou: 0.3468 d1.dn_loss_cls: 0.1503 d1.dn_loss_bbox: 0.1808 d1.dn_loss_iou: 0.2354 d2.dn_loss_cls: 0.1347 d2.dn_loss_bbox: 0.1582 d2.dn_loss_iou: 0.2127 d3.dn_loss_cls: 0.1255 d3.dn_loss_bbox: 0.1545 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.1213 d4.dn_loss_bbox: 0.1530 d4.dn_loss_iou: 0.2058 d1.loss_lmm_region: 0.1450 loss_lmm_image: 0.9267 2024/11/11 20:00:24 - mmengine - INFO - Iter(train) [ 59600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:13:54 time: 2.0046 data_time: 0.0176 memory: 34973 grad_norm: 32.6340 loss: 11.0881 loss_cls: 0.3613 loss_bbox: 0.1535 loss_iou: 0.2960 d0.loss_cls: 0.4183 d0.loss_bbox: 0.1611 d0.loss_iou: 0.3090 d1.loss_cls: 0.3844 d1.loss_bbox: 0.1557 d1.loss_iou: 0.2971 d2.loss_cls: 0.3707 d2.loss_bbox: 0.1551 d2.loss_iou: 0.2971 d3.loss_cls: 0.3648 d3.loss_bbox: 0.1538 d3.loss_iou: 0.2970 d4.loss_cls: 0.3618 d4.loss_bbox: 0.1548 d4.loss_iou: 0.2989 enc_loss_cls: 0.4103 enc_loss_bbox: 0.1798 enc_loss_iou: 0.3334 dn_loss_cls: 0.1869 dn_loss_bbox: 0.1740 dn_loss_iou: 0.2423 d0.dn_loss_cls: 0.2616 d0.dn_loss_bbox: 0.3257 d0.dn_loss_iou: 0.3918 d1.dn_loss_cls: 0.2194 d1.dn_loss_bbox: 0.2074 d1.dn_loss_iou: 0.2746 d2.dn_loss_cls: 0.2087 d2.dn_loss_bbox: 0.1842 d2.dn_loss_iou: 0.2526 d3.dn_loss_cls: 0.1983 d3.dn_loss_bbox: 0.1769 d3.dn_loss_iou: 0.2449 d4.dn_loss_cls: 0.1937 d4.dn_loss_bbox: 0.1741 d4.dn_loss_iou: 0.2423 d1.loss_lmm_region: 0.1357 loss_lmm_image: 0.8796 2024/11/11 20:03:41 - mmengine - INFO - Iter(train) [ 59700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:10:30 time: 1.9710 data_time: 0.0174 memory: 33177 grad_norm: 34.7236 loss: 9.0230 loss_cls: 0.2517 loss_bbox: 0.1397 loss_iou: 0.2210 d0.loss_cls: 0.3039 d0.loss_bbox: 0.1424 d0.loss_iou: 0.2319 d1.loss_cls: 0.2773 d1.loss_bbox: 0.1437 d1.loss_iou: 0.2269 d2.loss_cls: 0.2628 d2.loss_bbox: 0.1395 d2.loss_iou: 0.2249 d3.loss_cls: 0.2550 d3.loss_bbox: 0.1406 d3.loss_iou: 0.2222 d4.loss_cls: 0.2527 d4.loss_bbox: 0.1414 d4.loss_iou: 0.2220 enc_loss_cls: 0.3004 enc_loss_bbox: 0.1612 enc_loss_iou: 0.2572 dn_loss_cls: 0.1117 dn_loss_bbox: 0.1727 dn_loss_iou: 0.2137 d0.dn_loss_cls: 0.1928 d0.dn_loss_bbox: 0.3199 d0.dn_loss_iou: 0.3561 d1.dn_loss_cls: 0.1435 d1.dn_loss_bbox: 0.2091 d1.dn_loss_iou: 0.2445 d2.dn_loss_cls: 0.1242 d2.dn_loss_bbox: 0.1831 d2.dn_loss_iou: 0.2234 d3.dn_loss_cls: 0.1167 d3.dn_loss_bbox: 0.1752 d3.dn_loss_iou: 0.2165 d4.dn_loss_cls: 0.1137 d4.dn_loss_bbox: 0.1728 d4.dn_loss_iou: 0.2137 d1.loss_lmm_region: 0.1360 loss_lmm_image: 0.8654 2024/11/11 20:07:03 - mmengine - INFO - Iter(train) [ 59800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:07:13 time: 2.0027 data_time: 0.0172 memory: 33034 grad_norm: 32.1835 loss: 10.8046 loss_cls: 0.3202 loss_bbox: 0.1694 loss_iou: 0.2790 d0.loss_cls: 0.3702 d0.loss_bbox: 0.1812 d0.loss_iou: 0.2943 d1.loss_cls: 0.3444 d1.loss_bbox: 0.1688 d1.loss_iou: 0.2828 d2.loss_cls: 0.3338 d2.loss_bbox: 0.1695 d2.loss_iou: 0.2812 d3.loss_cls: 0.3275 d3.loss_bbox: 0.1706 d3.loss_iou: 0.2801 d4.loss_cls: 0.3228 d4.loss_bbox: 0.1701 d4.loss_iou: 0.2788 enc_loss_cls: 0.3737 enc_loss_bbox: 0.1996 enc_loss_iou: 0.3135 dn_loss_cls: 0.1746 dn_loss_bbox: 0.1909 dn_loss_iou: 0.2355 d0.dn_loss_cls: 0.2610 d0.dn_loss_bbox: 0.3420 d0.dn_loss_iou: 0.3778 d1.dn_loss_cls: 0.2102 d1.dn_loss_bbox: 0.2258 d1.dn_loss_iou: 0.2632 d2.dn_loss_cls: 0.1952 d2.dn_loss_bbox: 0.2028 d2.dn_loss_iou: 0.2449 d3.dn_loss_cls: 0.1836 d3.dn_loss_bbox: 0.1935 d3.dn_loss_iou: 0.2379 d4.dn_loss_cls: 0.1813 d4.dn_loss_bbox: 0.1909 d4.dn_loss_iou: 0.2356 d1.loss_lmm_region: 0.1626 loss_lmm_image: 0.8639 2024/11/11 20:10:21 - mmengine - INFO - Iter(train) [ 59900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:03:50 time: 1.9850 data_time: 0.0175 memory: 33679 grad_norm: 32.0400 loss: 8.2231 loss_cls: 0.2332 loss_bbox: 0.1176 loss_iou: 0.1941 d0.loss_cls: 0.2749 d0.loss_bbox: 0.1329 d0.loss_iou: 0.2111 d1.loss_cls: 0.2467 d1.loss_bbox: 0.1237 d1.loss_iou: 0.2001 d2.loss_cls: 0.2397 d2.loss_bbox: 0.1187 d2.loss_iou: 0.1944 d3.loss_cls: 0.2351 d3.loss_bbox: 0.1180 d3.loss_iou: 0.1945 d4.loss_cls: 0.2339 d4.loss_bbox: 0.1177 d4.loss_iou: 0.1941 enc_loss_cls: 0.2835 enc_loss_bbox: 0.1504 enc_loss_iou: 0.2364 dn_loss_cls: 0.0740 dn_loss_bbox: 0.1662 dn_loss_iou: 0.2005 d0.dn_loss_cls: 0.1606 d0.dn_loss_bbox: 0.3386 d0.dn_loss_iou: 0.3583 d1.dn_loss_cls: 0.1099 d1.dn_loss_bbox: 0.2076 d1.dn_loss_iou: 0.2367 d2.dn_loss_cls: 0.0881 d2.dn_loss_bbox: 0.1797 d2.dn_loss_iou: 0.2114 d3.dn_loss_cls: 0.0800 d3.dn_loss_bbox: 0.1684 d3.dn_loss_iou: 0.2029 d4.dn_loss_cls: 0.0751 d4.dn_loss_bbox: 0.1662 d4.dn_loss_iou: 0.2006 d1.loss_lmm_region: 0.1249 loss_lmm_image: 0.8228 2024/11/11 20:13:38 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 20:13:38 - mmengine - INFO - Iter(train) [ 60000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:00:25 time: 1.9676 data_time: 0.0173 memory: 33414 grad_norm: 28.5462 loss: 9.6977 loss_cls: 0.2814 loss_bbox: 0.1405 loss_iou: 0.2498 d0.loss_cls: 0.3201 d0.loss_bbox: 0.1547 d0.loss_iou: 0.2666 d1.loss_cls: 0.2965 d1.loss_bbox: 0.1496 d1.loss_iou: 0.2557 d2.loss_cls: 0.2880 d2.loss_bbox: 0.1427 d2.loss_iou: 0.2534 d3.loss_cls: 0.2860 d3.loss_bbox: 0.1405 d3.loss_iou: 0.2505 d4.loss_cls: 0.2828 d4.loss_bbox: 0.1398 d4.loss_iou: 0.2492 enc_loss_cls: 0.3194 enc_loss_bbox: 0.1723 enc_loss_iou: 0.2907 dn_loss_cls: 0.1128 dn_loss_bbox: 0.1864 dn_loss_iou: 0.2399 d0.dn_loss_cls: 0.1873 d0.dn_loss_bbox: 0.3428 d0.dn_loss_iou: 0.3926 d1.dn_loss_cls: 0.1419 d1.dn_loss_bbox: 0.2181 d1.dn_loss_iou: 0.2725 d2.dn_loss_cls: 0.1247 d2.dn_loss_bbox: 0.1977 d2.dn_loss_iou: 0.2490 d3.dn_loss_cls: 0.1158 d3.dn_loss_bbox: 0.1886 d3.dn_loss_iou: 0.2421 d4.dn_loss_cls: 0.1138 d4.dn_loss_bbox: 0.1865 d4.dn_loss_iou: 0.2398 d1.loss_lmm_region: 0.1301 loss_lmm_image: 0.8851 2024/11/11 20:13:38 - mmengine - INFO - Saving checkpoint at 60000 iterations 2024/11/11 20:23:59 - mmengine - INFO - Iter(val) [100/602] eta: 0:49:05 time: 5.9981 data_time: 0.0017 memory: 7497 2024/11/11 20:34:19 - mmengine - INFO - Iter(val) [200/602] eta: 0:40:27 time: 6.3327 data_time: 0.0018 memory: 7528 2024/11/11 20:45:43 - mmengine - INFO - Iter(val) [300/602] eta: 0:31:43 time: 6.9048 data_time: 0.0018 memory: 7505 2024/11/11 20:58:00 - mmengine - INFO - Iter(val) [400/602] eta: 0:22:07 time: 7.5626 data_time: 0.0018 memory: 7528 2024/11/11 21:10:55 - mmengine - INFO - Iter(val) [500/602] eta: 0:11:34 time: 7.7914 data_time: 0.0019 memory: 7473 2024/11/11 21:24:47 - mmengine - INFO - Iter(val) [600/602] eta: 0:00:14 time: 8.3876 data_time: 0.0019 memory: 7522 2024/11/11 21:32:28 - mmengine - INFO - === 57 classes had less than 10000 detections! Outputting 10000 detections for each class will improve AP further. === 2024/11/11 21:34:56 - mmengine - INFO - mAP_copypaste: {'AP': 0.4754909609392126, 'AP50': 0.5849959064506959, 'AP75': 0.5066121852967392, 'APs': 0.419807876705539, 'APm': 0.6135044369342095, 'APl': 0.7003757626313989, 'APr': 0.35302448692450883, 'APc': 0.43051000036658754, 'APf': 0.5374217696245214} 2024/11/11 21:35:21 - mmengine - INFO - Iter(val) [602/602] lvis_fixed_ap/AP: 0.4755 lvis_fixed_ap/AP50: 0.5850 lvis_fixed_ap/AP75: 0.5066 lvis_fixed_ap/APs: 0.4198 lvis_fixed_ap/APm: 0.6135 lvis_fixed_ap/APl: 0.7004 lvis_fixed_ap/APr: 0.3530 lvis_fixed_ap/APc: 0.4305 lvis_fixed_ap/APf: 0.5374 data_time: 0.0020 time: 7.0648 2024/11/11 21:38:42 - mmengine - INFO - Iter(train) [ 60100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:12:29 time: 2.0038 data_time: 0.0179 memory: 34302 grad_norm: 30.6712 loss: 9.6387 loss_cls: 0.3135 loss_bbox: 0.1251 loss_iou: 0.2474 d0.loss_cls: 0.3585 d0.loss_bbox: 0.1374 d0.loss_iou: 0.2639 d1.loss_cls: 0.3341 d1.loss_bbox: 0.1297 d1.loss_iou: 0.2570 d2.loss_cls: 0.3212 d2.loss_bbox: 0.1274 d2.loss_iou: 0.2557 d3.loss_cls: 0.3210 d3.loss_bbox: 0.1244 d3.loss_iou: 0.2499 d4.loss_cls: 0.3156 d4.loss_bbox: 0.1250 d4.loss_iou: 0.2477 enc_loss_cls: 0.3559 enc_loss_bbox: 0.1475 enc_loss_iou: 0.2851 dn_loss_cls: 0.1623 dn_loss_bbox: 0.1450 dn_loss_iou: 0.2080 d0.dn_loss_cls: 0.2331 d0.dn_loss_bbox: 0.2641 d0.dn_loss_iou: 0.3401 d1.dn_loss_cls: 0.1962 d1.dn_loss_bbox: 0.1694 d1.dn_loss_iou: 0.2353 d2.dn_loss_cls: 0.1750 d2.dn_loss_bbox: 0.1518 d2.dn_loss_iou: 0.2158 d3.dn_loss_cls: 0.1679 d3.dn_loss_bbox: 0.1457 d3.dn_loss_iou: 0.2100 d4.dn_loss_cls: 0.1621 d4.dn_loss_bbox: 0.1450 d4.dn_loss_iou: 0.2080 d1.loss_lmm_region: 0.1433 loss_lmm_image: 0.9177 2024/11/11 21:42:03 - mmengine - INFO - Iter(train) [ 60200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:09:08 time: 1.9865 data_time: 0.0177 memory: 35076 grad_norm: 32.4658 loss: 9.8925 loss_cls: 0.3176 loss_bbox: 0.1383 loss_iou: 0.2336 d0.loss_cls: 0.3533 d0.loss_bbox: 0.1609 d0.loss_iou: 0.2543 d1.loss_cls: 0.3376 d1.loss_bbox: 0.1438 d1.loss_iou: 0.2423 d2.loss_cls: 0.3222 d2.loss_bbox: 0.1423 d2.loss_iou: 0.2364 d3.loss_cls: 0.3202 d3.loss_bbox: 0.1391 d3.loss_iou: 0.2329 d4.loss_cls: 0.3159 d4.loss_bbox: 0.1387 d4.loss_iou: 0.2333 enc_loss_cls: 0.3586 enc_loss_bbox: 0.1676 enc_loss_iou: 0.2739 dn_loss_cls: 0.1530 dn_loss_bbox: 0.1688 dn_loss_iou: 0.2145 d0.dn_loss_cls: 0.2445 d0.dn_loss_bbox: 0.3207 d0.dn_loss_iou: 0.3581 d1.dn_loss_cls: 0.1944 d1.dn_loss_bbox: 0.2046 d1.dn_loss_iou: 0.2490 d2.dn_loss_cls: 0.1733 d2.dn_loss_bbox: 0.1817 d2.dn_loss_iou: 0.2247 d3.dn_loss_cls: 0.1618 d3.dn_loss_bbox: 0.1719 d3.dn_loss_iou: 0.2172 d4.dn_loss_cls: 0.1553 d4.dn_loss_bbox: 0.1689 d4.dn_loss_iou: 0.2146 d1.loss_lmm_region: 0.1848 loss_lmm_image: 0.8683 2024/11/11 21:45:24 - mmengine - INFO - Iter(train) [ 60300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:05:47 time: 2.0067 data_time: 0.0179 memory: 32678 grad_norm: 27.9098 loss: 9.4813 loss_cls: 0.2695 loss_bbox: 0.1297 loss_iou: 0.2233 d0.loss_cls: 0.3147 d0.loss_bbox: 0.1443 d0.loss_iou: 0.2331 d1.loss_cls: 0.2920 d1.loss_bbox: 0.1391 d1.loss_iou: 0.2269 d2.loss_cls: 0.2785 d2.loss_bbox: 0.1344 d2.loss_iou: 0.2245 d3.loss_cls: 0.2760 d3.loss_bbox: 0.1290 d3.loss_iou: 0.2221 d4.loss_cls: 0.2699 d4.loss_bbox: 0.1323 d4.loss_iou: 0.2235 enc_loss_cls: 0.3185 enc_loss_bbox: 0.1622 enc_loss_iou: 0.2574 dn_loss_cls: 0.1344 dn_loss_bbox: 0.1944 dn_loss_iou: 0.2184 d0.dn_loss_cls: 0.2138 d0.dn_loss_bbox: 0.3543 d0.dn_loss_iou: 0.3638 d1.dn_loss_cls: 0.1670 d1.dn_loss_bbox: 0.2296 d1.dn_loss_iou: 0.2500 d2.dn_loss_cls: 0.1449 d2.dn_loss_bbox: 0.2055 d2.dn_loss_iou: 0.2283 d3.dn_loss_cls: 0.1409 d3.dn_loss_bbox: 0.1952 d3.dn_loss_iou: 0.2208 d4.dn_loss_cls: 0.1341 d4.dn_loss_bbox: 0.1942 d4.dn_loss_iou: 0.2183 d1.loss_lmm_region: 0.1563 loss_lmm_image: 0.9162 2024/11/11 21:48:41 - mmengine - INFO - Iter(train) [ 60400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 2:02:19 time: 1.9710 data_time: 0.0177 memory: 33801 grad_norm: 31.6151 loss: 9.1569 loss_cls: 0.2892 loss_bbox: 0.1294 loss_iou: 0.2248 d0.loss_cls: 0.3293 d0.loss_bbox: 0.1318 d0.loss_iou: 0.2327 d1.loss_cls: 0.3060 d1.loss_bbox: 0.1319 d1.loss_iou: 0.2282 d2.loss_cls: 0.3011 d2.loss_bbox: 0.1235 d2.loss_iou: 0.2223 d3.loss_cls: 0.2995 d3.loss_bbox: 0.1236 d3.loss_iou: 0.2199 d4.loss_cls: 0.3001 d4.loss_bbox: 0.1225 d4.loss_iou: 0.2189 enc_loss_cls: 0.3275 enc_loss_bbox: 0.1464 enc_loss_iou: 0.2547 dn_loss_cls: 0.1319 dn_loss_bbox: 0.1606 dn_loss_iou: 0.2055 d0.dn_loss_cls: 0.2075 d0.dn_loss_bbox: 0.3150 d0.dn_loss_iou: 0.3477 d1.dn_loss_cls: 0.1584 d1.dn_loss_bbox: 0.1953 d1.dn_loss_iou: 0.2373 d2.dn_loss_cls: 0.1436 d2.dn_loss_bbox: 0.1701 d2.dn_loss_iou: 0.2156 d3.dn_loss_cls: 0.1353 d3.dn_loss_bbox: 0.1627 d3.dn_loss_iou: 0.2084 d4.dn_loss_cls: 0.1304 d4.dn_loss_bbox: 0.1606 d4.dn_loss_iou: 0.2054 d1.loss_lmm_region: 0.1637 loss_lmm_image: 0.8385 2024/11/11 21:51:59 - mmengine - INFO - Iter(train) [ 60500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:58:54 time: 2.0194 data_time: 0.0177 memory: 33647 grad_norm: 31.0575 loss: 9.3766 loss_cls: 0.2691 loss_bbox: 0.1358 loss_iou: 0.2460 d0.loss_cls: 0.3083 d0.loss_bbox: 0.1444 d0.loss_iou: 0.2613 d1.loss_cls: 0.2897 d1.loss_bbox: 0.1367 d1.loss_iou: 0.2505 d2.loss_cls: 0.2795 d2.loss_bbox: 0.1370 d2.loss_iou: 0.2480 d3.loss_cls: 0.2727 d3.loss_bbox: 0.1373 d3.loss_iou: 0.2495 d4.loss_cls: 0.2702 d4.loss_bbox: 0.1347 d4.loss_iou: 0.2439 enc_loss_cls: 0.3208 enc_loss_bbox: 0.1542 enc_loss_iou: 0.2769 dn_loss_cls: 0.1187 dn_loss_bbox: 0.1655 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.2004 d0.dn_loss_bbox: 0.3307 d0.dn_loss_iou: 0.3687 d1.dn_loss_cls: 0.1494 d1.dn_loss_bbox: 0.1995 d1.dn_loss_iou: 0.2563 d2.dn_loss_cls: 0.1307 d2.dn_loss_bbox: 0.1756 d2.dn_loss_iou: 0.2345 d3.dn_loss_cls: 0.1226 d3.dn_loss_bbox: 0.1669 d3.dn_loss_iou: 0.2277 d4.dn_loss_cls: 0.1181 d4.dn_loss_bbox: 0.1654 d4.dn_loss_iou: 0.2248 d1.loss_lmm_region: 0.1581 loss_lmm_image: 0.8717 2024/11/11 21:55:20 - mmengine - INFO - Iter(train) [ 60600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:55:32 time: 1.9947 data_time: 0.0177 memory: 35467 grad_norm: 33.6313 loss: 9.1627 loss_cls: 0.2872 loss_bbox: 0.1179 loss_iou: 0.2052 d0.loss_cls: 0.3230 d0.loss_bbox: 0.1315 d0.loss_iou: 0.2208 d1.loss_cls: 0.3058 d1.loss_bbox: 0.1176 d1.loss_iou: 0.2084 d2.loss_cls: 0.2964 d2.loss_bbox: 0.1131 d2.loss_iou: 0.2025 d3.loss_cls: 0.2975 d3.loss_bbox: 0.1166 d3.loss_iou: 0.2021 d4.loss_cls: 0.2915 d4.loss_bbox: 0.1157 d4.loss_iou: 0.2037 enc_loss_cls: 0.3205 enc_loss_bbox: 0.1501 enc_loss_iou: 0.2409 dn_loss_cls: 0.1379 dn_loss_bbox: 0.1777 dn_loss_iou: 0.2118 d0.dn_loss_cls: 0.1989 d0.dn_loss_bbox: 0.3266 d0.dn_loss_iou: 0.3521 d1.dn_loss_cls: 0.1590 d1.dn_loss_bbox: 0.2089 d1.dn_loss_iou: 0.2431 d2.dn_loss_cls: 0.1440 d2.dn_loss_bbox: 0.1863 d2.dn_loss_iou: 0.2213 d3.dn_loss_cls: 0.1408 d3.dn_loss_bbox: 0.1790 d3.dn_loss_iou: 0.2141 d4.dn_loss_cls: 0.1382 d4.dn_loss_bbox: 0.1775 d4.dn_loss_iou: 0.2118 d1.loss_lmm_region: 0.1529 loss_lmm_image: 0.9130 2024/11/11 21:58:40 - mmengine - INFO - Iter(train) [ 60700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:52:10 time: 2.0276 data_time: 0.0178 memory: 33887 grad_norm: 30.7460 loss: 9.9859 loss_cls: 0.3274 loss_bbox: 0.1529 loss_iou: 0.2786 d0.loss_cls: 0.3690 d0.loss_bbox: 0.1629 d0.loss_iou: 0.2891 d1.loss_cls: 0.3360 d1.loss_bbox: 0.1611 d1.loss_iou: 0.2875 d2.loss_cls: 0.3313 d2.loss_bbox: 0.1571 d2.loss_iou: 0.2814 d3.loss_cls: 0.3312 d3.loss_bbox: 0.1538 d3.loss_iou: 0.2788 d4.loss_cls: 0.3288 d4.loss_bbox: 0.1548 d4.loss_iou: 0.2783 enc_loss_cls: 0.3700 enc_loss_bbox: 0.1782 enc_loss_iou: 0.3145 dn_loss_cls: 0.1134 dn_loss_bbox: 0.1588 dn_loss_iou: 0.2138 d0.dn_loss_cls: 0.1876 d0.dn_loss_bbox: 0.3099 d0.dn_loss_iou: 0.3597 d1.dn_loss_cls: 0.1428 d1.dn_loss_bbox: 0.1870 d1.dn_loss_iou: 0.2438 d2.dn_loss_cls: 0.1248 d2.dn_loss_bbox: 0.1675 d2.dn_loss_iou: 0.2222 d3.dn_loss_cls: 0.1169 d3.dn_loss_bbox: 0.1611 d3.dn_loss_iou: 0.2159 d4.dn_loss_cls: 0.1136 d4.dn_loss_bbox: 0.1589 d4.dn_loss_iou: 0.2137 d1.loss_lmm_region: 0.1472 loss_lmm_image: 0.9047 2024/11/11 22:01:59 - mmengine - INFO - Iter(train) [ 60800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:48:47 time: 1.9933 data_time: 0.0177 memory: 34301 grad_norm: 34.1783 loss: 9.3991 loss_cls: 0.3149 loss_bbox: 0.1296 loss_iou: 0.2494 d0.loss_cls: 0.3653 d0.loss_bbox: 0.1404 d0.loss_iou: 0.2622 d1.loss_cls: 0.3406 d1.loss_bbox: 0.1284 d1.loss_iou: 0.2503 d2.loss_cls: 0.3253 d2.loss_bbox: 0.1294 d2.loss_iou: 0.2495 d3.loss_cls: 0.3196 d3.loss_bbox: 0.1325 d3.loss_iou: 0.2496 d4.loss_cls: 0.3141 d4.loss_bbox: 0.1306 d4.loss_iou: 0.2505 enc_loss_cls: 0.3662 enc_loss_bbox: 0.1566 enc_loss_iou: 0.2901 dn_loss_cls: 0.1076 dn_loss_bbox: 0.1539 dn_loss_iou: 0.2131 d0.dn_loss_cls: 0.1822 d0.dn_loss_bbox: 0.2842 d0.dn_loss_iou: 0.3509 d1.dn_loss_cls: 0.1376 d1.dn_loss_bbox: 0.1859 d1.dn_loss_iou: 0.2424 d2.dn_loss_cls: 0.1209 d2.dn_loss_bbox: 0.1664 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.1121 d3.dn_loss_bbox: 0.1567 d3.dn_loss_iou: 0.2152 d4.dn_loss_cls: 0.1066 d4.dn_loss_bbox: 0.1539 d4.dn_loss_iou: 0.2131 d1.loss_lmm_region: 0.1336 loss_lmm_image: 0.8457 2024/11/11 22:05:19 - mmengine - INFO - Iter(train) [ 60900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:45:24 time: 2.0084 data_time: 0.0176 memory: 34734 grad_norm: 31.2437 loss: 10.1439 loss_cls: 0.3392 loss_bbox: 0.1430 loss_iou: 0.2834 d0.loss_cls: 0.3907 d0.loss_bbox: 0.1567 d0.loss_iou: 0.3023 d1.loss_cls: 0.3633 d1.loss_bbox: 0.1451 d1.loss_iou: 0.2900 d2.loss_cls: 0.3434 d2.loss_bbox: 0.1484 d2.loss_iou: 0.2891 d3.loss_cls: 0.3423 d3.loss_bbox: 0.1433 d3.loss_iou: 0.2846 d4.loss_cls: 0.3418 d4.loss_bbox: 0.1407 d4.loss_iou: 0.2818 enc_loss_cls: 0.3947 enc_loss_bbox: 0.1767 enc_loss_iou: 0.3274 dn_loss_cls: 0.1146 dn_loss_bbox: 0.1539 dn_loss_iou: 0.2292 d0.dn_loss_cls: 0.1999 d0.dn_loss_bbox: 0.2944 d0.dn_loss_iou: 0.3770 d1.dn_loss_cls: 0.1490 d1.dn_loss_bbox: 0.1838 d1.dn_loss_iou: 0.2619 d2.dn_loss_cls: 0.1295 d2.dn_loss_bbox: 0.1620 d2.dn_loss_iou: 0.2381 d3.dn_loss_cls: 0.1223 d3.dn_loss_bbox: 0.1568 d3.dn_loss_iou: 0.2324 d4.dn_loss_cls: 0.1168 d4.dn_loss_bbox: 0.1539 d4.dn_loss_iou: 0.2293 d1.loss_lmm_region: 0.1513 loss_lmm_image: 0.8598 2024/11/11 22:08:37 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 22:08:37 - mmengine - INFO - Iter(train) [ 61000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:41:58 time: 1.9752 data_time: 0.0178 memory: 35248 grad_norm: 28.3658 loss: 8.9792 loss_cls: 0.2976 loss_bbox: 0.1273 loss_iou: 0.2223 d0.loss_cls: 0.3495 d0.loss_bbox: 0.1369 d0.loss_iou: 0.2349 d1.loss_cls: 0.3182 d1.loss_bbox: 0.1326 d1.loss_iou: 0.2273 d2.loss_cls: 0.3070 d2.loss_bbox: 0.1291 d2.loss_iou: 0.2274 d3.loss_cls: 0.3015 d3.loss_bbox: 0.1285 d3.loss_iou: 0.2253 d4.loss_cls: 0.2978 d4.loss_bbox: 0.1270 d4.loss_iou: 0.2239 enc_loss_cls: 0.3424 enc_loss_bbox: 0.1540 enc_loss_iou: 0.2611 dn_loss_cls: 0.1120 dn_loss_bbox: 0.1483 dn_loss_iou: 0.1932 d0.dn_loss_cls: 0.1823 d0.dn_loss_bbox: 0.2934 d0.dn_loss_iou: 0.3316 d1.dn_loss_cls: 0.1407 d1.dn_loss_bbox: 0.1816 d1.dn_loss_iou: 0.2258 d2.dn_loss_cls: 0.1229 d2.dn_loss_bbox: 0.1597 d2.dn_loss_iou: 0.2044 d3.dn_loss_cls: 0.1181 d3.dn_loss_bbox: 0.1504 d3.dn_loss_iou: 0.1962 d4.dn_loss_cls: 0.1136 d4.dn_loss_bbox: 0.1483 d4.dn_loss_iou: 0.1933 d1.loss_lmm_region: 0.1233 loss_lmm_image: 0.8684 2024/11/11 22:11:55 - mmengine - INFO - Iter(train) [ 61100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:38:33 time: 1.9837 data_time: 0.0178 memory: 33876 grad_norm: 32.3767 loss: 8.8909 loss_cls: 0.2810 loss_bbox: 0.1186 loss_iou: 0.2521 d0.loss_cls: 0.3214 d0.loss_bbox: 0.1328 d0.loss_iou: 0.2697 d1.loss_cls: 0.2963 d1.loss_bbox: 0.1244 d1.loss_iou: 0.2610 d2.loss_cls: 0.2926 d2.loss_bbox: 0.1163 d2.loss_iou: 0.2543 d3.loss_cls: 0.2825 d3.loss_bbox: 0.1196 d3.loss_iou: 0.2541 d4.loss_cls: 0.2799 d4.loss_bbox: 0.1197 d4.loss_iou: 0.2543 enc_loss_cls: 0.3194 enc_loss_bbox: 0.1521 enc_loss_iou: 0.2957 dn_loss_cls: 0.0938 dn_loss_bbox: 0.1299 dn_loss_iou: 0.2148 d0.dn_loss_cls: 0.1681 d0.dn_loss_bbox: 0.2726 d0.dn_loss_iou: 0.3617 d1.dn_loss_cls: 0.1209 d1.dn_loss_bbox: 0.1617 d1.dn_loss_iou: 0.2479 d2.dn_loss_cls: 0.1069 d2.dn_loss_bbox: 0.1405 d2.dn_loss_iou: 0.2241 d3.dn_loss_cls: 0.0986 d3.dn_loss_bbox: 0.1315 d3.dn_loss_iou: 0.2171 d4.dn_loss_cls: 0.0952 d4.dn_loss_bbox: 0.1298 d4.dn_loss_iou: 0.2147 d1.loss_lmm_region: 0.1081 loss_lmm_image: 0.8554 2024/11/11 22:15:14 - mmengine - INFO - Iter(train) [ 61200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:35:09 time: 1.9839 data_time: 0.0177 memory: 35014 grad_norm: 39.2383 loss: 11.1227 loss_cls: 0.3691 loss_bbox: 0.1722 loss_iou: 0.2730 d0.loss_cls: 0.4157 d0.loss_bbox: 0.1867 d0.loss_iou: 0.2922 d1.loss_cls: 0.3876 d1.loss_bbox: 0.1783 d1.loss_iou: 0.2871 d2.loss_cls: 0.3814 d2.loss_bbox: 0.1718 d2.loss_iou: 0.2760 d3.loss_cls: 0.3733 d3.loss_bbox: 0.1731 d3.loss_iou: 0.2732 d4.loss_cls: 0.3715 d4.loss_bbox: 0.1727 d4.loss_iou: 0.2727 enc_loss_cls: 0.4030 enc_loss_bbox: 0.2086 enc_loss_iou: 0.3167 dn_loss_cls: 0.1689 dn_loss_bbox: 0.1999 dn_loss_iou: 0.2246 d0.dn_loss_cls: 0.2494 d0.dn_loss_bbox: 0.3615 d0.dn_loss_iou: 0.3701 d1.dn_loss_cls: 0.2069 d1.dn_loss_bbox: 0.2378 d1.dn_loss_iou: 0.2560 d2.dn_loss_cls: 0.1869 d2.dn_loss_bbox: 0.2133 d2.dn_loss_iou: 0.2365 d3.dn_loss_cls: 0.1780 d3.dn_loss_bbox: 0.2029 d3.dn_loss_iou: 0.2278 d4.dn_loss_cls: 0.1708 d4.dn_loss_bbox: 0.1999 d4.dn_loss_iou: 0.2245 d1.loss_lmm_region: 0.1549 loss_lmm_image: 0.8960 2024/11/11 22:18:32 - mmengine - INFO - Iter(train) [ 61300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:31:44 time: 1.9752 data_time: 0.0176 memory: 33883 grad_norm: 31.6927 loss: 9.1760 loss_cls: 0.2707 loss_bbox: 0.1235 loss_iou: 0.2335 d0.loss_cls: 0.3142 d0.loss_bbox: 0.1375 d0.loss_iou: 0.2496 d1.loss_cls: 0.2929 d1.loss_bbox: 0.1295 d1.loss_iou: 0.2394 d2.loss_cls: 0.2794 d2.loss_bbox: 0.1263 d2.loss_iou: 0.2375 d3.loss_cls: 0.2737 d3.loss_bbox: 0.1247 d3.loss_iou: 0.2356 d4.loss_cls: 0.2694 d4.loss_bbox: 0.1233 d4.loss_iou: 0.2346 enc_loss_cls: 0.3145 enc_loss_bbox: 0.1566 enc_loss_iou: 0.2738 dn_loss_cls: 0.1035 dn_loss_bbox: 0.1692 dn_loss_iou: 0.2212 d0.dn_loss_cls: 0.1875 d0.dn_loss_bbox: 0.3342 d0.dn_loss_iou: 0.3814 d1.dn_loss_cls: 0.1354 d1.dn_loss_bbox: 0.2036 d1.dn_loss_iou: 0.2576 d2.dn_loss_cls: 0.1134 d2.dn_loss_bbox: 0.1785 d2.dn_loss_iou: 0.2325 d3.dn_loss_cls: 0.1063 d3.dn_loss_bbox: 0.1707 d3.dn_loss_iou: 0.2239 d4.dn_loss_cls: 0.1026 d4.dn_loss_bbox: 0.1693 d4.dn_loss_iou: 0.2213 d1.loss_lmm_region: 0.1574 loss_lmm_image: 0.8659 2024/11/11 22:21:53 - mmengine - INFO - Iter(train) [ 61400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:28:22 time: 1.9959 data_time: 0.0176 memory: 34396 grad_norm: 30.9732 loss: 9.4105 loss_cls: 0.2770 loss_bbox: 0.1452 loss_iou: 0.2287 d0.loss_cls: 0.3160 d0.loss_bbox: 0.1659 d0.loss_iou: 0.2456 d1.loss_cls: 0.2914 d1.loss_bbox: 0.1553 d1.loss_iou: 0.2387 d2.loss_cls: 0.2856 d2.loss_bbox: 0.1469 d2.loss_iou: 0.2295 d3.loss_cls: 0.2807 d3.loss_bbox: 0.1468 d3.loss_iou: 0.2293 d4.loss_cls: 0.2784 d4.loss_bbox: 0.1455 d4.loss_iou: 0.2290 enc_loss_cls: 0.3195 enc_loss_bbox: 0.1775 enc_loss_iou: 0.2658 dn_loss_cls: 0.0941 dn_loss_bbox: 0.1890 dn_loss_iou: 0.2222 d0.dn_loss_cls: 0.1805 d0.dn_loss_bbox: 0.3624 d0.dn_loss_iou: 0.3740 d1.dn_loss_cls: 0.1243 d1.dn_loss_bbox: 0.2324 d1.dn_loss_iou: 0.2587 d2.dn_loss_cls: 0.1043 d2.dn_loss_bbox: 0.2009 d2.dn_loss_iou: 0.2339 d3.dn_loss_cls: 0.0985 d3.dn_loss_bbox: 0.1921 d3.dn_loss_iou: 0.2254 d4.dn_loss_cls: 0.0942 d4.dn_loss_bbox: 0.1890 d4.dn_loss_iou: 0.2222 d1.loss_lmm_region: 0.1352 loss_lmm_image: 0.8788 2024/11/11 22:25:11 - mmengine - INFO - Iter(train) [ 61500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:24:57 time: 1.9835 data_time: 0.0177 memory: 32134 grad_norm: 28.7831 loss: 10.5782 loss_cls: 0.3180 loss_bbox: 0.1730 loss_iou: 0.2803 d0.loss_cls: 0.3789 d0.loss_bbox: 0.1890 d0.loss_iou: 0.2972 d1.loss_cls: 0.3434 d1.loss_bbox: 0.1768 d1.loss_iou: 0.2857 d2.loss_cls: 0.3339 d2.loss_bbox: 0.1725 d2.loss_iou: 0.2816 d3.loss_cls: 0.3244 d3.loss_bbox: 0.1733 d3.loss_iou: 0.2807 d4.loss_cls: 0.3210 d4.loss_bbox: 0.1729 d4.loss_iou: 0.2812 enc_loss_cls: 0.3855 enc_loss_bbox: 0.1983 enc_loss_iou: 0.3151 dn_loss_cls: 0.1245 dn_loss_bbox: 0.2004 dn_loss_iou: 0.2281 d0.dn_loss_cls: 0.2095 d0.dn_loss_bbox: 0.3795 d0.dn_loss_iou: 0.3856 d1.dn_loss_cls: 0.1583 d1.dn_loss_bbox: 0.2430 d1.dn_loss_iou: 0.2636 d2.dn_loss_cls: 0.1377 d2.dn_loss_bbox: 0.2148 d2.dn_loss_iou: 0.2389 d3.dn_loss_cls: 0.1295 d3.dn_loss_bbox: 0.2031 d3.dn_loss_iou: 0.2307 d4.dn_loss_cls: 0.1246 d4.dn_loss_bbox: 0.2005 d4.dn_loss_iou: 0.2281 d1.loss_lmm_region: 0.1533 loss_lmm_image: 0.8417 2024/11/11 22:28:30 - mmengine - INFO - Iter(train) [ 61600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:21:34 time: 1.9792 data_time: 0.0177 memory: 35583 grad_norm: 28.0296 loss: 10.0336 loss_cls: 0.3006 loss_bbox: 0.1499 loss_iou: 0.2743 d0.loss_cls: 0.3574 d0.loss_bbox: 0.1624 d0.loss_iou: 0.2935 d1.loss_cls: 0.3242 d1.loss_bbox: 0.1578 d1.loss_iou: 0.2839 d2.loss_cls: 0.3127 d2.loss_bbox: 0.1509 d2.loss_iou: 0.2757 d3.loss_cls: 0.3064 d3.loss_bbox: 0.1507 d3.loss_iou: 0.2756 d4.loss_cls: 0.3026 d4.loss_bbox: 0.1500 d4.loss_iou: 0.2755 enc_loss_cls: 0.3544 enc_loss_bbox: 0.1791 enc_loss_iou: 0.3133 dn_loss_cls: 0.1081 dn_loss_bbox: 0.1820 dn_loss_iou: 0.2308 d0.dn_loss_cls: 0.1966 d0.dn_loss_bbox: 0.3528 d0.dn_loss_iou: 0.3790 d1.dn_loss_cls: 0.1403 d1.dn_loss_bbox: 0.2160 d1.dn_loss_iou: 0.2616 d2.dn_loss_cls: 0.1196 d2.dn_loss_bbox: 0.1925 d2.dn_loss_iou: 0.2403 d3.dn_loss_cls: 0.1129 d3.dn_loss_bbox: 0.1843 d3.dn_loss_iou: 0.2329 d4.dn_loss_cls: 0.1081 d4.dn_loss_bbox: 0.1820 d4.dn_loss_iou: 0.2308 d1.loss_lmm_region: 0.1407 loss_lmm_image: 0.8712 2024/11/11 22:31:49 - mmengine - INFO - Iter(train) [ 61700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:18:08 time: 1.9962 data_time: 0.0177 memory: 33757 grad_norm: 27.7601 loss: 9.9807 loss_cls: 0.3285 loss_bbox: 0.1426 loss_iou: 0.2742 d0.loss_cls: 0.3837 d0.loss_bbox: 0.1476 d0.loss_iou: 0.2837 d1.loss_cls: 0.3536 d1.loss_bbox: 0.1433 d1.loss_iou: 0.2788 d2.loss_cls: 0.3445 d2.loss_bbox: 0.1405 d2.loss_iou: 0.2747 d3.loss_cls: 0.3360 d3.loss_bbox: 0.1392 d3.loss_iou: 0.2733 d4.loss_cls: 0.3259 d4.loss_bbox: 0.1453 d4.loss_iou: 0.2756 enc_loss_cls: 0.3807 enc_loss_bbox: 0.1648 enc_loss_iou: 0.3135 dn_loss_cls: 0.1225 dn_loss_bbox: 0.1681 dn_loss_iou: 0.2181 d0.dn_loss_cls: 0.2018 d0.dn_loss_bbox: 0.3107 d0.dn_loss_iou: 0.3593 d1.dn_loss_cls: 0.1547 d1.dn_loss_bbox: 0.2008 d1.dn_loss_iou: 0.2490 d2.dn_loss_cls: 0.1376 d2.dn_loss_bbox: 0.1768 d2.dn_loss_iou: 0.2265 d3.dn_loss_cls: 0.1288 d3.dn_loss_bbox: 0.1696 d3.dn_loss_iou: 0.2202 d4.dn_loss_cls: 0.1244 d4.dn_loss_bbox: 0.1681 d4.dn_loss_iou: 0.2181 d1.loss_lmm_region: 0.1423 loss_lmm_image: 0.8332 2024/11/11 22:35:07 - mmengine - INFO - Iter(train) [ 61800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:14:44 time: 1.9551 data_time: 0.0176 memory: 32928 grad_norm: 32.4521 loss: 9.5906 loss_cls: 0.3093 loss_bbox: 0.1261 loss_iou: 0.2049 d0.loss_cls: 0.3403 d0.loss_bbox: 0.1468 d0.loss_iou: 0.2239 d1.loss_cls: 0.3176 d1.loss_bbox: 0.1361 d1.loss_iou: 0.2149 d2.loss_cls: 0.3094 d2.loss_bbox: 0.1301 d2.loss_iou: 0.2096 d3.loss_cls: 0.3123 d3.loss_bbox: 0.1275 d3.loss_iou: 0.2057 d4.loss_cls: 0.3096 d4.loss_bbox: 0.1276 d4.loss_iou: 0.2058 enc_loss_cls: 0.3505 enc_loss_bbox: 0.1587 enc_loss_iou: 0.2414 dn_loss_cls: 0.1565 dn_loss_bbox: 0.1914 dn_loss_iou: 0.2184 d0.dn_loss_cls: 0.2296 d0.dn_loss_bbox: 0.3640 d0.dn_loss_iou: 0.3669 d1.dn_loss_cls: 0.1825 d1.dn_loss_bbox: 0.2273 d1.dn_loss_iou: 0.2496 d2.dn_loss_cls: 0.1647 d2.dn_loss_bbox: 0.2036 d2.dn_loss_iou: 0.2284 d3.dn_loss_cls: 0.1591 d3.dn_loss_bbox: 0.1940 d3.dn_loss_iou: 0.2211 d4.dn_loss_cls: 0.1567 d4.dn_loss_bbox: 0.1915 d4.dn_loss_iou: 0.2184 d1.loss_lmm_region: 0.1429 loss_lmm_image: 0.8155 2024/11/11 22:38:22 - mmengine - INFO - Iter(train) [ 61900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:11:14 time: 1.9491 data_time: 0.0177 memory: 34006 grad_norm: 30.4221 loss: 9.4178 loss_cls: 0.3050 loss_bbox: 0.1334 loss_iou: 0.2488 d0.loss_cls: 0.3674 d0.loss_bbox: 0.1445 d0.loss_iou: 0.2635 d1.loss_cls: 0.3358 d1.loss_bbox: 0.1371 d1.loss_iou: 0.2551 d2.loss_cls: 0.3197 d2.loss_bbox: 0.1333 d2.loss_iou: 0.2494 d3.loss_cls: 0.3134 d3.loss_bbox: 0.1330 d3.loss_iou: 0.2500 d4.loss_cls: 0.3066 d4.loss_bbox: 0.1331 d4.loss_iou: 0.2488 enc_loss_cls: 0.3693 enc_loss_bbox: 0.1603 enc_loss_iou: 0.2932 dn_loss_cls: 0.1179 dn_loss_bbox: 0.1437 dn_loss_iou: 0.1983 d0.dn_loss_cls: 0.2024 d0.dn_loss_bbox: 0.2771 d0.dn_loss_iou: 0.3308 d1.dn_loss_cls: 0.1495 d1.dn_loss_bbox: 0.1783 d1.dn_loss_iou: 0.2278 d2.dn_loss_cls: 0.1312 d2.dn_loss_bbox: 0.1542 d2.dn_loss_iou: 0.2073 d3.dn_loss_cls: 0.1246 d3.dn_loss_bbox: 0.1457 d3.dn_loss_iou: 0.2006 d4.dn_loss_cls: 0.1187 d4.dn_loss_bbox: 0.1437 d4.dn_loss_iou: 0.1982 d1.loss_lmm_region: 0.1738 loss_lmm_image: 0.8932 2024/11/11 22:41:42 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 22:41:42 - mmengine - INFO - Iter(train) [ 62000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:07:51 time: 1.9888 data_time: 0.0177 memory: 32468 grad_norm: nan loss: 7.7463 loss_cls: 0.2234 loss_bbox: 0.0968 loss_iou: 0.1710 d0.loss_cls: 0.2634 d0.loss_bbox: 0.1056 d0.loss_iou: 0.1845 d1.loss_cls: 0.2452 d1.loss_bbox: 0.1014 d1.loss_iou: 0.1757 d2.loss_cls: 0.2313 d2.loss_bbox: 0.0970 d2.loss_iou: 0.1719 d3.loss_cls: 0.2259 d3.loss_bbox: 0.0964 d3.loss_iou: 0.1708 d4.loss_cls: 0.2213 d4.loss_bbox: 0.0990 d4.loss_iou: 0.1729 enc_loss_cls: 0.2685 enc_loss_bbox: 0.1210 enc_loss_iou: 0.2082 dn_loss_cls: 0.1234 dn_loss_bbox: 0.1395 dn_loss_iou: 0.1686 d0.dn_loss_cls: 0.2018 d0.dn_loss_bbox: 0.2873 d0.dn_loss_iou: 0.3169 d1.dn_loss_cls: 0.1519 d1.dn_loss_bbox: 0.1720 d1.dn_loss_iou: 0.2023 d2.dn_loss_cls: 0.1382 d2.dn_loss_bbox: 0.1499 d2.dn_loss_iou: 0.1786 d3.dn_loss_cls: 0.1317 d3.dn_loss_bbox: 0.1423 d3.dn_loss_iou: 0.1719 d4.dn_loss_cls: 0.1276 d4.dn_loss_bbox: 0.1395 d4.dn_loss_iou: 0.1686 d1.loss_lmm_region: 0.1316 loss_lmm_image: 0.8513 2024/11/11 22:45:01 - mmengine - INFO - Iter(train) [ 62100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:04:28 time: 2.0001 data_time: 0.0177 memory: 33721 grad_norm: 25.2017 loss: 8.8205 loss_cls: 0.2786 loss_bbox: 0.1210 loss_iou: 0.2140 d0.loss_cls: 0.3175 d0.loss_bbox: 0.1370 d0.loss_iou: 0.2347 d1.loss_cls: 0.2942 d1.loss_bbox: 0.1308 d1.loss_iou: 0.2268 d2.loss_cls: 0.2847 d2.loss_bbox: 0.1265 d2.loss_iou: 0.2208 d3.loss_cls: 0.2778 d3.loss_bbox: 0.1238 d3.loss_iou: 0.2170 d4.loss_cls: 0.2777 d4.loss_bbox: 0.1240 d4.loss_iou: 0.2154 enc_loss_cls: 0.3114 enc_loss_bbox: 0.1550 enc_loss_iou: 0.2623 dn_loss_cls: 0.1086 dn_loss_bbox: 0.1580 dn_loss_iou: 0.1937 d0.dn_loss_cls: 0.1766 d0.dn_loss_bbox: 0.3053 d0.dn_loss_iou: 0.3321 d1.dn_loss_cls: 0.1363 d1.dn_loss_bbox: 0.1949 d1.dn_loss_iou: 0.2249 d2.dn_loss_cls: 0.1198 d2.dn_loss_bbox: 0.1724 d2.dn_loss_iou: 0.2047 d3.dn_loss_cls: 0.1121 d3.dn_loss_bbox: 0.1608 d3.dn_loss_iou: 0.1964 d4.dn_loss_cls: 0.1094 d4.dn_loss_bbox: 0.1579 d4.dn_loss_iou: 0.1937 d1.loss_lmm_region: 0.1385 loss_lmm_image: 0.8734 2024/11/11 22:48:19 - mmengine - INFO - Iter(train) [ 62200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 1:01:02 time: 1.9810 data_time: 0.0177 memory: 32052 grad_norm: 30.0197 loss: 9.1216 loss_cls: 0.2893 loss_bbox: 0.1241 loss_iou: 0.2196 d0.loss_cls: 0.3461 d0.loss_bbox: 0.1314 d0.loss_iou: 0.2296 d1.loss_cls: 0.3177 d1.loss_bbox: 0.1269 d1.loss_iou: 0.2236 d2.loss_cls: 0.3036 d2.loss_bbox: 0.1240 d2.loss_iou: 0.2196 d3.loss_cls: 0.2979 d3.loss_bbox: 0.1230 d3.loss_iou: 0.2196 d4.loss_cls: 0.2884 d4.loss_bbox: 0.1237 d4.loss_iou: 0.2197 enc_loss_cls: 0.3382 enc_loss_bbox: 0.1469 enc_loss_iou: 0.2522 dn_loss_cls: 0.1182 dn_loss_bbox: 0.1662 dn_loss_iou: 0.2031 d0.dn_loss_cls: 0.2057 d0.dn_loss_bbox: 0.3249 d0.dn_loss_iou: 0.3448 d1.dn_loss_cls: 0.1558 d1.dn_loss_bbox: 0.1986 d1.dn_loss_iou: 0.2335 d2.dn_loss_cls: 0.1284 d2.dn_loss_bbox: 0.1761 d2.dn_loss_iou: 0.2117 d3.dn_loss_cls: 0.1210 d3.dn_loss_bbox: 0.1690 d3.dn_loss_iou: 0.2059 d4.dn_loss_cls: 0.1170 d4.dn_loss_bbox: 0.1662 d4.dn_loss_iou: 0.2031 d1.loss_lmm_region: 0.1597 loss_lmm_image: 0.8476 2024/11/11 22:51:38 - mmengine - INFO - Iter(train) [ 62300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:57:39 time: 1.9702 data_time: 0.0177 memory: 35368 grad_norm: 33.0808 loss: 9.2053 loss_cls: 0.2927 loss_bbox: 0.1361 loss_iou: 0.2623 d0.loss_cls: 0.3378 d0.loss_bbox: 0.1426 d0.loss_iou: 0.2705 d1.loss_cls: 0.3126 d1.loss_bbox: 0.1363 d1.loss_iou: 0.2634 d2.loss_cls: 0.3061 d2.loss_bbox: 0.1355 d2.loss_iou: 0.2621 d3.loss_cls: 0.3006 d3.loss_bbox: 0.1331 d3.loss_iou: 0.2643 d4.loss_cls: 0.2969 d4.loss_bbox: 0.1349 d4.loss_iou: 0.2623 enc_loss_cls: 0.3401 enc_loss_bbox: 0.1571 enc_loss_iou: 0.3002 dn_loss_cls: 0.1176 dn_loss_bbox: 0.1336 dn_loss_iou: 0.1982 d0.dn_loss_cls: 0.1907 d0.dn_loss_bbox: 0.2552 d0.dn_loss_iou: 0.3273 d1.dn_loss_cls: 0.1405 d1.dn_loss_bbox: 0.1590 d1.dn_loss_iou: 0.2247 d2.dn_loss_cls: 0.1261 d2.dn_loss_bbox: 0.1419 d2.dn_loss_iou: 0.2057 d3.dn_loss_cls: 0.1226 d3.dn_loss_bbox: 0.1353 d3.dn_loss_iou: 0.2000 d4.dn_loss_cls: 0.1191 d4.dn_loss_bbox: 0.1335 d4.dn_loss_iou: 0.1981 d1.loss_lmm_region: 0.1509 loss_lmm_image: 0.8778 2024/11/11 22:54:53 - mmengine - INFO - Iter(train) [ 62400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:54:10 time: 1.9424 data_time: 0.0176 memory: 32580 grad_norm: 32.6791 loss: 9.9739 loss_cls: 0.3238 loss_bbox: 0.1440 loss_iou: 0.2926 d0.loss_cls: 0.3767 d0.loss_bbox: 0.1525 d0.loss_iou: 0.3113 d1.loss_cls: 0.3458 d1.loss_bbox: 0.1497 d1.loss_iou: 0.3011 d2.loss_cls: 0.3330 d2.loss_bbox: 0.1459 d2.loss_iou: 0.2946 d3.loss_cls: 0.3261 d3.loss_bbox: 0.1442 d3.loss_iou: 0.2930 d4.loss_cls: 0.3214 d4.loss_bbox: 0.1448 d4.loss_iou: 0.2941 enc_loss_cls: 0.3823 enc_loss_bbox: 0.1707 enc_loss_iou: 0.3340 dn_loss_cls: 0.1074 dn_loss_bbox: 0.1504 dn_loss_iou: 0.2190 d0.dn_loss_cls: 0.1922 d0.dn_loss_bbox: 0.2968 d0.dn_loss_iou: 0.3614 d1.dn_loss_cls: 0.1413 d1.dn_loss_bbox: 0.1848 d1.dn_loss_iou: 0.2513 d2.dn_loss_cls: 0.1213 d2.dn_loss_bbox: 0.1623 d2.dn_loss_iou: 0.2292 d3.dn_loss_cls: 0.1128 d3.dn_loss_bbox: 0.1530 d3.dn_loss_iou: 0.2214 d4.dn_loss_cls: 0.1080 d4.dn_loss_bbox: 0.1505 d4.dn_loss_iou: 0.2189 d1.loss_lmm_region: 0.1529 loss_lmm_image: 0.8575 2024/11/11 22:58:12 - mmengine - INFO - Iter(train) [ 62500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:50:46 time: 2.0020 data_time: 0.0176 memory: 34762 grad_norm: 28.2823 loss: 9.7690 loss_cls: 0.3022 loss_bbox: 0.1508 loss_iou: 0.2868 d0.loss_cls: 0.3392 d0.loss_bbox: 0.1654 d0.loss_iou: 0.2997 d1.loss_cls: 0.3180 d1.loss_bbox: 0.1558 d1.loss_iou: 0.2924 d2.loss_cls: 0.3151 d2.loss_bbox: 0.1506 d2.loss_iou: 0.2853 d3.loss_cls: 0.3092 d3.loss_bbox: 0.1493 d3.loss_iou: 0.2834 d4.loss_cls: 0.3050 d4.loss_bbox: 0.1476 d4.loss_iou: 0.2845 enc_loss_cls: 0.3396 enc_loss_bbox: 0.1787 enc_loss_iou: 0.3223 dn_loss_cls: 0.1045 dn_loss_bbox: 0.1622 dn_loss_iou: 0.2271 d0.dn_loss_cls: 0.1812 d0.dn_loss_bbox: 0.2874 d0.dn_loss_iou: 0.3641 d1.dn_loss_cls: 0.1359 d1.dn_loss_bbox: 0.1847 d1.dn_loss_iou: 0.2528 d2.dn_loss_cls: 0.1176 d2.dn_loss_bbox: 0.1690 d2.dn_loss_iou: 0.2344 d3.dn_loss_cls: 0.1096 d3.dn_loss_bbox: 0.1644 d3.dn_loss_iou: 0.2291 d4.dn_loss_cls: 0.1045 d4.dn_loss_bbox: 0.1621 d4.dn_loss_iou: 0.2271 d1.loss_lmm_region: 0.1183 loss_lmm_image: 0.8520 2024/11/11 23:01:33 - mmengine - INFO - Iter(train) [ 62600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:47:25 time: 2.0051 data_time: 0.0177 memory: 34746 grad_norm: 30.7547 loss: 11.3161 loss_cls: 0.3667 loss_bbox: 0.2003 loss_iou: 0.3002 d0.loss_cls: 0.4171 d0.loss_bbox: 0.2113 d0.loss_iou: 0.3140 d1.loss_cls: 0.3898 d1.loss_bbox: 0.2096 d1.loss_iou: 0.3085 d2.loss_cls: 0.3831 d2.loss_bbox: 0.2036 d2.loss_iou: 0.3044 d3.loss_cls: 0.3686 d3.loss_bbox: 0.2031 d3.loss_iou: 0.3056 d4.loss_cls: 0.3679 d4.loss_bbox: 0.2012 d4.loss_iou: 0.3015 enc_loss_cls: 0.4230 enc_loss_bbox: 0.2189 enc_loss_iou: 0.3295 dn_loss_cls: 0.1464 dn_loss_bbox: 0.1829 dn_loss_iou: 0.2444 d0.dn_loss_cls: 0.2290 d0.dn_loss_bbox: 0.3368 d0.dn_loss_iou: 0.3923 d1.dn_loss_cls: 0.1733 d1.dn_loss_bbox: 0.2210 d1.dn_loss_iou: 0.2761 d2.dn_loss_cls: 0.1619 d2.dn_loss_bbox: 0.1931 d2.dn_loss_iou: 0.2539 d3.dn_loss_cls: 0.1554 d3.dn_loss_bbox: 0.1844 d3.dn_loss_iou: 0.2469 d4.dn_loss_cls: 0.1497 d4.dn_loss_bbox: 0.1829 d4.dn_loss_iou: 0.2443 d1.loss_lmm_region: 0.1587 loss_lmm_image: 0.8550 2024/11/11 23:04:52 - mmengine - INFO - Iter(train) [ 62700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:44:02 time: 1.9795 data_time: 0.0177 memory: 34936 grad_norm: 29.8583 loss: 9.5382 loss_cls: 0.3176 loss_bbox: 0.1277 loss_iou: 0.2446 d0.loss_cls: 0.3628 d0.loss_bbox: 0.1365 d0.loss_iou: 0.2568 d1.loss_cls: 0.3310 d1.loss_bbox: 0.1339 d1.loss_iou: 0.2508 d2.loss_cls: 0.3241 d2.loss_bbox: 0.1315 d2.loss_iou: 0.2457 d3.loss_cls: 0.3219 d3.loss_bbox: 0.1273 d3.loss_iou: 0.2457 d4.loss_cls: 0.3179 d4.loss_bbox: 0.1275 d4.loss_iou: 0.2461 enc_loss_cls: 0.3606 enc_loss_bbox: 0.1493 enc_loss_iou: 0.2845 dn_loss_cls: 0.1419 dn_loss_bbox: 0.1488 dn_loss_iou: 0.2111 d0.dn_loss_cls: 0.2172 d0.dn_loss_bbox: 0.2784 d0.dn_loss_iou: 0.3505 d1.dn_loss_cls: 0.1701 d1.dn_loss_bbox: 0.1784 d1.dn_loss_iou: 0.2412 d2.dn_loss_cls: 0.1543 d2.dn_loss_bbox: 0.1581 d2.dn_loss_iou: 0.2203 d3.dn_loss_cls: 0.1457 d3.dn_loss_bbox: 0.1512 d3.dn_loss_iou: 0.2137 d4.dn_loss_cls: 0.1421 d4.dn_loss_bbox: 0.1489 d4.dn_loss_iou: 0.2111 d1.loss_lmm_region: 0.1546 loss_lmm_image: 0.8568 2024/11/11 23:08:11 - mmengine - INFO - Iter(train) [ 62800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:40:37 time: 1.9963 data_time: 0.0177 memory: 32811 grad_norm: 32.8434 loss: 8.2703 loss_cls: 0.2750 loss_bbox: 0.1027 loss_iou: 0.1880 d0.loss_cls: 0.3169 d0.loss_bbox: 0.1228 d0.loss_iou: 0.2059 d1.loss_cls: 0.2983 d1.loss_bbox: 0.1095 d1.loss_iou: 0.1944 d2.loss_cls: 0.2904 d2.loss_bbox: 0.1019 d2.loss_iou: 0.1892 d3.loss_cls: 0.2795 d3.loss_bbox: 0.1019 d3.loss_iou: 0.1871 d4.loss_cls: 0.2759 d4.loss_bbox: 0.1036 d4.loss_iou: 0.1888 enc_loss_cls: 0.3193 enc_loss_bbox: 0.1384 enc_loss_iou: 0.2313 dn_loss_cls: 0.1028 dn_loss_bbox: 0.1418 dn_loss_iou: 0.1867 d0.dn_loss_cls: 0.1779 d0.dn_loss_bbox: 0.2555 d0.dn_loss_iou: 0.3119 d1.dn_loss_cls: 0.1294 d1.dn_loss_bbox: 0.1669 d1.dn_loss_iou: 0.2143 d2.dn_loss_cls: 0.1129 d2.dn_loss_bbox: 0.1490 d2.dn_loss_iou: 0.1955 d3.dn_loss_cls: 0.1063 d3.dn_loss_bbox: 0.1426 d3.dn_loss_iou: 0.1888 d4.dn_loss_cls: 0.1050 d4.dn_loss_bbox: 0.1419 d4.dn_loss_iou: 0.1866 d1.loss_lmm_region: 0.1406 loss_lmm_image: 0.8932 2024/11/11 23:11:29 - mmengine - INFO - Iter(train) [ 62900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:37:13 time: 1.9895 data_time: 0.0178 memory: 34647 grad_norm: 32.3996 loss: 9.4175 loss_cls: 0.2951 loss_bbox: 0.1240 loss_iou: 0.2463 d0.loss_cls: 0.3387 d0.loss_bbox: 0.1381 d0.loss_iou: 0.2600 d1.loss_cls: 0.3158 d1.loss_bbox: 0.1310 d1.loss_iou: 0.2528 d2.loss_cls: 0.3060 d2.loss_bbox: 0.1223 d2.loss_iou: 0.2486 d3.loss_cls: 0.3004 d3.loss_bbox: 0.1236 d3.loss_iou: 0.2490 d4.loss_cls: 0.3002 d4.loss_bbox: 0.1208 d4.loss_iou: 0.2448 enc_loss_cls: 0.3379 enc_loss_bbox: 0.1536 enc_loss_iou: 0.2857 dn_loss_cls: 0.0974 dn_loss_bbox: 0.1635 dn_loss_iou: 0.2352 d0.dn_loss_cls: 0.1834 d0.dn_loss_bbox: 0.3136 d0.dn_loss_iou: 0.3841 d1.dn_loss_cls: 0.1320 d1.dn_loss_bbox: 0.1995 d1.dn_loss_iou: 0.2658 d2.dn_loss_cls: 0.1096 d2.dn_loss_bbox: 0.1756 d2.dn_loss_iou: 0.2445 d3.dn_loss_cls: 0.1024 d3.dn_loss_bbox: 0.1661 d3.dn_loss_iou: 0.2380 d4.dn_loss_cls: 0.0979 d4.dn_loss_bbox: 0.1636 d4.dn_loss_iou: 0.2352 d1.loss_lmm_region: 0.1431 loss_lmm_image: 0.8722 2024/11/11 23:14:46 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 23:14:46 - mmengine - INFO - Iter(train) [ 63000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:33:46 time: 1.9733 data_time: 0.0177 memory: 33784 grad_norm: 27.4967 loss: 10.2318 loss_cls: 0.3338 loss_bbox: 0.1575 loss_iou: 0.2853 d0.loss_cls: 0.3920 d0.loss_bbox: 0.1755 d0.loss_iou: 0.3052 d1.loss_cls: 0.3616 d1.loss_bbox: 0.1653 d1.loss_iou: 0.2916 d2.loss_cls: 0.3503 d2.loss_bbox: 0.1596 d2.loss_iou: 0.2852 d3.loss_cls: 0.3413 d3.loss_bbox: 0.1564 d3.loss_iou: 0.2830 d4.loss_cls: 0.3351 d4.loss_bbox: 0.1579 d4.loss_iou: 0.2864 enc_loss_cls: 0.3894 enc_loss_bbox: 0.1904 enc_loss_iou: 0.3279 dn_loss_cls: 0.0847 dn_loss_bbox: 0.1689 dn_loss_iou: 0.2301 d0.dn_loss_cls: 0.1772 d0.dn_loss_bbox: 0.3335 d0.dn_loss_iou: 0.3866 d1.dn_loss_cls: 0.1252 d1.dn_loss_bbox: 0.2075 d1.dn_loss_iou: 0.2656 d2.dn_loss_cls: 0.1004 d2.dn_loss_bbox: 0.1846 d2.dn_loss_iou: 0.2422 d3.dn_loss_cls: 0.0913 d3.dn_loss_bbox: 0.1728 d3.dn_loss_iou: 0.2339 d4.dn_loss_cls: 0.0849 d4.dn_loss_bbox: 0.1690 d4.dn_loss_iou: 0.2300 d1.loss_lmm_region: 0.1488 loss_lmm_image: 0.8640 2024/11/11 23:18:04 - mmengine - INFO - Iter(train) [ 63100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:30:22 time: 1.9875 data_time: 0.0177 memory: 33766 grad_norm: 27.3881 loss: 10.6671 loss_cls: 0.3109 loss_bbox: 0.1701 loss_iou: 0.2856 d0.loss_cls: 0.3654 d0.loss_bbox: 0.1797 d0.loss_iou: 0.2978 d1.loss_cls: 0.3380 d1.loss_bbox: 0.1720 d1.loss_iou: 0.2886 d2.loss_cls: 0.3215 d2.loss_bbox: 0.1719 d2.loss_iou: 0.2883 d3.loss_cls: 0.3146 d3.loss_bbox: 0.1715 d3.loss_iou: 0.2861 d4.loss_cls: 0.3166 d4.loss_bbox: 0.1652 d4.loss_iou: 0.2844 enc_loss_cls: 0.3648 enc_loss_bbox: 0.1951 enc_loss_iou: 0.3186 dn_loss_cls: 0.1194 dn_loss_bbox: 0.2150 dn_loss_iou: 0.2542 d0.dn_loss_cls: 0.2042 d0.dn_loss_bbox: 0.3659 d0.dn_loss_iou: 0.4018 d1.dn_loss_cls: 0.1557 d1.dn_loss_bbox: 0.2491 d1.dn_loss_iou: 0.2884 d2.dn_loss_cls: 0.1343 d2.dn_loss_bbox: 0.2267 d2.dn_loss_iou: 0.2653 d3.dn_loss_cls: 0.1243 d3.dn_loss_bbox: 0.2171 d3.dn_loss_iou: 0.2572 d4.dn_loss_cls: 0.1199 d4.dn_loss_bbox: 0.2150 d4.dn_loss_iou: 0.2541 d1.loss_lmm_region: 0.1481 loss_lmm_image: 0.8453 2024/11/11 23:21:23 - mmengine - INFO - Iter(train) [ 63200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:26:57 time: 1.9735 data_time: 0.0178 memory: 33742 grad_norm: 29.3025 loss: 8.7835 loss_cls: 0.2537 loss_bbox: 0.1304 loss_iou: 0.2216 d0.loss_cls: 0.3002 d0.loss_bbox: 0.1437 d0.loss_iou: 0.2274 d1.loss_cls: 0.2749 d1.loss_bbox: 0.1371 d1.loss_iou: 0.2277 d2.loss_cls: 0.2661 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2249 d3.loss_cls: 0.2592 d3.loss_bbox: 0.1322 d3.loss_iou: 0.2229 d4.loss_cls: 0.2529 d4.loss_bbox: 0.1321 d4.loss_iou: 0.2209 enc_loss_cls: 0.3046 enc_loss_bbox: 0.1546 enc_loss_iou: 0.2520 dn_loss_cls: 0.1018 dn_loss_bbox: 0.1638 dn_loss_iou: 0.2016 d0.dn_loss_cls: 0.1831 d0.dn_loss_bbox: 0.3130 d0.dn_loss_iou: 0.3491 d1.dn_loss_cls: 0.1354 d1.dn_loss_bbox: 0.1957 d1.dn_loss_iou: 0.2337 d2.dn_loss_cls: 0.1133 d2.dn_loss_bbox: 0.1740 d2.dn_loss_iou: 0.2111 d3.dn_loss_cls: 0.1081 d3.dn_loss_bbox: 0.1660 d3.dn_loss_iou: 0.2042 d4.dn_loss_cls: 0.1023 d4.dn_loss_bbox: 0.1639 d4.dn_loss_iou: 0.2017 d1.loss_lmm_region: 0.1254 loss_lmm_image: 0.8622 2024/11/11 23:24:41 - mmengine - INFO - Iter(train) [ 63300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:23:32 time: 1.9498 data_time: 0.0177 memory: 33825 grad_norm: 37.0011 loss: 9.3849 loss_cls: 0.3119 loss_bbox: 0.1368 loss_iou: 0.2551 d0.loss_cls: 0.3614 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2732 d1.loss_cls: 0.3285 d1.loss_bbox: 0.1377 d1.loss_iou: 0.2593 d2.loss_cls: 0.3174 d2.loss_bbox: 0.1377 d2.loss_iou: 0.2575 d3.loss_cls: 0.3120 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2574 d4.loss_cls: 0.3135 d4.loss_bbox: 0.1346 d4.loss_iou: 0.2560 enc_loss_cls: 0.3506 enc_loss_bbox: 0.1589 enc_loss_iou: 0.2937 dn_loss_cls: 0.0949 dn_loss_bbox: 0.1590 dn_loss_iou: 0.2093 d0.dn_loss_cls: 0.1719 d0.dn_loss_bbox: 0.2962 d0.dn_loss_iou: 0.3431 d1.dn_loss_cls: 0.1223 d1.dn_loss_bbox: 0.1851 d1.dn_loss_iou: 0.2346 d2.dn_loss_cls: 0.1042 d2.dn_loss_bbox: 0.1638 d2.dn_loss_iou: 0.2156 d3.dn_loss_cls: 0.0975 d3.dn_loss_bbox: 0.1602 d3.dn_loss_iou: 0.2111 d4.dn_loss_cls: 0.0941 d4.dn_loss_bbox: 0.1589 d4.dn_loss_iou: 0.2091 d1.loss_lmm_region: 0.1388 loss_lmm_image: 0.8796 2024/11/11 23:28:00 - mmengine - INFO - Iter(train) [ 63400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:20:09 time: 1.9962 data_time: 0.0176 memory: 35394 grad_norm: 33.5402 loss: 9.6846 loss_cls: 0.2894 loss_bbox: 0.1535 loss_iou: 0.2761 d0.loss_cls: 0.3326 d0.loss_bbox: 0.1690 d0.loss_iou: 0.2959 d1.loss_cls: 0.3070 d1.loss_bbox: 0.1605 d1.loss_iou: 0.2900 d2.loss_cls: 0.2997 d2.loss_bbox: 0.1532 d2.loss_iou: 0.2790 d3.loss_cls: 0.2970 d3.loss_bbox: 0.1523 d3.loss_iou: 0.2756 d4.loss_cls: 0.2908 d4.loss_bbox: 0.1524 d4.loss_iou: 0.2772 enc_loss_cls: 0.3390 enc_loss_bbox: 0.1812 enc_loss_iou: 0.3204 dn_loss_cls: 0.1140 dn_loss_bbox: 0.1674 dn_loss_iou: 0.2189 d0.dn_loss_cls: 0.1779 d0.dn_loss_bbox: 0.2927 d0.dn_loss_iou: 0.3484 d1.dn_loss_cls: 0.1381 d1.dn_loss_bbox: 0.1932 d1.dn_loss_iou: 0.2470 d2.dn_loss_cls: 0.1235 d2.dn_loss_bbox: 0.1733 d2.dn_loss_iou: 0.2271 d3.dn_loss_cls: 0.1198 d3.dn_loss_bbox: 0.1680 d3.dn_loss_iou: 0.2209 d4.dn_loss_cls: 0.1149 d4.dn_loss_bbox: 0.1674 d4.dn_loss_iou: 0.2189 d1.loss_lmm_region: 0.1405 loss_lmm_image: 0.8211 2024/11/11 23:31:19 - mmengine - INFO - Iter(train) [ 63500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:16:46 time: 1.9828 data_time: 0.0178 memory: 33645 grad_norm: 32.9918 loss: 9.7083 loss_cls: 0.2947 loss_bbox: 0.1485 loss_iou: 0.2572 d0.loss_cls: 0.3413 d0.loss_bbox: 0.1492 d0.loss_iou: 0.2639 d1.loss_cls: 0.3093 d1.loss_bbox: 0.1485 d1.loss_iou: 0.2610 d2.loss_cls: 0.3067 d2.loss_bbox: 0.1436 d2.loss_iou: 0.2549 d3.loss_cls: 0.2950 d3.loss_bbox: 0.1457 d3.loss_iou: 0.2579 d4.loss_cls: 0.2928 d4.loss_bbox: 0.1486 d4.loss_iou: 0.2575 enc_loss_cls: 0.3387 enc_loss_bbox: 0.1591 enc_loss_iou: 0.2851 dn_loss_cls: 0.1151 dn_loss_bbox: 0.1755 dn_loss_iou: 0.2361 d0.dn_loss_cls: 0.1993 d0.dn_loss_bbox: 0.3185 d0.dn_loss_iou: 0.3770 d1.dn_loss_cls: 0.1549 d1.dn_loss_bbox: 0.2062 d1.dn_loss_iou: 0.2680 d2.dn_loss_cls: 0.1343 d2.dn_loss_bbox: 0.1842 d2.dn_loss_iou: 0.2460 d3.dn_loss_cls: 0.1240 d3.dn_loss_bbox: 0.1781 d3.dn_loss_iou: 0.2386 d4.dn_loss_cls: 0.1165 d4.dn_loss_bbox: 0.1755 d4.dn_loss_iou: 0.2360 d1.loss_lmm_region: 0.1363 loss_lmm_image: 0.8291 2024/11/11 23:34:39 - mmengine - INFO - Iter(train) [ 63600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:13:23 time: 2.0064 data_time: 0.0177 memory: 33676 grad_norm: 26.9798 loss: 8.6658 loss_cls: 0.2732 loss_bbox: 0.1080 loss_iou: 0.2132 d0.loss_cls: 0.3110 d0.loss_bbox: 0.1183 d0.loss_iou: 0.2244 d1.loss_cls: 0.2858 d1.loss_bbox: 0.1127 d1.loss_iou: 0.2179 d2.loss_cls: 0.2849 d2.loss_bbox: 0.1077 d2.loss_iou: 0.2131 d3.loss_cls: 0.2789 d3.loss_bbox: 0.1092 d3.loss_iou: 0.2135 d4.loss_cls: 0.2711 d4.loss_bbox: 0.1086 d4.loss_iou: 0.2137 enc_loss_cls: 0.3155 enc_loss_bbox: 0.1368 enc_loss_iou: 0.2481 dn_loss_cls: 0.1030 dn_loss_bbox: 0.1504 dn_loss_iou: 0.2071 d0.dn_loss_cls: 0.1881 d0.dn_loss_bbox: 0.2872 d0.dn_loss_iou: 0.3506 d1.dn_loss_cls: 0.1368 d1.dn_loss_bbox: 0.1759 d1.dn_loss_iou: 0.2363 d2.dn_loss_cls: 0.1178 d2.dn_loss_bbox: 0.1563 d2.dn_loss_iou: 0.2152 d3.dn_loss_cls: 0.1099 d3.dn_loss_bbox: 0.1517 d3.dn_loss_iou: 0.2098 d4.dn_loss_cls: 0.1045 d4.dn_loss_bbox: 0.1504 d4.dn_loss_iou: 0.2071 d1.loss_lmm_region: 0.1368 loss_lmm_image: 0.9053 2024/11/11 23:37:58 - mmengine - INFO - Iter(train) [ 63700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:09:59 time: 1.9942 data_time: 0.0178 memory: 35370 grad_norm: 43.3798 loss: 9.0018 loss_cls: 0.2612 loss_bbox: 0.1308 loss_iou: 0.2238 d0.loss_cls: 0.2882 d0.loss_bbox: 0.1468 d0.loss_iou: 0.2409 d1.loss_cls: 0.2722 d1.loss_bbox: 0.1378 d1.loss_iou: 0.2297 d2.loss_cls: 0.2653 d2.loss_bbox: 0.1336 d2.loss_iou: 0.2253 d3.loss_cls: 0.2608 d3.loss_bbox: 0.1321 d3.loss_iou: 0.2262 d4.loss_cls: 0.2592 d4.loss_bbox: 0.1324 d4.loss_iou: 0.2245 enc_loss_cls: 0.2855 enc_loss_bbox: 0.1637 enc_loss_iou: 0.2609 dn_loss_cls: 0.1035 dn_loss_bbox: 0.1837 dn_loss_iou: 0.2191 d0.dn_loss_cls: 0.1738 d0.dn_loss_bbox: 0.3319 d0.dn_loss_iou: 0.3585 d1.dn_loss_cls: 0.1350 d1.dn_loss_bbox: 0.2166 d1.dn_loss_iou: 0.2504 d2.dn_loss_cls: 0.1163 d2.dn_loss_bbox: 0.1937 d2.dn_loss_iou: 0.2289 d3.dn_loss_cls: 0.1085 d3.dn_loss_bbox: 0.1853 d3.dn_loss_iou: 0.2214 d4.dn_loss_cls: 0.1057 d4.dn_loss_bbox: 0.1837 d4.dn_loss_iou: 0.2191 d1.loss_lmm_region: 0.1260 loss_lmm_image: 0.8400 2024/11/11 23:41:18 - mmengine - INFO - Iter(train) [ 63800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:06:37 time: 2.0148 data_time: 0.0179 memory: 34851 grad_norm: 34.4611 loss: 9.7226 loss_cls: 0.3138 loss_bbox: 0.1341 loss_iou: 0.2586 d0.loss_cls: 0.3573 d0.loss_bbox: 0.1418 d0.loss_iou: 0.2677 d1.loss_cls: 0.3426 d1.loss_bbox: 0.1337 d1.loss_iou: 0.2568 d2.loss_cls: 0.3262 d2.loss_bbox: 0.1327 d2.loss_iou: 0.2564 d3.loss_cls: 0.3180 d3.loss_bbox: 0.1354 d3.loss_iou: 0.2607 d4.loss_cls: 0.3156 d4.loss_bbox: 0.1357 d4.loss_iou: 0.2610 enc_loss_cls: 0.3616 enc_loss_bbox: 0.1563 enc_loss_iou: 0.2931 dn_loss_cls: 0.1127 dn_loss_bbox: 0.1684 dn_loss_iou: 0.2288 d0.dn_loss_cls: 0.1946 d0.dn_loss_bbox: 0.3070 d0.dn_loss_iou: 0.3729 d1.dn_loss_cls: 0.1516 d1.dn_loss_bbox: 0.1990 d1.dn_loss_iou: 0.2573 d2.dn_loss_cls: 0.1318 d2.dn_loss_bbox: 0.1778 d2.dn_loss_iou: 0.2374 d3.dn_loss_cls: 0.1225 d3.dn_loss_bbox: 0.1704 d3.dn_loss_iou: 0.2311 d4.dn_loss_cls: 0.1120 d4.dn_loss_bbox: 0.1683 d4.dn_loss_iou: 0.2288 d1.loss_lmm_region: 0.1421 loss_lmm_image: 0.8492 2024/11/11 23:44:37 - mmengine - INFO - Iter(train) [ 63900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 2 days, 0:03:14 time: 1.9896 data_time: 0.0178 memory: 35206 grad_norm: 35.2844 loss: 9.9528 loss_cls: 0.3202 loss_bbox: 0.1555 loss_iou: 0.2777 d0.loss_cls: 0.3613 d0.loss_bbox: 0.1631 d0.loss_iou: 0.2868 d1.loss_cls: 0.3416 d1.loss_bbox: 0.1602 d1.loss_iou: 0.2801 d2.loss_cls: 0.3251 d2.loss_bbox: 0.1564 d2.loss_iou: 0.2792 d3.loss_cls: 0.3211 d3.loss_bbox: 0.1556 d3.loss_iou: 0.2767 d4.loss_cls: 0.3242 d4.loss_bbox: 0.1527 d4.loss_iou: 0.2755 enc_loss_cls: 0.3635 enc_loss_bbox: 0.1771 enc_loss_iou: 0.3095 dn_loss_cls: 0.0917 dn_loss_bbox: 0.1696 dn_loss_iou: 0.2258 d0.dn_loss_cls: 0.1743 d0.dn_loss_bbox: 0.3228 d0.dn_loss_iou: 0.3743 d1.dn_loss_cls: 0.1252 d1.dn_loss_bbox: 0.2044 d1.dn_loss_iou: 0.2577 d2.dn_loss_cls: 0.1047 d2.dn_loss_bbox: 0.1806 d2.dn_loss_iou: 0.2360 d3.dn_loss_cls: 0.0969 d3.dn_loss_bbox: 0.1717 d3.dn_loss_iou: 0.2284 d4.dn_loss_cls: 0.0928 d4.dn_loss_bbox: 0.1695 d4.dn_loss_iou: 0.2258 d1.loss_lmm_region: 0.1392 loss_lmm_image: 0.8982 2024/11/11 23:47:56 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/11 23:47:56 - mmengine - INFO - Iter(train) [ 64000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:59:50 time: 1.9842 data_time: 0.0179 memory: 34616 grad_norm: 25.8755 loss: 10.0171 loss_cls: 0.3034 loss_bbox: 0.1503 loss_iou: 0.2508 d0.loss_cls: 0.3610 d0.loss_bbox: 0.1648 d0.loss_iou: 0.2643 d1.loss_cls: 0.3278 d1.loss_bbox: 0.1614 d1.loss_iou: 0.2613 d2.loss_cls: 0.3178 d2.loss_bbox: 0.1533 d2.loss_iou: 0.2525 d3.loss_cls: 0.3091 d3.loss_bbox: 0.1482 d3.loss_iou: 0.2497 d4.loss_cls: 0.3033 d4.loss_bbox: 0.1504 d4.loss_iou: 0.2515 enc_loss_cls: 0.3613 enc_loss_bbox: 0.1819 enc_loss_iou: 0.2937 dn_loss_cls: 0.1246 dn_loss_bbox: 0.1961 dn_loss_iou: 0.2279 d0.dn_loss_cls: 0.1953 d0.dn_loss_bbox: 0.3585 d0.dn_loss_iou: 0.3743 d1.dn_loss_cls: 0.1482 d1.dn_loss_bbox: 0.2282 d1.dn_loss_iou: 0.2584 d2.dn_loss_cls: 0.1338 d2.dn_loss_bbox: 0.2053 d2.dn_loss_iou: 0.2375 d3.dn_loss_cls: 0.1286 d3.dn_loss_bbox: 0.1973 d3.dn_loss_iou: 0.2305 d4.dn_loss_cls: 0.1273 d4.dn_loss_bbox: 0.1961 d4.dn_loss_iou: 0.2279 d1.loss_lmm_region: 0.1373 loss_lmm_image: 0.8661 2024/11/11 23:51:13 - mmengine - INFO - Iter(train) [ 64100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:56:25 time: 1.9767 data_time: 0.0178 memory: 33199 grad_norm: 26.7119 loss: 10.8289 loss_cls: 0.3821 loss_bbox: 0.1575 loss_iou: 0.3024 d0.loss_cls: 0.4501 d0.loss_bbox: 0.1627 d0.loss_iou: 0.3180 d1.loss_cls: 0.4173 d1.loss_bbox: 0.1575 d1.loss_iou: 0.3058 d2.loss_cls: 0.3977 d2.loss_bbox: 0.1550 d2.loss_iou: 0.3004 d3.loss_cls: 0.3904 d3.loss_bbox: 0.1552 d3.loss_iou: 0.3013 d4.loss_cls: 0.3791 d4.loss_bbox: 0.1580 d4.loss_iou: 0.3031 enc_loss_cls: 0.4479 enc_loss_bbox: 0.1800 enc_loss_iou: 0.3460 dn_loss_cls: 0.1367 dn_loss_bbox: 0.1647 dn_loss_iou: 0.2207 d0.dn_loss_cls: 0.2001 d0.dn_loss_bbox: 0.3015 d0.dn_loss_iou: 0.3618 d1.dn_loss_cls: 0.1565 d1.dn_loss_bbox: 0.1968 d1.dn_loss_iou: 0.2541 d2.dn_loss_cls: 0.1370 d2.dn_loss_bbox: 0.1757 d2.dn_loss_iou: 0.2304 d3.dn_loss_cls: 0.1388 d3.dn_loss_bbox: 0.1670 d3.dn_loss_iou: 0.2233 d4.dn_loss_cls: 0.1367 d4.dn_loss_bbox: 0.1647 d4.dn_loss_iou: 0.2207 d1.loss_lmm_region: 0.1671 loss_lmm_image: 0.9073 2024/11/11 23:54:31 - mmengine - INFO - Iter(train) [ 64200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:52:59 time: 1.9676 data_time: 0.0178 memory: 34402 grad_norm: 25.3577 loss: 8.6636 loss_cls: 0.2761 loss_bbox: 0.1101 loss_iou: 0.2137 d0.loss_cls: 0.3140 d0.loss_bbox: 0.1290 d0.loss_iou: 0.2321 d1.loss_cls: 0.2940 d1.loss_bbox: 0.1187 d1.loss_iou: 0.2209 d2.loss_cls: 0.2832 d2.loss_bbox: 0.1171 d2.loss_iou: 0.2175 d3.loss_cls: 0.2760 d3.loss_bbox: 0.1102 d3.loss_iou: 0.2142 d4.loss_cls: 0.2748 d4.loss_bbox: 0.1100 d4.loss_iou: 0.2135 enc_loss_cls: 0.3174 enc_loss_bbox: 0.1416 enc_loss_iou: 0.2533 dn_loss_cls: 0.1186 dn_loss_bbox: 0.1491 dn_loss_iou: 0.1918 d0.dn_loss_cls: 0.1953 d0.dn_loss_bbox: 0.2899 d0.dn_loss_iou: 0.3300 d1.dn_loss_cls: 0.1499 d1.dn_loss_bbox: 0.1823 d1.dn_loss_iou: 0.2223 d2.dn_loss_cls: 0.1319 d2.dn_loss_bbox: 0.1585 d2.dn_loss_iou: 0.2009 d3.dn_loss_cls: 0.1205 d3.dn_loss_bbox: 0.1520 d3.dn_loss_iou: 0.1945 d4.dn_loss_cls: 0.1182 d4.dn_loss_bbox: 0.1490 d4.dn_loss_iou: 0.1916 d1.loss_lmm_region: 0.1291 loss_lmm_image: 0.8509 2024/11/11 23:57:48 - mmengine - INFO - Iter(train) [ 64300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:49:34 time: 1.9843 data_time: 0.0177 memory: 34602 grad_norm: 32.8951 loss: 10.9373 loss_cls: 0.3448 loss_bbox: 0.1543 loss_iou: 0.2862 d0.loss_cls: 0.4009 d0.loss_bbox: 0.1684 d0.loss_iou: 0.2971 d1.loss_cls: 0.3762 d1.loss_bbox: 0.1566 d1.loss_iou: 0.2881 d2.loss_cls: 0.3524 d2.loss_bbox: 0.1547 d2.loss_iou: 0.2895 d3.loss_cls: 0.3452 d3.loss_bbox: 0.1522 d3.loss_iou: 0.2862 d4.loss_cls: 0.3447 d4.loss_bbox: 0.1542 d4.loss_iou: 0.2868 enc_loss_cls: 0.3985 enc_loss_bbox: 0.1772 enc_loss_iou: 0.3153 dn_loss_cls: 0.1621 dn_loss_bbox: 0.1913 dn_loss_iou: 0.2502 d0.dn_loss_cls: 0.2417 d0.dn_loss_bbox: 0.3552 d0.dn_loss_iou: 0.4056 d1.dn_loss_cls: 0.1943 d1.dn_loss_bbox: 0.2280 d1.dn_loss_iou: 0.2850 d2.dn_loss_cls: 0.1789 d2.dn_loss_bbox: 0.2030 d2.dn_loss_iou: 0.2614 d3.dn_loss_cls: 0.1684 d3.dn_loss_bbox: 0.1929 d3.dn_loss_iou: 0.2531 d4.dn_loss_cls: 0.1671 d4.dn_loss_bbox: 0.1913 d4.dn_loss_iou: 0.2502 d1.loss_lmm_region: 0.1472 loss_lmm_image: 0.8808 2024/11/12 00:01:08 - mmengine - INFO - Iter(train) [ 64400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:46:11 time: 2.0102 data_time: 0.0178 memory: 34147 grad_norm: 31.3282 loss: 9.5640 loss_cls: 0.3081 loss_bbox: 0.1291 loss_iou: 0.2421 d0.loss_cls: 0.3419 d0.loss_bbox: 0.1406 d0.loss_iou: 0.2541 d1.loss_cls: 0.3193 d1.loss_bbox: 0.1349 d1.loss_iou: 0.2461 d2.loss_cls: 0.3148 d2.loss_bbox: 0.1314 d2.loss_iou: 0.2453 d3.loss_cls: 0.3149 d3.loss_bbox: 0.1270 d3.loss_iou: 0.2417 d4.loss_cls: 0.3113 d4.loss_bbox: 0.1295 d4.loss_iou: 0.2433 enc_loss_cls: 0.3529 enc_loss_bbox: 0.1440 enc_loss_iou: 0.2688 dn_loss_cls: 0.1474 dn_loss_bbox: 0.1562 dn_loss_iou: 0.2114 d0.dn_loss_cls: 0.2239 d0.dn_loss_bbox: 0.3049 d0.dn_loss_iou: 0.3636 d1.dn_loss_cls: 0.1757 d1.dn_loss_bbox: 0.1888 d1.dn_loss_iou: 0.2419 d2.dn_loss_cls: 0.1615 d2.dn_loss_bbox: 0.1677 d2.dn_loss_iou: 0.2214 d3.dn_loss_cls: 0.1552 d3.dn_loss_bbox: 0.1580 d3.dn_loss_iou: 0.2143 d4.dn_loss_cls: 0.1481 d4.dn_loss_bbox: 0.1562 d4.dn_loss_iou: 0.2113 d1.loss_lmm_region: 0.1437 loss_lmm_image: 0.8718 2024/11/12 00:04:28 - mmengine - INFO - Iter(train) [ 64500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:42:49 time: 2.0035 data_time: 0.0177 memory: 34802 grad_norm: 28.1024 loss: 8.2405 loss_cls: 0.2472 loss_bbox: 0.1154 loss_iou: 0.1887 d0.loss_cls: 0.2933 d0.loss_bbox: 0.1291 d0.loss_iou: 0.2058 d1.loss_cls: 0.2583 d1.loss_bbox: 0.1253 d1.loss_iou: 0.1977 d2.loss_cls: 0.2546 d2.loss_bbox: 0.1205 d2.loss_iou: 0.1883 d3.loss_cls: 0.2488 d3.loss_bbox: 0.1171 d3.loss_iou: 0.1881 d4.loss_cls: 0.2472 d4.loss_bbox: 0.1155 d4.loss_iou: 0.1883 enc_loss_cls: 0.2944 enc_loss_bbox: 0.1441 enc_loss_iou: 0.2298 dn_loss_cls: 0.0962 dn_loss_bbox: 0.1510 dn_loss_iou: 0.1935 d0.dn_loss_cls: 0.1722 d0.dn_loss_bbox: 0.2965 d0.dn_loss_iou: 0.3347 d1.dn_loss_cls: 0.1248 d1.dn_loss_bbox: 0.1824 d1.dn_loss_iou: 0.2235 d2.dn_loss_cls: 0.1042 d2.dn_loss_bbox: 0.1606 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.0978 d3.dn_loss_bbox: 0.1534 d3.dn_loss_iou: 0.1960 d4.dn_loss_cls: 0.0968 d4.dn_loss_bbox: 0.1510 d4.dn_loss_iou: 0.1935 d1.loss_lmm_region: 0.1335 loss_lmm_image: 0.8796 2024/11/12 00:07:48 - mmengine - INFO - Iter(train) [ 64600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:39:27 time: 2.0049 data_time: 0.0179 memory: 34459 grad_norm: 42.9363 loss: 9.2398 loss_cls: 0.2780 loss_bbox: 0.1376 loss_iou: 0.2494 d0.loss_cls: 0.3161 d0.loss_bbox: 0.1536 d0.loss_iou: 0.2648 d1.loss_cls: 0.2879 d1.loss_bbox: 0.1496 d1.loss_iou: 0.2570 d2.loss_cls: 0.2795 d2.loss_bbox: 0.1465 d2.loss_iou: 0.2539 d3.loss_cls: 0.2767 d3.loss_bbox: 0.1437 d3.loss_iou: 0.2496 d4.loss_cls: 0.2747 d4.loss_bbox: 0.1451 d4.loss_iou: 0.2517 enc_loss_cls: 0.3253 enc_loss_bbox: 0.1630 enc_loss_iou: 0.2858 dn_loss_cls: 0.0907 dn_loss_bbox: 0.1626 dn_loss_iou: 0.2209 d0.dn_loss_cls: 0.1832 d0.dn_loss_bbox: 0.3256 d0.dn_loss_iou: 0.3759 d1.dn_loss_cls: 0.1269 d1.dn_loss_bbox: 0.1993 d1.dn_loss_iou: 0.2548 d2.dn_loss_cls: 0.1042 d2.dn_loss_bbox: 0.1739 d2.dn_loss_iou: 0.2325 d3.dn_loss_cls: 0.0973 d3.dn_loss_bbox: 0.1658 d3.dn_loss_iou: 0.2239 d4.dn_loss_cls: 0.0932 d4.dn_loss_bbox: 0.1626 d4.dn_loss_iou: 0.2209 d1.loss_lmm_region: 0.1337 loss_lmm_image: 0.8022 2024/11/12 00:11:07 - mmengine - INFO - Iter(train) [ 64700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:36:03 time: 1.9948 data_time: 0.0178 memory: 34956 grad_norm: 28.1187 loss: 9.3569 loss_cls: 0.3093 loss_bbox: 0.1235 loss_iou: 0.2400 d0.loss_cls: 0.3571 d0.loss_bbox: 0.1330 d0.loss_iou: 0.2525 d1.loss_cls: 0.3286 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2461 d2.loss_cls: 0.3205 d2.loss_bbox: 0.1237 d2.loss_iou: 0.2403 d3.loss_cls: 0.3122 d3.loss_bbox: 0.1228 d3.loss_iou: 0.2389 d4.loss_cls: 0.3084 d4.loss_bbox: 0.1243 d4.loss_iou: 0.2398 enc_loss_cls: 0.3653 enc_loss_bbox: 0.1454 enc_loss_iou: 0.2772 dn_loss_cls: 0.1137 dn_loss_bbox: 0.1512 dn_loss_iou: 0.2108 d0.dn_loss_cls: 0.1986 d0.dn_loss_bbox: 0.3048 d0.dn_loss_iou: 0.3549 d1.dn_loss_cls: 0.1459 d1.dn_loss_bbox: 0.1845 d1.dn_loss_iou: 0.2420 d2.dn_loss_cls: 0.1268 d2.dn_loss_bbox: 0.1621 d2.dn_loss_iou: 0.2201 d3.dn_loss_cls: 0.1194 d3.dn_loss_bbox: 0.1542 d3.dn_loss_iou: 0.2137 d4.dn_loss_cls: 0.1148 d4.dn_loss_bbox: 0.1510 d4.dn_loss_iou: 0.2107 d1.loss_lmm_region: 0.1682 loss_lmm_image: 0.8733 2024/11/12 00:14:26 - mmengine - INFO - Iter(train) [ 64800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:32:40 time: 1.9767 data_time: 0.0177 memory: 33451 grad_norm: 34.4443 loss: 9.4815 loss_cls: 0.2749 loss_bbox: 0.1227 loss_iou: 0.2220 d0.loss_cls: 0.3152 d0.loss_bbox: 0.1347 d0.loss_iou: 0.2404 d1.loss_cls: 0.2899 d1.loss_bbox: 0.1310 d1.loss_iou: 0.2343 d2.loss_cls: 0.2874 d2.loss_bbox: 0.1228 d2.loss_iou: 0.2251 d3.loss_cls: 0.2751 d3.loss_bbox: 0.1255 d3.loss_iou: 0.2256 d4.loss_cls: 0.2758 d4.loss_bbox: 0.1229 d4.loss_iou: 0.2228 enc_loss_cls: 0.3186 enc_loss_bbox: 0.1477 enc_loss_iou: 0.2591 dn_loss_cls: 0.1552 dn_loss_bbox: 0.1841 dn_loss_iou: 0.2224 d0.dn_loss_cls: 0.2330 d0.dn_loss_bbox: 0.3216 d0.dn_loss_iou: 0.3628 d1.dn_loss_cls: 0.1845 d1.dn_loss_bbox: 0.2120 d1.dn_loss_iou: 0.2533 d2.dn_loss_cls: 0.1635 d2.dn_loss_bbox: 0.1936 d2.dn_loss_iou: 0.2318 d3.dn_loss_cls: 0.1582 d3.dn_loss_bbox: 0.1866 d3.dn_loss_iou: 0.2249 d4.dn_loss_cls: 0.1554 d4.dn_loss_bbox: 0.1842 d4.dn_loss_iou: 0.2223 d1.loss_lmm_region: 0.1648 loss_lmm_image: 0.8938 2024/11/12 00:17:46 - mmengine - INFO - Iter(train) [ 64900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:29:18 time: 2.0013 data_time: 0.0178 memory: 33422 grad_norm: 35.0999 loss: 10.1968 loss_cls: 0.2917 loss_bbox: 0.1626 loss_iou: 0.2519 d0.loss_cls: 0.3310 d0.loss_bbox: 0.1764 d0.loss_iou: 0.2682 d1.loss_cls: 0.3111 d1.loss_bbox: 0.1655 d1.loss_iou: 0.2600 d2.loss_cls: 0.2997 d2.loss_bbox: 0.1681 d2.loss_iou: 0.2554 d3.loss_cls: 0.2935 d3.loss_bbox: 0.1645 d3.loss_iou: 0.2540 d4.loss_cls: 0.2889 d4.loss_bbox: 0.1651 d4.loss_iou: 0.2522 enc_loss_cls: 0.3410 enc_loss_bbox: 0.1879 enc_loss_iou: 0.2880 dn_loss_cls: 0.1239 dn_loss_bbox: 0.2100 dn_loss_iou: 0.2392 d0.dn_loss_cls: 0.2059 d0.dn_loss_bbox: 0.3784 d0.dn_loss_iou: 0.3910 d1.dn_loss_cls: 0.1548 d1.dn_loss_bbox: 0.2431 d1.dn_loss_iou: 0.2722 d2.dn_loss_cls: 0.1352 d2.dn_loss_bbox: 0.2207 d2.dn_loss_iou: 0.2506 d3.dn_loss_cls: 0.1293 d3.dn_loss_bbox: 0.2118 d3.dn_loss_iou: 0.2426 d4.dn_loss_cls: 0.1240 d4.dn_loss_bbox: 0.2100 d4.dn_loss_iou: 0.2391 d1.loss_lmm_region: 0.1526 loss_lmm_image: 0.8856 2024/11/12 00:21:05 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 00:21:05 - mmengine - INFO - Iter(train) [ 65000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:25:54 time: 2.0074 data_time: 0.0178 memory: 33743 grad_norm: 26.1883 loss: 10.3807 loss_cls: 0.3191 loss_bbox: 0.1689 loss_iou: 0.3086 d0.loss_cls: 0.3626 d0.loss_bbox: 0.1833 d0.loss_iou: 0.3226 d1.loss_cls: 0.3362 d1.loss_bbox: 0.1748 d1.loss_iou: 0.3105 d2.loss_cls: 0.3252 d2.loss_bbox: 0.1726 d2.loss_iou: 0.3108 d3.loss_cls: 0.3236 d3.loss_bbox: 0.1697 d3.loss_iou: 0.3078 d4.loss_cls: 0.3192 d4.loss_bbox: 0.1680 d4.loss_iou: 0.3070 enc_loss_cls: 0.3678 enc_loss_bbox: 0.1914 enc_loss_iou: 0.3353 dn_loss_cls: 0.1062 dn_loss_bbox: 0.1774 dn_loss_iou: 0.2343 d0.dn_loss_cls: 0.1860 d0.dn_loss_bbox: 0.3202 d0.dn_loss_iou: 0.3815 d1.dn_loss_cls: 0.1367 d1.dn_loss_bbox: 0.2058 d1.dn_loss_iou: 0.2644 d2.dn_loss_cls: 0.1203 d2.dn_loss_bbox: 0.1837 d2.dn_loss_iou: 0.2426 d3.dn_loss_cls: 0.1118 d3.dn_loss_bbox: 0.1782 d3.dn_loss_iou: 0.2359 d4.dn_loss_cls: 0.1084 d4.dn_loss_bbox: 0.1773 d4.dn_loss_iou: 0.2342 d1.loss_lmm_region: 0.1431 loss_lmm_image: 0.8479 2024/11/12 00:24:23 - mmengine - INFO - Iter(train) [ 65100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:22:30 time: 2.0014 data_time: 0.0177 memory: 35131 grad_norm: 29.2409 loss: 9.7987 loss_cls: 0.2873 loss_bbox: 0.1492 loss_iou: 0.2388 d0.loss_cls: 0.3311 d0.loss_bbox: 0.1593 d0.loss_iou: 0.2512 d1.loss_cls: 0.3063 d1.loss_bbox: 0.1521 d1.loss_iou: 0.2416 d2.loss_cls: 0.2930 d2.loss_bbox: 0.1529 d2.loss_iou: 0.2386 d3.loss_cls: 0.2870 d3.loss_bbox: 0.1466 d3.loss_iou: 0.2375 d4.loss_cls: 0.2882 d4.loss_bbox: 0.1475 d4.loss_iou: 0.2381 enc_loss_cls: 0.3389 enc_loss_bbox: 0.1719 enc_loss_iou: 0.2709 dn_loss_cls: 0.1233 dn_loss_bbox: 0.1976 dn_loss_iou: 0.2274 d0.dn_loss_cls: 0.2062 d0.dn_loss_bbox: 0.3709 d0.dn_loss_iou: 0.3851 d1.dn_loss_cls: 0.1571 d1.dn_loss_bbox: 0.2365 d1.dn_loss_iou: 0.2656 d2.dn_loss_cls: 0.1363 d2.dn_loss_bbox: 0.2096 d2.dn_loss_iou: 0.2397 d3.dn_loss_cls: 0.1289 d3.dn_loss_bbox: 0.2005 d3.dn_loss_iou: 0.2311 d4.dn_loss_cls: 0.1222 d4.dn_loss_bbox: 0.1975 d4.dn_loss_iou: 0.2272 d1.loss_lmm_region: 0.1323 loss_lmm_image: 0.8758 2024/11/12 00:27:44 - mmengine - INFO - Iter(train) [ 65200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:19:09 time: 2.0059 data_time: 0.0178 memory: 34848 grad_norm: 28.8427 loss: 9.0195 loss_cls: 0.3029 loss_bbox: 0.1128 loss_iou: 0.2110 d0.loss_cls: 0.3540 d0.loss_bbox: 0.1219 d0.loss_iou: 0.2256 d1.loss_cls: 0.3232 d1.loss_bbox: 0.1209 d1.loss_iou: 0.2180 d2.loss_cls: 0.3139 d2.loss_bbox: 0.1152 d2.loss_iou: 0.2130 d3.loss_cls: 0.3041 d3.loss_bbox: 0.1150 d3.loss_iou: 0.2131 d4.loss_cls: 0.3032 d4.loss_bbox: 0.1130 d4.loss_iou: 0.2110 enc_loss_cls: 0.3535 enc_loss_bbox: 0.1413 enc_loss_iou: 0.2511 dn_loss_cls: 0.1475 dn_loss_bbox: 0.1433 dn_loss_iou: 0.1875 d0.dn_loss_cls: 0.2225 d0.dn_loss_bbox: 0.2891 d0.dn_loss_iou: 0.3365 d1.dn_loss_cls: 0.1789 d1.dn_loss_bbox: 0.1809 d1.dn_loss_iou: 0.2225 d2.dn_loss_cls: 0.1587 d2.dn_loss_bbox: 0.1565 d2.dn_loss_iou: 0.2001 d3.dn_loss_cls: 0.1505 d3.dn_loss_bbox: 0.1462 d3.dn_loss_iou: 0.1908 d4.dn_loss_cls: 0.1458 d4.dn_loss_bbox: 0.1433 d4.dn_loss_iou: 0.1873 d1.loss_lmm_region: 0.1305 loss_lmm_image: 0.8634 2024/11/12 00:31:00 - mmengine - INFO - Iter(train) [ 65300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:15:42 time: 1.9566 data_time: 0.0178 memory: 34380 grad_norm: 35.8810 loss: 10.1482 loss_cls: 0.3256 loss_bbox: 0.1532 loss_iou: 0.2713 d0.loss_cls: 0.3728 d0.loss_bbox: 0.1706 d0.loss_iou: 0.2885 d1.loss_cls: 0.3465 d1.loss_bbox: 0.1591 d1.loss_iou: 0.2754 d2.loss_cls: 0.3357 d2.loss_bbox: 0.1531 d2.loss_iou: 0.2733 d3.loss_cls: 0.3280 d3.loss_bbox: 0.1535 d3.loss_iou: 0.2718 d4.loss_cls: 0.3241 d4.loss_bbox: 0.1537 d4.loss_iou: 0.2719 enc_loss_cls: 0.3721 enc_loss_bbox: 0.1835 enc_loss_iou: 0.3100 dn_loss_cls: 0.1136 dn_loss_bbox: 0.1581 dn_loss_iou: 0.2333 d0.dn_loss_cls: 0.2085 d0.dn_loss_bbox: 0.3067 d0.dn_loss_iou: 0.3812 d1.dn_loss_cls: 0.1484 d1.dn_loss_bbox: 0.1920 d1.dn_loss_iou: 0.2666 d2.dn_loss_cls: 0.1283 d2.dn_loss_bbox: 0.1691 d2.dn_loss_iou: 0.2436 d3.dn_loss_cls: 0.1181 d3.dn_loss_bbox: 0.1605 d3.dn_loss_iou: 0.2359 d4.dn_loss_cls: 0.1137 d4.dn_loss_bbox: 0.1581 d4.dn_loss_iou: 0.2333 d1.loss_lmm_region: 0.1709 loss_lmm_image: 0.9148 2024/11/12 00:34:20 - mmengine - INFO - Iter(train) [ 65400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:12:20 time: 1.9797 data_time: 0.0178 memory: 34649 grad_norm: 26.0281 loss: 9.0261 loss_cls: 0.3038 loss_bbox: 0.1130 loss_iou: 0.2161 d0.loss_cls: 0.3321 d0.loss_bbox: 0.1349 d0.loss_iou: 0.2380 d1.loss_cls: 0.3157 d1.loss_bbox: 0.1232 d1.loss_iou: 0.2243 d2.loss_cls: 0.3049 d2.loss_bbox: 0.1207 d2.loss_iou: 0.2202 d3.loss_cls: 0.3086 d3.loss_bbox: 0.1135 d3.loss_iou: 0.2149 d4.loss_cls: 0.3050 d4.loss_bbox: 0.1124 d4.loss_iou: 0.2155 enc_loss_cls: 0.3312 enc_loss_bbox: 0.1442 enc_loss_iou: 0.2607 dn_loss_cls: 0.1099 dn_loss_bbox: 0.1550 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1894 d0.dn_loss_bbox: 0.2998 d0.dn_loss_iou: 0.3481 d1.dn_loss_cls: 0.1357 d1.dn_loss_bbox: 0.1874 d1.dn_loss_iou: 0.2361 d2.dn_loss_cls: 0.1161 d2.dn_loss_bbox: 0.1649 d2.dn_loss_iou: 0.2137 d3.dn_loss_cls: 0.1120 d3.dn_loss_bbox: 0.1569 d3.dn_loss_iou: 0.2064 d4.dn_loss_cls: 0.1100 d4.dn_loss_bbox: 0.1551 d4.dn_loss_iou: 0.2035 d1.loss_lmm_region: 0.1515 loss_lmm_image: 0.9179 2024/11/12 00:37:38 - mmengine - INFO - Iter(train) [ 65500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:08:55 time: 1.9633 data_time: 0.0177 memory: 35085 grad_norm: 28.9047 loss: 8.8815 loss_cls: 0.2815 loss_bbox: 0.1290 loss_iou: 0.2163 d0.loss_cls: 0.3318 d0.loss_bbox: 0.1368 d0.loss_iou: 0.2252 d1.loss_cls: 0.3005 d1.loss_bbox: 0.1350 d1.loss_iou: 0.2223 d2.loss_cls: 0.2913 d2.loss_bbox: 0.1311 d2.loss_iou: 0.2168 d3.loss_cls: 0.2829 d3.loss_bbox: 0.1307 d3.loss_iou: 0.2176 d4.loss_cls: 0.2825 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2170 enc_loss_cls: 0.3343 enc_loss_bbox: 0.1575 enc_loss_iou: 0.2541 dn_loss_cls: 0.1010 dn_loss_bbox: 0.1458 dn_loss_iou: 0.1952 d0.dn_loss_cls: 0.1867 d0.dn_loss_bbox: 0.3159 d0.dn_loss_iou: 0.3466 d1.dn_loss_cls: 0.1332 d1.dn_loss_bbox: 0.1898 d1.dn_loss_iou: 0.2301 d2.dn_loss_cls: 0.1185 d2.dn_loss_bbox: 0.1606 d2.dn_loss_iou: 0.2055 d3.dn_loss_cls: 0.1088 d3.dn_loss_bbox: 0.1494 d3.dn_loss_iou: 0.1976 d4.dn_loss_cls: 0.1031 d4.dn_loss_bbox: 0.1459 d4.dn_loss_iou: 0.1953 d1.loss_lmm_region: 0.1409 loss_lmm_image: 0.8865 2024/11/12 00:40:55 - mmengine - INFO - Iter(train) [ 65600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:05:29 time: 2.0034 data_time: 0.0179 memory: 34079 grad_norm: 34.3933 loss: 8.6623 loss_cls: 0.2747 loss_bbox: 0.1222 loss_iou: 0.2250 d0.loss_cls: 0.3260 d0.loss_bbox: 0.1299 d0.loss_iou: 0.2293 d1.loss_cls: 0.3002 d1.loss_bbox: 0.1259 d1.loss_iou: 0.2258 d2.loss_cls: 0.2858 d2.loss_bbox: 0.1223 d2.loss_iou: 0.2259 d3.loss_cls: 0.2774 d3.loss_bbox: 0.1226 d3.loss_iou: 0.2259 d4.loss_cls: 0.2758 d4.loss_bbox: 0.1218 d4.loss_iou: 0.2247 enc_loss_cls: 0.3280 enc_loss_bbox: 0.1391 enc_loss_iou: 0.2462 dn_loss_cls: 0.0910 dn_loss_bbox: 0.1608 dn_loss_iou: 0.1886 d0.dn_loss_cls: 0.1699 d0.dn_loss_bbox: 0.2957 d0.dn_loss_iou: 0.3165 d1.dn_loss_cls: 0.1250 d1.dn_loss_bbox: 0.1903 d1.dn_loss_iou: 0.2161 d2.dn_loss_cls: 0.1068 d2.dn_loss_bbox: 0.1691 d2.dn_loss_iou: 0.1977 d3.dn_loss_cls: 0.0990 d3.dn_loss_bbox: 0.1618 d3.dn_loss_iou: 0.1909 d4.dn_loss_cls: 0.0921 d4.dn_loss_bbox: 0.1608 d4.dn_loss_iou: 0.1884 d1.loss_lmm_region: 0.1235 loss_lmm_image: 0.8639 2024/11/12 00:44:12 - mmengine - INFO - Iter(train) [ 65700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 23:02:03 time: 1.9391 data_time: 0.0177 memory: 34169 grad_norm: 27.9225 loss: 8.9828 loss_cls: 0.2644 loss_bbox: 0.1292 loss_iou: 0.2158 d0.loss_cls: 0.3049 d0.loss_bbox: 0.1435 d0.loss_iou: 0.2320 d1.loss_cls: 0.2795 d1.loss_bbox: 0.1358 d1.loss_iou: 0.2240 d2.loss_cls: 0.2758 d2.loss_bbox: 0.1300 d2.loss_iou: 0.2173 d3.loss_cls: 0.2683 d3.loss_bbox: 0.1282 d3.loss_iou: 0.2173 d4.loss_cls: 0.2681 d4.loss_bbox: 0.1267 d4.loss_iou: 0.2165 enc_loss_cls: 0.3163 enc_loss_bbox: 0.1511 enc_loss_iou: 0.2492 dn_loss_cls: 0.1038 dn_loss_bbox: 0.1809 dn_loss_iou: 0.2148 d0.dn_loss_cls: 0.1804 d0.dn_loss_bbox: 0.3162 d0.dn_loss_iou: 0.3530 d1.dn_loss_cls: 0.1318 d1.dn_loss_bbox: 0.2097 d1.dn_loss_iou: 0.2442 d2.dn_loss_cls: 0.1164 d2.dn_loss_bbox: 0.1905 d2.dn_loss_iou: 0.2236 d3.dn_loss_cls: 0.1093 d3.dn_loss_bbox: 0.1826 d3.dn_loss_iou: 0.2171 d4.dn_loss_cls: 0.1041 d4.dn_loss_bbox: 0.1809 d4.dn_loss_iou: 0.2148 d1.loss_lmm_region: 0.1375 loss_lmm_image: 0.8769 2024/11/12 00:47:32 - mmengine - INFO - Iter(train) [ 65800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:58:41 time: 2.0049 data_time: 0.0178 memory: 33473 grad_norm: 27.6586 loss: 9.3268 loss_cls: 0.2773 loss_bbox: 0.1471 loss_iou: 0.2473 d0.loss_cls: 0.3234 d0.loss_bbox: 0.1539 d0.loss_iou: 0.2566 d1.loss_cls: 0.2944 d1.loss_bbox: 0.1520 d1.loss_iou: 0.2532 d2.loss_cls: 0.2919 d2.loss_bbox: 0.1454 d2.loss_iou: 0.2468 d3.loss_cls: 0.2846 d3.loss_bbox: 0.1434 d3.loss_iou: 0.2438 d4.loss_cls: 0.2790 d4.loss_bbox: 0.1469 d4.loss_iou: 0.2475 enc_loss_cls: 0.3358 enc_loss_bbox: 0.1727 enc_loss_iou: 0.2856 dn_loss_cls: 0.0980 dn_loss_bbox: 0.1570 dn_loss_iou: 0.2164 d0.dn_loss_cls: 0.1813 d0.dn_loss_bbox: 0.3110 d0.dn_loss_iou: 0.3675 d1.dn_loss_cls: 0.1330 d1.dn_loss_bbox: 0.1896 d1.dn_loss_iou: 0.2483 d2.dn_loss_cls: 0.1121 d2.dn_loss_bbox: 0.1659 d2.dn_loss_iou: 0.2265 d3.dn_loss_cls: 0.1036 d3.dn_loss_bbox: 0.1595 d3.dn_loss_iou: 0.2195 d4.dn_loss_cls: 0.0980 d4.dn_loss_bbox: 0.1569 d4.dn_loss_iou: 0.2164 d1.loss_lmm_region: 0.1427 loss_lmm_image: 0.8952 2024/11/12 00:50:50 - mmengine - INFO - Iter(train) [ 65900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:55:17 time: 1.9769 data_time: 0.0177 memory: 34126 grad_norm: 26.4900 loss: 8.6139 loss_cls: 0.2622 loss_bbox: 0.1144 loss_iou: 0.2234 d0.loss_cls: 0.3058 d0.loss_bbox: 0.1231 d0.loss_iou: 0.2365 d1.loss_cls: 0.2773 d1.loss_bbox: 0.1153 d1.loss_iou: 0.2267 d2.loss_cls: 0.2711 d2.loss_bbox: 0.1146 d2.loss_iou: 0.2226 d3.loss_cls: 0.2648 d3.loss_bbox: 0.1145 d3.loss_iou: 0.2237 d4.loss_cls: 0.2654 d4.loss_bbox: 0.1149 d4.loss_iou: 0.2220 enc_loss_cls: 0.3034 enc_loss_bbox: 0.1394 enc_loss_iou: 0.2613 dn_loss_cls: 0.0888 dn_loss_bbox: 0.1550 dn_loss_iou: 0.2033 d0.dn_loss_cls: 0.1682 d0.dn_loss_bbox: 0.3143 d0.dn_loss_iou: 0.3575 d1.dn_loss_cls: 0.1160 d1.dn_loss_bbox: 0.1896 d1.dn_loss_iou: 0.2367 d2.dn_loss_cls: 0.0988 d2.dn_loss_bbox: 0.1654 d2.dn_loss_iou: 0.2131 d3.dn_loss_cls: 0.0916 d3.dn_loss_bbox: 0.1580 d3.dn_loss_iou: 0.2059 d4.dn_loss_cls: 0.0886 d4.dn_loss_bbox: 0.1550 d4.dn_loss_iou: 0.2032 d1.loss_lmm_region: 0.1112 loss_lmm_image: 0.8912 2024/11/12 00:54:09 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 00:54:09 - mmengine - INFO - Iter(train) [ 66000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:51:54 time: 2.0000 data_time: 0.0178 memory: 33551 grad_norm: 24.4474 loss: 9.2910 loss_cls: 0.2866 loss_bbox: 0.1367 loss_iou: 0.2389 d0.loss_cls: 0.3285 d0.loss_bbox: 0.1507 d0.loss_iou: 0.2517 d1.loss_cls: 0.2970 d1.loss_bbox: 0.1451 d1.loss_iou: 0.2478 d2.loss_cls: 0.2912 d2.loss_bbox: 0.1403 d2.loss_iou: 0.2418 d3.loss_cls: 0.2840 d3.loss_bbox: 0.1396 d3.loss_iou: 0.2396 d4.loss_cls: 0.2821 d4.loss_bbox: 0.1403 d4.loss_iou: 0.2399 enc_loss_cls: 0.3315 enc_loss_bbox: 0.1630 enc_loss_iou: 0.2750 dn_loss_cls: 0.1054 dn_loss_bbox: 0.1842 dn_loss_iou: 0.2045 d0.dn_loss_cls: 0.1731 d0.dn_loss_bbox: 0.3301 d0.dn_loss_iou: 0.3450 d1.dn_loss_cls: 0.1295 d1.dn_loss_bbox: 0.2182 d1.dn_loss_iou: 0.2366 d2.dn_loss_cls: 0.1139 d2.dn_loss_bbox: 0.1914 d2.dn_loss_iou: 0.2147 d3.dn_loss_cls: 0.1073 d3.dn_loss_bbox: 0.1856 d3.dn_loss_iou: 0.2070 d4.dn_loss_cls: 0.1044 d4.dn_loss_bbox: 0.1843 d4.dn_loss_iou: 0.2045 d1.loss_lmm_region: 0.1305 loss_lmm_image: 0.8694 2024/11/12 00:57:27 - mmengine - INFO - Iter(train) [ 66100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:48:29 time: 1.9650 data_time: 0.0180 memory: 34384 grad_norm: 31.2268 loss: 8.7206 loss_cls: 0.2683 loss_bbox: 0.1147 loss_iou: 0.2048 d0.loss_cls: 0.3080 d0.loss_bbox: 0.1321 d0.loss_iou: 0.2201 d1.loss_cls: 0.2853 d1.loss_bbox: 0.1213 d1.loss_iou: 0.2104 d2.loss_cls: 0.2784 d2.loss_bbox: 0.1168 d2.loss_iou: 0.2059 d3.loss_cls: 0.2725 d3.loss_bbox: 0.1131 d3.loss_iou: 0.2033 d4.loss_cls: 0.2688 d4.loss_bbox: 0.1155 d4.loss_iou: 0.2051 enc_loss_cls: 0.3168 enc_loss_bbox: 0.1439 enc_loss_iou: 0.2382 dn_loss_cls: 0.1083 dn_loss_bbox: 0.1590 dn_loss_iou: 0.2085 d0.dn_loss_cls: 0.1857 d0.dn_loss_bbox: 0.3030 d0.dn_loss_iou: 0.3503 d1.dn_loss_cls: 0.1405 d1.dn_loss_bbox: 0.1885 d1.dn_loss_iou: 0.2372 d2.dn_loss_cls: 0.1218 d2.dn_loss_bbox: 0.1694 d2.dn_loss_iou: 0.2179 d3.dn_loss_cls: 0.1141 d3.dn_loss_bbox: 0.1615 d3.dn_loss_iou: 0.2109 d4.dn_loss_cls: 0.1098 d4.dn_loss_bbox: 0.1590 d4.dn_loss_iou: 0.2084 d1.loss_lmm_region: 0.1320 loss_lmm_image: 0.8915 2024/11/12 01:00:47 - mmengine - INFO - Iter(train) [ 66200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:45:07 time: 2.0023 data_time: 0.0177 memory: 33378 grad_norm: 26.0916 loss: 9.1941 loss_cls: 0.3054 loss_bbox: 0.1167 loss_iou: 0.2178 d0.loss_cls: 0.3438 d0.loss_bbox: 0.1274 d0.loss_iou: 0.2305 d1.loss_cls: 0.3239 d1.loss_bbox: 0.1239 d1.loss_iou: 0.2238 d2.loss_cls: 0.3149 d2.loss_bbox: 0.1181 d2.loss_iou: 0.2193 d3.loss_cls: 0.3095 d3.loss_bbox: 0.1171 d3.loss_iou: 0.2177 d4.loss_cls: 0.3086 d4.loss_bbox: 0.1159 d4.loss_iou: 0.2165 enc_loss_cls: 0.3484 enc_loss_bbox: 0.1408 enc_loss_iou: 0.2540 dn_loss_cls: 0.1215 dn_loss_bbox: 0.1711 dn_loss_iou: 0.1996 d0.dn_loss_cls: 0.2014 d0.dn_loss_bbox: 0.3201 d0.dn_loss_iou: 0.3335 d1.dn_loss_cls: 0.1521 d1.dn_loss_bbox: 0.2007 d1.dn_loss_iou: 0.2265 d2.dn_loss_cls: 0.1345 d2.dn_loss_bbox: 0.1806 d2.dn_loss_iou: 0.2095 d3.dn_loss_cls: 0.1268 d3.dn_loss_bbox: 0.1722 d3.dn_loss_iou: 0.2016 d4.dn_loss_cls: 0.1229 d4.dn_loss_bbox: 0.1711 d4.dn_loss_iou: 0.1995 d1.loss_lmm_region: 0.1564 loss_lmm_image: 0.8986 2024/11/12 01:04:06 - mmengine - INFO - Iter(train) [ 66300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:41:45 time: 1.9990 data_time: 0.0178 memory: 34638 grad_norm: 30.8394 loss: 10.0969 loss_cls: 0.3266 loss_bbox: 0.1482 loss_iou: 0.2535 d0.loss_cls: 0.3738 d0.loss_bbox: 0.1670 d0.loss_iou: 0.2709 d1.loss_cls: 0.3433 d1.loss_bbox: 0.1562 d1.loss_iou: 0.2596 d2.loss_cls: 0.3377 d2.loss_bbox: 0.1489 d2.loss_iou: 0.2533 d3.loss_cls: 0.3288 d3.loss_bbox: 0.1510 d3.loss_iou: 0.2537 d4.loss_cls: 0.3270 d4.loss_bbox: 0.1476 d4.loss_iou: 0.2541 enc_loss_cls: 0.3675 enc_loss_bbox: 0.1868 enc_loss_iou: 0.2979 dn_loss_cls: 0.1025 dn_loss_bbox: 0.1910 dn_loss_iou: 0.2252 d0.dn_loss_cls: 0.1873 d0.dn_loss_bbox: 0.3538 d0.dn_loss_iou: 0.3692 d1.dn_loss_cls: 0.1362 d1.dn_loss_bbox: 0.2315 d1.dn_loss_iou: 0.2594 d2.dn_loss_cls: 0.1182 d2.dn_loss_bbox: 0.2020 d2.dn_loss_iou: 0.2354 d3.dn_loss_cls: 0.1081 d3.dn_loss_bbox: 0.1933 d3.dn_loss_iou: 0.2283 d4.dn_loss_cls: 0.1040 d4.dn_loss_bbox: 0.1909 d4.dn_loss_iou: 0.2252 d1.loss_lmm_region: 0.1865 loss_lmm_image: 0.8953 2024/11/12 01:07:27 - mmengine - INFO - Iter(train) [ 66400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:38:23 time: 1.9665 data_time: 0.0178 memory: 34994 grad_norm: 26.2164 loss: 10.7938 loss_cls: 0.3601 loss_bbox: 0.1661 loss_iou: 0.2701 d0.loss_cls: 0.4180 d0.loss_bbox: 0.1794 d0.loss_iou: 0.2851 d1.loss_cls: 0.3778 d1.loss_bbox: 0.1784 d1.loss_iou: 0.2801 d2.loss_cls: 0.3706 d2.loss_bbox: 0.1725 d2.loss_iou: 0.2737 d3.loss_cls: 0.3667 d3.loss_bbox: 0.1672 d3.loss_iou: 0.2693 d4.loss_cls: 0.3604 d4.loss_bbox: 0.1639 d4.loss_iou: 0.2673 enc_loss_cls: 0.4056 enc_loss_bbox: 0.1978 enc_loss_iou: 0.3126 dn_loss_cls: 0.1470 dn_loss_bbox: 0.1898 dn_loss_iou: 0.2306 d0.dn_loss_cls: 0.2217 d0.dn_loss_bbox: 0.3440 d0.dn_loss_iou: 0.3786 d1.dn_loss_cls: 0.1709 d1.dn_loss_bbox: 0.2204 d1.dn_loss_iou: 0.2628 d2.dn_loss_cls: 0.1604 d2.dn_loss_bbox: 0.2009 d2.dn_loss_iou: 0.2414 d3.dn_loss_cls: 0.1523 d3.dn_loss_bbox: 0.1913 d3.dn_loss_iou: 0.2335 d4.dn_loss_cls: 0.1457 d4.dn_loss_bbox: 0.1897 d4.dn_loss_iou: 0.2305 d1.loss_lmm_region: 0.1550 loss_lmm_image: 0.8845 2024/11/12 01:10:48 - mmengine - INFO - Iter(train) [ 66500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:35:04 time: 2.0153 data_time: 0.0179 memory: 35450 grad_norm: 33.4485 loss: 10.3592 loss_cls: 0.3133 loss_bbox: 0.1578 loss_iou: 0.2697 d0.loss_cls: 0.3657 d0.loss_bbox: 0.1754 d0.loss_iou: 0.2817 d1.loss_cls: 0.3324 d1.loss_bbox: 0.1688 d1.loss_iou: 0.2783 d2.loss_cls: 0.3213 d2.loss_bbox: 0.1674 d2.loss_iou: 0.2746 d3.loss_cls: 0.3241 d3.loss_bbox: 0.1553 d3.loss_iou: 0.2664 d4.loss_cls: 0.3185 d4.loss_bbox: 0.1571 d4.loss_iou: 0.2685 enc_loss_cls: 0.3649 enc_loss_bbox: 0.1942 enc_loss_iou: 0.3044 dn_loss_cls: 0.1301 dn_loss_bbox: 0.1886 dn_loss_iou: 0.2439 d0.dn_loss_cls: 0.2192 d0.dn_loss_bbox: 0.3478 d0.dn_loss_iou: 0.3944 d1.dn_loss_cls: 0.1669 d1.dn_loss_bbox: 0.2229 d1.dn_loss_iou: 0.2756 d2.dn_loss_cls: 0.1452 d2.dn_loss_bbox: 0.1997 d2.dn_loss_iou: 0.2547 d3.dn_loss_cls: 0.1361 d3.dn_loss_bbox: 0.1917 d3.dn_loss_iou: 0.2472 d4.dn_loss_cls: 0.1318 d4.dn_loss_bbox: 0.1886 d4.dn_loss_iou: 0.2440 d1.loss_lmm_region: 0.1537 loss_lmm_image: 0.8173 2024/11/12 01:14:08 - mmengine - INFO - Iter(train) [ 66600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:31:42 time: 2.0189 data_time: 0.0178 memory: 34189 grad_norm: 27.8120 loss: 10.7626 loss_cls: 0.3460 loss_bbox: 0.1791 loss_iou: 0.2813 d0.loss_cls: 0.3910 d0.loss_bbox: 0.1899 d0.loss_iou: 0.3014 d1.loss_cls: 0.3552 d1.loss_bbox: 0.1887 d1.loss_iou: 0.2952 d2.loss_cls: 0.3543 d2.loss_bbox: 0.1805 d2.loss_iou: 0.2834 d3.loss_cls: 0.3514 d3.loss_bbox: 0.1803 d3.loss_iou: 0.2829 d4.loss_cls: 0.3477 d4.loss_bbox: 0.1785 d4.loss_iou: 0.2811 enc_loss_cls: 0.3913 enc_loss_bbox: 0.2049 enc_loss_iou: 0.3236 dn_loss_cls: 0.1129 dn_loss_bbox: 0.2060 dn_loss_iou: 0.2367 d0.dn_loss_cls: 0.2036 d0.dn_loss_bbox: 0.3783 d0.dn_loss_iou: 0.3853 d1.dn_loss_cls: 0.1459 d1.dn_loss_bbox: 0.2426 d1.dn_loss_iou: 0.2702 d2.dn_loss_cls: 0.1250 d2.dn_loss_bbox: 0.2163 d2.dn_loss_iou: 0.2470 d3.dn_loss_cls: 0.1193 d3.dn_loss_bbox: 0.2091 d3.dn_loss_iou: 0.2395 d4.dn_loss_cls: 0.1130 d4.dn_loss_bbox: 0.2060 d4.dn_loss_iou: 0.2366 d1.loss_lmm_region: 0.1470 loss_lmm_image: 0.8343 2024/11/12 01:17:28 - mmengine - INFO - Iter(train) [ 66700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:28:19 time: 2.0156 data_time: 0.0178 memory: 34358 grad_norm: 27.1054 loss: 9.9011 loss_cls: 0.2871 loss_bbox: 0.1588 loss_iou: 0.2856 d0.loss_cls: 0.3443 d0.loss_bbox: 0.1618 d0.loss_iou: 0.2949 d1.loss_cls: 0.3099 d1.loss_bbox: 0.1592 d1.loss_iou: 0.2893 d2.loss_cls: 0.2965 d2.loss_bbox: 0.1595 d2.loss_iou: 0.2842 d3.loss_cls: 0.2877 d3.loss_bbox: 0.1627 d3.loss_iou: 0.2857 d4.loss_cls: 0.2872 d4.loss_bbox: 0.1594 d4.loss_iou: 0.2838 enc_loss_cls: 0.3453 enc_loss_bbox: 0.1788 enc_loss_iou: 0.3176 dn_loss_cls: 0.0832 dn_loss_bbox: 0.1965 dn_loss_iou: 0.2442 d0.dn_loss_cls: 0.1632 d0.dn_loss_bbox: 0.3467 d0.dn_loss_iou: 0.3929 d1.dn_loss_cls: 0.1106 d1.dn_loss_bbox: 0.2283 d1.dn_loss_iou: 0.2750 d2.dn_loss_cls: 0.0947 d2.dn_loss_bbox: 0.2061 d2.dn_loss_iou: 0.2535 d3.dn_loss_cls: 0.0878 d3.dn_loss_bbox: 0.1988 d3.dn_loss_iou: 0.2467 d4.dn_loss_cls: 0.0845 d4.dn_loss_bbox: 0.1966 d4.dn_loss_iou: 0.2443 d1.loss_lmm_region: 0.1293 loss_lmm_image: 0.7793 2024/11/12 01:20:46 - mmengine - INFO - Iter(train) [ 66800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:24:54 time: 1.9491 data_time: 0.0178 memory: 34489 grad_norm: nan loss: 9.2033 loss_cls: 0.2700 loss_bbox: 0.1296 loss_iou: 0.2310 d0.loss_cls: 0.3091 d0.loss_bbox: 0.1483 d0.loss_iou: 0.2534 d1.loss_cls: 0.2956 d1.loss_bbox: 0.1359 d1.loss_iou: 0.2372 d2.loss_cls: 0.2816 d2.loss_bbox: 0.1286 d2.loss_iou: 0.2317 d3.loss_cls: 0.2740 d3.loss_bbox: 0.1293 d3.loss_iou: 0.2307 d4.loss_cls: 0.2743 d4.loss_bbox: 0.1287 d4.loss_iou: 0.2303 enc_loss_cls: 0.3173 enc_loss_bbox: 0.1683 enc_loss_iou: 0.2756 dn_loss_cls: 0.0908 dn_loss_bbox: 0.1833 dn_loss_iou: 0.2229 d0.dn_loss_cls: 0.1812 d0.dn_loss_bbox: 0.3310 d0.dn_loss_iou: 0.3764 d1.dn_loss_cls: 0.1261 d1.dn_loss_bbox: 0.2157 d1.dn_loss_iou: 0.2554 d2.dn_loss_cls: 0.1022 d2.dn_loss_bbox: 0.1957 d2.dn_loss_iou: 0.2341 d3.dn_loss_cls: 0.0957 d3.dn_loss_bbox: 0.1861 d3.dn_loss_iou: 0.2260 d4.dn_loss_cls: 0.0914 d4.dn_loss_bbox: 0.1833 d4.dn_loss_iou: 0.2229 d1.loss_lmm_region: 0.1334 loss_lmm_image: 0.8693 2024/11/12 01:24:07 - mmengine - INFO - Iter(train) [ 66900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:21:34 time: 2.0116 data_time: 0.0180 memory: 33822 grad_norm: 28.3006 loss: 8.2728 loss_cls: 0.2844 loss_bbox: 0.0960 loss_iou: 0.2045 d0.loss_cls: 0.3199 d0.loss_bbox: 0.1077 d0.loss_iou: 0.2216 d1.loss_cls: 0.3010 d1.loss_bbox: 0.1021 d1.loss_iou: 0.2134 d2.loss_cls: 0.2845 d2.loss_bbox: 0.0997 d2.loss_iou: 0.2072 d3.loss_cls: 0.2835 d3.loss_bbox: 0.0973 d3.loss_iou: 0.2059 d4.loss_cls: 0.2830 d4.loss_bbox: 0.0969 d4.loss_iou: 0.2049 enc_loss_cls: 0.3205 enc_loss_bbox: 0.1236 enc_loss_iou: 0.2467 dn_loss_cls: 0.1020 dn_loss_bbox: 0.1298 dn_loss_iou: 0.1877 d0.dn_loss_cls: 0.1725 d0.dn_loss_bbox: 0.2472 d0.dn_loss_iou: 0.3237 d1.dn_loss_cls: 0.1333 d1.dn_loss_bbox: 0.1541 d1.dn_loss_iou: 0.2172 d2.dn_loss_cls: 0.1166 d2.dn_loss_bbox: 0.1345 d2.dn_loss_iou: 0.1957 d3.dn_loss_cls: 0.1086 d3.dn_loss_bbox: 0.1298 d3.dn_loss_iou: 0.1891 d4.dn_loss_cls: 0.1034 d4.dn_loss_bbox: 0.1298 d4.dn_loss_iou: 0.1876 d1.loss_lmm_region: 0.1248 loss_lmm_image: 0.8810 2024/11/12 01:27:21 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 01:27:21 - mmengine - INFO - Iter(train) [ 67000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:18:05 time: 1.9667 data_time: 0.0178 memory: 33775 grad_norm: 35.1239 loss: 10.4507 loss_cls: 0.3517 loss_bbox: 0.1599 loss_iou: 0.2942 d0.loss_cls: 0.3973 d0.loss_bbox: 0.1698 d0.loss_iou: 0.3087 d1.loss_cls: 0.3796 d1.loss_bbox: 0.1599 d1.loss_iou: 0.2957 d2.loss_cls: 0.3716 d2.loss_bbox: 0.1535 d2.loss_iou: 0.2892 d3.loss_cls: 0.3512 d3.loss_bbox: 0.1661 d3.loss_iou: 0.2943 d4.loss_cls: 0.3548 d4.loss_bbox: 0.1603 d4.loss_iou: 0.2929 enc_loss_cls: 0.4001 enc_loss_bbox: 0.1875 enc_loss_iou: 0.3332 dn_loss_cls: 0.1530 dn_loss_bbox: 0.1550 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.2270 d0.dn_loss_bbox: 0.2837 d0.dn_loss_iou: 0.3549 d1.dn_loss_cls: 0.1754 d1.dn_loss_bbox: 0.1833 d1.dn_loss_iou: 0.2430 d2.dn_loss_cls: 0.1618 d2.dn_loss_bbox: 0.1633 d2.dn_loss_iou: 0.2243 d3.dn_loss_cls: 0.1564 d3.dn_loss_bbox: 0.1570 d3.dn_loss_iou: 0.2176 d4.dn_loss_cls: 0.1522 d4.dn_loss_bbox: 0.1550 d4.dn_loss_iou: 0.2157 d1.loss_lmm_region: 0.1479 loss_lmm_image: 0.8368 2024/11/12 01:30:40 - mmengine - INFO - Iter(train) [ 67100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:14:41 time: 1.9875 data_time: 0.0179 memory: 34055 grad_norm: 26.8243 loss: 9.1623 loss_cls: 0.2926 loss_bbox: 0.1178 loss_iou: 0.2088 d0.loss_cls: 0.3352 d0.loss_bbox: 0.1345 d0.loss_iou: 0.2326 d1.loss_cls: 0.3093 d1.loss_bbox: 0.1254 d1.loss_iou: 0.2188 d2.loss_cls: 0.2959 d2.loss_bbox: 0.1233 d2.loss_iou: 0.2158 d3.loss_cls: 0.2947 d3.loss_bbox: 0.1198 d3.loss_iou: 0.2107 d4.loss_cls: 0.2948 d4.loss_bbox: 0.1178 d4.loss_iou: 0.2082 enc_loss_cls: 0.3375 enc_loss_bbox: 0.1432 enc_loss_iou: 0.2532 dn_loss_cls: 0.1222 dn_loss_bbox: 0.1736 dn_loss_iou: 0.2121 d0.dn_loss_cls: 0.1978 d0.dn_loss_bbox: 0.3200 d0.dn_loss_iou: 0.3620 d1.dn_loss_cls: 0.1513 d1.dn_loss_bbox: 0.2052 d1.dn_loss_iou: 0.2454 d2.dn_loss_cls: 0.1322 d2.dn_loss_bbox: 0.1841 d2.dn_loss_iou: 0.2227 d3.dn_loss_cls: 0.1276 d3.dn_loss_bbox: 0.1753 d3.dn_loss_iou: 0.2149 d4.dn_loss_cls: 0.1238 d4.dn_loss_bbox: 0.1737 d4.dn_loss_iou: 0.2120 d1.loss_lmm_region: 0.1521 loss_lmm_image: 0.8645 2024/11/12 01:33:57 - mmengine - INFO - Iter(train) [ 67200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:11:16 time: 1.9743 data_time: 0.0180 memory: 33903 grad_norm: 29.1470 loss: 10.4127 loss_cls: 0.3189 loss_bbox: 0.1630 loss_iou: 0.3064 d0.loss_cls: 0.3744 d0.loss_bbox: 0.1869 d0.loss_iou: 0.3252 d1.loss_cls: 0.3515 d1.loss_bbox: 0.1702 d1.loss_iou: 0.3147 d2.loss_cls: 0.3379 d2.loss_bbox: 0.1609 d2.loss_iou: 0.3051 d3.loss_cls: 0.3284 d3.loss_bbox: 0.1603 d3.loss_iou: 0.3038 d4.loss_cls: 0.3243 d4.loss_bbox: 0.1605 d4.loss_iou: 0.3044 enc_loss_cls: 0.3837 enc_loss_bbox: 0.1883 enc_loss_iou: 0.3387 dn_loss_cls: 0.0884 dn_loss_bbox: 0.1793 dn_loss_iou: 0.2453 d0.dn_loss_cls: 0.1742 d0.dn_loss_bbox: 0.3229 d0.dn_loss_iou: 0.3939 d1.dn_loss_cls: 0.1257 d1.dn_loss_bbox: 0.2141 d1.dn_loss_iou: 0.2785 d2.dn_loss_cls: 0.1023 d2.dn_loss_bbox: 0.1925 d2.dn_loss_iou: 0.2560 d3.dn_loss_cls: 0.0940 d3.dn_loss_bbox: 0.1813 d3.dn_loss_iou: 0.2472 d4.dn_loss_cls: 0.0900 d4.dn_loss_bbox: 0.1794 d4.dn_loss_iou: 0.2453 d1.loss_lmm_region: 0.1257 loss_lmm_image: 0.8693 2024/11/12 01:37:15 - mmengine - INFO - Iter(train) [ 67300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:07:52 time: 1.9984 data_time: 0.0178 memory: 34512 grad_norm: 34.2442 loss: 9.7391 loss_cls: 0.3256 loss_bbox: 0.1556 loss_iou: 0.2977 d0.loss_cls: 0.3726 d0.loss_bbox: 0.1690 d0.loss_iou: 0.3130 d1.loss_cls: 0.3457 d1.loss_bbox: 0.1593 d1.loss_iou: 0.2988 d2.loss_cls: 0.3368 d2.loss_bbox: 0.1544 d2.loss_iou: 0.2939 d3.loss_cls: 0.3272 d3.loss_bbox: 0.1554 d3.loss_iou: 0.2978 d4.loss_cls: 0.3280 d4.loss_bbox: 0.1544 d4.loss_iou: 0.2961 enc_loss_cls: 0.3705 enc_loss_bbox: 0.1899 enc_loss_iou: 0.3410 dn_loss_cls: 0.0817 dn_loss_bbox: 0.1435 dn_loss_iou: 0.2062 d0.dn_loss_cls: 0.1600 d0.dn_loss_bbox: 0.2787 d0.dn_loss_iou: 0.3436 d1.dn_loss_cls: 0.1135 d1.dn_loss_bbox: 0.1773 d1.dn_loss_iou: 0.2365 d2.dn_loss_cls: 0.0959 d2.dn_loss_bbox: 0.1547 d2.dn_loss_iou: 0.2160 d3.dn_loss_cls: 0.0878 d3.dn_loss_bbox: 0.1459 d3.dn_loss_iou: 0.2090 d4.dn_loss_cls: 0.0838 d4.dn_loss_bbox: 0.1435 d4.dn_loss_iou: 0.2061 d1.loss_lmm_region: 0.1272 loss_lmm_image: 0.8451 2024/11/12 01:40:32 - mmengine - INFO - Iter(train) [ 67400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:04:27 time: 1.9732 data_time: 0.0178 memory: 32597 grad_norm: 29.6191 loss: 10.3331 loss_cls: 0.3513 loss_bbox: 0.1484 loss_iou: 0.2933 d0.loss_cls: 0.3971 d0.loss_bbox: 0.1617 d0.loss_iou: 0.3088 d1.loss_cls: 0.3738 d1.loss_bbox: 0.1555 d1.loss_iou: 0.3028 d2.loss_cls: 0.3649 d2.loss_bbox: 0.1484 d2.loss_iou: 0.2933 d3.loss_cls: 0.3562 d3.loss_bbox: 0.1464 d3.loss_iou: 0.2921 d4.loss_cls: 0.3565 d4.loss_bbox: 0.1469 d4.loss_iou: 0.2922 enc_loss_cls: 0.4052 enc_loss_bbox: 0.1756 enc_loss_iou: 0.3314 dn_loss_cls: 0.1229 dn_loss_bbox: 0.1575 dn_loss_iou: 0.2326 d0.dn_loss_cls: 0.1963 d0.dn_loss_bbox: 0.2949 d0.dn_loss_iou: 0.3727 d1.dn_loss_cls: 0.1512 d1.dn_loss_bbox: 0.1887 d1.dn_loss_iou: 0.2631 d2.dn_loss_cls: 0.1331 d2.dn_loss_bbox: 0.1661 d2.dn_loss_iou: 0.2407 d3.dn_loss_cls: 0.1289 d3.dn_loss_bbox: 0.1591 d3.dn_loss_iou: 0.2344 d4.dn_loss_cls: 0.1246 d4.dn_loss_bbox: 0.1574 d4.dn_loss_iou: 0.2324 d1.loss_lmm_region: 0.1424 loss_lmm_image: 0.8321 2024/11/12 01:43:52 - mmengine - INFO - Iter(train) [ 67500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 22:01:05 time: 1.9882 data_time: 0.0178 memory: 33235 grad_norm: 31.0522 loss: 8.7396 loss_cls: 0.2458 loss_bbox: 0.1268 loss_iou: 0.2015 d0.loss_cls: 0.2922 d0.loss_bbox: 0.1360 d0.loss_iou: 0.2110 d1.loss_cls: 0.2702 d1.loss_bbox: 0.1252 d1.loss_iou: 0.2010 d2.loss_cls: 0.2615 d2.loss_bbox: 0.1254 d2.loss_iou: 0.2008 d3.loss_cls: 0.2558 d3.loss_bbox: 0.1235 d3.loss_iou: 0.1985 d4.loss_cls: 0.2448 d4.loss_bbox: 0.1279 d4.loss_iou: 0.2018 enc_loss_cls: 0.2897 enc_loss_bbox: 0.1450 enc_loss_iou: 0.2314 dn_loss_cls: 0.1389 dn_loss_bbox: 0.1534 dn_loss_iou: 0.1992 d0.dn_loss_cls: 0.2107 d0.dn_loss_bbox: 0.3223 d0.dn_loss_iou: 0.3553 d1.dn_loss_cls: 0.1613 d1.dn_loss_bbox: 0.1982 d1.dn_loss_iou: 0.2348 d2.dn_loss_cls: 0.1480 d2.dn_loss_bbox: 0.1663 d2.dn_loss_iou: 0.2100 d3.dn_loss_cls: 0.1433 d3.dn_loss_bbox: 0.1559 d3.dn_loss_iou: 0.2024 d4.dn_loss_cls: 0.1391 d4.dn_loss_bbox: 0.1534 d4.dn_loss_iou: 0.1992 d1.loss_lmm_region: 0.1370 loss_lmm_image: 0.8953 2024/11/12 01:47:08 - mmengine - INFO - Iter(train) [ 67600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:57:38 time: 1.9771 data_time: 0.0178 memory: 31561 grad_norm: 29.8836 loss: 8.9959 loss_cls: 0.2899 loss_bbox: 0.1356 loss_iou: 0.2133 d0.loss_cls: 0.3341 d0.loss_bbox: 0.1379 d0.loss_iou: 0.2229 d1.loss_cls: 0.3099 d1.loss_bbox: 0.1314 d1.loss_iou: 0.2134 d2.loss_cls: 0.3019 d2.loss_bbox: 0.1329 d2.loss_iou: 0.2140 d3.loss_cls: 0.2955 d3.loss_bbox: 0.1332 d3.loss_iou: 0.2132 d4.loss_cls: 0.2935 d4.loss_bbox: 0.1330 d4.loss_iou: 0.2121 enc_loss_cls: 0.3286 enc_loss_bbox: 0.1587 enc_loss_iou: 0.2476 dn_loss_cls: 0.1444 dn_loss_bbox: 0.1453 dn_loss_iou: 0.1818 d0.dn_loss_cls: 0.2256 d0.dn_loss_bbox: 0.2873 d0.dn_loss_iou: 0.3207 d1.dn_loss_cls: 0.1743 d1.dn_loss_bbox: 0.1796 d1.dn_loss_iou: 0.2127 d2.dn_loss_cls: 0.1585 d2.dn_loss_bbox: 0.1548 d2.dn_loss_iou: 0.1909 d3.dn_loss_cls: 0.1486 d3.dn_loss_bbox: 0.1475 d3.dn_loss_iou: 0.1846 d4.dn_loss_cls: 0.1449 d4.dn_loss_bbox: 0.1454 d4.dn_loss_iou: 0.1818 d1.loss_lmm_region: 0.1571 loss_lmm_image: 0.8574 2024/11/12 01:50:29 - mmengine - INFO - Iter(train) [ 67700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:54:17 time: 1.9962 data_time: 0.0179 memory: 34342 grad_norm: 30.0632 loss: 9.4919 loss_cls: 0.2948 loss_bbox: 0.1317 loss_iou: 0.2439 d0.loss_cls: 0.3472 d0.loss_bbox: 0.1308 d0.loss_iou: 0.2540 d1.loss_cls: 0.3213 d1.loss_bbox: 0.1269 d1.loss_iou: 0.2455 d2.loss_cls: 0.3113 d2.loss_bbox: 0.1224 d2.loss_iou: 0.2395 d3.loss_cls: 0.2981 d3.loss_bbox: 0.1299 d3.loss_iou: 0.2412 d4.loss_cls: 0.2936 d4.loss_bbox: 0.1318 d4.loss_iou: 0.2432 enc_loss_cls: 0.3459 enc_loss_bbox: 0.1470 enc_loss_iou: 0.2751 dn_loss_cls: 0.1053 dn_loss_bbox: 0.1897 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.1844 d0.dn_loss_bbox: 0.3606 d0.dn_loss_iou: 0.3681 d1.dn_loss_cls: 0.1344 d1.dn_loss_bbox: 0.2325 d1.dn_loss_iou: 0.2507 d2.dn_loss_cls: 0.1174 d2.dn_loss_bbox: 0.2041 d2.dn_loss_iou: 0.2266 d3.dn_loss_cls: 0.1072 d3.dn_loss_bbox: 0.1935 d3.dn_loss_iou: 0.2191 d4.dn_loss_cls: 0.1057 d4.dn_loss_bbox: 0.1898 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1341 loss_lmm_image: 0.8620 2024/11/12 01:53:47 - mmengine - INFO - Iter(train) [ 67800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:50:53 time: 1.9625 data_time: 0.0178 memory: 32621 grad_norm: 32.5518 loss: 10.1952 loss_cls: 0.3344 loss_bbox: 0.1559 loss_iou: 0.2398 d0.loss_cls: 0.3806 d0.loss_bbox: 0.1751 d0.loss_iou: 0.2587 d1.loss_cls: 0.3604 d1.loss_bbox: 0.1656 d1.loss_iou: 0.2493 d2.loss_cls: 0.3527 d2.loss_bbox: 0.1583 d2.loss_iou: 0.2422 d3.loss_cls: 0.3422 d3.loss_bbox: 0.1606 d3.loss_iou: 0.2438 d4.loss_cls: 0.3376 d4.loss_bbox: 0.1574 d4.loss_iou: 0.2415 enc_loss_cls: 0.3954 enc_loss_bbox: 0.1796 enc_loss_iou: 0.2743 dn_loss_cls: 0.1411 dn_loss_bbox: 0.1676 dn_loss_iou: 0.2231 d0.dn_loss_cls: 0.2387 d0.dn_loss_bbox: 0.3210 d0.dn_loss_iou: 0.3667 d1.dn_loss_cls: 0.1764 d1.dn_loss_bbox: 0.1969 d1.dn_loss_iou: 0.2519 d2.dn_loss_cls: 0.1565 d2.dn_loss_bbox: 0.1771 d2.dn_loss_iou: 0.2321 d3.dn_loss_cls: 0.1465 d3.dn_loss_bbox: 0.1688 d3.dn_loss_iou: 0.2252 d4.dn_loss_cls: 0.1405 d4.dn_loss_bbox: 0.1676 d4.dn_loss_iou: 0.2230 d1.loss_lmm_region: 0.1850 loss_lmm_image: 0.8840 2024/11/12 01:57:06 - mmengine - INFO - Iter(train) [ 67900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:47:30 time: 2.0140 data_time: 0.0177 memory: 34373 grad_norm: 34.5184 loss: 9.3282 loss_cls: 0.3065 loss_bbox: 0.1180 loss_iou: 0.2632 d0.loss_cls: 0.3669 d0.loss_bbox: 0.1301 d0.loss_iou: 0.2768 d1.loss_cls: 0.3318 d1.loss_bbox: 0.1242 d1.loss_iou: 0.2703 d2.loss_cls: 0.3177 d2.loss_bbox: 0.1216 d2.loss_iou: 0.2688 d3.loss_cls: 0.3108 d3.loss_bbox: 0.1202 d3.loss_iou: 0.2658 d4.loss_cls: 0.3083 d4.loss_bbox: 0.1171 d4.loss_iou: 0.2640 enc_loss_cls: 0.3651 enc_loss_bbox: 0.1470 enc_loss_iou: 0.3000 dn_loss_cls: 0.1209 dn_loss_bbox: 0.1293 dn_loss_iou: 0.2012 d0.dn_loss_cls: 0.2018 d0.dn_loss_bbox: 0.2906 d0.dn_loss_iou: 0.3495 d1.dn_loss_cls: 0.1490 d1.dn_loss_bbox: 0.1652 d1.dn_loss_iou: 0.2335 d2.dn_loss_cls: 0.1315 d2.dn_loss_bbox: 0.1417 d2.dn_loss_iou: 0.2115 d3.dn_loss_cls: 0.1268 d3.dn_loss_bbox: 0.1323 d3.dn_loss_iou: 0.2038 d4.dn_loss_cls: 0.1193 d4.dn_loss_bbox: 0.1293 d4.dn_loss_iou: 0.2012 d1.loss_lmm_region: 0.1181 loss_lmm_image: 0.8777 2024/11/12 02:00:26 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 02:00:26 - mmengine - INFO - Iter(train) [ 68000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:44:09 time: 2.0136 data_time: 0.0178 memory: 35099 grad_norm: 26.3313 loss: 10.4767 loss_cls: 0.3291 loss_bbox: 0.1668 loss_iou: 0.2974 d0.loss_cls: 0.3735 d0.loss_bbox: 0.1756 d0.loss_iou: 0.3100 d1.loss_cls: 0.3403 d1.loss_bbox: 0.1755 d1.loss_iou: 0.3039 d2.loss_cls: 0.3359 d2.loss_bbox: 0.1711 d2.loss_iou: 0.3008 d3.loss_cls: 0.3283 d3.loss_bbox: 0.1694 d3.loss_iou: 0.2991 d4.loss_cls: 0.3228 d4.loss_bbox: 0.1709 d4.loss_iou: 0.3004 enc_loss_cls: 0.3787 enc_loss_bbox: 0.1983 enc_loss_iou: 0.3357 dn_loss_cls: 0.1100 dn_loss_bbox: 0.1777 dn_loss_iou: 0.2417 d0.dn_loss_cls: 0.1952 d0.dn_loss_bbox: 0.3205 d0.dn_loss_iou: 0.3818 d1.dn_loss_cls: 0.1432 d1.dn_loss_bbox: 0.2108 d1.dn_loss_iou: 0.2715 d2.dn_loss_cls: 0.1226 d2.dn_loss_bbox: 0.1875 d2.dn_loss_iou: 0.2495 d3.dn_loss_cls: 0.1147 d3.dn_loss_bbox: 0.1804 d3.dn_loss_iou: 0.2437 d4.dn_loss_cls: 0.1105 d4.dn_loss_bbox: 0.1777 d4.dn_loss_iou: 0.2416 d1.loss_lmm_region: 0.1660 loss_lmm_image: 0.8468 2024/11/12 02:03:43 - mmengine - INFO - Iter(train) [ 68100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:40:43 time: 1.9810 data_time: 0.0178 memory: 34197 grad_norm: 30.2754 loss: 9.4625 loss_cls: 0.2858 loss_bbox: 0.1391 loss_iou: 0.2441 d0.loss_cls: 0.3309 d0.loss_bbox: 0.1568 d0.loss_iou: 0.2600 d1.loss_cls: 0.3084 d1.loss_bbox: 0.1460 d1.loss_iou: 0.2492 d2.loss_cls: 0.2959 d2.loss_bbox: 0.1426 d2.loss_iou: 0.2425 d3.loss_cls: 0.2860 d3.loss_bbox: 0.1395 d3.loss_iou: 0.2457 d4.loss_cls: 0.2855 d4.loss_bbox: 0.1376 d4.loss_iou: 0.2444 enc_loss_cls: 0.3329 enc_loss_bbox: 0.1714 enc_loss_iou: 0.2842 dn_loss_cls: 0.1022 dn_loss_bbox: 0.1752 dn_loss_iou: 0.2187 d0.dn_loss_cls: 0.1927 d0.dn_loss_bbox: 0.3552 d0.dn_loss_iou: 0.3750 d1.dn_loss_cls: 0.1340 d1.dn_loss_bbox: 0.2155 d1.dn_loss_iou: 0.2535 d2.dn_loss_cls: 0.1117 d2.dn_loss_bbox: 0.1872 d2.dn_loss_iou: 0.2294 d3.dn_loss_cls: 0.1047 d3.dn_loss_bbox: 0.1786 d3.dn_loss_iou: 0.2219 d4.dn_loss_cls: 0.1021 d4.dn_loss_bbox: 0.1750 d4.dn_loss_iou: 0.2186 d1.loss_lmm_region: 0.1446 loss_lmm_image: 0.8384 2024/11/12 02:07:01 - mmengine - INFO - Iter(train) [ 68200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:37:19 time: 1.9710 data_time: 0.0178 memory: 34819 grad_norm: 28.9252 loss: 10.9176 loss_cls: 0.3830 loss_bbox: 0.1609 loss_iou: 0.2918 d0.loss_cls: 0.4311 d0.loss_bbox: 0.1736 d0.loss_iou: 0.3057 d1.loss_cls: 0.4023 d1.loss_bbox: 0.1660 d1.loss_iou: 0.2978 d2.loss_cls: 0.3903 d2.loss_bbox: 0.1636 d2.loss_iou: 0.2930 d3.loss_cls: 0.3876 d3.loss_bbox: 0.1575 d3.loss_iou: 0.2914 d4.loss_cls: 0.3832 d4.loss_bbox: 0.1601 d4.loss_iou: 0.2912 enc_loss_cls: 0.4265 enc_loss_bbox: 0.1952 enc_loss_iou: 0.3292 dn_loss_cls: 0.1385 dn_loss_bbox: 0.1847 dn_loss_iou: 0.2281 d0.dn_loss_cls: 0.2277 d0.dn_loss_bbox: 0.3335 d0.dn_loss_iou: 0.3659 d1.dn_loss_cls: 0.1688 d1.dn_loss_bbox: 0.2173 d1.dn_loss_iou: 0.2575 d2.dn_loss_cls: 0.1489 d2.dn_loss_bbox: 0.1966 d2.dn_loss_iou: 0.2382 d3.dn_loss_cls: 0.1416 d3.dn_loss_bbox: 0.1871 d3.dn_loss_iou: 0.2311 d4.dn_loss_cls: 0.1394 d4.dn_loss_bbox: 0.1846 d4.dn_loss_iou: 0.2281 d1.loss_lmm_region: 0.1570 loss_lmm_image: 0.8622 2024/11/12 02:10:19 - mmengine - INFO - Iter(train) [ 68300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:33:55 time: 1.9665 data_time: 0.0179 memory: 35118 grad_norm: 33.3646 loss: 9.9143 loss_cls: 0.2900 loss_bbox: 0.1582 loss_iou: 0.2481 d0.loss_cls: 0.3225 d0.loss_bbox: 0.1725 d0.loss_iou: 0.2660 d1.loss_cls: 0.3074 d1.loss_bbox: 0.1608 d1.loss_iou: 0.2526 d2.loss_cls: 0.2939 d2.loss_bbox: 0.1633 d2.loss_iou: 0.2542 d3.loss_cls: 0.2891 d3.loss_bbox: 0.1625 d3.loss_iou: 0.2529 d4.loss_cls: 0.2867 d4.loss_bbox: 0.1611 d4.loss_iou: 0.2494 enc_loss_cls: 0.3310 enc_loss_bbox: 0.1842 enc_loss_iou: 0.2823 dn_loss_cls: 0.1275 dn_loss_bbox: 0.1886 dn_loss_iou: 0.2295 d0.dn_loss_cls: 0.2048 d0.dn_loss_bbox: 0.3630 d0.dn_loss_iou: 0.3775 d1.dn_loss_cls: 0.1529 d1.dn_loss_bbox: 0.2300 d1.dn_loss_iou: 0.2616 d2.dn_loss_cls: 0.1375 d2.dn_loss_bbox: 0.2034 d2.dn_loss_iou: 0.2402 d3.dn_loss_cls: 0.1306 d3.dn_loss_bbox: 0.1910 d3.dn_loss_iou: 0.2318 d4.dn_loss_cls: 0.1266 d4.dn_loss_bbox: 0.1885 d4.dn_loss_iou: 0.2294 d1.loss_lmm_region: 0.1606 loss_lmm_image: 0.8507 2024/11/12 02:13:37 - mmengine - INFO - Iter(train) [ 68400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:30:30 time: 1.9735 data_time: 0.0179 memory: 33803 grad_norm: 28.6527 loss: 9.4106 loss_cls: 0.3017 loss_bbox: 0.1262 loss_iou: 0.2259 d0.loss_cls: 0.3348 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2452 d1.loss_cls: 0.3162 d1.loss_bbox: 0.1323 d1.loss_iou: 0.2333 d2.loss_cls: 0.3116 d2.loss_bbox: 0.1256 d2.loss_iou: 0.2308 d3.loss_cls: 0.3034 d3.loss_bbox: 0.1300 d3.loss_iou: 0.2292 d4.loss_cls: 0.2984 d4.loss_bbox: 0.1293 d4.loss_iou: 0.2276 enc_loss_cls: 0.3477 enc_loss_bbox: 0.1547 enc_loss_iou: 0.2672 dn_loss_cls: 0.1225 dn_loss_bbox: 0.1742 dn_loss_iou: 0.2202 d0.dn_loss_cls: 0.1999 d0.dn_loss_bbox: 0.3205 d0.dn_loss_iou: 0.3628 d1.dn_loss_cls: 0.1539 d1.dn_loss_bbox: 0.2085 d1.dn_loss_iou: 0.2511 d2.dn_loss_cls: 0.1346 d2.dn_loss_bbox: 0.1880 d2.dn_loss_iou: 0.2310 d3.dn_loss_cls: 0.1270 d3.dn_loss_bbox: 0.1786 d3.dn_loss_iou: 0.2235 d4.dn_loss_cls: 0.1225 d4.dn_loss_bbox: 0.1743 d4.dn_loss_iou: 0.2204 d1.loss_lmm_region: 0.1362 loss_lmm_image: 0.8440 2024/11/12 02:16:57 - mmengine - INFO - Iter(train) [ 68500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:27:09 time: 1.9768 data_time: 0.0179 memory: 33442 grad_norm: 27.9649 loss: 9.5660 loss_cls: 0.2999 loss_bbox: 0.1378 loss_iou: 0.2455 d0.loss_cls: 0.3513 d0.loss_bbox: 0.1556 d0.loss_iou: 0.2574 d1.loss_cls: 0.3187 d1.loss_bbox: 0.1485 d1.loss_iou: 0.2528 d2.loss_cls: 0.3081 d2.loss_bbox: 0.1496 d2.loss_iou: 0.2513 d3.loss_cls: 0.3041 d3.loss_bbox: 0.1396 d3.loss_iou: 0.2453 d4.loss_cls: 0.2977 d4.loss_bbox: 0.1402 d4.loss_iou: 0.2458 enc_loss_cls: 0.3521 enc_loss_bbox: 0.1638 enc_loss_iou: 0.2781 dn_loss_cls: 0.1110 dn_loss_bbox: 0.1611 dn_loss_iou: 0.2206 d0.dn_loss_cls: 0.2019 d0.dn_loss_bbox: 0.3091 d0.dn_loss_iou: 0.3701 d1.dn_loss_cls: 0.1451 d1.dn_loss_bbox: 0.1944 d1.dn_loss_iou: 0.2521 d2.dn_loss_cls: 0.1258 d2.dn_loss_bbox: 0.1686 d2.dn_loss_iou: 0.2287 d3.dn_loss_cls: 0.1176 d3.dn_loss_bbox: 0.1614 d3.dn_loss_iou: 0.2218 d4.dn_loss_cls: 0.1117 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.2205 d1.loss_lmm_region: 0.1778 loss_lmm_image: 0.8619 2024/11/12 02:20:18 - mmengine - INFO - Iter(train) [ 68600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:23:49 time: 2.0199 data_time: 0.0179 memory: 34213 grad_norm: 28.1701 loss: 9.3543 loss_cls: 0.3191 loss_bbox: 0.1285 loss_iou: 0.2345 d0.loss_cls: 0.3560 d0.loss_bbox: 0.1438 d0.loss_iou: 0.2518 d1.loss_cls: 0.3365 d1.loss_bbox: 0.1339 d1.loss_iou: 0.2435 d2.loss_cls: 0.3304 d2.loss_bbox: 0.1306 d2.loss_iou: 0.2402 d3.loss_cls: 0.3280 d3.loss_bbox: 0.1276 d3.loss_iou: 0.2338 d4.loss_cls: 0.3227 d4.loss_bbox: 0.1279 d4.loss_iou: 0.2352 enc_loss_cls: 0.3580 enc_loss_bbox: 0.1561 enc_loss_iou: 0.2753 dn_loss_cls: 0.1013 dn_loss_bbox: 0.1543 dn_loss_iou: 0.2150 d0.dn_loss_cls: 0.1803 d0.dn_loss_bbox: 0.2944 d0.dn_loss_iou: 0.3601 d1.dn_loss_cls: 0.1358 d1.dn_loss_bbox: 0.1862 d1.dn_loss_iou: 0.2474 d2.dn_loss_cls: 0.1147 d2.dn_loss_bbox: 0.1645 d2.dn_loss_iou: 0.2259 d3.dn_loss_cls: 0.1074 d3.dn_loss_bbox: 0.1566 d3.dn_loss_iou: 0.2182 d4.dn_loss_cls: 0.1030 d4.dn_loss_bbox: 0.1543 d4.dn_loss_iou: 0.2151 d1.loss_lmm_region: 0.1254 loss_lmm_image: 0.8809 2024/11/12 02:23:36 - mmengine - INFO - Iter(train) [ 68700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:20:24 time: 1.9817 data_time: 0.0181 memory: 33601 grad_norm: 39.7652 loss: 8.8215 loss_cls: 0.2784 loss_bbox: 0.1260 loss_iou: 0.2149 d0.loss_cls: 0.3157 d0.loss_bbox: 0.1343 d0.loss_iou: 0.2295 d1.loss_cls: 0.2905 d1.loss_bbox: 0.1312 d1.loss_iou: 0.2207 d2.loss_cls: 0.2859 d2.loss_bbox: 0.1272 d2.loss_iou: 0.2167 d3.loss_cls: 0.2818 d3.loss_bbox: 0.1258 d3.loss_iou: 0.2153 d4.loss_cls: 0.2790 d4.loss_bbox: 0.1261 d4.loss_iou: 0.2180 enc_loss_cls: 0.3248 enc_loss_bbox: 0.1432 enc_loss_iou: 0.2441 dn_loss_cls: 0.1107 dn_loss_bbox: 0.1596 dn_loss_iou: 0.1918 d0.dn_loss_cls: 0.1906 d0.dn_loss_bbox: 0.2925 d0.dn_loss_iou: 0.3266 d1.dn_loss_cls: 0.1470 d1.dn_loss_bbox: 0.1892 d1.dn_loss_iou: 0.2216 d2.dn_loss_cls: 0.1231 d2.dn_loss_bbox: 0.1705 d2.dn_loss_iou: 0.2010 d3.dn_loss_cls: 0.1168 d3.dn_loss_bbox: 0.1617 d3.dn_loss_iou: 0.1944 d4.dn_loss_cls: 0.1132 d4.dn_loss_bbox: 0.1595 d4.dn_loss_iou: 0.1917 d1.loss_lmm_region: 0.1474 loss_lmm_image: 0.8836 2024/11/12 02:26:54 - mmengine - INFO - Iter(train) [ 68800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:17:00 time: 1.9899 data_time: 0.0179 memory: 33779 grad_norm: 32.0797 loss: 9.8899 loss_cls: 0.2891 loss_bbox: 0.1392 loss_iou: 0.2497 d0.loss_cls: 0.3392 d0.loss_bbox: 0.1504 d0.loss_iou: 0.2648 d1.loss_cls: 0.3152 d1.loss_bbox: 0.1405 d1.loss_iou: 0.2521 d2.loss_cls: 0.3027 d2.loss_bbox: 0.1389 d2.loss_iou: 0.2494 d3.loss_cls: 0.2961 d3.loss_bbox: 0.1387 d3.loss_iou: 0.2504 d4.loss_cls: 0.2885 d4.loss_bbox: 0.1403 d4.loss_iou: 0.2507 enc_loss_cls: 0.3378 enc_loss_bbox: 0.1642 enc_loss_iou: 0.2861 dn_loss_cls: 0.1128 dn_loss_bbox: 0.1986 dn_loss_iou: 0.2456 d0.dn_loss_cls: 0.2007 d0.dn_loss_bbox: 0.3506 d0.dn_loss_iou: 0.3929 d1.dn_loss_cls: 0.1501 d1.dn_loss_bbox: 0.2281 d1.dn_loss_iou: 0.2752 d2.dn_loss_cls: 0.1309 d2.dn_loss_bbox: 0.2072 d2.dn_loss_iou: 0.2547 d3.dn_loss_cls: 0.1219 d3.dn_loss_bbox: 0.2006 d3.dn_loss_iou: 0.2481 d4.dn_loss_cls: 0.1148 d4.dn_loss_bbox: 0.1986 d4.dn_loss_iou: 0.2457 d1.loss_lmm_region: 0.1422 loss_lmm_image: 0.8864 2024/11/12 02:30:12 - mmengine - INFO - Iter(train) [ 68900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:13:36 time: 1.9509 data_time: 0.0180 memory: 33479 grad_norm: 29.4447 loss: 9.0187 loss_cls: 0.2806 loss_bbox: 0.1184 loss_iou: 0.2076 d0.loss_cls: 0.3245 d0.loss_bbox: 0.1337 d0.loss_iou: 0.2282 d1.loss_cls: 0.2988 d1.loss_bbox: 0.1214 d1.loss_iou: 0.2122 d2.loss_cls: 0.2903 d2.loss_bbox: 0.1164 d2.loss_iou: 0.2083 d3.loss_cls: 0.2815 d3.loss_bbox: 0.1197 d3.loss_iou: 0.2105 d4.loss_cls: 0.2818 d4.loss_bbox: 0.1175 d4.loss_iou: 0.2078 enc_loss_cls: 0.3389 enc_loss_bbox: 0.1427 enc_loss_iou: 0.2449 dn_loss_cls: 0.1162 dn_loss_bbox: 0.1766 dn_loss_iou: 0.2086 d0.dn_loss_cls: 0.1933 d0.dn_loss_bbox: 0.3301 d0.dn_loss_iou: 0.3491 d1.dn_loss_cls: 0.1482 d1.dn_loss_bbox: 0.2083 d1.dn_loss_iou: 0.2389 d2.dn_loss_cls: 0.1265 d2.dn_loss_bbox: 0.1863 d2.dn_loss_iou: 0.2176 d3.dn_loss_cls: 0.1210 d3.dn_loss_bbox: 0.1791 d3.dn_loss_iou: 0.2118 d4.dn_loss_cls: 0.1186 d4.dn_loss_bbox: 0.1767 d4.dn_loss_iou: 0.2086 d1.loss_lmm_region: 0.1376 loss_lmm_image: 0.8797 2024/11/12 02:33:29 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 02:33:29 - mmengine - INFO - Iter(train) [ 69000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:10:11 time: 1.9804 data_time: 0.0178 memory: 35107 grad_norm: 28.5229 loss: 9.9676 loss_cls: 0.2988 loss_bbox: 0.1457 loss_iou: 0.2605 d0.loss_cls: 0.3404 d0.loss_bbox: 0.1606 d0.loss_iou: 0.2759 d1.loss_cls: 0.3200 d1.loss_bbox: 0.1500 d1.loss_iou: 0.2633 d2.loss_cls: 0.3086 d2.loss_bbox: 0.1467 d2.loss_iou: 0.2609 d3.loss_cls: 0.3027 d3.loss_bbox: 0.1494 d3.loss_iou: 0.2612 d4.loss_cls: 0.2990 d4.loss_bbox: 0.1455 d4.loss_iou: 0.2613 enc_loss_cls: 0.3422 enc_loss_bbox: 0.1783 enc_loss_iou: 0.3020 dn_loss_cls: 0.1264 dn_loss_bbox: 0.1753 dn_loss_iou: 0.2425 d0.dn_loss_cls: 0.2077 d0.dn_loss_bbox: 0.3310 d0.dn_loss_iou: 0.3973 d1.dn_loss_cls: 0.1593 d1.dn_loss_bbox: 0.2095 d1.dn_loss_iou: 0.2758 d2.dn_loss_cls: 0.1400 d2.dn_loss_bbox: 0.1844 d2.dn_loss_iou: 0.2527 d3.dn_loss_cls: 0.1319 d3.dn_loss_bbox: 0.1779 d3.dn_loss_iou: 0.2449 d4.dn_loss_cls: 0.1268 d4.dn_loss_bbox: 0.1753 d4.dn_loss_iou: 0.2425 d1.loss_lmm_region: 0.1497 loss_lmm_image: 0.8434 2024/11/12 02:36:49 - mmengine - INFO - Iter(train) [ 69100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:06:50 time: 2.0049 data_time: 0.0179 memory: 34459 grad_norm: 31.9876 loss: 9.1024 loss_cls: 0.2562 loss_bbox: 0.1174 loss_iou: 0.2112 d0.loss_cls: 0.2905 d0.loss_bbox: 0.1263 d0.loss_iou: 0.2218 d1.loss_cls: 0.2691 d1.loss_bbox: 0.1216 d1.loss_iou: 0.2124 d2.loss_cls: 0.2632 d2.loss_bbox: 0.1161 d2.loss_iou: 0.2085 d3.loss_cls: 0.2595 d3.loss_bbox: 0.1151 d3.loss_iou: 0.2090 d4.loss_cls: 0.2558 d4.loss_bbox: 0.1173 d4.loss_iou: 0.2111 enc_loss_cls: 0.2921 enc_loss_bbox: 0.1462 enc_loss_iou: 0.2447 dn_loss_cls: 0.1527 dn_loss_bbox: 0.1756 dn_loss_iou: 0.2193 d0.dn_loss_cls: 0.2395 d0.dn_loss_bbox: 0.3405 d0.dn_loss_iou: 0.3720 d1.dn_loss_cls: 0.1883 d1.dn_loss_bbox: 0.2082 d1.dn_loss_iou: 0.2487 d2.dn_loss_cls: 0.1646 d2.dn_loss_bbox: 0.1867 d2.dn_loss_iou: 0.2270 d3.dn_loss_cls: 0.1625 d3.dn_loss_bbox: 0.1780 d3.dn_loss_iou: 0.2214 d4.dn_loss_cls: 0.1533 d4.dn_loss_bbox: 0.1757 d4.dn_loss_iou: 0.2193 d1.loss_lmm_region: 0.1422 loss_lmm_image: 0.8621 2024/11/12 02:40:09 - mmengine - INFO - Iter(train) [ 69200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:03:28 time: 1.9730 data_time: 0.0181 memory: 30665 grad_norm: 35.1263 loss: 9.2956 loss_cls: 0.2782 loss_bbox: 0.1314 loss_iou: 0.2367 d0.loss_cls: 0.3303 d0.loss_bbox: 0.1446 d0.loss_iou: 0.2544 d1.loss_cls: 0.2996 d1.loss_bbox: 0.1375 d1.loss_iou: 0.2407 d2.loss_cls: 0.2849 d2.loss_bbox: 0.1350 d2.loss_iou: 0.2410 d3.loss_cls: 0.2810 d3.loss_bbox: 0.1327 d3.loss_iou: 0.2378 d4.loss_cls: 0.2791 d4.loss_bbox: 0.1314 d4.loss_iou: 0.2371 enc_loss_cls: 0.3283 enc_loss_bbox: 0.1670 enc_loss_iou: 0.2723 dn_loss_cls: 0.0977 dn_loss_bbox: 0.1833 dn_loss_iou: 0.2101 d0.dn_loss_cls: 0.1862 d0.dn_loss_bbox: 0.3425 d0.dn_loss_iou: 0.3491 d1.dn_loss_cls: 0.1321 d1.dn_loss_bbox: 0.2243 d1.dn_loss_iou: 0.2420 d2.dn_loss_cls: 0.1118 d2.dn_loss_bbox: 0.1945 d2.dn_loss_iou: 0.2197 d3.dn_loss_cls: 0.1049 d3.dn_loss_bbox: 0.1862 d3.dn_loss_iou: 0.2130 d4.dn_loss_cls: 0.1002 d4.dn_loss_bbox: 0.1833 d4.dn_loss_iou: 0.2100 d1.loss_lmm_region: 0.1433 loss_lmm_image: 0.8803 2024/11/12 02:43:28 - mmengine - INFO - Iter(train) [ 69300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 21:00:05 time: 1.9869 data_time: 0.0179 memory: 33415 grad_norm: 27.9900 loss: 9.7492 loss_cls: 0.3204 loss_bbox: 0.1348 loss_iou: 0.2405 d0.loss_cls: 0.3481 d0.loss_bbox: 0.1615 d0.loss_iou: 0.2598 d1.loss_cls: 0.3280 d1.loss_bbox: 0.1483 d1.loss_iou: 0.2472 d2.loss_cls: 0.3321 d2.loss_bbox: 0.1351 d2.loss_iou: 0.2392 d3.loss_cls: 0.3234 d3.loss_bbox: 0.1374 d3.loss_iou: 0.2445 d4.loss_cls: 0.3211 d4.loss_bbox: 0.1339 d4.loss_iou: 0.2395 enc_loss_cls: 0.3582 enc_loss_bbox: 0.1698 enc_loss_iou: 0.2790 dn_loss_cls: 0.1318 dn_loss_bbox: 0.1668 dn_loss_iou: 0.2081 d0.dn_loss_cls: 0.2221 d0.dn_loss_bbox: 0.3297 d0.dn_loss_iou: 0.3515 d1.dn_loss_cls: 0.1710 d1.dn_loss_bbox: 0.2083 d1.dn_loss_iou: 0.2418 d2.dn_loss_cls: 0.1551 d2.dn_loss_bbox: 0.1793 d2.dn_loss_iou: 0.2183 d3.dn_loss_cls: 0.1429 d3.dn_loss_bbox: 0.1679 d3.dn_loss_iou: 0.2105 d4.dn_loss_cls: 0.1323 d4.dn_loss_bbox: 0.1669 d4.dn_loss_iou: 0.2082 d1.loss_lmm_region: 0.1510 loss_lmm_image: 0.8839 2024/11/12 02:46:46 - mmengine - INFO - Iter(train) [ 69400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:56:41 time: 2.0008 data_time: 0.0178 memory: 34915 grad_norm: 30.3769 loss: 8.3788 loss_cls: 0.2480 loss_bbox: 0.1173 loss_iou: 0.2145 d0.loss_cls: 0.2952 d0.loss_bbox: 0.1263 d0.loss_iou: 0.2294 d1.loss_cls: 0.2699 d1.loss_bbox: 0.1196 d1.loss_iou: 0.2210 d2.loss_cls: 0.2631 d2.loss_bbox: 0.1180 d2.loss_iou: 0.2154 d3.loss_cls: 0.2562 d3.loss_bbox: 0.1159 d3.loss_iou: 0.2140 d4.loss_cls: 0.2495 d4.loss_bbox: 0.1171 d4.loss_iou: 0.2141 enc_loss_cls: 0.2963 enc_loss_bbox: 0.1425 enc_loss_iou: 0.2576 dn_loss_cls: 0.0887 dn_loss_bbox: 0.1398 dn_loss_iou: 0.2003 d0.dn_loss_cls: 0.1607 d0.dn_loss_bbox: 0.2837 d0.dn_loss_iou: 0.3435 d1.dn_loss_cls: 0.1136 d1.dn_loss_bbox: 0.1743 d1.dn_loss_iou: 0.2340 d2.dn_loss_cls: 0.0944 d2.dn_loss_bbox: 0.1532 d2.dn_loss_iou: 0.2115 d3.dn_loss_cls: 0.0926 d3.dn_loss_bbox: 0.1435 d3.dn_loss_iou: 0.2034 d4.dn_loss_cls: 0.0875 d4.dn_loss_bbox: 0.1399 d4.dn_loss_iou: 0.2002 d1.loss_lmm_region: 0.1282 loss_lmm_image: 0.8846 2024/11/12 02:50:05 - mmengine - INFO - Iter(train) [ 69500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:53:18 time: 1.9952 data_time: 0.0182 memory: 35891 grad_norm: 29.2977 loss: 9.7255 loss_cls: 0.2548 loss_bbox: 0.1492 loss_iou: 0.2127 d0.loss_cls: 0.2963 d0.loss_bbox: 0.1579 d0.loss_iou: 0.2193 d1.loss_cls: 0.2747 d1.loss_bbox: 0.1540 d1.loss_iou: 0.2124 d2.loss_cls: 0.2664 d2.loss_bbox: 0.1468 d2.loss_iou: 0.2097 d3.loss_cls: 0.2576 d3.loss_bbox: 0.1476 d3.loss_iou: 0.2119 d4.loss_cls: 0.2571 d4.loss_bbox: 0.1465 d4.loss_iou: 0.2100 enc_loss_cls: 0.3049 enc_loss_bbox: 0.1701 enc_loss_iou: 0.2360 dn_loss_cls: 0.1380 dn_loss_bbox: 0.2307 dn_loss_iou: 0.2318 d0.dn_loss_cls: 0.2066 d0.dn_loss_bbox: 0.4165 d0.dn_loss_iou: 0.3843 d1.dn_loss_cls: 0.1621 d1.dn_loss_bbox: 0.2727 d1.dn_loss_iou: 0.2663 d2.dn_loss_cls: 0.1476 d2.dn_loss_bbox: 0.2416 d2.dn_loss_iou: 0.2415 d3.dn_loss_cls: 0.1437 d3.dn_loss_bbox: 0.2330 d3.dn_loss_iou: 0.2346 d4.dn_loss_cls: 0.1383 d4.dn_loss_bbox: 0.2306 d4.dn_loss_iou: 0.2317 d1.loss_lmm_region: 0.1463 loss_lmm_image: 0.9314 2024/11/12 02:53:22 - mmengine - INFO - Iter(train) [ 69600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:49:54 time: 1.9899 data_time: 0.0181 memory: 34435 grad_norm: 30.8667 loss: 10.4301 loss_cls: 0.3415 loss_bbox: 0.1629 loss_iou: 0.2950 d0.loss_cls: 0.3787 d0.loss_bbox: 0.1856 d0.loss_iou: 0.3162 d1.loss_cls: 0.3665 d1.loss_bbox: 0.1652 d1.loss_iou: 0.2930 d2.loss_cls: 0.3537 d2.loss_bbox: 0.1646 d2.loss_iou: 0.2942 d3.loss_cls: 0.3460 d3.loss_bbox: 0.1614 d3.loss_iou: 0.2933 d4.loss_cls: 0.3415 d4.loss_bbox: 0.1630 d4.loss_iou: 0.2942 enc_loss_cls: 0.3907 enc_loss_bbox: 0.1929 enc_loss_iou: 0.3277 dn_loss_cls: 0.1073 dn_loss_bbox: 0.1751 dn_loss_iou: 0.2427 d0.dn_loss_cls: 0.1802 d0.dn_loss_bbox: 0.3180 d0.dn_loss_iou: 0.3784 d1.dn_loss_cls: 0.1384 d1.dn_loss_bbox: 0.2027 d1.dn_loss_iou: 0.2694 d2.dn_loss_cls: 0.1194 d2.dn_loss_bbox: 0.1834 d2.dn_loss_iou: 0.2503 d3.dn_loss_cls: 0.1117 d3.dn_loss_bbox: 0.1764 d3.dn_loss_iou: 0.2445 d4.dn_loss_cls: 0.1083 d4.dn_loss_bbox: 0.1751 d4.dn_loss_iou: 0.2426 d1.loss_lmm_region: 0.1240 loss_lmm_image: 0.8543 2024/11/12 02:56:40 - mmengine - INFO - Iter(train) [ 69700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:46:29 time: 1.9783 data_time: 0.0180 memory: 34059 grad_norm: 30.3404 loss: 9.5546 loss_cls: 0.2746 loss_bbox: 0.1358 loss_iou: 0.2314 d0.loss_cls: 0.3180 d0.loss_bbox: 0.1494 d0.loss_iou: 0.2473 d1.loss_cls: 0.2966 d1.loss_bbox: 0.1402 d1.loss_iou: 0.2399 d2.loss_cls: 0.2850 d2.loss_bbox: 0.1388 d2.loss_iou: 0.2331 d3.loss_cls: 0.2785 d3.loss_bbox: 0.1344 d3.loss_iou: 0.2317 d4.loss_cls: 0.2763 d4.loss_bbox: 0.1352 d4.loss_iou: 0.2321 enc_loss_cls: 0.3221 enc_loss_bbox: 0.1657 enc_loss_iou: 0.2721 dn_loss_cls: 0.1547 dn_loss_bbox: 0.1782 dn_loss_iou: 0.2150 d0.dn_loss_cls: 0.2353 d0.dn_loss_bbox: 0.3282 d0.dn_loss_iou: 0.3631 d1.dn_loss_cls: 0.1839 d1.dn_loss_bbox: 0.2092 d1.dn_loss_iou: 0.2470 d2.dn_loss_cls: 0.1643 d2.dn_loss_bbox: 0.1917 d2.dn_loss_iou: 0.2265 d3.dn_loss_cls: 0.1585 d3.dn_loss_bbox: 0.1812 d3.dn_loss_iou: 0.2186 d4.dn_loss_cls: 0.1540 d4.dn_loss_bbox: 0.1782 d4.dn_loss_iou: 0.2150 d1.loss_lmm_region: 0.1581 loss_lmm_image: 0.8559 2024/11/12 02:57:43 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 02:57:43 - mmengine - WARNING - Reach the end of the dataloader, it will be restarted and continue to iterate. It is recommended to use `mmengine.dataset.InfiniteSampler` to enable the dataloader to iterate infinitely. 2024/11/12 03:00:01 - mmengine - INFO - Iter(train) [ 69800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:43:09 time: 1.9639 data_time: 0.0180 memory: 34901 grad_norm: 28.9661 loss: 10.0151 loss_cls: 0.3360 loss_bbox: 0.1412 loss_iou: 0.2836 d0.loss_cls: 0.3809 d0.loss_bbox: 0.1568 d0.loss_iou: 0.3028 d1.loss_cls: 0.3583 d1.loss_bbox: 0.1456 d1.loss_iou: 0.2910 d2.loss_cls: 0.3499 d2.loss_bbox: 0.1449 d2.loss_iou: 0.2867 d3.loss_cls: 0.3431 d3.loss_bbox: 0.1393 d3.loss_iou: 0.2821 d4.loss_cls: 0.3370 d4.loss_bbox: 0.1395 d4.loss_iou: 0.2825 enc_loss_cls: 0.3832 enc_loss_bbox: 0.1702 enc_loss_iou: 0.3292 dn_loss_cls: 0.1082 dn_loss_bbox: 0.1626 dn_loss_iou: 0.2128 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.3107 d0.dn_loss_iou: 0.3526 d1.dn_loss_cls: 0.1361 d1.dn_loss_bbox: 0.1927 d1.dn_loss_iou: 0.2404 d2.dn_loss_cls: 0.1167 d2.dn_loss_bbox: 0.1705 d2.dn_loss_iou: 0.2211 d3.dn_loss_cls: 0.1091 d3.dn_loss_bbox: 0.1635 d3.dn_loss_iou: 0.2144 d4.dn_loss_cls: 0.1070 d4.dn_loss_bbox: 0.1626 d4.dn_loss_iou: 0.2127 d1.loss_lmm_region: 0.1466 loss_lmm_image: 0.9080 2024/11/12 03:03:21 - mmengine - INFO - Iter(train) [ 69900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:39:47 time: 1.9793 data_time: 0.0180 memory: 33499 grad_norm: 26.5443 loss: 9.4002 loss_cls: 0.2908 loss_bbox: 0.1449 loss_iou: 0.2798 d0.loss_cls: 0.3327 d0.loss_bbox: 0.1563 d0.loss_iou: 0.2946 d1.loss_cls: 0.3048 d1.loss_bbox: 0.1523 d1.loss_iou: 0.2907 d2.loss_cls: 0.3011 d2.loss_bbox: 0.1469 d2.loss_iou: 0.2824 d3.loss_cls: 0.2968 d3.loss_bbox: 0.1449 d3.loss_iou: 0.2801 d4.loss_cls: 0.2940 d4.loss_bbox: 0.1428 d4.loss_iou: 0.2781 enc_loss_cls: 0.3356 enc_loss_bbox: 0.1740 enc_loss_iou: 0.3236 dn_loss_cls: 0.0959 dn_loss_bbox: 0.1484 dn_loss_iou: 0.2174 d0.dn_loss_cls: 0.1690 d0.dn_loss_bbox: 0.2800 d0.dn_loss_iou: 0.3563 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.1793 d1.dn_loss_iou: 0.2485 d2.dn_loss_cls: 0.1069 d2.dn_loss_bbox: 0.1576 d2.dn_loss_iou: 0.2263 d3.dn_loss_cls: 0.0990 d3.dn_loss_bbox: 0.1503 d3.dn_loss_iou: 0.2198 d4.dn_loss_cls: 0.0964 d4.dn_loss_bbox: 0.1484 d4.dn_loss_iou: 0.2174 d1.loss_lmm_region: 0.1098 loss_lmm_image: 0.8048 2024/11/12 03:06:40 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 03:06:40 - mmengine - INFO - Iter(train) [ 70000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:36:25 time: 1.9965 data_time: 0.0181 memory: 33877 grad_norm: 32.4748 loss: 9.6145 loss_cls: 0.2946 loss_bbox: 0.1397 loss_iou: 0.2613 d0.loss_cls: 0.3447 d0.loss_bbox: 0.1524 d0.loss_iou: 0.2759 d1.loss_cls: 0.3214 d1.loss_bbox: 0.1403 d1.loss_iou: 0.2629 d2.loss_cls: 0.3054 d2.loss_bbox: 0.1438 d2.loss_iou: 0.2642 d3.loss_cls: 0.2968 d3.loss_bbox: 0.1417 d3.loss_iou: 0.2631 d4.loss_cls: 0.2941 d4.loss_bbox: 0.1406 d4.loss_iou: 0.2615 enc_loss_cls: 0.3437 enc_loss_bbox: 0.1663 enc_loss_iou: 0.2984 dn_loss_cls: 0.1149 dn_loss_bbox: 0.1641 dn_loss_iou: 0.2227 d0.dn_loss_cls: 0.1822 d0.dn_loss_bbox: 0.3160 d0.dn_loss_iou: 0.3723 d1.dn_loss_cls: 0.1373 d1.dn_loss_bbox: 0.1983 d1.dn_loss_iou: 0.2564 d2.dn_loss_cls: 0.1237 d2.dn_loss_bbox: 0.1740 d2.dn_loss_iou: 0.2329 d3.dn_loss_cls: 0.1158 d3.dn_loss_bbox: 0.1662 d3.dn_loss_iou: 0.2254 d4.dn_loss_cls: 0.1138 d4.dn_loss_bbox: 0.1642 d4.dn_loss_iou: 0.2228 d1.loss_lmm_region: 0.1131 loss_lmm_image: 0.8860 2024/11/12 03:09:59 - mmengine - INFO - Iter(train) [ 70100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:33:02 time: 1.9799 data_time: 0.0181 memory: 34227 grad_norm: 24.8130 loss: 10.4619 loss_cls: 0.3716 loss_bbox: 0.1608 loss_iou: 0.3075 d0.loss_cls: 0.4264 d0.loss_bbox: 0.1759 d0.loss_iou: 0.3238 d1.loss_cls: 0.3928 d1.loss_bbox: 0.1692 d1.loss_iou: 0.3139 d2.loss_cls: 0.3801 d2.loss_bbox: 0.1614 d2.loss_iou: 0.3066 d3.loss_cls: 0.3728 d3.loss_bbox: 0.1644 d3.loss_iou: 0.3082 d4.loss_cls: 0.3721 d4.loss_bbox: 0.1598 d4.loss_iou: 0.3066 enc_loss_cls: 0.4220 enc_loss_bbox: 0.1956 enc_loss_iou: 0.3505 dn_loss_cls: 0.0843 dn_loss_bbox: 0.1615 dn_loss_iou: 0.2218 d0.dn_loss_cls: 0.1656 d0.dn_loss_bbox: 0.2933 d0.dn_loss_iou: 0.3560 d1.dn_loss_cls: 0.1152 d1.dn_loss_bbox: 0.1899 d1.dn_loss_iou: 0.2499 d2.dn_loss_cls: 0.0950 d2.dn_loss_bbox: 0.1670 d2.dn_loss_iou: 0.2294 d3.dn_loss_cls: 0.0881 d3.dn_loss_bbox: 0.1627 d3.dn_loss_iou: 0.2238 d4.dn_loss_cls: 0.0847 d4.dn_loss_bbox: 0.1615 d4.dn_loss_iou: 0.2218 d1.loss_lmm_region: 0.1447 loss_lmm_image: 0.9036 2024/11/12 03:13:17 - mmengine - INFO - Iter(train) [ 70200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:29:38 time: 1.9810 data_time: 0.0181 memory: 34545 grad_norm: 27.3922 loss: 8.7942 loss_cls: 0.2697 loss_bbox: 0.1264 loss_iou: 0.2512 d0.loss_cls: 0.2960 d0.loss_bbox: 0.1405 d0.loss_iou: 0.2685 d1.loss_cls: 0.2808 d1.loss_bbox: 0.1318 d1.loss_iou: 0.2575 d2.loss_cls: 0.2749 d2.loss_bbox: 0.1302 d2.loss_iou: 0.2538 d3.loss_cls: 0.2687 d3.loss_bbox: 0.1273 d3.loss_iou: 0.2526 d4.loss_cls: 0.2724 d4.loss_bbox: 0.1256 d4.loss_iou: 0.2494 enc_loss_cls: 0.3075 enc_loss_bbox: 0.1530 enc_loss_iou: 0.2889 dn_loss_cls: 0.0780 dn_loss_bbox: 0.1430 dn_loss_iou: 0.1989 d0.dn_loss_cls: 0.1540 d0.dn_loss_bbox: 0.3020 d0.dn_loss_iou: 0.3429 d1.dn_loss_cls: 0.1101 d1.dn_loss_bbox: 0.1774 d1.dn_loss_iou: 0.2305 d2.dn_loss_cls: 0.0907 d2.dn_loss_bbox: 0.1546 d2.dn_loss_iou: 0.2100 d3.dn_loss_cls: 0.0830 d3.dn_loss_bbox: 0.1464 d3.dn_loss_iou: 0.2022 d4.dn_loss_cls: 0.0785 d4.dn_loss_bbox: 0.1432 d4.dn_loss_iou: 0.1989 d1.loss_lmm_region: 0.1236 loss_lmm_image: 0.8997 2024/11/12 03:16:38 - mmengine - INFO - Iter(train) [ 70300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:26:18 time: 2.0237 data_time: 0.0180 memory: 34898 grad_norm: 28.2078 loss: 10.6895 loss_cls: 0.3590 loss_bbox: 0.1689 loss_iou: 0.2785 d0.loss_cls: 0.4174 d0.loss_bbox: 0.1668 d0.loss_iou: 0.2826 d1.loss_cls: 0.3845 d1.loss_bbox: 0.1664 d1.loss_iou: 0.2792 d2.loss_cls: 0.3689 d2.loss_bbox: 0.1689 d2.loss_iou: 0.2775 d3.loss_cls: 0.3618 d3.loss_bbox: 0.1682 d3.loss_iou: 0.2798 d4.loss_cls: 0.3589 d4.loss_bbox: 0.1680 d4.loss_iou: 0.2795 enc_loss_cls: 0.4181 enc_loss_bbox: 0.1850 enc_loss_iou: 0.3097 dn_loss_cls: 0.1181 dn_loss_bbox: 0.1942 dn_loss_iou: 0.2305 d0.dn_loss_cls: 0.2081 d0.dn_loss_bbox: 0.3530 d0.dn_loss_iou: 0.3763 d1.dn_loss_cls: 0.1612 d1.dn_loss_bbox: 0.2302 d1.dn_loss_iou: 0.2633 d2.dn_loss_cls: 0.1348 d2.dn_loss_bbox: 0.2066 d2.dn_loss_iou: 0.2408 d3.dn_loss_cls: 0.1251 d3.dn_loss_bbox: 0.1972 d3.dn_loss_iou: 0.2332 d4.dn_loss_cls: 0.1213 d4.dn_loss_bbox: 0.1941 d4.dn_loss_iou: 0.2304 d1.loss_lmm_region: 0.1549 loss_lmm_image: 0.8687 2024/11/12 03:19:57 - mmengine - INFO - Iter(train) [ 70400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:22:55 time: 1.9809 data_time: 0.0180 memory: 35219 grad_norm: 27.0242 loss: 8.9422 loss_cls: 0.2588 loss_bbox: 0.1224 loss_iou: 0.2347 d0.loss_cls: 0.3033 d0.loss_bbox: 0.1302 d0.loss_iou: 0.2478 d1.loss_cls: 0.2821 d1.loss_bbox: 0.1258 d1.loss_iou: 0.2372 d2.loss_cls: 0.2712 d2.loss_bbox: 0.1233 d2.loss_iou: 0.2330 d3.loss_cls: 0.2617 d3.loss_bbox: 0.1237 d3.loss_iou: 0.2353 d4.loss_cls: 0.2570 d4.loss_bbox: 0.1247 d4.loss_iou: 0.2356 enc_loss_cls: 0.2995 enc_loss_bbox: 0.1438 enc_loss_iou: 0.2654 dn_loss_cls: 0.1062 dn_loss_bbox: 0.1767 dn_loss_iou: 0.2127 d0.dn_loss_cls: 0.1866 d0.dn_loss_bbox: 0.3332 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1404 d1.dn_loss_bbox: 0.2091 d1.dn_loss_iou: 0.2446 d2.dn_loss_cls: 0.1180 d2.dn_loss_bbox: 0.1862 d2.dn_loss_iou: 0.2220 d3.dn_loss_cls: 0.1129 d3.dn_loss_bbox: 0.1793 d3.dn_loss_iou: 0.2156 d4.dn_loss_cls: 0.1077 d4.dn_loss_bbox: 0.1768 d4.dn_loss_iou: 0.2126 d1.loss_lmm_region: 0.1234 loss_lmm_image: 0.8072 2024/11/12 03:23:17 - mmengine - INFO - Iter(train) [ 70500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:19:33 time: 1.9947 data_time: 0.0181 memory: 33792 grad_norm: 25.4420 loss: 10.5932 loss_cls: 0.3438 loss_bbox: 0.1605 loss_iou: 0.2797 d0.loss_cls: 0.3950 d0.loss_bbox: 0.1747 d0.loss_iou: 0.2938 d1.loss_cls: 0.3633 d1.loss_bbox: 0.1641 d1.loss_iou: 0.2815 d2.loss_cls: 0.3513 d2.loss_bbox: 0.1638 d2.loss_iou: 0.2788 d3.loss_cls: 0.3490 d3.loss_bbox: 0.1590 d3.loss_iou: 0.2769 d4.loss_cls: 0.3474 d4.loss_bbox: 0.1598 d4.loss_iou: 0.2790 enc_loss_cls: 0.3960 enc_loss_bbox: 0.1993 enc_loss_iou: 0.3269 dn_loss_cls: 0.1120 dn_loss_bbox: 0.1952 dn_loss_iou: 0.2436 d0.dn_loss_cls: 0.1941 d0.dn_loss_bbox: 0.3579 d0.dn_loss_iou: 0.4031 d1.dn_loss_cls: 0.1398 d1.dn_loss_bbox: 0.2315 d1.dn_loss_iou: 0.2803 d2.dn_loss_cls: 0.1226 d2.dn_loss_bbox: 0.2051 d2.dn_loss_iou: 0.2553 d3.dn_loss_cls: 0.1171 d3.dn_loss_bbox: 0.1960 d3.dn_loss_iou: 0.2462 d4.dn_loss_cls: 0.1127 d4.dn_loss_bbox: 0.1952 d4.dn_loss_iou: 0.2435 d1.loss_lmm_region: 0.1462 loss_lmm_image: 0.8521 2024/11/12 03:26:35 - mmengine - INFO - Iter(train) [ 70600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:16:09 time: 1.9810 data_time: 0.0181 memory: 30814 grad_norm: 31.8399 loss: 8.2576 loss_cls: 0.2547 loss_bbox: 0.1124 loss_iou: 0.2147 d0.loss_cls: 0.2932 d0.loss_bbox: 0.1208 d0.loss_iou: 0.2308 d1.loss_cls: 0.2735 d1.loss_bbox: 0.1123 d1.loss_iou: 0.2146 d2.loss_cls: 0.2626 d2.loss_bbox: 0.1142 d2.loss_iou: 0.2162 d3.loss_cls: 0.2604 d3.loss_bbox: 0.1092 d3.loss_iou: 0.2116 d4.loss_cls: 0.2569 d4.loss_bbox: 0.1116 d4.loss_iou: 0.2124 enc_loss_cls: 0.2965 enc_loss_bbox: 0.1358 enc_loss_iou: 0.2486 dn_loss_cls: 0.0892 dn_loss_bbox: 0.1361 dn_loss_iou: 0.1921 d0.dn_loss_cls: 0.1673 d0.dn_loss_bbox: 0.2792 d0.dn_loss_iou: 0.3282 d1.dn_loss_cls: 0.1215 d1.dn_loss_bbox: 0.1674 d1.dn_loss_iou: 0.2186 d2.dn_loss_cls: 0.1010 d2.dn_loss_bbox: 0.1439 d2.dn_loss_iou: 0.1993 d3.dn_loss_cls: 0.0934 d3.dn_loss_bbox: 0.1377 d3.dn_loss_iou: 0.1942 d4.dn_loss_cls: 0.0897 d4.dn_loss_bbox: 0.1360 d4.dn_loss_iou: 0.1920 d1.loss_lmm_region: 0.1182 loss_lmm_image: 0.8896 2024/11/12 03:29:52 - mmengine - INFO - Iter(train) [ 70700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:12:45 time: 1.9999 data_time: 0.0181 memory: 31810 grad_norm: 27.5138 loss: 9.3273 loss_cls: 0.3000 loss_bbox: 0.1370 loss_iou: 0.2489 d0.loss_cls: 0.3516 d0.loss_bbox: 0.1507 d0.loss_iou: 0.2672 d1.loss_cls: 0.3206 d1.loss_bbox: 0.1414 d1.loss_iou: 0.2555 d2.loss_cls: 0.3055 d2.loss_bbox: 0.1423 d2.loss_iou: 0.2547 d3.loss_cls: 0.3066 d3.loss_bbox: 0.1366 d3.loss_iou: 0.2495 d4.loss_cls: 0.3035 d4.loss_bbox: 0.1369 d4.loss_iou: 0.2480 enc_loss_cls: 0.3529 enc_loss_bbox: 0.1623 enc_loss_iou: 0.2889 dn_loss_cls: 0.1131 dn_loss_bbox: 0.1434 dn_loss_iou: 0.2032 d0.dn_loss_cls: 0.1945 d0.dn_loss_bbox: 0.2756 d0.dn_loss_iou: 0.3419 d1.dn_loss_cls: 0.1425 d1.dn_loss_bbox: 0.1726 d1.dn_loss_iou: 0.2304 d2.dn_loss_cls: 0.1227 d2.dn_loss_bbox: 0.1507 d2.dn_loss_iou: 0.2113 d3.dn_loss_cls: 0.1168 d3.dn_loss_bbox: 0.1434 d3.dn_loss_iou: 0.2044 d4.dn_loss_cls: 0.1127 d4.dn_loss_bbox: 0.1435 d4.dn_loss_iou: 0.2032 d1.loss_lmm_region: 0.1446 loss_lmm_image: 0.8964 2024/11/12 03:33:11 - mmengine - INFO - Iter(train) [ 70800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:09:22 time: 1.9805 data_time: 0.0181 memory: 35075 grad_norm: 31.4363 loss: 10.4956 loss_cls: 0.3339 loss_bbox: 0.1576 loss_iou: 0.2758 d0.loss_cls: 0.3879 d0.loss_bbox: 0.1643 d0.loss_iou: 0.2919 d1.loss_cls: 0.3544 d1.loss_bbox: 0.1630 d1.loss_iou: 0.2819 d2.loss_cls: 0.3456 d2.loss_bbox: 0.1572 d2.loss_iou: 0.2789 d3.loss_cls: 0.3383 d3.loss_bbox: 0.1558 d3.loss_iou: 0.2778 d4.loss_cls: 0.3340 d4.loss_bbox: 0.1589 d4.loss_iou: 0.2790 enc_loss_cls: 0.3924 enc_loss_bbox: 0.1788 enc_loss_iou: 0.3154 dn_loss_cls: 0.1402 dn_loss_bbox: 0.1823 dn_loss_iou: 0.2385 d0.dn_loss_cls: 0.2184 d0.dn_loss_bbox: 0.3349 d0.dn_loss_iou: 0.3810 d1.dn_loss_cls: 0.1702 d1.dn_loss_bbox: 0.2196 d1.dn_loss_iou: 0.2691 d2.dn_loss_cls: 0.1525 d2.dn_loss_bbox: 0.1932 d2.dn_loss_iou: 0.2473 d3.dn_loss_cls: 0.1456 d3.dn_loss_bbox: 0.1842 d3.dn_loss_iou: 0.2408 d4.dn_loss_cls: 0.1407 d4.dn_loss_bbox: 0.1824 d4.dn_loss_iou: 0.2385 d1.loss_lmm_region: 0.1623 loss_lmm_image: 0.8310 2024/11/12 03:36:30 - mmengine - INFO - Iter(train) [ 70900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:05:59 time: 1.9826 data_time: 0.0181 memory: 34772 grad_norm: 31.5298 loss: 10.7367 loss_cls: 0.4007 loss_bbox: 0.1505 loss_iou: 0.2750 d0.loss_cls: 0.4422 d0.loss_bbox: 0.1644 d0.loss_iou: 0.2880 d1.loss_cls: 0.4098 d1.loss_bbox: 0.1559 d1.loss_iou: 0.2795 d2.loss_cls: 0.4026 d2.loss_bbox: 0.1524 d2.loss_iou: 0.2770 d3.loss_cls: 0.4038 d3.loss_bbox: 0.1486 d3.loss_iou: 0.2724 d4.loss_cls: 0.4020 d4.loss_bbox: 0.1472 d4.loss_iou: 0.2710 enc_loss_cls: 0.4476 enc_loss_bbox: 0.1724 enc_loss_iou: 0.3083 dn_loss_cls: 0.1657 dn_loss_bbox: 0.1652 dn_loss_iou: 0.2163 d0.dn_loss_cls: 0.2448 d0.dn_loss_bbox: 0.2912 d0.dn_loss_iou: 0.3491 d1.dn_loss_cls: 0.1981 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2476 d2.dn_loss_cls: 0.1807 d2.dn_loss_bbox: 0.1748 d2.dn_loss_iou: 0.2263 d3.dn_loss_cls: 0.1736 d3.dn_loss_bbox: 0.1679 d3.dn_loss_iou: 0.2189 d4.dn_loss_cls: 0.1626 d4.dn_loss_bbox: 0.1653 d4.dn_loss_iou: 0.2163 d1.loss_lmm_region: 0.1501 loss_lmm_image: 0.8537 2024/11/12 03:39:47 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 03:39:47 - mmengine - INFO - Iter(train) [ 71000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 20:02:35 time: 1.9704 data_time: 0.0180 memory: 34861 grad_norm: 32.7938 loss: 9.1130 loss_cls: 0.2808 loss_bbox: 0.1322 loss_iou: 0.2535 d0.loss_cls: 0.3233 d0.loss_bbox: 0.1456 d0.loss_iou: 0.2642 d1.loss_cls: 0.3035 d1.loss_bbox: 0.1317 d1.loss_iou: 0.2531 d2.loss_cls: 0.2939 d2.loss_bbox: 0.1312 d2.loss_iou: 0.2519 d3.loss_cls: 0.2872 d3.loss_bbox: 0.1317 d3.loss_iou: 0.2519 d4.loss_cls: 0.2798 d4.loss_bbox: 0.1337 d4.loss_iou: 0.2541 enc_loss_cls: 0.3284 enc_loss_bbox: 0.1566 enc_loss_iou: 0.2856 dn_loss_cls: 0.1016 dn_loss_bbox: 0.1530 dn_loss_iou: 0.2020 d0.dn_loss_cls: 0.1826 d0.dn_loss_bbox: 0.3230 d0.dn_loss_iou: 0.3479 d1.dn_loss_cls: 0.1290 d1.dn_loss_bbox: 0.1941 d1.dn_loss_iou: 0.2349 d2.dn_loss_cls: 0.1139 d2.dn_loss_bbox: 0.1650 d2.dn_loss_iou: 0.2117 d3.dn_loss_cls: 0.1076 d3.dn_loss_bbox: 0.1545 d3.dn_loss_iou: 0.2045 d4.dn_loss_cls: 0.1036 d4.dn_loss_bbox: 0.1530 d4.dn_loss_iou: 0.2021 d1.loss_lmm_region: 0.1253 loss_lmm_image: 0.8299 2024/11/12 03:43:07 - mmengine - INFO - Iter(train) [ 71100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:59:13 time: 2.0047 data_time: 0.0180 memory: 32278 grad_norm: 27.2576 loss: 10.0866 loss_cls: 0.3311 loss_bbox: 0.1557 loss_iou: 0.2789 d0.loss_cls: 0.3670 d0.loss_bbox: 0.1716 d0.loss_iou: 0.2972 d1.loss_cls: 0.3442 d1.loss_bbox: 0.1569 d1.loss_iou: 0.2812 d2.loss_cls: 0.3369 d2.loss_bbox: 0.1619 d2.loss_iou: 0.2826 d3.loss_cls: 0.3333 d3.loss_bbox: 0.1598 d3.loss_iou: 0.2833 d4.loss_cls: 0.3327 d4.loss_bbox: 0.1556 d4.loss_iou: 0.2784 enc_loss_cls: 0.3680 enc_loss_bbox: 0.1860 enc_loss_iou: 0.3186 dn_loss_cls: 0.0949 dn_loss_bbox: 0.1796 dn_loss_iou: 0.2378 d0.dn_loss_cls: 0.1762 d0.dn_loss_bbox: 0.3122 d0.dn_loss_iou: 0.3793 d1.dn_loss_cls: 0.1256 d1.dn_loss_bbox: 0.2082 d1.dn_loss_iou: 0.2667 d2.dn_loss_cls: 0.1071 d2.dn_loss_bbox: 0.1898 d2.dn_loss_iou: 0.2467 d3.dn_loss_cls: 0.0998 d3.dn_loss_bbox: 0.1821 d3.dn_loss_iou: 0.2407 d4.dn_loss_cls: 0.0963 d4.dn_loss_bbox: 0.1796 d4.dn_loss_iou: 0.2377 d1.loss_lmm_region: 0.1400 loss_lmm_image: 0.8054 2024/11/12 03:46:27 - mmengine - INFO - Iter(train) [ 71200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:55:52 time: 2.0141 data_time: 0.0181 memory: 33073 grad_norm: 26.7546 loss: 8.8610 loss_cls: 0.2687 loss_bbox: 0.1371 loss_iou: 0.2195 d0.loss_cls: 0.3099 d0.loss_bbox: 0.1507 d0.loss_iou: 0.2343 d1.loss_cls: 0.2869 d1.loss_bbox: 0.1384 d1.loss_iou: 0.2261 d2.loss_cls: 0.2762 d2.loss_bbox: 0.1343 d2.loss_iou: 0.2183 d3.loss_cls: 0.2692 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2205 d4.loss_cls: 0.2691 d4.loss_bbox: 0.1351 d4.loss_iou: 0.2189 enc_loss_cls: 0.3171 enc_loss_bbox: 0.1605 enc_loss_iou: 0.2504 dn_loss_cls: 0.1003 dn_loss_bbox: 0.1628 dn_loss_iou: 0.2054 d0.dn_loss_cls: 0.1749 d0.dn_loss_bbox: 0.3074 d0.dn_loss_iou: 0.3454 d1.dn_loss_cls: 0.1268 d1.dn_loss_bbox: 0.1920 d1.dn_loss_iou: 0.2345 d2.dn_loss_cls: 0.1106 d2.dn_loss_bbox: 0.1694 d2.dn_loss_iou: 0.2122 d3.dn_loss_cls: 0.1044 d3.dn_loss_bbox: 0.1641 d3.dn_loss_iou: 0.2072 d4.dn_loss_cls: 0.1010 d4.dn_loss_bbox: 0.1628 d4.dn_loss_iou: 0.2053 d1.loss_lmm_region: 0.1372 loss_lmm_image: 0.8590 2024/11/12 03:49:46 - mmengine - INFO - Iter(train) [ 71300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:52:29 time: 1.9790 data_time: 0.0180 memory: 32548 grad_norm: 46.8773 loss: 8.2424 loss_cls: 0.2272 loss_bbox: 0.1255 loss_iou: 0.2105 d0.loss_cls: 0.2619 d0.loss_bbox: 0.1434 d0.loss_iou: 0.2219 d1.loss_cls: 0.2406 d1.loss_bbox: 0.1332 d1.loss_iou: 0.2142 d2.loss_cls: 0.2387 d2.loss_bbox: 0.1275 d2.loss_iou: 0.2114 d3.loss_cls: 0.2336 d3.loss_bbox: 0.1260 d3.loss_iou: 0.2097 d4.loss_cls: 0.2274 d4.loss_bbox: 0.1258 d4.loss_iou: 0.2104 enc_loss_cls: 0.2725 enc_loss_bbox: 0.1498 enc_loss_iou: 0.2390 dn_loss_cls: 0.0729 dn_loss_bbox: 0.1649 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1485 d0.dn_loss_bbox: 0.3174 d0.dn_loss_iou: 0.3411 d1.dn_loss_cls: 0.1000 d1.dn_loss_bbox: 0.2018 d1.dn_loss_iou: 0.2340 d2.dn_loss_cls: 0.0834 d2.dn_loss_bbox: 0.1790 d2.dn_loss_iou: 0.2138 d3.dn_loss_cls: 0.0782 d3.dn_loss_bbox: 0.1680 d3.dn_loss_iou: 0.2061 d4.dn_loss_cls: 0.0746 d4.dn_loss_bbox: 0.1650 d4.dn_loss_iou: 0.2034 d1.loss_lmm_region: 0.1185 loss_lmm_image: 0.8179 2024/11/12 03:53:05 - mmengine - INFO - Iter(train) [ 71400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:49:06 time: 1.9780 data_time: 0.0180 memory: 32608 grad_norm: 26.9122 loss: 9.6963 loss_cls: 0.3232 loss_bbox: 0.1453 loss_iou: 0.2537 d0.loss_cls: 0.3641 d0.loss_bbox: 0.1647 d0.loss_iou: 0.2779 d1.loss_cls: 0.3437 d1.loss_bbox: 0.1509 d1.loss_iou: 0.2611 d2.loss_cls: 0.3394 d2.loss_bbox: 0.1445 d2.loss_iou: 0.2556 d3.loss_cls: 0.3274 d3.loss_bbox: 0.1449 d3.loss_iou: 0.2549 d4.loss_cls: 0.3226 d4.loss_bbox: 0.1448 d4.loss_iou: 0.2546 enc_loss_cls: 0.3634 enc_loss_bbox: 0.1805 enc_loss_iou: 0.2987 dn_loss_cls: 0.1258 dn_loss_bbox: 0.1518 dn_loss_iou: 0.2082 d0.dn_loss_cls: 0.2030 d0.dn_loss_bbox: 0.2866 d0.dn_loss_iou: 0.3413 d1.dn_loss_cls: 0.1627 d1.dn_loss_bbox: 0.1835 d1.dn_loss_iou: 0.2353 d2.dn_loss_cls: 0.1425 d2.dn_loss_bbox: 0.1635 d2.dn_loss_iou: 0.2165 d3.dn_loss_cls: 0.1311 d3.dn_loss_bbox: 0.1543 d3.dn_loss_iou: 0.2103 d4.dn_loss_cls: 0.1264 d4.dn_loss_bbox: 0.1517 d4.dn_loss_iou: 0.2081 d1.loss_lmm_region: 0.1278 loss_lmm_image: 0.8502 2024/11/12 03:56:24 - mmengine - INFO - Iter(train) [ 71500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:45:43 time: 1.9764 data_time: 0.0181 memory: 32997 grad_norm: 36.5865 loss: 8.4602 loss_cls: 0.2324 loss_bbox: 0.1326 loss_iou: 0.2210 d0.loss_cls: 0.2774 d0.loss_bbox: 0.1471 d0.loss_iou: 0.2311 d1.loss_cls: 0.2504 d1.loss_bbox: 0.1402 d1.loss_iou: 0.2269 d2.loss_cls: 0.2423 d2.loss_bbox: 0.1366 d2.loss_iou: 0.2231 d3.loss_cls: 0.2369 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2222 d4.loss_cls: 0.2343 d4.loss_bbox: 0.1315 d4.loss_iou: 0.2204 enc_loss_cls: 0.2791 enc_loss_bbox: 0.1589 enc_loss_iou: 0.2520 dn_loss_cls: 0.0883 dn_loss_bbox: 0.1620 dn_loss_iou: 0.1965 d0.dn_loss_cls: 0.1646 d0.dn_loss_bbox: 0.3166 d0.dn_loss_iou: 0.3343 d1.dn_loss_cls: 0.1149 d1.dn_loss_bbox: 0.1933 d1.dn_loss_iou: 0.2248 d2.dn_loss_cls: 0.1022 d2.dn_loss_bbox: 0.1716 d2.dn_loss_iou: 0.2044 d3.dn_loss_cls: 0.0949 d3.dn_loss_bbox: 0.1646 d3.dn_loss_iou: 0.1982 d4.dn_loss_cls: 0.0891 d4.dn_loss_bbox: 0.1620 d4.dn_loss_iou: 0.1964 d1.loss_lmm_region: 0.1213 loss_lmm_image: 0.8281 2024/11/12 03:59:42 - mmengine - INFO - Iter(train) [ 71600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:42:21 time: 1.9745 data_time: 0.0182 memory: 34669 grad_norm: 27.5027 loss: 9.4245 loss_cls: 0.2954 loss_bbox: 0.1427 loss_iou: 0.2522 d0.loss_cls: 0.3433 d0.loss_bbox: 0.1481 d0.loss_iou: 0.2618 d1.loss_cls: 0.3088 d1.loss_bbox: 0.1545 d1.loss_iou: 0.2637 d2.loss_cls: 0.3002 d2.loss_bbox: 0.1454 d2.loss_iou: 0.2576 d3.loss_cls: 0.2973 d3.loss_bbox: 0.1426 d3.loss_iou: 0.2515 d4.loss_cls: 0.2997 d4.loss_bbox: 0.1403 d4.loss_iou: 0.2514 enc_loss_cls: 0.3430 enc_loss_bbox: 0.1667 enc_loss_iou: 0.2823 dn_loss_cls: 0.1121 dn_loss_bbox: 0.1648 dn_loss_iou: 0.2118 d0.dn_loss_cls: 0.1844 d0.dn_loss_bbox: 0.3120 d0.dn_loss_iou: 0.3530 d1.dn_loss_cls: 0.1406 d1.dn_loss_bbox: 0.1949 d1.dn_loss_iou: 0.2420 d2.dn_loss_cls: 0.1213 d2.dn_loss_bbox: 0.1729 d2.dn_loss_iou: 0.2207 d3.dn_loss_cls: 0.1151 d3.dn_loss_bbox: 0.1652 d3.dn_loss_iou: 0.2134 d4.dn_loss_cls: 0.1117 d4.dn_loss_bbox: 0.1648 d4.dn_loss_iou: 0.2118 d1.loss_lmm_region: 0.1416 loss_lmm_image: 0.8219 2024/11/12 04:03:02 - mmengine - INFO - Iter(train) [ 71700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:38:59 time: 1.9878 data_time: 0.0181 memory: 32977 grad_norm: 27.0223 loss: 9.6246 loss_cls: 0.2966 loss_bbox: 0.1481 loss_iou: 0.2760 d0.loss_cls: 0.3284 d0.loss_bbox: 0.1733 d0.loss_iou: 0.3034 d1.loss_cls: 0.3126 d1.loss_bbox: 0.1555 d1.loss_iou: 0.2873 d2.loss_cls: 0.3073 d2.loss_bbox: 0.1479 d2.loss_iou: 0.2787 d3.loss_cls: 0.2992 d3.loss_bbox: 0.1487 d3.loss_iou: 0.2763 d4.loss_cls: 0.2965 d4.loss_bbox: 0.1491 d4.loss_iou: 0.2762 enc_loss_cls: 0.3427 enc_loss_bbox: 0.1804 enc_loss_iou: 0.3169 dn_loss_cls: 0.0784 dn_loss_bbox: 0.1646 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.1615 d0.dn_loss_bbox: 0.3079 d0.dn_loss_iou: 0.3710 d1.dn_loss_cls: 0.1079 d1.dn_loss_bbox: 0.1970 d1.dn_loss_iou: 0.2560 d2.dn_loss_cls: 0.0900 d2.dn_loss_bbox: 0.1763 d2.dn_loss_iou: 0.2351 d3.dn_loss_cls: 0.0819 d3.dn_loss_bbox: 0.1675 d3.dn_loss_iou: 0.2274 d4.dn_loss_cls: 0.0798 d4.dn_loss_bbox: 0.1646 d4.dn_loss_iou: 0.2248 d1.loss_lmm_region: 0.1297 loss_lmm_image: 0.8775 2024/11/12 04:06:23 - mmengine - INFO - Iter(train) [ 71800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:35:39 time: 2.0366 data_time: 0.0181 memory: 33716 grad_norm: 37.0350 loss: 9.4583 loss_cls: 0.2925 loss_bbox: 0.1339 loss_iou: 0.2523 d0.loss_cls: 0.3363 d0.loss_bbox: 0.1448 d0.loss_iou: 0.2653 d1.loss_cls: 0.3129 d1.loss_bbox: 0.1375 d1.loss_iou: 0.2569 d2.loss_cls: 0.3047 d2.loss_bbox: 0.1333 d2.loss_iou: 0.2522 d3.loss_cls: 0.2998 d3.loss_bbox: 0.1316 d3.loss_iou: 0.2491 d4.loss_cls: 0.2954 d4.loss_bbox: 0.1322 d4.loss_iou: 0.2497 enc_loss_cls: 0.3391 enc_loss_bbox: 0.1656 enc_loss_iou: 0.2944 dn_loss_cls: 0.1139 dn_loss_bbox: 0.1691 dn_loss_iou: 0.2188 d0.dn_loss_cls: 0.1956 d0.dn_loss_bbox: 0.3124 d0.dn_loss_iou: 0.3597 d1.dn_loss_cls: 0.1422 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2495 d2.dn_loss_cls: 0.1250 d2.dn_loss_bbox: 0.1763 d2.dn_loss_iou: 0.2278 d3.dn_loss_cls: 0.1193 d3.dn_loss_bbox: 0.1703 d3.dn_loss_iou: 0.2214 d4.dn_loss_cls: 0.1150 d4.dn_loss_bbox: 0.1690 d4.dn_loss_iou: 0.2188 d1.loss_lmm_region: 0.1297 loss_lmm_image: 0.8480 2024/11/12 04:09:42 - mmengine - INFO - Iter(train) [ 71900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:32:15 time: 1.9803 data_time: 0.0183 memory: 34019 grad_norm: nan loss: 8.2001 loss_cls: 0.2393 loss_bbox: 0.1116 loss_iou: 0.1920 d0.loss_cls: 0.2845 d0.loss_bbox: 0.1208 d0.loss_iou: 0.2030 d1.loss_cls: 0.2520 d1.loss_bbox: 0.1214 d1.loss_iou: 0.1995 d2.loss_cls: 0.2479 d2.loss_bbox: 0.1136 d2.loss_iou: 0.1945 d3.loss_cls: 0.2452 d3.loss_bbox: 0.1134 d3.loss_iou: 0.1928 d4.loss_cls: 0.2428 d4.loss_bbox: 0.1117 d4.loss_iou: 0.1915 enc_loss_cls: 0.2855 enc_loss_bbox: 0.1359 enc_loss_iou: 0.2259 dn_loss_cls: 0.1020 dn_loss_bbox: 0.1643 dn_loss_iou: 0.1964 d0.dn_loss_cls: 0.1600 d0.dn_loss_bbox: 0.2980 d0.dn_loss_iou: 0.3255 d1.dn_loss_cls: 0.1294 d1.dn_loss_bbox: 0.1964 d1.dn_loss_iou: 0.2247 d2.dn_loss_cls: 0.1132 d2.dn_loss_bbox: 0.1756 d2.dn_loss_iou: 0.2058 d3.dn_loss_cls: 0.1069 d3.dn_loss_bbox: 0.1665 d3.dn_loss_iou: 0.1986 d4.dn_loss_cls: 0.1021 d4.dn_loss_bbox: 0.1645 d4.dn_loss_iou: 0.1966 d1.loss_lmm_region: 0.0944 loss_lmm_image: 0.8544 2024/11/12 04:13:02 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 04:13:02 - mmengine - INFO - Iter(train) [ 72000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:28:55 time: 1.9867 data_time: 0.0182 memory: 34254 grad_norm: 30.2446 loss: 8.6379 loss_cls: 0.2366 loss_bbox: 0.1284 loss_iou: 0.2135 d0.loss_cls: 0.2805 d0.loss_bbox: 0.1404 d0.loss_iou: 0.2267 d1.loss_cls: 0.2559 d1.loss_bbox: 0.1324 d1.loss_iou: 0.2191 d2.loss_cls: 0.2473 d2.loss_bbox: 0.1298 d2.loss_iou: 0.2143 d3.loss_cls: 0.2421 d3.loss_bbox: 0.1263 d3.loss_iou: 0.2118 d4.loss_cls: 0.2388 d4.loss_bbox: 0.1271 d4.loss_iou: 0.2124 enc_loss_cls: 0.2832 enc_loss_bbox: 0.1554 enc_loss_iou: 0.2467 dn_loss_cls: 0.0805 dn_loss_bbox: 0.1692 dn_loss_iou: 0.2215 d0.dn_loss_cls: 0.1706 d0.dn_loss_bbox: 0.3331 d0.dn_loss_iou: 0.3738 d1.dn_loss_cls: 0.1179 d1.dn_loss_bbox: 0.2076 d1.dn_loss_iou: 0.2562 d2.dn_loss_cls: 0.0950 d2.dn_loss_bbox: 0.1830 d2.dn_loss_iou: 0.2328 d3.dn_loss_cls: 0.0854 d3.dn_loss_bbox: 0.1722 d3.dn_loss_iou: 0.2244 d4.dn_loss_cls: 0.0813 d4.dn_loss_bbox: 0.1692 d4.dn_loss_iou: 0.2216 d1.loss_lmm_region: 0.1420 loss_lmm_image: 0.8320 2024/11/12 04:16:22 - mmengine - INFO - Iter(train) [ 72100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:25:33 time: 1.9942 data_time: 0.0181 memory: 33863 grad_norm: 29.7324 loss: 9.7680 loss_cls: 0.2809 loss_bbox: 0.1636 loss_iou: 0.2592 d0.loss_cls: 0.3239 d0.loss_bbox: 0.1709 d0.loss_iou: 0.2727 d1.loss_cls: 0.3010 d1.loss_bbox: 0.1664 d1.loss_iou: 0.2647 d2.loss_cls: 0.2945 d2.loss_bbox: 0.1651 d2.loss_iou: 0.2610 d3.loss_cls: 0.2838 d3.loss_bbox: 0.1651 d3.loss_iou: 0.2606 d4.loss_cls: 0.2829 d4.loss_bbox: 0.1646 d4.loss_iou: 0.2599 enc_loss_cls: 0.3340 enc_loss_bbox: 0.1830 enc_loss_iou: 0.2918 dn_loss_cls: 0.1031 dn_loss_bbox: 0.1877 dn_loss_iou: 0.2325 d0.dn_loss_cls: 0.1905 d0.dn_loss_bbox: 0.3305 d0.dn_loss_iou: 0.3653 d1.dn_loss_cls: 0.1352 d1.dn_loss_bbox: 0.2210 d1.dn_loss_iou: 0.2628 d2.dn_loss_cls: 0.1156 d2.dn_loss_bbox: 0.1980 d2.dn_loss_iou: 0.2421 d3.dn_loss_cls: 0.1095 d3.dn_loss_bbox: 0.1905 d3.dn_loss_iou: 0.2352 d4.dn_loss_cls: 0.1031 d4.dn_loss_bbox: 0.1877 d4.dn_loss_iou: 0.2326 d1.loss_lmm_region: 0.1397 loss_lmm_image: 0.8359 2024/11/12 04:19:41 - mmengine - INFO - Iter(train) [ 72200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:22:11 time: 1.9630 data_time: 0.0182 memory: 34811 grad_norm: 27.2339 loss: 10.9191 loss_cls: 0.3634 loss_bbox: 0.1639 loss_iou: 0.2706 d0.loss_cls: 0.4267 d0.loss_bbox: 0.1708 d0.loss_iou: 0.2828 d1.loss_cls: 0.3897 d1.loss_bbox: 0.1693 d1.loss_iou: 0.2776 d2.loss_cls: 0.3756 d2.loss_bbox: 0.1648 d2.loss_iou: 0.2720 d3.loss_cls: 0.3709 d3.loss_bbox: 0.1635 d3.loss_iou: 0.2708 d4.loss_cls: 0.3642 d4.loss_bbox: 0.1633 d4.loss_iou: 0.2697 enc_loss_cls: 0.4222 enc_loss_bbox: 0.1941 enc_loss_iou: 0.3161 dn_loss_cls: 0.1431 dn_loss_bbox: 0.2028 dn_loss_iou: 0.2407 d0.dn_loss_cls: 0.2388 d0.dn_loss_bbox: 0.3720 d0.dn_loss_iou: 0.3972 d1.dn_loss_cls: 0.1829 d1.dn_loss_bbox: 0.2445 d1.dn_loss_iou: 0.2792 d2.dn_loss_cls: 0.1570 d2.dn_loss_bbox: 0.2160 d2.dn_loss_iou: 0.2535 d3.dn_loss_cls: 0.1493 d3.dn_loss_bbox: 0.2053 d3.dn_loss_iou: 0.2439 d4.dn_loss_cls: 0.1437 d4.dn_loss_bbox: 0.2027 d4.dn_loss_iou: 0.2407 d1.loss_lmm_region: 0.1461 loss_lmm_image: 0.7975 2024/11/12 04:23:00 - mmengine - INFO - Iter(train) [ 72300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:18:48 time: 1.9705 data_time: 0.0181 memory: 33683 grad_norm: 32.3695 loss: 10.7275 loss_cls: 0.3131 loss_bbox: 0.1913 loss_iou: 0.3175 d0.loss_cls: 0.3613 d0.loss_bbox: 0.2017 d0.loss_iou: 0.3328 d1.loss_cls: 0.3377 d1.loss_bbox: 0.1919 d1.loss_iou: 0.3245 d2.loss_cls: 0.3361 d2.loss_bbox: 0.1823 d2.loss_iou: 0.3135 d3.loss_cls: 0.3247 d3.loss_bbox: 0.1832 d3.loss_iou: 0.3117 d4.loss_cls: 0.3119 d4.loss_bbox: 0.1914 d4.loss_iou: 0.3168 enc_loss_cls: 0.3709 enc_loss_bbox: 0.2140 enc_loss_iou: 0.3535 dn_loss_cls: 0.0805 dn_loss_bbox: 0.1958 dn_loss_iou: 0.2579 d0.dn_loss_cls: 0.1842 d0.dn_loss_bbox: 0.3591 d0.dn_loss_iou: 0.4054 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.2307 d1.dn_loss_iou: 0.2872 d2.dn_loss_cls: 0.0970 d2.dn_loss_bbox: 0.2060 d2.dn_loss_iou: 0.2665 d3.dn_loss_cls: 0.0870 d3.dn_loss_bbox: 0.1976 d3.dn_loss_iou: 0.2596 d4.dn_loss_cls: 0.0815 d4.dn_loss_bbox: 0.1957 d4.dn_loss_iou: 0.2579 d1.loss_lmm_region: 0.1506 loss_lmm_image: 0.8242 2024/11/12 04:26:18 - mmengine - INFO - Iter(train) [ 72400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:15:25 time: 1.9967 data_time: 0.0181 memory: 34337 grad_norm: 47.2359 loss: 9.6992 loss_cls: 0.3091 loss_bbox: 0.1483 loss_iou: 0.2508 d0.loss_cls: 0.3501 d0.loss_bbox: 0.1573 d0.loss_iou: 0.2612 d1.loss_cls: 0.3246 d1.loss_bbox: 0.1529 d1.loss_iou: 0.2573 d2.loss_cls: 0.3217 d2.loss_bbox: 0.1465 d2.loss_iou: 0.2521 d3.loss_cls: 0.3176 d3.loss_bbox: 0.1465 d3.loss_iou: 0.2508 d4.loss_cls: 0.3132 d4.loss_bbox: 0.1470 d4.loss_iou: 0.2492 enc_loss_cls: 0.3576 enc_loss_bbox: 0.1695 enc_loss_iou: 0.2789 dn_loss_cls: 0.0990 dn_loss_bbox: 0.1851 dn_loss_iou: 0.2318 d0.dn_loss_cls: 0.1814 d0.dn_loss_bbox: 0.3292 d0.dn_loss_iou: 0.3778 d1.dn_loss_cls: 0.1287 d1.dn_loss_bbox: 0.2119 d1.dn_loss_iou: 0.2638 d2.dn_loss_cls: 0.1105 d2.dn_loss_bbox: 0.1922 d2.dn_loss_iou: 0.2400 d3.dn_loss_cls: 0.1037 d3.dn_loss_bbox: 0.1868 d3.dn_loss_iou: 0.2336 d4.dn_loss_cls: 0.1004 d4.dn_loss_bbox: 0.1852 d4.dn_loss_iou: 0.2317 d1.loss_lmm_region: 0.1402 loss_lmm_image: 0.8039 2024/11/12 04:29:35 - mmengine - INFO - Iter(train) [ 72500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:12:00 time: 1.9654 data_time: 0.0181 memory: 33463 grad_norm: 28.1402 loss: 8.2001 loss_cls: 0.2581 loss_bbox: 0.1158 loss_iou: 0.2018 d0.loss_cls: 0.2998 d0.loss_bbox: 0.1278 d0.loss_iou: 0.2191 d1.loss_cls: 0.2779 d1.loss_bbox: 0.1172 d1.loss_iou: 0.2074 d2.loss_cls: 0.2669 d2.loss_bbox: 0.1150 d2.loss_iou: 0.2042 d3.loss_cls: 0.2600 d3.loss_bbox: 0.1174 d3.loss_iou: 0.2023 d4.loss_cls: 0.2599 d4.loss_bbox: 0.1148 d4.loss_iou: 0.2009 enc_loss_cls: 0.3028 enc_loss_bbox: 0.1429 enc_loss_iou: 0.2409 dn_loss_cls: 0.0904 dn_loss_bbox: 0.1453 dn_loss_iou: 0.1865 d0.dn_loss_cls: 0.1581 d0.dn_loss_bbox: 0.2800 d0.dn_loss_iou: 0.3208 d1.dn_loss_cls: 0.1161 d1.dn_loss_bbox: 0.1740 d1.dn_loss_iou: 0.2156 d2.dn_loss_cls: 0.0992 d2.dn_loss_bbox: 0.1522 d2.dn_loss_iou: 0.1939 d3.dn_loss_cls: 0.0939 d3.dn_loss_bbox: 0.1475 d3.dn_loss_iou: 0.1886 d4.dn_loss_cls: 0.0897 d4.dn_loss_bbox: 0.1453 d4.dn_loss_iou: 0.1865 d1.loss_lmm_region: 0.1119 loss_lmm_image: 0.8514 2024/11/12 04:32:56 - mmengine - INFO - Iter(train) [ 72600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:08:39 time: 2.0082 data_time: 0.0182 memory: 34942 grad_norm: 31.3658 loss: 10.0474 loss_cls: 0.3325 loss_bbox: 0.1488 loss_iou: 0.2814 d0.loss_cls: 0.3855 d0.loss_bbox: 0.1551 d0.loss_iou: 0.2972 d1.loss_cls: 0.3538 d1.loss_bbox: 0.1550 d1.loss_iou: 0.2917 d2.loss_cls: 0.3430 d2.loss_bbox: 0.1517 d2.loss_iou: 0.2874 d3.loss_cls: 0.3363 d3.loss_bbox: 0.1485 d3.loss_iou: 0.2833 d4.loss_cls: 0.3310 d4.loss_bbox: 0.1496 d4.loss_iou: 0.2820 enc_loss_cls: 0.3909 enc_loss_bbox: 0.1715 enc_loss_iou: 0.3224 dn_loss_cls: 0.1243 dn_loss_bbox: 0.1563 dn_loss_iou: 0.2144 d0.dn_loss_cls: 0.2081 d0.dn_loss_bbox: 0.2942 d0.dn_loss_iou: 0.3542 d1.dn_loss_cls: 0.1582 d1.dn_loss_bbox: 0.1862 d1.dn_loss_iou: 0.2442 d2.dn_loss_cls: 0.1385 d2.dn_loss_bbox: 0.1637 d2.dn_loss_iou: 0.2226 d3.dn_loss_cls: 0.1284 d3.dn_loss_bbox: 0.1577 d3.dn_loss_iou: 0.2166 d4.dn_loss_cls: 0.1251 d4.dn_loss_bbox: 0.1562 d4.dn_loss_iou: 0.2142 d1.loss_lmm_region: 0.1281 loss_lmm_image: 0.8576 2024/11/12 04:36:15 - mmengine - INFO - Iter(train) [ 72700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:05:17 time: 1.9922 data_time: 0.0182 memory: 34760 grad_norm: 27.4307 loss: 9.3500 loss_cls: 0.2720 loss_bbox: 0.1513 loss_iou: 0.2411 d0.loss_cls: 0.3148 d0.loss_bbox: 0.1600 d0.loss_iou: 0.2506 d1.loss_cls: 0.2870 d1.loss_bbox: 0.1578 d1.loss_iou: 0.2455 d2.loss_cls: 0.2855 d2.loss_bbox: 0.1499 d2.loss_iou: 0.2372 d3.loss_cls: 0.2747 d3.loss_bbox: 0.1515 d3.loss_iou: 0.2409 d4.loss_cls: 0.2719 d4.loss_bbox: 0.1506 d4.loss_iou: 0.2409 enc_loss_cls: 0.3138 enc_loss_bbox: 0.1793 enc_loss_iou: 0.2737 dn_loss_cls: 0.0975 dn_loss_bbox: 0.1864 dn_loss_iou: 0.2187 d0.dn_loss_cls: 0.1771 d0.dn_loss_bbox: 0.3388 d0.dn_loss_iou: 0.3565 d1.dn_loss_cls: 0.1280 d1.dn_loss_bbox: 0.2245 d1.dn_loss_iou: 0.2516 d2.dn_loss_cls: 0.1081 d2.dn_loss_bbox: 0.1962 d2.dn_loss_iou: 0.2288 d3.dn_loss_cls: 0.1004 d3.dn_loss_bbox: 0.1888 d3.dn_loss_iou: 0.2212 d4.dn_loss_cls: 0.0974 d4.dn_loss_bbox: 0.1864 d4.dn_loss_iou: 0.2187 d1.loss_lmm_region: 0.1305 loss_lmm_image: 0.8445 2024/11/12 04:39:34 - mmengine - INFO - Iter(train) [ 72800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 19:01:55 time: 1.9940 data_time: 0.0182 memory: 33659 grad_norm: 27.8036 loss: 9.4503 loss_cls: 0.2751 loss_bbox: 0.1272 loss_iou: 0.2114 d0.loss_cls: 0.3183 d0.loss_bbox: 0.1398 d0.loss_iou: 0.2287 d1.loss_cls: 0.2944 d1.loss_bbox: 0.1323 d1.loss_iou: 0.2180 d2.loss_cls: 0.2880 d2.loss_bbox: 0.1257 d2.loss_iou: 0.2118 d3.loss_cls: 0.2795 d3.loss_bbox: 0.1275 d3.loss_iou: 0.2136 d4.loss_cls: 0.2749 d4.loss_bbox: 0.1286 d4.loss_iou: 0.2138 enc_loss_cls: 0.3142 enc_loss_bbox: 0.1612 enc_loss_iou: 0.2571 dn_loss_cls: 0.1991 dn_loss_bbox: 0.1609 dn_loss_iou: 0.2063 d0.dn_loss_cls: 0.2622 d0.dn_loss_bbox: 0.3331 d0.dn_loss_iou: 0.3624 d1.dn_loss_cls: 0.2271 d1.dn_loss_bbox: 0.1973 d1.dn_loss_iou: 0.2402 d2.dn_loss_cls: 0.2107 d2.dn_loss_bbox: 0.1707 d2.dn_loss_iou: 0.2172 d3.dn_loss_cls: 0.2054 d3.dn_loss_bbox: 0.1627 d3.dn_loss_iou: 0.2093 d4.dn_loss_cls: 0.2006 d4.dn_loss_bbox: 0.1608 d4.dn_loss_iou: 0.2063 d1.loss_lmm_region: 0.1604 loss_lmm_image: 0.8161 2024/11/12 04:42:53 - mmengine - INFO - Iter(train) [ 72900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:58:32 time: 1.9885 data_time: 0.0182 memory: 33920 grad_norm: 31.8434 loss: 9.1625 loss_cls: 0.2507 loss_bbox: 0.1389 loss_iou: 0.2258 d0.loss_cls: 0.2939 d0.loss_bbox: 0.1433 d0.loss_iou: 0.2328 d1.loss_cls: 0.2704 d1.loss_bbox: 0.1390 d1.loss_iou: 0.2288 d2.loss_cls: 0.2597 d2.loss_bbox: 0.1409 d2.loss_iou: 0.2284 d3.loss_cls: 0.2567 d3.loss_bbox: 0.1389 d3.loss_iou: 0.2234 d4.loss_cls: 0.2564 d4.loss_bbox: 0.1356 d4.loss_iou: 0.2228 enc_loss_cls: 0.2965 enc_loss_bbox: 0.1608 enc_loss_iou: 0.2556 dn_loss_cls: 0.1167 dn_loss_bbox: 0.1754 dn_loss_iou: 0.2283 d0.dn_loss_cls: 0.2050 d0.dn_loss_bbox: 0.3362 d0.dn_loss_iou: 0.3799 d1.dn_loss_cls: 0.1539 d1.dn_loss_bbox: 0.2095 d1.dn_loss_iou: 0.2611 d2.dn_loss_cls: 0.1290 d2.dn_loss_bbox: 0.1853 d2.dn_loss_iou: 0.2390 d3.dn_loss_cls: 0.1221 d3.dn_loss_bbox: 0.1767 d3.dn_loss_iou: 0.2306 d4.dn_loss_cls: 0.1171 d4.dn_loss_bbox: 0.1754 d4.dn_loss_iou: 0.2282 d1.loss_lmm_region: 0.1561 loss_lmm_image: 0.8379 2024/11/12 04:46:10 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 04:46:10 - mmengine - INFO - Iter(train) [ 73000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:55:07 time: 1.9577 data_time: 0.0183 memory: 33967 grad_norm: 26.3492 loss: 8.6993 loss_cls: 0.2672 loss_bbox: 0.1354 loss_iou: 0.2536 d0.loss_cls: 0.3204 d0.loss_bbox: 0.1420 d0.loss_iou: 0.2597 d1.loss_cls: 0.2911 d1.loss_bbox: 0.1378 d1.loss_iou: 0.2540 d2.loss_cls: 0.2813 d2.loss_bbox: 0.1325 d2.loss_iou: 0.2514 d3.loss_cls: 0.2713 d3.loss_bbox: 0.1334 d3.loss_iou: 0.2531 d4.loss_cls: 0.2676 d4.loss_bbox: 0.1352 d4.loss_iou: 0.2540 enc_loss_cls: 0.3157 enc_loss_bbox: 0.1568 enc_loss_iou: 0.2875 dn_loss_cls: 0.0704 dn_loss_bbox: 0.1385 dn_loss_iou: 0.2004 d0.dn_loss_cls: 0.1507 d0.dn_loss_bbox: 0.2710 d0.dn_loss_iou: 0.3392 d1.dn_loss_cls: 0.0985 d1.dn_loss_bbox: 0.1687 d1.dn_loss_iou: 0.2309 d2.dn_loss_cls: 0.0811 d2.dn_loss_bbox: 0.1475 d2.dn_loss_iou: 0.2093 d3.dn_loss_cls: 0.0760 d3.dn_loss_bbox: 0.1406 d3.dn_loss_iou: 0.2029 d4.dn_loss_cls: 0.0714 d4.dn_loss_bbox: 0.1384 d4.dn_loss_iou: 0.2003 d1.loss_lmm_region: 0.1093 loss_lmm_image: 0.8534 2024/11/12 04:49:26 - mmengine - INFO - Iter(train) [ 73100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:51:42 time: 1.9753 data_time: 0.0181 memory: 35267 grad_norm: 25.4267 loss: 9.0813 loss_cls: 0.2598 loss_bbox: 0.1391 loss_iou: 0.2650 d0.loss_cls: 0.3027 d0.loss_bbox: 0.1461 d0.loss_iou: 0.2782 d1.loss_cls: 0.2849 d1.loss_bbox: 0.1385 d1.loss_iou: 0.2654 d2.loss_cls: 0.2688 d2.loss_bbox: 0.1421 d2.loss_iou: 0.2685 d3.loss_cls: 0.2699 d3.loss_bbox: 0.1371 d3.loss_iou: 0.2638 d4.loss_cls: 0.2649 d4.loss_bbox: 0.1393 d4.loss_iou: 0.2636 enc_loss_cls: 0.3074 enc_loss_bbox: 0.1604 enc_loss_iou: 0.2945 dn_loss_cls: 0.0684 dn_loss_bbox: 0.1714 dn_loss_iou: 0.2299 d0.dn_loss_cls: 0.1493 d0.dn_loss_bbox: 0.3054 d0.dn_loss_iou: 0.3663 d1.dn_loss_cls: 0.0990 d1.dn_loss_bbox: 0.1993 d1.dn_loss_iou: 0.2568 d2.dn_loss_cls: 0.0816 d2.dn_loss_bbox: 0.1793 d2.dn_loss_iou: 0.2381 d3.dn_loss_cls: 0.0746 d3.dn_loss_bbox: 0.1728 d3.dn_loss_iou: 0.2317 d4.dn_loss_cls: 0.0697 d4.dn_loss_bbox: 0.1714 d4.dn_loss_iou: 0.2300 d1.loss_lmm_region: 0.1266 loss_lmm_image: 0.7998 2024/11/12 04:52:48 - mmengine - INFO - Iter(train) [ 73200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:48:23 time: 2.0179 data_time: 0.0183 memory: 34434 grad_norm: 25.2532 loss: 8.9242 loss_cls: 0.2565 loss_bbox: 0.1319 loss_iou: 0.2179 d0.loss_cls: 0.3028 d0.loss_bbox: 0.1333 d0.loss_iou: 0.2240 d1.loss_cls: 0.2768 d1.loss_bbox: 0.1309 d1.loss_iou: 0.2204 d2.loss_cls: 0.2658 d2.loss_bbox: 0.1297 d2.loss_iou: 0.2206 d3.loss_cls: 0.2570 d3.loss_bbox: 0.1321 d3.loss_iou: 0.2178 d4.loss_cls: 0.2572 d4.loss_bbox: 0.1295 d4.loss_iou: 0.2178 enc_loss_cls: 0.3011 enc_loss_bbox: 0.1496 enc_loss_iou: 0.2465 dn_loss_cls: 0.0917 dn_loss_bbox: 0.1733 dn_loss_iou: 0.2235 d0.dn_loss_cls: 0.1745 d0.dn_loss_bbox: 0.3410 d0.dn_loss_iou: 0.3793 d1.dn_loss_cls: 0.1258 d1.dn_loss_bbox: 0.2084 d1.dn_loss_iou: 0.2572 d2.dn_loss_cls: 0.1075 d2.dn_loss_bbox: 0.1822 d2.dn_loss_iou: 0.2338 d3.dn_loss_cls: 0.0980 d3.dn_loss_bbox: 0.1754 d3.dn_loss_iou: 0.2258 d4.dn_loss_cls: 0.0926 d4.dn_loss_bbox: 0.1733 d4.dn_loss_iou: 0.2236 d1.loss_lmm_region: 0.1395 loss_lmm_image: 0.8783 2024/11/12 04:56:07 - mmengine - INFO - Iter(train) [ 73300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:45:00 time: 1.9840 data_time: 0.0181 memory: 33913 grad_norm: 26.9232 loss: 9.6181 loss_cls: 0.3049 loss_bbox: 0.1449 loss_iou: 0.2783 d0.loss_cls: 0.3612 d0.loss_bbox: 0.1550 d0.loss_iou: 0.2916 d1.loss_cls: 0.3363 d1.loss_bbox: 0.1449 d1.loss_iou: 0.2792 d2.loss_cls: 0.3211 d2.loss_bbox: 0.1417 d2.loss_iou: 0.2768 d3.loss_cls: 0.3129 d3.loss_bbox: 0.1411 d3.loss_iou: 0.2773 d4.loss_cls: 0.3038 d4.loss_bbox: 0.1431 d4.loss_iou: 0.2785 enc_loss_cls: 0.3650 enc_loss_bbox: 0.1694 enc_loss_iou: 0.3152 dn_loss_cls: 0.1003 dn_loss_bbox: 0.1488 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.1786 d0.dn_loss_bbox: 0.2817 d0.dn_loss_iou: 0.3566 d1.dn_loss_cls: 0.1327 d1.dn_loss_bbox: 0.1753 d1.dn_loss_iou: 0.2456 d2.dn_loss_cls: 0.1127 d2.dn_loss_bbox: 0.1574 d2.dn_loss_iou: 0.2251 d3.dn_loss_cls: 0.1051 d3.dn_loss_bbox: 0.1502 d3.dn_loss_iou: 0.2176 d4.dn_loss_cls: 0.0999 d4.dn_loss_bbox: 0.1488 d4.dn_loss_iou: 0.2157 d1.loss_lmm_region: 0.1234 loss_lmm_image: 0.8849 2024/11/12 04:59:27 - mmengine - INFO - Iter(train) [ 73400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:41:39 time: 1.9855 data_time: 0.0182 memory: 33905 grad_norm: 28.5587 loss: 9.4395 loss_cls: 0.2741 loss_bbox: 0.1505 loss_iou: 0.2638 d0.loss_cls: 0.3174 d0.loss_bbox: 0.1641 d0.loss_iou: 0.2762 d1.loss_cls: 0.2948 d1.loss_bbox: 0.1546 d1.loss_iou: 0.2687 d2.loss_cls: 0.2841 d2.loss_bbox: 0.1514 d2.loss_iou: 0.2660 d3.loss_cls: 0.2786 d3.loss_bbox: 0.1490 d3.loss_iou: 0.2634 d4.loss_cls: 0.2766 d4.loss_bbox: 0.1498 d4.loss_iou: 0.2623 enc_loss_cls: 0.3219 enc_loss_bbox: 0.1740 enc_loss_iou: 0.3016 dn_loss_cls: 0.1091 dn_loss_bbox: 0.1591 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.1866 d0.dn_loss_bbox: 0.3039 d0.dn_loss_iou: 0.3574 d1.dn_loss_cls: 0.1391 d1.dn_loss_bbox: 0.1883 d1.dn_loss_iou: 0.2486 d2.dn_loss_cls: 0.1222 d2.dn_loss_bbox: 0.1653 d2.dn_loss_iou: 0.2282 d3.dn_loss_cls: 0.1161 d3.dn_loss_bbox: 0.1610 d3.dn_loss_iou: 0.2228 d4.dn_loss_cls: 0.1106 d4.dn_loss_bbox: 0.1591 d4.dn_loss_iou: 0.2205 d1.loss_lmm_region: 0.1541 loss_lmm_image: 0.8240 2024/11/12 05:02:46 - mmengine - INFO - Iter(train) [ 73500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:38:17 time: 1.9819 data_time: 0.0181 memory: 34081 grad_norm: 29.7749 loss: 8.2282 loss_cls: 0.2284 loss_bbox: 0.1145 loss_iou: 0.1887 d0.loss_cls: 0.2663 d0.loss_bbox: 0.1219 d0.loss_iou: 0.1975 d1.loss_cls: 0.2482 d1.loss_bbox: 0.1153 d1.loss_iou: 0.1873 d2.loss_cls: 0.2355 d2.loss_bbox: 0.1160 d2.loss_iou: 0.1907 d3.loss_cls: 0.2301 d3.loss_bbox: 0.1173 d3.loss_iou: 0.1891 d4.loss_cls: 0.2284 d4.loss_bbox: 0.1149 d4.loss_iou: 0.1897 enc_loss_cls: 0.2733 enc_loss_bbox: 0.1389 enc_loss_iou: 0.2184 dn_loss_cls: 0.1350 dn_loss_bbox: 0.1539 dn_loss_iou: 0.1787 d0.dn_loss_cls: 0.2112 d0.dn_loss_bbox: 0.3160 d0.dn_loss_iou: 0.3253 d1.dn_loss_cls: 0.1617 d1.dn_loss_bbox: 0.1895 d1.dn_loss_iou: 0.2116 d2.dn_loss_cls: 0.1442 d2.dn_loss_bbox: 0.1622 d2.dn_loss_iou: 0.1884 d3.dn_loss_cls: 0.1408 d3.dn_loss_bbox: 0.1557 d3.dn_loss_iou: 0.1814 d4.dn_loss_cls: 0.1351 d4.dn_loss_bbox: 0.1539 d4.dn_loss_iou: 0.1787 d1.loss_lmm_region: 0.1236 loss_lmm_image: 0.8710 2024/11/12 05:06:05 - mmengine - INFO - Iter(train) [ 73600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:34:54 time: 1.9975 data_time: 0.0181 memory: 35164 grad_norm: 30.0324 loss: 9.0630 loss_cls: 0.2572 loss_bbox: 0.1396 loss_iou: 0.2193 d0.loss_cls: 0.2904 d0.loss_bbox: 0.1509 d0.loss_iou: 0.2363 d1.loss_cls: 0.2715 d1.loss_bbox: 0.1469 d1.loss_iou: 0.2302 d2.loss_cls: 0.2651 d2.loss_bbox: 0.1386 d2.loss_iou: 0.2209 d3.loss_cls: 0.2591 d3.loss_bbox: 0.1400 d3.loss_iou: 0.2197 d4.loss_cls: 0.2586 d4.loss_bbox: 0.1390 d4.loss_iou: 0.2177 enc_loss_cls: 0.2944 enc_loss_bbox: 0.1697 enc_loss_iou: 0.2606 dn_loss_cls: 0.1109 dn_loss_bbox: 0.1830 dn_loss_iou: 0.2201 d0.dn_loss_cls: 0.1766 d0.dn_loss_bbox: 0.3308 d0.dn_loss_iou: 0.3691 d1.dn_loss_cls: 0.1343 d1.dn_loss_bbox: 0.2161 d1.dn_loss_iou: 0.2530 d2.dn_loss_cls: 0.1148 d2.dn_loss_bbox: 0.1942 d2.dn_loss_iou: 0.2309 d3.dn_loss_cls: 0.1085 d3.dn_loss_bbox: 0.1847 d3.dn_loss_iou: 0.2228 d4.dn_loss_cls: 0.1093 d4.dn_loss_bbox: 0.1830 d4.dn_loss_iou: 0.2201 d1.loss_lmm_region: 0.1188 loss_lmm_image: 0.8565 2024/11/12 05:09:24 - mmengine - INFO - Iter(train) [ 73700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:31:32 time: 1.9859 data_time: 0.0181 memory: 33519 grad_norm: 32.7455 loss: 9.2237 loss_cls: 0.2774 loss_bbox: 0.1379 loss_iou: 0.2348 d0.loss_cls: 0.3153 d0.loss_bbox: 0.1535 d0.loss_iou: 0.2539 d1.loss_cls: 0.3007 d1.loss_bbox: 0.1424 d1.loss_iou: 0.2407 d2.loss_cls: 0.2882 d2.loss_bbox: 0.1382 d2.loss_iou: 0.2350 d3.loss_cls: 0.2832 d3.loss_bbox: 0.1371 d3.loss_iou: 0.2316 d4.loss_cls: 0.2775 d4.loss_bbox: 0.1377 d4.loss_iou: 0.2334 enc_loss_cls: 0.3178 enc_loss_bbox: 0.1649 enc_loss_iou: 0.2789 dn_loss_cls: 0.0842 dn_loss_bbox: 0.1766 dn_loss_iou: 0.2179 d0.dn_loss_cls: 0.1731 d0.dn_loss_bbox: 0.3495 d0.dn_loss_iou: 0.3656 d1.dn_loss_cls: 0.1208 d1.dn_loss_bbox: 0.2168 d1.dn_loss_iou: 0.2507 d2.dn_loss_cls: 0.1028 d2.dn_loss_bbox: 0.1901 d2.dn_loss_iou: 0.2292 d3.dn_loss_cls: 0.0918 d3.dn_loss_bbox: 0.1794 d3.dn_loss_iou: 0.2211 d4.dn_loss_cls: 0.0859 d4.dn_loss_bbox: 0.1765 d4.dn_loss_iou: 0.2178 d1.loss_lmm_region: 0.1242 loss_lmm_image: 0.8695 2024/11/12 05:12:43 - mmengine - INFO - Iter(train) [ 73800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:28:09 time: 1.9769 data_time: 0.0182 memory: 33968 grad_norm: 29.4080 loss: 8.9778 loss_cls: 0.2665 loss_bbox: 0.1409 loss_iou: 0.2635 d0.loss_cls: 0.3086 d0.loss_bbox: 0.1507 d0.loss_iou: 0.2781 d1.loss_cls: 0.2855 d1.loss_bbox: 0.1422 d1.loss_iou: 0.2700 d2.loss_cls: 0.2784 d2.loss_bbox: 0.1383 d2.loss_iou: 0.2632 d3.loss_cls: 0.2690 d3.loss_bbox: 0.1416 d3.loss_iou: 0.2656 d4.loss_cls: 0.2706 d4.loss_bbox: 0.1388 d4.loss_iou: 0.2623 enc_loss_cls: 0.3200 enc_loss_bbox: 0.1634 enc_loss_iou: 0.3019 dn_loss_cls: 0.0810 dn_loss_bbox: 0.1542 dn_loss_iou: 0.2028 d0.dn_loss_cls: 0.1500 d0.dn_loss_bbox: 0.3007 d0.dn_loss_iou: 0.3451 d1.dn_loss_cls: 0.1047 d1.dn_loss_bbox: 0.1855 d1.dn_loss_iou: 0.2319 d2.dn_loss_cls: 0.0931 d2.dn_loss_bbox: 0.1630 d2.dn_loss_iou: 0.2119 d3.dn_loss_cls: 0.0843 d3.dn_loss_bbox: 0.1558 d3.dn_loss_iou: 0.2050 d4.dn_loss_cls: 0.0807 d4.dn_loss_bbox: 0.1543 d4.dn_loss_iou: 0.2030 d1.loss_lmm_region: 0.1092 loss_lmm_image: 0.8425 2024/11/12 05:16:01 - mmengine - INFO - Iter(train) [ 73900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:24:46 time: 1.9728 data_time: 0.0182 memory: 34909 grad_norm: 34.9339 loss: 8.4661 loss_cls: 0.2455 loss_bbox: 0.1289 loss_iou: 0.2110 d0.loss_cls: 0.2954 d0.loss_bbox: 0.1367 d0.loss_iou: 0.2231 d1.loss_cls: 0.2691 d1.loss_bbox: 0.1293 d1.loss_iou: 0.2142 d2.loss_cls: 0.2579 d2.loss_bbox: 0.1264 d2.loss_iou: 0.2108 d3.loss_cls: 0.2526 d3.loss_bbox: 0.1268 d3.loss_iou: 0.2100 d4.loss_cls: 0.2490 d4.loss_bbox: 0.1275 d4.loss_iou: 0.2108 enc_loss_cls: 0.2987 enc_loss_bbox: 0.1515 enc_loss_iou: 0.2444 dn_loss_cls: 0.1016 dn_loss_bbox: 0.1553 dn_loss_iou: 0.1903 d0.dn_loss_cls: 0.1786 d0.dn_loss_bbox: 0.3095 d0.dn_loss_iou: 0.3323 d1.dn_loss_cls: 0.1319 d1.dn_loss_bbox: 0.1884 d1.dn_loss_iou: 0.2206 d2.dn_loss_cls: 0.1116 d2.dn_loss_bbox: 0.1664 d2.dn_loss_iou: 0.2005 d3.dn_loss_cls: 0.1046 d3.dn_loss_bbox: 0.1568 d3.dn_loss_iou: 0.1927 d4.dn_loss_cls: 0.1006 d4.dn_loss_bbox: 0.1552 d4.dn_loss_iou: 0.1903 d1.loss_lmm_region: 0.1267 loss_lmm_image: 0.8328 2024/11/12 05:19:20 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 05:19:20 - mmengine - INFO - Iter(train) [ 74000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:21:24 time: 1.9886 data_time: 0.0181 memory: 34850 grad_norm: 29.3453 loss: 10.0786 loss_cls: 0.3273 loss_bbox: 0.1467 loss_iou: 0.2599 d0.loss_cls: 0.3711 d0.loss_bbox: 0.1614 d0.loss_iou: 0.2761 d1.loss_cls: 0.3560 d1.loss_bbox: 0.1498 d1.loss_iou: 0.2643 d2.loss_cls: 0.3361 d2.loss_bbox: 0.1494 d2.loss_iou: 0.2636 d3.loss_cls: 0.3289 d3.loss_bbox: 0.1499 d3.loss_iou: 0.2620 d4.loss_cls: 0.3257 d4.loss_bbox: 0.1470 d4.loss_iou: 0.2608 enc_loss_cls: 0.3792 enc_loss_bbox: 0.1748 enc_loss_iou: 0.2987 dn_loss_cls: 0.1385 dn_loss_bbox: 0.1713 dn_loss_iou: 0.2185 d0.dn_loss_cls: 0.2129 d0.dn_loss_bbox: 0.3140 d0.dn_loss_iou: 0.3587 d1.dn_loss_cls: 0.1644 d1.dn_loss_bbox: 0.2038 d1.dn_loss_iou: 0.2513 d2.dn_loss_cls: 0.1479 d2.dn_loss_bbox: 0.1829 d2.dn_loss_iou: 0.2297 d3.dn_loss_cls: 0.1454 d3.dn_loss_bbox: 0.1738 d3.dn_loss_iou: 0.2218 d4.dn_loss_cls: 0.1409 d4.dn_loss_bbox: 0.1712 d4.dn_loss_iou: 0.2185 d1.loss_lmm_region: 0.1574 loss_lmm_image: 0.8670 2024/11/12 05:22:40 - mmengine - INFO - Iter(train) [ 74100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:18:03 time: 2.0049 data_time: 0.0181 memory: 34589 grad_norm: 25.5560 loss: 10.8172 loss_cls: 0.3550 loss_bbox: 0.1688 loss_iou: 0.3020 d0.loss_cls: 0.4206 d0.loss_bbox: 0.1765 d0.loss_iou: 0.3245 d1.loss_cls: 0.3806 d1.loss_bbox: 0.1709 d1.loss_iou: 0.3145 d2.loss_cls: 0.3655 d2.loss_bbox: 0.1668 d2.loss_iou: 0.3047 d3.loss_cls: 0.3607 d3.loss_bbox: 0.1658 d3.loss_iou: 0.3001 d4.loss_cls: 0.3546 d4.loss_bbox: 0.1682 d4.loss_iou: 0.3012 enc_loss_cls: 0.4076 enc_loss_bbox: 0.1980 enc_loss_iou: 0.3550 dn_loss_cls: 0.1172 dn_loss_bbox: 0.1797 dn_loss_iou: 0.2437 d0.dn_loss_cls: 0.1963 d0.dn_loss_bbox: 0.3377 d0.dn_loss_iou: 0.3991 d1.dn_loss_cls: 0.1469 d1.dn_loss_bbox: 0.2189 d1.dn_loss_iou: 0.2818 d2.dn_loss_cls: 0.1278 d2.dn_loss_bbox: 0.1937 d2.dn_loss_iou: 0.2572 d3.dn_loss_cls: 0.1239 d3.dn_loss_bbox: 0.1843 d3.dn_loss_iou: 0.2480 d4.dn_loss_cls: 0.1182 d4.dn_loss_bbox: 0.1798 d4.dn_loss_iou: 0.2438 d1.loss_lmm_region: 0.1241 loss_lmm_image: 0.8336 2024/11/12 05:25:58 - mmengine - INFO - Iter(train) [ 74200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:14:39 time: 1.9771 data_time: 0.0182 memory: 34992 grad_norm: 26.7043 loss: 9.5022 loss_cls: 0.2792 loss_bbox: 0.1378 loss_iou: 0.2588 d0.loss_cls: 0.3324 d0.loss_bbox: 0.1494 d0.loss_iou: 0.2725 d1.loss_cls: 0.3034 d1.loss_bbox: 0.1384 d1.loss_iou: 0.2638 d2.loss_cls: 0.2928 d2.loss_bbox: 0.1356 d2.loss_iou: 0.2604 d3.loss_cls: 0.2837 d3.loss_bbox: 0.1370 d3.loss_iou: 0.2592 d4.loss_cls: 0.2776 d4.loss_bbox: 0.1371 d4.loss_iou: 0.2596 enc_loss_cls: 0.3308 enc_loss_bbox: 0.1681 enc_loss_iou: 0.3020 dn_loss_cls: 0.1163 dn_loss_bbox: 0.1600 dn_loss_iou: 0.2232 d0.dn_loss_cls: 0.1917 d0.dn_loss_bbox: 0.3131 d0.dn_loss_iou: 0.3758 d1.dn_loss_cls: 0.1444 d1.dn_loss_bbox: 0.1941 d1.dn_loss_iou: 0.2577 d2.dn_loss_cls: 0.1249 d2.dn_loss_bbox: 0.1716 d2.dn_loss_iou: 0.2347 d3.dn_loss_cls: 0.1191 d3.dn_loss_bbox: 0.1621 d3.dn_loss_iou: 0.2258 d4.dn_loss_cls: 0.1164 d4.dn_loss_bbox: 0.1599 d4.dn_loss_iou: 0.2233 d1.loss_lmm_region: 0.1078 loss_lmm_image: 0.9008 2024/11/12 05:29:16 - mmengine - INFO - Iter(train) [ 74300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:11:16 time: 1.9839 data_time: 0.0181 memory: 32994 grad_norm: 37.1295 loss: 9.8808 loss_cls: 0.2967 loss_bbox: 0.1374 loss_iou: 0.2975 d0.loss_cls: 0.3360 d0.loss_bbox: 0.1494 d0.loss_iou: 0.3085 d1.loss_cls: 0.3175 d1.loss_bbox: 0.1395 d1.loss_iou: 0.3002 d2.loss_cls: 0.3082 d2.loss_bbox: 0.1364 d2.loss_iou: 0.2946 d3.loss_cls: 0.3015 d3.loss_bbox: 0.1363 d3.loss_iou: 0.2986 d4.loss_cls: 0.3009 d4.loss_bbox: 0.1372 d4.loss_iou: 0.2979 enc_loss_cls: 0.3344 enc_loss_bbox: 0.1670 enc_loss_iou: 0.3327 dn_loss_cls: 0.1062 dn_loss_bbox: 0.1716 dn_loss_iou: 0.2327 d0.dn_loss_cls: 0.1837 d0.dn_loss_bbox: 0.3138 d0.dn_loss_iou: 0.3700 d1.dn_loss_cls: 0.1390 d1.dn_loss_bbox: 0.2044 d1.dn_loss_iou: 0.2627 d2.dn_loss_cls: 0.1226 d2.dn_loss_bbox: 0.1823 d2.dn_loss_iou: 0.2419 d3.dn_loss_cls: 0.1135 d3.dn_loss_bbox: 0.1742 d3.dn_loss_iou: 0.2351 d4.dn_loss_cls: 0.1077 d4.dn_loss_bbox: 0.1716 d4.dn_loss_iou: 0.2326 d1.loss_lmm_region: 0.1221 loss_lmm_image: 0.8648 2024/11/12 05:32:34 - mmengine - INFO - Iter(train) [ 74400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:07:53 time: 2.0010 data_time: 0.0181 memory: 35379 grad_norm: 33.4861 loss: 9.9884 loss_cls: 0.3165 loss_bbox: 0.1424 loss_iou: 0.2510 d0.loss_cls: 0.3716 d0.loss_bbox: 0.1555 d0.loss_iou: 0.2692 d1.loss_cls: 0.3399 d1.loss_bbox: 0.1511 d1.loss_iou: 0.2586 d2.loss_cls: 0.3340 d2.loss_bbox: 0.1448 d2.loss_iou: 0.2503 d3.loss_cls: 0.3262 d3.loss_bbox: 0.1426 d3.loss_iou: 0.2505 d4.loss_cls: 0.3234 d4.loss_bbox: 0.1391 d4.loss_iou: 0.2508 enc_loss_cls: 0.3780 enc_loss_bbox: 0.1672 enc_loss_iou: 0.2850 dn_loss_cls: 0.1164 dn_loss_bbox: 0.1921 dn_loss_iou: 0.2218 d0.dn_loss_cls: 0.2011 d0.dn_loss_bbox: 0.3655 d0.dn_loss_iou: 0.3742 d1.dn_loss_cls: 0.1531 d1.dn_loss_bbox: 0.2353 d1.dn_loss_iou: 0.2592 d2.dn_loss_cls: 0.1293 d2.dn_loss_bbox: 0.2020 d2.dn_loss_iou: 0.2326 d3.dn_loss_cls: 0.1228 d3.dn_loss_bbox: 0.1952 d3.dn_loss_iou: 0.2247 d4.dn_loss_cls: 0.1187 d4.dn_loss_bbox: 0.1919 d4.dn_loss_iou: 0.2217 d1.loss_lmm_region: 0.1351 loss_lmm_image: 0.8478 2024/11/12 05:35:52 - mmengine - INFO - Iter(train) [ 74500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:04:29 time: 1.9655 data_time: 0.0183 memory: 33234 grad_norm: 44.8879 loss: 9.8853 loss_cls: 0.3006 loss_bbox: 0.1587 loss_iou: 0.2734 d0.loss_cls: 0.3419 d0.loss_bbox: 0.1759 d0.loss_iou: 0.2921 d1.loss_cls: 0.3176 d1.loss_bbox: 0.1668 d1.loss_iou: 0.2804 d2.loss_cls: 0.3084 d2.loss_bbox: 0.1620 d2.loss_iou: 0.2752 d3.loss_cls: 0.3008 d3.loss_bbox: 0.1611 d3.loss_iou: 0.2737 d4.loss_cls: 0.2969 d4.loss_bbox: 0.1613 d4.loss_iou: 0.2731 enc_loss_cls: 0.3398 enc_loss_bbox: 0.1931 enc_loss_iou: 0.3158 dn_loss_cls: 0.0950 dn_loss_bbox: 0.1779 dn_loss_iou: 0.2292 d0.dn_loss_cls: 0.1783 d0.dn_loss_bbox: 0.3520 d0.dn_loss_iou: 0.3839 d1.dn_loss_cls: 0.1274 d1.dn_loss_bbox: 0.2189 d1.dn_loss_iou: 0.2647 d2.dn_loss_cls: 0.1079 d2.dn_loss_bbox: 0.1872 d2.dn_loss_iou: 0.2389 d3.dn_loss_cls: 0.1003 d3.dn_loss_bbox: 0.1799 d3.dn_loss_iou: 0.2324 d4.dn_loss_cls: 0.0966 d4.dn_loss_bbox: 0.1780 d4.dn_loss_iou: 0.2292 d1.loss_lmm_region: 0.1520 loss_lmm_image: 0.7868 2024/11/12 05:39:11 - mmengine - INFO - Iter(train) [ 74600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 18:01:07 time: 1.9868 data_time: 0.0182 memory: 35333 grad_norm: 30.4820 loss: 8.8263 loss_cls: 0.2683 loss_bbox: 0.1293 loss_iou: 0.2130 d0.loss_cls: 0.3169 d0.loss_bbox: 0.1345 d0.loss_iou: 0.2217 d1.loss_cls: 0.2847 d1.loss_bbox: 0.1336 d1.loss_iou: 0.2174 d2.loss_cls: 0.2732 d2.loss_bbox: 0.1327 d2.loss_iou: 0.2163 d3.loss_cls: 0.2748 d3.loss_bbox: 0.1296 d3.loss_iou: 0.2138 d4.loss_cls: 0.2702 d4.loss_bbox: 0.1277 d4.loss_iou: 0.2128 enc_loss_cls: 0.3074 enc_loss_bbox: 0.1531 enc_loss_iou: 0.2463 dn_loss_cls: 0.1139 dn_loss_bbox: 0.1642 dn_loss_iou: 0.2033 d0.dn_loss_cls: 0.1969 d0.dn_loss_bbox: 0.2998 d0.dn_loss_iou: 0.3361 d1.dn_loss_cls: 0.1428 d1.dn_loss_bbox: 0.1922 d1.dn_loss_iou: 0.2280 d2.dn_loss_cls: 0.1264 d2.dn_loss_bbox: 0.1703 d2.dn_loss_iou: 0.2096 d3.dn_loss_cls: 0.1220 d3.dn_loss_bbox: 0.1665 d3.dn_loss_iou: 0.2053 d4.dn_loss_cls: 0.1173 d4.dn_loss_bbox: 0.1642 d4.dn_loss_iou: 0.2033 d1.loss_lmm_region: 0.1512 loss_lmm_image: 0.8355 2024/11/12 05:42:28 - mmengine - INFO - Iter(train) [ 74700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:57:43 time: 1.9762 data_time: 0.0182 memory: 35256 grad_norm: 27.5629 loss: 9.8574 loss_cls: 0.3096 loss_bbox: 0.1427 loss_iou: 0.2685 d0.loss_cls: 0.3650 d0.loss_bbox: 0.1588 d0.loss_iou: 0.2841 d1.loss_cls: 0.3322 d1.loss_bbox: 0.1485 d1.loss_iou: 0.2705 d2.loss_cls: 0.3180 d2.loss_bbox: 0.1474 d2.loss_iou: 0.2709 d3.loss_cls: 0.3120 d3.loss_bbox: 0.1440 d3.loss_iou: 0.2675 d4.loss_cls: 0.3089 d4.loss_bbox: 0.1419 d4.loss_iou: 0.2686 enc_loss_cls: 0.3664 enc_loss_bbox: 0.1739 enc_loss_iou: 0.3092 dn_loss_cls: 0.1119 dn_loss_bbox: 0.1624 dn_loss_iou: 0.2256 d0.dn_loss_cls: 0.1948 d0.dn_loss_bbox: 0.3280 d0.dn_loss_iou: 0.3797 d1.dn_loss_cls: 0.1482 d1.dn_loss_bbox: 0.2017 d1.dn_loss_iou: 0.2610 d2.dn_loss_cls: 0.1245 d2.dn_loss_bbox: 0.1759 d2.dn_loss_iou: 0.2374 d3.dn_loss_cls: 0.1166 d3.dn_loss_bbox: 0.1653 d3.dn_loss_iou: 0.2287 d4.dn_loss_cls: 0.1131 d4.dn_loss_bbox: 0.1625 d4.dn_loss_iou: 0.2257 d1.loss_lmm_region: 0.1327 loss_lmm_image: 0.8530 2024/11/12 05:45:48 - mmengine - INFO - Iter(train) [ 74800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:54:21 time: 1.9939 data_time: 0.0182 memory: 34804 grad_norm: 28.1131 loss: 9.3185 loss_cls: 0.3205 loss_bbox: 0.1402 loss_iou: 0.2401 d0.loss_cls: 0.3572 d0.loss_bbox: 0.1454 d0.loss_iou: 0.2483 d1.loss_cls: 0.3311 d1.loss_bbox: 0.1520 d1.loss_iou: 0.2492 d2.loss_cls: 0.3312 d2.loss_bbox: 0.1443 d2.loss_iou: 0.2395 d3.loss_cls: 0.3220 d3.loss_bbox: 0.1453 d3.loss_iou: 0.2395 d4.loss_cls: 0.3247 d4.loss_bbox: 0.1392 d4.loss_iou: 0.2393 enc_loss_cls: 0.3503 enc_loss_bbox: 0.1670 enc_loss_iou: 0.2758 dn_loss_cls: 0.0877 dn_loss_bbox: 0.1685 dn_loss_iou: 0.2042 d0.dn_loss_cls: 0.1703 d0.dn_loss_bbox: 0.3094 d0.dn_loss_iou: 0.3343 d1.dn_loss_cls: 0.1195 d1.dn_loss_bbox: 0.1973 d1.dn_loss_iou: 0.2311 d2.dn_loss_cls: 0.0979 d2.dn_loss_bbox: 0.1767 d2.dn_loss_iou: 0.2124 d3.dn_loss_cls: 0.0927 d3.dn_loss_bbox: 0.1706 d3.dn_loss_iou: 0.2061 d4.dn_loss_cls: 0.0891 d4.dn_loss_bbox: 0.1686 d4.dn_loss_iou: 0.2042 d1.loss_lmm_region: 0.1460 loss_lmm_image: 0.8298 2024/11/12 05:49:05 - mmengine - INFO - Iter(train) [ 74900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:50:58 time: 1.9881 data_time: 0.0181 memory: 34584 grad_norm: 30.8268 loss: 10.1692 loss_cls: 0.2826 loss_bbox: 0.1592 loss_iou: 0.2645 d0.loss_cls: 0.3341 d0.loss_bbox: 0.1748 d0.loss_iou: 0.2885 d1.loss_cls: 0.3101 d1.loss_bbox: 0.1635 d1.loss_iou: 0.2699 d2.loss_cls: 0.2951 d2.loss_bbox: 0.1620 d2.loss_iou: 0.2675 d3.loss_cls: 0.2852 d3.loss_bbox: 0.1603 d3.loss_iou: 0.2655 d4.loss_cls: 0.2840 d4.loss_bbox: 0.1590 d4.loss_iou: 0.2641 enc_loss_cls: 0.3383 enc_loss_bbox: 0.1917 enc_loss_iou: 0.3110 dn_loss_cls: 0.1416 dn_loss_bbox: 0.2010 dn_loss_iou: 0.2364 d0.dn_loss_cls: 0.2182 d0.dn_loss_bbox: 0.3409 d0.dn_loss_iou: 0.3729 d1.dn_loss_cls: 0.1676 d1.dn_loss_bbox: 0.2311 d1.dn_loss_iou: 0.2653 d2.dn_loss_cls: 0.1521 d2.dn_loss_bbox: 0.2086 d2.dn_loss_iou: 0.2457 d3.dn_loss_cls: 0.1436 d3.dn_loss_bbox: 0.2024 d3.dn_loss_iou: 0.2385 d4.dn_loss_cls: 0.1405 d4.dn_loss_bbox: 0.2009 d4.dn_loss_iou: 0.2364 d1.loss_lmm_region: 0.1384 loss_lmm_image: 0.8560 2024/11/12 05:52:25 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 05:52:25 - mmengine - INFO - Iter(train) [ 75000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:47:36 time: 1.9956 data_time: 0.0182 memory: 34608 grad_norm: 30.9551 loss: 8.4821 loss_cls: 0.2499 loss_bbox: 0.1141 loss_iou: 0.2173 d0.loss_cls: 0.2882 d0.loss_bbox: 0.1343 d0.loss_iou: 0.2445 d1.loss_cls: 0.2623 d1.loss_bbox: 0.1213 d1.loss_iou: 0.2263 d2.loss_cls: 0.2551 d2.loss_bbox: 0.1175 d2.loss_iou: 0.2224 d3.loss_cls: 0.2552 d3.loss_bbox: 0.1148 d3.loss_iou: 0.2176 d4.loss_cls: 0.2499 d4.loss_bbox: 0.1150 d4.loss_iou: 0.2184 enc_loss_cls: 0.2958 enc_loss_bbox: 0.1401 enc_loss_iou: 0.2622 dn_loss_cls: 0.0921 dn_loss_bbox: 0.1486 dn_loss_iou: 0.1961 d0.dn_loss_cls: 0.1699 d0.dn_loss_bbox: 0.3188 d0.dn_loss_iou: 0.3550 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.1888 d1.dn_loss_iou: 0.2319 d2.dn_loss_cls: 0.1057 d2.dn_loss_bbox: 0.1626 d2.dn_loss_iou: 0.2079 d3.dn_loss_cls: 0.0975 d3.dn_loss_bbox: 0.1508 d3.dn_loss_iou: 0.1992 d4.dn_loss_cls: 0.0938 d4.dn_loss_bbox: 0.1486 d4.dn_loss_iou: 0.1960 d1.loss_lmm_region: 0.1176 loss_lmm_image: 0.8579 2024/11/12 05:55:44 - mmengine - INFO - Iter(train) [ 75100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:44:14 time: 2.0054 data_time: 0.0182 memory: 33340 grad_norm: 30.0085 loss: 8.4348 loss_cls: 0.2577 loss_bbox: 0.1209 loss_iou: 0.2233 d0.loss_cls: 0.2986 d0.loss_bbox: 0.1272 d0.loss_iou: 0.2336 d1.loss_cls: 0.2732 d1.loss_bbox: 0.1249 d1.loss_iou: 0.2283 d2.loss_cls: 0.2604 d2.loss_bbox: 0.1217 d2.loss_iou: 0.2245 d3.loss_cls: 0.2590 d3.loss_bbox: 0.1216 d3.loss_iou: 0.2235 d4.loss_cls: 0.2572 d4.loss_bbox: 0.1214 d4.loss_iou: 0.2227 enc_loss_cls: 0.2966 enc_loss_bbox: 0.1364 enc_loss_iou: 0.2482 dn_loss_cls: 0.1110 dn_loss_bbox: 0.1388 dn_loss_iou: 0.1929 d0.dn_loss_cls: 0.1827 d0.dn_loss_bbox: 0.2558 d0.dn_loss_iou: 0.3258 d1.dn_loss_cls: 0.1385 d1.dn_loss_bbox: 0.1630 d1.dn_loss_iou: 0.2196 d2.dn_loss_cls: 0.1220 d2.dn_loss_bbox: 0.1456 d2.dn_loss_iou: 0.2001 d3.dn_loss_cls: 0.1160 d3.dn_loss_bbox: 0.1410 d3.dn_loss_iou: 0.1949 d4.dn_loss_cls: 0.1110 d4.dn_loss_bbox: 0.1388 d4.dn_loss_iou: 0.1930 d1.loss_lmm_region: 0.1347 loss_lmm_image: 0.8286 2024/11/12 05:59:03 - mmengine - INFO - Iter(train) [ 75200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:40:52 time: 1.9883 data_time: 0.0180 memory: 34810 grad_norm: 31.6337 loss: 8.8230 loss_cls: 0.2793 loss_bbox: 0.1169 loss_iou: 0.1846 d0.loss_cls: 0.3221 d0.loss_bbox: 0.1315 d0.loss_iou: 0.2041 d1.loss_cls: 0.3015 d1.loss_bbox: 0.1232 d1.loss_iou: 0.1918 d2.loss_cls: 0.2929 d2.loss_bbox: 0.1201 d2.loss_iou: 0.1870 d3.loss_cls: 0.2848 d3.loss_bbox: 0.1180 d3.loss_iou: 0.1856 d4.loss_cls: 0.2808 d4.loss_bbox: 0.1178 d4.loss_iou: 0.1854 enc_loss_cls: 0.3242 enc_loss_bbox: 0.1432 enc_loss_iou: 0.2222 dn_loss_cls: 0.1299 dn_loss_bbox: 0.1654 dn_loss_iou: 0.1974 d0.dn_loss_cls: 0.2200 d0.dn_loss_bbox: 0.3387 d0.dn_loss_iou: 0.3447 d1.dn_loss_cls: 0.1583 d1.dn_loss_bbox: 0.2057 d1.dn_loss_iou: 0.2297 d2.dn_loss_cls: 0.1439 d2.dn_loss_bbox: 0.1774 d2.dn_loss_iou: 0.2068 d3.dn_loss_cls: 0.1368 d3.dn_loss_bbox: 0.1682 d3.dn_loss_iou: 0.1998 d4.dn_loss_cls: 0.1298 d4.dn_loss_bbox: 0.1656 d4.dn_loss_iou: 0.1973 d1.loss_lmm_region: 0.1467 loss_lmm_image: 0.8444 2024/11/12 06:02:23 - mmengine - INFO - Iter(train) [ 75300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:37:31 time: 2.0094 data_time: 0.0181 memory: 33161 grad_norm: 28.5702 loss: 9.6095 loss_cls: 0.2607 loss_bbox: 0.1455 loss_iou: 0.2477 d0.loss_cls: 0.2926 d0.loss_bbox: 0.1548 d0.loss_iou: 0.2610 d1.loss_cls: 0.2787 d1.loss_bbox: 0.1465 d1.loss_iou: 0.2512 d2.loss_cls: 0.2681 d2.loss_bbox: 0.1482 d2.loss_iou: 0.2521 d3.loss_cls: 0.2635 d3.loss_bbox: 0.1455 d3.loss_iou: 0.2470 d4.loss_cls: 0.2625 d4.loss_bbox: 0.1449 d4.loss_iou: 0.2473 enc_loss_cls: 0.2995 enc_loss_bbox: 0.1729 enc_loss_iou: 0.2883 dn_loss_cls: 0.1030 dn_loss_bbox: 0.2116 dn_loss_iou: 0.2391 d0.dn_loss_cls: 0.1864 d0.dn_loss_bbox: 0.3705 d0.dn_loss_iou: 0.3885 d1.dn_loss_cls: 0.1358 d1.dn_loss_bbox: 0.2468 d1.dn_loss_iou: 0.2732 d2.dn_loss_cls: 0.1144 d2.dn_loss_bbox: 0.2209 d2.dn_loss_iou: 0.2489 d3.dn_loss_cls: 0.1073 d3.dn_loss_bbox: 0.2137 d3.dn_loss_iou: 0.2408 d4.dn_loss_cls: 0.1038 d4.dn_loss_bbox: 0.2116 d4.dn_loss_iou: 0.2391 d1.loss_lmm_region: 0.1181 loss_lmm_image: 0.8575 2024/11/12 06:05:44 - mmengine - INFO - Iter(train) [ 75400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:34:10 time: 2.0265 data_time: 0.0181 memory: 33606 grad_norm: 31.6902 loss: 9.1412 loss_cls: 0.2840 loss_bbox: 0.1314 loss_iou: 0.2302 d0.loss_cls: 0.3303 d0.loss_bbox: 0.1372 d0.loss_iou: 0.2449 d1.loss_cls: 0.3053 d1.loss_bbox: 0.1374 d1.loss_iou: 0.2375 d2.loss_cls: 0.2957 d2.loss_bbox: 0.1330 d2.loss_iou: 0.2319 d3.loss_cls: 0.2894 d3.loss_bbox: 0.1316 d3.loss_iou: 0.2307 d4.loss_cls: 0.2865 d4.loss_bbox: 0.1306 d4.loss_iou: 0.2296 enc_loss_cls: 0.3358 enc_loss_bbox: 0.1527 enc_loss_iou: 0.2672 dn_loss_cls: 0.1171 dn_loss_bbox: 0.1571 dn_loss_iou: 0.2098 d0.dn_loss_cls: 0.1967 d0.dn_loss_bbox: 0.3036 d0.dn_loss_iou: 0.3533 d1.dn_loss_cls: 0.1456 d1.dn_loss_bbox: 0.1918 d1.dn_loss_iou: 0.2437 d2.dn_loss_cls: 0.1262 d2.dn_loss_bbox: 0.1679 d2.dn_loss_iou: 0.2210 d3.dn_loss_cls: 0.1190 d3.dn_loss_bbox: 0.1594 d3.dn_loss_iou: 0.2130 d4.dn_loss_cls: 0.1172 d4.dn_loss_bbox: 0.1570 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.1441 loss_lmm_image: 0.8351 2024/11/12 06:09:03 - mmengine - INFO - Iter(train) [ 75500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:30:48 time: 1.9773 data_time: 0.0181 memory: 33824 grad_norm: 27.4896 loss: 10.4337 loss_cls: 0.3321 loss_bbox: 0.1649 loss_iou: 0.2974 d0.loss_cls: 0.3691 d0.loss_bbox: 0.1795 d0.loss_iou: 0.3074 d1.loss_cls: 0.3423 d1.loss_bbox: 0.1749 d1.loss_iou: 0.3072 d2.loss_cls: 0.3404 d2.loss_bbox: 0.1644 d2.loss_iou: 0.2991 d3.loss_cls: 0.3370 d3.loss_bbox: 0.1619 d3.loss_iou: 0.2957 d4.loss_cls: 0.3344 d4.loss_bbox: 0.1649 d4.loss_iou: 0.2984 enc_loss_cls: 0.3765 enc_loss_bbox: 0.1937 enc_loss_iou: 0.3310 dn_loss_cls: 0.1062 dn_loss_bbox: 0.1881 dn_loss_iou: 0.2396 d0.dn_loss_cls: 0.1823 d0.dn_loss_bbox: 0.3428 d0.dn_loss_iou: 0.3823 d1.dn_loss_cls: 0.1364 d1.dn_loss_bbox: 0.2190 d1.dn_loss_iou: 0.2680 d2.dn_loss_cls: 0.1164 d2.dn_loss_bbox: 0.1985 d2.dn_loss_iou: 0.2492 d3.dn_loss_cls: 0.1129 d3.dn_loss_bbox: 0.1902 d3.dn_loss_iou: 0.2421 d4.dn_loss_cls: 0.1073 d4.dn_loss_bbox: 0.1881 d4.dn_loss_iou: 0.2395 d1.loss_lmm_region: 0.1276 loss_lmm_image: 0.8252 2024/11/12 06:12:22 - mmengine - INFO - Iter(train) [ 75600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:27:26 time: 1.9966 data_time: 0.0181 memory: 34108 grad_norm: 30.1065 loss: 9.5094 loss_cls: 0.2948 loss_bbox: 0.1268 loss_iou: 0.2402 d0.loss_cls: 0.3360 d0.loss_bbox: 0.1401 d0.loss_iou: 0.2514 d1.loss_cls: 0.3202 d1.loss_bbox: 0.1355 d1.loss_iou: 0.2477 d2.loss_cls: 0.2973 d2.loss_bbox: 0.1321 d2.loss_iou: 0.2434 d3.loss_cls: 0.2991 d3.loss_bbox: 0.1263 d3.loss_iou: 0.2399 d4.loss_cls: 0.2891 d4.loss_bbox: 0.1263 d4.loss_iou: 0.2402 enc_loss_cls: 0.3374 enc_loss_bbox: 0.1561 enc_loss_iou: 0.2743 dn_loss_cls: 0.1156 dn_loss_bbox: 0.1719 dn_loss_iou: 0.2265 d0.dn_loss_cls: 0.1879 d0.dn_loss_bbox: 0.3425 d0.dn_loss_iou: 0.3792 d1.dn_loss_cls: 0.1411 d1.dn_loss_bbox: 0.2087 d1.dn_loss_iou: 0.2596 d2.dn_loss_cls: 0.1247 d2.dn_loss_bbox: 0.1844 d2.dn_loss_iou: 0.2366 d3.dn_loss_cls: 0.1202 d3.dn_loss_bbox: 0.1748 d3.dn_loss_iou: 0.2296 d4.dn_loss_cls: 0.1179 d4.dn_loss_bbox: 0.1720 d4.dn_loss_iou: 0.2265 d1.loss_lmm_region: 0.1299 loss_lmm_image: 0.9061 2024/11/12 06:15:42 - mmengine - INFO - Iter(train) [ 75700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:24:04 time: 1.9818 data_time: 0.0181 memory: 33775 grad_norm: 40.2564 loss: 9.2534 loss_cls: 0.2772 loss_bbox: 0.1255 loss_iou: 0.2302 d0.loss_cls: 0.3174 d0.loss_bbox: 0.1307 d0.loss_iou: 0.2383 d1.loss_cls: 0.2912 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2335 d2.loss_cls: 0.2816 d2.loss_bbox: 0.1264 d2.loss_iou: 0.2318 d3.loss_cls: 0.2803 d3.loss_bbox: 0.1240 d3.loss_iou: 0.2310 d4.loss_cls: 0.2771 d4.loss_bbox: 0.1254 d4.loss_iou: 0.2303 enc_loss_cls: 0.3184 enc_loss_bbox: 0.1414 enc_loss_iou: 0.2564 dn_loss_cls: 0.1225 dn_loss_bbox: 0.1706 dn_loss_iou: 0.2153 d0.dn_loss_cls: 0.2063 d0.dn_loss_bbox: 0.3343 d0.dn_loss_iou: 0.3643 d1.dn_loss_cls: 0.1543 d1.dn_loss_bbox: 0.2078 d1.dn_loss_iou: 0.2501 d2.dn_loss_cls: 0.1366 d2.dn_loss_bbox: 0.1803 d2.dn_loss_iou: 0.2271 d3.dn_loss_cls: 0.1305 d3.dn_loss_bbox: 0.1719 d3.dn_loss_iou: 0.2183 d4.dn_loss_cls: 0.1242 d4.dn_loss_bbox: 0.1706 d4.dn_loss_iou: 0.2153 d1.loss_lmm_region: 0.1570 loss_lmm_image: 0.9004 2024/11/12 06:19:00 - mmengine - INFO - Iter(train) [ 75800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:20:42 time: 1.9702 data_time: 0.0182 memory: 33693 grad_norm: 29.0772 loss: 9.4926 loss_cls: 0.2864 loss_bbox: 0.1533 loss_iou: 0.2520 d0.loss_cls: 0.3280 d0.loss_bbox: 0.1652 d0.loss_iou: 0.2673 d1.loss_cls: 0.3053 d1.loss_bbox: 0.1532 d1.loss_iou: 0.2546 d2.loss_cls: 0.2944 d2.loss_bbox: 0.1533 d2.loss_iou: 0.2509 d3.loss_cls: 0.2893 d3.loss_bbox: 0.1532 d3.loss_iou: 0.2512 d4.loss_cls: 0.2878 d4.loss_bbox: 0.1543 d4.loss_iou: 0.2521 enc_loss_cls: 0.3348 enc_loss_bbox: 0.1771 enc_loss_iou: 0.2838 dn_loss_cls: 0.0991 dn_loss_bbox: 0.1789 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.1923 d0.dn_loss_bbox: 0.3550 d0.dn_loss_iou: 0.3596 d1.dn_loss_cls: 0.1328 d1.dn_loss_bbox: 0.2145 d1.dn_loss_iou: 0.2448 d2.dn_loss_cls: 0.1125 d2.dn_loss_bbox: 0.1873 d2.dn_loss_iou: 0.2222 d3.dn_loss_cls: 0.1026 d3.dn_loss_bbox: 0.1809 d3.dn_loss_iou: 0.2165 d4.dn_loss_cls: 0.1008 d4.dn_loss_bbox: 0.1790 d4.dn_loss_iou: 0.2141 d1.loss_lmm_region: 0.1411 loss_lmm_image: 0.7970 2024/11/12 06:22:21 - mmengine - INFO - Iter(train) [ 75900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:17:21 time: 2.0276 data_time: 0.0184 memory: 34132 grad_norm: 26.1425 loss: 9.4914 loss_cls: 0.2774 loss_bbox: 0.1487 loss_iou: 0.2654 d0.loss_cls: 0.3207 d0.loss_bbox: 0.1587 d0.loss_iou: 0.2746 d1.loss_cls: 0.3067 d1.loss_bbox: 0.1471 d1.loss_iou: 0.2653 d2.loss_cls: 0.2906 d2.loss_bbox: 0.1493 d2.loss_iou: 0.2654 d3.loss_cls: 0.2810 d3.loss_bbox: 0.1494 d3.loss_iou: 0.2657 d4.loss_cls: 0.2775 d4.loss_bbox: 0.1495 d4.loss_iou: 0.2663 enc_loss_cls: 0.3232 enc_loss_bbox: 0.1739 enc_loss_iou: 0.2980 dn_loss_cls: 0.0892 dn_loss_bbox: 0.1746 dn_loss_iou: 0.2284 d0.dn_loss_cls: 0.1690 d0.dn_loss_bbox: 0.3406 d0.dn_loss_iou: 0.3812 d1.dn_loss_cls: 0.1188 d1.dn_loss_bbox: 0.2105 d1.dn_loss_iou: 0.2601 d2.dn_loss_cls: 0.0993 d2.dn_loss_bbox: 0.1856 d2.dn_loss_iou: 0.2384 d3.dn_loss_cls: 0.0914 d3.dn_loss_bbox: 0.1767 d3.dn_loss_iou: 0.2308 d4.dn_loss_cls: 0.0891 d4.dn_loss_bbox: 0.1746 d4.dn_loss_iou: 0.2285 d1.loss_lmm_region: 0.1284 loss_lmm_image: 0.8222 2024/11/12 06:25:41 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 06:25:41 - mmengine - INFO - Iter(train) [ 76000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:14:00 time: 1.9865 data_time: 0.0181 memory: 34735 grad_norm: 27.2773 loss: 8.9441 loss_cls: 0.2857 loss_bbox: 0.1232 loss_iou: 0.2364 d0.loss_cls: 0.3285 d0.loss_bbox: 0.1341 d0.loss_iou: 0.2467 d1.loss_cls: 0.3006 d1.loss_bbox: 0.1248 d1.loss_iou: 0.2376 d2.loss_cls: 0.2894 d2.loss_bbox: 0.1253 d2.loss_iou: 0.2378 d3.loss_cls: 0.2810 d3.loss_bbox: 0.1265 d3.loss_iou: 0.2393 d4.loss_cls: 0.2809 d4.loss_bbox: 0.1248 d4.loss_iou: 0.2376 enc_loss_cls: 0.3262 enc_loss_bbox: 0.1467 enc_loss_iou: 0.2711 dn_loss_cls: 0.1175 dn_loss_bbox: 0.1569 dn_loss_iou: 0.2134 d0.dn_loss_cls: 0.1869 d0.dn_loss_bbox: 0.2764 d0.dn_loss_iou: 0.3381 d1.dn_loss_cls: 0.1375 d1.dn_loss_bbox: 0.1748 d1.dn_loss_iou: 0.2373 d2.dn_loss_cls: 0.1205 d2.dn_loss_bbox: 0.1596 d2.dn_loss_iou: 0.2200 d3.dn_loss_cls: 0.1174 d3.dn_loss_bbox: 0.1564 d3.dn_loss_iou: 0.2151 d4.dn_loss_cls: 0.1165 d4.dn_loss_bbox: 0.1569 d4.dn_loss_iou: 0.2134 d1.loss_lmm_region: 0.1392 loss_lmm_image: 0.7858 2024/11/12 06:29:01 - mmengine - INFO - Iter(train) [ 76100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:10:38 time: 1.9900 data_time: 0.0181 memory: 34421 grad_norm: 30.1413 loss: 8.8152 loss_cls: 0.2513 loss_bbox: 0.1282 loss_iou: 0.2475 d0.loss_cls: 0.3008 d0.loss_bbox: 0.1331 d0.loss_iou: 0.2590 d1.loss_cls: 0.2691 d1.loss_bbox: 0.1292 d1.loss_iou: 0.2537 d2.loss_cls: 0.2567 d2.loss_bbox: 0.1289 d2.loss_iou: 0.2500 d3.loss_cls: 0.2549 d3.loss_bbox: 0.1275 d3.loss_iou: 0.2440 d4.loss_cls: 0.2534 d4.loss_bbox: 0.1259 d4.loss_iou: 0.2479 enc_loss_cls: 0.3022 enc_loss_bbox: 0.1520 enc_loss_iou: 0.2837 dn_loss_cls: 0.0818 dn_loss_bbox: 0.1568 dn_loss_iou: 0.2175 d0.dn_loss_cls: 0.1625 d0.dn_loss_bbox: 0.3130 d0.dn_loss_iou: 0.3736 d1.dn_loss_cls: 0.1133 d1.dn_loss_bbox: 0.1908 d1.dn_loss_iou: 0.2508 d2.dn_loss_cls: 0.0939 d2.dn_loss_bbox: 0.1671 d2.dn_loss_iou: 0.2290 d3.dn_loss_cls: 0.0861 d3.dn_loss_bbox: 0.1584 d3.dn_loss_iou: 0.2208 d4.dn_loss_cls: 0.0826 d4.dn_loss_bbox: 0.1567 d4.dn_loss_iou: 0.2174 d1.loss_lmm_region: 0.1331 loss_lmm_image: 0.8110 2024/11/12 06:32:19 - mmengine - INFO - Iter(train) [ 76200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:07:16 time: 2.0000 data_time: 0.0181 memory: 33170 grad_norm: 27.7523 loss: 9.0305 loss_cls: 0.2947 loss_bbox: 0.1120 loss_iou: 0.2073 d0.loss_cls: 0.3153 d0.loss_bbox: 0.1369 d0.loss_iou: 0.2315 d1.loss_cls: 0.3061 d1.loss_bbox: 0.1268 d1.loss_iou: 0.2207 d2.loss_cls: 0.3035 d2.loss_bbox: 0.1165 d2.loss_iou: 0.2100 d3.loss_cls: 0.3019 d3.loss_bbox: 0.1122 d3.loss_iou: 0.2087 d4.loss_cls: 0.2935 d4.loss_bbox: 0.1140 d4.loss_iou: 0.2077 enc_loss_cls: 0.3303 enc_loss_bbox: 0.1480 enc_loss_iou: 0.2521 dn_loss_cls: 0.1214 dn_loss_bbox: 0.1618 dn_loss_iou: 0.2190 d0.dn_loss_cls: 0.1996 d0.dn_loss_bbox: 0.3064 d0.dn_loss_iou: 0.3608 d1.dn_loss_cls: 0.1541 d1.dn_loss_bbox: 0.1946 d1.dn_loss_iou: 0.2486 d2.dn_loss_cls: 0.1340 d2.dn_loss_bbox: 0.1709 d2.dn_loss_iou: 0.2278 d3.dn_loss_cls: 0.1275 d3.dn_loss_bbox: 0.1628 d3.dn_loss_iou: 0.2210 d4.dn_loss_cls: 0.1232 d4.dn_loss_bbox: 0.1619 d4.dn_loss_iou: 0.2190 d1.loss_lmm_region: 0.1299 loss_lmm_image: 0.8367 2024/11/12 06:35:38 - mmengine - INFO - Iter(train) [ 76300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:03:54 time: 1.9931 data_time: 0.0182 memory: 34362 grad_norm: 29.3302 loss: 8.7401 loss_cls: 0.2524 loss_bbox: 0.1247 loss_iou: 0.2335 d0.loss_cls: 0.2988 d0.loss_bbox: 0.1425 d0.loss_iou: 0.2466 d1.loss_cls: 0.2705 d1.loss_bbox: 0.1324 d1.loss_iou: 0.2408 d2.loss_cls: 0.2580 d2.loss_bbox: 0.1279 d2.loss_iou: 0.2372 d3.loss_cls: 0.2526 d3.loss_bbox: 0.1266 d3.loss_iou: 0.2363 d4.loss_cls: 0.2550 d4.loss_bbox: 0.1230 d4.loss_iou: 0.2327 enc_loss_cls: 0.3015 enc_loss_bbox: 0.1546 enc_loss_iou: 0.2665 dn_loss_cls: 0.0916 dn_loss_bbox: 0.1663 dn_loss_iou: 0.2057 d0.dn_loss_cls: 0.1612 d0.dn_loss_bbox: 0.3007 d0.dn_loss_iou: 0.3353 d1.dn_loss_cls: 0.1195 d1.dn_loss_bbox: 0.1970 d1.dn_loss_iou: 0.2317 d2.dn_loss_cls: 0.1027 d2.dn_loss_bbox: 0.1777 d2.dn_loss_iou: 0.2140 d3.dn_loss_cls: 0.0937 d3.dn_loss_bbox: 0.1686 d3.dn_loss_iou: 0.2079 d4.dn_loss_cls: 0.0914 d4.dn_loss_bbox: 0.1662 d4.dn_loss_iou: 0.2057 d1.loss_lmm_region: 0.1149 loss_lmm_image: 0.8743 2024/11/12 06:38:58 - mmengine - INFO - Iter(train) [ 76400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 17:00:33 time: 2.0030 data_time: 0.0182 memory: 34666 grad_norm: 30.1325 loss: 8.8300 loss_cls: 0.2827 loss_bbox: 0.1073 loss_iou: 0.2108 d0.loss_cls: 0.3213 d0.loss_bbox: 0.1170 d0.loss_iou: 0.2200 d1.loss_cls: 0.3005 d1.loss_bbox: 0.1139 d1.loss_iou: 0.2152 d2.loss_cls: 0.2968 d2.loss_bbox: 0.1087 d2.loss_iou: 0.2110 d3.loss_cls: 0.2865 d3.loss_bbox: 0.1075 d3.loss_iou: 0.2099 d4.loss_cls: 0.2835 d4.loss_bbox: 0.1068 d4.loss_iou: 0.2099 enc_loss_cls: 0.3259 enc_loss_bbox: 0.1333 enc_loss_iou: 0.2415 dn_loss_cls: 0.1304 dn_loss_bbox: 0.1535 dn_loss_iou: 0.2067 d0.dn_loss_cls: 0.2021 d0.dn_loss_bbox: 0.3112 d0.dn_loss_iou: 0.3536 d1.dn_loss_cls: 0.1578 d1.dn_loss_bbox: 0.1862 d1.dn_loss_iou: 0.2375 d2.dn_loss_cls: 0.1389 d2.dn_loss_bbox: 0.1633 d2.dn_loss_iou: 0.2160 d3.dn_loss_cls: 0.1315 d3.dn_loss_bbox: 0.1551 d3.dn_loss_iou: 0.2093 d4.dn_loss_cls: 0.1298 d4.dn_loss_bbox: 0.1534 d4.dn_loss_iou: 0.2068 d1.loss_lmm_region: 0.1217 loss_lmm_image: 0.8557 2024/11/12 06:42:18 - mmengine - INFO - Iter(train) [ 76500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:57:11 time: 1.9865 data_time: 0.0181 memory: 33461 grad_norm: 27.8891 loss: 8.9736 loss_cls: 0.2610 loss_bbox: 0.1221 loss_iou: 0.2361 d0.loss_cls: 0.3030 d0.loss_bbox: 0.1333 d0.loss_iou: 0.2505 d1.loss_cls: 0.2817 d1.loss_bbox: 0.1269 d1.loss_iou: 0.2442 d2.loss_cls: 0.2703 d2.loss_bbox: 0.1251 d2.loss_iou: 0.2395 d3.loss_cls: 0.2629 d3.loss_bbox: 0.1234 d3.loss_iou: 0.2375 d4.loss_cls: 0.2614 d4.loss_bbox: 0.1227 d4.loss_iou: 0.2361 enc_loss_cls: 0.3043 enc_loss_bbox: 0.1518 enc_loss_iou: 0.2759 dn_loss_cls: 0.1255 dn_loss_bbox: 0.1578 dn_loss_iou: 0.2121 d0.dn_loss_cls: 0.1924 d0.dn_loss_bbox: 0.2952 d0.dn_loss_iou: 0.3441 d1.dn_loss_cls: 0.1515 d1.dn_loss_bbox: 0.1842 d1.dn_loss_iou: 0.2379 d2.dn_loss_cls: 0.1331 d2.dn_loss_bbox: 0.1663 d2.dn_loss_iou: 0.2206 d3.dn_loss_cls: 0.1265 d3.dn_loss_bbox: 0.1600 d3.dn_loss_iou: 0.2149 d4.dn_loss_cls: 0.1221 d4.dn_loss_bbox: 0.1578 d4.dn_loss_iou: 0.2120 d1.loss_lmm_region: 0.1381 loss_lmm_image: 0.8516 2024/11/12 06:45:36 - mmengine - INFO - Iter(train) [ 76600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:53:48 time: 1.9761 data_time: 0.0181 memory: 32037 grad_norm: 36.8967 loss: 8.4727 loss_cls: 0.2494 loss_bbox: 0.1119 loss_iou: 0.1884 d0.loss_cls: 0.2858 d0.loss_bbox: 0.1269 d0.loss_iou: 0.2049 d1.loss_cls: 0.2625 d1.loss_bbox: 0.1177 d1.loss_iou: 0.1950 d2.loss_cls: 0.2571 d2.loss_bbox: 0.1133 d2.loss_iou: 0.1900 d3.loss_cls: 0.2509 d3.loss_bbox: 0.1135 d3.loss_iou: 0.1910 d4.loss_cls: 0.2488 d4.loss_bbox: 0.1125 d4.loss_iou: 0.1892 enc_loss_cls: 0.2869 enc_loss_bbox: 0.1409 enc_loss_iou: 0.2269 dn_loss_cls: 0.1301 dn_loss_bbox: 0.1626 dn_loss_iou: 0.1972 d0.dn_loss_cls: 0.2086 d0.dn_loss_bbox: 0.3082 d0.dn_loss_iou: 0.3365 d1.dn_loss_cls: 0.1572 d1.dn_loss_bbox: 0.1983 d1.dn_loss_iou: 0.2289 d2.dn_loss_cls: 0.1400 d2.dn_loss_bbox: 0.1726 d2.dn_loss_iou: 0.2068 d3.dn_loss_cls: 0.1333 d3.dn_loss_bbox: 0.1644 d3.dn_loss_iou: 0.1994 d4.dn_loss_cls: 0.1279 d4.dn_loss_bbox: 0.1626 d4.dn_loss_iou: 0.1972 d1.loss_lmm_region: 0.1309 loss_lmm_image: 0.8465 2024/11/12 06:48:55 - mmengine - INFO - Iter(train) [ 76700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:50:26 time: 1.9888 data_time: 0.0182 memory: 34272 grad_norm: 26.6661 loss: 10.1181 loss_cls: 0.3401 loss_bbox: 0.1540 loss_iou: 0.2935 d0.loss_cls: 0.3906 d0.loss_bbox: 0.1682 d0.loss_iou: 0.3106 d1.loss_cls: 0.3610 d1.loss_bbox: 0.1591 d1.loss_iou: 0.2984 d2.loss_cls: 0.3501 d2.loss_bbox: 0.1567 d2.loss_iou: 0.2946 d3.loss_cls: 0.3490 d3.loss_bbox: 0.1517 d3.loss_iou: 0.2923 d4.loss_cls: 0.3423 d4.loss_bbox: 0.1534 d4.loss_iou: 0.2927 enc_loss_cls: 0.3962 enc_loss_bbox: 0.1791 enc_loss_iou: 0.3294 dn_loss_cls: 0.0965 dn_loss_bbox: 0.1675 dn_loss_iou: 0.2167 d0.dn_loss_cls: 0.1766 d0.dn_loss_bbox: 0.3014 d0.dn_loss_iou: 0.3530 d1.dn_loss_cls: 0.1304 d1.dn_loss_bbox: 0.1977 d1.dn_loss_iou: 0.2473 d2.dn_loss_cls: 0.1096 d2.dn_loss_bbox: 0.1754 d2.dn_loss_iou: 0.2258 d3.dn_loss_cls: 0.1028 d3.dn_loss_bbox: 0.1693 d3.dn_loss_iou: 0.2190 d4.dn_loss_cls: 0.0981 d4.dn_loss_bbox: 0.1674 d4.dn_loss_iou: 0.2168 d1.loss_lmm_region: 0.1386 loss_lmm_image: 0.8450 2024/11/12 06:52:13 - mmengine - INFO - Iter(train) [ 76800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:47:03 time: 1.9792 data_time: 0.0181 memory: 33311 grad_norm: 28.4273 loss: 9.6637 loss_cls: 0.3036 loss_bbox: 0.1533 loss_iou: 0.2849 d0.loss_cls: 0.3575 d0.loss_bbox: 0.1650 d0.loss_iou: 0.2998 d1.loss_cls: 0.3272 d1.loss_bbox: 0.1607 d1.loss_iou: 0.2973 d2.loss_cls: 0.3172 d2.loss_bbox: 0.1566 d2.loss_iou: 0.2891 d3.loss_cls: 0.3156 d3.loss_bbox: 0.1518 d3.loss_iou: 0.2846 d4.loss_cls: 0.3053 d4.loss_bbox: 0.1537 d4.loss_iou: 0.2862 enc_loss_cls: 0.3643 enc_loss_bbox: 0.1775 enc_loss_iou: 0.3201 dn_loss_cls: 0.0838 dn_loss_bbox: 0.1645 dn_loss_iou: 0.2192 d0.dn_loss_cls: 0.1637 d0.dn_loss_bbox: 0.2918 d0.dn_loss_iou: 0.3542 d1.dn_loss_cls: 0.1148 d1.dn_loss_bbox: 0.1918 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.0951 d2.dn_loss_bbox: 0.1740 d2.dn_loss_iou: 0.2283 d3.dn_loss_cls: 0.0893 d3.dn_loss_bbox: 0.1667 d3.dn_loss_iou: 0.2215 d4.dn_loss_cls: 0.0841 d4.dn_loss_bbox: 0.1645 d4.dn_loss_iou: 0.2190 d1.loss_lmm_region: 0.1149 loss_lmm_image: 0.8032 2024/11/12 06:55:33 - mmengine - INFO - Iter(train) [ 76900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:43:42 time: 1.9946 data_time: 0.0181 memory: 33400 grad_norm: 35.7718 loss: 9.0764 loss_cls: 0.2918 loss_bbox: 0.1217 loss_iou: 0.2334 d0.loss_cls: 0.3264 d0.loss_bbox: 0.1320 d0.loss_iou: 0.2487 d1.loss_cls: 0.3068 d1.loss_bbox: 0.1282 d1.loss_iou: 0.2388 d2.loss_cls: 0.3020 d2.loss_bbox: 0.1242 d2.loss_iou: 0.2340 d3.loss_cls: 0.2947 d3.loss_bbox: 0.1236 d3.loss_iou: 0.2344 d4.loss_cls: 0.2916 d4.loss_bbox: 0.1226 d4.loss_iou: 0.2334 enc_loss_cls: 0.3320 enc_loss_bbox: 0.1487 enc_loss_iou: 0.2692 dn_loss_cls: 0.1006 dn_loss_bbox: 0.1477 dn_loss_iou: 0.2164 d0.dn_loss_cls: 0.1819 d0.dn_loss_bbox: 0.2936 d0.dn_loss_iou: 0.3650 d1.dn_loss_cls: 0.1315 d1.dn_loss_bbox: 0.1791 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.1136 d2.dn_loss_bbox: 0.1593 d2.dn_loss_iou: 0.2267 d3.dn_loss_cls: 0.1045 d3.dn_loss_bbox: 0.1509 d3.dn_loss_iou: 0.2190 d4.dn_loss_cls: 0.1017 d4.dn_loss_bbox: 0.1477 d4.dn_loss_iou: 0.2165 d1.loss_lmm_region: 0.1510 loss_lmm_image: 0.8834 2024/11/12 06:58:51 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 06:58:51 - mmengine - INFO - Iter(train) [ 77000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:40:19 time: 1.9555 data_time: 0.0182 memory: 35555 grad_norm: 30.9035 loss: 9.7877 loss_cls: 0.3058 loss_bbox: 0.1306 loss_iou: 0.2218 d0.loss_cls: 0.3546 d0.loss_bbox: 0.1427 d0.loss_iou: 0.2336 d1.loss_cls: 0.3284 d1.loss_bbox: 0.1397 d1.loss_iou: 0.2290 d2.loss_cls: 0.3214 d2.loss_bbox: 0.1337 d2.loss_iou: 0.2261 d3.loss_cls: 0.3135 d3.loss_bbox: 0.1311 d3.loss_iou: 0.2213 d4.loss_cls: 0.3072 d4.loss_bbox: 0.1307 d4.loss_iou: 0.2222 enc_loss_cls: 0.3615 enc_loss_bbox: 0.1542 enc_loss_iou: 0.2527 dn_loss_cls: 0.1472 dn_loss_bbox: 0.1824 dn_loss_iou: 0.2189 d0.dn_loss_cls: 0.2360 d0.dn_loss_bbox: 0.3619 d0.dn_loss_iou: 0.3811 d1.dn_loss_cls: 0.1849 d1.dn_loss_bbox: 0.2256 d1.dn_loss_iou: 0.2546 d2.dn_loss_cls: 0.1644 d2.dn_loss_bbox: 0.1927 d2.dn_loss_iou: 0.2296 d3.dn_loss_cls: 0.1571 d3.dn_loss_bbox: 0.1859 d3.dn_loss_iou: 0.2217 d4.dn_loss_cls: 0.1491 d4.dn_loss_bbox: 0.1825 d4.dn_loss_iou: 0.2190 d1.loss_lmm_region: 0.1506 loss_lmm_image: 0.8807 2024/11/12 07:02:11 - mmengine - INFO - Iter(train) [ 77100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:36:58 time: 2.0258 data_time: 0.0182 memory: 32383 grad_norm: 45.6287 loss: 8.9578 loss_cls: 0.2650 loss_bbox: 0.1435 loss_iou: 0.2431 d0.loss_cls: 0.3287 d0.loss_bbox: 0.1453 d0.loss_iou: 0.2522 d1.loss_cls: 0.2859 d1.loss_bbox: 0.1469 d1.loss_iou: 0.2461 d2.loss_cls: 0.2769 d2.loss_bbox: 0.1443 d2.loss_iou: 0.2439 d3.loss_cls: 0.2735 d3.loss_bbox: 0.1425 d3.loss_iou: 0.2407 d4.loss_cls: 0.2695 d4.loss_bbox: 0.1424 d4.loss_iou: 0.2408 enc_loss_cls: 0.3255 enc_loss_bbox: 0.1631 enc_loss_iou: 0.2783 dn_loss_cls: 0.0808 dn_loss_bbox: 0.1685 dn_loss_iou: 0.2033 d0.dn_loss_cls: 0.1598 d0.dn_loss_bbox: 0.3158 d0.dn_loss_iou: 0.3484 d1.dn_loss_cls: 0.1101 d1.dn_loss_bbox: 0.1958 d1.dn_loss_iou: 0.2324 d2.dn_loss_cls: 0.0912 d2.dn_loss_bbox: 0.1762 d2.dn_loss_iou: 0.2113 d3.dn_loss_cls: 0.0862 d3.dn_loss_bbox: 0.1699 d3.dn_loss_iou: 0.2052 d4.dn_loss_cls: 0.0819 d4.dn_loss_bbox: 0.1686 d4.dn_loss_iou: 0.2033 d1.loss_lmm_region: 0.1276 loss_lmm_image: 0.8237 2024/11/12 07:05:32 - mmengine - INFO - Iter(train) [ 77200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:33:38 time: 1.9994 data_time: 0.0181 memory: 35394 grad_norm: 26.3896 loss: 9.3198 loss_cls: 0.2893 loss_bbox: 0.1422 loss_iou: 0.2489 d0.loss_cls: 0.3300 d0.loss_bbox: 0.1534 d0.loss_iou: 0.2639 d1.loss_cls: 0.3019 d1.loss_bbox: 0.1491 d1.loss_iou: 0.2610 d2.loss_cls: 0.2963 d2.loss_bbox: 0.1436 d2.loss_iou: 0.2537 d3.loss_cls: 0.2932 d3.loss_bbox: 0.1413 d3.loss_iou: 0.2508 d4.loss_cls: 0.2925 d4.loss_bbox: 0.1390 d4.loss_iou: 0.2485 enc_loss_cls: 0.3359 enc_loss_bbox: 0.1705 enc_loss_iou: 0.2889 dn_loss_cls: 0.0952 dn_loss_bbox: 0.1665 dn_loss_iou: 0.2105 d0.dn_loss_cls: 0.1733 d0.dn_loss_bbox: 0.3049 d0.dn_loss_iou: 0.3460 d1.dn_loss_cls: 0.1277 d1.dn_loss_bbox: 0.1940 d1.dn_loss_iou: 0.2391 d2.dn_loss_cls: 0.1113 d2.dn_loss_bbox: 0.1733 d2.dn_loss_iou: 0.2192 d3.dn_loss_cls: 0.1010 d3.dn_loss_bbox: 0.1673 d3.dn_loss_iou: 0.2124 d4.dn_loss_cls: 0.0972 d4.dn_loss_bbox: 0.1666 d4.dn_loss_iou: 0.2106 d1.loss_lmm_region: 0.1314 loss_lmm_image: 0.8782 2024/11/12 07:08:53 - mmengine - INFO - Iter(train) [ 77300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:30:18 time: 2.0213 data_time: 0.0181 memory: 32561 grad_norm: 28.4328 loss: 9.1167 loss_cls: 0.2483 loss_bbox: 0.1389 loss_iou: 0.2451 d0.loss_cls: 0.2968 d0.loss_bbox: 0.1485 d0.loss_iou: 0.2600 d1.loss_cls: 0.2773 d1.loss_bbox: 0.1364 d1.loss_iou: 0.2478 d2.loss_cls: 0.2608 d2.loss_bbox: 0.1382 d2.loss_iou: 0.2473 d3.loss_cls: 0.2550 d3.loss_bbox: 0.1366 d3.loss_iou: 0.2447 d4.loss_cls: 0.2520 d4.loss_bbox: 0.1375 d4.loss_iou: 0.2447 enc_loss_cls: 0.2975 enc_loss_bbox: 0.1663 enc_loss_iou: 0.2797 dn_loss_cls: 0.0955 dn_loss_bbox: 0.1835 dn_loss_iou: 0.2196 d0.dn_loss_cls: 0.1707 d0.dn_loss_bbox: 0.3422 d0.dn_loss_iou: 0.3536 d1.dn_loss_cls: 0.1204 d1.dn_loss_bbox: 0.2185 d1.dn_loss_iou: 0.2488 d2.dn_loss_cls: 0.1055 d2.dn_loss_bbox: 0.1972 d2.dn_loss_iou: 0.2305 d3.dn_loss_cls: 0.0999 d3.dn_loss_bbox: 0.1854 d3.dn_loss_iou: 0.2218 d4.dn_loss_cls: 0.0971 d4.dn_loss_bbox: 0.1835 d4.dn_loss_iou: 0.2196 d1.loss_lmm_region: 0.1187 loss_lmm_image: 0.8453 2024/11/12 07:12:15 - mmengine - INFO - Iter(train) [ 77400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:26:58 time: 2.0114 data_time: 0.0181 memory: 34542 grad_norm: 28.7964 loss: 9.5721 loss_cls: 0.3172 loss_bbox: 0.1379 loss_iou: 0.2372 d0.loss_cls: 0.3551 d0.loss_bbox: 0.1553 d0.loss_iou: 0.2581 d1.loss_cls: 0.3296 d1.loss_bbox: 0.1472 d1.loss_iou: 0.2474 d2.loss_cls: 0.3187 d2.loss_bbox: 0.1461 d2.loss_iou: 0.2474 d3.loss_cls: 0.3138 d3.loss_bbox: 0.1440 d3.loss_iou: 0.2437 d4.loss_cls: 0.3137 d4.loss_bbox: 0.1402 d4.loss_iou: 0.2410 enc_loss_cls: 0.3597 enc_loss_bbox: 0.1648 enc_loss_iou: 0.2763 dn_loss_cls: 0.0930 dn_loss_bbox: 0.1873 dn_loss_iou: 0.2224 d0.dn_loss_cls: 0.1780 d0.dn_loss_bbox: 0.3252 d0.dn_loss_iou: 0.3647 d1.dn_loss_cls: 0.1262 d1.dn_loss_bbox: 0.2223 d1.dn_loss_iou: 0.2548 d2.dn_loss_cls: 0.1037 d2.dn_loss_bbox: 0.2010 d2.dn_loss_iou: 0.2345 d3.dn_loss_cls: 0.0953 d3.dn_loss_bbox: 0.1888 d3.dn_loss_iou: 0.2247 d4.dn_loss_cls: 0.0930 d4.dn_loss_bbox: 0.1873 d4.dn_loss_iou: 0.2225 d1.loss_lmm_region: 0.1399 loss_lmm_image: 0.8126 2024/11/12 07:15:35 - mmengine - INFO - Iter(train) [ 77500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:23:37 time: 1.9866 data_time: 0.0183 memory: 34098 grad_norm: 25.7062 loss: 9.2389 loss_cls: 0.2615 loss_bbox: 0.1392 loss_iou: 0.2323 d0.loss_cls: 0.2929 d0.loss_bbox: 0.1606 d0.loss_iou: 0.2504 d1.loss_cls: 0.2760 d1.loss_bbox: 0.1519 d1.loss_iou: 0.2407 d2.loss_cls: 0.2656 d2.loss_bbox: 0.1455 d2.loss_iou: 0.2379 d3.loss_cls: 0.2580 d3.loss_bbox: 0.1435 d3.loss_iou: 0.2350 d4.loss_cls: 0.2576 d4.loss_bbox: 0.1414 d4.loss_iou: 0.2337 enc_loss_cls: 0.2980 enc_loss_bbox: 0.1728 enc_loss_iou: 0.2692 dn_loss_cls: 0.1000 dn_loss_bbox: 0.1778 dn_loss_iou: 0.2223 d0.dn_loss_cls: 0.1905 d0.dn_loss_bbox: 0.3335 d0.dn_loss_iou: 0.3681 d1.dn_loss_cls: 0.1349 d1.dn_loss_bbox: 0.2147 d1.dn_loss_iou: 0.2561 d2.dn_loss_cls: 0.1141 d2.dn_loss_bbox: 0.1887 d2.dn_loss_iou: 0.2332 d3.dn_loss_cls: 0.1057 d3.dn_loss_bbox: 0.1794 d3.dn_loss_iou: 0.2247 d4.dn_loss_cls: 0.1023 d4.dn_loss_bbox: 0.1779 d4.dn_loss_iou: 0.2224 d1.loss_lmm_region: 0.1465 loss_lmm_image: 0.8826 2024/11/12 07:18:54 - mmengine - INFO - Iter(train) [ 77600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:20:15 time: 1.9757 data_time: 0.0182 memory: 35056 grad_norm: 34.1597 loss: 8.7003 loss_cls: 0.2669 loss_bbox: 0.1221 loss_iou: 0.2103 d0.loss_cls: 0.3039 d0.loss_bbox: 0.1341 d0.loss_iou: 0.2248 d1.loss_cls: 0.2821 d1.loss_bbox: 0.1251 d1.loss_iou: 0.2140 d2.loss_cls: 0.2756 d2.loss_bbox: 0.1217 d2.loss_iou: 0.2081 d3.loss_cls: 0.2749 d3.loss_bbox: 0.1222 d3.loss_iou: 0.2103 d4.loss_cls: 0.2686 d4.loss_bbox: 0.1230 d4.loss_iou: 0.2108 enc_loss_cls: 0.3136 enc_loss_bbox: 0.1488 enc_loss_iou: 0.2398 dn_loss_cls: 0.1236 dn_loss_bbox: 0.1621 dn_loss_iou: 0.1898 d0.dn_loss_cls: 0.1947 d0.dn_loss_bbox: 0.3089 d0.dn_loss_iou: 0.3269 d1.dn_loss_cls: 0.1482 d1.dn_loss_bbox: 0.1982 d1.dn_loss_iou: 0.2207 d2.dn_loss_cls: 0.1306 d2.dn_loss_bbox: 0.1741 d2.dn_loss_iou: 0.1996 d3.dn_loss_cls: 0.1251 d3.dn_loss_bbox: 0.1639 d3.dn_loss_iou: 0.1916 d4.dn_loss_cls: 0.1223 d4.dn_loss_bbox: 0.1620 d4.dn_loss_iou: 0.1897 d1.loss_lmm_region: 0.1435 loss_lmm_image: 0.8245 2024/11/12 07:22:12 - mmengine - INFO - Iter(train) [ 77700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:16:52 time: 1.9843 data_time: 0.0183 memory: 34927 grad_norm: 27.8092 loss: 8.4232 loss_cls: 0.2548 loss_bbox: 0.1235 loss_iou: 0.2379 d0.loss_cls: 0.2947 d0.loss_bbox: 0.1352 d0.loss_iou: 0.2545 d1.loss_cls: 0.2699 d1.loss_bbox: 0.1304 d1.loss_iou: 0.2480 d2.loss_cls: 0.2650 d2.loss_bbox: 0.1231 d2.loss_iou: 0.2385 d3.loss_cls: 0.2588 d3.loss_bbox: 0.1231 d3.loss_iou: 0.2384 d4.loss_cls: 0.2558 d4.loss_bbox: 0.1244 d4.loss_iou: 0.2392 enc_loss_cls: 0.2971 enc_loss_bbox: 0.1508 enc_loss_iou: 0.2775 dn_loss_cls: 0.0817 dn_loss_bbox: 0.1305 dn_loss_iou: 0.1910 d0.dn_loss_cls: 0.1518 d0.dn_loss_bbox: 0.2772 d0.dn_loss_iou: 0.3262 d1.dn_loss_cls: 0.1048 d1.dn_loss_bbox: 0.1659 d1.dn_loss_iou: 0.2207 d2.dn_loss_cls: 0.0922 d2.dn_loss_bbox: 0.1427 d2.dn_loss_iou: 0.2007 d3.dn_loss_cls: 0.0861 d3.dn_loss_bbox: 0.1326 d3.dn_loss_iou: 0.1933 d4.dn_loss_cls: 0.0818 d4.dn_loss_bbox: 0.1305 d4.dn_loss_iou: 0.1911 d1.loss_lmm_region: 0.1325 loss_lmm_image: 0.8494 2024/11/12 07:25:33 - mmengine - INFO - Iter(train) [ 77800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:13:32 time: 2.0214 data_time: 0.0182 memory: 33517 grad_norm: 27.4373 loss: 8.9455 loss_cls: 0.2880 loss_bbox: 0.1287 loss_iou: 0.2018 d0.loss_cls: 0.3361 d0.loss_bbox: 0.1383 d0.loss_iou: 0.2170 d1.loss_cls: 0.3069 d1.loss_bbox: 0.1300 d1.loss_iou: 0.2050 d2.loss_cls: 0.3012 d2.loss_bbox: 0.1282 d2.loss_iou: 0.2014 d3.loss_cls: 0.2946 d3.loss_bbox: 0.1280 d3.loss_iou: 0.2020 d4.loss_cls: 0.2882 d4.loss_bbox: 0.1277 d4.loss_iou: 0.2014 enc_loss_cls: 0.3412 enc_loss_bbox: 0.1557 enc_loss_iou: 0.2384 dn_loss_cls: 0.1095 dn_loss_bbox: 0.1673 dn_loss_iou: 0.1986 d0.dn_loss_cls: 0.1947 d0.dn_loss_bbox: 0.3263 d0.dn_loss_iou: 0.3442 d1.dn_loss_cls: 0.1433 d1.dn_loss_bbox: 0.2057 d1.dn_loss_iou: 0.2331 d2.dn_loss_cls: 0.1223 d2.dn_loss_bbox: 0.1800 d2.dn_loss_iou: 0.2095 d3.dn_loss_cls: 0.1144 d3.dn_loss_bbox: 0.1711 d3.dn_loss_iou: 0.2018 d4.dn_loss_cls: 0.1103 d4.dn_loss_bbox: 0.1672 d4.dn_loss_iou: 0.1985 d1.loss_lmm_region: 0.1263 loss_lmm_image: 0.8613 2024/11/12 07:28:52 - mmengine - INFO - Iter(train) [ 77900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:10:10 time: 1.9647 data_time: 0.0183 memory: 33531 grad_norm: 34.2359 loss: 9.2527 loss_cls: 0.2879 loss_bbox: 0.1370 loss_iou: 0.2386 d0.loss_cls: 0.3309 d0.loss_bbox: 0.1481 d0.loss_iou: 0.2482 d1.loss_cls: 0.3096 d1.loss_bbox: 0.1411 d1.loss_iou: 0.2391 d2.loss_cls: 0.2983 d2.loss_bbox: 0.1390 d2.loss_iou: 0.2395 d3.loss_cls: 0.2898 d3.loss_bbox: 0.1376 d3.loss_iou: 0.2387 d4.loss_cls: 0.2881 d4.loss_bbox: 0.1380 d4.loss_iou: 0.2393 enc_loss_cls: 0.3339 enc_loss_bbox: 0.1594 enc_loss_iou: 0.2687 dn_loss_cls: 0.1019 dn_loss_bbox: 0.1745 dn_loss_iou: 0.2174 d0.dn_loss_cls: 0.1735 d0.dn_loss_bbox: 0.3096 d0.dn_loss_iou: 0.3483 d1.dn_loss_cls: 0.1261 d1.dn_loss_bbox: 0.2081 d1.dn_loss_iou: 0.2453 d2.dn_loss_cls: 0.1098 d2.dn_loss_bbox: 0.1841 d2.dn_loss_iou: 0.2253 d3.dn_loss_cls: 0.1040 d3.dn_loss_bbox: 0.1772 d3.dn_loss_iou: 0.2197 d4.dn_loss_cls: 0.1011 d4.dn_loss_bbox: 0.1746 d4.dn_loss_iou: 0.2174 d1.loss_lmm_region: 0.1140 loss_lmm_image: 0.8703 2024/11/12 07:32:12 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 07:32:12 - mmengine - INFO - Iter(train) [ 78000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:06:49 time: 2.0078 data_time: 0.0181 memory: 34152 grad_norm: 34.0688 loss: 10.0293 loss_cls: 0.3379 loss_bbox: 0.1539 loss_iou: 0.2472 d0.loss_cls: 0.3800 d0.loss_bbox: 0.1601 d0.loss_iou: 0.2607 d1.loss_cls: 0.3472 d1.loss_bbox: 0.1630 d1.loss_iou: 0.2591 d2.loss_cls: 0.3481 d2.loss_bbox: 0.1549 d2.loss_iou: 0.2504 d3.loss_cls: 0.3361 d3.loss_bbox: 0.1558 d3.loss_iou: 0.2509 d4.loss_cls: 0.3367 d4.loss_bbox: 0.1537 d4.loss_iou: 0.2471 enc_loss_cls: 0.3908 enc_loss_bbox: 0.1690 enc_loss_iou: 0.2773 dn_loss_cls: 0.1165 dn_loss_bbox: 0.1757 dn_loss_iou: 0.2214 d0.dn_loss_cls: 0.2186 d0.dn_loss_bbox: 0.3250 d0.dn_loss_iou: 0.3616 d1.dn_loss_cls: 0.1594 d1.dn_loss_bbox: 0.2150 d1.dn_loss_iou: 0.2543 d2.dn_loss_cls: 0.1331 d2.dn_loss_bbox: 0.1846 d2.dn_loss_iou: 0.2307 d3.dn_loss_cls: 0.1237 d3.dn_loss_bbox: 0.1764 d3.dn_loss_iou: 0.2233 d4.dn_loss_cls: 0.1178 d4.dn_loss_bbox: 0.1757 d4.dn_loss_iou: 0.2215 d1.loss_lmm_region: 0.1559 loss_lmm_image: 0.8592 2024/11/12 07:35:31 - mmengine - INFO - Iter(train) [ 78100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:03:26 time: 1.9740 data_time: 0.0182 memory: 31774 grad_norm: nan loss: 10.0427 loss_cls: 0.2955 loss_bbox: 0.1437 loss_iou: 0.2698 d0.loss_cls: 0.3452 d0.loss_bbox: 0.1593 d0.loss_iou: 0.2836 d1.loss_cls: 0.3203 d1.loss_bbox: 0.1497 d1.loss_iou: 0.2769 d2.loss_cls: 0.3034 d2.loss_bbox: 0.1461 d2.loss_iou: 0.2743 d3.loss_cls: 0.3012 d3.loss_bbox: 0.1429 d3.loss_iou: 0.2714 d4.loss_cls: 0.2945 d4.loss_bbox: 0.1440 d4.loss_iou: 0.2706 enc_loss_cls: 0.3420 enc_loss_bbox: 0.1728 enc_loss_iou: 0.3087 dn_loss_cls: 0.1110 dn_loss_bbox: 0.1932 dn_loss_iou: 0.2343 d0.dn_loss_cls: 0.2022 d0.dn_loss_bbox: 0.3498 d0.dn_loss_iou: 0.3813 d1.dn_loss_cls: 0.1475 d1.dn_loss_bbox: 0.2241 d1.dn_loss_iou: 0.2636 d2.dn_loss_cls: 0.1235 d2.dn_loss_bbox: 0.2073 d2.dn_loss_iou: 0.2446 d3.dn_loss_cls: 0.1172 d3.dn_loss_bbox: 0.1953 d3.dn_loss_iou: 0.2367 d4.dn_loss_cls: 0.1122 d4.dn_loss_bbox: 0.1933 d4.dn_loss_iou: 0.2343 d1.loss_lmm_region: 0.1609 loss_lmm_image: 0.8942 2024/11/12 07:38:50 - mmengine - INFO - Iter(train) [ 78200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 16:00:04 time: 1.9863 data_time: 0.0182 memory: 33884 grad_norm: 29.2451 loss: 9.2249 loss_cls: 0.2825 loss_bbox: 0.1210 loss_iou: 0.2408 d0.loss_cls: 0.3272 d0.loss_bbox: 0.1295 d0.loss_iou: 0.2513 d1.loss_cls: 0.2955 d1.loss_bbox: 0.1289 d1.loss_iou: 0.2524 d2.loss_cls: 0.2935 d2.loss_bbox: 0.1205 d2.loss_iou: 0.2405 d3.loss_cls: 0.2877 d3.loss_bbox: 0.1198 d3.loss_iou: 0.2407 d4.loss_cls: 0.2834 d4.loss_bbox: 0.1213 d4.loss_iou: 0.2410 enc_loss_cls: 0.3275 enc_loss_bbox: 0.1394 enc_loss_iou: 0.2700 dn_loss_cls: 0.1444 dn_loss_bbox: 0.1577 dn_loss_iou: 0.2090 d0.dn_loss_cls: 0.2177 d0.dn_loss_bbox: 0.3020 d0.dn_loss_iou: 0.3529 d1.dn_loss_cls: 0.1768 d1.dn_loss_bbox: 0.1868 d1.dn_loss_iou: 0.2391 d2.dn_loss_cls: 0.1536 d2.dn_loss_bbox: 0.1657 d2.dn_loss_iou: 0.2176 d3.dn_loss_cls: 0.1464 d3.dn_loss_bbox: 0.1596 d3.dn_loss_iou: 0.2116 d4.dn_loss_cls: 0.1435 d4.dn_loss_bbox: 0.1578 d4.dn_loss_iou: 0.2092 d1.loss_lmm_region: 0.1497 loss_lmm_image: 0.8096 2024/11/12 07:42:11 - mmengine - INFO - Iter(train) [ 78300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:56:44 time: 2.0137 data_time: 0.0181 memory: 34278 grad_norm: 35.1666 loss: 8.8568 loss_cls: 0.2520 loss_bbox: 0.1296 loss_iou: 0.2030 d0.loss_cls: 0.2842 d0.loss_bbox: 0.1414 d0.loss_iou: 0.2194 d1.loss_cls: 0.2650 d1.loss_bbox: 0.1354 d1.loss_iou: 0.2129 d2.loss_cls: 0.2616 d2.loss_bbox: 0.1312 d2.loss_iou: 0.2057 d3.loss_cls: 0.2541 d3.loss_bbox: 0.1327 d3.loss_iou: 0.2065 d4.loss_cls: 0.2555 d4.loss_bbox: 0.1278 d4.loss_iou: 0.2003 enc_loss_cls: 0.2884 enc_loss_bbox: 0.1591 enc_loss_iou: 0.2441 dn_loss_cls: 0.1183 dn_loss_bbox: 0.1886 dn_loss_iou: 0.2031 d0.dn_loss_cls: 0.1964 d0.dn_loss_bbox: 0.3449 d0.dn_loss_iou: 0.3457 d1.dn_loss_cls: 0.1511 d1.dn_loss_bbox: 0.2209 d1.dn_loss_iou: 0.2347 d2.dn_loss_cls: 0.1269 d2.dn_loss_bbox: 0.1963 d2.dn_loss_iou: 0.2130 d3.dn_loss_cls: 0.1213 d3.dn_loss_bbox: 0.1901 d3.dn_loss_iou: 0.2054 d4.dn_loss_cls: 0.1184 d4.dn_loss_bbox: 0.1885 d4.dn_loss_iou: 0.2030 d1.loss_lmm_region: 0.1508 loss_lmm_image: 0.8294 2024/11/12 07:45:29 - mmengine - INFO - Iter(train) [ 78400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:53:21 time: 2.0005 data_time: 0.0182 memory: 35272 grad_norm: 33.6401 loss: 9.8171 loss_cls: 0.2803 loss_bbox: 0.1642 loss_iou: 0.2652 d0.loss_cls: 0.3269 d0.loss_bbox: 0.1701 d0.loss_iou: 0.2767 d1.loss_cls: 0.3106 d1.loss_bbox: 0.1582 d1.loss_iou: 0.2665 d2.loss_cls: 0.2927 d2.loss_bbox: 0.1611 d2.loss_iou: 0.2659 d3.loss_cls: 0.2849 d3.loss_bbox: 0.1611 d3.loss_iou: 0.2657 d4.loss_cls: 0.2817 d4.loss_bbox: 0.1635 d4.loss_iou: 0.2643 enc_loss_cls: 0.3346 enc_loss_bbox: 0.1785 enc_loss_iou: 0.2924 dn_loss_cls: 0.1036 dn_loss_bbox: 0.1965 dn_loss_iou: 0.2317 d0.dn_loss_cls: 0.1825 d0.dn_loss_bbox: 0.3498 d0.dn_loss_iou: 0.3692 d1.dn_loss_cls: 0.1328 d1.dn_loss_bbox: 0.2316 d1.dn_loss_iou: 0.2626 d2.dn_loss_cls: 0.1147 d2.dn_loss_bbox: 0.2070 d2.dn_loss_iou: 0.2397 d3.dn_loss_cls: 0.1067 d3.dn_loss_bbox: 0.1992 d3.dn_loss_iou: 0.2343 d4.dn_loss_cls: 0.1044 d4.dn_loss_bbox: 0.1966 d4.dn_loss_iou: 0.2317 d1.loss_lmm_region: 0.1309 loss_lmm_image: 0.8264 2024/11/12 07:48:47 - mmengine - INFO - Iter(train) [ 78500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:49:58 time: 1.9714 data_time: 0.0181 memory: 34085 grad_norm: 28.6880 loss: 9.3774 loss_cls: 0.2856 loss_bbox: 0.1411 loss_iou: 0.2402 d0.loss_cls: 0.3321 d0.loss_bbox: 0.1604 d0.loss_iou: 0.2524 d1.loss_cls: 0.3039 d1.loss_bbox: 0.1530 d1.loss_iou: 0.2442 d2.loss_cls: 0.2886 d2.loss_bbox: 0.1468 d2.loss_iou: 0.2421 d3.loss_cls: 0.2875 d3.loss_bbox: 0.1435 d3.loss_iou: 0.2410 d4.loss_cls: 0.2881 d4.loss_bbox: 0.1404 d4.loss_iou: 0.2382 enc_loss_cls: 0.3317 enc_loss_bbox: 0.1803 enc_loss_iou: 0.2770 dn_loss_cls: 0.1229 dn_loss_bbox: 0.1713 dn_loss_iou: 0.2110 d0.dn_loss_cls: 0.2046 d0.dn_loss_bbox: 0.3192 d0.dn_loss_iou: 0.3482 d1.dn_loss_cls: 0.1515 d1.dn_loss_bbox: 0.2026 d1.dn_loss_iou: 0.2386 d2.dn_loss_cls: 0.1364 d2.dn_loss_bbox: 0.1816 d2.dn_loss_iou: 0.2194 d3.dn_loss_cls: 0.1292 d3.dn_loss_bbox: 0.1740 d3.dn_loss_iou: 0.2133 d4.dn_loss_cls: 0.1228 d4.dn_loss_bbox: 0.1713 d4.dn_loss_iou: 0.2110 d1.loss_lmm_region: 0.1345 loss_lmm_image: 0.7957 2024/11/12 07:52:07 - mmengine - INFO - Iter(train) [ 78600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:46:37 time: 2.0232 data_time: 0.0182 memory: 34875 grad_norm: 32.4180 loss: 10.2886 loss_cls: 0.3321 loss_bbox: 0.1518 loss_iou: 0.2922 d0.loss_cls: 0.3829 d0.loss_bbox: 0.1638 d0.loss_iou: 0.3081 d1.loss_cls: 0.3527 d1.loss_bbox: 0.1629 d1.loss_iou: 0.3046 d2.loss_cls: 0.3371 d2.loss_bbox: 0.1587 d2.loss_iou: 0.3031 d3.loss_cls: 0.3359 d3.loss_bbox: 0.1544 d3.loss_iou: 0.2943 d4.loss_cls: 0.3316 d4.loss_bbox: 0.1533 d4.loss_iou: 0.2938 enc_loss_cls: 0.3897 enc_loss_bbox: 0.1785 enc_loss_iou: 0.3348 dn_loss_cls: 0.0991 dn_loss_bbox: 0.1758 dn_loss_iou: 0.2356 d0.dn_loss_cls: 0.1810 d0.dn_loss_bbox: 0.3351 d0.dn_loss_iou: 0.3881 d1.dn_loss_cls: 0.1296 d1.dn_loss_bbox: 0.2136 d1.dn_loss_iou: 0.2694 d2.dn_loss_cls: 0.1108 d2.dn_loss_bbox: 0.1889 d2.dn_loss_iou: 0.2474 d3.dn_loss_cls: 0.1041 d3.dn_loss_bbox: 0.1783 d3.dn_loss_iou: 0.2385 d4.dn_loss_cls: 0.1012 d4.dn_loss_bbox: 0.1758 d4.dn_loss_iou: 0.2356 d1.loss_lmm_region: 0.1436 loss_lmm_image: 0.8210 2024/11/12 07:55:27 - mmengine - INFO - Iter(train) [ 78700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:43:16 time: 1.9974 data_time: 0.0182 memory: 33502 grad_norm: 40.9152 loss: 9.0898 loss_cls: 0.2720 loss_bbox: 0.1268 loss_iou: 0.2278 d0.loss_cls: 0.3090 d0.loss_bbox: 0.1451 d0.loss_iou: 0.2522 d1.loss_cls: 0.2883 d1.loss_bbox: 0.1357 d1.loss_iou: 0.2403 d2.loss_cls: 0.2849 d2.loss_bbox: 0.1294 d2.loss_iou: 0.2307 d3.loss_cls: 0.2792 d3.loss_bbox: 0.1272 d3.loss_iou: 0.2280 d4.loss_cls: 0.2719 d4.loss_bbox: 0.1276 d4.loss_iou: 0.2294 enc_loss_cls: 0.3177 enc_loss_bbox: 0.1638 enc_loss_iou: 0.2777 dn_loss_cls: 0.0868 dn_loss_bbox: 0.1829 dn_loss_iou: 0.2152 d0.dn_loss_cls: 0.1721 d0.dn_loss_bbox: 0.3531 d0.dn_loss_iou: 0.3678 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.2214 d1.dn_loss_iou: 0.2499 d2.dn_loss_cls: 0.1000 d2.dn_loss_bbox: 0.1961 d2.dn_loss_iou: 0.2261 d3.dn_loss_cls: 0.0924 d3.dn_loss_bbox: 0.1860 d3.dn_loss_iou: 0.2184 d4.dn_loss_cls: 0.0878 d4.dn_loss_bbox: 0.1829 d4.dn_loss_iou: 0.2152 d1.loss_lmm_region: 0.1240 loss_lmm_image: 0.8257 2024/11/12 07:58:47 - mmengine - INFO - Iter(train) [ 78800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:39:55 time: 1.9930 data_time: 0.0182 memory: 33792 grad_norm: 26.1930 loss: 9.5830 loss_cls: 0.2796 loss_bbox: 0.1509 loss_iou: 0.2630 d0.loss_cls: 0.3168 d0.loss_bbox: 0.1702 d0.loss_iou: 0.2847 d1.loss_cls: 0.2952 d1.loss_bbox: 0.1576 d1.loss_iou: 0.2727 d2.loss_cls: 0.2897 d2.loss_bbox: 0.1540 d2.loss_iou: 0.2667 d3.loss_cls: 0.2832 d3.loss_bbox: 0.1515 d3.loss_iou: 0.2646 d4.loss_cls: 0.2812 d4.loss_bbox: 0.1511 d4.loss_iou: 0.2651 enc_loss_cls: 0.3183 enc_loss_bbox: 0.1802 enc_loss_iou: 0.3030 dn_loss_cls: 0.1060 dn_loss_bbox: 0.1794 dn_loss_iou: 0.2193 d0.dn_loss_cls: 0.1774 d0.dn_loss_bbox: 0.3134 d0.dn_loss_iou: 0.3610 d1.dn_loss_cls: 0.1339 d1.dn_loss_bbox: 0.2114 d1.dn_loss_iou: 0.2515 d2.dn_loss_cls: 0.1210 d2.dn_loss_bbox: 0.1911 d2.dn_loss_iou: 0.2296 d3.dn_loss_cls: 0.1140 d3.dn_loss_bbox: 0.1821 d3.dn_loss_iou: 0.2220 d4.dn_loss_cls: 0.1079 d4.dn_loss_bbox: 0.1794 d4.dn_loss_iou: 0.2193 d1.loss_lmm_region: 0.1135 loss_lmm_image: 0.8502 2024/11/12 08:02:05 - mmengine - INFO - Iter(train) [ 78900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:36:32 time: 1.9987 data_time: 0.0184 memory: 35341 grad_norm: 29.5682 loss: 8.8517 loss_cls: 0.2713 loss_bbox: 0.1342 loss_iou: 0.2309 d0.loss_cls: 0.3144 d0.loss_bbox: 0.1463 d0.loss_iou: 0.2449 d1.loss_cls: 0.2866 d1.loss_bbox: 0.1374 d1.loss_iou: 0.2355 d2.loss_cls: 0.2815 d2.loss_bbox: 0.1352 d2.loss_iou: 0.2325 d3.loss_cls: 0.2713 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2320 d4.loss_cls: 0.2663 d4.loss_bbox: 0.1363 d4.loss_iou: 0.2335 enc_loss_cls: 0.3183 enc_loss_bbox: 0.1701 enc_loss_iou: 0.2697 dn_loss_cls: 0.0833 dn_loss_bbox: 0.1619 dn_loss_iou: 0.2009 d0.dn_loss_cls: 0.1667 d0.dn_loss_bbox: 0.3057 d0.dn_loss_iou: 0.3340 d1.dn_loss_cls: 0.1143 d1.dn_loss_bbox: 0.1973 d1.dn_loss_iou: 0.2285 d2.dn_loss_cls: 0.0922 d2.dn_loss_bbox: 0.1717 d2.dn_loss_iou: 0.2079 d3.dn_loss_cls: 0.0858 d3.dn_loss_bbox: 0.1638 d3.dn_loss_iou: 0.2029 d4.dn_loss_cls: 0.0850 d4.dn_loss_bbox: 0.1619 d4.dn_loss_iou: 0.2009 d1.loss_lmm_region: 0.1266 loss_lmm_image: 0.8768 2024/11/12 08:05:24 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 08:05:24 - mmengine - INFO - Iter(train) [ 79000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:33:10 time: 1.9793 data_time: 0.0182 memory: 34388 grad_norm: 25.9878 loss: 8.8852 loss_cls: 0.2279 loss_bbox: 0.1338 loss_iou: 0.2432 d0.loss_cls: 0.2660 d0.loss_bbox: 0.1356 d0.loss_iou: 0.2468 d1.loss_cls: 0.2418 d1.loss_bbox: 0.1380 d1.loss_iou: 0.2463 d2.loss_cls: 0.2381 d2.loss_bbox: 0.1318 d2.loss_iou: 0.2420 d3.loss_cls: 0.2344 d3.loss_bbox: 0.1330 d3.loss_iou: 0.2408 d4.loss_cls: 0.2288 d4.loss_bbox: 0.1328 d4.loss_iou: 0.2427 enc_loss_cls: 0.2716 enc_loss_bbox: 0.1528 enc_loss_iou: 0.2633 dn_loss_cls: 0.0968 dn_loss_bbox: 0.1800 dn_loss_iou: 0.2207 d0.dn_loss_cls: 0.1770 d0.dn_loss_bbox: 0.3467 d0.dn_loss_iou: 0.3586 d1.dn_loss_cls: 0.1244 d1.dn_loss_bbox: 0.2125 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.1097 d2.dn_loss_bbox: 0.1895 d2.dn_loss_iou: 0.2277 d3.dn_loss_cls: 0.1036 d3.dn_loss_bbox: 0.1828 d3.dn_loss_iou: 0.2226 d4.dn_loss_cls: 0.0972 d4.dn_loss_bbox: 0.1801 d4.dn_loss_iou: 0.2208 d1.loss_lmm_region: 0.1351 loss_lmm_image: 0.8597 2024/11/12 08:08:42 - mmengine - INFO - Iter(train) [ 79100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:29:48 time: 2.0036 data_time: 0.0182 memory: 34700 grad_norm: 29.3901 loss: 9.3029 loss_cls: 0.2506 loss_bbox: 0.1481 loss_iou: 0.2733 d0.loss_cls: 0.2903 d0.loss_bbox: 0.1584 d0.loss_iou: 0.2821 d1.loss_cls: 0.2701 d1.loss_bbox: 0.1498 d1.loss_iou: 0.2746 d2.loss_cls: 0.2602 d2.loss_bbox: 0.1476 d2.loss_iou: 0.2726 d3.loss_cls: 0.2548 d3.loss_bbox: 0.1448 d3.loss_iou: 0.2711 d4.loss_cls: 0.2508 d4.loss_bbox: 0.1469 d4.loss_iou: 0.2732 enc_loss_cls: 0.3039 enc_loss_bbox: 0.1708 enc_loss_iou: 0.3018 dn_loss_cls: 0.0841 dn_loss_bbox: 0.1626 dn_loss_iou: 0.2355 d0.dn_loss_cls: 0.1683 d0.dn_loss_bbox: 0.3345 d0.dn_loss_iou: 0.3925 d1.dn_loss_cls: 0.1183 d1.dn_loss_bbox: 0.2033 d1.dn_loss_iou: 0.2714 d2.dn_loss_cls: 0.0986 d2.dn_loss_bbox: 0.1726 d2.dn_loss_iou: 0.2449 d3.dn_loss_cls: 0.0891 d3.dn_loss_bbox: 0.1653 d3.dn_loss_iou: 0.2380 d4.dn_loss_cls: 0.0848 d4.dn_loss_bbox: 0.1626 d4.dn_loss_iou: 0.2355 d1.loss_lmm_region: 0.1252 loss_lmm_image: 0.8199 2024/11/12 08:12:03 - mmengine - INFO - Iter(train) [ 79200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:26:27 time: 1.9871 data_time: 0.0183 memory: 34812 grad_norm: 28.6183 loss: 9.0490 loss_cls: 0.2694 loss_bbox: 0.1260 loss_iou: 0.2359 d0.loss_cls: 0.3156 d0.loss_bbox: 0.1334 d0.loss_iou: 0.2483 d1.loss_cls: 0.2848 d1.loss_bbox: 0.1354 d1.loss_iou: 0.2408 d2.loss_cls: 0.2793 d2.loss_bbox: 0.1255 d2.loss_iou: 0.2367 d3.loss_cls: 0.2736 d3.loss_bbox: 0.1256 d3.loss_iou: 0.2361 d4.loss_cls: 0.2700 d4.loss_bbox: 0.1253 d4.loss_iou: 0.2352 enc_loss_cls: 0.3238 enc_loss_bbox: 0.1489 enc_loss_iou: 0.2689 dn_loss_cls: 0.0984 dn_loss_bbox: 0.1640 dn_loss_iou: 0.2208 d0.dn_loss_cls: 0.1856 d0.dn_loss_bbox: 0.3175 d0.dn_loss_iou: 0.3683 d1.dn_loss_cls: 0.1266 d1.dn_loss_bbox: 0.1954 d1.dn_loss_iou: 0.2508 d2.dn_loss_cls: 0.1096 d2.dn_loss_bbox: 0.1741 d2.dn_loss_iou: 0.2308 d3.dn_loss_cls: 0.1019 d3.dn_loss_bbox: 0.1649 d3.dn_loss_iou: 0.2228 d4.dn_loss_cls: 0.0975 d4.dn_loss_bbox: 0.1641 d4.dn_loss_iou: 0.2208 d1.loss_lmm_region: 0.1460 loss_lmm_image: 0.8508 2024/11/12 08:15:21 - mmengine - INFO - Iter(train) [ 79300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:23:04 time: 1.9785 data_time: 0.0182 memory: 33503 grad_norm: 29.0938 loss: 9.9565 loss_cls: 0.3347 loss_bbox: 0.1516 loss_iou: 0.2724 d0.loss_cls: 0.3831 d0.loss_bbox: 0.1681 d0.loss_iou: 0.2885 d1.loss_cls: 0.3621 d1.loss_bbox: 0.1565 d1.loss_iou: 0.2752 d2.loss_cls: 0.3512 d2.loss_bbox: 0.1521 d2.loss_iou: 0.2703 d3.loss_cls: 0.3435 d3.loss_bbox: 0.1501 d3.loss_iou: 0.2690 d4.loss_cls: 0.3368 d4.loss_bbox: 0.1515 d4.loss_iou: 0.2715 enc_loss_cls: 0.3941 enc_loss_bbox: 0.1768 enc_loss_iou: 0.3095 dn_loss_cls: 0.1332 dn_loss_bbox: 0.1490 dn_loss_iou: 0.2034 d0.dn_loss_cls: 0.2105 d0.dn_loss_bbox: 0.2890 d0.dn_loss_iou: 0.3412 d1.dn_loss_cls: 0.1637 d1.dn_loss_bbox: 0.1862 d1.dn_loss_iou: 0.2357 d2.dn_loss_cls: 0.1447 d2.dn_loss_bbox: 0.1600 d2.dn_loss_iou: 0.2136 d3.dn_loss_cls: 0.1343 d3.dn_loss_bbox: 0.1514 d3.dn_loss_iou: 0.2059 d4.dn_loss_cls: 0.1332 d4.dn_loss_bbox: 0.1491 d4.dn_loss_iou: 0.2033 d1.loss_lmm_region: 0.1320 loss_lmm_image: 0.8485 2024/11/12 08:18:39 - mmengine - INFO - Iter(train) [ 79400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:19:42 time: 1.9821 data_time: 0.0182 memory: 32384 grad_norm: 39.2944 loss: 9.7339 loss_cls: 0.3161 loss_bbox: 0.1438 loss_iou: 0.2524 d0.loss_cls: 0.3581 d0.loss_bbox: 0.1669 d0.loss_iou: 0.2661 d1.loss_cls: 0.3329 d1.loss_bbox: 0.1564 d1.loss_iou: 0.2615 d2.loss_cls: 0.3240 d2.loss_bbox: 0.1528 d2.loss_iou: 0.2548 d3.loss_cls: 0.3203 d3.loss_bbox: 0.1474 d3.loss_iou: 0.2535 d4.loss_cls: 0.3179 d4.loss_bbox: 0.1444 d4.loss_iou: 0.2515 enc_loss_cls: 0.3535 enc_loss_bbox: 0.1888 enc_loss_iou: 0.2897 dn_loss_cls: 0.1135 dn_loss_bbox: 0.1595 dn_loss_iou: 0.2182 d0.dn_loss_cls: 0.2042 d0.dn_loss_bbox: 0.3208 d0.dn_loss_iou: 0.3600 d1.dn_loss_cls: 0.1454 d1.dn_loss_bbox: 0.1932 d1.dn_loss_iou: 0.2486 d2.dn_loss_cls: 0.1265 d2.dn_loss_bbox: 0.1692 d2.dn_loss_iou: 0.2275 d3.dn_loss_cls: 0.1195 d3.dn_loss_bbox: 0.1627 d3.dn_loss_iou: 0.2210 d4.dn_loss_cls: 0.1141 d4.dn_loss_bbox: 0.1596 d4.dn_loss_iou: 0.2183 d1.loss_lmm_region: 0.1808 loss_lmm_image: 0.8186 2024/11/12 08:21:57 - mmengine - INFO - Iter(train) [ 79500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:16:19 time: 1.9775 data_time: 0.0183 memory: 34716 grad_norm: 34.4520 loss: 8.5964 loss_cls: 0.2943 loss_bbox: 0.1165 loss_iou: 0.2212 d0.loss_cls: 0.3356 d0.loss_bbox: 0.1228 d0.loss_iou: 0.2310 d1.loss_cls: 0.3121 d1.loss_bbox: 0.1183 d1.loss_iou: 0.2247 d2.loss_cls: 0.3057 d2.loss_bbox: 0.1136 d2.loss_iou: 0.2210 d3.loss_cls: 0.2978 d3.loss_bbox: 0.1186 d3.loss_iou: 0.2215 d4.loss_cls: 0.2959 d4.loss_bbox: 0.1162 d4.loss_iou: 0.2210 enc_loss_cls: 0.3385 enc_loss_bbox: 0.1392 enc_loss_iou: 0.2542 dn_loss_cls: 0.1009 dn_loss_bbox: 0.1362 dn_loss_iou: 0.1899 d0.dn_loss_cls: 0.1712 d0.dn_loss_bbox: 0.2673 d0.dn_loss_iou: 0.3212 d1.dn_loss_cls: 0.1266 d1.dn_loss_bbox: 0.1693 d1.dn_loss_iou: 0.2207 d2.dn_loss_cls: 0.1112 d2.dn_loss_bbox: 0.1480 d2.dn_loss_iou: 0.1994 d3.dn_loss_cls: 0.1032 d3.dn_loss_bbox: 0.1391 d3.dn_loss_iou: 0.1927 d4.dn_loss_cls: 0.1014 d4.dn_loss_bbox: 0.1362 d4.dn_loss_iou: 0.1899 d1.loss_lmm_region: 0.1310 loss_lmm_image: 0.8214 2024/11/12 08:25:14 - mmengine - INFO - Iter(train) [ 79600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:12:56 time: 1.9600 data_time: 0.0181 memory: 33865 grad_norm: 34.7824 loss: 10.1559 loss_cls: 0.3036 loss_bbox: 0.1660 loss_iou: 0.3135 d0.loss_cls: 0.3485 d0.loss_bbox: 0.1764 d0.loss_iou: 0.3256 d1.loss_cls: 0.3173 d1.loss_bbox: 0.1767 d1.loss_iou: 0.3234 d2.loss_cls: 0.3133 d2.loss_bbox: 0.1662 d2.loss_iou: 0.3133 d3.loss_cls: 0.3034 d3.loss_bbox: 0.1676 d3.loss_iou: 0.3152 d4.loss_cls: 0.3031 d4.loss_bbox: 0.1667 d4.loss_iou: 0.3136 enc_loss_cls: 0.3457 enc_loss_bbox: 0.1955 enc_loss_iou: 0.3584 dn_loss_cls: 0.0910 dn_loss_bbox: 0.1719 dn_loss_iou: 0.2402 d0.dn_loss_cls: 0.1639 d0.dn_loss_bbox: 0.3095 d0.dn_loss_iou: 0.3859 d1.dn_loss_cls: 0.1185 d1.dn_loss_bbox: 0.2033 d1.dn_loss_iou: 0.2716 d2.dn_loss_cls: 0.1011 d2.dn_loss_bbox: 0.1803 d2.dn_loss_iou: 0.2498 d3.dn_loss_cls: 0.0956 d3.dn_loss_bbox: 0.1744 d3.dn_loss_iou: 0.2430 d4.dn_loss_cls: 0.0921 d4.dn_loss_bbox: 0.1718 d4.dn_loss_iou: 0.2401 d1.loss_lmm_region: 0.1034 loss_lmm_image: 0.8358 2024/11/12 08:28:34 - mmengine - INFO - Iter(train) [ 79700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:09:34 time: 1.9902 data_time: 0.0182 memory: 34124 grad_norm: 32.3416 loss: 8.9681 loss_cls: 0.2613 loss_bbox: 0.1298 loss_iou: 0.1991 d0.loss_cls: 0.3010 d0.loss_bbox: 0.1471 d0.loss_iou: 0.2181 d1.loss_cls: 0.2818 d1.loss_bbox: 0.1309 d1.loss_iou: 0.2028 d2.loss_cls: 0.2773 d2.loss_bbox: 0.1268 d2.loss_iou: 0.1960 d3.loss_cls: 0.2667 d3.loss_bbox: 0.1280 d3.loss_iou: 0.1972 d4.loss_cls: 0.2599 d4.loss_bbox: 0.1311 d4.loss_iou: 0.2003 enc_loss_cls: 0.3121 enc_loss_bbox: 0.1588 enc_loss_iou: 0.2360 dn_loss_cls: 0.1194 dn_loss_bbox: 0.1933 dn_loss_iou: 0.2097 d0.dn_loss_cls: 0.1943 d0.dn_loss_bbox: 0.3543 d0.dn_loss_iou: 0.3548 d1.dn_loss_cls: 0.1468 d1.dn_loss_bbox: 0.2246 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1332 d2.dn_loss_bbox: 0.2031 d2.dn_loss_iou: 0.2192 d3.dn_loss_cls: 0.1250 d3.dn_loss_bbox: 0.1954 d3.dn_loss_iou: 0.2120 d4.dn_loss_cls: 0.1224 d4.dn_loss_bbox: 0.1933 d4.dn_loss_iou: 0.2098 d1.loss_lmm_region: 0.1346 loss_lmm_image: 0.8195 2024/11/12 08:31:53 - mmengine - INFO - Iter(train) [ 79800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:06:13 time: 1.9874 data_time: 0.0182 memory: 33917 grad_norm: 36.0910 loss: 9.4611 loss_cls: 0.3205 loss_bbox: 0.1403 loss_iou: 0.2595 d0.loss_cls: 0.3683 d0.loss_bbox: 0.1434 d0.loss_iou: 0.2623 d1.loss_cls: 0.3363 d1.loss_bbox: 0.1440 d1.loss_iou: 0.2670 d2.loss_cls: 0.3351 d2.loss_bbox: 0.1367 d2.loss_iou: 0.2574 d3.loss_cls: 0.3256 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2591 d4.loss_cls: 0.3195 d4.loss_bbox: 0.1426 d4.loss_iou: 0.2603 enc_loss_cls: 0.3735 enc_loss_bbox: 0.1626 enc_loss_iou: 0.2886 dn_loss_cls: 0.1131 dn_loss_bbox: 0.1496 dn_loss_iou: 0.1984 d0.dn_loss_cls: 0.1806 d0.dn_loss_bbox: 0.2998 d0.dn_loss_iou: 0.3346 d1.dn_loss_cls: 0.1339 d1.dn_loss_bbox: 0.1835 d1.dn_loss_iou: 0.2309 d2.dn_loss_cls: 0.1204 d2.dn_loss_bbox: 0.1605 d2.dn_loss_iou: 0.2100 d3.dn_loss_cls: 0.1145 d3.dn_loss_bbox: 0.1529 d3.dn_loss_iou: 0.2012 d4.dn_loss_cls: 0.1134 d4.dn_loss_bbox: 0.1496 d4.dn_loss_iou: 0.1984 d1.loss_lmm_region: 0.1441 loss_lmm_image: 0.8283 2024/11/12 08:35:11 - mmengine - INFO - Iter(train) [ 79900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 15:02:49 time: 1.9645 data_time: 0.0183 memory: 35065 grad_norm: 29.0195 loss: 8.5109 loss_cls: 0.2632 loss_bbox: 0.1139 loss_iou: 0.2018 d0.loss_cls: 0.2986 d0.loss_bbox: 0.1218 d0.loss_iou: 0.2100 d1.loss_cls: 0.2716 d1.loss_bbox: 0.1204 d1.loss_iou: 0.2059 d2.loss_cls: 0.2719 d2.loss_bbox: 0.1145 d2.loss_iou: 0.2011 d3.loss_cls: 0.2669 d3.loss_bbox: 0.1124 d3.loss_iou: 0.2009 d4.loss_cls: 0.2636 d4.loss_bbox: 0.1142 d4.loss_iou: 0.2018 enc_loss_cls: 0.3125 enc_loss_bbox: 0.1348 enc_loss_iou: 0.2323 dn_loss_cls: 0.0935 dn_loss_bbox: 0.1678 dn_loss_iou: 0.2008 d0.dn_loss_cls: 0.1691 d0.dn_loss_bbox: 0.3019 d0.dn_loss_iou: 0.3330 d1.dn_loss_cls: 0.1246 d1.dn_loss_bbox: 0.1984 d1.dn_loss_iou: 0.2314 d2.dn_loss_cls: 0.1052 d2.dn_loss_bbox: 0.1771 d2.dn_loss_iou: 0.2110 d3.dn_loss_cls: 0.0997 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2030 d4.dn_loss_cls: 0.0949 d4.dn_loss_bbox: 0.1678 d4.dn_loss_iou: 0.2009 d1.loss_lmm_region: 0.1539 loss_lmm_image: 0.8741 2024/11/12 08:38:28 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 08:38:28 - mmengine - INFO - Iter(train) [ 80000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:59:26 time: 1.9736 data_time: 0.0182 memory: 34098 grad_norm: 28.2187 loss: 9.8503 loss_cls: 0.2934 loss_bbox: 0.1534 loss_iou: 0.2660 d0.loss_cls: 0.3431 d0.loss_bbox: 0.1613 d0.loss_iou: 0.2754 d1.loss_cls: 0.3155 d1.loss_bbox: 0.1560 d1.loss_iou: 0.2689 d2.loss_cls: 0.3027 d2.loss_bbox: 0.1579 d2.loss_iou: 0.2687 d3.loss_cls: 0.2998 d3.loss_bbox: 0.1552 d3.loss_iou: 0.2676 d4.loss_cls: 0.2969 d4.loss_bbox: 0.1523 d4.loss_iou: 0.2645 enc_loss_cls: 0.3399 enc_loss_bbox: 0.1764 enc_loss_iou: 0.3029 dn_loss_cls: 0.1311 dn_loss_bbox: 0.1705 dn_loss_iou: 0.2247 d0.dn_loss_cls: 0.2049 d0.dn_loss_bbox: 0.3085 d0.dn_loss_iou: 0.3584 d1.dn_loss_cls: 0.1535 d1.dn_loss_bbox: 0.2012 d1.dn_loss_iou: 0.2557 d2.dn_loss_cls: 0.1418 d2.dn_loss_bbox: 0.1768 d2.dn_loss_iou: 0.2331 d3.dn_loss_cls: 0.1364 d3.dn_loss_bbox: 0.1711 d3.dn_loss_iou: 0.2272 d4.dn_loss_cls: 0.1314 d4.dn_loss_bbox: 0.1705 d4.dn_loss_iou: 0.2247 d1.loss_lmm_region: 0.1247 loss_lmm_image: 0.8864 2024/11/12 08:41:50 - mmengine - INFO - Iter(train) [ 80100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:56:07 time: 2.0298 data_time: 0.0182 memory: 33271 grad_norm: 35.8142 loss: 9.7443 loss_cls: 0.3167 loss_bbox: 0.1391 loss_iou: 0.2508 d0.loss_cls: 0.3584 d0.loss_bbox: 0.1475 d0.loss_iou: 0.2644 d1.loss_cls: 0.3313 d1.loss_bbox: 0.1423 d1.loss_iou: 0.2604 d2.loss_cls: 0.3245 d2.loss_bbox: 0.1398 d2.loss_iou: 0.2538 d3.loss_cls: 0.3243 d3.loss_bbox: 0.1384 d3.loss_iou: 0.2511 d4.loss_cls: 0.3194 d4.loss_bbox: 0.1386 d4.loss_iou: 0.2502 enc_loss_cls: 0.3708 enc_loss_bbox: 0.1617 enc_loss_iou: 0.2886 dn_loss_cls: 0.1081 dn_loss_bbox: 0.1773 dn_loss_iou: 0.2262 d0.dn_loss_cls: 0.1915 d0.dn_loss_bbox: 0.3368 d0.dn_loss_iou: 0.3811 d1.dn_loss_cls: 0.1374 d1.dn_loss_bbox: 0.2061 d1.dn_loss_iou: 0.2577 d2.dn_loss_cls: 0.1194 d2.dn_loss_bbox: 0.1874 d2.dn_loss_iou: 0.2361 d3.dn_loss_cls: 0.1126 d3.dn_loss_bbox: 0.1797 d3.dn_loss_iou: 0.2286 d4.dn_loss_cls: 0.1107 d4.dn_loss_bbox: 0.1772 d4.dn_loss_iou: 0.2260 d1.loss_lmm_region: 0.1418 loss_lmm_image: 0.8305 2024/11/12 08:45:10 - mmengine - INFO - Iter(train) [ 80200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:52:46 time: 1.9766 data_time: 0.0183 memory: 33901 grad_norm: 27.5569 loss: 7.3796 loss_cls: 0.2248 loss_bbox: 0.0912 loss_iou: 0.1708 d0.loss_cls: 0.2603 d0.loss_bbox: 0.1059 d0.loss_iou: 0.1894 d1.loss_cls: 0.2386 d1.loss_bbox: 0.0973 d1.loss_iou: 0.1788 d2.loss_cls: 0.2313 d2.loss_bbox: 0.0930 d2.loss_iou: 0.1750 d3.loss_cls: 0.2255 d3.loss_bbox: 0.0923 d3.loss_iou: 0.1734 d4.loss_cls: 0.2242 d4.loss_bbox: 0.0918 d4.loss_iou: 0.1721 enc_loss_cls: 0.2681 enc_loss_bbox: 0.1227 enc_loss_iou: 0.2175 dn_loss_cls: 0.0752 dn_loss_bbox: 0.1315 dn_loss_iou: 0.1697 d0.dn_loss_cls: 0.1459 d0.dn_loss_bbox: 0.2776 d0.dn_loss_iou: 0.3049 d1.dn_loss_cls: 0.1086 d1.dn_loss_bbox: 0.1631 d1.dn_loss_iou: 0.2001 d2.dn_loss_cls: 0.0859 d2.dn_loss_bbox: 0.1426 d2.dn_loss_iou: 0.1798 d3.dn_loss_cls: 0.0783 d3.dn_loss_bbox: 0.1333 d3.dn_loss_iou: 0.1720 d4.dn_loss_cls: 0.0744 d4.dn_loss_bbox: 0.1315 d4.dn_loss_iou: 0.1696 d1.loss_lmm_region: 0.1211 loss_lmm_image: 0.8703 2024/11/12 08:48:26 - mmengine - INFO - Iter(train) [ 80300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:49:22 time: 1.9597 data_time: 0.0182 memory: 32173 grad_norm: 22.8974 loss: 10.3156 loss_cls: 0.3193 loss_bbox: 0.1664 loss_iou: 0.2849 d0.loss_cls: 0.3614 d0.loss_bbox: 0.1833 d0.loss_iou: 0.3110 d1.loss_cls: 0.3367 d1.loss_bbox: 0.1732 d1.loss_iou: 0.2949 d2.loss_cls: 0.3301 d2.loss_bbox: 0.1730 d2.loss_iou: 0.2897 d3.loss_cls: 0.3247 d3.loss_bbox: 0.1656 d3.loss_iou: 0.2855 d4.loss_cls: 0.3183 d4.loss_bbox: 0.1675 d4.loss_iou: 0.2847 enc_loss_cls: 0.3562 enc_loss_bbox: 0.2059 enc_loss_iou: 0.3388 dn_loss_cls: 0.0886 dn_loss_bbox: 0.1937 dn_loss_iou: 0.2389 d0.dn_loss_cls: 0.1784 d0.dn_loss_bbox: 0.3472 d0.dn_loss_iou: 0.3952 d1.dn_loss_cls: 0.1257 d1.dn_loss_bbox: 0.2303 d1.dn_loss_iou: 0.2758 d2.dn_loss_cls: 0.1048 d2.dn_loss_bbox: 0.2070 d2.dn_loss_iou: 0.2515 d3.dn_loss_cls: 0.0970 d3.dn_loss_bbox: 0.1966 d3.dn_loss_iou: 0.2421 d4.dn_loss_cls: 0.0901 d4.dn_loss_bbox: 0.1938 d4.dn_loss_iou: 0.2389 d1.loss_lmm_region: 0.1436 loss_lmm_image: 0.8056 2024/11/12 08:51:45 - mmengine - INFO - Iter(train) [ 80400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:46:00 time: 1.9923 data_time: 0.0182 memory: 35743 grad_norm: 25.2577 loss: 9.6515 loss_cls: 0.3402 loss_bbox: 0.1363 loss_iou: 0.2564 d0.loss_cls: 0.3816 d0.loss_bbox: 0.1467 d0.loss_iou: 0.2772 d1.loss_cls: 0.3608 d1.loss_bbox: 0.1365 d1.loss_iou: 0.2619 d2.loss_cls: 0.3449 d2.loss_bbox: 0.1369 d2.loss_iou: 0.2628 d3.loss_cls: 0.3452 d3.loss_bbox: 0.1331 d3.loss_iou: 0.2546 d4.loss_cls: 0.3432 d4.loss_bbox: 0.1366 d4.loss_iou: 0.2571 enc_loss_cls: 0.3853 enc_loss_bbox: 0.1621 enc_loss_iou: 0.3043 dn_loss_cls: 0.1179 dn_loss_bbox: 0.1457 dn_loss_iou: 0.2018 d0.dn_loss_cls: 0.1969 d0.dn_loss_bbox: 0.2873 d0.dn_loss_iou: 0.3402 d1.dn_loss_cls: 0.1527 d1.dn_loss_bbox: 0.1756 d1.dn_loss_iou: 0.2301 d2.dn_loss_cls: 0.1351 d2.dn_loss_bbox: 0.1551 d2.dn_loss_iou: 0.2108 d3.dn_loss_cls: 0.1267 d3.dn_loss_bbox: 0.1476 d3.dn_loss_iou: 0.2040 d4.dn_loss_cls: 0.1201 d4.dn_loss_bbox: 0.1458 d4.dn_loss_iou: 0.2019 d1.loss_lmm_region: 0.1258 loss_lmm_image: 0.8667 2024/11/12 08:55:04 - mmengine - INFO - Iter(train) [ 80500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:42:38 time: 1.9721 data_time: 0.0182 memory: 35192 grad_norm: 30.4683 loss: 8.8740 loss_cls: 0.2662 loss_bbox: 0.1364 loss_iou: 0.2472 d0.loss_cls: 0.3086 d0.loss_bbox: 0.1455 d0.loss_iou: 0.2650 d1.loss_cls: 0.2867 d1.loss_bbox: 0.1388 d1.loss_iou: 0.2521 d2.loss_cls: 0.2791 d2.loss_bbox: 0.1354 d2.loss_iou: 0.2470 d3.loss_cls: 0.2724 d3.loss_bbox: 0.1364 d3.loss_iou: 0.2442 d4.loss_cls: 0.2679 d4.loss_bbox: 0.1361 d4.loss_iou: 0.2444 enc_loss_cls: 0.3233 enc_loss_bbox: 0.1549 enc_loss_iou: 0.2819 dn_loss_cls: 0.0858 dn_loss_bbox: 0.1444 dn_loss_iou: 0.2080 d0.dn_loss_cls: 0.1651 d0.dn_loss_bbox: 0.2944 d0.dn_loss_iou: 0.3566 d1.dn_loss_cls: 0.1147 d1.dn_loss_bbox: 0.1783 d1.dn_loss_iou: 0.2408 d2.dn_loss_cls: 0.0996 d2.dn_loss_bbox: 0.1546 d2.dn_loss_iou: 0.2180 d3.dn_loss_cls: 0.0912 d3.dn_loss_bbox: 0.1462 d3.dn_loss_iou: 0.2105 d4.dn_loss_cls: 0.0870 d4.dn_loss_bbox: 0.1443 d4.dn_loss_iou: 0.2079 d1.loss_lmm_region: 0.1351 loss_lmm_image: 0.8222 2024/11/12 08:58:22 - mmengine - INFO - Iter(train) [ 80600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:39:15 time: 1.9869 data_time: 0.0182 memory: 33744 grad_norm: 26.4906 loss: 9.3495 loss_cls: 0.3145 loss_bbox: 0.1206 loss_iou: 0.2590 d0.loss_cls: 0.3560 d0.loss_bbox: 0.1319 d0.loss_iou: 0.2721 d1.loss_cls: 0.3402 d1.loss_bbox: 0.1272 d1.loss_iou: 0.2607 d2.loss_cls: 0.3258 d2.loss_bbox: 0.1298 d2.loss_iou: 0.2628 d3.loss_cls: 0.3202 d3.loss_bbox: 0.1240 d3.loss_iou: 0.2586 d4.loss_cls: 0.3177 d4.loss_bbox: 0.1201 d4.loss_iou: 0.2579 enc_loss_cls: 0.3625 enc_loss_bbox: 0.1503 enc_loss_iou: 0.2983 dn_loss_cls: 0.1148 dn_loss_bbox: 0.1347 dn_loss_iou: 0.2137 d0.dn_loss_cls: 0.1898 d0.dn_loss_bbox: 0.2629 d0.dn_loss_iou: 0.3504 d1.dn_loss_cls: 0.1479 d1.dn_loss_bbox: 0.1663 d1.dn_loss_iou: 0.2440 d2.dn_loss_cls: 0.1305 d2.dn_loss_bbox: 0.1455 d2.dn_loss_iou: 0.2238 d3.dn_loss_cls: 0.1195 d3.dn_loss_bbox: 0.1368 d3.dn_loss_iou: 0.2166 d4.dn_loss_cls: 0.1155 d4.dn_loss_bbox: 0.1348 d4.dn_loss_iou: 0.2136 d1.loss_lmm_region: 0.1310 loss_lmm_image: 0.8469 2024/11/12 09:01:40 - mmengine - INFO - Iter(train) [ 80700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:35:52 time: 1.9889 data_time: 0.0182 memory: 34683 grad_norm: 32.8036 loss: 9.0850 loss_cls: 0.2774 loss_bbox: 0.1230 loss_iou: 0.2237 d0.loss_cls: 0.3249 d0.loss_bbox: 0.1404 d0.loss_iou: 0.2459 d1.loss_cls: 0.3019 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2302 d2.loss_cls: 0.2909 d2.loss_bbox: 0.1241 d2.loss_iou: 0.2261 d3.loss_cls: 0.2870 d3.loss_bbox: 0.1213 d3.loss_iou: 0.2192 d4.loss_cls: 0.2794 d4.loss_bbox: 0.1235 d4.loss_iou: 0.2234 enc_loss_cls: 0.3242 enc_loss_bbox: 0.1603 enc_loss_iou: 0.2716 dn_loss_cls: 0.1043 dn_loss_bbox: 0.1644 dn_loss_iou: 0.2136 d0.dn_loss_cls: 0.1882 d0.dn_loss_bbox: 0.3058 d0.dn_loss_iou: 0.3592 d1.dn_loss_cls: 0.1398 d1.dn_loss_bbox: 0.2016 d1.dn_loss_iou: 0.2484 d2.dn_loss_cls: 0.1181 d2.dn_loss_bbox: 0.1784 d2.dn_loss_iou: 0.2254 d3.dn_loss_cls: 0.1071 d3.dn_loss_bbox: 0.1665 d3.dn_loss_iou: 0.2165 d4.dn_loss_cls: 0.1051 d4.dn_loss_bbox: 0.1643 d4.dn_loss_iou: 0.2135 d1.loss_lmm_region: 0.1642 loss_lmm_image: 0.8543 2024/11/12 09:05:01 - mmengine - INFO - Iter(train) [ 80800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:32:32 time: 1.9919 data_time: 0.0183 memory: 34381 grad_norm: 30.5811 loss: 8.8316 loss_cls: 0.2588 loss_bbox: 0.1276 loss_iou: 0.2571 d0.loss_cls: 0.3086 d0.loss_bbox: 0.1318 d0.loss_iou: 0.2702 d1.loss_cls: 0.2868 d1.loss_bbox: 0.1259 d1.loss_iou: 0.2604 d2.loss_cls: 0.2751 d2.loss_bbox: 0.1227 d2.loss_iou: 0.2545 d3.loss_cls: 0.2636 d3.loss_bbox: 0.1262 d3.loss_iou: 0.2579 d4.loss_cls: 0.2625 d4.loss_bbox: 0.1258 d4.loss_iou: 0.2560 enc_loss_cls: 0.3114 enc_loss_bbox: 0.1443 enc_loss_iou: 0.2895 dn_loss_cls: 0.0846 dn_loss_bbox: 0.1407 dn_loss_iou: 0.2135 d0.dn_loss_cls: 0.1652 d0.dn_loss_bbox: 0.2887 d0.dn_loss_iou: 0.3546 d1.dn_loss_cls: 0.1098 d1.dn_loss_bbox: 0.1695 d1.dn_loss_iou: 0.2429 d2.dn_loss_cls: 0.0954 d2.dn_loss_bbox: 0.1481 d2.dn_loss_iou: 0.2217 d3.dn_loss_cls: 0.0894 d3.dn_loss_bbox: 0.1432 d3.dn_loss_iou: 0.2160 d4.dn_loss_cls: 0.0856 d4.dn_loss_bbox: 0.1408 d4.dn_loss_iou: 0.2135 d1.loss_lmm_region: 0.1337 loss_lmm_image: 0.8580 2024/11/12 09:08:21 - mmengine - INFO - Iter(train) [ 80900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:29:11 time: 1.9953 data_time: 0.0182 memory: 34563 grad_norm: 35.5796 loss: 9.5545 loss_cls: 0.2848 loss_bbox: 0.1408 loss_iou: 0.2281 d0.loss_cls: 0.3445 d0.loss_bbox: 0.1544 d0.loss_iou: 0.2412 d1.loss_cls: 0.3108 d1.loss_bbox: 0.1477 d1.loss_iou: 0.2374 d2.loss_cls: 0.2989 d2.loss_bbox: 0.1442 d2.loss_iou: 0.2312 d3.loss_cls: 0.2890 d3.loss_bbox: 0.1415 d3.loss_iou: 0.2287 d4.loss_cls: 0.2885 d4.loss_bbox: 0.1401 d4.loss_iou: 0.2291 enc_loss_cls: 0.3498 enc_loss_bbox: 0.1693 enc_loss_iou: 0.2629 dn_loss_cls: 0.1169 dn_loss_bbox: 0.1891 dn_loss_iou: 0.2273 d0.dn_loss_cls: 0.2059 d0.dn_loss_bbox: 0.3286 d0.dn_loss_iou: 0.3695 d1.dn_loss_cls: 0.1536 d1.dn_loss_bbox: 0.2201 d1.dn_loss_iou: 0.2570 d2.dn_loss_cls: 0.1312 d2.dn_loss_bbox: 0.1985 d2.dn_loss_iou: 0.2357 d3.dn_loss_cls: 0.1237 d3.dn_loss_bbox: 0.1909 d3.dn_loss_iou: 0.2296 d4.dn_loss_cls: 0.1190 d4.dn_loss_bbox: 0.1891 d4.dn_loss_iou: 0.2274 d1.loss_lmm_region: 0.1372 loss_lmm_image: 0.8409 2024/11/12 09:11:40 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 09:11:40 - mmengine - INFO - Iter(train) [ 81000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:25:50 time: 2.0273 data_time: 0.0181 memory: 35864 grad_norm: 27.7353 loss: 8.9672 loss_cls: 0.2702 loss_bbox: 0.1339 loss_iou: 0.2305 d0.loss_cls: 0.3123 d0.loss_bbox: 0.1564 d0.loss_iou: 0.2479 d1.loss_cls: 0.2888 d1.loss_bbox: 0.1429 d1.loss_iou: 0.2382 d2.loss_cls: 0.2831 d2.loss_bbox: 0.1327 d2.loss_iou: 0.2296 d3.loss_cls: 0.2791 d3.loss_bbox: 0.1310 d3.loss_iou: 0.2289 d4.loss_cls: 0.2730 d4.loss_bbox: 0.1332 d4.loss_iou: 0.2307 enc_loss_cls: 0.3138 enc_loss_bbox: 0.1657 enc_loss_iou: 0.2695 dn_loss_cls: 0.1066 dn_loss_bbox: 0.1530 dn_loss_iou: 0.2120 d0.dn_loss_cls: 0.1894 d0.dn_loss_bbox: 0.2956 d0.dn_loss_iou: 0.3550 d1.dn_loss_cls: 0.1398 d1.dn_loss_bbox: 0.1857 d1.dn_loss_iou: 0.2431 d2.dn_loss_cls: 0.1162 d2.dn_loss_bbox: 0.1624 d2.dn_loss_iou: 0.2228 d3.dn_loss_cls: 0.1084 d3.dn_loss_bbox: 0.1553 d3.dn_loss_iou: 0.2148 d4.dn_loss_cls: 0.1056 d4.dn_loss_bbox: 0.1530 d4.dn_loss_iou: 0.2120 d1.loss_lmm_region: 0.1346 loss_lmm_image: 0.8104 2024/11/12 09:15:00 - mmengine - INFO - Iter(train) [ 81100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:22:29 time: 2.0037 data_time: 0.0182 memory: 33167 grad_norm: 32.3700 loss: 8.5439 loss_cls: 0.2577 loss_bbox: 0.1224 loss_iou: 0.2094 d0.loss_cls: 0.3000 d0.loss_bbox: 0.1330 d0.loss_iou: 0.2235 d1.loss_cls: 0.2805 d1.loss_bbox: 0.1269 d1.loss_iou: 0.2168 d2.loss_cls: 0.2676 d2.loss_bbox: 0.1272 d2.loss_iou: 0.2160 d3.loss_cls: 0.2632 d3.loss_bbox: 0.1213 d3.loss_iou: 0.2108 d4.loss_cls: 0.2596 d4.loss_bbox: 0.1212 d4.loss_iou: 0.2099 enc_loss_cls: 0.2950 enc_loss_bbox: 0.1480 enc_loss_iou: 0.2444 dn_loss_cls: 0.0806 dn_loss_bbox: 0.1585 dn_loss_iou: 0.2150 d0.dn_loss_cls: 0.1604 d0.dn_loss_bbox: 0.3120 d0.dn_loss_iou: 0.3635 d1.dn_loss_cls: 0.1082 d1.dn_loss_bbox: 0.1938 d1.dn_loss_iou: 0.2464 d2.dn_loss_cls: 0.0900 d2.dn_loss_bbox: 0.1689 d2.dn_loss_iou: 0.2241 d3.dn_loss_cls: 0.0851 d3.dn_loss_bbox: 0.1601 d3.dn_loss_iou: 0.2171 d4.dn_loss_cls: 0.0814 d4.dn_loss_bbox: 0.1585 d4.dn_loss_iou: 0.2151 d1.loss_lmm_region: 0.1216 loss_lmm_image: 0.8294 2024/11/12 09:18:21 - mmengine - INFO - Iter(train) [ 81200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:19:09 time: 2.0174 data_time: 0.0182 memory: 34430 grad_norm: 26.7225 loss: 10.0261 loss_cls: 0.3187 loss_bbox: 0.1430 loss_iou: 0.2618 d0.loss_cls: 0.3649 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2762 d1.loss_cls: 0.3420 d1.loss_bbox: 0.1461 d1.loss_iou: 0.2678 d2.loss_cls: 0.3324 d2.loss_bbox: 0.1425 d2.loss_iou: 0.2623 d3.loss_cls: 0.3257 d3.loss_bbox: 0.1416 d3.loss_iou: 0.2594 d4.loss_cls: 0.3236 d4.loss_bbox: 0.1389 d4.loss_iou: 0.2585 enc_loss_cls: 0.3628 enc_loss_bbox: 0.1657 enc_loss_iou: 0.2981 dn_loss_cls: 0.1211 dn_loss_bbox: 0.1848 dn_loss_iou: 0.2330 d0.dn_loss_cls: 0.2089 d0.dn_loss_bbox: 0.3176 d0.dn_loss_iou: 0.3721 d1.dn_loss_cls: 0.1561 d1.dn_loss_bbox: 0.2127 d1.dn_loss_iou: 0.2625 d2.dn_loss_cls: 0.1358 d2.dn_loss_bbox: 0.1913 d2.dn_loss_iou: 0.2399 d3.dn_loss_cls: 0.1264 d3.dn_loss_bbox: 0.1868 d3.dn_loss_iou: 0.2348 d4.dn_loss_cls: 0.1218 d4.dn_loss_bbox: 0.1848 d4.dn_loss_iou: 0.2330 d1.loss_lmm_region: 0.1526 loss_lmm_image: 0.8669 2024/11/12 09:21:41 - mmengine - INFO - Iter(train) [ 81300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:15:47 time: 1.9963 data_time: 0.0183 memory: 32998 grad_norm: 29.9521 loss: 9.8468 loss_cls: 0.2755 loss_bbox: 0.1514 loss_iou: 0.2544 d0.loss_cls: 0.3255 d0.loss_bbox: 0.1626 d0.loss_iou: 0.2671 d1.loss_cls: 0.3063 d1.loss_bbox: 0.1603 d1.loss_iou: 0.2623 d2.loss_cls: 0.2878 d2.loss_bbox: 0.1578 d2.loss_iou: 0.2628 d3.loss_cls: 0.2806 d3.loss_bbox: 0.1535 d3.loss_iou: 0.2582 d4.loss_cls: 0.2770 d4.loss_bbox: 0.1526 d4.loss_iou: 0.2566 enc_loss_cls: 0.3290 enc_loss_bbox: 0.1762 enc_loss_iou: 0.2874 dn_loss_cls: 0.1191 dn_loss_bbox: 0.1947 dn_loss_iou: 0.2322 d0.dn_loss_cls: 0.2007 d0.dn_loss_bbox: 0.3553 d0.dn_loss_iou: 0.3740 d1.dn_loss_cls: 0.1505 d1.dn_loss_bbox: 0.2277 d1.dn_loss_iou: 0.2617 d2.dn_loss_cls: 0.1277 d2.dn_loss_bbox: 0.2071 d2.dn_loss_iou: 0.2424 d3.dn_loss_cls: 0.1228 d3.dn_loss_bbox: 0.1971 d3.dn_loss_iou: 0.2342 d4.dn_loss_cls: 0.1221 d4.dn_loss_bbox: 0.1947 d4.dn_loss_iou: 0.2321 d1.loss_lmm_region: 0.1467 loss_lmm_image: 0.8594 2024/11/12 09:24:59 - mmengine - INFO - Iter(train) [ 81400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:12:25 time: 1.9887 data_time: 0.0185 memory: 34266 grad_norm: 29.0612 loss: 9.2812 loss_cls: 0.3042 loss_bbox: 0.1397 loss_iou: 0.2478 d0.loss_cls: 0.3533 d0.loss_bbox: 0.1558 d0.loss_iou: 0.2688 d1.loss_cls: 0.3254 d1.loss_bbox: 0.1455 d1.loss_iou: 0.2540 d2.loss_cls: 0.3150 d2.loss_bbox: 0.1388 d2.loss_iou: 0.2508 d3.loss_cls: 0.3138 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2435 d4.loss_cls: 0.3086 d4.loss_bbox: 0.1359 d4.loss_iou: 0.2438 enc_loss_cls: 0.3579 enc_loss_bbox: 0.1693 enc_loss_iou: 0.2971 dn_loss_cls: 0.0772 dn_loss_bbox: 0.1499 dn_loss_iou: 0.2126 d0.dn_loss_cls: 0.1506 d0.dn_loss_bbox: 0.3043 d0.dn_loss_iou: 0.3661 d1.dn_loss_cls: 0.1051 d1.dn_loss_bbox: 0.1885 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.0873 d2.dn_loss_bbox: 0.1640 d2.dn_loss_iou: 0.2244 d3.dn_loss_cls: 0.0825 d3.dn_loss_bbox: 0.1524 d3.dn_loss_iou: 0.2153 d4.dn_loss_cls: 0.0781 d4.dn_loss_bbox: 0.1500 d4.dn_loss_iou: 0.2126 d1.loss_lmm_region: 0.1097 loss_lmm_image: 0.8984 2024/11/12 09:28:20 - mmengine - INFO - Iter(train) [ 81500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:09:05 time: 2.0189 data_time: 0.0183 memory: 34171 grad_norm: 30.6924 loss: 9.1164 loss_cls: 0.2753 loss_bbox: 0.1362 loss_iou: 0.2450 d0.loss_cls: 0.3195 d0.loss_bbox: 0.1443 d0.loss_iou: 0.2581 d1.loss_cls: 0.2946 d1.loss_bbox: 0.1347 d1.loss_iou: 0.2485 d2.loss_cls: 0.2870 d2.loss_bbox: 0.1363 d2.loss_iou: 0.2460 d3.loss_cls: 0.2817 d3.loss_bbox: 0.1359 d3.loss_iou: 0.2438 d4.loss_cls: 0.2773 d4.loss_bbox: 0.1350 d4.loss_iou: 0.2449 enc_loss_cls: 0.3219 enc_loss_bbox: 0.1585 enc_loss_iou: 0.2825 dn_loss_cls: 0.0927 dn_loss_bbox: 0.1632 dn_loss_iou: 0.2138 d0.dn_loss_cls: 0.1648 d0.dn_loss_bbox: 0.3096 d0.dn_loss_iou: 0.3573 d1.dn_loss_cls: 0.1212 d1.dn_loss_bbox: 0.1931 d1.dn_loss_iou: 0.2425 d2.dn_loss_cls: 0.1069 d2.dn_loss_bbox: 0.1701 d2.dn_loss_iou: 0.2220 d3.dn_loss_cls: 0.1010 d3.dn_loss_bbox: 0.1648 d3.dn_loss_iou: 0.2157 d4.dn_loss_cls: 0.0952 d4.dn_loss_bbox: 0.1632 d4.dn_loss_iou: 0.2138 d1.loss_lmm_region: 0.1298 loss_lmm_image: 0.8686 2024/11/12 09:31:40 - mmengine - INFO - Iter(train) [ 81600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:05:44 time: 1.9997 data_time: 0.0183 memory: 34273 grad_norm: 33.3867 loss: 10.2931 loss_cls: 0.3493 loss_bbox: 0.1547 loss_iou: 0.2782 d0.loss_cls: 0.3878 d0.loss_bbox: 0.1669 d0.loss_iou: 0.2910 d1.loss_cls: 0.3658 d1.loss_bbox: 0.1609 d1.loss_iou: 0.2869 d2.loss_cls: 0.3589 d2.loss_bbox: 0.1543 d2.loss_iou: 0.2794 d3.loss_cls: 0.3543 d3.loss_bbox: 0.1524 d3.loss_iou: 0.2777 d4.loss_cls: 0.3487 d4.loss_bbox: 0.1537 d4.loss_iou: 0.2787 enc_loss_cls: 0.3925 enc_loss_bbox: 0.1804 enc_loss_iou: 0.3163 dn_loss_cls: 0.1660 dn_loss_bbox: 0.1541 dn_loss_iou: 0.2041 d0.dn_loss_cls: 0.2183 d0.dn_loss_bbox: 0.2807 d0.dn_loss_iou: 0.3369 d1.dn_loss_cls: 0.1836 d1.dn_loss_bbox: 0.1787 d1.dn_loss_iou: 0.2310 d2.dn_loss_cls: 0.1717 d2.dn_loss_bbox: 0.1623 d2.dn_loss_iou: 0.2131 d3.dn_loss_cls: 0.1654 d3.dn_loss_bbox: 0.1559 d3.dn_loss_iou: 0.2063 d4.dn_loss_cls: 0.1663 d4.dn_loss_bbox: 0.1541 d4.dn_loss_iou: 0.2041 d1.loss_lmm_region: 0.1600 loss_lmm_image: 0.8916 2024/11/12 09:34:59 - mmengine - INFO - Iter(train) [ 81700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 14:02:22 time: 1.9985 data_time: 0.0182 memory: 34361 grad_norm: 32.3469 loss: 10.1088 loss_cls: 0.3466 loss_bbox: 0.1553 loss_iou: 0.2943 d0.loss_cls: 0.4128 d0.loss_bbox: 0.1715 d0.loss_iou: 0.3102 d1.loss_cls: 0.3768 d1.loss_bbox: 0.1594 d1.loss_iou: 0.2983 d2.loss_cls: 0.3636 d2.loss_bbox: 0.1537 d2.loss_iou: 0.2922 d3.loss_cls: 0.3553 d3.loss_bbox: 0.1534 d3.loss_iou: 0.2935 d4.loss_cls: 0.3486 d4.loss_bbox: 0.1554 d4.loss_iou: 0.2933 enc_loss_cls: 0.4052 enc_loss_bbox: 0.1855 enc_loss_iou: 0.3311 dn_loss_cls: 0.0985 dn_loss_bbox: 0.1554 dn_loss_iou: 0.2051 d0.dn_loss_cls: 0.1813 d0.dn_loss_bbox: 0.2942 d0.dn_loss_iou: 0.3358 d1.dn_loss_cls: 0.1288 d1.dn_loss_bbox: 0.1935 d1.dn_loss_iou: 0.2349 d2.dn_loss_cls: 0.1137 d2.dn_loss_bbox: 0.1696 d2.dn_loss_iou: 0.2156 d3.dn_loss_cls: 0.1053 d3.dn_loss_bbox: 0.1589 d3.dn_loss_iou: 0.2083 d4.dn_loss_cls: 0.0994 d4.dn_loss_bbox: 0.1552 d4.dn_loss_iou: 0.2050 d1.loss_lmm_region: 0.1376 loss_lmm_image: 0.8567 2024/11/12 09:38:20 - mmengine - INFO - Iter(train) [ 81800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:59:02 time: 2.0283 data_time: 0.0182 memory: 34411 grad_norm: 27.3164 loss: 9.4905 loss_cls: 0.3022 loss_bbox: 0.1386 loss_iou: 0.2607 d0.loss_cls: 0.3514 d0.loss_bbox: 0.1499 d0.loss_iou: 0.2760 d1.loss_cls: 0.3214 d1.loss_bbox: 0.1469 d1.loss_iou: 0.2674 d2.loss_cls: 0.3172 d2.loss_bbox: 0.1408 d2.loss_iou: 0.2612 d3.loss_cls: 0.3110 d3.loss_bbox: 0.1388 d3.loss_iou: 0.2580 d4.loss_cls: 0.3032 d4.loss_bbox: 0.1394 d4.loss_iou: 0.2592 enc_loss_cls: 0.3468 enc_loss_bbox: 0.1664 enc_loss_iou: 0.3069 dn_loss_cls: 0.1017 dn_loss_bbox: 0.1531 dn_loss_iou: 0.2134 d0.dn_loss_cls: 0.1769 d0.dn_loss_bbox: 0.2975 d0.dn_loss_iou: 0.3629 d1.dn_loss_cls: 0.1346 d1.dn_loss_bbox: 0.1854 d1.dn_loss_iou: 0.2479 d2.dn_loss_cls: 0.1155 d2.dn_loss_bbox: 0.1651 d2.dn_loss_iou: 0.2252 d3.dn_loss_cls: 0.1068 d3.dn_loss_bbox: 0.1554 d3.dn_loss_iou: 0.2164 d4.dn_loss_cls: 0.1031 d4.dn_loss_bbox: 0.1531 d4.dn_loss_iou: 0.2135 d1.loss_lmm_region: 0.1180 loss_lmm_image: 0.8817 2024/11/12 09:41:39 - mmengine - INFO - Iter(train) [ 81900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:55:40 time: 1.9732 data_time: 0.0182 memory: 34341 grad_norm: 33.8653 loss: 9.9450 loss_cls: 0.3171 loss_bbox: 0.1413 loss_iou: 0.2664 d0.loss_cls: 0.3712 d0.loss_bbox: 0.1581 d0.loss_iou: 0.2889 d1.loss_cls: 0.3376 d1.loss_bbox: 0.1500 d1.loss_iou: 0.2775 d2.loss_cls: 0.3272 d2.loss_bbox: 0.1405 d2.loss_iou: 0.2698 d3.loss_cls: 0.3213 d3.loss_bbox: 0.1415 d3.loss_iou: 0.2688 d4.loss_cls: 0.3171 d4.loss_bbox: 0.1409 d4.loss_iou: 0.2677 enc_loss_cls: 0.3649 enc_loss_bbox: 0.1765 enc_loss_iou: 0.3110 dn_loss_cls: 0.1140 dn_loss_bbox: 0.1589 dn_loss_iou: 0.2349 d0.dn_loss_cls: 0.2149 d0.dn_loss_bbox: 0.3202 d0.dn_loss_iou: 0.3948 d1.dn_loss_cls: 0.1541 d1.dn_loss_bbox: 0.1889 d1.dn_loss_iou: 0.2679 d2.dn_loss_cls: 0.1305 d2.dn_loss_bbox: 0.1677 d2.dn_loss_iou: 0.2449 d3.dn_loss_cls: 0.1211 d3.dn_loss_bbox: 0.1613 d3.dn_loss_iou: 0.2371 d4.dn_loss_cls: 0.1161 d4.dn_loss_bbox: 0.1591 d4.dn_loss_iou: 0.2350 d1.loss_lmm_region: 0.1580 loss_lmm_image: 0.8103 2024/11/12 09:44:59 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 09:44:59 - mmengine - INFO - Iter(train) [ 82000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:52:19 time: 1.9931 data_time: 0.0181 memory: 33104 grad_norm: 27.1593 loss: 9.0888 loss_cls: 0.2720 loss_bbox: 0.1430 loss_iou: 0.2562 d0.loss_cls: 0.3135 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2670 d1.loss_cls: 0.2904 d1.loss_bbox: 0.1452 d1.loss_iou: 0.2579 d2.loss_cls: 0.2882 d2.loss_bbox: 0.1359 d2.loss_iou: 0.2549 d3.loss_cls: 0.2799 d3.loss_bbox: 0.1362 d3.loss_iou: 0.2575 d4.loss_cls: 0.2752 d4.loss_bbox: 0.1418 d4.loss_iou: 0.2540 enc_loss_cls: 0.3246 enc_loss_bbox: 0.1643 enc_loss_iou: 0.2899 dn_loss_cls: 0.0954 dn_loss_bbox: 0.1549 dn_loss_iou: 0.2070 d0.dn_loss_cls: 0.1823 d0.dn_loss_bbox: 0.2995 d0.dn_loss_iou: 0.3463 d1.dn_loss_cls: 0.1284 d1.dn_loss_bbox: 0.1909 d1.dn_loss_iou: 0.2366 d2.dn_loss_cls: 0.1084 d2.dn_loss_bbox: 0.1681 d2.dn_loss_iou: 0.2167 d3.dn_loss_cls: 0.1024 d3.dn_loss_bbox: 0.1567 d3.dn_loss_iou: 0.2092 d4.dn_loss_cls: 0.0974 d4.dn_loss_bbox: 0.1549 d4.dn_loss_iou: 0.2070 d1.loss_lmm_region: 0.1326 loss_lmm_image: 0.7950 2024/11/12 09:48:16 - mmengine - INFO - Iter(train) [ 82100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:48:56 time: 1.9739 data_time: 0.0181 memory: 34934 grad_norm: 30.9339 loss: 8.7549 loss_cls: 0.2754 loss_bbox: 0.1342 loss_iou: 0.2575 d0.loss_cls: 0.2975 d0.loss_bbox: 0.1606 d0.loss_iou: 0.2736 d1.loss_cls: 0.2878 d1.loss_bbox: 0.1394 d1.loss_iou: 0.2587 d2.loss_cls: 0.2797 d2.loss_bbox: 0.1400 d2.loss_iou: 0.2577 d3.loss_cls: 0.2765 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2553 d4.loss_cls: 0.2765 d4.loss_bbox: 0.1326 d4.loss_iou: 0.2557 enc_loss_cls: 0.3016 enc_loss_bbox: 0.1761 enc_loss_iou: 0.2936 dn_loss_cls: 0.0714 dn_loss_bbox: 0.1327 dn_loss_iou: 0.2012 d0.dn_loss_cls: 0.1499 d0.dn_loss_bbox: 0.2749 d0.dn_loss_iou: 0.3406 d1.dn_loss_cls: 0.0992 d1.dn_loss_bbox: 0.1624 d1.dn_loss_iou: 0.2293 d2.dn_loss_cls: 0.0820 d2.dn_loss_bbox: 0.1431 d2.dn_loss_iou: 0.2099 d3.dn_loss_cls: 0.0761 d3.dn_loss_bbox: 0.1344 d3.dn_loss_iou: 0.2036 d4.dn_loss_cls: 0.0710 d4.dn_loss_bbox: 0.1328 d4.dn_loss_iou: 0.2013 d1.loss_lmm_region: 0.1262 loss_lmm_image: 0.8472 2024/11/12 09:51:35 - mmengine - INFO - Iter(train) [ 82200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:45:34 time: 1.9667 data_time: 0.0182 memory: 34953 grad_norm: 33.2967 loss: 9.2480 loss_cls: 0.2969 loss_bbox: 0.1403 loss_iou: 0.2356 d0.loss_cls: 0.3479 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2553 d1.loss_cls: 0.3224 d1.loss_bbox: 0.1398 d1.loss_iou: 0.2395 d2.loss_cls: 0.3087 d2.loss_bbox: 0.1388 d2.loss_iou: 0.2356 d3.loss_cls: 0.3013 d3.loss_bbox: 0.1398 d3.loss_iou: 0.2365 d4.loss_cls: 0.3015 d4.loss_bbox: 0.1385 d4.loss_iou: 0.2359 enc_loss_cls: 0.3542 enc_loss_bbox: 0.1604 enc_loss_iou: 0.2727 dn_loss_cls: 0.1331 dn_loss_bbox: 0.1454 dn_loss_iou: 0.1936 d0.dn_loss_cls: 0.2088 d0.dn_loss_bbox: 0.2783 d0.dn_loss_iou: 0.3326 d1.dn_loss_cls: 0.1599 d1.dn_loss_bbox: 0.1744 d1.dn_loss_iou: 0.2237 d2.dn_loss_cls: 0.1432 d2.dn_loss_bbox: 0.1542 d2.dn_loss_iou: 0.2021 d3.dn_loss_cls: 0.1391 d3.dn_loss_bbox: 0.1468 d3.dn_loss_iou: 0.1952 d4.dn_loss_cls: 0.1383 d4.dn_loss_bbox: 0.1454 d4.dn_loss_iou: 0.1935 d1.loss_lmm_region: 0.1266 loss_lmm_image: 0.8608 2024/11/12 09:54:53 - mmengine - INFO - Iter(train) [ 82300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:42:11 time: 1.9599 data_time: 0.0182 memory: 33902 grad_norm: 39.8928 loss: 9.1807 loss_cls: 0.2731 loss_bbox: 0.1387 loss_iou: 0.2544 d0.loss_cls: 0.3115 d0.loss_bbox: 0.1537 d0.loss_iou: 0.2714 d1.loss_cls: 0.2910 d1.loss_bbox: 0.1482 d1.loss_iou: 0.2626 d2.loss_cls: 0.2880 d2.loss_bbox: 0.1406 d2.loss_iou: 0.2557 d3.loss_cls: 0.2808 d3.loss_bbox: 0.1388 d3.loss_iou: 0.2520 d4.loss_cls: 0.2750 d4.loss_bbox: 0.1380 d4.loss_iou: 0.2534 enc_loss_cls: 0.3192 enc_loss_bbox: 0.1659 enc_loss_iou: 0.2894 dn_loss_cls: 0.0726 dn_loss_bbox: 0.1729 dn_loss_iou: 0.2214 d0.dn_loss_cls: 0.1576 d0.dn_loss_bbox: 0.3224 d0.dn_loss_iou: 0.3638 d1.dn_loss_cls: 0.1065 d1.dn_loss_bbox: 0.2026 d1.dn_loss_iou: 0.2509 d2.dn_loss_cls: 0.0866 d2.dn_loss_bbox: 0.1800 d2.dn_loss_iou: 0.2303 d3.dn_loss_cls: 0.0783 d3.dn_loss_bbox: 0.1735 d3.dn_loss_iou: 0.2231 d4.dn_loss_cls: 0.0732 d4.dn_loss_bbox: 0.1729 d4.dn_loss_iou: 0.2214 d1.loss_lmm_region: 0.1150 loss_lmm_image: 0.8543 2024/11/12 09:58:10 - mmengine - INFO - Iter(train) [ 82400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:38:49 time: 1.9888 data_time: 0.0183 memory: 33376 grad_norm: 35.8889 loss: 10.2521 loss_cls: 0.3523 loss_bbox: 0.1484 loss_iou: 0.2740 d0.loss_cls: 0.4048 d0.loss_bbox: 0.1588 d0.loss_iou: 0.2881 d1.loss_cls: 0.3691 d1.loss_bbox: 0.1545 d1.loss_iou: 0.2822 d2.loss_cls: 0.3598 d2.loss_bbox: 0.1493 d2.loss_iou: 0.2755 d3.loss_cls: 0.3549 d3.loss_bbox: 0.1496 d3.loss_iou: 0.2768 d4.loss_cls: 0.3524 d4.loss_bbox: 0.1494 d4.loss_iou: 0.2754 enc_loss_cls: 0.4127 enc_loss_bbox: 0.1734 enc_loss_iou: 0.3104 dn_loss_cls: 0.1481 dn_loss_bbox: 0.1501 dn_loss_iou: 0.2182 d0.dn_loss_cls: 0.2225 d0.dn_loss_bbox: 0.2788 d0.dn_loss_iou: 0.3535 d1.dn_loss_cls: 0.1866 d1.dn_loss_bbox: 0.1796 d1.dn_loss_iou: 0.2471 d2.dn_loss_cls: 0.1678 d2.dn_loss_bbox: 0.1618 d2.dn_loss_iou: 0.2285 d3.dn_loss_cls: 0.1536 d3.dn_loss_bbox: 0.1525 d3.dn_loss_iou: 0.2210 d4.dn_loss_cls: 0.1463 d4.dn_loss_bbox: 0.1501 d4.dn_loss_iou: 0.2182 d1.loss_lmm_region: 0.1434 loss_lmm_image: 0.8526 2024/11/12 10:01:31 - mmengine - INFO - Iter(train) [ 82500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:35:28 time: 2.0134 data_time: 0.0183 memory: 35484 grad_norm: 29.7417 loss: 8.7170 loss_cls: 0.2559 loss_bbox: 0.1363 loss_iou: 0.2136 d0.loss_cls: 0.2940 d0.loss_bbox: 0.1492 d0.loss_iou: 0.2298 d1.loss_cls: 0.2740 d1.loss_bbox: 0.1415 d1.loss_iou: 0.2189 d2.loss_cls: 0.2649 d2.loss_bbox: 0.1407 d2.loss_iou: 0.2152 d3.loss_cls: 0.2620 d3.loss_bbox: 0.1370 d3.loss_iou: 0.2140 d4.loss_cls: 0.2601 d4.loss_bbox: 0.1353 d4.loss_iou: 0.2139 enc_loss_cls: 0.2952 enc_loss_bbox: 0.1594 enc_loss_iou: 0.2501 dn_loss_cls: 0.1045 dn_loss_bbox: 0.1613 dn_loss_iou: 0.1959 d0.dn_loss_cls: 0.1802 d0.dn_loss_bbox: 0.3066 d0.dn_loss_iou: 0.3354 d1.dn_loss_cls: 0.1375 d1.dn_loss_bbox: 0.1967 d1.dn_loss_iou: 0.2280 d2.dn_loss_cls: 0.1185 d2.dn_loss_bbox: 0.1722 d2.dn_loss_iou: 0.2056 d3.dn_loss_cls: 0.1073 d3.dn_loss_bbox: 0.1639 d3.dn_loss_iou: 0.1981 d4.dn_loss_cls: 0.1047 d4.dn_loss_bbox: 0.1613 d4.dn_loss_iou: 0.1959 d1.loss_lmm_region: 0.1347 loss_lmm_image: 0.8476 2024/11/12 10:04:51 - mmengine - INFO - Iter(train) [ 82600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:32:07 time: 2.0104 data_time: 0.0182 memory: 31827 grad_norm: 23.2306 loss: 9.8875 loss_cls: 0.2950 loss_bbox: 0.1509 loss_iou: 0.2711 d0.loss_cls: 0.3393 d0.loss_bbox: 0.1631 d0.loss_iou: 0.2855 d1.loss_cls: 0.3144 d1.loss_bbox: 0.1589 d1.loss_iou: 0.2807 d2.loss_cls: 0.3041 d2.loss_bbox: 0.1520 d2.loss_iou: 0.2744 d3.loss_cls: 0.2985 d3.loss_bbox: 0.1516 d3.loss_iou: 0.2726 d4.loss_cls: 0.2999 d4.loss_bbox: 0.1498 d4.loss_iou: 0.2706 enc_loss_cls: 0.3449 enc_loss_bbox: 0.1745 enc_loss_iou: 0.3036 dn_loss_cls: 0.1068 dn_loss_bbox: 0.1750 dn_loss_iou: 0.2293 d0.dn_loss_cls: 0.1894 d0.dn_loss_bbox: 0.3365 d0.dn_loss_iou: 0.3757 d1.dn_loss_cls: 0.1440 d1.dn_loss_bbox: 0.2111 d1.dn_loss_iou: 0.2636 d2.dn_loss_cls: 0.1227 d2.dn_loss_bbox: 0.1890 d2.dn_loss_iou: 0.2418 d3.dn_loss_cls: 0.1143 d3.dn_loss_bbox: 0.1787 d3.dn_loss_iou: 0.2327 d4.dn_loss_cls: 0.1092 d4.dn_loss_bbox: 0.1751 d4.dn_loss_iou: 0.2293 d1.loss_lmm_region: 0.1441 loss_lmm_image: 0.8635 2024/11/12 10:08:09 - mmengine - INFO - Iter(train) [ 82700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:28:45 time: 1.9863 data_time: 0.0182 memory: 32898 grad_norm: 32.9308 loss: 8.6557 loss_cls: 0.2497 loss_bbox: 0.1246 loss_iou: 0.1974 d0.loss_cls: 0.2926 d0.loss_bbox: 0.1325 d0.loss_iou: 0.2079 d1.loss_cls: 0.2693 d1.loss_bbox: 0.1274 d1.loss_iou: 0.2035 d2.loss_cls: 0.2619 d2.loss_bbox: 0.1253 d2.loss_iou: 0.1994 d3.loss_cls: 0.2567 d3.loss_bbox: 0.1207 d3.loss_iou: 0.1972 d4.loss_cls: 0.2512 d4.loss_bbox: 0.1229 d4.loss_iou: 0.1966 enc_loss_cls: 0.2987 enc_loss_bbox: 0.1428 enc_loss_iou: 0.2255 dn_loss_cls: 0.1165 dn_loss_bbox: 0.1712 dn_loss_iou: 0.2033 d0.dn_loss_cls: 0.1890 d0.dn_loss_bbox: 0.3124 d0.dn_loss_iou: 0.3342 d1.dn_loss_cls: 0.1443 d1.dn_loss_bbox: 0.2046 d1.dn_loss_iou: 0.2333 d2.dn_loss_cls: 0.1256 d2.dn_loss_bbox: 0.1819 d2.dn_loss_iou: 0.2121 d3.dn_loss_cls: 0.1211 d3.dn_loss_bbox: 0.1731 d3.dn_loss_iou: 0.2054 d4.dn_loss_cls: 0.1150 d4.dn_loss_bbox: 0.1712 d4.dn_loss_iou: 0.2034 d1.loss_lmm_region: 0.1412 loss_lmm_image: 0.8933 2024/11/12 10:11:27 - mmengine - INFO - Iter(train) [ 82800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:25:22 time: 1.9964 data_time: 0.0183 memory: 33867 grad_norm: 38.1620 loss: 8.1933 loss_cls: 0.2597 loss_bbox: 0.1134 loss_iou: 0.2082 d0.loss_cls: 0.2997 d0.loss_bbox: 0.1277 d0.loss_iou: 0.2200 d1.loss_cls: 0.2798 d1.loss_bbox: 0.1201 d1.loss_iou: 0.2113 d2.loss_cls: 0.2681 d2.loss_bbox: 0.1165 d2.loss_iou: 0.2096 d3.loss_cls: 0.2656 d3.loss_bbox: 0.1144 d3.loss_iou: 0.2070 d4.loss_cls: 0.2620 d4.loss_bbox: 0.1139 d4.loss_iou: 0.2076 enc_loss_cls: 0.3035 enc_loss_bbox: 0.1377 enc_loss_iou: 0.2360 dn_loss_cls: 0.0862 dn_loss_bbox: 0.1457 dn_loss_iou: 0.1840 d0.dn_loss_cls: 0.1541 d0.dn_loss_bbox: 0.2840 d0.dn_loss_iou: 0.3141 d1.dn_loss_cls: 0.1104 d1.dn_loss_bbox: 0.1732 d1.dn_loss_iou: 0.2110 d2.dn_loss_cls: 0.0956 d2.dn_loss_bbox: 0.1528 d2.dn_loss_iou: 0.1914 d3.dn_loss_cls: 0.0895 d3.dn_loss_bbox: 0.1474 d3.dn_loss_iou: 0.1859 d4.dn_loss_cls: 0.0874 d4.dn_loss_bbox: 0.1457 d4.dn_loss_iou: 0.1839 d1.loss_lmm_region: 0.1083 loss_lmm_image: 0.8607 2024/11/12 10:14:46 - mmengine - INFO - Iter(train) [ 82900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:22:01 time: 1.9984 data_time: 0.0182 memory: 34562 grad_norm: 35.5287 loss: 10.8836 loss_cls: 0.3798 loss_bbox: 0.1641 loss_iou: 0.3254 d0.loss_cls: 0.4329 d0.loss_bbox: 0.1771 d0.loss_iou: 0.3359 d1.loss_cls: 0.4038 d1.loss_bbox: 0.1677 d1.loss_iou: 0.3255 d2.loss_cls: 0.3893 d2.loss_bbox: 0.1687 d2.loss_iou: 0.3267 d3.loss_cls: 0.3863 d3.loss_bbox: 0.1677 d3.loss_iou: 0.3267 d4.loss_cls: 0.3797 d4.loss_bbox: 0.1659 d4.loss_iou: 0.3261 enc_loss_cls: 0.4321 enc_loss_bbox: 0.1960 enc_loss_iou: 0.3628 dn_loss_cls: 0.1166 dn_loss_bbox: 0.1635 dn_loss_iou: 0.2372 d0.dn_loss_cls: 0.1977 d0.dn_loss_bbox: 0.3015 d0.dn_loss_iou: 0.3786 d1.dn_loss_cls: 0.1470 d1.dn_loss_bbox: 0.1972 d1.dn_loss_iou: 0.2696 d2.dn_loss_cls: 0.1278 d2.dn_loss_bbox: 0.1729 d2.dn_loss_iou: 0.2475 d3.dn_loss_cls: 0.1196 d3.dn_loss_bbox: 0.1657 d3.dn_loss_iou: 0.2398 d4.dn_loss_cls: 0.1163 d4.dn_loss_bbox: 0.1635 d4.dn_loss_iou: 0.2372 d1.loss_lmm_region: 0.1217 loss_lmm_image: 0.8227 2024/11/12 10:18:04 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 10:18:04 - mmengine - INFO - Iter(train) [ 83000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:18:38 time: 1.9715 data_time: 0.0181 memory: 34198 grad_norm: 30.4461 loss: 10.0989 loss_cls: 0.3396 loss_bbox: 0.1527 loss_iou: 0.2817 d0.loss_cls: 0.3931 d0.loss_bbox: 0.1585 d0.loss_iou: 0.2939 d1.loss_cls: 0.3618 d1.loss_bbox: 0.1575 d1.loss_iou: 0.2941 d2.loss_cls: 0.3488 d2.loss_bbox: 0.1572 d2.loss_iou: 0.2904 d3.loss_cls: 0.3411 d3.loss_bbox: 0.1540 d3.loss_iou: 0.2841 d4.loss_cls: 0.3396 d4.loss_bbox: 0.1531 d4.loss_iou: 0.2837 enc_loss_cls: 0.4077 enc_loss_bbox: 0.1749 enc_loss_iou: 0.3113 dn_loss_cls: 0.1064 dn_loss_bbox: 0.1587 dn_loss_iou: 0.2231 d0.dn_loss_cls: 0.1752 d0.dn_loss_bbox: 0.2954 d0.dn_loss_iou: 0.3563 d1.dn_loss_cls: 0.1275 d1.dn_loss_bbox: 0.1891 d1.dn_loss_iou: 0.2520 d2.dn_loss_cls: 0.1120 d2.dn_loss_bbox: 0.1705 d2.dn_loss_iou: 0.2326 d3.dn_loss_cls: 0.1096 d3.dn_loss_bbox: 0.1620 d3.dn_loss_iou: 0.2254 d4.dn_loss_cls: 0.1081 d4.dn_loss_bbox: 0.1588 d4.dn_loss_iou: 0.2231 d1.loss_lmm_region: 0.1484 loss_lmm_image: 0.8861 2024/11/12 10:21:25 - mmengine - INFO - Iter(train) [ 83100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:15:18 time: 1.9900 data_time: 0.0182 memory: 33559 grad_norm: 41.8673 loss: 9.2235 loss_cls: 0.2964 loss_bbox: 0.1230 loss_iou: 0.2231 d0.loss_cls: 0.3365 d0.loss_bbox: 0.1327 d0.loss_iou: 0.2288 d1.loss_cls: 0.3120 d1.loss_bbox: 0.1267 d1.loss_iou: 0.2239 d2.loss_cls: 0.3097 d2.loss_bbox: 0.1211 d2.loss_iou: 0.2177 d3.loss_cls: 0.2984 d3.loss_bbox: 0.1208 d3.loss_iou: 0.2234 d4.loss_cls: 0.2974 d4.loss_bbox: 0.1237 d4.loss_iou: 0.2226 enc_loss_cls: 0.3372 enc_loss_bbox: 0.1427 enc_loss_iou: 0.2485 dn_loss_cls: 0.1528 dn_loss_bbox: 0.1647 dn_loss_iou: 0.1995 d0.dn_loss_cls: 0.2279 d0.dn_loss_bbox: 0.2963 d0.dn_loss_iou: 0.3248 d1.dn_loss_cls: 0.1864 d1.dn_loss_bbox: 0.1897 d1.dn_loss_iou: 0.2239 d2.dn_loss_cls: 0.1589 d2.dn_loss_bbox: 0.1709 d2.dn_loss_iou: 0.2064 d3.dn_loss_cls: 0.1536 d3.dn_loss_bbox: 0.1658 d3.dn_loss_iou: 0.2011 d4.dn_loss_cls: 0.1555 d4.dn_loss_bbox: 0.1647 d4.dn_loss_iou: 0.1995 d1.loss_lmm_region: 0.1381 loss_lmm_image: 0.8768 2024/11/12 10:24:45 - mmengine - INFO - Iter(train) [ 83200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:11:57 time: 1.9955 data_time: 0.0182 memory: 33613 grad_norm: 31.4207 loss: 9.1272 loss_cls: 0.2809 loss_bbox: 0.1232 loss_iou: 0.2252 d0.loss_cls: 0.3148 d0.loss_bbox: 0.1455 d0.loss_iou: 0.2520 d1.loss_cls: 0.2966 d1.loss_bbox: 0.1313 d1.loss_iou: 0.2369 d2.loss_cls: 0.2879 d2.loss_bbox: 0.1270 d2.loss_iou: 0.2292 d3.loss_cls: 0.2857 d3.loss_bbox: 0.1237 d3.loss_iou: 0.2264 d4.loss_cls: 0.2833 d4.loss_bbox: 0.1232 d4.loss_iou: 0.2247 enc_loss_cls: 0.3231 enc_loss_bbox: 0.1577 enc_loss_iou: 0.2725 dn_loss_cls: 0.1249 dn_loss_bbox: 0.1668 dn_loss_iou: 0.2061 d0.dn_loss_cls: 0.2010 d0.dn_loss_bbox: 0.3164 d0.dn_loss_iou: 0.3537 d1.dn_loss_cls: 0.1562 d1.dn_loss_bbox: 0.2000 d1.dn_loss_iou: 0.2378 d2.dn_loss_cls: 0.1365 d2.dn_loss_bbox: 0.1741 d2.dn_loss_iou: 0.2151 d3.dn_loss_cls: 0.1304 d3.dn_loss_bbox: 0.1686 d3.dn_loss_iou: 0.2089 d4.dn_loss_cls: 0.1252 d4.dn_loss_bbox: 0.1668 d4.dn_loss_iou: 0.2061 d1.loss_lmm_region: 0.1246 loss_lmm_image: 0.8372 2024/11/12 10:28:04 - mmengine - INFO - Iter(train) [ 83300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:08:36 time: 1.9937 data_time: 0.0183 memory: 35225 grad_norm: 30.8795 loss: 9.8939 loss_cls: 0.3446 loss_bbox: 0.1445 loss_iou: 0.2871 d0.loss_cls: 0.4067 d0.loss_bbox: 0.1557 d0.loss_iou: 0.3027 d1.loss_cls: 0.3725 d1.loss_bbox: 0.1474 d1.loss_iou: 0.2898 d2.loss_cls: 0.3724 d2.loss_bbox: 0.1382 d2.loss_iou: 0.2775 d3.loss_cls: 0.3534 d3.loss_bbox: 0.1434 d3.loss_iou: 0.2855 d4.loss_cls: 0.3485 d4.loss_bbox: 0.1450 d4.loss_iou: 0.2880 enc_loss_cls: 0.3941 enc_loss_bbox: 0.1752 enc_loss_iou: 0.3254 dn_loss_cls: 0.1089 dn_loss_bbox: 0.1424 dn_loss_iou: 0.2116 d0.dn_loss_cls: 0.1830 d0.dn_loss_bbox: 0.2793 d0.dn_loss_iou: 0.3520 d1.dn_loss_cls: 0.1374 d1.dn_loss_bbox: 0.1712 d1.dn_loss_iou: 0.2424 d2.dn_loss_cls: 0.1214 d2.dn_loss_bbox: 0.1526 d2.dn_loss_iou: 0.2209 d3.dn_loss_cls: 0.1133 d3.dn_loss_bbox: 0.1443 d3.dn_loss_iou: 0.2136 d4.dn_loss_cls: 0.1096 d4.dn_loss_bbox: 0.1424 d4.dn_loss_iou: 0.2116 d1.loss_lmm_region: 0.1207 loss_lmm_image: 0.8176 2024/11/12 10:31:24 - mmengine - INFO - Iter(train) [ 83400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:05:15 time: 2.0082 data_time: 0.0182 memory: 33225 grad_norm: 27.0935 loss: 8.1658 loss_cls: 0.2515 loss_bbox: 0.1097 loss_iou: 0.2036 d0.loss_cls: 0.2914 d0.loss_bbox: 0.1242 d0.loss_iou: 0.2237 d1.loss_cls: 0.2670 d1.loss_bbox: 0.1183 d1.loss_iou: 0.2128 d2.loss_cls: 0.2609 d2.loss_bbox: 0.1103 d2.loss_iou: 0.2052 d3.loss_cls: 0.2545 d3.loss_bbox: 0.1096 d3.loss_iou: 0.2032 d4.loss_cls: 0.2541 d4.loss_bbox: 0.1092 d4.loss_iou: 0.2038 enc_loss_cls: 0.2937 enc_loss_bbox: 0.1350 enc_loss_iou: 0.2431 dn_loss_cls: 0.0870 dn_loss_bbox: 0.1410 dn_loss_iou: 0.1962 d0.dn_loss_cls: 0.1605 d0.dn_loss_bbox: 0.2732 d0.dn_loss_iou: 0.3350 d1.dn_loss_cls: 0.1130 d1.dn_loss_bbox: 0.1732 d1.dn_loss_iou: 0.2263 d2.dn_loss_cls: 0.0965 d2.dn_loss_bbox: 0.1493 d2.dn_loss_iou: 0.2058 d3.dn_loss_cls: 0.0902 d3.dn_loss_bbox: 0.1424 d3.dn_loss_iou: 0.1989 d4.dn_loss_cls: 0.0862 d4.dn_loss_bbox: 0.1409 d4.dn_loss_iou: 0.1962 d1.loss_lmm_region: 0.1146 loss_lmm_image: 0.8543 2024/11/12 10:34:43 - mmengine - INFO - Iter(train) [ 83500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 13:01:53 time: 1.9684 data_time: 0.0182 memory: 34088 grad_norm: 36.7802 loss: 8.5867 loss_cls: 0.2706 loss_bbox: 0.1112 loss_iou: 0.2240 d0.loss_cls: 0.3032 d0.loss_bbox: 0.1257 d0.loss_iou: 0.2407 d1.loss_cls: 0.2896 d1.loss_bbox: 0.1131 d1.loss_iou: 0.2296 d2.loss_cls: 0.2775 d2.loss_bbox: 0.1121 d2.loss_iou: 0.2261 d3.loss_cls: 0.2730 d3.loss_bbox: 0.1097 d3.loss_iou: 0.2228 d4.loss_cls: 0.2734 d4.loss_bbox: 0.1106 d4.loss_iou: 0.2238 enc_loss_cls: 0.3210 enc_loss_bbox: 0.1347 enc_loss_iou: 0.2599 dn_loss_cls: 0.0851 dn_loss_bbox: 0.1490 dn_loss_iou: 0.2041 d0.dn_loss_cls: 0.1634 d0.dn_loss_bbox: 0.2918 d0.dn_loss_iou: 0.3506 d1.dn_loss_cls: 0.1144 d1.dn_loss_bbox: 0.1782 d1.dn_loss_iou: 0.2337 d2.dn_loss_cls: 0.0982 d2.dn_loss_bbox: 0.1573 d2.dn_loss_iou: 0.2135 d3.dn_loss_cls: 0.0916 d3.dn_loss_bbox: 0.1507 d3.dn_loss_iou: 0.2062 d4.dn_loss_cls: 0.0879 d4.dn_loss_bbox: 0.1490 d4.dn_loss_iou: 0.2042 d1.loss_lmm_region: 0.1267 loss_lmm_image: 0.8785 2024/11/12 10:38:03 - mmengine - INFO - Iter(train) [ 83600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:58:32 time: 1.9971 data_time: 0.0184 memory: 34164 grad_norm: 46.2804 loss: 8.7179 loss_cls: 0.2624 loss_bbox: 0.1089 loss_iou: 0.1963 d0.loss_cls: 0.3133 d0.loss_bbox: 0.1156 d0.loss_iou: 0.2037 d1.loss_cls: 0.2892 d1.loss_bbox: 0.1105 d1.loss_iou: 0.1999 d2.loss_cls: 0.2740 d2.loss_bbox: 0.1096 d2.loss_iou: 0.1966 d3.loss_cls: 0.2693 d3.loss_bbox: 0.1076 d3.loss_iou: 0.1938 d4.loss_cls: 0.2641 d4.loss_bbox: 0.1083 d4.loss_iou: 0.1955 enc_loss_cls: 0.3096 enc_loss_bbox: 0.1348 enc_loss_iou: 0.2297 dn_loss_cls: 0.1495 dn_loss_bbox: 0.1607 dn_loss_iou: 0.1993 d0.dn_loss_cls: 0.2189 d0.dn_loss_bbox: 0.3182 d0.dn_loss_iou: 0.3442 d1.dn_loss_cls: 0.1750 d1.dn_loss_bbox: 0.1894 d1.dn_loss_iou: 0.2306 d2.dn_loss_cls: 0.1591 d2.dn_loss_bbox: 0.1693 d2.dn_loss_iou: 0.2083 d3.dn_loss_cls: 0.1496 d3.dn_loss_bbox: 0.1628 d3.dn_loss_iou: 0.2016 d4.dn_loss_cls: 0.1490 d4.dn_loss_bbox: 0.1607 d4.dn_loss_iou: 0.1992 d1.loss_lmm_region: 0.1382 loss_lmm_image: 0.8416 2024/11/12 10:41:24 - mmengine - INFO - Iter(train) [ 83700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:55:12 time: 2.0175 data_time: 0.0182 memory: 32714 grad_norm: 36.2035 loss: 10.1283 loss_cls: 0.3299 loss_bbox: 0.1386 loss_iou: 0.2537 d0.loss_cls: 0.3697 d0.loss_bbox: 0.1619 d0.loss_iou: 0.2695 d1.loss_cls: 0.3475 d1.loss_bbox: 0.1501 d1.loss_iou: 0.2653 d2.loss_cls: 0.3481 d2.loss_bbox: 0.1400 d2.loss_iou: 0.2536 d3.loss_cls: 0.3350 d3.loss_bbox: 0.1381 d3.loss_iou: 0.2520 d4.loss_cls: 0.3294 d4.loss_bbox: 0.1389 d4.loss_iou: 0.2542 enc_loss_cls: 0.3816 enc_loss_bbox: 0.1706 enc_loss_iou: 0.2911 dn_loss_cls: 0.1317 dn_loss_bbox: 0.1822 dn_loss_iou: 0.2213 d0.dn_loss_cls: 0.2129 d0.dn_loss_bbox: 0.3356 d0.dn_loss_iou: 0.3685 d1.dn_loss_cls: 0.1637 d1.dn_loss_bbox: 0.2191 d1.dn_loss_iou: 0.2561 d2.dn_loss_cls: 0.1479 d2.dn_loss_bbox: 0.1954 d2.dn_loss_iou: 0.2334 d3.dn_loss_cls: 0.1373 d3.dn_loss_bbox: 0.1848 d3.dn_loss_iou: 0.2244 d4.dn_loss_cls: 0.1329 d4.dn_loss_bbox: 0.1822 d4.dn_loss_iou: 0.2213 d1.loss_lmm_region: 0.1450 loss_lmm_image: 0.9138 2024/11/12 10:44:44 - mmengine - INFO - Iter(train) [ 83800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:51:51 time: 2.0207 data_time: 0.0183 memory: 34124 grad_norm: 32.9259 loss: 9.7295 loss_cls: 0.2993 loss_bbox: 0.1438 loss_iou: 0.2640 d0.loss_cls: 0.3359 d0.loss_bbox: 0.1596 d0.loss_iou: 0.2800 d1.loss_cls: 0.3150 d1.loss_bbox: 0.1488 d1.loss_iou: 0.2706 d2.loss_cls: 0.3059 d2.loss_bbox: 0.1446 d2.loss_iou: 0.2643 d3.loss_cls: 0.3018 d3.loss_bbox: 0.1442 d3.loss_iou: 0.2638 d4.loss_cls: 0.3009 d4.loss_bbox: 0.1424 d4.loss_iou: 0.2639 enc_loss_cls: 0.3499 enc_loss_bbox: 0.1732 enc_loss_iou: 0.3017 dn_loss_cls: 0.1166 dn_loss_bbox: 0.1627 dn_loss_iou: 0.2355 d0.dn_loss_cls: 0.2027 d0.dn_loss_bbox: 0.3023 d0.dn_loss_iou: 0.3784 d1.dn_loss_cls: 0.1487 d1.dn_loss_bbox: 0.1958 d1.dn_loss_iou: 0.2653 d2.dn_loss_cls: 0.1282 d2.dn_loss_bbox: 0.1736 d2.dn_loss_iou: 0.2445 d3.dn_loss_cls: 0.1212 d3.dn_loss_bbox: 0.1651 d3.dn_loss_iou: 0.2378 d4.dn_loss_cls: 0.1166 d4.dn_loss_bbox: 0.1628 d4.dn_loss_iou: 0.2355 d1.loss_lmm_region: 0.1447 loss_lmm_image: 0.8178 2024/11/12 10:48:01 - mmengine - INFO - Iter(train) [ 83900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:48:28 time: 1.9795 data_time: 0.0182 memory: 34283 grad_norm: 29.2622 loss: 9.9823 loss_cls: 0.3547 loss_bbox: 0.1565 loss_iou: 0.2653 d0.loss_cls: 0.3998 d0.loss_bbox: 0.1656 d0.loss_iou: 0.2809 d1.loss_cls: 0.3782 d1.loss_bbox: 0.1599 d1.loss_iou: 0.2734 d2.loss_cls: 0.3670 d2.loss_bbox: 0.1554 d2.loss_iou: 0.2690 d3.loss_cls: 0.3628 d3.loss_bbox: 0.1546 d3.loss_iou: 0.2636 d4.loss_cls: 0.3566 d4.loss_bbox: 0.1560 d4.loss_iou: 0.2641 enc_loss_cls: 0.3983 enc_loss_bbox: 0.1885 enc_loss_iou: 0.3124 dn_loss_cls: 0.0956 dn_loss_bbox: 0.1592 dn_loss_iou: 0.2155 d0.dn_loss_cls: 0.1715 d0.dn_loss_bbox: 0.3096 d0.dn_loss_iou: 0.3552 d1.dn_loss_cls: 0.1220 d1.dn_loss_bbox: 0.1906 d1.dn_loss_iou: 0.2440 d2.dn_loss_cls: 0.1045 d2.dn_loss_bbox: 0.1710 d2.dn_loss_iou: 0.2249 d3.dn_loss_cls: 0.0998 d3.dn_loss_bbox: 0.1618 d3.dn_loss_iou: 0.2178 d4.dn_loss_cls: 0.0948 d4.dn_loss_bbox: 0.1593 d4.dn_loss_iou: 0.2154 d1.loss_lmm_region: 0.1346 loss_lmm_image: 0.8525 2024/11/12 10:51:19 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 10:51:19 - mmengine - INFO - Iter(train) [ 84000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:45:06 time: 1.9828 data_time: 0.0183 memory: 33500 grad_norm: 31.8123 loss: 9.3080 loss_cls: 0.3015 loss_bbox: 0.1385 loss_iou: 0.2452 d0.loss_cls: 0.3511 d0.loss_bbox: 0.1557 d0.loss_iou: 0.2609 d1.loss_cls: 0.3209 d1.loss_bbox: 0.1460 d1.loss_iou: 0.2515 d2.loss_cls: 0.3121 d2.loss_bbox: 0.1391 d2.loss_iou: 0.2455 d3.loss_cls: 0.3026 d3.loss_bbox: 0.1367 d3.loss_iou: 0.2428 d4.loss_cls: 0.3016 d4.loss_bbox: 0.1376 d4.loss_iou: 0.2445 enc_loss_cls: 0.3492 enc_loss_bbox: 0.1699 enc_loss_iou: 0.2890 dn_loss_cls: 0.0955 dn_loss_bbox: 0.1666 dn_loss_iou: 0.2102 d0.dn_loss_cls: 0.1682 d0.dn_loss_bbox: 0.3290 d0.dn_loss_iou: 0.3609 d1.dn_loss_cls: 0.1201 d1.dn_loss_bbox: 0.2021 d1.dn_loss_iou: 0.2442 d2.dn_loss_cls: 0.1041 d2.dn_loss_bbox: 0.1747 d2.dn_loss_iou: 0.2194 d3.dn_loss_cls: 0.0977 d3.dn_loss_bbox: 0.1683 d3.dn_loss_iou: 0.2130 d4.dn_loss_cls: 0.0959 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2103 d1.loss_lmm_region: 0.1080 loss_lmm_image: 0.8113 2024/11/12 10:54:39 - mmengine - INFO - Iter(train) [ 84100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:41:45 time: 1.9906 data_time: 0.0183 memory: 34750 grad_norm: 29.8360 loss: 8.9905 loss_cls: 0.2844 loss_bbox: 0.1195 loss_iou: 0.2441 d0.loss_cls: 0.3233 d0.loss_bbox: 0.1333 d0.loss_iou: 0.2582 d1.loss_cls: 0.2985 d1.loss_bbox: 0.1236 d1.loss_iou: 0.2522 d2.loss_cls: 0.2895 d2.loss_bbox: 0.1210 d2.loss_iou: 0.2487 d3.loss_cls: 0.2882 d3.loss_bbox: 0.1176 d3.loss_iou: 0.2441 d4.loss_cls: 0.2843 d4.loss_bbox: 0.1195 d4.loss_iou: 0.2441 enc_loss_cls: 0.3233 enc_loss_bbox: 0.1481 enc_loss_iou: 0.2810 dn_loss_cls: 0.0820 dn_loss_bbox: 0.1486 dn_loss_iou: 0.2208 d0.dn_loss_cls: 0.1678 d0.dn_loss_bbox: 0.2949 d0.dn_loss_iou: 0.3681 d1.dn_loss_cls: 0.1178 d1.dn_loss_bbox: 0.1819 d1.dn_loss_iou: 0.2540 d2.dn_loss_cls: 0.0962 d2.dn_loss_bbox: 0.1582 d2.dn_loss_iou: 0.2310 d3.dn_loss_cls: 0.0873 d3.dn_loss_bbox: 0.1505 d3.dn_loss_iou: 0.2229 d4.dn_loss_cls: 0.0823 d4.dn_loss_bbox: 0.1487 d4.dn_loss_iou: 0.2208 d1.loss_lmm_region: 0.1122 loss_lmm_image: 0.8981 2024/11/12 10:57:59 - mmengine - INFO - Iter(train) [ 84200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:38:24 time: 2.0015 data_time: 0.0186 memory: 34813 grad_norm: 29.2275 loss: 10.1497 loss_cls: 0.3377 loss_bbox: 0.1305 loss_iou: 0.2124 d0.loss_cls: 0.3652 d0.loss_bbox: 0.1544 d0.loss_iou: 0.2283 d1.loss_cls: 0.3610 d1.loss_bbox: 0.1374 d1.loss_iou: 0.2205 d2.loss_cls: 0.3482 d2.loss_bbox: 0.1343 d2.loss_iou: 0.2154 d3.loss_cls: 0.3389 d3.loss_bbox: 0.1332 d3.loss_iou: 0.2136 d4.loss_cls: 0.3403 d4.loss_bbox: 0.1298 d4.loss_iou: 0.2122 enc_loss_cls: 0.3750 enc_loss_bbox: 0.1587 enc_loss_iou: 0.2467 dn_loss_cls: 0.2061 dn_loss_bbox: 0.1770 dn_loss_iou: 0.2159 d0.dn_loss_cls: 0.2898 d0.dn_loss_bbox: 0.3455 d0.dn_loss_iou: 0.3756 d1.dn_loss_cls: 0.2501 d1.dn_loss_bbox: 0.2137 d1.dn_loss_iou: 0.2502 d2.dn_loss_cls: 0.2289 d2.dn_loss_bbox: 0.1892 d2.dn_loss_iou: 0.2268 d3.dn_loss_cls: 0.2203 d3.dn_loss_bbox: 0.1794 d3.dn_loss_iou: 0.2187 d4.dn_loss_cls: 0.2087 d4.dn_loss_bbox: 0.1771 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1557 loss_lmm_image: 0.8113 2024/11/12 11:01:17 - mmengine - INFO - Iter(train) [ 84300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:35:01 time: 1.9841 data_time: 0.0183 memory: 33257 grad_norm: 25.5298 loss: 8.9478 loss_cls: 0.2632 loss_bbox: 0.1276 loss_iou: 0.2398 d0.loss_cls: 0.3006 d0.loss_bbox: 0.1374 d0.loss_iou: 0.2543 d1.loss_cls: 0.2836 d1.loss_bbox: 0.1301 d1.loss_iou: 0.2437 d2.loss_cls: 0.2712 d2.loss_bbox: 0.1296 d2.loss_iou: 0.2418 d3.loss_cls: 0.2683 d3.loss_bbox: 0.1291 d3.loss_iou: 0.2399 d4.loss_cls: 0.2649 d4.loss_bbox: 0.1272 d4.loss_iou: 0.2395 enc_loss_cls: 0.3000 enc_loss_bbox: 0.1549 enc_loss_iou: 0.2803 dn_loss_cls: 0.1007 dn_loss_bbox: 0.1688 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.1750 d0.dn_loss_bbox: 0.3138 d0.dn_loss_iou: 0.3481 d1.dn_loss_cls: 0.1292 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2415 d2.dn_loss_cls: 0.1117 d2.dn_loss_bbox: 0.1774 d2.dn_loss_iou: 0.2225 d3.dn_loss_cls: 0.1070 d3.dn_loss_bbox: 0.1702 d3.dn_loss_iou: 0.2166 d4.dn_loss_cls: 0.1018 d4.dn_loss_bbox: 0.1688 d4.dn_loss_iou: 0.2142 d1.loss_lmm_region: 0.1394 loss_lmm_image: 0.8027 2024/11/12 11:04:38 - mmengine - INFO - Iter(train) [ 84400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:31:41 time: 2.0131 data_time: 0.0184 memory: 34334 grad_norm: 25.6935 loss: 9.5758 loss_cls: 0.2934 loss_bbox: 0.1483 loss_iou: 0.2525 d0.loss_cls: 0.3347 d0.loss_bbox: 0.1587 d0.loss_iou: 0.2549 d1.loss_cls: 0.3210 d1.loss_bbox: 0.1489 d1.loss_iou: 0.2450 d2.loss_cls: 0.3026 d2.loss_bbox: 0.1510 d2.loss_iou: 0.2470 d3.loss_cls: 0.2977 d3.loss_bbox: 0.1461 d3.loss_iou: 0.2471 d4.loss_cls: 0.2942 d4.loss_bbox: 0.1474 d4.loss_iou: 0.2509 enc_loss_cls: 0.3400 enc_loss_bbox: 0.1751 enc_loss_iou: 0.2761 dn_loss_cls: 0.1241 dn_loss_bbox: 0.1655 dn_loss_iou: 0.2131 d0.dn_loss_cls: 0.2010 d0.dn_loss_bbox: 0.3227 d0.dn_loss_iou: 0.3632 d1.dn_loss_cls: 0.1496 d1.dn_loss_bbox: 0.1994 d1.dn_loss_iou: 0.2440 d2.dn_loss_cls: 0.1337 d2.dn_loss_bbox: 0.1751 d2.dn_loss_iou: 0.2209 d3.dn_loss_cls: 0.1261 d3.dn_loss_bbox: 0.1674 d3.dn_loss_iou: 0.2151 d4.dn_loss_cls: 0.1244 d4.dn_loss_bbox: 0.1655 d4.dn_loss_iou: 0.2129 d1.loss_lmm_region: 0.1509 loss_lmm_image: 0.8687 2024/11/12 11:07:58 - mmengine - INFO - Iter(train) [ 84500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:28:21 time: 2.0112 data_time: 0.0182 memory: 34561 grad_norm: 29.3947 loss: 9.6061 loss_cls: 0.2877 loss_bbox: 0.1388 loss_iou: 0.2499 d0.loss_cls: 0.3319 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2685 d1.loss_cls: 0.3021 d1.loss_bbox: 0.1458 d1.loss_iou: 0.2564 d2.loss_cls: 0.2959 d2.loss_bbox: 0.1412 d2.loss_iou: 0.2516 d3.loss_cls: 0.2959 d3.loss_bbox: 0.1400 d3.loss_iou: 0.2498 d4.loss_cls: 0.2896 d4.loss_bbox: 0.1400 d4.loss_iou: 0.2509 enc_loss_cls: 0.3391 enc_loss_bbox: 0.1669 enc_loss_iou: 0.2853 dn_loss_cls: 0.1189 dn_loss_bbox: 0.1776 dn_loss_iou: 0.2262 d0.dn_loss_cls: 0.2053 d0.dn_loss_bbox: 0.3257 d0.dn_loss_iou: 0.3714 d1.dn_loss_cls: 0.1491 d1.dn_loss_bbox: 0.2111 d1.dn_loss_iou: 0.2559 d2.dn_loss_cls: 0.1317 d2.dn_loss_bbox: 0.1879 d2.dn_loss_iou: 0.2359 d3.dn_loss_cls: 0.1239 d3.dn_loss_bbox: 0.1803 d3.dn_loss_iou: 0.2290 d4.dn_loss_cls: 0.1195 d4.dn_loss_bbox: 0.1776 d4.dn_loss_iou: 0.2261 d1.loss_lmm_region: 0.1414 loss_lmm_image: 0.8298 2024/11/12 11:11:18 - mmengine - INFO - Iter(train) [ 84600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:25:00 time: 2.0007 data_time: 0.0182 memory: 33641 grad_norm: 31.1302 loss: 11.7097 loss_cls: 0.3752 loss_bbox: 0.1981 loss_iou: 0.3184 d0.loss_cls: 0.4099 d0.loss_bbox: 0.2200 d0.loss_iou: 0.3410 d1.loss_cls: 0.3815 d1.loss_bbox: 0.2177 d1.loss_iou: 0.3338 d2.loss_cls: 0.3800 d2.loss_bbox: 0.2073 d2.loss_iou: 0.3272 d3.loss_cls: 0.3801 d3.loss_bbox: 0.2018 d3.loss_iou: 0.3215 d4.loss_cls: 0.3827 d4.loss_bbox: 0.1986 d4.loss_iou: 0.3185 enc_loss_cls: 0.4176 enc_loss_bbox: 0.2319 enc_loss_iou: 0.3630 dn_loss_cls: 0.1528 dn_loss_bbox: 0.2035 dn_loss_iou: 0.2516 d0.dn_loss_cls: 0.2255 d0.dn_loss_bbox: 0.3625 d0.dn_loss_iou: 0.3989 d1.dn_loss_cls: 0.1790 d1.dn_loss_bbox: 0.2398 d1.dn_loss_iou: 0.2828 d2.dn_loss_cls: 0.1657 d2.dn_loss_bbox: 0.2123 d2.dn_loss_iou: 0.2607 d3.dn_loss_cls: 0.1566 d3.dn_loss_bbox: 0.2051 d3.dn_loss_iou: 0.2536 d4.dn_loss_cls: 0.1541 d4.dn_loss_bbox: 0.2035 d4.dn_loss_iou: 0.2516 d1.loss_lmm_region: 0.1635 loss_lmm_image: 0.8608 2024/11/12 11:14:38 - mmengine - INFO - Iter(train) [ 84700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:21:39 time: 1.9904 data_time: 0.0182 memory: 35087 grad_norm: 30.3538 loss: 9.8410 loss_cls: 0.3072 loss_bbox: 0.1528 loss_iou: 0.2497 d0.loss_cls: 0.3503 d0.loss_bbox: 0.1635 d0.loss_iou: 0.2629 d1.loss_cls: 0.3307 d1.loss_bbox: 0.1594 d1.loss_iou: 0.2539 d2.loss_cls: 0.3201 d2.loss_bbox: 0.1555 d2.loss_iou: 0.2492 d3.loss_cls: 0.3114 d3.loss_bbox: 0.1521 d3.loss_iou: 0.2488 d4.loss_cls: 0.3094 d4.loss_bbox: 0.1507 d4.loss_iou: 0.2487 enc_loss_cls: 0.3551 enc_loss_bbox: 0.1791 enc_loss_iou: 0.2840 dn_loss_cls: 0.1246 dn_loss_bbox: 0.1820 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.2053 d0.dn_loss_bbox: 0.3325 d0.dn_loss_iou: 0.3600 d1.dn_loss_cls: 0.1560 d1.dn_loss_bbox: 0.2196 d1.dn_loss_iou: 0.2511 d2.dn_loss_cls: 0.1384 d2.dn_loss_bbox: 0.1936 d2.dn_loss_iou: 0.2306 d3.dn_loss_cls: 0.1286 d3.dn_loss_bbox: 0.1845 d3.dn_loss_iou: 0.2227 d4.dn_loss_cls: 0.1264 d4.dn_loss_bbox: 0.1820 d4.dn_loss_iou: 0.2203 d1.loss_lmm_region: 0.1428 loss_lmm_image: 0.8252 2024/11/12 11:17:58 - mmengine - INFO - Iter(train) [ 84800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:18:18 time: 1.9801 data_time: 0.0183 memory: 33298 grad_norm: 32.6952 loss: 9.8587 loss_cls: 0.3100 loss_bbox: 0.1499 loss_iou: 0.3153 d0.loss_cls: 0.3520 d0.loss_bbox: 0.1514 d0.loss_iou: 0.3179 d1.loss_cls: 0.3248 d1.loss_bbox: 0.1459 d1.loss_iou: 0.3130 d2.loss_cls: 0.3212 d2.loss_bbox: 0.1446 d2.loss_iou: 0.3135 d3.loss_cls: 0.3182 d3.loss_bbox: 0.1439 d3.loss_iou: 0.3107 d4.loss_cls: 0.3116 d4.loss_bbox: 0.1487 d4.loss_iou: 0.3151 enc_loss_cls: 0.3476 enc_loss_bbox: 0.1728 enc_loss_iou: 0.3450 dn_loss_cls: 0.1060 dn_loss_bbox: 0.1489 dn_loss_iou: 0.2156 d0.dn_loss_cls: 0.1780 d0.dn_loss_bbox: 0.2866 d0.dn_loss_iou: 0.3488 d1.dn_loss_cls: 0.1310 d1.dn_loss_bbox: 0.1802 d1.dn_loss_iou: 0.2455 d2.dn_loss_cls: 0.1127 d2.dn_loss_bbox: 0.1590 d2.dn_loss_iou: 0.2244 d3.dn_loss_cls: 0.1077 d3.dn_loss_bbox: 0.1510 d3.dn_loss_iou: 0.2178 d4.dn_loss_cls: 0.1068 d4.dn_loss_bbox: 0.1489 d4.dn_loss_iou: 0.2157 d1.loss_lmm_region: 0.1354 loss_lmm_image: 0.8655 2024/11/12 11:21:16 - mmengine - INFO - Iter(train) [ 84900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:14:56 time: 1.9763 data_time: 0.0183 memory: 33581 grad_norm: 27.5258 loss: 8.3809 loss_cls: 0.2290 loss_bbox: 0.1348 loss_iou: 0.2041 d0.loss_cls: 0.2664 d0.loss_bbox: 0.1526 d0.loss_iou: 0.2184 d1.loss_cls: 0.2453 d1.loss_bbox: 0.1438 d1.loss_iou: 0.2130 d2.loss_cls: 0.2401 d2.loss_bbox: 0.1368 d2.loss_iou: 0.2049 d3.loss_cls: 0.2332 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2041 d4.loss_cls: 0.2291 d4.loss_bbox: 0.1358 d4.loss_iou: 0.2032 enc_loss_cls: 0.2700 enc_loss_bbox: 0.1690 enc_loss_iou: 0.2422 dn_loss_cls: 0.0826 dn_loss_bbox: 0.1680 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.1588 d0.dn_loss_bbox: 0.3397 d0.dn_loss_iou: 0.3529 d1.dn_loss_cls: 0.1081 d1.dn_loss_bbox: 0.2054 d1.dn_loss_iou: 0.2301 d2.dn_loss_cls: 0.0915 d2.dn_loss_bbox: 0.1785 d2.dn_loss_iou: 0.2059 d3.dn_loss_cls: 0.0855 d3.dn_loss_bbox: 0.1697 d3.dn_loss_iou: 0.1988 d4.dn_loss_cls: 0.0831 d4.dn_loss_bbox: 0.1680 d4.dn_loss_iou: 0.1963 d1.loss_lmm_region: 0.1256 loss_lmm_image: 0.8243 2024/11/12 11:24:36 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 11:24:36 - mmengine - INFO - Iter(train) [ 85000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:11:35 time: 2.0090 data_time: 0.0183 memory: 33504 grad_norm: nan loss: 8.6037 loss_cls: 0.2671 loss_bbox: 0.1177 loss_iou: 0.2224 d0.loss_cls: 0.3070 d0.loss_bbox: 0.1313 d0.loss_iou: 0.2381 d1.loss_cls: 0.2811 d1.loss_bbox: 0.1258 d1.loss_iou: 0.2329 d2.loss_cls: 0.2738 d2.loss_bbox: 0.1216 d2.loss_iou: 0.2263 d3.loss_cls: 0.2708 d3.loss_bbox: 0.1170 d3.loss_iou: 0.2221 d4.loss_cls: 0.2667 d4.loss_bbox: 0.1187 d4.loss_iou: 0.2220 enc_loss_cls: 0.3052 enc_loss_bbox: 0.1424 enc_loss_iou: 0.2610 dn_loss_cls: 0.0856 dn_loss_bbox: 0.1456 dn_loss_iou: 0.2028 d0.dn_loss_cls: 0.1627 d0.dn_loss_bbox: 0.2895 d0.dn_loss_iou: 0.3520 d1.dn_loss_cls: 0.1147 d1.dn_loss_bbox: 0.1803 d1.dn_loss_iou: 0.2390 d2.dn_loss_cls: 0.0963 d2.dn_loss_bbox: 0.1569 d2.dn_loss_iou: 0.2139 d3.dn_loss_cls: 0.0902 d3.dn_loss_bbox: 0.1481 d3.dn_loss_iou: 0.2055 d4.dn_loss_cls: 0.0874 d4.dn_loss_bbox: 0.1457 d4.dn_loss_iou: 0.2029 d1.loss_lmm_region: 0.1194 loss_lmm_image: 0.8942 2024/11/12 11:27:56 - mmengine - INFO - Iter(train) [ 85100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:08:14 time: 2.0040 data_time: 0.0183 memory: 34767 grad_norm: 28.1609 loss: 8.6866 loss_cls: 0.2657 loss_bbox: 0.1231 loss_iou: 0.2436 d0.loss_cls: 0.3194 d0.loss_bbox: 0.1297 d0.loss_iou: 0.2541 d1.loss_cls: 0.2870 d1.loss_bbox: 0.1233 d1.loss_iou: 0.2487 d2.loss_cls: 0.2777 d2.loss_bbox: 0.1243 d2.loss_iou: 0.2422 d3.loss_cls: 0.2728 d3.loss_bbox: 0.1242 d3.loss_iou: 0.2430 d4.loss_cls: 0.2706 d4.loss_bbox: 0.1209 d4.loss_iou: 0.2430 enc_loss_cls: 0.3157 enc_loss_bbox: 0.1449 enc_loss_iou: 0.2784 dn_loss_cls: 0.0805 dn_loss_bbox: 0.1489 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1580 d0.dn_loss_bbox: 0.2813 d0.dn_loss_iou: 0.3439 d1.dn_loss_cls: 0.1104 d1.dn_loss_bbox: 0.1785 d1.dn_loss_iou: 0.2343 d2.dn_loss_cls: 0.0938 d2.dn_loss_bbox: 0.1596 d2.dn_loss_iou: 0.2138 d3.dn_loss_cls: 0.0830 d3.dn_loss_bbox: 0.1513 d3.dn_loss_iou: 0.2062 d4.dn_loss_cls: 0.0809 d4.dn_loss_bbox: 0.1489 d4.dn_loss_iou: 0.2035 d1.loss_lmm_region: 0.1395 loss_lmm_image: 0.8145 2024/11/12 11:31:15 - mmengine - INFO - Iter(train) [ 85200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:04:53 time: 1.9819 data_time: 0.0183 memory: 34006 grad_norm: 28.6644 loss: 9.7736 loss_cls: 0.2891 loss_bbox: 0.1586 loss_iou: 0.2803 d0.loss_cls: 0.3357 d0.loss_bbox: 0.1681 d0.loss_iou: 0.2927 d1.loss_cls: 0.3142 d1.loss_bbox: 0.1611 d1.loss_iou: 0.2856 d2.loss_cls: 0.3010 d2.loss_bbox: 0.1593 d2.loss_iou: 0.2839 d3.loss_cls: 0.2902 d3.loss_bbox: 0.1587 d3.loss_iou: 0.2809 d4.loss_cls: 0.2914 d4.loss_bbox: 0.1570 d4.loss_iou: 0.2795 enc_loss_cls: 0.3394 enc_loss_bbox: 0.1816 enc_loss_iou: 0.3127 dn_loss_cls: 0.0972 dn_loss_bbox: 0.1632 dn_loss_iou: 0.2286 d0.dn_loss_cls: 0.1759 d0.dn_loss_bbox: 0.3198 d0.dn_loss_iou: 0.3819 d1.dn_loss_cls: 0.1268 d1.dn_loss_bbox: 0.1954 d1.dn_loss_iou: 0.2609 d2.dn_loss_cls: 0.1073 d2.dn_loss_bbox: 0.1720 d2.dn_loss_iou: 0.2390 d3.dn_loss_cls: 0.1018 d3.dn_loss_bbox: 0.1660 d3.dn_loss_iou: 0.2312 d4.dn_loss_cls: 0.0989 d4.dn_loss_bbox: 0.1632 d4.dn_loss_iou: 0.2286 d1.loss_lmm_region: 0.1452 loss_lmm_image: 0.8497 2024/11/12 11:34:34 - mmengine - INFO - Iter(train) [ 85300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 12:01:31 time: 1.9680 data_time: 0.0183 memory: 34419 grad_norm: 27.5422 loss: 8.5704 loss_cls: 0.2885 loss_bbox: 0.1213 loss_iou: 0.2130 d0.loss_cls: 0.3251 d0.loss_bbox: 0.1333 d0.loss_iou: 0.2279 d1.loss_cls: 0.2984 d1.loss_bbox: 0.1315 d1.loss_iou: 0.2243 d2.loss_cls: 0.2936 d2.loss_bbox: 0.1271 d2.loss_iou: 0.2194 d3.loss_cls: 0.2890 d3.loss_bbox: 0.1230 d3.loss_iou: 0.2158 d4.loss_cls: 0.2856 d4.loss_bbox: 0.1230 d4.loss_iou: 0.2155 enc_loss_cls: 0.3224 enc_loss_bbox: 0.1504 enc_loss_iou: 0.2510 dn_loss_cls: 0.0958 dn_loss_bbox: 0.1457 dn_loss_iou: 0.1823 d0.dn_loss_cls: 0.1697 d0.dn_loss_bbox: 0.2842 d0.dn_loss_iou: 0.3121 d1.dn_loss_cls: 0.1223 d1.dn_loss_bbox: 0.1739 d1.dn_loss_iou: 0.2086 d2.dn_loss_cls: 0.1044 d2.dn_loss_bbox: 0.1570 d2.dn_loss_iou: 0.1908 d3.dn_loss_cls: 0.0977 d3.dn_loss_bbox: 0.1474 d3.dn_loss_iou: 0.1843 d4.dn_loss_cls: 0.0954 d4.dn_loss_bbox: 0.1457 d4.dn_loss_iou: 0.1822 d1.loss_lmm_region: 0.1214 loss_lmm_image: 0.8707 2024/11/12 11:37:50 - mmengine - INFO - Iter(train) [ 85400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:58:07 time: 1.9210 data_time: 0.0183 memory: 34149 grad_norm: 28.7932 loss: 8.0963 loss_cls: 0.2102 loss_bbox: 0.1156 loss_iou: 0.2035 d0.loss_cls: 0.2426 d0.loss_bbox: 0.1235 d0.loss_iou: 0.2070 d1.loss_cls: 0.2273 d1.loss_bbox: 0.1173 d1.loss_iou: 0.1980 d2.loss_cls: 0.2193 d2.loss_bbox: 0.1166 d2.loss_iou: 0.2018 d3.loss_cls: 0.2185 d3.loss_bbox: 0.1156 d3.loss_iou: 0.2002 d4.loss_cls: 0.2158 d4.loss_bbox: 0.1133 d4.loss_iou: 0.1982 enc_loss_cls: 0.2475 enc_loss_bbox: 0.1321 enc_loss_iou: 0.2250 dn_loss_cls: 0.0784 dn_loss_bbox: 0.1723 dn_loss_iou: 0.2047 d0.dn_loss_cls: 0.1544 d0.dn_loss_bbox: 0.3236 d0.dn_loss_iou: 0.3421 d1.dn_loss_cls: 0.1024 d1.dn_loss_bbox: 0.2041 d1.dn_loss_iou: 0.2332 d2.dn_loss_cls: 0.0860 d2.dn_loss_bbox: 0.1813 d2.dn_loss_iou: 0.2127 d3.dn_loss_cls: 0.0810 d3.dn_loss_bbox: 0.1739 d3.dn_loss_iou: 0.2065 d4.dn_loss_cls: 0.0780 d4.dn_loss_bbox: 0.1724 d4.dn_loss_iou: 0.2047 d1.loss_lmm_region: 0.1329 loss_lmm_image: 0.9024 2024/11/12 11:41:07 - mmengine - INFO - Iter(train) [ 85500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:54:44 time: 1.9580 data_time: 0.0183 memory: 35131 grad_norm: 26.6815 loss: 10.1552 loss_cls: 0.3437 loss_bbox: 0.1419 loss_iou: 0.2748 d0.loss_cls: 0.3826 d0.loss_bbox: 0.1619 d0.loss_iou: 0.2959 d1.loss_cls: 0.3618 d1.loss_bbox: 0.1512 d1.loss_iou: 0.2831 d2.loss_cls: 0.3535 d2.loss_bbox: 0.1448 d2.loss_iou: 0.2768 d3.loss_cls: 0.3420 d3.loss_bbox: 0.1481 d3.loss_iou: 0.2772 d4.loss_cls: 0.3447 d4.loss_bbox: 0.1442 d4.loss_iou: 0.2768 enc_loss_cls: 0.3864 enc_loss_bbox: 0.1783 enc_loss_iou: 0.3203 dn_loss_cls: 0.1075 dn_loss_bbox: 0.1798 dn_loss_iou: 0.2240 d0.dn_loss_cls: 0.1837 d0.dn_loss_bbox: 0.3249 d0.dn_loss_iou: 0.3640 d1.dn_loss_cls: 0.1383 d1.dn_loss_bbox: 0.2027 d1.dn_loss_iou: 0.2522 d2.dn_loss_cls: 0.1211 d2.dn_loss_bbox: 0.1884 d2.dn_loss_iou: 0.2326 d3.dn_loss_cls: 0.1121 d3.dn_loss_bbox: 0.1816 d3.dn_loss_iou: 0.2267 d4.dn_loss_cls: 0.1083 d4.dn_loss_bbox: 0.1798 d4.dn_loss_iou: 0.2239 d1.loss_lmm_region: 0.1384 loss_lmm_image: 0.8752 2024/11/12 11:44:26 - mmengine - INFO - Iter(train) [ 85600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:51:22 time: 1.9630 data_time: 0.0183 memory: 34143 grad_norm: 25.3808 loss: 9.7472 loss_cls: 0.3171 loss_bbox: 0.1517 loss_iou: 0.2552 d0.loss_cls: 0.3658 d0.loss_bbox: 0.1636 d0.loss_iou: 0.2653 d1.loss_cls: 0.3350 d1.loss_bbox: 0.1575 d1.loss_iou: 0.2596 d2.loss_cls: 0.3252 d2.loss_bbox: 0.1543 d2.loss_iou: 0.2571 d3.loss_cls: 0.3183 d3.loss_bbox: 0.1531 d3.loss_iou: 0.2571 d4.loss_cls: 0.3188 d4.loss_bbox: 0.1515 d4.loss_iou: 0.2545 enc_loss_cls: 0.3672 enc_loss_bbox: 0.1723 enc_loss_iou: 0.2850 dn_loss_cls: 0.1017 dn_loss_bbox: 0.1772 dn_loss_iou: 0.2148 d0.dn_loss_cls: 0.1978 d0.dn_loss_bbox: 0.3346 d0.dn_loss_iou: 0.3510 d1.dn_loss_cls: 0.1391 d1.dn_loss_bbox: 0.2138 d1.dn_loss_iou: 0.2455 d2.dn_loss_cls: 0.1164 d2.dn_loss_bbox: 0.1891 d2.dn_loss_iou: 0.2252 d3.dn_loss_cls: 0.1082 d3.dn_loss_bbox: 0.1798 d3.dn_loss_iou: 0.2174 d4.dn_loss_cls: 0.1028 d4.dn_loss_bbox: 0.1772 d4.dn_loss_iou: 0.2148 d1.loss_lmm_region: 0.1518 loss_lmm_image: 0.8037 2024/11/12 11:47:45 - mmengine - INFO - Iter(train) [ 85700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:48:01 time: 1.9682 data_time: 0.0187 memory: 36050 grad_norm: 33.2348 loss: 10.0281 loss_cls: 0.3125 loss_bbox: 0.1496 loss_iou: 0.2468 d0.loss_cls: 0.3529 d0.loss_bbox: 0.1533 d0.loss_iou: 0.2586 d1.loss_cls: 0.3343 d1.loss_bbox: 0.1506 d1.loss_iou: 0.2447 d2.loss_cls: 0.3299 d2.loss_bbox: 0.1471 d2.loss_iou: 0.2415 d3.loss_cls: 0.3194 d3.loss_bbox: 0.1473 d3.loss_iou: 0.2429 d4.loss_cls: 0.3192 d4.loss_bbox: 0.1440 d4.loss_iou: 0.2420 enc_loss_cls: 0.3591 enc_loss_bbox: 0.1680 enc_loss_iou: 0.2817 dn_loss_cls: 0.1740 dn_loss_bbox: 0.1840 dn_loss_iou: 0.2150 d0.dn_loss_cls: 0.2433 d0.dn_loss_bbox: 0.3406 d0.dn_loss_iou: 0.3589 d1.dn_loss_cls: 0.2121 d1.dn_loss_bbox: 0.2202 d1.dn_loss_iou: 0.2474 d2.dn_loss_cls: 0.1989 d2.dn_loss_bbox: 0.1967 d2.dn_loss_iou: 0.2255 d3.dn_loss_cls: 0.1837 d3.dn_loss_bbox: 0.1866 d3.dn_loss_iou: 0.2178 d4.dn_loss_cls: 0.1753 d4.dn_loss_bbox: 0.1840 d4.dn_loss_iou: 0.2150 d1.loss_lmm_region: 0.1359 loss_lmm_image: 0.7676 2024/11/12 11:51:03 - mmengine - INFO - Iter(train) [ 85800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:44:39 time: 1.9942 data_time: 0.0183 memory: 32987 grad_norm: 34.3340 loss: 10.1229 loss_cls: 0.3290 loss_bbox: 0.1583 loss_iou: 0.2511 d0.loss_cls: 0.3770 d0.loss_bbox: 0.1673 d0.loss_iou: 0.2661 d1.loss_cls: 0.3535 d1.loss_bbox: 0.1652 d1.loss_iou: 0.2577 d2.loss_cls: 0.3390 d2.loss_bbox: 0.1632 d2.loss_iou: 0.2546 d3.loss_cls: 0.3250 d3.loss_bbox: 0.1606 d3.loss_iou: 0.2520 d4.loss_cls: 0.3261 d4.loss_bbox: 0.1610 d4.loss_iou: 0.2522 enc_loss_cls: 0.3756 enc_loss_bbox: 0.1769 enc_loss_iou: 0.2832 dn_loss_cls: 0.1144 dn_loss_bbox: 0.1942 dn_loss_iou: 0.2298 d0.dn_loss_cls: 0.1861 d0.dn_loss_bbox: 0.3624 d0.dn_loss_iou: 0.3812 d1.dn_loss_cls: 0.1421 d1.dn_loss_bbox: 0.2282 d1.dn_loss_iou: 0.2627 d2.dn_loss_cls: 0.1246 d2.dn_loss_bbox: 0.2072 d2.dn_loss_iou: 0.2409 d3.dn_loss_cls: 0.1173 d3.dn_loss_bbox: 0.1977 d3.dn_loss_iou: 0.2333 d4.dn_loss_cls: 0.1143 d4.dn_loss_bbox: 0.1942 d4.dn_loss_iou: 0.2298 d1.loss_lmm_region: 0.1165 loss_lmm_image: 0.8516 2024/11/12 11:54:22 - mmengine - INFO - Iter(train) [ 85900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:41:17 time: 1.9913 data_time: 0.0183 memory: 33754 grad_norm: 53.8185 loss: 9.6481 loss_cls: 0.2765 loss_bbox: 0.1553 loss_iou: 0.2770 d0.loss_cls: 0.3134 d0.loss_bbox: 0.1658 d0.loss_iou: 0.2888 d1.loss_cls: 0.2934 d1.loss_bbox: 0.1582 d1.loss_iou: 0.2823 d2.loss_cls: 0.2859 d2.loss_bbox: 0.1518 d2.loss_iou: 0.2767 d3.loss_cls: 0.2825 d3.loss_bbox: 0.1536 d3.loss_iou: 0.2745 d4.loss_cls: 0.2796 d4.loss_bbox: 0.1510 d4.loss_iou: 0.2744 enc_loss_cls: 0.3263 enc_loss_bbox: 0.1752 enc_loss_iou: 0.3009 dn_loss_cls: 0.1034 dn_loss_bbox: 0.1777 dn_loss_iou: 0.2171 d0.dn_loss_cls: 0.1759 d0.dn_loss_bbox: 0.3420 d0.dn_loss_iou: 0.3603 d1.dn_loss_cls: 0.1301 d1.dn_loss_bbox: 0.2200 d1.dn_loss_iou: 0.2493 d2.dn_loss_cls: 0.1115 d2.dn_loss_bbox: 0.1888 d2.dn_loss_iou: 0.2262 d3.dn_loss_cls: 0.1076 d3.dn_loss_bbox: 0.1796 d3.dn_loss_iou: 0.2191 d4.dn_loss_cls: 0.1031 d4.dn_loss_bbox: 0.1778 d4.dn_loss_iou: 0.2172 d1.loss_lmm_region: 0.1219 loss_lmm_image: 0.8762 2024/11/12 11:57:41 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 11:57:41 - mmengine - INFO - Iter(train) [ 86000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:37:56 time: 2.0128 data_time: 0.0184 memory: 34183 grad_norm: 34.1158 loss: 9.4654 loss_cls: 0.3034 loss_bbox: 0.1230 loss_iou: 0.2214 d0.loss_cls: 0.3267 d0.loss_bbox: 0.1341 d0.loss_iou: 0.2396 d1.loss_cls: 0.3115 d1.loss_bbox: 0.1243 d1.loss_iou: 0.2286 d2.loss_cls: 0.3076 d2.loss_bbox: 0.1251 d2.loss_iou: 0.2261 d3.loss_cls: 0.3025 d3.loss_bbox: 0.1242 d3.loss_iou: 0.2219 d4.loss_cls: 0.3051 d4.loss_bbox: 0.1194 d4.loss_iou: 0.2227 enc_loss_cls: 0.3299 enc_loss_bbox: 0.1461 enc_loss_iou: 0.2581 dn_loss_cls: 0.1908 dn_loss_bbox: 0.1540 dn_loss_iou: 0.2030 d0.dn_loss_cls: 0.2583 d0.dn_loss_bbox: 0.2964 d0.dn_loss_iou: 0.3444 d1.dn_loss_cls: 0.2208 d1.dn_loss_bbox: 0.1888 d1.dn_loss_iou: 0.2335 d2.dn_loss_cls: 0.2026 d2.dn_loss_bbox: 0.1643 d2.dn_loss_iou: 0.2118 d3.dn_loss_cls: 0.1935 d3.dn_loss_bbox: 0.1563 d3.dn_loss_iou: 0.2051 d4.dn_loss_cls: 0.1900 d4.dn_loss_bbox: 0.1539 d4.dn_loss_iou: 0.2027 d1.loss_lmm_region: 0.1513 loss_lmm_image: 0.8426 2024/11/12 12:01:01 - mmengine - INFO - Iter(train) [ 86100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:34:35 time: 1.9844 data_time: 0.0182 memory: 34054 grad_norm: 34.7442 loss: 8.2020 loss_cls: 0.2409 loss_bbox: 0.1243 loss_iou: 0.2006 d0.loss_cls: 0.2668 d0.loss_bbox: 0.1458 d0.loss_iou: 0.2159 d1.loss_cls: 0.2572 d1.loss_bbox: 0.1299 d1.loss_iou: 0.2047 d2.loss_cls: 0.2480 d2.loss_bbox: 0.1256 d2.loss_iou: 0.2004 d3.loss_cls: 0.2422 d3.loss_bbox: 0.1259 d3.loss_iou: 0.2020 d4.loss_cls: 0.2445 d4.loss_bbox: 0.1234 d4.loss_iou: 0.2006 enc_loss_cls: 0.2789 enc_loss_bbox: 0.1551 enc_loss_iou: 0.2362 dn_loss_cls: 0.1017 dn_loss_bbox: 0.1459 dn_loss_iou: 0.1890 d0.dn_loss_cls: 0.1753 d0.dn_loss_bbox: 0.2875 d0.dn_loss_iou: 0.3317 d1.dn_loss_cls: 0.1275 d1.dn_loss_bbox: 0.1733 d1.dn_loss_iou: 0.2182 d2.dn_loss_cls: 0.1088 d2.dn_loss_bbox: 0.1529 d2.dn_loss_iou: 0.1960 d3.dn_loss_cls: 0.1047 d3.dn_loss_bbox: 0.1459 d3.dn_loss_iou: 0.1911 d4.dn_loss_cls: 0.1009 d4.dn_loss_bbox: 0.1458 d4.dn_loss_iou: 0.1891 d1.loss_lmm_region: 0.1253 loss_lmm_image: 0.8228 2024/11/12 12:04:23 - mmengine - INFO - Iter(train) [ 86200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:31:15 time: 2.0103 data_time: 0.0183 memory: 34061 grad_norm: 30.8456 loss: 9.8035 loss_cls: 0.2817 loss_bbox: 0.1616 loss_iou: 0.2576 d0.loss_cls: 0.3144 d0.loss_bbox: 0.1790 d0.loss_iou: 0.2789 d1.loss_cls: 0.2906 d1.loss_bbox: 0.1761 d1.loss_iou: 0.2761 d2.loss_cls: 0.2927 d2.loss_bbox: 0.1628 d2.loss_iou: 0.2643 d3.loss_cls: 0.2849 d3.loss_bbox: 0.1619 d3.loss_iou: 0.2591 d4.loss_cls: 0.2781 d4.loss_bbox: 0.1633 d4.loss_iou: 0.2608 enc_loss_cls: 0.3303 enc_loss_bbox: 0.1871 enc_loss_iou: 0.2969 dn_loss_cls: 0.0940 dn_loss_bbox: 0.1892 dn_loss_iou: 0.2360 d0.dn_loss_cls: 0.1794 d0.dn_loss_bbox: 0.3511 d0.dn_loss_iou: 0.3829 d1.dn_loss_cls: 0.1299 d1.dn_loss_bbox: 0.2227 d1.dn_loss_iou: 0.2675 d2.dn_loss_cls: 0.1090 d2.dn_loss_bbox: 0.2017 d2.dn_loss_iou: 0.2461 d3.dn_loss_cls: 0.1027 d3.dn_loss_bbox: 0.1912 d3.dn_loss_iou: 0.2387 d4.dn_loss_cls: 0.0952 d4.dn_loss_bbox: 0.1890 d4.dn_loss_iou: 0.2361 d1.loss_lmm_region: 0.1320 loss_lmm_image: 0.8508 2024/11/12 12:07:43 - mmengine - INFO - Iter(train) [ 86300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:27:55 time: 1.9951 data_time: 0.0183 memory: 34345 grad_norm: 33.0151 loss: 10.0529 loss_cls: 0.3022 loss_bbox: 0.1696 loss_iou: 0.2740 d0.loss_cls: 0.3547 d0.loss_bbox: 0.1797 d0.loss_iou: 0.2863 d1.loss_cls: 0.3233 d1.loss_bbox: 0.1715 d1.loss_iou: 0.2791 d2.loss_cls: 0.3194 d2.loss_bbox: 0.1642 d2.loss_iou: 0.2727 d3.loss_cls: 0.3081 d3.loss_bbox: 0.1675 d3.loss_iou: 0.2726 d4.loss_cls: 0.3029 d4.loss_bbox: 0.1674 d4.loss_iou: 0.2717 enc_loss_cls: 0.3521 enc_loss_bbox: 0.1958 enc_loss_iou: 0.3116 dn_loss_cls: 0.0902 dn_loss_bbox: 0.2003 dn_loss_iou: 0.2282 d0.dn_loss_cls: 0.1764 d0.dn_loss_bbox: 0.3564 d0.dn_loss_iou: 0.3730 d1.dn_loss_cls: 0.1258 d1.dn_loss_bbox: 0.2295 d1.dn_loss_iou: 0.2591 d2.dn_loss_cls: 0.1041 d2.dn_loss_bbox: 0.2086 d2.dn_loss_iou: 0.2375 d3.dn_loss_cls: 0.0948 d3.dn_loss_bbox: 0.2020 d3.dn_loss_iou: 0.2304 d4.dn_loss_cls: 0.0919 d4.dn_loss_bbox: 0.2004 d4.dn_loss_iou: 0.2283 d1.loss_lmm_region: 0.1587 loss_lmm_image: 0.8110 2024/11/12 12:11:02 - mmengine - INFO - Iter(train) [ 86400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:24:33 time: 1.9755 data_time: 0.0183 memory: 34908 grad_norm: 25.1686 loss: 9.3533 loss_cls: 0.3013 loss_bbox: 0.1263 loss_iou: 0.2311 d0.loss_cls: 0.3413 d0.loss_bbox: 0.1429 d0.loss_iou: 0.2510 d1.loss_cls: 0.3181 d1.loss_bbox: 0.1342 d1.loss_iou: 0.2395 d2.loss_cls: 0.3123 d2.loss_bbox: 0.1289 d2.loss_iou: 0.2354 d3.loss_cls: 0.3017 d3.loss_bbox: 0.1298 d3.loss_iou: 0.2348 d4.loss_cls: 0.3026 d4.loss_bbox: 0.1260 d4.loss_iou: 0.2313 enc_loss_cls: 0.3503 enc_loss_bbox: 0.1531 enc_loss_iou: 0.2691 dn_loss_cls: 0.1022 dn_loss_bbox: 0.1686 dn_loss_iou: 0.2181 d0.dn_loss_cls: 0.1824 d0.dn_loss_bbox: 0.3253 d0.dn_loss_iou: 0.3725 d1.dn_loss_cls: 0.1297 d1.dn_loss_bbox: 0.2003 d1.dn_loss_iou: 0.2499 d2.dn_loss_cls: 0.1118 d2.dn_loss_bbox: 0.1772 d2.dn_loss_iou: 0.2263 d3.dn_loss_cls: 0.1056 d3.dn_loss_bbox: 0.1698 d3.dn_loss_iou: 0.2205 d4.dn_loss_cls: 0.1017 d4.dn_loss_bbox: 0.1686 d4.dn_loss_iou: 0.2181 d1.loss_lmm_region: 0.1578 loss_lmm_image: 0.8858 2024/11/12 12:14:23 - mmengine - INFO - Iter(train) [ 86500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:21:13 time: 2.0035 data_time: 0.0182 memory: 33596 grad_norm: 29.8117 loss: 11.7988 loss_cls: 0.4824 loss_bbox: 0.1538 loss_iou: 0.3073 d0.loss_cls: 0.5444 d0.loss_bbox: 0.1667 d0.loss_iou: 0.3210 d1.loss_cls: 0.5152 d1.loss_bbox: 0.1606 d1.loss_iou: 0.3145 d2.loss_cls: 0.5003 d2.loss_bbox: 0.1521 d2.loss_iou: 0.3089 d3.loss_cls: 0.4949 d3.loss_bbox: 0.1525 d3.loss_iou: 0.3057 d4.loss_cls: 0.4881 d4.loss_bbox: 0.1537 d4.loss_iou: 0.3073 enc_loss_cls: 0.5487 enc_loss_bbox: 0.1829 enc_loss_iou: 0.3485 dn_loss_cls: 0.2280 dn_loss_bbox: 0.1373 dn_loss_iou: 0.2113 d0.dn_loss_cls: 0.2871 d0.dn_loss_bbox: 0.2845 d0.dn_loss_iou: 0.3522 d1.dn_loss_cls: 0.2453 d1.dn_loss_bbox: 0.1692 d1.dn_loss_iou: 0.2426 d2.dn_loss_cls: 0.2295 d2.dn_loss_bbox: 0.1461 d2.dn_loss_iou: 0.2205 d3.dn_loss_cls: 0.2286 d3.dn_loss_bbox: 0.1401 d3.dn_loss_iou: 0.2139 d4.dn_loss_cls: 0.2262 d4.dn_loss_bbox: 0.1373 d4.dn_loss_iou: 0.2113 d1.loss_lmm_region: 0.1469 loss_lmm_image: 0.8315 2024/11/12 12:17:42 - mmengine - INFO - Iter(train) [ 86600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:17:52 time: 1.9805 data_time: 0.0184 memory: 33334 grad_norm: 27.1139 loss: 7.6467 loss_cls: 0.2359 loss_bbox: 0.1031 loss_iou: 0.1833 d0.loss_cls: 0.2675 d0.loss_bbox: 0.1214 d0.loss_iou: 0.2019 d1.loss_cls: 0.2412 d1.loss_bbox: 0.1123 d1.loss_iou: 0.1904 d2.loss_cls: 0.2363 d2.loss_bbox: 0.1083 d2.loss_iou: 0.1873 d3.loss_cls: 0.2396 d3.loss_bbox: 0.1037 d3.loss_iou: 0.1847 d4.loss_cls: 0.2357 d4.loss_bbox: 0.1028 d4.loss_iou: 0.1836 enc_loss_cls: 0.2746 enc_loss_bbox: 0.1374 enc_loss_iou: 0.2287 dn_loss_cls: 0.0771 dn_loss_bbox: 0.1378 dn_loss_iou: 0.1742 d0.dn_loss_cls: 0.1541 d0.dn_loss_bbox: 0.2831 d0.dn_loss_iou: 0.3161 d1.dn_loss_cls: 0.1055 d1.dn_loss_bbox: 0.1637 d1.dn_loss_iou: 0.2020 d2.dn_loss_cls: 0.0874 d2.dn_loss_bbox: 0.1452 d2.dn_loss_iou: 0.1825 d3.dn_loss_cls: 0.0821 d3.dn_loss_bbox: 0.1394 d3.dn_loss_iou: 0.1763 d4.dn_loss_cls: 0.0787 d4.dn_loss_bbox: 0.1378 d4.dn_loss_iou: 0.1743 d1.loss_lmm_region: 0.1148 loss_lmm_image: 0.8353 2024/11/12 12:21:01 - mmengine - INFO - Iter(train) [ 86700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:14:31 time: 1.9800 data_time: 0.0185 memory: 34841 grad_norm: 29.9669 loss: 9.7294 loss_cls: 0.2937 loss_bbox: 0.1603 loss_iou: 0.2630 d0.loss_cls: 0.3416 d0.loss_bbox: 0.1770 d0.loss_iou: 0.2838 d1.loss_cls: 0.3140 d1.loss_bbox: 0.1690 d1.loss_iou: 0.2732 d2.loss_cls: 0.3044 d2.loss_bbox: 0.1619 d2.loss_iou: 0.2674 d3.loss_cls: 0.2989 d3.loss_bbox: 0.1596 d3.loss_iou: 0.2657 d4.loss_cls: 0.2957 d4.loss_bbox: 0.1599 d4.loss_iou: 0.2631 enc_loss_cls: 0.3486 enc_loss_bbox: 0.1878 enc_loss_iou: 0.3042 dn_loss_cls: 0.0913 dn_loss_bbox: 0.1827 dn_loss_iou: 0.2168 d0.dn_loss_cls: 0.1707 d0.dn_loss_bbox: 0.3196 d0.dn_loss_iou: 0.3553 d1.dn_loss_cls: 0.1205 d1.dn_loss_bbox: 0.2116 d1.dn_loss_iou: 0.2453 d2.dn_loss_cls: 0.1034 d2.dn_loss_bbox: 0.1906 d2.dn_loss_iou: 0.2254 d3.dn_loss_cls: 0.0956 d3.dn_loss_bbox: 0.1845 d3.dn_loss_iou: 0.2193 d4.dn_loss_cls: 0.0909 d4.dn_loss_bbox: 0.1827 d4.dn_loss_iou: 0.2167 d1.loss_lmm_region: 0.1329 loss_lmm_image: 0.8807 2024/11/12 12:24:20 - mmengine - INFO - Iter(train) [ 86800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:11:09 time: 1.9792 data_time: 0.0185 memory: 34869 grad_norm: 34.2891 loss: 9.2511 loss_cls: 0.3179 loss_bbox: 0.1274 loss_iou: 0.2504 d0.loss_cls: 0.3676 d0.loss_bbox: 0.1414 d0.loss_iou: 0.2647 d1.loss_cls: 0.3440 d1.loss_bbox: 0.1317 d1.loss_iou: 0.2522 d2.loss_cls: 0.3296 d2.loss_bbox: 0.1304 d2.loss_iou: 0.2509 d3.loss_cls: 0.3211 d3.loss_bbox: 0.1292 d3.loss_iou: 0.2515 d4.loss_cls: 0.3209 d4.loss_bbox: 0.1275 d4.loss_iou: 0.2491 enc_loss_cls: 0.3689 enc_loss_bbox: 0.1544 enc_loss_iou: 0.2859 dn_loss_cls: 0.1067 dn_loss_bbox: 0.1378 dn_loss_iou: 0.2001 d0.dn_loss_cls: 0.1869 d0.dn_loss_bbox: 0.2764 d0.dn_loss_iou: 0.3458 d1.dn_loss_cls: 0.1350 d1.dn_loss_bbox: 0.1704 d1.dn_loss_iou: 0.2333 d2.dn_loss_cls: 0.1178 d2.dn_loss_bbox: 0.1490 d2.dn_loss_iou: 0.2103 d3.dn_loss_cls: 0.1111 d3.dn_loss_bbox: 0.1412 d3.dn_loss_iou: 0.2031 d4.dn_loss_cls: 0.1076 d4.dn_loss_bbox: 0.1378 d4.dn_loss_iou: 0.2000 d1.loss_lmm_region: 0.1264 loss_lmm_image: 0.8379 2024/11/12 12:27:41 - mmengine - INFO - Iter(train) [ 86900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:07:49 time: 2.0400 data_time: 0.0183 memory: 34122 grad_norm: 34.1068 loss: 8.9315 loss_cls: 0.2750 loss_bbox: 0.1324 loss_iou: 0.2474 d0.loss_cls: 0.3217 d0.loss_bbox: 0.1475 d0.loss_iou: 0.2640 d1.loss_cls: 0.2917 d1.loss_bbox: 0.1402 d1.loss_iou: 0.2560 d2.loss_cls: 0.2807 d2.loss_bbox: 0.1367 d2.loss_iou: 0.2522 d3.loss_cls: 0.2781 d3.loss_bbox: 0.1337 d3.loss_iou: 0.2491 d4.loss_cls: 0.2763 d4.loss_bbox: 0.1322 d4.loss_iou: 0.2486 enc_loss_cls: 0.3269 enc_loss_bbox: 0.1628 enc_loss_iou: 0.2909 dn_loss_cls: 0.0799 dn_loss_bbox: 0.1584 dn_loss_iou: 0.2108 d0.dn_loss_cls: 0.1583 d0.dn_loss_bbox: 0.2992 d0.dn_loss_iou: 0.3526 d1.dn_loss_cls: 0.1096 d1.dn_loss_bbox: 0.1842 d1.dn_loss_iou: 0.2381 d2.dn_loss_cls: 0.0904 d2.dn_loss_bbox: 0.1676 d2.dn_loss_iou: 0.2205 d3.dn_loss_cls: 0.0842 d3.dn_loss_bbox: 0.1604 d3.dn_loss_iou: 0.2134 d4.dn_loss_cls: 0.0804 d4.dn_loss_bbox: 0.1584 d4.dn_loss_iou: 0.2107 d1.loss_lmm_region: 0.1242 loss_lmm_image: 0.7863 2024/11/12 12:31:00 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 12:31:00 - mmengine - INFO - Iter(train) [ 87000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:04:27 time: 1.9643 data_time: 0.0182 memory: 34261 grad_norm: 31.8302 loss: 8.7095 loss_cls: 0.2550 loss_bbox: 0.1298 loss_iou: 0.2205 d0.loss_cls: 0.2950 d0.loss_bbox: 0.1477 d0.loss_iou: 0.2384 d1.loss_cls: 0.2738 d1.loss_bbox: 0.1398 d1.loss_iou: 0.2298 d2.loss_cls: 0.2640 d2.loss_bbox: 0.1342 d2.loss_iou: 0.2224 d3.loss_cls: 0.2596 d3.loss_bbox: 0.1326 d3.loss_iou: 0.2212 d4.loss_cls: 0.2554 d4.loss_bbox: 0.1307 d4.loss_iou: 0.2199 enc_loss_cls: 0.3048 enc_loss_bbox: 0.1602 enc_loss_iou: 0.2547 dn_loss_cls: 0.0841 dn_loss_bbox: 0.1744 dn_loss_iou: 0.2014 d0.dn_loss_cls: 0.1598 d0.dn_loss_bbox: 0.3204 d0.dn_loss_iou: 0.3327 d1.dn_loss_cls: 0.1149 d1.dn_loss_bbox: 0.2046 d1.dn_loss_iou: 0.2300 d2.dn_loss_cls: 0.0967 d2.dn_loss_bbox: 0.1847 d2.dn_loss_iou: 0.2096 d3.dn_loss_cls: 0.0890 d3.dn_loss_bbox: 0.1770 d3.dn_loss_iou: 0.2033 d4.dn_loss_cls: 0.0855 d4.dn_loss_bbox: 0.1744 d4.dn_loss_iou: 0.2014 d1.loss_lmm_region: 0.1175 loss_lmm_image: 0.8585 2024/11/12 12:34:20 - mmengine - INFO - Iter(train) [ 87100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 11:01:07 time: 2.0169 data_time: 0.0183 memory: 34023 grad_norm: 25.7350 loss: 8.5046 loss_cls: 0.2526 loss_bbox: 0.1171 loss_iou: 0.2068 d0.loss_cls: 0.2899 d0.loss_bbox: 0.1265 d0.loss_iou: 0.2170 d1.loss_cls: 0.2622 d1.loss_bbox: 0.1228 d1.loss_iou: 0.2126 d2.loss_cls: 0.2595 d2.loss_bbox: 0.1192 d2.loss_iou: 0.2091 d3.loss_cls: 0.2531 d3.loss_bbox: 0.1193 d3.loss_iou: 0.2080 d4.loss_cls: 0.2507 d4.loss_bbox: 0.1187 d4.loss_iou: 0.2093 enc_loss_cls: 0.3026 enc_loss_bbox: 0.1425 enc_loss_iou: 0.2374 dn_loss_cls: 0.0974 dn_loss_bbox: 0.1614 dn_loss_iou: 0.2078 d0.dn_loss_cls: 0.1860 d0.dn_loss_bbox: 0.3104 d0.dn_loss_iou: 0.3480 d1.dn_loss_cls: 0.1322 d1.dn_loss_bbox: 0.1932 d1.dn_loss_iou: 0.2377 d2.dn_loss_cls: 0.1086 d2.dn_loss_bbox: 0.1704 d2.dn_loss_iou: 0.2163 d3.dn_loss_cls: 0.1015 d3.dn_loss_bbox: 0.1630 d3.dn_loss_iou: 0.2094 d4.dn_loss_cls: 0.0983 d4.dn_loss_bbox: 0.1615 d4.dn_loss_iou: 0.2078 d1.loss_lmm_region: 0.1362 loss_lmm_image: 0.8208 2024/11/12 12:37:41 - mmengine - INFO - Iter(train) [ 87200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:57:47 time: 2.0093 data_time: 0.0183 memory: 34267 grad_norm: 41.8934 loss: 8.4246 loss_cls: 0.2688 loss_bbox: 0.1123 loss_iou: 0.1889 d0.loss_cls: 0.3065 d0.loss_bbox: 0.1190 d0.loss_iou: 0.1979 d1.loss_cls: 0.2845 d1.loss_bbox: 0.1149 d1.loss_iou: 0.1925 d2.loss_cls: 0.2774 d2.loss_bbox: 0.1130 d2.loss_iou: 0.1888 d3.loss_cls: 0.2719 d3.loss_bbox: 0.1121 d3.loss_iou: 0.1884 d4.loss_cls: 0.2686 d4.loss_bbox: 0.1120 d4.loss_iou: 0.1881 enc_loss_cls: 0.3149 enc_loss_bbox: 0.1319 enc_loss_iou: 0.2199 dn_loss_cls: 0.1242 dn_loss_bbox: 0.1540 dn_loss_iou: 0.1859 d0.dn_loss_cls: 0.1945 d0.dn_loss_bbox: 0.3072 d0.dn_loss_iou: 0.3300 d1.dn_loss_cls: 0.1548 d1.dn_loss_bbox: 0.1888 d1.dn_loss_iou: 0.2166 d2.dn_loss_cls: 0.1338 d2.dn_loss_bbox: 0.1653 d2.dn_loss_iou: 0.1945 d3.dn_loss_cls: 0.1277 d3.dn_loss_bbox: 0.1564 d3.dn_loss_iou: 0.1885 d4.dn_loss_cls: 0.1248 d4.dn_loss_bbox: 0.1541 d4.dn_loss_iou: 0.1859 d1.loss_lmm_region: 0.1180 loss_lmm_image: 0.8474 2024/11/12 12:41:01 - mmengine - INFO - Iter(train) [ 87300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:54:26 time: 1.9725 data_time: 0.0184 memory: 34128 grad_norm: 24.1261 loss: 10.0271 loss_cls: 0.2932 loss_bbox: 0.1685 loss_iou: 0.2778 d0.loss_cls: 0.3344 d0.loss_bbox: 0.1865 d0.loss_iou: 0.2958 d1.loss_cls: 0.3201 d1.loss_bbox: 0.1744 d1.loss_iou: 0.2859 d2.loss_cls: 0.3069 d2.loss_bbox: 0.1748 d2.loss_iou: 0.2840 d3.loss_cls: 0.2989 d3.loss_bbox: 0.1678 d3.loss_iou: 0.2778 d4.loss_cls: 0.2955 d4.loss_bbox: 0.1657 d4.loss_iou: 0.2765 enc_loss_cls: 0.3370 enc_loss_bbox: 0.1935 enc_loss_iou: 0.3146 dn_loss_cls: 0.0900 dn_loss_bbox: 0.1969 dn_loss_iou: 0.2400 d0.dn_loss_cls: 0.1748 d0.dn_loss_bbox: 0.3382 d0.dn_loss_iou: 0.3799 d1.dn_loss_cls: 0.1250 d1.dn_loss_bbox: 0.2302 d1.dn_loss_iou: 0.2711 d2.dn_loss_cls: 0.1059 d2.dn_loss_bbox: 0.2053 d2.dn_loss_iou: 0.2491 d3.dn_loss_cls: 0.0968 d3.dn_loss_bbox: 0.1984 d3.dn_loss_iou: 0.2417 d4.dn_loss_cls: 0.0919 d4.dn_loss_bbox: 0.1969 d4.dn_loss_iou: 0.2398 d1.loss_lmm_region: 0.1396 loss_lmm_image: 0.7863 2024/11/12 12:44:20 - mmengine - INFO - Iter(train) [ 87400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:51:05 time: 1.9938 data_time: 0.0183 memory: 35656 grad_norm: 33.6047 loss: 8.9527 loss_cls: 0.2686 loss_bbox: 0.1371 loss_iou: 0.2505 d0.loss_cls: 0.3152 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2648 d1.loss_cls: 0.2869 d1.loss_bbox: 0.1424 d1.loss_iou: 0.2541 d2.loss_cls: 0.2845 d2.loss_bbox: 0.1364 d2.loss_iou: 0.2511 d3.loss_cls: 0.2755 d3.loss_bbox: 0.1344 d3.loss_iou: 0.2484 d4.loss_cls: 0.2704 d4.loss_bbox: 0.1365 d4.loss_iou: 0.2501 enc_loss_cls: 0.3222 enc_loss_bbox: 0.1634 enc_loss_iou: 0.2839 dn_loss_cls: 0.0763 dn_loss_bbox: 0.1656 dn_loss_iou: 0.2125 d0.dn_loss_cls: 0.1511 d0.dn_loss_bbox: 0.2997 d0.dn_loss_iou: 0.3471 d1.dn_loss_cls: 0.1015 d1.dn_loss_bbox: 0.1942 d1.dn_loss_iou: 0.2416 d2.dn_loss_cls: 0.0865 d2.dn_loss_bbox: 0.1772 d2.dn_loss_iou: 0.2227 d3.dn_loss_cls: 0.0797 d3.dn_loss_bbox: 0.1683 d3.dn_loss_iou: 0.2153 d4.dn_loss_cls: 0.0771 d4.dn_loss_bbox: 0.1656 d4.dn_loss_iou: 0.2123 d1.loss_lmm_region: 0.1065 loss_lmm_image: 0.8240 2024/11/12 12:47:40 - mmengine - INFO - Iter(train) [ 87500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:47:44 time: 1.9861 data_time: 0.0182 memory: 34558 grad_norm: 30.3391 loss: 8.7317 loss_cls: 0.2550 loss_bbox: 0.1305 loss_iou: 0.2359 d0.loss_cls: 0.3061 d0.loss_bbox: 0.1446 d0.loss_iou: 0.2469 d1.loss_cls: 0.2826 d1.loss_bbox: 0.1299 d1.loss_iou: 0.2345 d2.loss_cls: 0.2658 d2.loss_bbox: 0.1329 d2.loss_iou: 0.2389 d3.loss_cls: 0.2605 d3.loss_bbox: 0.1315 d3.loss_iou: 0.2346 d4.loss_cls: 0.2576 d4.loss_bbox: 0.1299 d4.loss_iou: 0.2356 enc_loss_cls: 0.3128 enc_loss_bbox: 0.1595 enc_loss_iou: 0.2720 dn_loss_cls: 0.0839 dn_loss_bbox: 0.1577 dn_loss_iou: 0.2081 d0.dn_loss_cls: 0.1628 d0.dn_loss_bbox: 0.2970 d0.dn_loss_iou: 0.3513 d1.dn_loss_cls: 0.1130 d1.dn_loss_bbox: 0.1898 d1.dn_loss_iou: 0.2401 d2.dn_loss_cls: 0.0933 d2.dn_loss_bbox: 0.1693 d2.dn_loss_iou: 0.2193 d3.dn_loss_cls: 0.0867 d3.dn_loss_bbox: 0.1607 d3.dn_loss_iou: 0.2115 d4.dn_loss_cls: 0.0839 d4.dn_loss_bbox: 0.1576 d4.dn_loss_iou: 0.2081 d1.loss_lmm_region: 0.1246 loss_lmm_image: 0.8155 2024/11/12 12:51:01 - mmengine - INFO - Iter(train) [ 87600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:44:24 time: 2.0193 data_time: 0.0184 memory: 34890 grad_norm: 33.9010 loss: 10.2073 loss_cls: 0.3415 loss_bbox: 0.1524 loss_iou: 0.2810 d0.loss_cls: 0.3740 d0.loss_bbox: 0.1755 d0.loss_iou: 0.3042 d1.loss_cls: 0.3596 d1.loss_bbox: 0.1605 d1.loss_iou: 0.2868 d2.loss_cls: 0.3429 d2.loss_bbox: 0.1600 d2.loss_iou: 0.2881 d3.loss_cls: 0.3423 d3.loss_bbox: 0.1541 d3.loss_iou: 0.2818 d4.loss_cls: 0.3371 d4.loss_bbox: 0.1551 d4.loss_iou: 0.2832 enc_loss_cls: 0.3849 enc_loss_bbox: 0.1829 enc_loss_iou: 0.3175 dn_loss_cls: 0.1121 dn_loss_bbox: 0.1724 dn_loss_iou: 0.2252 d0.dn_loss_cls: 0.1897 d0.dn_loss_bbox: 0.3247 d0.dn_loss_iou: 0.3681 d1.dn_loss_cls: 0.1412 d1.dn_loss_bbox: 0.2030 d1.dn_loss_iou: 0.2551 d2.dn_loss_cls: 0.1247 d2.dn_loss_bbox: 0.1772 d2.dn_loss_iou: 0.2324 d3.dn_loss_cls: 0.1171 d3.dn_loss_bbox: 0.1744 d3.dn_loss_iou: 0.2272 d4.dn_loss_cls: 0.1128 d4.dn_loss_bbox: 0.1724 d4.dn_loss_iou: 0.2252 d1.loss_lmm_region: 0.1279 loss_lmm_image: 0.8594 2024/11/12 12:54:18 - mmengine - INFO - Iter(train) [ 87700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:41:01 time: 1.9649 data_time: 0.0182 memory: 33612 grad_norm: nan loss: 8.9841 loss_cls: 0.2779 loss_bbox: 0.1379 loss_iou: 0.2531 d0.loss_cls: 0.3267 d0.loss_bbox: 0.1565 d0.loss_iou: 0.2750 d1.loss_cls: 0.3048 d1.loss_bbox: 0.1419 d1.loss_iou: 0.2601 d2.loss_cls: 0.2904 d2.loss_bbox: 0.1390 d2.loss_iou: 0.2549 d3.loss_cls: 0.2840 d3.loss_bbox: 0.1370 d3.loss_iou: 0.2524 d4.loss_cls: 0.2780 d4.loss_bbox: 0.1365 d4.loss_iou: 0.2529 enc_loss_cls: 0.3337 enc_loss_bbox: 0.1729 enc_loss_iou: 0.3002 dn_loss_cls: 0.0790 dn_loss_bbox: 0.1554 dn_loss_iou: 0.1924 d0.dn_loss_cls: 0.1556 d0.dn_loss_bbox: 0.2857 d0.dn_loss_iou: 0.3232 d1.dn_loss_cls: 0.1111 d1.dn_loss_bbox: 0.1875 d1.dn_loss_iou: 0.2221 d2.dn_loss_cls: 0.0924 d2.dn_loss_bbox: 0.1677 d2.dn_loss_iou: 0.2035 d3.dn_loss_cls: 0.0837 d3.dn_loss_bbox: 0.1574 d3.dn_loss_iou: 0.1949 d4.dn_loss_cls: 0.0802 d4.dn_loss_bbox: 0.1553 d4.dn_loss_iou: 0.1924 d1.loss_lmm_region: 0.1203 loss_lmm_image: 0.8582 2024/11/12 12:57:37 - mmengine - INFO - Iter(train) [ 87800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:37:40 time: 1.9851 data_time: 0.0185 memory: 33462 grad_norm: 25.3585 loss: 8.4444 loss_cls: 0.2867 loss_bbox: 0.1203 loss_iou: 0.2205 d0.loss_cls: 0.3275 d0.loss_bbox: 0.1281 d0.loss_iou: 0.2351 d1.loss_cls: 0.3038 d1.loss_bbox: 0.1226 d1.loss_iou: 0.2256 d2.loss_cls: 0.2928 d2.loss_bbox: 0.1223 d2.loss_iou: 0.2247 d3.loss_cls: 0.2874 d3.loss_bbox: 0.1214 d3.loss_iou: 0.2233 d4.loss_cls: 0.2857 d4.loss_bbox: 0.1219 d4.loss_iou: 0.2223 enc_loss_cls: 0.3228 enc_loss_bbox: 0.1444 enc_loss_iou: 0.2605 dn_loss_cls: 0.0778 dn_loss_bbox: 0.1394 dn_loss_iou: 0.1903 d0.dn_loss_cls: 0.1519 d0.dn_loss_bbox: 0.2654 d0.dn_loss_iou: 0.3217 d1.dn_loss_cls: 0.1072 d1.dn_loss_bbox: 0.1643 d1.dn_loss_iou: 0.2195 d2.dn_loss_cls: 0.0907 d2.dn_loss_bbox: 0.1492 d2.dn_loss_iou: 0.1996 d3.dn_loss_cls: 0.0824 d3.dn_loss_bbox: 0.1416 d3.dn_loss_iou: 0.1926 d4.dn_loss_cls: 0.0776 d4.dn_loss_bbox: 0.1394 d4.dn_loss_iou: 0.1903 d1.loss_lmm_region: 0.1276 loss_lmm_image: 0.8162 2024/11/12 13:00:57 - mmengine - INFO - Iter(train) [ 87900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:34:19 time: 2.0003 data_time: 0.0183 memory: 33659 grad_norm: 31.5638 loss: 8.5961 loss_cls: 0.2719 loss_bbox: 0.1214 loss_iou: 0.2097 d0.loss_cls: 0.3127 d0.loss_bbox: 0.1375 d0.loss_iou: 0.2225 d1.loss_cls: 0.2845 d1.loss_bbox: 0.1312 d1.loss_iou: 0.2198 d2.loss_cls: 0.2778 d2.loss_bbox: 0.1246 d2.loss_iou: 0.2119 d3.loss_cls: 0.2753 d3.loss_bbox: 0.1237 d3.loss_iou: 0.2112 d4.loss_cls: 0.2715 d4.loss_bbox: 0.1212 d4.loss_iou: 0.2087 enc_loss_cls: 0.3154 enc_loss_bbox: 0.1536 enc_loss_iou: 0.2467 dn_loss_cls: 0.0994 dn_loss_bbox: 0.1567 dn_loss_iou: 0.1848 d0.dn_loss_cls: 0.1713 d0.dn_loss_bbox: 0.3095 d0.dn_loss_iou: 0.3250 d1.dn_loss_cls: 0.1250 d1.dn_loss_bbox: 0.1878 d1.dn_loss_iou: 0.2152 d2.dn_loss_cls: 0.1098 d2.dn_loss_bbox: 0.1649 d2.dn_loss_iou: 0.1939 d3.dn_loss_cls: 0.1038 d3.dn_loss_bbox: 0.1601 d3.dn_loss_iou: 0.1880 d4.dn_loss_cls: 0.1007 d4.dn_loss_bbox: 0.1569 d4.dn_loss_iou: 0.1847 d1.loss_lmm_region: 0.1176 loss_lmm_image: 0.8882 2024/11/12 13:04:16 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 13:04:16 - mmengine - INFO - Iter(train) [ 88000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:30:57 time: 1.9855 data_time: 0.0183 memory: 34473 grad_norm: 35.7850 loss: 9.9174 loss_cls: 0.3165 loss_bbox: 0.1398 loss_iou: 0.2264 d0.loss_cls: 0.3502 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2486 d1.loss_cls: 0.3341 d1.loss_bbox: 0.1509 d1.loss_iou: 0.2345 d2.loss_cls: 0.3276 d2.loss_bbox: 0.1445 d2.loss_iou: 0.2317 d3.loss_cls: 0.3204 d3.loss_bbox: 0.1409 d3.loss_iou: 0.2282 d4.loss_cls: 0.3150 d4.loss_bbox: 0.1401 d4.loss_iou: 0.2273 enc_loss_cls: 0.3574 enc_loss_bbox: 0.1686 enc_loss_iou: 0.2622 dn_loss_cls: 0.1356 dn_loss_bbox: 0.1942 dn_loss_iou: 0.2331 d0.dn_loss_cls: 0.2261 d0.dn_loss_bbox: 0.3527 d0.dn_loss_iou: 0.3852 d1.dn_loss_cls: 0.1668 d1.dn_loss_bbox: 0.2332 d1.dn_loss_iou: 0.2678 d2.dn_loss_cls: 0.1505 d2.dn_loss_bbox: 0.2068 d2.dn_loss_iou: 0.2440 d3.dn_loss_cls: 0.1412 d3.dn_loss_bbox: 0.1976 d3.dn_loss_iou: 0.2365 d4.dn_loss_cls: 0.1379 d4.dn_loss_bbox: 0.1943 d4.dn_loss_iou: 0.2332 d1.loss_lmm_region: 0.1586 loss_lmm_image: 0.7982 2024/11/12 13:07:37 - mmengine - INFO - Iter(train) [ 88100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:27:37 time: 2.0340 data_time: 0.0184 memory: 34888 grad_norm: 25.5386 loss: 8.8904 loss_cls: 0.2521 loss_bbox: 0.1379 loss_iou: 0.2496 d0.loss_cls: 0.2965 d0.loss_bbox: 0.1523 d0.loss_iou: 0.2639 d1.loss_cls: 0.2707 d1.loss_bbox: 0.1420 d1.loss_iou: 0.2571 d2.loss_cls: 0.2667 d2.loss_bbox: 0.1357 d2.loss_iou: 0.2495 d3.loss_cls: 0.2602 d3.loss_bbox: 0.1363 d3.loss_iou: 0.2503 d4.loss_cls: 0.2524 d4.loss_bbox: 0.1377 d4.loss_iou: 0.2502 enc_loss_cls: 0.3070 enc_loss_bbox: 0.1641 enc_loss_iou: 0.2818 dn_loss_cls: 0.0792 dn_loss_bbox: 0.1628 dn_loss_iou: 0.2158 d0.dn_loss_cls: 0.1618 d0.dn_loss_bbox: 0.3168 d0.dn_loss_iou: 0.3578 d1.dn_loss_cls: 0.1140 d1.dn_loss_bbox: 0.1945 d1.dn_loss_iou: 0.2456 d2.dn_loss_cls: 0.0922 d2.dn_loss_bbox: 0.1717 d2.dn_loss_iou: 0.2242 d3.dn_loss_cls: 0.0862 d3.dn_loss_bbox: 0.1642 d3.dn_loss_iou: 0.2176 d4.dn_loss_cls: 0.0808 d4.dn_loss_bbox: 0.1629 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1213 loss_lmm_image: 0.7911 2024/11/12 13:10:57 - mmengine - INFO - Iter(train) [ 88200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:24:17 time: 2.0026 data_time: 0.0183 memory: 33172 grad_norm: 26.7843 loss: 8.4783 loss_cls: 0.2405 loss_bbox: 0.1165 loss_iou: 0.2431 d0.loss_cls: 0.2772 d0.loss_bbox: 0.1270 d0.loss_iou: 0.2562 d1.loss_cls: 0.2577 d1.loss_bbox: 0.1214 d1.loss_iou: 0.2481 d2.loss_cls: 0.2502 d2.loss_bbox: 0.1140 d2.loss_iou: 0.2414 d3.loss_cls: 0.2486 d3.loss_bbox: 0.1145 d3.loss_iou: 0.2385 d4.loss_cls: 0.2414 d4.loss_bbox: 0.1168 d4.loss_iou: 0.2440 enc_loss_cls: 0.2878 enc_loss_bbox: 0.1404 enc_loss_iou: 0.2785 dn_loss_cls: 0.0860 dn_loss_bbox: 0.1438 dn_loss_iou: 0.2094 d0.dn_loss_cls: 0.1636 d0.dn_loss_bbox: 0.2804 d0.dn_loss_iou: 0.3459 d1.dn_loss_cls: 0.1195 d1.dn_loss_bbox: 0.1776 d1.dn_loss_iou: 0.2407 d2.dn_loss_cls: 0.0968 d2.dn_loss_bbox: 0.1519 d2.dn_loss_iou: 0.2179 d3.dn_loss_cls: 0.0889 d3.dn_loss_bbox: 0.1457 d3.dn_loss_iou: 0.2113 d4.dn_loss_cls: 0.0856 d4.dn_loss_bbox: 0.1438 d4.dn_loss_iou: 0.2095 d1.loss_lmm_region: 0.1157 loss_lmm_image: 0.8401 2024/11/12 13:14:16 - mmengine - INFO - Iter(train) [ 88300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:20:55 time: 2.0053 data_time: 0.0183 memory: 34613 grad_norm: 24.8169 loss: 8.5876 loss_cls: 0.2992 loss_bbox: 0.1293 loss_iou: 0.2196 d0.loss_cls: 0.3433 d0.loss_bbox: 0.1412 d0.loss_iou: 0.2334 d1.loss_cls: 0.3192 d1.loss_bbox: 0.1336 d1.loss_iou: 0.2251 d2.loss_cls: 0.3106 d2.loss_bbox: 0.1296 d2.loss_iou: 0.2244 d3.loss_cls: 0.3024 d3.loss_bbox: 0.1287 d3.loss_iou: 0.2212 d4.loss_cls: 0.3006 d4.loss_bbox: 0.1277 d4.loss_iou: 0.2182 enc_loss_cls: 0.3476 enc_loss_bbox: 0.1576 enc_loss_iou: 0.2593 dn_loss_cls: 0.0751 dn_loss_bbox: 0.1399 dn_loss_iou: 0.1793 d0.dn_loss_cls: 0.1451 d0.dn_loss_bbox: 0.2876 d0.dn_loss_iou: 0.3156 d1.dn_loss_cls: 0.1028 d1.dn_loss_bbox: 0.1816 d1.dn_loss_iou: 0.2137 d2.dn_loss_cls: 0.0865 d2.dn_loss_bbox: 0.1539 d2.dn_loss_iou: 0.1905 d3.dn_loss_cls: 0.0788 d3.dn_loss_bbox: 0.1426 d3.dn_loss_iou: 0.1825 d4.dn_loss_cls: 0.0748 d4.dn_loss_bbox: 0.1400 d4.dn_loss_iou: 0.1795 d1.loss_lmm_region: 0.1213 loss_lmm_image: 0.8249 2024/11/12 13:17:33 - mmengine - INFO - Iter(train) [ 88400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:17:33 time: 1.9568 data_time: 0.0183 memory: 33943 grad_norm: 25.4264 loss: 9.0582 loss_cls: 0.2972 loss_bbox: 0.1355 loss_iou: 0.2291 d0.loss_cls: 0.3403 d0.loss_bbox: 0.1449 d0.loss_iou: 0.2390 d1.loss_cls: 0.3192 d1.loss_bbox: 0.1372 d1.loss_iou: 0.2305 d2.loss_cls: 0.3071 d2.loss_bbox: 0.1362 d2.loss_iou: 0.2276 d3.loss_cls: 0.2997 d3.loss_bbox: 0.1344 d3.loss_iou: 0.2291 d4.loss_cls: 0.2980 d4.loss_bbox: 0.1351 d4.loss_iou: 0.2282 enc_loss_cls: 0.3473 enc_loss_bbox: 0.1603 enc_loss_iou: 0.2610 dn_loss_cls: 0.0863 dn_loss_bbox: 0.1587 dn_loss_iou: 0.2041 d0.dn_loss_cls: 0.1648 d0.dn_loss_bbox: 0.3041 d0.dn_loss_iou: 0.3399 d1.dn_loss_cls: 0.1176 d1.dn_loss_bbox: 0.1925 d1.dn_loss_iou: 0.2337 d2.dn_loss_cls: 0.0988 d2.dn_loss_bbox: 0.1729 d2.dn_loss_iou: 0.2144 d3.dn_loss_cls: 0.0915 d3.dn_loss_bbox: 0.1610 d3.dn_loss_iou: 0.2064 d4.dn_loss_cls: 0.0863 d4.dn_loss_bbox: 0.1588 d4.dn_loss_iou: 0.2041 d1.loss_lmm_region: 0.1396 loss_lmm_image: 0.8858 2024/11/12 13:20:55 - mmengine - INFO - Iter(train) [ 88500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:14:13 time: 2.0208 data_time: 0.0185 memory: 34582 grad_norm: 37.0764 loss: 9.1014 loss_cls: 0.2716 loss_bbox: 0.1444 loss_iou: 0.2662 d0.loss_cls: 0.3124 d0.loss_bbox: 0.1609 d0.loss_iou: 0.2838 d1.loss_cls: 0.2925 d1.loss_bbox: 0.1496 d1.loss_iou: 0.2692 d2.loss_cls: 0.2839 d2.loss_bbox: 0.1452 d2.loss_iou: 0.2632 d3.loss_cls: 0.2797 d3.loss_bbox: 0.1426 d3.loss_iou: 0.2616 d4.loss_cls: 0.2702 d4.loss_bbox: 0.1503 d4.loss_iou: 0.2675 enc_loss_cls: 0.3136 enc_loss_bbox: 0.1713 enc_loss_iou: 0.3051 dn_loss_cls: 0.0864 dn_loss_bbox: 0.1491 dn_loss_iou: 0.2112 d0.dn_loss_cls: 0.1563 d0.dn_loss_bbox: 0.2876 d0.dn_loss_iou: 0.3457 d1.dn_loss_cls: 0.1127 d1.dn_loss_bbox: 0.1829 d1.dn_loss_iou: 0.2401 d2.dn_loss_cls: 0.0963 d2.dn_loss_bbox: 0.1581 d2.dn_loss_iou: 0.2197 d3.dn_loss_cls: 0.0902 d3.dn_loss_bbox: 0.1517 d3.dn_loss_iou: 0.2132 d4.dn_loss_cls: 0.0873 d4.dn_loss_bbox: 0.1491 d4.dn_loss_iou: 0.2113 d1.loss_lmm_region: 0.1301 loss_lmm_image: 0.8175 2024/11/12 13:24:16 - mmengine - INFO - Iter(train) [ 88600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:10:53 time: 2.0327 data_time: 0.0184 memory: 34929 grad_norm: 24.8347 loss: 8.8722 loss_cls: 0.2626 loss_bbox: 0.1287 loss_iou: 0.2153 d0.loss_cls: 0.3010 d0.loss_bbox: 0.1420 d0.loss_iou: 0.2288 d1.loss_cls: 0.2789 d1.loss_bbox: 0.1358 d1.loss_iou: 0.2212 d2.loss_cls: 0.2723 d2.loss_bbox: 0.1332 d2.loss_iou: 0.2198 d3.loss_cls: 0.2668 d3.loss_bbox: 0.1303 d3.loss_iou: 0.2169 d4.loss_cls: 0.2622 d4.loss_bbox: 0.1302 d4.loss_iou: 0.2164 enc_loss_cls: 0.3116 enc_loss_bbox: 0.1514 enc_loss_iou: 0.2508 dn_loss_cls: 0.1008 dn_loss_bbox: 0.1719 dn_loss_iou: 0.2139 d0.dn_loss_cls: 0.1767 d0.dn_loss_bbox: 0.3172 d0.dn_loss_iou: 0.3608 d1.dn_loss_cls: 0.1277 d1.dn_loss_bbox: 0.2025 d1.dn_loss_iou: 0.2456 d2.dn_loss_cls: 0.1108 d2.dn_loss_bbox: 0.1820 d2.dn_loss_iou: 0.2240 d3.dn_loss_cls: 0.1043 d3.dn_loss_bbox: 0.1747 d3.dn_loss_iou: 0.2164 d4.dn_loss_cls: 0.1023 d4.dn_loss_bbox: 0.1719 d4.dn_loss_iou: 0.2139 d1.loss_lmm_region: 0.1204 loss_lmm_image: 0.8584 2024/11/12 13:27:35 - mmengine - INFO - Iter(train) [ 88700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:07:32 time: 2.0023 data_time: 0.0183 memory: 33412 grad_norm: 24.6657 loss: 8.8044 loss_cls: 0.2312 loss_bbox: 0.1398 loss_iou: 0.2286 d0.loss_cls: 0.2712 d0.loss_bbox: 0.1505 d0.loss_iou: 0.2386 d1.loss_cls: 0.2470 d1.loss_bbox: 0.1460 d1.loss_iou: 0.2355 d2.loss_cls: 0.2392 d2.loss_bbox: 0.1413 d2.loss_iou: 0.2323 d3.loss_cls: 0.2340 d3.loss_bbox: 0.1413 d3.loss_iou: 0.2309 d4.loss_cls: 0.2345 d4.loss_bbox: 0.1396 d4.loss_iou: 0.2282 enc_loss_cls: 0.2791 enc_loss_bbox: 0.1603 enc_loss_iou: 0.2561 dn_loss_cls: 0.0995 dn_loss_bbox: 0.1758 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.1658 d0.dn_loss_bbox: 0.3158 d0.dn_loss_iou: 0.3673 d1.dn_loss_cls: 0.1222 d1.dn_loss_bbox: 0.2084 d1.dn_loss_iou: 0.2581 d2.dn_loss_cls: 0.1074 d2.dn_loss_bbox: 0.1867 d2.dn_loss_iou: 0.2363 d3.dn_loss_cls: 0.1024 d3.dn_loss_bbox: 0.1783 d3.dn_loss_iou: 0.2275 d4.dn_loss_cls: 0.1005 d4.dn_loss_bbox: 0.1758 d4.dn_loss_iou: 0.2249 d1.loss_lmm_region: 0.1295 loss_lmm_image: 0.7921 2024/11/12 13:30:51 - mmengine - INFO - Iter(train) [ 88800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:04:08 time: 1.9241 data_time: 0.0182 memory: 33103 grad_norm: 30.1888 loss: 9.9636 loss_cls: 0.2710 loss_bbox: 0.1619 loss_iou: 0.2655 d0.loss_cls: 0.3135 d0.loss_bbox: 0.1689 d0.loss_iou: 0.2752 d1.loss_cls: 0.2926 d1.loss_bbox: 0.1651 d1.loss_iou: 0.2715 d2.loss_cls: 0.2846 d2.loss_bbox: 0.1614 d2.loss_iou: 0.2679 d3.loss_cls: 0.2760 d3.loss_bbox: 0.1626 d3.loss_iou: 0.2668 d4.loss_cls: 0.2726 d4.loss_bbox: 0.1631 d4.loss_iou: 0.2659 enc_loss_cls: 0.3254 enc_loss_bbox: 0.1840 enc_loss_iou: 0.2938 dn_loss_cls: 0.1125 dn_loss_bbox: 0.2080 dn_loss_iou: 0.2470 d0.dn_loss_cls: 0.1801 d0.dn_loss_bbox: 0.3643 d0.dn_loss_iou: 0.3951 d1.dn_loss_cls: 0.1331 d1.dn_loss_bbox: 0.2419 d1.dn_loss_iou: 0.2780 d2.dn_loss_cls: 0.1173 d2.dn_loss_bbox: 0.2197 d2.dn_loss_iou: 0.2571 d3.dn_loss_cls: 0.1155 d3.dn_loss_bbox: 0.2099 d3.dn_loss_iou: 0.2499 d4.dn_loss_cls: 0.1129 d4.dn_loss_bbox: 0.2080 d4.dn_loss_iou: 0.2470 d1.loss_lmm_region: 0.1228 loss_lmm_image: 0.8341 2024/11/12 13:34:11 - mmengine - INFO - Iter(train) [ 88900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 10:00:47 time: 1.9930 data_time: 0.0183 memory: 33094 grad_norm: 30.5775 loss: 9.0198 loss_cls: 0.2674 loss_bbox: 0.1378 loss_iou: 0.2384 d0.loss_cls: 0.3050 d0.loss_bbox: 0.1449 d0.loss_iou: 0.2502 d1.loss_cls: 0.2840 d1.loss_bbox: 0.1362 d1.loss_iou: 0.2398 d2.loss_cls: 0.2791 d2.loss_bbox: 0.1319 d2.loss_iou: 0.2372 d3.loss_cls: 0.2707 d3.loss_bbox: 0.1343 d3.loss_iou: 0.2387 d4.loss_cls: 0.2707 d4.loss_bbox: 0.1366 d4.loss_iou: 0.2380 enc_loss_cls: 0.3149 enc_loss_bbox: 0.1573 enc_loss_iou: 0.2738 dn_loss_cls: 0.0919 dn_loss_bbox: 0.1652 dn_loss_iou: 0.2094 d0.dn_loss_cls: 0.1672 d0.dn_loss_bbox: 0.3446 d0.dn_loss_iou: 0.3619 d1.dn_loss_cls: 0.1202 d1.dn_loss_bbox: 0.2051 d1.dn_loss_iou: 0.2433 d2.dn_loss_cls: 0.1041 d2.dn_loss_bbox: 0.1782 d2.dn_loss_iou: 0.2201 d3.dn_loss_cls: 0.0947 d3.dn_loss_bbox: 0.1680 d3.dn_loss_iou: 0.2123 d4.dn_loss_cls: 0.0917 d4.dn_loss_bbox: 0.1653 d4.dn_loss_iou: 0.2094 d1.loss_lmm_region: 0.1062 loss_lmm_image: 0.8741 2024/11/12 13:37:29 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 13:37:29 - mmengine - INFO - Iter(train) [ 89000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:57:26 time: 2.0006 data_time: 0.0184 memory: 33870 grad_norm: 33.5251 loss: 9.6172 loss_cls: 0.2962 loss_bbox: 0.1345 loss_iou: 0.2402 d0.loss_cls: 0.3479 d0.loss_bbox: 0.1463 d0.loss_iou: 0.2517 d1.loss_cls: 0.3183 d1.loss_bbox: 0.1373 d1.loss_iou: 0.2449 d2.loss_cls: 0.3078 d2.loss_bbox: 0.1330 d2.loss_iou: 0.2402 d3.loss_cls: 0.3008 d3.loss_bbox: 0.1349 d3.loss_iou: 0.2390 d4.loss_cls: 0.2976 d4.loss_bbox: 0.1354 d4.loss_iou: 0.2403 enc_loss_cls: 0.3468 enc_loss_bbox: 0.1625 enc_loss_iou: 0.2759 dn_loss_cls: 0.1224 dn_loss_bbox: 0.1829 dn_loss_iou: 0.2276 d0.dn_loss_cls: 0.1992 d0.dn_loss_bbox: 0.3485 d0.dn_loss_iou: 0.3757 d1.dn_loss_cls: 0.1545 d1.dn_loss_bbox: 0.2191 d1.dn_loss_iou: 0.2597 d2.dn_loss_cls: 0.1377 d2.dn_loss_bbox: 0.1932 d2.dn_loss_iou: 0.2372 d3.dn_loss_cls: 0.1291 d3.dn_loss_bbox: 0.1847 d3.dn_loss_iou: 0.2298 d4.dn_loss_cls: 0.1213 d4.dn_loss_bbox: 0.1830 d4.dn_loss_iou: 0.2276 d1.loss_lmm_region: 0.1214 loss_lmm_image: 0.8313 2024/11/12 13:40:49 - mmengine - INFO - Iter(train) [ 89100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:54:05 time: 1.9916 data_time: 0.0182 memory: 35122 grad_norm: 30.4225 loss: 8.8593 loss_cls: 0.2665 loss_bbox: 0.1248 loss_iou: 0.2165 d0.loss_cls: 0.3181 d0.loss_bbox: 0.1317 d0.loss_iou: 0.2275 d1.loss_cls: 0.2906 d1.loss_bbox: 0.1231 d1.loss_iou: 0.2150 d2.loss_cls: 0.2773 d2.loss_bbox: 0.1232 d2.loss_iou: 0.2150 d3.loss_cls: 0.2721 d3.loss_bbox: 0.1206 d3.loss_iou: 0.2135 d4.loss_cls: 0.2675 d4.loss_bbox: 0.1250 d4.loss_iou: 0.2165 enc_loss_cls: 0.3153 enc_loss_bbox: 0.1476 enc_loss_iou: 0.2540 dn_loss_cls: 0.0906 dn_loss_bbox: 0.1708 dn_loss_iou: 0.2122 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.3388 d0.dn_loss_iou: 0.3644 d1.dn_loss_cls: 0.1247 d1.dn_loss_bbox: 0.2156 d1.dn_loss_iou: 0.2492 d2.dn_loss_cls: 0.1046 d2.dn_loss_bbox: 0.1844 d2.dn_loss_iou: 0.2244 d3.dn_loss_cls: 0.0952 d3.dn_loss_bbox: 0.1743 d3.dn_loss_iou: 0.2154 d4.dn_loss_cls: 0.0914 d4.dn_loss_bbox: 0.1709 d4.dn_loss_iou: 0.2121 d1.loss_lmm_region: 0.1152 loss_lmm_image: 0.8609 2024/11/12 13:44:08 - mmengine - INFO - Iter(train) [ 89200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:50:43 time: 1.9657 data_time: 0.0185 memory: 33615 grad_norm: 27.2027 loss: 9.0168 loss_cls: 0.2804 loss_bbox: 0.1267 loss_iou: 0.2147 d0.loss_cls: 0.3192 d0.loss_bbox: 0.1414 d0.loss_iou: 0.2262 d1.loss_cls: 0.3030 d1.loss_bbox: 0.1329 d1.loss_iou: 0.2185 d2.loss_cls: 0.2888 d2.loss_bbox: 0.1272 d2.loss_iou: 0.2157 d3.loss_cls: 0.2833 d3.loss_bbox: 0.1285 d3.loss_iou: 0.2159 d4.loss_cls: 0.2819 d4.loss_bbox: 0.1250 d4.loss_iou: 0.2143 enc_loss_cls: 0.3208 enc_loss_bbox: 0.1641 enc_loss_iou: 0.2528 dn_loss_cls: 0.0921 dn_loss_bbox: 0.1860 dn_loss_iou: 0.2101 d0.dn_loss_cls: 0.1729 d0.dn_loss_bbox: 0.3305 d0.dn_loss_iou: 0.3453 d1.dn_loss_cls: 0.1214 d1.dn_loss_bbox: 0.2161 d1.dn_loss_iou: 0.2410 d2.dn_loss_cls: 0.1030 d2.dn_loss_bbox: 0.1951 d2.dn_loss_iou: 0.2193 d3.dn_loss_cls: 0.0957 d3.dn_loss_bbox: 0.1876 d3.dn_loss_iou: 0.2124 d4.dn_loss_cls: 0.0938 d4.dn_loss_bbox: 0.1860 d4.dn_loss_iou: 0.2101 d1.loss_lmm_region: 0.1474 loss_lmm_image: 0.8700 2024/11/12 13:47:27 - mmengine - INFO - Iter(train) [ 89300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:47:22 time: 2.0131 data_time: 0.0184 memory: 34503 grad_norm: 27.4777 loss: 9.4134 loss_cls: 0.2583 loss_bbox: 0.1517 loss_iou: 0.2692 d0.loss_cls: 0.3008 d0.loss_bbox: 0.1619 d0.loss_iou: 0.2812 d1.loss_cls: 0.2806 d1.loss_bbox: 0.1558 d1.loss_iou: 0.2743 d2.loss_cls: 0.2680 d2.loss_bbox: 0.1549 d2.loss_iou: 0.2728 d3.loss_cls: 0.2658 d3.loss_bbox: 0.1520 d3.loss_iou: 0.2682 d4.loss_cls: 0.2612 d4.loss_bbox: 0.1499 d4.loss_iou: 0.2677 enc_loss_cls: 0.3116 enc_loss_bbox: 0.1741 enc_loss_iou: 0.2989 dn_loss_cls: 0.1045 dn_loss_bbox: 0.1624 dn_loss_iou: 0.2198 d0.dn_loss_cls: 0.1850 d0.dn_loss_bbox: 0.3178 d0.dn_loss_iou: 0.3622 d1.dn_loss_cls: 0.1375 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2513 d2.dn_loss_cls: 0.1189 d2.dn_loss_bbox: 0.1741 d2.dn_loss_iou: 0.2300 d3.dn_loss_cls: 0.1123 d3.dn_loss_bbox: 0.1664 d3.dn_loss_iou: 0.2231 d4.dn_loss_cls: 0.1049 d4.dn_loss_bbox: 0.1624 d4.dn_loss_iou: 0.2198 d1.loss_lmm_region: 0.1360 loss_lmm_image: 0.8487 2024/11/12 13:50:46 - mmengine - INFO - Iter(train) [ 89400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:44:01 time: 1.9852 data_time: 0.0183 memory: 33740 grad_norm: 30.1984 loss: 10.1424 loss_cls: 0.2941 loss_bbox: 0.1534 loss_iou: 0.2867 d0.loss_cls: 0.3322 d0.loss_bbox: 0.1700 d0.loss_iou: 0.3023 d1.loss_cls: 0.3071 d1.loss_bbox: 0.1647 d1.loss_iou: 0.2958 d2.loss_cls: 0.3016 d2.loss_bbox: 0.1556 d2.loss_iou: 0.2887 d3.loss_cls: 0.2951 d3.loss_bbox: 0.1554 d3.loss_iou: 0.2871 d4.loss_cls: 0.2942 d4.loss_bbox: 0.1552 d4.loss_iou: 0.2871 enc_loss_cls: 0.3325 enc_loss_bbox: 0.1818 enc_loss_iou: 0.3270 dn_loss_cls: 0.1093 dn_loss_bbox: 0.2040 dn_loss_iou: 0.2338 d0.dn_loss_cls: 0.1984 d0.dn_loss_bbox: 0.3605 d0.dn_loss_iou: 0.3762 d1.dn_loss_cls: 0.1435 d1.dn_loss_bbox: 0.2336 d1.dn_loss_iou: 0.2642 d2.dn_loss_cls: 0.1222 d2.dn_loss_bbox: 0.2104 d2.dn_loss_iou: 0.2427 d3.dn_loss_cls: 0.1140 d3.dn_loss_bbox: 0.2056 d3.dn_loss_iou: 0.2364 d4.dn_loss_cls: 0.1106 d4.dn_loss_bbox: 0.2040 d4.dn_loss_iou: 0.2340 d1.loss_lmm_region: 0.1467 loss_lmm_image: 0.8245 2024/11/12 13:54:07 - mmengine - INFO - Iter(train) [ 89500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:40:41 time: 2.0016 data_time: 0.0184 memory: 35152 grad_norm: 27.8651 loss: 8.4563 loss_cls: 0.2523 loss_bbox: 0.1222 loss_iou: 0.2009 d0.loss_cls: 0.2830 d0.loss_bbox: 0.1338 d0.loss_iou: 0.2147 d1.loss_cls: 0.2676 d1.loss_bbox: 0.1245 d1.loss_iou: 0.2032 d2.loss_cls: 0.2623 d2.loss_bbox: 0.1211 d2.loss_iou: 0.2012 d3.loss_cls: 0.2540 d3.loss_bbox: 0.1232 d3.loss_iou: 0.2029 d4.loss_cls: 0.2520 d4.loss_bbox: 0.1220 d4.loss_iou: 0.2004 enc_loss_cls: 0.2920 enc_loss_bbox: 0.1430 enc_loss_iou: 0.2316 dn_loss_cls: 0.0890 dn_loss_bbox: 0.1690 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.1733 d0.dn_loss_bbox: 0.3366 d0.dn_loss_iou: 0.3496 d1.dn_loss_cls: 0.1251 d1.dn_loss_bbox: 0.2055 d1.dn_loss_iou: 0.2290 d2.dn_loss_cls: 0.1039 d2.dn_loss_bbox: 0.1798 d2.dn_loss_iou: 0.2058 d3.dn_loss_cls: 0.0950 d3.dn_loss_bbox: 0.1706 d3.dn_loss_iou: 0.1988 d4.dn_loss_cls: 0.0900 d4.dn_loss_bbox: 0.1690 d4.dn_loss_iou: 0.1964 d1.loss_lmm_region: 0.1409 loss_lmm_image: 0.8252 2024/11/12 13:57:25 - mmengine - INFO - Iter(train) [ 89600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:37:19 time: 1.9782 data_time: 0.0184 memory: 34893 grad_norm: 30.8022 loss: 10.9796 loss_cls: 0.3445 loss_bbox: 0.1920 loss_iou: 0.3262 d0.loss_cls: 0.3897 d0.loss_bbox: 0.2083 d0.loss_iou: 0.3482 d1.loss_cls: 0.3682 d1.loss_bbox: 0.1965 d1.loss_iou: 0.3348 d2.loss_cls: 0.3548 d2.loss_bbox: 0.1945 d2.loss_iou: 0.3315 d3.loss_cls: 0.3519 d3.loss_bbox: 0.1901 d3.loss_iou: 0.3241 d4.loss_cls: 0.3410 d4.loss_bbox: 0.1949 d4.loss_iou: 0.3273 enc_loss_cls: 0.3979 enc_loss_bbox: 0.2193 enc_loss_iou: 0.3700 dn_loss_cls: 0.0824 dn_loss_bbox: 0.2037 dn_loss_iou: 0.2495 d0.dn_loss_cls: 0.1653 d0.dn_loss_bbox: 0.3607 d0.dn_loss_iou: 0.3984 d1.dn_loss_cls: 0.1145 d1.dn_loss_bbox: 0.2358 d1.dn_loss_iou: 0.2795 d2.dn_loss_cls: 0.0965 d2.dn_loss_bbox: 0.2123 d2.dn_loss_iou: 0.2579 d3.dn_loss_cls: 0.0876 d3.dn_loss_bbox: 0.2055 d3.dn_loss_iou: 0.2507 d4.dn_loss_cls: 0.0832 d4.dn_loss_bbox: 0.2037 d4.dn_loss_iou: 0.2494 d1.loss_lmm_region: 0.1288 loss_lmm_image: 0.8087 2024/11/12 14:00:46 - mmengine - INFO - Iter(train) [ 89700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:33:59 time: 2.0140 data_time: 0.0184 memory: 33700 grad_norm: 35.3115 loss: 8.9843 loss_cls: 0.2987 loss_bbox: 0.1148 loss_iou: 0.2408 d0.loss_cls: 0.3404 d0.loss_bbox: 0.1337 d0.loss_iou: 0.2629 d1.loss_cls: 0.3182 d1.loss_bbox: 0.1237 d1.loss_iou: 0.2527 d2.loss_cls: 0.3099 d2.loss_bbox: 0.1159 d2.loss_iou: 0.2459 d3.loss_cls: 0.3041 d3.loss_bbox: 0.1139 d3.loss_iou: 0.2418 d4.loss_cls: 0.2978 d4.loss_bbox: 0.1150 d4.loss_iou: 0.2415 enc_loss_cls: 0.3450 enc_loss_bbox: 0.1503 enc_loss_iou: 0.2871 dn_loss_cls: 0.0929 dn_loss_bbox: 0.1508 dn_loss_iou: 0.2017 d0.dn_loss_cls: 0.1725 d0.dn_loss_bbox: 0.2928 d0.dn_loss_iou: 0.3471 d1.dn_loss_cls: 0.1225 d1.dn_loss_bbox: 0.1785 d1.dn_loss_iou: 0.2319 d2.dn_loss_cls: 0.1036 d2.dn_loss_bbox: 0.1574 d2.dn_loss_iou: 0.2091 d3.dn_loss_cls: 0.0973 d3.dn_loss_bbox: 0.1521 d3.dn_loss_iou: 0.2032 d4.dn_loss_cls: 0.0931 d4.dn_loss_bbox: 0.1508 d4.dn_loss_iou: 0.2016 d1.loss_lmm_region: 0.1200 loss_lmm_image: 0.8512 2024/11/12 14:04:06 - mmengine - INFO - Iter(train) [ 89800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:30:38 time: 1.9939 data_time: 0.0184 memory: 34665 grad_norm: 33.7823 loss: 9.6062 loss_cls: 0.2542 loss_bbox: 0.1533 loss_iou: 0.2863 d0.loss_cls: 0.3060 d0.loss_bbox: 0.1654 d0.loss_iou: 0.2973 d1.loss_cls: 0.2809 d1.loss_bbox: 0.1579 d1.loss_iou: 0.2890 d2.loss_cls: 0.2649 d2.loss_bbox: 0.1547 d2.loss_iou: 0.2864 d3.loss_cls: 0.2580 d3.loss_bbox: 0.1560 d3.loss_iou: 0.2842 d4.loss_cls: 0.2549 d4.loss_bbox: 0.1550 d4.loss_iou: 0.2835 enc_loss_cls: 0.3090 enc_loss_bbox: 0.1801 enc_loss_iou: 0.3157 dn_loss_cls: 0.1001 dn_loss_bbox: 0.1750 dn_loss_iou: 0.2296 d0.dn_loss_cls: 0.1826 d0.dn_loss_bbox: 0.3343 d0.dn_loss_iou: 0.3700 d1.dn_loss_cls: 0.1273 d1.dn_loss_bbox: 0.2040 d1.dn_loss_iou: 0.2563 d2.dn_loss_cls: 0.1114 d2.dn_loss_bbox: 0.1843 d2.dn_loss_iou: 0.2379 d3.dn_loss_cls: 0.1041 d3.dn_loss_bbox: 0.1787 d3.dn_loss_iou: 0.2319 d4.dn_loss_cls: 0.1006 d4.dn_loss_bbox: 0.1750 d4.dn_loss_iou: 0.2294 d1.loss_lmm_region: 0.1243 loss_lmm_image: 0.8569 2024/11/12 14:07:27 - mmengine - INFO - Iter(train) [ 89900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:27:18 time: 2.0189 data_time: 0.0185 memory: 35207 grad_norm: 29.1707 loss: 10.2659 loss_cls: 0.3301 loss_bbox: 0.1578 loss_iou: 0.3016 d0.loss_cls: 0.3865 d0.loss_bbox: 0.1667 d0.loss_iou: 0.3179 d1.loss_cls: 0.3625 d1.loss_bbox: 0.1582 d1.loss_iou: 0.3041 d2.loss_cls: 0.3459 d2.loss_bbox: 0.1604 d2.loss_iou: 0.3019 d3.loss_cls: 0.3367 d3.loss_bbox: 0.1565 d3.loss_iou: 0.2989 d4.loss_cls: 0.3325 d4.loss_bbox: 0.1550 d4.loss_iou: 0.3004 enc_loss_cls: 0.3860 enc_loss_bbox: 0.1886 enc_loss_iou: 0.3464 dn_loss_cls: 0.0898 dn_loss_bbox: 0.1667 dn_loss_iou: 0.2483 d0.dn_loss_cls: 0.1764 d0.dn_loss_bbox: 0.2938 d0.dn_loss_iou: 0.3884 d1.dn_loss_cls: 0.1307 d1.dn_loss_bbox: 0.1925 d1.dn_loss_iou: 0.2781 d2.dn_loss_cls: 0.1068 d2.dn_loss_bbox: 0.1743 d2.dn_loss_iou: 0.2579 d3.dn_loss_cls: 0.0960 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2507 d4.dn_loss_cls: 0.0903 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2483 d1.loss_lmm_region: 0.1244 loss_lmm_image: 0.8227 2024/11/12 14:10:48 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 14:10:48 - mmengine - INFO - Iter(train) [ 90000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:23:58 time: 2.0303 data_time: 0.0184 memory: 32882 grad_norm: 25.7595 loss: 9.0383 loss_cls: 0.2696 loss_bbox: 0.1453 loss_iou: 0.2334 d0.loss_cls: 0.3160 d0.loss_bbox: 0.1579 d0.loss_iou: 0.2484 d1.loss_cls: 0.2905 d1.loss_bbox: 0.1465 d1.loss_iou: 0.2395 d2.loss_cls: 0.2839 d2.loss_bbox: 0.1410 d2.loss_iou: 0.2344 d3.loss_cls: 0.2748 d3.loss_bbox: 0.1447 d3.loss_iou: 0.2360 d4.loss_cls: 0.2710 d4.loss_bbox: 0.1448 d4.loss_iou: 0.2323 enc_loss_cls: 0.3221 enc_loss_bbox: 0.1605 enc_loss_iou: 0.2616 dn_loss_cls: 0.0938 dn_loss_bbox: 0.1750 dn_loss_iou: 0.2093 d0.dn_loss_cls: 0.1731 d0.dn_loss_bbox: 0.3000 d0.dn_loss_iou: 0.3406 d1.dn_loss_cls: 0.1259 d1.dn_loss_bbox: 0.1996 d1.dn_loss_iou: 0.2376 d2.dn_loss_cls: 0.1070 d2.dn_loss_bbox: 0.1829 d2.dn_loss_iou: 0.2185 d3.dn_loss_cls: 0.1000 d3.dn_loss_bbox: 0.1754 d3.dn_loss_iou: 0.2115 d4.dn_loss_cls: 0.0951 d4.dn_loss_bbox: 0.1751 d4.dn_loss_iou: 0.2093 d1.loss_lmm_region: 0.1200 loss_lmm_image: 0.8345 2024/11/12 14:10:48 - mmengine - INFO - Saving checkpoint at 90000 iterations 2024/11/12 14:21:25 - mmengine - INFO - Iter(val) [100/602] eta: 0:48:39 time: 5.9567 data_time: 0.0017 memory: 7497 2024/11/12 14:31:39 - mmengine - INFO - Iter(val) [200/602] eta: 0:40:03 time: 6.2762 data_time: 0.0018 memory: 7528 2024/11/12 14:42:54 - mmengine - INFO - Iter(val) [300/602] eta: 0:31:22 time: 6.8308 data_time: 0.0018 memory: 7506 2024/11/12 14:55:03 - mmengine - INFO - Iter(val) [400/602] eta: 0:21:52 time: 7.4421 data_time: 0.0019 memory: 7528 2024/11/12 15:07:53 - mmengine - INFO - Iter(val) [500/602] eta: 0:11:27 time: 7.8647 data_time: 0.0019 memory: 7473 2024/11/12 15:21:40 - mmengine - INFO - Iter(val) [600/602] eta: 0:00:13 time: 8.3034 data_time: 0.0019 memory: 7522 2024/11/12 15:28:33 - mmengine - INFO - === 65 classes had less than 10000 detections! Outputting 10000 detections for each class will improve AP further. === 2024/11/12 15:30:52 - mmengine - INFO - mAP_copypaste: {'AP': 0.47572854385242147, 'AP50': 0.586117432511868, 'AP75': 0.5033830407007226, 'APs': 0.43263302344790894, 'APm': 0.6187157222643316, 'APl': 0.6942080440369518, 'APr': 0.3749900082761499, 'APc': 0.4245109873566035, 'APf': 0.5392806119279899} 2024/11/12 15:31:16 - mmengine - INFO - Iter(val) [602/602] lvis_fixed_ap/AP: 0.4757 lvis_fixed_ap/AP50: 0.5861 lvis_fixed_ap/AP75: 0.5034 lvis_fixed_ap/APs: 0.4326 lvis_fixed_ap/APm: 0.6187 lvis_fixed_ap/APl: 0.6942 lvis_fixed_ap/APr: 0.3750 lvis_fixed_ap/APc: 0.4245 lvis_fixed_ap/APf: 0.5393 data_time: 0.0020 time: 7.0016 2024/11/12 15:34:35 - mmengine - INFO - Iter(train) [ 90100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:26:48 time: 1.9619 data_time: 0.0187 memory: 35346 grad_norm: 28.1637 loss: 10.3283 loss_cls: 0.3030 loss_bbox: 0.1588 loss_iou: 0.2846 d0.loss_cls: 0.3416 d0.loss_bbox: 0.1782 d0.loss_iou: 0.3054 d1.loss_cls: 0.3154 d1.loss_bbox: 0.1714 d1.loss_iou: 0.3000 d2.loss_cls: 0.3074 d2.loss_bbox: 0.1667 d2.loss_iou: 0.2938 d3.loss_cls: 0.3042 d3.loss_bbox: 0.1614 d3.loss_iou: 0.2884 d4.loss_cls: 0.3038 d4.loss_bbox: 0.1602 d4.loss_iou: 0.2846 enc_loss_cls: 0.3548 enc_loss_bbox: 0.1925 enc_loss_iou: 0.3269 dn_loss_cls: 0.0981 dn_loss_bbox: 0.1941 dn_loss_iou: 0.2490 d0.dn_loss_cls: 0.1896 d0.dn_loss_bbox: 0.3804 d0.dn_loss_iou: 0.4085 d1.dn_loss_cls: 0.1337 d1.dn_loss_bbox: 0.2343 d1.dn_loss_iou: 0.2816 d2.dn_loss_cls: 0.1118 d2.dn_loss_bbox: 0.2081 d2.dn_loss_iou: 0.2603 d3.dn_loss_cls: 0.1056 d3.dn_loss_bbox: 0.1967 d3.dn_loss_iou: 0.2516 d4.dn_loss_cls: 0.0994 d4.dn_loss_bbox: 0.1942 d4.dn_loss_iou: 0.2489 d1.loss_lmm_region: 0.1339 loss_lmm_image: 0.8457 2024/11/12 15:37:55 - mmengine - INFO - Iter(train) [ 90200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:23:27 time: 2.0061 data_time: 0.0184 memory: 34240 grad_norm: 41.3223 loss: 10.6730 loss_cls: 0.3795 loss_bbox: 0.1673 loss_iou: 0.3144 d0.loss_cls: 0.4250 d0.loss_bbox: 0.1851 d0.loss_iou: 0.3394 d1.loss_cls: 0.3990 d1.loss_bbox: 0.1737 d1.loss_iou: 0.3265 d2.loss_cls: 0.3868 d2.loss_bbox: 0.1713 d2.loss_iou: 0.3183 d3.loss_cls: 0.3808 d3.loss_bbox: 0.1655 d3.loss_iou: 0.3163 d4.loss_cls: 0.3806 d4.loss_bbox: 0.1648 d4.loss_iou: 0.3152 enc_loss_cls: 0.4349 enc_loss_bbox: 0.1999 enc_loss_iou: 0.3670 dn_loss_cls: 0.1097 dn_loss_bbox: 0.1447 dn_loss_iou: 0.2215 d0.dn_loss_cls: 0.1890 d0.dn_loss_bbox: 0.2817 d0.dn_loss_iou: 0.3642 d1.dn_loss_cls: 0.1425 d1.dn_loss_bbox: 0.1756 d1.dn_loss_iou: 0.2532 d2.dn_loss_cls: 0.1201 d2.dn_loss_bbox: 0.1519 d2.dn_loss_iou: 0.2292 d3.dn_loss_cls: 0.1142 d3.dn_loss_bbox: 0.1468 d3.dn_loss_iou: 0.2236 d4.dn_loss_cls: 0.1095 d4.dn_loss_bbox: 0.1447 d4.dn_loss_iou: 0.2215 d1.loss_lmm_region: 0.1606 loss_lmm_image: 0.8571 2024/11/12 15:41:15 - mmengine - INFO - Iter(train) [ 90300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:20:05 time: 2.0086 data_time: 0.0183 memory: 36110 grad_norm: 31.7077 loss: 10.7668 loss_cls: 0.3804 loss_bbox: 0.1679 loss_iou: 0.2868 d0.loss_cls: 0.4238 d0.loss_bbox: 0.1807 d0.loss_iou: 0.3066 d1.loss_cls: 0.3961 d1.loss_bbox: 0.1769 d1.loss_iou: 0.2976 d2.loss_cls: 0.3950 d2.loss_bbox: 0.1698 d2.loss_iou: 0.2893 d3.loss_cls: 0.3813 d3.loss_bbox: 0.1700 d3.loss_iou: 0.2889 d4.loss_cls: 0.3809 d4.loss_bbox: 0.1678 d4.loss_iou: 0.2880 enc_loss_cls: 0.4339 enc_loss_bbox: 0.1979 enc_loss_iou: 0.3320 dn_loss_cls: 0.1266 dn_loss_bbox: 0.1802 dn_loss_iou: 0.2147 d0.dn_loss_cls: 0.2000 d0.dn_loss_bbox: 0.3271 d0.dn_loss_iou: 0.3504 d1.dn_loss_cls: 0.1602 d1.dn_loss_bbox: 0.2136 d1.dn_loss_iou: 0.2478 d2.dn_loss_cls: 0.1405 d2.dn_loss_bbox: 0.1906 d2.dn_loss_iou: 0.2254 d3.dn_loss_cls: 0.1296 d3.dn_loss_bbox: 0.1824 d3.dn_loss_iou: 0.2177 d4.dn_loss_cls: 0.1286 d4.dn_loss_bbox: 0.1801 d4.dn_loss_iou: 0.2148 d1.loss_lmm_region: 0.1473 loss_lmm_image: 0.8775 2024/11/12 15:44:35 - mmengine - INFO - Iter(train) [ 90400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:16:43 time: 2.0189 data_time: 0.0184 memory: 32925 grad_norm: 36.2793 loss: 8.8758 loss_cls: 0.2533 loss_bbox: 0.1326 loss_iou: 0.2403 d0.loss_cls: 0.2938 d0.loss_bbox: 0.1413 d0.loss_iou: 0.2497 d1.loss_cls: 0.2755 d1.loss_bbox: 0.1386 d1.loss_iou: 0.2462 d2.loss_cls: 0.2670 d2.loss_bbox: 0.1359 d2.loss_iou: 0.2414 d3.loss_cls: 0.2601 d3.loss_bbox: 0.1350 d3.loss_iou: 0.2400 d4.loss_cls: 0.2567 d4.loss_bbox: 0.1354 d4.loss_iou: 0.2394 enc_loss_cls: 0.3035 enc_loss_bbox: 0.1516 enc_loss_iou: 0.2720 dn_loss_cls: 0.1096 dn_loss_bbox: 0.1505 dn_loss_iou: 0.1990 d0.dn_loss_cls: 0.1830 d0.dn_loss_bbox: 0.2964 d0.dn_loss_iou: 0.3432 d1.dn_loss_cls: 0.1354 d1.dn_loss_bbox: 0.1904 d1.dn_loss_iou: 0.2344 d2.dn_loss_cls: 0.1198 d2.dn_loss_bbox: 0.1629 d2.dn_loss_iou: 0.2106 d3.dn_loss_cls: 0.1133 d3.dn_loss_bbox: 0.1530 d3.dn_loss_iou: 0.2018 d4.dn_loss_cls: 0.1088 d4.dn_loss_bbox: 0.1505 d4.dn_loss_iou: 0.1990 d1.loss_lmm_region: 0.1270 loss_lmm_image: 0.8779 2024/11/12 15:47:52 - mmengine - INFO - Iter(train) [ 90500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:13:20 time: 1.9795 data_time: 0.0183 memory: 33451 grad_norm: 33.0012 loss: 8.4781 loss_cls: 0.2380 loss_bbox: 0.1349 loss_iou: 0.2335 d0.loss_cls: 0.2783 d0.loss_bbox: 0.1445 d0.loss_iou: 0.2464 d1.loss_cls: 0.2557 d1.loss_bbox: 0.1372 d1.loss_iou: 0.2398 d2.loss_cls: 0.2461 d2.loss_bbox: 0.1355 d2.loss_iou: 0.2354 d3.loss_cls: 0.2413 d3.loss_bbox: 0.1334 d3.loss_iou: 0.2336 d4.loss_cls: 0.2380 d4.loss_bbox: 0.1352 d4.loss_iou: 0.2356 enc_loss_cls: 0.2840 enc_loss_bbox: 0.1624 enc_loss_iou: 0.2719 dn_loss_cls: 0.0751 dn_loss_bbox: 0.1536 dn_loss_iou: 0.1999 d0.dn_loss_cls: 0.1430 d0.dn_loss_bbox: 0.3061 d0.dn_loss_iou: 0.3402 d1.dn_loss_cls: 0.0990 d1.dn_loss_bbox: 0.1881 d1.dn_loss_iou: 0.2296 d2.dn_loss_cls: 0.0854 d2.dn_loss_bbox: 0.1645 d2.dn_loss_iou: 0.2088 d3.dn_loss_cls: 0.0795 d3.dn_loss_bbox: 0.1559 d3.dn_loss_iou: 0.2025 d4.dn_loss_cls: 0.0768 d4.dn_loss_bbox: 0.1535 d4.dn_loss_iou: 0.1999 d1.loss_lmm_region: 0.1194 loss_lmm_image: 0.8369 2024/11/12 15:51:11 - mmengine - INFO - Iter(train) [ 90600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:09:57 time: 1.9974 data_time: 0.0184 memory: 33555 grad_norm: 27.9306 loss: 9.8096 loss_cls: 0.2994 loss_bbox: 0.1448 loss_iou: 0.2797 d0.loss_cls: 0.3420 d0.loss_bbox: 0.1569 d0.loss_iou: 0.2945 d1.loss_cls: 0.3205 d1.loss_bbox: 0.1541 d1.loss_iou: 0.2874 d2.loss_cls: 0.3112 d2.loss_bbox: 0.1460 d2.loss_iou: 0.2825 d3.loss_cls: 0.3001 d3.loss_bbox: 0.1466 d3.loss_iou: 0.2815 d4.loss_cls: 0.2970 d4.loss_bbox: 0.1454 d4.loss_iou: 0.2803 enc_loss_cls: 0.3463 enc_loss_bbox: 0.1654 enc_loss_iou: 0.3141 dn_loss_cls: 0.1076 dn_loss_bbox: 0.1692 dn_loss_iou: 0.2389 d0.dn_loss_cls: 0.1838 d0.dn_loss_bbox: 0.3101 d0.dn_loss_iou: 0.3756 d1.dn_loss_cls: 0.1414 d1.dn_loss_bbox: 0.1966 d1.dn_loss_iou: 0.2675 d2.dn_loss_cls: 0.1214 d2.dn_loss_bbox: 0.1778 d2.dn_loss_iou: 0.2478 d3.dn_loss_cls: 0.1120 d3.dn_loss_bbox: 0.1711 d3.dn_loss_iou: 0.2412 d4.dn_loss_cls: 0.1088 d4.dn_loss_bbox: 0.1693 d4.dn_loss_iou: 0.2388 d1.loss_lmm_region: 0.1277 loss_lmm_image: 0.8075 2024/11/12 15:54:29 - mmengine - INFO - Iter(train) [ 90700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:06:34 time: 1.9927 data_time: 0.0184 memory: 33852 grad_norm: 25.9637 loss: 9.2517 loss_cls: 0.2832 loss_bbox: 0.1331 loss_iou: 0.2360 d0.loss_cls: 0.3251 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2552 d1.loss_cls: 0.2977 d1.loss_bbox: 0.1439 d1.loss_iou: 0.2496 d2.loss_cls: 0.2953 d2.loss_bbox: 0.1352 d2.loss_iou: 0.2399 d3.loss_cls: 0.2871 d3.loss_bbox: 0.1321 d3.loss_iou: 0.2370 d4.loss_cls: 0.2877 d4.loss_bbox: 0.1323 d4.loss_iou: 0.2351 enc_loss_cls: 0.3284 enc_loss_bbox: 0.1601 enc_loss_iou: 0.2766 dn_loss_cls: 0.0943 dn_loss_bbox: 0.1840 dn_loss_iou: 0.2091 d0.dn_loss_cls: 0.1742 d0.dn_loss_bbox: 0.3522 d0.dn_loss_iou: 0.3556 d1.dn_loss_cls: 0.1242 d1.dn_loss_bbox: 0.2247 d1.dn_loss_iou: 0.2418 d2.dn_loss_cls: 0.1067 d2.dn_loss_bbox: 0.1967 d2.dn_loss_iou: 0.2197 d3.dn_loss_cls: 0.0979 d3.dn_loss_bbox: 0.1873 d3.dn_loss_iou: 0.2118 d4.dn_loss_cls: 0.0942 d4.dn_loss_bbox: 0.1841 d4.dn_loss_iou: 0.2092 d1.loss_lmm_region: 0.1264 loss_lmm_image: 0.8414 2024/11/12 15:57:47 - mmengine - INFO - Iter(train) [ 90800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 9:03:11 time: 1.9560 data_time: 0.0183 memory: 34863 grad_norm: 30.1167 loss: 9.4903 loss_cls: 0.3164 loss_bbox: 0.1383 loss_iou: 0.2594 d0.loss_cls: 0.3599 d0.loss_bbox: 0.1515 d0.loss_iou: 0.2762 d1.loss_cls: 0.3295 d1.loss_bbox: 0.1477 d1.loss_iou: 0.2683 d2.loss_cls: 0.3236 d2.loss_bbox: 0.1412 d2.loss_iou: 0.2624 d3.loss_cls: 0.3158 d3.loss_bbox: 0.1404 d3.loss_iou: 0.2619 d4.loss_cls: 0.3161 d4.loss_bbox: 0.1396 d4.loss_iou: 0.2602 enc_loss_cls: 0.3562 enc_loss_bbox: 0.1610 enc_loss_iou: 0.2933 dn_loss_cls: 0.1074 dn_loss_bbox: 0.1476 dn_loss_iou: 0.2086 d0.dn_loss_cls: 0.1902 d0.dn_loss_bbox: 0.2826 d0.dn_loss_iou: 0.3441 d1.dn_loss_cls: 0.1417 d1.dn_loss_bbox: 0.1776 d1.dn_loss_iou: 0.2379 d2.dn_loss_cls: 0.1178 d2.dn_loss_bbox: 0.1560 d2.dn_loss_iou: 0.2174 d3.dn_loss_cls: 0.1123 d3.dn_loss_bbox: 0.1505 d3.dn_loss_iou: 0.2109 d4.dn_loss_cls: 0.1076 d4.dn_loss_bbox: 0.1475 d4.dn_loss_iou: 0.2086 d1.loss_lmm_region: 0.1322 loss_lmm_image: 0.8726 2024/11/12 16:01:06 - mmengine - INFO - Iter(train) [ 90900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:59:49 time: 1.9899 data_time: 0.0184 memory: 34841 grad_norm: 30.4461 loss: 9.7809 loss_cls: 0.2981 loss_bbox: 0.1478 loss_iou: 0.2642 d0.loss_cls: 0.3485 d0.loss_bbox: 0.1579 d0.loss_iou: 0.2721 d1.loss_cls: 0.3158 d1.loss_bbox: 0.1558 d1.loss_iou: 0.2755 d2.loss_cls: 0.3072 d2.loss_bbox: 0.1496 d2.loss_iou: 0.2685 d3.loss_cls: 0.2980 d3.loss_bbox: 0.1480 d3.loss_iou: 0.2663 d4.loss_cls: 0.2943 d4.loss_bbox: 0.1476 d4.loss_iou: 0.2668 enc_loss_cls: 0.3475 enc_loss_bbox: 0.1785 enc_loss_iou: 0.3013 dn_loss_cls: 0.1074 dn_loss_bbox: 0.1788 dn_loss_iou: 0.2266 d0.dn_loss_cls: 0.1886 d0.dn_loss_bbox: 0.3256 d0.dn_loss_iou: 0.3741 d1.dn_loss_cls: 0.1418 d1.dn_loss_bbox: 0.2125 d1.dn_loss_iou: 0.2602 d2.dn_loss_cls: 0.1201 d2.dn_loss_bbox: 0.1914 d2.dn_loss_iou: 0.2382 d3.dn_loss_cls: 0.1128 d3.dn_loss_bbox: 0.1825 d3.dn_loss_iou: 0.2300 d4.dn_loss_cls: 0.1070 d4.dn_loss_bbox: 0.1788 d4.dn_loss_iou: 0.2264 d1.loss_lmm_region: 0.1371 loss_lmm_image: 0.8318 2024/11/12 16:04:23 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 16:04:23 - mmengine - INFO - Iter(train) [ 91000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:56:25 time: 1.9694 data_time: 0.0183 memory: 33426 grad_norm: nan loss: 10.0817 loss_cls: 0.3033 loss_bbox: 0.1625 loss_iou: 0.2562 d0.loss_cls: 0.3394 d0.loss_bbox: 0.1782 d0.loss_iou: 0.2741 d1.loss_cls: 0.3215 d1.loss_bbox: 0.1674 d1.loss_iou: 0.2642 d2.loss_cls: 0.3149 d2.loss_bbox: 0.1586 d2.loss_iou: 0.2557 d3.loss_cls: 0.3072 d3.loss_bbox: 0.1619 d3.loss_iou: 0.2549 d4.loss_cls: 0.3046 d4.loss_bbox: 0.1619 d4.loss_iou: 0.2555 enc_loss_cls: 0.3589 enc_loss_bbox: 0.1841 enc_loss_iou: 0.2935 dn_loss_cls: 0.1164 dn_loss_bbox: 0.1976 dn_loss_iou: 0.2307 d0.dn_loss_cls: 0.2052 d0.dn_loss_bbox: 0.3438 d0.dn_loss_iou: 0.3739 d1.dn_loss_cls: 0.1531 d1.dn_loss_bbox: 0.2226 d1.dn_loss_iou: 0.2562 d2.dn_loss_cls: 0.1345 d2.dn_loss_bbox: 0.2064 d2.dn_loss_iou: 0.2384 d3.dn_loss_cls: 0.1223 d3.dn_loss_bbox: 0.2000 d3.dn_loss_iou: 0.2325 d4.dn_loss_cls: 0.1179 d4.dn_loss_bbox: 0.1976 d4.dn_loss_iou: 0.2306 d1.loss_lmm_region: 0.1477 loss_lmm_image: 0.8758 2024/11/12 16:07:43 - mmengine - INFO - Iter(train) [ 91100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:53:03 time: 2.0094 data_time: 0.0182 memory: 34216 grad_norm: 29.8346 loss: 8.6352 loss_cls: 0.2819 loss_bbox: 0.1146 loss_iou: 0.1984 d0.loss_cls: 0.3184 d0.loss_bbox: 0.1234 d0.loss_iou: 0.2159 d1.loss_cls: 0.2939 d1.loss_bbox: 0.1203 d1.loss_iou: 0.2103 d2.loss_cls: 0.2870 d2.loss_bbox: 0.1171 d2.loss_iou: 0.2030 d3.loss_cls: 0.2845 d3.loss_bbox: 0.1147 d3.loss_iou: 0.1993 d4.loss_cls: 0.2849 d4.loss_bbox: 0.1124 d4.loss_iou: 0.1971 enc_loss_cls: 0.3255 enc_loss_bbox: 0.1355 enc_loss_iou: 0.2376 dn_loss_cls: 0.1243 dn_loss_bbox: 0.1517 dn_loss_iou: 0.1987 d0.dn_loss_cls: 0.1923 d0.dn_loss_bbox: 0.3052 d0.dn_loss_iou: 0.3471 d1.dn_loss_cls: 0.1486 d1.dn_loss_bbox: 0.1867 d1.dn_loss_iou: 0.2306 d2.dn_loss_cls: 0.1364 d2.dn_loss_bbox: 0.1617 d2.dn_loss_iou: 0.2068 d3.dn_loss_cls: 0.1313 d3.dn_loss_bbox: 0.1547 d3.dn_loss_iou: 0.2016 d4.dn_loss_cls: 0.1263 d4.dn_loss_bbox: 0.1517 d4.dn_loss_iou: 0.1989 d1.loss_lmm_region: 0.1075 loss_lmm_image: 0.7973 2024/11/12 16:11:01 - mmengine - INFO - Iter(train) [ 91200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:49:41 time: 2.0210 data_time: 0.0184 memory: 35006 grad_norm: 29.7108 loss: 9.3474 loss_cls: 0.3112 loss_bbox: 0.1490 loss_iou: 0.2593 d0.loss_cls: 0.3541 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2858 d1.loss_cls: 0.3224 d1.loss_bbox: 0.1572 d1.loss_iou: 0.2728 d2.loss_cls: 0.3231 d2.loss_bbox: 0.1442 d2.loss_iou: 0.2581 d3.loss_cls: 0.3109 d3.loss_bbox: 0.1469 d3.loss_iou: 0.2598 d4.loss_cls: 0.3138 d4.loss_bbox: 0.1449 d4.loss_iou: 0.2584 enc_loss_cls: 0.3553 enc_loss_bbox: 0.1785 enc_loss_iou: 0.3108 dn_loss_cls: 0.0943 dn_loss_bbox: 0.1470 dn_loss_iou: 0.2049 d0.dn_loss_cls: 0.1728 d0.dn_loss_bbox: 0.2884 d0.dn_loss_iou: 0.3420 d1.dn_loss_cls: 0.1263 d1.dn_loss_bbox: 0.1734 d1.dn_loss_iou: 0.2333 d2.dn_loss_cls: 0.1053 d2.dn_loss_bbox: 0.1531 d2.dn_loss_iou: 0.2131 d3.dn_loss_cls: 0.0972 d3.dn_loss_bbox: 0.1474 d3.dn_loss_iou: 0.2063 d4.dn_loss_cls: 0.0937 d4.dn_loss_bbox: 0.1469 d4.dn_loss_iou: 0.2049 d1.loss_lmm_region: 0.1192 loss_lmm_image: 0.8024 2024/11/12 16:14:17 - mmengine - INFO - Iter(train) [ 91300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:46:16 time: 1.9408 data_time: 0.0183 memory: 34924 grad_norm: 25.9930 loss: 9.2140 loss_cls: 0.2880 loss_bbox: 0.1442 loss_iou: 0.2487 d0.loss_cls: 0.3343 d0.loss_bbox: 0.1519 d0.loss_iou: 0.2619 d1.loss_cls: 0.3056 d1.loss_bbox: 0.1495 d1.loss_iou: 0.2589 d2.loss_cls: 0.2954 d2.loss_bbox: 0.1477 d2.loss_iou: 0.2523 d3.loss_cls: 0.2909 d3.loss_bbox: 0.1429 d3.loss_iou: 0.2493 d4.loss_cls: 0.2869 d4.loss_bbox: 0.1436 d4.loss_iou: 0.2483 enc_loss_cls: 0.3409 enc_loss_bbox: 0.1670 enc_loss_iou: 0.2881 dn_loss_cls: 0.0998 dn_loss_bbox: 0.1549 dn_loss_iou: 0.2047 d0.dn_loss_cls: 0.1658 d0.dn_loss_bbox: 0.2895 d0.dn_loss_iou: 0.3382 d1.dn_loss_cls: 0.1206 d1.dn_loss_bbox: 0.1867 d1.dn_loss_iou: 0.2341 d2.dn_loss_cls: 0.1093 d2.dn_loss_bbox: 0.1652 d2.dn_loss_iou: 0.2141 d3.dn_loss_cls: 0.1040 d3.dn_loss_bbox: 0.1581 d3.dn_loss_iou: 0.2075 d4.dn_loss_cls: 0.1022 d4.dn_loss_bbox: 0.1550 d4.dn_loss_iou: 0.2048 d1.loss_lmm_region: 0.1151 loss_lmm_image: 0.8884 2024/11/12 16:17:36 - mmengine - INFO - Iter(train) [ 91400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:42:55 time: 1.9880 data_time: 0.0183 memory: 31476 grad_norm: 28.9101 loss: 8.4957 loss_cls: 0.2420 loss_bbox: 0.1191 loss_iou: 0.1988 d0.loss_cls: 0.2736 d0.loss_bbox: 0.1341 d0.loss_iou: 0.2149 d1.loss_cls: 0.2588 d1.loss_bbox: 0.1255 d1.loss_iou: 0.2045 d2.loss_cls: 0.2533 d2.loss_bbox: 0.1195 d2.loss_iou: 0.1996 d3.loss_cls: 0.2451 d3.loss_bbox: 0.1188 d3.loss_iou: 0.1968 d4.loss_cls: 0.2437 d4.loss_bbox: 0.1191 d4.loss_iou: 0.1986 enc_loss_cls: 0.2824 enc_loss_bbox: 0.1427 enc_loss_iou: 0.2305 dn_loss_cls: 0.1016 dn_loss_bbox: 0.1751 dn_loss_iou: 0.2017 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.3287 d0.dn_loss_iou: 0.3337 d1.dn_loss_cls: 0.1340 d1.dn_loss_bbox: 0.2077 d1.dn_loss_iou: 0.2300 d2.dn_loss_cls: 0.1148 d2.dn_loss_bbox: 0.1870 d2.dn_loss_iou: 0.2122 d3.dn_loss_cls: 0.1077 d3.dn_loss_bbox: 0.1774 d3.dn_loss_iou: 0.2044 d4.dn_loss_cls: 0.0990 d4.dn_loss_bbox: 0.1751 d4.dn_loss_iou: 0.2016 d1.loss_lmm_region: 0.1286 loss_lmm_image: 0.8711 2024/11/12 16:20:56 - mmengine - INFO - Iter(train) [ 91500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:39:33 time: 1.9973 data_time: 0.0183 memory: 32383 grad_norm: 34.7173 loss: 8.7005 loss_cls: 0.2491 loss_bbox: 0.1236 loss_iou: 0.2110 d0.loss_cls: 0.2915 d0.loss_bbox: 0.1353 d0.loss_iou: 0.2313 d1.loss_cls: 0.2676 d1.loss_bbox: 0.1253 d1.loss_iou: 0.2158 d2.loss_cls: 0.2608 d2.loss_bbox: 0.1238 d2.loss_iou: 0.2147 d3.loss_cls: 0.2519 d3.loss_bbox: 0.1231 d3.loss_iou: 0.2121 d4.loss_cls: 0.2494 d4.loss_bbox: 0.1237 d4.loss_iou: 0.2114 enc_loss_cls: 0.2972 enc_loss_bbox: 0.1468 enc_loss_iou: 0.2558 dn_loss_cls: 0.1207 dn_loss_bbox: 0.1594 dn_loss_iou: 0.1996 d0.dn_loss_cls: 0.1923 d0.dn_loss_bbox: 0.3012 d0.dn_loss_iou: 0.3447 d1.dn_loss_cls: 0.1445 d1.dn_loss_bbox: 0.1885 d1.dn_loss_iou: 0.2304 d2.dn_loss_cls: 0.1268 d2.dn_loss_bbox: 0.1674 d2.dn_loss_iou: 0.2090 d3.dn_loss_cls: 0.1202 d3.dn_loss_bbox: 0.1607 d3.dn_loss_iou: 0.2019 d4.dn_loss_cls: 0.1197 d4.dn_loss_bbox: 0.1594 d4.dn_loss_iou: 0.1995 d1.loss_lmm_region: 0.1228 loss_lmm_image: 0.9109 2024/11/12 16:24:15 - mmengine - INFO - Iter(train) [ 91600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:36:10 time: 1.9710 data_time: 0.0182 memory: 33550 grad_norm: 29.4504 loss: 9.0066 loss_cls: 0.3233 loss_bbox: 0.1314 loss_iou: 0.2402 d0.loss_cls: 0.3511 d0.loss_bbox: 0.1450 d0.loss_iou: 0.2522 d1.loss_cls: 0.3410 d1.loss_bbox: 0.1302 d1.loss_iou: 0.2363 d2.loss_cls: 0.3380 d2.loss_bbox: 0.1252 d2.loss_iou: 0.2303 d3.loss_cls: 0.3289 d3.loss_bbox: 0.1266 d3.loss_iou: 0.2341 d4.loss_cls: 0.3229 d4.loss_bbox: 0.1314 d4.loss_iou: 0.2368 enc_loss_cls: 0.3569 enc_loss_bbox: 0.1520 enc_loss_iou: 0.2727 dn_loss_cls: 0.1027 dn_loss_bbox: 0.1394 dn_loss_iou: 0.1894 d0.dn_loss_cls: 0.1712 d0.dn_loss_bbox: 0.2727 d0.dn_loss_iou: 0.3256 d1.dn_loss_cls: 0.1278 d1.dn_loss_bbox: 0.1680 d1.dn_loss_iou: 0.2200 d2.dn_loss_cls: 0.1118 d2.dn_loss_bbox: 0.1494 d2.dn_loss_iou: 0.1997 d3.dn_loss_cls: 0.1058 d3.dn_loss_bbox: 0.1412 d3.dn_loss_iou: 0.1916 d4.dn_loss_cls: 0.1030 d4.dn_loss_bbox: 0.1394 d4.dn_loss_iou: 0.1894 d1.loss_lmm_region: 0.1172 loss_lmm_image: 0.8347 2024/11/12 16:27:34 - mmengine - INFO - Iter(train) [ 91700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:32:49 time: 1.9843 data_time: 0.0182 memory: 33845 grad_norm: 31.8330 loss: 9.8476 loss_cls: 0.2715 loss_bbox: 0.1570 loss_iou: 0.2568 d0.loss_cls: 0.3131 d0.loss_bbox: 0.1678 d0.loss_iou: 0.2716 d1.loss_cls: 0.2967 d1.loss_bbox: 0.1521 d1.loss_iou: 0.2586 d2.loss_cls: 0.2850 d2.loss_bbox: 0.1469 d2.loss_iou: 0.2535 d3.loss_cls: 0.2789 d3.loss_bbox: 0.1507 d3.loss_iou: 0.2531 d4.loss_cls: 0.2737 d4.loss_bbox: 0.1556 d4.loss_iou: 0.2558 enc_loss_cls: 0.3172 enc_loss_bbox: 0.1774 enc_loss_iou: 0.2898 dn_loss_cls: 0.1082 dn_loss_bbox: 0.2034 dn_loss_iou: 0.2505 d0.dn_loss_cls: 0.1974 d0.dn_loss_bbox: 0.3866 d0.dn_loss_iou: 0.4106 d1.dn_loss_cls: 0.1355 d1.dn_loss_bbox: 0.2443 d1.dn_loss_iou: 0.2844 d2.dn_loss_cls: 0.1160 d2.dn_loss_bbox: 0.2165 d2.dn_loss_iou: 0.2614 d3.dn_loss_cls: 0.1094 d3.dn_loss_bbox: 0.2061 d3.dn_loss_iou: 0.2530 d4.dn_loss_cls: 0.1081 d4.dn_loss_bbox: 0.2036 d4.dn_loss_iou: 0.2506 d1.loss_lmm_region: 0.1309 loss_lmm_image: 0.7886 2024/11/12 16:30:54 - mmengine - INFO - Iter(train) [ 91800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:29:27 time: 1.9905 data_time: 0.0184 memory: 32980 grad_norm: 26.1369 loss: 8.2957 loss_cls: 0.2497 loss_bbox: 0.1072 loss_iou: 0.2308 d0.loss_cls: 0.2891 d0.loss_bbox: 0.1206 d0.loss_iou: 0.2500 d1.loss_cls: 0.2682 d1.loss_bbox: 0.1121 d1.loss_iou: 0.2360 d2.loss_cls: 0.2643 d2.loss_bbox: 0.1070 d2.loss_iou: 0.2307 d3.loss_cls: 0.2557 d3.loss_bbox: 0.1075 d3.loss_iou: 0.2305 d4.loss_cls: 0.2517 d4.loss_bbox: 0.1061 d4.loss_iou: 0.2310 enc_loss_cls: 0.2952 enc_loss_bbox: 0.1334 enc_loss_iou: 0.2749 dn_loss_cls: 0.0878 dn_loss_bbox: 0.1303 dn_loss_iou: 0.2090 d0.dn_loss_cls: 0.1485 d0.dn_loss_bbox: 0.2513 d0.dn_loss_iou: 0.3395 d1.dn_loss_cls: 0.1078 d1.dn_loss_bbox: 0.1521 d1.dn_loss_iou: 0.2314 d2.dn_loss_cls: 0.0951 d2.dn_loss_bbox: 0.1374 d2.dn_loss_iou: 0.2170 d3.dn_loss_cls: 0.0924 d3.dn_loss_bbox: 0.1326 d3.dn_loss_iou: 0.2117 d4.dn_loss_cls: 0.0887 d4.dn_loss_bbox: 0.1303 d4.dn_loss_iou: 0.2090 d1.loss_lmm_region: 0.1188 loss_lmm_image: 0.8529 2024/11/12 16:34:12 - mmengine - INFO - Iter(train) [ 91900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:26:04 time: 1.9728 data_time: 0.0184 memory: 33497 grad_norm: 36.7030 loss: 9.4605 loss_cls: 0.2778 loss_bbox: 0.1526 loss_iou: 0.2409 d0.loss_cls: 0.3145 d0.loss_bbox: 0.1711 d0.loss_iou: 0.2572 d1.loss_cls: 0.2933 d1.loss_bbox: 0.1650 d1.loss_iou: 0.2517 d2.loss_cls: 0.2885 d2.loss_bbox: 0.1541 d2.loss_iou: 0.2430 d3.loss_cls: 0.2798 d3.loss_bbox: 0.1535 d3.loss_iou: 0.2433 d4.loss_cls: 0.2776 d4.loss_bbox: 0.1531 d4.loss_iou: 0.2428 enc_loss_cls: 0.3205 enc_loss_bbox: 0.1852 enc_loss_iou: 0.2800 dn_loss_cls: 0.1059 dn_loss_bbox: 0.1720 dn_loss_iou: 0.2207 d0.dn_loss_cls: 0.1902 d0.dn_loss_bbox: 0.3224 d0.dn_loss_iou: 0.3618 d1.dn_loss_cls: 0.1399 d1.dn_loss_bbox: 0.2056 d1.dn_loss_iou: 0.2505 d2.dn_loss_cls: 0.1185 d2.dn_loss_bbox: 0.1838 d2.dn_loss_iou: 0.2305 d3.dn_loss_cls: 0.1133 d3.dn_loss_bbox: 0.1747 d3.dn_loss_iou: 0.2230 d4.dn_loss_cls: 0.1075 d4.dn_loss_bbox: 0.1719 d4.dn_loss_iou: 0.2207 d1.loss_lmm_region: 0.1419 loss_lmm_image: 0.8601 2024/11/12 16:37:31 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 16:37:31 - mmengine - INFO - Iter(train) [ 92000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:22:42 time: 1.9950 data_time: 0.0183 memory: 35388 grad_norm: 30.9685 loss: 8.8063 loss_cls: 0.3086 loss_bbox: 0.1085 loss_iou: 0.2584 d0.loss_cls: 0.3563 d0.loss_bbox: 0.1263 d0.loss_iou: 0.2780 d1.loss_cls: 0.3288 d1.loss_bbox: 0.1155 d1.loss_iou: 0.2649 d2.loss_cls: 0.3259 d2.loss_bbox: 0.1064 d2.loss_iou: 0.2558 d3.loss_cls: 0.3137 d3.loss_bbox: 0.1093 d3.loss_iou: 0.2599 d4.loss_cls: 0.3109 d4.loss_bbox: 0.1098 d4.loss_iou: 0.2576 enc_loss_cls: 0.3555 enc_loss_bbox: 0.1437 enc_loss_iou: 0.3066 dn_loss_cls: 0.0808 dn_loss_bbox: 0.1190 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.1600 d0.dn_loss_bbox: 0.2348 d0.dn_loss_iou: 0.3243 d1.dn_loss_cls: 0.1116 d1.dn_loss_bbox: 0.1469 d1.dn_loss_iou: 0.2236 d2.dn_loss_cls: 0.0942 d2.dn_loss_bbox: 0.1268 d2.dn_loss_iou: 0.2035 d3.dn_loss_cls: 0.0833 d3.dn_loss_bbox: 0.1214 d3.dn_loss_iou: 0.1983 d4.dn_loss_cls: 0.0808 d4.dn_loss_bbox: 0.1189 d4.dn_loss_iou: 0.1962 d1.loss_lmm_region: 0.1168 loss_lmm_image: 0.8685 2024/11/12 16:40:50 - mmengine - INFO - Iter(train) [ 92100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:19:19 time: 1.9996 data_time: 0.0183 memory: 33308 grad_norm: 35.2905 loss: 8.4750 loss_cls: 0.2324 loss_bbox: 0.1337 loss_iou: 0.2096 d0.loss_cls: 0.2620 d0.loss_bbox: 0.1477 d0.loss_iou: 0.2216 d1.loss_cls: 0.2489 d1.loss_bbox: 0.1366 d1.loss_iou: 0.2115 d2.loss_cls: 0.2442 d2.loss_bbox: 0.1336 d2.loss_iou: 0.2078 d3.loss_cls: 0.2388 d3.loss_bbox: 0.1327 d3.loss_iou: 0.2080 d4.loss_cls: 0.2350 d4.loss_bbox: 0.1319 d4.loss_iou: 0.2082 enc_loss_cls: 0.2717 enc_loss_bbox: 0.1576 enc_loss_iou: 0.2372 dn_loss_cls: 0.0804 dn_loss_bbox: 0.1655 dn_loss_iou: 0.2089 d0.dn_loss_cls: 0.1627 d0.dn_loss_bbox: 0.3482 d0.dn_loss_iou: 0.3664 d1.dn_loss_cls: 0.1113 d1.dn_loss_bbox: 0.2106 d1.dn_loss_iou: 0.2422 d2.dn_loss_cls: 0.0913 d2.dn_loss_bbox: 0.1778 d2.dn_loss_iou: 0.2179 d3.dn_loss_cls: 0.0854 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2116 d4.dn_loss_cls: 0.0811 d4.dn_loss_bbox: 0.1655 d4.dn_loss_iou: 0.2089 d1.loss_lmm_region: 0.1218 loss_lmm_image: 0.8380 2024/11/12 16:44:10 - mmengine - INFO - Iter(train) [ 92200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:15:58 time: 1.9879 data_time: 0.0183 memory: 34624 grad_norm: 31.4672 loss: 8.0668 loss_cls: 0.2540 loss_bbox: 0.1091 loss_iou: 0.1876 d0.loss_cls: 0.2797 d0.loss_bbox: 0.1265 d0.loss_iou: 0.2135 d1.loss_cls: 0.2719 d1.loss_bbox: 0.1151 d1.loss_iou: 0.1954 d2.loss_cls: 0.2602 d2.loss_bbox: 0.1125 d2.loss_iou: 0.1911 d3.loss_cls: 0.2540 d3.loss_bbox: 0.1111 d3.loss_iou: 0.1878 d4.loss_cls: 0.2522 d4.loss_bbox: 0.1087 d4.loss_iou: 0.1870 enc_loss_cls: 0.2943 enc_loss_bbox: 0.1352 enc_loss_iou: 0.2237 dn_loss_cls: 0.0850 dn_loss_bbox: 0.1453 dn_loss_iou: 0.1879 d0.dn_loss_cls: 0.1629 d0.dn_loss_bbox: 0.2893 d0.dn_loss_iou: 0.3282 d1.dn_loss_cls: 0.1128 d1.dn_loss_bbox: 0.1827 d1.dn_loss_iou: 0.2200 d2.dn_loss_cls: 0.0978 d2.dn_loss_bbox: 0.1560 d2.dn_loss_iou: 0.1979 d3.dn_loss_cls: 0.0884 d3.dn_loss_bbox: 0.1467 d3.dn_loss_iou: 0.1900 d4.dn_loss_cls: 0.0827 d4.dn_loss_bbox: 0.1454 d4.dn_loss_iou: 0.1880 d1.loss_lmm_region: 0.1212 loss_lmm_image: 0.8682 2024/11/12 16:47:28 - mmengine - INFO - Iter(train) [ 92300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:12:35 time: 2.0025 data_time: 0.0183 memory: 35771 grad_norm: 31.1040 loss: 10.1621 loss_cls: 0.3219 loss_bbox: 0.1780 loss_iou: 0.2818 d0.loss_cls: 0.3680 d0.loss_bbox: 0.1795 d0.loss_iou: 0.2879 d1.loss_cls: 0.3368 d1.loss_bbox: 0.1774 d1.loss_iou: 0.2817 d2.loss_cls: 0.3352 d2.loss_bbox: 0.1746 d2.loss_iou: 0.2792 d3.loss_cls: 0.3282 d3.loss_bbox: 0.1795 d3.loss_iou: 0.2800 d4.loss_cls: 0.3247 d4.loss_bbox: 0.1796 d4.loss_iou: 0.2804 enc_loss_cls: 0.3674 enc_loss_bbox: 0.1943 enc_loss_iou: 0.3071 dn_loss_cls: 0.1170 dn_loss_bbox: 0.1741 dn_loss_iou: 0.2221 d0.dn_loss_cls: 0.2000 d0.dn_loss_bbox: 0.3200 d0.dn_loss_iou: 0.3630 d1.dn_loss_cls: 0.1500 d1.dn_loss_bbox: 0.2057 d1.dn_loss_iou: 0.2529 d2.dn_loss_cls: 0.1291 d2.dn_loss_bbox: 0.1824 d2.dn_loss_iou: 0.2308 d3.dn_loss_cls: 0.1207 d3.dn_loss_bbox: 0.1760 d3.dn_loss_iou: 0.2242 d4.dn_loss_cls: 0.1192 d4.dn_loss_bbox: 0.1741 d4.dn_loss_iou: 0.2221 d1.loss_lmm_region: 0.1277 loss_lmm_image: 0.8073 2024/11/12 16:50:46 - mmengine - INFO - Iter(train) [ 92400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:09:12 time: 1.9804 data_time: 0.0184 memory: 33659 grad_norm: 30.8010 loss: 8.4919 loss_cls: 0.2377 loss_bbox: 0.1294 loss_iou: 0.2359 d0.loss_cls: 0.2752 d0.loss_bbox: 0.1339 d0.loss_iou: 0.2517 d1.loss_cls: 0.2571 d1.loss_bbox: 0.1297 d1.loss_iou: 0.2405 d2.loss_cls: 0.2492 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2395 d3.loss_cls: 0.2442 d3.loss_bbox: 0.1268 d3.loss_iou: 0.2366 d4.loss_cls: 0.2419 d4.loss_bbox: 0.1274 d4.loss_iou: 0.2355 enc_loss_cls: 0.2819 enc_loss_bbox: 0.1455 enc_loss_iou: 0.2690 dn_loss_cls: 0.0777 dn_loss_bbox: 0.1472 dn_loss_iou: 0.2077 d0.dn_loss_cls: 0.1559 d0.dn_loss_bbox: 0.2760 d0.dn_loss_iou: 0.3416 d1.dn_loss_cls: 0.1101 d1.dn_loss_bbox: 0.1770 d1.dn_loss_iou: 0.2354 d2.dn_loss_cls: 0.0892 d2.dn_loss_bbox: 0.1532 d2.dn_loss_iou: 0.2147 d3.dn_loss_cls: 0.0822 d3.dn_loss_bbox: 0.1488 d3.dn_loss_iou: 0.2098 d4.dn_loss_cls: 0.0771 d4.dn_loss_bbox: 0.1473 d4.dn_loss_iou: 0.2077 d1.loss_lmm_region: 0.1178 loss_lmm_image: 0.8981 2024/11/12 16:54:03 - mmengine - INFO - Iter(train) [ 92500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:05:48 time: 1.9826 data_time: 0.0184 memory: 33842 grad_norm: 27.7810 loss: 9.8295 loss_cls: 0.2869 loss_bbox: 0.1584 loss_iou: 0.2719 d0.loss_cls: 0.3366 d0.loss_bbox: 0.1774 d0.loss_iou: 0.2954 d1.loss_cls: 0.3091 d1.loss_bbox: 0.1678 d1.loss_iou: 0.2849 d2.loss_cls: 0.3011 d2.loss_bbox: 0.1596 d2.loss_iou: 0.2744 d3.loss_cls: 0.2932 d3.loss_bbox: 0.1586 d3.loss_iou: 0.2705 d4.loss_cls: 0.2936 d4.loss_bbox: 0.1574 d4.loss_iou: 0.2715 enc_loss_cls: 0.3378 enc_loss_bbox: 0.1924 enc_loss_iou: 0.3180 dn_loss_cls: 0.1103 dn_loss_bbox: 0.1782 dn_loss_iou: 0.2241 d0.dn_loss_cls: 0.1776 d0.dn_loss_bbox: 0.3215 d0.dn_loss_iou: 0.3700 d1.dn_loss_cls: 0.1348 d1.dn_loss_bbox: 0.2104 d1.dn_loss_iou: 0.2533 d2.dn_loss_cls: 0.1183 d2.dn_loss_bbox: 0.1869 d2.dn_loss_iou: 0.2333 d3.dn_loss_cls: 0.1139 d3.dn_loss_bbox: 0.1788 d3.dn_loss_iou: 0.2262 d4.dn_loss_cls: 0.1103 d4.dn_loss_bbox: 0.1782 d4.dn_loss_iou: 0.2241 d1.loss_lmm_region: 0.1272 loss_lmm_image: 0.8356 2024/11/12 16:57:20 - mmengine - INFO - Iter(train) [ 92600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 8:02:25 time: 1.9501 data_time: 0.0182 memory: 34999 grad_norm: 34.0949 loss: 8.7250 loss_cls: 0.2792 loss_bbox: 0.1226 loss_iou: 0.2425 d0.loss_cls: 0.3106 d0.loss_bbox: 0.1359 d0.loss_iou: 0.2611 d1.loss_cls: 0.2919 d1.loss_bbox: 0.1286 d1.loss_iou: 0.2516 d2.loss_cls: 0.2778 d2.loss_bbox: 0.1292 d2.loss_iou: 0.2468 d3.loss_cls: 0.2800 d3.loss_bbox: 0.1256 d3.loss_iou: 0.2445 d4.loss_cls: 0.2746 d4.loss_bbox: 0.1251 d4.loss_iou: 0.2435 enc_loss_cls: 0.3133 enc_loss_bbox: 0.1517 enc_loss_iou: 0.2818 dn_loss_cls: 0.1131 dn_loss_bbox: 0.1303 dn_loss_iou: 0.1905 d0.dn_loss_cls: 0.1838 d0.dn_loss_bbox: 0.2600 d0.dn_loss_iou: 0.3268 d1.dn_loss_cls: 0.1363 d1.dn_loss_bbox: 0.1545 d1.dn_loss_iou: 0.2185 d2.dn_loss_cls: 0.1201 d2.dn_loss_bbox: 0.1372 d2.dn_loss_iou: 0.2000 d3.dn_loss_cls: 0.1159 d3.dn_loss_bbox: 0.1325 d3.dn_loss_iou: 0.1934 d4.dn_loss_cls: 0.1134 d4.dn_loss_bbox: 0.1302 d4.dn_loss_iou: 0.1904 d1.loss_lmm_region: 0.1343 loss_lmm_image: 0.8261 2024/11/12 17:00:38 - mmengine - INFO - Iter(train) [ 92700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:59:03 time: 1.9964 data_time: 0.0182 memory: 33225 grad_norm: 40.6264 loss: 10.5671 loss_cls: 0.2922 loss_bbox: 0.1696 loss_iou: 0.2858 d0.loss_cls: 0.3380 d0.loss_bbox: 0.1870 d0.loss_iou: 0.3001 d1.loss_cls: 0.3267 d1.loss_bbox: 0.1666 d1.loss_iou: 0.2811 d2.loss_cls: 0.3012 d2.loss_bbox: 0.1755 d2.loss_iou: 0.2910 d3.loss_cls: 0.2986 d3.loss_bbox: 0.1706 d3.loss_iou: 0.2856 d4.loss_cls: 0.2917 d4.loss_bbox: 0.1709 d4.loss_iou: 0.2871 enc_loss_cls: 0.3525 enc_loss_bbox: 0.1931 enc_loss_iou: 0.3178 dn_loss_cls: 0.1382 dn_loss_bbox: 0.1949 dn_loss_iou: 0.2506 d0.dn_loss_cls: 0.2280 d0.dn_loss_bbox: 0.3572 d0.dn_loss_iou: 0.4018 d1.dn_loss_cls: 0.1749 d1.dn_loss_bbox: 0.2281 d1.dn_loss_iou: 0.2809 d2.dn_loss_cls: 0.1520 d2.dn_loss_bbox: 0.2051 d2.dn_loss_iou: 0.2606 d3.dn_loss_cls: 0.1442 d3.dn_loss_bbox: 0.1977 d3.dn_loss_iou: 0.2529 d4.dn_loss_cls: 0.1383 d4.dn_loss_bbox: 0.1949 d4.dn_loss_iou: 0.2505 d1.loss_lmm_region: 0.1466 loss_lmm_image: 0.8869 2024/11/12 17:03:57 - mmengine - INFO - Iter(train) [ 92800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:55:40 time: 1.9949 data_time: 0.0183 memory: 34379 grad_norm: 25.8765 loss: 9.3362 loss_cls: 0.2506 loss_bbox: 0.1384 loss_iou: 0.2407 d0.loss_cls: 0.2844 d0.loss_bbox: 0.1557 d0.loss_iou: 0.2544 d1.loss_cls: 0.2682 d1.loss_bbox: 0.1443 d1.loss_iou: 0.2466 d2.loss_cls: 0.2588 d2.loss_bbox: 0.1383 d2.loss_iou: 0.2414 d3.loss_cls: 0.2583 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2376 d4.loss_cls: 0.2524 d4.loss_bbox: 0.1361 d4.loss_iou: 0.2390 enc_loss_cls: 0.2972 enc_loss_bbox: 0.1602 enc_loss_iou: 0.2726 dn_loss_cls: 0.1262 dn_loss_bbox: 0.1784 dn_loss_iou: 0.2319 d0.dn_loss_cls: 0.2047 d0.dn_loss_bbox: 0.3347 d0.dn_loss_iou: 0.3818 d1.dn_loss_cls: 0.1552 d1.dn_loss_bbox: 0.2134 d1.dn_loss_iou: 0.2657 d2.dn_loss_cls: 0.1410 d2.dn_loss_bbox: 0.1905 d2.dn_loss_iou: 0.2419 d3.dn_loss_cls: 0.1324 d3.dn_loss_bbox: 0.1809 d3.dn_loss_iou: 0.2343 d4.dn_loss_cls: 0.1269 d4.dn_loss_bbox: 0.1783 d4.dn_loss_iou: 0.2318 d1.loss_lmm_region: 0.1438 loss_lmm_image: 0.8319 2024/11/12 17:07:16 - mmengine - INFO - Iter(train) [ 92900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:52:18 time: 1.9980 data_time: 0.0182 memory: 33771 grad_norm: 24.0797 loss: 10.7508 loss_cls: 0.3349 loss_bbox: 0.1853 loss_iou: 0.3278 d0.loss_cls: 0.3805 d0.loss_bbox: 0.2006 d0.loss_iou: 0.3419 d1.loss_cls: 0.3571 d1.loss_bbox: 0.1862 d1.loss_iou: 0.3324 d2.loss_cls: 0.3511 d2.loss_bbox: 0.1843 d2.loss_iou: 0.3267 d3.loss_cls: 0.3414 d3.loss_bbox: 0.1879 d3.loss_iou: 0.3289 d4.loss_cls: 0.3356 d4.loss_bbox: 0.1860 d4.loss_iou: 0.3286 enc_loss_cls: 0.4026 enc_loss_bbox: 0.2059 enc_loss_iou: 0.3612 dn_loss_cls: 0.0969 dn_loss_bbox: 0.1781 dn_loss_iou: 0.2384 d0.dn_loss_cls: 0.1752 d0.dn_loss_bbox: 0.3206 d0.dn_loss_iou: 0.3856 d1.dn_loss_cls: 0.1226 d1.dn_loss_bbox: 0.2106 d1.dn_loss_iou: 0.2713 d2.dn_loss_cls: 0.1038 d2.dn_loss_bbox: 0.1885 d2.dn_loss_iou: 0.2485 d3.dn_loss_cls: 0.0962 d3.dn_loss_bbox: 0.1804 d3.dn_loss_iou: 0.2411 d4.dn_loss_cls: 0.0963 d4.dn_loss_bbox: 0.1781 d4.dn_loss_iou: 0.2383 d1.loss_lmm_region: 0.1495 loss_lmm_image: 0.8438 2024/11/12 17:10:34 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 17:10:34 - mmengine - INFO - Iter(train) [ 93000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:48:55 time: 1.9787 data_time: 0.0183 memory: 32496 grad_norm: 30.7738 loss: 8.4278 loss_cls: 0.2733 loss_bbox: 0.1060 loss_iou: 0.2125 d0.loss_cls: 0.3169 d0.loss_bbox: 0.1143 d0.loss_iou: 0.2261 d1.loss_cls: 0.2914 d1.loss_bbox: 0.1113 d1.loss_iou: 0.2215 d2.loss_cls: 0.2812 d2.loss_bbox: 0.1095 d2.loss_iou: 0.2161 d3.loss_cls: 0.2737 d3.loss_bbox: 0.1084 d3.loss_iou: 0.2156 d4.loss_cls: 0.2754 d4.loss_bbox: 0.1059 d4.loss_iou: 0.2119 enc_loss_cls: 0.3245 enc_loss_bbox: 0.1322 enc_loss_iou: 0.2497 dn_loss_cls: 0.0893 dn_loss_bbox: 0.1438 dn_loss_iou: 0.1936 d0.dn_loss_cls: 0.1744 d0.dn_loss_bbox: 0.2899 d0.dn_loss_iou: 0.3356 d1.dn_loss_cls: 0.1212 d1.dn_loss_bbox: 0.1788 d1.dn_loss_iou: 0.2262 d2.dn_loss_cls: 0.1019 d2.dn_loss_bbox: 0.1568 d2.dn_loss_iou: 0.2041 d3.dn_loss_cls: 0.0943 d3.dn_loss_bbox: 0.1471 d3.dn_loss_iou: 0.1968 d4.dn_loss_cls: 0.0895 d4.dn_loss_bbox: 0.1439 d4.dn_loss_iou: 0.1937 d1.loss_lmm_region: 0.1336 loss_lmm_image: 0.8361 2024/11/12 17:13:53 - mmengine - INFO - Iter(train) [ 93100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:45:34 time: 1.9968 data_time: 0.0183 memory: 32246 grad_norm: 41.2669 loss: 9.0658 loss_cls: 0.2832 loss_bbox: 0.1318 loss_iou: 0.2230 d0.loss_cls: 0.3188 d0.loss_bbox: 0.1475 d0.loss_iou: 0.2396 d1.loss_cls: 0.3001 d1.loss_bbox: 0.1355 d1.loss_iou: 0.2279 d2.loss_cls: 0.2933 d2.loss_bbox: 0.1313 d2.loss_iou: 0.2228 d3.loss_cls: 0.2860 d3.loss_bbox: 0.1294 d3.loss_iou: 0.2217 d4.loss_cls: 0.2808 d4.loss_bbox: 0.1314 d4.loss_iou: 0.2230 enc_loss_cls: 0.3237 enc_loss_bbox: 0.1574 enc_loss_iou: 0.2585 dn_loss_cls: 0.1301 dn_loss_bbox: 0.1589 dn_loss_iou: 0.1986 d0.dn_loss_cls: 0.1963 d0.dn_loss_bbox: 0.2952 d0.dn_loss_iou: 0.3279 d1.dn_loss_cls: 0.1503 d1.dn_loss_bbox: 0.1889 d1.dn_loss_iou: 0.2284 d2.dn_loss_cls: 0.1408 d2.dn_loss_bbox: 0.1673 d2.dn_loss_iou: 0.2077 d3.dn_loss_cls: 0.1350 d3.dn_loss_bbox: 0.1607 d3.dn_loss_iou: 0.2013 d4.dn_loss_cls: 0.1296 d4.dn_loss_bbox: 0.1589 d4.dn_loss_iou: 0.1987 d1.loss_lmm_region: 0.1429 loss_lmm_image: 0.8818 2024/11/12 17:17:13 - mmengine - INFO - Iter(train) [ 93200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:42:12 time: 1.9839 data_time: 0.0182 memory: 34572 grad_norm: 33.6249 loss: 9.5604 loss_cls: 0.2989 loss_bbox: 0.1477 loss_iou: 0.2444 d0.loss_cls: 0.3477 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2536 d1.loss_cls: 0.3159 d1.loss_bbox: 0.1486 d1.loss_iou: 0.2475 d2.loss_cls: 0.3087 d2.loss_bbox: 0.1481 d2.loss_iou: 0.2449 d3.loss_cls: 0.3022 d3.loss_bbox: 0.1482 d3.loss_iou: 0.2427 d4.loss_cls: 0.3010 d4.loss_bbox: 0.1488 d4.loss_iou: 0.2449 enc_loss_cls: 0.3504 enc_loss_bbox: 0.1720 enc_loss_iou: 0.2758 dn_loss_cls: 0.1163 dn_loss_bbox: 0.1686 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.2012 d0.dn_loss_bbox: 0.3236 d0.dn_loss_iou: 0.3614 d1.dn_loss_cls: 0.1458 d1.dn_loss_bbox: 0.2027 d1.dn_loss_iou: 0.2514 d2.dn_loss_cls: 0.1292 d2.dn_loss_bbox: 0.1790 d2.dn_loss_iou: 0.2293 d3.dn_loss_cls: 0.1220 d3.dn_loss_bbox: 0.1709 d3.dn_loss_iou: 0.2228 d4.dn_loss_cls: 0.1179 d4.dn_loss_bbox: 0.1686 d4.dn_loss_iou: 0.2203 d1.loss_lmm_region: 0.1227 loss_lmm_image: 0.8395 2024/11/12 17:20:33 - mmengine - INFO - Iter(train) [ 93300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:38:50 time: 1.9974 data_time: 0.0184 memory: 34305 grad_norm: 26.8916 loss: 9.1786 loss_cls: 0.2850 loss_bbox: 0.1421 loss_iou: 0.2635 d0.loss_cls: 0.3269 d0.loss_bbox: 0.1543 d0.loss_iou: 0.2804 d1.loss_cls: 0.3036 d1.loss_bbox: 0.1509 d1.loss_iou: 0.2716 d2.loss_cls: 0.2980 d2.loss_bbox: 0.1438 d2.loss_iou: 0.2664 d3.loss_cls: 0.2876 d3.loss_bbox: 0.1445 d3.loss_iou: 0.2652 d4.loss_cls: 0.2841 d4.loss_bbox: 0.1429 d4.loss_iou: 0.2645 enc_loss_cls: 0.3288 enc_loss_bbox: 0.1711 enc_loss_iou: 0.3001 dn_loss_cls: 0.0928 dn_loss_bbox: 0.1573 dn_loss_iou: 0.1922 d0.dn_loss_cls: 0.1649 d0.dn_loss_bbox: 0.3064 d0.dn_loss_iou: 0.3363 d1.dn_loss_cls: 0.1192 d1.dn_loss_bbox: 0.1955 d1.dn_loss_iou: 0.2248 d2.dn_loss_cls: 0.1025 d2.dn_loss_bbox: 0.1692 d2.dn_loss_iou: 0.2017 d3.dn_loss_cls: 0.0974 d3.dn_loss_bbox: 0.1593 d3.dn_loss_iou: 0.1950 d4.dn_loss_cls: 0.0938 d4.dn_loss_bbox: 0.1572 d4.dn_loss_iou: 0.1921 d1.loss_lmm_region: 0.1086 loss_lmm_image: 0.8371 2024/11/12 17:23:51 - mmengine - INFO - Iter(train) [ 93400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:35:28 time: 1.9816 data_time: 0.0182 memory: 34409 grad_norm: 28.0648 loss: 9.9541 loss_cls: 0.2987 loss_bbox: 0.1526 loss_iou: 0.2815 d0.loss_cls: 0.3521 d0.loss_bbox: 0.1680 d0.loss_iou: 0.2985 d1.loss_cls: 0.3248 d1.loss_bbox: 0.1574 d1.loss_iou: 0.2868 d2.loss_cls: 0.3147 d2.loss_bbox: 0.1527 d2.loss_iou: 0.2813 d3.loss_cls: 0.3066 d3.loss_bbox: 0.1508 d3.loss_iou: 0.2804 d4.loss_cls: 0.2990 d4.loss_bbox: 0.1522 d4.loss_iou: 0.2818 enc_loss_cls: 0.3574 enc_loss_bbox: 0.1812 enc_loss_iou: 0.3221 dn_loss_cls: 0.0912 dn_loss_bbox: 0.1770 dn_loss_iou: 0.2406 d0.dn_loss_cls: 0.1752 d0.dn_loss_bbox: 0.3282 d0.dn_loss_iou: 0.3956 d1.dn_loss_cls: 0.1223 d1.dn_loss_bbox: 0.2087 d1.dn_loss_iou: 0.2717 d2.dn_loss_cls: 0.1055 d2.dn_loss_bbox: 0.1877 d2.dn_loss_iou: 0.2496 d3.dn_loss_cls: 0.0977 d3.dn_loss_bbox: 0.1797 d3.dn_loss_iou: 0.2427 d4.dn_loss_cls: 0.0918 d4.dn_loss_bbox: 0.1770 d4.dn_loss_iou: 0.2405 d1.loss_lmm_region: 0.1218 loss_lmm_image: 0.8490 2024/11/12 17:27:10 - mmengine - INFO - Iter(train) [ 93500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:32:05 time: 1.9797 data_time: 0.0182 memory: 35047 grad_norm: 38.3851 loss: 8.6060 loss_cls: 0.2543 loss_bbox: 0.1211 loss_iou: 0.2308 d0.loss_cls: 0.2906 d0.loss_bbox: 0.1320 d0.loss_iou: 0.2481 d1.loss_cls: 0.2727 d1.loss_bbox: 0.1253 d1.loss_iou: 0.2354 d2.loss_cls: 0.2642 d2.loss_bbox: 0.1222 d2.loss_iou: 0.2330 d3.loss_cls: 0.2567 d3.loss_bbox: 0.1213 d3.loss_iou: 0.2302 d4.loss_cls: 0.2555 d4.loss_bbox: 0.1206 d4.loss_iou: 0.2302 enc_loss_cls: 0.2948 enc_loss_bbox: 0.1428 enc_loss_iou: 0.2705 dn_loss_cls: 0.0871 dn_loss_bbox: 0.1566 dn_loss_iou: 0.2019 d0.dn_loss_cls: 0.1590 d0.dn_loss_bbox: 0.2918 d0.dn_loss_iou: 0.3418 d1.dn_loss_cls: 0.1179 d1.dn_loss_bbox: 0.1855 d1.dn_loss_iou: 0.2335 d2.dn_loss_cls: 0.0993 d2.dn_loss_bbox: 0.1687 d2.dn_loss_iou: 0.2119 d3.dn_loss_cls: 0.0906 d3.dn_loss_bbox: 0.1578 d3.dn_loss_iou: 0.2045 d4.dn_loss_cls: 0.0877 d4.dn_loss_bbox: 0.1567 d4.dn_loss_iou: 0.2020 d1.loss_lmm_region: 0.1149 loss_lmm_image: 0.8841 2024/11/12 17:30:30 - mmengine - INFO - Iter(train) [ 93600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:28:44 time: 2.0017 data_time: 0.0183 memory: 32394 grad_norm: 28.7199 loss: 8.6136 loss_cls: 0.2613 loss_bbox: 0.1071 loss_iou: 0.2050 d0.loss_cls: 0.2880 d0.loss_bbox: 0.1210 d0.loss_iou: 0.2205 d1.loss_cls: 0.2801 d1.loss_bbox: 0.1092 d1.loss_iou: 0.2089 d2.loss_cls: 0.2650 d2.loss_bbox: 0.1116 d2.loss_iou: 0.2097 d3.loss_cls: 0.2586 d3.loss_bbox: 0.1088 d3.loss_iou: 0.2076 d4.loss_cls: 0.2600 d4.loss_bbox: 0.1079 d4.loss_iou: 0.2062 enc_loss_cls: 0.2917 enc_loss_bbox: 0.1311 enc_loss_iou: 0.2384 dn_loss_cls: 0.1272 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2112 d0.dn_loss_cls: 0.2037 d0.dn_loss_bbox: 0.3039 d0.dn_loss_iou: 0.3514 d1.dn_loss_cls: 0.1627 d1.dn_loss_bbox: 0.1886 d1.dn_loss_iou: 0.2384 d2.dn_loss_cls: 0.1468 d2.dn_loss_bbox: 0.1652 d2.dn_loss_iou: 0.2190 d3.dn_loss_cls: 0.1403 d3.dn_loss_bbox: 0.1578 d3.dn_loss_iou: 0.2126 d4.dn_loss_cls: 0.1311 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.2111 d1.loss_lmm_region: 0.1114 loss_lmm_image: 0.8216 2024/11/12 17:33:49 - mmengine - INFO - Iter(train) [ 93700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:25:22 time: 1.9824 data_time: 0.0182 memory: 33788 grad_norm: 41.5865 loss: 9.6735 loss_cls: 0.2841 loss_bbox: 0.1514 loss_iou: 0.2836 d0.loss_cls: 0.3357 d0.loss_bbox: 0.1572 d0.loss_iou: 0.2920 d1.loss_cls: 0.3145 d1.loss_bbox: 0.1478 d1.loss_iou: 0.2845 d2.loss_cls: 0.2941 d2.loss_bbox: 0.1494 d2.loss_iou: 0.2826 d3.loss_cls: 0.2860 d3.loss_bbox: 0.1498 d3.loss_iou: 0.2827 d4.loss_cls: 0.2829 d4.loss_bbox: 0.1514 d4.loss_iou: 0.2827 enc_loss_cls: 0.3383 enc_loss_bbox: 0.1740 enc_loss_iou: 0.3174 dn_loss_cls: 0.1001 dn_loss_bbox: 0.1598 dn_loss_iou: 0.2244 d0.dn_loss_cls: 0.1876 d0.dn_loss_bbox: 0.3097 d0.dn_loss_iou: 0.3709 d1.dn_loss_cls: 0.1342 d1.dn_loss_bbox: 0.1996 d1.dn_loss_iou: 0.2579 d2.dn_loss_cls: 0.1131 d2.dn_loss_bbox: 0.1719 d2.dn_loss_iou: 0.2350 d3.dn_loss_cls: 0.1043 d3.dn_loss_bbox: 0.1624 d3.dn_loss_iou: 0.2271 d4.dn_loss_cls: 0.1005 d4.dn_loss_bbox: 0.1597 d4.dn_loss_iou: 0.2243 d1.loss_lmm_region: 0.1387 loss_lmm_image: 0.8501 2024/11/12 17:37:08 - mmengine - INFO - Iter(train) [ 93800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:21:59 time: 1.9804 data_time: 0.0183 memory: 34301 grad_norm: 24.3266 loss: 8.8593 loss_cls: 0.2651 loss_bbox: 0.1336 loss_iou: 0.2272 d0.loss_cls: 0.3077 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2414 d1.loss_cls: 0.2903 d1.loss_bbox: 0.1362 d1.loss_iou: 0.2309 d2.loss_cls: 0.2777 d2.loss_bbox: 0.1340 d2.loss_iou: 0.2268 d3.loss_cls: 0.2719 d3.loss_bbox: 0.1313 d3.loss_iou: 0.2244 d4.loss_cls: 0.2678 d4.loss_bbox: 0.1317 d4.loss_iou: 0.2254 enc_loss_cls: 0.3155 enc_loss_bbox: 0.1596 enc_loss_iou: 0.2638 dn_loss_cls: 0.0848 dn_loss_bbox: 0.1681 dn_loss_iou: 0.2005 d0.dn_loss_cls: 0.1657 d0.dn_loss_bbox: 0.3210 d0.dn_loss_iou: 0.3415 d1.dn_loss_cls: 0.1159 d1.dn_loss_bbox: 0.2074 d1.dn_loss_iou: 0.2354 d2.dn_loss_cls: 0.0982 d2.dn_loss_bbox: 0.1783 d2.dn_loss_iou: 0.2115 d3.dn_loss_cls: 0.0881 d3.dn_loss_bbox: 0.1694 d3.dn_loss_iou: 0.2034 d4.dn_loss_cls: 0.0861 d4.dn_loss_bbox: 0.1682 d4.dn_loss_iou: 0.2006 d1.loss_lmm_region: 0.1360 loss_lmm_image: 0.8655 2024/11/12 17:40:25 - mmengine - INFO - Iter(train) [ 93900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:18:37 time: 1.9876 data_time: 0.0184 memory: 34945 grad_norm: 37.1324 loss: 9.7674 loss_cls: 0.2982 loss_bbox: 0.1479 loss_iou: 0.2651 d0.loss_cls: 0.3498 d0.loss_bbox: 0.1596 d0.loss_iou: 0.2743 d1.loss_cls: 0.3192 d1.loss_bbox: 0.1559 d1.loss_iou: 0.2723 d2.loss_cls: 0.3078 d2.loss_bbox: 0.1516 d2.loss_iou: 0.2658 d3.loss_cls: 0.3032 d3.loss_bbox: 0.1501 d3.loss_iou: 0.2664 d4.loss_cls: 0.2980 d4.loss_bbox: 0.1499 d4.loss_iou: 0.2659 enc_loss_cls: 0.3537 enc_loss_bbox: 0.1739 enc_loss_iou: 0.2984 dn_loss_cls: 0.1226 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2299 d0.dn_loss_cls: 0.2124 d0.dn_loss_bbox: 0.2931 d0.dn_loss_iou: 0.3730 d1.dn_loss_cls: 0.1600 d1.dn_loss_bbox: 0.1887 d1.dn_loss_iou: 0.2620 d2.dn_loss_cls: 0.1373 d2.dn_loss_bbox: 0.1666 d2.dn_loss_iou: 0.2406 d3.dn_loss_cls: 0.1310 d3.dn_loss_bbox: 0.1581 d3.dn_loss_iou: 0.2329 d4.dn_loss_cls: 0.1241 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.2299 d1.loss_lmm_region: 0.1352 loss_lmm_image: 0.8315 2024/11/12 17:43:46 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 17:43:46 - mmengine - INFO - Iter(train) [ 94000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:15:16 time: 1.9961 data_time: 0.0182 memory: 34905 grad_norm: 30.7856 loss: 10.5868 loss_cls: 0.3431 loss_bbox: 0.1706 loss_iou: 0.3204 d0.loss_cls: 0.3801 d0.loss_bbox: 0.1826 d0.loss_iou: 0.3322 d1.loss_cls: 0.3538 d1.loss_bbox: 0.1800 d1.loss_iou: 0.3290 d2.loss_cls: 0.3446 d2.loss_bbox: 0.1809 d2.loss_iou: 0.3266 d3.loss_cls: 0.3492 d3.loss_bbox: 0.1695 d3.loss_iou: 0.3245 d4.loss_cls: 0.3437 d4.loss_bbox: 0.1713 d4.loss_iou: 0.3229 enc_loss_cls: 0.3849 enc_loss_bbox: 0.1933 enc_loss_iou: 0.3484 dn_loss_cls: 0.1074 dn_loss_bbox: 0.1610 dn_loss_iou: 0.2324 d0.dn_loss_cls: 0.1976 d0.dn_loss_bbox: 0.3059 d0.dn_loss_iou: 0.3792 d1.dn_loss_cls: 0.1432 d1.dn_loss_bbox: 0.1957 d1.dn_loss_iou: 0.2669 d2.dn_loss_cls: 0.1227 d2.dn_loss_bbox: 0.1729 d2.dn_loss_iou: 0.2446 d3.dn_loss_cls: 0.1127 d3.dn_loss_bbox: 0.1643 d3.dn_loss_iou: 0.2356 d4.dn_loss_cls: 0.1078 d4.dn_loss_bbox: 0.1609 d4.dn_loss_iou: 0.2324 d1.loss_lmm_region: 0.1524 loss_lmm_image: 0.8398 2024/11/12 17:47:06 - mmengine - INFO - Iter(train) [ 94100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:11:54 time: 1.9763 data_time: 0.0182 memory: 34061 grad_norm: 26.6029 loss: 8.8504 loss_cls: 0.2690 loss_bbox: 0.1362 loss_iou: 0.2045 d0.loss_cls: 0.3062 d0.loss_bbox: 0.1476 d0.loss_iou: 0.2184 d1.loss_cls: 0.2893 d1.loss_bbox: 0.1420 d1.loss_iou: 0.2096 d2.loss_cls: 0.2827 d2.loss_bbox: 0.1364 d2.loss_iou: 0.2026 d3.loss_cls: 0.2729 d3.loss_bbox: 0.1371 d3.loss_iou: 0.2050 d4.loss_cls: 0.2691 d4.loss_bbox: 0.1375 d4.loss_iou: 0.2041 enc_loss_cls: 0.3171 enc_loss_bbox: 0.1615 enc_loss_iou: 0.2357 dn_loss_cls: 0.0975 dn_loss_bbox: 0.1758 dn_loss_iou: 0.2069 d0.dn_loss_cls: 0.1899 d0.dn_loss_bbox: 0.3303 d0.dn_loss_iou: 0.3513 d1.dn_loss_cls: 0.1401 d1.dn_loss_bbox: 0.2087 d1.dn_loss_iou: 0.2366 d2.dn_loss_cls: 0.1174 d2.dn_loss_bbox: 0.1840 d2.dn_loss_iou: 0.2152 d3.dn_loss_cls: 0.1057 d3.dn_loss_bbox: 0.1759 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.0989 d4.dn_loss_bbox: 0.1758 d4.dn_loss_iou: 0.2068 d1.loss_lmm_region: 0.1284 loss_lmm_image: 0.8129 2024/11/12 17:50:26 - mmengine - INFO - Iter(train) [ 94200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:08:33 time: 1.9952 data_time: 0.0183 memory: 31787 grad_norm: 26.2471 loss: 9.5496 loss_cls: 0.3058 loss_bbox: 0.1433 loss_iou: 0.2894 d0.loss_cls: 0.3676 d0.loss_bbox: 0.1556 d0.loss_iou: 0.3057 d1.loss_cls: 0.3299 d1.loss_bbox: 0.1502 d1.loss_iou: 0.2983 d2.loss_cls: 0.3194 d2.loss_bbox: 0.1463 d2.loss_iou: 0.2948 d3.loss_cls: 0.3132 d3.loss_bbox: 0.1425 d3.loss_iou: 0.2908 d4.loss_cls: 0.3049 d4.loss_bbox: 0.1440 d4.loss_iou: 0.2929 enc_loss_cls: 0.3604 enc_loss_bbox: 0.1668 enc_loss_iou: 0.3321 dn_loss_cls: 0.0868 dn_loss_bbox: 0.1372 dn_loss_iou: 0.2068 d0.dn_loss_cls: 0.1681 d0.dn_loss_bbox: 0.2818 d0.dn_loss_iou: 0.3457 d1.dn_loss_cls: 0.1183 d1.dn_loss_bbox: 0.1729 d1.dn_loss_iou: 0.2376 d2.dn_loss_cls: 0.0975 d2.dn_loss_bbox: 0.1493 d2.dn_loss_iou: 0.2172 d3.dn_loss_cls: 0.0913 d3.dn_loss_bbox: 0.1400 d3.dn_loss_iou: 0.2096 d4.dn_loss_cls: 0.0881 d4.dn_loss_bbox: 0.1372 d4.dn_loss_iou: 0.2069 d1.loss_lmm_region: 0.1291 loss_lmm_image: 0.8744 2024/11/12 17:53:47 - mmengine - INFO - Iter(train) [ 94300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:05:12 time: 2.0166 data_time: 0.0181 memory: 34966 grad_norm: 33.2260 loss: 9.7828 loss_cls: 0.2861 loss_bbox: 0.1652 loss_iou: 0.2577 d0.loss_cls: 0.3260 d0.loss_bbox: 0.1790 d0.loss_iou: 0.2738 d1.loss_cls: 0.3071 d1.loss_bbox: 0.1677 d1.loss_iou: 0.2582 d2.loss_cls: 0.2974 d2.loss_bbox: 0.1647 d2.loss_iou: 0.2514 d3.loss_cls: 0.2865 d3.loss_bbox: 0.1669 d3.loss_iou: 0.2551 d4.loss_cls: 0.2860 d4.loss_bbox: 0.1637 d4.loss_iou: 0.2569 enc_loss_cls: 0.3459 enc_loss_bbox: 0.1825 enc_loss_iou: 0.2833 dn_loss_cls: 0.1126 dn_loss_bbox: 0.1765 dn_loss_iou: 0.2275 d0.dn_loss_cls: 0.1927 d0.dn_loss_bbox: 0.3575 d0.dn_loss_iou: 0.3883 d1.dn_loss_cls: 0.1407 d1.dn_loss_bbox: 0.2212 d1.dn_loss_iou: 0.2644 d2.dn_loss_cls: 0.1238 d2.dn_loss_bbox: 0.1890 d2.dn_loss_iou: 0.2391 d3.dn_loss_cls: 0.1166 d3.dn_loss_bbox: 0.1802 d3.dn_loss_iou: 0.2311 d4.dn_loss_cls: 0.1131 d4.dn_loss_bbox: 0.1766 d4.dn_loss_iou: 0.2275 d1.loss_lmm_region: 0.1269 loss_lmm_image: 0.8164 2024/11/12 17:57:06 - mmengine - INFO - Iter(train) [ 94400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 7:01:49 time: 1.9983 data_time: 0.0182 memory: 33515 grad_norm: 29.8662 loss: 8.7121 loss_cls: 0.2680 loss_bbox: 0.1215 loss_iou: 0.2203 d0.loss_cls: 0.3116 d0.loss_bbox: 0.1290 d0.loss_iou: 0.2327 d1.loss_cls: 0.2889 d1.loss_bbox: 0.1229 d1.loss_iou: 0.2200 d2.loss_cls: 0.2751 d2.loss_bbox: 0.1235 d2.loss_iou: 0.2215 d3.loss_cls: 0.2703 d3.loss_bbox: 0.1220 d3.loss_iou: 0.2199 d4.loss_cls: 0.2684 d4.loss_bbox: 0.1218 d4.loss_iou: 0.2199 enc_loss_cls: 0.3164 enc_loss_bbox: 0.1479 enc_loss_iou: 0.2538 dn_loss_cls: 0.1214 dn_loss_bbox: 0.1518 dn_loss_iou: 0.1828 d0.dn_loss_cls: 0.1988 d0.dn_loss_bbox: 0.3101 d0.dn_loss_iou: 0.3225 d1.dn_loss_cls: 0.1488 d1.dn_loss_bbox: 0.1926 d1.dn_loss_iou: 0.2150 d2.dn_loss_cls: 0.1293 d2.dn_loss_bbox: 0.1653 d2.dn_loss_iou: 0.1931 d3.dn_loss_cls: 0.1238 d3.dn_loss_bbox: 0.1543 d3.dn_loss_iou: 0.1855 d4.dn_loss_cls: 0.1226 d4.dn_loss_bbox: 0.1518 d4.dn_loss_iou: 0.1828 d1.loss_lmm_region: 0.1369 loss_lmm_image: 0.8475 2024/11/12 18:00:27 - mmengine - INFO - Iter(train) [ 94500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:58:29 time: 2.0194 data_time: 0.0183 memory: 34955 grad_norm: 31.7290 loss: 9.6484 loss_cls: 0.2966 loss_bbox: 0.1490 loss_iou: 0.2889 d0.loss_cls: 0.3495 d0.loss_bbox: 0.1594 d0.loss_iou: 0.3010 d1.loss_cls: 0.3199 d1.loss_bbox: 0.1457 d1.loss_iou: 0.2916 d2.loss_cls: 0.3084 d2.loss_bbox: 0.1441 d2.loss_iou: 0.2907 d3.loss_cls: 0.2995 d3.loss_bbox: 0.1484 d3.loss_iou: 0.2918 d4.loss_cls: 0.2972 d4.loss_bbox: 0.1470 d4.loss_iou: 0.2904 enc_loss_cls: 0.3484 enc_loss_bbox: 0.1741 enc_loss_iou: 0.3289 dn_loss_cls: 0.0933 dn_loss_bbox: 0.1467 dn_loss_iou: 0.2207 d0.dn_loss_cls: 0.1729 d0.dn_loss_bbox: 0.3002 d0.dn_loss_iou: 0.3692 d1.dn_loss_cls: 0.1244 d1.dn_loss_bbox: 0.1839 d1.dn_loss_iou: 0.2544 d2.dn_loss_cls: 0.1055 d2.dn_loss_bbox: 0.1586 d2.dn_loss_iou: 0.2306 d3.dn_loss_cls: 0.0967 d3.dn_loss_bbox: 0.1494 d3.dn_loss_iou: 0.2230 d4.dn_loss_cls: 0.0953 d4.dn_loss_bbox: 0.1469 d4.dn_loss_iou: 0.2208 d1.loss_lmm_region: 0.1338 loss_lmm_image: 0.8519 2024/11/12 18:03:45 - mmengine - INFO - Iter(train) [ 94600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:55:06 time: 1.9931 data_time: 0.0182 memory: 34137 grad_norm: 27.1218 loss: 7.4661 loss_cls: 0.1975 loss_bbox: 0.1132 loss_iou: 0.1774 d0.loss_cls: 0.2264 d0.loss_bbox: 0.1254 d0.loss_iou: 0.1905 d1.loss_cls: 0.2070 d1.loss_bbox: 0.1256 d1.loss_iou: 0.1873 d2.loss_cls: 0.2069 d2.loss_bbox: 0.1128 d2.loss_iou: 0.1784 d3.loss_cls: 0.1993 d3.loss_bbox: 0.1127 d3.loss_iou: 0.1774 d4.loss_cls: 0.1995 d4.loss_bbox: 0.1110 d4.loss_iou: 0.1754 enc_loss_cls: 0.2307 enc_loss_bbox: 0.1373 enc_loss_iou: 0.2070 dn_loss_cls: 0.0603 dn_loss_bbox: 0.1553 dn_loss_iou: 0.1880 d0.dn_loss_cls: 0.1340 d0.dn_loss_bbox: 0.3152 d0.dn_loss_iou: 0.3280 d1.dn_loss_cls: 0.0863 d1.dn_loss_bbox: 0.1941 d1.dn_loss_iou: 0.2190 d2.dn_loss_cls: 0.0731 d2.dn_loss_bbox: 0.1672 d2.dn_loss_iou: 0.1978 d3.dn_loss_cls: 0.0651 d3.dn_loss_bbox: 0.1581 d3.dn_loss_iou: 0.1909 d4.dn_loss_cls: 0.0609 d4.dn_loss_bbox: 0.1552 d4.dn_loss_iou: 0.1879 d1.loss_lmm_region: 0.1143 loss_lmm_image: 0.8169 2024/11/12 18:07:03 - mmengine - INFO - Iter(train) [ 94700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:51:44 time: 1.9690 data_time: 0.0181 memory: 34965 grad_norm: 24.9810 loss: 8.9398 loss_cls: 0.2660 loss_bbox: 0.1250 loss_iou: 0.2436 d0.loss_cls: 0.3082 d0.loss_bbox: 0.1356 d0.loss_iou: 0.2595 d1.loss_cls: 0.2810 d1.loss_bbox: 0.1304 d1.loss_iou: 0.2524 d2.loss_cls: 0.2778 d2.loss_bbox: 0.1250 d2.loss_iou: 0.2442 d3.loss_cls: 0.2708 d3.loss_bbox: 0.1244 d3.loss_iou: 0.2425 d4.loss_cls: 0.2689 d4.loss_bbox: 0.1241 d4.loss_iou: 0.2430 enc_loss_cls: 0.3029 enc_loss_bbox: 0.1547 enc_loss_iou: 0.2860 dn_loss_cls: 0.0826 dn_loss_bbox: 0.1575 dn_loss_iou: 0.2181 d0.dn_loss_cls: 0.1705 d0.dn_loss_bbox: 0.3108 d0.dn_loss_iou: 0.3719 d1.dn_loss_cls: 0.1134 d1.dn_loss_bbox: 0.1911 d1.dn_loss_iou: 0.2496 d2.dn_loss_cls: 0.0971 d2.dn_loss_bbox: 0.1664 d2.dn_loss_iou: 0.2269 d3.dn_loss_cls: 0.0873 d3.dn_loss_bbox: 0.1594 d3.dn_loss_iou: 0.2204 d4.dn_loss_cls: 0.0833 d4.dn_loss_bbox: 0.1575 d4.dn_loss_iou: 0.2179 d1.loss_lmm_region: 0.1524 loss_lmm_image: 0.8397 2024/11/12 18:10:22 - mmengine - INFO - Iter(train) [ 94800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:48:21 time: 1.9781 data_time: 0.0182 memory: 34547 grad_norm: 28.7721 loss: 8.1712 loss_cls: 0.2129 loss_bbox: 0.1205 loss_iou: 0.1973 d0.loss_cls: 0.2523 d0.loss_bbox: 0.1345 d0.loss_iou: 0.2113 d1.loss_cls: 0.2274 d1.loss_bbox: 0.1240 d1.loss_iou: 0.2011 d2.loss_cls: 0.2209 d2.loss_bbox: 0.1205 d2.loss_iou: 0.1977 d3.loss_cls: 0.2139 d3.loss_bbox: 0.1206 d3.loss_iou: 0.1974 d4.loss_cls: 0.2133 d4.loss_bbox: 0.1204 d4.loss_iou: 0.1970 enc_loss_cls: 0.2570 enc_loss_bbox: 0.1562 enc_loss_iou: 0.2364 dn_loss_cls: 0.0694 dn_loss_bbox: 0.1787 dn_loss_iou: 0.2072 d0.dn_loss_cls: 0.1494 d0.dn_loss_bbox: 0.3434 d0.dn_loss_iou: 0.3602 d1.dn_loss_cls: 0.0966 d1.dn_loss_bbox: 0.2168 d1.dn_loss_iou: 0.2417 d2.dn_loss_cls: 0.0802 d2.dn_loss_bbox: 0.1917 d2.dn_loss_iou: 0.2174 d3.dn_loss_cls: 0.0728 d3.dn_loss_bbox: 0.1811 d3.dn_loss_iou: 0.2095 d4.dn_loss_cls: 0.0690 d4.dn_loss_bbox: 0.1788 d4.dn_loss_iou: 0.2072 d1.loss_lmm_region: 0.1175 loss_lmm_image: 0.8498 2024/11/12 18:13:41 - mmengine - INFO - Iter(train) [ 94900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:44:59 time: 2.0081 data_time: 0.0182 memory: 33139 grad_norm: 40.2729 loss: 9.0179 loss_cls: 0.3140 loss_bbox: 0.1212 loss_iou: 0.2232 d0.loss_cls: 0.3419 d0.loss_bbox: 0.1334 d0.loss_iou: 0.2355 d1.loss_cls: 0.3276 d1.loss_bbox: 0.1240 d1.loss_iou: 0.2270 d2.loss_cls: 0.3191 d2.loss_bbox: 0.1199 d2.loss_iou: 0.2207 d3.loss_cls: 0.3144 d3.loss_bbox: 0.1211 d3.loss_iou: 0.2225 d4.loss_cls: 0.3152 d4.loss_bbox: 0.1204 d4.loss_iou: 0.2216 enc_loss_cls: 0.3520 enc_loss_bbox: 0.1482 enc_loss_iou: 0.2567 dn_loss_cls: 0.1008 dn_loss_bbox: 0.1567 dn_loss_iou: 0.2063 d0.dn_loss_cls: 0.1723 d0.dn_loss_bbox: 0.3036 d0.dn_loss_iou: 0.3435 d1.dn_loss_cls: 0.1280 d1.dn_loss_bbox: 0.1919 d1.dn_loss_iou: 0.2360 d2.dn_loss_cls: 0.1105 d2.dn_loss_bbox: 0.1693 d2.dn_loss_iou: 0.2160 d3.dn_loss_cls: 0.1064 d3.dn_loss_bbox: 0.1590 d3.dn_loss_iou: 0.2083 d4.dn_loss_cls: 0.1017 d4.dn_loss_bbox: 0.1568 d4.dn_loss_iou: 0.2063 d1.loss_lmm_region: 0.1200 loss_lmm_image: 0.8449 2024/11/12 18:16:59 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 18:16:59 - mmengine - INFO - Iter(train) [ 95000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:41:37 time: 1.9654 data_time: 0.0182 memory: 34097 grad_norm: 27.5339 loss: 8.3403 loss_cls: 0.2397 loss_bbox: 0.1045 loss_iou: 0.1757 d0.loss_cls: 0.2684 d0.loss_bbox: 0.1137 d0.loss_iou: 0.1888 d1.loss_cls: 0.2508 d1.loss_bbox: 0.1083 d1.loss_iou: 0.1795 d2.loss_cls: 0.2470 d2.loss_bbox: 0.1055 d2.loss_iou: 0.1757 d3.loss_cls: 0.2416 d3.loss_bbox: 0.1046 d3.loss_iou: 0.1769 d4.loss_cls: 0.2393 d4.loss_bbox: 0.1053 d4.loss_iou: 0.1761 enc_loss_cls: 0.2741 enc_loss_bbox: 0.1318 enc_loss_iou: 0.2127 dn_loss_cls: 0.1911 dn_loss_bbox: 0.1447 dn_loss_iou: 0.1799 d0.dn_loss_cls: 0.2302 d0.dn_loss_bbox: 0.2834 d0.dn_loss_iou: 0.3121 d1.dn_loss_cls: 0.1994 d1.dn_loss_bbox: 0.1781 d1.dn_loss_iou: 0.2087 d2.dn_loss_cls: 0.1954 d2.dn_loss_bbox: 0.1568 d2.dn_loss_iou: 0.1894 d3.dn_loss_cls: 0.1882 d3.dn_loss_bbox: 0.1485 d3.dn_loss_iou: 0.1825 d4.dn_loss_cls: 0.1949 d4.dn_loss_bbox: 0.1448 d4.dn_loss_iou: 0.1799 d1.loss_lmm_region: 0.1278 loss_lmm_image: 0.8845 2024/11/12 18:20:18 - mmengine - INFO - Iter(train) [ 95100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:38:15 time: 1.9917 data_time: 0.0181 memory: 34482 grad_norm: 27.7795 loss: 8.3406 loss_cls: 0.2743 loss_bbox: 0.1130 loss_iou: 0.2284 d0.loss_cls: 0.3075 d0.loss_bbox: 0.1298 d0.loss_iou: 0.2525 d1.loss_cls: 0.2949 d1.loss_bbox: 0.1189 d1.loss_iou: 0.2390 d2.loss_cls: 0.2850 d2.loss_bbox: 0.1146 d2.loss_iou: 0.2323 d3.loss_cls: 0.2799 d3.loss_bbox: 0.1123 d3.loss_iou: 0.2301 d4.loss_cls: 0.2769 d4.loss_bbox: 0.1116 d4.loss_iou: 0.2277 enc_loss_cls: 0.3137 enc_loss_bbox: 0.1464 enc_loss_iou: 0.2803 dn_loss_cls: 0.0799 dn_loss_bbox: 0.1306 dn_loss_iou: 0.1844 d0.dn_loss_cls: 0.1526 d0.dn_loss_bbox: 0.2551 d0.dn_loss_iou: 0.3128 d1.dn_loss_cls: 0.1104 d1.dn_loss_bbox: 0.1619 d1.dn_loss_iou: 0.2129 d2.dn_loss_cls: 0.0930 d2.dn_loss_bbox: 0.1409 d2.dn_loss_iou: 0.1918 d3.dn_loss_cls: 0.0842 d3.dn_loss_bbox: 0.1323 d3.dn_loss_iou: 0.1864 d4.dn_loss_cls: 0.0803 d4.dn_loss_bbox: 0.1307 d4.dn_loss_iou: 0.1843 d1.loss_lmm_region: 0.1173 loss_lmm_image: 0.8296 2024/11/12 18:23:37 - mmengine - INFO - Iter(train) [ 95200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:34:53 time: 1.9677 data_time: 0.0182 memory: 32186 grad_norm: 29.2803 loss: 8.8197 loss_cls: 0.2822 loss_bbox: 0.1194 loss_iou: 0.2510 d0.loss_cls: 0.3124 d0.loss_bbox: 0.1329 d0.loss_iou: 0.2639 d1.loss_cls: 0.2928 d1.loss_bbox: 0.1253 d1.loss_iou: 0.2572 d2.loss_cls: 0.2865 d2.loss_bbox: 0.1202 d2.loss_iou: 0.2520 d3.loss_cls: 0.2808 d3.loss_bbox: 0.1172 d3.loss_iou: 0.2496 d4.loss_cls: 0.2798 d4.loss_bbox: 0.1187 d4.loss_iou: 0.2524 enc_loss_cls: 0.3202 enc_loss_bbox: 0.1446 enc_loss_iou: 0.2813 dn_loss_cls: 0.1037 dn_loss_bbox: 0.1431 dn_loss_iou: 0.2038 d0.dn_loss_cls: 0.1783 d0.dn_loss_bbox: 0.2838 d0.dn_loss_iou: 0.3370 d1.dn_loss_cls: 0.1277 d1.dn_loss_bbox: 0.1730 d1.dn_loss_iou: 0.2328 d2.dn_loss_cls: 0.1126 d2.dn_loss_bbox: 0.1523 d2.dn_loss_iou: 0.2126 d3.dn_loss_cls: 0.1084 d3.dn_loss_bbox: 0.1455 d3.dn_loss_iou: 0.2058 d4.dn_loss_cls: 0.1056 d4.dn_loss_bbox: 0.1430 d4.dn_loss_iou: 0.2037 d1.loss_lmm_region: 0.1220 loss_lmm_image: 0.7845 2024/11/12 18:26:56 - mmengine - INFO - Iter(train) [ 95300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:31:31 time: 1.9884 data_time: 0.0183 memory: 35682 grad_norm: 29.7736 loss: 9.0939 loss_cls: 0.2739 loss_bbox: 0.1334 loss_iou: 0.2508 d0.loss_cls: 0.3184 d0.loss_bbox: 0.1424 d0.loss_iou: 0.2663 d1.loss_cls: 0.2942 d1.loss_bbox: 0.1380 d1.loss_iou: 0.2564 d2.loss_cls: 0.2805 d2.loss_bbox: 0.1364 d2.loss_iou: 0.2552 d3.loss_cls: 0.2765 d3.loss_bbox: 0.1350 d3.loss_iou: 0.2527 d4.loss_cls: 0.2735 d4.loss_bbox: 0.1337 d4.loss_iou: 0.2528 enc_loss_cls: 0.3225 enc_loss_bbox: 0.1539 enc_loss_iou: 0.2913 dn_loss_cls: 0.1037 dn_loss_bbox: 0.1544 dn_loss_iou: 0.2131 d0.dn_loss_cls: 0.1825 d0.dn_loss_bbox: 0.2958 d0.dn_loss_iou: 0.3560 d1.dn_loss_cls: 0.1317 d1.dn_loss_bbox: 0.1847 d1.dn_loss_iou: 0.2448 d2.dn_loss_cls: 0.1172 d2.dn_loss_bbox: 0.1638 d2.dn_loss_iou: 0.2230 d3.dn_loss_cls: 0.1080 d3.dn_loss_bbox: 0.1568 d3.dn_loss_iou: 0.2156 d4.dn_loss_cls: 0.1044 d4.dn_loss_bbox: 0.1545 d4.dn_loss_iou: 0.2131 d1.loss_lmm_region: 0.1191 loss_lmm_image: 0.8140 2024/11/12 18:30:15 - mmengine - INFO - Iter(train) [ 95400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:28:09 time: 1.9829 data_time: 0.0182 memory: 34217 grad_norm: 30.8917 loss: 8.9048 loss_cls: 0.3000 loss_bbox: 0.1325 loss_iou: 0.2264 d0.loss_cls: 0.3321 d0.loss_bbox: 0.1444 d0.loss_iou: 0.2364 d1.loss_cls: 0.3194 d1.loss_bbox: 0.1364 d1.loss_iou: 0.2312 d2.loss_cls: 0.3146 d2.loss_bbox: 0.1281 d2.loss_iou: 0.2222 d3.loss_cls: 0.3095 d3.loss_bbox: 0.1282 d3.loss_iou: 0.2235 d4.loss_cls: 0.3025 d4.loss_bbox: 0.1296 d4.loss_iou: 0.2236 enc_loss_cls: 0.3429 enc_loss_bbox: 0.1561 enc_loss_iou: 0.2603 dn_loss_cls: 0.1106 dn_loss_bbox: 0.1404 dn_loss_iou: 0.1891 d0.dn_loss_cls: 0.1831 d0.dn_loss_bbox: 0.2795 d0.dn_loss_iou: 0.3248 d1.dn_loss_cls: 0.1364 d1.dn_loss_bbox: 0.1710 d1.dn_loss_iou: 0.2183 d2.dn_loss_cls: 0.1204 d2.dn_loss_bbox: 0.1509 d2.dn_loss_iou: 0.1978 d3.dn_loss_cls: 0.1135 d3.dn_loss_bbox: 0.1423 d3.dn_loss_iou: 0.1908 d4.dn_loss_cls: 0.1110 d4.dn_loss_bbox: 0.1404 d4.dn_loss_iou: 0.1890 d1.loss_lmm_region: 0.1150 loss_lmm_image: 0.8806 2024/11/12 18:33:37 - mmengine - INFO - Iter(train) [ 95500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:24:48 time: 2.0079 data_time: 0.0184 memory: 33465 grad_norm: 39.4428 loss: 9.8325 loss_cls: 0.3341 loss_bbox: 0.1343 loss_iou: 0.2290 d0.loss_cls: 0.3781 d0.loss_bbox: 0.1487 d0.loss_iou: 0.2418 d1.loss_cls: 0.3464 d1.loss_bbox: 0.1429 d1.loss_iou: 0.2380 d2.loss_cls: 0.3387 d2.loss_bbox: 0.1394 d2.loss_iou: 0.2348 d3.loss_cls: 0.3347 d3.loss_bbox: 0.1345 d3.loss_iou: 0.2348 d4.loss_cls: 0.3331 d4.loss_bbox: 0.1338 d4.loss_iou: 0.2308 enc_loss_cls: 0.3819 enc_loss_bbox: 0.1585 enc_loss_iou: 0.2669 dn_loss_cls: 0.1243 dn_loss_bbox: 0.1817 dn_loss_iou: 0.2261 d0.dn_loss_cls: 0.2158 d0.dn_loss_bbox: 0.3226 d0.dn_loss_iou: 0.3646 d1.dn_loss_cls: 0.1626 d1.dn_loss_bbox: 0.2089 d1.dn_loss_iou: 0.2550 d2.dn_loss_cls: 0.1428 d2.dn_loss_bbox: 0.1886 d2.dn_loss_iou: 0.2349 d3.dn_loss_cls: 0.1302 d3.dn_loss_bbox: 0.1834 d3.dn_loss_iou: 0.2288 d4.dn_loss_cls: 0.1256 d4.dn_loss_bbox: 0.1818 d4.dn_loss_iou: 0.2262 d1.loss_lmm_region: 0.1470 loss_lmm_image: 0.8666 2024/11/12 18:36:58 - mmengine - INFO - Iter(train) [ 95600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:21:28 time: 2.0161 data_time: 0.0184 memory: 32173 grad_norm: 31.8843 loss: 9.1064 loss_cls: 0.2704 loss_bbox: 0.1399 loss_iou: 0.2500 d0.loss_cls: 0.3321 d0.loss_bbox: 0.1542 d0.loss_iou: 0.2594 d1.loss_cls: 0.2947 d1.loss_bbox: 0.1464 d1.loss_iou: 0.2545 d2.loss_cls: 0.2800 d2.loss_bbox: 0.1417 d2.loss_iou: 0.2475 d3.loss_cls: 0.2749 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2458 d4.loss_cls: 0.2710 d4.loss_bbox: 0.1413 d4.loss_iou: 0.2477 enc_loss_cls: 0.3298 enc_loss_bbox: 0.1684 enc_loss_iou: 0.2828 dn_loss_cls: 0.0838 dn_loss_bbox: 0.1658 dn_loss_iou: 0.2116 d0.dn_loss_cls: 0.1604 d0.dn_loss_bbox: 0.2915 d0.dn_loss_iou: 0.3521 d1.dn_loss_cls: 0.1170 d1.dn_loss_bbox: 0.1943 d1.dn_loss_iou: 0.2424 d2.dn_loss_cls: 0.0960 d2.dn_loss_bbox: 0.1739 d2.dn_loss_iou: 0.2216 d3.dn_loss_cls: 0.0895 d3.dn_loss_bbox: 0.1671 d3.dn_loss_iou: 0.2145 d4.dn_loss_cls: 0.0843 d4.dn_loss_bbox: 0.1658 d4.dn_loss_iou: 0.2117 d1.loss_lmm_region: 0.1304 loss_lmm_image: 0.8595 2024/11/12 18:40:17 - mmengine - INFO - Iter(train) [ 95700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:18:06 time: 1.9883 data_time: 0.0185 memory: 35858 grad_norm: 38.8051 loss: 9.0518 loss_cls: 0.2734 loss_bbox: 0.1344 loss_iou: 0.2276 d0.loss_cls: 0.3096 d0.loss_bbox: 0.1513 d0.loss_iou: 0.2390 d1.loss_cls: 0.2908 d1.loss_bbox: 0.1415 d1.loss_iou: 0.2319 d2.loss_cls: 0.2833 d2.loss_bbox: 0.1377 d2.loss_iou: 0.2289 d3.loss_cls: 0.2797 d3.loss_bbox: 0.1337 d3.loss_iou: 0.2248 d4.loss_cls: 0.2765 d4.loss_bbox: 0.1326 d4.loss_iou: 0.2263 enc_loss_cls: 0.3061 enc_loss_bbox: 0.1653 enc_loss_iou: 0.2618 dn_loss_cls: 0.1022 dn_loss_bbox: 0.1680 dn_loss_iou: 0.2114 d0.dn_loss_cls: 0.1790 d0.dn_loss_bbox: 0.3217 d0.dn_loss_iou: 0.3586 d1.dn_loss_cls: 0.1338 d1.dn_loss_bbox: 0.2030 d1.dn_loss_iou: 0.2428 d2.dn_loss_cls: 0.1146 d2.dn_loss_bbox: 0.1782 d2.dn_loss_iou: 0.2206 d3.dn_loss_cls: 0.1070 d3.dn_loss_bbox: 0.1704 d3.dn_loss_iou: 0.2140 d4.dn_loss_cls: 0.1025 d4.dn_loss_bbox: 0.1680 d4.dn_loss_iou: 0.2113 d1.loss_lmm_region: 0.1337 loss_lmm_image: 0.8551 2024/11/12 18:43:37 - mmengine - INFO - Iter(train) [ 95800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:14:45 time: 2.0076 data_time: 0.0182 memory: 34574 grad_norm: 27.0400 loss: 10.3644 loss_cls: 0.3078 loss_bbox: 0.1632 loss_iou: 0.3005 d0.loss_cls: 0.3501 d0.loss_bbox: 0.1737 d0.loss_iou: 0.3108 d1.loss_cls: 0.3301 d1.loss_bbox: 0.1675 d1.loss_iou: 0.3054 d2.loss_cls: 0.3198 d2.loss_bbox: 0.1665 d2.loss_iou: 0.2985 d3.loss_cls: 0.3144 d3.loss_bbox: 0.1620 d3.loss_iou: 0.2997 d4.loss_cls: 0.3102 d4.loss_bbox: 0.1648 d4.loss_iou: 0.2992 enc_loss_cls: 0.3506 enc_loss_bbox: 0.1943 enc_loss_iou: 0.3346 dn_loss_cls: 0.1158 dn_loss_bbox: 0.1886 dn_loss_iou: 0.2455 d0.dn_loss_cls: 0.1863 d0.dn_loss_bbox: 0.3291 d0.dn_loss_iou: 0.3879 d1.dn_loss_cls: 0.1436 d1.dn_loss_bbox: 0.2173 d1.dn_loss_iou: 0.2745 d2.dn_loss_cls: 0.1291 d2.dn_loss_bbox: 0.1952 d2.dn_loss_iou: 0.2541 d3.dn_loss_cls: 0.1212 d3.dn_loss_bbox: 0.1904 d3.dn_loss_iou: 0.2482 d4.dn_loss_cls: 0.1162 d4.dn_loss_bbox: 0.1886 d4.dn_loss_iou: 0.2454 d1.loss_lmm_region: 0.1283 loss_lmm_image: 0.8355 2024/11/12 18:46:55 - mmengine - INFO - Iter(train) [ 95900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:11:22 time: 1.9878 data_time: 0.0182 memory: 33962 grad_norm: 34.7526 loss: 8.3642 loss_cls: 0.2627 loss_bbox: 0.1122 loss_iou: 0.2162 d0.loss_cls: 0.3028 d0.loss_bbox: 0.1216 d0.loss_iou: 0.2242 d1.loss_cls: 0.2774 d1.loss_bbox: 0.1169 d1.loss_iou: 0.2215 d2.loss_cls: 0.2687 d2.loss_bbox: 0.1151 d2.loss_iou: 0.2199 d3.loss_cls: 0.2665 d3.loss_bbox: 0.1131 d3.loss_iou: 0.2174 d4.loss_cls: 0.2625 d4.loss_bbox: 0.1132 d4.loss_iou: 0.2158 enc_loss_cls: 0.3076 enc_loss_bbox: 0.1386 enc_loss_iou: 0.2466 dn_loss_cls: 0.1072 dn_loss_bbox: 0.1460 dn_loss_iou: 0.1842 d0.dn_loss_cls: 0.1812 d0.dn_loss_bbox: 0.2799 d0.dn_loss_iou: 0.3124 d1.dn_loss_cls: 0.1387 d1.dn_loss_bbox: 0.1760 d1.dn_loss_iou: 0.2124 d2.dn_loss_cls: 0.1206 d2.dn_loss_bbox: 0.1558 d2.dn_loss_iou: 0.1940 d3.dn_loss_cls: 0.1115 d3.dn_loss_bbox: 0.1480 d3.dn_loss_iou: 0.1861 d4.dn_loss_cls: 0.1079 d4.dn_loss_bbox: 0.1459 d4.dn_loss_iou: 0.1841 d1.loss_lmm_region: 0.1141 loss_lmm_image: 0.8177 2024/11/12 18:50:13 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 18:50:13 - mmengine - INFO - Iter(train) [ 96000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:07:59 time: 1.9890 data_time: 0.0183 memory: 34850 grad_norm: 30.9378 loss: 9.6685 loss_cls: 0.2900 loss_bbox: 0.1402 loss_iou: 0.2254 d0.loss_cls: 0.3216 d0.loss_bbox: 0.1618 d0.loss_iou: 0.2429 d1.loss_cls: 0.3034 d1.loss_bbox: 0.1485 d1.loss_iou: 0.2313 d2.loss_cls: 0.3005 d2.loss_bbox: 0.1405 d2.loss_iou: 0.2231 d3.loss_cls: 0.2955 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2222 d4.loss_cls: 0.2902 d4.loss_bbox: 0.1409 d4.loss_iou: 0.2259 enc_loss_cls: 0.3259 enc_loss_bbox: 0.1654 enc_loss_iou: 0.2564 dn_loss_cls: 0.1355 dn_loss_bbox: 0.2083 dn_loss_iou: 0.2227 d0.dn_loss_cls: 0.2166 d0.dn_loss_bbox: 0.3684 d0.dn_loss_iou: 0.3694 d1.dn_loss_cls: 0.1671 d1.dn_loss_bbox: 0.2479 d1.dn_loss_iou: 0.2594 d2.dn_loss_cls: 0.1494 d2.dn_loss_bbox: 0.2224 d2.dn_loss_iou: 0.2352 d3.dn_loss_cls: 0.1411 d3.dn_loss_bbox: 0.2118 d3.dn_loss_iou: 0.2259 d4.dn_loss_cls: 0.1373 d4.dn_loss_bbox: 0.2084 d4.dn_loss_iou: 0.2229 d1.loss_lmm_region: 0.1499 loss_lmm_image: 0.7815 2024/11/12 18:53:31 - mmengine - INFO - Iter(train) [ 96100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:04:37 time: 1.9748 data_time: 0.0183 memory: 33364 grad_norm: 29.8811 loss: 9.6139 loss_cls: 0.3066 loss_bbox: 0.1476 loss_iou: 0.2615 d0.loss_cls: 0.3526 d0.loss_bbox: 0.1564 d0.loss_iou: 0.2717 d1.loss_cls: 0.3270 d1.loss_bbox: 0.1514 d1.loss_iou: 0.2650 d2.loss_cls: 0.3195 d2.loss_bbox: 0.1470 d2.loss_iou: 0.2602 d3.loss_cls: 0.3121 d3.loss_bbox: 0.1468 d3.loss_iou: 0.2611 d4.loss_cls: 0.3055 d4.loss_bbox: 0.1482 d4.loss_iou: 0.2618 enc_loss_cls: 0.3589 enc_loss_bbox: 0.1741 enc_loss_iou: 0.2990 dn_loss_cls: 0.0975 dn_loss_bbox: 0.1878 dn_loss_iou: 0.2181 d0.dn_loss_cls: 0.1775 d0.dn_loss_bbox: 0.3202 d0.dn_loss_iou: 0.3491 d1.dn_loss_cls: 0.1297 d1.dn_loss_bbox: 0.2160 d1.dn_loss_iou: 0.2454 d2.dn_loss_cls: 0.1108 d2.dn_loss_bbox: 0.1956 d2.dn_loss_iou: 0.2261 d3.dn_loss_cls: 0.1028 d3.dn_loss_bbox: 0.1895 d3.dn_loss_iou: 0.2199 d4.dn_loss_cls: 0.0981 d4.dn_loss_bbox: 0.1878 d4.dn_loss_iou: 0.2180 d1.loss_lmm_region: 0.1358 loss_lmm_image: 0.7542 2024/11/12 18:56:50 - mmengine - INFO - Iter(train) [ 96200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 6:01:15 time: 2.0058 data_time: 0.0182 memory: 34308 grad_norm: 29.1175 loss: 9.4725 loss_cls: 0.2645 loss_bbox: 0.1460 loss_iou: 0.2496 d0.loss_cls: 0.3111 d0.loss_bbox: 0.1521 d0.loss_iou: 0.2571 d1.loss_cls: 0.2867 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2498 d2.loss_cls: 0.2771 d2.loss_bbox: 0.1492 d2.loss_iou: 0.2494 d3.loss_cls: 0.2716 d3.loss_bbox: 0.1452 d3.loss_iou: 0.2483 d4.loss_cls: 0.2678 d4.loss_bbox: 0.1441 d4.loss_iou: 0.2485 enc_loss_cls: 0.3131 enc_loss_bbox: 0.1713 enc_loss_iou: 0.2807 dn_loss_cls: 0.0836 dn_loss_bbox: 0.1929 dn_loss_iou: 0.2397 d0.dn_loss_cls: 0.1784 d0.dn_loss_bbox: 0.3402 d0.dn_loss_iou: 0.3825 d1.dn_loss_cls: 0.1210 d1.dn_loss_bbox: 0.2237 d1.dn_loss_iou: 0.2693 d2.dn_loss_cls: 0.0977 d2.dn_loss_bbox: 0.2044 d2.dn_loss_iou: 0.2501 d3.dn_loss_cls: 0.0902 d3.dn_loss_bbox: 0.1947 d3.dn_loss_iou: 0.2425 d4.dn_loss_cls: 0.0846 d4.dn_loss_bbox: 0.1931 d4.dn_loss_iou: 0.2396 d1.loss_lmm_region: 0.1512 loss_lmm_image: 0.8605 2024/11/12 19:00:12 - mmengine - INFO - Iter(train) [ 96300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:57:55 time: 2.0375 data_time: 0.0181 memory: 34506 grad_norm: 34.4866 loss: 9.6611 loss_cls: 0.2883 loss_bbox: 0.1433 loss_iou: 0.2679 d0.loss_cls: 0.3237 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2852 d1.loss_cls: 0.3054 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2745 d2.loss_cls: 0.2985 d2.loss_bbox: 0.1442 d2.loss_iou: 0.2704 d3.loss_cls: 0.2909 d3.loss_bbox: 0.1443 d3.loss_iou: 0.2687 d4.loss_cls: 0.2889 d4.loss_bbox: 0.1437 d4.loss_iou: 0.2681 enc_loss_cls: 0.3263 enc_loss_bbox: 0.1807 enc_loss_iou: 0.3126 dn_loss_cls: 0.0897 dn_loss_bbox: 0.1776 dn_loss_iou: 0.2317 d0.dn_loss_cls: 0.1735 d0.dn_loss_bbox: 0.3259 d0.dn_loss_iou: 0.3801 d1.dn_loss_cls: 0.1206 d1.dn_loss_bbox: 0.2085 d1.dn_loss_iou: 0.2622 d2.dn_loss_cls: 0.1029 d2.dn_loss_bbox: 0.1885 d2.dn_loss_iou: 0.2419 d3.dn_loss_cls: 0.0950 d3.dn_loss_bbox: 0.1803 d3.dn_loss_iou: 0.2345 d4.dn_loss_cls: 0.0908 d4.dn_loss_bbox: 0.1776 d4.dn_loss_iou: 0.2317 d1.loss_lmm_region: 0.1242 loss_lmm_image: 0.8903 2024/11/12 19:03:31 - mmengine - INFO - Iter(train) [ 96400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:54:33 time: 1.9971 data_time: 0.0182 memory: 34745 grad_norm: 29.4604 loss: 8.8778 loss_cls: 0.2696 loss_bbox: 0.1402 loss_iou: 0.2679 d0.loss_cls: 0.3117 d0.loss_bbox: 0.1431 d0.loss_iou: 0.2782 d1.loss_cls: 0.2896 d1.loss_bbox: 0.1402 d1.loss_iou: 0.2726 d2.loss_cls: 0.2760 d2.loss_bbox: 0.1419 d2.loss_iou: 0.2698 d3.loss_cls: 0.2692 d3.loss_bbox: 0.1427 d3.loss_iou: 0.2707 d4.loss_cls: 0.2686 d4.loss_bbox: 0.1393 d4.loss_iou: 0.2672 enc_loss_cls: 0.3157 enc_loss_bbox: 0.1645 enc_loss_iou: 0.3041 dn_loss_cls: 0.0904 dn_loss_bbox: 0.1421 dn_loss_iou: 0.1960 d0.dn_loss_cls: 0.1583 d0.dn_loss_bbox: 0.2624 d0.dn_loss_iou: 0.3278 d1.dn_loss_cls: 0.1122 d1.dn_loss_bbox: 0.1672 d1.dn_loss_iou: 0.2245 d2.dn_loss_cls: 0.0965 d2.dn_loss_bbox: 0.1501 d2.dn_loss_iou: 0.2049 d3.dn_loss_cls: 0.0931 d3.dn_loss_bbox: 0.1438 d3.dn_loss_iou: 0.1983 d4.dn_loss_cls: 0.0903 d4.dn_loss_bbox: 0.1421 d4.dn_loss_iou: 0.1960 d1.loss_lmm_region: 0.1099 loss_lmm_image: 0.8288 2024/11/12 19:06:50 - mmengine - INFO - Iter(train) [ 96500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:51:11 time: 2.0039 data_time: 0.0182 memory: 33071 grad_norm: 32.5247 loss: 9.2897 loss_cls: 0.3030 loss_bbox: 0.1401 loss_iou: 0.2395 d0.loss_cls: 0.3388 d0.loss_bbox: 0.1514 d0.loss_iou: 0.2554 d1.loss_cls: 0.3181 d1.loss_bbox: 0.1473 d1.loss_iou: 0.2475 d2.loss_cls: 0.3132 d2.loss_bbox: 0.1378 d2.loss_iou: 0.2403 d3.loss_cls: 0.3068 d3.loss_bbox: 0.1395 d3.loss_iou: 0.2387 d4.loss_cls: 0.3000 d4.loss_bbox: 0.1411 d4.loss_iou: 0.2407 enc_loss_cls: 0.3447 enc_loss_bbox: 0.1641 enc_loss_iou: 0.2767 dn_loss_cls: 0.1141 dn_loss_bbox: 0.1592 dn_loss_iou: 0.2058 d0.dn_loss_cls: 0.1952 d0.dn_loss_bbox: 0.2883 d0.dn_loss_iou: 0.3432 d1.dn_loss_cls: 0.1478 d1.dn_loss_bbox: 0.1863 d1.dn_loss_iou: 0.2358 d2.dn_loss_cls: 0.1261 d2.dn_loss_bbox: 0.1680 d2.dn_loss_iou: 0.2152 d3.dn_loss_cls: 0.1187 d3.dn_loss_bbox: 0.1601 d3.dn_loss_iou: 0.2078 d4.dn_loss_cls: 0.1145 d4.dn_loss_bbox: 0.1591 d4.dn_loss_iou: 0.2058 d1.loss_lmm_region: 0.1334 loss_lmm_image: 0.8209 2024/11/12 19:10:11 - mmengine - INFO - Iter(train) [ 96600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:47:50 time: 2.0002 data_time: 0.0183 memory: 34761 grad_norm: 22.7533 loss: 7.2646 loss_cls: 0.2267 loss_bbox: 0.0954 loss_iou: 0.1749 d0.loss_cls: 0.2584 d0.loss_bbox: 0.1119 d0.loss_iou: 0.1956 d1.loss_cls: 0.2378 d1.loss_bbox: 0.1051 d1.loss_iou: 0.1877 d2.loss_cls: 0.2369 d2.loss_bbox: 0.0955 d2.loss_iou: 0.1790 d3.loss_cls: 0.2313 d3.loss_bbox: 0.0947 d3.loss_iou: 0.1759 d4.loss_cls: 0.2274 d4.loss_bbox: 0.0954 d4.loss_iou: 0.1747 enc_loss_cls: 0.2680 enc_loss_bbox: 0.1275 enc_loss_iou: 0.2207 dn_loss_cls: 0.0531 dn_loss_bbox: 0.1273 dn_loss_iou: 0.1731 d0.dn_loss_cls: 0.1259 d0.dn_loss_bbox: 0.2646 d0.dn_loss_iou: 0.3090 d1.dn_loss_cls: 0.0775 d1.dn_loss_bbox: 0.1593 d1.dn_loss_iou: 0.2020 d2.dn_loss_cls: 0.0607 d2.dn_loss_bbox: 0.1372 d2.dn_loss_iou: 0.1823 d3.dn_loss_cls: 0.0561 d3.dn_loss_bbox: 0.1294 d3.dn_loss_iou: 0.1757 d4.dn_loss_cls: 0.0530 d4.dn_loss_bbox: 0.1273 d4.dn_loss_iou: 0.1730 d1.loss_lmm_region: 0.1009 loss_lmm_image: 0.8566 2024/11/12 19:13:30 - mmengine - INFO - Iter(train) [ 96700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:44:28 time: 2.0014 data_time: 0.0182 memory: 34892 grad_norm: 28.1455 loss: 9.5058 loss_cls: 0.3204 loss_bbox: 0.1320 loss_iou: 0.2660 d0.loss_cls: 0.3737 d0.loss_bbox: 0.1467 d0.loss_iou: 0.2876 d1.loss_cls: 0.3447 d1.loss_bbox: 0.1391 d1.loss_iou: 0.2772 d2.loss_cls: 0.3327 d2.loss_bbox: 0.1349 d2.loss_iou: 0.2693 d3.loss_cls: 0.3264 d3.loss_bbox: 0.1302 d3.loss_iou: 0.2648 d4.loss_cls: 0.3207 d4.loss_bbox: 0.1325 d4.loss_iou: 0.2663 enc_loss_cls: 0.3754 enc_loss_bbox: 0.1606 enc_loss_iou: 0.3161 dn_loss_cls: 0.0884 dn_loss_bbox: 0.1587 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1716 d0.dn_loss_bbox: 0.3003 d0.dn_loss_iou: 0.3463 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.1913 d1.dn_loss_iou: 0.2346 d2.dn_loss_cls: 0.1007 d2.dn_loss_bbox: 0.1677 d2.dn_loss_iou: 0.2137 d3.dn_loss_cls: 0.0919 d3.dn_loss_bbox: 0.1617 d3.dn_loss_iou: 0.2070 d4.dn_loss_cls: 0.0876 d4.dn_loss_bbox: 0.1587 d4.dn_loss_iou: 0.2034 d1.loss_lmm_region: 0.1073 loss_lmm_image: 0.8732 2024/11/12 19:16:51 - mmengine - INFO - Iter(train) [ 96800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:41:07 time: 1.9841 data_time: 0.0183 memory: 33994 grad_norm: 23.4241 loss: 9.1350 loss_cls: 0.2806 loss_bbox: 0.1314 loss_iou: 0.2306 d0.loss_cls: 0.3233 d0.loss_bbox: 0.1453 d0.loss_iou: 0.2462 d1.loss_cls: 0.2982 d1.loss_bbox: 0.1374 d1.loss_iou: 0.2382 d2.loss_cls: 0.2925 d2.loss_bbox: 0.1313 d2.loss_iou: 0.2331 d3.loss_cls: 0.2881 d3.loss_bbox: 0.1316 d3.loss_iou: 0.2309 d4.loss_cls: 0.2816 d4.loss_bbox: 0.1304 d4.loss_iou: 0.2294 enc_loss_cls: 0.3347 enc_loss_bbox: 0.1543 enc_loss_iou: 0.2630 dn_loss_cls: 0.0919 dn_loss_bbox: 0.1760 dn_loss_iou: 0.2161 d0.dn_loss_cls: 0.1761 d0.dn_loss_bbox: 0.3271 d0.dn_loss_iou: 0.3611 d1.dn_loss_cls: 0.1244 d1.dn_loss_bbox: 0.2108 d1.dn_loss_iou: 0.2509 d2.dn_loss_cls: 0.1045 d2.dn_loss_bbox: 0.1871 d2.dn_loss_iou: 0.2276 d3.dn_loss_cls: 0.0985 d3.dn_loss_bbox: 0.1785 d3.dn_loss_iou: 0.2192 d4.dn_loss_cls: 0.0937 d4.dn_loss_bbox: 0.1761 d4.dn_loss_iou: 0.2162 d1.loss_lmm_region: 0.1224 loss_lmm_image: 0.8449 2024/11/12 19:20:12 - mmengine - INFO - Iter(train) [ 96900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:37:46 time: 1.9893 data_time: 0.0182 memory: 34541 grad_norm: 26.5979 loss: 9.9127 loss_cls: 0.2698 loss_bbox: 0.1754 loss_iou: 0.2802 d0.loss_cls: 0.3030 d0.loss_bbox: 0.1881 d0.loss_iou: 0.2888 d1.loss_cls: 0.2870 d1.loss_bbox: 0.1750 d1.loss_iou: 0.2801 d2.loss_cls: 0.2837 d2.loss_bbox: 0.1713 d2.loss_iou: 0.2776 d3.loss_cls: 0.2727 d3.loss_bbox: 0.1752 d3.loss_iou: 0.2808 d4.loss_cls: 0.2696 d4.loss_bbox: 0.1761 d4.loss_iou: 0.2813 enc_loss_cls: 0.3119 enc_loss_bbox: 0.1988 enc_loss_iou: 0.3073 dn_loss_cls: 0.0793 dn_loss_bbox: 0.2079 dn_loss_iou: 0.2547 d0.dn_loss_cls: 0.1625 d0.dn_loss_bbox: 0.3552 d0.dn_loss_iou: 0.3914 d1.dn_loss_cls: 0.1123 d1.dn_loss_bbox: 0.2375 d1.dn_loss_iou: 0.2802 d2.dn_loss_cls: 0.0948 d2.dn_loss_bbox: 0.2177 d2.dn_loss_iou: 0.2627 d3.dn_loss_cls: 0.0852 d3.dn_loss_bbox: 0.2090 d3.dn_loss_iou: 0.2560 d4.dn_loss_cls: 0.0802 d4.dn_loss_bbox: 0.2078 d4.dn_loss_iou: 0.2545 d1.loss_lmm_region: 0.1283 loss_lmm_image: 0.7818 2024/11/12 19:23:32 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 19:23:32 - mmengine - INFO - Iter(train) [ 97000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:34:25 time: 1.9969 data_time: 0.0184 memory: 33293 grad_norm: 28.4351 loss: 9.3704 loss_cls: 0.3146 loss_bbox: 0.1254 loss_iou: 0.2250 d0.loss_cls: 0.3460 d0.loss_bbox: 0.1441 d0.loss_iou: 0.2427 d1.loss_cls: 0.3295 d1.loss_bbox: 0.1366 d1.loss_iou: 0.2327 d2.loss_cls: 0.3206 d2.loss_bbox: 0.1247 d2.loss_iou: 0.2267 d3.loss_cls: 0.3146 d3.loss_bbox: 0.1254 d3.loss_iou: 0.2259 d4.loss_cls: 0.3120 d4.loss_bbox: 0.1269 d4.loss_iou: 0.2276 enc_loss_cls: 0.3544 enc_loss_bbox: 0.1618 enc_loss_iou: 0.2648 dn_loss_cls: 0.1359 dn_loss_bbox: 0.1714 dn_loss_iou: 0.1915 d0.dn_loss_cls: 0.2045 d0.dn_loss_bbox: 0.3210 d0.dn_loss_iou: 0.3283 d1.dn_loss_cls: 0.1624 d1.dn_loss_bbox: 0.2027 d1.dn_loss_iou: 0.2215 d2.dn_loss_cls: 0.1454 d2.dn_loss_bbox: 0.1795 d2.dn_loss_iou: 0.2000 d3.dn_loss_cls: 0.1369 d3.dn_loss_bbox: 0.1722 d3.dn_loss_iou: 0.1933 d4.dn_loss_cls: 0.1348 d4.dn_loss_bbox: 0.1712 d4.dn_loss_iou: 0.1915 d1.loss_lmm_region: 0.1384 loss_lmm_image: 0.8858 2024/11/12 19:26:50 - mmengine - INFO - Iter(train) [ 97100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:31:02 time: 1.9807 data_time: 0.0182 memory: 31386 grad_norm: nan loss: 9.0293 loss_cls: 0.2729 loss_bbox: 0.1351 loss_iou: 0.2243 d0.loss_cls: 0.3080 d0.loss_bbox: 0.1424 d0.loss_iou: 0.2350 d1.loss_cls: 0.2906 d1.loss_bbox: 0.1356 d1.loss_iou: 0.2256 d2.loss_cls: 0.2841 d2.loss_bbox: 0.1323 d2.loss_iou: 0.2226 d3.loss_cls: 0.2786 d3.loss_bbox: 0.1323 d3.loss_iou: 0.2217 d4.loss_cls: 0.2766 d4.loss_bbox: 0.1333 d4.loss_iou: 0.2219 enc_loss_cls: 0.3139 enc_loss_bbox: 0.1536 enc_loss_iou: 0.2568 dn_loss_cls: 0.1175 dn_loss_bbox: 0.1716 dn_loss_iou: 0.2069 d0.dn_loss_cls: 0.1921 d0.dn_loss_bbox: 0.3132 d0.dn_loss_iou: 0.3484 d1.dn_loss_cls: 0.1485 d1.dn_loss_bbox: 0.2007 d1.dn_loss_iou: 0.2351 d2.dn_loss_cls: 0.1261 d2.dn_loss_bbox: 0.1794 d2.dn_loss_iou: 0.2149 d3.dn_loss_cls: 0.1220 d3.dn_loss_bbox: 0.1740 d3.dn_loss_iou: 0.2095 d4.dn_loss_cls: 0.1192 d4.dn_loss_bbox: 0.1716 d4.dn_loss_iou: 0.2069 d1.loss_lmm_region: 0.1320 loss_lmm_image: 0.8423 2024/11/12 19:30:11 - mmengine - INFO - Iter(train) [ 97200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:27:42 time: 2.0055 data_time: 0.0183 memory: 34854 grad_norm: 28.2741 loss: 8.6711 loss_cls: 0.2565 loss_bbox: 0.1321 loss_iou: 0.2350 d0.loss_cls: 0.2900 d0.loss_bbox: 0.1432 d0.loss_iou: 0.2477 d1.loss_cls: 0.2708 d1.loss_bbox: 0.1401 d1.loss_iou: 0.2449 d2.loss_cls: 0.2679 d2.loss_bbox: 0.1304 d2.loss_iou: 0.2347 d3.loss_cls: 0.2604 d3.loss_bbox: 0.1333 d3.loss_iou: 0.2362 d4.loss_cls: 0.2573 d4.loss_bbox: 0.1312 d4.loss_iou: 0.2347 enc_loss_cls: 0.2981 enc_loss_bbox: 0.1544 enc_loss_iou: 0.2671 dn_loss_cls: 0.0865 dn_loss_bbox: 0.1512 dn_loss_iou: 0.1980 d0.dn_loss_cls: 0.1630 d0.dn_loss_bbox: 0.2910 d0.dn_loss_iou: 0.3320 d1.dn_loss_cls: 0.1204 d1.dn_loss_bbox: 0.1815 d1.dn_loss_iou: 0.2295 d2.dn_loss_cls: 0.0993 d2.dn_loss_bbox: 0.1609 d2.dn_loss_iou: 0.2084 d3.dn_loss_cls: 0.0939 d3.dn_loss_bbox: 0.1530 d3.dn_loss_iou: 0.2002 d4.dn_loss_cls: 0.0901 d4.dn_loss_bbox: 0.1512 d4.dn_loss_iou: 0.1979 d1.loss_lmm_region: 0.1008 loss_lmm_image: 0.8960 2024/11/12 19:33:31 - mmengine - INFO - Iter(train) [ 97300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:24:20 time: 2.0186 data_time: 0.0181 memory: 35029 grad_norm: 30.6902 loss: 9.5696 loss_cls: 0.3275 loss_bbox: 0.1323 loss_iou: 0.2636 d0.loss_cls: 0.3716 d0.loss_bbox: 0.1423 d0.loss_iou: 0.2782 d1.loss_cls: 0.3438 d1.loss_bbox: 0.1370 d1.loss_iou: 0.2691 d2.loss_cls: 0.3322 d2.loss_bbox: 0.1347 d2.loss_iou: 0.2650 d3.loss_cls: 0.3290 d3.loss_bbox: 0.1329 d3.loss_iou: 0.2631 d4.loss_cls: 0.3299 d4.loss_bbox: 0.1331 d4.loss_iou: 0.2643 enc_loss_cls: 0.3754 enc_loss_bbox: 0.1617 enc_loss_iou: 0.3040 dn_loss_cls: 0.1162 dn_loss_bbox: 0.1422 dn_loss_iou: 0.2100 d0.dn_loss_cls: 0.1855 d0.dn_loss_bbox: 0.2726 d0.dn_loss_iou: 0.3428 d1.dn_loss_cls: 0.1433 d1.dn_loss_bbox: 0.1731 d1.dn_loss_iou: 0.2398 d2.dn_loss_cls: 0.1261 d2.dn_loss_bbox: 0.1547 d2.dn_loss_iou: 0.2204 d3.dn_loss_cls: 0.1181 d3.dn_loss_bbox: 0.1456 d3.dn_loss_iou: 0.2133 d4.dn_loss_cls: 0.1154 d4.dn_loss_bbox: 0.1424 d4.dn_loss_iou: 0.2100 d1.loss_lmm_region: 0.1159 loss_lmm_image: 0.8914 2024/11/12 19:36:51 - mmengine - INFO - Iter(train) [ 97400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:20:59 time: 1.9992 data_time: 0.0182 memory: 33068 grad_norm: 34.6426 loss: 8.4734 loss_cls: 0.2951 loss_bbox: 0.1018 loss_iou: 0.2135 d0.loss_cls: 0.3277 d0.loss_bbox: 0.1232 d0.loss_iou: 0.2325 d1.loss_cls: 0.3114 d1.loss_bbox: 0.1079 d1.loss_iou: 0.2187 d2.loss_cls: 0.3039 d2.loss_bbox: 0.1042 d2.loss_iou: 0.2152 d3.loss_cls: 0.3026 d3.loss_bbox: 0.1041 d3.loss_iou: 0.2162 d4.loss_cls: 0.2956 d4.loss_bbox: 0.1034 d4.loss_iou: 0.2144 enc_loss_cls: 0.3373 enc_loss_bbox: 0.1299 enc_loss_iou: 0.2504 dn_loss_cls: 0.1170 dn_loss_bbox: 0.1266 dn_loss_iou: 0.1760 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.2718 d0.dn_loss_iou: 0.3113 d1.dn_loss_cls: 0.1392 d1.dn_loss_bbox: 0.1596 d1.dn_loss_iou: 0.2055 d2.dn_loss_cls: 0.1215 d2.dn_loss_bbox: 0.1355 d2.dn_loss_iou: 0.1842 d3.dn_loss_cls: 0.1191 d3.dn_loss_bbox: 0.1289 d3.dn_loss_iou: 0.1783 d4.dn_loss_cls: 0.1156 d4.dn_loss_bbox: 0.1266 d4.dn_loss_iou: 0.1760 d1.loss_lmm_region: 0.1429 loss_lmm_image: 0.8459 2024/11/12 19:40:09 - mmengine - INFO - Iter(train) [ 97500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:17:36 time: 1.9574 data_time: 0.0184 memory: 33799 grad_norm: 28.9108 loss: 8.7611 loss_cls: 0.2668 loss_bbox: 0.1208 loss_iou: 0.2155 d0.loss_cls: 0.3064 d0.loss_bbox: 0.1303 d0.loss_iou: 0.2261 d1.loss_cls: 0.2812 d1.loss_bbox: 0.1277 d1.loss_iou: 0.2234 d2.loss_cls: 0.2737 d2.loss_bbox: 0.1212 d2.loss_iou: 0.2206 d3.loss_cls: 0.2751 d3.loss_bbox: 0.1185 d3.loss_iou: 0.2169 d4.loss_cls: 0.2673 d4.loss_bbox: 0.1211 d4.loss_iou: 0.2162 enc_loss_cls: 0.3146 enc_loss_bbox: 0.1410 enc_loss_iou: 0.2457 dn_loss_cls: 0.1206 dn_loss_bbox: 0.1473 dn_loss_iou: 0.2044 d0.dn_loss_cls: 0.1881 d0.dn_loss_bbox: 0.2938 d0.dn_loss_iou: 0.3431 d1.dn_loss_cls: 0.1454 d1.dn_loss_bbox: 0.1808 d1.dn_loss_iou: 0.2337 d2.dn_loss_cls: 0.1284 d2.dn_loss_bbox: 0.1565 d2.dn_loss_iou: 0.2130 d3.dn_loss_cls: 0.1245 d3.dn_loss_bbox: 0.1504 d3.dn_loss_iou: 0.2066 d4.dn_loss_cls: 0.1210 d4.dn_loss_bbox: 0.1473 d4.dn_loss_iou: 0.2042 d1.loss_lmm_region: 0.1569 loss_lmm_image: 0.8649 2024/11/12 19:43:30 - mmengine - INFO - Iter(train) [ 97600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:14:16 time: 1.9915 data_time: 0.0184 memory: 35353 grad_norm: 29.6285 loss: 9.6391 loss_cls: 0.3154 loss_bbox: 0.1367 loss_iou: 0.2316 d0.loss_cls: 0.3559 d0.loss_bbox: 0.1433 d0.loss_iou: 0.2445 d1.loss_cls: 0.3250 d1.loss_bbox: 0.1457 d1.loss_iou: 0.2429 d2.loss_cls: 0.3219 d2.loss_bbox: 0.1387 d2.loss_iou: 0.2335 d3.loss_cls: 0.3169 d3.loss_bbox: 0.1372 d3.loss_iou: 0.2344 d4.loss_cls: 0.3134 d4.loss_bbox: 0.1361 d4.loss_iou: 0.2312 enc_loss_cls: 0.3585 enc_loss_bbox: 0.1604 enc_loss_iou: 0.2711 dn_loss_cls: 0.1978 dn_loss_bbox: 0.1568 dn_loss_iou: 0.2015 d0.dn_loss_cls: 0.2337 d0.dn_loss_bbox: 0.2870 d0.dn_loss_iou: 0.3384 d1.dn_loss_cls: 0.1895 d1.dn_loss_bbox: 0.1861 d1.dn_loss_iou: 0.2314 d2.dn_loss_cls: 0.1833 d2.dn_loss_bbox: 0.1674 d2.dn_loss_iou: 0.2115 d3.dn_loss_cls: 0.1830 d3.dn_loss_bbox: 0.1593 d3.dn_loss_iou: 0.2041 d4.dn_loss_cls: 0.1864 d4.dn_loss_bbox: 0.1569 d4.dn_loss_iou: 0.2016 d1.loss_lmm_region: 0.1270 loss_lmm_image: 0.8424 2024/11/12 19:46:50 - mmengine - INFO - Iter(train) [ 97700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:10:54 time: 1.9882 data_time: 0.0183 memory: 34451 grad_norm: 28.3860 loss: 9.1605 loss_cls: 0.2800 loss_bbox: 0.1245 loss_iou: 0.2371 d0.loss_cls: 0.3235 d0.loss_bbox: 0.1331 d0.loss_iou: 0.2521 d1.loss_cls: 0.2991 d1.loss_bbox: 0.1291 d1.loss_iou: 0.2431 d2.loss_cls: 0.2924 d2.loss_bbox: 0.1256 d2.loss_iou: 0.2397 d3.loss_cls: 0.2872 d3.loss_bbox: 0.1232 d3.loss_iou: 0.2351 d4.loss_cls: 0.2824 d4.loss_bbox: 0.1222 d4.loss_iou: 0.2330 enc_loss_cls: 0.3287 enc_loss_bbox: 0.1514 enc_loss_iou: 0.2710 dn_loss_cls: 0.1000 dn_loss_bbox: 0.1666 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.1832 d0.dn_loss_bbox: 0.3065 d0.dn_loss_iou: 0.3523 d1.dn_loss_cls: 0.1330 d1.dn_loss_bbox: 0.1986 d1.dn_loss_iou: 0.2441 d2.dn_loss_cls: 0.1132 d2.dn_loss_bbox: 0.1751 d2.dn_loss_iou: 0.2237 d3.dn_loss_cls: 0.1045 d3.dn_loss_bbox: 0.1688 d3.dn_loss_iou: 0.2159 d4.dn_loss_cls: 0.1023 d4.dn_loss_bbox: 0.1666 d4.dn_loss_iou: 0.2142 d1.loss_lmm_region: 0.1284 loss_lmm_image: 0.9354 2024/11/12 19:50:10 - mmengine - INFO - Iter(train) [ 97800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:07:33 time: 1.9944 data_time: 0.0184 memory: 34312 grad_norm: 23.4352 loss: 8.5951 loss_cls: 0.2340 loss_bbox: 0.1197 loss_iou: 0.2165 d0.loss_cls: 0.2687 d0.loss_bbox: 0.1341 d0.loss_iou: 0.2307 d1.loss_cls: 0.2520 d1.loss_bbox: 0.1248 d1.loss_iou: 0.2216 d2.loss_cls: 0.2427 d2.loss_bbox: 0.1239 d2.loss_iou: 0.2166 d3.loss_cls: 0.2376 d3.loss_bbox: 0.1201 d3.loss_iou: 0.2172 d4.loss_cls: 0.2353 d4.loss_bbox: 0.1202 d4.loss_iou: 0.2172 enc_loss_cls: 0.2760 enc_loss_bbox: 0.1514 enc_loss_iou: 0.2517 dn_loss_cls: 0.0870 dn_loss_bbox: 0.1710 dn_loss_iou: 0.2280 d0.dn_loss_cls: 0.1689 d0.dn_loss_bbox: 0.3356 d0.dn_loss_iou: 0.3775 d1.dn_loss_cls: 0.1177 d1.dn_loss_bbox: 0.2058 d1.dn_loss_iou: 0.2610 d2.dn_loss_cls: 0.0972 d2.dn_loss_bbox: 0.1817 d2.dn_loss_iou: 0.2387 d3.dn_loss_cls: 0.0898 d3.dn_loss_bbox: 0.1742 d3.dn_loss_iou: 0.2311 d4.dn_loss_cls: 0.0871 d4.dn_loss_bbox: 0.1710 d4.dn_loss_iou: 0.2280 d1.loss_lmm_region: 0.1145 loss_lmm_image: 0.8172 2024/11/12 19:53:29 - mmengine - INFO - Iter(train) [ 97900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:04:11 time: 2.0175 data_time: 0.0183 memory: 35001 grad_norm: 28.9285 loss: 9.3746 loss_cls: 0.3281 loss_bbox: 0.1308 loss_iou: 0.2239 d0.loss_cls: 0.3637 d0.loss_bbox: 0.1408 d0.loss_iou: 0.2374 d1.loss_cls: 0.3424 d1.loss_bbox: 0.1384 d1.loss_iou: 0.2318 d2.loss_cls: 0.3386 d2.loss_bbox: 0.1342 d2.loss_iou: 0.2261 d3.loss_cls: 0.3268 d3.loss_bbox: 0.1355 d3.loss_iou: 0.2276 d4.loss_cls: 0.3282 d4.loss_bbox: 0.1304 d4.loss_iou: 0.2237 enc_loss_cls: 0.3686 enc_loss_bbox: 0.1548 enc_loss_iou: 0.2602 dn_loss_cls: 0.1356 dn_loss_bbox: 0.1460 dn_loss_iou: 0.2043 d0.dn_loss_cls: 0.2065 d0.dn_loss_bbox: 0.2896 d0.dn_loss_iou: 0.3410 d1.dn_loss_cls: 0.1676 d1.dn_loss_bbox: 0.1805 d1.dn_loss_iou: 0.2340 d2.dn_loss_cls: 0.1471 d2.dn_loss_bbox: 0.1583 d2.dn_loss_iou: 0.2133 d3.dn_loss_cls: 0.1409 d3.dn_loss_bbox: 0.1482 d3.dn_loss_iou: 0.2059 d4.dn_loss_cls: 0.1383 d4.dn_loss_bbox: 0.1461 d4.dn_loss_iou: 0.2043 d1.loss_lmm_region: 0.1342 loss_lmm_image: 0.8408 2024/11/12 19:56:47 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 19:56:47 - mmengine - INFO - Iter(train) [ 98000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 5:00:49 time: 1.9963 data_time: 0.0183 memory: 35197 grad_norm: 26.4388 loss: 9.5389 loss_cls: 0.3007 loss_bbox: 0.1487 loss_iou: 0.2532 d0.loss_cls: 0.3606 d0.loss_bbox: 0.1578 d0.loss_iou: 0.2667 d1.loss_cls: 0.3269 d1.loss_bbox: 0.1494 d1.loss_iou: 0.2581 d2.loss_cls: 0.3159 d2.loss_bbox: 0.1457 d2.loss_iou: 0.2533 d3.loss_cls: 0.3059 d3.loss_bbox: 0.1485 d3.loss_iou: 0.2535 d4.loss_cls: 0.2993 d4.loss_bbox: 0.1485 d4.loss_iou: 0.2543 enc_loss_cls: 0.3624 enc_loss_bbox: 0.1781 enc_loss_iou: 0.2985 dn_loss_cls: 0.0956 dn_loss_bbox: 0.1670 dn_loss_iou: 0.2172 d0.dn_loss_cls: 0.1843 d0.dn_loss_bbox: 0.3163 d0.dn_loss_iou: 0.3650 d1.dn_loss_cls: 0.1296 d1.dn_loss_bbox: 0.2006 d1.dn_loss_iou: 0.2487 d2.dn_loss_cls: 0.1088 d2.dn_loss_bbox: 0.1758 d2.dn_loss_iou: 0.2266 d3.dn_loss_cls: 0.0997 d3.dn_loss_bbox: 0.1698 d3.dn_loss_iou: 0.2200 d4.dn_loss_cls: 0.0963 d4.dn_loss_bbox: 0.1670 d4.dn_loss_iou: 0.2171 d1.loss_lmm_region: 0.1308 loss_lmm_image: 0.8168 2024/11/12 20:00:06 - mmengine - INFO - Iter(train) [ 98100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:57:27 time: 1.9565 data_time: 0.0183 memory: 33288 grad_norm: 32.2910 loss: 9.0927 loss_cls: 0.2665 loss_bbox: 0.1304 loss_iou: 0.2401 d0.loss_cls: 0.3265 d0.loss_bbox: 0.1364 d0.loss_iou: 0.2481 d1.loss_cls: 0.2883 d1.loss_bbox: 0.1352 d1.loss_iou: 0.2449 d2.loss_cls: 0.2773 d2.loss_bbox: 0.1301 d2.loss_iou: 0.2412 d3.loss_cls: 0.2703 d3.loss_bbox: 0.1294 d3.loss_iou: 0.2428 d4.loss_cls: 0.2702 d4.loss_bbox: 0.1267 d4.loss_iou: 0.2388 enc_loss_cls: 0.3231 enc_loss_bbox: 0.1530 enc_loss_iou: 0.2686 dn_loss_cls: 0.1320 dn_loss_bbox: 0.1505 dn_loss_iou: 0.2115 d0.dn_loss_cls: 0.2052 d0.dn_loss_bbox: 0.2753 d0.dn_loss_iou: 0.3366 d1.dn_loss_cls: 0.1631 d1.dn_loss_bbox: 0.1735 d1.dn_loss_iou: 0.2376 d2.dn_loss_cls: 0.1482 d2.dn_loss_bbox: 0.1573 d2.dn_loss_iou: 0.2197 d3.dn_loss_cls: 0.1401 d3.dn_loss_bbox: 0.1524 d3.dn_loss_iou: 0.2138 d4.dn_loss_cls: 0.1337 d4.dn_loss_bbox: 0.1506 d4.dn_loss_iou: 0.2116 d1.loss_lmm_region: 0.1266 loss_lmm_image: 0.8654 2024/11/12 20:03:24 - mmengine - INFO - Iter(train) [ 98200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:54:05 time: 1.9708 data_time: 0.0183 memory: 36544 grad_norm: 30.0651 loss: 9.1966 loss_cls: 0.3201 loss_bbox: 0.1244 loss_iou: 0.2328 d0.loss_cls: 0.3621 d0.loss_bbox: 0.1364 d0.loss_iou: 0.2464 d1.loss_cls: 0.3294 d1.loss_bbox: 0.1363 d1.loss_iou: 0.2415 d2.loss_cls: 0.3199 d2.loss_bbox: 0.1318 d2.loss_iou: 0.2377 d3.loss_cls: 0.3229 d3.loss_bbox: 0.1256 d3.loss_iou: 0.2341 d4.loss_cls: 0.3256 d4.loss_bbox: 0.1204 d4.loss_iou: 0.2286 enc_loss_cls: 0.3630 enc_loss_bbox: 0.1572 enc_loss_iou: 0.2657 dn_loss_cls: 0.1157 dn_loss_bbox: 0.1530 dn_loss_iou: 0.2002 d0.dn_loss_cls: 0.1948 d0.dn_loss_bbox: 0.2956 d0.dn_loss_iou: 0.3336 d1.dn_loss_cls: 0.1423 d1.dn_loss_bbox: 0.1836 d1.dn_loss_iou: 0.2299 d2.dn_loss_cls: 0.1211 d2.dn_loss_bbox: 0.1635 d2.dn_loss_iou: 0.2092 d3.dn_loss_cls: 0.1166 d3.dn_loss_bbox: 0.1551 d3.dn_loss_iou: 0.2024 d4.dn_loss_cls: 0.1172 d4.dn_loss_bbox: 0.1530 d4.dn_loss_iou: 0.2001 d1.loss_lmm_region: 0.1322 loss_lmm_image: 0.8157 2024/11/12 20:06:43 - mmengine - INFO - Iter(train) [ 98300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:50:43 time: 1.9864 data_time: 0.0185 memory: 33974 grad_norm: 26.2597 loss: 10.4188 loss_cls: 0.3464 loss_bbox: 0.1586 loss_iou: 0.2688 d0.loss_cls: 0.3930 d0.loss_bbox: 0.1787 d0.loss_iou: 0.2906 d1.loss_cls: 0.3691 d1.loss_bbox: 0.1659 d1.loss_iou: 0.2757 d2.loss_cls: 0.3566 d2.loss_bbox: 0.1632 d2.loss_iou: 0.2724 d3.loss_cls: 0.3479 d3.loss_bbox: 0.1590 d3.loss_iou: 0.2698 d4.loss_cls: 0.3459 d4.loss_bbox: 0.1583 d4.loss_iou: 0.2701 enc_loss_cls: 0.3946 enc_loss_bbox: 0.1950 enc_loss_iou: 0.3179 dn_loss_cls: 0.1163 dn_loss_bbox: 0.1838 dn_loss_iou: 0.2355 d0.dn_loss_cls: 0.2005 d0.dn_loss_bbox: 0.3468 d0.dn_loss_iou: 0.3856 d1.dn_loss_cls: 0.1496 d1.dn_loss_bbox: 0.2231 d1.dn_loss_iou: 0.2695 d2.dn_loss_cls: 0.1302 d2.dn_loss_bbox: 0.1968 d2.dn_loss_iou: 0.2470 d3.dn_loss_cls: 0.1217 d3.dn_loss_bbox: 0.1866 d3.dn_loss_iou: 0.2383 d4.dn_loss_cls: 0.1161 d4.dn_loss_bbox: 0.1839 d4.dn_loss_iou: 0.2356 d1.loss_lmm_region: 0.1525 loss_lmm_image: 0.8021 2024/11/12 20:10:02 - mmengine - INFO - Iter(train) [ 98400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:47:21 time: 1.9871 data_time: 0.0183 memory: 35056 grad_norm: 28.7806 loss: 9.0613 loss_cls: 0.2654 loss_bbox: 0.1477 loss_iou: 0.2428 d0.loss_cls: 0.3079 d0.loss_bbox: 0.1633 d0.loss_iou: 0.2614 d1.loss_cls: 0.2799 d1.loss_bbox: 0.1569 d1.loss_iou: 0.2542 d2.loss_cls: 0.2727 d2.loss_bbox: 0.1487 d2.loss_iou: 0.2449 d3.loss_cls: 0.2661 d3.loss_bbox: 0.1483 d3.loss_iou: 0.2443 d4.loss_cls: 0.2624 d4.loss_bbox: 0.1483 d4.loss_iou: 0.2446 enc_loss_cls: 0.3143 enc_loss_bbox: 0.1716 enc_loss_iou: 0.2775 dn_loss_cls: 0.0905 dn_loss_bbox: 0.1618 dn_loss_iou: 0.2157 d0.dn_loss_cls: 0.1700 d0.dn_loss_bbox: 0.3052 d0.dn_loss_iou: 0.3530 d1.dn_loss_cls: 0.1171 d1.dn_loss_bbox: 0.1879 d1.dn_loss_iou: 0.2422 d2.dn_loss_cls: 0.1009 d2.dn_loss_bbox: 0.1693 d2.dn_loss_iou: 0.2227 d3.dn_loss_cls: 0.0944 d3.dn_loss_bbox: 0.1636 d3.dn_loss_iou: 0.2177 d4.dn_loss_cls: 0.0908 d4.dn_loss_bbox: 0.1618 d4.dn_loss_iou: 0.2157 d1.loss_lmm_region: 0.1397 loss_lmm_image: 0.8180 2024/11/12 20:13:22 - mmengine - INFO - Iter(train) [ 98500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:43:59 time: 2.0231 data_time: 0.0184 memory: 31565 grad_norm: 24.9407 loss: 8.7476 loss_cls: 0.2640 loss_bbox: 0.1095 loss_iou: 0.1868 d0.loss_cls: 0.2850 d0.loss_bbox: 0.1239 d0.loss_iou: 0.2047 d1.loss_cls: 0.2657 d1.loss_bbox: 0.1198 d1.loss_iou: 0.1984 d2.loss_cls: 0.2594 d2.loss_bbox: 0.1115 d2.loss_iou: 0.1902 d3.loss_cls: 0.2591 d3.loss_bbox: 0.1105 d3.loss_iou: 0.1892 d4.loss_cls: 0.2610 d4.loss_bbox: 0.1099 d4.loss_iou: 0.1869 enc_loss_cls: 0.2906 enc_loss_bbox: 0.1346 enc_loss_iou: 0.2240 dn_loss_cls: 0.1369 dn_loss_bbox: 0.1959 dn_loss_iou: 0.2072 d0.dn_loss_cls: 0.2052 d0.dn_loss_bbox: 0.3614 d0.dn_loss_iou: 0.3556 d1.dn_loss_cls: 0.1573 d1.dn_loss_bbox: 0.2306 d1.dn_loss_iou: 0.2372 d2.dn_loss_cls: 0.1362 d2.dn_loss_bbox: 0.2070 d2.dn_loss_iou: 0.2158 d3.dn_loss_cls: 0.1402 d3.dn_loss_bbox: 0.1969 d3.dn_loss_iou: 0.2088 d4.dn_loss_cls: 0.1356 d4.dn_loss_bbox: 0.1959 d4.dn_loss_iou: 0.2072 d1.loss_lmm_region: 0.1160 loss_lmm_image: 0.8163 2024/11/12 20:16:41 - mmengine - INFO - Iter(train) [ 98600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:40:38 time: 1.9844 data_time: 0.0183 memory: 33853 grad_norm: 26.0451 loss: 9.3765 loss_cls: 0.2947 loss_bbox: 0.1315 loss_iou: 0.2662 d0.loss_cls: 0.3314 d0.loss_bbox: 0.1477 d0.loss_iou: 0.2845 d1.loss_cls: 0.3110 d1.loss_bbox: 0.1382 d1.loss_iou: 0.2694 d2.loss_cls: 0.3073 d2.loss_bbox: 0.1310 d2.loss_iou: 0.2636 d3.loss_cls: 0.2992 d3.loss_bbox: 0.1349 d3.loss_iou: 0.2663 d4.loss_cls: 0.2965 d4.loss_bbox: 0.1328 d4.loss_iou: 0.2670 enc_loss_cls: 0.3418 enc_loss_bbox: 0.1591 enc_loss_iou: 0.3093 dn_loss_cls: 0.1087 dn_loss_bbox: 0.1515 dn_loss_iou: 0.2190 d0.dn_loss_cls: 0.1798 d0.dn_loss_bbox: 0.2889 d0.dn_loss_iou: 0.3666 d1.dn_loss_cls: 0.1350 d1.dn_loss_bbox: 0.1827 d1.dn_loss_iou: 0.2518 d2.dn_loss_cls: 0.1196 d2.dn_loss_bbox: 0.1613 d2.dn_loss_iou: 0.2294 d3.dn_loss_cls: 0.1127 d3.dn_loss_bbox: 0.1535 d3.dn_loss_iou: 0.2215 d4.dn_loss_cls: 0.1097 d4.dn_loss_bbox: 0.1514 d4.dn_loss_iou: 0.2190 d1.loss_lmm_region: 0.1219 loss_lmm_image: 0.8092 2024/11/12 20:19:59 - mmengine - INFO - Iter(train) [ 98700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:37:15 time: 1.9654 data_time: 0.0183 memory: 34248 grad_norm: 37.1335 loss: 8.9631 loss_cls: 0.2607 loss_bbox: 0.1364 loss_iou: 0.2485 d0.loss_cls: 0.2985 d0.loss_bbox: 0.1479 d0.loss_iou: 0.2632 d1.loss_cls: 0.2780 d1.loss_bbox: 0.1421 d1.loss_iou: 0.2528 d2.loss_cls: 0.2683 d2.loss_bbox: 0.1396 d2.loss_iou: 0.2483 d3.loss_cls: 0.2612 d3.loss_bbox: 0.1390 d3.loss_iou: 0.2494 d4.loss_cls: 0.2611 d4.loss_bbox: 0.1377 d4.loss_iou: 0.2480 enc_loss_cls: 0.3004 enc_loss_bbox: 0.1670 enc_loss_iou: 0.2826 dn_loss_cls: 0.0840 dn_loss_bbox: 0.1607 dn_loss_iou: 0.2130 d0.dn_loss_cls: 0.1632 d0.dn_loss_bbox: 0.3180 d0.dn_loss_iou: 0.3560 d1.dn_loss_cls: 0.1152 d1.dn_loss_bbox: 0.1996 d1.dn_loss_iou: 0.2452 d2.dn_loss_cls: 0.0962 d2.dn_loss_bbox: 0.1759 d2.dn_loss_iou: 0.2254 d3.dn_loss_cls: 0.0892 d3.dn_loss_bbox: 0.1641 d3.dn_loss_iou: 0.2163 d4.dn_loss_cls: 0.0845 d4.dn_loss_bbox: 0.1608 d4.dn_loss_iou: 0.2131 d1.loss_lmm_region: 0.1111 loss_lmm_image: 0.8408 2024/11/12 20:23:19 - mmengine - INFO - Iter(train) [ 98800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:33:54 time: 2.0050 data_time: 0.0184 memory: 33869 grad_norm: 25.8434 loss: 8.8455 loss_cls: 0.2518 loss_bbox: 0.1207 loss_iou: 0.2302 d0.loss_cls: 0.2934 d0.loss_bbox: 0.1298 d0.loss_iou: 0.2432 d1.loss_cls: 0.2681 d1.loss_bbox: 0.1242 d1.loss_iou: 0.2366 d2.loss_cls: 0.2629 d2.loss_bbox: 0.1199 d2.loss_iou: 0.2305 d3.loss_cls: 0.2572 d3.loss_bbox: 0.1192 d3.loss_iou: 0.2288 d4.loss_cls: 0.2525 d4.loss_bbox: 0.1209 d4.loss_iou: 0.2315 enc_loss_cls: 0.3001 enc_loss_bbox: 0.1439 enc_loss_iou: 0.2629 dn_loss_cls: 0.0981 dn_loss_bbox: 0.1769 dn_loss_iou: 0.2221 d0.dn_loss_cls: 0.1728 d0.dn_loss_bbox: 0.3257 d0.dn_loss_iou: 0.3630 d1.dn_loss_cls: 0.1266 d1.dn_loss_bbox: 0.2116 d1.dn_loss_iou: 0.2518 d2.dn_loss_cls: 0.1078 d2.dn_loss_bbox: 0.1879 d2.dn_loss_iou: 0.2310 d3.dn_loss_cls: 0.1027 d3.dn_loss_bbox: 0.1786 d3.dn_loss_iou: 0.2240 d4.dn_loss_cls: 0.0985 d4.dn_loss_bbox: 0.1768 d4.dn_loss_iou: 0.2221 d1.loss_lmm_region: 0.1211 loss_lmm_image: 0.8184 2024/11/12 20:26:38 - mmengine - INFO - Iter(train) [ 98900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:30:33 time: 2.0058 data_time: 0.0183 memory: 34849 grad_norm: 28.3403 loss: 9.4969 loss_cls: 0.3239 loss_bbox: 0.1538 loss_iou: 0.2409 d0.loss_cls: 0.3739 d0.loss_bbox: 0.1713 d0.loss_iou: 0.2550 d1.loss_cls: 0.3521 d1.loss_bbox: 0.1568 d1.loss_iou: 0.2445 d2.loss_cls: 0.3323 d2.loss_bbox: 0.1541 d2.loss_iou: 0.2396 d3.loss_cls: 0.3248 d3.loss_bbox: 0.1548 d3.loss_iou: 0.2416 d4.loss_cls: 0.3255 d4.loss_bbox: 0.1523 d4.loss_iou: 0.2399 enc_loss_cls: 0.3871 enc_loss_bbox: 0.1834 enc_loss_iou: 0.2734 dn_loss_cls: 0.1050 dn_loss_bbox: 0.1520 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1836 d0.dn_loss_bbox: 0.2970 d0.dn_loss_iou: 0.3405 d1.dn_loss_cls: 0.1335 d1.dn_loss_bbox: 0.1848 d1.dn_loss_iou: 0.2335 d2.dn_loss_cls: 0.1134 d2.dn_loss_bbox: 0.1633 d2.dn_loss_iou: 0.2128 d3.dn_loss_cls: 0.1070 d3.dn_loss_bbox: 0.1562 d3.dn_loss_iou: 0.2065 d4.dn_loss_cls: 0.1052 d4.dn_loss_bbox: 0.1520 d4.dn_loss_iou: 0.2035 d1.loss_lmm_region: 0.1429 loss_lmm_image: 0.8196 2024/11/12 20:29:58 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 20:29:58 - mmengine - INFO - Iter(train) [ 99000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:27:11 time: 2.0117 data_time: 0.0183 memory: 33079 grad_norm: 38.5328 loss: 8.5857 loss_cls: 0.2687 loss_bbox: 0.1158 loss_iou: 0.1880 d0.loss_cls: 0.3122 d0.loss_bbox: 0.1235 d0.loss_iou: 0.1992 d1.loss_cls: 0.2941 d1.loss_bbox: 0.1169 d1.loss_iou: 0.1912 d2.loss_cls: 0.2855 d2.loss_bbox: 0.1122 d2.loss_iou: 0.1849 d3.loss_cls: 0.2757 d3.loss_bbox: 0.1128 d3.loss_iou: 0.1840 d4.loss_cls: 0.2730 d4.loss_bbox: 0.1129 d4.loss_iou: 0.1862 enc_loss_cls: 0.3168 enc_loss_bbox: 0.1406 enc_loss_iou: 0.2247 dn_loss_cls: 0.1019 dn_loss_bbox: 0.1806 dn_loss_iou: 0.2017 d0.dn_loss_cls: 0.1859 d0.dn_loss_bbox: 0.3423 d0.dn_loss_iou: 0.3473 d1.dn_loss_cls: 0.1370 d1.dn_loss_bbox: 0.2127 d1.dn_loss_iou: 0.2326 d2.dn_loss_cls: 0.1158 d2.dn_loss_bbox: 0.1882 d2.dn_loss_iou: 0.2113 d3.dn_loss_cls: 0.1067 d3.dn_loss_bbox: 0.1816 d3.dn_loss_iou: 0.2039 d4.dn_loss_cls: 0.1027 d4.dn_loss_bbox: 0.1807 d4.dn_loss_iou: 0.2016 d1.loss_lmm_region: 0.1342 loss_lmm_image: 0.7980 2024/11/12 20:33:18 - mmengine - INFO - Iter(train) [ 99100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:23:50 time: 1.9831 data_time: 0.0183 memory: 34927 grad_norm: 37.5573 loss: 8.5834 loss_cls: 0.2647 loss_bbox: 0.1248 loss_iou: 0.2253 d0.loss_cls: 0.3047 d0.loss_bbox: 0.1383 d0.loss_iou: 0.2425 d1.loss_cls: 0.2805 d1.loss_bbox: 0.1298 d1.loss_iou: 0.2343 d2.loss_cls: 0.2690 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2288 d3.loss_cls: 0.2672 d3.loss_bbox: 0.1267 d3.loss_iou: 0.2269 d4.loss_cls: 0.2664 d4.loss_bbox: 0.1248 d4.loss_iou: 0.2252 enc_loss_cls: 0.3125 enc_loss_bbox: 0.1593 enc_loss_iou: 0.2694 dn_loss_cls: 0.0716 dn_loss_bbox: 0.1498 dn_loss_iou: 0.1973 d0.dn_loss_cls: 0.1511 d0.dn_loss_bbox: 0.2962 d0.dn_loss_iou: 0.3428 d1.dn_loss_cls: 0.1043 d1.dn_loss_bbox: 0.1868 d1.dn_loss_iou: 0.2314 d2.dn_loss_cls: 0.0843 d2.dn_loss_bbox: 0.1622 d2.dn_loss_iou: 0.2085 d3.dn_loss_cls: 0.0777 d3.dn_loss_bbox: 0.1532 d3.dn_loss_iou: 0.2006 d4.dn_loss_cls: 0.0726 d4.dn_loss_bbox: 0.1499 d4.dn_loss_iou: 0.1972 d1.loss_lmm_region: 0.1213 loss_lmm_image: 0.8749 2024/11/12 20:36:37 - mmengine - INFO - Iter(train) [ 99200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:20:28 time: 1.9885 data_time: 0.0182 memory: 32433 grad_norm: 29.3975 loss: 10.7674 loss_cls: 0.3248 loss_bbox: 0.1903 loss_iou: 0.3043 d0.loss_cls: 0.3717 d0.loss_bbox: 0.1908 d0.loss_iou: 0.3143 d1.loss_cls: 0.3457 d1.loss_bbox: 0.1942 d1.loss_iou: 0.3118 d2.loss_cls: 0.3397 d2.loss_bbox: 0.1895 d2.loss_iou: 0.3035 d3.loss_cls: 0.3316 d3.loss_bbox: 0.1893 d3.loss_iou: 0.3040 d4.loss_cls: 0.3280 d4.loss_bbox: 0.1881 d4.loss_iou: 0.3032 enc_loss_cls: 0.3869 enc_loss_bbox: 0.1982 enc_loss_iou: 0.3299 dn_loss_cls: 0.1270 dn_loss_bbox: 0.1889 dn_loss_iou: 0.2380 d0.dn_loss_cls: 0.2044 d0.dn_loss_bbox: 0.3290 d0.dn_loss_iou: 0.3808 d1.dn_loss_cls: 0.1523 d1.dn_loss_bbox: 0.2196 d1.dn_loss_iou: 0.2679 d2.dn_loss_cls: 0.1344 d2.dn_loss_bbox: 0.1987 d2.dn_loss_iou: 0.2477 d3.dn_loss_cls: 0.1308 d3.dn_loss_bbox: 0.1917 d3.dn_loss_iou: 0.2413 d4.dn_loss_cls: 0.1272 d4.dn_loss_bbox: 0.1890 d4.dn_loss_iou: 0.2380 d1.loss_lmm_region: 0.1593 loss_lmm_image: 0.8618 2024/11/12 20:39:57 - mmengine - INFO - Iter(train) [ 99300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:17:06 time: 1.9830 data_time: 0.0184 memory: 34327 grad_norm: 37.0076 loss: 9.4881 loss_cls: 0.2692 loss_bbox: 0.1452 loss_iou: 0.2419 d0.loss_cls: 0.3095 d0.loss_bbox: 0.1585 d0.loss_iou: 0.2542 d1.loss_cls: 0.2882 d1.loss_bbox: 0.1492 d1.loss_iou: 0.2468 d2.loss_cls: 0.2770 d2.loss_bbox: 0.1476 d2.loss_iou: 0.2411 d3.loss_cls: 0.2694 d3.loss_bbox: 0.1472 d3.loss_iou: 0.2433 d4.loss_cls: 0.2684 d4.loss_bbox: 0.1456 d4.loss_iou: 0.2421 enc_loss_cls: 0.3192 enc_loss_bbox: 0.1652 enc_loss_iou: 0.2759 dn_loss_cls: 0.0841 dn_loss_bbox: 0.2037 dn_loss_iou: 0.2432 d0.dn_loss_cls: 0.1774 d0.dn_loss_bbox: 0.3618 d0.dn_loss_iou: 0.3971 d1.dn_loss_cls: 0.1183 d1.dn_loss_bbox: 0.2371 d1.dn_loss_iou: 0.2751 d2.dn_loss_cls: 0.0976 d2.dn_loss_bbox: 0.2132 d2.dn_loss_iou: 0.2531 d3.dn_loss_cls: 0.0878 d3.dn_loss_bbox: 0.2048 d3.dn_loss_iou: 0.2450 d4.dn_loss_cls: 0.0839 d4.dn_loss_bbox: 0.2037 d4.dn_loss_iou: 0.2432 d1.loss_lmm_region: 0.1297 loss_lmm_image: 0.8235 2024/11/12 20:43:16 - mmengine - INFO - Iter(train) [ 99400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:13:45 time: 1.9988 data_time: 0.0184 memory: 32150 grad_norm: 26.5535 loss: 9.9217 loss_cls: 0.3086 loss_bbox: 0.1637 loss_iou: 0.2780 d0.loss_cls: 0.3623 d0.loss_bbox: 0.1788 d0.loss_iou: 0.2933 d1.loss_cls: 0.3307 d1.loss_bbox: 0.1683 d1.loss_iou: 0.2826 d2.loss_cls: 0.3263 d2.loss_bbox: 0.1601 d2.loss_iou: 0.2752 d3.loss_cls: 0.3154 d3.loss_bbox: 0.1637 d3.loss_iou: 0.2769 d4.loss_cls: 0.3083 d4.loss_bbox: 0.1639 d4.loss_iou: 0.2778 enc_loss_cls: 0.3609 enc_loss_bbox: 0.2029 enc_loss_iou: 0.3231 dn_loss_cls: 0.0969 dn_loss_bbox: 0.1738 dn_loss_iou: 0.2166 d0.dn_loss_cls: 0.1826 d0.dn_loss_bbox: 0.3207 d0.dn_loss_iou: 0.3593 d1.dn_loss_cls: 0.1274 d1.dn_loss_bbox: 0.2065 d1.dn_loss_iou: 0.2486 d2.dn_loss_cls: 0.1081 d2.dn_loss_bbox: 0.1840 d2.dn_loss_iou: 0.2264 d3.dn_loss_cls: 0.1012 d3.dn_loss_bbox: 0.1760 d3.dn_loss_iou: 0.2189 d4.dn_loss_cls: 0.0977 d4.dn_loss_bbox: 0.1739 d4.dn_loss_iou: 0.2166 d1.loss_lmm_region: 0.1176 loss_lmm_image: 0.8481 2024/11/12 20:46:34 - mmengine - INFO - Iter(train) [ 99500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:10:23 time: 1.9747 data_time: 0.0183 memory: 34199 grad_norm: 33.8274 loss: 9.5326 loss_cls: 0.2879 loss_bbox: 0.1402 loss_iou: 0.2861 d0.loss_cls: 0.3250 d0.loss_bbox: 0.1524 d0.loss_iou: 0.2976 d1.loss_cls: 0.3089 d1.loss_bbox: 0.1471 d1.loss_iou: 0.2866 d2.loss_cls: 0.3042 d2.loss_bbox: 0.1380 d2.loss_iou: 0.2844 d3.loss_cls: 0.2915 d3.loss_bbox: 0.1411 d3.loss_iou: 0.2861 d4.loss_cls: 0.2900 d4.loss_bbox: 0.1397 d4.loss_iou: 0.2858 enc_loss_cls: 0.3313 enc_loss_bbox: 0.1678 enc_loss_iou: 0.3250 dn_loss_cls: 0.0870 dn_loss_bbox: 0.1719 dn_loss_iou: 0.2242 d0.dn_loss_cls: 0.1593 d0.dn_loss_bbox: 0.3041 d0.dn_loss_iou: 0.3632 d1.dn_loss_cls: 0.1172 d1.dn_loss_bbox: 0.2038 d1.dn_loss_iou: 0.2554 d2.dn_loss_cls: 0.1014 d2.dn_loss_bbox: 0.1807 d2.dn_loss_iou: 0.2331 d3.dn_loss_cls: 0.0937 d3.dn_loss_bbox: 0.1742 d3.dn_loss_iou: 0.2267 d4.dn_loss_cls: 0.0873 d4.dn_loss_bbox: 0.1719 d4.dn_loss_iou: 0.2241 d1.loss_lmm_region: 0.1115 loss_lmm_image: 0.8253 2024/11/12 20:49:51 - mmengine - INFO - Iter(train) [ 99600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:07:00 time: 1.9962 data_time: 0.0182 memory: 33321 grad_norm: 31.0688 loss: 10.5814 loss_cls: 0.3665 loss_bbox: 0.1673 loss_iou: 0.2752 d0.loss_cls: 0.4189 d0.loss_bbox: 0.1756 d0.loss_iou: 0.2903 d1.loss_cls: 0.3992 d1.loss_bbox: 0.1677 d1.loss_iou: 0.2772 d2.loss_cls: 0.3818 d2.loss_bbox: 0.1663 d2.loss_iou: 0.2730 d3.loss_cls: 0.3734 d3.loss_bbox: 0.1679 d3.loss_iou: 0.2752 d4.loss_cls: 0.3683 d4.loss_bbox: 0.1667 d4.loss_iou: 0.2763 enc_loss_cls: 0.4293 enc_loss_bbox: 0.1869 enc_loss_iou: 0.3120 dn_loss_cls: 0.1384 dn_loss_bbox: 0.1798 dn_loss_iou: 0.2214 d0.dn_loss_cls: 0.2183 d0.dn_loss_bbox: 0.3203 d0.dn_loss_iou: 0.3569 d1.dn_loss_cls: 0.1609 d1.dn_loss_bbox: 0.2042 d1.dn_loss_iou: 0.2477 d2.dn_loss_cls: 0.1449 d2.dn_loss_bbox: 0.1892 d2.dn_loss_iou: 0.2301 d3.dn_loss_cls: 0.1390 d3.dn_loss_bbox: 0.1814 d3.dn_loss_iou: 0.2240 d4.dn_loss_cls: 0.1370 d4.dn_loss_bbox: 0.1798 d4.dn_loss_iou: 0.2215 d1.loss_lmm_region: 0.1394 loss_lmm_image: 0.8321 2024/11/12 20:53:08 - mmengine - INFO - Iter(train) [ 99700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:03:37 time: 1.9847 data_time: 0.0183 memory: 32293 grad_norm: 27.1866 loss: 9.7445 loss_cls: 0.3162 loss_bbox: 0.1443 loss_iou: 0.2579 d0.loss_cls: 0.3554 d0.loss_bbox: 0.1572 d0.loss_iou: 0.2754 d1.loss_cls: 0.3406 d1.loss_bbox: 0.1478 d1.loss_iou: 0.2621 d2.loss_cls: 0.3327 d2.loss_bbox: 0.1453 d2.loss_iou: 0.2589 d3.loss_cls: 0.3239 d3.loss_bbox: 0.1427 d3.loss_iou: 0.2587 d4.loss_cls: 0.3180 d4.loss_bbox: 0.1462 d4.loss_iou: 0.2592 enc_loss_cls: 0.3656 enc_loss_bbox: 0.1657 enc_loss_iou: 0.2900 dn_loss_cls: 0.1135 dn_loss_bbox: 0.1626 dn_loss_iou: 0.2190 d0.dn_loss_cls: 0.1972 d0.dn_loss_bbox: 0.3148 d0.dn_loss_iou: 0.3676 d1.dn_loss_cls: 0.1436 d1.dn_loss_bbox: 0.1991 d1.dn_loss_iou: 0.2523 d2.dn_loss_cls: 0.1270 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2272 d3.dn_loss_cls: 0.1179 d3.dn_loss_bbox: 0.1649 d3.dn_loss_iou: 0.2217 d4.dn_loss_cls: 0.1129 d4.dn_loss_bbox: 0.1627 d4.dn_loss_iou: 0.2190 d1.loss_lmm_region: 0.1214 loss_lmm_image: 0.8641 2024/11/12 20:56:26 - mmengine - INFO - Iter(train) [ 99800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 4:00:15 time: 1.9850 data_time: 0.0183 memory: 35017 grad_norm: 33.4937 loss: 9.3108 loss_cls: 0.2802 loss_bbox: 0.1501 loss_iou: 0.2588 d0.loss_cls: 0.3186 d0.loss_bbox: 0.1657 d0.loss_iou: 0.2751 d1.loss_cls: 0.2977 d1.loss_bbox: 0.1556 d1.loss_iou: 0.2665 d2.loss_cls: 0.2882 d2.loss_bbox: 0.1517 d2.loss_iou: 0.2626 d3.loss_cls: 0.2822 d3.loss_bbox: 0.1532 d3.loss_iou: 0.2622 d4.loss_cls: 0.2825 d4.loss_bbox: 0.1497 d4.loss_iou: 0.2589 enc_loss_cls: 0.3249 enc_loss_bbox: 0.1741 enc_loss_iou: 0.2948 dn_loss_cls: 0.0984 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2108 d0.dn_loss_cls: 0.1736 d0.dn_loss_bbox: 0.2959 d0.dn_loss_iou: 0.3509 d1.dn_loss_cls: 0.1273 d1.dn_loss_bbox: 0.1829 d1.dn_loss_iou: 0.2403 d2.dn_loss_cls: 0.1111 d2.dn_loss_bbox: 0.1624 d2.dn_loss_iou: 0.2186 d3.dn_loss_cls: 0.1048 d3.dn_loss_bbox: 0.1566 d3.dn_loss_iou: 0.2122 d4.dn_loss_cls: 0.0987 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.2106 d1.loss_lmm_region: 0.1299 loss_lmm_image: 0.8605 2024/11/12 20:59:44 - mmengine - INFO - Iter(train) [ 99900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:56:52 time: 1.9786 data_time: 0.0184 memory: 34095 grad_norm: 36.7184 loss: 9.7694 loss_cls: 0.3140 loss_bbox: 0.1517 loss_iou: 0.2909 d0.loss_cls: 0.3534 d0.loss_bbox: 0.1718 d0.loss_iou: 0.3193 d1.loss_cls: 0.3277 d1.loss_bbox: 0.1611 d1.loss_iou: 0.3074 d2.loss_cls: 0.3158 d2.loss_bbox: 0.1588 d2.loss_iou: 0.3026 d3.loss_cls: 0.3117 d3.loss_bbox: 0.1539 d3.loss_iou: 0.2963 d4.loss_cls: 0.3140 d4.loss_bbox: 0.1520 d4.loss_iou: 0.2935 enc_loss_cls: 0.3585 enc_loss_bbox: 0.1839 enc_loss_iou: 0.3409 dn_loss_cls: 0.0870 dn_loss_bbox: 0.1428 dn_loss_iou: 0.2278 d0.dn_loss_cls: 0.1733 d0.dn_loss_bbox: 0.2854 d0.dn_loss_iou: 0.3739 d1.dn_loss_cls: 0.1234 d1.dn_loss_bbox: 0.1723 d1.dn_loss_iou: 0.2571 d2.dn_loss_cls: 0.1011 d2.dn_loss_bbox: 0.1524 d2.dn_loss_iou: 0.2372 d3.dn_loss_cls: 0.0920 d3.dn_loss_bbox: 0.1449 d3.dn_loss_iou: 0.2297 d4.dn_loss_cls: 0.0874 d4.dn_loss_bbox: 0.1428 d4.dn_loss_iou: 0.2277 d1.loss_lmm_region: 0.1307 loss_lmm_image: 0.8011 2024/11/12 21:03:03 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 21:03:03 - mmengine - INFO - Iter(train) [100000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:53:31 time: 1.9920 data_time: 0.0184 memory: 34299 grad_norm: 31.0370 loss: 9.9343 loss_cls: 0.3050 loss_bbox: 0.1581 loss_iou: 0.2652 d0.loss_cls: 0.3563 d0.loss_bbox: 0.1713 d0.loss_iou: 0.2787 d1.loss_cls: 0.3333 d1.loss_bbox: 0.1608 d1.loss_iou: 0.2665 d2.loss_cls: 0.3156 d2.loss_bbox: 0.1580 d2.loss_iou: 0.2676 d3.loss_cls: 0.3115 d3.loss_bbox: 0.1581 d3.loss_iou: 0.2656 d4.loss_cls: 0.3047 d4.loss_bbox: 0.1598 d4.loss_iou: 0.2664 enc_loss_cls: 0.3545 enc_loss_bbox: 0.1921 enc_loss_iou: 0.3066 dn_loss_cls: 0.0828 dn_loss_bbox: 0.1907 dn_loss_iou: 0.2307 d0.dn_loss_cls: 0.1788 d0.dn_loss_bbox: 0.3552 d0.dn_loss_iou: 0.3730 d1.dn_loss_cls: 0.1185 d1.dn_loss_bbox: 0.2254 d1.dn_loss_iou: 0.2621 d2.dn_loss_cls: 0.0954 d2.dn_loss_bbox: 0.2007 d2.dn_loss_iou: 0.2400 d3.dn_loss_cls: 0.0861 d3.dn_loss_bbox: 0.1917 d3.dn_loss_iou: 0.2331 d4.dn_loss_cls: 0.0835 d4.dn_loss_bbox: 0.1906 d4.dn_loss_iou: 0.2307 d1.loss_lmm_region: 0.1346 loss_lmm_image: 0.8750 2024/11/12 21:06:22 - mmengine - INFO - Iter(train) [100100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:50:09 time: 2.0012 data_time: 0.0184 memory: 34512 grad_norm: 37.0394 loss: 9.1872 loss_cls: 0.2870 loss_bbox: 0.1338 loss_iou: 0.2467 d0.loss_cls: 0.3260 d0.loss_bbox: 0.1508 d0.loss_iou: 0.2681 d1.loss_cls: 0.3099 d1.loss_bbox: 0.1353 d1.loss_iou: 0.2497 d2.loss_cls: 0.2951 d2.loss_bbox: 0.1356 d2.loss_iou: 0.2474 d3.loss_cls: 0.2934 d3.loss_bbox: 0.1339 d3.loss_iou: 0.2474 d4.loss_cls: 0.2872 d4.loss_bbox: 0.1338 d4.loss_iou: 0.2466 enc_loss_cls: 0.3366 enc_loss_bbox: 0.1624 enc_loss_iou: 0.2872 dn_loss_cls: 0.1016 dn_loss_bbox: 0.1507 dn_loss_iou: 0.2113 d0.dn_loss_cls: 0.1797 d0.dn_loss_bbox: 0.2984 d0.dn_loss_iou: 0.3581 d1.dn_loss_cls: 0.1277 d1.dn_loss_bbox: 0.1872 d1.dn_loss_iou: 0.2444 d2.dn_loss_cls: 0.1101 d2.dn_loss_bbox: 0.1627 d2.dn_loss_iou: 0.2216 d3.dn_loss_cls: 0.1046 d3.dn_loss_bbox: 0.1536 d3.dn_loss_iou: 0.2140 d4.dn_loss_cls: 0.1021 d4.dn_loss_bbox: 0.1507 d4.dn_loss_iou: 0.2112 d1.loss_lmm_region: 0.1209 loss_lmm_image: 0.8628 2024/11/12 21:09:43 - mmengine - INFO - Iter(train) [100200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:46:48 time: 2.0073 data_time: 0.0184 memory: 34019 grad_norm: 29.7092 loss: 7.2502 loss_cls: 0.2150 loss_bbox: 0.0923 loss_iou: 0.1722 d0.loss_cls: 0.2475 d0.loss_bbox: 0.1087 d0.loss_iou: 0.1872 d1.loss_cls: 0.2363 d1.loss_bbox: 0.0968 d1.loss_iou: 0.1769 d2.loss_cls: 0.2229 d2.loss_bbox: 0.0957 d2.loss_iou: 0.1765 d3.loss_cls: 0.2157 d3.loss_bbox: 0.0929 d3.loss_iou: 0.1727 d4.loss_cls: 0.2158 d4.loss_bbox: 0.0926 d4.loss_iou: 0.1724 enc_loss_cls: 0.2599 enc_loss_bbox: 0.1202 enc_loss_iou: 0.2099 dn_loss_cls: 0.0722 dn_loss_bbox: 0.1299 dn_loss_iou: 0.1808 d0.dn_loss_cls: 0.1452 d0.dn_loss_bbox: 0.2642 d0.dn_loss_iou: 0.3174 d1.dn_loss_cls: 0.0947 d1.dn_loss_bbox: 0.1547 d1.dn_loss_iou: 0.2084 d2.dn_loss_cls: 0.0802 d2.dn_loss_bbox: 0.1372 d2.dn_loss_iou: 0.1886 d3.dn_loss_cls: 0.0766 d3.dn_loss_bbox: 0.1320 d3.dn_loss_iou: 0.1827 d4.dn_loss_cls: 0.0727 d4.dn_loss_bbox: 0.1299 d4.dn_loss_iou: 0.1808 d1.loss_lmm_region: 0.1060 loss_lmm_image: 0.8158 2024/11/12 21:13:00 - mmengine - INFO - Iter(train) [100300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:43:26 time: 1.9772 data_time: 0.0183 memory: 33593 grad_norm: 26.2496 loss: 8.7682 loss_cls: 0.2532 loss_bbox: 0.1229 loss_iou: 0.2328 d0.loss_cls: 0.2890 d0.loss_bbox: 0.1355 d0.loss_iou: 0.2488 d1.loss_cls: 0.2724 d1.loss_bbox: 0.1318 d1.loss_iou: 0.2407 d2.loss_cls: 0.2654 d2.loss_bbox: 0.1231 d2.loss_iou: 0.2337 d3.loss_cls: 0.2609 d3.loss_bbox: 0.1211 d3.loss_iou: 0.2309 d4.loss_cls: 0.2546 d4.loss_bbox: 0.1229 d4.loss_iou: 0.2327 enc_loss_cls: 0.3007 enc_loss_bbox: 0.1489 enc_loss_iou: 0.2664 dn_loss_cls: 0.1067 dn_loss_bbox: 0.1610 dn_loss_iou: 0.2127 d0.dn_loss_cls: 0.1821 d0.dn_loss_bbox: 0.2987 d0.dn_loss_iou: 0.3497 d1.dn_loss_cls: 0.1376 d1.dn_loss_bbox: 0.1922 d1.dn_loss_iou: 0.2443 d2.dn_loss_cls: 0.1159 d2.dn_loss_bbox: 0.1704 d2.dn_loss_iou: 0.2227 d3.dn_loss_cls: 0.1094 d3.dn_loss_bbox: 0.1628 d3.dn_loss_iou: 0.2155 d4.dn_loss_cls: 0.1053 d4.dn_loss_bbox: 0.1610 d4.dn_loss_iou: 0.2127 d1.loss_lmm_region: 0.1173 loss_lmm_image: 0.8019 2024/11/12 21:16:20 - mmengine - INFO - Iter(train) [100400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:40:05 time: 1.9939 data_time: 0.0184 memory: 33983 grad_norm: 29.2838 loss: 8.4217 loss_cls: 0.2565 loss_bbox: 0.1243 loss_iou: 0.2352 d0.loss_cls: 0.2967 d0.loss_bbox: 0.1382 d0.loss_iou: 0.2518 d1.loss_cls: 0.2776 d1.loss_bbox: 0.1315 d1.loss_iou: 0.2412 d2.loss_cls: 0.2657 d2.loss_bbox: 0.1252 d2.loss_iou: 0.2368 d3.loss_cls: 0.2591 d3.loss_bbox: 0.1249 d3.loss_iou: 0.2340 d4.loss_cls: 0.2558 d4.loss_bbox: 0.1259 d4.loss_iou: 0.2349 enc_loss_cls: 0.3052 enc_loss_bbox: 0.1513 enc_loss_iou: 0.2739 dn_loss_cls: 0.0768 dn_loss_bbox: 0.1471 dn_loss_iou: 0.2048 d0.dn_loss_cls: 0.1468 d0.dn_loss_bbox: 0.2802 d0.dn_loss_iou: 0.3472 d1.dn_loss_cls: 0.1027 d1.dn_loss_bbox: 0.1736 d1.dn_loss_iou: 0.2327 d2.dn_loss_cls: 0.0857 d2.dn_loss_bbox: 0.1526 d2.dn_loss_iou: 0.2129 d3.dn_loss_cls: 0.0795 d3.dn_loss_bbox: 0.1488 d3.dn_loss_iou: 0.2071 d4.dn_loss_cls: 0.0780 d4.dn_loss_bbox: 0.1471 d4.dn_loss_iou: 0.2049 d1.loss_lmm_region: 0.0878 loss_lmm_image: 0.7600 2024/11/12 21:19:37 - mmengine - INFO - Iter(train) [100500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:36:42 time: 1.9571 data_time: 0.0183 memory: 35182 grad_norm: nan loss: 10.0611 loss_cls: 0.3240 loss_bbox: 0.1432 loss_iou: 0.2709 d0.loss_cls: 0.3651 d0.loss_bbox: 0.1629 d0.loss_iou: 0.2855 d1.loss_cls: 0.3407 d1.loss_bbox: 0.1516 d1.loss_iou: 0.2746 d2.loss_cls: 0.3320 d2.loss_bbox: 0.1466 d2.loss_iou: 0.2730 d3.loss_cls: 0.3255 d3.loss_bbox: 0.1433 d3.loss_iou: 0.2718 d4.loss_cls: 0.3189 d4.loss_bbox: 0.1495 d4.loss_iou: 0.2721 enc_loss_cls: 0.3702 enc_loss_bbox: 0.1712 enc_loss_iou: 0.3080 dn_loss_cls: 0.1162 dn_loss_bbox: 0.1653 dn_loss_iou: 0.2394 d0.dn_loss_cls: 0.2028 d0.dn_loss_bbox: 0.3176 d0.dn_loss_iou: 0.4035 d1.dn_loss_cls: 0.1463 d1.dn_loss_bbox: 0.2004 d1.dn_loss_iou: 0.2748 d2.dn_loss_cls: 0.1286 d2.dn_loss_bbox: 0.1750 d2.dn_loss_iou: 0.2490 d3.dn_loss_cls: 0.1217 d3.dn_loss_bbox: 0.1666 d3.dn_loss_iou: 0.2416 d4.dn_loss_cls: 0.1179 d4.dn_loss_bbox: 0.1654 d4.dn_loss_iou: 0.2392 d1.loss_lmm_region: 0.1343 loss_lmm_image: 0.8547 2024/11/12 21:22:55 - mmengine - INFO - Iter(train) [100600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:33:19 time: 1.9912 data_time: 0.0184 memory: 33787 grad_norm: 26.8997 loss: 9.0778 loss_cls: 0.2614 loss_bbox: 0.1323 loss_iou: 0.2309 d0.loss_cls: 0.2998 d0.loss_bbox: 0.1485 d0.loss_iou: 0.2469 d1.loss_cls: 0.2760 d1.loss_bbox: 0.1420 d1.loss_iou: 0.2401 d2.loss_cls: 0.2655 d2.loss_bbox: 0.1376 d2.loss_iou: 0.2344 d3.loss_cls: 0.2629 d3.loss_bbox: 0.1337 d3.loss_iou: 0.2321 d4.loss_cls: 0.2619 d4.loss_bbox: 0.1321 d4.loss_iou: 0.2312 enc_loss_cls: 0.3107 enc_loss_bbox: 0.1623 enc_loss_iou: 0.2677 dn_loss_cls: 0.0976 dn_loss_bbox: 0.1723 dn_loss_iou: 0.2213 d0.dn_loss_cls: 0.1759 d0.dn_loss_bbox: 0.3534 d0.dn_loss_iou: 0.3758 d1.dn_loss_cls: 0.1272 d1.dn_loss_bbox: 0.2165 d1.dn_loss_iou: 0.2570 d2.dn_loss_cls: 0.1071 d2.dn_loss_bbox: 0.1889 d2.dn_loss_iou: 0.2343 d3.dn_loss_cls: 0.1013 d3.dn_loss_bbox: 0.1763 d3.dn_loss_iou: 0.2253 d4.dn_loss_cls: 0.0975 d4.dn_loss_bbox: 0.1723 d4.dn_loss_iou: 0.2214 d1.loss_lmm_region: 0.1029 loss_lmm_image: 0.8434 2024/11/12 21:26:15 - mmengine - INFO - Iter(train) [100700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:29:58 time: 2.0124 data_time: 0.0185 memory: 34278 grad_norm: 30.6086 loss: 8.8271 loss_cls: 0.2886 loss_bbox: 0.1333 loss_iou: 0.2419 d0.loss_cls: 0.3236 d0.loss_bbox: 0.1444 d0.loss_iou: 0.2618 d1.loss_cls: 0.3115 d1.loss_bbox: 0.1336 d1.loss_iou: 0.2473 d2.loss_cls: 0.3019 d2.loss_bbox: 0.1280 d2.loss_iou: 0.2433 d3.loss_cls: 0.2919 d3.loss_bbox: 0.1304 d3.loss_iou: 0.2419 d4.loss_cls: 0.2904 d4.loss_bbox: 0.1327 d4.loss_iou: 0.2423 enc_loss_cls: 0.3433 enc_loss_bbox: 0.1522 enc_loss_iou: 0.2784 dn_loss_cls: 0.0828 dn_loss_bbox: 0.1453 dn_loss_iou: 0.1974 d0.dn_loss_cls: 0.1549 d0.dn_loss_bbox: 0.2844 d0.dn_loss_iou: 0.3337 d1.dn_loss_cls: 0.1100 d1.dn_loss_bbox: 0.1756 d1.dn_loss_iou: 0.2260 d2.dn_loss_cls: 0.0905 d2.dn_loss_bbox: 0.1525 d2.dn_loss_iou: 0.2060 d3.dn_loss_cls: 0.0843 d3.dn_loss_bbox: 0.1461 d3.dn_loss_iou: 0.1991 d4.dn_loss_cls: 0.0826 d4.dn_loss_bbox: 0.1452 d4.dn_loss_iou: 0.1973 d1.loss_lmm_region: 0.1146 loss_lmm_image: 0.8360 2024/11/12 21:29:36 - mmengine - INFO - Iter(train) [100800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:26:37 time: 2.0091 data_time: 0.0183 memory: 34122 grad_norm: 28.8991 loss: 8.8359 loss_cls: 0.2793 loss_bbox: 0.1307 loss_iou: 0.2337 d0.loss_cls: 0.3171 d0.loss_bbox: 0.1431 d0.loss_iou: 0.2528 d1.loss_cls: 0.3005 d1.loss_bbox: 0.1302 d1.loss_iou: 0.2383 d2.loss_cls: 0.2847 d2.loss_bbox: 0.1321 d2.loss_iou: 0.2351 d3.loss_cls: 0.2787 d3.loss_bbox: 0.1316 d3.loss_iou: 0.2355 d4.loss_cls: 0.2797 d4.loss_bbox: 0.1298 d4.loss_iou: 0.2335 enc_loss_cls: 0.3229 enc_loss_bbox: 0.1577 enc_loss_iou: 0.2749 dn_loss_cls: 0.0993 dn_loss_bbox: 0.1511 dn_loss_iou: 0.2002 d0.dn_loss_cls: 0.1747 d0.dn_loss_bbox: 0.2703 d0.dn_loss_iou: 0.3224 d1.dn_loss_cls: 0.1317 d1.dn_loss_bbox: 0.1741 d1.dn_loss_iou: 0.2227 d2.dn_loss_cls: 0.1115 d2.dn_loss_bbox: 0.1614 d2.dn_loss_iou: 0.2086 d3.dn_loss_cls: 0.1050 d3.dn_loss_bbox: 0.1534 d3.dn_loss_iou: 0.2028 d4.dn_loss_cls: 0.1002 d4.dn_loss_bbox: 0.1512 d4.dn_loss_iou: 0.2003 d1.loss_lmm_region: 0.1305 loss_lmm_image: 0.8427 2024/11/12 21:32:56 - mmengine - INFO - Iter(train) [100900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:23:16 time: 2.0163 data_time: 0.0185 memory: 34891 grad_norm: 28.6331 loss: 9.9561 loss_cls: 0.3183 loss_bbox: 0.1428 loss_iou: 0.2998 d0.loss_cls: 0.3595 d0.loss_bbox: 0.1625 d0.loss_iou: 0.3191 d1.loss_cls: 0.3406 d1.loss_bbox: 0.1498 d1.loss_iou: 0.3085 d2.loss_cls: 0.3360 d2.loss_bbox: 0.1425 d2.loss_iou: 0.2981 d3.loss_cls: 0.3236 d3.loss_bbox: 0.1430 d3.loss_iou: 0.2979 d4.loss_cls: 0.3210 d4.loss_bbox: 0.1432 d4.loss_iou: 0.2971 enc_loss_cls: 0.3731 enc_loss_bbox: 0.1753 enc_loss_iou: 0.3459 dn_loss_cls: 0.0893 dn_loss_bbox: 0.1663 dn_loss_iou: 0.2400 d0.dn_loss_cls: 0.1695 d0.dn_loss_bbox: 0.2999 d0.dn_loss_iou: 0.3903 d1.dn_loss_cls: 0.1184 d1.dn_loss_bbox: 0.1917 d1.dn_loss_iou: 0.2682 d2.dn_loss_cls: 0.0991 d2.dn_loss_bbox: 0.1720 d2.dn_loss_iou: 0.2467 d3.dn_loss_cls: 0.0922 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2424 d4.dn_loss_cls: 0.0893 d4.dn_loss_bbox: 0.1662 d4.dn_loss_iou: 0.2399 d1.loss_lmm_region: 0.1072 loss_lmm_image: 0.8007 2024/11/12 21:36:16 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 21:36:16 - mmengine - INFO - Iter(train) [101000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:19:55 time: 1.9954 data_time: 0.0183 memory: 35063 grad_norm: 29.3830 loss: 9.4022 loss_cls: 0.3231 loss_bbox: 0.1385 loss_iou: 0.2646 d0.loss_cls: 0.3630 d0.loss_bbox: 0.1613 d0.loss_iou: 0.2804 d1.loss_cls: 0.3418 d1.loss_bbox: 0.1469 d1.loss_iou: 0.2720 d2.loss_cls: 0.3346 d2.loss_bbox: 0.1410 d2.loss_iou: 0.2703 d3.loss_cls: 0.3268 d3.loss_bbox: 0.1424 d3.loss_iou: 0.2670 d4.loss_cls: 0.3256 d4.loss_bbox: 0.1415 d4.loss_iou: 0.2654 enc_loss_cls: 0.3717 enc_loss_bbox: 0.1776 enc_loss_iou: 0.3110 dn_loss_cls: 0.0875 dn_loss_bbox: 0.1427 dn_loss_iou: 0.1925 d0.dn_loss_cls: 0.1679 d0.dn_loss_bbox: 0.2965 d0.dn_loss_iou: 0.3397 d1.dn_loss_cls: 0.1174 d1.dn_loss_bbox: 0.1802 d1.dn_loss_iou: 0.2247 d2.dn_loss_cls: 0.0988 d2.dn_loss_bbox: 0.1546 d2.dn_loss_iou: 0.2029 d3.dn_loss_cls: 0.0908 d3.dn_loss_bbox: 0.1467 d3.dn_loss_iou: 0.1956 d4.dn_loss_cls: 0.0873 d4.dn_loss_bbox: 0.1428 d4.dn_loss_iou: 0.1926 d1.loss_lmm_region: 0.1233 loss_lmm_image: 0.8512 2024/11/12 21:39:33 - mmengine - INFO - Iter(train) [101100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:16:32 time: 1.9678 data_time: 0.0184 memory: 35504 grad_norm: 33.6035 loss: 9.5161 loss_cls: 0.3074 loss_bbox: 0.1335 loss_iou: 0.2572 d0.loss_cls: 0.3496 d0.loss_bbox: 0.1470 d0.loss_iou: 0.2736 d1.loss_cls: 0.3300 d1.loss_bbox: 0.1358 d1.loss_iou: 0.2596 d2.loss_cls: 0.3199 d2.loss_bbox: 0.1320 d2.loss_iou: 0.2582 d3.loss_cls: 0.3105 d3.loss_bbox: 0.1354 d3.loss_iou: 0.2594 d4.loss_cls: 0.3068 d4.loss_bbox: 0.1348 d4.loss_iou: 0.2599 enc_loss_cls: 0.3524 enc_loss_bbox: 0.1612 enc_loss_iou: 0.2927 dn_loss_cls: 0.1003 dn_loss_bbox: 0.1744 dn_loss_iou: 0.2132 d0.dn_loss_cls: 0.1739 d0.dn_loss_bbox: 0.2944 d0.dn_loss_iou: 0.3459 d1.dn_loss_cls: 0.1320 d1.dn_loss_bbox: 0.2001 d1.dn_loss_iou: 0.2417 d2.dn_loss_cls: 0.1140 d2.dn_loss_bbox: 0.1861 d2.dn_loss_iou: 0.2225 d3.dn_loss_cls: 0.1047 d3.dn_loss_bbox: 0.1782 d3.dn_loss_iou: 0.2155 d4.dn_loss_cls: 0.1016 d4.dn_loss_bbox: 0.1744 d4.dn_loss_iou: 0.2132 d1.loss_lmm_region: 0.1103 loss_lmm_image: 0.9025 2024/11/12 21:42:51 - mmengine - INFO - Iter(train) [101200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:13:10 time: 1.9622 data_time: 0.0184 memory: 35100 grad_norm: 28.2395 loss: 8.8186 loss_cls: 0.2722 loss_bbox: 0.1323 loss_iou: 0.2093 d0.loss_cls: 0.3146 d0.loss_bbox: 0.1382 d0.loss_iou: 0.2193 d1.loss_cls: 0.2964 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2077 d2.loss_cls: 0.2861 d2.loss_bbox: 0.1267 d2.loss_iou: 0.2062 d3.loss_cls: 0.2820 d3.loss_bbox: 0.1273 d3.loss_iou: 0.2063 d4.loss_cls: 0.2769 d4.loss_bbox: 0.1295 d4.loss_iou: 0.2075 enc_loss_cls: 0.3178 enc_loss_bbox: 0.1531 enc_loss_iou: 0.2381 dn_loss_cls: 0.1010 dn_loss_bbox: 0.1667 dn_loss_iou: 0.2095 d0.dn_loss_cls: 0.1918 d0.dn_loss_bbox: 0.3122 d0.dn_loss_iou: 0.3489 d1.dn_loss_cls: 0.1392 d1.dn_loss_bbox: 0.1959 d1.dn_loss_iou: 0.2373 d2.dn_loss_cls: 0.1133 d2.dn_loss_bbox: 0.1734 d2.dn_loss_iou: 0.2173 d3.dn_loss_cls: 0.1057 d3.dn_loss_bbox: 0.1685 d3.dn_loss_iou: 0.2117 d4.dn_loss_cls: 0.1021 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2094 d1.loss_lmm_region: 0.1352 loss_lmm_image: 0.8378 2024/11/12 21:46:08 - mmengine - INFO - Iter(train) [101300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:09:48 time: 1.9640 data_time: 0.0184 memory: 34692 grad_norm: 29.8744 loss: 10.3636 loss_cls: 0.2723 loss_bbox: 0.1738 loss_iou: 0.3071 d0.loss_cls: 0.3206 d0.loss_bbox: 0.1764 d0.loss_iou: 0.3117 d1.loss_cls: 0.3005 d1.loss_bbox: 0.1732 d1.loss_iou: 0.3036 d2.loss_cls: 0.2866 d2.loss_bbox: 0.1707 d2.loss_iou: 0.3027 d3.loss_cls: 0.2725 d3.loss_bbox: 0.1767 d3.loss_iou: 0.3063 d4.loss_cls: 0.2700 d4.loss_bbox: 0.1741 d4.loss_iou: 0.3071 enc_loss_cls: 0.3269 enc_loss_bbox: 0.1898 enc_loss_iou: 0.3333 dn_loss_cls: 0.1102 dn_loss_bbox: 0.2054 dn_loss_iou: 0.2608 d0.dn_loss_cls: 0.1941 d0.dn_loss_bbox: 0.3586 d0.dn_loss_iou: 0.4044 d1.dn_loss_cls: 0.1423 d1.dn_loss_bbox: 0.2356 d1.dn_loss_iou: 0.2893 d2.dn_loss_cls: 0.1200 d2.dn_loss_bbox: 0.2128 d2.dn_loss_iou: 0.2681 d3.dn_loss_cls: 0.1142 d3.dn_loss_bbox: 0.2063 d3.dn_loss_iou: 0.2621 d4.dn_loss_cls: 0.1098 d4.dn_loss_bbox: 0.2054 d4.dn_loss_iou: 0.2608 d1.loss_lmm_region: 0.1402 loss_lmm_image: 0.8072 2024/11/12 21:49:28 - mmengine - INFO - Iter(train) [101400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:06:27 time: 2.0007 data_time: 0.0183 memory: 34848 grad_norm: 25.2948 loss: 9.9339 loss_cls: 0.3183 loss_bbox: 0.1612 loss_iou: 0.2833 d0.loss_cls: 0.3615 d0.loss_bbox: 0.1766 d0.loss_iou: 0.2997 d1.loss_cls: 0.3379 d1.loss_bbox: 0.1672 d1.loss_iou: 0.2921 d2.loss_cls: 0.3261 d2.loss_bbox: 0.1647 d2.loss_iou: 0.2880 d3.loss_cls: 0.3193 d3.loss_bbox: 0.1624 d3.loss_iou: 0.2865 d4.loss_cls: 0.3204 d4.loss_bbox: 0.1612 d4.loss_iou: 0.2823 enc_loss_cls: 0.3644 enc_loss_bbox: 0.1933 enc_loss_iou: 0.3259 dn_loss_cls: 0.0908 dn_loss_bbox: 0.1721 dn_loss_iou: 0.2143 d0.dn_loss_cls: 0.1707 d0.dn_loss_bbox: 0.3078 d0.dn_loss_iou: 0.3513 d1.dn_loss_cls: 0.1190 d1.dn_loss_bbox: 0.2041 d1.dn_loss_iou: 0.2439 d2.dn_loss_cls: 0.0988 d2.dn_loss_bbox: 0.1807 d2.dn_loss_iou: 0.2223 d3.dn_loss_cls: 0.0919 d3.dn_loss_bbox: 0.1736 d3.dn_loss_iou: 0.2164 d4.dn_loss_cls: 0.0893 d4.dn_loss_bbox: 0.1721 d4.dn_loss_iou: 0.2142 d1.loss_lmm_region: 0.1294 loss_lmm_image: 0.8790 2024/11/12 21:52:48 - mmengine - INFO - Iter(train) [101500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 3:03:05 time: 1.9854 data_time: 0.0183 memory: 34107 grad_norm: 25.1596 loss: 10.3103 loss_cls: 0.3294 loss_bbox: 0.1626 loss_iou: 0.3108 d0.loss_cls: 0.3729 d0.loss_bbox: 0.1806 d0.loss_iou: 0.3309 d1.loss_cls: 0.3519 d1.loss_bbox: 0.1676 d1.loss_iou: 0.3167 d2.loss_cls: 0.3458 d2.loss_bbox: 0.1604 d2.loss_iou: 0.3092 d3.loss_cls: 0.3319 d3.loss_bbox: 0.1637 d3.loss_iou: 0.3110 d4.loss_cls: 0.3279 d4.loss_bbox: 0.1630 d4.loss_iou: 0.3117 enc_loss_cls: 0.3835 enc_loss_bbox: 0.1926 enc_loss_iou: 0.3517 dn_loss_cls: 0.1085 dn_loss_bbox: 0.1656 dn_loss_iou: 0.2332 d0.dn_loss_cls: 0.1828 d0.dn_loss_bbox: 0.2835 d0.dn_loss_iou: 0.3663 d1.dn_loss_cls: 0.1328 d1.dn_loss_bbox: 0.1914 d1.dn_loss_iou: 0.2608 d2.dn_loss_cls: 0.1189 d2.dn_loss_bbox: 0.1745 d2.dn_loss_iou: 0.2418 d3.dn_loss_cls: 0.1136 d3.dn_loss_bbox: 0.1668 d3.dn_loss_iou: 0.2350 d4.dn_loss_cls: 0.1110 d4.dn_loss_bbox: 0.1656 d4.dn_loss_iou: 0.2330 d1.loss_lmm_region: 0.1300 loss_lmm_image: 0.8195 2024/11/12 21:56:10 - mmengine - INFO - Iter(train) [101600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:59:45 time: 2.0054 data_time: 0.0183 memory: 34868 grad_norm: 31.7143 loss: 9.6366 loss_cls: 0.2952 loss_bbox: 0.1395 loss_iou: 0.2597 d0.loss_cls: 0.3419 d0.loss_bbox: 0.1532 d0.loss_iou: 0.2757 d1.loss_cls: 0.3222 d1.loss_bbox: 0.1393 d1.loss_iou: 0.2642 d2.loss_cls: 0.3097 d2.loss_bbox: 0.1398 d2.loss_iou: 0.2644 d3.loss_cls: 0.3025 d3.loss_bbox: 0.1385 d3.loss_iou: 0.2611 d4.loss_cls: 0.2960 d4.loss_bbox: 0.1398 d4.loss_iou: 0.2601 enc_loss_cls: 0.3481 enc_loss_bbox: 0.1684 enc_loss_iou: 0.2966 dn_loss_cls: 0.1216 dn_loss_bbox: 0.1639 dn_loss_iou: 0.2163 d0.dn_loss_cls: 0.1983 d0.dn_loss_bbox: 0.3162 d0.dn_loss_iou: 0.3590 d1.dn_loss_cls: 0.1538 d1.dn_loss_bbox: 0.2038 d1.dn_loss_iou: 0.2497 d2.dn_loss_cls: 0.1331 d2.dn_loss_bbox: 0.1768 d2.dn_loss_iou: 0.2271 d3.dn_loss_cls: 0.1224 d3.dn_loss_bbox: 0.1664 d3.dn_loss_iou: 0.2188 d4.dn_loss_cls: 0.1191 d4.dn_loss_bbox: 0.1640 d4.dn_loss_iou: 0.2162 d1.loss_lmm_region: 0.1503 loss_lmm_image: 0.8437 2024/11/12 21:59:29 - mmengine - INFO - Iter(train) [101700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:56:23 time: 2.0024 data_time: 0.0183 memory: 33690 grad_norm: 31.3205 loss: 8.3444 loss_cls: 0.2498 loss_bbox: 0.1156 loss_iou: 0.2427 d0.loss_cls: 0.2861 d0.loss_bbox: 0.1277 d0.loss_iou: 0.2578 d1.loss_cls: 0.2614 d1.loss_bbox: 0.1250 d1.loss_iou: 0.2540 d2.loss_cls: 0.2562 d2.loss_bbox: 0.1207 d2.loss_iou: 0.2475 d3.loss_cls: 0.2533 d3.loss_bbox: 0.1170 d3.loss_iou: 0.2443 d4.loss_cls: 0.2521 d4.loss_bbox: 0.1163 d4.loss_iou: 0.2427 enc_loss_cls: 0.2933 enc_loss_bbox: 0.1414 enc_loss_iou: 0.2783 dn_loss_cls: 0.0856 dn_loss_bbox: 0.1401 dn_loss_iou: 0.1946 d0.dn_loss_cls: 0.1470 d0.dn_loss_bbox: 0.2685 d0.dn_loss_iou: 0.3301 d1.dn_loss_cls: 0.1076 d1.dn_loss_bbox: 0.1650 d1.dn_loss_iou: 0.2215 d2.dn_loss_cls: 0.0929 d2.dn_loss_bbox: 0.1466 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.0870 d3.dn_loss_bbox: 0.1404 d3.dn_loss_iou: 0.1961 d4.dn_loss_cls: 0.0852 d4.dn_loss_bbox: 0.1401 d4.dn_loss_iou: 0.1947 d1.loss_lmm_region: 0.0950 loss_lmm_image: 0.8211 2024/11/12 22:02:47 - mmengine - INFO - Iter(train) [101800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:53:01 time: 2.0177 data_time: 0.0184 memory: 33183 grad_norm: 35.4148 loss: 8.4689 loss_cls: 0.2455 loss_bbox: 0.1323 loss_iou: 0.2072 d0.loss_cls: 0.2844 d0.loss_bbox: 0.1424 d0.loss_iou: 0.2174 d1.loss_cls: 0.2602 d1.loss_bbox: 0.1332 d1.loss_iou: 0.2096 d2.loss_cls: 0.2524 d2.loss_bbox: 0.1345 d2.loss_iou: 0.2096 d3.loss_cls: 0.2433 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2108 d4.loss_cls: 0.2433 d4.loss_bbox: 0.1333 d4.loss_iou: 0.2085 enc_loss_cls: 0.2837 enc_loss_bbox: 0.1560 enc_loss_iou: 0.2398 dn_loss_cls: 0.0993 dn_loss_bbox: 0.1557 dn_loss_iou: 0.2011 d0.dn_loss_cls: 0.1767 d0.dn_loss_bbox: 0.2850 d0.dn_loss_iou: 0.3265 d1.dn_loss_cls: 0.1278 d1.dn_loss_bbox: 0.1820 d1.dn_loss_iou: 0.2263 d2.dn_loss_cls: 0.1103 d2.dn_loss_bbox: 0.1611 d2.dn_loss_iou: 0.2084 d3.dn_loss_cls: 0.1057 d3.dn_loss_bbox: 0.1567 d3.dn_loss_iou: 0.2035 d4.dn_loss_cls: 0.1005 d4.dn_loss_bbox: 0.1557 d4.dn_loss_iou: 0.2011 d1.loss_lmm_region: 0.1240 loss_lmm_image: 0.8782 2024/11/12 22:06:06 - mmengine - INFO - Iter(train) [101900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:49:40 time: 1.9965 data_time: 0.0183 memory: 34080 grad_norm: 36.8366 loss: 9.3702 loss_cls: 0.2928 loss_bbox: 0.1434 loss_iou: 0.2453 d0.loss_cls: 0.3387 d0.loss_bbox: 0.1552 d0.loss_iou: 0.2671 d1.loss_cls: 0.3087 d1.loss_bbox: 0.1505 d1.loss_iou: 0.2582 d2.loss_cls: 0.3002 d2.loss_bbox: 0.1448 d2.loss_iou: 0.2511 d3.loss_cls: 0.2938 d3.loss_bbox: 0.1424 d3.loss_iou: 0.2467 d4.loss_cls: 0.2945 d4.loss_bbox: 0.1437 d4.loss_iou: 0.2461 enc_loss_cls: 0.3422 enc_loss_bbox: 0.1708 enc_loss_iou: 0.2888 dn_loss_cls: 0.0906 dn_loss_bbox: 0.1761 dn_loss_iou: 0.2123 d0.dn_loss_cls: 0.1760 d0.dn_loss_bbox: 0.3353 d0.dn_loss_iou: 0.3650 d1.dn_loss_cls: 0.1274 d1.dn_loss_bbox: 0.2096 d1.dn_loss_iou: 0.2450 d2.dn_loss_cls: 0.1056 d2.dn_loss_bbox: 0.1885 d2.dn_loss_iou: 0.2238 d3.dn_loss_cls: 0.0968 d3.dn_loss_bbox: 0.1783 d3.dn_loss_iou: 0.2150 d4.dn_loss_cls: 0.0917 d4.dn_loss_bbox: 0.1761 d4.dn_loss_iou: 0.2122 d1.loss_lmm_region: 0.1287 loss_lmm_image: 0.7912 2024/11/12 22:09:26 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 22:09:26 - mmengine - INFO - Iter(train) [102000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:46:18 time: 1.9865 data_time: 0.0184 memory: 34042 grad_norm: 26.6360 loss: 9.6471 loss_cls: 0.2714 loss_bbox: 0.1644 loss_iou: 0.2677 d0.loss_cls: 0.3200 d0.loss_bbox: 0.1707 d0.loss_iou: 0.2802 d1.loss_cls: 0.2986 d1.loss_bbox: 0.1605 d1.loss_iou: 0.2705 d2.loss_cls: 0.2862 d2.loss_bbox: 0.1618 d2.loss_iou: 0.2685 d3.loss_cls: 0.2751 d3.loss_bbox: 0.1637 d3.loss_iou: 0.2684 d4.loss_cls: 0.2714 d4.loss_bbox: 0.1643 d4.loss_iou: 0.2680 enc_loss_cls: 0.3276 enc_loss_bbox: 0.1855 enc_loss_iou: 0.3059 dn_loss_cls: 0.1179 dn_loss_bbox: 0.1747 dn_loss_iou: 0.2258 d0.dn_loss_cls: 0.1857 d0.dn_loss_bbox: 0.3178 d0.dn_loss_iou: 0.3703 d1.dn_loss_cls: 0.1426 d1.dn_loss_bbox: 0.2060 d1.dn_loss_iou: 0.2569 d2.dn_loss_cls: 0.1250 d2.dn_loss_bbox: 0.1854 d2.dn_loss_iou: 0.2354 d3.dn_loss_cls: 0.1213 d3.dn_loss_bbox: 0.1767 d3.dn_loss_iou: 0.2283 d4.dn_loss_cls: 0.1161 d4.dn_loss_bbox: 0.1746 d4.dn_loss_iou: 0.2256 d1.loss_lmm_region: 0.1243 loss_lmm_image: 0.7863 2024/11/12 22:12:44 - mmengine - INFO - Iter(train) [102100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:42:56 time: 2.0005 data_time: 0.0184 memory: 33996 grad_norm: 26.9200 loss: 9.1959 loss_cls: 0.2703 loss_bbox: 0.1420 loss_iou: 0.2395 d0.loss_cls: 0.3080 d0.loss_bbox: 0.1570 d0.loss_iou: 0.2580 d1.loss_cls: 0.2843 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2508 d2.loss_cls: 0.2764 d2.loss_bbox: 0.1441 d2.loss_iou: 0.2438 d3.loss_cls: 0.2724 d3.loss_bbox: 0.1421 d3.loss_iou: 0.2438 d4.loss_cls: 0.2693 d4.loss_bbox: 0.1412 d4.loss_iou: 0.2400 enc_loss_cls: 0.3164 enc_loss_bbox: 0.1654 enc_loss_iou: 0.2744 dn_loss_cls: 0.1176 dn_loss_bbox: 0.1565 dn_loss_iou: 0.2156 d0.dn_loss_cls: 0.1970 d0.dn_loss_bbox: 0.2864 d0.dn_loss_iou: 0.3531 d1.dn_loss_cls: 0.1504 d1.dn_loss_bbox: 0.1765 d1.dn_loss_iou: 0.2412 d2.dn_loss_cls: 0.1283 d2.dn_loss_bbox: 0.1622 d2.dn_loss_iou: 0.2240 d3.dn_loss_cls: 0.1201 d3.dn_loss_bbox: 0.1578 d3.dn_loss_iou: 0.2179 d4.dn_loss_cls: 0.1177 d4.dn_loss_bbox: 0.1565 d4.dn_loss_iou: 0.2156 d1.loss_lmm_region: 0.1395 loss_lmm_image: 0.8733 2024/11/12 22:16:03 - mmengine - INFO - Iter(train) [102200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:39:35 time: 1.9889 data_time: 0.0184 memory: 34648 grad_norm: 28.9530 loss: 9.0534 loss_cls: 0.2612 loss_bbox: 0.1379 loss_iou: 0.2424 d0.loss_cls: 0.3019 d0.loss_bbox: 0.1401 d0.loss_iou: 0.2475 d1.loss_cls: 0.2854 d1.loss_bbox: 0.1395 d1.loss_iou: 0.2400 d2.loss_cls: 0.2741 d2.loss_bbox: 0.1353 d2.loss_iou: 0.2406 d3.loss_cls: 0.2654 d3.loss_bbox: 0.1359 d3.loss_iou: 0.2428 d4.loss_cls: 0.2621 d4.loss_bbox: 0.1381 d4.loss_iou: 0.2430 enc_loss_cls: 0.3092 enc_loss_bbox: 0.1609 enc_loss_iou: 0.2742 dn_loss_cls: 0.0851 dn_loss_bbox: 0.1850 dn_loss_iou: 0.2141 d0.dn_loss_cls: 0.1571 d0.dn_loss_bbox: 0.3462 d0.dn_loss_iou: 0.3545 d1.dn_loss_cls: 0.1162 d1.dn_loss_bbox: 0.2214 d1.dn_loss_iou: 0.2440 d2.dn_loss_cls: 0.0983 d2.dn_loss_bbox: 0.1929 d2.dn_loss_iou: 0.2219 d3.dn_loss_cls: 0.0896 d3.dn_loss_bbox: 0.1864 d3.dn_loss_iou: 0.2160 d4.dn_loss_cls: 0.0867 d4.dn_loss_bbox: 0.1850 d4.dn_loss_iou: 0.2139 d1.loss_lmm_region: 0.1171 loss_lmm_image: 0.8444 2024/11/12 22:19:21 - mmengine - INFO - Iter(train) [102300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:36:13 time: 1.9650 data_time: 0.0184 memory: 33956 grad_norm: 32.4816 loss: 8.8514 loss_cls: 0.2622 loss_bbox: 0.1354 loss_iou: 0.2381 d0.loss_cls: 0.2934 d0.loss_bbox: 0.1469 d0.loss_iou: 0.2522 d1.loss_cls: 0.2770 d1.loss_bbox: 0.1402 d1.loss_iou: 0.2441 d2.loss_cls: 0.2647 d2.loss_bbox: 0.1391 d2.loss_iou: 0.2427 d3.loss_cls: 0.2612 d3.loss_bbox: 0.1363 d3.loss_iou: 0.2385 d4.loss_cls: 0.2602 d4.loss_bbox: 0.1357 d4.loss_iou: 0.2383 enc_loss_cls: 0.3140 enc_loss_bbox: 0.1620 enc_loss_iou: 0.2690 dn_loss_cls: 0.0643 dn_loss_bbox: 0.1792 dn_loss_iou: 0.2190 d0.dn_loss_cls: 0.1470 d0.dn_loss_bbox: 0.3318 d0.dn_loss_iou: 0.3552 d1.dn_loss_cls: 0.0950 d1.dn_loss_bbox: 0.2112 d1.dn_loss_iou: 0.2475 d2.dn_loss_cls: 0.0743 d2.dn_loss_bbox: 0.1869 d2.dn_loss_iou: 0.2271 d3.dn_loss_cls: 0.0689 d3.dn_loss_bbox: 0.1812 d3.dn_loss_iou: 0.2214 d4.dn_loss_cls: 0.0638 d4.dn_loss_bbox: 0.1792 d4.dn_loss_iou: 0.2191 d1.loss_lmm_region: 0.1310 loss_lmm_image: 0.7971 2024/11/12 22:22:41 - mmengine - INFO - Iter(train) [102400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:32:51 time: 2.0121 data_time: 0.0184 memory: 34846 grad_norm: 35.8186 loss: 9.5071 loss_cls: 0.3017 loss_bbox: 0.1408 loss_iou: 0.2717 d0.loss_cls: 0.3467 d0.loss_bbox: 0.1496 d0.loss_iou: 0.2863 d1.loss_cls: 0.3218 d1.loss_bbox: 0.1494 d1.loss_iou: 0.2833 d2.loss_cls: 0.3152 d2.loss_bbox: 0.1440 d2.loss_iou: 0.2774 d3.loss_cls: 0.3075 d3.loss_bbox: 0.1419 d3.loss_iou: 0.2754 d4.loss_cls: 0.3040 d4.loss_bbox: 0.1401 d4.loss_iou: 0.2713 enc_loss_cls: 0.3533 enc_loss_bbox: 0.1648 enc_loss_iou: 0.3130 dn_loss_cls: 0.0978 dn_loss_bbox: 0.1492 dn_loss_iou: 0.2167 d0.dn_loss_cls: 0.1731 d0.dn_loss_bbox: 0.2862 d0.dn_loss_iou: 0.3573 d1.dn_loss_cls: 0.1250 d1.dn_loss_bbox: 0.1797 d1.dn_loss_iou: 0.2472 d2.dn_loss_cls: 0.1086 d2.dn_loss_bbox: 0.1580 d2.dn_loss_iou: 0.2255 d3.dn_loss_cls: 0.1026 d3.dn_loss_bbox: 0.1520 d3.dn_loss_iou: 0.2192 d4.dn_loss_cls: 0.0985 d4.dn_loss_bbox: 0.1491 d4.dn_loss_iou: 0.2166 d1.loss_lmm_region: 0.1176 loss_lmm_image: 0.8681 2024/11/12 22:26:01 - mmengine - INFO - Iter(train) [102500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:29:30 time: 1.9930 data_time: 0.0186 memory: 35227 grad_norm: 27.6492 loss: 9.1826 loss_cls: 0.2801 loss_bbox: 0.1419 loss_iou: 0.2614 d0.loss_cls: 0.3271 d0.loss_bbox: 0.1501 d0.loss_iou: 0.2769 d1.loss_cls: 0.3012 d1.loss_bbox: 0.1470 d1.loss_iou: 0.2672 d2.loss_cls: 0.2892 d2.loss_bbox: 0.1458 d2.loss_iou: 0.2657 d3.loss_cls: 0.2847 d3.loss_bbox: 0.1438 d3.loss_iou: 0.2628 d4.loss_cls: 0.2831 d4.loss_bbox: 0.1415 d4.loss_iou: 0.2610 enc_loss_cls: 0.3234 enc_loss_bbox: 0.1681 enc_loss_iou: 0.2989 dn_loss_cls: 0.0852 dn_loss_bbox: 0.1528 dn_loss_iou: 0.2086 d0.dn_loss_cls: 0.1703 d0.dn_loss_bbox: 0.3002 d0.dn_loss_iou: 0.3494 d1.dn_loss_cls: 0.1169 d1.dn_loss_bbox: 0.1844 d1.dn_loss_iou: 0.2377 d2.dn_loss_cls: 0.0962 d2.dn_loss_bbox: 0.1637 d2.dn_loss_iou: 0.2177 d3.dn_loss_cls: 0.0885 d3.dn_loss_bbox: 0.1555 d3.dn_loss_iou: 0.2107 d4.dn_loss_cls: 0.0853 d4.dn_loss_bbox: 0.1528 d4.dn_loss_iou: 0.2085 d1.loss_lmm_region: 0.1135 loss_lmm_image: 0.8635 2024/11/12 22:29:18 - mmengine - INFO - Iter(train) [102600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:26:08 time: 1.9794 data_time: 0.0183 memory: 33538 grad_norm: 30.1121 loss: 8.5818 loss_cls: 0.2224 loss_bbox: 0.1298 loss_iou: 0.2236 d0.loss_cls: 0.2694 d0.loss_bbox: 0.1336 d0.loss_iou: 0.2298 d1.loss_cls: 0.2398 d1.loss_bbox: 0.1314 d1.loss_iou: 0.2255 d2.loss_cls: 0.2355 d2.loss_bbox: 0.1272 d2.loss_iou: 0.2218 d3.loss_cls: 0.2266 d3.loss_bbox: 0.1302 d3.loss_iou: 0.2236 d4.loss_cls: 0.2238 d4.loss_bbox: 0.1263 d4.loss_iou: 0.2230 enc_loss_cls: 0.2699 enc_loss_bbox: 0.1477 enc_loss_iou: 0.2524 dn_loss_cls: 0.1045 dn_loss_bbox: 0.1695 dn_loss_iou: 0.2183 d0.dn_loss_cls: 0.1661 d0.dn_loss_bbox: 0.3216 d0.dn_loss_iou: 0.3668 d1.dn_loss_cls: 0.1259 d1.dn_loss_bbox: 0.2061 d1.dn_loss_iou: 0.2533 d2.dn_loss_cls: 0.1113 d2.dn_loss_bbox: 0.1816 d2.dn_loss_iou: 0.2300 d3.dn_loss_cls: 0.1060 d3.dn_loss_bbox: 0.1734 d3.dn_loss_iou: 0.2216 d4.dn_loss_cls: 0.1018 d4.dn_loss_bbox: 0.1695 d4.dn_loss_iou: 0.2183 d1.loss_lmm_region: 0.1094 loss_lmm_image: 0.8133 2024/11/12 22:32:36 - mmengine - INFO - Iter(train) [102700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:22:46 time: 1.9488 data_time: 0.0184 memory: 33475 grad_norm: 32.8700 loss: 10.3068 loss_cls: 0.3156 loss_bbox: 0.1609 loss_iou: 0.2591 d0.loss_cls: 0.3484 d0.loss_bbox: 0.1772 d0.loss_iou: 0.2791 d1.loss_cls: 0.3271 d1.loss_bbox: 0.1726 d1.loss_iou: 0.2731 d2.loss_cls: 0.3224 d2.loss_bbox: 0.1637 d2.loss_iou: 0.2656 d3.loss_cls: 0.3169 d3.loss_bbox: 0.1641 d3.loss_iou: 0.2641 d4.loss_cls: 0.3194 d4.loss_bbox: 0.1595 d4.loss_iou: 0.2617 enc_loss_cls: 0.3559 enc_loss_bbox: 0.1935 enc_loss_iou: 0.3040 dn_loss_cls: 0.1010 dn_loss_bbox: 0.2106 dn_loss_iou: 0.2443 d0.dn_loss_cls: 0.2007 d0.dn_loss_bbox: 0.3618 d0.dn_loss_iou: 0.3972 d1.dn_loss_cls: 0.1428 d1.dn_loss_bbox: 0.2430 d1.dn_loss_iou: 0.2762 d2.dn_loss_cls: 0.1174 d2.dn_loss_bbox: 0.2185 d2.dn_loss_iou: 0.2528 d3.dn_loss_cls: 0.1064 d3.dn_loss_bbox: 0.2133 d3.dn_loss_iou: 0.2470 d4.dn_loss_cls: 0.1023 d4.dn_loss_bbox: 0.2106 d4.dn_loss_iou: 0.2443 d1.loss_lmm_region: 0.1511 loss_lmm_image: 0.8615 2024/11/12 22:35:55 - mmengine - INFO - Iter(train) [102800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:19:24 time: 2.0079 data_time: 0.0182 memory: 32823 grad_norm: 23.2027 loss: 8.8751 loss_cls: 0.2707 loss_bbox: 0.1302 loss_iou: 0.2589 d0.loss_cls: 0.3098 d0.loss_bbox: 0.1433 d0.loss_iou: 0.2749 d1.loss_cls: 0.2857 d1.loss_bbox: 0.1359 d1.loss_iou: 0.2683 d2.loss_cls: 0.2808 d2.loss_bbox: 0.1297 d2.loss_iou: 0.2615 d3.loss_cls: 0.2753 d3.loss_bbox: 0.1290 d3.loss_iou: 0.2596 d4.loss_cls: 0.2712 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2591 enc_loss_cls: 0.3226 enc_loss_bbox: 0.1517 enc_loss_iou: 0.2960 dn_loss_cls: 0.0730 dn_loss_bbox: 0.1537 dn_loss_iou: 0.2088 d0.dn_loss_cls: 0.1492 d0.dn_loss_bbox: 0.2892 d0.dn_loss_iou: 0.3403 d1.dn_loss_cls: 0.1036 d1.dn_loss_bbox: 0.1809 d1.dn_loss_iou: 0.2339 d2.dn_loss_cls: 0.0840 d2.dn_loss_bbox: 0.1598 d2.dn_loss_iou: 0.2155 d3.dn_loss_cls: 0.0785 d3.dn_loss_bbox: 0.1554 d3.dn_loss_iou: 0.2109 d4.dn_loss_cls: 0.0726 d4.dn_loss_bbox: 0.1537 d4.dn_loss_iou: 0.2087 d1.loss_lmm_region: 0.1158 loss_lmm_image: 0.8425 2024/11/12 22:39:14 - mmengine - INFO - Iter(train) [102900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:16:03 time: 1.9867 data_time: 0.0184 memory: 34348 grad_norm: 30.9081 loss: 10.5234 loss_cls: 0.3359 loss_bbox: 0.1699 loss_iou: 0.2538 d0.loss_cls: 0.3739 d0.loss_bbox: 0.1844 d0.loss_iou: 0.2681 d1.loss_cls: 0.3482 d1.loss_bbox: 0.1778 d1.loss_iou: 0.2628 d2.loss_cls: 0.3430 d2.loss_bbox: 0.1719 d2.loss_iou: 0.2578 d3.loss_cls: 0.3364 d3.loss_bbox: 0.1737 d3.loss_iou: 0.2573 d4.loss_cls: 0.3349 d4.loss_bbox: 0.1732 d4.loss_iou: 0.2561 enc_loss_cls: 0.3863 enc_loss_bbox: 0.1965 enc_loss_iou: 0.2860 dn_loss_cls: 0.1566 dn_loss_bbox: 0.1926 dn_loss_iou: 0.2268 d0.dn_loss_cls: 0.2310 d0.dn_loss_bbox: 0.3494 d0.dn_loss_iou: 0.3654 d1.dn_loss_cls: 0.1830 d1.dn_loss_bbox: 0.2298 d1.dn_loss_iou: 0.2556 d2.dn_loss_cls: 0.1626 d2.dn_loss_bbox: 0.2070 d2.dn_loss_iou: 0.2364 d3.dn_loss_cls: 0.1613 d3.dn_loss_bbox: 0.1958 d3.dn_loss_iou: 0.2300 d4.dn_loss_cls: 0.1559 d4.dn_loss_bbox: 0.1927 d4.dn_loss_iou: 0.2269 d1.loss_lmm_region: 0.1583 loss_lmm_image: 0.8584 2024/11/12 22:42:35 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 22:42:35 - mmengine - INFO - Iter(train) [103000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:12:42 time: 2.0153 data_time: 0.0183 memory: 32675 grad_norm: 35.2868 loss: 9.1903 loss_cls: 0.2827 loss_bbox: 0.1343 loss_iou: 0.2579 d0.loss_cls: 0.3267 d0.loss_bbox: 0.1454 d0.loss_iou: 0.2694 d1.loss_cls: 0.2999 d1.loss_bbox: 0.1383 d1.loss_iou: 0.2632 d2.loss_cls: 0.2929 d2.loss_bbox: 0.1364 d2.loss_iou: 0.2580 d3.loss_cls: 0.2872 d3.loss_bbox: 0.1335 d3.loss_iou: 0.2571 d4.loss_cls: 0.2836 d4.loss_bbox: 0.1347 d4.loss_iou: 0.2590 enc_loss_cls: 0.3308 enc_loss_bbox: 0.1595 enc_loss_iou: 0.2909 dn_loss_cls: 0.0796 dn_loss_bbox: 0.1719 dn_loss_iou: 0.2187 d0.dn_loss_cls: 0.1524 d0.dn_loss_bbox: 0.3216 d0.dn_loss_iou: 0.3577 d1.dn_loss_cls: 0.1094 d1.dn_loss_bbox: 0.2051 d1.dn_loss_iou: 0.2465 d2.dn_loss_cls: 0.0926 d2.dn_loss_bbox: 0.1825 d2.dn_loss_iou: 0.2270 d3.dn_loss_cls: 0.0857 d3.dn_loss_bbox: 0.1743 d3.dn_loss_iou: 0.2208 d4.dn_loss_cls: 0.0798 d4.dn_loss_bbox: 0.1719 d4.dn_loss_iou: 0.2185 d1.loss_lmm_region: 0.1171 loss_lmm_image: 0.8161 2024/11/12 22:45:56 - mmengine - INFO - Iter(train) [103100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:09:21 time: 2.0135 data_time: 0.0184 memory: 35356 grad_norm: 29.5832 loss: 9.0520 loss_cls: 0.2687 loss_bbox: 0.1357 loss_iou: 0.2372 d0.loss_cls: 0.3120 d0.loss_bbox: 0.1466 d0.loss_iou: 0.2524 d1.loss_cls: 0.2857 d1.loss_bbox: 0.1435 d1.loss_iou: 0.2476 d2.loss_cls: 0.2827 d2.loss_bbox: 0.1375 d2.loss_iou: 0.2411 d3.loss_cls: 0.2760 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2371 d4.loss_cls: 0.2706 d4.loss_bbox: 0.1351 d4.loss_iou: 0.2374 enc_loss_cls: 0.3121 enc_loss_bbox: 0.1660 enc_loss_iou: 0.2799 dn_loss_cls: 0.0923 dn_loss_bbox: 0.1737 dn_loss_iou: 0.2009 d0.dn_loss_cls: 0.1743 d0.dn_loss_bbox: 0.3235 d0.dn_loss_iou: 0.3362 d1.dn_loss_cls: 0.1230 d1.dn_loss_bbox: 0.2107 d1.dn_loss_iou: 0.2339 d2.dn_loss_cls: 0.1056 d2.dn_loss_bbox: 0.1886 d2.dn_loss_iou: 0.2125 d3.dn_loss_cls: 0.0974 d3.dn_loss_bbox: 0.1756 d3.dn_loss_iou: 0.2037 d4.dn_loss_cls: 0.0929 d4.dn_loss_bbox: 0.1738 d4.dn_loss_iou: 0.2009 d1.loss_lmm_region: 0.1389 loss_lmm_image: 0.8530 2024/11/12 22:49:15 - mmengine - INFO - Iter(train) [103200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:06:00 time: 1.9738 data_time: 0.0185 memory: 35433 grad_norm: 33.6925 loss: 11.6296 loss_cls: 0.4438 loss_bbox: 0.1637 loss_iou: 0.2968 d0.loss_cls: 0.5102 d0.loss_bbox: 0.1682 d0.loss_iou: 0.3063 d1.loss_cls: 0.4735 d1.loss_bbox: 0.1651 d1.loss_iou: 0.3005 d2.loss_cls: 0.4605 d2.loss_bbox: 0.1603 d2.loss_iou: 0.2952 d3.loss_cls: 0.4518 d3.loss_bbox: 0.1627 d3.loss_iou: 0.2957 d4.loss_cls: 0.4447 d4.loss_bbox: 0.1632 d4.loss_iou: 0.2980 enc_loss_cls: 0.4973 enc_loss_bbox: 0.1922 enc_loss_iou: 0.3428 dn_loss_cls: 0.2159 dn_loss_bbox: 0.1597 dn_loss_iou: 0.2186 d0.dn_loss_cls: 0.3144 d0.dn_loss_bbox: 0.3103 d0.dn_loss_iou: 0.3634 d1.dn_loss_cls: 0.2819 d1.dn_loss_bbox: 0.1962 d1.dn_loss_iou: 0.2523 d2.dn_loss_cls: 0.2388 d2.dn_loss_bbox: 0.1696 d2.dn_loss_iou: 0.2284 d3.dn_loss_cls: 0.2196 d3.dn_loss_bbox: 0.1621 d3.dn_loss_iou: 0.2207 d4.dn_loss_cls: 0.2166 d4.dn_loss_bbox: 0.1596 d4.dn_loss_iou: 0.2185 d1.loss_lmm_region: 0.1444 loss_lmm_image: 0.7458 2024/11/12 22:52:35 - mmengine - INFO - Iter(train) [103300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 2:02:39 time: 1.9828 data_time: 0.0183 memory: 32904 grad_norm: 28.5864 loss: 9.9330 loss_cls: 0.3066 loss_bbox: 0.1421 loss_iou: 0.2546 d0.loss_cls: 0.3419 d0.loss_bbox: 0.1547 d0.loss_iou: 0.2710 d1.loss_cls: 0.3225 d1.loss_bbox: 0.1470 d1.loss_iou: 0.2608 d2.loss_cls: 0.3154 d2.loss_bbox: 0.1425 d2.loss_iou: 0.2575 d3.loss_cls: 0.3106 d3.loss_bbox: 0.1444 d3.loss_iou: 0.2552 d4.loss_cls: 0.3053 d4.loss_bbox: 0.1439 d4.loss_iou: 0.2558 enc_loss_cls: 0.3523 enc_loss_bbox: 0.1679 enc_loss_iou: 0.2908 dn_loss_cls: 0.1107 dn_loss_bbox: 0.1876 dn_loss_iou: 0.2425 d0.dn_loss_cls: 0.2027 d0.dn_loss_bbox: 0.3412 d0.dn_loss_iou: 0.3859 d1.dn_loss_cls: 0.1507 d1.dn_loss_bbox: 0.2230 d1.dn_loss_iou: 0.2721 d2.dn_loss_cls: 0.1263 d2.dn_loss_bbox: 0.1996 d2.dn_loss_iou: 0.2525 d3.dn_loss_cls: 0.1182 d3.dn_loss_bbox: 0.1903 d3.dn_loss_iou: 0.2447 d4.dn_loss_cls: 0.1125 d4.dn_loss_bbox: 0.1876 d4.dn_loss_iou: 0.2426 d1.loss_lmm_region: 0.1442 loss_lmm_image: 0.8554 2024/11/12 22:55:54 - mmengine - INFO - Iter(train) [103400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:59:17 time: 1.9782 data_time: 0.0183 memory: 33066 grad_norm: 29.0708 loss: 8.9100 loss_cls: 0.2598 loss_bbox: 0.1210 loss_iou: 0.2282 d0.loss_cls: 0.3128 d0.loss_bbox: 0.1290 d0.loss_iou: 0.2416 d1.loss_cls: 0.2797 d1.loss_bbox: 0.1258 d1.loss_iou: 0.2348 d2.loss_cls: 0.2711 d2.loss_bbox: 0.1219 d2.loss_iou: 0.2312 d3.loss_cls: 0.2652 d3.loss_bbox: 0.1207 d3.loss_iou: 0.2280 d4.loss_cls: 0.2613 d4.loss_bbox: 0.1211 d4.loss_iou: 0.2286 enc_loss_cls: 0.3142 enc_loss_bbox: 0.1443 enc_loss_iou: 0.2585 dn_loss_cls: 0.0961 dn_loss_bbox: 0.1706 dn_loss_iou: 0.2158 d0.dn_loss_cls: 0.1738 d0.dn_loss_bbox: 0.3217 d0.dn_loss_iou: 0.3642 d1.dn_loss_cls: 0.1281 d1.dn_loss_bbox: 0.2067 d1.dn_loss_iou: 0.2505 d2.dn_loss_cls: 0.1103 d2.dn_loss_bbox: 0.1842 d2.dn_loss_iou: 0.2277 d3.dn_loss_cls: 0.1002 d3.dn_loss_bbox: 0.1730 d3.dn_loss_iou: 0.2189 d4.dn_loss_cls: 0.0953 d4.dn_loss_bbox: 0.1707 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1328 loss_lmm_image: 0.8550 2024/11/12 22:59:15 - mmengine - INFO - Iter(train) [103500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:55:56 time: 2.0134 data_time: 0.0184 memory: 34748 grad_norm: 30.6105 loss: 10.3736 loss_cls: 0.3410 loss_bbox: 0.1505 loss_iou: 0.2539 d0.loss_cls: 0.3732 d0.loss_bbox: 0.1677 d0.loss_iou: 0.2701 d1.loss_cls: 0.3554 d1.loss_bbox: 0.1594 d1.loss_iou: 0.2590 d2.loss_cls: 0.3434 d2.loss_bbox: 0.1561 d2.loss_iou: 0.2560 d3.loss_cls: 0.3394 d3.loss_bbox: 0.1547 d3.loss_iou: 0.2552 d4.loss_cls: 0.3443 d4.loss_bbox: 0.1491 d4.loss_iou: 0.2528 enc_loss_cls: 0.3840 enc_loss_bbox: 0.1797 enc_loss_iou: 0.2854 dn_loss_cls: 0.1480 dn_loss_bbox: 0.1886 dn_loss_iou: 0.2336 d0.dn_loss_cls: 0.2441 d0.dn_loss_bbox: 0.3375 d0.dn_loss_iou: 0.3769 d1.dn_loss_cls: 0.1887 d1.dn_loss_bbox: 0.2225 d1.dn_loss_iou: 0.2642 d2.dn_loss_cls: 0.1629 d2.dn_loss_bbox: 0.2028 d2.dn_loss_iou: 0.2444 d3.dn_loss_cls: 0.1476 d3.dn_loss_bbox: 0.1924 d3.dn_loss_iou: 0.2361 d4.dn_loss_cls: 0.1481 d4.dn_loss_bbox: 0.1887 d4.dn_loss_iou: 0.2335 d1.loss_lmm_region: 0.1598 loss_lmm_image: 0.8229 2024/11/12 23:02:34 - mmengine - INFO - Iter(train) [103600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:52:34 time: 1.9903 data_time: 0.0185 memory: 34801 grad_norm: 29.1177 loss: 8.7570 loss_cls: 0.2558 loss_bbox: 0.1233 loss_iou: 0.2266 d0.loss_cls: 0.2972 d0.loss_bbox: 0.1365 d0.loss_iou: 0.2447 d1.loss_cls: 0.2737 d1.loss_bbox: 0.1253 d1.loss_iou: 0.2335 d2.loss_cls: 0.2680 d2.loss_bbox: 0.1220 d2.loss_iou: 0.2265 d3.loss_cls: 0.2633 d3.loss_bbox: 0.1211 d3.loss_iou: 0.2239 d4.loss_cls: 0.2561 d4.loss_bbox: 0.1234 d4.loss_iou: 0.2271 enc_loss_cls: 0.3023 enc_loss_bbox: 0.1497 enc_loss_iou: 0.2657 dn_loss_cls: 0.1323 dn_loss_bbox: 0.1414 dn_loss_iou: 0.1961 d0.dn_loss_cls: 0.1996 d0.dn_loss_bbox: 0.2864 d0.dn_loss_iou: 0.3342 d1.dn_loss_cls: 0.1544 d1.dn_loss_bbox: 0.1742 d1.dn_loss_iou: 0.2279 d2.dn_loss_cls: 0.1430 d2.dn_loss_bbox: 0.1530 d2.dn_loss_iou: 0.2070 d3.dn_loss_cls: 0.1361 d3.dn_loss_bbox: 0.1441 d3.dn_loss_iou: 0.1990 d4.dn_loss_cls: 0.1291 d4.dn_loss_bbox: 0.1414 d4.dn_loss_iou: 0.1961 d1.loss_lmm_region: 0.1298 loss_lmm_image: 0.8661 2024/11/12 23:05:53 - mmengine - INFO - Iter(train) [103700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:49:13 time: 1.9702 data_time: 0.0183 memory: 33149 grad_norm: 24.5060 loss: 8.5150 loss_cls: 0.2461 loss_bbox: 0.1320 loss_iou: 0.2090 d0.loss_cls: 0.2768 d0.loss_bbox: 0.1394 d0.loss_iou: 0.2221 d1.loss_cls: 0.2587 d1.loss_bbox: 0.1353 d1.loss_iou: 0.2160 d2.loss_cls: 0.2524 d2.loss_bbox: 0.1306 d2.loss_iou: 0.2087 d3.loss_cls: 0.2522 d3.loss_bbox: 0.1294 d3.loss_iou: 0.2082 d4.loss_cls: 0.2441 d4.loss_bbox: 0.1327 d4.loss_iou: 0.2093 enc_loss_cls: 0.2906 enc_loss_bbox: 0.1472 enc_loss_iou: 0.2341 dn_loss_cls: 0.0943 dn_loss_bbox: 0.1738 dn_loss_iou: 0.2010 d0.dn_loss_cls: 0.1755 d0.dn_loss_bbox: 0.3303 d0.dn_loss_iou: 0.3392 d1.dn_loss_cls: 0.1248 d1.dn_loss_bbox: 0.2113 d1.dn_loss_iou: 0.2336 d2.dn_loss_cls: 0.1032 d2.dn_loss_bbox: 0.1856 d2.dn_loss_iou: 0.2109 d3.dn_loss_cls: 0.0965 d3.dn_loss_bbox: 0.1760 d3.dn_loss_iou: 0.2032 d4.dn_loss_cls: 0.0939 d4.dn_loss_bbox: 0.1738 d4.dn_loss_iou: 0.2010 d1.loss_lmm_region: 0.1150 loss_lmm_image: 0.7975 2024/11/12 23:09:10 - mmengine - INFO - Iter(train) [103800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:45:51 time: 1.9767 data_time: 0.0183 memory: 32967 grad_norm: 26.1548 loss: 9.9055 loss_cls: 0.3224 loss_bbox: 0.1499 loss_iou: 0.2717 d0.loss_cls: 0.3577 d0.loss_bbox: 0.1626 d0.loss_iou: 0.2853 d1.loss_cls: 0.3382 d1.loss_bbox: 0.1552 d1.loss_iou: 0.2808 d2.loss_cls: 0.3259 d2.loss_bbox: 0.1557 d2.loss_iou: 0.2765 d3.loss_cls: 0.3180 d3.loss_bbox: 0.1553 d3.loss_iou: 0.2737 d4.loss_cls: 0.3173 d4.loss_bbox: 0.1524 d4.loss_iou: 0.2725 enc_loss_cls: 0.3603 enc_loss_bbox: 0.1746 enc_loss_iou: 0.3078 dn_loss_cls: 0.1009 dn_loss_bbox: 0.1675 dn_loss_iou: 0.2270 d0.dn_loss_cls: 0.1869 d0.dn_loss_bbox: 0.2951 d0.dn_loss_iou: 0.3698 d1.dn_loss_cls: 0.1415 d1.dn_loss_bbox: 0.1906 d1.dn_loss_iou: 0.2567 d2.dn_loss_cls: 0.1175 d2.dn_loss_bbox: 0.1766 d2.dn_loss_iou: 0.2372 d3.dn_loss_cls: 0.1074 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2295 d4.dn_loss_cls: 0.1022 d4.dn_loss_bbox: 0.1675 d4.dn_loss_iou: 0.2270 d1.loss_lmm_region: 0.1379 loss_lmm_image: 0.8844 2024/11/12 23:12:28 - mmengine - INFO - Iter(train) [103900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:42:29 time: 1.9724 data_time: 0.0183 memory: 32813 grad_norm: 25.1615 loss: 9.6599 loss_cls: 0.2954 loss_bbox: 0.1550 loss_iou: 0.2545 d0.loss_cls: 0.3315 d0.loss_bbox: 0.1714 d0.loss_iou: 0.2697 d1.loss_cls: 0.3225 d1.loss_bbox: 0.1518 d1.loss_iou: 0.2571 d2.loss_cls: 0.3132 d2.loss_bbox: 0.1473 d2.loss_iou: 0.2477 d3.loss_cls: 0.3013 d3.loss_bbox: 0.1518 d3.loss_iou: 0.2521 d4.loss_cls: 0.2986 d4.loss_bbox: 0.1534 d4.loss_iou: 0.2530 enc_loss_cls: 0.3397 enc_loss_bbox: 0.1846 enc_loss_iou: 0.2919 dn_loss_cls: 0.1115 dn_loss_bbox: 0.1863 dn_loss_iou: 0.2225 d0.dn_loss_cls: 0.1914 d0.dn_loss_bbox: 0.3365 d0.dn_loss_iou: 0.3629 d1.dn_loss_cls: 0.1416 d1.dn_loss_bbox: 0.2235 d1.dn_loss_iou: 0.2521 d2.dn_loss_cls: 0.1240 d2.dn_loss_bbox: 0.1959 d2.dn_loss_iou: 0.2305 d3.dn_loss_cls: 0.1179 d3.dn_loss_bbox: 0.1871 d3.dn_loss_iou: 0.2245 d4.dn_loss_cls: 0.1119 d4.dn_loss_bbox: 0.1863 d4.dn_loss_iou: 0.2224 d1.loss_lmm_region: 0.1096 loss_lmm_image: 0.7780 2024/11/12 23:15:46 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 23:15:46 - mmengine - INFO - Iter(train) [104000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:39:07 time: 1.9796 data_time: 0.0183 memory: 35979 grad_norm: 26.9844 loss: 9.5845 loss_cls: 0.2794 loss_bbox: 0.1553 loss_iou: 0.2400 d0.loss_cls: 0.3203 d0.loss_bbox: 0.1597 d0.loss_iou: 0.2499 d1.loss_cls: 0.2967 d1.loss_bbox: 0.1559 d1.loss_iou: 0.2425 d2.loss_cls: 0.2878 d2.loss_bbox: 0.1545 d2.loss_iou: 0.2385 d3.loss_cls: 0.2842 d3.loss_bbox: 0.1516 d3.loss_iou: 0.2369 d4.loss_cls: 0.2812 d4.loss_bbox: 0.1531 d4.loss_iou: 0.2376 enc_loss_cls: 0.3262 enc_loss_bbox: 0.1771 enc_loss_iou: 0.2772 dn_loss_cls: 0.1250 dn_loss_bbox: 0.1807 dn_loss_iou: 0.2236 d0.dn_loss_cls: 0.1998 d0.dn_loss_bbox: 0.3538 d0.dn_loss_iou: 0.3795 d1.dn_loss_cls: 0.1547 d1.dn_loss_bbox: 0.2170 d1.dn_loss_iou: 0.2550 d2.dn_loss_cls: 0.1365 d2.dn_loss_bbox: 0.1920 d2.dn_loss_iou: 0.2333 d3.dn_loss_cls: 0.1304 d3.dn_loss_bbox: 0.1838 d3.dn_loss_iou: 0.2262 d4.dn_loss_cls: 0.1255 d4.dn_loss_bbox: 0.1807 d4.dn_loss_iou: 0.2236 d1.loss_lmm_region: 0.1436 loss_lmm_image: 0.8142 2024/11/12 23:19:04 - mmengine - INFO - Iter(train) [104100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:35:45 time: 1.9655 data_time: 0.0184 memory: 35463 grad_norm: 32.1336 loss: 9.5969 loss_cls: 0.2839 loss_bbox: 0.1495 loss_iou: 0.2901 d0.loss_cls: 0.3258 d0.loss_bbox: 0.1626 d0.loss_iou: 0.3053 d1.loss_cls: 0.3067 d1.loss_bbox: 0.1552 d1.loss_iou: 0.2953 d2.loss_cls: 0.2949 d2.loss_bbox: 0.1513 d2.loss_iou: 0.2949 d3.loss_cls: 0.2895 d3.loss_bbox: 0.1505 d3.loss_iou: 0.2894 d4.loss_cls: 0.2866 d4.loss_bbox: 0.1486 d4.loss_iou: 0.2891 enc_loss_cls: 0.3292 enc_loss_bbox: 0.1722 enc_loss_iou: 0.3251 dn_loss_cls: 0.0884 dn_loss_bbox: 0.1697 dn_loss_iou: 0.2254 d0.dn_loss_cls: 0.1626 d0.dn_loss_bbox: 0.3004 d0.dn_loss_iou: 0.3621 d1.dn_loss_cls: 0.1138 d1.dn_loss_bbox: 0.1997 d1.dn_loss_iou: 0.2559 d2.dn_loss_cls: 0.0983 d2.dn_loss_bbox: 0.1780 d2.dn_loss_iou: 0.2352 d3.dn_loss_cls: 0.0907 d3.dn_loss_bbox: 0.1711 d3.dn_loss_iou: 0.2274 d4.dn_loss_cls: 0.0877 d4.dn_loss_bbox: 0.1696 d4.dn_loss_iou: 0.2254 d1.loss_lmm_region: 0.1176 loss_lmm_image: 0.8220 2024/11/12 23:22:22 - mmengine - INFO - Iter(train) [104200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:32:23 time: 1.9805 data_time: 0.0184 memory: 33844 grad_norm: 34.8489 loss: 9.2459 loss_cls: 0.3117 loss_bbox: 0.1295 loss_iou: 0.2428 d0.loss_cls: 0.3490 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2645 d1.loss_cls: 0.3267 d1.loss_bbox: 0.1341 d1.loss_iou: 0.2518 d2.loss_cls: 0.3230 d2.loss_bbox: 0.1337 d2.loss_iou: 0.2486 d3.loss_cls: 0.3117 d3.loss_bbox: 0.1326 d3.loss_iou: 0.2472 d4.loss_cls: 0.3096 d4.loss_bbox: 0.1309 d4.loss_iou: 0.2451 enc_loss_cls: 0.3590 enc_loss_bbox: 0.1566 enc_loss_iou: 0.2829 dn_loss_cls: 0.1013 dn_loss_bbox: 0.1556 dn_loss_iou: 0.2016 d0.dn_loss_cls: 0.1931 d0.dn_loss_bbox: 0.2904 d0.dn_loss_iou: 0.3431 d1.dn_loss_cls: 0.1379 d1.dn_loss_bbox: 0.1843 d1.dn_loss_iou: 0.2329 d2.dn_loss_cls: 0.1166 d2.dn_loss_bbox: 0.1625 d2.dn_loss_iou: 0.2118 d3.dn_loss_cls: 0.1055 d3.dn_loss_bbox: 0.1569 d3.dn_loss_iou: 0.2045 d4.dn_loss_cls: 0.1012 d4.dn_loss_bbox: 0.1555 d4.dn_loss_iou: 0.2015 d1.loss_lmm_region: 0.1383 loss_lmm_image: 0.8147 2024/11/12 23:25:42 - mmengine - INFO - Iter(train) [104300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:29:02 time: 1.9798 data_time: 0.0184 memory: 33748 grad_norm: 26.3968 loss: 10.5388 loss_cls: 0.4052 loss_bbox: 0.1575 loss_iou: 0.2414 d0.loss_cls: 0.4368 d0.loss_bbox: 0.1704 d0.loss_iou: 0.2604 d1.loss_cls: 0.4289 d1.loss_bbox: 0.1599 d1.loss_iou: 0.2473 d2.loss_cls: 0.4067 d2.loss_bbox: 0.1575 d2.loss_iou: 0.2456 d3.loss_cls: 0.4082 d3.loss_bbox: 0.1584 d3.loss_iou: 0.2435 d4.loss_cls: 0.4017 d4.loss_bbox: 0.1562 d4.loss_iou: 0.2426 enc_loss_cls: 0.4521 enc_loss_bbox: 0.1845 enc_loss_iou: 0.2799 dn_loss_cls: 0.1629 dn_loss_bbox: 0.1797 dn_loss_iou: 0.1980 d0.dn_loss_cls: 0.2272 d0.dn_loss_bbox: 0.3320 d0.dn_loss_iou: 0.3421 d1.dn_loss_cls: 0.1905 d1.dn_loss_bbox: 0.2163 d1.dn_loss_iou: 0.2308 d2.dn_loss_cls: 0.1745 d2.dn_loss_bbox: 0.1891 d2.dn_loss_iou: 0.2075 d3.dn_loss_cls: 0.1665 d3.dn_loss_bbox: 0.1816 d3.dn_loss_iou: 0.2007 d4.dn_loss_cls: 0.1601 d4.dn_loss_bbox: 0.1796 d4.dn_loss_iou: 0.1979 d1.loss_lmm_region: 0.1297 loss_lmm_image: 0.8273 2024/11/12 23:29:00 - mmengine - INFO - Iter(train) [104400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:25:40 time: 1.9915 data_time: 0.0183 memory: 33365 grad_norm: 28.1091 loss: 10.1013 loss_cls: 0.3001 loss_bbox: 0.1527 loss_iou: 0.2629 d0.loss_cls: 0.3360 d0.loss_bbox: 0.1696 d0.loss_iou: 0.2821 d1.loss_cls: 0.3217 d1.loss_bbox: 0.1601 d1.loss_iou: 0.2718 d2.loss_cls: 0.3124 d2.loss_bbox: 0.1595 d2.loss_iou: 0.2682 d3.loss_cls: 0.3053 d3.loss_bbox: 0.1550 d3.loss_iou: 0.2655 d4.loss_cls: 0.2991 d4.loss_bbox: 0.1541 d4.loss_iou: 0.2639 enc_loss_cls: 0.3465 enc_loss_bbox: 0.1821 enc_loss_iou: 0.2996 dn_loss_cls: 0.1461 dn_loss_bbox: 0.1804 dn_loss_iou: 0.2234 d0.dn_loss_cls: 0.2335 d0.dn_loss_bbox: 0.3288 d0.dn_loss_iou: 0.3630 d1.dn_loss_cls: 0.1856 d1.dn_loss_bbox: 0.2144 d1.dn_loss_iou: 0.2571 d2.dn_loss_cls: 0.1613 d2.dn_loss_bbox: 0.1898 d2.dn_loss_iou: 0.2338 d3.dn_loss_cls: 0.1515 d3.dn_loss_bbox: 0.1830 d3.dn_loss_iou: 0.2263 d4.dn_loss_cls: 0.1499 d4.dn_loss_bbox: 0.1804 d4.dn_loss_iou: 0.2234 d1.loss_lmm_region: 0.1408 loss_lmm_image: 0.8608 2024/11/12 23:32:19 - mmengine - INFO - Iter(train) [104500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:22:18 time: 1.9903 data_time: 0.0182 memory: 34096 grad_norm: 33.2807 loss: 8.5076 loss_cls: 0.2304 loss_bbox: 0.1181 loss_iou: 0.2032 d0.loss_cls: 0.2702 d0.loss_bbox: 0.1271 d0.loss_iou: 0.2130 d1.loss_cls: 0.2511 d1.loss_bbox: 0.1198 d1.loss_iou: 0.2070 d2.loss_cls: 0.2441 d2.loss_bbox: 0.1158 d2.loss_iou: 0.2013 d3.loss_cls: 0.2345 d3.loss_bbox: 0.1169 d3.loss_iou: 0.2026 d4.loss_cls: 0.2318 d4.loss_bbox: 0.1179 d4.loss_iou: 0.2033 enc_loss_cls: 0.2801 enc_loss_bbox: 0.1375 enc_loss_iou: 0.2291 dn_loss_cls: 0.0970 dn_loss_bbox: 0.1887 dn_loss_iou: 0.2096 d0.dn_loss_cls: 0.1756 d0.dn_loss_bbox: 0.3541 d0.dn_loss_iou: 0.3581 d1.dn_loss_cls: 0.1263 d1.dn_loss_bbox: 0.2257 d1.dn_loss_iou: 0.2408 d2.dn_loss_cls: 0.1084 d2.dn_loss_bbox: 0.2000 d2.dn_loss_iou: 0.2194 d3.dn_loss_cls: 0.1012 d3.dn_loss_bbox: 0.1906 d3.dn_loss_iou: 0.2117 d4.dn_loss_cls: 0.0976 d4.dn_loss_bbox: 0.1887 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.1410 loss_lmm_image: 0.8086 2024/11/12 23:35:40 - mmengine - INFO - Iter(train) [104600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:18:57 time: 1.9839 data_time: 0.0183 memory: 34161 grad_norm: 31.5528 loss: 9.4256 loss_cls: 0.2919 loss_bbox: 0.1408 loss_iou: 0.2203 d0.loss_cls: 0.3362 d0.loss_bbox: 0.1554 d0.loss_iou: 0.2392 d1.loss_cls: 0.3137 d1.loss_bbox: 0.1425 d1.loss_iou: 0.2256 d2.loss_cls: 0.2999 d2.loss_bbox: 0.1427 d2.loss_iou: 0.2228 d3.loss_cls: 0.2952 d3.loss_bbox: 0.1408 d3.loss_iou: 0.2210 d4.loss_cls: 0.2903 d4.loss_bbox: 0.1423 d4.loss_iou: 0.2207 enc_loss_cls: 0.3476 enc_loss_bbox: 0.1587 enc_loss_iou: 0.2551 dn_loss_cls: 0.0976 dn_loss_bbox: 0.1956 dn_loss_iou: 0.2328 d0.dn_loss_cls: 0.1840 d0.dn_loss_bbox: 0.3584 d0.dn_loss_iou: 0.3836 d1.dn_loss_cls: 0.1307 d1.dn_loss_bbox: 0.2287 d1.dn_loss_iou: 0.2649 d2.dn_loss_cls: 0.1136 d2.dn_loss_bbox: 0.2043 d2.dn_loss_iou: 0.2417 d3.dn_loss_cls: 0.1023 d3.dn_loss_bbox: 0.1977 d3.dn_loss_iou: 0.2356 d4.dn_loss_cls: 0.0985 d4.dn_loss_bbox: 0.1956 d4.dn_loss_iou: 0.2328 d1.loss_lmm_region: 0.1428 loss_lmm_image: 0.7817 2024/11/12 23:38:58 - mmengine - INFO - Iter(train) [104700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:15:36 time: 1.9817 data_time: 0.0183 memory: 34007 grad_norm: 31.4428 loss: 10.0547 loss_cls: 0.3179 loss_bbox: 0.1633 loss_iou: 0.3024 d0.loss_cls: 0.3533 d0.loss_bbox: 0.1775 d0.loss_iou: 0.3188 d1.loss_cls: 0.3308 d1.loss_bbox: 0.1686 d1.loss_iou: 0.3084 d2.loss_cls: 0.3227 d2.loss_bbox: 0.1663 d2.loss_iou: 0.3038 d3.loss_cls: 0.3233 d3.loss_bbox: 0.1613 d3.loss_iou: 0.2997 d4.loss_cls: 0.3211 d4.loss_bbox: 0.1624 d4.loss_iou: 0.3017 enc_loss_cls: 0.3629 enc_loss_bbox: 0.1911 enc_loss_iou: 0.3381 dn_loss_cls: 0.0975 dn_loss_bbox: 0.1629 dn_loss_iou: 0.2315 d0.dn_loss_cls: 0.1747 d0.dn_loss_bbox: 0.2958 d0.dn_loss_iou: 0.3726 d1.dn_loss_cls: 0.1261 d1.dn_loss_bbox: 0.1862 d1.dn_loss_iou: 0.2592 d2.dn_loss_cls: 0.1105 d2.dn_loss_bbox: 0.1694 d2.dn_loss_iou: 0.2405 d3.dn_loss_cls: 0.1018 d3.dn_loss_bbox: 0.1648 d3.dn_loss_iou: 0.2338 d4.dn_loss_cls: 0.0981 d4.dn_loss_bbox: 0.1629 d4.dn_loss_iou: 0.2316 d1.loss_lmm_region: 0.1393 loss_lmm_image: 0.7999 2024/11/12 23:42:18 - mmengine - INFO - Iter(train) [104800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:12:15 time: 2.0087 data_time: 0.0183 memory: 32783 grad_norm: 38.8600 loss: 8.5584 loss_cls: 0.2588 loss_bbox: 0.1188 loss_iou: 0.1929 d0.loss_cls: 0.2952 d0.loss_bbox: 0.1312 d0.loss_iou: 0.2076 d1.loss_cls: 0.2744 d1.loss_bbox: 0.1249 d1.loss_iou: 0.1982 d2.loss_cls: 0.2721 d2.loss_bbox: 0.1178 d2.loss_iou: 0.1939 d3.loss_cls: 0.2639 d3.loss_bbox: 0.1199 d3.loss_iou: 0.1934 d4.loss_cls: 0.2580 d4.loss_bbox: 0.1204 d4.loss_iou: 0.1929 enc_loss_cls: 0.2950 enc_loss_bbox: 0.1465 enc_loss_iou: 0.2317 dn_loss_cls: 0.1087 dn_loss_bbox: 0.1692 dn_loss_iou: 0.1967 d0.dn_loss_cls: 0.1839 d0.dn_loss_bbox: 0.3140 d0.dn_loss_iou: 0.3423 d1.dn_loss_cls: 0.1412 d1.dn_loss_bbox: 0.1984 d1.dn_loss_iou: 0.2295 d2.dn_loss_cls: 0.1238 d2.dn_loss_bbox: 0.1787 d2.dn_loss_iou: 0.2070 d3.dn_loss_cls: 0.1171 d3.dn_loss_bbox: 0.1709 d3.dn_loss_iou: 0.1994 d4.dn_loss_cls: 0.1109 d4.dn_loss_bbox: 0.1693 d4.dn_loss_iou: 0.1967 d1.loss_lmm_region: 0.1256 loss_lmm_image: 0.8672 2024/11/12 23:45:36 - mmengine - INFO - Iter(train) [104900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:08:52 time: 1.9584 data_time: 0.0183 memory: 34827 grad_norm: nan loss: 8.4444 loss_cls: 0.2392 loss_bbox: 0.1200 loss_iou: 0.2284 d0.loss_cls: 0.2785 d0.loss_bbox: 0.1313 d0.loss_iou: 0.2480 d1.loss_cls: 0.2633 d1.loss_bbox: 0.1206 d1.loss_iou: 0.2352 d2.loss_cls: 0.2519 d2.loss_bbox: 0.1190 d2.loss_iou: 0.2314 d3.loss_cls: 0.2487 d3.loss_bbox: 0.1143 d3.loss_iou: 0.2268 d4.loss_cls: 0.2438 d4.loss_bbox: 0.1154 d4.loss_iou: 0.2279 enc_loss_cls: 0.2870 enc_loss_bbox: 0.1437 enc_loss_iou: 0.2697 dn_loss_cls: 0.1323 dn_loss_bbox: 0.1297 dn_loss_iou: 0.1868 d0.dn_loss_cls: 0.2019 d0.dn_loss_bbox: 0.2836 d0.dn_loss_iou: 0.3299 d1.dn_loss_cls: 0.1501 d1.dn_loss_bbox: 0.1634 d1.dn_loss_iou: 0.2183 d2.dn_loss_cls: 0.1388 d2.dn_loss_bbox: 0.1405 d2.dn_loss_iou: 0.1969 d3.dn_loss_cls: 0.1351 d3.dn_loss_bbox: 0.1328 d3.dn_loss_iou: 0.1895 d4.dn_loss_cls: 0.1338 d4.dn_loss_bbox: 0.1299 d4.dn_loss_iou: 0.1867 d1.loss_lmm_region: 0.1220 loss_lmm_image: 0.7981 2024/11/12 23:48:54 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/12 23:48:54 - mmengine - INFO - Iter(train) [105000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:05:31 time: 2.0036 data_time: 0.0186 memory: 34992 grad_norm: 35.9283 loss: 7.9685 loss_cls: 0.2430 loss_bbox: 0.1058 loss_iou: 0.1988 d0.loss_cls: 0.2829 d0.loss_bbox: 0.1101 d0.loss_iou: 0.2081 d1.loss_cls: 0.2594 d1.loss_bbox: 0.1068 d1.loss_iou: 0.2000 d2.loss_cls: 0.2471 d2.loss_bbox: 0.1083 d2.loss_iou: 0.2037 d3.loss_cls: 0.2458 d3.loss_bbox: 0.1070 d3.loss_iou: 0.1991 d4.loss_cls: 0.2388 d4.loss_bbox: 0.1073 d4.loss_iou: 0.2006 enc_loss_cls: 0.2766 enc_loss_bbox: 0.1291 enc_loss_iou: 0.2364 dn_loss_cls: 0.0780 dn_loss_bbox: 0.1494 dn_loss_iou: 0.1831 d0.dn_loss_cls: 0.1573 d0.dn_loss_bbox: 0.2885 d0.dn_loss_iou: 0.3196 d1.dn_loss_cls: 0.1076 d1.dn_loss_bbox: 0.1818 d1.dn_loss_iou: 0.2133 d2.dn_loss_cls: 0.0886 d2.dn_loss_bbox: 0.1593 d2.dn_loss_iou: 0.1930 d3.dn_loss_cls: 0.0833 d3.dn_loss_bbox: 0.1526 d3.dn_loss_iou: 0.1857 d4.dn_loss_cls: 0.0797 d4.dn_loss_bbox: 0.1494 d4.dn_loss_iou: 0.1830 d1.loss_lmm_region: 0.1452 loss_lmm_image: 0.8556 2024/11/12 23:52:14 - mmengine - INFO - Iter(train) [105100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 1:02:10 time: 1.9952 data_time: 0.0184 memory: 35380 grad_norm: 31.2169 loss: 8.0380 loss_cls: 0.2077 loss_bbox: 0.1249 loss_iou: 0.2030 d0.loss_cls: 0.2446 d0.loss_bbox: 0.1373 d0.loss_iou: 0.2141 d1.loss_cls: 0.2259 d1.loss_bbox: 0.1317 d1.loss_iou: 0.2092 d2.loss_cls: 0.2140 d2.loss_bbox: 0.1303 d2.loss_iou: 0.2052 d3.loss_cls: 0.2096 d3.loss_bbox: 0.1252 d3.loss_iou: 0.2029 d4.loss_cls: 0.2084 d4.loss_bbox: 0.1239 d4.loss_iou: 0.2031 enc_loss_cls: 0.2492 enc_loss_bbox: 0.1584 enc_loss_iou: 0.2377 dn_loss_cls: 0.0700 dn_loss_bbox: 0.1558 dn_loss_iou: 0.2020 d0.dn_loss_cls: 0.1570 d0.dn_loss_bbox: 0.3152 d0.dn_loss_iou: 0.3520 d1.dn_loss_cls: 0.1077 d1.dn_loss_bbox: 0.1924 d1.dn_loss_iou: 0.2355 d2.dn_loss_cls: 0.0846 d2.dn_loss_bbox: 0.1668 d2.dn_loss_iou: 0.2124 d3.dn_loss_cls: 0.0753 d3.dn_loss_bbox: 0.1577 d3.dn_loss_iou: 0.2041 d4.dn_loss_cls: 0.0713 d4.dn_loss_bbox: 0.1558 d4.dn_loss_iou: 0.2019 d1.loss_lmm_region: 0.1356 loss_lmm_image: 0.8185 2024/11/12 23:55:32 - mmengine - INFO - Iter(train) [105200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:58:48 time: 1.9879 data_time: 0.0183 memory: 33655 grad_norm: 31.4485 loss: 10.6204 loss_cls: 0.3318 loss_bbox: 0.1789 loss_iou: 0.2969 d0.loss_cls: 0.3787 d0.loss_bbox: 0.1903 d0.loss_iou: 0.3171 d1.loss_cls: 0.3640 d1.loss_bbox: 0.1768 d1.loss_iou: 0.2998 d2.loss_cls: 0.3453 d2.loss_bbox: 0.1782 d2.loss_iou: 0.2984 d3.loss_cls: 0.3388 d3.loss_bbox: 0.1796 d3.loss_iou: 0.2964 d4.loss_cls: 0.3309 d4.loss_bbox: 0.1813 d4.loss_iou: 0.2977 enc_loss_cls: 0.3930 enc_loss_bbox: 0.1973 enc_loss_iou: 0.3279 dn_loss_cls: 0.1106 dn_loss_bbox: 0.1860 dn_loss_iou: 0.2384 d0.dn_loss_cls: 0.2016 d0.dn_loss_bbox: 0.3245 d0.dn_loss_iou: 0.3754 d1.dn_loss_cls: 0.1464 d1.dn_loss_bbox: 0.2176 d1.dn_loss_iou: 0.2685 d2.dn_loss_cls: 0.1261 d2.dn_loss_bbox: 0.1959 d2.dn_loss_iou: 0.2476 d3.dn_loss_cls: 0.1164 d3.dn_loss_bbox: 0.1882 d3.dn_loss_iou: 0.2411 d4.dn_loss_cls: 0.1110 d4.dn_loss_bbox: 0.1859 d4.dn_loss_iou: 0.2383 d1.loss_lmm_region: 0.1669 loss_lmm_image: 0.8343 2024/11/12 23:58:54 - mmengine - INFO - Iter(train) [105300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:55:27 time: 2.0527 data_time: 0.0184 memory: 34812 grad_norm: 26.3780 loss: 9.3184 loss_cls: 0.2747 loss_bbox: 0.1498 loss_iou: 0.2408 d0.loss_cls: 0.3207 d0.loss_bbox: 0.1583 d0.loss_iou: 0.2589 d1.loss_cls: 0.3003 d1.loss_bbox: 0.1515 d1.loss_iou: 0.2433 d2.loss_cls: 0.2930 d2.loss_bbox: 0.1459 d2.loss_iou: 0.2419 d3.loss_cls: 0.2843 d3.loss_bbox: 0.1458 d3.loss_iou: 0.2384 d4.loss_cls: 0.2795 d4.loss_bbox: 0.1467 d4.loss_iou: 0.2381 enc_loss_cls: 0.3243 enc_loss_bbox: 0.1680 enc_loss_iou: 0.2771 dn_loss_cls: 0.0886 dn_loss_bbox: 0.1781 dn_loss_iou: 0.2182 d0.dn_loss_cls: 0.1875 d0.dn_loss_bbox: 0.3181 d0.dn_loss_iou: 0.3595 d1.dn_loss_cls: 0.1238 d1.dn_loss_bbox: 0.2107 d1.dn_loss_iou: 0.2494 d2.dn_loss_cls: 0.1027 d2.dn_loss_bbox: 0.1857 d2.dn_loss_iou: 0.2275 d3.dn_loss_cls: 0.0928 d3.dn_loss_bbox: 0.1806 d3.dn_loss_iou: 0.2209 d4.dn_loss_cls: 0.0886 d4.dn_loss_bbox: 0.1780 d4.dn_loss_iou: 0.2181 d1.loss_lmm_region: 0.1374 loss_lmm_image: 0.8709 2024/11/13 00:02:13 - mmengine - INFO - Iter(train) [105400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:52:06 time: 1.9781 data_time: 0.0184 memory: 31199 grad_norm: 31.2775 loss: 8.5174 loss_cls: 0.2399 loss_bbox: 0.1265 loss_iou: 0.2317 d0.loss_cls: 0.2730 d0.loss_bbox: 0.1458 d0.loss_iou: 0.2495 d1.loss_cls: 0.2551 d1.loss_bbox: 0.1333 d1.loss_iou: 0.2392 d2.loss_cls: 0.2493 d2.loss_bbox: 0.1278 d2.loss_iou: 0.2363 d3.loss_cls: 0.2403 d3.loss_bbox: 0.1268 d3.loss_iou: 0.2333 d4.loss_cls: 0.2391 d4.loss_bbox: 0.1278 d4.loss_iou: 0.2323 enc_loss_cls: 0.2904 enc_loss_bbox: 0.1548 enc_loss_iou: 0.2711 dn_loss_cls: 0.0607 dn_loss_bbox: 0.1596 dn_loss_iou: 0.2212 d0.dn_loss_cls: 0.1387 d0.dn_loss_bbox: 0.3041 d0.dn_loss_iou: 0.3611 d1.dn_loss_cls: 0.0856 d1.dn_loss_bbox: 0.1910 d1.dn_loss_iou: 0.2496 d2.dn_loss_cls: 0.0683 d2.dn_loss_bbox: 0.1678 d2.dn_loss_iou: 0.2289 d3.dn_loss_cls: 0.0627 d3.dn_loss_bbox: 0.1612 d3.dn_loss_iou: 0.2230 d4.dn_loss_cls: 0.0606 d4.dn_loss_bbox: 0.1597 d4.dn_loss_iou: 0.2211 d1.loss_lmm_region: 0.1117 loss_lmm_image: 0.8571 2024/11/13 00:05:29 - mmengine - INFO - Iter(train) [105500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:48:43 time: 1.9790 data_time: 0.0185 memory: 33882 grad_norm: 27.2451 loss: 8.9306 loss_cls: 0.3003 loss_bbox: 0.1318 loss_iou: 0.2128 d0.loss_cls: 0.3416 d0.loss_bbox: 0.1418 d0.loss_iou: 0.2277 d1.loss_cls: 0.3156 d1.loss_bbox: 0.1297 d1.loss_iou: 0.2150 d2.loss_cls: 0.3087 d2.loss_bbox: 0.1280 d2.loss_iou: 0.2090 d3.loss_cls: 0.3055 d3.loss_bbox: 0.1315 d3.loss_iou: 0.2125 d4.loss_cls: 0.3013 d4.loss_bbox: 0.1293 d4.loss_iou: 0.2125 enc_loss_cls: 0.3465 enc_loss_bbox: 0.1525 enc_loss_iou: 0.2460 dn_loss_cls: 0.1071 dn_loss_bbox: 0.1559 dn_loss_iou: 0.1987 d0.dn_loss_cls: 0.1877 d0.dn_loss_bbox: 0.2991 d0.dn_loss_iou: 0.3453 d1.dn_loss_cls: 0.1396 d1.dn_loss_bbox: 0.1848 d1.dn_loss_iou: 0.2294 d2.dn_loss_cls: 0.1206 d2.dn_loss_bbox: 0.1644 d2.dn_loss_iou: 0.2091 d3.dn_loss_cls: 0.1149 d3.dn_loss_bbox: 0.1567 d3.dn_loss_iou: 0.2011 d4.dn_loss_cls: 0.1065 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.1988 d1.loss_lmm_region: 0.1185 loss_lmm_image: 0.8368 2024/11/13 00:08:50 - mmengine - INFO - Iter(train) [105600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:45:23 time: 1.9904 data_time: 0.0183 memory: 33850 grad_norm: 34.7502 loss: 8.7500 loss_cls: 0.2700 loss_bbox: 0.1223 loss_iou: 0.2309 d0.loss_cls: 0.3024 d0.loss_bbox: 0.1363 d0.loss_iou: 0.2506 d1.loss_cls: 0.2892 d1.loss_bbox: 0.1239 d1.loss_iou: 0.2344 d2.loss_cls: 0.2809 d2.loss_bbox: 0.1217 d2.loss_iou: 0.2322 d3.loss_cls: 0.2735 d3.loss_bbox: 0.1237 d3.loss_iou: 0.2337 d4.loss_cls: 0.2706 d4.loss_bbox: 0.1215 d4.loss_iou: 0.2314 enc_loss_cls: 0.3039 enc_loss_bbox: 0.1490 enc_loss_iou: 0.2722 dn_loss_cls: 0.0961 dn_loss_bbox: 0.1514 dn_loss_iou: 0.2006 d0.dn_loss_cls: 0.1733 d0.dn_loss_bbox: 0.2993 d0.dn_loss_iou: 0.3441 d1.dn_loss_cls: 0.1232 d1.dn_loss_bbox: 0.1842 d1.dn_loss_iou: 0.2325 d2.dn_loss_cls: 0.1071 d2.dn_loss_bbox: 0.1604 d2.dn_loss_iou: 0.2112 d3.dn_loss_cls: 0.1008 d3.dn_loss_bbox: 0.1537 d3.dn_loss_iou: 0.2034 d4.dn_loss_cls: 0.0978 d4.dn_loss_bbox: 0.1515 d4.dn_loss_iou: 0.2006 d1.loss_lmm_region: 0.1354 loss_lmm_image: 0.8492 2024/11/13 00:12:08 - mmengine - INFO - Iter(train) [105700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:42:01 time: 1.9941 data_time: 0.0183 memory: 34291 grad_norm: 33.0544 loss: 9.2039 loss_cls: 0.2693 loss_bbox: 0.1336 loss_iou: 0.2468 d0.loss_cls: 0.3045 d0.loss_bbox: 0.1423 d0.loss_iou: 0.2593 d1.loss_cls: 0.2904 d1.loss_bbox: 0.1399 d1.loss_iou: 0.2529 d2.loss_cls: 0.2786 d2.loss_bbox: 0.1356 d2.loss_iou: 0.2509 d3.loss_cls: 0.2720 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2478 d4.loss_cls: 0.2677 d4.loss_bbox: 0.1342 d4.loss_iou: 0.2469 enc_loss_cls: 0.3107 enc_loss_bbox: 0.1537 enc_loss_iou: 0.2806 dn_loss_cls: 0.1152 dn_loss_bbox: 0.1609 dn_loss_iou: 0.2292 d0.dn_loss_cls: 0.1935 d0.dn_loss_bbox: 0.2847 d0.dn_loss_iou: 0.3588 d1.dn_loss_cls: 0.1530 d1.dn_loss_bbox: 0.1831 d1.dn_loss_iou: 0.2556 d2.dn_loss_cls: 0.1304 d2.dn_loss_bbox: 0.1675 d2.dn_loss_iou: 0.2378 d3.dn_loss_cls: 0.1180 d3.dn_loss_bbox: 0.1635 d3.dn_loss_iou: 0.2323 d4.dn_loss_cls: 0.1152 d4.dn_loss_bbox: 0.1608 d4.dn_loss_iou: 0.2292 d1.loss_lmm_region: 0.1285 loss_lmm_image: 0.8332 2024/11/13 00:15:28 - mmengine - INFO - Iter(train) [105800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:38:40 time: 2.0140 data_time: 0.0183 memory: 34757 grad_norm: 36.4261 loss: 10.2823 loss_cls: 0.2780 loss_bbox: 0.1579 loss_iou: 0.2678 d0.loss_cls: 0.3191 d0.loss_bbox: 0.1693 d0.loss_iou: 0.2816 d1.loss_cls: 0.2983 d1.loss_bbox: 0.1638 d1.loss_iou: 0.2771 d2.loss_cls: 0.2881 d2.loss_bbox: 0.1591 d2.loss_iou: 0.2727 d3.loss_cls: 0.2808 d3.loss_bbox: 0.1614 d3.loss_iou: 0.2716 d4.loss_cls: 0.2761 d4.loss_bbox: 0.1583 d4.loss_iou: 0.2683 enc_loss_cls: 0.3228 enc_loss_bbox: 0.1859 enc_loss_iou: 0.3042 dn_loss_cls: 0.1597 dn_loss_bbox: 0.2035 dn_loss_iou: 0.2417 d0.dn_loss_cls: 0.2371 d0.dn_loss_bbox: 0.3559 d0.dn_loss_iou: 0.3860 d1.dn_loss_cls: 0.1858 d1.dn_loss_bbox: 0.2374 d1.dn_loss_iou: 0.2753 d2.dn_loss_cls: 0.1737 d2.dn_loss_bbox: 0.2150 d2.dn_loss_iou: 0.2523 d3.dn_loss_cls: 0.1695 d3.dn_loss_bbox: 0.2060 d3.dn_loss_iou: 0.2451 d4.dn_loss_cls: 0.1596 d4.dn_loss_bbox: 0.2035 d4.dn_loss_iou: 0.2418 d1.loss_lmm_region: 0.1477 loss_lmm_image: 0.8238 2024/11/13 00:18:47 - mmengine - INFO - Iter(train) [105900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:35:18 time: 2.0135 data_time: 0.0184 memory: 35261 grad_norm: 28.8973 loss: 9.5440 loss_cls: 0.2811 loss_bbox: 0.1519 loss_iou: 0.2572 d0.loss_cls: 0.3333 d0.loss_bbox: 0.1643 d0.loss_iou: 0.2709 d1.loss_cls: 0.3029 d1.loss_bbox: 0.1585 d1.loss_iou: 0.2645 d2.loss_cls: 0.2945 d2.loss_bbox: 0.1518 d2.loss_iou: 0.2585 d3.loss_cls: 0.2926 d3.loss_bbox: 0.1490 d3.loss_iou: 0.2557 d4.loss_cls: 0.2811 d4.loss_bbox: 0.1532 d4.loss_iou: 0.2581 enc_loss_cls: 0.3384 enc_loss_bbox: 0.1743 enc_loss_iou: 0.2911 dn_loss_cls: 0.1264 dn_loss_bbox: 0.1667 dn_loss_iou: 0.2183 d0.dn_loss_cls: 0.1921 d0.dn_loss_bbox: 0.2978 d0.dn_loss_iou: 0.3556 d1.dn_loss_cls: 0.1498 d1.dn_loss_bbox: 0.2002 d1.dn_loss_iou: 0.2486 d2.dn_loss_cls: 0.1350 d2.dn_loss_bbox: 0.1765 d2.dn_loss_iou: 0.2274 d3.dn_loss_cls: 0.1272 d3.dn_loss_bbox: 0.1692 d3.dn_loss_iou: 0.2205 d4.dn_loss_cls: 0.1250 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2184 d1.loss_lmm_region: 0.1134 loss_lmm_image: 0.8265 2024/11/13 00:22:05 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 00:22:05 - mmengine - INFO - Iter(train) [106000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:31:57 time: 1.9665 data_time: 0.0186 memory: 34710 grad_norm: 35.9307 loss: 9.0431 loss_cls: 0.2934 loss_bbox: 0.1393 loss_iou: 0.2498 d0.loss_cls: 0.3366 d0.loss_bbox: 0.1517 d0.loss_iou: 0.2713 d1.loss_cls: 0.3120 d1.loss_bbox: 0.1444 d1.loss_iou: 0.2603 d2.loss_cls: 0.3028 d2.loss_bbox: 0.1419 d2.loss_iou: 0.2555 d3.loss_cls: 0.3005 d3.loss_bbox: 0.1369 d3.loss_iou: 0.2495 d4.loss_cls: 0.2932 d4.loss_bbox: 0.1397 d4.loss_iou: 0.2494 enc_loss_cls: 0.3474 enc_loss_bbox: 0.1617 enc_loss_iou: 0.2924 dn_loss_cls: 0.0770 dn_loss_bbox: 0.1577 dn_loss_iou: 0.2027 d0.dn_loss_cls: 0.1544 d0.dn_loss_bbox: 0.2821 d0.dn_loss_iou: 0.3340 d1.dn_loss_cls: 0.1079 d1.dn_loss_bbox: 0.1872 d1.dn_loss_iou: 0.2296 d2.dn_loss_cls: 0.0897 d2.dn_loss_bbox: 0.1647 d2.dn_loss_iou: 0.2097 d3.dn_loss_cls: 0.0825 d3.dn_loss_bbox: 0.1600 d3.dn_loss_iou: 0.2049 d4.dn_loss_cls: 0.0774 d4.dn_loss_bbox: 0.1577 d4.dn_loss_iou: 0.2026 d1.loss_lmm_region: 0.1146 loss_lmm_image: 0.8168 2024/11/13 00:25:25 - mmengine - INFO - Iter(train) [106100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:28:35 time: 1.9864 data_time: 0.0183 memory: 33782 grad_norm: 31.6045 loss: 9.6849 loss_cls: 0.2789 loss_bbox: 0.1502 loss_iou: 0.2892 d0.loss_cls: 0.3262 d0.loss_bbox: 0.1600 d0.loss_iou: 0.3001 d1.loss_cls: 0.2985 d1.loss_bbox: 0.1550 d1.loss_iou: 0.2962 d2.loss_cls: 0.2925 d2.loss_bbox: 0.1503 d2.loss_iou: 0.2912 d3.loss_cls: 0.2847 d3.loss_bbox: 0.1501 d3.loss_iou: 0.2891 d4.loss_cls: 0.2816 d4.loss_bbox: 0.1506 d4.loss_iou: 0.2876 enc_loss_cls: 0.3345 enc_loss_bbox: 0.1713 enc_loss_iou: 0.3189 dn_loss_cls: 0.0854 dn_loss_bbox: 0.1710 dn_loss_iou: 0.2420 d0.dn_loss_cls: 0.1693 d0.dn_loss_bbox: 0.3182 d0.dn_loss_iou: 0.3884 d1.dn_loss_cls: 0.1156 d1.dn_loss_bbox: 0.2015 d1.dn_loss_iou: 0.2717 d2.dn_loss_cls: 0.0927 d2.dn_loss_bbox: 0.1804 d2.dn_loss_iou: 0.2508 d3.dn_loss_cls: 0.0859 d3.dn_loss_bbox: 0.1738 d3.dn_loss_iou: 0.2446 d4.dn_loss_cls: 0.0852 d4.dn_loss_bbox: 0.1711 d4.dn_loss_iou: 0.2421 d1.loss_lmm_region: 0.1149 loss_lmm_image: 0.8236 2024/11/13 00:28:43 - mmengine - INFO - Iter(train) [106200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:25:13 time: 1.9729 data_time: 0.0183 memory: 34086 grad_norm: 29.4376 loss: 9.3353 loss_cls: 0.2872 loss_bbox: 0.1505 loss_iou: 0.2501 d0.loss_cls: 0.3354 d0.loss_bbox: 0.1575 d0.loss_iou: 0.2627 d1.loss_cls: 0.3149 d1.loss_bbox: 0.1528 d1.loss_iou: 0.2524 d2.loss_cls: 0.3028 d2.loss_bbox: 0.1507 d2.loss_iou: 0.2511 d3.loss_cls: 0.2959 d3.loss_bbox: 0.1492 d3.loss_iou: 0.2485 d4.loss_cls: 0.2879 d4.loss_bbox: 0.1494 d4.loss_iou: 0.2493 enc_loss_cls: 0.3388 enc_loss_bbox: 0.1672 enc_loss_iou: 0.2781 dn_loss_cls: 0.0732 dn_loss_bbox: 0.1711 dn_loss_iou: 0.2158 d0.dn_loss_cls: 0.1706 d0.dn_loss_bbox: 0.3149 d0.dn_loss_iou: 0.3543 d1.dn_loss_cls: 0.1142 d1.dn_loss_bbox: 0.2023 d1.dn_loss_iou: 0.2458 d2.dn_loss_cls: 0.0908 d2.dn_loss_bbox: 0.1827 d2.dn_loss_iou: 0.2258 d3.dn_loss_cls: 0.0779 d3.dn_loss_bbox: 0.1736 d3.dn_loss_iou: 0.2185 d4.dn_loss_cls: 0.0735 d4.dn_loss_bbox: 0.1712 d4.dn_loss_iou: 0.2158 d1.loss_lmm_region: 0.1617 loss_lmm_image: 0.8489 2024/11/13 00:32:04 - mmengine - INFO - Iter(train) [106300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:21:53 time: 2.0308 data_time: 0.0183 memory: 32025 grad_norm: 28.1249 loss: 9.0600 loss_cls: 0.2634 loss_bbox: 0.1333 loss_iou: 0.2111 d0.loss_cls: 0.3007 d0.loss_bbox: 0.1422 d0.loss_iou: 0.2280 d1.loss_cls: 0.2846 d1.loss_bbox: 0.1397 d1.loss_iou: 0.2206 d2.loss_cls: 0.2754 d2.loss_bbox: 0.1339 d2.loss_iou: 0.2125 d3.loss_cls: 0.2692 d3.loss_bbox: 0.1327 d3.loss_iou: 0.2130 d4.loss_cls: 0.2669 d4.loss_bbox: 0.1339 d4.loss_iou: 0.2124 enc_loss_cls: 0.3150 enc_loss_bbox: 0.1529 enc_loss_iou: 0.2438 dn_loss_cls: 0.0913 dn_loss_bbox: 0.2069 dn_loss_iou: 0.2208 d0.dn_loss_cls: 0.1751 d0.dn_loss_bbox: 0.3707 d0.dn_loss_iou: 0.3624 d1.dn_loss_cls: 0.1221 d1.dn_loss_bbox: 0.2455 d1.dn_loss_iou: 0.2511 d2.dn_loss_cls: 0.1037 d2.dn_loss_bbox: 0.2179 d2.dn_loss_iou: 0.2299 d3.dn_loss_cls: 0.0965 d3.dn_loss_bbox: 0.2096 d3.dn_loss_iou: 0.2234 d4.dn_loss_cls: 0.0926 d4.dn_loss_bbox: 0.2069 d4.dn_loss_iou: 0.2208 d1.loss_lmm_region: 0.1246 loss_lmm_image: 0.8028 2024/11/13 00:35:25 - mmengine - INFO - Iter(train) [106400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:18:32 time: 2.0022 data_time: 0.0185 memory: 34578 grad_norm: 26.2959 loss: 9.1105 loss_cls: 0.2725 loss_bbox: 0.1318 loss_iou: 0.2451 d0.loss_cls: 0.3205 d0.loss_bbox: 0.1477 d0.loss_iou: 0.2606 d1.loss_cls: 0.2935 d1.loss_bbox: 0.1366 d1.loss_iou: 0.2518 d2.loss_cls: 0.2836 d2.loss_bbox: 0.1328 d2.loss_iou: 0.2442 d3.loss_cls: 0.2784 d3.loss_bbox: 0.1307 d3.loss_iou: 0.2416 d4.loss_cls: 0.2711 d4.loss_bbox: 0.1322 d4.loss_iou: 0.2454 enc_loss_cls: 0.3223 enc_loss_bbox: 0.1579 enc_loss_iou: 0.2805 dn_loss_cls: 0.1005 dn_loss_bbox: 0.1613 dn_loss_iou: 0.2103 d0.dn_loss_cls: 0.1734 d0.dn_loss_bbox: 0.3132 d0.dn_loss_iou: 0.3533 d1.dn_loss_cls: 0.1256 d1.dn_loss_bbox: 0.1943 d1.dn_loss_iou: 0.2399 d2.dn_loss_cls: 0.1078 d2.dn_loss_bbox: 0.1701 d2.dn_loss_iou: 0.2186 d3.dn_loss_cls: 0.1037 d3.dn_loss_bbox: 0.1643 d3.dn_loss_iou: 0.2126 d4.dn_loss_cls: 0.1011 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.2101 d1.loss_lmm_region: 0.1220 loss_lmm_image: 0.8864 2024/11/13 00:38:45 - mmengine - INFO - Iter(train) [106500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:15:12 time: 1.9942 data_time: 0.0185 memory: 34505 grad_norm: 26.4683 loss: 9.8950 loss_cls: 0.3339 loss_bbox: 0.1600 loss_iou: 0.2807 d0.loss_cls: 0.3876 d0.loss_bbox: 0.1690 d0.loss_iou: 0.2949 d1.loss_cls: 0.3600 d1.loss_bbox: 0.1609 d1.loss_iou: 0.2866 d2.loss_cls: 0.3429 d2.loss_bbox: 0.1595 d2.loss_iou: 0.2808 d3.loss_cls: 0.3397 d3.loss_bbox: 0.1581 d3.loss_iou: 0.2792 d4.loss_cls: 0.3365 d4.loss_bbox: 0.1580 d4.loss_iou: 0.2787 enc_loss_cls: 0.3836 enc_loss_bbox: 0.1879 enc_loss_iou: 0.3214 dn_loss_cls: 0.0929 dn_loss_bbox: 0.1534 dn_loss_iou: 0.2084 d0.dn_loss_cls: 0.1705 d0.dn_loss_bbox: 0.3109 d0.dn_loss_iou: 0.3549 d1.dn_loss_cls: 0.1242 d1.dn_loss_bbox: 0.1909 d1.dn_loss_iou: 0.2422 d2.dn_loss_cls: 0.1046 d2.dn_loss_bbox: 0.1671 d2.dn_loss_iou: 0.2191 d3.dn_loss_cls: 0.0973 d3.dn_loss_bbox: 0.1554 d3.dn_loss_iou: 0.2104 d4.dn_loss_cls: 0.0940 d4.dn_loss_bbox: 0.1535 d4.dn_loss_iou: 0.2083 d1.loss_lmm_region: 0.1222 loss_lmm_image: 0.8551 2024/11/13 00:42:05 - mmengine - INFO - Iter(train) [106600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:11:50 time: 1.9989 data_time: 0.0185 memory: 34066 grad_norm: 28.4362 loss: 9.1344 loss_cls: 0.3162 loss_bbox: 0.1295 loss_iou: 0.2507 d0.loss_cls: 0.3626 d0.loss_bbox: 0.1468 d0.loss_iou: 0.2728 d1.loss_cls: 0.3397 d1.loss_bbox: 0.1334 d1.loss_iou: 0.2577 d2.loss_cls: 0.3322 d2.loss_bbox: 0.1277 d2.loss_iou: 0.2506 d3.loss_cls: 0.3191 d3.loss_bbox: 0.1303 d3.loss_iou: 0.2507 d4.loss_cls: 0.3174 d4.loss_bbox: 0.1290 d4.loss_iou: 0.2504 enc_loss_cls: 0.3656 enc_loss_bbox: 0.1636 enc_loss_iou: 0.2993 dn_loss_cls: 0.0989 dn_loss_bbox: 0.1299 dn_loss_iou: 0.1938 d0.dn_loss_cls: 0.1772 d0.dn_loss_bbox: 0.2570 d0.dn_loss_iou: 0.3244 d1.dn_loss_cls: 0.1292 d1.dn_loss_bbox: 0.1562 d1.dn_loss_iou: 0.2216 d2.dn_loss_cls: 0.1095 d2.dn_loss_bbox: 0.1376 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.1033 d3.dn_loss_bbox: 0.1311 d3.dn_loss_iou: 0.1955 d4.dn_loss_cls: 0.0987 d4.dn_loss_bbox: 0.1299 d4.dn_loss_iou: 0.1937 d1.loss_lmm_region: 0.1346 loss_lmm_image: 0.8650 2024/11/13 00:45:25 - mmengine - INFO - Iter(train) [106700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:08:29 time: 2.0231 data_time: 0.0185 memory: 35078 grad_norm: 25.2464 loss: 9.6459 loss_cls: 0.3078 loss_bbox: 0.1572 loss_iou: 0.2548 d0.loss_cls: 0.3544 d0.loss_bbox: 0.1736 d0.loss_iou: 0.2779 d1.loss_cls: 0.3333 d1.loss_bbox: 0.1591 d1.loss_iou: 0.2622 d2.loss_cls: 0.3234 d2.loss_bbox: 0.1608 d2.loss_iou: 0.2569 d3.loss_cls: 0.3121 d3.loss_bbox: 0.1581 d3.loss_iou: 0.2562 d4.loss_cls: 0.3081 d4.loss_bbox: 0.1575 d4.loss_iou: 0.2556 enc_loss_cls: 0.3606 enc_loss_bbox: 0.1894 enc_loss_iou: 0.3014 dn_loss_cls: 0.0992 dn_loss_bbox: 0.1668 dn_loss_iou: 0.2056 d0.dn_loss_cls: 0.1775 d0.dn_loss_bbox: 0.3143 d0.dn_loss_iou: 0.3527 d1.dn_loss_cls: 0.1289 d1.dn_loss_bbox: 0.1987 d1.dn_loss_iou: 0.2387 d2.dn_loss_cls: 0.1097 d2.dn_loss_bbox: 0.1764 d2.dn_loss_iou: 0.2151 d3.dn_loss_cls: 0.1020 d3.dn_loss_bbox: 0.1687 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.0984 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2056 d1.loss_lmm_region: 0.1210 loss_lmm_image: 0.8715 2024/11/13 00:48:46 - mmengine - INFO - Iter(train) [106800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:05:09 time: 1.9899 data_time: 0.0183 memory: 34199 grad_norm: 37.2183 loss: 10.0325 loss_cls: 0.2987 loss_bbox: 0.1742 loss_iou: 0.2748 d0.loss_cls: 0.3333 d0.loss_bbox: 0.1927 d0.loss_iou: 0.2969 d1.loss_cls: 0.3202 d1.loss_bbox: 0.1804 d1.loss_iou: 0.2802 d2.loss_cls: 0.3099 d2.loss_bbox: 0.1741 d2.loss_iou: 0.2775 d3.loss_cls: 0.2940 d3.loss_bbox: 0.1796 d3.loss_iou: 0.2791 d4.loss_cls: 0.2999 d4.loss_bbox: 0.1751 d4.loss_iou: 0.2758 enc_loss_cls: 0.3486 enc_loss_bbox: 0.1978 enc_loss_iou: 0.3069 dn_loss_cls: 0.1022 dn_loss_bbox: 0.1942 dn_loss_iou: 0.2257 d0.dn_loss_cls: 0.1726 d0.dn_loss_bbox: 0.3522 d0.dn_loss_iou: 0.3663 d1.dn_loss_cls: 0.1247 d1.dn_loss_bbox: 0.2306 d1.dn_loss_iou: 0.2551 d2.dn_loss_cls: 0.1095 d2.dn_loss_bbox: 0.2037 d2.dn_loss_iou: 0.2343 d3.dn_loss_cls: 0.1038 d3.dn_loss_bbox: 0.1964 d3.dn_loss_iou: 0.2280 d4.dn_loss_cls: 0.1007 d4.dn_loss_bbox: 0.1943 d4.dn_loss_iou: 0.2257 d1.loss_lmm_region: 0.1178 loss_lmm_image: 0.8249 2024/11/13 00:52:06 - mmengine - INFO - Iter(train) [106900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 1 day, 0:01:48 time: 2.0240 data_time: 0.0185 memory: 33760 grad_norm: 34.6786 loss: 9.0767 loss_cls: 0.2715 loss_bbox: 0.1166 loss_iou: 0.2020 d0.loss_cls: 0.2972 d0.loss_bbox: 0.1324 d0.loss_iou: 0.2189 d1.loss_cls: 0.2830 d1.loss_bbox: 0.1253 d1.loss_iou: 0.2083 d2.loss_cls: 0.2737 d2.loss_bbox: 0.1205 d2.loss_iou: 0.2034 d3.loss_cls: 0.2713 d3.loss_bbox: 0.1180 d3.loss_iou: 0.2031 d4.loss_cls: 0.2742 d4.loss_bbox: 0.1176 d4.loss_iou: 0.2017 enc_loss_cls: 0.3070 enc_loss_bbox: 0.1425 enc_loss_iou: 0.2363 dn_loss_cls: 0.1599 dn_loss_bbox: 0.1785 dn_loss_iou: 0.2114 d0.dn_loss_cls: 0.2392 d0.dn_loss_bbox: 0.3472 d0.dn_loss_iou: 0.3584 d1.dn_loss_cls: 0.1962 d1.dn_loss_bbox: 0.2139 d1.dn_loss_iou: 0.2422 d2.dn_loss_cls: 0.1734 d2.dn_loss_bbox: 0.1878 d2.dn_loss_iou: 0.2201 d3.dn_loss_cls: 0.1672 d3.dn_loss_bbox: 0.1803 d3.dn_loss_iou: 0.2138 d4.dn_loss_cls: 0.1615 d4.dn_loss_bbox: 0.1785 d4.dn_loss_iou: 0.2114 d1.loss_lmm_region: 0.1413 loss_lmm_image: 0.7700 2024/11/13 00:55:25 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 00:55:25 - mmengine - INFO - Iter(train) [107000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:58:26 time: 1.9902 data_time: 0.0186 memory: 34565 grad_norm: 23.7239 loss: 10.3607 loss_cls: 0.3290 loss_bbox: 0.1638 loss_iou: 0.2974 d0.loss_cls: 0.3781 d0.loss_bbox: 0.1773 d0.loss_iou: 0.3091 d1.loss_cls: 0.3581 d1.loss_bbox: 0.1699 d1.loss_iou: 0.3016 d2.loss_cls: 0.3433 d2.loss_bbox: 0.1659 d2.loss_iou: 0.3004 d3.loss_cls: 0.3331 d3.loss_bbox: 0.1633 d3.loss_iou: 0.2983 d4.loss_cls: 0.3288 d4.loss_bbox: 0.1621 d4.loss_iou: 0.2979 enc_loss_cls: 0.3755 enc_loss_bbox: 0.1958 enc_loss_iou: 0.3322 dn_loss_cls: 0.1012 dn_loss_bbox: 0.1856 dn_loss_iou: 0.2382 d0.dn_loss_cls: 0.1765 d0.dn_loss_bbox: 0.3335 d0.dn_loss_iou: 0.3774 d1.dn_loss_cls: 0.1281 d1.dn_loss_bbox: 0.2203 d1.dn_loss_iou: 0.2684 d2.dn_loss_cls: 0.1098 d2.dn_loss_bbox: 0.1977 d2.dn_loss_iou: 0.2486 d3.dn_loss_cls: 0.1054 d3.dn_loss_bbox: 0.1873 d3.dn_loss_iou: 0.2404 d4.dn_loss_cls: 0.1026 d4.dn_loss_bbox: 0.1856 d4.dn_loss_iou: 0.2382 d1.loss_lmm_region: 0.1150 loss_lmm_image: 0.8198 2024/11/13 00:58:45 - mmengine - INFO - Iter(train) [107100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:55:05 time: 1.9941 data_time: 0.0185 memory: 34407 grad_norm: 35.2737 loss: 11.2674 loss_cls: 0.3506 loss_bbox: 0.1960 loss_iou: 0.3413 d0.loss_cls: 0.3887 d0.loss_bbox: 0.2101 d0.loss_iou: 0.3652 d1.loss_cls: 0.3691 d1.loss_bbox: 0.1986 d1.loss_iou: 0.3452 d2.loss_cls: 0.3576 d2.loss_bbox: 0.1937 d2.loss_iou: 0.3415 d3.loss_cls: 0.3498 d3.loss_bbox: 0.1953 d3.loss_iou: 0.3428 d4.loss_cls: 0.3526 d4.loss_bbox: 0.1955 d4.loss_iou: 0.3426 enc_loss_cls: 0.4097 enc_loss_bbox: 0.2191 enc_loss_iou: 0.3794 dn_loss_cls: 0.1115 dn_loss_bbox: 0.1822 dn_loss_iou: 0.2551 d0.dn_loss_cls: 0.2090 d0.dn_loss_bbox: 0.3365 d0.dn_loss_iou: 0.4092 d1.dn_loss_cls: 0.1542 d1.dn_loss_bbox: 0.2175 d1.dn_loss_iou: 0.2904 d2.dn_loss_cls: 0.1284 d2.dn_loss_bbox: 0.1952 d2.dn_loss_iou: 0.2666 d3.dn_loss_cls: 0.1180 d3.dn_loss_bbox: 0.1859 d3.dn_loss_iou: 0.2589 d4.dn_loss_cls: 0.1123 d4.dn_loss_bbox: 0.1823 d4.dn_loss_iou: 0.2551 d1.loss_lmm_region: 0.1331 loss_lmm_image: 0.8216 2024/11/13 01:02:03 - mmengine - INFO - Iter(train) [107200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:51:43 time: 1.9983 data_time: 0.0185 memory: 34196 grad_norm: 31.5339 loss: 8.7936 loss_cls: 0.2355 loss_bbox: 0.1376 loss_iou: 0.2313 d0.loss_cls: 0.2770 d0.loss_bbox: 0.1469 d0.loss_iou: 0.2419 d1.loss_cls: 0.2535 d1.loss_bbox: 0.1393 d1.loss_iou: 0.2320 d2.loss_cls: 0.2456 d2.loss_bbox: 0.1355 d2.loss_iou: 0.2288 d3.loss_cls: 0.2398 d3.loss_bbox: 0.1348 d3.loss_iou: 0.2272 d4.loss_cls: 0.2385 d4.loss_bbox: 0.1359 d4.loss_iou: 0.2298 enc_loss_cls: 0.2857 enc_loss_bbox: 0.1606 enc_loss_iou: 0.2629 dn_loss_cls: 0.1017 dn_loss_bbox: 0.1708 dn_loss_iou: 0.2149 d0.dn_loss_cls: 0.1698 d0.dn_loss_bbox: 0.3218 d0.dn_loss_iou: 0.3501 d1.dn_loss_cls: 0.1237 d1.dn_loss_bbox: 0.2044 d1.dn_loss_iou: 0.2451 d2.dn_loss_cls: 0.1099 d2.dn_loss_bbox: 0.1798 d2.dn_loss_iou: 0.2236 d3.dn_loss_cls: 0.1031 d3.dn_loss_bbox: 0.1718 d3.dn_loss_iou: 0.2167 d4.dn_loss_cls: 0.1001 d4.dn_loss_bbox: 0.1709 d4.dn_loss_iou: 0.2150 d1.loss_lmm_region: 0.1474 loss_lmm_image: 0.8329 2024/11/13 01:05:23 - mmengine - INFO - Iter(train) [107300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:48:22 time: 1.9746 data_time: 0.0185 memory: 34589 grad_norm: 25.2584 loss: 8.9170 loss_cls: 0.2676 loss_bbox: 0.1260 loss_iou: 0.2191 d0.loss_cls: 0.3031 d0.loss_bbox: 0.1357 d0.loss_iou: 0.2302 d1.loss_cls: 0.2872 d1.loss_bbox: 0.1270 d1.loss_iou: 0.2208 d2.loss_cls: 0.2710 d2.loss_bbox: 0.1300 d2.loss_iou: 0.2225 d3.loss_cls: 0.2694 d3.loss_bbox: 0.1267 d3.loss_iou: 0.2197 d4.loss_cls: 0.2701 d4.loss_bbox: 0.1248 d4.loss_iou: 0.2164 enc_loss_cls: 0.3130 enc_loss_bbox: 0.1445 enc_loss_iou: 0.2482 dn_loss_cls: 0.1093 dn_loss_bbox: 0.1636 dn_loss_iou: 0.2185 d0.dn_loss_cls: 0.1959 d0.dn_loss_bbox: 0.3217 d0.dn_loss_iou: 0.3656 d1.dn_loss_cls: 0.1454 d1.dn_loss_bbox: 0.2009 d1.dn_loss_iou: 0.2506 d2.dn_loss_cls: 0.1253 d2.dn_loss_bbox: 0.1744 d2.dn_loss_iou: 0.2287 d3.dn_loss_cls: 0.1159 d3.dn_loss_bbox: 0.1671 d3.dn_loss_iou: 0.2212 d4.dn_loss_cls: 0.1103 d4.dn_loss_bbox: 0.1636 d4.dn_loss_iou: 0.2185 d1.loss_lmm_region: 0.1279 loss_lmm_image: 0.8197 2024/11/13 01:08:43 - mmengine - INFO - Iter(train) [107400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:45:01 time: 2.0198 data_time: 0.0184 memory: 33050 grad_norm: 26.2242 loss: 9.8363 loss_cls: 0.3247 loss_bbox: 0.1540 loss_iou: 0.2657 d0.loss_cls: 0.3693 d0.loss_bbox: 0.1639 d0.loss_iou: 0.2797 d1.loss_cls: 0.3437 d1.loss_bbox: 0.1593 d1.loss_iou: 0.2721 d2.loss_cls: 0.3385 d2.loss_bbox: 0.1510 d2.loss_iou: 0.2637 d3.loss_cls: 0.3340 d3.loss_bbox: 0.1482 d3.loss_iou: 0.2625 d4.loss_cls: 0.3255 d4.loss_bbox: 0.1527 d4.loss_iou: 0.2664 enc_loss_cls: 0.3715 enc_loss_bbox: 0.1768 enc_loss_iou: 0.3020 dn_loss_cls: 0.1020 dn_loss_bbox: 0.1792 dn_loss_iou: 0.2252 d0.dn_loss_cls: 0.1761 d0.dn_loss_bbox: 0.3200 d0.dn_loss_iou: 0.3565 d1.dn_loss_cls: 0.1294 d1.dn_loss_bbox: 0.2093 d1.dn_loss_iou: 0.2560 d2.dn_loss_cls: 0.1117 d2.dn_loss_bbox: 0.1888 d2.dn_loss_iou: 0.2352 d3.dn_loss_cls: 0.1054 d3.dn_loss_bbox: 0.1815 d3.dn_loss_iou: 0.2277 d4.dn_loss_cls: 0.1020 d4.dn_loss_bbox: 0.1792 d4.dn_loss_iou: 0.2253 d1.loss_lmm_region: 0.1190 loss_lmm_image: 0.7816 2024/11/13 01:12:04 - mmengine - INFO - Iter(train) [107500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:41:41 time: 2.0147 data_time: 0.0184 memory: 34459 grad_norm: 31.4275 loss: 10.3797 loss_cls: 0.3230 loss_bbox: 0.1571 loss_iou: 0.2604 d0.loss_cls: 0.3684 d0.loss_bbox: 0.1659 d0.loss_iou: 0.2715 d1.loss_cls: 0.3373 d1.loss_bbox: 0.1613 d1.loss_iou: 0.2655 d2.loss_cls: 0.3287 d2.loss_bbox: 0.1563 d2.loss_iou: 0.2597 d3.loss_cls: 0.3251 d3.loss_bbox: 0.1567 d3.loss_iou: 0.2588 d4.loss_cls: 0.3226 d4.loss_bbox: 0.1552 d4.loss_iou: 0.2590 enc_loss_cls: 0.3747 enc_loss_bbox: 0.1810 enc_loss_iou: 0.2905 dn_loss_cls: 0.1306 dn_loss_bbox: 0.2013 dn_loss_iou: 0.2432 d0.dn_loss_cls: 0.2155 d0.dn_loss_bbox: 0.3673 d0.dn_loss_iou: 0.3936 d1.dn_loss_cls: 0.1613 d1.dn_loss_bbox: 0.2319 d1.dn_loss_iou: 0.2732 d2.dn_loss_cls: 0.1399 d2.dn_loss_bbox: 0.2128 d2.dn_loss_iou: 0.2538 d3.dn_loss_cls: 0.1333 d3.dn_loss_bbox: 0.2029 d3.dn_loss_iou: 0.2457 d4.dn_loss_cls: 0.1288 d4.dn_loss_bbox: 0.2014 d4.dn_loss_iou: 0.2432 d1.loss_lmm_region: 0.1703 loss_lmm_image: 0.8507 2024/11/13 01:15:24 - mmengine - INFO - Iter(train) [107600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:38:20 time: 2.0095 data_time: 0.0185 memory: 34147 grad_norm: 29.1342 loss: 9.8052 loss_cls: 0.3226 loss_bbox: 0.1338 loss_iou: 0.2527 d0.loss_cls: 0.3727 d0.loss_bbox: 0.1496 d0.loss_iou: 0.2708 d1.loss_cls: 0.3400 d1.loss_bbox: 0.1430 d1.loss_iou: 0.2623 d2.loss_cls: 0.3336 d2.loss_bbox: 0.1379 d2.loss_iou: 0.2554 d3.loss_cls: 0.3315 d3.loss_bbox: 0.1364 d3.loss_iou: 0.2537 d4.loss_cls: 0.3250 d4.loss_bbox: 0.1346 d4.loss_iou: 0.2536 enc_loss_cls: 0.3730 enc_loss_bbox: 0.1652 enc_loss_iou: 0.2970 dn_loss_cls: 0.1025 dn_loss_bbox: 0.1707 dn_loss_iou: 0.2255 d0.dn_loss_cls: 0.1815 d0.dn_loss_bbox: 0.3240 d0.dn_loss_iou: 0.3759 d1.dn_loss_cls: 0.1325 d1.dn_loss_bbox: 0.2057 d1.dn_loss_iou: 0.2567 d2.dn_loss_cls: 0.1147 d2.dn_loss_bbox: 0.1822 d2.dn_loss_iou: 0.2345 d3.dn_loss_cls: 0.1085 d3.dn_loss_bbox: 0.1726 d3.dn_loss_iou: 0.2275 d4.dn_loss_cls: 0.1038 d4.dn_loss_bbox: 0.1706 d4.dn_loss_iou: 0.2255 d1.loss_lmm_region: 0.1516 loss_lmm_image: 0.8946 2024/11/13 01:18:43 - mmengine - INFO - Iter(train) [107700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:34:58 time: 1.9933 data_time: 0.0183 memory: 34691 grad_norm: 28.5186 loss: 9.4956 loss_cls: 0.3016 loss_bbox: 0.1464 loss_iou: 0.2519 d0.loss_cls: 0.3498 d0.loss_bbox: 0.1532 d0.loss_iou: 0.2640 d1.loss_cls: 0.3240 d1.loss_bbox: 0.1478 d1.loss_iou: 0.2569 d2.loss_cls: 0.3129 d2.loss_bbox: 0.1467 d2.loss_iou: 0.2544 d3.loss_cls: 0.3077 d3.loss_bbox: 0.1445 d3.loss_iou: 0.2519 d4.loss_cls: 0.3023 d4.loss_bbox: 0.1465 d4.loss_iou: 0.2513 enc_loss_cls: 0.3437 enc_loss_bbox: 0.1699 enc_loss_iou: 0.2868 dn_loss_cls: 0.0954 dn_loss_bbox: 0.1685 dn_loss_iou: 0.2259 d0.dn_loss_cls: 0.1788 d0.dn_loss_bbox: 0.3124 d0.dn_loss_iou: 0.3670 d1.dn_loss_cls: 0.1222 d1.dn_loss_bbox: 0.1967 d1.dn_loss_iou: 0.2538 d2.dn_loss_cls: 0.1061 d2.dn_loss_bbox: 0.1775 d2.dn_loss_iou: 0.2340 d3.dn_loss_cls: 0.0991 d3.dn_loss_bbox: 0.1708 d3.dn_loss_iou: 0.2280 d4.dn_loss_cls: 0.0972 d4.dn_loss_bbox: 0.1685 d4.dn_loss_iou: 0.2258 d1.loss_lmm_region: 0.1397 loss_lmm_image: 0.8142 2024/11/13 01:22:04 - mmengine - INFO - Iter(train) [107800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:31:38 time: 2.0067 data_time: 0.0184 memory: 34600 grad_norm: 25.2195 loss: 9.4524 loss_cls: 0.3331 loss_bbox: 0.1297 loss_iou: 0.2790 d0.loss_cls: 0.3738 d0.loss_bbox: 0.1428 d0.loss_iou: 0.2950 d1.loss_cls: 0.3476 d1.loss_bbox: 0.1370 d1.loss_iou: 0.2867 d2.loss_cls: 0.3462 d2.loss_bbox: 0.1329 d2.loss_iou: 0.2828 d3.loss_cls: 0.3389 d3.loss_bbox: 0.1323 d3.loss_iou: 0.2791 d4.loss_cls: 0.3336 d4.loss_bbox: 0.1300 d4.loss_iou: 0.2788 enc_loss_cls: 0.3864 enc_loss_bbox: 0.1549 enc_loss_iou: 0.3102 dn_loss_cls: 0.0881 dn_loss_bbox: 0.1301 dn_loss_iou: 0.2072 d0.dn_loss_cls: 0.1587 d0.dn_loss_bbox: 0.2803 d0.dn_loss_iou: 0.3488 d1.dn_loss_cls: 0.1154 d1.dn_loss_bbox: 0.1719 d1.dn_loss_iou: 0.2420 d2.dn_loss_cls: 0.0988 d2.dn_loss_bbox: 0.1439 d2.dn_loss_iou: 0.2181 d3.dn_loss_cls: 0.0952 d3.dn_loss_bbox: 0.1332 d3.dn_loss_iou: 0.2100 d4.dn_loss_cls: 0.0888 d4.dn_loss_bbox: 0.1301 d4.dn_loss_iou: 0.2074 d1.loss_lmm_region: 0.1167 loss_lmm_image: 0.8370 2024/11/13 01:25:24 - mmengine - INFO - Iter(train) [107900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:28:17 time: 1.9928 data_time: 0.0185 memory: 32802 grad_norm: 36.9908 loss: 9.7623 loss_cls: 0.2865 loss_bbox: 0.1617 loss_iou: 0.2669 d0.loss_cls: 0.3237 d0.loss_bbox: 0.1708 d0.loss_iou: 0.2768 d1.loss_cls: 0.2984 d1.loss_bbox: 0.1684 d1.loss_iou: 0.2722 d2.loss_cls: 0.2962 d2.loss_bbox: 0.1597 d2.loss_iou: 0.2666 d3.loss_cls: 0.2867 d3.loss_bbox: 0.1624 d3.loss_iou: 0.2707 d4.loss_cls: 0.2885 d4.loss_bbox: 0.1596 d4.loss_iou: 0.2658 enc_loss_cls: 0.3280 enc_loss_bbox: 0.1921 enc_loss_iou: 0.3038 dn_loss_cls: 0.1158 dn_loss_bbox: 0.1806 dn_loss_iou: 0.2278 d0.dn_loss_cls: 0.1916 d0.dn_loss_bbox: 0.3269 d0.dn_loss_iou: 0.3664 d1.dn_loss_cls: 0.1494 d1.dn_loss_bbox: 0.2098 d1.dn_loss_iou: 0.2563 d2.dn_loss_cls: 0.1309 d2.dn_loss_bbox: 0.1888 d2.dn_loss_iou: 0.2359 d3.dn_loss_cls: 0.1234 d3.dn_loss_bbox: 0.1827 d3.dn_loss_iou: 0.2301 d4.dn_loss_cls: 0.1173 d4.dn_loss_bbox: 0.1807 d4.dn_loss_iou: 0.2277 d1.loss_lmm_region: 0.1282 loss_lmm_image: 0.7862 2024/11/13 01:28:45 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 01:28:45 - mmengine - INFO - Iter(train) [108000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:24:56 time: 2.0169 data_time: 0.0184 memory: 33745 grad_norm: 25.9795 loss: 8.4850 loss_cls: 0.2456 loss_bbox: 0.1200 loss_iou: 0.2216 d0.loss_cls: 0.2758 d0.loss_bbox: 0.1303 d0.loss_iou: 0.2361 d1.loss_cls: 0.2609 d1.loss_bbox: 0.1214 d1.loss_iou: 0.2289 d2.loss_cls: 0.2538 d2.loss_bbox: 0.1194 d2.loss_iou: 0.2235 d3.loss_cls: 0.2506 d3.loss_bbox: 0.1191 d3.loss_iou: 0.2193 d4.loss_cls: 0.2480 d4.loss_bbox: 0.1196 d4.loss_iou: 0.2204 enc_loss_cls: 0.2872 enc_loss_bbox: 0.1410 enc_loss_iou: 0.2536 dn_loss_cls: 0.0838 dn_loss_bbox: 0.1640 dn_loss_iou: 0.2166 d0.dn_loss_cls: 0.1581 d0.dn_loss_bbox: 0.3029 d0.dn_loss_iou: 0.3565 d1.dn_loss_cls: 0.1134 d1.dn_loss_bbox: 0.1895 d1.dn_loss_iou: 0.2444 d2.dn_loss_cls: 0.0986 d2.dn_loss_bbox: 0.1741 d2.dn_loss_iou: 0.2255 d3.dn_loss_cls: 0.0901 d3.dn_loss_bbox: 0.1643 d3.dn_loss_iou: 0.2181 d4.dn_loss_cls: 0.0842 d4.dn_loss_bbox: 0.1640 d4.dn_loss_iou: 0.2167 d1.loss_lmm_region: 0.1225 loss_lmm_image: 0.8015 2024/11/13 01:32:04 - mmengine - INFO - Iter(train) [108100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:21:35 time: 1.9983 data_time: 0.0185 memory: 34402 grad_norm: 25.3158 loss: 9.8033 loss_cls: 0.3200 loss_bbox: 0.1511 loss_iou: 0.2631 d0.loss_cls: 0.3588 d0.loss_bbox: 0.1658 d0.loss_iou: 0.2801 d1.loss_cls: 0.3332 d1.loss_bbox: 0.1606 d1.loss_iou: 0.2759 d2.loss_cls: 0.3233 d2.loss_bbox: 0.1543 d2.loss_iou: 0.2722 d3.loss_cls: 0.3202 d3.loss_bbox: 0.1531 d3.loss_iou: 0.2672 d4.loss_cls: 0.3175 d4.loss_bbox: 0.1524 d4.loss_iou: 0.2655 enc_loss_cls: 0.3688 enc_loss_bbox: 0.1788 enc_loss_iou: 0.3038 dn_loss_cls: 0.1157 dn_loss_bbox: 0.1617 dn_loss_iou: 0.2202 d0.dn_loss_cls: 0.1824 d0.dn_loss_bbox: 0.2952 d0.dn_loss_iou: 0.3597 d1.dn_loss_cls: 0.1376 d1.dn_loss_bbox: 0.1894 d1.dn_loss_iou: 0.2488 d2.dn_loss_cls: 0.1254 d2.dn_loss_bbox: 0.1702 d2.dn_loss_iou: 0.2284 d3.dn_loss_cls: 0.1175 d3.dn_loss_bbox: 0.1635 d3.dn_loss_iou: 0.2226 d4.dn_loss_cls: 0.1157 d4.dn_loss_bbox: 0.1617 d4.dn_loss_iou: 0.2202 d1.loss_lmm_region: 0.1221 loss_lmm_image: 0.8595 2024/11/13 01:35:24 - mmengine - INFO - Iter(train) [108200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:18:14 time: 2.0171 data_time: 0.0185 memory: 34599 grad_norm: 26.5766 loss: 8.4009 loss_cls: 0.2649 loss_bbox: 0.1124 loss_iou: 0.2097 d0.loss_cls: 0.3029 d0.loss_bbox: 0.1219 d0.loss_iou: 0.2223 d1.loss_cls: 0.2790 d1.loss_bbox: 0.1148 d1.loss_iou: 0.2116 d2.loss_cls: 0.2713 d2.loss_bbox: 0.1116 d2.loss_iou: 0.2098 d3.loss_cls: 0.2658 d3.loss_bbox: 0.1144 d3.loss_iou: 0.2103 d4.loss_cls: 0.2687 d4.loss_bbox: 0.1121 d4.loss_iou: 0.2093 enc_loss_cls: 0.3106 enc_loss_bbox: 0.1312 enc_loss_iou: 0.2389 dn_loss_cls: 0.0884 dn_loss_bbox: 0.1497 dn_loss_iou: 0.1978 d0.dn_loss_cls: 0.1571 d0.dn_loss_bbox: 0.2876 d0.dn_loss_iou: 0.3323 d1.dn_loss_cls: 0.1195 d1.dn_loss_bbox: 0.1783 d1.dn_loss_iou: 0.2259 d2.dn_loss_cls: 0.1029 d2.dn_loss_bbox: 0.1599 d2.dn_loss_iou: 0.2074 d3.dn_loss_cls: 0.0976 d3.dn_loss_bbox: 0.1520 d3.dn_loss_iou: 0.2008 d4.dn_loss_cls: 0.0911 d4.dn_loss_bbox: 0.1497 d4.dn_loss_iou: 0.1978 d1.loss_lmm_region: 0.1138 loss_lmm_image: 0.8978 2024/11/13 01:38:43 - mmengine - INFO - Iter(train) [108300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:14:53 time: 1.9947 data_time: 0.0185 memory: 33210 grad_norm: 26.4980 loss: 8.4523 loss_cls: 0.2460 loss_bbox: 0.1210 loss_iou: 0.2315 d0.loss_cls: 0.2845 d0.loss_bbox: 0.1262 d0.loss_iou: 0.2436 d1.loss_cls: 0.2643 d1.loss_bbox: 0.1210 d1.loss_iou: 0.2350 d2.loss_cls: 0.2515 d2.loss_bbox: 0.1208 d2.loss_iou: 0.2352 d3.loss_cls: 0.2492 d3.loss_bbox: 0.1193 d3.loss_iou: 0.2322 d4.loss_cls: 0.2503 d4.loss_bbox: 0.1195 d4.loss_iou: 0.2303 enc_loss_cls: 0.2914 enc_loss_bbox: 0.1409 enc_loss_iou: 0.2620 dn_loss_cls: 0.0816 dn_loss_bbox: 0.1475 dn_loss_iou: 0.2084 d0.dn_loss_cls: 0.1640 d0.dn_loss_bbox: 0.2910 d0.dn_loss_iou: 0.3525 d1.dn_loss_cls: 0.1139 d1.dn_loss_bbox: 0.1762 d1.dn_loss_iou: 0.2377 d2.dn_loss_cls: 0.0947 d2.dn_loss_bbox: 0.1541 d2.dn_loss_iou: 0.2158 d3.dn_loss_cls: 0.0876 d3.dn_loss_bbox: 0.1487 d3.dn_loss_iou: 0.2101 d4.dn_loss_cls: 0.0828 d4.dn_loss_bbox: 0.1476 d4.dn_loss_iou: 0.2083 d1.loss_lmm_region: 0.1255 loss_lmm_image: 0.8289 2024/11/13 01:42:05 - mmengine - INFO - Iter(train) [108400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:11:32 time: 2.0047 data_time: 0.0184 memory: 35376 grad_norm: 26.7924 loss: 9.8496 loss_cls: 0.3165 loss_bbox: 0.1385 loss_iou: 0.2408 d0.loss_cls: 0.3562 d0.loss_bbox: 0.1511 d0.loss_iou: 0.2551 d1.loss_cls: 0.3394 d1.loss_bbox: 0.1393 d1.loss_iou: 0.2446 d2.loss_cls: 0.3357 d2.loss_bbox: 0.1378 d2.loss_iou: 0.2393 d3.loss_cls: 0.3248 d3.loss_bbox: 0.1359 d3.loss_iou: 0.2380 d4.loss_cls: 0.3192 d4.loss_bbox: 0.1375 d4.loss_iou: 0.2402 enc_loss_cls: 0.3611 enc_loss_bbox: 0.1613 enc_loss_iou: 0.2757 dn_loss_cls: 0.1636 dn_loss_bbox: 0.1667 dn_loss_iou: 0.2165 d0.dn_loss_cls: 0.2295 d0.dn_loss_bbox: 0.3227 d0.dn_loss_iou: 0.3622 d1.dn_loss_cls: 0.1909 d1.dn_loss_bbox: 0.2006 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.1755 d2.dn_loss_bbox: 0.1766 d2.dn_loss_iou: 0.2259 d3.dn_loss_cls: 0.1707 d3.dn_loss_bbox: 0.1690 d3.dn_loss_iou: 0.2184 d4.dn_loss_cls: 0.1654 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2165 d1.loss_lmm_region: 0.1435 loss_lmm_image: 0.8324 2024/11/13 01:45:24 - mmengine - INFO - Iter(train) [108500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:08:11 time: 2.0164 data_time: 0.0187 memory: 34274 grad_norm: 46.2163 loss: 10.2611 loss_cls: 0.3317 loss_bbox: 0.1577 loss_iou: 0.2958 d0.loss_cls: 0.3829 d0.loss_bbox: 0.1690 d0.loss_iou: 0.3150 d1.loss_cls: 0.3567 d1.loss_bbox: 0.1576 d1.loss_iou: 0.3040 d2.loss_cls: 0.3435 d2.loss_bbox: 0.1562 d2.loss_iou: 0.3009 d3.loss_cls: 0.3333 d3.loss_bbox: 0.1599 d3.loss_iou: 0.3028 d4.loss_cls: 0.3311 d4.loss_bbox: 0.1587 d4.loss_iou: 0.2967 enc_loss_cls: 0.3956 enc_loss_bbox: 0.1831 enc_loss_iou: 0.3360 dn_loss_cls: 0.1039 dn_loss_bbox: 0.1655 dn_loss_iou: 0.2328 d0.dn_loss_cls: 0.1818 d0.dn_loss_bbox: 0.3026 d0.dn_loss_iou: 0.3698 d1.dn_loss_cls: 0.1367 d1.dn_loss_bbox: 0.1993 d1.dn_loss_iou: 0.2634 d2.dn_loss_cls: 0.1189 d2.dn_loss_bbox: 0.1774 d2.dn_loss_iou: 0.2433 d3.dn_loss_cls: 0.1108 d3.dn_loss_bbox: 0.1676 d3.dn_loss_iou: 0.2351 d4.dn_loss_cls: 0.1050 d4.dn_loss_bbox: 0.1654 d4.dn_loss_iou: 0.2328 d1.loss_lmm_region: 0.1318 loss_lmm_image: 0.8489 2024/11/13 01:48:45 - mmengine - INFO - Iter(train) [108600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:04:50 time: 2.0056 data_time: 0.0186 memory: 34364 grad_norm: 34.6704 loss: 10.4854 loss_cls: 0.3472 loss_bbox: 0.1744 loss_iou: 0.2934 d0.loss_cls: 0.3905 d0.loss_bbox: 0.1879 d0.loss_iou: 0.3028 d1.loss_cls: 0.3705 d1.loss_bbox: 0.1809 d1.loss_iou: 0.2967 d2.loss_cls: 0.3582 d2.loss_bbox: 0.1732 d2.loss_iou: 0.2916 d3.loss_cls: 0.3545 d3.loss_bbox: 0.1727 d3.loss_iou: 0.2916 d4.loss_cls: 0.3522 d4.loss_bbox: 0.1697 d4.loss_iou: 0.2908 enc_loss_cls: 0.4052 enc_loss_bbox: 0.1954 enc_loss_iou: 0.3225 dn_loss_cls: 0.1043 dn_loss_bbox: 0.1813 dn_loss_iou: 0.2187 d0.dn_loss_cls: 0.1876 d0.dn_loss_bbox: 0.3359 d0.dn_loss_iou: 0.3625 d1.dn_loss_cls: 0.1447 d1.dn_loss_bbox: 0.2191 d1.dn_loss_iou: 0.2511 d2.dn_loss_cls: 0.1253 d2.dn_loss_bbox: 0.1962 d2.dn_loss_iou: 0.2300 d3.dn_loss_cls: 0.1128 d3.dn_loss_bbox: 0.1846 d3.dn_loss_iou: 0.2219 d4.dn_loss_cls: 0.1078 d4.dn_loss_bbox: 0.1813 d4.dn_loss_iou: 0.2187 d1.loss_lmm_region: 0.1332 loss_lmm_image: 0.8465 2024/11/13 01:52:07 - mmengine - INFO - Iter(train) [108700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 23:01:30 time: 2.0321 data_time: 0.0185 memory: 34268 grad_norm: 32.7969 loss: 11.0915 loss_cls: 0.3768 loss_bbox: 0.1921 loss_iou: 0.3006 d0.loss_cls: 0.4301 d0.loss_bbox: 0.1969 d0.loss_iou: 0.3076 d1.loss_cls: 0.4091 d1.loss_bbox: 0.1904 d1.loss_iou: 0.3004 d2.loss_cls: 0.3992 d2.loss_bbox: 0.1868 d2.loss_iou: 0.2963 d3.loss_cls: 0.3870 d3.loss_bbox: 0.1910 d3.loss_iou: 0.2981 d4.loss_cls: 0.3791 d4.loss_bbox: 0.1908 d4.loss_iou: 0.3019 enc_loss_cls: 0.4339 enc_loss_bbox: 0.2185 enc_loss_iou: 0.3312 dn_loss_cls: 0.1070 dn_loss_bbox: 0.1990 dn_loss_iou: 0.2407 d0.dn_loss_cls: 0.1947 d0.dn_loss_bbox: 0.3559 d0.dn_loss_iou: 0.3926 d1.dn_loss_cls: 0.1389 d1.dn_loss_bbox: 0.2317 d1.dn_loss_iou: 0.2731 d2.dn_loss_cls: 0.1187 d2.dn_loss_bbox: 0.2074 d2.dn_loss_iou: 0.2493 d3.dn_loss_cls: 0.1113 d3.dn_loss_bbox: 0.2007 d3.dn_loss_iou: 0.2433 d4.dn_loss_cls: 0.1068 d4.dn_loss_bbox: 0.1990 d4.dn_loss_iou: 0.2405 d1.loss_lmm_region: 0.1363 loss_lmm_image: 0.8268 2024/11/13 01:55:26 - mmengine - INFO - Iter(train) [108800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:58:09 time: 2.0133 data_time: 0.0185 memory: 33850 grad_norm: nan loss: 9.3899 loss_cls: 0.2809 loss_bbox: 0.1433 loss_iou: 0.2396 d0.loss_cls: 0.3220 d0.loss_bbox: 0.1589 d0.loss_iou: 0.2574 d1.loss_cls: 0.3043 d1.loss_bbox: 0.1474 d1.loss_iou: 0.2402 d2.loss_cls: 0.2929 d2.loss_bbox: 0.1451 d2.loss_iou: 0.2405 d3.loss_cls: 0.2860 d3.loss_bbox: 0.1446 d3.loss_iou: 0.2411 d4.loss_cls: 0.2835 d4.loss_bbox: 0.1417 d4.loss_iou: 0.2395 enc_loss_cls: 0.3401 enc_loss_bbox: 0.1704 enc_loss_iou: 0.2743 dn_loss_cls: 0.0910 dn_loss_bbox: 0.1854 dn_loss_iou: 0.2241 d0.dn_loss_cls: 0.1696 d0.dn_loss_bbox: 0.3498 d0.dn_loss_iou: 0.3715 d1.dn_loss_cls: 0.1195 d1.dn_loss_bbox: 0.2221 d1.dn_loss_iou: 0.2551 d2.dn_loss_cls: 0.1015 d2.dn_loss_bbox: 0.1995 d2.dn_loss_iou: 0.2351 d3.dn_loss_cls: 0.0981 d3.dn_loss_bbox: 0.1894 d3.dn_loss_iou: 0.2270 d4.dn_loss_cls: 0.0915 d4.dn_loss_bbox: 0.1855 d4.dn_loss_iou: 0.2241 d1.loss_lmm_region: 0.1298 loss_lmm_image: 0.8265 2024/11/13 01:58:47 - mmengine - INFO - Iter(train) [108900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:54:48 time: 1.9965 data_time: 0.0185 memory: 33704 grad_norm: 28.6322 loss: 9.5224 loss_cls: 0.2794 loss_bbox: 0.1475 loss_iou: 0.2780 d0.loss_cls: 0.3124 d0.loss_bbox: 0.1580 d0.loss_iou: 0.2950 d1.loss_cls: 0.2922 d1.loss_bbox: 0.1526 d1.loss_iou: 0.2853 d2.loss_cls: 0.2860 d2.loss_bbox: 0.1478 d2.loss_iou: 0.2797 d3.loss_cls: 0.2826 d3.loss_bbox: 0.1460 d3.loss_iou: 0.2779 d4.loss_cls: 0.2812 d4.loss_bbox: 0.1484 d4.loss_iou: 0.2778 enc_loss_cls: 0.3255 enc_loss_bbox: 0.1689 enc_loss_iou: 0.3157 dn_loss_cls: 0.0858 dn_loss_bbox: 0.1714 dn_loss_iou: 0.2229 d0.dn_loss_cls: 0.1756 d0.dn_loss_bbox: 0.3253 d0.dn_loss_iou: 0.3731 d1.dn_loss_cls: 0.1223 d1.dn_loss_bbox: 0.2068 d1.dn_loss_iou: 0.2559 d2.dn_loss_cls: 0.0993 d2.dn_loss_bbox: 0.1809 d2.dn_loss_iou: 0.2326 d3.dn_loss_cls: 0.0894 d3.dn_loss_bbox: 0.1748 d3.dn_loss_iou: 0.2256 d4.dn_loss_cls: 0.0872 d4.dn_loss_bbox: 0.1714 d4.dn_loss_iou: 0.2230 d1.loss_lmm_region: 0.1138 loss_lmm_image: 0.8473 2024/11/13 02:02:05 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 02:02:05 - mmengine - INFO - Iter(train) [109000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:51:27 time: 1.9765 data_time: 0.0184 memory: 35697 grad_norm: 28.9301 loss: 9.2250 loss_cls: 0.2810 loss_bbox: 0.1387 loss_iou: 0.2439 d0.loss_cls: 0.3203 d0.loss_bbox: 0.1580 d0.loss_iou: 0.2622 d1.loss_cls: 0.3007 d1.loss_bbox: 0.1434 d1.loss_iou: 0.2495 d2.loss_cls: 0.2907 d2.loss_bbox: 0.1404 d2.loss_iou: 0.2466 d3.loss_cls: 0.2900 d3.loss_bbox: 0.1321 d3.loss_iou: 0.2409 d4.loss_cls: 0.2802 d4.loss_bbox: 0.1393 d4.loss_iou: 0.2436 enc_loss_cls: 0.3270 enc_loss_bbox: 0.1719 enc_loss_iou: 0.2828 dn_loss_cls: 0.0969 dn_loss_bbox: 0.1709 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.1751 d0.dn_loss_bbox: 0.3029 d0.dn_loss_iou: 0.3545 d1.dn_loss_cls: 0.1257 d1.dn_loss_bbox: 0.1962 d1.dn_loss_iou: 0.2492 d2.dn_loss_cls: 0.1074 d2.dn_loss_bbox: 0.1757 d2.dn_loss_iou: 0.2291 d3.dn_loss_cls: 0.1014 d3.dn_loss_bbox: 0.1713 d3.dn_loss_iou: 0.2227 d4.dn_loss_cls: 0.0973 d4.dn_loss_bbox: 0.1707 d4.dn_loss_iou: 0.2203 d1.loss_lmm_region: 0.1319 loss_lmm_image: 0.8222 2024/11/13 02:05:26 - mmengine - INFO - Iter(train) [109100/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:48:06 time: 2.0165 data_time: 0.0183 memory: 35172 grad_norm: 29.0742 loss: 8.3235 loss_cls: 0.2603 loss_bbox: 0.1137 loss_iou: 0.2090 d0.loss_cls: 0.2888 d0.loss_bbox: 0.1272 d0.loss_iou: 0.2237 d1.loss_cls: 0.2725 d1.loss_bbox: 0.1205 d1.loss_iou: 0.2170 d2.loss_cls: 0.2659 d2.loss_bbox: 0.1172 d2.loss_iou: 0.2120 d3.loss_cls: 0.2630 d3.loss_bbox: 0.1140 d3.loss_iou: 0.2088 d4.loss_cls: 0.2617 d4.loss_bbox: 0.1133 d4.loss_iou: 0.2086 enc_loss_cls: 0.2988 enc_loss_bbox: 0.1419 enc_loss_iou: 0.2418 dn_loss_cls: 0.0852 dn_loss_bbox: 0.1512 dn_loss_iou: 0.2035 d0.dn_loss_cls: 0.1648 d0.dn_loss_bbox: 0.2812 d0.dn_loss_iou: 0.3359 d1.dn_loss_cls: 0.1119 d1.dn_loss_bbox: 0.1806 d1.dn_loss_iou: 0.2300 d2.dn_loss_cls: 0.0967 d2.dn_loss_bbox: 0.1601 d2.dn_loss_iou: 0.2112 d3.dn_loss_cls: 0.0889 d3.dn_loss_bbox: 0.1533 d3.dn_loss_iou: 0.2055 d4.dn_loss_cls: 0.0859 d4.dn_loss_bbox: 0.1512 d4.dn_loss_iou: 0.2036 d1.loss_lmm_region: 0.1347 loss_lmm_image: 0.8088 2024/11/13 02:08:46 - mmengine - INFO - Iter(train) [109200/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:44:45 time: 1.9974 data_time: 0.0185 memory: 32076 grad_norm: 29.7515 loss: 9.8456 loss_cls: 0.3037 loss_bbox: 0.1502 loss_iou: 0.2506 d0.loss_cls: 0.3420 d0.loss_bbox: 0.1643 d0.loss_iou: 0.2649 d1.loss_cls: 0.3207 d1.loss_bbox: 0.1531 d1.loss_iou: 0.2554 d2.loss_cls: 0.3047 d2.loss_bbox: 0.1491 d2.loss_iou: 0.2525 d3.loss_cls: 0.3027 d3.loss_bbox: 0.1513 d3.loss_iou: 0.2515 d4.loss_cls: 0.2983 d4.loss_bbox: 0.1540 d4.loss_iou: 0.2537 enc_loss_cls: 0.3467 enc_loss_bbox: 0.1802 enc_loss_iou: 0.2877 dn_loss_cls: 0.1287 dn_loss_bbox: 0.1827 dn_loss_iou: 0.2194 d0.dn_loss_cls: 0.2115 d0.dn_loss_bbox: 0.3468 d0.dn_loss_iou: 0.3631 d1.dn_loss_cls: 0.1604 d1.dn_loss_bbox: 0.2227 d1.dn_loss_iou: 0.2522 d2.dn_loss_cls: 0.1437 d2.dn_loss_bbox: 0.1948 d2.dn_loss_iou: 0.2290 d3.dn_loss_cls: 0.1342 d3.dn_loss_bbox: 0.1839 d3.dn_loss_iou: 0.2213 d4.dn_loss_cls: 0.1262 d4.dn_loss_bbox: 0.1827 d4.dn_loss_iou: 0.2193 d1.loss_lmm_region: 0.1643 loss_lmm_image: 0.8214 2024/11/13 02:12:05 - mmengine - INFO - Iter(train) [109300/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:41:24 time: 2.0106 data_time: 0.0184 memory: 34190 grad_norm: 32.6299 loss: 9.2160 loss_cls: 0.2719 loss_bbox: 0.1404 loss_iou: 0.2381 d0.loss_cls: 0.3132 d0.loss_bbox: 0.1531 d0.loss_iou: 0.2479 d1.loss_cls: 0.2888 d1.loss_bbox: 0.1519 d1.loss_iou: 0.2444 d2.loss_cls: 0.2850 d2.loss_bbox: 0.1469 d2.loss_iou: 0.2409 d3.loss_cls: 0.2759 d3.loss_bbox: 0.1421 d3.loss_iou: 0.2396 d4.loss_cls: 0.2732 d4.loss_bbox: 0.1396 d4.loss_iou: 0.2380 enc_loss_cls: 0.3100 enc_loss_bbox: 0.1722 enc_loss_iou: 0.2668 dn_loss_cls: 0.1078 dn_loss_bbox: 0.1728 dn_loss_iou: 0.2200 d0.dn_loss_cls: 0.1799 d0.dn_loss_bbox: 0.3192 d0.dn_loss_iou: 0.3567 d1.dn_loss_cls: 0.1329 d1.dn_loss_bbox: 0.2016 d1.dn_loss_iou: 0.2463 d2.dn_loss_cls: 0.1152 d2.dn_loss_bbox: 0.1815 d2.dn_loss_iou: 0.2283 d3.dn_loss_cls: 0.1098 d3.dn_loss_bbox: 0.1745 d3.dn_loss_iou: 0.2223 d4.dn_loss_cls: 0.1083 d4.dn_loss_bbox: 0.1729 d4.dn_loss_iou: 0.2199 d1.loss_lmm_region: 0.1289 loss_lmm_image: 0.8375 2024/11/13 02:15:26 - mmengine - INFO - Iter(train) [109400/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:38:03 time: 2.0035 data_time: 0.0185 memory: 35560 grad_norm: 26.3279 loss: 9.1494 loss_cls: 0.3289 loss_bbox: 0.1190 loss_iou: 0.2174 d0.loss_cls: 0.3618 d0.loss_bbox: 0.1386 d0.loss_iou: 0.2382 d1.loss_cls: 0.3461 d1.loss_bbox: 0.1270 d1.loss_iou: 0.2242 d2.loss_cls: 0.3387 d2.loss_bbox: 0.1213 d2.loss_iou: 0.2178 d3.loss_cls: 0.3322 d3.loss_bbox: 0.1184 d3.loss_iou: 0.2153 d4.loss_cls: 0.3328 d4.loss_bbox: 0.1176 d4.loss_iou: 0.2152 enc_loss_cls: 0.3763 enc_loss_bbox: 0.1491 enc_loss_iou: 0.2573 dn_loss_cls: 0.1221 dn_loss_bbox: 0.1496 dn_loss_iou: 0.1990 d0.dn_loss_cls: 0.2076 d0.dn_loss_bbox: 0.2878 d0.dn_loss_iou: 0.3318 d1.dn_loss_cls: 0.1541 d1.dn_loss_bbox: 0.1717 d1.dn_loss_iou: 0.2250 d2.dn_loss_cls: 0.1339 d2.dn_loss_bbox: 0.1556 d2.dn_loss_iou: 0.2066 d3.dn_loss_cls: 0.1283 d3.dn_loss_bbox: 0.1505 d3.dn_loss_iou: 0.2010 d4.dn_loss_cls: 0.1224 d4.dn_loss_bbox: 0.1495 d4.dn_loss_iou: 0.1990 d1.loss_lmm_region: 0.1335 loss_lmm_image: 0.8269 2024/11/13 02:18:46 - mmengine - INFO - Iter(train) [109500/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:34:42 time: 1.9921 data_time: 0.0184 memory: 31809 grad_norm: 29.5669 loss: 7.9677 loss_cls: 0.2279 loss_bbox: 0.1127 loss_iou: 0.1981 d0.loss_cls: 0.2572 d0.loss_bbox: 0.1272 d0.loss_iou: 0.2171 d1.loss_cls: 0.2449 d1.loss_bbox: 0.1150 d1.loss_iou: 0.2038 d2.loss_cls: 0.2360 d2.loss_bbox: 0.1135 d2.loss_iou: 0.2008 d3.loss_cls: 0.2304 d3.loss_bbox: 0.1126 d3.loss_iou: 0.1999 d4.loss_cls: 0.2280 d4.loss_bbox: 0.1138 d4.loss_iou: 0.2001 enc_loss_cls: 0.2644 enc_loss_bbox: 0.1378 enc_loss_iou: 0.2354 dn_loss_cls: 0.0898 dn_loss_bbox: 0.1491 dn_loss_iou: 0.1987 d0.dn_loss_cls: 0.1596 d0.dn_loss_bbox: 0.2934 d0.dn_loss_iou: 0.3365 d1.dn_loss_cls: 0.1179 d1.dn_loss_bbox: 0.1830 d1.dn_loss_iou: 0.2306 d2.dn_loss_cls: 0.0980 d2.dn_loss_bbox: 0.1589 d2.dn_loss_iou: 0.2084 d3.dn_loss_cls: 0.0935 d3.dn_loss_bbox: 0.1514 d3.dn_loss_iou: 0.2008 d4.dn_loss_cls: 0.0891 d4.dn_loss_bbox: 0.1492 d4.dn_loss_iou: 0.1986 d1.loss_lmm_region: 0.0993 loss_lmm_image: 0.7855 2024/11/13 02:22:06 - mmengine - INFO - Iter(train) [109600/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:31:21 time: 1.9942 data_time: 0.0185 memory: 33533 grad_norm: 25.0791 loss: 9.3242 loss_cls: 0.2717 loss_bbox: 0.1436 loss_iou: 0.2617 d0.loss_cls: 0.3169 d0.loss_bbox: 0.1535 d0.loss_iou: 0.2776 d1.loss_cls: 0.2919 d1.loss_bbox: 0.1438 d1.loss_iou: 0.2673 d2.loss_cls: 0.2818 d2.loss_bbox: 0.1447 d2.loss_iou: 0.2636 d3.loss_cls: 0.2790 d3.loss_bbox: 0.1409 d3.loss_iou: 0.2599 d4.loss_cls: 0.2735 d4.loss_bbox: 0.1433 d4.loss_iou: 0.2623 enc_loss_cls: 0.3247 enc_loss_bbox: 0.1677 enc_loss_iou: 0.3010 dn_loss_cls: 0.0882 dn_loss_bbox: 0.1641 dn_loss_iou: 0.2213 d0.dn_loss_cls: 0.1696 d0.dn_loss_bbox: 0.3092 d0.dn_loss_iou: 0.3705 d1.dn_loss_cls: 0.1193 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2542 d2.dn_loss_cls: 0.1003 d2.dn_loss_bbox: 0.1747 d2.dn_loss_iou: 0.2316 d3.dn_loss_cls: 0.0920 d3.dn_loss_bbox: 0.1678 d3.dn_loss_iou: 0.2243 d4.dn_loss_cls: 0.0896 d4.dn_loss_bbox: 0.1641 d4.dn_loss_iou: 0.2214 d1.loss_lmm_region: 0.1243 loss_lmm_image: 0.8699 2024/11/13 02:25:25 - mmengine - INFO - Iter(train) [109700/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:28:00 time: 1.9937 data_time: 0.0185 memory: 35005 grad_norm: 31.9945 loss: 9.1768 loss_cls: 0.2786 loss_bbox: 0.1391 loss_iou: 0.2588 d0.loss_cls: 0.3241 d0.loss_bbox: 0.1529 d0.loss_iou: 0.2716 d1.loss_cls: 0.2973 d1.loss_bbox: 0.1453 d1.loss_iou: 0.2617 d2.loss_cls: 0.2887 d2.loss_bbox: 0.1406 d2.loss_iou: 0.2594 d3.loss_cls: 0.2845 d3.loss_bbox: 0.1399 d3.loss_iou: 0.2574 d4.loss_cls: 0.2792 d4.loss_bbox: 0.1394 d4.loss_iou: 0.2589 enc_loss_cls: 0.3339 enc_loss_bbox: 0.1652 enc_loss_iou: 0.2909 dn_loss_cls: 0.0912 dn_loss_bbox: 0.1586 dn_loss_iou: 0.2073 d0.dn_loss_cls: 0.1726 d0.dn_loss_bbox: 0.2984 d0.dn_loss_iou: 0.3461 d1.dn_loss_cls: 0.1224 d1.dn_loss_bbox: 0.1868 d1.dn_loss_iou: 0.2351 d2.dn_loss_cls: 0.1027 d2.dn_loss_bbox: 0.1691 d2.dn_loss_iou: 0.2163 d3.dn_loss_cls: 0.0962 d3.dn_loss_bbox: 0.1612 d3.dn_loss_iou: 0.2094 d4.dn_loss_cls: 0.0919 d4.dn_loss_bbox: 0.1585 d4.dn_loss_iou: 0.2074 d1.loss_lmm_region: 0.1140 loss_lmm_image: 0.8646 2024/11/13 02:28:45 - mmengine - INFO - Iter(train) [109800/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:24:39 time: 2.0089 data_time: 0.0186 memory: 34573 grad_norm: 27.5167 loss: 8.5400 loss_cls: 0.2601 loss_bbox: 0.1239 loss_iou: 0.2388 d0.loss_cls: 0.3027 d0.loss_bbox: 0.1333 d0.loss_iou: 0.2512 d1.loss_cls: 0.2741 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2475 d2.loss_cls: 0.2679 d2.loss_bbox: 0.1241 d2.loss_iou: 0.2418 d3.loss_cls: 0.2663 d3.loss_bbox: 0.1250 d3.loss_iou: 0.2375 d4.loss_cls: 0.2628 d4.loss_bbox: 0.1226 d4.loss_iou: 0.2388 enc_loss_cls: 0.3070 enc_loss_bbox: 0.1431 enc_loss_iou: 0.2715 dn_loss_cls: 0.0874 dn_loss_bbox: 0.1497 dn_loss_iou: 0.1972 d0.dn_loss_cls: 0.1517 d0.dn_loss_bbox: 0.2677 d0.dn_loss_iou: 0.3290 d1.dn_loss_cls: 0.1086 d1.dn_loss_bbox: 0.1754 d1.dn_loss_iou: 0.2243 d2.dn_loss_cls: 0.0947 d2.dn_loss_bbox: 0.1569 d2.dn_loss_iou: 0.2049 d3.dn_loss_cls: 0.0923 d3.dn_loss_bbox: 0.1510 d3.dn_loss_iou: 0.1992 d4.dn_loss_cls: 0.0878 d4.dn_loss_bbox: 0.1497 d4.dn_loss_iou: 0.1973 d1.loss_lmm_region: 0.1097 loss_lmm_image: 0.8378 2024/11/13 02:32:03 - mmengine - INFO - Iter(train) [109900/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:21:17 time: 1.9860 data_time: 0.0184 memory: 34998 grad_norm: 31.6362 loss: 7.9779 loss_cls: 0.2518 loss_bbox: 0.0987 loss_iou: 0.1805 d0.loss_cls: 0.2845 d0.loss_bbox: 0.1109 d0.loss_iou: 0.1926 d1.loss_cls: 0.2647 d1.loss_bbox: 0.1044 d1.loss_iou: 0.1876 d2.loss_cls: 0.2607 d2.loss_bbox: 0.1001 d2.loss_iou: 0.1835 d3.loss_cls: 0.2542 d3.loss_bbox: 0.1011 d3.loss_iou: 0.1833 d4.loss_cls: 0.2508 d4.loss_bbox: 0.0995 d4.loss_iou: 0.1802 enc_loss_cls: 0.2900 enc_loss_bbox: 0.1211 enc_loss_iou: 0.2145 dn_loss_cls: 0.1247 dn_loss_bbox: 0.1338 dn_loss_iou: 0.1728 d0.dn_loss_cls: 0.2006 d0.dn_loss_bbox: 0.2836 d0.dn_loss_iou: 0.3113 d1.dn_loss_cls: 0.1505 d1.dn_loss_bbox: 0.1664 d1.dn_loss_iou: 0.2023 d2.dn_loss_cls: 0.1362 d2.dn_loss_bbox: 0.1461 d2.dn_loss_iou: 0.1825 d3.dn_loss_cls: 0.1283 d3.dn_loss_bbox: 0.1369 d3.dn_loss_iou: 0.1759 d4.dn_loss_cls: 0.1246 d4.dn_loss_bbox: 0.1338 d4.dn_loss_iou: 0.1727 d1.loss_lmm_region: 0.1265 loss_lmm_image: 0.8536 2024/11/13 02:35:25 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 02:35:25 - mmengine - INFO - Iter(train) [110000/150000] base_lr: 1.0000e-04 lr: 1.0000e-04 eta: 22:17:57 time: 2.0388 data_time: 0.0185 memory: 35082 grad_norm: 28.1601 loss: 8.3588 loss_cls: 0.2410 loss_bbox: 0.1176 loss_iou: 0.2262 d0.loss_cls: 0.2818 d0.loss_bbox: 0.1286 d0.loss_iou: 0.2382 d1.loss_cls: 0.2582 d1.loss_bbox: 0.1222 d1.loss_iou: 0.2308 d2.loss_cls: 0.2536 d2.loss_bbox: 0.1162 d2.loss_iou: 0.2262 d3.loss_cls: 0.2454 d3.loss_bbox: 0.1160 d3.loss_iou: 0.2259 d4.loss_cls: 0.2447 d4.loss_bbox: 0.1147 d4.loss_iou: 0.2246 enc_loss_cls: 0.2863 enc_loss_bbox: 0.1430 enc_loss_iou: 0.2639 dn_loss_cls: 0.0730 dn_loss_bbox: 0.1483 dn_loss_iou: 0.2087 d0.dn_loss_cls: 0.1575 d0.dn_loss_bbox: 0.2964 d0.dn_loss_iou: 0.3480 d1.dn_loss_cls: 0.1059 d1.dn_loss_bbox: 0.1818 d1.dn_loss_iou: 0.2376 d2.dn_loss_cls: 0.0842 d2.dn_loss_bbox: 0.1576 d2.dn_loss_iou: 0.2172 d3.dn_loss_cls: 0.0752 d3.dn_loss_bbox: 0.1495 d3.dn_loss_iou: 0.2105 d4.dn_loss_cls: 0.0731 d4.dn_loss_bbox: 0.1483 d4.dn_loss_iou: 0.2088 d1.loss_lmm_region: 0.1278 loss_lmm_image: 0.8443 2024/11/13 02:38:46 - mmengine - INFO - Iter(train) [110100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 22:14:36 time: 2.0192 data_time: 0.0187 memory: 34514 grad_norm: 24.7105 loss: 10.9873 loss_cls: 0.3625 loss_bbox: 0.1977 loss_iou: 0.3590 d0.loss_cls: 0.4101 d0.loss_bbox: 0.2060 d0.loss_iou: 0.3746 d1.loss_cls: 0.3862 d1.loss_bbox: 0.2025 d1.loss_iou: 0.3655 d2.loss_cls: 0.3717 d2.loss_bbox: 0.2011 d2.loss_iou: 0.3614 d3.loss_cls: 0.3653 d3.loss_bbox: 0.1977 d3.loss_iou: 0.3582 d4.loss_cls: 0.3593 d4.loss_bbox: 0.2005 d4.loss_iou: 0.3608 enc_loss_cls: 0.4190 enc_loss_bbox: 0.2152 enc_loss_iou: 0.3905 dn_loss_cls: 0.0905 dn_loss_bbox: 0.1611 dn_loss_iou: 0.2330 d0.dn_loss_cls: 0.1725 d0.dn_loss_bbox: 0.3035 d0.dn_loss_iou: 0.3767 d1.dn_loss_cls: 0.1233 d1.dn_loss_bbox: 0.1956 d1.dn_loss_iou: 0.2646 d2.dn_loss_cls: 0.1061 d2.dn_loss_bbox: 0.1678 d2.dn_loss_iou: 0.2419 d3.dn_loss_cls: 0.0952 d3.dn_loss_bbox: 0.1617 d3.dn_loss_iou: 0.2354 d4.dn_loss_cls: 0.0908 d4.dn_loss_bbox: 0.1610 d4.dn_loss_iou: 0.2331 d1.loss_lmm_region: 0.1136 loss_lmm_image: 0.7953 2024/11/13 02:42:06 - mmengine - INFO - Iter(train) [110200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 22:11:15 time: 2.0072 data_time: 0.0185 memory: 33929 grad_norm: 22.5938 loss: 9.0396 loss_cls: 0.2879 loss_bbox: 0.1310 loss_iou: 0.2493 d0.loss_cls: 0.3184 d0.loss_bbox: 0.1425 d0.loss_iou: 0.2634 d1.loss_cls: 0.2972 d1.loss_bbox: 0.1393 d1.loss_iou: 0.2579 d2.loss_cls: 0.2876 d2.loss_bbox: 0.1379 d2.loss_iou: 0.2535 d3.loss_cls: 0.2893 d3.loss_bbox: 0.1322 d3.loss_iou: 0.2494 d4.loss_cls: 0.2887 d4.loss_bbox: 0.1306 d4.loss_iou: 0.2497 enc_loss_cls: 0.3262 enc_loss_bbox: 0.1561 enc_loss_iou: 0.2791 dn_loss_cls: 0.0935 dn_loss_bbox: 0.1560 dn_loss_iou: 0.2123 d0.dn_loss_cls: 0.1729 d0.dn_loss_bbox: 0.2848 d0.dn_loss_iou: 0.3400 d1.dn_loss_cls: 0.1275 d1.dn_loss_bbox: 0.1854 d1.dn_loss_iou: 0.2409 d2.dn_loss_cls: 0.1050 d2.dn_loss_bbox: 0.1679 d2.dn_loss_iou: 0.2224 d3.dn_loss_cls: 0.0973 d3.dn_loss_bbox: 0.1586 d3.dn_loss_iou: 0.2151 d4.dn_loss_cls: 0.0945 d4.dn_loss_bbox: 0.1560 d4.dn_loss_iou: 0.2125 d1.loss_lmm_region: 0.1185 loss_lmm_image: 0.8111 2024/11/13 02:45:25 - mmengine - INFO - Iter(train) [110300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 22:07:54 time: 1.9932 data_time: 0.0187 memory: 34770 grad_norm: 28.1231 loss: 8.5804 loss_cls: 0.2863 loss_bbox: 0.1270 loss_iou: 0.2169 d0.loss_cls: 0.3223 d0.loss_bbox: 0.1381 d0.loss_iou: 0.2293 d1.loss_cls: 0.3053 d1.loss_bbox: 0.1299 d1.loss_iou: 0.2187 d2.loss_cls: 0.2960 d2.loss_bbox: 0.1286 d2.loss_iou: 0.2177 d3.loss_cls: 0.2910 d3.loss_bbox: 0.1270 d3.loss_iou: 0.2161 d4.loss_cls: 0.2860 d4.loss_bbox: 0.1299 d4.loss_iou: 0.2173 enc_loss_cls: 0.3285 enc_loss_bbox: 0.1519 enc_loss_iou: 0.2492 dn_loss_cls: 0.0862 dn_loss_bbox: 0.1444 dn_loss_iou: 0.1925 d0.dn_loss_cls: 0.1645 d0.dn_loss_bbox: 0.2747 d0.dn_loss_iou: 0.3233 d1.dn_loss_cls: 0.1152 d1.dn_loss_bbox: 0.1740 d1.dn_loss_iou: 0.2208 d2.dn_loss_cls: 0.0988 d2.dn_loss_bbox: 0.1545 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.0896 d3.dn_loss_bbox: 0.1470 d3.dn_loss_iou: 0.1946 d4.dn_loss_cls: 0.0861 d4.dn_loss_bbox: 0.1445 d4.dn_loss_iou: 0.1925 d1.loss_lmm_region: 0.1367 loss_lmm_image: 0.8255 2024/11/13 02:48:46 - mmengine - INFO - Iter(train) [110400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 22:04:34 time: 2.0178 data_time: 0.0187 memory: 34226 grad_norm: 26.3796 loss: 8.9872 loss_cls: 0.2917 loss_bbox: 0.1404 loss_iou: 0.2560 d0.loss_cls: 0.3346 d0.loss_bbox: 0.1488 d0.loss_iou: 0.2691 d1.loss_cls: 0.3151 d1.loss_bbox: 0.1434 d1.loss_iou: 0.2604 d2.loss_cls: 0.3020 d2.loss_bbox: 0.1404 d2.loss_iou: 0.2581 d3.loss_cls: 0.2985 d3.loss_bbox: 0.1395 d3.loss_iou: 0.2565 d4.loss_cls: 0.2952 d4.loss_bbox: 0.1396 d4.loss_iou: 0.2562 enc_loss_cls: 0.3364 enc_loss_bbox: 0.1630 enc_loss_iou: 0.2828 dn_loss_cls: 0.0920 dn_loss_bbox: 0.1501 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.1727 d0.dn_loss_bbox: 0.2715 d0.dn_loss_iou: 0.3170 d1.dn_loss_cls: 0.1197 d1.dn_loss_bbox: 0.1719 d1.dn_loss_iou: 0.2201 d2.dn_loss_cls: 0.1043 d2.dn_loss_bbox: 0.1544 d2.dn_loss_iou: 0.2025 d3.dn_loss_cls: 0.0960 d3.dn_loss_bbox: 0.1513 d3.dn_loss_iou: 0.1980 d4.dn_loss_cls: 0.0947 d4.dn_loss_bbox: 0.1500 d4.dn_loss_iou: 0.1964 d1.loss_lmm_region: 0.1344 loss_lmm_image: 0.7662 2024/11/13 02:52:05 - mmengine - INFO - Iter(train) [110500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 22:01:12 time: 1.9634 data_time: 0.0186 memory: 34696 grad_norm: 26.4776 loss: 8.1866 loss_cls: 0.2471 loss_bbox: 0.1226 loss_iou: 0.2094 d0.loss_cls: 0.2817 d0.loss_bbox: 0.1290 d0.loss_iou: 0.2224 d1.loss_cls: 0.2621 d1.loss_bbox: 0.1236 d1.loss_iou: 0.2095 d2.loss_cls: 0.2530 d2.loss_bbox: 0.1230 d2.loss_iou: 0.2094 d3.loss_cls: 0.2510 d3.loss_bbox: 0.1211 d3.loss_iou: 0.2078 d4.loss_cls: 0.2504 d4.loss_bbox: 0.1198 d4.loss_iou: 0.2066 enc_loss_cls: 0.2894 enc_loss_bbox: 0.1388 enc_loss_iou: 0.2387 dn_loss_cls: 0.0654 dn_loss_bbox: 0.1599 dn_loss_iou: 0.2062 d0.dn_loss_cls: 0.1468 d0.dn_loss_bbox: 0.3069 d0.dn_loss_iou: 0.3438 d1.dn_loss_cls: 0.0974 d1.dn_loss_bbox: 0.1901 d1.dn_loss_iou: 0.2346 d2.dn_loss_cls: 0.0787 d2.dn_loss_bbox: 0.1691 d2.dn_loss_iou: 0.2162 d3.dn_loss_cls: 0.0721 d3.dn_loss_bbox: 0.1626 d3.dn_loss_iou: 0.2089 d4.dn_loss_cls: 0.0658 d4.dn_loss_bbox: 0.1599 d4.dn_loss_iou: 0.2062 d1.loss_lmm_region: 0.0995 loss_lmm_image: 0.7801 2024/11/13 02:55:25 - mmengine - INFO - Iter(train) [110600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:57:51 time: 1.9800 data_time: 0.0186 memory: 35096 grad_norm: 35.2394 loss: 9.6271 loss_cls: 0.3077 loss_bbox: 0.1548 loss_iou: 0.2759 d0.loss_cls: 0.3468 d0.loss_bbox: 0.1690 d0.loss_iou: 0.2946 d1.loss_cls: 0.3213 d1.loss_bbox: 0.1611 d1.loss_iou: 0.2838 d2.loss_cls: 0.3156 d2.loss_bbox: 0.1570 d2.loss_iou: 0.2787 d3.loss_cls: 0.3082 d3.loss_bbox: 0.1569 d3.loss_iou: 0.2783 d4.loss_cls: 0.3086 d4.loss_bbox: 0.1546 d4.loss_iou: 0.2763 enc_loss_cls: 0.3564 enc_loss_bbox: 0.1797 enc_loss_iou: 0.3049 dn_loss_cls: 0.0937 dn_loss_bbox: 0.1640 dn_loss_iou: 0.2193 d0.dn_loss_cls: 0.1706 d0.dn_loss_bbox: 0.3038 d0.dn_loss_iou: 0.3484 d1.dn_loss_cls: 0.1228 d1.dn_loss_bbox: 0.1858 d1.dn_loss_iou: 0.2420 d2.dn_loss_cls: 0.1037 d2.dn_loss_bbox: 0.1722 d2.dn_loss_iou: 0.2258 d3.dn_loss_cls: 0.0996 d3.dn_loss_bbox: 0.1658 d3.dn_loss_iou: 0.2204 d4.dn_loss_cls: 0.0942 d4.dn_loss_bbox: 0.1640 d4.dn_loss_iou: 0.2192 d1.loss_lmm_region: 0.1220 loss_lmm_image: 0.7996 2024/11/13 02:58:43 - mmengine - INFO - Iter(train) [110700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:54:30 time: 1.9660 data_time: 0.0184 memory: 33740 grad_norm: 37.7119 loss: 8.4488 loss_cls: 0.2660 loss_bbox: 0.1240 loss_iou: 0.2270 d0.loss_cls: 0.3041 d0.loss_bbox: 0.1296 d0.loss_iou: 0.2391 d1.loss_cls: 0.2847 d1.loss_bbox: 0.1235 d1.loss_iou: 0.2344 d2.loss_cls: 0.2773 d2.loss_bbox: 0.1199 d2.loss_iou: 0.2297 d3.loss_cls: 0.2719 d3.loss_bbox: 0.1220 d3.loss_iou: 0.2268 d4.loss_cls: 0.2686 d4.loss_bbox: 0.1217 d4.loss_iou: 0.2253 enc_loss_cls: 0.3094 enc_loss_bbox: 0.1438 enc_loss_iou: 0.2601 dn_loss_cls: 0.1035 dn_loss_bbox: 0.1430 dn_loss_iou: 0.1871 d0.dn_loss_cls: 0.1753 d0.dn_loss_bbox: 0.2870 d0.dn_loss_iou: 0.3172 d1.dn_loss_cls: 0.1257 d1.dn_loss_bbox: 0.1742 d1.dn_loss_iou: 0.2138 d2.dn_loss_cls: 0.1106 d2.dn_loss_bbox: 0.1539 d2.dn_loss_iou: 0.1957 d3.dn_loss_cls: 0.1042 d3.dn_loss_bbox: 0.1456 d3.dn_loss_iou: 0.1892 d4.dn_loss_cls: 0.1042 d4.dn_loss_bbox: 0.1431 d4.dn_loss_iou: 0.1871 d1.loss_lmm_region: 0.0931 loss_lmm_image: 0.7865 2024/11/13 03:02:01 - mmengine - INFO - Iter(train) [110800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:51:08 time: 1.9678 data_time: 0.0187 memory: 33358 grad_norm: 24.1822 loss: 8.9555 loss_cls: 0.3143 loss_bbox: 0.1253 loss_iou: 0.2251 d0.loss_cls: 0.3502 d0.loss_bbox: 0.1403 d0.loss_iou: 0.2396 d1.loss_cls: 0.3267 d1.loss_bbox: 0.1292 d1.loss_iou: 0.2292 d2.loss_cls: 0.3180 d2.loss_bbox: 0.1287 d2.loss_iou: 0.2294 d3.loss_cls: 0.3178 d3.loss_bbox: 0.1251 d3.loss_iou: 0.2260 d4.loss_cls: 0.3161 d4.loss_bbox: 0.1237 d4.loss_iou: 0.2238 enc_loss_cls: 0.3567 enc_loss_bbox: 0.1503 enc_loss_iou: 0.2590 dn_loss_cls: 0.1150 dn_loss_bbox: 0.1509 dn_loss_iou: 0.1902 d0.dn_loss_cls: 0.1907 d0.dn_loss_bbox: 0.2848 d0.dn_loss_iou: 0.3186 d1.dn_loss_cls: 0.1448 d1.dn_loss_bbox: 0.1857 d1.dn_loss_iou: 0.2193 d2.dn_loss_cls: 0.1269 d2.dn_loss_bbox: 0.1636 d2.dn_loss_iou: 0.2003 d3.dn_loss_cls: 0.1192 d3.dn_loss_bbox: 0.1525 d3.dn_loss_iou: 0.1926 d4.dn_loss_cls: 0.1167 d4.dn_loss_bbox: 0.1509 d4.dn_loss_iou: 0.1902 d1.loss_lmm_region: 0.1116 loss_lmm_image: 0.7768 2024/11/13 03:05:23 - mmengine - INFO - Iter(train) [110900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:47:48 time: 2.0383 data_time: 0.0187 memory: 34001 grad_norm: 25.9471 loss: 9.7086 loss_cls: 0.3279 loss_bbox: 0.1549 loss_iou: 0.2753 d0.loss_cls: 0.3822 d0.loss_bbox: 0.1573 d0.loss_iou: 0.2820 d1.loss_cls: 0.3631 d1.loss_bbox: 0.1530 d1.loss_iou: 0.2754 d2.loss_cls: 0.3418 d2.loss_bbox: 0.1568 d2.loss_iou: 0.2768 d3.loss_cls: 0.3292 d3.loss_bbox: 0.1575 d3.loss_iou: 0.2758 d4.loss_cls: 0.3289 d4.loss_bbox: 0.1542 d4.loss_iou: 0.2742 enc_loss_cls: 0.3818 enc_loss_bbox: 0.1731 enc_loss_iou: 0.3052 dn_loss_cls: 0.1171 dn_loss_bbox: 0.1525 dn_loss_iou: 0.2077 d0.dn_loss_cls: 0.1889 d0.dn_loss_bbox: 0.2832 d0.dn_loss_iou: 0.3395 d1.dn_loss_cls: 0.1419 d1.dn_loss_bbox: 0.1797 d1.dn_loss_iou: 0.2331 d2.dn_loss_cls: 0.1183 d2.dn_loss_bbox: 0.1590 d2.dn_loss_iou: 0.2139 d3.dn_loss_cls: 0.1155 d3.dn_loss_bbox: 0.1531 d3.dn_loss_iou: 0.2087 d4.dn_loss_cls: 0.1165 d4.dn_loss_bbox: 0.1526 d4.dn_loss_iou: 0.2076 d1.loss_lmm_region: 0.1028 loss_lmm_image: 0.7904 2024/11/13 03:08:42 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 03:08:42 - mmengine - INFO - Iter(train) [111000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:44:27 time: 1.9988 data_time: 0.0186 memory: 34626 grad_norm: 27.8823 loss: 9.5897 loss_cls: 0.2876 loss_bbox: 0.1555 loss_iou: 0.2888 d0.loss_cls: 0.3309 d0.loss_bbox: 0.1644 d0.loss_iou: 0.2989 d1.loss_cls: 0.3038 d1.loss_bbox: 0.1586 d1.loss_iou: 0.2916 d2.loss_cls: 0.2970 d2.loss_bbox: 0.1562 d2.loss_iou: 0.2904 d3.loss_cls: 0.2908 d3.loss_bbox: 0.1564 d3.loss_iou: 0.2913 d4.loss_cls: 0.2878 d4.loss_bbox: 0.1564 d4.loss_iou: 0.2900 enc_loss_cls: 0.3335 enc_loss_bbox: 0.1757 enc_loss_iou: 0.3134 dn_loss_cls: 0.0990 dn_loss_bbox: 0.1567 dn_loss_iou: 0.2280 d0.dn_loss_cls: 0.1784 d0.dn_loss_bbox: 0.2960 d0.dn_loss_iou: 0.3578 d1.dn_loss_cls: 0.1346 d1.dn_loss_bbox: 0.1888 d1.dn_loss_iou: 0.2550 d2.dn_loss_cls: 0.1101 d2.dn_loss_bbox: 0.1668 d2.dn_loss_iou: 0.2369 d3.dn_loss_cls: 0.1024 d3.dn_loss_bbox: 0.1590 d3.dn_loss_iou: 0.2302 d4.dn_loss_cls: 0.1007 d4.dn_loss_bbox: 0.1565 d4.dn_loss_iou: 0.2279 d1.loss_lmm_region: 0.1174 loss_lmm_image: 0.7684 2024/11/13 03:12:02 - mmengine - INFO - Iter(train) [111100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:41:06 time: 2.0023 data_time: 0.0186 memory: 33859 grad_norm: 29.5248 loss: 7.7049 loss_cls: 0.2346 loss_bbox: 0.0990 loss_iou: 0.1940 d0.loss_cls: 0.2719 d0.loss_bbox: 0.1118 d0.loss_iou: 0.2049 d1.loss_cls: 0.2514 d1.loss_bbox: 0.1038 d1.loss_iou: 0.2002 d2.loss_cls: 0.2376 d2.loss_bbox: 0.1020 d2.loss_iou: 0.1997 d3.loss_cls: 0.2338 d3.loss_bbox: 0.0997 d3.loss_iou: 0.1953 d4.loss_cls: 0.2332 d4.loss_bbox: 0.1001 d4.loss_iou: 0.1957 enc_loss_cls: 0.2784 enc_loss_bbox: 0.1208 enc_loss_iou: 0.2228 dn_loss_cls: 0.0849 dn_loss_bbox: 0.1513 dn_loss_iou: 0.1818 d0.dn_loss_cls: 0.1575 d0.dn_loss_bbox: 0.2868 d0.dn_loss_iou: 0.2998 d1.dn_loss_cls: 0.1098 d1.dn_loss_bbox: 0.1857 d1.dn_loss_iou: 0.2079 d2.dn_loss_cls: 0.0907 d2.dn_loss_bbox: 0.1628 d2.dn_loss_iou: 0.1896 d3.dn_loss_cls: 0.0856 d3.dn_loss_bbox: 0.1546 d3.dn_loss_iou: 0.1846 d4.dn_loss_cls: 0.0837 d4.dn_loss_bbox: 0.1512 d4.dn_loss_iou: 0.1816 d1.loss_lmm_region: 0.1179 loss_lmm_image: 0.7464 2024/11/13 03:15:22 - mmengine - INFO - Iter(train) [111200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:37:45 time: 2.0003 data_time: 0.0186 memory: 33459 grad_norm: 22.5566 loss: 8.4385 loss_cls: 0.2316 loss_bbox: 0.1258 loss_iou: 0.2155 d0.loss_cls: 0.2596 d0.loss_bbox: 0.1369 d0.loss_iou: 0.2290 d1.loss_cls: 0.2423 d1.loss_bbox: 0.1315 d1.loss_iou: 0.2243 d2.loss_cls: 0.2390 d2.loss_bbox: 0.1271 d2.loss_iou: 0.2171 d3.loss_cls: 0.2368 d3.loss_bbox: 0.1237 d3.loss_iou: 0.2150 d4.loss_cls: 0.2334 d4.loss_bbox: 0.1252 d4.loss_iou: 0.2142 enc_loss_cls: 0.2711 enc_loss_bbox: 0.1446 enc_loss_iou: 0.2441 dn_loss_cls: 0.0875 dn_loss_bbox: 0.1756 dn_loss_iou: 0.2229 d0.dn_loss_cls: 0.1641 d0.dn_loss_bbox: 0.3194 d0.dn_loss_iou: 0.3655 d1.dn_loss_cls: 0.1176 d1.dn_loss_bbox: 0.2039 d1.dn_loss_iou: 0.2498 d2.dn_loss_cls: 0.1002 d2.dn_loss_bbox: 0.1825 d2.dn_loss_iou: 0.2310 d3.dn_loss_cls: 0.0940 d3.dn_loss_bbox: 0.1760 d3.dn_loss_iou: 0.2244 d4.dn_loss_cls: 0.0887 d4.dn_loss_bbox: 0.1756 d4.dn_loss_iou: 0.2229 d1.loss_lmm_region: 0.0932 loss_lmm_image: 0.7561 2024/11/13 03:18:42 - mmengine - INFO - Iter(train) [111300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:34:24 time: 1.9808 data_time: 0.0186 memory: 33894 grad_norm: 23.0299 loss: 7.6275 loss_cls: 0.2088 loss_bbox: 0.1113 loss_iou: 0.2101 d0.loss_cls: 0.2502 d0.loss_bbox: 0.1177 d0.loss_iou: 0.2234 d1.loss_cls: 0.2280 d1.loss_bbox: 0.1122 d1.loss_iou: 0.2131 d2.loss_cls: 0.2201 d2.loss_bbox: 0.1113 d2.loss_iou: 0.2112 d3.loss_cls: 0.2121 d3.loss_bbox: 0.1127 d3.loss_iou: 0.2111 d4.loss_cls: 0.2098 d4.loss_bbox: 0.1113 d4.loss_iou: 0.2101 enc_loss_cls: 0.2591 enc_loss_bbox: 0.1296 enc_loss_iou: 0.2423 dn_loss_cls: 0.0822 dn_loss_bbox: 0.1272 dn_loss_iou: 0.1867 d0.dn_loss_cls: 0.1566 d0.dn_loss_bbox: 0.2550 d0.dn_loss_iou: 0.3131 d1.dn_loss_cls: 0.1064 d1.dn_loss_bbox: 0.1524 d1.dn_loss_iou: 0.2109 d2.dn_loss_cls: 0.0931 d2.dn_loss_bbox: 0.1360 d2.dn_loss_iou: 0.1941 d3.dn_loss_cls: 0.0856 d3.dn_loss_bbox: 0.1294 d3.dn_loss_iou: 0.1891 d4.dn_loss_cls: 0.0833 d4.dn_loss_bbox: 0.1273 d4.dn_loss_iou: 0.1867 d1.loss_lmm_region: 0.1186 loss_lmm_image: 0.7782 2024/11/13 03:22:03 - mmengine - INFO - Iter(train) [111400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:31:03 time: 2.0146 data_time: 0.0185 memory: 35395 grad_norm: 25.1994 loss: 8.6419 loss_cls: 0.2868 loss_bbox: 0.1194 loss_iou: 0.2498 d0.loss_cls: 0.3281 d0.loss_bbox: 0.1286 d0.loss_iou: 0.2650 d1.loss_cls: 0.3053 d1.loss_bbox: 0.1231 d1.loss_iou: 0.2553 d2.loss_cls: 0.2947 d2.loss_bbox: 0.1215 d2.loss_iou: 0.2533 d3.loss_cls: 0.2909 d3.loss_bbox: 0.1212 d3.loss_iou: 0.2515 d4.loss_cls: 0.2854 d4.loss_bbox: 0.1210 d4.loss_iou: 0.2509 enc_loss_cls: 0.3328 enc_loss_bbox: 0.1418 enc_loss_iou: 0.2840 dn_loss_cls: 0.1051 dn_loss_bbox: 0.1341 dn_loss_iou: 0.1804 d0.dn_loss_cls: 0.1661 d0.dn_loss_bbox: 0.2519 d0.dn_loss_iou: 0.3072 d1.dn_loss_cls: 0.1250 d1.dn_loss_bbox: 0.1521 d1.dn_loss_iou: 0.2050 d2.dn_loss_cls: 0.1111 d2.dn_loss_bbox: 0.1398 d2.dn_loss_iou: 0.1885 d3.dn_loss_cls: 0.1071 d3.dn_loss_bbox: 0.1338 d3.dn_loss_iou: 0.1816 d4.dn_loss_cls: 0.1060 d4.dn_loss_bbox: 0.1340 d4.dn_loss_iou: 0.1805 d1.loss_lmm_region: 0.1193 loss_lmm_image: 0.8030 2024/11/13 03:25:23 - mmengine - INFO - Iter(train) [111500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:27:42 time: 2.0163 data_time: 0.0186 memory: 34329 grad_norm: 29.3569 loss: 8.6287 loss_cls: 0.2536 loss_bbox: 0.1325 loss_iou: 0.2437 d0.loss_cls: 0.2868 d0.loss_bbox: 0.1471 d0.loss_iou: 0.2559 d1.loss_cls: 0.2696 d1.loss_bbox: 0.1381 d1.loss_iou: 0.2467 d2.loss_cls: 0.2595 d2.loss_bbox: 0.1369 d2.loss_iou: 0.2428 d3.loss_cls: 0.2581 d3.loss_bbox: 0.1330 d3.loss_iou: 0.2417 d4.loss_cls: 0.2551 d4.loss_bbox: 0.1327 d4.loss_iou: 0.2436 enc_loss_cls: 0.2861 enc_loss_bbox: 0.1652 enc_loss_iou: 0.2703 dn_loss_cls: 0.0842 dn_loss_bbox: 0.1612 dn_loss_iou: 0.2135 d0.dn_loss_cls: 0.1599 d0.dn_loss_bbox: 0.2898 d0.dn_loss_iou: 0.3386 d1.dn_loss_cls: 0.1164 d1.dn_loss_bbox: 0.1869 d1.dn_loss_iou: 0.2379 d2.dn_loss_cls: 0.0962 d2.dn_loss_bbox: 0.1680 d2.dn_loss_iou: 0.2210 d3.dn_loss_cls: 0.0894 d3.dn_loss_bbox: 0.1629 d3.dn_loss_iou: 0.2152 d4.dn_loss_cls: 0.0857 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.2135 d1.loss_lmm_region: 0.1040 loss_lmm_image: 0.7242 2024/11/13 03:28:43 - mmengine - INFO - Iter(train) [111600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:24:21 time: 1.9903 data_time: 0.0187 memory: 33874 grad_norm: 25.4705 loss: 9.0590 loss_cls: 0.2905 loss_bbox: 0.1321 loss_iou: 0.2378 d0.loss_cls: 0.3214 d0.loss_bbox: 0.1443 d0.loss_iou: 0.2563 d1.loss_cls: 0.3068 d1.loss_bbox: 0.1343 d1.loss_iou: 0.2456 d2.loss_cls: 0.2993 d2.loss_bbox: 0.1293 d2.loss_iou: 0.2404 d3.loss_cls: 0.2964 d3.loss_bbox: 0.1295 d3.loss_iou: 0.2388 d4.loss_cls: 0.2911 d4.loss_bbox: 0.1306 d4.loss_iou: 0.2384 enc_loss_cls: 0.3321 enc_loss_bbox: 0.1521 enc_loss_iou: 0.2684 dn_loss_cls: 0.1288 dn_loss_bbox: 0.1421 dn_loss_iou: 0.2028 d0.dn_loss_cls: 0.2025 d0.dn_loss_bbox: 0.2858 d0.dn_loss_iou: 0.3366 d1.dn_loss_cls: 0.1517 d1.dn_loss_bbox: 0.1751 d1.dn_loss_iou: 0.2316 d2.dn_loss_cls: 0.1382 d2.dn_loss_bbox: 0.1518 d2.dn_loss_iou: 0.2124 d3.dn_loss_cls: 0.1336 d3.dn_loss_bbox: 0.1437 d3.dn_loss_iou: 0.2049 d4.dn_loss_cls: 0.1298 d4.dn_loss_bbox: 0.1422 d4.dn_loss_iou: 0.2030 d1.loss_lmm_region: 0.1265 loss_lmm_image: 0.8003 2024/11/13 03:32:02 - mmengine - INFO - Iter(train) [111700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:21:00 time: 1.9844 data_time: 0.0185 memory: 33867 grad_norm: 28.4904 loss: 7.9170 loss_cls: 0.2470 loss_bbox: 0.1145 loss_iou: 0.2098 d0.loss_cls: 0.2827 d0.loss_bbox: 0.1250 d0.loss_iou: 0.2187 d1.loss_cls: 0.2617 d1.loss_bbox: 0.1218 d1.loss_iou: 0.2146 d2.loss_cls: 0.2562 d2.loss_bbox: 0.1166 d2.loss_iou: 0.2116 d3.loss_cls: 0.2510 d3.loss_bbox: 0.1148 d3.loss_iou: 0.2107 d4.loss_cls: 0.2460 d4.loss_bbox: 0.1146 d4.loss_iou: 0.2102 enc_loss_cls: 0.2922 enc_loss_bbox: 0.1370 enc_loss_iou: 0.2361 dn_loss_cls: 0.0887 dn_loss_bbox: 0.1370 dn_loss_iou: 0.1669 d0.dn_loss_cls: 0.1620 d0.dn_loss_bbox: 0.2698 d0.dn_loss_iou: 0.2887 d1.dn_loss_cls: 0.1113 d1.dn_loss_bbox: 0.1667 d1.dn_loss_iou: 0.1933 d2.dn_loss_cls: 0.0975 d2.dn_loss_bbox: 0.1448 d2.dn_loss_iou: 0.1741 d3.dn_loss_cls: 0.0916 d3.dn_loss_bbox: 0.1381 d3.dn_loss_iou: 0.1689 d4.dn_loss_cls: 0.0886 d4.dn_loss_bbox: 0.1370 d4.dn_loss_iou: 0.1668 d1.loss_lmm_region: 0.1055 loss_lmm_image: 0.8266 2024/11/13 03:35:21 - mmengine - INFO - Iter(train) [111800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:17:39 time: 1.9946 data_time: 0.0185 memory: 34404 grad_norm: 24.1676 loss: 8.1710 loss_cls: 0.2429 loss_bbox: 0.1233 loss_iou: 0.2337 d0.loss_cls: 0.2748 d0.loss_bbox: 0.1352 d0.loss_iou: 0.2475 d1.loss_cls: 0.2594 d1.loss_bbox: 0.1290 d1.loss_iou: 0.2398 d2.loss_cls: 0.2512 d2.loss_bbox: 0.1236 d2.loss_iou: 0.2352 d3.loss_cls: 0.2460 d3.loss_bbox: 0.1230 d3.loss_iou: 0.2342 d4.loss_cls: 0.2453 d4.loss_bbox: 0.1228 d4.loss_iou: 0.2333 enc_loss_cls: 0.2808 enc_loss_bbox: 0.1456 enc_loss_iou: 0.2641 dn_loss_cls: 0.0772 dn_loss_bbox: 0.1403 dn_loss_iou: 0.1929 d0.dn_loss_cls: 0.1515 d0.dn_loss_bbox: 0.2642 d0.dn_loss_iou: 0.3217 d1.dn_loss_cls: 0.1035 d1.dn_loss_bbox: 0.1628 d1.dn_loss_iou: 0.2172 d2.dn_loss_cls: 0.0873 d2.dn_loss_bbox: 0.1458 d2.dn_loss_iou: 0.2005 d3.dn_loss_cls: 0.0828 d3.dn_loss_bbox: 0.1417 d3.dn_loss_iou: 0.1950 d4.dn_loss_cls: 0.0775 d4.dn_loss_bbox: 0.1402 d4.dn_loss_iou: 0.1929 d1.loss_lmm_region: 0.0969 loss_lmm_image: 0.7885 2024/11/13 03:38:41 - mmengine - INFO - Iter(train) [111900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:14:18 time: 2.0076 data_time: 0.0186 memory: 33654 grad_norm: 25.5157 loss: 9.2265 loss_cls: 0.2821 loss_bbox: 0.1384 loss_iou: 0.2466 d0.loss_cls: 0.3183 d0.loss_bbox: 0.1433 d0.loss_iou: 0.2577 d1.loss_cls: 0.3038 d1.loss_bbox: 0.1369 d1.loss_iou: 0.2480 d2.loss_cls: 0.2905 d2.loss_bbox: 0.1419 d2.loss_iou: 0.2493 d3.loss_cls: 0.2850 d3.loss_bbox: 0.1414 d3.loss_iou: 0.2480 d4.loss_cls: 0.2847 d4.loss_bbox: 0.1392 d4.loss_iou: 0.2482 enc_loss_cls: 0.3342 enc_loss_bbox: 0.1537 enc_loss_iou: 0.2711 dn_loss_cls: 0.1063 dn_loss_bbox: 0.1659 dn_loss_iou: 0.2102 d0.dn_loss_cls: 0.1937 d0.dn_loss_bbox: 0.3134 d0.dn_loss_iou: 0.3409 d1.dn_loss_cls: 0.1527 d1.dn_loss_bbox: 0.2000 d1.dn_loss_iou: 0.2393 d2.dn_loss_cls: 0.1298 d2.dn_loss_bbox: 0.1770 d2.dn_loss_iou: 0.2194 d3.dn_loss_cls: 0.1206 d3.dn_loss_bbox: 0.1682 d3.dn_loss_iou: 0.2128 d4.dn_loss_cls: 0.1077 d4.dn_loss_bbox: 0.1658 d4.dn_loss_iou: 0.2102 d1.loss_lmm_region: 0.1478 loss_lmm_image: 0.7822 2024/11/13 03:42:00 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 03:42:00 - mmengine - INFO - Iter(train) [112000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:10:56 time: 1.9789 data_time: 0.0185 memory: 33480 grad_norm: 28.3807 loss: 9.4916 loss_cls: 0.3026 loss_bbox: 0.1446 loss_iou: 0.2559 d0.loss_cls: 0.3504 d0.loss_bbox: 0.1524 d0.loss_iou: 0.2676 d1.loss_cls: 0.3274 d1.loss_bbox: 0.1414 d1.loss_iou: 0.2563 d2.loss_cls: 0.3186 d2.loss_bbox: 0.1418 d2.loss_iou: 0.2549 d3.loss_cls: 0.3113 d3.loss_bbox: 0.1401 d3.loss_iou: 0.2543 d4.loss_cls: 0.3077 d4.loss_bbox: 0.1393 d4.loss_iou: 0.2540 enc_loss_cls: 0.3563 enc_loss_bbox: 0.1647 enc_loss_iou: 0.2842 dn_loss_cls: 0.1043 dn_loss_bbox: 0.1646 dn_loss_iou: 0.2162 d0.dn_loss_cls: 0.1921 d0.dn_loss_bbox: 0.3017 d0.dn_loss_iou: 0.3505 d1.dn_loss_cls: 0.1443 d1.dn_loss_bbox: 0.2002 d1.dn_loss_iou: 0.2481 d2.dn_loss_cls: 0.1239 d2.dn_loss_bbox: 0.1766 d2.dn_loss_iou: 0.2259 d3.dn_loss_cls: 0.1120 d3.dn_loss_bbox: 0.1675 d3.dn_loss_iou: 0.2184 d4.dn_loss_cls: 0.1051 d4.dn_loss_bbox: 0.1647 d4.dn_loss_iou: 0.2163 d1.loss_lmm_region: 0.1343 loss_lmm_image: 0.7991 2024/11/13 03:45:23 - mmengine - INFO - Iter(train) [112100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:07:37 time: 2.0351 data_time: 0.0185 memory: 35819 grad_norm: 27.3883 loss: 8.1466 loss_cls: 0.2521 loss_bbox: 0.1038 loss_iou: 0.1978 d0.loss_cls: 0.2803 d0.loss_bbox: 0.1132 d0.loss_iou: 0.2093 d1.loss_cls: 0.2629 d1.loss_bbox: 0.1076 d1.loss_iou: 0.2023 d2.loss_cls: 0.2576 d2.loss_bbox: 0.1031 d2.loss_iou: 0.1975 d3.loss_cls: 0.2532 d3.loss_bbox: 0.1038 d3.loss_iou: 0.1990 d4.loss_cls: 0.2456 d4.loss_bbox: 0.1122 d4.loss_iou: 0.2009 enc_loss_cls: 0.2849 enc_loss_bbox: 0.1267 enc_loss_iou: 0.2253 dn_loss_cls: 0.1015 dn_loss_bbox: 0.1548 dn_loss_iou: 0.1946 d0.dn_loss_cls: 0.1672 d0.dn_loss_bbox: 0.3070 d0.dn_loss_iou: 0.3248 d1.dn_loss_cls: 0.1241 d1.dn_loss_bbox: 0.1882 d1.dn_loss_iou: 0.2215 d2.dn_loss_cls: 0.1116 d2.dn_loss_bbox: 0.1638 d2.dn_loss_iou: 0.2031 d3.dn_loss_cls: 0.1058 d3.dn_loss_bbox: 0.1561 d3.dn_loss_iou: 0.1965 d4.dn_loss_cls: 0.0971 d4.dn_loss_bbox: 0.1548 d4.dn_loss_iou: 0.1946 d1.loss_lmm_region: 0.1062 loss_lmm_image: 0.8342 2024/11/13 03:48:44 - mmengine - INFO - Iter(train) [112200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:04:16 time: 2.0041 data_time: 0.0187 memory: 34677 grad_norm: 21.6226 loss: 9.1131 loss_cls: 0.2775 loss_bbox: 0.1491 loss_iou: 0.2579 d0.loss_cls: 0.3322 d0.loss_bbox: 0.1530 d0.loss_iou: 0.2677 d1.loss_cls: 0.2964 d1.loss_bbox: 0.1522 d1.loss_iou: 0.2638 d2.loss_cls: 0.2911 d2.loss_bbox: 0.1486 d2.loss_iou: 0.2597 d3.loss_cls: 0.2831 d3.loss_bbox: 0.1492 d3.loss_iou: 0.2588 d4.loss_cls: 0.2798 d4.loss_bbox: 0.1494 d4.loss_iou: 0.2577 enc_loss_cls: 0.3363 enc_loss_bbox: 0.1647 enc_loss_iou: 0.2855 dn_loss_cls: 0.0844 dn_loss_bbox: 0.1540 dn_loss_iou: 0.2103 d0.dn_loss_cls: 0.1676 d0.dn_loss_bbox: 0.2958 d0.dn_loss_iou: 0.3438 d1.dn_loss_cls: 0.1146 d1.dn_loss_bbox: 0.1819 d1.dn_loss_iou: 0.2371 d2.dn_loss_cls: 0.0983 d2.dn_loss_bbox: 0.1626 d2.dn_loss_iou: 0.2187 d3.dn_loss_cls: 0.0885 d3.dn_loss_bbox: 0.1566 d3.dn_loss_iou: 0.2132 d4.dn_loss_cls: 0.0846 d4.dn_loss_bbox: 0.1541 d4.dn_loss_iou: 0.2104 d1.loss_lmm_region: 0.1169 loss_lmm_image: 0.8057 2024/11/13 03:52:05 - mmengine - INFO - Iter(train) [112300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 21:00:55 time: 2.0168 data_time: 0.0185 memory: 34585 grad_norm: 27.7930 loss: 8.7186 loss_cls: 0.2614 loss_bbox: 0.1366 loss_iou: 0.2493 d0.loss_cls: 0.3012 d0.loss_bbox: 0.1534 d0.loss_iou: 0.2654 d1.loss_cls: 0.2800 d1.loss_bbox: 0.1411 d1.loss_iou: 0.2529 d2.loss_cls: 0.2750 d2.loss_bbox: 0.1349 d2.loss_iou: 0.2485 d3.loss_cls: 0.2685 d3.loss_bbox: 0.1350 d3.loss_iou: 0.2471 d4.loss_cls: 0.2622 d4.loss_bbox: 0.1355 d4.loss_iou: 0.2484 enc_loss_cls: 0.3189 enc_loss_bbox: 0.1589 enc_loss_iou: 0.2754 dn_loss_cls: 0.0741 dn_loss_bbox: 0.1535 dn_loss_iou: 0.1999 d0.dn_loss_cls: 0.1519 d0.dn_loss_bbox: 0.2873 d0.dn_loss_iou: 0.3219 d1.dn_loss_cls: 0.1047 d1.dn_loss_bbox: 0.1808 d1.dn_loss_iou: 0.2250 d2.dn_loss_cls: 0.0872 d2.dn_loss_bbox: 0.1618 d2.dn_loss_iou: 0.2072 d3.dn_loss_cls: 0.0780 d3.dn_loss_bbox: 0.1560 d3.dn_loss_iou: 0.2019 d4.dn_loss_cls: 0.0745 d4.dn_loss_bbox: 0.1536 d4.dn_loss_iou: 0.1999 d1.loss_lmm_region: 0.1176 loss_lmm_image: 0.8325 2024/11/13 03:55:24 - mmengine - INFO - Iter(train) [112400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:57:34 time: 1.9581 data_time: 0.0185 memory: 35286 grad_norm: 29.6399 loss: 11.1365 loss_cls: 0.5428 loss_bbox: 0.1416 loss_iou: 0.2362 d0.loss_cls: 0.6096 d0.loss_bbox: 0.1568 d0.loss_iou: 0.2495 d1.loss_cls: 0.5841 d1.loss_bbox: 0.1467 d1.loss_iou: 0.2397 d2.loss_cls: 0.5694 d2.loss_bbox: 0.1449 d2.loss_iou: 0.2401 d3.loss_cls: 0.5541 d3.loss_bbox: 0.1423 d3.loss_iou: 0.2356 d4.loss_cls: 0.5452 d4.loss_bbox: 0.1400 d4.loss_iou: 0.2346 enc_loss_cls: 0.6087 enc_loss_bbox: 0.1648 enc_loss_iou: 0.2613 dn_loss_cls: 0.1353 dn_loss_bbox: 0.1698 dn_loss_iou: 0.1946 d0.dn_loss_cls: 0.2307 d0.dn_loss_bbox: 0.2998 d0.dn_loss_iou: 0.3162 d1.dn_loss_cls: 0.1729 d1.dn_loss_bbox: 0.1941 d1.dn_loss_iou: 0.2186 d2.dn_loss_cls: 0.1543 d2.dn_loss_bbox: 0.1768 d2.dn_loss_iou: 0.2024 d3.dn_loss_cls: 0.1443 d3.dn_loss_bbox: 0.1712 d3.dn_loss_iou: 0.1971 d4.dn_loss_cls: 0.1378 d4.dn_loss_bbox: 0.1698 d4.dn_loss_iou: 0.1947 d1.loss_lmm_region: 0.1362 loss_lmm_image: 0.7720 2024/11/13 03:58:43 - mmengine - INFO - Iter(train) [112500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:54:13 time: 1.9692 data_time: 0.0186 memory: 34351 grad_norm: 29.4143 loss: 9.4708 loss_cls: 0.2945 loss_bbox: 0.1617 loss_iou: 0.2788 d0.loss_cls: 0.3378 d0.loss_bbox: 0.1732 d0.loss_iou: 0.2906 d1.loss_cls: 0.3069 d1.loss_bbox: 0.1731 d1.loss_iou: 0.2814 d2.loss_cls: 0.3028 d2.loss_bbox: 0.1681 d2.loss_iou: 0.2801 d3.loss_cls: 0.2965 d3.loss_bbox: 0.1625 d3.loss_iou: 0.2791 d4.loss_cls: 0.2954 d4.loss_bbox: 0.1613 d4.loss_iou: 0.2793 enc_loss_cls: 0.3458 enc_loss_bbox: 0.1825 enc_loss_iou: 0.2991 dn_loss_cls: 0.0842 dn_loss_bbox: 0.1547 dn_loss_iou: 0.2175 d0.dn_loss_cls: 0.1631 d0.dn_loss_bbox: 0.2947 d0.dn_loss_iou: 0.3541 d1.dn_loss_cls: 0.1120 d1.dn_loss_bbox: 0.1872 d1.dn_loss_iou: 0.2455 d2.dn_loss_cls: 0.0932 d2.dn_loss_bbox: 0.1648 d2.dn_loss_iou: 0.2256 d3.dn_loss_cls: 0.0869 d3.dn_loss_bbox: 0.1561 d3.dn_loss_iou: 0.2197 d4.dn_loss_cls: 0.0845 d4.dn_loss_bbox: 0.1546 d4.dn_loss_iou: 0.2176 d1.loss_lmm_region: 0.1191 loss_lmm_image: 0.7850 2024/11/13 04:02:04 - mmengine - INFO - Iter(train) [112600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:50:53 time: 2.0264 data_time: 0.0187 memory: 33218 grad_norm: 25.2022 loss: 8.5281 loss_cls: 0.2409 loss_bbox: 0.1377 loss_iou: 0.2362 d0.loss_cls: 0.2791 d0.loss_bbox: 0.1465 d0.loss_iou: 0.2427 d1.loss_cls: 0.2578 d1.loss_bbox: 0.1390 d1.loss_iou: 0.2378 d2.loss_cls: 0.2500 d2.loss_bbox: 0.1391 d2.loss_iou: 0.2385 d3.loss_cls: 0.2457 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2359 d4.loss_cls: 0.2441 d4.loss_bbox: 0.1353 d4.loss_iou: 0.2334 enc_loss_cls: 0.2823 enc_loss_bbox: 0.1578 enc_loss_iou: 0.2574 dn_loss_cls: 0.0987 dn_loss_bbox: 0.1598 dn_loss_iou: 0.1987 d0.dn_loss_cls: 0.1668 d0.dn_loss_bbox: 0.2923 d0.dn_loss_iou: 0.3266 d1.dn_loss_cls: 0.1203 d1.dn_loss_bbox: 0.1925 d1.dn_loss_iou: 0.2254 d2.dn_loss_cls: 0.1097 d2.dn_loss_bbox: 0.1711 d2.dn_loss_iou: 0.2069 d3.dn_loss_cls: 0.1000 d3.dn_loss_bbox: 0.1621 d3.dn_loss_iou: 0.2008 d4.dn_loss_cls: 0.0988 d4.dn_loss_bbox: 0.1598 d4.dn_loss_iou: 0.1986 d1.loss_lmm_region: 0.1090 loss_lmm_image: 0.7561 2024/11/13 04:05:24 - mmengine - INFO - Iter(train) [112700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:47:32 time: 1.9976 data_time: 0.0186 memory: 31879 grad_norm: 34.3008 loss: 10.4141 loss_cls: 0.3320 loss_bbox: 0.1794 loss_iou: 0.2859 d0.loss_cls: 0.3794 d0.loss_bbox: 0.1909 d0.loss_iou: 0.2998 d1.loss_cls: 0.3603 d1.loss_bbox: 0.1809 d1.loss_iou: 0.2868 d2.loss_cls: 0.3482 d2.loss_bbox: 0.1749 d2.loss_iou: 0.2830 d3.loss_cls: 0.3390 d3.loss_bbox: 0.1759 d3.loss_iou: 0.2836 d4.loss_cls: 0.3335 d4.loss_bbox: 0.1771 d4.loss_iou: 0.2855 enc_loss_cls: 0.3888 enc_loss_bbox: 0.2010 enc_loss_iou: 0.3172 dn_loss_cls: 0.1246 dn_loss_bbox: 0.1975 dn_loss_iou: 0.2265 d0.dn_loss_cls: 0.2037 d0.dn_loss_bbox: 0.3383 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1587 d1.dn_loss_bbox: 0.2317 d1.dn_loss_iou: 0.2545 d2.dn_loss_cls: 0.1423 d2.dn_loss_bbox: 0.2101 d2.dn_loss_iou: 0.2363 d3.dn_loss_cls: 0.1367 d3.dn_loss_bbox: 0.2002 d3.dn_loss_iou: 0.2290 d4.dn_loss_cls: 0.1256 d4.dn_loss_bbox: 0.1976 d4.dn_loss_iou: 0.2266 d1.loss_lmm_region: 0.1152 loss_lmm_image: 0.7016 2024/11/13 04:08:44 - mmengine - INFO - Iter(train) [112800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:44:11 time: 1.9834 data_time: 0.0185 memory: 35237 grad_norm: 30.9214 loss: 9.3654 loss_cls: 0.2826 loss_bbox: 0.1512 loss_iou: 0.2422 d0.loss_cls: 0.3232 d0.loss_bbox: 0.1585 d0.loss_iou: 0.2481 d1.loss_cls: 0.2931 d1.loss_bbox: 0.1583 d1.loss_iou: 0.2459 d2.loss_cls: 0.2889 d2.loss_bbox: 0.1553 d2.loss_iou: 0.2434 d3.loss_cls: 0.2871 d3.loss_bbox: 0.1519 d3.loss_iou: 0.2439 d4.loss_cls: 0.2855 d4.loss_bbox: 0.1510 d4.loss_iou: 0.2412 enc_loss_cls: 0.3253 enc_loss_bbox: 0.1735 enc_loss_iou: 0.2646 dn_loss_cls: 0.1087 dn_loss_bbox: 0.1764 dn_loss_iou: 0.2137 d0.dn_loss_cls: 0.1962 d0.dn_loss_bbox: 0.3323 d0.dn_loss_iou: 0.3502 d1.dn_loss_cls: 0.1437 d1.dn_loss_bbox: 0.2128 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1224 d2.dn_loss_bbox: 0.1879 d2.dn_loss_iou: 0.2229 d3.dn_loss_cls: 0.1144 d3.dn_loss_bbox: 0.1774 d3.dn_loss_iou: 0.2154 d4.dn_loss_cls: 0.1092 d4.dn_loss_bbox: 0.1764 d4.dn_loss_iou: 0.2137 d1.loss_lmm_region: 0.1354 loss_lmm_image: 0.8003 2024/11/13 04:12:03 - mmengine - INFO - Iter(train) [112900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:40:49 time: 1.9764 data_time: 0.0186 memory: 34582 grad_norm: 26.6308 loss: 7.7013 loss_cls: 0.2416 loss_bbox: 0.1117 loss_iou: 0.2330 d0.loss_cls: 0.2747 d0.loss_bbox: 0.1264 d0.loss_iou: 0.2511 d1.loss_cls: 0.2562 d1.loss_bbox: 0.1163 d1.loss_iou: 0.2398 d2.loss_cls: 0.2501 d2.loss_bbox: 0.1123 d2.loss_iou: 0.2311 d3.loss_cls: 0.2465 d3.loss_bbox: 0.1104 d3.loss_iou: 0.2311 d4.loss_cls: 0.2408 d4.loss_bbox: 0.1117 d4.loss_iou: 0.2345 enc_loss_cls: 0.2856 enc_loss_bbox: 0.1391 enc_loss_iou: 0.2714 dn_loss_cls: 0.0590 dn_loss_bbox: 0.1205 dn_loss_iou: 0.1786 d0.dn_loss_cls: 0.1258 d0.dn_loss_bbox: 0.2276 d0.dn_loss_iou: 0.2950 d1.dn_loss_cls: 0.0838 d1.dn_loss_bbox: 0.1429 d1.dn_loss_iou: 0.2026 d2.dn_loss_cls: 0.0669 d2.dn_loss_bbox: 0.1281 d2.dn_loss_iou: 0.1873 d3.dn_loss_cls: 0.0630 d3.dn_loss_bbox: 0.1218 d3.dn_loss_iou: 0.1809 d4.dn_loss_cls: 0.0596 d4.dn_loss_bbox: 0.1206 d4.dn_loss_iou: 0.1787 d1.loss_lmm_region: 0.0795 loss_lmm_image: 0.7638 2024/11/13 04:15:25 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 04:15:25 - mmengine - INFO - Iter(train) [113000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:37:29 time: 2.0049 data_time: 0.0186 memory: 33719 grad_norm: 27.6330 loss: 8.7996 loss_cls: 0.2674 loss_bbox: 0.1389 loss_iou: 0.2444 d0.loss_cls: 0.3063 d0.loss_bbox: 0.1564 d0.loss_iou: 0.2673 d1.loss_cls: 0.2909 d1.loss_bbox: 0.1414 d1.loss_iou: 0.2495 d2.loss_cls: 0.2777 d2.loss_bbox: 0.1421 d2.loss_iou: 0.2476 d3.loss_cls: 0.2724 d3.loss_bbox: 0.1397 d3.loss_iou: 0.2463 d4.loss_cls: 0.2676 d4.loss_bbox: 0.1387 d4.loss_iou: 0.2450 enc_loss_cls: 0.3174 enc_loss_bbox: 0.1589 enc_loss_iou: 0.2820 dn_loss_cls: 0.0685 dn_loss_bbox: 0.1580 dn_loss_iou: 0.2100 d0.dn_loss_cls: 0.1534 d0.dn_loss_bbox: 0.2948 d0.dn_loss_iou: 0.3427 d1.dn_loss_cls: 0.1005 d1.dn_loss_bbox: 0.1887 d1.dn_loss_iou: 0.2393 d2.dn_loss_cls: 0.0798 d2.dn_loss_bbox: 0.1655 d2.dn_loss_iou: 0.2176 d3.dn_loss_cls: 0.0726 d3.dn_loss_bbox: 0.1588 d3.dn_loss_iou: 0.2117 d4.dn_loss_cls: 0.0697 d4.dn_loss_bbox: 0.1580 d4.dn_loss_iou: 0.2101 d1.loss_lmm_region: 0.1146 loss_lmm_image: 0.7874 2024/11/13 04:18:43 - mmengine - INFO - Iter(train) [113100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:34:08 time: 1.9949 data_time: 0.0185 memory: 33463 grad_norm: 26.4798 loss: 9.7111 loss_cls: 0.3166 loss_bbox: 0.1516 loss_iou: 0.2552 d0.loss_cls: 0.3573 d0.loss_bbox: 0.1594 d0.loss_iou: 0.2626 d1.loss_cls: 0.3347 d1.loss_bbox: 0.1546 d1.loss_iou: 0.2576 d2.loss_cls: 0.3244 d2.loss_bbox: 0.1487 d2.loss_iou: 0.2527 d3.loss_cls: 0.3175 d3.loss_bbox: 0.1531 d3.loss_iou: 0.2567 d4.loss_cls: 0.3197 d4.loss_bbox: 0.1513 d4.loss_iou: 0.2559 enc_loss_cls: 0.3644 enc_loss_bbox: 0.1734 enc_loss_iou: 0.2833 dn_loss_cls: 0.1011 dn_loss_bbox: 0.1781 dn_loss_iou: 0.2272 d0.dn_loss_cls: 0.1988 d0.dn_loss_bbox: 0.3246 d0.dn_loss_iou: 0.3681 d1.dn_loss_cls: 0.1362 d1.dn_loss_bbox: 0.2120 d1.dn_loss_iou: 0.2589 d2.dn_loss_cls: 0.1168 d2.dn_loss_bbox: 0.1890 d2.dn_loss_iou: 0.2375 d3.dn_loss_cls: 0.1068 d3.dn_loss_bbox: 0.1785 d3.dn_loss_iou: 0.2293 d4.dn_loss_cls: 0.1017 d4.dn_loss_bbox: 0.1781 d4.dn_loss_iou: 0.2272 d1.loss_lmm_region: 0.1126 loss_lmm_image: 0.7782 2024/11/13 04:22:02 - mmengine - INFO - Iter(train) [113200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:30:47 time: 2.0065 data_time: 0.0185 memory: 33055 grad_norm: 33.9947 loss: 8.7363 loss_cls: 0.2530 loss_bbox: 0.1427 loss_iou: 0.2562 d0.loss_cls: 0.2889 d0.loss_bbox: 0.1541 d0.loss_iou: 0.2701 d1.loss_cls: 0.2664 d1.loss_bbox: 0.1460 d1.loss_iou: 0.2610 d2.loss_cls: 0.2647 d2.loss_bbox: 0.1417 d2.loss_iou: 0.2551 d3.loss_cls: 0.2524 d3.loss_bbox: 0.1433 d3.loss_iou: 0.2567 d4.loss_cls: 0.2533 d4.loss_bbox: 0.1422 d4.loss_iou: 0.2556 enc_loss_cls: 0.2976 enc_loss_bbox: 0.1646 enc_loss_iou: 0.2868 dn_loss_cls: 0.0985 dn_loss_bbox: 0.1462 dn_loss_iou: 0.2070 d0.dn_loss_cls: 0.1670 d0.dn_loss_bbox: 0.2582 d0.dn_loss_iou: 0.3312 d1.dn_loss_cls: 0.1277 d1.dn_loss_bbox: 0.1706 d1.dn_loss_iou: 0.2328 d2.dn_loss_cls: 0.1104 d2.dn_loss_bbox: 0.1518 d2.dn_loss_iou: 0.2150 d3.dn_loss_cls: 0.1030 d3.dn_loss_bbox: 0.1483 d3.dn_loss_iou: 0.2096 d4.dn_loss_cls: 0.1006 d4.dn_loss_bbox: 0.1461 d4.dn_loss_iou: 0.2070 d1.loss_lmm_region: 0.1069 loss_lmm_image: 0.7457 2024/11/13 04:25:21 - mmengine - INFO - Iter(train) [113300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:27:25 time: 2.0020 data_time: 0.0186 memory: 33737 grad_norm: 26.0752 loss: 8.6600 loss_cls: 0.2623 loss_bbox: 0.1335 loss_iou: 0.2385 d0.loss_cls: 0.2964 d0.loss_bbox: 0.1409 d0.loss_iou: 0.2492 d1.loss_cls: 0.2787 d1.loss_bbox: 0.1360 d1.loss_iou: 0.2441 d2.loss_cls: 0.2682 d2.loss_bbox: 0.1363 d2.loss_iou: 0.2445 d3.loss_cls: 0.2663 d3.loss_bbox: 0.1338 d3.loss_iou: 0.2416 d4.loss_cls: 0.2660 d4.loss_bbox: 0.1325 d4.loss_iou: 0.2365 enc_loss_cls: 0.3112 enc_loss_bbox: 0.1493 enc_loss_iou: 0.2649 dn_loss_cls: 0.0796 dn_loss_bbox: 0.1638 dn_loss_iou: 0.2072 d0.dn_loss_cls: 0.1587 d0.dn_loss_bbox: 0.2831 d0.dn_loss_iou: 0.3284 d1.dn_loss_cls: 0.1111 d1.dn_loss_bbox: 0.1887 d1.dn_loss_iou: 0.2318 d2.dn_loss_cls: 0.0918 d2.dn_loss_bbox: 0.1683 d2.dn_loss_iou: 0.2143 d3.dn_loss_cls: 0.0875 d3.dn_loss_bbox: 0.1644 d3.dn_loss_iou: 0.2089 d4.dn_loss_cls: 0.0802 d4.dn_loss_bbox: 0.1639 d4.dn_loss_iou: 0.2071 d1.loss_lmm_region: 0.0984 loss_lmm_image: 0.7921 2024/11/13 04:28:40 - mmengine - INFO - Iter(train) [113400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:24:04 time: 1.9821 data_time: 0.0185 memory: 34766 grad_norm: 22.4755 loss: 9.2827 loss_cls: 0.2674 loss_bbox: 0.1625 loss_iou: 0.2607 d0.loss_cls: 0.3078 d0.loss_bbox: 0.1744 d0.loss_iou: 0.2723 d1.loss_cls: 0.2887 d1.loss_bbox: 0.1651 d1.loss_iou: 0.2678 d2.loss_cls: 0.2819 d2.loss_bbox: 0.1653 d2.loss_iou: 0.2644 d3.loss_cls: 0.2745 d3.loss_bbox: 0.1638 d3.loss_iou: 0.2612 d4.loss_cls: 0.2725 d4.loss_bbox: 0.1621 d4.loss_iou: 0.2594 enc_loss_cls: 0.3184 enc_loss_bbox: 0.1832 enc_loss_iou: 0.2890 dn_loss_cls: 0.0819 dn_loss_bbox: 0.1743 dn_loss_iou: 0.2152 d0.dn_loss_cls: 0.1689 d0.dn_loss_bbox: 0.3082 d0.dn_loss_iou: 0.3430 d1.dn_loss_cls: 0.1152 d1.dn_loss_bbox: 0.2073 d1.dn_loss_iou: 0.2424 d2.dn_loss_cls: 0.0945 d2.dn_loss_bbox: 0.1849 d2.dn_loss_iou: 0.2239 d3.dn_loss_cls: 0.0873 d3.dn_loss_bbox: 0.1774 d3.dn_loss_iou: 0.2176 d4.dn_loss_cls: 0.0832 d4.dn_loss_bbox: 0.1744 d4.dn_loss_iou: 0.2152 d1.loss_lmm_region: 0.1265 loss_lmm_image: 0.7787 2024/11/13 04:31:59 - mmengine - INFO - Iter(train) [113500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:20:43 time: 1.9591 data_time: 0.0185 memory: 34293 grad_norm: 25.5789 loss: 8.5182 loss_cls: 0.2483 loss_bbox: 0.1339 loss_iou: 0.2200 d0.loss_cls: 0.2778 d0.loss_bbox: 0.1429 d0.loss_iou: 0.2298 d1.loss_cls: 0.2609 d1.loss_bbox: 0.1406 d1.loss_iou: 0.2268 d2.loss_cls: 0.2535 d2.loss_bbox: 0.1389 d2.loss_iou: 0.2240 d3.loss_cls: 0.2484 d3.loss_bbox: 0.1362 d3.loss_iou: 0.2235 d4.loss_cls: 0.2502 d4.loss_bbox: 0.1348 d4.loss_iou: 0.2223 enc_loss_cls: 0.2848 enc_loss_bbox: 0.1477 enc_loss_iou: 0.2416 dn_loss_cls: 0.0897 dn_loss_bbox: 0.1578 dn_loss_iou: 0.2026 d0.dn_loss_cls: 0.1692 d0.dn_loss_bbox: 0.3083 d0.dn_loss_iou: 0.3346 d1.dn_loss_cls: 0.1255 d1.dn_loss_bbox: 0.1904 d1.dn_loss_iou: 0.2285 d2.dn_loss_cls: 0.1072 d2.dn_loss_bbox: 0.1681 d2.dn_loss_iou: 0.2103 d3.dn_loss_cls: 0.0985 d3.dn_loss_bbox: 0.1595 d3.dn_loss_iou: 0.2045 d4.dn_loss_cls: 0.0932 d4.dn_loss_bbox: 0.1578 d4.dn_loss_iou: 0.2027 d1.loss_lmm_region: 0.1218 loss_lmm_image: 0.8011 2024/11/13 04:35:18 - mmengine - INFO - Iter(train) [113600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:17:22 time: 2.0070 data_time: 0.0190 memory: 34816 grad_norm: 30.0239 loss: 9.4708 loss_cls: 0.2763 loss_bbox: 0.1590 loss_iou: 0.2540 d0.loss_cls: 0.3137 d0.loss_bbox: 0.1715 d0.loss_iou: 0.2669 d1.loss_cls: 0.2999 d1.loss_bbox: 0.1592 d1.loss_iou: 0.2551 d2.loss_cls: 0.2920 d2.loss_bbox: 0.1563 d2.loss_iou: 0.2503 d3.loss_cls: 0.2815 d3.loss_bbox: 0.1558 d3.loss_iou: 0.2498 d4.loss_cls: 0.2799 d4.loss_bbox: 0.1568 d4.loss_iou: 0.2509 enc_loss_cls: 0.3198 enc_loss_bbox: 0.1748 enc_loss_iou: 0.2758 dn_loss_cls: 0.1241 dn_loss_bbox: 0.1733 dn_loss_iou: 0.2128 d0.dn_loss_cls: 0.1976 d0.dn_loss_bbox: 0.3274 d0.dn_loss_iou: 0.3532 d1.dn_loss_cls: 0.1520 d1.dn_loss_bbox: 0.2072 d1.dn_loss_iou: 0.2422 d2.dn_loss_cls: 0.1352 d2.dn_loss_bbox: 0.1852 d2.dn_loss_iou: 0.2229 d3.dn_loss_cls: 0.1279 d3.dn_loss_bbox: 0.1758 d3.dn_loss_iou: 0.2150 d4.dn_loss_cls: 0.1246 d4.dn_loss_bbox: 0.1735 d4.dn_loss_iou: 0.2128 d1.loss_lmm_region: 0.1229 loss_lmm_image: 0.7860 2024/11/13 04:38:37 - mmengine - INFO - Iter(train) [113700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:14:00 time: 1.9647 data_time: 0.0186 memory: 33731 grad_norm: nan loss: 9.4342 loss_cls: 0.3185 loss_bbox: 0.1408 loss_iou: 0.2829 d0.loss_cls: 0.3476 d0.loss_bbox: 0.1595 d0.loss_iou: 0.2995 d1.loss_cls: 0.3301 d1.loss_bbox: 0.1454 d1.loss_iou: 0.2889 d2.loss_cls: 0.3252 d2.loss_bbox: 0.1432 d2.loss_iou: 0.2835 d3.loss_cls: 0.3189 d3.loss_bbox: 0.1393 d3.loss_iou: 0.2813 d4.loss_cls: 0.3169 d4.loss_bbox: 0.1407 d4.loss_iou: 0.2827 enc_loss_cls: 0.3551 enc_loss_bbox: 0.1632 enc_loss_iou: 0.3101 dn_loss_cls: 0.0730 dn_loss_bbox: 0.1536 dn_loss_iou: 0.2285 d0.dn_loss_cls: 0.1498 d0.dn_loss_bbox: 0.2817 d0.dn_loss_iou: 0.3561 d1.dn_loss_cls: 0.0982 d1.dn_loss_bbox: 0.1801 d1.dn_loss_iou: 0.2554 d2.dn_loss_cls: 0.0828 d2.dn_loss_bbox: 0.1612 d2.dn_loss_iou: 0.2365 d3.dn_loss_cls: 0.0764 d3.dn_loss_bbox: 0.1551 d3.dn_loss_iou: 0.2304 d4.dn_loss_cls: 0.0733 d4.dn_loss_bbox: 0.1536 d4.dn_loss_iou: 0.2286 d1.loss_lmm_region: 0.1216 loss_lmm_image: 0.7650 2024/11/13 04:41:58 - mmengine - INFO - Iter(train) [113800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:10:40 time: 2.0050 data_time: 0.0186 memory: 34711 grad_norm: 23.1871 loss: 7.4620 loss_cls: 0.1904 loss_bbox: 0.1150 loss_iou: 0.1747 d0.loss_cls: 0.2208 d0.loss_bbox: 0.1263 d0.loss_iou: 0.1855 d1.loss_cls: 0.2062 d1.loss_bbox: 0.1182 d1.loss_iou: 0.1762 d2.loss_cls: 0.2003 d2.loss_bbox: 0.1147 d2.loss_iou: 0.1731 d3.loss_cls: 0.1948 d3.loss_bbox: 0.1149 d3.loss_iou: 0.1736 d4.loss_cls: 0.1913 d4.loss_bbox: 0.1154 d4.loss_iou: 0.1755 enc_loss_cls: 0.2306 enc_loss_bbox: 0.1327 enc_loss_iou: 0.1972 dn_loss_cls: 0.0678 dn_loss_bbox: 0.1722 dn_loss_iou: 0.1961 d0.dn_loss_cls: 0.1494 d0.dn_loss_bbox: 0.3141 d0.dn_loss_iou: 0.3183 d1.dn_loss_cls: 0.0990 d1.dn_loss_bbox: 0.1964 d1.dn_loss_iou: 0.2192 d2.dn_loss_cls: 0.0800 d2.dn_loss_bbox: 0.1784 d2.dn_loss_iou: 0.2028 d3.dn_loss_cls: 0.0732 d3.dn_loss_bbox: 0.1731 d3.dn_loss_iou: 0.1978 d4.dn_loss_cls: 0.0689 d4.dn_loss_bbox: 0.1721 d4.dn_loss_iou: 0.1961 d1.loss_lmm_region: 0.1036 loss_lmm_image: 0.7560 2024/11/13 04:45:16 - mmengine - INFO - Iter(train) [113900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:07:18 time: 1.9747 data_time: 0.0185 memory: 34304 grad_norm: 24.2810 loss: 8.5908 loss_cls: 0.2457 loss_bbox: 0.1354 loss_iou: 0.2369 d0.loss_cls: 0.2857 d0.loss_bbox: 0.1506 d0.loss_iou: 0.2532 d1.loss_cls: 0.2621 d1.loss_bbox: 0.1461 d1.loss_iou: 0.2455 d2.loss_cls: 0.2544 d2.loss_bbox: 0.1392 d2.loss_iou: 0.2382 d3.loss_cls: 0.2497 d3.loss_bbox: 0.1379 d3.loss_iou: 0.2376 d4.loss_cls: 0.2490 d4.loss_bbox: 0.1353 d4.loss_iou: 0.2374 enc_loss_cls: 0.2987 enc_loss_bbox: 0.1608 enc_loss_iou: 0.2664 dn_loss_cls: 0.0680 dn_loss_bbox: 0.1701 dn_loss_iou: 0.2138 d0.dn_loss_cls: 0.1444 d0.dn_loss_bbox: 0.3117 d0.dn_loss_iou: 0.3397 d1.dn_loss_cls: 0.0952 d1.dn_loss_bbox: 0.2050 d1.dn_loss_iou: 0.2402 d2.dn_loss_cls: 0.0831 d2.dn_loss_bbox: 0.1826 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.0761 d3.dn_loss_bbox: 0.1717 d3.dn_loss_iou: 0.2155 d4.dn_loss_cls: 0.0678 d4.dn_loss_bbox: 0.1701 d4.dn_loss_iou: 0.2138 d1.loss_lmm_region: 0.1042 loss_lmm_image: 0.7298 2024/11/13 04:48:35 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 04:48:35 - mmengine - INFO - Iter(train) [114000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:03:57 time: 2.0078 data_time: 0.0186 memory: 34040 grad_norm: 23.2112 loss: 8.8614 loss_cls: 0.2661 loss_bbox: 0.1457 loss_iou: 0.2322 d0.loss_cls: 0.3038 d0.loss_bbox: 0.1577 d0.loss_iou: 0.2484 d1.loss_cls: 0.2867 d1.loss_bbox: 0.1510 d1.loss_iou: 0.2408 d2.loss_cls: 0.2753 d2.loss_bbox: 0.1515 d2.loss_iou: 0.2406 d3.loss_cls: 0.2691 d3.loss_bbox: 0.1471 d3.loss_iou: 0.2323 d4.loss_cls: 0.2666 d4.loss_bbox: 0.1456 d4.loss_iou: 0.2308 enc_loss_cls: 0.3108 enc_loss_bbox: 0.1659 enc_loss_iou: 0.2634 dn_loss_cls: 0.0981 dn_loss_bbox: 0.1653 dn_loss_iou: 0.2011 d0.dn_loss_cls: 0.1695 d0.dn_loss_bbox: 0.2882 d0.dn_loss_iou: 0.3200 d1.dn_loss_cls: 0.1212 d1.dn_loss_bbox: 0.1842 d1.dn_loss_iou: 0.2230 d2.dn_loss_cls: 0.1063 d2.dn_loss_bbox: 0.1685 d2.dn_loss_iou: 0.2073 d3.dn_loss_cls: 0.1010 d3.dn_loss_bbox: 0.1659 d3.dn_loss_iou: 0.2025 d4.dn_loss_cls: 0.0981 d4.dn_loss_bbox: 0.1653 d4.dn_loss_iou: 0.2010 d1.loss_lmm_region: 0.1183 loss_lmm_image: 0.8252 2024/11/13 04:51:53 - mmengine - INFO - Iter(train) [114100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 20:00:36 time: 1.9944 data_time: 0.0186 memory: 33507 grad_norm: 22.5411 loss: 8.5641 loss_cls: 0.2951 loss_bbox: 0.1220 loss_iou: 0.2304 d0.loss_cls: 0.3400 d0.loss_bbox: 0.1331 d0.loss_iou: 0.2474 d1.loss_cls: 0.3133 d1.loss_bbox: 0.1273 d1.loss_iou: 0.2362 d2.loss_cls: 0.3014 d2.loss_bbox: 0.1224 d2.loss_iou: 0.2289 d3.loss_cls: 0.2957 d3.loss_bbox: 0.1224 d3.loss_iou: 0.2308 d4.loss_cls: 0.2974 d4.loss_bbox: 0.1219 d4.loss_iou: 0.2305 enc_loss_cls: 0.3454 enc_loss_bbox: 0.1437 enc_loss_iou: 0.2602 dn_loss_cls: 0.0750 dn_loss_bbox: 0.1286 dn_loss_iou: 0.1990 d0.dn_loss_cls: 0.1580 d0.dn_loss_bbox: 0.2619 d0.dn_loss_iou: 0.3318 d1.dn_loss_cls: 0.1075 d1.dn_loss_bbox: 0.1576 d1.dn_loss_iou: 0.2276 d2.dn_loss_cls: 0.0875 d2.dn_loss_bbox: 0.1378 d2.dn_loss_iou: 0.2086 d3.dn_loss_cls: 0.0795 d3.dn_loss_bbox: 0.1310 d3.dn_loss_iou: 0.2017 d4.dn_loss_cls: 0.0754 d4.dn_loss_bbox: 0.1287 d4.dn_loss_iou: 0.1990 d1.loss_lmm_region: 0.1024 loss_lmm_image: 0.8197 2024/11/13 04:55:16 - mmengine - INFO - Iter(train) [114200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:57:16 time: 2.0203 data_time: 0.0185 memory: 34845 grad_norm: 26.3463 loss: 7.7375 loss_cls: 0.2481 loss_bbox: 0.1124 loss_iou: 0.2297 d0.loss_cls: 0.2920 d0.loss_bbox: 0.1213 d0.loss_iou: 0.2396 d1.loss_cls: 0.2658 d1.loss_bbox: 0.1169 d1.loss_iou: 0.2339 d2.loss_cls: 0.2549 d2.loss_bbox: 0.1165 d2.loss_iou: 0.2310 d3.loss_cls: 0.2503 d3.loss_bbox: 0.1160 d3.loss_iou: 0.2335 d4.loss_cls: 0.2493 d4.loss_bbox: 0.1128 d4.loss_iou: 0.2299 enc_loss_cls: 0.3021 enc_loss_bbox: 0.1351 enc_loss_iou: 0.2602 dn_loss_cls: 0.0575 dn_loss_bbox: 0.1160 dn_loss_iou: 0.1773 d0.dn_loss_cls: 0.1325 d0.dn_loss_bbox: 0.2375 d0.dn_loss_iou: 0.3069 d1.dn_loss_cls: 0.0839 d1.dn_loss_bbox: 0.1430 d1.dn_loss_iou: 0.2032 d2.dn_loss_cls: 0.0674 d2.dn_loss_bbox: 0.1265 d2.dn_loss_iou: 0.1860 d3.dn_loss_cls: 0.0600 d3.dn_loss_bbox: 0.1188 d3.dn_loss_iou: 0.1794 d4.dn_loss_cls: 0.0578 d4.dn_loss_bbox: 0.1161 d4.dn_loss_iou: 0.1773 d1.loss_lmm_region: 0.0893 loss_lmm_image: 0.7497 2024/11/13 04:58:37 - mmengine - INFO - Iter(train) [114300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:53:55 time: 1.9988 data_time: 0.0186 memory: 34735 grad_norm: 22.5822 loss: 9.5243 loss_cls: 0.2911 loss_bbox: 0.1581 loss_iou: 0.2500 d0.loss_cls: 0.3298 d0.loss_bbox: 0.1755 d0.loss_iou: 0.2645 d1.loss_cls: 0.3152 d1.loss_bbox: 0.1631 d1.loss_iou: 0.2542 d2.loss_cls: 0.3021 d2.loss_bbox: 0.1584 d2.loss_iou: 0.2518 d3.loss_cls: 0.2923 d3.loss_bbox: 0.1574 d3.loss_iou: 0.2510 d4.loss_cls: 0.2886 d4.loss_bbox: 0.1589 d4.loss_iou: 0.2520 enc_loss_cls: 0.3347 enc_loss_bbox: 0.1936 enc_loss_iou: 0.2862 dn_loss_cls: 0.1109 dn_loss_bbox: 0.1883 dn_loss_iou: 0.2118 d0.dn_loss_cls: 0.1801 d0.dn_loss_bbox: 0.3328 d0.dn_loss_iou: 0.3454 d1.dn_loss_cls: 0.1400 d1.dn_loss_bbox: 0.2212 d1.dn_loss_iou: 0.2414 d2.dn_loss_cls: 0.1229 d2.dn_loss_bbox: 0.1966 d2.dn_loss_iou: 0.2207 d3.dn_loss_cls: 0.1156 d3.dn_loss_bbox: 0.1895 d3.dn_loss_iou: 0.2138 d4.dn_loss_cls: 0.1104 d4.dn_loss_bbox: 0.1883 d4.dn_loss_iou: 0.2119 d1.loss_lmm_region: 0.1240 loss_lmm_image: 0.7302 2024/11/13 05:01:57 - mmengine - INFO - Iter(train) [114400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:50:34 time: 1.9987 data_time: 0.0185 memory: 34841 grad_norm: 26.7330 loss: 7.7976 loss_cls: 0.2360 loss_bbox: 0.1137 loss_iou: 0.2006 d0.loss_cls: 0.2716 d0.loss_bbox: 0.1208 d0.loss_iou: 0.2112 d1.loss_cls: 0.2508 d1.loss_bbox: 0.1152 d1.loss_iou: 0.2077 d2.loss_cls: 0.2492 d2.loss_bbox: 0.1100 d2.loss_iou: 0.2022 d3.loss_cls: 0.2438 d3.loss_bbox: 0.1095 d3.loss_iou: 0.1991 d4.loss_cls: 0.2376 d4.loss_bbox: 0.1129 d4.loss_iou: 0.2004 enc_loss_cls: 0.2798 enc_loss_bbox: 0.1309 enc_loss_iou: 0.2275 dn_loss_cls: 0.0718 dn_loss_bbox: 0.1407 dn_loss_iou: 0.1890 d0.dn_loss_cls: 0.1443 d0.dn_loss_bbox: 0.2721 d0.dn_loss_iou: 0.3209 d1.dn_loss_cls: 0.0973 d1.dn_loss_bbox: 0.1724 d1.dn_loss_iou: 0.2192 d2.dn_loss_cls: 0.0808 d2.dn_loss_bbox: 0.1504 d2.dn_loss_iou: 0.1979 d3.dn_loss_cls: 0.0742 d3.dn_loss_bbox: 0.1421 d3.dn_loss_iou: 0.1910 d4.dn_loss_cls: 0.0715 d4.dn_loss_bbox: 0.1406 d4.dn_loss_iou: 0.1889 d1.loss_lmm_region: 0.1109 loss_lmm_image: 0.7913 2024/11/13 05:05:16 - mmengine - INFO - Iter(train) [114500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:47:13 time: 1.9875 data_time: 0.0187 memory: 32961 grad_norm: 24.8412 loss: 9.3608 loss_cls: 0.2865 loss_bbox: 0.1570 loss_iou: 0.2384 d0.loss_cls: 0.3269 d0.loss_bbox: 0.1718 d0.loss_iou: 0.2522 d1.loss_cls: 0.3039 d1.loss_bbox: 0.1584 d1.loss_iou: 0.2450 d2.loss_cls: 0.2983 d2.loss_bbox: 0.1546 d2.loss_iou: 0.2390 d3.loss_cls: 0.2912 d3.loss_bbox: 0.1571 d3.loss_iou: 0.2380 d4.loss_cls: 0.2869 d4.loss_bbox: 0.1574 d4.loss_iou: 0.2396 enc_loss_cls: 0.3379 enc_loss_bbox: 0.1868 enc_loss_iou: 0.2709 dn_loss_cls: 0.1076 dn_loss_bbox: 0.1758 dn_loss_iou: 0.2098 d0.dn_loss_cls: 0.1899 d0.dn_loss_bbox: 0.3159 d0.dn_loss_iou: 0.3418 d1.dn_loss_cls: 0.1353 d1.dn_loss_bbox: 0.2056 d1.dn_loss_iou: 0.2358 d2.dn_loss_cls: 0.1224 d2.dn_loss_bbox: 0.1865 d2.dn_loss_iou: 0.2197 d3.dn_loss_cls: 0.1109 d3.dn_loss_bbox: 0.1788 d3.dn_loss_iou: 0.2129 d4.dn_loss_cls: 0.1081 d4.dn_loss_bbox: 0.1759 d4.dn_loss_iou: 0.2099 d1.loss_lmm_region: 0.1130 loss_lmm_image: 0.8073 2024/11/13 05:08:36 - mmengine - INFO - Iter(train) [114600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:43:52 time: 2.0018 data_time: 0.0186 memory: 33428 grad_norm: 38.6874 loss: 8.1580 loss_cls: 0.2553 loss_bbox: 0.1336 loss_iou: 0.2288 d0.loss_cls: 0.2906 d0.loss_bbox: 0.1448 d0.loss_iou: 0.2391 d1.loss_cls: 0.2711 d1.loss_bbox: 0.1358 d1.loss_iou: 0.2321 d2.loss_cls: 0.2682 d2.loss_bbox: 0.1313 d2.loss_iou: 0.2271 d3.loss_cls: 0.2631 d3.loss_bbox: 0.1279 d3.loss_iou: 0.2270 d4.loss_cls: 0.2603 d4.loss_bbox: 0.1277 d4.loss_iou: 0.2280 enc_loss_cls: 0.3005 enc_loss_bbox: 0.1508 enc_loss_iou: 0.2516 dn_loss_cls: 0.0738 dn_loss_bbox: 0.1375 dn_loss_iou: 0.1854 d0.dn_loss_cls: 0.1511 d0.dn_loss_bbox: 0.2493 d0.dn_loss_iou: 0.2998 d1.dn_loss_cls: 0.1004 d1.dn_loss_bbox: 0.1640 d1.dn_loss_iou: 0.2095 d2.dn_loss_cls: 0.0847 d2.dn_loss_bbox: 0.1478 d2.dn_loss_iou: 0.1936 d3.dn_loss_cls: 0.0784 d3.dn_loss_bbox: 0.1397 d3.dn_loss_iou: 0.1872 d4.dn_loss_cls: 0.0745 d4.dn_loss_bbox: 0.1375 d4.dn_loss_iou: 0.1855 d1.loss_lmm_region: 0.1079 loss_lmm_image: 0.7556 2024/11/13 05:11:56 - mmengine - INFO - Iter(train) [114700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:40:31 time: 1.9961 data_time: 0.0185 memory: 35695 grad_norm: 28.3927 loss: 8.7221 loss_cls: 0.2702 loss_bbox: 0.1286 loss_iou: 0.2450 d0.loss_cls: 0.3138 d0.loss_bbox: 0.1350 d0.loss_iou: 0.2586 d1.loss_cls: 0.2880 d1.loss_bbox: 0.1314 d1.loss_iou: 0.2491 d2.loss_cls: 0.2834 d2.loss_bbox: 0.1264 d2.loss_iou: 0.2450 d3.loss_cls: 0.2775 d3.loss_bbox: 0.1272 d3.loss_iou: 0.2434 d4.loss_cls: 0.2713 d4.loss_bbox: 0.1279 d4.loss_iou: 0.2440 enc_loss_cls: 0.3259 enc_loss_bbox: 0.1464 enc_loss_iou: 0.2728 dn_loss_cls: 0.0727 dn_loss_bbox: 0.1469 dn_loss_iou: 0.2138 d0.dn_loss_cls: 0.1595 d0.dn_loss_bbox: 0.2981 d0.dn_loss_iou: 0.3587 d1.dn_loss_cls: 0.1080 d1.dn_loss_bbox: 0.1864 d1.dn_loss_iou: 0.2463 d2.dn_loss_cls: 0.0865 d2.dn_loss_bbox: 0.1620 d2.dn_loss_iou: 0.2255 d3.dn_loss_cls: 0.0777 d3.dn_loss_bbox: 0.1490 d3.dn_loss_iou: 0.2160 d4.dn_loss_cls: 0.0739 d4.dn_loss_bbox: 0.1469 d4.dn_loss_iou: 0.2139 d1.loss_lmm_region: 0.1041 loss_lmm_image: 0.7654 2024/11/13 05:15:16 - mmengine - INFO - Iter(train) [114800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:37:10 time: 1.9717 data_time: 0.0185 memory: 33811 grad_norm: 24.7707 loss: 8.5925 loss_cls: 0.2520 loss_bbox: 0.1338 loss_iou: 0.2346 d0.loss_cls: 0.2839 d0.loss_bbox: 0.1424 d0.loss_iou: 0.2458 d1.loss_cls: 0.2678 d1.loss_bbox: 0.1335 d1.loss_iou: 0.2336 d2.loss_cls: 0.2593 d2.loss_bbox: 0.1328 d2.loss_iou: 0.2334 d3.loss_cls: 0.2554 d3.loss_bbox: 0.1334 d3.loss_iou: 0.2329 d4.loss_cls: 0.2549 d4.loss_bbox: 0.1337 d4.loss_iou: 0.2338 enc_loss_cls: 0.2905 enc_loss_bbox: 0.1541 enc_loss_iou: 0.2615 dn_loss_cls: 0.0793 dn_loss_bbox: 0.1717 dn_loss_iou: 0.2119 d0.dn_loss_cls: 0.1519 d0.dn_loss_bbox: 0.3046 d0.dn_loss_iou: 0.3379 d1.dn_loss_cls: 0.1059 d1.dn_loss_bbox: 0.1969 d1.dn_loss_iou: 0.2368 d2.dn_loss_cls: 0.0879 d2.dn_loss_bbox: 0.1762 d2.dn_loss_iou: 0.2190 d3.dn_loss_cls: 0.0823 d3.dn_loss_bbox: 0.1718 d3.dn_loss_iou: 0.2140 d4.dn_loss_cls: 0.0800 d4.dn_loss_bbox: 0.1716 d4.dn_loss_iou: 0.2119 d1.loss_lmm_region: 0.1012 loss_lmm_image: 0.7768 2024/11/13 05:18:35 - mmengine - INFO - Iter(train) [114900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:33:49 time: 2.0077 data_time: 0.0186 memory: 33069 grad_norm: 26.3806 loss: 8.1411 loss_cls: 0.2443 loss_bbox: 0.1128 loss_iou: 0.2164 d0.loss_cls: 0.2771 d0.loss_bbox: 0.1241 d0.loss_iou: 0.2294 d1.loss_cls: 0.2533 d1.loss_bbox: 0.1199 d1.loss_iou: 0.2263 d2.loss_cls: 0.2471 d2.loss_bbox: 0.1172 d2.loss_iou: 0.2213 d3.loss_cls: 0.2442 d3.loss_bbox: 0.1144 d3.loss_iou: 0.2180 d4.loss_cls: 0.2442 d4.loss_bbox: 0.1134 d4.loss_iou: 0.2170 enc_loss_cls: 0.2781 enc_loss_bbox: 0.1337 enc_loss_iou: 0.2484 dn_loss_cls: 0.0896 dn_loss_bbox: 0.1393 dn_loss_iou: 0.2058 d0.dn_loss_cls: 0.1621 d0.dn_loss_bbox: 0.2783 d0.dn_loss_iou: 0.3393 d1.dn_loss_cls: 0.1199 d1.dn_loss_bbox: 0.1660 d1.dn_loss_iou: 0.2323 d2.dn_loss_cls: 0.1012 d2.dn_loss_bbox: 0.1471 d2.dn_loss_iou: 0.2138 d3.dn_loss_cls: 0.0942 d3.dn_loss_bbox: 0.1413 d3.dn_loss_iou: 0.2083 d4.dn_loss_cls: 0.0896 d4.dn_loss_bbox: 0.1393 d4.dn_loss_iou: 0.2058 d1.loss_lmm_region: 0.1050 loss_lmm_image: 0.7625 2024/11/13 05:21:53 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 05:21:53 - mmengine - INFO - Iter(train) [115000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:30:28 time: 1.9694 data_time: 0.0186 memory: 34264 grad_norm: 29.5515 loss: 10.1284 loss_cls: 0.3105 loss_bbox: 0.1822 loss_iou: 0.3220 d0.loss_cls: 0.3482 d0.loss_bbox: 0.1925 d0.loss_iou: 0.3349 d1.loss_cls: 0.3304 d1.loss_bbox: 0.1856 d1.loss_iou: 0.3249 d2.loss_cls: 0.3264 d2.loss_bbox: 0.1767 d2.loss_iou: 0.3193 d3.loss_cls: 0.3154 d3.loss_bbox: 0.1785 d3.loss_iou: 0.3201 d4.loss_cls: 0.3125 d4.loss_bbox: 0.1825 d4.loss_iou: 0.3221 enc_loss_cls: 0.3649 enc_loss_bbox: 0.1939 enc_loss_iou: 0.3414 dn_loss_cls: 0.0804 dn_loss_bbox: 0.1844 dn_loss_iou: 0.2272 d0.dn_loss_cls: 0.1616 d0.dn_loss_bbox: 0.3303 d0.dn_loss_iou: 0.3616 d1.dn_loss_cls: 0.1101 d1.dn_loss_bbox: 0.2188 d1.dn_loss_iou: 0.2553 d2.dn_loss_cls: 0.0925 d2.dn_loss_bbox: 0.1962 d2.dn_loss_iou: 0.2361 d3.dn_loss_cls: 0.0847 d3.dn_loss_bbox: 0.1864 d3.dn_loss_iou: 0.2286 d4.dn_loss_cls: 0.0806 d4.dn_loss_bbox: 0.1843 d4.dn_loss_iou: 0.2272 d1.loss_lmm_region: 0.1069 loss_lmm_image: 0.6905 2024/11/13 05:25:12 - mmengine - INFO - Iter(train) [115100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:27:07 time: 2.0178 data_time: 0.0185 memory: 35176 grad_norm: 24.4524 loss: 8.3587 loss_cls: 0.2585 loss_bbox: 0.1186 loss_iou: 0.2079 d0.loss_cls: 0.2985 d0.loss_bbox: 0.1344 d0.loss_iou: 0.2279 d1.loss_cls: 0.2741 d1.loss_bbox: 0.1223 d1.loss_iou: 0.2157 d2.loss_cls: 0.2703 d2.loss_bbox: 0.1185 d2.loss_iou: 0.2105 d3.loss_cls: 0.2625 d3.loss_bbox: 0.1186 d3.loss_iou: 0.2077 d4.loss_cls: 0.2598 d4.loss_bbox: 0.1184 d4.loss_iou: 0.2077 enc_loss_cls: 0.3109 enc_loss_bbox: 0.1440 enc_loss_iou: 0.2454 dn_loss_cls: 0.0949 dn_loss_bbox: 0.1566 dn_loss_iou: 0.1954 d0.dn_loss_cls: 0.1674 d0.dn_loss_bbox: 0.3007 d0.dn_loss_iou: 0.3310 d1.dn_loss_cls: 0.1171 d1.dn_loss_bbox: 0.1914 d1.dn_loss_iou: 0.2256 d2.dn_loss_cls: 0.1028 d2.dn_loss_bbox: 0.1700 d2.dn_loss_iou: 0.2064 d3.dn_loss_cls: 0.0976 d3.dn_loss_bbox: 0.1592 d3.dn_loss_iou: 0.1982 d4.dn_loss_cls: 0.0955 d4.dn_loss_bbox: 0.1566 d4.dn_loss_iou: 0.1954 d1.loss_lmm_region: 0.1034 loss_lmm_image: 0.7610 2024/11/13 05:28:30 - mmengine - INFO - Iter(train) [115200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:23:45 time: 1.9925 data_time: 0.0186 memory: 34190 grad_norm: 34.2938 loss: 9.6533 loss_cls: 0.3230 loss_bbox: 0.1535 loss_iou: 0.2801 d0.loss_cls: 0.3733 d0.loss_bbox: 0.1654 d0.loss_iou: 0.2942 d1.loss_cls: 0.3448 d1.loss_bbox: 0.1552 d1.loss_iou: 0.2819 d2.loss_cls: 0.3414 d2.loss_bbox: 0.1505 d2.loss_iou: 0.2815 d3.loss_cls: 0.3321 d3.loss_bbox: 0.1482 d3.loss_iou: 0.2777 d4.loss_cls: 0.3263 d4.loss_bbox: 0.1515 d4.loss_iou: 0.2796 enc_loss_cls: 0.3712 enc_loss_bbox: 0.1764 enc_loss_iou: 0.3186 dn_loss_cls: 0.0768 dn_loss_bbox: 0.1508 dn_loss_iou: 0.2192 d0.dn_loss_cls: 0.1720 d0.dn_loss_bbox: 0.2997 d0.dn_loss_iou: 0.3597 d1.dn_loss_cls: 0.1158 d1.dn_loss_bbox: 0.1852 d1.dn_loss_iou: 0.2471 d2.dn_loss_cls: 0.0928 d2.dn_loss_bbox: 0.1629 d2.dn_loss_iou: 0.2283 d3.dn_loss_cls: 0.0838 d3.dn_loss_bbox: 0.1533 d3.dn_loss_iou: 0.2211 d4.dn_loss_cls: 0.0778 d4.dn_loss_bbox: 0.1509 d4.dn_loss_iou: 0.2193 d1.loss_lmm_region: 0.1277 loss_lmm_image: 0.7826 2024/11/13 05:31:50 - mmengine - INFO - Iter(train) [115300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:20:24 time: 1.9875 data_time: 0.0185 memory: 35001 grad_norm: 24.1430 loss: 8.9118 loss_cls: 0.2625 loss_bbox: 0.1521 loss_iou: 0.2528 d0.loss_cls: 0.3000 d0.loss_bbox: 0.1660 d0.loss_iou: 0.2712 d1.loss_cls: 0.2773 d1.loss_bbox: 0.1576 d1.loss_iou: 0.2605 d2.loss_cls: 0.2719 d2.loss_bbox: 0.1522 d2.loss_iou: 0.2558 d3.loss_cls: 0.2693 d3.loss_bbox: 0.1507 d3.loss_iou: 0.2533 d4.loss_cls: 0.2642 d4.loss_bbox: 0.1502 d4.loss_iou: 0.2518 enc_loss_cls: 0.3101 enc_loss_bbox: 0.1744 enc_loss_iou: 0.2813 dn_loss_cls: 0.0766 dn_loss_bbox: 0.1617 dn_loss_iou: 0.2127 d0.dn_loss_cls: 0.1577 d0.dn_loss_bbox: 0.3044 d0.dn_loss_iou: 0.3488 d1.dn_loss_cls: 0.1070 d1.dn_loss_bbox: 0.1897 d1.dn_loss_iou: 0.2398 d2.dn_loss_cls: 0.0884 d2.dn_loss_bbox: 0.1713 d2.dn_loss_iou: 0.2219 d3.dn_loss_cls: 0.0821 d3.dn_loss_bbox: 0.1636 d3.dn_loss_iou: 0.2147 d4.dn_loss_cls: 0.0771 d4.dn_loss_bbox: 0.1618 d4.dn_loss_iou: 0.2128 d1.loss_lmm_region: 0.1049 loss_lmm_image: 0.7294 2024/11/13 05:35:09 - mmengine - INFO - Iter(train) [115400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:17:03 time: 1.9953 data_time: 0.0185 memory: 34029 grad_norm: 27.6087 loss: 8.5767 loss_cls: 0.2583 loss_bbox: 0.1251 loss_iou: 0.2038 d0.loss_cls: 0.2998 d0.loss_bbox: 0.1309 d0.loss_iou: 0.2137 d1.loss_cls: 0.2772 d1.loss_bbox: 0.1216 d1.loss_iou: 0.2034 d2.loss_cls: 0.2696 d2.loss_bbox: 0.1225 d2.loss_iou: 0.2003 d3.loss_cls: 0.2606 d3.loss_bbox: 0.1253 d3.loss_iou: 0.2048 d4.loss_cls: 0.2592 d4.loss_bbox: 0.1241 d4.loss_iou: 0.2022 enc_loss_cls: 0.2943 enc_loss_bbox: 0.1534 enc_loss_iou: 0.2338 dn_loss_cls: 0.0948 dn_loss_bbox: 0.1867 dn_loss_iou: 0.2060 d0.dn_loss_cls: 0.1721 d0.dn_loss_bbox: 0.3434 d0.dn_loss_iou: 0.3405 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.2157 d1.dn_loss_iou: 0.2322 d2.dn_loss_cls: 0.1061 d2.dn_loss_bbox: 0.1933 d2.dn_loss_iou: 0.2137 d3.dn_loss_cls: 0.0979 d3.dn_loss_bbox: 0.1876 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.0947 d4.dn_loss_bbox: 0.1867 d4.dn_loss_iou: 0.2060 d1.loss_lmm_region: 0.1050 loss_lmm_image: 0.7818 2024/11/13 05:38:28 - mmengine - INFO - Iter(train) [115500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:13:42 time: 1.9820 data_time: 0.0187 memory: 33985 grad_norm: 21.5997 loss: 8.9749 loss_cls: 0.2901 loss_bbox: 0.1351 loss_iou: 0.2830 d0.loss_cls: 0.3336 d0.loss_bbox: 0.1536 d0.loss_iou: 0.3022 d1.loss_cls: 0.3165 d1.loss_bbox: 0.1367 d1.loss_iou: 0.2868 d2.loss_cls: 0.3069 d2.loss_bbox: 0.1316 d2.loss_iou: 0.2826 d3.loss_cls: 0.2956 d3.loss_bbox: 0.1332 d3.loss_iou: 0.2841 d4.loss_cls: 0.2915 d4.loss_bbox: 0.1346 d4.loss_iou: 0.2826 enc_loss_cls: 0.3422 enc_loss_bbox: 0.1659 enc_loss_iou: 0.3262 dn_loss_cls: 0.0612 dn_loss_bbox: 0.1452 dn_loss_iou: 0.1966 d0.dn_loss_cls: 0.1442 d0.dn_loss_bbox: 0.2843 d0.dn_loss_iou: 0.3273 d1.dn_loss_cls: 0.0904 d1.dn_loss_bbox: 0.1780 d1.dn_loss_iou: 0.2262 d2.dn_loss_cls: 0.0710 d2.dn_loss_bbox: 0.1524 d2.dn_loss_iou: 0.2055 d3.dn_loss_cls: 0.0644 d3.dn_loss_bbox: 0.1466 d3.dn_loss_iou: 0.1989 d4.dn_loss_cls: 0.0608 d4.dn_loss_bbox: 0.1451 d4.dn_loss_iou: 0.1966 d1.loss_lmm_region: 0.1104 loss_lmm_image: 0.7551 2024/11/13 05:41:48 - mmengine - INFO - Iter(train) [115600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:10:21 time: 1.9977 data_time: 0.0188 memory: 34170 grad_norm: 25.1948 loss: 8.7325 loss_cls: 0.2813 loss_bbox: 0.1256 loss_iou: 0.2390 d0.loss_cls: 0.3208 d0.loss_bbox: 0.1449 d0.loss_iou: 0.2585 d1.loss_cls: 0.3015 d1.loss_bbox: 0.1280 d1.loss_iou: 0.2420 d2.loss_cls: 0.2990 d2.loss_bbox: 0.1227 d2.loss_iou: 0.2368 d3.loss_cls: 0.2885 d3.loss_bbox: 0.1234 d3.loss_iou: 0.2355 d4.loss_cls: 0.2841 d4.loss_bbox: 0.1233 d4.loss_iou: 0.2363 enc_loss_cls: 0.3308 enc_loss_bbox: 0.1510 enc_loss_iou: 0.2662 dn_loss_cls: 0.0829 dn_loss_bbox: 0.1448 dn_loss_iou: 0.2022 d0.dn_loss_cls: 0.1801 d0.dn_loss_bbox: 0.2946 d0.dn_loss_iou: 0.3457 d1.dn_loss_cls: 0.1206 d1.dn_loss_bbox: 0.1784 d1.dn_loss_iou: 0.2340 d2.dn_loss_cls: 0.0979 d2.dn_loss_bbox: 0.1536 d2.dn_loss_iou: 0.2110 d3.dn_loss_cls: 0.0901 d3.dn_loss_bbox: 0.1469 d3.dn_loss_iou: 0.2044 d4.dn_loss_cls: 0.0827 d4.dn_loss_bbox: 0.1448 d4.dn_loss_iou: 0.2022 d1.loss_lmm_region: 0.1161 loss_lmm_image: 0.7601 2024/11/13 05:45:08 - mmengine - INFO - Iter(train) [115700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:07:00 time: 2.0080 data_time: 0.0185 memory: 35547 grad_norm: 27.1177 loss: 9.1355 loss_cls: 0.2645 loss_bbox: 0.1647 loss_iou: 0.2625 d0.loss_cls: 0.3001 d0.loss_bbox: 0.1777 d0.loss_iou: 0.2812 d1.loss_cls: 0.2818 d1.loss_bbox: 0.1675 d1.loss_iou: 0.2702 d2.loss_cls: 0.2729 d2.loss_bbox: 0.1648 d2.loss_iou: 0.2620 d3.loss_cls: 0.2656 d3.loss_bbox: 0.1641 d3.loss_iou: 0.2631 d4.loss_cls: 0.2639 d4.loss_bbox: 0.1660 d4.loss_iou: 0.2636 enc_loss_cls: 0.3123 enc_loss_bbox: 0.1890 enc_loss_iou: 0.2920 dn_loss_cls: 0.0722 dn_loss_bbox: 0.1783 dn_loss_iou: 0.2063 d0.dn_loss_cls: 0.1523 d0.dn_loss_bbox: 0.3111 d0.dn_loss_iou: 0.3270 d1.dn_loss_cls: 0.1057 d1.dn_loss_bbox: 0.2090 d1.dn_loss_iou: 0.2314 d2.dn_loss_cls: 0.0863 d2.dn_loss_bbox: 0.1873 d2.dn_loss_iou: 0.2133 d3.dn_loss_cls: 0.0786 d3.dn_loss_bbox: 0.1800 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.0733 d4.dn_loss_bbox: 0.1784 d4.dn_loss_iou: 0.2062 d1.loss_lmm_region: 0.1066 loss_lmm_image: 0.7749 2024/11/13 05:48:30 - mmengine - INFO - Iter(train) [115800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:03:40 time: 2.0164 data_time: 0.0186 memory: 33358 grad_norm: 31.4951 loss: 8.9563 loss_cls: 0.2801 loss_bbox: 0.1450 loss_iou: 0.2389 d0.loss_cls: 0.3215 d0.loss_bbox: 0.1598 d0.loss_iou: 0.2573 d1.loss_cls: 0.2987 d1.loss_bbox: 0.1467 d1.loss_iou: 0.2432 d2.loss_cls: 0.2867 d2.loss_bbox: 0.1469 d2.loss_iou: 0.2440 d3.loss_cls: 0.2854 d3.loss_bbox: 0.1448 d3.loss_iou: 0.2421 d4.loss_cls: 0.2804 d4.loss_bbox: 0.1451 d4.loss_iou: 0.2395 enc_loss_cls: 0.3338 enc_loss_bbox: 0.1705 enc_loss_iou: 0.2694 dn_loss_cls: 0.0778 dn_loss_bbox: 0.1612 dn_loss_iou: 0.2046 d0.dn_loss_cls: 0.1783 d0.dn_loss_bbox: 0.3213 d0.dn_loss_iou: 0.3466 d1.dn_loss_cls: 0.1145 d1.dn_loss_bbox: 0.1929 d1.dn_loss_iou: 0.2365 d2.dn_loss_cls: 0.0919 d2.dn_loss_bbox: 0.1725 d2.dn_loss_iou: 0.2150 d3.dn_loss_cls: 0.0821 d3.dn_loss_bbox: 0.1639 d3.dn_loss_iou: 0.2073 d4.dn_loss_cls: 0.0771 d4.dn_loss_bbox: 0.1612 d4.dn_loss_iou: 0.2048 d1.loss_lmm_region: 0.1196 loss_lmm_image: 0.7470 2024/11/13 05:51:48 - mmengine - INFO - Iter(train) [115900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 19:00:19 time: 1.9838 data_time: 0.0187 memory: 34824 grad_norm: 28.0482 loss: 9.3156 loss_cls: 0.2946 loss_bbox: 0.1493 loss_iou: 0.2589 d0.loss_cls: 0.3357 d0.loss_bbox: 0.1556 d0.loss_iou: 0.2645 d1.loss_cls: 0.3088 d1.loss_bbox: 0.1550 d1.loss_iou: 0.2640 d2.loss_cls: 0.3019 d2.loss_bbox: 0.1505 d2.loss_iou: 0.2593 d3.loss_cls: 0.2971 d3.loss_bbox: 0.1515 d3.loss_iou: 0.2588 d4.loss_cls: 0.2950 d4.loss_bbox: 0.1506 d4.loss_iou: 0.2590 enc_loss_cls: 0.3448 enc_loss_bbox: 0.1646 enc_loss_iou: 0.2787 dn_loss_cls: 0.0907 dn_loss_bbox: 0.1653 dn_loss_iou: 0.2292 d0.dn_loss_cls: 0.1733 d0.dn_loss_bbox: 0.3031 d0.dn_loss_iou: 0.3640 d1.dn_loss_cls: 0.1178 d1.dn_loss_bbox: 0.1924 d1.dn_loss_iou: 0.2564 d2.dn_loss_cls: 0.0994 d2.dn_loss_bbox: 0.1752 d2.dn_loss_iou: 0.2378 d3.dn_loss_cls: 0.0943 d3.dn_loss_bbox: 0.1675 d3.dn_loss_iou: 0.2316 d4.dn_loss_cls: 0.0912 d4.dn_loss_bbox: 0.1652 d4.dn_loss_iou: 0.2292 d1.loss_lmm_region: 0.1074 loss_lmm_image: 0.7263 2024/11/13 05:55:08 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 05:55:08 - mmengine - INFO - Iter(train) [116000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:56:58 time: 2.0136 data_time: 0.0187 memory: 33647 grad_norm: 25.7874 loss: 9.3311 loss_cls: 0.3164 loss_bbox: 0.1514 loss_iou: 0.2860 d0.loss_cls: 0.3721 d0.loss_bbox: 0.1606 d0.loss_iou: 0.3015 d1.loss_cls: 0.3455 d1.loss_bbox: 0.1537 d1.loss_iou: 0.2906 d2.loss_cls: 0.3322 d2.loss_bbox: 0.1512 d2.loss_iou: 0.2869 d3.loss_cls: 0.3202 d3.loss_bbox: 0.1499 d3.loss_iou: 0.2863 d4.loss_cls: 0.3183 d4.loss_bbox: 0.1517 d4.loss_iou: 0.2853 enc_loss_cls: 0.3764 enc_loss_bbox: 0.1770 enc_loss_iou: 0.3194 dn_loss_cls: 0.0754 dn_loss_bbox: 0.1360 dn_loss_iou: 0.1957 d0.dn_loss_cls: 0.1593 d0.dn_loss_bbox: 0.2596 d0.dn_loss_iou: 0.3238 d1.dn_loss_cls: 0.1079 d1.dn_loss_bbox: 0.1619 d1.dn_loss_iou: 0.2207 d2.dn_loss_cls: 0.0868 d2.dn_loss_bbox: 0.1441 d2.dn_loss_iou: 0.2037 d3.dn_loss_cls: 0.0791 d3.dn_loss_bbox: 0.1381 d3.dn_loss_iou: 0.1982 d4.dn_loss_cls: 0.0762 d4.dn_loss_bbox: 0.1360 d4.dn_loss_iou: 0.1957 d1.loss_lmm_region: 0.1031 loss_lmm_image: 0.7973 2024/11/13 05:58:27 - mmengine - INFO - Iter(train) [116100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:53:37 time: 1.9882 data_time: 0.0187 memory: 34707 grad_norm: 22.6155 loss: 9.0283 loss_cls: 0.2798 loss_bbox: 0.1410 loss_iou: 0.2326 d0.loss_cls: 0.3232 d0.loss_bbox: 0.1468 d0.loss_iou: 0.2440 d1.loss_cls: 0.2980 d1.loss_bbox: 0.1455 d1.loss_iou: 0.2381 d2.loss_cls: 0.2922 d2.loss_bbox: 0.1419 d2.loss_iou: 0.2352 d3.loss_cls: 0.2870 d3.loss_bbox: 0.1402 d3.loss_iou: 0.2346 d4.loss_cls: 0.2805 d4.loss_bbox: 0.1414 d4.loss_iou: 0.2339 enc_loss_cls: 0.3270 enc_loss_bbox: 0.1538 enc_loss_iou: 0.2565 dn_loss_cls: 0.0937 dn_loss_bbox: 0.1707 dn_loss_iou: 0.2179 d0.dn_loss_cls: 0.1721 d0.dn_loss_bbox: 0.3147 d0.dn_loss_iou: 0.3493 d1.dn_loss_cls: 0.1232 d1.dn_loss_bbox: 0.1996 d1.dn_loss_iou: 0.2442 d2.dn_loss_cls: 0.1074 d2.dn_loss_bbox: 0.1804 d2.dn_loss_iou: 0.2265 d3.dn_loss_cls: 0.0983 d3.dn_loss_bbox: 0.1717 d3.dn_loss_iou: 0.2196 d4.dn_loss_cls: 0.0953 d4.dn_loss_bbox: 0.1707 d4.dn_loss_iou: 0.2180 d1.loss_lmm_region: 0.1009 loss_lmm_image: 0.7811 2024/11/13 06:01:46 - mmengine - INFO - Iter(train) [116200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:50:16 time: 1.9983 data_time: 0.0186 memory: 32417 grad_norm: 36.4918 loss: 8.6035 loss_cls: 0.2550 loss_bbox: 0.1392 loss_iou: 0.2202 d0.loss_cls: 0.2907 d0.loss_bbox: 0.1493 d0.loss_iou: 0.2287 d1.loss_cls: 0.2747 d1.loss_bbox: 0.1396 d1.loss_iou: 0.2224 d2.loss_cls: 0.2652 d2.loss_bbox: 0.1366 d2.loss_iou: 0.2171 d3.loss_cls: 0.2639 d3.loss_bbox: 0.1362 d3.loss_iou: 0.2173 d4.loss_cls: 0.2584 d4.loss_bbox: 0.1381 d4.loss_iou: 0.2190 enc_loss_cls: 0.3039 enc_loss_bbox: 0.1540 enc_loss_iou: 0.2435 dn_loss_cls: 0.1023 dn_loss_bbox: 0.1703 dn_loss_iou: 0.1987 d0.dn_loss_cls: 0.1828 d0.dn_loss_bbox: 0.2971 d0.dn_loss_iou: 0.3261 d1.dn_loss_cls: 0.1319 d1.dn_loss_bbox: 0.1973 d1.dn_loss_iou: 0.2251 d2.dn_loss_cls: 0.1140 d2.dn_loss_bbox: 0.1775 d2.dn_loss_iou: 0.2067 d3.dn_loss_cls: 0.1086 d3.dn_loss_bbox: 0.1718 d3.dn_loss_iou: 0.2004 d4.dn_loss_cls: 0.1034 d4.dn_loss_bbox: 0.1702 d4.dn_loss_iou: 0.1987 d1.loss_lmm_region: 0.1267 loss_lmm_image: 0.7209 2024/11/13 06:05:04 - mmengine - INFO - Iter(train) [116300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:46:54 time: 1.9761 data_time: 0.0186 memory: 34149 grad_norm: 25.1518 loss: 9.2077 loss_cls: 0.2899 loss_bbox: 0.1556 loss_iou: 0.2697 d0.loss_cls: 0.3402 d0.loss_bbox: 0.1597 d0.loss_iou: 0.2798 d1.loss_cls: 0.3124 d1.loss_bbox: 0.1586 d1.loss_iou: 0.2765 d2.loss_cls: 0.3040 d2.loss_bbox: 0.1557 d2.loss_iou: 0.2718 d3.loss_cls: 0.2967 d3.loss_bbox: 0.1519 d3.loss_iou: 0.2701 d4.loss_cls: 0.2888 d4.loss_bbox: 0.1557 d4.loss_iou: 0.2727 enc_loss_cls: 0.3429 enc_loss_bbox: 0.1679 enc_loss_iou: 0.2923 dn_loss_cls: 0.0999 dn_loss_bbox: 0.1482 dn_loss_iou: 0.1947 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.2807 d0.dn_loss_iou: 0.3231 d1.dn_loss_cls: 0.1281 d1.dn_loss_bbox: 0.1735 d1.dn_loss_iou: 0.2206 d2.dn_loss_cls: 0.1090 d2.dn_loss_bbox: 0.1555 d2.dn_loss_iou: 0.2030 d3.dn_loss_cls: 0.1011 d3.dn_loss_bbox: 0.1511 d3.dn_loss_iou: 0.1974 d4.dn_loss_cls: 0.0987 d4.dn_loss_bbox: 0.1482 d4.dn_loss_iou: 0.1948 d1.loss_lmm_region: 0.1076 loss_lmm_image: 0.7764 2024/11/13 06:08:26 - mmengine - INFO - Iter(train) [116400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:43:34 time: 2.0336 data_time: 0.0186 memory: 34394 grad_norm: 26.6140 loss: 8.2904 loss_cls: 0.2667 loss_bbox: 0.1148 loss_iou: 0.2117 d0.loss_cls: 0.3095 d0.loss_bbox: 0.1325 d0.loss_iou: 0.2294 d1.loss_cls: 0.2939 d1.loss_bbox: 0.1174 d1.loss_iou: 0.2149 d2.loss_cls: 0.2777 d2.loss_bbox: 0.1178 d2.loss_iou: 0.2142 d3.loss_cls: 0.2695 d3.loss_bbox: 0.1160 d3.loss_iou: 0.2133 d4.loss_cls: 0.2669 d4.loss_bbox: 0.1164 d4.loss_iou: 0.2125 enc_loss_cls: 0.3168 enc_loss_bbox: 0.1406 enc_loss_iou: 0.2417 dn_loss_cls: 0.0880 dn_loss_bbox: 0.1538 dn_loss_iou: 0.1943 d0.dn_loss_cls: 0.1654 d0.dn_loss_bbox: 0.2911 d0.dn_loss_iou: 0.3214 d1.dn_loss_cls: 0.1159 d1.dn_loss_bbox: 0.1824 d1.dn_loss_iou: 0.2219 d2.dn_loss_cls: 0.0992 d2.dn_loss_bbox: 0.1602 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.0925 d3.dn_loss_bbox: 0.1552 d3.dn_loss_iou: 0.1963 d4.dn_loss_cls: 0.0881 d4.dn_loss_bbox: 0.1539 d4.dn_loss_iou: 0.1943 d1.loss_lmm_region: 0.1151 loss_lmm_image: 0.7051 2024/11/13 06:11:46 - mmengine - INFO - Iter(train) [116500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:40:13 time: 1.9844 data_time: 0.0184 memory: 34214 grad_norm: 23.2167 loss: 8.6377 loss_cls: 0.2556 loss_bbox: 0.1343 loss_iou: 0.2307 d0.loss_cls: 0.2914 d0.loss_bbox: 0.1451 d0.loss_iou: 0.2477 d1.loss_cls: 0.2736 d1.loss_bbox: 0.1374 d1.loss_iou: 0.2325 d2.loss_cls: 0.2684 d2.loss_bbox: 0.1339 d2.loss_iou: 0.2286 d3.loss_cls: 0.2618 d3.loss_bbox: 0.1350 d3.loss_iou: 0.2307 d4.loss_cls: 0.2573 d4.loss_bbox: 0.1351 d4.loss_iou: 0.2300 enc_loss_cls: 0.3059 enc_loss_bbox: 0.1554 enc_loss_iou: 0.2596 dn_loss_cls: 0.0938 dn_loss_bbox: 0.1491 dn_loss_iou: 0.2109 d0.dn_loss_cls: 0.1773 d0.dn_loss_bbox: 0.2940 d0.dn_loss_iou: 0.3498 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.1810 d1.dn_loss_iou: 0.2403 d2.dn_loss_cls: 0.1024 d2.dn_loss_bbox: 0.1590 d2.dn_loss_iou: 0.2200 d3.dn_loss_cls: 0.0980 d3.dn_loss_bbox: 0.1520 d3.dn_loss_iou: 0.2134 d4.dn_loss_cls: 0.0923 d4.dn_loss_bbox: 0.1491 d4.dn_loss_iou: 0.2109 d1.loss_lmm_region: 0.1097 loss_lmm_image: 0.7637 2024/11/13 06:15:05 - mmengine - INFO - Iter(train) [116600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:36:52 time: 1.9987 data_time: 0.0186 memory: 35607 grad_norm: 22.0468 loss: 8.6087 loss_cls: 0.2635 loss_bbox: 0.1305 loss_iou: 0.2492 d0.loss_cls: 0.3039 d0.loss_bbox: 0.1402 d0.loss_iou: 0.2596 d1.loss_cls: 0.2805 d1.loss_bbox: 0.1323 d1.loss_iou: 0.2561 d2.loss_cls: 0.2757 d2.loss_bbox: 0.1304 d2.loss_iou: 0.2487 d3.loss_cls: 0.2664 d3.loss_bbox: 0.1323 d3.loss_iou: 0.2501 d4.loss_cls: 0.2665 d4.loss_bbox: 0.1296 d4.loss_iou: 0.2488 enc_loss_cls: 0.3111 enc_loss_bbox: 0.1517 enc_loss_iou: 0.2767 dn_loss_cls: 0.0801 dn_loss_bbox: 0.1454 dn_loss_iou: 0.2052 d0.dn_loss_cls: 0.1673 d0.dn_loss_bbox: 0.2756 d0.dn_loss_iou: 0.3325 d1.dn_loss_cls: 0.1126 d1.dn_loss_bbox: 0.1699 d1.dn_loss_iou: 0.2313 d2.dn_loss_cls: 0.0942 d2.dn_loss_bbox: 0.1526 d2.dn_loss_iou: 0.2138 d3.dn_loss_cls: 0.0849 d3.dn_loss_bbox: 0.1469 d3.dn_loss_iou: 0.2071 d4.dn_loss_cls: 0.0807 d4.dn_loss_bbox: 0.1454 d4.dn_loss_iou: 0.2052 d1.loss_lmm_region: 0.1041 loss_lmm_image: 0.7503 2024/11/13 06:18:25 - mmengine - INFO - Iter(train) [116700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:33:31 time: 2.0068 data_time: 0.0186 memory: 33692 grad_norm: 26.7660 loss: 8.0113 loss_cls: 0.2287 loss_bbox: 0.1301 loss_iou: 0.2043 d0.loss_cls: 0.2589 d0.loss_bbox: 0.1424 d0.loss_iou: 0.2175 d1.loss_cls: 0.2437 d1.loss_bbox: 0.1318 d1.loss_iou: 0.2072 d2.loss_cls: 0.2370 d2.loss_bbox: 0.1280 d2.loss_iou: 0.2030 d3.loss_cls: 0.2334 d3.loss_bbox: 0.1279 d3.loss_iou: 0.2030 d4.loss_cls: 0.2301 d4.loss_bbox: 0.1298 d4.loss_iou: 0.2038 enc_loss_cls: 0.2718 enc_loss_bbox: 0.1521 enc_loss_iou: 0.2314 dn_loss_cls: 0.0645 dn_loss_bbox: 0.1686 dn_loss_iou: 0.1940 d0.dn_loss_cls: 0.1397 d0.dn_loss_bbox: 0.3052 d0.dn_loss_iou: 0.3190 d1.dn_loss_cls: 0.0918 d1.dn_loss_bbox: 0.1978 d1.dn_loss_iou: 0.2205 d2.dn_loss_cls: 0.0728 d2.dn_loss_bbox: 0.1772 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.0680 d3.dn_loss_bbox: 0.1708 d3.dn_loss_iou: 0.1960 d4.dn_loss_cls: 0.0648 d4.dn_loss_bbox: 0.1685 d4.dn_loss_iou: 0.1940 d1.loss_lmm_region: 0.0998 loss_lmm_image: 0.7803 2024/11/13 06:21:46 - mmengine - INFO - Iter(train) [116800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:30:11 time: 2.0168 data_time: 0.0186 memory: 33973 grad_norm: 24.8127 loss: 8.4724 loss_cls: 0.2385 loss_bbox: 0.1274 loss_iou: 0.2458 d0.loss_cls: 0.2708 d0.loss_bbox: 0.1329 d0.loss_iou: 0.2484 d1.loss_cls: 0.2513 d1.loss_bbox: 0.1282 d1.loss_iou: 0.2467 d2.loss_cls: 0.2462 d2.loss_bbox: 0.1275 d2.loss_iou: 0.2460 d3.loss_cls: 0.2431 d3.loss_bbox: 0.1286 d3.loss_iou: 0.2453 d4.loss_cls: 0.2392 d4.loss_bbox: 0.1287 d4.loss_iou: 0.2461 enc_loss_cls: 0.2765 enc_loss_bbox: 0.1441 enc_loss_iou: 0.2615 dn_loss_cls: 0.0726 dn_loss_bbox: 0.1587 dn_loss_iou: 0.2140 d0.dn_loss_cls: 0.1555 d0.dn_loss_bbox: 0.3080 d0.dn_loss_iou: 0.3466 d1.dn_loss_cls: 0.1075 d1.dn_loss_bbox: 0.1897 d1.dn_loss_iou: 0.2417 d2.dn_loss_cls: 0.0853 d2.dn_loss_bbox: 0.1673 d2.dn_loss_iou: 0.2219 d3.dn_loss_cls: 0.0779 d3.dn_loss_bbox: 0.1608 d3.dn_loss_iou: 0.2162 d4.dn_loss_cls: 0.0738 d4.dn_loss_bbox: 0.1587 d4.dn_loss_iou: 0.2140 d1.loss_lmm_region: 0.1205 loss_lmm_image: 0.7590 2024/11/13 06:25:04 - mmengine - INFO - Iter(train) [116900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:26:49 time: 1.9616 data_time: 0.0186 memory: 33276 grad_norm: 27.6239 loss: 7.2503 loss_cls: 0.2083 loss_bbox: 0.1187 loss_iou: 0.1803 d0.loss_cls: 0.2373 d0.loss_bbox: 0.1268 d0.loss_iou: 0.1944 d1.loss_cls: 0.2245 d1.loss_bbox: 0.1192 d1.loss_iou: 0.1869 d2.loss_cls: 0.2167 d2.loss_bbox: 0.1166 d2.loss_iou: 0.1816 d3.loss_cls: 0.2104 d3.loss_bbox: 0.1180 d3.loss_iou: 0.1810 d4.loss_cls: 0.2084 d4.loss_bbox: 0.1172 d4.loss_iou: 0.1790 enc_loss_cls: 0.2455 enc_loss_bbox: 0.1348 enc_loss_iou: 0.2075 dn_loss_cls: 0.0670 dn_loss_bbox: 0.1335 dn_loss_iou: 0.1798 d0.dn_loss_cls: 0.1329 d0.dn_loss_bbox: 0.2598 d0.dn_loss_iou: 0.3050 d1.dn_loss_cls: 0.0904 d1.dn_loss_bbox: 0.1569 d1.dn_loss_iou: 0.2009 d2.dn_loss_cls: 0.0763 d2.dn_loss_bbox: 0.1402 d2.dn_loss_iou: 0.1873 d3.dn_loss_cls: 0.0713 d3.dn_loss_bbox: 0.1345 d3.dn_loss_iou: 0.1813 d4.dn_loss_cls: 0.0678 d4.dn_loss_bbox: 0.1336 d4.dn_loss_iou: 0.1799 d1.loss_lmm_region: 0.0707 loss_lmm_image: 0.7677 2024/11/13 06:28:24 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 06:28:24 - mmengine - INFO - Iter(train) [117000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:23:29 time: 2.0193 data_time: 0.0186 memory: 34668 grad_norm: 24.7302 loss: 8.3049 loss_cls: 0.2653 loss_bbox: 0.1184 loss_iou: 0.2038 d0.loss_cls: 0.2973 d0.loss_bbox: 0.1253 d0.loss_iou: 0.2145 d1.loss_cls: 0.2807 d1.loss_bbox: 0.1154 d1.loss_iou: 0.2050 d2.loss_cls: 0.2752 d2.loss_bbox: 0.1156 d2.loss_iou: 0.2024 d3.loss_cls: 0.2657 d3.loss_bbox: 0.1192 d3.loss_iou: 0.2055 d4.loss_cls: 0.2614 d4.loss_bbox: 0.1226 d4.loss_iou: 0.2070 enc_loss_cls: 0.3022 enc_loss_bbox: 0.1391 enc_loss_iou: 0.2323 dn_loss_cls: 0.1066 dn_loss_bbox: 0.1609 dn_loss_iou: 0.1873 d0.dn_loss_cls: 0.1825 d0.dn_loss_bbox: 0.3042 d0.dn_loss_iou: 0.3104 d1.dn_loss_cls: 0.1400 d1.dn_loss_bbox: 0.1938 d1.dn_loss_iou: 0.2126 d2.dn_loss_cls: 0.1195 d2.dn_loss_bbox: 0.1722 d2.dn_loss_iou: 0.1957 d3.dn_loss_cls: 0.1116 d3.dn_loss_bbox: 0.1640 d3.dn_loss_iou: 0.1894 d4.dn_loss_cls: 0.1096 d4.dn_loss_bbox: 0.1609 d4.dn_loss_iou: 0.1873 d1.loss_lmm_region: 0.1037 loss_lmm_image: 0.7188 2024/11/13 06:31:42 - mmengine - INFO - Iter(train) [117100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:20:07 time: 1.9663 data_time: 0.0186 memory: 33287 grad_norm: 25.6254 loss: 7.8012 loss_cls: 0.2265 loss_bbox: 0.1091 loss_iou: 0.2029 d0.loss_cls: 0.2594 d0.loss_bbox: 0.1179 d0.loss_iou: 0.2123 d1.loss_cls: 0.2390 d1.loss_bbox: 0.1151 d1.loss_iou: 0.2080 d2.loss_cls: 0.2322 d2.loss_bbox: 0.1155 d2.loss_iou: 0.2058 d3.loss_cls: 0.2267 d3.loss_bbox: 0.1130 d3.loss_iou: 0.2041 d4.loss_cls: 0.2264 d4.loss_bbox: 0.1108 d4.loss_iou: 0.2031 enc_loss_cls: 0.2646 enc_loss_bbox: 0.1313 enc_loss_iou: 0.2313 dn_loss_cls: 0.0922 dn_loss_bbox: 0.1482 dn_loss_iou: 0.1823 d0.dn_loss_cls: 0.1700 d0.dn_loss_bbox: 0.2826 d0.dn_loss_iou: 0.3082 d1.dn_loss_cls: 0.1201 d1.dn_loss_bbox: 0.1774 d1.dn_loss_iou: 0.2091 d2.dn_loss_cls: 0.1032 d2.dn_loss_bbox: 0.1577 d2.dn_loss_iou: 0.1915 d3.dn_loss_cls: 0.0962 d3.dn_loss_bbox: 0.1504 d3.dn_loss_iou: 0.1847 d4.dn_loss_cls: 0.0924 d4.dn_loss_bbox: 0.1481 d4.dn_loss_iou: 0.1822 d1.loss_lmm_region: 0.1047 loss_lmm_image: 0.7450 2024/11/13 06:35:04 - mmengine - INFO - Iter(train) [117200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:16:47 time: 2.0222 data_time: 0.0184 memory: 34901 grad_norm: 23.4169 loss: 7.3845 loss_cls: 0.2173 loss_bbox: 0.1045 loss_iou: 0.1926 d0.loss_cls: 0.2441 d0.loss_bbox: 0.1154 d0.loss_iou: 0.2041 d1.loss_cls: 0.2302 d1.loss_bbox: 0.1095 d1.loss_iou: 0.2017 d2.loss_cls: 0.2236 d2.loss_bbox: 0.1086 d2.loss_iou: 0.1949 d3.loss_cls: 0.2165 d3.loss_bbox: 0.1065 d3.loss_iou: 0.1949 d4.loss_cls: 0.2178 d4.loss_bbox: 0.1049 d4.loss_iou: 0.1925 enc_loss_cls: 0.2493 enc_loss_bbox: 0.1232 enc_loss_iou: 0.2193 dn_loss_cls: 0.0739 dn_loss_bbox: 0.1389 dn_loss_iou: 0.1874 d0.dn_loss_cls: 0.1520 d0.dn_loss_bbox: 0.2695 d0.dn_loss_iou: 0.3095 d1.dn_loss_cls: 0.0991 d1.dn_loss_bbox: 0.1723 d1.dn_loss_iou: 0.2139 d2.dn_loss_cls: 0.0848 d2.dn_loss_bbox: 0.1510 d2.dn_loss_iou: 0.1969 d3.dn_loss_cls: 0.0774 d3.dn_loss_bbox: 0.1422 d3.dn_loss_iou: 0.1898 d4.dn_loss_cls: 0.0742 d4.dn_loss_bbox: 0.1389 d4.dn_loss_iou: 0.1875 d1.loss_lmm_region: 0.0865 loss_lmm_image: 0.6670 2024/11/13 06:38:22 - mmengine - INFO - Iter(train) [117300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:13:25 time: 1.9543 data_time: 0.0184 memory: 34544 grad_norm: 24.2289 loss: 8.8966 loss_cls: 0.2540 loss_bbox: 0.1567 loss_iou: 0.2377 d0.loss_cls: 0.2918 d0.loss_bbox: 0.1593 d0.loss_iou: 0.2440 d1.loss_cls: 0.2721 d1.loss_bbox: 0.1575 d1.loss_iou: 0.2386 d2.loss_cls: 0.2641 d2.loss_bbox: 0.1555 d2.loss_iou: 0.2374 d3.loss_cls: 0.2589 d3.loss_bbox: 0.1530 d3.loss_iou: 0.2354 d4.loss_cls: 0.2566 d4.loss_bbox: 0.1547 d4.loss_iou: 0.2365 enc_loss_cls: 0.2986 enc_loss_bbox: 0.1717 enc_loss_iou: 0.2566 dn_loss_cls: 0.1068 dn_loss_bbox: 0.1679 dn_loss_iou: 0.2013 d0.dn_loss_cls: 0.1811 d0.dn_loss_bbox: 0.3004 d0.dn_loss_iou: 0.3248 d1.dn_loss_cls: 0.1318 d1.dn_loss_bbox: 0.1953 d1.dn_loss_iou: 0.2256 d2.dn_loss_cls: 0.1152 d2.dn_loss_bbox: 0.1762 d2.dn_loss_iou: 0.2095 d3.dn_loss_cls: 0.1115 d3.dn_loss_bbox: 0.1684 d3.dn_loss_iou: 0.2032 d4.dn_loss_cls: 0.1083 d4.dn_loss_bbox: 0.1679 d4.dn_loss_iou: 0.2013 d1.loss_lmm_region: 0.1255 loss_lmm_image: 0.7837 2024/11/13 06:41:44 - mmengine - INFO - Iter(train) [117400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:10:05 time: 2.0167 data_time: 0.0186 memory: 34085 grad_norm: 30.1616 loss: 8.5468 loss_cls: 0.2265 loss_bbox: 0.1181 loss_iou: 0.2291 d0.loss_cls: 0.2570 d0.loss_bbox: 0.1302 d0.loss_iou: 0.2384 d1.loss_cls: 0.2438 d1.loss_bbox: 0.1214 d1.loss_iou: 0.2282 d2.loss_cls: 0.2370 d2.loss_bbox: 0.1202 d2.loss_iou: 0.2286 d3.loss_cls: 0.2300 d3.loss_bbox: 0.1199 d3.loss_iou: 0.2299 d4.loss_cls: 0.2285 d4.loss_bbox: 0.1194 d4.loss_iou: 0.2300 enc_loss_cls: 0.2680 enc_loss_bbox: 0.1404 enc_loss_iou: 0.2517 dn_loss_cls: 0.1274 dn_loss_bbox: 0.1633 dn_loss_iou: 0.2098 d0.dn_loss_cls: 0.1951 d0.dn_loss_bbox: 0.3101 d0.dn_loss_iou: 0.3438 d1.dn_loss_cls: 0.1541 d1.dn_loss_bbox: 0.1974 d1.dn_loss_iou: 0.2394 d2.dn_loss_cls: 0.1327 d2.dn_loss_bbox: 0.1736 d2.dn_loss_iou: 0.2193 d3.dn_loss_cls: 0.1338 d3.dn_loss_bbox: 0.1656 d3.dn_loss_iou: 0.2122 d4.dn_loss_cls: 0.1305 d4.dn_loss_bbox: 0.1633 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.1079 loss_lmm_image: 0.7615 2024/11/13 06:45:02 - mmengine - INFO - Iter(train) [117500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:06:44 time: 1.9945 data_time: 0.0185 memory: 33231 grad_norm: 33.2912 loss: 8.5356 loss_cls: 0.2888 loss_bbox: 0.1151 loss_iou: 0.2050 d0.loss_cls: 0.3342 d0.loss_bbox: 0.1287 d0.loss_iou: 0.2231 d1.loss_cls: 0.3076 d1.loss_bbox: 0.1258 d1.loss_iou: 0.2127 d2.loss_cls: 0.3032 d2.loss_bbox: 0.1154 d2.loss_iou: 0.2062 d3.loss_cls: 0.2943 d3.loss_bbox: 0.1171 d3.loss_iou: 0.2062 d4.loss_cls: 0.2937 d4.loss_bbox: 0.1153 d4.loss_iou: 0.2052 enc_loss_cls: 0.3467 enc_loss_bbox: 0.1378 enc_loss_iou: 0.2382 dn_loss_cls: 0.0899 dn_loss_bbox: 0.1518 dn_loss_iou: 0.1892 d0.dn_loss_cls: 0.1747 d0.dn_loss_bbox: 0.3019 d0.dn_loss_iou: 0.3242 d1.dn_loss_cls: 0.1236 d1.dn_loss_bbox: 0.1878 d1.dn_loss_iou: 0.2196 d2.dn_loss_cls: 0.1039 d2.dn_loss_bbox: 0.1612 d2.dn_loss_iou: 0.1979 d3.dn_loss_cls: 0.0958 d3.dn_loss_bbox: 0.1536 d3.dn_loss_iou: 0.1916 d4.dn_loss_cls: 0.0907 d4.dn_loss_bbox: 0.1519 d4.dn_loss_iou: 0.1893 d1.loss_lmm_region: 0.1300 loss_lmm_image: 0.7868 2024/11/13 06:48:20 - mmengine - INFO - Iter(train) [117600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:03:22 time: 1.9821 data_time: 0.0185 memory: 34941 grad_norm: 37.6937 loss: 9.2446 loss_cls: 0.2920 loss_bbox: 0.1425 loss_iou: 0.2384 d0.loss_cls: 0.3394 d0.loss_bbox: 0.1560 d0.loss_iou: 0.2521 d1.loss_cls: 0.3158 d1.loss_bbox: 0.1462 d1.loss_iou: 0.2460 d2.loss_cls: 0.3060 d2.loss_bbox: 0.1432 d2.loss_iou: 0.2401 d3.loss_cls: 0.3014 d3.loss_bbox: 0.1460 d3.loss_iou: 0.2406 d4.loss_cls: 0.2970 d4.loss_bbox: 0.1425 d4.loss_iou: 0.2387 enc_loss_cls: 0.3472 enc_loss_bbox: 0.1648 enc_loss_iou: 0.2670 dn_loss_cls: 0.1141 dn_loss_bbox: 0.1645 dn_loss_iou: 0.2109 d0.dn_loss_cls: 0.2022 d0.dn_loss_bbox: 0.2991 d0.dn_loss_iou: 0.3400 d1.dn_loss_cls: 0.1454 d1.dn_loss_bbox: 0.1923 d1.dn_loss_iou: 0.2367 d2.dn_loss_cls: 0.1252 d2.dn_loss_bbox: 0.1703 d2.dn_loss_iou: 0.2180 d3.dn_loss_cls: 0.1183 d3.dn_loss_bbox: 0.1650 d3.dn_loss_iou: 0.2123 d4.dn_loss_cls: 0.1135 d4.dn_loss_bbox: 0.1645 d4.dn_loss_iou: 0.2109 d1.loss_lmm_region: 0.1140 loss_lmm_image: 0.7647 2024/11/13 06:51:40 - mmengine - INFO - Iter(train) [117700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 18:00:02 time: 2.0056 data_time: 0.0187 memory: 34757 grad_norm: 25.7722 loss: 10.0997 loss_cls: 0.3142 loss_bbox: 0.1556 loss_iou: 0.2927 d0.loss_cls: 0.3629 d0.loss_bbox: 0.1650 d0.loss_iou: 0.3024 d1.loss_cls: 0.3389 d1.loss_bbox: 0.1620 d1.loss_iou: 0.2968 d2.loss_cls: 0.3304 d2.loss_bbox: 0.1583 d2.loss_iou: 0.2936 d3.loss_cls: 0.3195 d3.loss_bbox: 0.1573 d3.loss_iou: 0.2922 d4.loss_cls: 0.3134 d4.loss_bbox: 0.1567 d4.loss_iou: 0.2947 enc_loss_cls: 0.3704 enc_loss_bbox: 0.1763 enc_loss_iou: 0.3142 dn_loss_cls: 0.1139 dn_loss_bbox: 0.1736 dn_loss_iou: 0.2398 d0.dn_loss_cls: 0.2013 d0.dn_loss_bbox: 0.3192 d0.dn_loss_iou: 0.3788 d1.dn_loss_cls: 0.1470 d1.dn_loss_bbox: 0.2098 d1.dn_loss_iou: 0.2712 d2.dn_loss_cls: 0.1289 d2.dn_loss_bbox: 0.1834 d2.dn_loss_iou: 0.2488 d3.dn_loss_cls: 0.1202 d3.dn_loss_bbox: 0.1751 d3.dn_loss_iou: 0.2418 d4.dn_loss_cls: 0.1142 d4.dn_loss_bbox: 0.1736 d4.dn_loss_iou: 0.2399 d1.loss_lmm_region: 0.1270 loss_lmm_image: 0.7249 2024/11/13 06:55:00 - mmengine - INFO - Iter(train) [117800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:56:41 time: 1.9982 data_time: 0.0187 memory: 35240 grad_norm: 36.6428 loss: 8.9597 loss_cls: 0.2772 loss_bbox: 0.1135 loss_iou: 0.2211 d0.loss_cls: 0.3148 d0.loss_bbox: 0.1265 d0.loss_iou: 0.2369 d1.loss_cls: 0.2922 d1.loss_bbox: 0.1188 d1.loss_iou: 0.2249 d2.loss_cls: 0.2833 d2.loss_bbox: 0.1154 d2.loss_iou: 0.2218 d3.loss_cls: 0.2814 d3.loss_bbox: 0.1132 d3.loss_iou: 0.2198 d4.loss_cls: 0.2777 d4.loss_bbox: 0.1153 d4.loss_iou: 0.2203 enc_loss_cls: 0.3312 enc_loss_bbox: 0.1322 enc_loss_iou: 0.2489 dn_loss_cls: 0.1983 dn_loss_bbox: 0.1357 dn_loss_iou: 0.1929 d0.dn_loss_cls: 0.2464 d0.dn_loss_bbox: 0.2833 d0.dn_loss_iou: 0.3217 d1.dn_loss_cls: 0.2211 d1.dn_loss_bbox: 0.1692 d1.dn_loss_iou: 0.2202 d2.dn_loss_cls: 0.2016 d2.dn_loss_bbox: 0.1468 d2.dn_loss_iou: 0.2013 d3.dn_loss_cls: 0.1959 d3.dn_loss_bbox: 0.1377 d3.dn_loss_iou: 0.1951 d4.dn_loss_cls: 0.1957 d4.dn_loss_bbox: 0.1357 d4.dn_loss_iou: 0.1929 d1.loss_lmm_region: 0.1075 loss_lmm_image: 0.7746 2024/11/13 06:58:20 - mmengine - INFO - Iter(train) [117900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:53:20 time: 1.9654 data_time: 0.0186 memory: 34306 grad_norm: 34.5393 loss: 9.2281 loss_cls: 0.2880 loss_bbox: 0.1391 loss_iou: 0.2653 d0.loss_cls: 0.3243 d0.loss_bbox: 0.1504 d0.loss_iou: 0.2759 d1.loss_cls: 0.3044 d1.loss_bbox: 0.1418 d1.loss_iou: 0.2706 d2.loss_cls: 0.2980 d2.loss_bbox: 0.1399 d2.loss_iou: 0.2650 d3.loss_cls: 0.2920 d3.loss_bbox: 0.1392 d3.loss_iou: 0.2650 d4.loss_cls: 0.2914 d4.loss_bbox: 0.1380 d4.loss_iou: 0.2645 enc_loss_cls: 0.3393 enc_loss_bbox: 0.1577 enc_loss_iou: 0.2914 dn_loss_cls: 0.0972 dn_loss_bbox: 0.1628 dn_loss_iou: 0.2205 d0.dn_loss_cls: 0.1719 d0.dn_loss_bbox: 0.3057 d0.dn_loss_iou: 0.3558 d1.dn_loss_cls: 0.1269 d1.dn_loss_bbox: 0.1912 d1.dn_loss_iou: 0.2486 d2.dn_loss_cls: 0.1042 d2.dn_loss_bbox: 0.1717 d2.dn_loss_iou: 0.2297 d3.dn_loss_cls: 0.1000 d3.dn_loss_bbox: 0.1640 d3.dn_loss_iou: 0.2224 d4.dn_loss_cls: 0.0968 d4.dn_loss_bbox: 0.1628 d4.dn_loss_iou: 0.2206 d1.loss_lmm_region: 0.1218 loss_lmm_image: 0.7123 2024/11/13 07:01:40 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 07:01:40 - mmengine - INFO - Iter(train) [118000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:49:59 time: 2.0267 data_time: 0.0185 memory: 35354 grad_norm: 28.8176 loss: 8.6571 loss_cls: 0.2680 loss_bbox: 0.1309 loss_iou: 0.2136 d0.loss_cls: 0.3019 d0.loss_bbox: 0.1441 d0.loss_iou: 0.2272 d1.loss_cls: 0.2825 d1.loss_bbox: 0.1361 d1.loss_iou: 0.2190 d2.loss_cls: 0.2756 d2.loss_bbox: 0.1298 d2.loss_iou: 0.2154 d3.loss_cls: 0.2704 d3.loss_bbox: 0.1309 d3.loss_iou: 0.2141 d4.loss_cls: 0.2641 d4.loss_bbox: 0.1330 d4.loss_iou: 0.2156 enc_loss_cls: 0.3114 enc_loss_bbox: 0.1500 enc_loss_iou: 0.2375 dn_loss_cls: 0.1124 dn_loss_bbox: 0.1648 dn_loss_iou: 0.1980 d0.dn_loss_cls: 0.1976 d0.dn_loss_bbox: 0.3217 d0.dn_loss_iou: 0.3329 d1.dn_loss_cls: 0.1430 d1.dn_loss_bbox: 0.2004 d1.dn_loss_iou: 0.2286 d2.dn_loss_cls: 0.1208 d2.dn_loss_bbox: 0.1763 d2.dn_loss_iou: 0.2078 d3.dn_loss_cls: 0.1162 d3.dn_loss_bbox: 0.1673 d3.dn_loss_iou: 0.2005 d4.dn_loss_cls: 0.1123 d4.dn_loss_bbox: 0.1648 d4.dn_loss_iou: 0.1981 d1.loss_lmm_region: 0.1104 loss_lmm_image: 0.7120 2024/11/13 07:05:00 - mmengine - INFO - Iter(train) [118100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:46:38 time: 1.9916 data_time: 0.0185 memory: 34719 grad_norm: 44.8420 loss: 8.4970 loss_cls: 0.2644 loss_bbox: 0.1170 loss_iou: 0.2136 d0.loss_cls: 0.3026 d0.loss_bbox: 0.1279 d0.loss_iou: 0.2285 d1.loss_cls: 0.2848 d1.loss_bbox: 0.1225 d1.loss_iou: 0.2176 d2.loss_cls: 0.2731 d2.loss_bbox: 0.1175 d2.loss_iou: 0.2138 d3.loss_cls: 0.2681 d3.loss_bbox: 0.1170 d3.loss_iou: 0.2141 d4.loss_cls: 0.2658 d4.loss_bbox: 0.1181 d4.loss_iou: 0.2134 enc_loss_cls: 0.3144 enc_loss_bbox: 0.1402 enc_loss_iou: 0.2434 dn_loss_cls: 0.1075 dn_loss_bbox: 0.1542 dn_loss_iou: 0.2033 d0.dn_loss_cls: 0.1866 d0.dn_loss_bbox: 0.3006 d0.dn_loss_iou: 0.3354 d1.dn_loss_cls: 0.1309 d1.dn_loss_bbox: 0.1882 d1.dn_loss_iou: 0.2305 d2.dn_loss_cls: 0.1142 d2.dn_loss_bbox: 0.1659 d2.dn_loss_iou: 0.2118 d3.dn_loss_cls: 0.1102 d3.dn_loss_bbox: 0.1560 d3.dn_loss_iou: 0.2050 d4.dn_loss_cls: 0.1067 d4.dn_loss_bbox: 0.1542 d4.dn_loss_iou: 0.2033 d1.loss_lmm_region: 0.1092 loss_lmm_image: 0.7455 2024/11/13 07:08:19 - mmengine - INFO - Iter(train) [118200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:43:17 time: 2.0073 data_time: 0.0187 memory: 33808 grad_norm: 27.1791 loss: 8.9523 loss_cls: 0.2753 loss_bbox: 0.1427 loss_iou: 0.2505 d0.loss_cls: 0.3062 d0.loss_bbox: 0.1520 d0.loss_iou: 0.2599 d1.loss_cls: 0.2862 d1.loss_bbox: 0.1490 d1.loss_iou: 0.2551 d2.loss_cls: 0.2851 d2.loss_bbox: 0.1425 d2.loss_iou: 0.2519 d3.loss_cls: 0.2777 d3.loss_bbox: 0.1440 d3.loss_iou: 0.2510 d4.loss_cls: 0.2767 d4.loss_bbox: 0.1424 d4.loss_iou: 0.2504 enc_loss_cls: 0.3116 enc_loss_bbox: 0.1679 enc_loss_iou: 0.2797 dn_loss_cls: 0.0853 dn_loss_bbox: 0.1553 dn_loss_iou: 0.2267 d0.dn_loss_cls: 0.1597 d0.dn_loss_bbox: 0.2895 d0.dn_loss_iou: 0.3580 d1.dn_loss_cls: 0.1126 d1.dn_loss_bbox: 0.1835 d1.dn_loss_iou: 0.2529 d2.dn_loss_cls: 0.0976 d2.dn_loss_bbox: 0.1637 d2.dn_loss_iou: 0.2354 d3.dn_loss_cls: 0.0912 d3.dn_loss_bbox: 0.1562 d3.dn_loss_iou: 0.2283 d4.dn_loss_cls: 0.0883 d4.dn_loss_bbox: 0.1554 d4.dn_loss_iou: 0.2267 d1.loss_lmm_region: 0.1094 loss_lmm_image: 0.7186 2024/11/13 07:11:38 - mmengine - INFO - Iter(train) [118300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:39:56 time: 1.9923 data_time: 0.0185 memory: 34843 grad_norm: 27.5011 loss: 8.4967 loss_cls: 0.2409 loss_bbox: 0.1299 loss_iou: 0.2143 d0.loss_cls: 0.2694 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2267 d1.loss_cls: 0.2574 d1.loss_bbox: 0.1322 d1.loss_iou: 0.2178 d2.loss_cls: 0.2516 d2.loss_bbox: 0.1260 d2.loss_iou: 0.2118 d3.loss_cls: 0.2456 d3.loss_bbox: 0.1276 d3.loss_iou: 0.2146 d4.loss_cls: 0.2454 d4.loss_bbox: 0.1275 d4.loss_iou: 0.2129 enc_loss_cls: 0.2821 enc_loss_bbox: 0.1503 enc_loss_iou: 0.2421 dn_loss_cls: 0.0940 dn_loss_bbox: 0.1717 dn_loss_iou: 0.2107 d0.dn_loss_cls: 0.1738 d0.dn_loss_bbox: 0.3274 d0.dn_loss_iou: 0.3452 d1.dn_loss_cls: 0.1245 d1.dn_loss_bbox: 0.2015 d1.dn_loss_iou: 0.2356 d2.dn_loss_cls: 0.1049 d2.dn_loss_bbox: 0.1817 d2.dn_loss_iou: 0.2184 d3.dn_loss_cls: 0.0981 d3.dn_loss_bbox: 0.1740 d3.dn_loss_iou: 0.2129 d4.dn_loss_cls: 0.0951 d4.dn_loss_bbox: 0.1718 d4.dn_loss_iou: 0.2109 d1.loss_lmm_region: 0.1163 loss_lmm_image: 0.7564 2024/11/13 07:15:00 - mmengine - INFO - Iter(train) [118400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:36:36 time: 2.0136 data_time: 0.0186 memory: 35463 grad_norm: 26.5484 loss: 10.1034 loss_cls: 0.3391 loss_bbox: 0.1804 loss_iou: 0.3115 d0.loss_cls: 0.3793 d0.loss_bbox: 0.2007 d0.loss_iou: 0.3267 d1.loss_cls: 0.3534 d1.loss_bbox: 0.1879 d1.loss_iou: 0.3154 d2.loss_cls: 0.3586 d2.loss_bbox: 0.1782 d2.loss_iou: 0.3085 d3.loss_cls: 0.3484 d3.loss_bbox: 0.1812 d3.loss_iou: 0.3106 d4.loss_cls: 0.3414 d4.loss_bbox: 0.1808 d4.loss_iou: 0.3108 enc_loss_cls: 0.3888 enc_loss_bbox: 0.2094 enc_loss_iou: 0.3444 dn_loss_cls: 0.0893 dn_loss_bbox: 0.1464 dn_loss_iou: 0.2080 d0.dn_loss_cls: 0.1726 d0.dn_loss_bbox: 0.2837 d0.dn_loss_iou: 0.3401 d1.dn_loss_cls: 0.1229 d1.dn_loss_bbox: 0.1728 d1.dn_loss_iou: 0.2353 d2.dn_loss_cls: 0.1024 d2.dn_loss_bbox: 0.1563 d2.dn_loss_iou: 0.2173 d3.dn_loss_cls: 0.0945 d3.dn_loss_bbox: 0.1485 d3.dn_loss_iou: 0.2105 d4.dn_loss_cls: 0.0896 d4.dn_loss_bbox: 0.1465 d4.dn_loss_iou: 0.2079 d1.loss_lmm_region: 0.1287 loss_lmm_image: 0.7748 2024/11/13 07:18:18 - mmengine - INFO - Iter(train) [118500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:33:15 time: 2.0004 data_time: 0.0186 memory: 33407 grad_norm: 20.6852 loss: 9.5763 loss_cls: 0.2911 loss_bbox: 0.1745 loss_iou: 0.3072 d0.loss_cls: 0.3346 d0.loss_bbox: 0.1836 d0.loss_iou: 0.3200 d1.loss_cls: 0.3186 d1.loss_bbox: 0.1721 d1.loss_iou: 0.3080 d2.loss_cls: 0.3040 d2.loss_bbox: 0.1778 d2.loss_iou: 0.3074 d3.loss_cls: 0.3003 d3.loss_bbox: 0.1702 d3.loss_iou: 0.3062 d4.loss_cls: 0.2887 d4.loss_bbox: 0.1747 d4.loss_iou: 0.3128 enc_loss_cls: 0.3404 enc_loss_bbox: 0.1935 enc_loss_iou: 0.3341 dn_loss_cls: 0.0721 dn_loss_bbox: 0.1485 dn_loss_iou: 0.2070 d0.dn_loss_cls: 0.1547 d0.dn_loss_bbox: 0.3008 d0.dn_loss_iou: 0.3443 d1.dn_loss_cls: 0.1063 d1.dn_loss_bbox: 0.1852 d1.dn_loss_iou: 0.2361 d2.dn_loss_cls: 0.0842 d2.dn_loss_bbox: 0.1594 d2.dn_loss_iou: 0.2151 d3.dn_loss_cls: 0.0772 d3.dn_loss_bbox: 0.1500 d3.dn_loss_iou: 0.2090 d4.dn_loss_cls: 0.0727 d4.dn_loss_bbox: 0.1485 d4.dn_loss_iou: 0.2070 d1.loss_lmm_region: 0.1004 loss_lmm_image: 0.7775 2024/11/13 07:21:37 - mmengine - INFO - Iter(train) [118600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:29:54 time: 1.9885 data_time: 0.0186 memory: 33559 grad_norm: 24.9734 loss: 8.0381 loss_cls: 0.2497 loss_bbox: 0.1080 loss_iou: 0.2004 d0.loss_cls: 0.2827 d0.loss_bbox: 0.1184 d0.loss_iou: 0.2136 d1.loss_cls: 0.2642 d1.loss_bbox: 0.1123 d1.loss_iou: 0.2048 d2.loss_cls: 0.2609 d2.loss_bbox: 0.1081 d2.loss_iou: 0.2013 d3.loss_cls: 0.2567 d3.loss_bbox: 0.1087 d3.loss_iou: 0.2016 d4.loss_cls: 0.2523 d4.loss_bbox: 0.1077 d4.loss_iou: 0.2001 enc_loss_cls: 0.2909 enc_loss_bbox: 0.1280 enc_loss_iou: 0.2276 dn_loss_cls: 0.1206 dn_loss_bbox: 0.1321 dn_loss_iou: 0.1907 d0.dn_loss_cls: 0.1907 d0.dn_loss_bbox: 0.2603 d0.dn_loss_iou: 0.3177 d1.dn_loss_cls: 0.1459 d1.dn_loss_bbox: 0.1548 d1.dn_loss_iou: 0.2149 d2.dn_loss_cls: 0.1290 d2.dn_loss_bbox: 0.1407 d2.dn_loss_iou: 0.1985 d3.dn_loss_cls: 0.1252 d3.dn_loss_bbox: 0.1340 d3.dn_loss_iou: 0.1930 d4.dn_loss_cls: 0.1216 d4.dn_loss_bbox: 0.1321 d4.dn_loss_iou: 0.1908 d1.loss_lmm_region: 0.1067 loss_lmm_image: 0.7409 2024/11/13 07:24:57 - mmengine - INFO - Iter(train) [118700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:26:33 time: 2.0030 data_time: 0.0186 memory: 34401 grad_norm: 33.6515 loss: 7.8022 loss_cls: 0.2197 loss_bbox: 0.1256 loss_iou: 0.2227 d0.loss_cls: 0.2490 d0.loss_bbox: 0.1397 d0.loss_iou: 0.2336 d1.loss_cls: 0.2338 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2240 d2.loss_cls: 0.2311 d2.loss_bbox: 0.1247 d2.loss_iou: 0.2208 d3.loss_cls: 0.2242 d3.loss_bbox: 0.1261 d3.loss_iou: 0.2231 d4.loss_cls: 0.2204 d4.loss_bbox: 0.1268 d4.loss_iou: 0.2242 enc_loss_cls: 0.2601 enc_loss_bbox: 0.1452 enc_loss_iou: 0.2456 dn_loss_cls: 0.0647 dn_loss_bbox: 0.1395 dn_loss_iou: 0.1937 d0.dn_loss_cls: 0.1317 d0.dn_loss_bbox: 0.2690 d0.dn_loss_iou: 0.3189 d1.dn_loss_cls: 0.0848 d1.dn_loss_bbox: 0.1703 d1.dn_loss_iou: 0.2195 d2.dn_loss_cls: 0.0714 d2.dn_loss_bbox: 0.1490 d2.dn_loss_iou: 0.2016 d3.dn_loss_cls: 0.0658 d3.dn_loss_bbox: 0.1406 d3.dn_loss_iou: 0.1946 d4.dn_loss_cls: 0.0643 d4.dn_loss_bbox: 0.1396 d4.dn_loss_iou: 0.1937 d1.loss_lmm_region: 0.0909 loss_lmm_image: 0.7506 2024/11/13 07:28:16 - mmengine - INFO - Iter(train) [118800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:23:12 time: 2.0054 data_time: 0.0185 memory: 34626 grad_norm: 30.7004 loss: 7.5468 loss_cls: 0.2396 loss_bbox: 0.1030 loss_iou: 0.1948 d0.loss_cls: 0.2819 d0.loss_bbox: 0.1125 d0.loss_iou: 0.2079 d1.loss_cls: 0.2558 d1.loss_bbox: 0.1078 d1.loss_iou: 0.2011 d2.loss_cls: 0.2466 d2.loss_bbox: 0.1030 d2.loss_iou: 0.1956 d3.loss_cls: 0.2429 d3.loss_bbox: 0.1003 d3.loss_iou: 0.1922 d4.loss_cls: 0.2399 d4.loss_bbox: 0.1020 d4.loss_iou: 0.1943 enc_loss_cls: 0.2848 enc_loss_bbox: 0.1228 enc_loss_iou: 0.2271 dn_loss_cls: 0.0579 dn_loss_bbox: 0.1342 dn_loss_iou: 0.1770 d0.dn_loss_cls: 0.1371 d0.dn_loss_bbox: 0.2874 d0.dn_loss_iou: 0.3172 d1.dn_loss_cls: 0.0885 d1.dn_loss_bbox: 0.1684 d1.dn_loss_iou: 0.2077 d2.dn_loss_cls: 0.0690 d2.dn_loss_bbox: 0.1455 d2.dn_loss_iou: 0.1868 d3.dn_loss_cls: 0.0614 d3.dn_loss_bbox: 0.1364 d3.dn_loss_iou: 0.1793 d4.dn_loss_cls: 0.0583 d4.dn_loss_bbox: 0.1343 d4.dn_loss_iou: 0.1771 d1.loss_lmm_region: 0.0876 loss_lmm_image: 0.7798 2024/11/13 07:31:34 - mmengine - INFO - Iter(train) [118900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:19:51 time: 1.9771 data_time: 0.0187 memory: 34997 grad_norm: 25.8932 loss: 7.8002 loss_cls: 0.2523 loss_bbox: 0.1073 loss_iou: 0.1972 d0.loss_cls: 0.2819 d0.loss_bbox: 0.1187 d0.loss_iou: 0.2084 d1.loss_cls: 0.2620 d1.loss_bbox: 0.1119 d1.loss_iou: 0.2005 d2.loss_cls: 0.2557 d2.loss_bbox: 0.1079 d2.loss_iou: 0.1986 d3.loss_cls: 0.2561 d3.loss_bbox: 0.1057 d3.loss_iou: 0.1961 d4.loss_cls: 0.2541 d4.loss_bbox: 0.1061 d4.loss_iou: 0.1961 enc_loss_cls: 0.2991 enc_loss_bbox: 0.1257 enc_loss_iou: 0.2228 dn_loss_cls: 0.1273 dn_loss_bbox: 0.1167 dn_loss_iou: 0.1720 d0.dn_loss_cls: 0.1917 d0.dn_loss_bbox: 0.2403 d0.dn_loss_iou: 0.2977 d1.dn_loss_cls: 0.1436 d1.dn_loss_bbox: 0.1483 d1.dn_loss_iou: 0.2000 d2.dn_loss_cls: 0.1315 d2.dn_loss_bbox: 0.1276 d2.dn_loss_iou: 0.1812 d3.dn_loss_cls: 0.1278 d3.dn_loss_bbox: 0.1193 d3.dn_loss_iou: 0.1747 d4.dn_loss_cls: 0.1292 d4.dn_loss_bbox: 0.1167 d4.dn_loss_iou: 0.1719 d1.loss_lmm_region: 0.0963 loss_lmm_image: 0.7220 2024/11/13 07:34:54 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 07:34:54 - mmengine - INFO - Iter(train) [119000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:16:30 time: 2.0229 data_time: 0.0187 memory: 33479 grad_norm: 29.8863 loss: 8.7327 loss_cls: 0.3039 loss_bbox: 0.1305 loss_iou: 0.2578 d0.loss_cls: 0.3489 d0.loss_bbox: 0.1363 d0.loss_iou: 0.2712 d1.loss_cls: 0.3275 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2549 d2.loss_cls: 0.3132 d2.loss_bbox: 0.1271 d2.loss_iou: 0.2582 d3.loss_cls: 0.3112 d3.loss_bbox: 0.1254 d3.loss_iou: 0.2561 d4.loss_cls: 0.3102 d4.loss_bbox: 0.1255 d4.loss_iou: 0.2564 enc_loss_cls: 0.3508 enc_loss_bbox: 0.1487 enc_loss_iou: 0.2865 dn_loss_cls: 0.0852 dn_loss_bbox: 0.1313 dn_loss_iou: 0.1855 d0.dn_loss_cls: 0.1663 d0.dn_loss_bbox: 0.2595 d0.dn_loss_iou: 0.3140 d1.dn_loss_cls: 0.1173 d1.dn_loss_bbox: 0.1537 d1.dn_loss_iou: 0.2113 d2.dn_loss_cls: 0.0947 d2.dn_loss_bbox: 0.1359 d2.dn_loss_iou: 0.1931 d3.dn_loss_cls: 0.0904 d3.dn_loss_bbox: 0.1317 d3.dn_loss_iou: 0.1867 d4.dn_loss_cls: 0.0855 d4.dn_loss_bbox: 0.1313 d4.dn_loss_iou: 0.1853 d1.loss_lmm_region: 0.1023 loss_lmm_image: 0.7446 2024/11/13 07:38:14 - mmengine - INFO - Iter(train) [119100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:13:09 time: 2.0090 data_time: 0.0185 memory: 33162 grad_norm: 31.9325 loss: 8.1629 loss_cls: 0.2216 loss_bbox: 0.1213 loss_iou: 0.2148 d0.loss_cls: 0.2455 d0.loss_bbox: 0.1358 d0.loss_iou: 0.2279 d1.loss_cls: 0.2312 d1.loss_bbox: 0.1301 d1.loss_iou: 0.2235 d2.loss_cls: 0.2297 d2.loss_bbox: 0.1257 d2.loss_iou: 0.2174 d3.loss_cls: 0.2258 d3.loss_bbox: 0.1222 d3.loss_iou: 0.2164 d4.loss_cls: 0.2232 d4.loss_bbox: 0.1215 d4.loss_iou: 0.2160 enc_loss_cls: 0.2546 enc_loss_bbox: 0.1409 enc_loss_iou: 0.2399 dn_loss_cls: 0.0603 dn_loss_bbox: 0.1801 dn_loss_iou: 0.2216 d0.dn_loss_cls: 0.1337 d0.dn_loss_bbox: 0.3178 d0.dn_loss_iou: 0.3485 d1.dn_loss_cls: 0.0866 d1.dn_loss_bbox: 0.2040 d1.dn_loss_iou: 0.2455 d2.dn_loss_cls: 0.0701 d2.dn_loss_bbox: 0.1850 d2.dn_loss_iou: 0.2287 d3.dn_loss_cls: 0.0649 d3.dn_loss_bbox: 0.1821 d3.dn_loss_iou: 0.2241 d4.dn_loss_cls: 0.0615 d4.dn_loss_bbox: 0.1801 d4.dn_loss_iou: 0.2216 d1.loss_lmm_region: 0.0996 loss_lmm_image: 0.7621 2024/11/13 07:41:31 - mmengine - INFO - Iter(train) [119200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:09:47 time: 1.9763 data_time: 0.0186 memory: 33758 grad_norm: 34.4398 loss: 8.9694 loss_cls: 0.2669 loss_bbox: 0.1309 loss_iou: 0.2223 d0.loss_cls: 0.2995 d0.loss_bbox: 0.1510 d0.loss_iou: 0.2369 d1.loss_cls: 0.2802 d1.loss_bbox: 0.1406 d1.loss_iou: 0.2306 d2.loss_cls: 0.2752 d2.loss_bbox: 0.1338 d2.loss_iou: 0.2245 d3.loss_cls: 0.2707 d3.loss_bbox: 0.1287 d3.loss_iou: 0.2221 d4.loss_cls: 0.2692 d4.loss_bbox: 0.1302 d4.loss_iou: 0.2217 enc_loss_cls: 0.3064 enc_loss_bbox: 0.1597 enc_loss_iou: 0.2524 dn_loss_cls: 0.1514 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2022 d0.dn_loss_cls: 0.2265 d0.dn_loss_bbox: 0.3086 d0.dn_loss_iou: 0.3361 d1.dn_loss_cls: 0.1759 d1.dn_loss_bbox: 0.1882 d1.dn_loss_iou: 0.2319 d2.dn_loss_cls: 0.1609 d2.dn_loss_bbox: 0.1676 d2.dn_loss_iou: 0.2125 d3.dn_loss_cls: 0.1549 d3.dn_loss_bbox: 0.1574 d3.dn_loss_iou: 0.2044 d4.dn_loss_cls: 0.1510 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.2022 d1.loss_lmm_region: 0.1119 loss_lmm_image: 0.7608 2024/11/13 07:44:51 - mmengine - INFO - Iter(train) [119300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:06:27 time: 2.0106 data_time: 0.0187 memory: 34446 grad_norm: 24.1228 loss: 8.2136 loss_cls: 0.2421 loss_bbox: 0.1311 loss_iou: 0.2138 d0.loss_cls: 0.2789 d0.loss_bbox: 0.1408 d0.loss_iou: 0.2218 d1.loss_cls: 0.2568 d1.loss_bbox: 0.1336 d1.loss_iou: 0.2158 d2.loss_cls: 0.2509 d2.loss_bbox: 0.1312 d2.loss_iou: 0.2132 d3.loss_cls: 0.2432 d3.loss_bbox: 0.1316 d3.loss_iou: 0.2137 d4.loss_cls: 0.2435 d4.loss_bbox: 0.1297 d4.loss_iou: 0.2108 enc_loss_cls: 0.2875 enc_loss_bbox: 0.1497 enc_loss_iou: 0.2342 dn_loss_cls: 0.0845 dn_loss_bbox: 0.1648 dn_loss_iou: 0.1926 d0.dn_loss_cls: 0.1568 d0.dn_loss_bbox: 0.2917 d0.dn_loss_iou: 0.3144 d1.dn_loss_cls: 0.1130 d1.dn_loss_bbox: 0.1938 d1.dn_loss_iou: 0.2171 d2.dn_loss_cls: 0.0951 d2.dn_loss_bbox: 0.1740 d2.dn_loss_iou: 0.1998 d3.dn_loss_cls: 0.0871 d3.dn_loss_bbox: 0.1666 d3.dn_loss_iou: 0.1947 d4.dn_loss_cls: 0.0848 d4.dn_loss_bbox: 0.1648 d4.dn_loss_iou: 0.1927 d1.loss_lmm_region: 0.1019 loss_lmm_image: 0.7497 2024/11/13 07:48:10 - mmengine - INFO - Iter(train) [119400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 17:03:06 time: 2.0009 data_time: 0.0185 memory: 32941 grad_norm: 24.0864 loss: 8.7501 loss_cls: 0.2671 loss_bbox: 0.1548 loss_iou: 0.2535 d0.loss_cls: 0.3186 d0.loss_bbox: 0.1588 d0.loss_iou: 0.2640 d1.loss_cls: 0.2890 d1.loss_bbox: 0.1537 d1.loss_iou: 0.2557 d2.loss_cls: 0.2811 d2.loss_bbox: 0.1522 d2.loss_iou: 0.2526 d3.loss_cls: 0.2722 d3.loss_bbox: 0.1538 d3.loss_iou: 0.2533 d4.loss_cls: 0.2695 d4.loss_bbox: 0.1527 d4.loss_iou: 0.2526 enc_loss_cls: 0.3250 enc_loss_bbox: 0.1638 enc_loss_iou: 0.2745 dn_loss_cls: 0.0689 dn_loss_bbox: 0.1522 dn_loss_iou: 0.2059 d0.dn_loss_cls: 0.1430 d0.dn_loss_bbox: 0.2855 d0.dn_loss_iou: 0.3399 d1.dn_loss_cls: 0.0968 d1.dn_loss_bbox: 0.1840 d1.dn_loss_iou: 0.2338 d2.dn_loss_cls: 0.0772 d2.dn_loss_bbox: 0.1613 d2.dn_loss_iou: 0.2146 d3.dn_loss_cls: 0.0727 d3.dn_loss_bbox: 0.1539 d3.dn_loss_iou: 0.2080 d4.dn_loss_cls: 0.0694 d4.dn_loss_bbox: 0.1522 d4.dn_loss_iou: 0.2060 d1.loss_lmm_region: 0.1010 loss_lmm_image: 0.7052 2024/11/13 07:51:31 - mmengine - INFO - Iter(train) [119500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:59:45 time: 1.9996 data_time: 0.0186 memory: 33677 grad_norm: 22.2091 loss: 8.8951 loss_cls: 0.2504 loss_bbox: 0.1516 loss_iou: 0.2378 d0.loss_cls: 0.2847 d0.loss_bbox: 0.1616 d0.loss_iou: 0.2499 d1.loss_cls: 0.2680 d1.loss_bbox: 0.1552 d1.loss_iou: 0.2416 d2.loss_cls: 0.2628 d2.loss_bbox: 0.1495 d2.loss_iou: 0.2364 d3.loss_cls: 0.2571 d3.loss_bbox: 0.1499 d3.loss_iou: 0.2378 d4.loss_cls: 0.2545 d4.loss_bbox: 0.1492 d4.loss_iou: 0.2363 enc_loss_cls: 0.2937 enc_loss_bbox: 0.1694 enc_loss_iou: 0.2635 dn_loss_cls: 0.0715 dn_loss_bbox: 0.1873 dn_loss_iou: 0.2233 d0.dn_loss_cls: 0.1493 d0.dn_loss_bbox: 0.3318 d0.dn_loss_iou: 0.3582 d1.dn_loss_cls: 0.0999 d1.dn_loss_bbox: 0.2171 d1.dn_loss_iou: 0.2517 d2.dn_loss_cls: 0.0816 d2.dn_loss_bbox: 0.1975 d2.dn_loss_iou: 0.2325 d3.dn_loss_cls: 0.0755 d3.dn_loss_bbox: 0.1891 d3.dn_loss_iou: 0.2255 d4.dn_loss_cls: 0.0721 d4.dn_loss_bbox: 0.1872 d4.dn_loss_iou: 0.2233 d1.loss_lmm_region: 0.0905 loss_lmm_image: 0.7696 2024/11/13 07:54:49 - mmengine - INFO - Iter(train) [119600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:56:24 time: 1.9871 data_time: 0.0186 memory: 34714 grad_norm: 40.8096 loss: 8.6669 loss_cls: 0.2790 loss_bbox: 0.1190 loss_iou: 0.2181 d0.loss_cls: 0.3075 d0.loss_bbox: 0.1355 d0.loss_iou: 0.2327 d1.loss_cls: 0.2906 d1.loss_bbox: 0.1244 d1.loss_iou: 0.2225 d2.loss_cls: 0.2837 d2.loss_bbox: 0.1233 d2.loss_iou: 0.2213 d3.loss_cls: 0.2785 d3.loss_bbox: 0.1193 d3.loss_iou: 0.2190 d4.loss_cls: 0.2792 d4.loss_bbox: 0.1181 d4.loss_iou: 0.2176 enc_loss_cls: 0.3182 enc_loss_bbox: 0.1443 enc_loss_iou: 0.2483 dn_loss_cls: 0.1195 dn_loss_bbox: 0.1589 dn_loss_iou: 0.1991 d0.dn_loss_cls: 0.1896 d0.dn_loss_bbox: 0.2873 d0.dn_loss_iou: 0.3226 d1.dn_loss_cls: 0.1436 d1.dn_loss_bbox: 0.1843 d1.dn_loss_iou: 0.2243 d2.dn_loss_cls: 0.1311 d2.dn_loss_bbox: 0.1673 d2.dn_loss_iou: 0.2073 d3.dn_loss_cls: 0.1230 d3.dn_loss_bbox: 0.1609 d3.dn_loss_iou: 0.2014 d4.dn_loss_cls: 0.1200 d4.dn_loss_bbox: 0.1589 d4.dn_loss_iou: 0.1992 d1.loss_lmm_region: 0.0994 loss_lmm_image: 0.7689 2024/11/13 07:58:09 - mmengine - INFO - Iter(train) [119700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:53:03 time: 1.9933 data_time: 0.0186 memory: 35259 grad_norm: 37.2892 loss: 8.6109 loss_cls: 0.2892 loss_bbox: 0.1176 loss_iou: 0.2269 d0.loss_cls: 0.3283 d0.loss_bbox: 0.1264 d0.loss_iou: 0.2399 d1.loss_cls: 0.3058 d1.loss_bbox: 0.1230 d1.loss_iou: 0.2307 d2.loss_cls: 0.2987 d2.loss_bbox: 0.1168 d2.loss_iou: 0.2262 d3.loss_cls: 0.2924 d3.loss_bbox: 0.1160 d3.loss_iou: 0.2256 d4.loss_cls: 0.2898 d4.loss_bbox: 0.1184 d4.loss_iou: 0.2274 enc_loss_cls: 0.3350 enc_loss_bbox: 0.1395 enc_loss_iou: 0.2605 dn_loss_cls: 0.1058 dn_loss_bbox: 0.1496 dn_loss_iou: 0.1876 d0.dn_loss_cls: 0.1737 d0.dn_loss_bbox: 0.2772 d0.dn_loss_iou: 0.3117 d1.dn_loss_cls: 0.1307 d1.dn_loss_bbox: 0.1766 d1.dn_loss_iou: 0.2131 d2.dn_loss_cls: 0.1170 d2.dn_loss_bbox: 0.1612 d2.dn_loss_iou: 0.1969 d3.dn_loss_cls: 0.1121 d3.dn_loss_bbox: 0.1520 d3.dn_loss_iou: 0.1898 d4.dn_loss_cls: 0.1075 d4.dn_loss_bbox: 0.1496 d4.dn_loss_iou: 0.1876 d1.loss_lmm_region: 0.1107 loss_lmm_image: 0.7665 2024/11/13 08:01:30 - mmengine - INFO - Iter(train) [119800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:49:43 time: 2.0178 data_time: 0.0185 memory: 35394 grad_norm: 32.4025 loss: 8.4951 loss_cls: 0.2457 loss_bbox: 0.1396 loss_iou: 0.2278 d0.loss_cls: 0.2793 d0.loss_bbox: 0.1481 d0.loss_iou: 0.2403 d1.loss_cls: 0.2618 d1.loss_bbox: 0.1398 d1.loss_iou: 0.2292 d2.loss_cls: 0.2519 d2.loss_bbox: 0.1422 d2.loss_iou: 0.2302 d3.loss_cls: 0.2498 d3.loss_bbox: 0.1381 d3.loss_iou: 0.2279 d4.loss_cls: 0.2476 d4.loss_bbox: 0.1391 d4.loss_iou: 0.2278 enc_loss_cls: 0.2895 enc_loss_bbox: 0.1547 enc_loss_iou: 0.2522 dn_loss_cls: 0.0796 dn_loss_bbox: 0.1668 dn_loss_iou: 0.2039 d0.dn_loss_cls: 0.1661 d0.dn_loss_bbox: 0.3227 d0.dn_loss_iou: 0.3465 d1.dn_loss_cls: 0.1099 d1.dn_loss_bbox: 0.1981 d1.dn_loss_iou: 0.2329 d2.dn_loss_cls: 0.0892 d2.dn_loss_bbox: 0.1781 d2.dn_loss_iou: 0.2140 d3.dn_loss_cls: 0.0836 d3.dn_loss_bbox: 0.1690 d3.dn_loss_iou: 0.2069 d4.dn_loss_cls: 0.0800 d4.dn_loss_bbox: 0.1667 d4.dn_loss_iou: 0.2038 d1.loss_lmm_region: 0.0937 loss_lmm_image: 0.7210 2024/11/13 08:04:50 - mmengine - INFO - Iter(train) [119900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:46:22 time: 2.0113 data_time: 0.0186 memory: 34196 grad_norm: 23.5711 loss: 8.0125 loss_cls: 0.2540 loss_bbox: 0.1135 loss_iou: 0.2181 d0.loss_cls: 0.2921 d0.loss_bbox: 0.1282 d0.loss_iou: 0.2337 d1.loss_cls: 0.2702 d1.loss_bbox: 0.1183 d1.loss_iou: 0.2224 d2.loss_cls: 0.2621 d2.loss_bbox: 0.1185 d2.loss_iou: 0.2228 d3.loss_cls: 0.2587 d3.loss_bbox: 0.1140 d3.loss_iou: 0.2199 d4.loss_cls: 0.2543 d4.loss_bbox: 0.1151 d4.loss_iou: 0.2205 enc_loss_cls: 0.2982 enc_loss_bbox: 0.1363 enc_loss_iou: 0.2476 dn_loss_cls: 0.0769 dn_loss_bbox: 0.1356 dn_loss_iou: 0.1912 d0.dn_loss_cls: 0.1473 d0.dn_loss_bbox: 0.2552 d0.dn_loss_iou: 0.3105 d1.dn_loss_cls: 0.0994 d1.dn_loss_bbox: 0.1591 d1.dn_loss_iou: 0.2147 d2.dn_loss_cls: 0.0865 d2.dn_loss_bbox: 0.1431 d2.dn_loss_iou: 0.1986 d3.dn_loss_cls: 0.0790 d3.dn_loss_bbox: 0.1369 d3.dn_loss_iou: 0.1929 d4.dn_loss_cls: 0.0771 d4.dn_loss_bbox: 0.1356 d4.dn_loss_iou: 0.1914 d1.loss_lmm_region: 0.1005 loss_lmm_image: 0.7626 2024/11/13 08:08:09 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 08:08:09 - mmengine - INFO - Iter(train) [120000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:43:01 time: 2.0037 data_time: 0.0186 memory: 34780 grad_norm: 25.9364 loss: 9.1279 loss_cls: 0.2586 loss_bbox: 0.1452 loss_iou: 0.2260 d0.loss_cls: 0.2917 d0.loss_bbox: 0.1625 d0.loss_iou: 0.2465 d1.loss_cls: 0.2746 d1.loss_bbox: 0.1543 d1.loss_iou: 0.2342 d2.loss_cls: 0.2636 d2.loss_bbox: 0.1501 d2.loss_iou: 0.2336 d3.loss_cls: 0.2637 d3.loss_bbox: 0.1462 d3.loss_iou: 0.2280 d4.loss_cls: 0.2628 d4.loss_bbox: 0.1419 d4.loss_iou: 0.2244 enc_loss_cls: 0.3020 enc_loss_bbox: 0.1662 enc_loss_iou: 0.2566 dn_loss_cls: 0.1006 dn_loss_bbox: 0.2024 dn_loss_iou: 0.2189 d0.dn_loss_cls: 0.1771 d0.dn_loss_bbox: 0.3582 d0.dn_loss_iou: 0.3624 d1.dn_loss_cls: 0.1269 d1.dn_loss_bbox: 0.2300 d1.dn_loss_iou: 0.2480 d2.dn_loss_cls: 0.1111 d2.dn_loss_bbox: 0.2091 d2.dn_loss_iou: 0.2267 d3.dn_loss_cls: 0.1058 d3.dn_loss_bbox: 0.2028 d3.dn_loss_iou: 0.2206 d4.dn_loss_cls: 0.1019 d4.dn_loss_bbox: 0.2023 d4.dn_loss_iou: 0.2189 d1.loss_lmm_region: 0.1176 loss_lmm_image: 0.7538 2024/11/13 08:08:09 - mmengine - INFO - Saving checkpoint at 120000 iterations 2024/11/13 08:18:37 - mmengine - INFO - Iter(val) [100/602] eta: 0:49:02 time: 6.0091 data_time: 0.0017 memory: 7497 2024/11/13 08:28:58 - mmengine - INFO - Iter(val) [200/602] eta: 0:40:25 time: 6.3254 data_time: 0.0018 memory: 7528 2024/11/13 08:40:17 - mmengine - INFO - Iter(val) [300/602] eta: 0:31:38 time: 6.8790 data_time: 0.0018 memory: 7506 2024/11/13 08:52:33 - mmengine - INFO - Iter(val) [400/602] eta: 0:22:03 time: 7.5067 data_time: 0.0018 memory: 7528 2024/11/13 09:05:17 - mmengine - INFO - Iter(val) [500/602] eta: 0:11:30 time: 7.7956 data_time: 0.0019 memory: 7473 2024/11/13 09:18:57 - mmengine - INFO - Iter(val) [600/602] eta: 0:00:14 time: 8.2744 data_time: 0.0019 memory: 7522 2024/11/13 09:26:17 - mmengine - INFO - === 81 classes had less than 10000 detections! Outputting 10000 detections for each class will improve AP further. === 2024/11/13 09:28:46 - mmengine - INFO - mAP_copypaste: {'AP': 0.4993315631405046, 'AP50': 0.6118461886502762, 'AP75': 0.5308775158954654, 'APs': 0.44930762195323115, 'APm': 0.6292050539901716, 'APl': 0.7159949039809201, 'APr': 0.41302113430998694, 'APc': 0.4549220541201559, 'APf': 0.5542493282565885} 2024/11/13 09:29:11 - mmengine - INFO - Iter(val) [602/602] lvis_fixed_ap/AP: 0.4993 lvis_fixed_ap/AP50: 0.6118 lvis_fixed_ap/AP75: 0.5309 lvis_fixed_ap/APs: 0.4493 lvis_fixed_ap/APm: 0.6292 lvis_fixed_ap/APl: 0.7160 lvis_fixed_ap/APr: 0.4130 lvis_fixed_ap/APc: 0.4549 lvis_fixed_ap/APf: 0.5542 data_time: 0.0020 time: 7.0163 2024/11/13 09:32:29 - mmengine - INFO - Iter(train) [120100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:42:08 time: 1.9985 data_time: 0.0186 memory: 34190 grad_norm: 24.7185 loss: 8.9441 loss_cls: 0.2621 loss_bbox: 0.1362 loss_iou: 0.2542 d0.loss_cls: 0.2948 d0.loss_bbox: 0.1512 d0.loss_iou: 0.2658 d1.loss_cls: 0.2788 d1.loss_bbox: 0.1396 d1.loss_iou: 0.2569 d2.loss_cls: 0.2719 d2.loss_bbox: 0.1374 d2.loss_iou: 0.2552 d3.loss_cls: 0.2667 d3.loss_bbox: 0.1389 d3.loss_iou: 0.2561 d4.loss_cls: 0.2619 d4.loss_bbox: 0.1380 d4.loss_iou: 0.2566 enc_loss_cls: 0.3013 enc_loss_bbox: 0.1600 enc_loss_iou: 0.2788 dn_loss_cls: 0.0932 dn_loss_bbox: 0.1633 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.1649 d0.dn_loss_bbox: 0.3073 d0.dn_loss_iou: 0.3437 d1.dn_loss_cls: 0.1256 d1.dn_loss_bbox: 0.1965 d1.dn_loss_iou: 0.2420 d2.dn_loss_cls: 0.1049 d2.dn_loss_bbox: 0.1731 d2.dn_loss_iou: 0.2222 d3.dn_loss_cls: 0.0991 d3.dn_loss_bbox: 0.1650 d3.dn_loss_iou: 0.2163 d4.dn_loss_cls: 0.0936 d4.dn_loss_bbox: 0.1633 d4.dn_loss_iou: 0.2143 d1.loss_lmm_region: 0.1034 loss_lmm_image: 0.7758 2024/11/13 09:35:49 - mmengine - INFO - Iter(train) [120200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:38:47 time: 2.0016 data_time: 0.0186 memory: 34307 grad_norm: 22.9581 loss: 10.0752 loss_cls: 0.3568 loss_bbox: 0.1546 loss_iou: 0.2841 d0.loss_cls: 0.4028 d0.loss_bbox: 0.1723 d0.loss_iou: 0.3004 d1.loss_cls: 0.3732 d1.loss_bbox: 0.1642 d1.loss_iou: 0.2914 d2.loss_cls: 0.3666 d2.loss_bbox: 0.1589 d2.loss_iou: 0.2876 d3.loss_cls: 0.3614 d3.loss_bbox: 0.1563 d3.loss_iou: 0.2843 d4.loss_cls: 0.3552 d4.loss_bbox: 0.1573 d4.loss_iou: 0.2866 enc_loss_cls: 0.4146 enc_loss_bbox: 0.1846 enc_loss_iou: 0.3156 dn_loss_cls: 0.0971 dn_loss_bbox: 0.1693 dn_loss_iou: 0.2123 d0.dn_loss_cls: 0.1803 d0.dn_loss_bbox: 0.3151 d0.dn_loss_iou: 0.3491 d1.dn_loss_cls: 0.1315 d1.dn_loss_bbox: 0.2041 d1.dn_loss_iou: 0.2408 d2.dn_loss_cls: 0.1143 d2.dn_loss_bbox: 0.1788 d2.dn_loss_iou: 0.2210 d3.dn_loss_cls: 0.1056 d3.dn_loss_bbox: 0.1704 d3.dn_loss_iou: 0.2144 d4.dn_loss_cls: 0.0991 d4.dn_loss_bbox: 0.1693 d4.dn_loss_iou: 0.2124 d1.loss_lmm_region: 0.1135 loss_lmm_image: 0.7479 2024/11/13 09:39:11 - mmengine - INFO - Iter(train) [120300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:35:26 time: 2.0132 data_time: 0.0188 memory: 35430 grad_norm: 27.4660 loss: 9.2254 loss_cls: 0.2808 loss_bbox: 0.1367 loss_iou: 0.2792 d0.loss_cls: 0.3082 d0.loss_bbox: 0.1534 d0.loss_iou: 0.2925 d1.loss_cls: 0.2959 d1.loss_bbox: 0.1427 d1.loss_iou: 0.2841 d2.loss_cls: 0.2919 d2.loss_bbox: 0.1357 d2.loss_iou: 0.2772 d3.loss_cls: 0.2835 d3.loss_bbox: 0.1379 d3.loss_iou: 0.2792 d4.loss_cls: 0.2787 d4.loss_bbox: 0.1381 d4.loss_iou: 0.2796 enc_loss_cls: 0.3182 enc_loss_bbox: 0.1637 enc_loss_iou: 0.3065 dn_loss_cls: 0.0771 dn_loss_bbox: 0.1665 dn_loss_iou: 0.2355 d0.dn_loss_cls: 0.1527 d0.dn_loss_bbox: 0.3065 d0.dn_loss_iou: 0.3720 d1.dn_loss_cls: 0.1087 d1.dn_loss_bbox: 0.1927 d1.dn_loss_iou: 0.2600 d2.dn_loss_cls: 0.0893 d2.dn_loss_bbox: 0.1750 d2.dn_loss_iou: 0.2435 d3.dn_loss_cls: 0.0818 d3.dn_loss_bbox: 0.1674 d3.dn_loss_iou: 0.2369 d4.dn_loss_cls: 0.0778 d4.dn_loss_bbox: 0.1665 d4.dn_loss_iou: 0.2354 d1.loss_lmm_region: 0.1043 loss_lmm_image: 0.7124 2024/11/13 09:42:31 - mmengine - INFO - Iter(train) [120400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:32:05 time: 1.9982 data_time: 0.0186 memory: 32875 grad_norm: 32.1804 loss: 8.9162 loss_cls: 0.2614 loss_bbox: 0.1425 loss_iou: 0.2367 d0.loss_cls: 0.3029 d0.loss_bbox: 0.1633 d0.loss_iou: 0.2571 d1.loss_cls: 0.2828 d1.loss_bbox: 0.1475 d1.loss_iou: 0.2441 d2.loss_cls: 0.2707 d2.loss_bbox: 0.1414 d2.loss_iou: 0.2397 d3.loss_cls: 0.2678 d3.loss_bbox: 0.1386 d3.loss_iou: 0.2374 d4.loss_cls: 0.2619 d4.loss_bbox: 0.1442 d4.loss_iou: 0.2368 enc_loss_cls: 0.3090 enc_loss_bbox: 0.1686 enc_loss_iou: 0.2737 dn_loss_cls: 0.0822 dn_loss_bbox: 0.1785 dn_loss_iou: 0.2184 d0.dn_loss_cls: 0.1633 d0.dn_loss_bbox: 0.3232 d0.dn_loss_iou: 0.3512 d1.dn_loss_cls: 0.1112 d1.dn_loss_bbox: 0.2070 d1.dn_loss_iou: 0.2443 d2.dn_loss_cls: 0.0910 d2.dn_loss_bbox: 0.1844 d2.dn_loss_iou: 0.2247 d3.dn_loss_cls: 0.0855 d3.dn_loss_bbox: 0.1787 d3.dn_loss_iou: 0.2199 d4.dn_loss_cls: 0.0826 d4.dn_loss_bbox: 0.1785 d4.dn_loss_iou: 0.2184 d1.loss_lmm_region: 0.1088 loss_lmm_image: 0.7363 2024/11/13 09:45:52 - mmengine - INFO - Iter(train) [120500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:28:44 time: 2.0185 data_time: 0.0187 memory: 33933 grad_norm: 21.7125 loss: 9.4455 loss_cls: 0.3085 loss_bbox: 0.1567 loss_iou: 0.2787 d0.loss_cls: 0.3416 d0.loss_bbox: 0.1703 d0.loss_iou: 0.2911 d1.loss_cls: 0.3417 d1.loss_bbox: 0.1522 d1.loss_iou: 0.2793 d2.loss_cls: 0.3150 d2.loss_bbox: 0.1599 d2.loss_iou: 0.2825 d3.loss_cls: 0.3071 d3.loss_bbox: 0.1570 d3.loss_iou: 0.2800 d4.loss_cls: 0.3076 d4.loss_bbox: 0.1561 d4.loss_iou: 0.2800 enc_loss_cls: 0.3550 enc_loss_bbox: 0.1701 enc_loss_iou: 0.3026 dn_loss_cls: 0.0856 dn_loss_bbox: 0.1494 dn_loss_iou: 0.2029 d0.dn_loss_cls: 0.1750 d0.dn_loss_bbox: 0.3044 d0.dn_loss_iou: 0.3422 d1.dn_loss_cls: 0.1185 d1.dn_loss_bbox: 0.1831 d1.dn_loss_iou: 0.2322 d2.dn_loss_cls: 0.0978 d2.dn_loss_bbox: 0.1586 d2.dn_loss_iou: 0.2127 d3.dn_loss_cls: 0.0895 d3.dn_loss_bbox: 0.1503 d3.dn_loss_iou: 0.2047 d4.dn_loss_cls: 0.0851 d4.dn_loss_bbox: 0.1493 d4.dn_loss_iou: 0.2029 d1.loss_lmm_region: 0.1133 loss_lmm_image: 0.7946 2024/11/13 09:49:12 - mmengine - INFO - Iter(train) [120600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:25:22 time: 1.9801 data_time: 0.0185 memory: 34663 grad_norm: 33.9559 loss: 8.5393 loss_cls: 0.2501 loss_bbox: 0.1299 loss_iou: 0.2268 d0.loss_cls: 0.2923 d0.loss_bbox: 0.1444 d0.loss_iou: 0.2423 d1.loss_cls: 0.2637 d1.loss_bbox: 0.1376 d1.loss_iou: 0.2357 d2.loss_cls: 0.2608 d2.loss_bbox: 0.1325 d2.loss_iou: 0.2282 d3.loss_cls: 0.2535 d3.loss_bbox: 0.1303 d3.loss_iou: 0.2268 d4.loss_cls: 0.2503 d4.loss_bbox: 0.1302 d4.loss_iou: 0.2273 enc_loss_cls: 0.2941 enc_loss_bbox: 0.1601 enc_loss_iou: 0.2568 dn_loss_cls: 0.1049 dn_loss_bbox: 0.1458 dn_loss_iou: 0.2042 d0.dn_loss_cls: 0.1765 d0.dn_loss_bbox: 0.2857 d0.dn_loss_iou: 0.3361 d1.dn_loss_cls: 0.1249 d1.dn_loss_bbox: 0.1785 d1.dn_loss_iou: 0.2310 d2.dn_loss_cls: 0.1151 d2.dn_loss_bbox: 0.1550 d2.dn_loss_iou: 0.2119 d3.dn_loss_cls: 0.1069 d3.dn_loss_bbox: 0.1484 d3.dn_loss_iou: 0.2066 d4.dn_loss_cls: 0.1070 d4.dn_loss_bbox: 0.1458 d4.dn_loss_iou: 0.2043 d1.loss_lmm_region: 0.1128 loss_lmm_image: 0.7637 2024/11/13 09:52:33 - mmengine - INFO - Iter(train) [120700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:22:01 time: 2.0094 data_time: 0.0186 memory: 33761 grad_norm: 37.5372 loss: 9.6212 loss_cls: 0.2876 loss_bbox: 0.1456 loss_iou: 0.2413 d0.loss_cls: 0.3306 d0.loss_bbox: 0.1599 d0.loss_iou: 0.2592 d1.loss_cls: 0.3032 d1.loss_bbox: 0.1546 d1.loss_iou: 0.2497 d2.loss_cls: 0.2951 d2.loss_bbox: 0.1458 d2.loss_iou: 0.2416 d3.loss_cls: 0.2905 d3.loss_bbox: 0.1467 d3.loss_iou: 0.2423 d4.loss_cls: 0.2846 d4.loss_bbox: 0.1450 d4.loss_iou: 0.2425 enc_loss_cls: 0.3345 enc_loss_bbox: 0.1709 enc_loss_iou: 0.2745 dn_loss_cls: 0.1396 dn_loss_bbox: 0.1842 dn_loss_iou: 0.2270 d0.dn_loss_cls: 0.2287 d0.dn_loss_bbox: 0.3312 d0.dn_loss_iou: 0.3602 d1.dn_loss_cls: 0.1807 d1.dn_loss_bbox: 0.2149 d1.dn_loss_iou: 0.2539 d2.dn_loss_cls: 0.1581 d2.dn_loss_bbox: 0.1970 d2.dn_loss_iou: 0.2372 d3.dn_loss_cls: 0.1480 d3.dn_loss_bbox: 0.1856 d3.dn_loss_iou: 0.2290 d4.dn_loss_cls: 0.1440 d4.dn_loss_bbox: 0.1842 d4.dn_loss_iou: 0.2271 d1.loss_lmm_region: 0.1206 loss_lmm_image: 0.7242 2024/11/13 09:55:53 - mmengine - INFO - Iter(train) [120800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:18:40 time: 2.0016 data_time: 0.0185 memory: 34737 grad_norm: 24.6587 loss: 9.2300 loss_cls: 0.2861 loss_bbox: 0.1620 loss_iou: 0.2877 d0.loss_cls: 0.3192 d0.loss_bbox: 0.1697 d0.loss_iou: 0.3016 d1.loss_cls: 0.3097 d1.loss_bbox: 0.1585 d1.loss_iou: 0.2900 d2.loss_cls: 0.2978 d2.loss_bbox: 0.1549 d2.loss_iou: 0.2925 d3.loss_cls: 0.2946 d3.loss_bbox: 0.1532 d3.loss_iou: 0.2868 d4.loss_cls: 0.2849 d4.loss_bbox: 0.1625 d4.loss_iou: 0.2881 enc_loss_cls: 0.3304 enc_loss_bbox: 0.1792 enc_loss_iou: 0.3138 dn_loss_cls: 0.0661 dn_loss_bbox: 0.1579 dn_loss_iou: 0.2116 d0.dn_loss_cls: 0.1361 d0.dn_loss_bbox: 0.2868 d0.dn_loss_iou: 0.3414 d1.dn_loss_cls: 0.0901 d1.dn_loss_bbox: 0.1873 d1.dn_loss_iou: 0.2391 d2.dn_loss_cls: 0.0755 d2.dn_loss_bbox: 0.1663 d2.dn_loss_iou: 0.2203 d3.dn_loss_cls: 0.0696 d3.dn_loss_bbox: 0.1590 d3.dn_loss_iou: 0.2132 d4.dn_loss_cls: 0.0666 d4.dn_loss_bbox: 0.1579 d4.dn_loss_iou: 0.2116 d1.loss_lmm_region: 0.1017 loss_lmm_image: 0.7486 2024/11/13 09:59:13 - mmengine - INFO - Iter(train) [120900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:15:18 time: 1.9774 data_time: 0.0187 memory: 35076 grad_norm: 25.5372 loss: 8.0408 loss_cls: 0.2334 loss_bbox: 0.1228 loss_iou: 0.2295 d0.loss_cls: 0.2654 d0.loss_bbox: 0.1298 d0.loss_iou: 0.2432 d1.loss_cls: 0.2424 d1.loss_bbox: 0.1279 d1.loss_iou: 0.2353 d2.loss_cls: 0.2380 d2.loss_bbox: 0.1241 d2.loss_iou: 0.2316 d3.loss_cls: 0.2328 d3.loss_bbox: 0.1250 d3.loss_iou: 0.2306 d4.loss_cls: 0.2318 d4.loss_bbox: 0.1240 d4.loss_iou: 0.2311 enc_loss_cls: 0.2747 enc_loss_bbox: 0.1408 enc_loss_iou: 0.2604 dn_loss_cls: 0.0747 dn_loss_bbox: 0.1396 dn_loss_iou: 0.1927 d0.dn_loss_cls: 0.1454 d0.dn_loss_bbox: 0.2710 d0.dn_loss_iou: 0.3160 d1.dn_loss_cls: 0.0999 d1.dn_loss_bbox: 0.1720 d1.dn_loss_iou: 0.2194 d2.dn_loss_cls: 0.0852 d2.dn_loss_bbox: 0.1485 d2.dn_loss_iou: 0.2001 d3.dn_loss_cls: 0.0795 d3.dn_loss_bbox: 0.1415 d3.dn_loss_iou: 0.1942 d4.dn_loss_cls: 0.0752 d4.dn_loss_bbox: 0.1397 d4.dn_loss_iou: 0.1927 d1.loss_lmm_region: 0.1014 loss_lmm_image: 0.7776 2024/11/13 10:02:33 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 10:02:33 - mmengine - INFO - Iter(train) [121000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:11:57 time: 2.0026 data_time: 0.0186 memory: 33163 grad_norm: 30.8261 loss: 8.8557 loss_cls: 0.2640 loss_bbox: 0.1424 loss_iou: 0.2353 d0.loss_cls: 0.3057 d0.loss_bbox: 0.1562 d0.loss_iou: 0.2443 d1.loss_cls: 0.2866 d1.loss_bbox: 0.1452 d1.loss_iou: 0.2360 d2.loss_cls: 0.2772 d2.loss_bbox: 0.1431 d2.loss_iou: 0.2342 d3.loss_cls: 0.2714 d3.loss_bbox: 0.1418 d3.loss_iou: 0.2324 d4.loss_cls: 0.2687 d4.loss_bbox: 0.1420 d4.loss_iou: 0.2336 enc_loss_cls: 0.3167 enc_loss_bbox: 0.1611 enc_loss_iou: 0.2549 dn_loss_cls: 0.0861 dn_loss_bbox: 0.1708 dn_loss_iou: 0.2159 d0.dn_loss_cls: 0.1689 d0.dn_loss_bbox: 0.3013 d0.dn_loss_iou: 0.3480 d1.dn_loss_cls: 0.1129 d1.dn_loss_bbox: 0.1955 d1.dn_loss_iou: 0.2426 d2.dn_loss_cls: 0.0927 d2.dn_loss_bbox: 0.1770 d2.dn_loss_iou: 0.2243 d3.dn_loss_cls: 0.0871 d3.dn_loss_bbox: 0.1725 d3.dn_loss_iou: 0.2181 d4.dn_loss_cls: 0.0862 d4.dn_loss_bbox: 0.1709 d4.dn_loss_iou: 0.2159 d1.loss_lmm_region: 0.1241 loss_lmm_image: 0.7522 2024/11/13 10:05:49 - mmengine - INFO - Iter(train) [121100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:08:35 time: 1.9807 data_time: 0.0185 memory: 33500 grad_norm: 22.7558 loss: 9.5234 loss_cls: 0.3378 loss_bbox: 0.1409 loss_iou: 0.2624 d0.loss_cls: 0.3880 d0.loss_bbox: 0.1520 d0.loss_iou: 0.2760 d1.loss_cls: 0.3686 d1.loss_bbox: 0.1453 d1.loss_iou: 0.2681 d2.loss_cls: 0.3580 d2.loss_bbox: 0.1416 d2.loss_iou: 0.2631 d3.loss_cls: 0.3460 d3.loss_bbox: 0.1397 d3.loss_iou: 0.2640 d4.loss_cls: 0.3437 d4.loss_bbox: 0.1395 d4.loss_iou: 0.2603 enc_loss_cls: 0.3956 enc_loss_bbox: 0.1617 enc_loss_iou: 0.2895 dn_loss_cls: 0.0957 dn_loss_bbox: 0.1582 dn_loss_iou: 0.2058 d0.dn_loss_cls: 0.1801 d0.dn_loss_bbox: 0.2906 d0.dn_loss_iou: 0.3352 d1.dn_loss_cls: 0.1277 d1.dn_loss_bbox: 0.1868 d1.dn_loss_iou: 0.2337 d2.dn_loss_cls: 0.1093 d2.dn_loss_bbox: 0.1657 d2.dn_loss_iou: 0.2142 d3.dn_loss_cls: 0.0998 d3.dn_loss_bbox: 0.1598 d3.dn_loss_iou: 0.2082 d4.dn_loss_cls: 0.0950 d4.dn_loss_bbox: 0.1583 d4.dn_loss_iou: 0.2058 d1.loss_lmm_region: 0.1063 loss_lmm_image: 0.7455 2024/11/13 10:09:09 - mmengine - INFO - Iter(train) [121200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:05:13 time: 2.0109 data_time: 0.0187 memory: 34086 grad_norm: 35.2362 loss: 8.0130 loss_cls: 0.2336 loss_bbox: 0.1185 loss_iou: 0.2316 d0.loss_cls: 0.2580 d0.loss_bbox: 0.1342 d0.loss_iou: 0.2479 d1.loss_cls: 0.2394 d1.loss_bbox: 0.1297 d1.loss_iou: 0.2418 d2.loss_cls: 0.2371 d2.loss_bbox: 0.1264 d2.loss_iou: 0.2355 d3.loss_cls: 0.2366 d3.loss_bbox: 0.1206 d3.loss_iou: 0.2337 d4.loss_cls: 0.2326 d4.loss_bbox: 0.1201 d4.loss_iou: 0.2329 enc_loss_cls: 0.2707 enc_loss_bbox: 0.1431 enc_loss_iou: 0.2638 dn_loss_cls: 0.0701 dn_loss_bbox: 0.1421 dn_loss_iou: 0.2054 d0.dn_loss_cls: 0.1404 d0.dn_loss_bbox: 0.2646 d0.dn_loss_iou: 0.3257 d1.dn_loss_cls: 0.0946 d1.dn_loss_bbox: 0.1661 d1.dn_loss_iou: 0.2288 d2.dn_loss_cls: 0.0818 d2.dn_loss_bbox: 0.1494 d2.dn_loss_iou: 0.2135 d3.dn_loss_cls: 0.0751 d3.dn_loss_bbox: 0.1427 d3.dn_loss_iou: 0.2074 d4.dn_loss_cls: 0.0708 d4.dn_loss_bbox: 0.1421 d4.dn_loss_iou: 0.2054 d1.loss_lmm_region: 0.0966 loss_lmm_image: 0.7028 2024/11/13 10:12:27 - mmengine - INFO - Iter(train) [121300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 16:01:51 time: 1.9762 data_time: 0.0185 memory: 35314 grad_norm: 27.3878 loss: 7.6205 loss_cls: 0.2233 loss_bbox: 0.1070 loss_iou: 0.2032 d0.loss_cls: 0.2586 d0.loss_bbox: 0.1177 d0.loss_iou: 0.2116 d1.loss_cls: 0.2388 d1.loss_bbox: 0.1131 d1.loss_iou: 0.2066 d2.loss_cls: 0.2369 d2.loss_bbox: 0.1056 d2.loss_iou: 0.1991 d3.loss_cls: 0.2309 d3.loss_bbox: 0.1074 d3.loss_iou: 0.1985 d4.loss_cls: 0.2247 d4.loss_bbox: 0.1069 d4.loss_iou: 0.2031 enc_loss_cls: 0.2689 enc_loss_bbox: 0.1229 enc_loss_iou: 0.2272 dn_loss_cls: 0.0851 dn_loss_bbox: 0.1282 dn_loss_iou: 0.1918 d0.dn_loss_cls: 0.1561 d0.dn_loss_bbox: 0.2550 d0.dn_loss_iou: 0.3216 d1.dn_loss_cls: 0.1118 d1.dn_loss_bbox: 0.1525 d1.dn_loss_iou: 0.2192 d2.dn_loss_cls: 0.1005 d2.dn_loss_bbox: 0.1355 d2.dn_loss_iou: 0.2001 d3.dn_loss_cls: 0.0908 d3.dn_loss_bbox: 0.1294 d3.dn_loss_iou: 0.1936 d4.dn_loss_cls: 0.0853 d4.dn_loss_bbox: 0.1283 d4.dn_loss_iou: 0.1918 d1.loss_lmm_region: 0.0994 loss_lmm_image: 0.7321 2024/11/13 10:15:44 - mmengine - INFO - Iter(train) [121400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:58:29 time: 1.9809 data_time: 0.0185 memory: 35187 grad_norm: 23.3861 loss: 8.4377 loss_cls: 0.2526 loss_bbox: 0.1297 loss_iou: 0.2382 d0.loss_cls: 0.2871 d0.loss_bbox: 0.1431 d0.loss_iou: 0.2539 d1.loss_cls: 0.2713 d1.loss_bbox: 0.1325 d1.loss_iou: 0.2400 d2.loss_cls: 0.2648 d2.loss_bbox: 0.1304 d2.loss_iou: 0.2407 d3.loss_cls: 0.2579 d3.loss_bbox: 0.1290 d3.loss_iou: 0.2384 d4.loss_cls: 0.2545 d4.loss_bbox: 0.1284 d4.loss_iou: 0.2387 enc_loss_cls: 0.2932 enc_loss_bbox: 0.1534 enc_loss_iou: 0.2678 dn_loss_cls: 0.0700 dn_loss_bbox: 0.1515 dn_loss_iou: 0.2058 d0.dn_loss_cls: 0.1547 d0.dn_loss_bbox: 0.2871 d0.dn_loss_iou: 0.3444 d1.dn_loss_cls: 0.1069 d1.dn_loss_bbox: 0.1839 d1.dn_loss_iou: 0.2364 d2.dn_loss_cls: 0.0857 d2.dn_loss_bbox: 0.1594 d2.dn_loss_iou: 0.2150 d3.dn_loss_cls: 0.0759 d3.dn_loss_bbox: 0.1548 d3.dn_loss_iou: 0.2086 d4.dn_loss_cls: 0.0708 d4.dn_loss_bbox: 0.1515 d4.dn_loss_iou: 0.2060 d1.loss_lmm_region: 0.0964 loss_lmm_image: 0.7275 2024/11/13 10:19:06 - mmengine - INFO - Iter(train) [121500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:55:09 time: 2.0241 data_time: 0.0185 memory: 33620 grad_norm: 24.5695 loss: 7.8302 loss_cls: 0.2318 loss_bbox: 0.1154 loss_iou: 0.1955 d0.loss_cls: 0.2712 d0.loss_bbox: 0.1222 d0.loss_iou: 0.2067 d1.loss_cls: 0.2539 d1.loss_bbox: 0.1140 d1.loss_iou: 0.1955 d2.loss_cls: 0.2440 d2.loss_bbox: 0.1122 d2.loss_iou: 0.1912 d3.loss_cls: 0.2385 d3.loss_bbox: 0.1132 d3.loss_iou: 0.1942 d4.loss_cls: 0.2324 d4.loss_bbox: 0.1146 d4.loss_iou: 0.1954 enc_loss_cls: 0.2759 enc_loss_bbox: 0.1342 enc_loss_iou: 0.2258 dn_loss_cls: 0.0778 dn_loss_bbox: 0.1528 dn_loss_iou: 0.1892 d0.dn_loss_cls: 0.1444 d0.dn_loss_bbox: 0.2890 d0.dn_loss_iou: 0.3258 d1.dn_loss_cls: 0.1011 d1.dn_loss_bbox: 0.1771 d1.dn_loss_iou: 0.2167 d2.dn_loss_cls: 0.0863 d2.dn_loss_bbox: 0.1584 d2.dn_loss_iou: 0.1968 d3.dn_loss_cls: 0.0804 d3.dn_loss_bbox: 0.1535 d3.dn_loss_iou: 0.1911 d4.dn_loss_cls: 0.0784 d4.dn_loss_bbox: 0.1528 d4.dn_loss_iou: 0.1892 d1.loss_lmm_region: 0.0893 loss_lmm_image: 0.8024 2024/11/13 10:22:24 - mmengine - INFO - Iter(train) [121600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:51:47 time: 1.9721 data_time: 0.0186 memory: 33364 grad_norm: 24.8329 loss: 8.5910 loss_cls: 0.2600 loss_bbox: 0.1285 loss_iou: 0.2484 d0.loss_cls: 0.2942 d0.loss_bbox: 0.1412 d0.loss_iou: 0.2586 d1.loss_cls: 0.2711 d1.loss_bbox: 0.1385 d1.loss_iou: 0.2601 d2.loss_cls: 0.2671 d2.loss_bbox: 0.1294 d2.loss_iou: 0.2525 d3.loss_cls: 0.2656 d3.loss_bbox: 0.1278 d3.loss_iou: 0.2480 d4.loss_cls: 0.2611 d4.loss_bbox: 0.1305 d4.loss_iou: 0.2495 enc_loss_cls: 0.3042 enc_loss_bbox: 0.1459 enc_loss_iou: 0.2779 dn_loss_cls: 0.0820 dn_loss_bbox: 0.1459 dn_loss_iou: 0.2120 d0.dn_loss_cls: 0.1635 d0.dn_loss_bbox: 0.2779 d0.dn_loss_iou: 0.3470 d1.dn_loss_cls: 0.1143 d1.dn_loss_bbox: 0.1745 d1.dn_loss_iou: 0.2384 d2.dn_loss_cls: 0.0923 d2.dn_loss_bbox: 0.1540 d2.dn_loss_iou: 0.2203 d3.dn_loss_cls: 0.0860 d3.dn_loss_bbox: 0.1476 d3.dn_loss_iou: 0.2136 d4.dn_loss_cls: 0.0836 d4.dn_loss_bbox: 0.1459 d4.dn_loss_iou: 0.2120 d1.loss_lmm_region: 0.1017 loss_lmm_image: 0.7183 2024/11/13 10:25:45 - mmengine - INFO - Iter(train) [121700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:48:26 time: 1.9972 data_time: 0.0186 memory: 33771 grad_norm: 27.9856 loss: 9.1112 loss_cls: 0.3087 loss_bbox: 0.1316 loss_iou: 0.2878 d0.loss_cls: 0.3533 d0.loss_bbox: 0.1410 d0.loss_iou: 0.3060 d1.loss_cls: 0.3271 d1.loss_bbox: 0.1348 d1.loss_iou: 0.2952 d2.loss_cls: 0.3186 d2.loss_bbox: 0.1311 d2.loss_iou: 0.2930 d3.loss_cls: 0.3112 d3.loss_bbox: 0.1327 d3.loss_iou: 0.2943 d4.loss_cls: 0.3091 d4.loss_bbox: 0.1336 d4.loss_iou: 0.2893 enc_loss_cls: 0.3628 enc_loss_bbox: 0.1505 enc_loss_iou: 0.3178 dn_loss_cls: 0.1070 dn_loss_bbox: 0.1209 dn_loss_iou: 0.1897 d0.dn_loss_cls: 0.1873 d0.dn_loss_bbox: 0.2378 d0.dn_loss_iou: 0.3106 d1.dn_loss_cls: 0.1395 d1.dn_loss_bbox: 0.1477 d1.dn_loss_iou: 0.2162 d2.dn_loss_cls: 0.1208 d2.dn_loss_bbox: 0.1281 d2.dn_loss_iou: 0.1986 d3.dn_loss_cls: 0.1114 d3.dn_loss_bbox: 0.1222 d3.dn_loss_iou: 0.1922 d4.dn_loss_cls: 0.1070 d4.dn_loss_bbox: 0.1209 d4.dn_loss_iou: 0.1897 d1.loss_lmm_region: 0.1094 loss_lmm_image: 0.7247 2024/11/13 10:29:05 - mmengine - INFO - Iter(train) [121800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:45:04 time: 1.9944 data_time: 0.0186 memory: 31635 grad_norm: 29.3322 loss: 9.0671 loss_cls: 0.2701 loss_bbox: 0.1524 loss_iou: 0.2464 d0.loss_cls: 0.3149 d0.loss_bbox: 0.1626 d0.loss_iou: 0.2618 d1.loss_cls: 0.2962 d1.loss_bbox: 0.1529 d1.loss_iou: 0.2502 d2.loss_cls: 0.2857 d2.loss_bbox: 0.1482 d2.loss_iou: 0.2472 d3.loss_cls: 0.2809 d3.loss_bbox: 0.1472 d3.loss_iou: 0.2429 d4.loss_cls: 0.2747 d4.loss_bbox: 0.1499 d4.loss_iou: 0.2453 enc_loss_cls: 0.3221 enc_loss_bbox: 0.1669 enc_loss_iou: 0.2740 dn_loss_cls: 0.0972 dn_loss_bbox: 0.1616 dn_loss_iou: 0.2097 d0.dn_loss_cls: 0.1676 d0.dn_loss_bbox: 0.3049 d0.dn_loss_iou: 0.3473 d1.dn_loss_cls: 0.1263 d1.dn_loss_bbox: 0.1963 d1.dn_loss_iou: 0.2406 d2.dn_loss_cls: 0.1083 d2.dn_loss_bbox: 0.1715 d2.dn_loss_iou: 0.2194 d3.dn_loss_cls: 0.1026 d3.dn_loss_bbox: 0.1623 d3.dn_loss_iou: 0.2115 d4.dn_loss_cls: 0.0984 d4.dn_loss_bbox: 0.1616 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.1109 loss_lmm_image: 0.7669 2024/11/13 10:32:25 - mmengine - INFO - Iter(train) [121900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:41:43 time: 1.9886 data_time: 0.0188 memory: 33666 grad_norm: 24.0532 loss: 7.8237 loss_cls: 0.2277 loss_bbox: 0.1232 loss_iou: 0.2185 d0.loss_cls: 0.2632 d0.loss_bbox: 0.1356 d0.loss_iou: 0.2283 d1.loss_cls: 0.2424 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2269 d2.loss_cls: 0.2394 d2.loss_bbox: 0.1232 d2.loss_iou: 0.2224 d3.loss_cls: 0.2339 d3.loss_bbox: 0.1227 d3.loss_iou: 0.2213 d4.loss_cls: 0.2314 d4.loss_bbox: 0.1205 d4.loss_iou: 0.2176 enc_loss_cls: 0.2729 enc_loss_bbox: 0.1449 enc_loss_iou: 0.2464 dn_loss_cls: 0.0765 dn_loss_bbox: 0.1359 dn_loss_iou: 0.1930 d0.dn_loss_cls: 0.1396 d0.dn_loss_bbox: 0.2581 d0.dn_loss_iou: 0.3169 d1.dn_loss_cls: 0.0952 d1.dn_loss_bbox: 0.1591 d1.dn_loss_iou: 0.2147 d2.dn_loss_cls: 0.0853 d2.dn_loss_bbox: 0.1429 d2.dn_loss_iou: 0.1996 d3.dn_loss_cls: 0.0825 d3.dn_loss_bbox: 0.1372 d3.dn_loss_iou: 0.1948 d4.dn_loss_cls: 0.0771 d4.dn_loss_bbox: 0.1359 d4.dn_loss_iou: 0.1930 d1.loss_lmm_region: 0.1025 loss_lmm_image: 0.6955 2024/11/13 10:35:43 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 10:35:43 - mmengine - INFO - Iter(train) [122000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:38:21 time: 1.9705 data_time: 0.0187 memory: 33303 grad_norm: 28.6418 loss: 9.0728 loss_cls: 0.2913 loss_bbox: 0.1409 loss_iou: 0.2490 d0.loss_cls: 0.3295 d0.loss_bbox: 0.1582 d0.loss_iou: 0.2655 d1.loss_cls: 0.3053 d1.loss_bbox: 0.1468 d1.loss_iou: 0.2536 d2.loss_cls: 0.2969 d2.loss_bbox: 0.1448 d2.loss_iou: 0.2518 d3.loss_cls: 0.2941 d3.loss_bbox: 0.1393 d3.loss_iou: 0.2454 d4.loss_cls: 0.2899 d4.loss_bbox: 0.1420 d4.loss_iou: 0.2502 enc_loss_cls: 0.3347 enc_loss_bbox: 0.1732 enc_loss_iou: 0.2850 dn_loss_cls: 0.0879 dn_loss_bbox: 0.1664 dn_loss_iou: 0.2101 d0.dn_loss_cls: 0.1631 d0.dn_loss_bbox: 0.2985 d0.dn_loss_iou: 0.3431 d1.dn_loss_cls: 0.1136 d1.dn_loss_bbox: 0.1878 d1.dn_loss_iou: 0.2359 d2.dn_loss_cls: 0.0951 d2.dn_loss_bbox: 0.1697 d2.dn_loss_iou: 0.2165 d3.dn_loss_cls: 0.0893 d3.dn_loss_bbox: 0.1664 d3.dn_loss_iou: 0.2116 d4.dn_loss_cls: 0.0870 d4.dn_loss_bbox: 0.1664 d4.dn_loss_iou: 0.2101 d1.loss_lmm_region: 0.1133 loss_lmm_image: 0.7534 2024/11/13 10:39:01 - mmengine - INFO - Iter(train) [122100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:34:59 time: 1.9787 data_time: 0.0189 memory: 33119 grad_norm: 30.2050 loss: 9.0804 loss_cls: 0.2738 loss_bbox: 0.1489 loss_iou: 0.2641 d0.loss_cls: 0.3049 d0.loss_bbox: 0.1605 d0.loss_iou: 0.2760 d1.loss_cls: 0.2879 d1.loss_bbox: 0.1515 d1.loss_iou: 0.2649 d2.loss_cls: 0.2817 d2.loss_bbox: 0.1486 d2.loss_iou: 0.2609 d3.loss_cls: 0.2742 d3.loss_bbox: 0.1493 d3.loss_iou: 0.2639 d4.loss_cls: 0.2741 d4.loss_bbox: 0.1493 d4.loss_iou: 0.2639 enc_loss_cls: 0.3153 enc_loss_bbox: 0.1736 enc_loss_iou: 0.2896 dn_loss_cls: 0.0917 dn_loss_bbox: 0.1557 dn_loss_iou: 0.2196 d0.dn_loss_cls: 0.1651 d0.dn_loss_bbox: 0.2845 d0.dn_loss_iou: 0.3496 d1.dn_loss_cls: 0.1225 d1.dn_loss_bbox: 0.1800 d1.dn_loss_iou: 0.2442 d2.dn_loss_cls: 0.1044 d2.dn_loss_bbox: 0.1642 d2.dn_loss_iou: 0.2266 d3.dn_loss_cls: 0.0973 d3.dn_loss_bbox: 0.1568 d3.dn_loss_iou: 0.2210 d4.dn_loss_cls: 0.0929 d4.dn_loss_bbox: 0.1557 d4.dn_loss_iou: 0.2197 d1.loss_lmm_region: 0.1069 loss_lmm_image: 0.7452 2024/11/13 10:42:21 - mmengine - INFO - Iter(train) [122200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:31:38 time: 2.0082 data_time: 0.0185 memory: 33671 grad_norm: 24.5089 loss: 8.2381 loss_cls: 0.2703 loss_bbox: 0.1159 loss_iou: 0.2261 d0.loss_cls: 0.3077 d0.loss_bbox: 0.1291 d0.loss_iou: 0.2386 d1.loss_cls: 0.2914 d1.loss_bbox: 0.1222 d1.loss_iou: 0.2323 d2.loss_cls: 0.2782 d2.loss_bbox: 0.1182 d2.loss_iou: 0.2315 d3.loss_cls: 0.2741 d3.loss_bbox: 0.1169 d3.loss_iou: 0.2291 d4.loss_cls: 0.2718 d4.loss_bbox: 0.1169 d4.loss_iou: 0.2282 enc_loss_cls: 0.3084 enc_loss_bbox: 0.1403 enc_loss_iou: 0.2539 dn_loss_cls: 0.0941 dn_loss_bbox: 0.1373 dn_loss_iou: 0.1917 d0.dn_loss_cls: 0.1641 d0.dn_loss_bbox: 0.2544 d0.dn_loss_iou: 0.3074 d1.dn_loss_cls: 0.1156 d1.dn_loss_bbox: 0.1640 d1.dn_loss_iou: 0.2157 d2.dn_loss_cls: 0.1006 d2.dn_loss_bbox: 0.1465 d2.dn_loss_iou: 0.1987 d3.dn_loss_cls: 0.0959 d3.dn_loss_bbox: 0.1393 d3.dn_loss_iou: 0.1936 d4.dn_loss_cls: 0.0944 d4.dn_loss_bbox: 0.1373 d4.dn_loss_iou: 0.1917 d1.loss_lmm_region: 0.0986 loss_lmm_image: 0.6959 2024/11/13 10:45:40 - mmengine - INFO - Iter(train) [122300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:28:16 time: 1.9749 data_time: 0.0187 memory: 32663 grad_norm: 24.8384 loss: 8.8210 loss_cls: 0.2745 loss_bbox: 0.1229 loss_iou: 0.2555 d0.loss_cls: 0.3038 d0.loss_bbox: 0.1346 d0.loss_iou: 0.2698 d1.loss_cls: 0.2861 d1.loss_bbox: 0.1256 d1.loss_iou: 0.2602 d2.loss_cls: 0.2771 d2.loss_bbox: 0.1230 d2.loss_iou: 0.2583 d3.loss_cls: 0.2767 d3.loss_bbox: 0.1222 d3.loss_iou: 0.2551 d4.loss_cls: 0.2756 d4.loss_bbox: 0.1234 d4.loss_iou: 0.2566 enc_loss_cls: 0.3162 enc_loss_bbox: 0.1445 enc_loss_iou: 0.2787 dn_loss_cls: 0.0855 dn_loss_bbox: 0.1398 dn_loss_iou: 0.2228 d0.dn_loss_cls: 0.1685 d0.dn_loss_bbox: 0.2853 d0.dn_loss_iou: 0.3638 d1.dn_loss_cls: 0.1194 d1.dn_loss_bbox: 0.1722 d1.dn_loss_iou: 0.2531 d2.dn_loss_cls: 0.0963 d2.dn_loss_bbox: 0.1503 d2.dn_loss_iou: 0.2327 d3.dn_loss_cls: 0.0898 d3.dn_loss_bbox: 0.1427 d3.dn_loss_iou: 0.2259 d4.dn_loss_cls: 0.0874 d4.dn_loss_bbox: 0.1399 d4.dn_loss_iou: 0.2229 d1.loss_lmm_region: 0.0885 loss_lmm_image: 0.7938 2024/11/13 10:48:59 - mmengine - INFO - Iter(train) [122400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:24:55 time: 1.9867 data_time: 0.0187 memory: 34089 grad_norm: 26.5092 loss: 8.4906 loss_cls: 0.2458 loss_bbox: 0.1282 loss_iou: 0.2542 d0.loss_cls: 0.2923 d0.loss_bbox: 0.1339 d0.loss_iou: 0.2640 d1.loss_cls: 0.2632 d1.loss_bbox: 0.1286 d1.loss_iou: 0.2590 d2.loss_cls: 0.2549 d2.loss_bbox: 0.1268 d2.loss_iou: 0.2571 d3.loss_cls: 0.2476 d3.loss_bbox: 0.1269 d3.loss_iou: 0.2547 d4.loss_cls: 0.2464 d4.loss_bbox: 0.1293 d4.loss_iou: 0.2555 enc_loss_cls: 0.2990 enc_loss_bbox: 0.1455 enc_loss_iou: 0.2808 dn_loss_cls: 0.0890 dn_loss_bbox: 0.1406 dn_loss_iou: 0.2051 d0.dn_loss_cls: 0.1778 d0.dn_loss_bbox: 0.2725 d0.dn_loss_iou: 0.3368 d1.dn_loss_cls: 0.1250 d1.dn_loss_bbox: 0.1697 d1.dn_loss_iou: 0.2324 d2.dn_loss_cls: 0.1046 d2.dn_loss_bbox: 0.1490 d2.dn_loss_iou: 0.2133 d3.dn_loss_cls: 0.0960 d3.dn_loss_bbox: 0.1429 d3.dn_loss_iou: 0.2069 d4.dn_loss_cls: 0.0899 d4.dn_loss_bbox: 0.1407 d4.dn_loss_iou: 0.2052 d1.loss_lmm_region: 0.1040 loss_lmm_image: 0.6956 2024/11/13 10:52:19 - mmengine - INFO - Iter(train) [122500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:21:34 time: 2.0072 data_time: 0.0187 memory: 33813 grad_norm: 23.8681 loss: 8.4847 loss_cls: 0.2524 loss_bbox: 0.1303 loss_iou: 0.2383 d0.loss_cls: 0.2812 d0.loss_bbox: 0.1379 d0.loss_iou: 0.2475 d1.loss_cls: 0.2654 d1.loss_bbox: 0.1333 d1.loss_iou: 0.2440 d2.loss_cls: 0.2621 d2.loss_bbox: 0.1299 d2.loss_iou: 0.2381 d3.loss_cls: 0.2557 d3.loss_bbox: 0.1295 d3.loss_iou: 0.2383 d4.loss_cls: 0.2525 d4.loss_bbox: 0.1307 d4.loss_iou: 0.2398 enc_loss_cls: 0.2872 enc_loss_bbox: 0.1451 enc_loss_iou: 0.2608 dn_loss_cls: 0.0988 dn_loss_bbox: 0.1476 dn_loss_iou: 0.2097 d0.dn_loss_cls: 0.1819 d0.dn_loss_bbox: 0.2666 d0.dn_loss_iou: 0.3333 d1.dn_loss_cls: 0.1315 d1.dn_loss_bbox: 0.1732 d1.dn_loss_iou: 0.2337 d2.dn_loss_cls: 0.1117 d2.dn_loss_bbox: 0.1556 d2.dn_loss_iou: 0.2165 d3.dn_loss_cls: 0.1031 d3.dn_loss_bbox: 0.1485 d3.dn_loss_iou: 0.2111 d4.dn_loss_cls: 0.0993 d4.dn_loss_bbox: 0.1475 d4.dn_loss_iou: 0.2096 d1.loss_lmm_region: 0.1059 loss_lmm_image: 0.6994 2024/11/13 10:55:39 - mmengine - INFO - Iter(train) [122600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:18:12 time: 2.0083 data_time: 0.0185 memory: 34519 grad_norm: 22.3715 loss: 9.1884 loss_cls: 0.2730 loss_bbox: 0.1545 loss_iou: 0.2607 d0.loss_cls: 0.3102 d0.loss_bbox: 0.1625 d0.loss_iou: 0.2663 d1.loss_cls: 0.2891 d1.loss_bbox: 0.1593 d1.loss_iou: 0.2638 d2.loss_cls: 0.2829 d2.loss_bbox: 0.1551 d2.loss_iou: 0.2587 d3.loss_cls: 0.2768 d3.loss_bbox: 0.1555 d3.loss_iou: 0.2603 d4.loss_cls: 0.2726 d4.loss_bbox: 0.1553 d4.loss_iou: 0.2606 enc_loss_cls: 0.3207 enc_loss_bbox: 0.1724 enc_loss_iou: 0.2801 dn_loss_cls: 0.0785 dn_loss_bbox: 0.1708 dn_loss_iou: 0.2205 d0.dn_loss_cls: 0.1688 d0.dn_loss_bbox: 0.3280 d0.dn_loss_iou: 0.3570 d1.dn_loss_cls: 0.1117 d1.dn_loss_bbox: 0.2041 d1.dn_loss_iou: 0.2503 d2.dn_loss_cls: 0.0908 d2.dn_loss_bbox: 0.1822 d2.dn_loss_iou: 0.2302 d3.dn_loss_cls: 0.0833 d3.dn_loss_bbox: 0.1733 d3.dn_loss_iou: 0.2233 d4.dn_loss_cls: 0.0792 d4.dn_loss_bbox: 0.1708 d4.dn_loss_iou: 0.2205 d1.loss_lmm_region: 0.1022 loss_lmm_image: 0.7522 2024/11/13 10:58:57 - mmengine - INFO - Iter(train) [122700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:14:51 time: 1.9887 data_time: 0.0186 memory: 34297 grad_norm: 32.6917 loss: 8.6155 loss_cls: 0.2722 loss_bbox: 0.1510 loss_iou: 0.2463 d0.loss_cls: 0.3088 d0.loss_bbox: 0.1668 d0.loss_iou: 0.2650 d1.loss_cls: 0.2871 d1.loss_bbox: 0.1551 d1.loss_iou: 0.2523 d2.loss_cls: 0.2799 d2.loss_bbox: 0.1511 d2.loss_iou: 0.2456 d3.loss_cls: 0.2733 d3.loss_bbox: 0.1509 d3.loss_iou: 0.2466 d4.loss_cls: 0.2701 d4.loss_bbox: 0.1515 d4.loss_iou: 0.2454 enc_loss_cls: 0.3261 enc_loss_bbox: 0.1716 enc_loss_iou: 0.2722 dn_loss_cls: 0.0799 dn_loss_bbox: 0.1419 dn_loss_iou: 0.1928 d0.dn_loss_cls: 0.1571 d0.dn_loss_bbox: 0.2815 d0.dn_loss_iou: 0.3209 d1.dn_loss_cls: 0.1076 d1.dn_loss_bbox: 0.1724 d1.dn_loss_iou: 0.2188 d2.dn_loss_cls: 0.0918 d2.dn_loss_bbox: 0.1508 d2.dn_loss_iou: 0.1999 d3.dn_loss_cls: 0.0841 d3.dn_loss_bbox: 0.1436 d3.dn_loss_iou: 0.1947 d4.dn_loss_cls: 0.0794 d4.dn_loss_bbox: 0.1419 d4.dn_loss_iou: 0.1928 d1.loss_lmm_region: 0.0855 loss_lmm_image: 0.6894 2024/11/13 11:02:16 - mmengine - INFO - Iter(train) [122800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:11:29 time: 1.9958 data_time: 0.0188 memory: 34171 grad_norm: 34.2217 loss: 9.9425 loss_cls: 0.3106 loss_bbox: 0.1627 loss_iou: 0.2906 d0.loss_cls: 0.3527 d0.loss_bbox: 0.1856 d0.loss_iou: 0.3116 d1.loss_cls: 0.3313 d1.loss_bbox: 0.1728 d1.loss_iou: 0.3001 d2.loss_cls: 0.3166 d2.loss_bbox: 0.1713 d2.loss_iou: 0.2960 d3.loss_cls: 0.3141 d3.loss_bbox: 0.1650 d3.loss_iou: 0.2919 d4.loss_cls: 0.3082 d4.loss_bbox: 0.1661 d4.loss_iou: 0.2937 enc_loss_cls: 0.3652 enc_loss_bbox: 0.1857 enc_loss_iou: 0.3194 dn_loss_cls: 0.0877 dn_loss_bbox: 0.1699 dn_loss_iou: 0.2358 d0.dn_loss_cls: 0.1959 d0.dn_loss_bbox: 0.3122 d0.dn_loss_iou: 0.3700 d1.dn_loss_cls: 0.1340 d1.dn_loss_bbox: 0.2047 d1.dn_loss_iou: 0.2655 d2.dn_loss_cls: 0.1026 d2.dn_loss_bbox: 0.1840 d2.dn_loss_iou: 0.2455 d3.dn_loss_cls: 0.0947 d3.dn_loss_bbox: 0.1730 d3.dn_loss_iou: 0.2380 d4.dn_loss_cls: 0.0889 d4.dn_loss_bbox: 0.1699 d4.dn_loss_iou: 0.2358 d1.loss_lmm_region: 0.1293 loss_lmm_image: 0.6941 2024/11/13 11:05:33 - mmengine - INFO - Iter(train) [122900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:08:07 time: 1.9716 data_time: 0.0186 memory: 33784 grad_norm: 22.6107 loss: 9.2004 loss_cls: 0.2606 loss_bbox: 0.1585 loss_iou: 0.2635 d0.loss_cls: 0.2962 d0.loss_bbox: 0.1747 d0.loss_iou: 0.2808 d1.loss_cls: 0.2815 d1.loss_bbox: 0.1584 d1.loss_iou: 0.2677 d2.loss_cls: 0.2711 d2.loss_bbox: 0.1588 d2.loss_iou: 0.2683 d3.loss_cls: 0.2638 d3.loss_bbox: 0.1579 d3.loss_iou: 0.2650 d4.loss_cls: 0.2637 d4.loss_bbox: 0.1574 d4.loss_iou: 0.2634 enc_loss_cls: 0.3067 enc_loss_bbox: 0.1831 enc_loss_iou: 0.2949 dn_loss_cls: 0.0919 dn_loss_bbox: 0.1739 dn_loss_iou: 0.2302 d0.dn_loss_cls: 0.1651 d0.dn_loss_bbox: 0.3192 d0.dn_loss_iou: 0.3635 d1.dn_loss_cls: 0.1184 d1.dn_loss_bbox: 0.2051 d1.dn_loss_iou: 0.2586 d2.dn_loss_cls: 0.1042 d2.dn_loss_bbox: 0.1823 d2.dn_loss_iou: 0.2393 d3.dn_loss_cls: 0.0969 d3.dn_loss_bbox: 0.1752 d3.dn_loss_iou: 0.2321 d4.dn_loss_cls: 0.0935 d4.dn_loss_bbox: 0.1738 d4.dn_loss_iou: 0.2301 d1.loss_lmm_region: 0.0981 loss_lmm_image: 0.6531 2024/11/13 11:08:54 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 11:08:54 - mmengine - INFO - Iter(train) [123000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:04:46 time: 2.0027 data_time: 0.0187 memory: 34138 grad_norm: 24.8470 loss: 7.8920 loss_cls: 0.2878 loss_bbox: 0.1061 loss_iou: 0.2003 d0.loss_cls: 0.3183 d0.loss_bbox: 0.1174 d0.loss_iou: 0.2094 d1.loss_cls: 0.3025 d1.loss_bbox: 0.1104 d1.loss_iou: 0.2058 d2.loss_cls: 0.2954 d2.loss_bbox: 0.1096 d2.loss_iou: 0.2037 d3.loss_cls: 0.2913 d3.loss_bbox: 0.1084 d3.loss_iou: 0.2008 d4.loss_cls: 0.2863 d4.loss_bbox: 0.1062 d4.loss_iou: 0.2004 enc_loss_cls: 0.3201 enc_loss_bbox: 0.1321 enc_loss_iou: 0.2298 dn_loss_cls: 0.0739 dn_loss_bbox: 0.1362 dn_loss_iou: 0.1669 d0.dn_loss_cls: 0.1427 d0.dn_loss_bbox: 0.2645 d0.dn_loss_iou: 0.2828 d1.dn_loss_cls: 0.0983 d1.dn_loss_bbox: 0.1632 d1.dn_loss_iou: 0.1915 d2.dn_loss_cls: 0.0838 d2.dn_loss_bbox: 0.1448 d2.dn_loss_iou: 0.1746 d3.dn_loss_cls: 0.0764 d3.dn_loss_bbox: 0.1388 d3.dn_loss_iou: 0.1696 d4.dn_loss_cls: 0.0744 d4.dn_loss_bbox: 0.1362 d4.dn_loss_iou: 0.1669 d1.loss_lmm_region: 0.1045 loss_lmm_image: 0.7599 2024/11/13 11:12:14 - mmengine - INFO - Iter(train) [123100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 15:01:25 time: 1.9941 data_time: 0.0187 memory: 34178 grad_norm: 26.6122 loss: 9.3447 loss_cls: 0.2937 loss_bbox: 0.1525 loss_iou: 0.2540 d0.loss_cls: 0.3404 d0.loss_bbox: 0.1608 d0.loss_iou: 0.2648 d1.loss_cls: 0.3208 d1.loss_bbox: 0.1522 d1.loss_iou: 0.2557 d2.loss_cls: 0.3091 d2.loss_bbox: 0.1486 d2.loss_iou: 0.2529 d3.loss_cls: 0.2959 d3.loss_bbox: 0.1510 d3.loss_iou: 0.2537 d4.loss_cls: 0.2975 d4.loss_bbox: 0.1504 d4.loss_iou: 0.2538 enc_loss_cls: 0.3434 enc_loss_bbox: 0.1657 enc_loss_iou: 0.2741 dn_loss_cls: 0.1184 dn_loss_bbox: 0.1625 dn_loss_iou: 0.2007 d0.dn_loss_cls: 0.2022 d0.dn_loss_bbox: 0.3027 d0.dn_loss_iou: 0.3341 d1.dn_loss_cls: 0.1469 d1.dn_loss_bbox: 0.1896 d1.dn_loss_iou: 0.2272 d2.dn_loss_cls: 0.1255 d2.dn_loss_bbox: 0.1719 d2.dn_loss_iou: 0.2087 d3.dn_loss_cls: 0.1201 d3.dn_loss_bbox: 0.1649 d3.dn_loss_iou: 0.2031 d4.dn_loss_cls: 0.1170 d4.dn_loss_bbox: 0.1625 d4.dn_loss_iou: 0.2008 d1.loss_lmm_region: 0.1102 loss_lmm_image: 0.7848 2024/11/13 11:15:33 - mmengine - INFO - Iter(train) [123200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:58:03 time: 2.0116 data_time: 0.0186 memory: 33590 grad_norm: 27.0784 loss: 9.1112 loss_cls: 0.2810 loss_bbox: 0.1530 loss_iou: 0.2430 d0.loss_cls: 0.3167 d0.loss_bbox: 0.1626 d0.loss_iou: 0.2504 d1.loss_cls: 0.2938 d1.loss_bbox: 0.1542 d1.loss_iou: 0.2488 d2.loss_cls: 0.2875 d2.loss_bbox: 0.1541 d2.loss_iou: 0.2462 d3.loss_cls: 0.2819 d3.loss_bbox: 0.1546 d3.loss_iou: 0.2439 d4.loss_cls: 0.2862 d4.loss_bbox: 0.1504 d4.loss_iou: 0.2403 enc_loss_cls: 0.3202 enc_loss_bbox: 0.1689 enc_loss_iou: 0.2619 dn_loss_cls: 0.0962 dn_loss_bbox: 0.1601 dn_loss_iou: 0.2135 d0.dn_loss_cls: 0.1831 d0.dn_loss_bbox: 0.2995 d0.dn_loss_iou: 0.3500 d1.dn_loss_cls: 0.1368 d1.dn_loss_bbox: 0.1936 d1.dn_loss_iou: 0.2435 d2.dn_loss_cls: 0.1136 d2.dn_loss_bbox: 0.1697 d2.dn_loss_iou: 0.2233 d3.dn_loss_cls: 0.1026 d3.dn_loss_bbox: 0.1620 d3.dn_loss_iou: 0.2159 d4.dn_loss_cls: 0.0965 d4.dn_loss_bbox: 0.1601 d4.dn_loss_iou: 0.2135 d1.loss_lmm_region: 0.1218 loss_lmm_image: 0.7565 2024/11/13 11:18:52 - mmengine - INFO - Iter(train) [123300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:54:42 time: 1.9614 data_time: 0.0186 memory: 33302 grad_norm: 31.6994 loss: 9.8412 loss_cls: 0.3008 loss_bbox: 0.1607 loss_iou: 0.2818 d0.loss_cls: 0.3374 d0.loss_bbox: 0.1757 d0.loss_iou: 0.2961 d1.loss_cls: 0.3190 d1.loss_bbox: 0.1638 d1.loss_iou: 0.2867 d2.loss_cls: 0.3136 d2.loss_bbox: 0.1600 d2.loss_iou: 0.2823 d3.loss_cls: 0.3060 d3.loss_bbox: 0.1599 d3.loss_iou: 0.2811 d4.loss_cls: 0.3044 d4.loss_bbox: 0.1598 d4.loss_iou: 0.2798 enc_loss_cls: 0.3457 enc_loss_bbox: 0.1870 enc_loss_iou: 0.3117 dn_loss_cls: 0.1142 dn_loss_bbox: 0.1772 dn_loss_iou: 0.2267 d0.dn_loss_cls: 0.1931 d0.dn_loss_bbox: 0.3221 d0.dn_loss_iou: 0.3626 d1.dn_loss_cls: 0.1436 d1.dn_loss_bbox: 0.2110 d1.dn_loss_iou: 0.2564 d2.dn_loss_cls: 0.1271 d2.dn_loss_bbox: 0.1861 d2.dn_loss_iou: 0.2356 d3.dn_loss_cls: 0.1216 d3.dn_loss_bbox: 0.1782 d3.dn_loss_iou: 0.2285 d4.dn_loss_cls: 0.1162 d4.dn_loss_bbox: 0.1773 d4.dn_loss_iou: 0.2268 d1.loss_lmm_region: 0.1015 loss_lmm_image: 0.7221 2024/11/13 11:22:11 - mmengine - INFO - Iter(train) [123400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:51:20 time: 2.0013 data_time: 0.0185 memory: 33988 grad_norm: 25.3123 loss: 8.0658 loss_cls: 0.2209 loss_bbox: 0.1331 loss_iou: 0.2257 d0.loss_cls: 0.2507 d0.loss_bbox: 0.1423 d0.loss_iou: 0.2406 d1.loss_cls: 0.2353 d1.loss_bbox: 0.1365 d1.loss_iou: 0.2340 d2.loss_cls: 0.2321 d2.loss_bbox: 0.1291 d2.loss_iou: 0.2267 d3.loss_cls: 0.2312 d3.loss_bbox: 0.1252 d3.loss_iou: 0.2228 d4.loss_cls: 0.2234 d4.loss_bbox: 0.1336 d4.loss_iou: 0.2260 enc_loss_cls: 0.2578 enc_loss_bbox: 0.1479 enc_loss_iou: 0.2488 dn_loss_cls: 0.0590 dn_loss_bbox: 0.1540 dn_loss_iou: 0.2072 d0.dn_loss_cls: 0.1444 d0.dn_loss_bbox: 0.2960 d0.dn_loss_iou: 0.3456 d1.dn_loss_cls: 0.0925 d1.dn_loss_bbox: 0.1860 d1.dn_loss_iou: 0.2360 d2.dn_loss_cls: 0.0718 d2.dn_loss_bbox: 0.1651 d2.dn_loss_iou: 0.2155 d3.dn_loss_cls: 0.0630 d3.dn_loss_bbox: 0.1568 d3.dn_loss_iou: 0.2098 d4.dn_loss_cls: 0.0602 d4.dn_loss_bbox: 0.1540 d4.dn_loss_iou: 0.2073 d1.loss_lmm_region: 0.1008 loss_lmm_image: 0.7170 2024/11/13 11:25:30 - mmengine - INFO - Iter(train) [123500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:47:59 time: 1.9826 data_time: 0.0187 memory: 33278 grad_norm: 28.5878 loss: 8.1716 loss_cls: 0.2477 loss_bbox: 0.1190 loss_iou: 0.2006 d0.loss_cls: 0.2816 d0.loss_bbox: 0.1241 d0.loss_iou: 0.2082 d1.loss_cls: 0.2676 d1.loss_bbox: 0.1201 d1.loss_iou: 0.1998 d2.loss_cls: 0.2593 d2.loss_bbox: 0.1173 d2.loss_iou: 0.1981 d3.loss_cls: 0.2496 d3.loss_bbox: 0.1205 d3.loss_iou: 0.1989 d4.loss_cls: 0.2491 d4.loss_bbox: 0.1186 d4.loss_iou: 0.1987 enc_loss_cls: 0.2841 enc_loss_bbox: 0.1394 enc_loss_iou: 0.2263 dn_loss_cls: 0.1094 dn_loss_bbox: 0.1591 dn_loss_iou: 0.1917 d0.dn_loss_cls: 0.1817 d0.dn_loss_bbox: 0.2944 d0.dn_loss_iou: 0.3192 d1.dn_loss_cls: 0.1388 d1.dn_loss_bbox: 0.1908 d1.dn_loss_iou: 0.2176 d2.dn_loss_cls: 0.1164 d2.dn_loss_bbox: 0.1665 d2.dn_loss_iou: 0.1981 d3.dn_loss_cls: 0.1101 d3.dn_loss_bbox: 0.1608 d3.dn_loss_iou: 0.1931 d4.dn_loss_cls: 0.1091 d4.dn_loss_bbox: 0.1591 d4.dn_loss_iou: 0.1917 d1.loss_lmm_region: 0.1053 loss_lmm_image: 0.7299 2024/11/13 11:28:52 - mmengine - INFO - Iter(train) [123600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:44:38 time: 2.0219 data_time: 0.0187 memory: 33795 grad_norm: 23.6452 loss: 8.1332 loss_cls: 0.2383 loss_bbox: 0.1109 loss_iou: 0.2192 d0.loss_cls: 0.2770 d0.loss_bbox: 0.1211 d0.loss_iou: 0.2342 d1.loss_cls: 0.2535 d1.loss_bbox: 0.1150 d1.loss_iou: 0.2239 d2.loss_cls: 0.2454 d2.loss_bbox: 0.1118 d2.loss_iou: 0.2211 d3.loss_cls: 0.2425 d3.loss_bbox: 0.1093 d3.loss_iou: 0.2188 d4.loss_cls: 0.2355 d4.loss_bbox: 0.1129 d4.loss_iou: 0.2205 enc_loss_cls: 0.2781 enc_loss_bbox: 0.1277 enc_loss_iou: 0.2489 dn_loss_cls: 0.0760 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2139 d0.dn_loss_cls: 0.1546 d0.dn_loss_bbox: 0.2748 d0.dn_loss_iou: 0.3431 d1.dn_loss_cls: 0.1027 d1.dn_loss_bbox: 0.1826 d1.dn_loss_iou: 0.2429 d2.dn_loss_cls: 0.0856 d2.dn_loss_bbox: 0.1636 d2.dn_loss_iou: 0.2225 d3.dn_loss_cls: 0.0814 d3.dn_loss_bbox: 0.1574 d3.dn_loss_iou: 0.2157 d4.dn_loss_cls: 0.0774 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.2140 d1.loss_lmm_region: 0.1080 loss_lmm_image: 0.7395 2024/11/13 11:32:11 - mmengine - INFO - Iter(train) [123700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:41:16 time: 1.9988 data_time: 0.0186 memory: 33626 grad_norm: 27.4288 loss: 8.7511 loss_cls: 0.3004 loss_bbox: 0.1256 loss_iou: 0.2530 d0.loss_cls: 0.3448 d0.loss_bbox: 0.1346 d0.loss_iou: 0.2677 d1.loss_cls: 0.3211 d1.loss_bbox: 0.1257 d1.loss_iou: 0.2571 d2.loss_cls: 0.3059 d2.loss_bbox: 0.1286 d2.loss_iou: 0.2573 d3.loss_cls: 0.3014 d3.loss_bbox: 0.1283 d3.loss_iou: 0.2575 d4.loss_cls: 0.3004 d4.loss_bbox: 0.1270 d4.loss_iou: 0.2538 enc_loss_cls: 0.3450 enc_loss_bbox: 0.1469 enc_loss_iou: 0.2842 dn_loss_cls: 0.0798 dn_loss_bbox: 0.1366 dn_loss_iou: 0.1953 d0.dn_loss_cls: 0.1641 d0.dn_loss_bbox: 0.2623 d0.dn_loss_iou: 0.3190 d1.dn_loss_cls: 0.1125 d1.dn_loss_bbox: 0.1619 d1.dn_loss_iou: 0.2203 d2.dn_loss_cls: 0.0913 d2.dn_loss_bbox: 0.1450 d2.dn_loss_iou: 0.2024 d3.dn_loss_cls: 0.0837 d3.dn_loss_bbox: 0.1382 d3.dn_loss_iou: 0.1974 d4.dn_loss_cls: 0.0795 d4.dn_loss_bbox: 0.1366 d4.dn_loss_iou: 0.1954 d1.loss_lmm_region: 0.1106 loss_lmm_image: 0.7527 2024/11/13 11:35:33 - mmengine - INFO - Iter(train) [123800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:37:55 time: 2.0162 data_time: 0.0187 memory: 32066 grad_norm: 27.3525 loss: 8.1854 loss_cls: 0.2792 loss_bbox: 0.1090 loss_iou: 0.2350 d0.loss_cls: 0.3285 d0.loss_bbox: 0.1190 d0.loss_iou: 0.2502 d1.loss_cls: 0.3033 d1.loss_bbox: 0.1106 d1.loss_iou: 0.2384 d2.loss_cls: 0.2933 d2.loss_bbox: 0.1113 d2.loss_iou: 0.2364 d3.loss_cls: 0.2854 d3.loss_bbox: 0.1092 d3.loss_iou: 0.2330 d4.loss_cls: 0.2815 d4.loss_bbox: 0.1091 d4.loss_iou: 0.2346 enc_loss_cls: 0.3302 enc_loss_bbox: 0.1273 enc_loss_iou: 0.2606 dn_loss_cls: 0.0802 dn_loss_bbox: 0.1242 dn_loss_iou: 0.1840 d0.dn_loss_cls: 0.1575 d0.dn_loss_bbox: 0.2383 d0.dn_loss_iou: 0.3064 d1.dn_loss_cls: 0.1044 d1.dn_loss_bbox: 0.1466 d1.dn_loss_iou: 0.2076 d2.dn_loss_cls: 0.0887 d2.dn_loss_bbox: 0.1306 d2.dn_loss_iou: 0.1911 d3.dn_loss_cls: 0.0840 d3.dn_loss_bbox: 0.1259 d3.dn_loss_iou: 0.1858 d4.dn_loss_cls: 0.0812 d4.dn_loss_bbox: 0.1242 d4.dn_loss_iou: 0.1841 d1.loss_lmm_region: 0.1012 loss_lmm_image: 0.7541 2024/11/13 11:38:53 - mmengine - INFO - Iter(train) [123900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:34:34 time: 1.9999 data_time: 0.0185 memory: 33370 grad_norm: 21.8190 loss: 8.2026 loss_cls: 0.2508 loss_bbox: 0.1282 loss_iou: 0.2228 d0.loss_cls: 0.2950 d0.loss_bbox: 0.1361 d0.loss_iou: 0.2362 d1.loss_cls: 0.2696 d1.loss_bbox: 0.1301 d1.loss_iou: 0.2304 d2.loss_cls: 0.2570 d2.loss_bbox: 0.1295 d2.loss_iou: 0.2284 d3.loss_cls: 0.2525 d3.loss_bbox: 0.1278 d3.loss_iou: 0.2233 d4.loss_cls: 0.2510 d4.loss_bbox: 0.1272 d4.loss_iou: 0.2217 enc_loss_cls: 0.2978 enc_loss_bbox: 0.1516 enc_loss_iou: 0.2572 dn_loss_cls: 0.0687 dn_loss_bbox: 0.1452 dn_loss_iou: 0.1934 d0.dn_loss_cls: 0.1522 d0.dn_loss_bbox: 0.2917 d0.dn_loss_iou: 0.3314 d1.dn_loss_cls: 0.0996 d1.dn_loss_bbox: 0.1791 d1.dn_loss_iou: 0.2242 d2.dn_loss_cls: 0.0832 d2.dn_loss_bbox: 0.1568 d2.dn_loss_iou: 0.2036 d3.dn_loss_cls: 0.0740 d3.dn_loss_bbox: 0.1473 d3.dn_loss_iou: 0.1956 d4.dn_loss_cls: 0.0692 d4.dn_loss_bbox: 0.1453 d4.dn_loss_iou: 0.1934 d1.loss_lmm_region: 0.0983 loss_lmm_image: 0.7265 2024/11/13 11:42:14 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 11:42:14 - mmengine - INFO - Iter(train) [124000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:31:13 time: 2.0145 data_time: 0.0186 memory: 34563 grad_norm: 25.3171 loss: 7.4591 loss_cls: 0.2353 loss_bbox: 0.1130 loss_iou: 0.1871 d0.loss_cls: 0.2728 d0.loss_bbox: 0.1258 d0.loss_iou: 0.2002 d1.loss_cls: 0.2575 d1.loss_bbox: 0.1144 d1.loss_iou: 0.1873 d2.loss_cls: 0.2454 d2.loss_bbox: 0.1137 d2.loss_iou: 0.1880 d3.loss_cls: 0.2386 d3.loss_bbox: 0.1127 d3.loss_iou: 0.1851 d4.loss_cls: 0.2358 d4.loss_bbox: 0.1130 d4.loss_iou: 0.1868 enc_loss_cls: 0.2775 enc_loss_bbox: 0.1346 enc_loss_iou: 0.2152 dn_loss_cls: 0.0822 dn_loss_bbox: 0.1326 dn_loss_iou: 0.1598 d0.dn_loss_cls: 0.1500 d0.dn_loss_bbox: 0.2595 d0.dn_loss_iou: 0.2789 d1.dn_loss_cls: 0.1056 d1.dn_loss_bbox: 0.1586 d1.dn_loss_iou: 0.1829 d2.dn_loss_cls: 0.0909 d2.dn_loss_bbox: 0.1414 d2.dn_loss_iou: 0.1673 d3.dn_loss_cls: 0.0871 d3.dn_loss_bbox: 0.1348 d3.dn_loss_iou: 0.1612 d4.dn_loss_cls: 0.0837 d4.dn_loss_bbox: 0.1326 d4.dn_loss_iou: 0.1598 d1.loss_lmm_region: 0.1004 loss_lmm_image: 0.7498 2024/11/13 11:45:32 - mmengine - INFO - Iter(train) [124100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:27:51 time: 1.9538 data_time: 0.0185 memory: 33421 grad_norm: 22.8213 loss: 8.3990 loss_cls: 0.2758 loss_bbox: 0.1215 loss_iou: 0.2173 d0.loss_cls: 0.2993 d0.loss_bbox: 0.1389 d0.loss_iou: 0.2356 d1.loss_cls: 0.2867 d1.loss_bbox: 0.1296 d1.loss_iou: 0.2266 d2.loss_cls: 0.2856 d2.loss_bbox: 0.1245 d2.loss_iou: 0.2245 d3.loss_cls: 0.2829 d3.loss_bbox: 0.1217 d3.loss_iou: 0.2187 d4.loss_cls: 0.2784 d4.loss_bbox: 0.1221 d4.loss_iou: 0.2174 enc_loss_cls: 0.3058 enc_loss_bbox: 0.1509 enc_loss_iou: 0.2513 dn_loss_cls: 0.0707 dn_loss_bbox: 0.1562 dn_loss_iou: 0.1993 d0.dn_loss_cls: 0.1524 d0.dn_loss_bbox: 0.2816 d0.dn_loss_iou: 0.3296 d1.dn_loss_cls: 0.1065 d1.dn_loss_bbox: 0.1819 d1.dn_loss_iou: 0.2261 d2.dn_loss_cls: 0.0831 d2.dn_loss_bbox: 0.1629 d2.dn_loss_iou: 0.2061 d3.dn_loss_cls: 0.0764 d3.dn_loss_bbox: 0.1582 d3.dn_loss_iou: 0.2016 d4.dn_loss_cls: 0.0714 d4.dn_loss_bbox: 0.1562 d4.dn_loss_iou: 0.1992 d1.loss_lmm_region: 0.1034 loss_lmm_image: 0.7613 2024/11/13 11:48:51 - mmengine - INFO - Iter(train) [124200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:24:30 time: 1.9847 data_time: 0.0186 memory: 33955 grad_norm: 25.6679 loss: 8.2794 loss_cls: 0.2694 loss_bbox: 0.1150 loss_iou: 0.2384 d0.loss_cls: 0.3079 d0.loss_bbox: 0.1318 d0.loss_iou: 0.2538 d1.loss_cls: 0.2876 d1.loss_bbox: 0.1234 d1.loss_iou: 0.2473 d2.loss_cls: 0.2762 d2.loss_bbox: 0.1216 d2.loss_iou: 0.2450 d3.loss_cls: 0.2682 d3.loss_bbox: 0.1179 d3.loss_iou: 0.2417 d4.loss_cls: 0.2688 d4.loss_bbox: 0.1164 d4.loss_iou: 0.2392 enc_loss_cls: 0.3221 enc_loss_bbox: 0.1387 enc_loss_iou: 0.2666 dn_loss_cls: 0.0865 dn_loss_bbox: 0.1243 dn_loss_iou: 0.1906 d0.dn_loss_cls: 0.1563 d0.dn_loss_bbox: 0.2541 d0.dn_loss_iou: 0.3207 d1.dn_loss_cls: 0.1065 d1.dn_loss_bbox: 0.1512 d1.dn_loss_iou: 0.2168 d2.dn_loss_cls: 0.0929 d2.dn_loss_bbox: 0.1323 d2.dn_loss_iou: 0.1981 d3.dn_loss_cls: 0.0876 d3.dn_loss_bbox: 0.1267 d3.dn_loss_iou: 0.1930 d4.dn_loss_cls: 0.0861 d4.dn_loss_bbox: 0.1244 d4.dn_loss_iou: 0.1906 d1.loss_lmm_region: 0.0931 loss_lmm_image: 0.7505 2024/11/13 11:52:10 - mmengine - INFO - Iter(train) [124300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:21:09 time: 1.9958 data_time: 0.0187 memory: 34622 grad_norm: 23.2938 loss: 7.4124 loss_cls: 0.2418 loss_bbox: 0.1083 loss_iou: 0.1964 d0.loss_cls: 0.2746 d0.loss_bbox: 0.1148 d0.loss_iou: 0.2086 d1.loss_cls: 0.2513 d1.loss_bbox: 0.1117 d1.loss_iou: 0.1989 d2.loss_cls: 0.2453 d2.loss_bbox: 0.1074 d2.loss_iou: 0.1959 d3.loss_cls: 0.2449 d3.loss_bbox: 0.1080 d3.loss_iou: 0.1943 d4.loss_cls: 0.2409 d4.loss_bbox: 0.1074 d4.loss_iou: 0.1959 enc_loss_cls: 0.2782 enc_loss_bbox: 0.1228 enc_loss_iou: 0.2247 dn_loss_cls: 0.0598 dn_loss_bbox: 0.1324 dn_loss_iou: 0.1742 d0.dn_loss_cls: 0.1265 d0.dn_loss_bbox: 0.2512 d0.dn_loss_iou: 0.2969 d1.dn_loss_cls: 0.0837 d1.dn_loss_bbox: 0.1536 d1.dn_loss_iou: 0.1977 d2.dn_loss_cls: 0.0672 d2.dn_loss_bbox: 0.1372 d2.dn_loss_iou: 0.1808 d3.dn_loss_cls: 0.0615 d3.dn_loss_bbox: 0.1338 d3.dn_loss_iou: 0.1766 d4.dn_loss_cls: 0.0592 d4.dn_loss_bbox: 0.1325 d4.dn_loss_iou: 0.1744 d1.loss_lmm_region: 0.0827 loss_lmm_image: 0.7585 2024/11/13 11:55:32 - mmengine - INFO - Iter(train) [124400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:17:48 time: 2.0232 data_time: 0.0188 memory: 35626 grad_norm: 27.9293 loss: 8.9613 loss_cls: 0.2696 loss_bbox: 0.1501 loss_iou: 0.2971 d0.loss_cls: 0.3147 d0.loss_bbox: 0.1526 d0.loss_iou: 0.3063 d1.loss_cls: 0.2862 d1.loss_bbox: 0.1504 d1.loss_iou: 0.2984 d2.loss_cls: 0.2810 d2.loss_bbox: 0.1463 d2.loss_iou: 0.2923 d3.loss_cls: 0.2761 d3.loss_bbox: 0.1457 d3.loss_iou: 0.2930 d4.loss_cls: 0.2720 d4.loss_bbox: 0.1490 d4.loss_iou: 0.2959 enc_loss_cls: 0.3196 enc_loss_bbox: 0.1653 enc_loss_iou: 0.3199 dn_loss_cls: 0.0656 dn_loss_bbox: 0.1456 dn_loss_iou: 0.2173 d0.dn_loss_cls: 0.1473 d0.dn_loss_bbox: 0.2499 d0.dn_loss_iou: 0.3371 d1.dn_loss_cls: 0.0946 d1.dn_loss_bbox: 0.1690 d1.dn_loss_iou: 0.2432 d2.dn_loss_cls: 0.0749 d2.dn_loss_bbox: 0.1538 d2.dn_loss_iou: 0.2266 d3.dn_loss_cls: 0.0691 d3.dn_loss_bbox: 0.1473 d3.dn_loss_iou: 0.2194 d4.dn_loss_cls: 0.0672 d4.dn_loss_bbox: 0.1457 d4.dn_loss_iou: 0.2173 d1.loss_lmm_region: 0.0941 loss_lmm_image: 0.6948 2024/11/13 11:58:50 - mmengine - INFO - Iter(train) [124500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:14:26 time: 1.9834 data_time: 0.0187 memory: 35147 grad_norm: 33.0099 loss: 8.5860 loss_cls: 0.2882 loss_bbox: 0.1250 loss_iou: 0.2503 d0.loss_cls: 0.3329 d0.loss_bbox: 0.1329 d0.loss_iou: 0.2645 d1.loss_cls: 0.3081 d1.loss_bbox: 0.1255 d1.loss_iou: 0.2540 d2.loss_cls: 0.2991 d2.loss_bbox: 0.1262 d2.loss_iou: 0.2518 d3.loss_cls: 0.2943 d3.loss_bbox: 0.1239 d3.loss_iou: 0.2495 d4.loss_cls: 0.2897 d4.loss_bbox: 0.1249 d4.loss_iou: 0.2503 enc_loss_cls: 0.3409 enc_loss_bbox: 0.1431 enc_loss_iou: 0.2796 dn_loss_cls: 0.1054 dn_loss_bbox: 0.1231 dn_loss_iou: 0.1915 d0.dn_loss_cls: 0.1786 d0.dn_loss_bbox: 0.2389 d0.dn_loss_iou: 0.3137 d1.dn_loss_cls: 0.1315 d1.dn_loss_bbox: 0.1467 d1.dn_loss_iou: 0.2156 d2.dn_loss_cls: 0.1146 d2.dn_loss_bbox: 0.1274 d2.dn_loss_iou: 0.1983 d3.dn_loss_cls: 0.1069 d3.dn_loss_bbox: 0.1238 d3.dn_loss_iou: 0.1934 d4.dn_loss_cls: 0.1041 d4.dn_loss_bbox: 0.1231 d4.dn_loss_iou: 0.1916 d1.loss_lmm_region: 0.0946 loss_lmm_image: 0.7082 2024/11/13 12:02:10 - mmengine - INFO - Iter(train) [124600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:11:05 time: 2.0271 data_time: 0.0186 memory: 33835 grad_norm: 22.4638 loss: 7.5491 loss_cls: 0.2296 loss_bbox: 0.1192 loss_iou: 0.2309 d0.loss_cls: 0.2632 d0.loss_bbox: 0.1371 d0.loss_iou: 0.2524 d1.loss_cls: 0.2505 d1.loss_bbox: 0.1254 d1.loss_iou: 0.2344 d2.loss_cls: 0.2375 d2.loss_bbox: 0.1218 d2.loss_iou: 0.2360 d3.loss_cls: 0.2349 d3.loss_bbox: 0.1201 d3.loss_iou: 0.2326 d4.loss_cls: 0.2306 d4.loss_bbox: 0.1190 d4.loss_iou: 0.2303 enc_loss_cls: 0.2783 enc_loss_bbox: 0.1384 enc_loss_iou: 0.2628 dn_loss_cls: 0.0573 dn_loss_bbox: 0.1084 dn_loss_iou: 0.1772 d0.dn_loss_cls: 0.1296 d0.dn_loss_bbox: 0.2135 d0.dn_loss_iou: 0.2930 d1.dn_loss_cls: 0.0838 d1.dn_loss_bbox: 0.1358 d1.dn_loss_iou: 0.2040 d2.dn_loss_cls: 0.0674 d2.dn_loss_bbox: 0.1175 d2.dn_loss_iou: 0.1862 d3.dn_loss_cls: 0.0615 d3.dn_loss_bbox: 0.1100 d3.dn_loss_iou: 0.1791 d4.dn_loss_cls: 0.0574 d4.dn_loss_bbox: 0.1085 d4.dn_loss_iou: 0.1772 d1.loss_lmm_region: 0.0792 loss_lmm_image: 0.7175 2024/11/13 12:05:30 - mmengine - INFO - Iter(train) [124700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:07:43 time: 2.0053 data_time: 0.0187 memory: 34364 grad_norm: 28.6353 loss: 10.1416 loss_cls: 0.3005 loss_bbox: 0.1628 loss_iou: 0.2736 d0.loss_cls: 0.3245 d0.loss_bbox: 0.1808 d0.loss_iou: 0.2886 d1.loss_cls: 0.3114 d1.loss_bbox: 0.1696 d1.loss_iou: 0.2789 d2.loss_cls: 0.3074 d2.loss_bbox: 0.1646 d2.loss_iou: 0.2748 d3.loss_cls: 0.3059 d3.loss_bbox: 0.1628 d3.loss_iou: 0.2719 d4.loss_cls: 0.3035 d4.loss_bbox: 0.1634 d4.loss_iou: 0.2732 enc_loss_cls: 0.3364 enc_loss_bbox: 0.1873 enc_loss_iou: 0.2980 dn_loss_cls: 0.1506 dn_loss_bbox: 0.1909 dn_loss_iou: 0.2375 d0.dn_loss_cls: 0.2219 d0.dn_loss_bbox: 0.3371 d0.dn_loss_iou: 0.3755 d1.dn_loss_cls: 0.1718 d1.dn_loss_bbox: 0.2232 d1.dn_loss_iou: 0.2647 d2.dn_loss_cls: 0.1545 d2.dn_loss_bbox: 0.2026 d2.dn_loss_iou: 0.2469 d3.dn_loss_cls: 0.1532 d3.dn_loss_bbox: 0.1936 d3.dn_loss_iou: 0.2395 d4.dn_loss_cls: 0.1505 d4.dn_loss_bbox: 0.1907 d4.dn_loss_iou: 0.2374 d1.loss_lmm_region: 0.1332 loss_lmm_image: 0.7267 2024/11/13 12:08:49 - mmengine - INFO - Iter(train) [124800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:04:22 time: 1.9984 data_time: 0.0186 memory: 34417 grad_norm: 28.2339 loss: 8.0538 loss_cls: 0.2515 loss_bbox: 0.1274 loss_iou: 0.2461 d0.loss_cls: 0.2964 d0.loss_bbox: 0.1325 d0.loss_iou: 0.2574 d1.loss_cls: 0.2692 d1.loss_bbox: 0.1263 d1.loss_iou: 0.2527 d2.loss_cls: 0.2650 d2.loss_bbox: 0.1232 d2.loss_iou: 0.2490 d3.loss_cls: 0.2550 d3.loss_bbox: 0.1278 d3.loss_iou: 0.2469 d4.loss_cls: 0.2528 d4.loss_bbox: 0.1276 d4.loss_iou: 0.2463 enc_loss_cls: 0.2980 enc_loss_bbox: 0.1466 enc_loss_iou: 0.2804 dn_loss_cls: 0.0640 dn_loss_bbox: 0.1248 dn_loss_iou: 0.1776 d0.dn_loss_cls: 0.1398 d0.dn_loss_bbox: 0.2683 d0.dn_loss_iou: 0.3081 d1.dn_loss_cls: 0.0914 d1.dn_loss_bbox: 0.1541 d1.dn_loss_iou: 0.2045 d2.dn_loss_cls: 0.0759 d2.dn_loss_bbox: 0.1331 d2.dn_loss_iou: 0.1864 d3.dn_loss_cls: 0.0690 d3.dn_loss_bbox: 0.1268 d3.dn_loss_iou: 0.1800 d4.dn_loss_cls: 0.0653 d4.dn_loss_bbox: 0.1249 d4.dn_loss_iou: 0.1777 d1.loss_lmm_region: 0.0841 loss_lmm_image: 0.7198 2024/11/13 12:12:09 - mmengine - INFO - Iter(train) [124900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 14:01:01 time: 1.9877 data_time: 0.0185 memory: 34723 grad_norm: 26.9573 loss: 6.8566 loss_cls: 0.1993 loss_bbox: 0.0866 loss_iou: 0.1724 d0.loss_cls: 0.2132 d0.loss_bbox: 0.0977 d0.loss_iou: 0.1859 d1.loss_cls: 0.2055 d1.loss_bbox: 0.0918 d1.loss_iou: 0.1765 d2.loss_cls: 0.2062 d2.loss_bbox: 0.0869 d2.loss_iou: 0.1726 d3.loss_cls: 0.2009 d3.loss_bbox: 0.0854 d3.loss_iou: 0.1720 d4.loss_cls: 0.1990 d4.loss_bbox: 0.0871 d4.loss_iou: 0.1732 enc_loss_cls: 0.2225 enc_loss_bbox: 0.1073 enc_loss_iou: 0.1962 dn_loss_cls: 0.0774 dn_loss_bbox: 0.1267 dn_loss_iou: 0.1766 d0.dn_loss_cls: 0.1366 d0.dn_loss_bbox: 0.2585 d0.dn_loss_iou: 0.2980 d1.dn_loss_cls: 0.0979 d1.dn_loss_bbox: 0.1535 d1.dn_loss_iou: 0.2001 d2.dn_loss_cls: 0.0859 d2.dn_loss_bbox: 0.1347 d2.dn_loss_iou: 0.1839 d3.dn_loss_cls: 0.0795 d3.dn_loss_bbox: 0.1295 d3.dn_loss_iou: 0.1787 d4.dn_loss_cls: 0.0776 d4.dn_loss_bbox: 0.1267 d4.dn_loss_iou: 0.1766 d1.loss_lmm_region: 0.1012 loss_lmm_image: 0.7187 2024/11/13 12:15:31 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 12:15:31 - mmengine - INFO - Iter(train) [125000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:57:40 time: 2.0163 data_time: 0.0185 memory: 33968 grad_norm: 28.4395 loss: 8.4870 loss_cls: 0.2493 loss_bbox: 0.1113 loss_iou: 0.1987 d0.loss_cls: 0.2776 d0.loss_bbox: 0.1208 d0.loss_iou: 0.2085 d1.loss_cls: 0.2622 d1.loss_bbox: 0.1155 d1.loss_iou: 0.2017 d2.loss_cls: 0.2586 d2.loss_bbox: 0.1114 d2.loss_iou: 0.1981 d3.loss_cls: 0.2541 d3.loss_bbox: 0.1118 d3.loss_iou: 0.1971 d4.loss_cls: 0.2514 d4.loss_bbox: 0.1117 d4.loss_iou: 0.1997 enc_loss_cls: 0.2858 enc_loss_bbox: 0.1280 enc_loss_iou: 0.2223 dn_loss_cls: 0.1587 dn_loss_bbox: 0.1622 dn_loss_iou: 0.1976 d0.dn_loss_cls: 0.2057 d0.dn_loss_bbox: 0.3087 d0.dn_loss_iou: 0.3243 d1.dn_loss_cls: 0.1917 d1.dn_loss_bbox: 0.1915 d1.dn_loss_iou: 0.2220 d2.dn_loss_cls: 0.1785 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2051 d3.dn_loss_cls: 0.1676 d3.dn_loss_bbox: 0.1652 d3.dn_loss_iou: 0.1995 d4.dn_loss_cls: 0.1623 d4.dn_loss_bbox: 0.1623 d4.dn_loss_iou: 0.1977 d1.loss_lmm_region: 0.0940 loss_lmm_image: 0.7445 2024/11/13 12:18:49 - mmengine - INFO - Iter(train) [125100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:54:18 time: 1.9832 data_time: 0.0189 memory: 33431 grad_norm: 26.7836 loss: 8.7212 loss_cls: 0.3239 loss_bbox: 0.1327 loss_iou: 0.2660 d0.loss_cls: 0.3675 d0.loss_bbox: 0.1433 d0.loss_iou: 0.2737 d1.loss_cls: 0.3438 d1.loss_bbox: 0.1386 d1.loss_iou: 0.2704 d2.loss_cls: 0.3367 d2.loss_bbox: 0.1344 d2.loss_iou: 0.2677 d3.loss_cls: 0.3289 d3.loss_bbox: 0.1325 d3.loss_iou: 0.2657 d4.loss_cls: 0.3291 d4.loss_bbox: 0.1312 d4.loss_iou: 0.2623 enc_loss_cls: 0.3709 enc_loss_bbox: 0.1510 enc_loss_iou: 0.2948 dn_loss_cls: 0.0750 dn_loss_bbox: 0.1121 dn_loss_iou: 0.1710 d0.dn_loss_cls: 0.1506 d0.dn_loss_bbox: 0.2215 d0.dn_loss_iou: 0.2906 d1.dn_loss_cls: 0.1050 d1.dn_loss_bbox: 0.1393 d1.dn_loss_iou: 0.1971 d2.dn_loss_cls: 0.0882 d2.dn_loss_bbox: 0.1204 d2.dn_loss_iou: 0.1788 d3.dn_loss_cls: 0.0810 d3.dn_loss_bbox: 0.1143 d3.dn_loss_iou: 0.1733 d4.dn_loss_cls: 0.0767 d4.dn_loss_bbox: 0.1123 d4.dn_loss_iou: 0.1711 d1.loss_lmm_region: 0.1011 loss_lmm_image: 0.7767 2024/11/13 12:22:07 - mmengine - INFO - Iter(train) [125200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:50:56 time: 1.9785 data_time: 0.0186 memory: 33054 grad_norm: 29.7426 loss: 7.2061 loss_cls: 0.2118 loss_bbox: 0.1049 loss_iou: 0.1959 d0.loss_cls: 0.2441 d0.loss_bbox: 0.1125 d0.loss_iou: 0.2075 d1.loss_cls: 0.2319 d1.loss_bbox: 0.1039 d1.loss_iou: 0.1998 d2.loss_cls: 0.2244 d2.loss_bbox: 0.1010 d2.loss_iou: 0.1949 d3.loss_cls: 0.2194 d3.loss_bbox: 0.1020 d3.loss_iou: 0.1943 d4.loss_cls: 0.2145 d4.loss_bbox: 0.1036 d4.loss_iou: 0.1952 enc_loss_cls: 0.2548 enc_loss_bbox: 0.1193 enc_loss_iou: 0.2214 dn_loss_cls: 0.0753 dn_loss_bbox: 0.1195 dn_loss_iou: 0.1732 d0.dn_loss_cls: 0.1421 d0.dn_loss_bbox: 0.2564 d0.dn_loss_iou: 0.3032 d1.dn_loss_cls: 0.1017 d1.dn_loss_bbox: 0.1454 d1.dn_loss_iou: 0.1996 d2.dn_loss_cls: 0.0844 d2.dn_loss_bbox: 0.1282 d2.dn_loss_iou: 0.1803 d3.dn_loss_cls: 0.0789 d3.dn_loss_bbox: 0.1210 d3.dn_loss_iou: 0.1742 d4.dn_loss_cls: 0.0764 d4.dn_loss_bbox: 0.1195 d4.dn_loss_iou: 0.1732 d1.loss_lmm_region: 0.0767 loss_lmm_image: 0.7195 2024/11/13 12:25:25 - mmengine - INFO - Iter(train) [125300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:47:35 time: 1.9596 data_time: 0.0186 memory: 33853 grad_norm: 25.8403 loss: 8.8102 loss_cls: 0.2758 loss_bbox: 0.1401 loss_iou: 0.2278 d0.loss_cls: 0.3229 d0.loss_bbox: 0.1524 d0.loss_iou: 0.2475 d1.loss_cls: 0.2993 d1.loss_bbox: 0.1472 d1.loss_iou: 0.2387 d2.loss_cls: 0.2867 d2.loss_bbox: 0.1443 d2.loss_iou: 0.2351 d3.loss_cls: 0.2793 d3.loss_bbox: 0.1423 d3.loss_iou: 0.2303 d4.loss_cls: 0.2752 d4.loss_bbox: 0.1440 d4.loss_iou: 0.2300 enc_loss_cls: 0.3190 enc_loss_bbox: 0.1690 enc_loss_iou: 0.2680 dn_loss_cls: 0.0925 dn_loss_bbox: 0.1594 dn_loss_iou: 0.1964 d0.dn_loss_cls: 0.1824 d0.dn_loss_bbox: 0.3143 d0.dn_loss_iou: 0.3307 d1.dn_loss_cls: 0.1254 d1.dn_loss_bbox: 0.1901 d1.dn_loss_iou: 0.2223 d2.dn_loss_cls: 0.1064 d2.dn_loss_bbox: 0.1684 d2.dn_loss_iou: 0.2039 d3.dn_loss_cls: 0.0986 d3.dn_loss_bbox: 0.1608 d3.dn_loss_iou: 0.1983 d4.dn_loss_cls: 0.0940 d4.dn_loss_bbox: 0.1594 d4.dn_loss_iou: 0.1963 d1.loss_lmm_region: 0.1290 loss_lmm_image: 0.7065 2024/11/13 12:28:42 - mmengine - INFO - Iter(train) [125400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:44:13 time: 1.9836 data_time: 0.0187 memory: 32024 grad_norm: 28.0725 loss: 7.8502 loss_cls: 0.2238 loss_bbox: 0.1062 loss_iou: 0.2042 d0.loss_cls: 0.2529 d0.loss_bbox: 0.1194 d0.loss_iou: 0.2209 d1.loss_cls: 0.2399 d1.loss_bbox: 0.1111 d1.loss_iou: 0.2080 d2.loss_cls: 0.2316 d2.loss_bbox: 0.1082 d2.loss_iou: 0.2055 d3.loss_cls: 0.2265 d3.loss_bbox: 0.1074 d3.loss_iou: 0.2046 d4.loss_cls: 0.2238 d4.loss_bbox: 0.1066 d4.loss_iou: 0.2050 enc_loss_cls: 0.2575 enc_loss_bbox: 0.1289 enc_loss_iou: 0.2369 dn_loss_cls: 0.0815 dn_loss_bbox: 0.1461 dn_loss_iou: 0.2030 d0.dn_loss_cls: 0.1517 d0.dn_loss_bbox: 0.2911 d0.dn_loss_iou: 0.3266 d1.dn_loss_cls: 0.1047 d1.dn_loss_bbox: 0.1762 d1.dn_loss_iou: 0.2293 d2.dn_loss_cls: 0.0898 d2.dn_loss_bbox: 0.1547 d2.dn_loss_iou: 0.2102 d3.dn_loss_cls: 0.0845 d3.dn_loss_bbox: 0.1485 d3.dn_loss_iou: 0.2045 d4.dn_loss_cls: 0.0821 d4.dn_loss_bbox: 0.1462 d4.dn_loss_iou: 0.2030 d1.loss_lmm_region: 0.0882 loss_lmm_image: 0.7996 2024/11/13 12:32:02 - mmengine - INFO - Iter(train) [125500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:40:52 time: 1.9971 data_time: 0.0187 memory: 34708 grad_norm: 31.3991 loss: 9.1561 loss_cls: 0.2648 loss_bbox: 0.1562 loss_iou: 0.2922 d0.loss_cls: 0.3130 d0.loss_bbox: 0.1657 d0.loss_iou: 0.3010 d1.loss_cls: 0.2884 d1.loss_bbox: 0.1575 d1.loss_iou: 0.2930 d2.loss_cls: 0.2771 d2.loss_bbox: 0.1554 d2.loss_iou: 0.2906 d3.loss_cls: 0.2715 d3.loss_bbox: 0.1557 d3.loss_iou: 0.2909 d4.loss_cls: 0.2643 d4.loss_bbox: 0.1579 d4.loss_iou: 0.2936 enc_loss_cls: 0.3176 enc_loss_bbox: 0.1815 enc_loss_iou: 0.3185 dn_loss_cls: 0.0771 dn_loss_bbox: 0.1627 dn_loss_iou: 0.2063 d0.dn_loss_cls: 0.1511 d0.dn_loss_bbox: 0.2813 d0.dn_loss_iou: 0.3283 d1.dn_loss_cls: 0.1034 d1.dn_loss_bbox: 0.1910 d1.dn_loss_iou: 0.2327 d2.dn_loss_cls: 0.0861 d2.dn_loss_bbox: 0.1693 d2.dn_loss_iou: 0.2135 d3.dn_loss_cls: 0.0810 d3.dn_loss_bbox: 0.1649 d3.dn_loss_iou: 0.2085 d4.dn_loss_cls: 0.0782 d4.dn_loss_bbox: 0.1628 d4.dn_loss_iou: 0.2063 d1.loss_lmm_region: 0.1000 loss_lmm_image: 0.7455 2024/11/13 12:35:21 - mmengine - INFO - Iter(train) [125600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:37:30 time: 1.9892 data_time: 0.0187 memory: 35365 grad_norm: nan loss: 7.7234 loss_cls: 0.2598 loss_bbox: 0.1049 loss_iou: 0.2106 d0.loss_cls: 0.2940 d0.loss_bbox: 0.1093 d0.loss_iou: 0.2214 d1.loss_cls: 0.2802 d1.loss_bbox: 0.1032 d1.loss_iou: 0.2110 d2.loss_cls: 0.2715 d2.loss_bbox: 0.1021 d2.loss_iou: 0.2077 d3.loss_cls: 0.2619 d3.loss_bbox: 0.1056 d3.loss_iou: 0.2126 d4.loss_cls: 0.2597 d4.loss_bbox: 0.1041 d4.loss_iou: 0.2107 enc_loss_cls: 0.3051 enc_loss_bbox: 0.1224 enc_loss_iou: 0.2402 dn_loss_cls: 0.0715 dn_loss_bbox: 0.1248 dn_loss_iou: 0.1731 d0.dn_loss_cls: 0.1498 d0.dn_loss_bbox: 0.2507 d0.dn_loss_iou: 0.2937 d1.dn_loss_cls: 0.1056 d1.dn_loss_bbox: 0.1489 d1.dn_loss_iou: 0.1966 d2.dn_loss_cls: 0.0844 d2.dn_loss_bbox: 0.1310 d2.dn_loss_iou: 0.1796 d3.dn_loss_cls: 0.0758 d3.dn_loss_bbox: 0.1269 d3.dn_loss_iou: 0.1754 d4.dn_loss_cls: 0.0726 d4.dn_loss_bbox: 0.1248 d4.dn_loss_iou: 0.1731 d1.loss_lmm_region: 0.1034 loss_lmm_image: 0.7636 2024/11/13 12:38:41 - mmengine - INFO - Iter(train) [125700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:34:09 time: 2.0004 data_time: 0.0186 memory: 35438 grad_norm: 29.3437 loss: 8.9552 loss_cls: 0.2797 loss_bbox: 0.1343 loss_iou: 0.2754 d0.loss_cls: 0.3267 d0.loss_bbox: 0.1447 d0.loss_iou: 0.2861 d1.loss_cls: 0.2999 d1.loss_bbox: 0.1398 d1.loss_iou: 0.2798 d2.loss_cls: 0.2908 d2.loss_bbox: 0.1395 d2.loss_iou: 0.2759 d3.loss_cls: 0.2839 d3.loss_bbox: 0.1376 d3.loss_iou: 0.2782 d4.loss_cls: 0.2821 d4.loss_bbox: 0.1353 d4.loss_iou: 0.2756 enc_loss_cls: 0.3324 enc_loss_bbox: 0.1571 enc_loss_iou: 0.3035 dn_loss_cls: 0.0761 dn_loss_bbox: 0.1379 dn_loss_iou: 0.2163 d0.dn_loss_cls: 0.1484 d0.dn_loss_bbox: 0.2542 d0.dn_loss_iou: 0.3418 d1.dn_loss_cls: 0.1049 d1.dn_loss_bbox: 0.1624 d1.dn_loss_iou: 0.2402 d2.dn_loss_cls: 0.0897 d2.dn_loss_bbox: 0.1463 d2.dn_loss_iou: 0.2237 d3.dn_loss_cls: 0.0868 d3.dn_loss_bbox: 0.1398 d3.dn_loss_iou: 0.2183 d4.dn_loss_cls: 0.0805 d4.dn_loss_bbox: 0.1380 d4.dn_loss_iou: 0.2162 d1.loss_lmm_region: 0.1032 loss_lmm_image: 0.7724 2024/11/13 12:42:01 - mmengine - INFO - Iter(train) [125800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:30:48 time: 1.9983 data_time: 0.0186 memory: 33991 grad_norm: 36.7224 loss: 8.3886 loss_cls: 0.2293 loss_bbox: 0.1457 loss_iou: 0.2431 d0.loss_cls: 0.2817 d0.loss_bbox: 0.1516 d0.loss_iou: 0.2505 d1.loss_cls: 0.2603 d1.loss_bbox: 0.1442 d1.loss_iou: 0.2410 d2.loss_cls: 0.2477 d2.loss_bbox: 0.1408 d2.loss_iou: 0.2405 d3.loss_cls: 0.2362 d3.loss_bbox: 0.1442 d3.loss_iou: 0.2408 d4.loss_cls: 0.2332 d4.loss_bbox: 0.1455 d4.loss_iou: 0.2419 enc_loss_cls: 0.2860 enc_loss_bbox: 0.1638 enc_loss_iou: 0.2667 dn_loss_cls: 0.0817 dn_loss_bbox: 0.1503 dn_loss_iou: 0.1954 d0.dn_loss_cls: 0.1599 d0.dn_loss_bbox: 0.2931 d0.dn_loss_iou: 0.3309 d1.dn_loss_cls: 0.1096 d1.dn_loss_bbox: 0.1801 d1.dn_loss_iou: 0.2234 d2.dn_loss_cls: 0.0946 d2.dn_loss_bbox: 0.1598 d2.dn_loss_iou: 0.2041 d3.dn_loss_cls: 0.0874 d3.dn_loss_bbox: 0.1524 d3.dn_loss_iou: 0.1973 d4.dn_loss_cls: 0.0839 d4.dn_loss_bbox: 0.1503 d4.dn_loss_iou: 0.1954 d1.loss_lmm_region: 0.1004 loss_lmm_image: 0.7039 2024/11/13 12:45:20 - mmengine - INFO - Iter(train) [125900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:27:27 time: 1.9769 data_time: 0.0187 memory: 34776 grad_norm: 24.8476 loss: 8.5762 loss_cls: 0.2665 loss_bbox: 0.1232 loss_iou: 0.2485 d0.loss_cls: 0.2988 d0.loss_bbox: 0.1381 d0.loss_iou: 0.2658 d1.loss_cls: 0.2794 d1.loss_bbox: 0.1307 d1.loss_iou: 0.2568 d2.loss_cls: 0.2738 d2.loss_bbox: 0.1255 d2.loss_iou: 0.2517 d3.loss_cls: 0.2667 d3.loss_bbox: 0.1266 d3.loss_iou: 0.2514 d4.loss_cls: 0.2635 d4.loss_bbox: 0.1250 d4.loss_iou: 0.2488 enc_loss_cls: 0.3125 enc_loss_bbox: 0.1492 enc_loss_iou: 0.2804 dn_loss_cls: 0.1048 dn_loss_bbox: 0.1393 dn_loss_iou: 0.1906 d0.dn_loss_cls: 0.1772 d0.dn_loss_bbox: 0.2726 d0.dn_loss_iou: 0.3130 d1.dn_loss_cls: 0.1336 d1.dn_loss_bbox: 0.1675 d1.dn_loss_iou: 0.2169 d2.dn_loss_cls: 0.1171 d2.dn_loss_bbox: 0.1488 d2.dn_loss_iou: 0.1992 d3.dn_loss_cls: 0.1102 d3.dn_loss_bbox: 0.1423 d3.dn_loss_iou: 0.1934 d4.dn_loss_cls: 0.1051 d4.dn_loss_bbox: 0.1393 d4.dn_loss_iou: 0.1906 d1.loss_lmm_region: 0.0987 loss_lmm_image: 0.7333 2024/11/13 12:48:38 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 12:48:38 - mmengine - INFO - Iter(train) [126000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:24:05 time: 1.9914 data_time: 0.0186 memory: 33871 grad_norm: 23.9229 loss: 8.5944 loss_cls: 0.2469 loss_bbox: 0.1390 loss_iou: 0.2420 d0.loss_cls: 0.2840 d0.loss_bbox: 0.1491 d0.loss_iou: 0.2551 d1.loss_cls: 0.2666 d1.loss_bbox: 0.1418 d1.loss_iou: 0.2465 d2.loss_cls: 0.2627 d2.loss_bbox: 0.1378 d2.loss_iou: 0.2420 d3.loss_cls: 0.2507 d3.loss_bbox: 0.1394 d3.loss_iou: 0.2424 d4.loss_cls: 0.2476 d4.loss_bbox: 0.1399 d4.loss_iou: 0.2432 enc_loss_cls: 0.2873 enc_loss_bbox: 0.1608 enc_loss_iou: 0.2702 dn_loss_cls: 0.0749 dn_loss_bbox: 0.1674 dn_loss_iou: 0.2049 d0.dn_loss_cls: 0.1547 d0.dn_loss_bbox: 0.3004 d0.dn_loss_iou: 0.3338 d1.dn_loss_cls: 0.1068 d1.dn_loss_bbox: 0.2023 d1.dn_loss_iou: 0.2342 d2.dn_loss_cls: 0.0870 d2.dn_loss_bbox: 0.1770 d2.dn_loss_iou: 0.2134 d3.dn_loss_cls: 0.0819 d3.dn_loss_bbox: 0.1698 d3.dn_loss_iou: 0.2073 d4.dn_loss_cls: 0.0774 d4.dn_loss_bbox: 0.1674 d4.dn_loss_iou: 0.2051 d1.loss_lmm_region: 0.0992 loss_lmm_image: 0.7344 2024/11/13 12:51:58 - mmengine - INFO - Iter(train) [126100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:20:44 time: 2.0050 data_time: 0.0185 memory: 31262 grad_norm: 28.1625 loss: 8.9809 loss_cls: 0.3095 loss_bbox: 0.1420 loss_iou: 0.2515 d0.loss_cls: 0.3424 d0.loss_bbox: 0.1587 d0.loss_iou: 0.2726 d1.loss_cls: 0.3197 d1.loss_bbox: 0.1498 d1.loss_iou: 0.2622 d2.loss_cls: 0.3151 d2.loss_bbox: 0.1463 d2.loss_iou: 0.2569 d3.loss_cls: 0.3149 d3.loss_bbox: 0.1436 d3.loss_iou: 0.2539 d4.loss_cls: 0.3125 d4.loss_bbox: 0.1419 d4.loss_iou: 0.2518 enc_loss_cls: 0.3430 enc_loss_bbox: 0.1773 enc_loss_iou: 0.2910 dn_loss_cls: 0.0893 dn_loss_bbox: 0.1440 dn_loss_iou: 0.1957 d0.dn_loss_cls: 0.1602 d0.dn_loss_bbox: 0.2673 d0.dn_loss_iou: 0.3232 d1.dn_loss_cls: 0.1152 d1.dn_loss_bbox: 0.1711 d1.dn_loss_iou: 0.2226 d2.dn_loss_cls: 0.0977 d2.dn_loss_bbox: 0.1534 d2.dn_loss_iou: 0.2046 d3.dn_loss_cls: 0.0928 d3.dn_loss_bbox: 0.1467 d3.dn_loss_iou: 0.1977 d4.dn_loss_cls: 0.0887 d4.dn_loss_bbox: 0.1440 d4.dn_loss_iou: 0.1958 d1.loss_lmm_region: 0.1010 loss_lmm_image: 0.7135 2024/11/13 12:55:18 - mmengine - INFO - Iter(train) [126200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:17:22 time: 1.9932 data_time: 0.0187 memory: 34561 grad_norm: 25.9959 loss: 7.7971 loss_cls: 0.2415 loss_bbox: 0.1113 loss_iou: 0.2180 d0.loss_cls: 0.2785 d0.loss_bbox: 0.1208 d0.loss_iou: 0.2318 d1.loss_cls: 0.2565 d1.loss_bbox: 0.1175 d1.loss_iou: 0.2272 d2.loss_cls: 0.2487 d2.loss_bbox: 0.1125 d2.loss_iou: 0.2213 d3.loss_cls: 0.2444 d3.loss_bbox: 0.1119 d3.loss_iou: 0.2204 d4.loss_cls: 0.2417 d4.loss_bbox: 0.1115 d4.loss_iou: 0.2185 enc_loss_cls: 0.2884 enc_loss_bbox: 0.1309 enc_loss_iou: 0.2466 dn_loss_cls: 0.0745 dn_loss_bbox: 0.1342 dn_loss_iou: 0.1888 d0.dn_loss_cls: 0.1352 d0.dn_loss_bbox: 0.2573 d0.dn_loss_iou: 0.3104 d1.dn_loss_cls: 0.0963 d1.dn_loss_bbox: 0.1626 d1.dn_loss_iou: 0.2152 d2.dn_loss_cls: 0.0844 d2.dn_loss_bbox: 0.1427 d2.dn_loss_iou: 0.1963 d3.dn_loss_cls: 0.0808 d3.dn_loss_bbox: 0.1357 d3.dn_loss_iou: 0.1903 d4.dn_loss_cls: 0.0776 d4.dn_loss_bbox: 0.1342 d4.dn_loss_iou: 0.1889 d1.loss_lmm_region: 0.0809 loss_lmm_image: 0.7110 2024/11/13 12:58:38 - mmengine - INFO - Iter(train) [126300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:14:01 time: 1.9774 data_time: 0.0187 memory: 34163 grad_norm: 25.0817 loss: 7.9604 loss_cls: 0.2392 loss_bbox: 0.1146 loss_iou: 0.1986 d0.loss_cls: 0.2748 d0.loss_bbox: 0.1190 d0.loss_iou: 0.2127 d1.loss_cls: 0.2547 d1.loss_bbox: 0.1137 d1.loss_iou: 0.2042 d2.loss_cls: 0.2490 d2.loss_bbox: 0.1135 d2.loss_iou: 0.2016 d3.loss_cls: 0.2474 d3.loss_bbox: 0.1100 d3.loss_iou: 0.1976 d4.loss_cls: 0.2420 d4.loss_bbox: 0.1120 d4.loss_iou: 0.1984 enc_loss_cls: 0.2821 enc_loss_bbox: 0.1344 enc_loss_iou: 0.2309 dn_loss_cls: 0.0874 dn_loss_bbox: 0.1563 dn_loss_iou: 0.1997 d0.dn_loss_cls: 0.1697 d0.dn_loss_bbox: 0.2846 d0.dn_loss_iou: 0.3261 d1.dn_loss_cls: 0.1163 d1.dn_loss_bbox: 0.1838 d1.dn_loss_iou: 0.2255 d2.dn_loss_cls: 0.0979 d2.dn_loss_bbox: 0.1648 d2.dn_loss_iou: 0.2074 d3.dn_loss_cls: 0.0918 d3.dn_loss_bbox: 0.1579 d3.dn_loss_iou: 0.2016 d4.dn_loss_cls: 0.0878 d4.dn_loss_bbox: 0.1562 d4.dn_loss_iou: 0.1996 d1.loss_lmm_region: 0.1131 loss_lmm_image: 0.6825 2024/11/13 13:01:59 - mmengine - INFO - Iter(train) [126400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:10:40 time: 2.0088 data_time: 0.0187 memory: 33963 grad_norm: 42.2455 loss: 8.6299 loss_cls: 0.2694 loss_bbox: 0.1349 loss_iou: 0.2406 d0.loss_cls: 0.3082 d0.loss_bbox: 0.1436 d0.loss_iou: 0.2548 d1.loss_cls: 0.2953 d1.loss_bbox: 0.1361 d1.loss_iou: 0.2444 d2.loss_cls: 0.2857 d2.loss_bbox: 0.1320 d2.loss_iou: 0.2377 d3.loss_cls: 0.2757 d3.loss_bbox: 0.1331 d3.loss_iou: 0.2386 d4.loss_cls: 0.2716 d4.loss_bbox: 0.1343 d4.loss_iou: 0.2403 enc_loss_cls: 0.3198 enc_loss_bbox: 0.1504 enc_loss_iou: 0.2686 dn_loss_cls: 0.0961 dn_loss_bbox: 0.1472 dn_loss_iou: 0.2001 d0.dn_loss_cls: 0.1761 d0.dn_loss_bbox: 0.2870 d0.dn_loss_iou: 0.3317 d1.dn_loss_cls: 0.1241 d1.dn_loss_bbox: 0.1798 d1.dn_loss_iou: 0.2283 d2.dn_loss_cls: 0.1074 d2.dn_loss_bbox: 0.1570 d2.dn_loss_iou: 0.2081 d3.dn_loss_cls: 0.0987 d3.dn_loss_bbox: 0.1492 d3.dn_loss_iou: 0.2025 d4.dn_loss_cls: 0.0960 d4.dn_loss_bbox: 0.1471 d4.dn_loss_iou: 0.2000 d1.loss_lmm_region: 0.1065 loss_lmm_image: 0.6717 2024/11/13 13:05:20 - mmengine - INFO - Iter(train) [126500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:07:19 time: 2.0144 data_time: 0.0186 memory: 34902 grad_norm: 23.1249 loss: 7.9294 loss_cls: 0.2101 loss_bbox: 0.1270 loss_iou: 0.2249 d0.loss_cls: 0.2441 d0.loss_bbox: 0.1305 d0.loss_iou: 0.2351 d1.loss_cls: 0.2254 d1.loss_bbox: 0.1279 d1.loss_iou: 0.2310 d2.loss_cls: 0.2245 d2.loss_bbox: 0.1242 d2.loss_iou: 0.2226 d3.loss_cls: 0.2164 d3.loss_bbox: 0.1245 d3.loss_iou: 0.2235 d4.loss_cls: 0.2136 d4.loss_bbox: 0.1253 d4.loss_iou: 0.2218 enc_loss_cls: 0.2599 enc_loss_bbox: 0.1418 enc_loss_iou: 0.2497 dn_loss_cls: 0.0732 dn_loss_bbox: 0.1528 dn_loss_iou: 0.1904 d0.dn_loss_cls: 0.1501 d0.dn_loss_bbox: 0.2931 d0.dn_loss_iou: 0.3207 d1.dn_loss_cls: 0.1010 d1.dn_loss_bbox: 0.1827 d1.dn_loss_iou: 0.2176 d2.dn_loss_cls: 0.0842 d2.dn_loss_bbox: 0.1594 d2.dn_loss_iou: 0.1985 d3.dn_loss_cls: 0.0780 d3.dn_loss_bbox: 0.1548 d3.dn_loss_iou: 0.1925 d4.dn_loss_cls: 0.0744 d4.dn_loss_bbox: 0.1528 d4.dn_loss_iou: 0.1903 d1.loss_lmm_region: 0.0817 loss_lmm_image: 0.7774 2024/11/13 13:08:40 - mmengine - INFO - Iter(train) [126600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:03:58 time: 2.0094 data_time: 0.0186 memory: 34667 grad_norm: 23.7833 loss: 7.7999 loss_cls: 0.2478 loss_bbox: 0.1048 loss_iou: 0.2083 d0.loss_cls: 0.2808 d0.loss_bbox: 0.1202 d0.loss_iou: 0.2265 d1.loss_cls: 0.2655 d1.loss_bbox: 0.1112 d1.loss_iou: 0.2158 d2.loss_cls: 0.2568 d2.loss_bbox: 0.1063 d2.loss_iou: 0.2103 d3.loss_cls: 0.2497 d3.loss_bbox: 0.1056 d3.loss_iou: 0.2083 d4.loss_cls: 0.2488 d4.loss_bbox: 0.1051 d4.loss_iou: 0.2083 enc_loss_cls: 0.2875 enc_loss_bbox: 0.1284 enc_loss_iou: 0.2449 dn_loss_cls: 0.1001 dn_loss_bbox: 0.1253 dn_loss_iou: 0.1807 d0.dn_loss_cls: 0.1481 d0.dn_loss_bbox: 0.2481 d0.dn_loss_iou: 0.3048 d1.dn_loss_cls: 0.1162 d1.dn_loss_bbox: 0.1512 d1.dn_loss_iou: 0.2056 d2.dn_loss_cls: 0.1067 d2.dn_loss_bbox: 0.1329 d2.dn_loss_iou: 0.1881 d3.dn_loss_cls: 0.0995 d3.dn_loss_bbox: 0.1266 d3.dn_loss_iou: 0.1825 d4.dn_loss_cls: 0.0975 d4.dn_loss_bbox: 0.1252 d4.dn_loss_iou: 0.1805 d1.loss_lmm_region: 0.0766 loss_lmm_image: 0.7629 2024/11/13 13:11:59 - mmengine - INFO - Iter(train) [126700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 13:00:37 time: 1.9916 data_time: 0.0189 memory: 34080 grad_norm: 28.2257 loss: 7.8879 loss_cls: 0.2462 loss_bbox: 0.1177 loss_iou: 0.1963 d0.loss_cls: 0.2783 d0.loss_bbox: 0.1274 d0.loss_iou: 0.2082 d1.loss_cls: 0.2614 d1.loss_bbox: 0.1214 d1.loss_iou: 0.2002 d2.loss_cls: 0.2548 d2.loss_bbox: 0.1190 d2.loss_iou: 0.1996 d3.loss_cls: 0.2507 d3.loss_bbox: 0.1165 d3.loss_iou: 0.1960 d4.loss_cls: 0.2495 d4.loss_bbox: 0.1172 d4.loss_iou: 0.1959 enc_loss_cls: 0.2806 enc_loss_bbox: 0.1389 enc_loss_iou: 0.2236 dn_loss_cls: 0.0936 dn_loss_bbox: 0.1436 dn_loss_iou: 0.1781 d0.dn_loss_cls: 0.1631 d0.dn_loss_bbox: 0.2821 d0.dn_loss_iou: 0.3054 d1.dn_loss_cls: 0.1190 d1.dn_loss_bbox: 0.1726 d1.dn_loss_iou: 0.2064 d2.dn_loss_cls: 0.1026 d2.dn_loss_bbox: 0.1533 d2.dn_loss_iou: 0.1870 d3.dn_loss_cls: 0.0977 d3.dn_loss_bbox: 0.1445 d3.dn_loss_iou: 0.1801 d4.dn_loss_cls: 0.0951 d4.dn_loss_bbox: 0.1435 d4.dn_loss_iou: 0.1782 d1.loss_lmm_region: 0.1026 loss_lmm_image: 0.7399 2024/11/13 13:15:18 - mmengine - INFO - Iter(train) [126800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:57:15 time: 2.0015 data_time: 0.0188 memory: 33851 grad_norm: 23.1271 loss: 9.5669 loss_cls: 0.2872 loss_bbox: 0.1593 loss_iou: 0.2799 d0.loss_cls: 0.3265 d0.loss_bbox: 0.1601 d0.loss_iou: 0.2924 d1.loss_cls: 0.3064 d1.loss_bbox: 0.1618 d1.loss_iou: 0.2886 d2.loss_cls: 0.3056 d2.loss_bbox: 0.1537 d2.loss_iou: 0.2762 d3.loss_cls: 0.2911 d3.loss_bbox: 0.1589 d3.loss_iou: 0.2791 d4.loss_cls: 0.2909 d4.loss_bbox: 0.1560 d4.loss_iou: 0.2784 enc_loss_cls: 0.3394 enc_loss_bbox: 0.1704 enc_loss_iou: 0.3054 dn_loss_cls: 0.0971 dn_loss_bbox: 0.1694 dn_loss_iou: 0.2336 d0.dn_loss_cls: 0.1850 d0.dn_loss_bbox: 0.3046 d0.dn_loss_iou: 0.3619 d1.dn_loss_cls: 0.1354 d1.dn_loss_bbox: 0.1939 d1.dn_loss_iou: 0.2586 d2.dn_loss_cls: 0.1118 d2.dn_loss_bbox: 0.1750 d2.dn_loss_iou: 0.2404 d3.dn_loss_cls: 0.1056 d3.dn_loss_bbox: 0.1702 d3.dn_loss_iou: 0.2352 d4.dn_loss_cls: 0.0983 d4.dn_loss_bbox: 0.1693 d4.dn_loss_iou: 0.2335 d1.loss_lmm_region: 0.1108 loss_lmm_image: 0.7097 2024/11/13 13:18:36 - mmengine - INFO - Iter(train) [126900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:53:54 time: 1.9789 data_time: 0.0187 memory: 34499 grad_norm: 31.2698 loss: 8.3037 loss_cls: 0.2491 loss_bbox: 0.1224 loss_iou: 0.2084 d0.loss_cls: 0.2811 d0.loss_bbox: 0.1309 d0.loss_iou: 0.2211 d1.loss_cls: 0.2663 d1.loss_bbox: 0.1232 d1.loss_iou: 0.2125 d2.loss_cls: 0.2631 d2.loss_bbox: 0.1199 d2.loss_iou: 0.2053 d3.loss_cls: 0.2601 d3.loss_bbox: 0.1200 d3.loss_iou: 0.2053 d4.loss_cls: 0.2529 d4.loss_bbox: 0.1232 d4.loss_iou: 0.2080 enc_loss_cls: 0.2979 enc_loss_bbox: 0.1375 enc_loss_iou: 0.2304 dn_loss_cls: 0.1350 dn_loss_bbox: 0.1468 dn_loss_iou: 0.1882 d0.dn_loss_cls: 0.1833 d0.dn_loss_bbox: 0.2906 d0.dn_loss_iou: 0.3195 d1.dn_loss_cls: 0.1498 d1.dn_loss_bbox: 0.1808 d1.dn_loss_iou: 0.2186 d2.dn_loss_cls: 0.1403 d2.dn_loss_bbox: 0.1570 d2.dn_loss_iou: 0.1982 d3.dn_loss_cls: 0.1372 d3.dn_loss_bbox: 0.1494 d3.dn_loss_iou: 0.1911 d4.dn_loss_cls: 0.1347 d4.dn_loss_bbox: 0.1468 d4.dn_loss_iou: 0.1881 d1.loss_lmm_region: 0.0885 loss_lmm_image: 0.7209 2024/11/13 13:21:54 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 13:21:54 - mmengine - INFO - Iter(train) [127000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:50:32 time: 1.9915 data_time: 0.0186 memory: 35484 grad_norm: 31.5318 loss: 8.8778 loss_cls: 0.2776 loss_bbox: 0.1415 loss_iou: 0.2637 d0.loss_cls: 0.3197 d0.loss_bbox: 0.1503 d0.loss_iou: 0.2857 d1.loss_cls: 0.2930 d1.loss_bbox: 0.1460 d1.loss_iou: 0.2756 d2.loss_cls: 0.2881 d2.loss_bbox: 0.1411 d2.loss_iou: 0.2665 d3.loss_cls: 0.2786 d3.loss_bbox: 0.1423 d3.loss_iou: 0.2659 d4.loss_cls: 0.2762 d4.loss_bbox: 0.1427 d4.loss_iou: 0.2665 enc_loss_cls: 0.3246 enc_loss_bbox: 0.1622 enc_loss_iou: 0.3002 dn_loss_cls: 0.0758 dn_loss_bbox: 0.1552 dn_loss_iou: 0.2101 d0.dn_loss_cls: 0.1428 d0.dn_loss_bbox: 0.2859 d0.dn_loss_iou: 0.3294 d1.dn_loss_cls: 0.1004 d1.dn_loss_bbox: 0.1845 d1.dn_loss_iou: 0.2333 d2.dn_loss_cls: 0.0850 d2.dn_loss_bbox: 0.1649 d2.dn_loss_iou: 0.2172 d3.dn_loss_cls: 0.0786 d3.dn_loss_bbox: 0.1564 d3.dn_loss_iou: 0.2119 d4.dn_loss_cls: 0.0768 d4.dn_loss_bbox: 0.1552 d4.dn_loss_iou: 0.2101 d1.loss_lmm_region: 0.0900 loss_lmm_image: 0.7060 2024/11/13 13:25:12 - mmengine - INFO - Iter(train) [127100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:47:10 time: 1.9807 data_time: 0.0186 memory: 34371 grad_norm: 27.9836 loss: 7.3625 loss_cls: 0.2076 loss_bbox: 0.1138 loss_iou: 0.1892 d0.loss_cls: 0.2440 d0.loss_bbox: 0.1167 d0.loss_iou: 0.1969 d1.loss_cls: 0.2257 d1.loss_bbox: 0.1117 d1.loss_iou: 0.1909 d2.loss_cls: 0.2187 d2.loss_bbox: 0.1097 d2.loss_iou: 0.1890 d3.loss_cls: 0.2106 d3.loss_bbox: 0.1132 d3.loss_iou: 0.1898 d4.loss_cls: 0.2075 d4.loss_bbox: 0.1142 d4.loss_iou: 0.1900 enc_loss_cls: 0.2481 enc_loss_bbox: 0.1275 enc_loss_iou: 0.2130 dn_loss_cls: 0.0786 dn_loss_bbox: 0.1367 dn_loss_iou: 0.1838 d0.dn_loss_cls: 0.1484 d0.dn_loss_bbox: 0.2752 d0.dn_loss_iou: 0.3162 d1.dn_loss_cls: 0.1038 d1.dn_loss_bbox: 0.1673 d1.dn_loss_iou: 0.2100 d2.dn_loss_cls: 0.0852 d2.dn_loss_bbox: 0.1441 d2.dn_loss_iou: 0.1917 d3.dn_loss_cls: 0.0800 d3.dn_loss_bbox: 0.1388 d3.dn_loss_iou: 0.1857 d4.dn_loss_cls: 0.0775 d4.dn_loss_bbox: 0.1367 d4.dn_loss_iou: 0.1838 d1.loss_lmm_region: 0.1010 loss_lmm_image: 0.6899 2024/11/13 13:28:33 - mmengine - INFO - Iter(train) [127200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:43:49 time: 2.0153 data_time: 0.0185 memory: 32976 grad_norm: 29.2986 loss: 8.7235 loss_cls: 0.2689 loss_bbox: 0.1435 loss_iou: 0.2153 d0.loss_cls: 0.3035 d0.loss_bbox: 0.1597 d0.loss_iou: 0.2307 d1.loss_cls: 0.2870 d1.loss_bbox: 0.1508 d1.loss_iou: 0.2229 d2.loss_cls: 0.2770 d2.loss_bbox: 0.1468 d2.loss_iou: 0.2165 d3.loss_cls: 0.2775 d3.loss_bbox: 0.1433 d3.loss_iou: 0.2158 d4.loss_cls: 0.2695 d4.loss_bbox: 0.1434 d4.loss_iou: 0.2157 enc_loss_cls: 0.3066 enc_loss_bbox: 0.1677 enc_loss_iou: 0.2409 dn_loss_cls: 0.0836 dn_loss_bbox: 0.1833 dn_loss_iou: 0.2009 d0.dn_loss_cls: 0.1724 d0.dn_loss_bbox: 0.3336 d0.dn_loss_iou: 0.3282 d1.dn_loss_cls: 0.1172 d1.dn_loss_bbox: 0.2206 d1.dn_loss_iou: 0.2286 d2.dn_loss_cls: 0.0968 d2.dn_loss_bbox: 0.1961 d2.dn_loss_iou: 0.2093 d3.dn_loss_cls: 0.0890 d3.dn_loss_bbox: 0.1861 d3.dn_loss_iou: 0.2030 d4.dn_loss_cls: 0.0848 d4.dn_loss_bbox: 0.1833 d4.dn_loss_iou: 0.2009 d1.loss_lmm_region: 0.1135 loss_lmm_image: 0.6894 2024/11/13 13:31:51 - mmengine - INFO - Iter(train) [127300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:40:28 time: 1.9574 data_time: 0.0186 memory: 33760 grad_norm: 28.0313 loss: 7.6793 loss_cls: 0.2225 loss_bbox: 0.1198 loss_iou: 0.1935 d0.loss_cls: 0.2619 d0.loss_bbox: 0.1250 d0.loss_iou: 0.1953 d1.loss_cls: 0.2422 d1.loss_bbox: 0.1201 d1.loss_iou: 0.1919 d2.loss_cls: 0.2389 d2.loss_bbox: 0.1136 d2.loss_iou: 0.1897 d3.loss_cls: 0.2319 d3.loss_bbox: 0.1161 d3.loss_iou: 0.1917 d4.loss_cls: 0.2252 d4.loss_bbox: 0.1178 d4.loss_iou: 0.1934 enc_loss_cls: 0.2715 enc_loss_bbox: 0.1306 enc_loss_iou: 0.2065 dn_loss_cls: 0.0762 dn_loss_bbox: 0.1503 dn_loss_iou: 0.1854 d0.dn_loss_cls: 0.1497 d0.dn_loss_bbox: 0.2828 d0.dn_loss_iou: 0.3049 d1.dn_loss_cls: 0.1019 d1.dn_loss_bbox: 0.1775 d1.dn_loss_iou: 0.2083 d2.dn_loss_cls: 0.0841 d2.dn_loss_bbox: 0.1585 d2.dn_loss_iou: 0.1918 d3.dn_loss_cls: 0.0792 d3.dn_loss_bbox: 0.1516 d3.dn_loss_iou: 0.1870 d4.dn_loss_cls: 0.0765 d4.dn_loss_bbox: 0.1503 d4.dn_loss_iou: 0.1854 d1.loss_lmm_region: 0.1002 loss_lmm_image: 0.7785 2024/11/13 13:35:09 - mmengine - INFO - Iter(train) [127400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:37:06 time: 1.9835 data_time: 0.0186 memory: 34554 grad_norm: 37.9264 loss: 8.2179 loss_cls: 0.2452 loss_bbox: 0.1294 loss_iou: 0.2296 d0.loss_cls: 0.2868 d0.loss_bbox: 0.1379 d0.loss_iou: 0.2392 d1.loss_cls: 0.2642 d1.loss_bbox: 0.1329 d1.loss_iou: 0.2355 d2.loss_cls: 0.2541 d2.loss_bbox: 0.1295 d2.loss_iou: 0.2308 d3.loss_cls: 0.2486 d3.loss_bbox: 0.1284 d3.loss_iou: 0.2287 d4.loss_cls: 0.2488 d4.loss_bbox: 0.1269 d4.loss_iou: 0.2279 enc_loss_cls: 0.2914 enc_loss_bbox: 0.1485 enc_loss_iou: 0.2537 dn_loss_cls: 0.0574 dn_loss_bbox: 0.1513 dn_loss_iou: 0.2002 d0.dn_loss_cls: 0.1347 d0.dn_loss_bbox: 0.2916 d0.dn_loss_iou: 0.3318 d1.dn_loss_cls: 0.0863 d1.dn_loss_bbox: 0.1791 d1.dn_loss_iou: 0.2267 d2.dn_loss_cls: 0.0710 d2.dn_loss_bbox: 0.1594 d2.dn_loss_iou: 0.2086 d3.dn_loss_cls: 0.0637 d3.dn_loss_bbox: 0.1528 d3.dn_loss_iou: 0.2022 d4.dn_loss_cls: 0.0587 d4.dn_loss_bbox: 0.1512 d4.dn_loss_iou: 0.2002 d1.loss_lmm_region: 0.0937 loss_lmm_image: 0.7792 2024/11/13 13:38:27 - mmengine - INFO - Iter(train) [127500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:33:45 time: 1.9744 data_time: 0.0187 memory: 33230 grad_norm: 24.3789 loss: 7.1647 loss_cls: 0.2215 loss_bbox: 0.0936 loss_iou: 0.1904 d0.loss_cls: 0.2652 d0.loss_bbox: 0.1021 d0.loss_iou: 0.2033 d1.loss_cls: 0.2395 d1.loss_bbox: 0.0952 d1.loss_iou: 0.1960 d2.loss_cls: 0.2301 d2.loss_bbox: 0.0955 d2.loss_iou: 0.1951 d3.loss_cls: 0.2264 d3.loss_bbox: 0.0939 d3.loss_iou: 0.1915 d4.loss_cls: 0.2224 d4.loss_bbox: 0.0933 d4.loss_iou: 0.1908 enc_loss_cls: 0.2736 enc_loss_bbox: 0.1120 enc_loss_iou: 0.2189 dn_loss_cls: 0.0621 dn_loss_bbox: 0.1207 dn_loss_iou: 0.1649 d0.dn_loss_cls: 0.1413 d0.dn_loss_bbox: 0.2503 d0.dn_loss_iou: 0.2899 d1.dn_loss_cls: 0.0931 d1.dn_loss_bbox: 0.1495 d1.dn_loss_iou: 0.1909 d2.dn_loss_cls: 0.0703 d2.dn_loss_bbox: 0.1295 d2.dn_loss_iou: 0.1729 d3.dn_loss_cls: 0.0659 d3.dn_loss_bbox: 0.1227 d3.dn_loss_iou: 0.1665 d4.dn_loss_cls: 0.0619 d4.dn_loss_bbox: 0.1208 d4.dn_loss_iou: 0.1649 d1.loss_lmm_region: 0.0985 loss_lmm_image: 0.7779 2024/11/13 13:41:44 - mmengine - INFO - Iter(train) [127600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:30:23 time: 1.9728 data_time: 0.0188 memory: 33781 grad_norm: 28.7283 loss: 9.7428 loss_cls: 0.3148 loss_bbox: 0.1305 loss_iou: 0.2645 d0.loss_cls: 0.3602 d0.loss_bbox: 0.1399 d0.loss_iou: 0.2790 d1.loss_cls: 0.3386 d1.loss_bbox: 0.1346 d1.loss_iou: 0.2771 d2.loss_cls: 0.3255 d2.loss_bbox: 0.1289 d2.loss_iou: 0.2691 d3.loss_cls: 0.3152 d3.loss_bbox: 0.1327 d3.loss_iou: 0.2664 d4.loss_cls: 0.3132 d4.loss_bbox: 0.1320 d4.loss_iou: 0.2643 enc_loss_cls: 0.3751 enc_loss_bbox: 0.1435 enc_loss_iou: 0.2921 dn_loss_cls: 0.1511 dn_loss_bbox: 0.1591 dn_loss_iou: 0.2289 d0.dn_loss_cls: 0.2246 d0.dn_loss_bbox: 0.3182 d0.dn_loss_iou: 0.3695 d1.dn_loss_cls: 0.1731 d1.dn_loss_bbox: 0.1980 d1.dn_loss_iou: 0.2604 d2.dn_loss_cls: 0.1533 d2.dn_loss_bbox: 0.1701 d2.dn_loss_iou: 0.2379 d3.dn_loss_cls: 0.1521 d3.dn_loss_bbox: 0.1603 d3.dn_loss_iou: 0.2306 d4.dn_loss_cls: 0.1516 d4.dn_loss_bbox: 0.1591 d4.dn_loss_iou: 0.2288 d1.loss_lmm_region: 0.1093 loss_lmm_image: 0.7096 2024/11/13 13:45:02 - mmengine - INFO - Iter(train) [127700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:27:02 time: 1.9981 data_time: 0.0186 memory: 34154 grad_norm: 40.9710 loss: 9.0953 loss_cls: 0.3247 loss_bbox: 0.1305 loss_iou: 0.2250 d0.loss_cls: 0.3613 d0.loss_bbox: 0.1428 d0.loss_iou: 0.2391 d1.loss_cls: 0.3437 d1.loss_bbox: 0.1336 d1.loss_iou: 0.2277 d2.loss_cls: 0.3384 d2.loss_bbox: 0.1294 d2.loss_iou: 0.2275 d3.loss_cls: 0.3305 d3.loss_bbox: 0.1290 d3.loss_iou: 0.2265 d4.loss_cls: 0.3240 d4.loss_bbox: 0.1317 d4.loss_iou: 0.2266 enc_loss_cls: 0.3779 enc_loss_bbox: 0.1533 enc_loss_iou: 0.2512 dn_loss_cls: 0.1422 dn_loss_bbox: 0.1409 dn_loss_iou: 0.1870 d0.dn_loss_cls: 0.2228 d0.dn_loss_bbox: 0.2697 d0.dn_loss_iou: 0.3111 d1.dn_loss_cls: 0.1765 d1.dn_loss_bbox: 0.1677 d1.dn_loss_iou: 0.2133 d2.dn_loss_cls: 0.1510 d2.dn_loss_bbox: 0.1494 d2.dn_loss_iou: 0.1947 d3.dn_loss_cls: 0.1422 d3.dn_loss_bbox: 0.1430 d3.dn_loss_iou: 0.1891 d4.dn_loss_cls: 0.1411 d4.dn_loss_bbox: 0.1409 d4.dn_loss_iou: 0.1869 d1.loss_lmm_region: 0.1121 loss_lmm_image: 0.7392 2024/11/13 13:48:24 - mmengine - INFO - Iter(train) [127800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:23:41 time: 2.0135 data_time: 0.0187 memory: 34524 grad_norm: 26.6398 loss: 8.1048 loss_cls: 0.2507 loss_bbox: 0.1284 loss_iou: 0.2284 d0.loss_cls: 0.2821 d0.loss_bbox: 0.1403 d0.loss_iou: 0.2466 d1.loss_cls: 0.2628 d1.loss_bbox: 0.1328 d1.loss_iou: 0.2368 d2.loss_cls: 0.2604 d2.loss_bbox: 0.1262 d2.loss_iou: 0.2302 d3.loss_cls: 0.2526 d3.loss_bbox: 0.1273 d3.loss_iou: 0.2284 d4.loss_cls: 0.2499 d4.loss_bbox: 0.1286 d4.loss_iou: 0.2298 enc_loss_cls: 0.2911 enc_loss_bbox: 0.1528 enc_loss_iou: 0.2609 dn_loss_cls: 0.0693 dn_loss_bbox: 0.1458 dn_loss_iou: 0.1887 d0.dn_loss_cls: 0.1458 d0.dn_loss_bbox: 0.2700 d0.dn_loss_iou: 0.3135 d1.dn_loss_cls: 0.0950 d1.dn_loss_bbox: 0.1736 d1.dn_loss_iou: 0.2161 d2.dn_loss_cls: 0.0788 d2.dn_loss_bbox: 0.1531 d2.dn_loss_iou: 0.1966 d3.dn_loss_cls: 0.0735 d3.dn_loss_bbox: 0.1491 d3.dn_loss_iou: 0.1915 d4.dn_loss_cls: 0.0701 d4.dn_loss_bbox: 0.1459 d4.dn_loss_iou: 0.1887 d1.loss_lmm_region: 0.0973 loss_lmm_image: 0.6952 2024/11/13 13:51:42 - mmengine - INFO - Iter(train) [127900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:20:19 time: 1.9608 data_time: 0.0187 memory: 34173 grad_norm: 22.0577 loss: 8.9572 loss_cls: 0.2876 loss_bbox: 0.1343 loss_iou: 0.2473 d0.loss_cls: 0.3254 d0.loss_bbox: 0.1433 d0.loss_iou: 0.2621 d1.loss_cls: 0.3057 d1.loss_bbox: 0.1381 d1.loss_iou: 0.2523 d2.loss_cls: 0.2980 d2.loss_bbox: 0.1328 d2.loss_iou: 0.2471 d3.loss_cls: 0.2928 d3.loss_bbox: 0.1333 d3.loss_iou: 0.2469 d4.loss_cls: 0.2894 d4.loss_bbox: 0.1346 d4.loss_iou: 0.2467 enc_loss_cls: 0.3323 enc_loss_bbox: 0.1555 enc_loss_iou: 0.2790 dn_loss_cls: 0.0724 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2185 d0.dn_loss_cls: 0.1590 d0.dn_loss_bbox: 0.3007 d0.dn_loss_iou: 0.3552 d1.dn_loss_cls: 0.1068 d1.dn_loss_bbox: 0.1857 d1.dn_loss_iou: 0.2475 d2.dn_loss_cls: 0.0854 d2.dn_loss_bbox: 0.1656 d2.dn_loss_iou: 0.2274 d3.dn_loss_cls: 0.0767 d3.dn_loss_bbox: 0.1575 d3.dn_loss_iou: 0.2206 d4.dn_loss_cls: 0.0734 d4.dn_loss_bbox: 0.1559 d4.dn_loss_iou: 0.2186 d1.loss_lmm_region: 0.0983 loss_lmm_image: 0.7915 2024/11/13 13:55:00 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 13:55:00 - mmengine - INFO - Iter(train) [128000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:16:58 time: 1.9902 data_time: 0.0186 memory: 33083 grad_norm: 28.7788 loss: 7.6273 loss_cls: 0.2137 loss_bbox: 0.1193 loss_iou: 0.2142 d0.loss_cls: 0.2448 d0.loss_bbox: 0.1315 d0.loss_iou: 0.2272 d1.loss_cls: 0.2255 d1.loss_bbox: 0.1286 d1.loss_iou: 0.2232 d2.loss_cls: 0.2185 d2.loss_bbox: 0.1228 d2.loss_iou: 0.2169 d3.loss_cls: 0.2160 d3.loss_bbox: 0.1182 d3.loss_iou: 0.2146 d4.loss_cls: 0.2175 d4.loss_bbox: 0.1179 d4.loss_iou: 0.2122 enc_loss_cls: 0.2600 enc_loss_bbox: 0.1393 enc_loss_iou: 0.2403 dn_loss_cls: 0.0595 dn_loss_bbox: 0.1433 dn_loss_iou: 0.1879 d0.dn_loss_cls: 0.1339 d0.dn_loss_bbox: 0.2915 d0.dn_loss_iou: 0.3205 d1.dn_loss_cls: 0.0857 d1.dn_loss_bbox: 0.1761 d1.dn_loss_iou: 0.2131 d2.dn_loss_cls: 0.0697 d2.dn_loss_bbox: 0.1522 d2.dn_loss_iou: 0.1950 d3.dn_loss_cls: 0.0638 d3.dn_loss_bbox: 0.1461 d3.dn_loss_iou: 0.1899 d4.dn_loss_cls: 0.0605 d4.dn_loss_bbox: 0.1433 d4.dn_loss_iou: 0.1879 d1.loss_lmm_region: 0.0858 loss_lmm_image: 0.6995 2024/11/13 13:58:19 - mmengine - INFO - Iter(train) [128100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:13:37 time: 1.9806 data_time: 0.0187 memory: 33654 grad_norm: 27.6148 loss: 8.9237 loss_cls: 0.2751 loss_bbox: 0.1313 loss_iou: 0.2621 d0.loss_cls: 0.3179 d0.loss_bbox: 0.1385 d0.loss_iou: 0.2660 d1.loss_cls: 0.2967 d1.loss_bbox: 0.1350 d1.loss_iou: 0.2622 d2.loss_cls: 0.2864 d2.loss_bbox: 0.1320 d2.loss_iou: 0.2589 d3.loss_cls: 0.2802 d3.loss_bbox: 0.1301 d3.loss_iou: 0.2604 d4.loss_cls: 0.2761 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2620 enc_loss_cls: 0.3225 enc_loss_bbox: 0.1476 enc_loss_iou: 0.2796 dn_loss_cls: 0.1234 dn_loss_bbox: 0.1487 dn_loss_iou: 0.2032 d0.dn_loss_cls: 0.1919 d0.dn_loss_bbox: 0.2625 d0.dn_loss_iou: 0.3210 d1.dn_loss_cls: 0.1480 d1.dn_loss_bbox: 0.1750 d1.dn_loss_iou: 0.2282 d2.dn_loss_cls: 0.1346 d2.dn_loss_bbox: 0.1569 d2.dn_loss_iou: 0.2116 d3.dn_loss_cls: 0.1320 d3.dn_loss_bbox: 0.1501 d3.dn_loss_iou: 0.2053 d4.dn_loss_cls: 0.1276 d4.dn_loss_bbox: 0.1487 d4.dn_loss_iou: 0.2033 d1.loss_lmm_region: 0.1092 loss_lmm_image: 0.6908 2024/11/13 14:01:39 - mmengine - INFO - Iter(train) [128200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:10:15 time: 1.9921 data_time: 0.0187 memory: 34611 grad_norm: 26.5052 loss: 8.0375 loss_cls: 0.2935 loss_bbox: 0.0972 loss_iou: 0.1820 d0.loss_cls: 0.3238 d0.loss_bbox: 0.1060 d0.loss_iou: 0.1975 d1.loss_cls: 0.3046 d1.loss_bbox: 0.1009 d1.loss_iou: 0.1868 d2.loss_cls: 0.3027 d2.loss_bbox: 0.0982 d2.loss_iou: 0.1828 d3.loss_cls: 0.2974 d3.loss_bbox: 0.0970 d3.loss_iou: 0.1834 d4.loss_cls: 0.2948 d4.loss_bbox: 0.0953 d4.loss_iou: 0.1820 enc_loss_cls: 0.3238 enc_loss_bbox: 0.1172 enc_loss_iou: 0.2098 dn_loss_cls: 0.1338 dn_loss_bbox: 0.1166 dn_loss_iou: 0.1682 d0.dn_loss_cls: 0.2092 d0.dn_loss_bbox: 0.2615 d0.dn_loss_iou: 0.2981 d1.dn_loss_cls: 0.1651 d1.dn_loss_bbox: 0.1507 d1.dn_loss_iou: 0.1964 d2.dn_loss_cls: 0.1468 d2.dn_loss_bbox: 0.1245 d2.dn_loss_iou: 0.1750 d3.dn_loss_cls: 0.1416 d3.dn_loss_bbox: 0.1186 d3.dn_loss_iou: 0.1704 d4.dn_loss_cls: 0.1364 d4.dn_loss_bbox: 0.1167 d4.dn_loss_iou: 0.1682 d1.loss_lmm_region: 0.1064 loss_lmm_image: 0.7568 2024/11/13 14:04:58 - mmengine - INFO - Iter(train) [128300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:06:54 time: 1.9777 data_time: 0.0186 memory: 34954 grad_norm: 27.7964 loss: 9.1748 loss_cls: 0.2744 loss_bbox: 0.1531 loss_iou: 0.2670 d0.loss_cls: 0.3075 d0.loss_bbox: 0.1620 d0.loss_iou: 0.2772 d1.loss_cls: 0.2882 d1.loss_bbox: 0.1584 d1.loss_iou: 0.2724 d2.loss_cls: 0.2793 d2.loss_bbox: 0.1554 d2.loss_iou: 0.2690 d3.loss_cls: 0.2746 d3.loss_bbox: 0.1556 d3.loss_iou: 0.2684 d4.loss_cls: 0.2766 d4.loss_bbox: 0.1535 d4.loss_iou: 0.2659 enc_loss_cls: 0.3165 enc_loss_bbox: 0.1806 enc_loss_iou: 0.2968 dn_loss_cls: 0.0720 dn_loss_bbox: 0.1692 dn_loss_iou: 0.2204 d0.dn_loss_cls: 0.1456 d0.dn_loss_bbox: 0.3140 d0.dn_loss_iou: 0.3531 d1.dn_loss_cls: 0.0948 d1.dn_loss_bbox: 0.1957 d1.dn_loss_iou: 0.2470 d2.dn_loss_cls: 0.0821 d2.dn_loss_bbox: 0.1764 d2.dn_loss_iou: 0.2279 d3.dn_loss_cls: 0.0742 d3.dn_loss_bbox: 0.1713 d3.dn_loss_iou: 0.2223 d4.dn_loss_cls: 0.0716 d4.dn_loss_bbox: 0.1692 d4.dn_loss_iou: 0.2203 d1.loss_lmm_region: 0.1133 loss_lmm_image: 0.7817 2024/11/13 14:08:15 - mmengine - INFO - Iter(train) [128400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:03:32 time: 1.9649 data_time: 0.0186 memory: 34263 grad_norm: 27.2162 loss: 8.5253 loss_cls: 0.2740 loss_bbox: 0.1280 loss_iou: 0.2460 d0.loss_cls: 0.3069 d0.loss_bbox: 0.1404 d0.loss_iou: 0.2624 d1.loss_cls: 0.2918 d1.loss_bbox: 0.1330 d1.loss_iou: 0.2505 d2.loss_cls: 0.2875 d2.loss_bbox: 0.1265 d2.loss_iou: 0.2434 d3.loss_cls: 0.2785 d3.loss_bbox: 0.1283 d3.loss_iou: 0.2450 d4.loss_cls: 0.2747 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2462 enc_loss_cls: 0.3121 enc_loss_bbox: 0.1517 enc_loss_iou: 0.2743 dn_loss_cls: 0.0690 dn_loss_bbox: 0.1497 dn_loss_iou: 0.2029 d0.dn_loss_cls: 0.1452 d0.dn_loss_bbox: 0.2723 d0.dn_loss_iou: 0.3241 d1.dn_loss_cls: 0.0968 d1.dn_loss_bbox: 0.1748 d1.dn_loss_iou: 0.2279 d2.dn_loss_cls: 0.0801 d2.dn_loss_bbox: 0.1578 d2.dn_loss_iou: 0.2105 d3.dn_loss_cls: 0.0727 d3.dn_loss_bbox: 0.1512 d3.dn_loss_iou: 0.2046 d4.dn_loss_cls: 0.0699 d4.dn_loss_bbox: 0.1496 d4.dn_loss_iou: 0.2028 d1.loss_lmm_region: 0.0824 loss_lmm_image: 0.7491 2024/11/13 14:11:36 - mmengine - INFO - Iter(train) [128500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 12:00:11 time: 2.0174 data_time: 0.0187 memory: 34280 grad_norm: 28.3927 loss: 7.3374 loss_cls: 0.2141 loss_bbox: 0.1088 loss_iou: 0.1869 d0.loss_cls: 0.2450 d0.loss_bbox: 0.1193 d0.loss_iou: 0.2004 d1.loss_cls: 0.2309 d1.loss_bbox: 0.1135 d1.loss_iou: 0.1917 d2.loss_cls: 0.2232 d2.loss_bbox: 0.1113 d2.loss_iou: 0.1894 d3.loss_cls: 0.2165 d3.loss_bbox: 0.1104 d3.loss_iou: 0.1890 d4.loss_cls: 0.2134 d4.loss_bbox: 0.1110 d4.loss_iou: 0.1869 enc_loss_cls: 0.2509 enc_loss_bbox: 0.1319 enc_loss_iou: 0.2141 dn_loss_cls: 0.0725 dn_loss_bbox: 0.1381 dn_loss_iou: 0.1721 d0.dn_loss_cls: 0.1413 d0.dn_loss_bbox: 0.2608 d0.dn_loss_iou: 0.2928 d1.dn_loss_cls: 0.0951 d1.dn_loss_bbox: 0.1641 d1.dn_loss_iou: 0.1969 d2.dn_loss_cls: 0.0806 d2.dn_loss_bbox: 0.1455 d2.dn_loss_iou: 0.1797 d3.dn_loss_cls: 0.0761 d3.dn_loss_bbox: 0.1391 d3.dn_loss_iou: 0.1738 d4.dn_loss_cls: 0.0732 d4.dn_loss_bbox: 0.1381 d4.dn_loss_iou: 0.1720 d1.loss_lmm_region: 0.1006 loss_lmm_image: 0.7666 2024/11/13 14:14:58 - mmengine - INFO - Iter(train) [128600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:56:50 time: 2.0297 data_time: 0.0186 memory: 34990 grad_norm: 26.6181 loss: 9.3168 loss_cls: 0.2837 loss_bbox: 0.1515 loss_iou: 0.2653 d0.loss_cls: 0.3141 d0.loss_bbox: 0.1636 d0.loss_iou: 0.2801 d1.loss_cls: 0.2958 d1.loss_bbox: 0.1558 d1.loss_iou: 0.2707 d2.loss_cls: 0.2890 d2.loss_bbox: 0.1534 d2.loss_iou: 0.2677 d3.loss_cls: 0.2894 d3.loss_bbox: 0.1525 d3.loss_iou: 0.2674 d4.loss_cls: 0.2847 d4.loss_bbox: 0.1499 d4.loss_iou: 0.2645 enc_loss_cls: 0.3192 enc_loss_bbox: 0.1783 enc_loss_iou: 0.2994 dn_loss_cls: 0.0744 dn_loss_bbox: 0.1859 dn_loss_iou: 0.2324 d0.dn_loss_cls: 0.1582 d0.dn_loss_bbox: 0.3125 d0.dn_loss_iou: 0.3584 d1.dn_loss_cls: 0.1058 d1.dn_loss_bbox: 0.2119 d1.dn_loss_iou: 0.2553 d2.dn_loss_cls: 0.0883 d2.dn_loss_bbox: 0.1941 d2.dn_loss_iou: 0.2389 d3.dn_loss_cls: 0.0818 d3.dn_loss_bbox: 0.1871 d3.dn_loss_iou: 0.2341 d4.dn_loss_cls: 0.0754 d4.dn_loss_bbox: 0.1859 d4.dn_loss_iou: 0.2324 d1.loss_lmm_region: 0.0970 loss_lmm_image: 0.7107 2024/11/13 14:18:16 - mmengine - INFO - Iter(train) [128700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:53:29 time: 1.9886 data_time: 0.0186 memory: 34041 grad_norm: 23.7524 loss: 7.8496 loss_cls: 0.2137 loss_bbox: 0.1282 loss_iou: 0.2003 d0.loss_cls: 0.2435 d0.loss_bbox: 0.1345 d0.loss_iou: 0.2132 d1.loss_cls: 0.2281 d1.loss_bbox: 0.1307 d1.loss_iou: 0.2040 d2.loss_cls: 0.2238 d2.loss_bbox: 0.1268 d2.loss_iou: 0.2018 d3.loss_cls: 0.2168 d3.loss_bbox: 0.1280 d3.loss_iou: 0.2007 d4.loss_cls: 0.2129 d4.loss_bbox: 0.1291 d4.loss_iou: 0.2009 enc_loss_cls: 0.2481 enc_loss_bbox: 0.1527 enc_loss_iou: 0.2289 dn_loss_cls: 0.0692 dn_loss_bbox: 0.1692 dn_loss_iou: 0.1891 d0.dn_loss_cls: 0.1389 d0.dn_loss_bbox: 0.3048 d0.dn_loss_iou: 0.3165 d1.dn_loss_cls: 0.0897 d1.dn_loss_bbox: 0.1964 d1.dn_loss_iou: 0.2137 d2.dn_loss_cls: 0.0774 d2.dn_loss_bbox: 0.1792 d2.dn_loss_iou: 0.1969 d3.dn_loss_cls: 0.0749 d3.dn_loss_bbox: 0.1709 d3.dn_loss_iou: 0.1909 d4.dn_loss_cls: 0.0709 d4.dn_loss_bbox: 0.1692 d4.dn_loss_iou: 0.1891 d1.loss_lmm_region: 0.1029 loss_lmm_image: 0.7731 2024/11/13 14:21:37 - mmengine - INFO - Iter(train) [128800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:50:08 time: 2.0140 data_time: 0.0187 memory: 33033 grad_norm: 27.8914 loss: 7.4330 loss_cls: 0.2458 loss_bbox: 0.0993 loss_iou: 0.2031 d0.loss_cls: 0.2844 d0.loss_bbox: 0.1072 d0.loss_iou: 0.2189 d1.loss_cls: 0.2709 d1.loss_bbox: 0.0980 d1.loss_iou: 0.2027 d2.loss_cls: 0.2626 d2.loss_bbox: 0.0957 d2.loss_iou: 0.2007 d3.loss_cls: 0.2523 d3.loss_bbox: 0.0988 d3.loss_iou: 0.2024 d4.loss_cls: 0.2472 d4.loss_bbox: 0.0998 d4.loss_iou: 0.2031 enc_loss_cls: 0.2954 enc_loss_bbox: 0.1186 enc_loss_iou: 0.2309 dn_loss_cls: 0.0717 dn_loss_bbox: 0.1148 dn_loss_iou: 0.1705 d0.dn_loss_cls: 0.1424 d0.dn_loss_bbox: 0.2380 d0.dn_loss_iou: 0.2946 d1.dn_loss_cls: 0.1005 d1.dn_loss_bbox: 0.1394 d1.dn_loss_iou: 0.1948 d2.dn_loss_cls: 0.0822 d2.dn_loss_bbox: 0.1215 d2.dn_loss_iou: 0.1771 d3.dn_loss_cls: 0.0759 d3.dn_loss_bbox: 0.1164 d3.dn_loss_iou: 0.1722 d4.dn_loss_cls: 0.0723 d4.dn_loss_bbox: 0.1149 d4.dn_loss_iou: 0.1705 d1.loss_lmm_region: 0.0706 loss_lmm_image: 0.7548 2024/11/13 14:24:57 - mmengine - INFO - Iter(train) [128900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:46:47 time: 2.0121 data_time: 0.0186 memory: 33589 grad_norm: 27.2083 loss: 8.5400 loss_cls: 0.2607 loss_bbox: 0.1195 loss_iou: 0.2465 d0.loss_cls: 0.2907 d0.loss_bbox: 0.1301 d0.loss_iou: 0.2643 d1.loss_cls: 0.2780 d1.loss_bbox: 0.1192 d1.loss_iou: 0.2505 d2.loss_cls: 0.2710 d2.loss_bbox: 0.1169 d2.loss_iou: 0.2491 d3.loss_cls: 0.2643 d3.loss_bbox: 0.1197 d3.loss_iou: 0.2493 d4.loss_cls: 0.2607 d4.loss_bbox: 0.1197 d4.loss_iou: 0.2481 enc_loss_cls: 0.3033 enc_loss_bbox: 0.1368 enc_loss_iou: 0.2710 dn_loss_cls: 0.0904 dn_loss_bbox: 0.1411 dn_loss_iou: 0.1961 d0.dn_loss_cls: 0.1753 d0.dn_loss_bbox: 0.2786 d0.dn_loss_iou: 0.3220 d1.dn_loss_cls: 0.1237 d1.dn_loss_bbox: 0.1735 d1.dn_loss_iou: 0.2225 d2.dn_loss_cls: 0.1022 d2.dn_loss_bbox: 0.1526 d2.dn_loss_iou: 0.2044 d3.dn_loss_cls: 0.0939 d3.dn_loss_bbox: 0.1440 d3.dn_loss_iou: 0.1985 d4.dn_loss_cls: 0.0899 d4.dn_loss_bbox: 0.1410 d4.dn_loss_iou: 0.1961 d1.loss_lmm_region: 0.1179 loss_lmm_image: 0.8069 2024/11/13 14:28:17 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 14:28:17 - mmengine - INFO - Iter(train) [129000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:43:26 time: 2.0015 data_time: 0.0187 memory: 35540 grad_norm: 29.2307 loss: 7.9962 loss_cls: 0.2692 loss_bbox: 0.1063 loss_iou: 0.2196 d0.loss_cls: 0.2986 d0.loss_bbox: 0.1269 d0.loss_iou: 0.2379 d1.loss_cls: 0.2862 d1.loss_bbox: 0.1108 d1.loss_iou: 0.2271 d2.loss_cls: 0.2773 d2.loss_bbox: 0.1065 d2.loss_iou: 0.2205 d3.loss_cls: 0.2718 d3.loss_bbox: 0.1072 d3.loss_iou: 0.2204 d4.loss_cls: 0.2687 d4.loss_bbox: 0.1076 d4.loss_iou: 0.2213 enc_loss_cls: 0.3066 enc_loss_bbox: 0.1356 enc_loss_iou: 0.2563 dn_loss_cls: 0.0562 dn_loss_bbox: 0.1339 dn_loss_iou: 0.1964 d0.dn_loss_cls: 0.1384 d0.dn_loss_bbox: 0.2700 d0.dn_loss_iou: 0.3325 d1.dn_loss_cls: 0.0844 d1.dn_loss_bbox: 0.1660 d1.dn_loss_iou: 0.2245 d2.dn_loss_cls: 0.0659 d2.dn_loss_bbox: 0.1459 d2.dn_loss_iou: 0.2049 d3.dn_loss_cls: 0.0595 d3.dn_loss_bbox: 0.1358 d3.dn_loss_iou: 0.1981 d4.dn_loss_cls: 0.0573 d4.dn_loss_bbox: 0.1338 d4.dn_loss_iou: 0.1964 d1.loss_lmm_region: 0.0931 loss_lmm_image: 0.7208 2024/11/13 14:31:40 - mmengine - INFO - Iter(train) [129100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:40:05 time: 2.0228 data_time: 0.0189 memory: 34528 grad_norm: 23.2046 loss: 8.4442 loss_cls: 0.2583 loss_bbox: 0.1370 loss_iou: 0.2586 d0.loss_cls: 0.2980 d0.loss_bbox: 0.1467 d0.loss_iou: 0.2710 d1.loss_cls: 0.2744 d1.loss_bbox: 0.1412 d1.loss_iou: 0.2657 d2.loss_cls: 0.2683 d2.loss_bbox: 0.1379 d2.loss_iou: 0.2596 d3.loss_cls: 0.2597 d3.loss_bbox: 0.1396 d3.loss_iou: 0.2622 d4.loss_cls: 0.2585 d4.loss_bbox: 0.1373 d4.loss_iou: 0.2603 enc_loss_cls: 0.2992 enc_loss_bbox: 0.1594 enc_loss_iou: 0.2857 dn_loss_cls: 0.0713 dn_loss_bbox: 0.1335 dn_loss_iou: 0.1963 d0.dn_loss_cls: 0.1431 d0.dn_loss_bbox: 0.2438 d0.dn_loss_iou: 0.3151 d1.dn_loss_cls: 0.0955 d1.dn_loss_bbox: 0.1567 d1.dn_loss_iou: 0.2206 d2.dn_loss_cls: 0.0792 d2.dn_loss_bbox: 0.1400 d2.dn_loss_iou: 0.2034 d3.dn_loss_cls: 0.0726 d3.dn_loss_bbox: 0.1343 d3.dn_loss_iou: 0.1979 d4.dn_loss_cls: 0.0724 d4.dn_loss_bbox: 0.1335 d4.dn_loss_iou: 0.1963 d1.loss_lmm_region: 0.0852 loss_lmm_image: 0.7748 2024/11/13 14:34:58 - mmengine - INFO - Iter(train) [129200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:36:44 time: 2.0019 data_time: 0.0185 memory: 35675 grad_norm: 34.5361 loss: 9.0635 loss_cls: 0.2789 loss_bbox: 0.1434 loss_iou: 0.2499 d0.loss_cls: 0.3126 d0.loss_bbox: 0.1515 d0.loss_iou: 0.2640 d1.loss_cls: 0.2978 d1.loss_bbox: 0.1450 d1.loss_iou: 0.2533 d2.loss_cls: 0.2895 d2.loss_bbox: 0.1437 d2.loss_iou: 0.2488 d3.loss_cls: 0.2844 d3.loss_bbox: 0.1446 d3.loss_iou: 0.2495 d4.loss_cls: 0.2790 d4.loss_bbox: 0.1450 d4.loss_iou: 0.2500 enc_loss_cls: 0.3206 enc_loss_bbox: 0.1598 enc_loss_iou: 0.2740 dn_loss_cls: 0.0992 dn_loss_bbox: 0.1617 dn_loss_iou: 0.2223 d0.dn_loss_cls: 0.1721 d0.dn_loss_bbox: 0.3015 d0.dn_loss_iou: 0.3564 d1.dn_loss_cls: 0.1278 d1.dn_loss_bbox: 0.1941 d1.dn_loss_iou: 0.2505 d2.dn_loss_cls: 0.1116 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2319 d3.dn_loss_cls: 0.1031 d3.dn_loss_bbox: 0.1637 d3.dn_loss_iou: 0.2243 d4.dn_loss_cls: 0.0990 d4.dn_loss_bbox: 0.1617 d4.dn_loss_iou: 0.2223 d1.loss_lmm_region: 0.1125 loss_lmm_image: 0.6903 2024/11/13 14:38:16 - mmengine - INFO - Iter(train) [129300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:33:22 time: 1.9890 data_time: 0.0187 memory: 33893 grad_norm: 25.5419 loss: 8.3898 loss_cls: 0.2575 loss_bbox: 0.1255 loss_iou: 0.2261 d0.loss_cls: 0.2884 d0.loss_bbox: 0.1389 d0.loss_iou: 0.2416 d1.loss_cls: 0.2757 d1.loss_bbox: 0.1320 d1.loss_iou: 0.2320 d2.loss_cls: 0.2714 d2.loss_bbox: 0.1239 d2.loss_iou: 0.2263 d3.loss_cls: 0.2596 d3.loss_bbox: 0.1256 d3.loss_iou: 0.2263 d4.loss_cls: 0.2582 d4.loss_bbox: 0.1248 d4.loss_iou: 0.2256 enc_loss_cls: 0.3067 enc_loss_bbox: 0.1457 enc_loss_iou: 0.2541 dn_loss_cls: 0.0917 dn_loss_bbox: 0.1533 dn_loss_iou: 0.1978 d0.dn_loss_cls: 0.1629 d0.dn_loss_bbox: 0.2902 d0.dn_loss_iou: 0.3245 d1.dn_loss_cls: 0.1247 d1.dn_loss_bbox: 0.1827 d1.dn_loss_iou: 0.2253 d2.dn_loss_cls: 0.1082 d2.dn_loss_bbox: 0.1611 d2.dn_loss_iou: 0.2059 d3.dn_loss_cls: 0.1010 d3.dn_loss_bbox: 0.1543 d3.dn_loss_iou: 0.1995 d4.dn_loss_cls: 0.0925 d4.dn_loss_bbox: 0.1534 d4.dn_loss_iou: 0.1978 d1.loss_lmm_region: 0.0830 loss_lmm_image: 0.7138 2024/11/13 14:41:36 - mmengine - INFO - Iter(train) [129400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:30:01 time: 2.0061 data_time: 0.0186 memory: 34662 grad_norm: 21.2844 loss: 8.6681 loss_cls: 0.2770 loss_bbox: 0.1223 loss_iou: 0.2347 d0.loss_cls: 0.3142 d0.loss_bbox: 0.1359 d0.loss_iou: 0.2451 d1.loss_cls: 0.2936 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2373 d2.loss_cls: 0.2861 d2.loss_bbox: 0.1298 d2.loss_iou: 0.2378 d3.loss_cls: 0.2856 d3.loss_bbox: 0.1220 d3.loss_iou: 0.2353 d4.loss_cls: 0.2786 d4.loss_bbox: 0.1220 d4.loss_iou: 0.2344 enc_loss_cls: 0.3173 enc_loss_bbox: 0.1478 enc_loss_iou: 0.2675 dn_loss_cls: 0.0793 dn_loss_bbox: 0.1575 dn_loss_iou: 0.2100 d0.dn_loss_cls: 0.1577 d0.dn_loss_bbox: 0.3006 d0.dn_loss_iou: 0.3465 d1.dn_loss_cls: 0.1086 d1.dn_loss_bbox: 0.1887 d1.dn_loss_iou: 0.2408 d2.dn_loss_cls: 0.0904 d2.dn_loss_bbox: 0.1665 d2.dn_loss_iou: 0.2193 d3.dn_loss_cls: 0.0833 d3.dn_loss_bbox: 0.1589 d3.dn_loss_iou: 0.2124 d4.dn_loss_cls: 0.0793 d4.dn_loss_bbox: 0.1574 d4.dn_loss_iou: 0.2100 d1.loss_lmm_region: 0.1087 loss_lmm_image: 0.7406 2024/11/13 14:44:56 - mmengine - INFO - Iter(train) [129500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:26:40 time: 2.0027 data_time: 0.0186 memory: 33859 grad_norm: 25.4476 loss: 7.6248 loss_cls: 0.2218 loss_bbox: 0.1039 loss_iou: 0.1834 d0.loss_cls: 0.2518 d0.loss_bbox: 0.1165 d0.loss_iou: 0.1936 d1.loss_cls: 0.2363 d1.loss_bbox: 0.1060 d1.loss_iou: 0.1849 d2.loss_cls: 0.2320 d2.loss_bbox: 0.1028 d2.loss_iou: 0.1819 d3.loss_cls: 0.2269 d3.loss_bbox: 0.1014 d3.loss_iou: 0.1816 d4.loss_cls: 0.2228 d4.loss_bbox: 0.1034 d4.loss_iou: 0.1828 enc_loss_cls: 0.2593 enc_loss_bbox: 0.1237 enc_loss_iou: 0.2051 dn_loss_cls: 0.0981 dn_loss_bbox: 0.1468 dn_loss_iou: 0.1862 d0.dn_loss_cls: 0.1765 d0.dn_loss_bbox: 0.2780 d0.dn_loss_iou: 0.3091 d1.dn_loss_cls: 0.1306 d1.dn_loss_bbox: 0.1746 d1.dn_loss_iou: 0.2111 d2.dn_loss_cls: 0.1109 d2.dn_loss_bbox: 0.1522 d2.dn_loss_iou: 0.1930 d3.dn_loss_cls: 0.1039 d3.dn_loss_bbox: 0.1480 d3.dn_loss_iou: 0.1879 d4.dn_loss_cls: 0.1005 d4.dn_loss_bbox: 0.1467 d4.dn_loss_iou: 0.1862 d1.loss_lmm_region: 0.0887 loss_lmm_image: 0.7734 2024/11/13 14:48:13 - mmengine - INFO - Iter(train) [129600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:23:18 time: 1.9817 data_time: 0.0186 memory: 32679 grad_norm: 32.2761 loss: 8.0870 loss_cls: 0.2669 loss_bbox: 0.1157 loss_iou: 0.2256 d0.loss_cls: 0.2992 d0.loss_bbox: 0.1284 d0.loss_iou: 0.2408 d1.loss_cls: 0.2791 d1.loss_bbox: 0.1178 d1.loss_iou: 0.2288 d2.loss_cls: 0.2719 d2.loss_bbox: 0.1167 d2.loss_iou: 0.2264 d3.loss_cls: 0.2709 d3.loss_bbox: 0.1144 d3.loss_iou: 0.2247 d4.loss_cls: 0.2665 d4.loss_bbox: 0.1177 d4.loss_iou: 0.2262 enc_loss_cls: 0.3115 enc_loss_bbox: 0.1399 enc_loss_iou: 0.2600 dn_loss_cls: 0.0835 dn_loss_bbox: 0.1400 dn_loss_iou: 0.1847 d0.dn_loss_cls: 0.1548 d0.dn_loss_bbox: 0.2496 d0.dn_loss_iou: 0.2989 d1.dn_loss_cls: 0.1088 d1.dn_loss_bbox: 0.1619 d1.dn_loss_iou: 0.2065 d2.dn_loss_cls: 0.0928 d2.dn_loss_bbox: 0.1489 d2.dn_loss_iou: 0.1926 d3.dn_loss_cls: 0.0873 d3.dn_loss_bbox: 0.1429 d3.dn_loss_iou: 0.1872 d4.dn_loss_cls: 0.0823 d4.dn_loss_bbox: 0.1400 d4.dn_loss_iou: 0.1848 d1.loss_lmm_region: 0.0920 loss_lmm_image: 0.6985 2024/11/13 14:51:33 - mmengine - INFO - Iter(train) [129700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:19:57 time: 1.9937 data_time: 0.0186 memory: 34593 grad_norm: 25.5607 loss: 8.1926 loss_cls: 0.2445 loss_bbox: 0.1386 loss_iou: 0.2377 d0.loss_cls: 0.2871 d0.loss_bbox: 0.1482 d0.loss_iou: 0.2506 d1.loss_cls: 0.2623 d1.loss_bbox: 0.1465 d1.loss_iou: 0.2468 d2.loss_cls: 0.2565 d2.loss_bbox: 0.1377 d2.loss_iou: 0.2391 d3.loss_cls: 0.2506 d3.loss_bbox: 0.1378 d3.loss_iou: 0.2393 d4.loss_cls: 0.2480 d4.loss_bbox: 0.1372 d4.loss_iou: 0.2380 enc_loss_cls: 0.2916 enc_loss_bbox: 0.1564 enc_loss_iou: 0.2629 dn_loss_cls: 0.0720 dn_loss_bbox: 0.1324 dn_loss_iou: 0.1836 d0.dn_loss_cls: 0.1516 d0.dn_loss_bbox: 0.2622 d0.dn_loss_iou: 0.3109 d1.dn_loss_cls: 0.0996 d1.dn_loss_bbox: 0.1595 d1.dn_loss_iou: 0.2098 d2.dn_loss_cls: 0.0846 d2.dn_loss_bbox: 0.1407 d2.dn_loss_iou: 0.1914 d3.dn_loss_cls: 0.0769 d3.dn_loss_bbox: 0.1340 d3.dn_loss_iou: 0.1851 d4.dn_loss_cls: 0.0723 d4.dn_loss_bbox: 0.1325 d4.dn_loss_iou: 0.1836 d1.loss_lmm_region: 0.0981 loss_lmm_image: 0.7548 2024/11/13 14:54:54 - mmengine - INFO - Iter(train) [129800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:16:36 time: 2.0059 data_time: 0.0187 memory: 32625 grad_norm: 32.7554 loss: 7.7044 loss_cls: 0.2194 loss_bbox: 0.1125 loss_iou: 0.2011 d0.loss_cls: 0.2461 d0.loss_bbox: 0.1231 d0.loss_iou: 0.2127 d1.loss_cls: 0.2335 d1.loss_bbox: 0.1131 d1.loss_iou: 0.2035 d2.loss_cls: 0.2253 d2.loss_bbox: 0.1132 d2.loss_iou: 0.2011 d3.loss_cls: 0.2226 d3.loss_bbox: 0.1123 d3.loss_iou: 0.2004 d4.loss_cls: 0.2223 d4.loss_bbox: 0.1117 d4.loss_iou: 0.2000 enc_loss_cls: 0.2527 enc_loss_bbox: 0.1324 enc_loss_iou: 0.2280 dn_loss_cls: 0.0968 dn_loss_bbox: 0.1437 dn_loss_iou: 0.1810 d0.dn_loss_cls: 0.1668 d0.dn_loss_bbox: 0.2643 d0.dn_loss_iou: 0.3015 d1.dn_loss_cls: 0.1265 d1.dn_loss_bbox: 0.1660 d1.dn_loss_iou: 0.2049 d2.dn_loss_cls: 0.1067 d2.dn_loss_bbox: 0.1484 d2.dn_loss_iou: 0.1880 d3.dn_loss_cls: 0.1004 d3.dn_loss_bbox: 0.1434 d3.dn_loss_iou: 0.1820 d4.dn_loss_cls: 0.0987 d4.dn_loss_bbox: 0.1437 d4.dn_loss_iou: 0.1810 d1.loss_lmm_region: 0.0983 loss_lmm_image: 0.7753 2024/11/13 14:58:12 - mmengine - INFO - Iter(train) [129900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:13:15 time: 1.9660 data_time: 0.0187 memory: 34294 grad_norm: 32.0227 loss: 8.3865 loss_cls: 0.2607 loss_bbox: 0.1253 loss_iou: 0.2128 d0.loss_cls: 0.3083 d0.loss_bbox: 0.1371 d0.loss_iou: 0.2186 d1.loss_cls: 0.2844 d1.loss_bbox: 0.1251 d1.loss_iou: 0.2097 d2.loss_cls: 0.2743 d2.loss_bbox: 0.1270 d2.loss_iou: 0.2117 d3.loss_cls: 0.2699 d3.loss_bbox: 0.1243 d3.loss_iou: 0.2105 d4.loss_cls: 0.2631 d4.loss_bbox: 0.1257 d4.loss_iou: 0.2130 enc_loss_cls: 0.3157 enc_loss_bbox: 0.1482 enc_loss_iou: 0.2312 dn_loss_cls: 0.0943 dn_loss_bbox: 0.1579 dn_loss_iou: 0.1916 d0.dn_loss_cls: 0.1710 d0.dn_loss_bbox: 0.3111 d0.dn_loss_iou: 0.3197 d1.dn_loss_cls: 0.1205 d1.dn_loss_bbox: 0.1927 d1.dn_loss_iou: 0.2171 d2.dn_loss_cls: 0.1046 d2.dn_loss_bbox: 0.1672 d2.dn_loss_iou: 0.1982 d3.dn_loss_cls: 0.0970 d3.dn_loss_bbox: 0.1601 d3.dn_loss_iou: 0.1932 d4.dn_loss_cls: 0.0956 d4.dn_loss_bbox: 0.1579 d4.dn_loss_iou: 0.1916 d1.loss_lmm_region: 0.1013 loss_lmm_image: 0.7475 2024/11/13 15:01:34 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 15:01:34 - mmengine - INFO - Iter(train) [130000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:09:54 time: 1.9955 data_time: 0.0188 memory: 34541 grad_norm: 24.4751 loss: 7.7255 loss_cls: 0.2447 loss_bbox: 0.1055 loss_iou: 0.1923 d0.loss_cls: 0.2631 d0.loss_bbox: 0.1186 d0.loss_iou: 0.2065 d1.loss_cls: 0.2574 d1.loss_bbox: 0.1089 d1.loss_iou: 0.1963 d2.loss_cls: 0.2537 d2.loss_bbox: 0.1067 d2.loss_iou: 0.1935 d3.loss_cls: 0.2500 d3.loss_bbox: 0.1042 d3.loss_iou: 0.1908 d4.loss_cls: 0.2482 d4.loss_bbox: 0.1056 d4.loss_iou: 0.1936 enc_loss_cls: 0.2706 enc_loss_bbox: 0.1243 enc_loss_iou: 0.2142 dn_loss_cls: 0.0873 dn_loss_bbox: 0.1412 dn_loss_iou: 0.1812 d0.dn_loss_cls: 0.1615 d0.dn_loss_bbox: 0.2690 d0.dn_loss_iou: 0.3050 d1.dn_loss_cls: 0.1093 d1.dn_loss_bbox: 0.1754 d1.dn_loss_iou: 0.2085 d2.dn_loss_cls: 0.0925 d2.dn_loss_bbox: 0.1510 d2.dn_loss_iou: 0.1902 d3.dn_loss_cls: 0.0876 d3.dn_loss_bbox: 0.1428 d3.dn_loss_iou: 0.1832 d4.dn_loss_cls: 0.0864 d4.dn_loss_bbox: 0.1413 d4.dn_loss_iou: 0.1812 d1.loss_lmm_region: 0.1073 loss_lmm_image: 0.7749 2024/11/13 15:04:51 - mmengine - INFO - Iter(train) [130100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:06:32 time: 1.9991 data_time: 0.0187 memory: 35060 grad_norm: 24.3920 loss: 9.1807 loss_cls: 0.2614 loss_bbox: 0.1424 loss_iou: 0.2700 d0.loss_cls: 0.3036 d0.loss_bbox: 0.1472 d0.loss_iou: 0.2782 d1.loss_cls: 0.2840 d1.loss_bbox: 0.1428 d1.loss_iou: 0.2688 d2.loss_cls: 0.2723 d2.loss_bbox: 0.1423 d2.loss_iou: 0.2691 d3.loss_cls: 0.2629 d3.loss_bbox: 0.1435 d3.loss_iou: 0.2710 d4.loss_cls: 0.2591 d4.loss_bbox: 0.1426 d4.loss_iou: 0.2703 enc_loss_cls: 0.3105 enc_loss_bbox: 0.1532 enc_loss_iou: 0.2854 dn_loss_cls: 0.0932 dn_loss_bbox: 0.1643 dn_loss_iou: 0.2385 d0.dn_loss_cls: 0.1726 d0.dn_loss_bbox: 0.3104 d0.dn_loss_iou: 0.3725 d1.dn_loss_cls: 0.1238 d1.dn_loss_bbox: 0.1954 d1.dn_loss_iou: 0.2647 d2.dn_loss_cls: 0.1072 d2.dn_loss_bbox: 0.1751 d2.dn_loss_iou: 0.2471 d3.dn_loss_cls: 0.0997 d3.dn_loss_bbox: 0.1660 d3.dn_loss_iou: 0.2403 d4.dn_loss_cls: 0.0952 d4.dn_loss_bbox: 0.1643 d4.dn_loss_iou: 0.2385 d1.loss_lmm_region: 0.0964 loss_lmm_image: 0.7348 2024/11/13 15:08:11 - mmengine - INFO - Iter(train) [130200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 11:03:11 time: 1.9867 data_time: 0.0187 memory: 33506 grad_norm: 29.7093 loss: 8.2937 loss_cls: 0.2345 loss_bbox: 0.1225 loss_iou: 0.2237 d0.loss_cls: 0.2617 d0.loss_bbox: 0.1354 d0.loss_iou: 0.2382 d1.loss_cls: 0.2424 d1.loss_bbox: 0.1306 d1.loss_iou: 0.2299 d2.loss_cls: 0.2378 d2.loss_bbox: 0.1265 d2.loss_iou: 0.2245 d3.loss_cls: 0.2337 d3.loss_bbox: 0.1255 d3.loss_iou: 0.2276 d4.loss_cls: 0.2342 d4.loss_bbox: 0.1226 d4.loss_iou: 0.2247 enc_loss_cls: 0.2703 enc_loss_bbox: 0.1421 enc_loss_iou: 0.2506 dn_loss_cls: 0.0651 dn_loss_bbox: 0.1615 dn_loss_iou: 0.2270 d0.dn_loss_cls: 0.1526 d0.dn_loss_bbox: 0.3169 d0.dn_loss_iou: 0.3722 d1.dn_loss_cls: 0.0996 d1.dn_loss_bbox: 0.1973 d1.dn_loss_iou: 0.2564 d2.dn_loss_cls: 0.0790 d2.dn_loss_bbox: 0.1730 d2.dn_loss_iou: 0.2357 d3.dn_loss_cls: 0.0710 d3.dn_loss_bbox: 0.1647 d3.dn_loss_iou: 0.2290 d4.dn_loss_cls: 0.0668 d4.dn_loss_bbox: 0.1616 d4.dn_loss_iou: 0.2270 d1.loss_lmm_region: 0.0929 loss_lmm_image: 0.7057 2024/11/13 15:11:32 - mmengine - INFO - Iter(train) [130300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:59:50 time: 2.0106 data_time: 0.0186 memory: 34019 grad_norm: 30.6527 loss: 9.2465 loss_cls: 0.2752 loss_bbox: 0.1489 loss_iou: 0.2547 d0.loss_cls: 0.3290 d0.loss_bbox: 0.1571 d0.loss_iou: 0.2668 d1.loss_cls: 0.3033 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2571 d2.loss_cls: 0.2932 d2.loss_bbox: 0.1441 d2.loss_iou: 0.2538 d3.loss_cls: 0.2768 d3.loss_bbox: 0.1511 d3.loss_iou: 0.2556 d4.loss_cls: 0.2745 d4.loss_bbox: 0.1489 d4.loss_iou: 0.2549 enc_loss_cls: 0.3340 enc_loss_bbox: 0.1717 enc_loss_iou: 0.2829 dn_loss_cls: 0.0810 dn_loss_bbox: 0.1688 dn_loss_iou: 0.2271 d0.dn_loss_cls: 0.1667 d0.dn_loss_bbox: 0.3389 d0.dn_loss_iou: 0.3713 d1.dn_loss_cls: 0.1118 d1.dn_loss_bbox: 0.2061 d1.dn_loss_iou: 0.2585 d2.dn_loss_cls: 0.0933 d2.dn_loss_bbox: 0.1815 d2.dn_loss_iou: 0.2383 d3.dn_loss_cls: 0.0861 d3.dn_loss_bbox: 0.1712 d3.dn_loss_iou: 0.2297 d4.dn_loss_cls: 0.0827 d4.dn_loss_bbox: 0.1689 d4.dn_loss_iou: 0.2272 d1.loss_lmm_region: 0.1036 loss_lmm_image: 0.7511 2024/11/13 15:14:52 - mmengine - INFO - Iter(train) [130400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:56:29 time: 2.0095 data_time: 0.0187 memory: 33214 grad_norm: 21.5973 loss: 8.4312 loss_cls: 0.2453 loss_bbox: 0.1386 loss_iou: 0.2204 d0.loss_cls: 0.2824 d0.loss_bbox: 0.1478 d0.loss_iou: 0.2299 d1.loss_cls: 0.2641 d1.loss_bbox: 0.1419 d1.loss_iou: 0.2224 d2.loss_cls: 0.2595 d2.loss_bbox: 0.1344 d2.loss_iou: 0.2176 d3.loss_cls: 0.2503 d3.loss_bbox: 0.1367 d3.loss_iou: 0.2179 d4.loss_cls: 0.2464 d4.loss_bbox: 0.1387 d4.loss_iou: 0.2203 enc_loss_cls: 0.2887 enc_loss_bbox: 0.1548 enc_loss_iou: 0.2415 dn_loss_cls: 0.0698 dn_loss_bbox: 0.1678 dn_loss_iou: 0.2084 d0.dn_loss_cls: 0.1482 d0.dn_loss_bbox: 0.3089 d0.dn_loss_iou: 0.3382 d1.dn_loss_cls: 0.0979 d1.dn_loss_bbox: 0.2006 d1.dn_loss_iou: 0.2370 d2.dn_loss_cls: 0.0799 d2.dn_loss_bbox: 0.1802 d2.dn_loss_iou: 0.2178 d3.dn_loss_cls: 0.0754 d3.dn_loss_bbox: 0.1706 d3.dn_loss_iou: 0.2113 d4.dn_loss_cls: 0.0714 d4.dn_loss_bbox: 0.1679 d4.dn_loss_iou: 0.2085 d1.loss_lmm_region: 0.1083 loss_lmm_image: 0.7637 2024/11/13 15:18:09 - mmengine - INFO - Iter(train) [130500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:53:08 time: 1.9824 data_time: 0.0186 memory: 34162 grad_norm: 26.2606 loss: 8.9440 loss_cls: 0.3138 loss_bbox: 0.1193 loss_iou: 0.2317 d0.loss_cls: 0.3527 d0.loss_bbox: 0.1345 d0.loss_iou: 0.2509 d1.loss_cls: 0.3371 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2368 d2.loss_cls: 0.3282 d2.loss_bbox: 0.1216 d2.loss_iou: 0.2321 d3.loss_cls: 0.3186 d3.loss_bbox: 0.1207 d3.loss_iou: 0.2318 d4.loss_cls: 0.3145 d4.loss_bbox: 0.1225 d4.loss_iou: 0.2327 enc_loss_cls: 0.3559 enc_loss_bbox: 0.1463 enc_loss_iou: 0.2676 dn_loss_cls: 0.1126 dn_loss_bbox: 0.1403 dn_loss_iou: 0.2046 d0.dn_loss_cls: 0.1838 d0.dn_loss_bbox: 0.2939 d0.dn_loss_iou: 0.3477 d1.dn_loss_cls: 0.1373 d1.dn_loss_bbox: 0.1706 d1.dn_loss_iou: 0.2319 d2.dn_loss_cls: 0.1200 d2.dn_loss_bbox: 0.1471 d2.dn_loss_iou: 0.2114 d3.dn_loss_cls: 0.1171 d3.dn_loss_bbox: 0.1416 d3.dn_loss_iou: 0.2060 d4.dn_loss_cls: 0.1137 d4.dn_loss_bbox: 0.1402 d4.dn_loss_iou: 0.2046 d1.loss_lmm_region: 0.0976 loss_lmm_image: 0.7261 2024/11/13 15:21:29 - mmengine - INFO - Iter(train) [130600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:49:47 time: 1.9863 data_time: 0.0187 memory: 34355 grad_norm: 24.1185 loss: 9.1484 loss_cls: 0.3127 loss_bbox: 0.1397 loss_iou: 0.2489 d0.loss_cls: 0.3521 d0.loss_bbox: 0.1547 d0.loss_iou: 0.2669 d1.loss_cls: 0.3400 d1.loss_bbox: 0.1392 d1.loss_iou: 0.2523 d2.loss_cls: 0.3270 d2.loss_bbox: 0.1371 d2.loss_iou: 0.2496 d3.loss_cls: 0.3169 d3.loss_bbox: 0.1393 d3.loss_iou: 0.2488 d4.loss_cls: 0.3133 d4.loss_bbox: 0.1392 d4.loss_iou: 0.2485 enc_loss_cls: 0.3574 enc_loss_bbox: 0.1625 enc_loss_iou: 0.2801 dn_loss_cls: 0.0905 dn_loss_bbox: 0.1606 dn_loss_iou: 0.2022 d0.dn_loss_cls: 0.1736 d0.dn_loss_bbox: 0.3052 d0.dn_loss_iou: 0.3335 d1.dn_loss_cls: 0.1203 d1.dn_loss_bbox: 0.1913 d1.dn_loss_iou: 0.2279 d2.dn_loss_cls: 0.1013 d2.dn_loss_bbox: 0.1685 d2.dn_loss_iou: 0.2098 d3.dn_loss_cls: 0.0947 d3.dn_loss_bbox: 0.1623 d3.dn_loss_iou: 0.2042 d4.dn_loss_cls: 0.0919 d4.dn_loss_bbox: 0.1606 d4.dn_loss_iou: 0.2022 d1.loss_lmm_region: 0.1086 loss_lmm_image: 0.7131 2024/11/13 15:24:50 - mmengine - INFO - Iter(train) [130700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:46:25 time: 1.9987 data_time: 0.0189 memory: 35586 grad_norm: 23.8671 loss: 9.7840 loss_cls: 0.3088 loss_bbox: 0.1576 loss_iou: 0.2536 d0.loss_cls: 0.3456 d0.loss_bbox: 0.1747 d0.loss_iou: 0.2671 d1.loss_cls: 0.3225 d1.loss_bbox: 0.1649 d1.loss_iou: 0.2598 d2.loss_cls: 0.3145 d2.loss_bbox: 0.1640 d2.loss_iou: 0.2562 d3.loss_cls: 0.3101 d3.loss_bbox: 0.1589 d3.loss_iou: 0.2542 d4.loss_cls: 0.3070 d4.loss_bbox: 0.1597 d4.loss_iou: 0.2553 enc_loss_cls: 0.3599 enc_loss_bbox: 0.1825 enc_loss_iou: 0.2791 dn_loss_cls: 0.1394 dn_loss_bbox: 0.1810 dn_loss_iou: 0.2184 d0.dn_loss_cls: 0.2106 d0.dn_loss_bbox: 0.3119 d0.dn_loss_iou: 0.3478 d1.dn_loss_cls: 0.1766 d1.dn_loss_bbox: 0.2058 d1.dn_loss_iou: 0.2437 d2.dn_loss_cls: 0.1632 d2.dn_loss_bbox: 0.1886 d2.dn_loss_iou: 0.2261 d3.dn_loss_cls: 0.1501 d3.dn_loss_bbox: 0.1822 d3.dn_loss_iou: 0.2203 d4.dn_loss_cls: 0.1434 d4.dn_loss_bbox: 0.1810 d4.dn_loss_iou: 0.2184 d1.loss_lmm_region: 0.1099 loss_lmm_image: 0.7094 2024/11/13 15:28:09 - mmengine - INFO - Iter(train) [130800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:43:04 time: 1.9984 data_time: 0.0188 memory: 34007 grad_norm: 33.7671 loss: 8.6456 loss_cls: 0.2567 loss_bbox: 0.1482 loss_iou: 0.2481 d0.loss_cls: 0.2982 d0.loss_bbox: 0.1541 d0.loss_iou: 0.2549 d1.loss_cls: 0.2754 d1.loss_bbox: 0.1470 d1.loss_iou: 0.2484 d2.loss_cls: 0.2660 d2.loss_bbox: 0.1481 d2.loss_iou: 0.2478 d3.loss_cls: 0.2601 d3.loss_bbox: 0.1474 d3.loss_iou: 0.2478 d4.loss_cls: 0.2582 d4.loss_bbox: 0.1459 d4.loss_iou: 0.2477 enc_loss_cls: 0.3050 enc_loss_bbox: 0.1618 enc_loss_iou: 0.2676 dn_loss_cls: 0.0865 dn_loss_bbox: 0.1495 dn_loss_iou: 0.1983 d0.dn_loss_cls: 0.1622 d0.dn_loss_bbox: 0.2784 d0.dn_loss_iou: 0.3260 d1.dn_loss_cls: 0.1134 d1.dn_loss_bbox: 0.1769 d1.dn_loss_iou: 0.2240 d2.dn_loss_cls: 0.0966 d2.dn_loss_bbox: 0.1574 d2.dn_loss_iou: 0.2052 d3.dn_loss_cls: 0.0908 d3.dn_loss_bbox: 0.1512 d3.dn_loss_iou: 0.2001 d4.dn_loss_cls: 0.0862 d4.dn_loss_bbox: 0.1495 d4.dn_loss_iou: 0.1984 d1.loss_lmm_region: 0.0996 loss_lmm_image: 0.7607 2024/11/13 15:31:29 - mmengine - INFO - Iter(train) [130900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:39:43 time: 1.9986 data_time: 0.0187 memory: 34835 grad_norm: 26.1891 loss: 8.7330 loss_cls: 0.2573 loss_bbox: 0.1423 loss_iou: 0.2496 d0.loss_cls: 0.2969 d0.loss_bbox: 0.1480 d0.loss_iou: 0.2588 d1.loss_cls: 0.2736 d1.loss_bbox: 0.1430 d1.loss_iou: 0.2503 d2.loss_cls: 0.2695 d2.loss_bbox: 0.1423 d2.loss_iou: 0.2481 d3.loss_cls: 0.2592 d3.loss_bbox: 0.1418 d3.loss_iou: 0.2494 d4.loss_cls: 0.2580 d4.loss_bbox: 0.1432 d4.loss_iou: 0.2496 enc_loss_cls: 0.3061 enc_loss_bbox: 0.1585 enc_loss_iou: 0.2732 dn_loss_cls: 0.0949 dn_loss_bbox: 0.1464 dn_loss_iou: 0.2062 d0.dn_loss_cls: 0.1772 d0.dn_loss_bbox: 0.2698 d0.dn_loss_iou: 0.3319 d1.dn_loss_cls: 0.1291 d1.dn_loss_bbox: 0.1703 d1.dn_loss_iou: 0.2318 d2.dn_loss_cls: 0.1097 d2.dn_loss_bbox: 0.1549 d2.dn_loss_iou: 0.2152 d3.dn_loss_cls: 0.1010 d3.dn_loss_bbox: 0.1478 d3.dn_loss_iou: 0.2084 d4.dn_loss_cls: 0.0970 d4.dn_loss_bbox: 0.1464 d4.dn_loss_iou: 0.2061 d1.loss_lmm_region: 0.1087 loss_lmm_image: 0.7611 2024/11/13 15:34:50 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 15:34:50 - mmengine - INFO - Iter(train) [131000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:36:22 time: 2.0246 data_time: 0.0186 memory: 31770 grad_norm: 29.5249 loss: 8.4408 loss_cls: 0.2458 loss_bbox: 0.1270 loss_iou: 0.2314 d0.loss_cls: 0.2797 d0.loss_bbox: 0.1361 d0.loss_iou: 0.2458 d1.loss_cls: 0.2651 d1.loss_bbox: 0.1296 d1.loss_iou: 0.2396 d2.loss_cls: 0.2548 d2.loss_bbox: 0.1275 d2.loss_iou: 0.2343 d3.loss_cls: 0.2523 d3.loss_bbox: 0.1258 d3.loss_iou: 0.2319 d4.loss_cls: 0.2488 d4.loss_bbox: 0.1257 d4.loss_iou: 0.2304 enc_loss_cls: 0.2901 enc_loss_bbox: 0.1466 enc_loss_iou: 0.2598 dn_loss_cls: 0.0875 dn_loss_bbox: 0.1522 dn_loss_iou: 0.2074 d0.dn_loss_cls: 0.1659 d0.dn_loss_bbox: 0.2973 d0.dn_loss_iou: 0.3376 d1.dn_loss_cls: 0.1154 d1.dn_loss_bbox: 0.1860 d1.dn_loss_iou: 0.2336 d2.dn_loss_cls: 0.0977 d2.dn_loss_bbox: 0.1632 d2.dn_loss_iou: 0.2159 d3.dn_loss_cls: 0.0919 d3.dn_loss_bbox: 0.1542 d3.dn_loss_iou: 0.2093 d4.dn_loss_cls: 0.0889 d4.dn_loss_bbox: 0.1523 d4.dn_loss_iou: 0.2074 d1.loss_lmm_region: 0.1124 loss_lmm_image: 0.7366 2024/11/13 15:38:08 - mmengine - INFO - Iter(train) [131100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:33:01 time: 1.9543 data_time: 0.0187 memory: 35538 grad_norm: 28.0556 loss: 9.0879 loss_cls: 0.2543 loss_bbox: 0.1630 loss_iou: 0.2704 d0.loss_cls: 0.2852 d0.loss_bbox: 0.1785 d0.loss_iou: 0.2778 d1.loss_cls: 0.2678 d1.loss_bbox: 0.1648 d1.loss_iou: 0.2731 d2.loss_cls: 0.2603 d2.loss_bbox: 0.1654 d2.loss_iou: 0.2719 d3.loss_cls: 0.2571 d3.loss_bbox: 0.1652 d3.loss_iou: 0.2717 d4.loss_cls: 0.2544 d4.loss_bbox: 0.1630 d4.loss_iou: 0.2708 enc_loss_cls: 0.3009 enc_loss_bbox: 0.1831 enc_loss_iou: 0.2886 dn_loss_cls: 0.0620 dn_loss_bbox: 0.1767 dn_loss_iou: 0.2292 d0.dn_loss_cls: 0.1476 d0.dn_loss_bbox: 0.3264 d0.dn_loss_iou: 0.3639 d1.dn_loss_cls: 0.0924 d1.dn_loss_bbox: 0.2076 d1.dn_loss_iou: 0.2590 d2.dn_loss_cls: 0.0734 d2.dn_loss_bbox: 0.1880 d2.dn_loss_iou: 0.2395 d3.dn_loss_cls: 0.0660 d3.dn_loss_bbox: 0.1775 d3.dn_loss_iou: 0.2310 d4.dn_loss_cls: 0.0630 d4.dn_loss_bbox: 0.1765 d4.dn_loss_iou: 0.2292 d1.loss_lmm_region: 0.0978 loss_lmm_image: 0.6940 2024/11/13 15:41:26 - mmengine - INFO - Iter(train) [131200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:29:39 time: 1.9658 data_time: 0.0190 memory: 33494 grad_norm: 33.5638 loss: 9.6260 loss_cls: 0.3094 loss_bbox: 0.1605 loss_iou: 0.2685 d0.loss_cls: 0.3561 d0.loss_bbox: 0.1782 d0.loss_iou: 0.2915 d1.loss_cls: 0.3256 d1.loss_bbox: 0.1693 d1.loss_iou: 0.2792 d2.loss_cls: 0.3169 d2.loss_bbox: 0.1650 d2.loss_iou: 0.2756 d3.loss_cls: 0.3097 d3.loss_bbox: 0.1648 d3.loss_iou: 0.2727 d4.loss_cls: 0.3131 d4.loss_bbox: 0.1591 d4.loss_iou: 0.2677 enc_loss_cls: 0.3549 enc_loss_bbox: 0.1948 enc_loss_iou: 0.3124 dn_loss_cls: 0.0894 dn_loss_bbox: 0.1698 dn_loss_iou: 0.2103 d0.dn_loss_cls: 0.1753 d0.dn_loss_bbox: 0.3157 d0.dn_loss_iou: 0.3445 d1.dn_loss_cls: 0.1229 d1.dn_loss_bbox: 0.1998 d1.dn_loss_iou: 0.2381 d2.dn_loss_cls: 0.1021 d2.dn_loss_bbox: 0.1781 d2.dn_loss_iou: 0.2189 d3.dn_loss_cls: 0.0928 d3.dn_loss_bbox: 0.1719 d3.dn_loss_iou: 0.2128 d4.dn_loss_cls: 0.0884 d4.dn_loss_bbox: 0.1699 d4.dn_loss_iou: 0.2103 d1.loss_lmm_region: 0.1025 loss_lmm_image: 0.7678 2024/11/13 15:44:46 - mmengine - INFO - Iter(train) [131300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:26:18 time: 1.9822 data_time: 0.0187 memory: 34367 grad_norm: 22.3251 loss: 8.3294 loss_cls: 0.2402 loss_bbox: 0.1243 loss_iou: 0.2125 d0.loss_cls: 0.2657 d0.loss_bbox: 0.1388 d0.loss_iou: 0.2272 d1.loss_cls: 0.2558 d1.loss_bbox: 0.1277 d1.loss_iou: 0.2184 d2.loss_cls: 0.2498 d2.loss_bbox: 0.1217 d2.loss_iou: 0.2118 d3.loss_cls: 0.2442 d3.loss_bbox: 0.1243 d3.loss_iou: 0.2126 d4.loss_cls: 0.2406 d4.loss_bbox: 0.1247 d4.loss_iou: 0.2135 enc_loss_cls: 0.2724 enc_loss_bbox: 0.1495 enc_loss_iou: 0.2379 dn_loss_cls: 0.0952 dn_loss_bbox: 0.1693 dn_loss_iou: 0.2067 d0.dn_loss_cls: 0.1688 d0.dn_loss_bbox: 0.3171 d0.dn_loss_iou: 0.3386 d1.dn_loss_cls: 0.1218 d1.dn_loss_bbox: 0.2095 d1.dn_loss_iou: 0.2366 d2.dn_loss_cls: 0.1071 d2.dn_loss_bbox: 0.1840 d2.dn_loss_iou: 0.2170 d3.dn_loss_cls: 0.1015 d3.dn_loss_bbox: 0.1719 d3.dn_loss_iou: 0.2090 d4.dn_loss_cls: 0.0976 d4.dn_loss_bbox: 0.1695 d4.dn_loss_iou: 0.2068 d1.loss_lmm_region: 0.1131 loss_lmm_image: 0.6747 2024/11/13 15:48:06 - mmengine - INFO - Iter(train) [131400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:22:57 time: 2.0155 data_time: 0.0186 memory: 35224 grad_norm: 26.5609 loss: 9.4122 loss_cls: 0.3141 loss_bbox: 0.1407 loss_iou: 0.2579 d0.loss_cls: 0.3568 d0.loss_bbox: 0.1479 d0.loss_iou: 0.2673 d1.loss_cls: 0.3458 d1.loss_bbox: 0.1381 d1.loss_iou: 0.2575 d2.loss_cls: 0.3262 d2.loss_bbox: 0.1405 d2.loss_iou: 0.2604 d3.loss_cls: 0.3189 d3.loss_bbox: 0.1432 d3.loss_iou: 0.2585 d4.loss_cls: 0.3155 d4.loss_bbox: 0.1404 d4.loss_iou: 0.2576 enc_loss_cls: 0.3595 enc_loss_bbox: 0.1568 enc_loss_iou: 0.2888 dn_loss_cls: 0.1220 dn_loss_bbox: 0.1483 dn_loss_iou: 0.2095 d0.dn_loss_cls: 0.2016 d0.dn_loss_bbox: 0.2941 d0.dn_loss_iou: 0.3519 d1.dn_loss_cls: 0.1458 d1.dn_loss_bbox: 0.1790 d1.dn_loss_iou: 0.2415 d2.dn_loss_cls: 0.1258 d2.dn_loss_bbox: 0.1570 d2.dn_loss_iou: 0.2190 d3.dn_loss_cls: 0.1265 d3.dn_loss_bbox: 0.1501 d3.dn_loss_iou: 0.2122 d4.dn_loss_cls: 0.1240 d4.dn_loss_bbox: 0.1484 d4.dn_loss_iou: 0.2096 d1.loss_lmm_region: 0.1071 loss_lmm_image: 0.7463 2024/11/13 15:51:25 - mmengine - INFO - Iter(train) [131500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:19:36 time: 1.9856 data_time: 0.0188 memory: 34191 grad_norm: 31.1273 loss: 10.5541 loss_cls: 0.3434 loss_bbox: 0.1812 loss_iou: 0.3235 d0.loss_cls: 0.3902 d0.loss_bbox: 0.1888 d0.loss_iou: 0.3448 d1.loss_cls: 0.3736 d1.loss_bbox: 0.1807 d1.loss_iou: 0.3286 d2.loss_cls: 0.3570 d2.loss_bbox: 0.1805 d2.loss_iou: 0.3316 d3.loss_cls: 0.3442 d3.loss_bbox: 0.1829 d3.loss_iou: 0.3280 d4.loss_cls: 0.3445 d4.loss_bbox: 0.1819 d4.loss_iou: 0.3237 enc_loss_cls: 0.3990 enc_loss_bbox: 0.2024 enc_loss_iou: 0.3574 dn_loss_cls: 0.0931 dn_loss_bbox: 0.1701 dn_loss_iou: 0.2314 d0.dn_loss_cls: 0.1795 d0.dn_loss_bbox: 0.3260 d0.dn_loss_iou: 0.3766 d1.dn_loss_cls: 0.1268 d1.dn_loss_bbox: 0.2005 d1.dn_loss_iou: 0.2612 d2.dn_loss_cls: 0.1074 d2.dn_loss_bbox: 0.1816 d2.dn_loss_iou: 0.2412 d3.dn_loss_cls: 0.0982 d3.dn_loss_bbox: 0.1724 d3.dn_loss_iou: 0.2337 d4.dn_loss_cls: 0.0937 d4.dn_loss_bbox: 0.1701 d4.dn_loss_iou: 0.2314 d1.loss_lmm_region: 0.1206 loss_lmm_image: 0.7507 2024/11/13 15:54:44 - mmengine - INFO - Iter(train) [131600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:16:15 time: 1.9943 data_time: 0.0186 memory: 33124 grad_norm: 28.4202 loss: 8.2164 loss_cls: 0.2408 loss_bbox: 0.1314 loss_iou: 0.2146 d0.loss_cls: 0.2729 d0.loss_bbox: 0.1391 d0.loss_iou: 0.2231 d1.loss_cls: 0.2552 d1.loss_bbox: 0.1313 d1.loss_iou: 0.2167 d2.loss_cls: 0.2503 d2.loss_bbox: 0.1285 d2.loss_iou: 0.2147 d3.loss_cls: 0.2448 d3.loss_bbox: 0.1302 d3.loss_iou: 0.2155 d4.loss_cls: 0.2437 d4.loss_bbox: 0.1302 d4.loss_iou: 0.2142 enc_loss_cls: 0.2782 enc_loss_bbox: 0.1492 enc_loss_iou: 0.2385 dn_loss_cls: 0.0736 dn_loss_bbox: 0.1573 dn_loss_iou: 0.2005 d0.dn_loss_cls: 0.1471 d0.dn_loss_bbox: 0.3051 d0.dn_loss_iou: 0.3267 d1.dn_loss_cls: 0.1051 d1.dn_loss_bbox: 0.1934 d1.dn_loss_iou: 0.2284 d2.dn_loss_cls: 0.0860 d2.dn_loss_bbox: 0.1687 d2.dn_loss_iou: 0.2097 d3.dn_loss_cls: 0.0782 d3.dn_loss_bbox: 0.1602 d3.dn_loss_iou: 0.2030 d4.dn_loss_cls: 0.0746 d4.dn_loss_bbox: 0.1573 d4.dn_loss_iou: 0.2006 d1.loss_lmm_region: 0.0986 loss_lmm_image: 0.7790 2024/11/13 15:58:05 - mmengine - INFO - Iter(train) [131700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:12:54 time: 1.9933 data_time: 0.0187 memory: 34578 grad_norm: 26.6339 loss: 8.5196 loss_cls: 0.2528 loss_bbox: 0.1393 loss_iou: 0.2424 d0.loss_cls: 0.2840 d0.loss_bbox: 0.1527 d0.loss_iou: 0.2519 d1.loss_cls: 0.2685 d1.loss_bbox: 0.1416 d1.loss_iou: 0.2455 d2.loss_cls: 0.2583 d2.loss_bbox: 0.1390 d2.loss_iou: 0.2415 d3.loss_cls: 0.2580 d3.loss_bbox: 0.1381 d3.loss_iou: 0.2421 d4.loss_cls: 0.2559 d4.loss_bbox: 0.1380 d4.loss_iou: 0.2414 enc_loss_cls: 0.2974 enc_loss_bbox: 0.1637 enc_loss_iou: 0.2682 dn_loss_cls: 0.0676 dn_loss_bbox: 0.1614 dn_loss_iou: 0.2153 d0.dn_loss_cls: 0.1432 d0.dn_loss_bbox: 0.2965 d0.dn_loss_iou: 0.3444 d1.dn_loss_cls: 0.0961 d1.dn_loss_bbox: 0.1930 d1.dn_loss_iou: 0.2420 d2.dn_loss_cls: 0.0766 d2.dn_loss_bbox: 0.1725 d2.dn_loss_iou: 0.2240 d3.dn_loss_cls: 0.0709 d3.dn_loss_bbox: 0.1628 d3.dn_loss_iou: 0.2170 d4.dn_loss_cls: 0.0676 d4.dn_loss_bbox: 0.1614 d4.dn_loss_iou: 0.2154 d1.loss_lmm_region: 0.0961 loss_lmm_image: 0.6755 2024/11/13 16:01:22 - mmengine - INFO - Iter(train) [131800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:09:32 time: 1.9717 data_time: 0.0186 memory: 35337 grad_norm: 29.5004 loss: 9.9274 loss_cls: 0.2875 loss_bbox: 0.1844 loss_iou: 0.2832 d0.loss_cls: 0.3358 d0.loss_bbox: 0.1964 d0.loss_iou: 0.3001 d1.loss_cls: 0.3121 d1.loss_bbox: 0.1818 d1.loss_iou: 0.2844 d2.loss_cls: 0.3004 d2.loss_bbox: 0.1777 d2.loss_iou: 0.2801 d3.loss_cls: 0.2947 d3.loss_bbox: 0.1786 d3.loss_iou: 0.2810 d4.loss_cls: 0.2888 d4.loss_bbox: 0.1811 d4.loss_iou: 0.2827 enc_loss_cls: 0.3509 enc_loss_bbox: 0.1994 enc_loss_iou: 0.3091 dn_loss_cls: 0.1027 dn_loss_bbox: 0.1882 dn_loss_iou: 0.2231 d0.dn_loss_cls: 0.1903 d0.dn_loss_bbox: 0.3367 d0.dn_loss_iou: 0.3572 d1.dn_loss_cls: 0.1358 d1.dn_loss_bbox: 0.2153 d1.dn_loss_iou: 0.2485 d2.dn_loss_cls: 0.1156 d2.dn_loss_bbox: 0.1974 d2.dn_loss_iou: 0.2313 d3.dn_loss_cls: 0.1063 d3.dn_loss_bbox: 0.1890 d3.dn_loss_iou: 0.2250 d4.dn_loss_cls: 0.1046 d4.dn_loss_bbox: 0.1883 d4.dn_loss_iou: 0.2230 d1.loss_lmm_region: 0.1211 loss_lmm_image: 0.7381 2024/11/13 16:04:42 - mmengine - INFO - Iter(train) [131900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:06:11 time: 1.9989 data_time: 0.0185 memory: 34177 grad_norm: 24.5701 loss: 7.9231 loss_cls: 0.2859 loss_bbox: 0.1098 loss_iou: 0.2039 d0.loss_cls: 0.3205 d0.loss_bbox: 0.1133 d0.loss_iou: 0.2097 d1.loss_cls: 0.2995 d1.loss_bbox: 0.1063 d1.loss_iou: 0.2021 d2.loss_cls: 0.2957 d2.loss_bbox: 0.1039 d2.loss_iou: 0.1996 d3.loss_cls: 0.2904 d3.loss_bbox: 0.1080 d3.loss_iou: 0.2026 d4.loss_cls: 0.2876 d4.loss_bbox: 0.1079 d4.loss_iou: 0.2018 enc_loss_cls: 0.3196 enc_loss_bbox: 0.1318 enc_loss_iou: 0.2279 dn_loss_cls: 0.0731 dn_loss_bbox: 0.1364 dn_loss_iou: 0.1752 d0.dn_loss_cls: 0.1429 d0.dn_loss_bbox: 0.2567 d0.dn_loss_iou: 0.2899 d1.dn_loss_cls: 0.1009 d1.dn_loss_bbox: 0.1650 d1.dn_loss_iou: 0.1997 d2.dn_loss_cls: 0.0850 d2.dn_loss_bbox: 0.1490 d2.dn_loss_iou: 0.1844 d3.dn_loss_cls: 0.0785 d3.dn_loss_bbox: 0.1389 d3.dn_loss_iou: 0.1774 d4.dn_loss_cls: 0.0715 d4.dn_loss_bbox: 0.1363 d4.dn_loss_iou: 0.1751 d1.loss_lmm_region: 0.0991 loss_lmm_image: 0.7603 2024/11/13 16:08:02 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 16:08:02 - mmengine - INFO - Iter(train) [132000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 10:02:50 time: 1.9990 data_time: 0.0186 memory: 34653 grad_norm: 24.3498 loss: 7.9667 loss_cls: 0.2458 loss_bbox: 0.1158 loss_iou: 0.2059 d0.loss_cls: 0.2884 d0.loss_bbox: 0.1189 d0.loss_iou: 0.2172 d1.loss_cls: 0.2693 d1.loss_bbox: 0.1145 d1.loss_iou: 0.2076 d2.loss_cls: 0.2584 d2.loss_bbox: 0.1170 d2.loss_iou: 0.2083 d3.loss_cls: 0.2524 d3.loss_bbox: 0.1161 d3.loss_iou: 0.2067 d4.loss_cls: 0.2471 d4.loss_bbox: 0.1156 d4.loss_iou: 0.2050 enc_loss_cls: 0.2970 enc_loss_bbox: 0.1283 enc_loss_iou: 0.2328 dn_loss_cls: 0.0755 dn_loss_bbox: 0.1413 dn_loss_iou: 0.1898 d0.dn_loss_cls: 0.1596 d0.dn_loss_bbox: 0.2862 d0.dn_loss_iou: 0.3274 d1.dn_loss_cls: 0.1045 d1.dn_loss_bbox: 0.1768 d1.dn_loss_iou: 0.2206 d2.dn_loss_cls: 0.0872 d2.dn_loss_bbox: 0.1507 d2.dn_loss_iou: 0.1990 d3.dn_loss_cls: 0.0781 d3.dn_loss_bbox: 0.1434 d3.dn_loss_iou: 0.1923 d4.dn_loss_cls: 0.0764 d4.dn_loss_bbox: 0.1413 d4.dn_loss_iou: 0.1898 d1.loss_lmm_region: 0.1010 loss_lmm_image: 0.7578 2024/11/13 16:11:22 - mmengine - INFO - Iter(train) [132100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:59:29 time: 2.0209 data_time: 0.0186 memory: 33440 grad_norm: 28.2146 loss: 7.9174 loss_cls: 0.2151 loss_bbox: 0.1348 loss_iou: 0.2115 d0.loss_cls: 0.2516 d0.loss_bbox: 0.1369 d0.loss_iou: 0.2154 d1.loss_cls: 0.2344 d1.loss_bbox: 0.1325 d1.loss_iou: 0.2133 d2.loss_cls: 0.2284 d2.loss_bbox: 0.1315 d2.loss_iou: 0.2103 d3.loss_cls: 0.2170 d3.loss_bbox: 0.1372 d3.loss_iou: 0.2136 d4.loss_cls: 0.2180 d4.loss_bbox: 0.1353 d4.loss_iou: 0.2113 enc_loss_cls: 0.2558 enc_loss_bbox: 0.1474 enc_loss_iou: 0.2320 dn_loss_cls: 0.0879 dn_loss_bbox: 0.1591 dn_loss_iou: 0.1875 d0.dn_loss_cls: 0.1515 d0.dn_loss_bbox: 0.2779 d0.dn_loss_iou: 0.3027 d1.dn_loss_cls: 0.1114 d1.dn_loss_bbox: 0.1855 d1.dn_loss_iou: 0.2116 d2.dn_loss_cls: 0.0968 d2.dn_loss_bbox: 0.1682 d2.dn_loss_iou: 0.1955 d3.dn_loss_cls: 0.0913 d3.dn_loss_bbox: 0.1608 d3.dn_loss_iou: 0.1897 d4.dn_loss_cls: 0.0884 d4.dn_loss_bbox: 0.1591 d4.dn_loss_iou: 0.1875 d1.loss_lmm_region: 0.0804 loss_lmm_image: 0.7411 2024/11/13 16:14:44 - mmengine - INFO - Iter(train) [132200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:56:08 time: 2.0411 data_time: 0.0187 memory: 33435 grad_norm: 23.1108 loss: 8.2853 loss_cls: 0.2627 loss_bbox: 0.1308 loss_iou: 0.2352 d0.loss_cls: 0.3065 d0.loss_bbox: 0.1414 d0.loss_iou: 0.2495 d1.loss_cls: 0.2847 d1.loss_bbox: 0.1325 d1.loss_iou: 0.2374 d2.loss_cls: 0.2759 d2.loss_bbox: 0.1309 d2.loss_iou: 0.2332 d3.loss_cls: 0.2704 d3.loss_bbox: 0.1300 d3.loss_iou: 0.2348 d4.loss_cls: 0.2651 d4.loss_bbox: 0.1301 d4.loss_iou: 0.2353 enc_loss_cls: 0.3114 enc_loss_bbox: 0.1510 enc_loss_iou: 0.2649 dn_loss_cls: 0.0892 dn_loss_bbox: 0.1295 dn_loss_iou: 0.1867 d0.dn_loss_cls: 0.1581 d0.dn_loss_bbox: 0.2585 d0.dn_loss_iou: 0.3084 d1.dn_loss_cls: 0.1142 d1.dn_loss_bbox: 0.1563 d1.dn_loss_iou: 0.2111 d2.dn_loss_cls: 0.0990 d2.dn_loss_bbox: 0.1356 d2.dn_loss_iou: 0.1941 d3.dn_loss_cls: 0.0929 d3.dn_loss_bbox: 0.1303 d3.dn_loss_iou: 0.1885 d4.dn_loss_cls: 0.0904 d4.dn_loss_bbox: 0.1294 d4.dn_loss_iou: 0.1867 d1.loss_lmm_region: 0.0901 loss_lmm_image: 0.7226 2024/11/13 16:18:04 - mmengine - INFO - Iter(train) [132300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:52:47 time: 2.0100 data_time: 0.0187 memory: 34480 grad_norm: 27.6187 loss: 8.7333 loss_cls: 0.2807 loss_bbox: 0.1493 loss_iou: 0.2431 d0.loss_cls: 0.3241 d0.loss_bbox: 0.1561 d0.loss_iou: 0.2518 d1.loss_cls: 0.2914 d1.loss_bbox: 0.1587 d1.loss_iou: 0.2511 d2.loss_cls: 0.2881 d2.loss_bbox: 0.1512 d2.loss_iou: 0.2447 d3.loss_cls: 0.2874 d3.loss_bbox: 0.1479 d3.loss_iou: 0.2440 d4.loss_cls: 0.2773 d4.loss_bbox: 0.1519 d4.loss_iou: 0.2454 enc_loss_cls: 0.3249 enc_loss_bbox: 0.1622 enc_loss_iou: 0.2665 dn_loss_cls: 0.0697 dn_loss_bbox: 0.1583 dn_loss_iou: 0.2023 d0.dn_loss_cls: 0.1429 d0.dn_loss_bbox: 0.2942 d0.dn_loss_iou: 0.3297 d1.dn_loss_cls: 0.0963 d1.dn_loss_bbox: 0.1830 d1.dn_loss_iou: 0.2279 d2.dn_loss_cls: 0.0818 d2.dn_loss_bbox: 0.1679 d2.dn_loss_iou: 0.2116 d3.dn_loss_cls: 0.0745 d3.dn_loss_bbox: 0.1591 d3.dn_loss_iou: 0.2041 d4.dn_loss_cls: 0.0700 d4.dn_loss_bbox: 0.1582 d4.dn_loss_iou: 0.2023 d1.loss_lmm_region: 0.0805 loss_lmm_image: 0.7212 2024/11/13 16:21:21 - mmengine - INFO - Iter(train) [132400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:49:26 time: 1.9232 data_time: 0.0186 memory: 32812 grad_norm: 27.2410 loss: 8.0980 loss_cls: 0.2588 loss_bbox: 0.1093 loss_iou: 0.1967 d0.loss_cls: 0.2868 d0.loss_bbox: 0.1191 d0.loss_iou: 0.2069 d1.loss_cls: 0.2690 d1.loss_bbox: 0.1142 d1.loss_iou: 0.2034 d2.loss_cls: 0.2637 d2.loss_bbox: 0.1091 d2.loss_iou: 0.1964 d3.loss_cls: 0.2595 d3.loss_bbox: 0.1082 d3.loss_iou: 0.1961 d4.loss_cls: 0.2598 d4.loss_bbox: 0.1088 d4.loss_iou: 0.1964 enc_loss_cls: 0.2960 enc_loss_bbox: 0.1251 enc_loss_iou: 0.2187 dn_loss_cls: 0.1316 dn_loss_bbox: 0.1401 dn_loss_iou: 0.1833 d0.dn_loss_cls: 0.2001 d0.dn_loss_bbox: 0.2847 d0.dn_loss_iou: 0.3150 d1.dn_loss_cls: 0.1552 d1.dn_loss_bbox: 0.1687 d1.dn_loss_iou: 0.2094 d2.dn_loss_cls: 0.1384 d2.dn_loss_bbox: 0.1476 d2.dn_loss_iou: 0.1919 d3.dn_loss_cls: 0.1345 d3.dn_loss_bbox: 0.1409 d3.dn_loss_iou: 0.1851 d4.dn_loss_cls: 0.1331 d4.dn_loss_bbox: 0.1401 d4.dn_loss_iou: 0.1833 d1.loss_lmm_region: 0.1224 loss_lmm_image: 0.6907 2024/11/13 16:24:43 - mmengine - INFO - Iter(train) [132500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:46:05 time: 2.0262 data_time: 0.0187 memory: 34850 grad_norm: 28.6539 loss: 9.2272 loss_cls: 0.2921 loss_bbox: 0.1316 loss_iou: 0.2338 d0.loss_cls: 0.3207 d0.loss_bbox: 0.1430 d0.loss_iou: 0.2492 d1.loss_cls: 0.3050 d1.loss_bbox: 0.1372 d1.loss_iou: 0.2444 d2.loss_cls: 0.3032 d2.loss_bbox: 0.1351 d2.loss_iou: 0.2429 d3.loss_cls: 0.3021 d3.loss_bbox: 0.1281 d3.loss_iou: 0.2321 d4.loss_cls: 0.2921 d4.loss_bbox: 0.1322 d4.loss_iou: 0.2341 enc_loss_cls: 0.3235 enc_loss_bbox: 0.1613 enc_loss_iou: 0.2664 dn_loss_cls: 0.1255 dn_loss_bbox: 0.1690 dn_loss_iou: 0.2235 d0.dn_loss_cls: 0.1947 d0.dn_loss_bbox: 0.3177 d0.dn_loss_iou: 0.3599 d1.dn_loss_cls: 0.1514 d1.dn_loss_bbox: 0.2021 d1.dn_loss_iou: 0.2509 d2.dn_loss_cls: 0.1336 d2.dn_loss_bbox: 0.1790 d2.dn_loss_iou: 0.2323 d3.dn_loss_cls: 0.1289 d3.dn_loss_bbox: 0.1713 d3.dn_loss_iou: 0.2254 d4.dn_loss_cls: 0.1269 d4.dn_loss_bbox: 0.1690 d4.dn_loss_iou: 0.2234 d1.loss_lmm_region: 0.1052 loss_lmm_image: 0.7270 2024/11/13 16:28:03 - mmengine - INFO - Iter(train) [132600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:42:44 time: 1.9814 data_time: 0.0189 memory: 33537 grad_norm: 34.0801 loss: 7.7683 loss_cls: 0.2193 loss_bbox: 0.1206 loss_iou: 0.2115 d0.loss_cls: 0.2468 d0.loss_bbox: 0.1321 d0.loss_iou: 0.2264 d1.loss_cls: 0.2324 d1.loss_bbox: 0.1220 d1.loss_iou: 0.2170 d2.loss_cls: 0.2284 d2.loss_bbox: 0.1206 d2.loss_iou: 0.2144 d3.loss_cls: 0.2259 d3.loss_bbox: 0.1184 d3.loss_iou: 0.2132 d4.loss_cls: 0.2256 d4.loss_bbox: 0.1204 d4.loss_iou: 0.2116 enc_loss_cls: 0.2606 enc_loss_bbox: 0.1370 enc_loss_iou: 0.2360 dn_loss_cls: 0.0708 dn_loss_bbox: 0.1413 dn_loss_iou: 0.1933 d0.dn_loss_cls: 0.1443 d0.dn_loss_bbox: 0.2823 d0.dn_loss_iou: 0.3229 d1.dn_loss_cls: 0.0964 d1.dn_loss_bbox: 0.1728 d1.dn_loss_iou: 0.2217 d2.dn_loss_cls: 0.0809 d2.dn_loss_bbox: 0.1503 d2.dn_loss_iou: 0.2019 d3.dn_loss_cls: 0.0747 d3.dn_loss_bbox: 0.1423 d3.dn_loss_iou: 0.1955 d4.dn_loss_cls: 0.0711 d4.dn_loss_bbox: 0.1413 d4.dn_loss_iou: 0.1933 d1.loss_lmm_region: 0.1136 loss_lmm_image: 0.7172 2024/11/13 16:31:21 - mmengine - INFO - Iter(train) [132700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:39:22 time: 1.9705 data_time: 0.0187 memory: 34932 grad_norm: 21.3333 loss: 8.0271 loss_cls: 0.2353 loss_bbox: 0.1248 loss_iou: 0.2116 d0.loss_cls: 0.2640 d0.loss_bbox: 0.1315 d0.loss_iou: 0.2192 d1.loss_cls: 0.2443 d1.loss_bbox: 0.1279 d1.loss_iou: 0.2146 d2.loss_cls: 0.2395 d2.loss_bbox: 0.1255 d2.loss_iou: 0.2129 d3.loss_cls: 0.2350 d3.loss_bbox: 0.1256 d3.loss_iou: 0.2133 d4.loss_cls: 0.2315 d4.loss_bbox: 0.1259 d4.loss_iou: 0.2146 enc_loss_cls: 0.2802 enc_loss_bbox: 0.1374 enc_loss_iou: 0.2317 dn_loss_cls: 0.0748 dn_loss_bbox: 0.1525 dn_loss_iou: 0.1955 d0.dn_loss_cls: 0.1533 d0.dn_loss_bbox: 0.3047 d0.dn_loss_iou: 0.3329 d1.dn_loss_cls: 0.1028 d1.dn_loss_bbox: 0.1824 d1.dn_loss_iou: 0.2236 d2.dn_loss_cls: 0.0855 d2.dn_loss_bbox: 0.1615 d2.dn_loss_iou: 0.2039 d3.dn_loss_cls: 0.0794 d3.dn_loss_bbox: 0.1536 d3.dn_loss_iou: 0.1972 d4.dn_loss_cls: 0.0756 d4.dn_loss_bbox: 0.1526 d4.dn_loss_iou: 0.1955 d1.loss_lmm_region: 0.0914 loss_lmm_image: 0.7621 2024/11/13 16:34:42 - mmengine - INFO - Iter(train) [132800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:36:02 time: 2.0265 data_time: 0.0187 memory: 34602 grad_norm: 27.8579 loss: 8.8419 loss_cls: 0.2398 loss_bbox: 0.1468 loss_iou: 0.2707 d0.loss_cls: 0.2701 d0.loss_bbox: 0.1574 d0.loss_iou: 0.2861 d1.loss_cls: 0.2578 d1.loss_bbox: 0.1491 d1.loss_iou: 0.2756 d2.loss_cls: 0.2491 d2.loss_bbox: 0.1457 d2.loss_iou: 0.2714 d3.loss_cls: 0.2439 d3.loss_bbox: 0.1462 d3.loss_iou: 0.2709 d4.loss_cls: 0.2425 d4.loss_bbox: 0.1453 d4.loss_iou: 0.2693 enc_loss_cls: 0.2805 enc_loss_bbox: 0.1639 enc_loss_iou: 0.2993 dn_loss_cls: 0.0750 dn_loss_bbox: 0.1687 dn_loss_iou: 0.2244 d0.dn_loss_cls: 0.1496 d0.dn_loss_bbox: 0.3062 d0.dn_loss_iou: 0.3547 d1.dn_loss_cls: 0.1047 d1.dn_loss_bbox: 0.2004 d1.dn_loss_iou: 0.2506 d2.dn_loss_cls: 0.0868 d2.dn_loss_bbox: 0.1789 d2.dn_loss_iou: 0.2331 d3.dn_loss_cls: 0.0810 d3.dn_loss_bbox: 0.1709 d3.dn_loss_iou: 0.2268 d4.dn_loss_cls: 0.0760 d4.dn_loss_bbox: 0.1687 d4.dn_loss_iou: 0.2245 d1.loss_lmm_region: 0.0818 loss_lmm_image: 0.6975 2024/11/13 16:38:01 - mmengine - INFO - Iter(train) [132900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:32:40 time: 1.9984 data_time: 0.0187 memory: 33760 grad_norm: 33.0203 loss: 7.9696 loss_cls: 0.2477 loss_bbox: 0.1266 loss_iou: 0.2127 d0.loss_cls: 0.2908 d0.loss_bbox: 0.1324 d0.loss_iou: 0.2195 d1.loss_cls: 0.2695 d1.loss_bbox: 0.1276 d1.loss_iou: 0.2135 d2.loss_cls: 0.2591 d2.loss_bbox: 0.1264 d2.loss_iou: 0.2110 d3.loss_cls: 0.2518 d3.loss_bbox: 0.1264 d3.loss_iou: 0.2124 d4.loss_cls: 0.2466 d4.loss_bbox: 0.1268 d4.loss_iou: 0.2129 enc_loss_cls: 0.2939 enc_loss_bbox: 0.1410 enc_loss_iou: 0.2316 dn_loss_cls: 0.0715 dn_loss_bbox: 0.1459 dn_loss_iou: 0.1858 d0.dn_loss_cls: 0.1500 d0.dn_loss_bbox: 0.2851 d0.dn_loss_iou: 0.3174 d1.dn_loss_cls: 0.0976 d1.dn_loss_bbox: 0.1780 d1.dn_loss_iou: 0.2122 d2.dn_loss_cls: 0.0800 d2.dn_loss_bbox: 0.1540 d2.dn_loss_iou: 0.1933 d3.dn_loss_cls: 0.0747 d3.dn_loss_bbox: 0.1471 d3.dn_loss_iou: 0.1872 d4.dn_loss_cls: 0.0707 d4.dn_loss_bbox: 0.1459 d4.dn_loss_iou: 0.1858 d1.loss_lmm_region: 0.0961 loss_lmm_image: 0.7109 2024/11/13 16:41:21 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 16:41:21 - mmengine - INFO - Iter(train) [133000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:29:19 time: 2.0211 data_time: 0.0187 memory: 33764 grad_norm: 27.4974 loss: 8.1680 loss_cls: 0.2414 loss_bbox: 0.1133 loss_iou: 0.2083 d0.loss_cls: 0.2667 d0.loss_bbox: 0.1312 d0.loss_iou: 0.2239 d1.loss_cls: 0.2566 d1.loss_bbox: 0.1236 d1.loss_iou: 0.2195 d2.loss_cls: 0.2525 d2.loss_bbox: 0.1186 d2.loss_iou: 0.2129 d3.loss_cls: 0.2450 d3.loss_bbox: 0.1151 d3.loss_iou: 0.2105 d4.loss_cls: 0.2411 d4.loss_bbox: 0.1136 d4.loss_iou: 0.2076 enc_loss_cls: 0.2810 enc_loss_bbox: 0.1366 enc_loss_iou: 0.2340 dn_loss_cls: 0.0720 dn_loss_bbox: 0.1602 dn_loss_iou: 0.2119 d0.dn_loss_cls: 0.1629 d0.dn_loss_bbox: 0.3110 d0.dn_loss_iou: 0.3486 d1.dn_loss_cls: 0.1072 d1.dn_loss_bbox: 0.1908 d1.dn_loss_iou: 0.2391 d2.dn_loss_cls: 0.0848 d2.dn_loss_bbox: 0.1690 d2.dn_loss_iou: 0.2204 d3.dn_loss_cls: 0.0767 d3.dn_loss_bbox: 0.1624 d3.dn_loss_iou: 0.2141 d4.dn_loss_cls: 0.0720 d4.dn_loss_bbox: 0.1602 d4.dn_loss_iou: 0.2120 d1.loss_lmm_region: 0.1045 loss_lmm_image: 0.7349 2024/11/13 16:44:38 - mmengine - INFO - Iter(train) [133100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:25:58 time: 1.9694 data_time: 0.0186 memory: 32477 grad_norm: nan loss: 8.2708 loss_cls: 0.2571 loss_bbox: 0.1271 loss_iou: 0.2131 d0.loss_cls: 0.2805 d0.loss_bbox: 0.1437 d0.loss_iou: 0.2306 d1.loss_cls: 0.2679 d1.loss_bbox: 0.1353 d1.loss_iou: 0.2213 d2.loss_cls: 0.2627 d2.loss_bbox: 0.1304 d2.loss_iou: 0.2145 d3.loss_cls: 0.2591 d3.loss_bbox: 0.1298 d3.loss_iou: 0.2124 d4.loss_cls: 0.2589 d4.loss_bbox: 0.1268 d4.loss_iou: 0.2132 enc_loss_cls: 0.2894 enc_loss_bbox: 0.1502 enc_loss_iou: 0.2450 dn_loss_cls: 0.0722 dn_loss_bbox: 0.1571 dn_loss_iou: 0.2027 d0.dn_loss_cls: 0.1466 d0.dn_loss_bbox: 0.2917 d0.dn_loss_iou: 0.3290 d1.dn_loss_cls: 0.0954 d1.dn_loss_bbox: 0.1878 d1.dn_loss_iou: 0.2303 d2.dn_loss_cls: 0.0803 d2.dn_loss_bbox: 0.1672 d2.dn_loss_iou: 0.2116 d3.dn_loss_cls: 0.0754 d3.dn_loss_bbox: 0.1589 d3.dn_loss_iou: 0.2046 d4.dn_loss_cls: 0.0723 d4.dn_loss_bbox: 0.1571 d4.dn_loss_iou: 0.2027 d1.loss_lmm_region: 0.1020 loss_lmm_image: 0.7569 2024/11/13 16:47:58 - mmengine - INFO - Iter(train) [133200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:22:37 time: 1.9763 data_time: 0.0186 memory: 33793 grad_norm: 32.2794 loss: 9.3157 loss_cls: 0.2696 loss_bbox: 0.1576 loss_iou: 0.2533 d0.loss_cls: 0.3018 d0.loss_bbox: 0.1692 d0.loss_iou: 0.2696 d1.loss_cls: 0.2843 d1.loss_bbox: 0.1651 d1.loss_iou: 0.2645 d2.loss_cls: 0.2787 d2.loss_bbox: 0.1578 d2.loss_iou: 0.2561 d3.loss_cls: 0.2771 d3.loss_bbox: 0.1561 d3.loss_iou: 0.2508 d4.loss_cls: 0.2709 d4.loss_bbox: 0.1584 d4.loss_iou: 0.2516 enc_loss_cls: 0.3079 enc_loss_bbox: 0.1794 enc_loss_iou: 0.2790 dn_loss_cls: 0.0811 dn_loss_bbox: 0.1953 dn_loss_iou: 0.2328 d0.dn_loss_cls: 0.1654 d0.dn_loss_bbox: 0.3627 d0.dn_loss_iou: 0.3765 d1.dn_loss_cls: 0.1133 d1.dn_loss_bbox: 0.2313 d1.dn_loss_iou: 0.2627 d2.dn_loss_cls: 0.0936 d2.dn_loss_bbox: 0.2064 d2.dn_loss_iou: 0.2422 d3.dn_loss_cls: 0.0853 d3.dn_loss_bbox: 0.1960 d3.dn_loss_iou: 0.2346 d4.dn_loss_cls: 0.0811 d4.dn_loss_bbox: 0.1953 d4.dn_loss_iou: 0.2327 d1.loss_lmm_region: 0.0934 loss_lmm_image: 0.6754 2024/11/13 16:51:16 - mmengine - INFO - Iter(train) [133300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:19:16 time: 1.9792 data_time: 0.0189 memory: 31949 grad_norm: 22.5612 loss: 9.1404 loss_cls: 0.2679 loss_bbox: 0.1501 loss_iou: 0.2590 d0.loss_cls: 0.3040 d0.loss_bbox: 0.1612 d0.loss_iou: 0.2710 d1.loss_cls: 0.2850 d1.loss_bbox: 0.1555 d1.loss_iou: 0.2656 d2.loss_cls: 0.2795 d2.loss_bbox: 0.1510 d2.loss_iou: 0.2619 d3.loss_cls: 0.2712 d3.loss_bbox: 0.1521 d3.loss_iou: 0.2612 d4.loss_cls: 0.2696 d4.loss_bbox: 0.1498 d4.loss_iou: 0.2589 enc_loss_cls: 0.3154 enc_loss_bbox: 0.1706 enc_loss_iou: 0.2849 dn_loss_cls: 0.0619 dn_loss_bbox: 0.1890 dn_loss_iou: 0.2248 d0.dn_loss_cls: 0.1493 d0.dn_loss_bbox: 0.3365 d0.dn_loss_iou: 0.3637 d1.dn_loss_cls: 0.0944 d1.dn_loss_bbox: 0.2239 d1.dn_loss_iou: 0.2536 d2.dn_loss_cls: 0.0741 d2.dn_loss_bbox: 0.2023 d2.dn_loss_iou: 0.2346 d3.dn_loss_cls: 0.0662 d3.dn_loss_bbox: 0.1925 d3.dn_loss_iou: 0.2275 d4.dn_loss_cls: 0.0619 d4.dn_loss_bbox: 0.1890 d4.dn_loss_iou: 0.2248 d1.loss_lmm_region: 0.0959 loss_lmm_image: 0.7293 2024/11/13 16:54:36 - mmengine - INFO - Iter(train) [133400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:15:54 time: 2.0089 data_time: 0.0187 memory: 34606 grad_norm: 27.1379 loss: 8.9038 loss_cls: 0.2813 loss_bbox: 0.1417 loss_iou: 0.2307 d0.loss_cls: 0.3222 d0.loss_bbox: 0.1475 d0.loss_iou: 0.2387 d1.loss_cls: 0.3029 d1.loss_bbox: 0.1413 d1.loss_iou: 0.2304 d2.loss_cls: 0.2966 d2.loss_bbox: 0.1429 d2.loss_iou: 0.2301 d3.loss_cls: 0.2912 d3.loss_bbox: 0.1414 d3.loss_iou: 0.2291 d4.loss_cls: 0.2837 d4.loss_bbox: 0.1430 d4.loss_iou: 0.2301 enc_loss_cls: 0.3240 enc_loss_bbox: 0.1599 enc_loss_iou: 0.2582 dn_loss_cls: 0.0912 dn_loss_bbox: 0.1705 dn_loss_iou: 0.2083 d0.dn_loss_cls: 0.1731 d0.dn_loss_bbox: 0.2867 d0.dn_loss_iou: 0.3317 d1.dn_loss_cls: 0.1183 d1.dn_loss_bbox: 0.1934 d1.dn_loss_iou: 0.2340 d2.dn_loss_cls: 0.1040 d2.dn_loss_bbox: 0.1789 d2.dn_loss_iou: 0.2178 d3.dn_loss_cls: 0.0967 d3.dn_loss_bbox: 0.1715 d3.dn_loss_iou: 0.2105 d4.dn_loss_cls: 0.0938 d4.dn_loss_bbox: 0.1705 d4.dn_loss_iou: 0.2084 d1.loss_lmm_region: 0.1110 loss_lmm_image: 0.7665 2024/11/13 16:57:54 - mmengine - INFO - Iter(train) [133500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:12:33 time: 1.9927 data_time: 0.0187 memory: 33950 grad_norm: 23.7552 loss: 9.0146 loss_cls: 0.2968 loss_bbox: 0.1575 loss_iou: 0.2578 d0.loss_cls: 0.3367 d0.loss_bbox: 0.1752 d0.loss_iou: 0.2746 d1.loss_cls: 0.3178 d1.loss_bbox: 0.1608 d1.loss_iou: 0.2612 d2.loss_cls: 0.3066 d2.loss_bbox: 0.1561 d2.loss_iou: 0.2569 d3.loss_cls: 0.3036 d3.loss_bbox: 0.1556 d3.loss_iou: 0.2551 d4.loss_cls: 0.3034 d4.loss_bbox: 0.1556 d4.loss_iou: 0.2542 enc_loss_cls: 0.3421 enc_loss_bbox: 0.1801 enc_loss_iou: 0.2868 dn_loss_cls: 0.0652 dn_loss_bbox: 0.1627 dn_loss_iou: 0.1979 d0.dn_loss_cls: 0.1422 d0.dn_loss_bbox: 0.2831 d0.dn_loss_iou: 0.3173 d1.dn_loss_cls: 0.0904 d1.dn_loss_bbox: 0.1866 d1.dn_loss_iou: 0.2199 d2.dn_loss_cls: 0.0742 d2.dn_loss_bbox: 0.1703 d2.dn_loss_iou: 0.2044 d3.dn_loss_cls: 0.0689 d3.dn_loss_bbox: 0.1639 d3.dn_loss_iou: 0.1997 d4.dn_loss_cls: 0.0656 d4.dn_loss_bbox: 0.1628 d4.dn_loss_iou: 0.1980 d1.loss_lmm_region: 0.0889 loss_lmm_image: 0.7581 2024/11/13 17:01:12 - mmengine - INFO - Iter(train) [133600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:09:12 time: 1.9952 data_time: 0.0187 memory: 33713 grad_norm: 20.7453 loss: 8.9153 loss_cls: 0.2919 loss_bbox: 0.1407 loss_iou: 0.2671 d0.loss_cls: 0.3281 d0.loss_bbox: 0.1513 d0.loss_iou: 0.2880 d1.loss_cls: 0.3063 d1.loss_bbox: 0.1448 d1.loss_iou: 0.2789 d2.loss_cls: 0.3048 d2.loss_bbox: 0.1389 d2.loss_iou: 0.2721 d3.loss_cls: 0.2964 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2681 d4.loss_cls: 0.2926 d4.loss_bbox: 0.1410 d4.loss_iou: 0.2696 enc_loss_cls: 0.3352 enc_loss_bbox: 0.1588 enc_loss_iou: 0.3037 dn_loss_cls: 0.0901 dn_loss_bbox: 0.1378 dn_loss_iou: 0.1992 d0.dn_loss_cls: 0.1610 d0.dn_loss_bbox: 0.2608 d0.dn_loss_iou: 0.3276 d1.dn_loss_cls: 0.1177 d1.dn_loss_bbox: 0.1613 d1.dn_loss_iou: 0.2243 d2.dn_loss_cls: 0.1019 d2.dn_loss_bbox: 0.1437 d2.dn_loss_iou: 0.2072 d3.dn_loss_cls: 0.0959 d3.dn_loss_bbox: 0.1388 d3.dn_loss_iou: 0.2010 d4.dn_loss_cls: 0.0900 d4.dn_loss_bbox: 0.1378 d4.dn_loss_iou: 0.1992 d1.loss_lmm_region: 0.0922 loss_lmm_image: 0.7090 2024/11/13 17:04:32 - mmengine - INFO - Iter(train) [133700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:05:51 time: 1.9963 data_time: 0.0187 memory: 33836 grad_norm: 25.7448 loss: 9.1303 loss_cls: 0.2763 loss_bbox: 0.1375 loss_iou: 0.2490 d0.loss_cls: 0.3208 d0.loss_bbox: 0.1428 d0.loss_iou: 0.2612 d1.loss_cls: 0.3002 d1.loss_bbox: 0.1423 d1.loss_iou: 0.2550 d2.loss_cls: 0.2908 d2.loss_bbox: 0.1365 d2.loss_iou: 0.2493 d3.loss_cls: 0.2798 d3.loss_bbox: 0.1355 d3.loss_iou: 0.2489 d4.loss_cls: 0.2812 d4.loss_bbox: 0.1340 d4.loss_iou: 0.2471 enc_loss_cls: 0.3204 enc_loss_bbox: 0.1518 enc_loss_iou: 0.2782 dn_loss_cls: 0.1129 dn_loss_bbox: 0.1657 dn_loss_iou: 0.2182 d0.dn_loss_cls: 0.1988 d0.dn_loss_bbox: 0.3032 d0.dn_loss_iou: 0.3491 d1.dn_loss_cls: 0.1438 d1.dn_loss_bbox: 0.1910 d1.dn_loss_iou: 0.2436 d2.dn_loss_cls: 0.1254 d2.dn_loss_bbox: 0.1725 d2.dn_loss_iou: 0.2258 d3.dn_loss_cls: 0.1163 d3.dn_loss_bbox: 0.1664 d3.dn_loss_iou: 0.2194 d4.dn_loss_cls: 0.1138 d4.dn_loss_bbox: 0.1656 d4.dn_loss_iou: 0.2182 d1.loss_lmm_region: 0.1088 loss_lmm_image: 0.7333 2024/11/13 17:07:52 - mmengine - INFO - Iter(train) [133800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 9:02:30 time: 2.0191 data_time: 0.0187 memory: 33966 grad_norm: 25.6200 loss: 6.8205 loss_cls: 0.2071 loss_bbox: 0.0964 loss_iou: 0.1783 d0.loss_cls: 0.2267 d0.loss_bbox: 0.1086 d0.loss_iou: 0.1908 d1.loss_cls: 0.2144 d1.loss_bbox: 0.1003 d1.loss_iou: 0.1833 d2.loss_cls: 0.2093 d2.loss_bbox: 0.0976 d2.loss_iou: 0.1814 d3.loss_cls: 0.2126 d3.loss_bbox: 0.0962 d3.loss_iou: 0.1789 d4.loss_cls: 0.2064 d4.loss_bbox: 0.0957 d4.loss_iou: 0.1788 enc_loss_cls: 0.2406 enc_loss_bbox: 0.1155 enc_loss_iou: 0.2042 dn_loss_cls: 0.0608 dn_loss_bbox: 0.1196 dn_loss_iou: 0.1606 d0.dn_loss_cls: 0.1230 d0.dn_loss_bbox: 0.2536 d0.dn_loss_iou: 0.2853 d1.dn_loss_cls: 0.0794 d1.dn_loss_bbox: 0.1416 d1.dn_loss_iou: 0.1818 d2.dn_loss_cls: 0.0663 d2.dn_loss_bbox: 0.1254 d2.dn_loss_iou: 0.1662 d3.dn_loss_cls: 0.0623 d3.dn_loss_bbox: 0.1210 d3.dn_loss_iou: 0.1620 d4.dn_loss_cls: 0.0609 d4.dn_loss_bbox: 0.1196 d4.dn_loss_iou: 0.1605 d1.loss_lmm_region: 0.0707 loss_lmm_image: 0.7770 2024/11/13 17:11:12 - mmengine - INFO - Iter(train) [133900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:59:09 time: 2.0087 data_time: 0.0187 memory: 33938 grad_norm: 23.8411 loss: 8.5977 loss_cls: 0.2615 loss_bbox: 0.1457 loss_iou: 0.2568 d0.loss_cls: 0.3044 d0.loss_bbox: 0.1606 d0.loss_iou: 0.2665 d1.loss_cls: 0.2816 d1.loss_bbox: 0.1513 d1.loss_iou: 0.2624 d2.loss_cls: 0.2729 d2.loss_bbox: 0.1473 d2.loss_iou: 0.2596 d3.loss_cls: 0.2690 d3.loss_bbox: 0.1446 d3.loss_iou: 0.2558 d4.loss_cls: 0.2624 d4.loss_bbox: 0.1460 d4.loss_iou: 0.2568 enc_loss_cls: 0.3125 enc_loss_bbox: 0.1717 enc_loss_iou: 0.2836 dn_loss_cls: 0.0719 dn_loss_bbox: 0.1450 dn_loss_iou: 0.1928 d0.dn_loss_cls: 0.1481 d0.dn_loss_bbox: 0.2778 d0.dn_loss_iou: 0.3118 d1.dn_loss_cls: 0.1027 d1.dn_loss_bbox: 0.1761 d1.dn_loss_iou: 0.2187 d2.dn_loss_cls: 0.0847 d2.dn_loss_bbox: 0.1550 d2.dn_loss_iou: 0.2011 d3.dn_loss_cls: 0.0778 d3.dn_loss_bbox: 0.1473 d3.dn_loss_iou: 0.1950 d4.dn_loss_cls: 0.0725 d4.dn_loss_bbox: 0.1452 d4.dn_loss_iou: 0.1930 d1.loss_lmm_region: 0.0964 loss_lmm_image: 0.7117 2024/11/13 17:14:30 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 17:14:30 - mmengine - INFO - Iter(train) [134000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:55:47 time: 1.9819 data_time: 0.0187 memory: 34587 grad_norm: 31.2006 loss: 8.8122 loss_cls: 0.2946 loss_bbox: 0.1433 loss_iou: 0.2506 d0.loss_cls: 0.3437 d0.loss_bbox: 0.1507 d0.loss_iou: 0.2651 d1.loss_cls: 0.3175 d1.loss_bbox: 0.1451 d1.loss_iou: 0.2555 d2.loss_cls: 0.3083 d2.loss_bbox: 0.1426 d2.loss_iou: 0.2536 d3.loss_cls: 0.3019 d3.loss_bbox: 0.1415 d3.loss_iou: 0.2503 d4.loss_cls: 0.2963 d4.loss_bbox: 0.1429 d4.loss_iou: 0.2505 enc_loss_cls: 0.3472 enc_loss_bbox: 0.1621 enc_loss_iou: 0.2762 dn_loss_cls: 0.0773 dn_loss_bbox: 0.1481 dn_loss_iou: 0.1910 d0.dn_loss_cls: 0.1515 d0.dn_loss_bbox: 0.2733 d0.dn_loss_iou: 0.3013 d1.dn_loss_cls: 0.1046 d1.dn_loss_bbox: 0.1773 d1.dn_loss_iou: 0.2129 d2.dn_loss_cls: 0.0894 d2.dn_loss_bbox: 0.1553 d2.dn_loss_iou: 0.1979 d3.dn_loss_cls: 0.0826 d3.dn_loss_bbox: 0.1503 d3.dn_loss_iou: 0.1923 d4.dn_loss_cls: 0.0785 d4.dn_loss_bbox: 0.1481 d4.dn_loss_iou: 0.1911 d1.loss_lmm_region: 0.0993 loss_lmm_image: 0.7507 2024/11/13 17:17:50 - mmengine - INFO - Iter(train) [134100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:52:26 time: 1.9880 data_time: 0.0187 memory: 34937 grad_norm: 39.6683 loss: 7.7219 loss_cls: 0.2488 loss_bbox: 0.1113 loss_iou: 0.2177 d0.loss_cls: 0.2897 d0.loss_bbox: 0.1175 d0.loss_iou: 0.2281 d1.loss_cls: 0.2607 d1.loss_bbox: 0.1146 d1.loss_iou: 0.2240 d2.loss_cls: 0.2573 d2.loss_bbox: 0.1116 d2.loss_iou: 0.2165 d3.loss_cls: 0.2542 d3.loss_bbox: 0.1106 d3.loss_iou: 0.2161 d4.loss_cls: 0.2503 d4.loss_bbox: 0.1129 d4.loss_iou: 0.2171 enc_loss_cls: 0.2929 enc_loss_bbox: 0.1308 enc_loss_iou: 0.2467 dn_loss_cls: 0.0773 dn_loss_bbox: 0.1228 dn_loss_iou: 0.1796 d0.dn_loss_cls: 0.1425 d0.dn_loss_bbox: 0.2355 d0.dn_loss_iou: 0.2892 d1.dn_loss_cls: 0.1013 d1.dn_loss_bbox: 0.1449 d1.dn_loss_iou: 0.2016 d2.dn_loss_cls: 0.0864 d2.dn_loss_bbox: 0.1303 d2.dn_loss_iou: 0.1869 d3.dn_loss_cls: 0.0806 d3.dn_loss_bbox: 0.1243 d3.dn_loss_iou: 0.1811 d4.dn_loss_cls: 0.0770 d4.dn_loss_bbox: 0.1227 d4.dn_loss_iou: 0.1797 d1.loss_lmm_region: 0.0884 loss_lmm_image: 0.7403 2024/11/13 17:21:09 - mmengine - INFO - Iter(train) [134200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:49:05 time: 1.9665 data_time: 0.0187 memory: 33541 grad_norm: 35.3404 loss: 7.8068 loss_cls: 0.2397 loss_bbox: 0.1211 loss_iou: 0.2239 d0.loss_cls: 0.2645 d0.loss_bbox: 0.1337 d0.loss_iou: 0.2420 d1.loss_cls: 0.2432 d1.loss_bbox: 0.1317 d1.loss_iou: 0.2335 d2.loss_cls: 0.2432 d2.loss_bbox: 0.1252 d2.loss_iou: 0.2276 d3.loss_cls: 0.2408 d3.loss_bbox: 0.1210 d3.loss_iou: 0.2251 d4.loss_cls: 0.2374 d4.loss_bbox: 0.1215 d4.loss_iou: 0.2245 enc_loss_cls: 0.2813 enc_loss_bbox: 0.1395 enc_loss_iou: 0.2496 dn_loss_cls: 0.0578 dn_loss_bbox: 0.1352 dn_loss_iou: 0.1784 d0.dn_loss_cls: 0.1418 d0.dn_loss_bbox: 0.2796 d0.dn_loss_iou: 0.3056 d1.dn_loss_cls: 0.0890 d1.dn_loss_bbox: 0.1651 d1.dn_loss_iou: 0.2045 d2.dn_loss_cls: 0.0704 d2.dn_loss_bbox: 0.1436 d2.dn_loss_iou: 0.1862 d3.dn_loss_cls: 0.0619 d3.dn_loss_bbox: 0.1370 d3.dn_loss_iou: 0.1805 d4.dn_loss_cls: 0.0583 d4.dn_loss_bbox: 0.1352 d4.dn_loss_iou: 0.1785 d1.loss_lmm_region: 0.1041 loss_lmm_image: 0.7244 2024/11/13 17:24:28 - mmengine - INFO - Iter(train) [134300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:45:44 time: 1.9890 data_time: 0.0187 memory: 33382 grad_norm: 25.4763 loss: 8.7767 loss_cls: 0.2492 loss_bbox: 0.1358 loss_iou: 0.2314 d0.loss_cls: 0.2807 d0.loss_bbox: 0.1512 d0.loss_iou: 0.2466 d1.loss_cls: 0.2606 d1.loss_bbox: 0.1477 d1.loss_iou: 0.2401 d2.loss_cls: 0.2570 d2.loss_bbox: 0.1398 d2.loss_iou: 0.2355 d3.loss_cls: 0.2527 d3.loss_bbox: 0.1365 d3.loss_iou: 0.2324 d4.loss_cls: 0.2492 d4.loss_bbox: 0.1365 d4.loss_iou: 0.2321 enc_loss_cls: 0.2855 enc_loss_bbox: 0.1608 enc_loss_iou: 0.2627 dn_loss_cls: 0.0859 dn_loss_bbox: 0.1812 dn_loss_iou: 0.2197 d0.dn_loss_cls: 0.1623 d0.dn_loss_bbox: 0.3201 d0.dn_loss_iou: 0.3557 d1.dn_loss_cls: 0.1124 d1.dn_loss_bbox: 0.2118 d1.dn_loss_iou: 0.2485 d2.dn_loss_cls: 0.0956 d2.dn_loss_bbox: 0.1928 d2.dn_loss_iou: 0.2297 d3.dn_loss_cls: 0.0906 d3.dn_loss_bbox: 0.1828 d3.dn_loss_iou: 0.2216 d4.dn_loss_cls: 0.0862 d4.dn_loss_bbox: 0.1811 d4.dn_loss_iou: 0.2197 d1.loss_lmm_region: 0.1002 loss_lmm_image: 0.7549 2024/11/13 17:27:46 - mmengine - INFO - Iter(train) [134400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:42:23 time: 1.9756 data_time: 0.0187 memory: 34800 grad_norm: 29.8189 loss: 8.7924 loss_cls: 0.2602 loss_bbox: 0.1410 loss_iou: 0.2510 d0.loss_cls: 0.2876 d0.loss_bbox: 0.1495 d0.loss_iou: 0.2637 d1.loss_cls: 0.2755 d1.loss_bbox: 0.1387 d1.loss_iou: 0.2523 d2.loss_cls: 0.2705 d2.loss_bbox: 0.1384 d2.loss_iou: 0.2485 d3.loss_cls: 0.2670 d3.loss_bbox: 0.1376 d3.loss_iou: 0.2484 d4.loss_cls: 0.2653 d4.loss_bbox: 0.1372 d4.loss_iou: 0.2495 enc_loss_cls: 0.2988 enc_loss_bbox: 0.1543 enc_loss_iou: 0.2721 dn_loss_cls: 0.1061 dn_loss_bbox: 0.1608 dn_loss_iou: 0.2059 d0.dn_loss_cls: 0.1722 d0.dn_loss_bbox: 0.3118 d0.dn_loss_iou: 0.3402 d1.dn_loss_cls: 0.1261 d1.dn_loss_bbox: 0.1916 d1.dn_loss_iou: 0.2345 d2.dn_loss_cls: 0.1085 d2.dn_loss_bbox: 0.1710 d2.dn_loss_iou: 0.2153 d3.dn_loss_cls: 0.1045 d3.dn_loss_bbox: 0.1629 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.1054 d4.dn_loss_bbox: 0.1609 d4.dn_loss_iou: 0.2059 d1.loss_lmm_region: 0.0975 loss_lmm_image: 0.6964 2024/11/13 17:31:07 - mmengine - INFO - Iter(train) [134500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:39:02 time: 2.0108 data_time: 0.0188 memory: 35753 grad_norm: 23.5216 loss: 8.4427 loss_cls: 0.2461 loss_bbox: 0.1385 loss_iou: 0.2412 d0.loss_cls: 0.2853 d0.loss_bbox: 0.1454 d0.loss_iou: 0.2504 d1.loss_cls: 0.2611 d1.loss_bbox: 0.1391 d1.loss_iou: 0.2436 d2.loss_cls: 0.2567 d2.loss_bbox: 0.1387 d2.loss_iou: 0.2416 d3.loss_cls: 0.2502 d3.loss_bbox: 0.1366 d3.loss_iou: 0.2405 d4.loss_cls: 0.2466 d4.loss_bbox: 0.1383 d4.loss_iou: 0.2411 enc_loss_cls: 0.2931 enc_loss_bbox: 0.1559 enc_loss_iou: 0.2644 dn_loss_cls: 0.0792 dn_loss_bbox: 0.1507 dn_loss_iou: 0.1964 d0.dn_loss_cls: 0.1578 d0.dn_loss_bbox: 0.2895 d0.dn_loss_iou: 0.3271 d1.dn_loss_cls: 0.1093 d1.dn_loss_bbox: 0.1763 d1.dn_loss_iou: 0.2204 d2.dn_loss_cls: 0.0936 d2.dn_loss_bbox: 0.1616 d2.dn_loss_iou: 0.2049 d3.dn_loss_cls: 0.0844 d3.dn_loss_bbox: 0.1537 d3.dn_loss_iou: 0.1985 d4.dn_loss_cls: 0.0809 d4.dn_loss_bbox: 0.1507 d4.dn_loss_iou: 0.1963 d1.loss_lmm_region: 0.1079 loss_lmm_image: 0.7492 2024/11/13 17:34:27 - mmengine - INFO - Iter(train) [134600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:35:41 time: 1.9893 data_time: 0.0188 memory: 33376 grad_norm: 26.1830 loss: 8.2953 loss_cls: 0.2544 loss_bbox: 0.1295 loss_iou: 0.2127 d0.loss_cls: 0.2843 d0.loss_bbox: 0.1475 d0.loss_iou: 0.2299 d1.loss_cls: 0.2639 d1.loss_bbox: 0.1407 d1.loss_iou: 0.2216 d2.loss_cls: 0.2608 d2.loss_bbox: 0.1318 d2.loss_iou: 0.2138 d3.loss_cls: 0.2564 d3.loss_bbox: 0.1307 d3.loss_iou: 0.2130 d4.loss_cls: 0.2559 d4.loss_bbox: 0.1297 d4.loss_iou: 0.2128 enc_loss_cls: 0.2997 enc_loss_bbox: 0.1518 enc_loss_iou: 0.2385 dn_loss_cls: 0.0847 dn_loss_bbox: 0.1575 dn_loss_iou: 0.1945 d0.dn_loss_cls: 0.1673 d0.dn_loss_bbox: 0.2894 d0.dn_loss_iou: 0.3171 d1.dn_loss_cls: 0.1184 d1.dn_loss_bbox: 0.1843 d1.dn_loss_iou: 0.2164 d2.dn_loss_cls: 0.1018 d2.dn_loss_bbox: 0.1656 d2.dn_loss_iou: 0.2013 d3.dn_loss_cls: 0.0954 d3.dn_loss_bbox: 0.1591 d3.dn_loss_iou: 0.1960 d4.dn_loss_cls: 0.0860 d4.dn_loss_bbox: 0.1575 d4.dn_loss_iou: 0.1945 d1.loss_lmm_region: 0.1042 loss_lmm_image: 0.7245 2024/11/13 17:37:46 - mmengine - INFO - Iter(train) [134700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:32:20 time: 2.0136 data_time: 0.0190 memory: 35053 grad_norm: 23.0253 loss: 8.2886 loss_cls: 0.2461 loss_bbox: 0.1181 loss_iou: 0.2216 d0.loss_cls: 0.2777 d0.loss_bbox: 0.1320 d0.loss_iou: 0.2318 d1.loss_cls: 0.2658 d1.loss_bbox: 0.1217 d1.loss_iou: 0.2199 d2.loss_cls: 0.2578 d2.loss_bbox: 0.1141 d2.loss_iou: 0.2178 d3.loss_cls: 0.2501 d3.loss_bbox: 0.1180 d3.loss_iou: 0.2200 d4.loss_cls: 0.2466 d4.loss_bbox: 0.1179 d4.loss_iou: 0.2207 enc_loss_cls: 0.2885 enc_loss_bbox: 0.1343 enc_loss_iou: 0.2391 dn_loss_cls: 0.0872 dn_loss_bbox: 0.1522 dn_loss_iou: 0.2164 d0.dn_loss_cls: 0.1697 d0.dn_loss_bbox: 0.2805 d0.dn_loss_iou: 0.3452 d1.dn_loss_cls: 0.1171 d1.dn_loss_bbox: 0.1869 d1.dn_loss_iou: 0.2464 d2.dn_loss_cls: 0.0952 d2.dn_loss_bbox: 0.1623 d2.dn_loss_iou: 0.2259 d3.dn_loss_cls: 0.0906 d3.dn_loss_bbox: 0.1550 d3.dn_loss_iou: 0.2191 d4.dn_loss_cls: 0.0869 d4.dn_loss_bbox: 0.1523 d4.dn_loss_iou: 0.2165 d1.loss_lmm_region: 0.1186 loss_lmm_image: 0.7052 2024/11/13 17:41:05 - mmengine - INFO - Iter(train) [134800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:28:59 time: 1.9817 data_time: 0.0187 memory: 33662 grad_norm: 22.8065 loss: 7.2836 loss_cls: 0.2181 loss_bbox: 0.1023 loss_iou: 0.1842 d0.loss_cls: 0.2554 d0.loss_bbox: 0.1197 d0.loss_iou: 0.2001 d1.loss_cls: 0.2393 d1.loss_bbox: 0.1094 d1.loss_iou: 0.1908 d2.loss_cls: 0.2311 d2.loss_bbox: 0.1033 d2.loss_iou: 0.1860 d3.loss_cls: 0.2223 d3.loss_bbox: 0.1024 d3.loss_iou: 0.1834 d4.loss_cls: 0.2217 d4.loss_bbox: 0.1024 d4.loss_iou: 0.1836 enc_loss_cls: 0.2646 enc_loss_bbox: 0.1297 enc_loss_iou: 0.2172 dn_loss_cls: 0.0777 dn_loss_bbox: 0.1368 dn_loss_iou: 0.1637 d0.dn_loss_cls: 0.1462 d0.dn_loss_bbox: 0.2628 d0.dn_loss_iou: 0.2863 d1.dn_loss_cls: 0.1020 d1.dn_loss_bbox: 0.1634 d1.dn_loss_iou: 0.1892 d2.dn_loss_cls: 0.0845 d2.dn_loss_bbox: 0.1472 d2.dn_loss_iou: 0.1733 d3.dn_loss_cls: 0.0808 d3.dn_loss_bbox: 0.1394 d3.dn_loss_iou: 0.1665 d4.dn_loss_cls: 0.0787 d4.dn_loss_bbox: 0.1370 d4.dn_loss_iou: 0.1638 d1.loss_lmm_region: 0.0767 loss_lmm_image: 0.7404 2024/11/13 17:44:26 - mmengine - INFO - Iter(train) [134900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:25:38 time: 2.0236 data_time: 0.0188 memory: 34601 grad_norm: 29.9996 loss: 8.1928 loss_cls: 0.2660 loss_bbox: 0.1100 loss_iou: 0.2044 d0.loss_cls: 0.2956 d0.loss_bbox: 0.1172 d0.loss_iou: 0.2151 d1.loss_cls: 0.2824 d1.loss_bbox: 0.1088 d1.loss_iou: 0.2051 d2.loss_cls: 0.2720 d2.loss_bbox: 0.1077 d2.loss_iou: 0.2023 d3.loss_cls: 0.2706 d3.loss_bbox: 0.1082 d3.loss_iou: 0.2040 d4.loss_cls: 0.2674 d4.loss_bbox: 0.1092 d4.loss_iou: 0.2034 enc_loss_cls: 0.3010 enc_loss_bbox: 0.1298 enc_loss_iou: 0.2368 dn_loss_cls: 0.1427 dn_loss_bbox: 0.1265 dn_loss_iou: 0.1870 d0.dn_loss_cls: 0.1946 d0.dn_loss_bbox: 0.2477 d0.dn_loss_iou: 0.3139 d1.dn_loss_cls: 0.1634 d1.dn_loss_bbox: 0.1526 d1.dn_loss_iou: 0.2138 d2.dn_loss_cls: 0.1421 d2.dn_loss_bbox: 0.1346 d2.dn_loss_iou: 0.1951 d3.dn_loss_cls: 0.1430 d3.dn_loss_bbox: 0.1275 d3.dn_loss_iou: 0.1888 d4.dn_loss_cls: 0.1418 d4.dn_loss_bbox: 0.1266 d4.dn_loss_iou: 0.1872 d1.loss_lmm_region: 0.0993 loss_lmm_image: 0.7477 2024/11/13 17:47:45 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 17:47:45 - mmengine - INFO - Iter(train) [135000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:22:17 time: 2.0086 data_time: 0.0188 memory: 32711 grad_norm: 24.5259 loss: 7.9745 loss_cls: 0.2465 loss_bbox: 0.1182 loss_iou: 0.2144 d0.loss_cls: 0.2750 d0.loss_bbox: 0.1306 d0.loss_iou: 0.2281 d1.loss_cls: 0.2598 d1.loss_bbox: 0.1224 d1.loss_iou: 0.2214 d2.loss_cls: 0.2539 d2.loss_bbox: 0.1185 d2.loss_iou: 0.2169 d3.loss_cls: 0.2491 d3.loss_bbox: 0.1164 d3.loss_iou: 0.2135 d4.loss_cls: 0.2468 d4.loss_bbox: 0.1177 d4.loss_iou: 0.2133 enc_loss_cls: 0.2854 enc_loss_bbox: 0.1389 enc_loss_iou: 0.2396 dn_loss_cls: 0.0765 dn_loss_bbox: 0.1442 dn_loss_iou: 0.1911 d0.dn_loss_cls: 0.1419 d0.dn_loss_bbox: 0.2735 d0.dn_loss_iou: 0.3162 d1.dn_loss_cls: 0.0963 d1.dn_loss_bbox: 0.1726 d1.dn_loss_iou: 0.2164 d2.dn_loss_cls: 0.0824 d2.dn_loss_bbox: 0.1525 d2.dn_loss_iou: 0.1983 d3.dn_loss_cls: 0.0785 d3.dn_loss_bbox: 0.1460 d3.dn_loss_iou: 0.1928 d4.dn_loss_cls: 0.0770 d4.dn_loss_bbox: 0.1442 d4.dn_loss_iou: 0.1910 d1.loss_lmm_region: 0.0915 loss_lmm_image: 0.7652 2024/11/13 17:51:05 - mmengine - INFO - Iter(train) [135100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:18:56 time: 1.9861 data_time: 0.0190 memory: 33850 grad_norm: 24.4347 loss: 8.8594 loss_cls: 0.2806 loss_bbox: 0.1381 loss_iou: 0.2641 d0.loss_cls: 0.3194 d0.loss_bbox: 0.1467 d0.loss_iou: 0.2743 d1.loss_cls: 0.2970 d1.loss_bbox: 0.1431 d1.loss_iou: 0.2709 d2.loss_cls: 0.2888 d2.loss_bbox: 0.1386 d2.loss_iou: 0.2654 d3.loss_cls: 0.2864 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2632 d4.loss_cls: 0.2845 d4.loss_bbox: 0.1367 d4.loss_iou: 0.2638 enc_loss_cls: 0.3237 enc_loss_bbox: 0.1564 enc_loss_iou: 0.2875 dn_loss_cls: 0.0845 dn_loss_bbox: 0.1399 dn_loss_iou: 0.2108 d0.dn_loss_cls: 0.1630 d0.dn_loss_bbox: 0.2741 d0.dn_loss_iou: 0.3423 d1.dn_loss_cls: 0.1109 d1.dn_loss_bbox: 0.1662 d1.dn_loss_iou: 0.2356 d2.dn_loss_cls: 0.0944 d2.dn_loss_bbox: 0.1497 d2.dn_loss_iou: 0.2187 d3.dn_loss_cls: 0.0875 d3.dn_loss_bbox: 0.1414 d3.dn_loss_iou: 0.2123 d4.dn_loss_cls: 0.0849 d4.dn_loss_bbox: 0.1399 d4.dn_loss_iou: 0.2108 d1.loss_lmm_region: 0.1162 loss_lmm_image: 0.7105 2024/11/13 17:54:26 - mmengine - INFO - Iter(train) [135200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:15:35 time: 2.0106 data_time: 0.0190 memory: 34334 grad_norm: 33.0364 loss: 8.4552 loss_cls: 0.2728 loss_bbox: 0.1284 loss_iou: 0.2638 d0.loss_cls: 0.3175 d0.loss_bbox: 0.1405 d0.loss_iou: 0.2788 d1.loss_cls: 0.2854 d1.loss_bbox: 0.1336 d1.loss_iou: 0.2688 d2.loss_cls: 0.2831 d2.loss_bbox: 0.1280 d2.loss_iou: 0.2615 d3.loss_cls: 0.2780 d3.loss_bbox: 0.1277 d3.loss_iou: 0.2638 d4.loss_cls: 0.2735 d4.loss_bbox: 0.1293 d4.loss_iou: 0.2641 enc_loss_cls: 0.3243 enc_loss_bbox: 0.1507 enc_loss_iou: 0.2942 dn_loss_cls: 0.0602 dn_loss_bbox: 0.1283 dn_loss_iou: 0.1879 d0.dn_loss_cls: 0.1360 d0.dn_loss_bbox: 0.2629 d0.dn_loss_iou: 0.3194 d1.dn_loss_cls: 0.0867 d1.dn_loss_bbox: 0.1617 d1.dn_loss_iou: 0.2164 d2.dn_loss_cls: 0.0710 d2.dn_loss_bbox: 0.1405 d2.dn_loss_iou: 0.1978 d3.dn_loss_cls: 0.0641 d3.dn_loss_bbox: 0.1317 d3.dn_loss_iou: 0.1906 d4.dn_loss_cls: 0.0603 d4.dn_loss_bbox: 0.1284 d4.dn_loss_iou: 0.1878 d1.loss_lmm_region: 0.0898 loss_lmm_image: 0.7661 2024/11/13 17:57:45 - mmengine - INFO - Iter(train) [135300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:12:14 time: 1.9978 data_time: 0.0186 memory: 34232 grad_norm: 26.9267 loss: 7.1512 loss_cls: 0.2163 loss_bbox: 0.1012 loss_iou: 0.1974 d0.loss_cls: 0.2503 d0.loss_bbox: 0.1110 d0.loss_iou: 0.2107 d1.loss_cls: 0.2313 d1.loss_bbox: 0.1012 d1.loss_iou: 0.2005 d2.loss_cls: 0.2240 d2.loss_bbox: 0.1005 d2.loss_iou: 0.1964 d3.loss_cls: 0.2234 d3.loss_bbox: 0.1014 d3.loss_iou: 0.1969 d4.loss_cls: 0.2178 d4.loss_bbox: 0.1010 d4.loss_iou: 0.1977 enc_loss_cls: 0.2538 enc_loss_bbox: 0.1236 enc_loss_iou: 0.2301 dn_loss_cls: 0.0552 dn_loss_bbox: 0.1258 dn_loss_iou: 0.1812 d0.dn_loss_cls: 0.1240 d0.dn_loss_bbox: 0.2435 d0.dn_loss_iou: 0.3131 d1.dn_loss_cls: 0.0774 d1.dn_loss_bbox: 0.1468 d1.dn_loss_iou: 0.2074 d2.dn_loss_cls: 0.0629 d2.dn_loss_bbox: 0.1320 d2.dn_loss_iou: 0.1893 d3.dn_loss_cls: 0.0589 d3.dn_loss_bbox: 0.1276 d3.dn_loss_iou: 0.1834 d4.dn_loss_cls: 0.0556 d4.dn_loss_bbox: 0.1259 d4.dn_loss_iou: 0.1813 d1.loss_lmm_region: 0.0926 loss_lmm_image: 0.6808 2024/11/13 18:01:06 - mmengine - INFO - Iter(train) [135400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:08:53 time: 2.0204 data_time: 0.0188 memory: 34572 grad_norm: 38.4552 loss: 8.5175 loss_cls: 0.2936 loss_bbox: 0.1167 loss_iou: 0.2344 d0.loss_cls: 0.3360 d0.loss_bbox: 0.1302 d0.loss_iou: 0.2496 d1.loss_cls: 0.3075 d1.loss_bbox: 0.1280 d1.loss_iou: 0.2425 d2.loss_cls: 0.2985 d2.loss_bbox: 0.1214 d2.loss_iou: 0.2366 d3.loss_cls: 0.2986 d3.loss_bbox: 0.1172 d3.loss_iou: 0.2357 d4.loss_cls: 0.2942 d4.loss_bbox: 0.1169 d4.loss_iou: 0.2355 enc_loss_cls: 0.3427 enc_loss_bbox: 0.1460 enc_loss_iou: 0.2688 dn_loss_cls: 0.0856 dn_loss_bbox: 0.1452 dn_loss_iou: 0.1887 d0.dn_loss_cls: 0.1685 d0.dn_loss_bbox: 0.2692 d0.dn_loss_iou: 0.3074 d1.dn_loss_cls: 0.1183 d1.dn_loss_bbox: 0.1715 d1.dn_loss_iou: 0.2133 d2.dn_loss_cls: 0.1004 d2.dn_loss_bbox: 0.1545 d2.dn_loss_iou: 0.1967 d3.dn_loss_cls: 0.0943 d3.dn_loss_bbox: 0.1480 d3.dn_loss_iou: 0.1909 d4.dn_loss_cls: 0.0867 d4.dn_loss_bbox: 0.1452 d4.dn_loss_iou: 0.1887 d1.loss_lmm_region: 0.0872 loss_lmm_image: 0.7067 2024/11/13 18:04:26 - mmengine - INFO - Iter(train) [135500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:05:32 time: 2.0151 data_time: 0.0187 memory: 33786 grad_norm: 31.2154 loss: 9.1379 loss_cls: 0.2821 loss_bbox: 0.1566 loss_iou: 0.2644 d0.loss_cls: 0.3266 d0.loss_bbox: 0.1667 d0.loss_iou: 0.2723 d1.loss_cls: 0.3028 d1.loss_bbox: 0.1610 d1.loss_iou: 0.2688 d2.loss_cls: 0.2948 d2.loss_bbox: 0.1521 d2.loss_iou: 0.2612 d3.loss_cls: 0.2851 d3.loss_bbox: 0.1574 d3.loss_iou: 0.2658 d4.loss_cls: 0.2831 d4.loss_bbox: 0.1548 d4.loss_iou: 0.2622 enc_loss_cls: 0.3345 enc_loss_bbox: 0.1727 enc_loss_iou: 0.2868 dn_loss_cls: 0.0993 dn_loss_bbox: 0.1539 dn_loss_iou: 0.2137 d0.dn_loss_cls: 0.1752 d0.dn_loss_bbox: 0.2796 d0.dn_loss_iou: 0.3396 d1.dn_loss_cls: 0.1308 d1.dn_loss_bbox: 0.1801 d1.dn_loss_iou: 0.2386 d2.dn_loss_cls: 0.1112 d2.dn_loss_bbox: 0.1619 d2.dn_loss_iou: 0.2213 d3.dn_loss_cls: 0.1035 d3.dn_loss_bbox: 0.1549 d3.dn_loss_iou: 0.2155 d4.dn_loss_cls: 0.0998 d4.dn_loss_bbox: 0.1540 d4.dn_loss_iou: 0.2138 d1.loss_lmm_region: 0.1053 loss_lmm_image: 0.6744 2024/11/13 18:07:44 - mmengine - INFO - Iter(train) [135600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 8:02:10 time: 1.9603 data_time: 0.0188 memory: 35133 grad_norm: 37.6840 loss: 8.6743 loss_cls: 0.2703 loss_bbox: 0.1449 loss_iou: 0.2582 d0.loss_cls: 0.3156 d0.loss_bbox: 0.1537 d0.loss_iou: 0.2662 d1.loss_cls: 0.2964 d1.loss_bbox: 0.1422 d1.loss_iou: 0.2566 d2.loss_cls: 0.2846 d2.loss_bbox: 0.1439 d2.loss_iou: 0.2563 d3.loss_cls: 0.2784 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2568 d4.loss_cls: 0.2708 d4.loss_bbox: 0.1438 d4.loss_iou: 0.2571 enc_loss_cls: 0.3277 enc_loss_bbox: 0.1636 enc_loss_iou: 0.2793 dn_loss_cls: 0.0782 dn_loss_bbox: 0.1405 dn_loss_iou: 0.1953 d0.dn_loss_cls: 0.1584 d0.dn_loss_bbox: 0.2622 d0.dn_loss_iou: 0.3162 d1.dn_loss_cls: 0.1153 d1.dn_loss_bbox: 0.1694 d1.dn_loss_iou: 0.2207 d2.dn_loss_cls: 0.0962 d2.dn_loss_bbox: 0.1512 d2.dn_loss_iou: 0.2037 d3.dn_loss_cls: 0.0878 d3.dn_loss_bbox: 0.1440 d3.dn_loss_iou: 0.1977 d4.dn_loss_cls: 0.0810 d4.dn_loss_bbox: 0.1405 d4.dn_loss_iou: 0.1953 d1.loss_lmm_region: 0.0966 loss_lmm_image: 0.7168 2024/11/13 18:11:03 - mmengine - INFO - Iter(train) [135700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:58:49 time: 1.9761 data_time: 0.0188 memory: 33711 grad_norm: 27.2556 loss: 9.2078 loss_cls: 0.3007 loss_bbox: 0.1393 loss_iou: 0.2297 d0.loss_cls: 0.3301 d0.loss_bbox: 0.1543 d0.loss_iou: 0.2477 d1.loss_cls: 0.3201 d1.loss_bbox: 0.1433 d1.loss_iou: 0.2316 d2.loss_cls: 0.3137 d2.loss_bbox: 0.1384 d2.loss_iou: 0.2277 d3.loss_cls: 0.3101 d3.loss_bbox: 0.1375 d3.loss_iou: 0.2278 d4.loss_cls: 0.3014 d4.loss_bbox: 0.1372 d4.loss_iou: 0.2285 enc_loss_cls: 0.3390 enc_loss_bbox: 0.1616 enc_loss_iou: 0.2616 dn_loss_cls: 0.1212 dn_loss_bbox: 0.1742 dn_loss_iou: 0.2102 d0.dn_loss_cls: 0.1834 d0.dn_loss_bbox: 0.3292 d0.dn_loss_iou: 0.3521 d1.dn_loss_cls: 0.1453 d1.dn_loss_bbox: 0.2071 d1.dn_loss_iou: 0.2401 d2.dn_loss_cls: 0.1218 d2.dn_loss_bbox: 0.1838 d2.dn_loss_iou: 0.2191 d3.dn_loss_cls: 0.1339 d3.dn_loss_bbox: 0.1766 d3.dn_loss_iou: 0.2125 d4.dn_loss_cls: 0.1265 d4.dn_loss_bbox: 0.1742 d4.dn_loss_iou: 0.2102 d1.loss_lmm_region: 0.0893 loss_lmm_image: 0.7158 2024/11/13 18:14:23 - mmengine - INFO - Iter(train) [135800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:55:28 time: 1.9988 data_time: 0.0187 memory: 34069 grad_norm: 27.4719 loss: 7.8649 loss_cls: 0.2352 loss_bbox: 0.1218 loss_iou: 0.2183 d0.loss_cls: 0.2547 d0.loss_bbox: 0.1382 d0.loss_iou: 0.2326 d1.loss_cls: 0.2448 d1.loss_bbox: 0.1218 d1.loss_iou: 0.2210 d2.loss_cls: 0.2405 d2.loss_bbox: 0.1182 d2.loss_iou: 0.2196 d3.loss_cls: 0.2353 d3.loss_bbox: 0.1247 d3.loss_iou: 0.2203 d4.loss_cls: 0.2362 d4.loss_bbox: 0.1216 d4.loss_iou: 0.2184 enc_loss_cls: 0.2706 enc_loss_bbox: 0.1397 enc_loss_iou: 0.2421 dn_loss_cls: 0.0715 dn_loss_bbox: 0.1470 dn_loss_iou: 0.1883 d0.dn_loss_cls: 0.1531 d0.dn_loss_bbox: 0.2869 d0.dn_loss_iou: 0.3167 d1.dn_loss_cls: 0.1005 d1.dn_loss_bbox: 0.1770 d1.dn_loss_iou: 0.2129 d2.dn_loss_cls: 0.0824 d2.dn_loss_bbox: 0.1566 d2.dn_loss_iou: 0.1958 d3.dn_loss_cls: 0.0760 d3.dn_loss_bbox: 0.1488 d3.dn_loss_iou: 0.1902 d4.dn_loss_cls: 0.0727 d4.dn_loss_bbox: 0.1469 d4.dn_loss_iou: 0.1883 d1.loss_lmm_region: 0.1017 loss_lmm_image: 0.6762 2024/11/13 18:17:44 - mmengine - INFO - Iter(train) [135900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:52:07 time: 2.0255 data_time: 0.0187 memory: 32984 grad_norm: 23.3928 loss: 7.5454 loss_cls: 0.2362 loss_bbox: 0.1175 loss_iou: 0.2008 d0.loss_cls: 0.2593 d0.loss_bbox: 0.1254 d0.loss_iou: 0.2118 d1.loss_cls: 0.2479 d1.loss_bbox: 0.1170 d1.loss_iou: 0.2051 d2.loss_cls: 0.2468 d2.loss_bbox: 0.1173 d2.loss_iou: 0.2034 d3.loss_cls: 0.2352 d3.loss_bbox: 0.1171 d3.loss_iou: 0.2023 d4.loss_cls: 0.2348 d4.loss_bbox: 0.1167 d4.loss_iou: 0.2011 enc_loss_cls: 0.2626 enc_loss_bbox: 0.1343 enc_loss_iou: 0.2267 dn_loss_cls: 0.0643 dn_loss_bbox: 0.1291 dn_loss_iou: 0.1850 d0.dn_loss_cls: 0.1391 d0.dn_loss_bbox: 0.2539 d0.dn_loss_iou: 0.3063 d1.dn_loss_cls: 0.0886 d1.dn_loss_bbox: 0.1532 d1.dn_loss_iou: 0.2072 d2.dn_loss_cls: 0.0737 d2.dn_loss_bbox: 0.1378 d2.dn_loss_iou: 0.1928 d3.dn_loss_cls: 0.0669 d3.dn_loss_bbox: 0.1304 d3.dn_loss_iou: 0.1866 d4.dn_loss_cls: 0.0638 d4.dn_loss_bbox: 0.1291 d4.dn_loss_iou: 0.1849 d1.loss_lmm_region: 0.0898 loss_lmm_image: 0.7436 2024/11/13 18:21:07 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 18:21:07 - mmengine - INFO - Iter(train) [136000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:48:47 time: 2.0326 data_time: 0.0188 memory: 34326 grad_norm: 29.2834 loss: 8.3218 loss_cls: 0.2759 loss_bbox: 0.1245 loss_iou: 0.2072 d0.loss_cls: 0.3052 d0.loss_bbox: 0.1368 d0.loss_iou: 0.2202 d1.loss_cls: 0.2894 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2103 d2.loss_cls: 0.2830 d2.loss_bbox: 0.1240 d2.loss_iou: 0.2069 d3.loss_cls: 0.2759 d3.loss_bbox: 0.1247 d3.loss_iou: 0.2063 d4.loss_cls: 0.2765 d4.loss_bbox: 0.1249 d4.loss_iou: 0.2063 enc_loss_cls: 0.3190 enc_loss_bbox: 0.1398 enc_loss_iou: 0.2289 dn_loss_cls: 0.0965 dn_loss_bbox: 0.1482 dn_loss_iou: 0.1800 d0.dn_loss_cls: 0.1794 d0.dn_loss_bbox: 0.2824 d0.dn_loss_iou: 0.2998 d1.dn_loss_cls: 0.1291 d1.dn_loss_bbox: 0.1807 d1.dn_loss_iou: 0.2068 d2.dn_loss_cls: 0.1088 d2.dn_loss_bbox: 0.1589 d2.dn_loss_iou: 0.1880 d3.dn_loss_cls: 0.1043 d3.dn_loss_bbox: 0.1514 d3.dn_loss_iou: 0.1821 d4.dn_loss_cls: 0.0967 d4.dn_loss_bbox: 0.1482 d4.dn_loss_iou: 0.1800 d1.loss_lmm_region: 0.1135 loss_lmm_image: 0.7745 2024/11/13 18:24:28 - mmengine - INFO - Iter(train) [136100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:45:26 time: 2.0150 data_time: 0.0189 memory: 33778 grad_norm: 35.0648 loss: 8.5663 loss_cls: 0.2475 loss_bbox: 0.1251 loss_iou: 0.2308 d0.loss_cls: 0.2871 d0.loss_bbox: 0.1412 d0.loss_iou: 0.2481 d1.loss_cls: 0.2625 d1.loss_bbox: 0.1330 d1.loss_iou: 0.2394 d2.loss_cls: 0.2590 d2.loss_bbox: 0.1279 d2.loss_iou: 0.2343 d3.loss_cls: 0.2542 d3.loss_bbox: 0.1254 d3.loss_iou: 0.2303 d4.loss_cls: 0.2482 d4.loss_bbox: 0.1251 d4.loss_iou: 0.2322 enc_loss_cls: 0.2996 enc_loss_bbox: 0.1488 enc_loss_iou: 0.2600 dn_loss_cls: 0.1018 dn_loss_bbox: 0.1563 dn_loss_iou: 0.2112 d0.dn_loss_cls: 0.1838 d0.dn_loss_bbox: 0.2996 d0.dn_loss_iou: 0.3441 d1.dn_loss_cls: 0.1273 d1.dn_loss_bbox: 0.1856 d1.dn_loss_iou: 0.2389 d2.dn_loss_cls: 0.1102 d2.dn_loss_bbox: 0.1662 d2.dn_loss_iou: 0.2201 d3.dn_loss_cls: 0.1078 d3.dn_loss_bbox: 0.1595 d3.dn_loss_iou: 0.2143 d4.dn_loss_cls: 0.1053 d4.dn_loss_bbox: 0.1564 d4.dn_loss_iou: 0.2112 d1.loss_lmm_region: 0.1043 loss_lmm_image: 0.7027 2024/11/13 18:27:48 - mmengine - INFO - Iter(train) [136200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:42:05 time: 2.0014 data_time: 0.0188 memory: 34960 grad_norm: 26.7924 loss: 7.4701 loss_cls: 0.2098 loss_bbox: 0.1170 loss_iou: 0.2093 d0.loss_cls: 0.2490 d0.loss_bbox: 0.1270 d0.loss_iou: 0.2184 d1.loss_cls: 0.2268 d1.loss_bbox: 0.1195 d1.loss_iou: 0.2148 d2.loss_cls: 0.2188 d2.loss_bbox: 0.1147 d2.loss_iou: 0.2112 d3.loss_cls: 0.2125 d3.loss_bbox: 0.1151 d3.loss_iou: 0.2085 d4.loss_cls: 0.2112 d4.loss_bbox: 0.1158 d4.loss_iou: 0.2093 enc_loss_cls: 0.2603 enc_loss_bbox: 0.1348 enc_loss_iou: 0.2356 dn_loss_cls: 0.0618 dn_loss_bbox: 0.1493 dn_loss_iou: 0.1822 d0.dn_loss_cls: 0.1358 d0.dn_loss_bbox: 0.2636 d0.dn_loss_iou: 0.2972 d1.dn_loss_cls: 0.0843 d1.dn_loss_bbox: 0.1754 d1.dn_loss_iou: 0.2063 d2.dn_loss_cls: 0.0708 d2.dn_loss_bbox: 0.1585 d2.dn_loss_iou: 0.1897 d3.dn_loss_cls: 0.0650 d3.dn_loss_bbox: 0.1505 d3.dn_loss_iou: 0.1836 d4.dn_loss_cls: 0.0621 d4.dn_loss_bbox: 0.1493 d4.dn_loss_iou: 0.1823 d1.loss_lmm_region: 0.0812 loss_lmm_image: 0.6817 2024/11/13 18:31:09 - mmengine - INFO - Iter(train) [136300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:38:44 time: 1.9981 data_time: 0.0187 memory: 33856 grad_norm: 29.8438 loss: 8.8429 loss_cls: 0.2711 loss_bbox: 0.1357 loss_iou: 0.2556 d0.loss_cls: 0.3072 d0.loss_bbox: 0.1497 d0.loss_iou: 0.2673 d1.loss_cls: 0.2889 d1.loss_bbox: 0.1410 d1.loss_iou: 0.2558 d2.loss_cls: 0.2846 d2.loss_bbox: 0.1365 d2.loss_iou: 0.2517 d3.loss_cls: 0.2741 d3.loss_bbox: 0.1357 d3.loss_iou: 0.2535 d4.loss_cls: 0.2705 d4.loss_bbox: 0.1366 d4.loss_iou: 0.2548 enc_loss_cls: 0.3103 enc_loss_bbox: 0.1567 enc_loss_iou: 0.2820 dn_loss_cls: 0.0836 dn_loss_bbox: 0.1575 dn_loss_iou: 0.2183 d0.dn_loss_cls: 0.1511 d0.dn_loss_bbox: 0.2875 d0.dn_loss_iou: 0.3478 d1.dn_loss_cls: 0.1054 d1.dn_loss_bbox: 0.1862 d1.dn_loss_iou: 0.2461 d2.dn_loss_cls: 0.0946 d2.dn_loss_bbox: 0.1647 d2.dn_loss_iou: 0.2262 d3.dn_loss_cls: 0.0898 d3.dn_loss_bbox: 0.1579 d3.dn_loss_iou: 0.2197 d4.dn_loss_cls: 0.0833 d4.dn_loss_bbox: 0.1575 d4.dn_loss_iou: 0.2183 d1.loss_lmm_region: 0.1130 loss_lmm_image: 0.7153 2024/11/13 18:34:26 - mmengine - INFO - Iter(train) [136400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:35:23 time: 1.9712 data_time: 0.0187 memory: 33917 grad_norm: 36.7464 loss: 9.3855 loss_cls: 0.3002 loss_bbox: 0.1592 loss_iou: 0.2736 d0.loss_cls: 0.3394 d0.loss_bbox: 0.1748 d0.loss_iou: 0.2881 d1.loss_cls: 0.3212 d1.loss_bbox: 0.1656 d1.loss_iou: 0.2782 d2.loss_cls: 0.3149 d2.loss_bbox: 0.1610 d2.loss_iou: 0.2743 d3.loss_cls: 0.3062 d3.loss_bbox: 0.1609 d3.loss_iou: 0.2732 d4.loss_cls: 0.3028 d4.loss_bbox: 0.1588 d4.loss_iou: 0.2726 enc_loss_cls: 0.3501 enc_loss_bbox: 0.1865 enc_loss_iou: 0.3092 dn_loss_cls: 0.0931 dn_loss_bbox: 0.1636 dn_loss_iou: 0.2038 d0.dn_loss_cls: 0.1598 d0.dn_loss_bbox: 0.3066 d0.dn_loss_iou: 0.3317 d1.dn_loss_cls: 0.1145 d1.dn_loss_bbox: 0.1883 d1.dn_loss_iou: 0.2265 d2.dn_loss_cls: 0.1007 d2.dn_loss_bbox: 0.1705 d2.dn_loss_iou: 0.2110 d3.dn_loss_cls: 0.0954 d3.dn_loss_bbox: 0.1655 d3.dn_loss_iou: 0.2055 d4.dn_loss_cls: 0.0909 d4.dn_loss_bbox: 0.1637 d4.dn_loss_iou: 0.2040 d1.loss_lmm_region: 0.1045 loss_lmm_image: 0.7152 2024/11/13 18:37:45 - mmengine - INFO - Iter(train) [136500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:32:01 time: 1.9954 data_time: 0.0188 memory: 32580 grad_norm: 27.7746 loss: 9.4900 loss_cls: 0.2859 loss_bbox: 0.1571 loss_iou: 0.2895 d0.loss_cls: 0.3311 d0.loss_bbox: 0.1635 d0.loss_iou: 0.3006 d1.loss_cls: 0.3063 d1.loss_bbox: 0.1593 d1.loss_iou: 0.2920 d2.loss_cls: 0.2996 d2.loss_bbox: 0.1562 d2.loss_iou: 0.2888 d3.loss_cls: 0.2936 d3.loss_bbox: 0.1551 d3.loss_iou: 0.2893 d4.loss_cls: 0.2866 d4.loss_bbox: 0.1554 d4.loss_iou: 0.2914 enc_loss_cls: 0.3358 enc_loss_bbox: 0.1767 enc_loss_iou: 0.3139 dn_loss_cls: 0.0831 dn_loss_bbox: 0.1618 dn_loss_iou: 0.2256 d0.dn_loss_cls: 0.1626 d0.dn_loss_bbox: 0.3005 d0.dn_loss_iou: 0.3570 d1.dn_loss_cls: 0.1107 d1.dn_loss_bbox: 0.1899 d1.dn_loss_iou: 0.2503 d2.dn_loss_cls: 0.0927 d2.dn_loss_bbox: 0.1710 d2.dn_loss_iou: 0.2337 d3.dn_loss_cls: 0.0864 d3.dn_loss_bbox: 0.1647 d3.dn_loss_iou: 0.2276 d4.dn_loss_cls: 0.0840 d4.dn_loss_bbox: 0.1620 d4.dn_loss_iou: 0.2257 d1.loss_lmm_region: 0.1151 loss_lmm_image: 0.7581 2024/11/13 18:41:05 - mmengine - INFO - Iter(train) [136600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:28:41 time: 2.0343 data_time: 0.0188 memory: 34952 grad_norm: 29.9116 loss: 8.3032 loss_cls: 0.2376 loss_bbox: 0.1265 loss_iou: 0.2226 d0.loss_cls: 0.2811 d0.loss_bbox: 0.1317 d0.loss_iou: 0.2292 d1.loss_cls: 0.2629 d1.loss_bbox: 0.1249 d1.loss_iou: 0.2220 d2.loss_cls: 0.2585 d2.loss_bbox: 0.1214 d2.loss_iou: 0.2191 d3.loss_cls: 0.2502 d3.loss_bbox: 0.1238 d3.loss_iou: 0.2185 d4.loss_cls: 0.2399 d4.loss_bbox: 0.1259 d4.loss_iou: 0.2196 enc_loss_cls: 0.2889 enc_loss_bbox: 0.1419 enc_loss_iou: 0.2430 dn_loss_cls: 0.1087 dn_loss_bbox: 0.1555 dn_loss_iou: 0.1978 d0.dn_loss_cls: 0.1787 d0.dn_loss_bbox: 0.2952 d0.dn_loss_iou: 0.3237 d1.dn_loss_cls: 0.1326 d1.dn_loss_bbox: 0.1851 d1.dn_loss_iou: 0.2211 d2.dn_loss_cls: 0.1142 d2.dn_loss_bbox: 0.1651 d2.dn_loss_iou: 0.2053 d3.dn_loss_cls: 0.1112 d3.dn_loss_bbox: 0.1579 d3.dn_loss_iou: 0.1999 d4.dn_loss_cls: 0.1073 d4.dn_loss_bbox: 0.1556 d4.dn_loss_iou: 0.1978 d1.loss_lmm_region: 0.0997 loss_lmm_image: 0.7016 2024/11/13 18:44:26 - mmengine - INFO - Iter(train) [136700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:25:20 time: 2.0021 data_time: 0.0187 memory: 35005 grad_norm: 28.3727 loss: 7.7360 loss_cls: 0.2405 loss_bbox: 0.1105 loss_iou: 0.1958 d0.loss_cls: 0.2857 d0.loss_bbox: 0.1229 d0.loss_iou: 0.2158 d1.loss_cls: 0.2637 d1.loss_bbox: 0.1139 d1.loss_iou: 0.2035 d2.loss_cls: 0.2528 d2.loss_bbox: 0.1129 d2.loss_iou: 0.1990 d3.loss_cls: 0.2441 d3.loss_bbox: 0.1106 d3.loss_iou: 0.1971 d4.loss_cls: 0.2423 d4.loss_bbox: 0.1109 d4.loss_iou: 0.1964 enc_loss_cls: 0.2957 enc_loss_bbox: 0.1278 enc_loss_iou: 0.2235 dn_loss_cls: 0.0736 dn_loss_bbox: 0.1426 dn_loss_iou: 0.1813 d0.dn_loss_cls: 0.1558 d0.dn_loss_bbox: 0.2776 d0.dn_loss_iou: 0.3048 d1.dn_loss_cls: 0.1052 d1.dn_loss_bbox: 0.1723 d1.dn_loss_iou: 0.2076 d2.dn_loss_cls: 0.0881 d2.dn_loss_bbox: 0.1517 d2.dn_loss_iou: 0.1897 d3.dn_loss_cls: 0.0790 d3.dn_loss_bbox: 0.1445 d3.dn_loss_iou: 0.1835 d4.dn_loss_cls: 0.0745 d4.dn_loss_bbox: 0.1426 d4.dn_loss_iou: 0.1813 d1.loss_lmm_region: 0.0918 loss_lmm_image: 0.7232 2024/11/13 18:47:47 - mmengine - INFO - Iter(train) [136800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:21:59 time: 1.9837 data_time: 0.0188 memory: 33493 grad_norm: 30.0189 loss: 8.6897 loss_cls: 0.2760 loss_bbox: 0.1135 loss_iou: 0.2288 d0.loss_cls: 0.3119 d0.loss_bbox: 0.1251 d0.loss_iou: 0.2411 d1.loss_cls: 0.2894 d1.loss_bbox: 0.1185 d1.loss_iou: 0.2362 d2.loss_cls: 0.2816 d2.loss_bbox: 0.1147 d2.loss_iou: 0.2303 d3.loss_cls: 0.2783 d3.loss_bbox: 0.1158 d3.loss_iou: 0.2283 d4.loss_cls: 0.2785 d4.loss_bbox: 0.1132 d4.loss_iou: 0.2277 enc_loss_cls: 0.3165 enc_loss_bbox: 0.1373 enc_loss_iou: 0.2556 dn_loss_cls: 0.1175 dn_loss_bbox: 0.1600 dn_loss_iou: 0.2048 d0.dn_loss_cls: 0.1958 d0.dn_loss_bbox: 0.2805 d0.dn_loss_iou: 0.3340 d1.dn_loss_cls: 0.1467 d1.dn_loss_bbox: 0.1822 d1.dn_loss_iou: 0.2289 d2.dn_loss_cls: 0.1276 d2.dn_loss_bbox: 0.1675 d2.dn_loss_iou: 0.2123 d3.dn_loss_cls: 0.1213 d3.dn_loss_bbox: 0.1614 d3.dn_loss_iou: 0.2068 d4.dn_loss_cls: 0.1177 d4.dn_loss_bbox: 0.1600 d4.dn_loss_iou: 0.2048 d1.loss_lmm_region: 0.0994 loss_lmm_image: 0.7422 2024/11/13 18:51:07 - mmengine - INFO - Iter(train) [136900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:18:38 time: 2.0149 data_time: 0.0188 memory: 33572 grad_norm: 24.1916 loss: 8.3234 loss_cls: 0.2501 loss_bbox: 0.1370 loss_iou: 0.2478 d0.loss_cls: 0.2933 d0.loss_bbox: 0.1456 d0.loss_iou: 0.2589 d1.loss_cls: 0.2730 d1.loss_bbox: 0.1368 d1.loss_iou: 0.2493 d2.loss_cls: 0.2599 d2.loss_bbox: 0.1359 d2.loss_iou: 0.2471 d3.loss_cls: 0.2546 d3.loss_bbox: 0.1364 d3.loss_iou: 0.2470 d4.loss_cls: 0.2507 d4.loss_bbox: 0.1380 d4.loss_iou: 0.2484 enc_loss_cls: 0.2933 enc_loss_bbox: 0.1586 enc_loss_iou: 0.2725 dn_loss_cls: 0.0722 dn_loss_bbox: 0.1478 dn_loss_iou: 0.1952 d0.dn_loss_cls: 0.1402 d0.dn_loss_bbox: 0.2650 d0.dn_loss_iou: 0.3133 d1.dn_loss_cls: 0.0990 d1.dn_loss_bbox: 0.1728 d1.dn_loss_iou: 0.2195 d2.dn_loss_cls: 0.0826 d2.dn_loss_bbox: 0.1580 d2.dn_loss_iou: 0.2035 d3.dn_loss_cls: 0.0769 d3.dn_loss_bbox: 0.1492 d3.dn_loss_iou: 0.1972 d4.dn_loss_cls: 0.0735 d4.dn_loss_bbox: 0.1478 d4.dn_loss_iou: 0.1954 d1.loss_lmm_region: 0.0791 loss_lmm_image: 0.7007 2024/11/13 18:54:27 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 18:54:27 - mmengine - INFO - Iter(train) [137000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:15:17 time: 1.9851 data_time: 0.0187 memory: 35296 grad_norm: 25.5666 loss: 9.8567 loss_cls: 0.3104 loss_bbox: 0.1666 loss_iou: 0.3000 d0.loss_cls: 0.3564 d0.loss_bbox: 0.1727 d0.loss_iou: 0.3098 d1.loss_cls: 0.3268 d1.loss_bbox: 0.1719 d1.loss_iou: 0.3058 d2.loss_cls: 0.3206 d2.loss_bbox: 0.1656 d2.loss_iou: 0.2992 d3.loss_cls: 0.3148 d3.loss_bbox: 0.1666 d3.loss_iou: 0.3002 d4.loss_cls: 0.3124 d4.loss_bbox: 0.1665 d4.loss_iou: 0.2978 enc_loss_cls: 0.3590 enc_loss_bbox: 0.1883 enc_loss_iou: 0.3278 dn_loss_cls: 0.0887 dn_loss_bbox: 0.1621 dn_loss_iou: 0.2322 d0.dn_loss_cls: 0.1779 d0.dn_loss_bbox: 0.3137 d0.dn_loss_iou: 0.3795 d1.dn_loss_cls: 0.1221 d1.dn_loss_bbox: 0.1946 d1.dn_loss_iou: 0.2614 d2.dn_loss_cls: 0.1015 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2413 d3.dn_loss_cls: 0.0919 d3.dn_loss_bbox: 0.1645 d3.dn_loss_iou: 0.2344 d4.dn_loss_cls: 0.0895 d4.dn_loss_bbox: 0.1621 d4.dn_loss_iou: 0.2323 d1.loss_lmm_region: 0.1000 loss_lmm_image: 0.6957 2024/11/13 18:57:47 - mmengine - INFO - Iter(train) [137100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:11:56 time: 2.0106 data_time: 0.0189 memory: 33972 grad_norm: 23.2586 loss: 8.4767 loss_cls: 0.2583 loss_bbox: 0.1401 loss_iou: 0.2583 d0.loss_cls: 0.2979 d0.loss_bbox: 0.1567 d0.loss_iou: 0.2730 d1.loss_cls: 0.2751 d1.loss_bbox: 0.1493 d1.loss_iou: 0.2663 d2.loss_cls: 0.2699 d2.loss_bbox: 0.1410 d2.loss_iou: 0.2601 d3.loss_cls: 0.2611 d3.loss_bbox: 0.1421 d3.loss_iou: 0.2603 d4.loss_cls: 0.2601 d4.loss_bbox: 0.1407 d4.loss_iou: 0.2594 enc_loss_cls: 0.3070 enc_loss_bbox: 0.1625 enc_loss_iou: 0.2840 dn_loss_cls: 0.0688 dn_loss_bbox: 0.1416 dn_loss_iou: 0.1951 d0.dn_loss_cls: 0.1418 d0.dn_loss_bbox: 0.2721 d0.dn_loss_iou: 0.3189 d1.dn_loss_cls: 0.0960 d1.dn_loss_bbox: 0.1721 d1.dn_loss_iou: 0.2219 d2.dn_loss_cls: 0.0779 d2.dn_loss_bbox: 0.1530 d2.dn_loss_iou: 0.2046 d3.dn_loss_cls: 0.0723 d3.dn_loss_bbox: 0.1437 d3.dn_loss_iou: 0.1978 d4.dn_loss_cls: 0.0694 d4.dn_loss_bbox: 0.1417 d4.dn_loss_iou: 0.1952 d1.loss_lmm_region: 0.0954 loss_lmm_image: 0.6743 2024/11/13 19:01:08 - mmengine - INFO - Iter(train) [137200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:08:35 time: 2.0176 data_time: 0.0187 memory: 34150 grad_norm: 24.1825 loss: 8.4782 loss_cls: 0.2828 loss_bbox: 0.1197 loss_iou: 0.2480 d0.loss_cls: 0.3243 d0.loss_bbox: 0.1351 d0.loss_iou: 0.2637 d1.loss_cls: 0.3002 d1.loss_bbox: 0.1281 d1.loss_iou: 0.2564 d2.loss_cls: 0.2902 d2.loss_bbox: 0.1225 d2.loss_iou: 0.2492 d3.loss_cls: 0.2870 d3.loss_bbox: 0.1200 d3.loss_iou: 0.2477 d4.loss_cls: 0.2825 d4.loss_bbox: 0.1219 d4.loss_iou: 0.2490 enc_loss_cls: 0.3211 enc_loss_bbox: 0.1521 enc_loss_iou: 0.2865 dn_loss_cls: 0.0582 dn_loss_bbox: 0.1482 dn_loss_iou: 0.1950 d0.dn_loss_cls: 0.1353 d0.dn_loss_bbox: 0.2663 d0.dn_loss_iou: 0.3158 d1.dn_loss_cls: 0.0839 d1.dn_loss_bbox: 0.1728 d1.dn_loss_iou: 0.2182 d2.dn_loss_cls: 0.0665 d2.dn_loss_bbox: 0.1565 d2.dn_loss_iou: 0.2033 d3.dn_loss_cls: 0.0617 d3.dn_loss_bbox: 0.1497 d3.dn_loss_iou: 0.1970 d4.dn_loss_cls: 0.0582 d4.dn_loss_bbox: 0.1482 d4.dn_loss_iou: 0.1951 d1.loss_lmm_region: 0.0887 loss_lmm_image: 0.7719 2024/11/13 19:04:30 - mmengine - INFO - Iter(train) [137300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:05:14 time: 2.0253 data_time: 0.0187 memory: 34873 grad_norm: 24.7851 loss: 9.4193 loss_cls: 0.3137 loss_bbox: 0.1531 loss_iou: 0.2781 d0.loss_cls: 0.3451 d0.loss_bbox: 0.1790 d0.loss_iou: 0.2958 d1.loss_cls: 0.3339 d1.loss_bbox: 0.1501 d1.loss_iou: 0.2842 d2.loss_cls: 0.3248 d2.loss_bbox: 0.1485 d2.loss_iou: 0.2822 d3.loss_cls: 0.3197 d3.loss_bbox: 0.1460 d3.loss_iou: 0.2788 d4.loss_cls: 0.3178 d4.loss_bbox: 0.1439 d4.loss_iou: 0.2762 enc_loss_cls: 0.3557 enc_loss_bbox: 0.1790 enc_loss_iou: 0.3105 dn_loss_cls: 0.0857 dn_loss_bbox: 0.1529 dn_loss_iou: 0.2081 d0.dn_loss_cls: 0.1712 d0.dn_loss_bbox: 0.2931 d0.dn_loss_iou: 0.3402 d1.dn_loss_cls: 0.1250 d1.dn_loss_bbox: 0.1869 d1.dn_loss_iou: 0.2368 d2.dn_loss_cls: 0.0998 d2.dn_loss_bbox: 0.1635 d2.dn_loss_iou: 0.2172 d3.dn_loss_cls: 0.0916 d3.dn_loss_bbox: 0.1540 d3.dn_loss_iou: 0.2106 d4.dn_loss_cls: 0.0855 d4.dn_loss_bbox: 0.1529 d4.dn_loss_iou: 0.2082 d1.loss_lmm_region: 0.1099 loss_lmm_image: 0.7104 2024/11/13 19:07:48 - mmengine - INFO - Iter(train) [137400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 7:01:53 time: 1.9717 data_time: 0.0187 memory: 34296 grad_norm: 22.7515 loss: 8.4988 loss_cls: 0.3084 loss_bbox: 0.1175 loss_iou: 0.2361 d0.loss_cls: 0.3463 d0.loss_bbox: 0.1260 d0.loss_iou: 0.2471 d1.loss_cls: 0.3196 d1.loss_bbox: 0.1237 d1.loss_iou: 0.2427 d2.loss_cls: 0.3113 d2.loss_bbox: 0.1219 d2.loss_iou: 0.2402 d3.loss_cls: 0.3117 d3.loss_bbox: 0.1205 d3.loss_iou: 0.2371 d4.loss_cls: 0.3119 d4.loss_bbox: 0.1179 d4.loss_iou: 0.2372 enc_loss_cls: 0.3517 enc_loss_bbox: 0.1372 enc_loss_iou: 0.2626 dn_loss_cls: 0.0617 dn_loss_bbox: 0.1355 dn_loss_iou: 0.2008 d0.dn_loss_cls: 0.1499 d0.dn_loss_bbox: 0.2609 d0.dn_loss_iou: 0.3287 d1.dn_loss_cls: 0.0948 d1.dn_loss_bbox: 0.1673 d1.dn_loss_iou: 0.2282 d2.dn_loss_cls: 0.0748 d2.dn_loss_bbox: 0.1441 d2.dn_loss_iou: 0.2089 d3.dn_loss_cls: 0.0665 d3.dn_loss_bbox: 0.1381 d3.dn_loss_iou: 0.2031 d4.dn_loss_cls: 0.0625 d4.dn_loss_bbox: 0.1356 d4.dn_loss_iou: 0.2009 d1.loss_lmm_region: 0.0941 loss_lmm_image: 0.7137 2024/11/13 19:11:07 - mmengine - INFO - Iter(train) [137500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:58:32 time: 1.9849 data_time: 0.0186 memory: 33278 grad_norm: 26.6206 loss: 9.5970 loss_cls: 0.3026 loss_bbox: 0.1736 loss_iou: 0.2658 d0.loss_cls: 0.3524 d0.loss_bbox: 0.1790 d0.loss_iou: 0.2767 d1.loss_cls: 0.3236 d1.loss_bbox: 0.1746 d1.loss_iou: 0.2688 d2.loss_cls: 0.3106 d2.loss_bbox: 0.1728 d2.loss_iou: 0.2675 d3.loss_cls: 0.3045 d3.loss_bbox: 0.1731 d3.loss_iou: 0.2673 d4.loss_cls: 0.3037 d4.loss_bbox: 0.1710 d4.loss_iou: 0.2652 enc_loss_cls: 0.3560 enc_loss_bbox: 0.1878 enc_loss_iou: 0.2958 dn_loss_cls: 0.0828 dn_loss_bbox: 0.1867 dn_loss_iou: 0.2131 d0.dn_loss_cls: 0.1733 d0.dn_loss_bbox: 0.3371 d0.dn_loss_iou: 0.3544 d1.dn_loss_cls: 0.1158 d1.dn_loss_bbox: 0.2174 d1.dn_loss_iou: 0.2416 d2.dn_loss_cls: 0.0959 d2.dn_loss_bbox: 0.1964 d2.dn_loss_iou: 0.2221 d3.dn_loss_cls: 0.0882 d3.dn_loss_bbox: 0.1879 d3.dn_loss_iou: 0.2146 d4.dn_loss_cls: 0.0837 d4.dn_loss_bbox: 0.1867 d4.dn_loss_iou: 0.2131 d1.loss_lmm_region: 0.0904 loss_lmm_image: 0.7033 2024/11/13 19:14:25 - mmengine - INFO - Iter(train) [137600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:55:11 time: 1.9840 data_time: 0.0188 memory: 34946 grad_norm: 27.6336 loss: 9.2798 loss_cls: 0.3369 loss_bbox: 0.1220 loss_iou: 0.2438 d0.loss_cls: 0.3772 d0.loss_bbox: 0.1456 d0.loss_iou: 0.2598 d1.loss_cls: 0.3523 d1.loss_bbox: 0.1327 d1.loss_iou: 0.2524 d2.loss_cls: 0.3441 d2.loss_bbox: 0.1261 d2.loss_iou: 0.2489 d3.loss_cls: 0.3420 d3.loss_bbox: 0.1206 d3.loss_iou: 0.2408 d4.loss_cls: 0.3378 d4.loss_bbox: 0.1206 d4.loss_iou: 0.2435 enc_loss_cls: 0.3823 enc_loss_bbox: 0.1510 enc_loss_iou: 0.2682 dn_loss_cls: 0.0969 dn_loss_bbox: 0.1496 dn_loss_iou: 0.2176 d0.dn_loss_cls: 0.1752 d0.dn_loss_bbox: 0.2926 d0.dn_loss_iou: 0.3507 d1.dn_loss_cls: 0.1249 d1.dn_loss_bbox: 0.1847 d1.dn_loss_iou: 0.2466 d2.dn_loss_cls: 0.1038 d2.dn_loss_bbox: 0.1608 d2.dn_loss_iou: 0.2255 d3.dn_loss_cls: 0.0997 d3.dn_loss_bbox: 0.1518 d3.dn_loss_iou: 0.2195 d4.dn_loss_cls: 0.0973 d4.dn_loss_bbox: 0.1496 d4.dn_loss_iou: 0.2176 d1.loss_lmm_region: 0.1038 loss_lmm_image: 0.7631 2024/11/13 19:17:46 - mmengine - INFO - Iter(train) [137700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:51:50 time: 1.9955 data_time: 0.0187 memory: 35302 grad_norm: 32.1447 loss: 9.4977 loss_cls: 0.2751 loss_bbox: 0.1644 loss_iou: 0.2554 d0.loss_cls: 0.3170 d0.loss_bbox: 0.1783 d0.loss_iou: 0.2690 d1.loss_cls: 0.2923 d1.loss_bbox: 0.1707 d1.loss_iou: 0.2617 d2.loss_cls: 0.2841 d2.loss_bbox: 0.1671 d2.loss_iou: 0.2581 d3.loss_cls: 0.2792 d3.loss_bbox: 0.1650 d3.loss_iou: 0.2551 d4.loss_cls: 0.2749 d4.loss_bbox: 0.1661 d4.loss_iou: 0.2568 enc_loss_cls: 0.3316 enc_loss_bbox: 0.1827 enc_loss_iou: 0.2791 dn_loss_cls: 0.0998 dn_loss_bbox: 0.2019 dn_loss_iou: 0.2227 d0.dn_loss_cls: 0.1787 d0.dn_loss_bbox: 0.3402 d0.dn_loss_iou: 0.3476 d1.dn_loss_cls: 0.1323 d1.dn_loss_bbox: 0.2321 d1.dn_loss_iou: 0.2484 d2.dn_loss_cls: 0.1142 d2.dn_loss_bbox: 0.2130 d2.dn_loss_iou: 0.2308 d3.dn_loss_cls: 0.1048 d3.dn_loss_bbox: 0.2045 d3.dn_loss_iou: 0.2248 d4.dn_loss_cls: 0.1010 d4.dn_loss_bbox: 0.2019 d4.dn_loss_iou: 0.2228 d1.loss_lmm_region: 0.0965 loss_lmm_image: 0.6956 2024/11/13 19:21:06 - mmengine - INFO - Iter(train) [137800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:48:29 time: 1.9977 data_time: 0.0188 memory: 33947 grad_norm: 27.8711 loss: 9.1614 loss_cls: 0.3041 loss_bbox: 0.1375 loss_iou: 0.2728 d0.loss_cls: 0.3355 d0.loss_bbox: 0.1508 d0.loss_iou: 0.2911 d1.loss_cls: 0.3196 d1.loss_bbox: 0.1413 d1.loss_iou: 0.2785 d2.loss_cls: 0.3104 d2.loss_bbox: 0.1402 d2.loss_iou: 0.2790 d3.loss_cls: 0.3056 d3.loss_bbox: 0.1394 d3.loss_iou: 0.2739 d4.loss_cls: 0.3019 d4.loss_bbox: 0.1397 d4.loss_iou: 0.2728 enc_loss_cls: 0.3423 enc_loss_bbox: 0.1592 enc_loss_iou: 0.3074 dn_loss_cls: 0.0938 dn_loss_bbox: 0.1507 dn_loss_iou: 0.2094 d0.dn_loss_cls: 0.1725 d0.dn_loss_bbox: 0.2826 d0.dn_loss_iou: 0.3363 d1.dn_loss_cls: 0.1234 d1.dn_loss_bbox: 0.1819 d1.dn_loss_iou: 0.2364 d2.dn_loss_cls: 0.1010 d2.dn_loss_bbox: 0.1600 d2.dn_loss_iou: 0.2175 d3.dn_loss_cls: 0.0949 d3.dn_loss_bbox: 0.1523 d3.dn_loss_iou: 0.2109 d4.dn_loss_cls: 0.0949 d4.dn_loss_bbox: 0.1506 d4.dn_loss_iou: 0.2094 d1.loss_lmm_region: 0.0955 loss_lmm_image: 0.6846 2024/11/13 19:24:26 - mmengine - INFO - Iter(train) [137900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:45:08 time: 1.9831 data_time: 0.0187 memory: 33852 grad_norm: 29.7562 loss: 9.2812 loss_cls: 0.3005 loss_bbox: 0.1373 loss_iou: 0.2563 d0.loss_cls: 0.3424 d0.loss_bbox: 0.1461 d0.loss_iou: 0.2706 d1.loss_cls: 0.3252 d1.loss_bbox: 0.1375 d1.loss_iou: 0.2601 d2.loss_cls: 0.3080 d2.loss_bbox: 0.1380 d2.loss_iou: 0.2587 d3.loss_cls: 0.3042 d3.loss_bbox: 0.1380 d3.loss_iou: 0.2584 d4.loss_cls: 0.3030 d4.loss_bbox: 0.1370 d4.loss_iou: 0.2563 enc_loss_cls: 0.3465 enc_loss_bbox: 0.1586 enc_loss_iou: 0.2892 dn_loss_cls: 0.1050 dn_loss_bbox: 0.1788 dn_loss_iou: 0.2135 d0.dn_loss_cls: 0.1740 d0.dn_loss_bbox: 0.2905 d0.dn_loss_iou: 0.3342 d1.dn_loss_cls: 0.1304 d1.dn_loss_bbox: 0.2048 d1.dn_loss_iou: 0.2399 d2.dn_loss_cls: 0.1148 d2.dn_loss_bbox: 0.1844 d2.dn_loss_iou: 0.2210 d3.dn_loss_cls: 0.1093 d3.dn_loss_bbox: 0.1801 d3.dn_loss_iou: 0.2153 d4.dn_loss_cls: 0.1063 d4.dn_loss_bbox: 0.1789 d4.dn_loss_iou: 0.2135 d1.loss_lmm_region: 0.1021 loss_lmm_image: 0.7125 2024/11/13 19:27:45 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 19:27:45 - mmengine - INFO - Iter(train) [138000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:41:47 time: 1.9894 data_time: 0.0190 memory: 35062 grad_norm: 26.8583 loss: 9.9159 loss_cls: 0.3013 loss_bbox: 0.1754 loss_iou: 0.3138 d0.loss_cls: 0.3478 d0.loss_bbox: 0.1855 d0.loss_iou: 0.3290 d1.loss_cls: 0.3247 d1.loss_bbox: 0.1763 d1.loss_iou: 0.3164 d2.loss_cls: 0.3120 d2.loss_bbox: 0.1765 d2.loss_iou: 0.3183 d3.loss_cls: 0.3077 d3.loss_bbox: 0.1756 d3.loss_iou: 0.3132 d4.loss_cls: 0.3026 d4.loss_bbox: 0.1762 d4.loss_iou: 0.3146 enc_loss_cls: 0.3544 enc_loss_bbox: 0.1936 enc_loss_iou: 0.3426 dn_loss_cls: 0.0637 dn_loss_bbox: 0.1781 dn_loss_iou: 0.2397 d0.dn_loss_cls: 0.1465 d0.dn_loss_bbox: 0.3039 d0.dn_loss_iou: 0.3694 d1.dn_loss_cls: 0.0947 d1.dn_loss_bbox: 0.2078 d1.dn_loss_iou: 0.2669 d2.dn_loss_cls: 0.0748 d2.dn_loss_bbox: 0.1875 d2.dn_loss_iou: 0.2487 d3.dn_loss_cls: 0.0687 d3.dn_loss_bbox: 0.1815 d3.dn_loss_iou: 0.2420 d4.dn_loss_cls: 0.0648 d4.dn_loss_bbox: 0.1782 d4.dn_loss_iou: 0.2397 d1.loss_lmm_region: 0.1098 loss_lmm_image: 0.6921 2024/11/13 19:31:04 - mmengine - INFO - Iter(train) [138100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:38:26 time: 2.0115 data_time: 0.0188 memory: 32326 grad_norm: 28.4259 loss: 8.3709 loss_cls: 0.2727 loss_bbox: 0.1189 loss_iou: 0.2283 d0.loss_cls: 0.3140 d0.loss_bbox: 0.1265 d0.loss_iou: 0.2398 d1.loss_cls: 0.2920 d1.loss_bbox: 0.1211 d1.loss_iou: 0.2320 d2.loss_cls: 0.2848 d2.loss_bbox: 0.1204 d2.loss_iou: 0.2295 d3.loss_cls: 0.2770 d3.loss_bbox: 0.1168 d3.loss_iou: 0.2276 d4.loss_cls: 0.2735 d4.loss_bbox: 0.1187 d4.loss_iou: 0.2282 enc_loss_cls: 0.3235 enc_loss_bbox: 0.1353 enc_loss_iou: 0.2544 dn_loss_cls: 0.0813 dn_loss_bbox: 0.1489 dn_loss_iou: 0.1980 d0.dn_loss_cls: 0.1505 d0.dn_loss_bbox: 0.2701 d0.dn_loss_iou: 0.3226 d1.dn_loss_cls: 0.1039 d1.dn_loss_bbox: 0.1742 d1.dn_loss_iou: 0.2231 d2.dn_loss_cls: 0.0902 d2.dn_loss_bbox: 0.1569 d2.dn_loss_iou: 0.2054 d3.dn_loss_cls: 0.0863 d3.dn_loss_bbox: 0.1502 d3.dn_loss_iou: 0.1996 d4.dn_loss_cls: 0.0815 d4.dn_loss_bbox: 0.1490 d4.dn_loss_iou: 0.1981 d1.loss_lmm_region: 0.0954 loss_lmm_image: 0.7506 2024/11/13 19:34:23 - mmengine - INFO - Iter(train) [138200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:35:05 time: 1.9814 data_time: 0.0189 memory: 34838 grad_norm: 27.9176 loss: 8.3350 loss_cls: 0.2749 loss_bbox: 0.1188 loss_iou: 0.2357 d0.loss_cls: 0.3045 d0.loss_bbox: 0.1308 d0.loss_iou: 0.2466 d1.loss_cls: 0.2808 d1.loss_bbox: 0.1292 d1.loss_iou: 0.2451 d2.loss_cls: 0.2758 d2.loss_bbox: 0.1213 d2.loss_iou: 0.2392 d3.loss_cls: 0.2744 d3.loss_bbox: 0.1177 d3.loss_iou: 0.2363 d4.loss_cls: 0.2736 d4.loss_bbox: 0.1198 d4.loss_iou: 0.2375 enc_loss_cls: 0.3106 enc_loss_bbox: 0.1426 enc_loss_iou: 0.2692 dn_loss_cls: 0.0812 dn_loss_bbox: 0.1388 dn_loss_iou: 0.1964 d0.dn_loss_cls: 0.1475 d0.dn_loss_bbox: 0.2802 d0.dn_loss_iou: 0.3315 d1.dn_loss_cls: 0.1008 d1.dn_loss_bbox: 0.1670 d1.dn_loss_iou: 0.2237 d2.dn_loss_cls: 0.0875 d2.dn_loss_bbox: 0.1457 d2.dn_loss_iou: 0.2039 d3.dn_loss_cls: 0.0827 d3.dn_loss_bbox: 0.1410 d3.dn_loss_iou: 0.1986 d4.dn_loss_cls: 0.0823 d4.dn_loss_bbox: 0.1388 d4.dn_loss_iou: 0.1964 d1.loss_lmm_region: 0.0963 loss_lmm_image: 0.7104 2024/11/13 19:37:40 - mmengine - INFO - Iter(train) [138300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:31:43 time: 1.9559 data_time: 0.0189 memory: 33635 grad_norm: 22.0713 loss: 8.8815 loss_cls: 0.2633 loss_bbox: 0.1463 loss_iou: 0.2726 d0.loss_cls: 0.3011 d0.loss_bbox: 0.1564 d0.loss_iou: 0.2894 d1.loss_cls: 0.2756 d1.loss_bbox: 0.1531 d1.loss_iou: 0.2856 d2.loss_cls: 0.2717 d2.loss_bbox: 0.1485 d2.loss_iou: 0.2775 d3.loss_cls: 0.2699 d3.loss_bbox: 0.1447 d3.loss_iou: 0.2746 d4.loss_cls: 0.2653 d4.loss_bbox: 0.1463 d4.loss_iou: 0.2724 enc_loss_cls: 0.3096 enc_loss_bbox: 0.1697 enc_loss_iou: 0.3020 dn_loss_cls: 0.0741 dn_loss_bbox: 0.1474 dn_loss_iou: 0.2142 d0.dn_loss_cls: 0.1596 d0.dn_loss_bbox: 0.2726 d0.dn_loss_iou: 0.3450 d1.dn_loss_cls: 0.1058 d1.dn_loss_bbox: 0.1800 d1.dn_loss_iou: 0.2416 d2.dn_loss_cls: 0.0854 d2.dn_loss_bbox: 0.1579 d2.dn_loss_iou: 0.2230 d3.dn_loss_cls: 0.0789 d3.dn_loss_bbox: 0.1496 d3.dn_loss_iou: 0.2162 d4.dn_loss_cls: 0.0744 d4.dn_loss_bbox: 0.1473 d4.dn_loss_iou: 0.2142 d1.loss_lmm_region: 0.1099 loss_lmm_image: 0.6885 2024/11/13 19:41:00 - mmengine - INFO - Iter(train) [138400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:28:22 time: 1.9809 data_time: 0.0188 memory: 33402 grad_norm: 32.3561 loss: 7.9577 loss_cls: 0.2382 loss_bbox: 0.1201 loss_iou: 0.2238 d0.loss_cls: 0.2691 d0.loss_bbox: 0.1277 d0.loss_iou: 0.2280 d1.loss_cls: 0.2561 d1.loss_bbox: 0.1236 d1.loss_iou: 0.2231 d2.loss_cls: 0.2487 d2.loss_bbox: 0.1179 d2.loss_iou: 0.2203 d3.loss_cls: 0.2432 d3.loss_bbox: 0.1193 d3.loss_iou: 0.2216 d4.loss_cls: 0.2403 d4.loss_bbox: 0.1194 d4.loss_iou: 0.2218 enc_loss_cls: 0.2774 enc_loss_bbox: 0.1365 enc_loss_iou: 0.2417 dn_loss_cls: 0.0754 dn_loss_bbox: 0.1445 dn_loss_iou: 0.1990 d0.dn_loss_cls: 0.1547 d0.dn_loss_bbox: 0.2761 d0.dn_loss_iou: 0.3186 d1.dn_loss_cls: 0.1096 d1.dn_loss_bbox: 0.1679 d1.dn_loss_iou: 0.2219 d2.dn_loss_cls: 0.0898 d2.dn_loss_bbox: 0.1511 d2.dn_loss_iou: 0.2041 d3.dn_loss_cls: 0.0819 d3.dn_loss_bbox: 0.1456 d3.dn_loss_iou: 0.2001 d4.dn_loss_cls: 0.0757 d4.dn_loss_bbox: 0.1444 d4.dn_loss_iou: 0.1989 d1.loss_lmm_region: 0.1150 loss_lmm_image: 0.6657 2024/11/13 19:44:21 - mmengine - INFO - Iter(train) [138500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:25:02 time: 2.0159 data_time: 0.0188 memory: 32670 grad_norm: 35.6806 loss: 7.5434 loss_cls: 0.2308 loss_bbox: 0.1087 loss_iou: 0.2096 d0.loss_cls: 0.2661 d0.loss_bbox: 0.1184 d0.loss_iou: 0.2181 d1.loss_cls: 0.2393 d1.loss_bbox: 0.1172 d1.loss_iou: 0.2195 d2.loss_cls: 0.2331 d2.loss_bbox: 0.1130 d2.loss_iou: 0.2129 d3.loss_cls: 0.2306 d3.loss_bbox: 0.1119 d3.loss_iou: 0.2139 d4.loss_cls: 0.2278 d4.loss_bbox: 0.1125 d4.loss_iou: 0.2127 enc_loss_cls: 0.2785 enc_loss_bbox: 0.1288 enc_loss_iou: 0.2339 dn_loss_cls: 0.0660 dn_loss_bbox: 0.1275 dn_loss_iou: 0.1798 d0.dn_loss_cls: 0.1421 d0.dn_loss_bbox: 0.2645 d0.dn_loss_iou: 0.3034 d1.dn_loss_cls: 0.0946 d1.dn_loss_bbox: 0.1569 d1.dn_loss_iou: 0.2073 d2.dn_loss_cls: 0.0768 d2.dn_loss_bbox: 0.1367 d2.dn_loss_iou: 0.1886 d3.dn_loss_cls: 0.0705 d3.dn_loss_bbox: 0.1297 d3.dn_loss_iou: 0.1821 d4.dn_loss_cls: 0.0673 d4.dn_loss_bbox: 0.1276 d4.dn_loss_iou: 0.1797 d1.loss_lmm_region: 0.0926 loss_lmm_image: 0.7126 2024/11/13 19:47:40 - mmengine - INFO - Iter(train) [138600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:21:40 time: 1.9877 data_time: 0.0189 memory: 34624 grad_norm: 24.5888 loss: 8.7466 loss_cls: 0.2611 loss_bbox: 0.1476 loss_iou: 0.2561 d0.loss_cls: 0.2960 d0.loss_bbox: 0.1529 d0.loss_iou: 0.2652 d1.loss_cls: 0.2768 d1.loss_bbox: 0.1523 d1.loss_iou: 0.2564 d2.loss_cls: 0.2717 d2.loss_bbox: 0.1477 d2.loss_iou: 0.2571 d3.loss_cls: 0.2641 d3.loss_bbox: 0.1472 d3.loss_iou: 0.2535 d4.loss_cls: 0.2579 d4.loss_bbox: 0.1489 d4.loss_iou: 0.2578 enc_loss_cls: 0.3087 enc_loss_bbox: 0.1624 enc_loss_iou: 0.2799 dn_loss_cls: 0.0780 dn_loss_bbox: 0.1559 dn_loss_iou: 0.2093 d0.dn_loss_cls: 0.1523 d0.dn_loss_bbox: 0.3012 d0.dn_loss_iou: 0.3423 d1.dn_loss_cls: 0.1037 d1.dn_loss_bbox: 0.1827 d1.dn_loss_iou: 0.2351 d2.dn_loss_cls: 0.0874 d2.dn_loss_bbox: 0.1647 d2.dn_loss_iou: 0.2172 d3.dn_loss_cls: 0.0831 d3.dn_loss_bbox: 0.1577 d3.dn_loss_iou: 0.2111 d4.dn_loss_cls: 0.0784 d4.dn_loss_bbox: 0.1560 d4.dn_loss_iou: 0.2092 d1.loss_lmm_region: 0.1024 loss_lmm_image: 0.6975 2024/11/13 19:50:58 - mmengine - INFO - Iter(train) [138700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:18:19 time: 1.9971 data_time: 0.0187 memory: 32623 grad_norm: 23.7750 loss: 8.2643 loss_cls: 0.2566 loss_bbox: 0.1286 loss_iou: 0.2328 d0.loss_cls: 0.3006 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2512 d1.loss_cls: 0.2734 d1.loss_bbox: 0.1398 d1.loss_iou: 0.2401 d2.loss_cls: 0.2626 d2.loss_bbox: 0.1386 d2.loss_iou: 0.2377 d3.loss_cls: 0.2619 d3.loss_bbox: 0.1296 d3.loss_iou: 0.2326 d4.loss_cls: 0.2568 d4.loss_bbox: 0.1305 d4.loss_iou: 0.2325 enc_loss_cls: 0.3152 enc_loss_bbox: 0.1454 enc_loss_iou: 0.2616 dn_loss_cls: 0.0764 dn_loss_bbox: 0.1380 dn_loss_iou: 0.1904 d0.dn_loss_cls: 0.1517 d0.dn_loss_bbox: 0.2584 d0.dn_loss_iou: 0.3071 d1.dn_loss_cls: 0.1044 d1.dn_loss_bbox: 0.1640 d1.dn_loss_iou: 0.2148 d2.dn_loss_cls: 0.0868 d2.dn_loss_bbox: 0.1458 d2.dn_loss_iou: 0.1969 d3.dn_loss_cls: 0.0799 d3.dn_loss_bbox: 0.1394 d3.dn_loss_iou: 0.1921 d4.dn_loss_cls: 0.0765 d4.dn_loss_bbox: 0.1380 d4.dn_loss_iou: 0.1904 d1.loss_lmm_region: 0.0897 loss_lmm_image: 0.7496 2024/11/13 19:54:19 - mmengine - INFO - Iter(train) [138800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:14:59 time: 2.0139 data_time: 0.0187 memory: 33886 grad_norm: 28.0094 loss: 8.7008 loss_cls: 0.2945 loss_bbox: 0.1252 loss_iou: 0.2317 d0.loss_cls: 0.3349 d0.loss_bbox: 0.1426 d0.loss_iou: 0.2463 d1.loss_cls: 0.3103 d1.loss_bbox: 0.1352 d1.loss_iou: 0.2371 d2.loss_cls: 0.3052 d2.loss_bbox: 0.1308 d2.loss_iou: 0.2335 d3.loss_cls: 0.3020 d3.loss_bbox: 0.1256 d3.loss_iou: 0.2306 d4.loss_cls: 0.2974 d4.loss_bbox: 0.1256 d4.loss_iou: 0.2317 enc_loss_cls: 0.3442 enc_loss_bbox: 0.1455 enc_loss_iou: 0.2552 dn_loss_cls: 0.0831 dn_loss_bbox: 0.1507 dn_loss_iou: 0.1980 d0.dn_loss_cls: 0.1711 d0.dn_loss_bbox: 0.2851 d0.dn_loss_iou: 0.3309 d1.dn_loss_cls: 0.1211 d1.dn_loss_bbox: 0.1806 d1.dn_loss_iou: 0.2267 d2.dn_loss_cls: 0.0977 d2.dn_loss_bbox: 0.1625 d2.dn_loss_iou: 0.2071 d3.dn_loss_cls: 0.0884 d3.dn_loss_bbox: 0.1531 d3.dn_loss_iou: 0.2001 d4.dn_loss_cls: 0.0825 d4.dn_loss_bbox: 0.1509 d4.dn_loss_iou: 0.1981 d1.loss_lmm_region: 0.0903 loss_lmm_image: 0.7378 2024/11/13 19:57:39 - mmengine - INFO - Iter(train) [138900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:11:38 time: 1.9725 data_time: 0.0188 memory: 34599 grad_norm: 30.1725 loss: 9.2437 loss_cls: 0.3016 loss_bbox: 0.1644 loss_iou: 0.2620 d0.loss_cls: 0.3369 d0.loss_bbox: 0.1749 d0.loss_iou: 0.2742 d1.loss_cls: 0.3124 d1.loss_bbox: 0.1687 d1.loss_iou: 0.2685 d2.loss_cls: 0.3109 d2.loss_bbox: 0.1639 d2.loss_iou: 0.2636 d3.loss_cls: 0.3046 d3.loss_bbox: 0.1652 d3.loss_iou: 0.2626 d4.loss_cls: 0.3049 d4.loss_bbox: 0.1643 d4.loss_iou: 0.2619 enc_loss_cls: 0.3415 enc_loss_bbox: 0.1858 enc_loss_iou: 0.2913 dn_loss_cls: 0.0793 dn_loss_bbox: 0.1609 dn_loss_iou: 0.2041 d0.dn_loss_cls: 0.1434 d0.dn_loss_bbox: 0.3090 d0.dn_loss_iou: 0.3397 d1.dn_loss_cls: 0.1024 d1.dn_loss_bbox: 0.1907 d1.dn_loss_iou: 0.2319 d2.dn_loss_cls: 0.0899 d2.dn_loss_bbox: 0.1672 d2.dn_loss_iou: 0.2119 d3.dn_loss_cls: 0.0833 d3.dn_loss_bbox: 0.1629 d3.dn_loss_iou: 0.2060 d4.dn_loss_cls: 0.0796 d4.dn_loss_bbox: 0.1610 d4.dn_loss_iou: 0.2042 d1.loss_lmm_region: 0.0987 loss_lmm_image: 0.7337 2024/11/13 20:00:59 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 20:00:59 - mmengine - INFO - Iter(train) [139000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:08:17 time: 2.0176 data_time: 0.0188 memory: 34085 grad_norm: 23.0696 loss: 7.6519 loss_cls: 0.2247 loss_bbox: 0.1153 loss_iou: 0.2102 d0.loss_cls: 0.2634 d0.loss_bbox: 0.1233 d0.loss_iou: 0.2193 d1.loss_cls: 0.2417 d1.loss_bbox: 0.1156 d1.loss_iou: 0.2130 d2.loss_cls: 0.2331 d2.loss_bbox: 0.1152 d2.loss_iou: 0.2093 d3.loss_cls: 0.2287 d3.loss_bbox: 0.1135 d3.loss_iou: 0.2092 d4.loss_cls: 0.2252 d4.loss_bbox: 0.1149 d4.loss_iou: 0.2107 enc_loss_cls: 0.2706 enc_loss_bbox: 0.1308 enc_loss_iou: 0.2320 dn_loss_cls: 0.0829 dn_loss_bbox: 0.1403 dn_loss_iou: 0.1816 d0.dn_loss_cls: 0.1553 d0.dn_loss_bbox: 0.2569 d0.dn_loss_iou: 0.2957 d1.dn_loss_cls: 0.1086 d1.dn_loss_bbox: 0.1650 d1.dn_loss_iou: 0.2035 d2.dn_loss_cls: 0.0933 d2.dn_loss_bbox: 0.1475 d2.dn_loss_iou: 0.1884 d3.dn_loss_cls: 0.0882 d3.dn_loss_bbox: 0.1421 d3.dn_loss_iou: 0.1838 d4.dn_loss_cls: 0.0838 d4.dn_loss_bbox: 0.1403 d4.dn_loss_iou: 0.1816 d1.loss_lmm_region: 0.0786 loss_lmm_image: 0.7148 2024/11/13 20:04:19 - mmengine - INFO - Iter(train) [139100/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:04:56 time: 2.0117 data_time: 0.0188 memory: 34597 grad_norm: 24.8168 loss: 8.9750 loss_cls: 0.3122 loss_bbox: 0.1429 loss_iou: 0.2657 d0.loss_cls: 0.3428 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2798 d1.loss_cls: 0.3294 d1.loss_bbox: 0.1433 d1.loss_iou: 0.2702 d2.loss_cls: 0.3204 d2.loss_bbox: 0.1395 d2.loss_iou: 0.2655 d3.loss_cls: 0.3148 d3.loss_bbox: 0.1420 d3.loss_iou: 0.2637 d4.loss_cls: 0.3103 d4.loss_bbox: 0.1438 d4.loss_iou: 0.2663 enc_loss_cls: 0.3503 enc_loss_bbox: 0.1595 enc_loss_iou: 0.2934 dn_loss_cls: 0.0783 dn_loss_bbox: 0.1393 dn_loss_iou: 0.1937 d0.dn_loss_cls: 0.1570 d0.dn_loss_bbox: 0.2976 d0.dn_loss_iou: 0.3271 d1.dn_loss_cls: 0.1074 d1.dn_loss_bbox: 0.1719 d1.dn_loss_iou: 0.2206 d2.dn_loss_cls: 0.0884 d2.dn_loss_bbox: 0.1497 d2.dn_loss_iou: 0.2025 d3.dn_loss_cls: 0.0838 d3.dn_loss_bbox: 0.1412 d3.dn_loss_iou: 0.1956 d4.dn_loss_cls: 0.0783 d4.dn_loss_bbox: 0.1393 d4.dn_loss_iou: 0.1938 d1.loss_lmm_region: 0.1001 loss_lmm_image: 0.6991 2024/11/13 20:07:40 - mmengine - INFO - Iter(train) [139200/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 6:01:35 time: 2.0047 data_time: 0.0188 memory: 33354 grad_norm: 25.3405 loss: 8.2975 loss_cls: 0.2752 loss_bbox: 0.1181 loss_iou: 0.2083 d0.loss_cls: 0.3099 d0.loss_bbox: 0.1333 d0.loss_iou: 0.2261 d1.loss_cls: 0.2911 d1.loss_bbox: 0.1233 d1.loss_iou: 0.2129 d2.loss_cls: 0.2864 d2.loss_bbox: 0.1195 d2.loss_iou: 0.2101 d3.loss_cls: 0.2766 d3.loss_bbox: 0.1188 d3.loss_iou: 0.2093 d4.loss_cls: 0.2774 d4.loss_bbox: 0.1177 d4.loss_iou: 0.2087 enc_loss_cls: 0.3181 enc_loss_bbox: 0.1488 enc_loss_iou: 0.2411 dn_loss_cls: 0.0995 dn_loss_bbox: 0.1450 dn_loss_iou: 0.1895 d0.dn_loss_cls: 0.1730 d0.dn_loss_bbox: 0.2640 d0.dn_loss_iou: 0.3133 d1.dn_loss_cls: 0.1232 d1.dn_loss_bbox: 0.1681 d1.dn_loss_iou: 0.2124 d2.dn_loss_cls: 0.1094 d2.dn_loss_bbox: 0.1520 d2.dn_loss_iou: 0.1973 d3.dn_loss_cls: 0.1001 d3.dn_loss_bbox: 0.1462 d3.dn_loss_iou: 0.1914 d4.dn_loss_cls: 0.1026 d4.dn_loss_bbox: 0.1450 d4.dn_loss_iou: 0.1893 d1.loss_lmm_region: 0.0964 loss_lmm_image: 0.7490 2024/11/13 20:10:59 - mmengine - INFO - Iter(train) [139300/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:58:14 time: 1.9636 data_time: 0.0188 memory: 33901 grad_norm: 44.7840 loss: 7.6575 loss_cls: 0.2454 loss_bbox: 0.0992 loss_iou: 0.1946 d0.loss_cls: 0.2759 d0.loss_bbox: 0.1074 d0.loss_iou: 0.2061 d1.loss_cls: 0.2567 d1.loss_bbox: 0.1002 d1.loss_iou: 0.1975 d2.loss_cls: 0.2508 d2.loss_bbox: 0.0978 d2.loss_iou: 0.1944 d3.loss_cls: 0.2491 d3.loss_bbox: 0.0977 d3.loss_iou: 0.1933 d4.loss_cls: 0.2467 d4.loss_bbox: 0.0974 d4.loss_iou: 0.1932 enc_loss_cls: 0.2785 enc_loss_bbox: 0.1194 enc_loss_iou: 0.2217 dn_loss_cls: 0.0904 dn_loss_bbox: 0.1318 dn_loss_iou: 0.1792 d0.dn_loss_cls: 0.1632 d0.dn_loss_bbox: 0.2646 d0.dn_loss_iou: 0.3042 d1.dn_loss_cls: 0.1241 d1.dn_loss_bbox: 0.1605 d1.dn_loss_iou: 0.2059 d2.dn_loss_cls: 0.1027 d2.dn_loss_bbox: 0.1400 d2.dn_loss_iou: 0.1864 d3.dn_loss_cls: 0.0936 d3.dn_loss_bbox: 0.1336 d3.dn_loss_iou: 0.1810 d4.dn_loss_cls: 0.0892 d4.dn_loss_bbox: 0.1319 d4.dn_loss_iou: 0.1792 d1.loss_lmm_region: 0.0934 loss_lmm_image: 0.7796 2024/11/13 20:14:18 - mmengine - INFO - Iter(train) [139400/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:54:53 time: 1.9862 data_time: 0.0187 memory: 35627 grad_norm: 27.8819 loss: 9.5507 loss_cls: 0.2892 loss_bbox: 0.1452 loss_iou: 0.2725 d0.loss_cls: 0.3338 d0.loss_bbox: 0.1533 d0.loss_iou: 0.2808 d1.loss_cls: 0.3136 d1.loss_bbox: 0.1439 d1.loss_iou: 0.2742 d2.loss_cls: 0.3024 d2.loss_bbox: 0.1472 d2.loss_iou: 0.2712 d3.loss_cls: 0.2989 d3.loss_bbox: 0.1445 d3.loss_iou: 0.2697 d4.loss_cls: 0.2910 d4.loss_bbox: 0.1458 d4.loss_iou: 0.2722 enc_loss_cls: 0.3382 enc_loss_bbox: 0.1646 enc_loss_iou: 0.2987 dn_loss_cls: 0.0906 dn_loss_bbox: 0.1788 dn_loss_iou: 0.2268 d0.dn_loss_cls: 0.1879 d0.dn_loss_bbox: 0.3326 d0.dn_loss_iou: 0.3680 d1.dn_loss_cls: 0.1303 d1.dn_loss_bbox: 0.2146 d1.dn_loss_iou: 0.2569 d2.dn_loss_cls: 0.1031 d2.dn_loss_bbox: 0.1912 d2.dn_loss_iou: 0.2357 d3.dn_loss_cls: 0.0960 d3.dn_loss_bbox: 0.1809 d3.dn_loss_iou: 0.2283 d4.dn_loss_cls: 0.0916 d4.dn_loss_bbox: 0.1789 d4.dn_loss_iou: 0.2267 d1.loss_lmm_region: 0.1317 loss_lmm_image: 0.7492 2024/11/13 20:17:38 - mmengine - INFO - Iter(train) [139500/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:51:32 time: 2.0081 data_time: 0.0630 memory: 34710 grad_norm: 22.6053 loss: 8.4798 loss_cls: 0.2471 loss_bbox: 0.1311 loss_iou: 0.2647 d0.loss_cls: 0.2892 d0.loss_bbox: 0.1462 d0.loss_iou: 0.2756 d1.loss_cls: 0.2722 d1.loss_bbox: 0.1342 d1.loss_iou: 0.2653 d2.loss_cls: 0.2623 d2.loss_bbox: 0.1344 d2.loss_iou: 0.2620 d3.loss_cls: 0.2558 d3.loss_bbox: 0.1298 d3.loss_iou: 0.2602 d4.loss_cls: 0.2477 d4.loss_bbox: 0.1318 d4.loss_iou: 0.2647 enc_loss_cls: 0.3089 enc_loss_bbox: 0.1514 enc_loss_iou: 0.2882 dn_loss_cls: 0.0692 dn_loss_bbox: 0.1424 dn_loss_iou: 0.2066 d0.dn_loss_cls: 0.1395 d0.dn_loss_bbox: 0.2900 d0.dn_loss_iou: 0.3510 d1.dn_loss_cls: 0.0905 d1.dn_loss_bbox: 0.1749 d1.dn_loss_iou: 0.2363 d2.dn_loss_cls: 0.0782 d2.dn_loss_bbox: 0.1528 d2.dn_loss_iou: 0.2156 d3.dn_loss_cls: 0.0720 d3.dn_loss_bbox: 0.1442 d3.dn_loss_iou: 0.2092 d4.dn_loss_cls: 0.0700 d4.dn_loss_bbox: 0.1425 d4.dn_loss_iou: 0.2067 d1.loss_lmm_region: 0.0828 loss_lmm_image: 0.6829 2024/11/13 20:20:56 - mmengine - INFO - Iter(train) [139600/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:48:11 time: 1.9713 data_time: 0.0187 memory: 34350 grad_norm: 25.0533 loss: 8.1968 loss_cls: 0.2424 loss_bbox: 0.1277 loss_iou: 0.2197 d0.loss_cls: 0.2707 d0.loss_bbox: 0.1368 d0.loss_iou: 0.2291 d1.loss_cls: 0.2509 d1.loss_bbox: 0.1323 d1.loss_iou: 0.2234 d2.loss_cls: 0.2432 d2.loss_bbox: 0.1307 d2.loss_iou: 0.2225 d3.loss_cls: 0.2461 d3.loss_bbox: 0.1278 d3.loss_iou: 0.2192 d4.loss_cls: 0.2455 d4.loss_bbox: 0.1269 d4.loss_iou: 0.2183 enc_loss_cls: 0.2783 enc_loss_bbox: 0.1437 enc_loss_iou: 0.2418 dn_loss_cls: 0.1000 dn_loss_bbox: 0.1503 dn_loss_iou: 0.2004 d0.dn_loss_cls: 0.1554 d0.dn_loss_bbox: 0.2987 d0.dn_loss_iou: 0.3361 d1.dn_loss_cls: 0.1114 d1.dn_loss_bbox: 0.1861 d1.dn_loss_iou: 0.2303 d2.dn_loss_cls: 0.1069 d2.dn_loss_bbox: 0.1617 d2.dn_loss_iou: 0.2100 d3.dn_loss_cls: 0.1012 d3.dn_loss_bbox: 0.1533 d3.dn_loss_iou: 0.2027 d4.dn_loss_cls: 0.0975 d4.dn_loss_bbox: 0.1503 d4.dn_loss_iou: 0.2004 d1.loss_lmm_region: 0.0873 loss_lmm_image: 0.6798 2024/11/13 20:24:14 - mmengine - INFO - Iter(train) [139700/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:44:50 time: 1.9748 data_time: 0.0189 memory: 34595 grad_norm: 24.6217 loss: 8.2096 loss_cls: 0.2735 loss_bbox: 0.1125 loss_iou: 0.2576 d0.loss_cls: 0.3104 d0.loss_bbox: 0.1168 d0.loss_iou: 0.2682 d1.loss_cls: 0.2930 d1.loss_bbox: 0.1126 d1.loss_iou: 0.2582 d2.loss_cls: 0.2831 d2.loss_bbox: 0.1125 d2.loss_iou: 0.2573 d3.loss_cls: 0.2745 d3.loss_bbox: 0.1112 d3.loss_iou: 0.2545 d4.loss_cls: 0.2740 d4.loss_bbox: 0.1116 d4.loss_iou: 0.2548 enc_loss_cls: 0.3187 enc_loss_bbox: 0.1294 enc_loss_iou: 0.2891 dn_loss_cls: 0.0660 dn_loss_bbox: 0.1202 dn_loss_iou: 0.1984 d0.dn_loss_cls: 0.1371 d0.dn_loss_bbox: 0.2407 d0.dn_loss_iou: 0.3224 d1.dn_loss_cls: 0.0950 d1.dn_loss_bbox: 0.1467 d1.dn_loss_iou: 0.2241 d2.dn_loss_cls: 0.0774 d2.dn_loss_bbox: 0.1270 d2.dn_loss_iou: 0.2067 d3.dn_loss_cls: 0.0690 d3.dn_loss_bbox: 0.1209 d3.dn_loss_iou: 0.1998 d4.dn_loss_cls: 0.0655 d4.dn_loss_bbox: 0.1201 d4.dn_loss_iou: 0.1982 d1.loss_lmm_region: 0.0738 loss_lmm_image: 0.7275 2024/11/13 20:27:32 - mmengine - INFO - Iter(train) [139800/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:41:29 time: 1.9729 data_time: 0.0187 memory: 33729 grad_norm: 35.5532 loss: 8.5399 loss_cls: 0.2821 loss_bbox: 0.1260 loss_iou: 0.2352 d0.loss_cls: 0.3201 d0.loss_bbox: 0.1345 d0.loss_iou: 0.2452 d1.loss_cls: 0.3020 d1.loss_bbox: 0.1265 d1.loss_iou: 0.2376 d2.loss_cls: 0.2929 d2.loss_bbox: 0.1249 d2.loss_iou: 0.2354 d3.loss_cls: 0.2831 d3.loss_bbox: 0.1263 d3.loss_iou: 0.2364 d4.loss_cls: 0.2811 d4.loss_bbox: 0.1263 d4.loss_iou: 0.2362 enc_loss_cls: 0.3307 enc_loss_bbox: 0.1482 enc_loss_iou: 0.2677 dn_loss_cls: 0.0774 dn_loss_bbox: 0.1504 dn_loss_iou: 0.1972 d0.dn_loss_cls: 0.1532 d0.dn_loss_bbox: 0.2961 d0.dn_loss_iou: 0.3342 d1.dn_loss_cls: 0.1033 d1.dn_loss_bbox: 0.1861 d1.dn_loss_iou: 0.2257 d2.dn_loss_cls: 0.0903 d2.dn_loss_bbox: 0.1612 d2.dn_loss_iou: 0.2063 d3.dn_loss_cls: 0.0828 d3.dn_loss_bbox: 0.1531 d3.dn_loss_iou: 0.1996 d4.dn_loss_cls: 0.0782 d4.dn_loss_bbox: 0.1504 d4.dn_loss_iou: 0.1972 d1.loss_lmm_region: 0.0914 loss_lmm_image: 0.7072 2024/11/13 20:30:53 - mmengine - INFO - Iter(train) [139900/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:38:08 time: 1.9873 data_time: 0.0189 memory: 32154 grad_norm: 24.3912 loss: 8.6396 loss_cls: 0.2635 loss_bbox: 0.1360 loss_iou: 0.2212 d0.loss_cls: 0.3039 d0.loss_bbox: 0.1525 d0.loss_iou: 0.2331 d1.loss_cls: 0.2868 d1.loss_bbox: 0.1383 d1.loss_iou: 0.2233 d2.loss_cls: 0.2764 d2.loss_bbox: 0.1368 d2.loss_iou: 0.2193 d3.loss_cls: 0.2699 d3.loss_bbox: 0.1367 d3.loss_iou: 0.2198 d4.loss_cls: 0.2669 d4.loss_bbox: 0.1357 d4.loss_iou: 0.2198 enc_loss_cls: 0.3116 enc_loss_bbox: 0.1589 enc_loss_iou: 0.2494 dn_loss_cls: 0.0930 dn_loss_bbox: 0.1646 dn_loss_iou: 0.2028 d0.dn_loss_cls: 0.1711 d0.dn_loss_bbox: 0.3072 d0.dn_loss_iou: 0.3276 d1.dn_loss_cls: 0.1216 d1.dn_loss_bbox: 0.1964 d1.dn_loss_iou: 0.2292 d2.dn_loss_cls: 0.1033 d2.dn_loss_bbox: 0.1792 d2.dn_loss_iou: 0.2129 d3.dn_loss_cls: 0.0965 d3.dn_loss_bbox: 0.1673 d3.dn_loss_iou: 0.2056 d4.dn_loss_cls: 0.0930 d4.dn_loss_bbox: 0.1646 d4.dn_loss_iou: 0.2028 d1.loss_lmm_region: 0.1059 loss_lmm_image: 0.7351 2024/11/13 20:34:14 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 20:34:14 - mmengine - INFO - Iter(train) [140000/150000] base_lr: 1.0000e-05 lr: 1.0000e-05 eta: 5:34:47 time: 2.0019 data_time: 0.0188 memory: 32597 grad_norm: 28.3079 loss: 9.4978 loss_cls: 0.2740 loss_bbox: 0.1704 loss_iou: 0.2845 d0.loss_cls: 0.3185 d0.loss_bbox: 0.1786 d0.loss_iou: 0.2903 d1.loss_cls: 0.2915 d1.loss_bbox: 0.1742 d1.loss_iou: 0.2869 d2.loss_cls: 0.2940 d2.loss_bbox: 0.1643 d2.loss_iou: 0.2810 d3.loss_cls: 0.2849 d3.loss_bbox: 0.1659 d3.loss_iou: 0.2827 d4.loss_cls: 0.2810 d4.loss_bbox: 0.1673 d4.loss_iou: 0.2822 enc_loss_cls: 0.3218 enc_loss_bbox: 0.1899 enc_loss_iou: 0.3125 dn_loss_cls: 0.0758 dn_loss_bbox: 0.1712 dn_loss_iou: 0.2316 d0.dn_loss_cls: 0.1660 d0.dn_loss_bbox: 0.3198 d0.dn_loss_iou: 0.3734 d1.dn_loss_cls: 0.1122 d1.dn_loss_bbox: 0.2079 d1.dn_loss_iou: 0.2638 d2.dn_loss_cls: 0.0918 d2.dn_loss_bbox: 0.1848 d2.dn_loss_iou: 0.2431 d3.dn_loss_cls: 0.0832 d3.dn_loss_bbox: 0.1742 d3.dn_loss_iou: 0.2344 d4.dn_loss_cls: 0.0771 d4.dn_loss_bbox: 0.1713 d4.dn_loss_iou: 0.2318 d1.loss_lmm_region: 0.1126 loss_lmm_image: 0.6756 2024/11/13 20:37:33 - mmengine - INFO - Iter(train) [140100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:31:26 time: 1.9929 data_time: 0.0188 memory: 33883 grad_norm: 26.0726 loss: 6.9287 loss_cls: 0.2077 loss_bbox: 0.1033 loss_iou: 0.1822 d0.loss_cls: 0.2339 d0.loss_bbox: 0.1210 d0.loss_iou: 0.1970 d1.loss_cls: 0.2185 d1.loss_bbox: 0.1052 d1.loss_iou: 0.1846 d2.loss_cls: 0.2170 d2.loss_bbox: 0.1021 d2.loss_iou: 0.1789 d3.loss_cls: 0.2095 d3.loss_bbox: 0.1043 d3.loss_iou: 0.1840 d4.loss_cls: 0.2109 d4.loss_bbox: 0.1002 d4.loss_iou: 0.1826 enc_loss_cls: 0.2485 enc_loss_bbox: 0.1290 enc_loss_iou: 0.2108 dn_loss_cls: 0.0554 dn_loss_bbox: 0.1339 dn_loss_iou: 0.1638 d0.dn_loss_cls: 0.1201 d0.dn_loss_bbox: 0.2552 d0.dn_loss_iou: 0.2748 d1.dn_loss_cls: 0.0751 d1.dn_loss_bbox: 0.1581 d1.dn_loss_iou: 0.1854 d2.dn_loss_cls: 0.0617 d2.dn_loss_bbox: 0.1415 d2.dn_loss_iou: 0.1702 d3.dn_loss_cls: 0.0579 d3.dn_loss_bbox: 0.1343 d3.dn_loss_iou: 0.1649 d4.dn_loss_cls: 0.0559 d4.dn_loss_bbox: 0.1339 d4.dn_loss_iou: 0.1638 d1.loss_lmm_region: 0.0819 loss_lmm_image: 0.7096 2024/11/13 20:40:54 - mmengine - INFO - Iter(train) [140200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:28:05 time: 2.0187 data_time: 0.0188 memory: 33449 grad_norm: 32.9133 loss: 8.5019 loss_cls: 0.2797 loss_bbox: 0.1305 loss_iou: 0.2200 d0.loss_cls: 0.3011 d0.loss_bbox: 0.1493 d0.loss_iou: 0.2315 d1.loss_cls: 0.2892 d1.loss_bbox: 0.1337 d1.loss_iou: 0.2256 d2.loss_cls: 0.2889 d2.loss_bbox: 0.1302 d2.loss_iou: 0.2232 d3.loss_cls: 0.2803 d3.loss_bbox: 0.1324 d3.loss_iou: 0.2230 d4.loss_cls: 0.2782 d4.loss_bbox: 0.1306 d4.loss_iou: 0.2209 enc_loss_cls: 0.3110 enc_loss_bbox: 0.1549 enc_loss_iou: 0.2459 dn_loss_cls: 0.1238 dn_loss_bbox: 0.1458 dn_loss_iou: 0.1913 d0.dn_loss_cls: 0.1907 d0.dn_loss_bbox: 0.2580 d0.dn_loss_iou: 0.3047 d1.dn_loss_cls: 0.1479 d1.dn_loss_bbox: 0.1706 d1.dn_loss_iou: 0.2133 d2.dn_loss_cls: 0.1322 d2.dn_loss_bbox: 0.1522 d2.dn_loss_iou: 0.1974 d3.dn_loss_cls: 0.1282 d3.dn_loss_bbox: 0.1471 d3.dn_loss_iou: 0.1927 d4.dn_loss_cls: 0.1248 d4.dn_loss_bbox: 0.1458 d4.dn_loss_iou: 0.1912 d1.loss_lmm_region: 0.1117 loss_lmm_image: 0.6523 2024/11/13 20:44:15 - mmengine - INFO - Iter(train) [140300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:24:44 time: 2.0242 data_time: 0.0191 memory: 33408 grad_norm: 39.9867 loss: 7.5341 loss_cls: 0.2131 loss_bbox: 0.1115 loss_iou: 0.1997 d0.loss_cls: 0.2523 d0.loss_bbox: 0.1248 d0.loss_iou: 0.2133 d1.loss_cls: 0.2312 d1.loss_bbox: 0.1178 d1.loss_iou: 0.2055 d2.loss_cls: 0.2231 d2.loss_bbox: 0.1130 d2.loss_iou: 0.2015 d3.loss_cls: 0.2157 d3.loss_bbox: 0.1115 d3.loss_iou: 0.2000 d4.loss_cls: 0.2131 d4.loss_bbox: 0.1112 d4.loss_iou: 0.1997 enc_loss_cls: 0.2621 enc_loss_bbox: 0.1373 enc_loss_iou: 0.2279 dn_loss_cls: 0.0557 dn_loss_bbox: 0.1589 dn_loss_iou: 0.1856 d0.dn_loss_cls: 0.1354 d0.dn_loss_bbox: 0.2938 d0.dn_loss_iou: 0.3152 d1.dn_loss_cls: 0.0857 d1.dn_loss_bbox: 0.1890 d1.dn_loss_iou: 0.2138 d2.dn_loss_cls: 0.0663 d2.dn_loss_bbox: 0.1673 d2.dn_loss_iou: 0.1946 d3.dn_loss_cls: 0.0603 d3.dn_loss_bbox: 0.1601 d3.dn_loss_iou: 0.1878 d4.dn_loss_cls: 0.0566 d4.dn_loss_bbox: 0.1588 d4.dn_loss_iou: 0.1857 d1.loss_lmm_region: 0.0853 loss_lmm_image: 0.6930 2024/11/13 20:47:37 - mmengine - INFO - Iter(train) [140400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:21:23 time: 2.0258 data_time: 0.0188 memory: 35653 grad_norm: 29.1603 loss: 8.0022 loss_cls: 0.2546 loss_bbox: 0.1110 loss_iou: 0.2088 d0.loss_cls: 0.2926 d0.loss_bbox: 0.1202 d0.loss_iou: 0.2191 d1.loss_cls: 0.2682 d1.loss_bbox: 0.1127 d1.loss_iou: 0.2103 d2.loss_cls: 0.2608 d2.loss_bbox: 0.1117 d2.loss_iou: 0.2074 d3.loss_cls: 0.2552 d3.loss_bbox: 0.1107 d3.loss_iou: 0.2096 d4.loss_cls: 0.2559 d4.loss_bbox: 0.1093 d4.loss_iou: 0.2075 enc_loss_cls: 0.2948 enc_loss_bbox: 0.1291 enc_loss_iou: 0.2314 dn_loss_cls: 0.1170 dn_loss_bbox: 0.1453 dn_loss_iou: 0.1883 d0.dn_loss_cls: 0.1829 d0.dn_loss_bbox: 0.2586 d0.dn_loss_iou: 0.3063 d1.dn_loss_cls: 0.1408 d1.dn_loss_bbox: 0.1654 d1.dn_loss_iou: 0.2105 d2.dn_loss_cls: 0.1244 d2.dn_loss_bbox: 0.1495 d2.dn_loss_iou: 0.1944 d3.dn_loss_cls: 0.1169 d3.dn_loss_bbox: 0.1463 d3.dn_loss_iou: 0.1896 d4.dn_loss_cls: 0.1161 d4.dn_loss_bbox: 0.1453 d4.dn_loss_iou: 0.1884 d1.loss_lmm_region: 0.0857 loss_lmm_image: 0.6495 2024/11/13 20:50:56 - mmengine - INFO - Iter(train) [140500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:18:02 time: 1.9771 data_time: 0.0189 memory: 34528 grad_norm: 25.1087 loss: 6.9912 loss_cls: 0.1968 loss_bbox: 0.1047 loss_iou: 0.1682 d0.loss_cls: 0.2283 d0.loss_bbox: 0.1135 d0.loss_iou: 0.1765 d1.loss_cls: 0.2106 d1.loss_bbox: 0.1090 d1.loss_iou: 0.1725 d2.loss_cls: 0.2061 d2.loss_bbox: 0.1063 d2.loss_iou: 0.1684 d3.loss_cls: 0.1985 d3.loss_bbox: 0.1056 d3.loss_iou: 0.1685 d4.loss_cls: 0.1998 d4.loss_bbox: 0.1045 d4.loss_iou: 0.1680 enc_loss_cls: 0.2354 enc_loss_bbox: 0.1237 enc_loss_iou: 0.1914 dn_loss_cls: 0.0660 dn_loss_bbox: 0.1443 dn_loss_iou: 0.1781 d0.dn_loss_cls: 0.1481 d0.dn_loss_bbox: 0.2899 d0.dn_loss_iou: 0.3073 d1.dn_loss_cls: 0.0965 d1.dn_loss_bbox: 0.1775 d1.dn_loss_iou: 0.2064 d2.dn_loss_cls: 0.0783 d2.dn_loss_bbox: 0.1545 d2.dn_loss_iou: 0.1855 d3.dn_loss_cls: 0.0704 d3.dn_loss_bbox: 0.1476 d3.dn_loss_iou: 0.1804 d4.dn_loss_cls: 0.0666 d4.dn_loss_bbox: 0.1442 d4.dn_loss_iou: 0.1781 d1.loss_lmm_region: 0.0961 loss_lmm_image: 0.6190 2024/11/13 20:54:16 - mmengine - INFO - Iter(train) [140600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:14:41 time: 2.0040 data_time: 0.0189 memory: 34345 grad_norm: 29.7766 loss: 8.2481 loss_cls: 0.2311 loss_bbox: 0.1245 loss_iou: 0.2235 d0.loss_cls: 0.2636 d0.loss_bbox: 0.1324 d0.loss_iou: 0.2320 d1.loss_cls: 0.2483 d1.loss_bbox: 0.1240 d1.loss_iou: 0.2257 d2.loss_cls: 0.2402 d2.loss_bbox: 0.1238 d2.loss_iou: 0.2238 d3.loss_cls: 0.2327 d3.loss_bbox: 0.1251 d3.loss_iou: 0.2257 d4.loss_cls: 0.2277 d4.loss_bbox: 0.1252 d4.loss_iou: 0.2249 enc_loss_cls: 0.2707 enc_loss_bbox: 0.1413 enc_loss_iou: 0.2459 dn_loss_cls: 0.1088 dn_loss_bbox: 0.1630 dn_loss_iou: 0.1998 d0.dn_loss_cls: 0.1776 d0.dn_loss_bbox: 0.2921 d0.dn_loss_iou: 0.3178 d1.dn_loss_cls: 0.1301 d1.dn_loss_bbox: 0.1892 d1.dn_loss_iou: 0.2231 d2.dn_loss_cls: 0.1188 d2.dn_loss_bbox: 0.1713 d2.dn_loss_iou: 0.2061 d3.dn_loss_cls: 0.1072 d3.dn_loss_bbox: 0.1645 d3.dn_loss_iou: 0.2011 d4.dn_loss_cls: 0.1089 d4.dn_loss_bbox: 0.1630 d4.dn_loss_iou: 0.1998 d1.loss_lmm_region: 0.0869 loss_lmm_image: 0.7067 2024/11/13 20:57:35 - mmengine - INFO - Iter(train) [140700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:11:20 time: 2.0032 data_time: 0.0190 memory: 34567 grad_norm: 27.0805 loss: 10.1953 loss_cls: 0.3532 loss_bbox: 0.1578 loss_iou: 0.2525 d0.loss_cls: 0.3755 d0.loss_bbox: 0.1686 d0.loss_iou: 0.2658 d1.loss_cls: 0.3594 d1.loss_bbox: 0.1587 d1.loss_iou: 0.2557 d2.loss_cls: 0.3506 d2.loss_bbox: 0.1575 d2.loss_iou: 0.2542 d3.loss_cls: 0.3536 d3.loss_bbox: 0.1580 d3.loss_iou: 0.2535 d4.loss_cls: 0.3516 d4.loss_bbox: 0.1582 d4.loss_iou: 0.2536 enc_loss_cls: 0.3841 enc_loss_bbox: 0.1743 enc_loss_iou: 0.2790 dn_loss_cls: 0.1847 dn_loss_bbox: 0.1843 dn_loss_iou: 0.2339 d0.dn_loss_cls: 0.2432 d0.dn_loss_bbox: 0.3301 d0.dn_loss_iou: 0.3745 d1.dn_loss_cls: 0.1727 d1.dn_loss_bbox: 0.2089 d1.dn_loss_iou: 0.2607 d2.dn_loss_cls: 0.1704 d2.dn_loss_bbox: 0.1896 d2.dn_loss_iou: 0.2412 d3.dn_loss_cls: 0.1723 d3.dn_loss_bbox: 0.1856 d3.dn_loss_iou: 0.2356 d4.dn_loss_cls: 0.1801 d4.dn_loss_bbox: 0.1844 d4.dn_loss_iou: 0.2340 d1.loss_lmm_region: 0.0951 loss_lmm_image: 0.6384 2024/11/13 21:00:56 - mmengine - INFO - Iter(train) [140800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:07:59 time: 1.9894 data_time: 0.0189 memory: 35300 grad_norm: 27.3609 loss: 6.8493 loss_cls: 0.1854 loss_bbox: 0.0974 loss_iou: 0.1730 d0.loss_cls: 0.2177 d0.loss_bbox: 0.1060 d0.loss_iou: 0.1849 d1.loss_cls: 0.2043 d1.loss_bbox: 0.0989 d1.loss_iou: 0.1757 d2.loss_cls: 0.1972 d2.loss_bbox: 0.0968 d2.loss_iou: 0.1729 d3.loss_cls: 0.1901 d3.loss_bbox: 0.0973 d3.loss_iou: 0.1737 d4.loss_cls: 0.1882 d4.loss_bbox: 0.0966 d4.loss_iou: 0.1731 enc_loss_cls: 0.2227 enc_loss_bbox: 0.1170 enc_loss_iou: 0.1980 dn_loss_cls: 0.0585 dn_loss_bbox: 0.1422 dn_loss_iou: 0.1692 d0.dn_loss_cls: 0.1400 d0.dn_loss_bbox: 0.2962 d0.dn_loss_iou: 0.3027 d1.dn_loss_cls: 0.0885 d1.dn_loss_bbox: 0.1783 d1.dn_loss_iou: 0.1972 d2.dn_loss_cls: 0.0691 d2.dn_loss_bbox: 0.1535 d2.dn_loss_iou: 0.1781 d3.dn_loss_cls: 0.0625 d3.dn_loss_bbox: 0.1443 d3.dn_loss_iou: 0.1712 d4.dn_loss_cls: 0.0593 d4.dn_loss_bbox: 0.1422 d4.dn_loss_iou: 0.1693 d1.loss_lmm_region: 0.0885 loss_lmm_image: 0.6717 2024/11/13 21:04:17 - mmengine - INFO - Iter(train) [140900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:04:39 time: 2.0077 data_time: 0.0189 memory: 33747 grad_norm: 23.2412 loss: 8.2046 loss_cls: 0.2485 loss_bbox: 0.1326 loss_iou: 0.2115 d0.loss_cls: 0.2834 d0.loss_bbox: 0.1397 d0.loss_iou: 0.2218 d1.loss_cls: 0.2636 d1.loss_bbox: 0.1368 d1.loss_iou: 0.2180 d2.loss_cls: 0.2577 d2.loss_bbox: 0.1319 d2.loss_iou: 0.2150 d3.loss_cls: 0.2525 d3.loss_bbox: 0.1314 d3.loss_iou: 0.2132 d4.loss_cls: 0.2501 d4.loss_bbox: 0.1323 d4.loss_iou: 0.2113 enc_loss_cls: 0.2895 enc_loss_bbox: 0.1476 enc_loss_iou: 0.2368 dn_loss_cls: 0.0770 dn_loss_bbox: 0.1582 dn_loss_iou: 0.1948 d0.dn_loss_cls: 0.1615 d0.dn_loss_bbox: 0.3216 d0.dn_loss_iou: 0.3330 d1.dn_loss_cls: 0.1068 d1.dn_loss_bbox: 0.1958 d1.dn_loss_iou: 0.2244 d2.dn_loss_cls: 0.0879 d2.dn_loss_bbox: 0.1683 d2.dn_loss_iou: 0.2034 d3.dn_loss_cls: 0.0803 d3.dn_loss_bbox: 0.1602 d3.dn_loss_iou: 0.1969 d4.dn_loss_cls: 0.0781 d4.dn_loss_bbox: 0.1583 d4.dn_loss_iou: 0.1948 d1.loss_lmm_region: 0.0898 loss_lmm_image: 0.6882 2024/11/13 21:07:36 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 21:07:36 - mmengine - INFO - Iter(train) [141000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 5:01:18 time: 2.0062 data_time: 0.0189 memory: 34995 grad_norm: 26.3902 loss: 7.8061 loss_cls: 0.2347 loss_bbox: 0.1107 loss_iou: 0.2185 d0.loss_cls: 0.2709 d0.loss_bbox: 0.1230 d0.loss_iou: 0.2340 d1.loss_cls: 0.2505 d1.loss_bbox: 0.1149 d1.loss_iou: 0.2261 d2.loss_cls: 0.2435 d2.loss_bbox: 0.1123 d2.loss_iou: 0.2219 d3.loss_cls: 0.2376 d3.loss_bbox: 0.1105 d3.loss_iou: 0.2192 d4.loss_cls: 0.2356 d4.loss_bbox: 0.1102 d4.loss_iou: 0.2176 enc_loss_cls: 0.2769 enc_loss_bbox: 0.1337 enc_loss_iou: 0.2477 dn_loss_cls: 0.0598 dn_loss_bbox: 0.1404 dn_loss_iou: 0.1871 d0.dn_loss_cls: 0.1429 d0.dn_loss_bbox: 0.2850 d0.dn_loss_iou: 0.3248 d1.dn_loss_cls: 0.0908 d1.dn_loss_bbox: 0.1732 d1.dn_loss_iou: 0.2186 d2.dn_loss_cls: 0.0715 d2.dn_loss_bbox: 0.1516 d2.dn_loss_iou: 0.1981 d3.dn_loss_cls: 0.0640 d3.dn_loss_bbox: 0.1415 d3.dn_loss_iou: 0.1893 d4.dn_loss_cls: 0.0607 d4.dn_loss_bbox: 0.1403 d4.dn_loss_iou: 0.1871 d1.loss_lmm_region: 0.0849 loss_lmm_image: 0.7445 2024/11/13 21:10:56 - mmengine - INFO - Iter(train) [141100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:57:57 time: 1.9849 data_time: 0.0189 memory: 33510 grad_norm: 24.9380 loss: 9.4608 loss_cls: 0.2853 loss_bbox: 0.1546 loss_iou: 0.2754 d0.loss_cls: 0.3309 d0.loss_bbox: 0.1708 d0.loss_iou: 0.2847 d1.loss_cls: 0.3092 d1.loss_bbox: 0.1594 d1.loss_iou: 0.2763 d2.loss_cls: 0.2974 d2.loss_bbox: 0.1558 d2.loss_iou: 0.2739 d3.loss_cls: 0.2922 d3.loss_bbox: 0.1537 d3.loss_iou: 0.2710 d4.loss_cls: 0.2882 d4.loss_bbox: 0.1532 d4.loss_iou: 0.2744 enc_loss_cls: 0.3400 enc_loss_bbox: 0.1783 enc_loss_iou: 0.2979 dn_loss_cls: 0.0790 dn_loss_bbox: 0.1788 dn_loss_iou: 0.2192 d0.dn_loss_cls: 0.1708 d0.dn_loss_bbox: 0.3400 d0.dn_loss_iou: 0.3589 d1.dn_loss_cls: 0.1112 d1.dn_loss_bbox: 0.2136 d1.dn_loss_iou: 0.2485 d2.dn_loss_cls: 0.0904 d2.dn_loss_bbox: 0.1945 d2.dn_loss_iou: 0.2306 d3.dn_loss_cls: 0.0837 d3.dn_loss_bbox: 0.1829 d3.dn_loss_iou: 0.2217 d4.dn_loss_cls: 0.0804 d4.dn_loss_bbox: 0.1787 d4.dn_loss_iou: 0.2192 d1.loss_lmm_region: 0.0970 loss_lmm_image: 0.7391 2024/11/13 21:14:16 - mmengine - INFO - Iter(train) [141200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:54:36 time: 2.0130 data_time: 0.0188 memory: 35021 grad_norm: 32.7470 loss: 9.7928 loss_cls: 0.3154 loss_bbox: 0.1661 loss_iou: 0.3157 d0.loss_cls: 0.3631 d0.loss_bbox: 0.1806 d0.loss_iou: 0.3334 d1.loss_cls: 0.3431 d1.loss_bbox: 0.1718 d1.loss_iou: 0.3211 d2.loss_cls: 0.3300 d2.loss_bbox: 0.1688 d2.loss_iou: 0.3173 d3.loss_cls: 0.3217 d3.loss_bbox: 0.1668 d3.loss_iou: 0.3157 d4.loss_cls: 0.3170 d4.loss_bbox: 0.1668 d4.loss_iou: 0.3175 enc_loss_cls: 0.3849 enc_loss_bbox: 0.1850 enc_loss_iou: 0.3426 dn_loss_cls: 0.0807 dn_loss_bbox: 0.1555 dn_loss_iou: 0.2180 d0.dn_loss_cls: 0.1685 d0.dn_loss_bbox: 0.2782 d0.dn_loss_iou: 0.3458 d1.dn_loss_cls: 0.1187 d1.dn_loss_bbox: 0.1814 d1.dn_loss_iou: 0.2441 d2.dn_loss_cls: 0.0952 d2.dn_loss_bbox: 0.1610 d2.dn_loss_iou: 0.2248 d3.dn_loss_cls: 0.0873 d3.dn_loss_bbox: 0.1555 d3.dn_loss_iou: 0.2188 d4.dn_loss_cls: 0.0820 d4.dn_loss_bbox: 0.1555 d4.dn_loss_iou: 0.2180 d1.loss_lmm_region: 0.0956 loss_lmm_image: 0.6641 2024/11/13 21:17:35 - mmengine - INFO - Iter(train) [141300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:51:15 time: 1.9696 data_time: 0.0188 memory: 34374 grad_norm: 21.8829 loss: 9.0100 loss_cls: 0.2451 loss_bbox: 0.1425 loss_iou: 0.2573 d0.loss_cls: 0.2810 d0.loss_bbox: 0.1558 d0.loss_iou: 0.2700 d1.loss_cls: 0.2656 d1.loss_bbox: 0.1479 d1.loss_iou: 0.2624 d2.loss_cls: 0.2572 d2.loss_bbox: 0.1443 d2.loss_iou: 0.2584 d3.loss_cls: 0.2495 d3.loss_bbox: 0.1425 d3.loss_iou: 0.2566 d4.loss_cls: 0.2458 d4.loss_bbox: 0.1426 d4.loss_iou: 0.2573 enc_loss_cls: 0.2879 enc_loss_bbox: 0.1632 enc_loss_iou: 0.2823 dn_loss_cls: 0.0935 dn_loss_bbox: 0.1806 dn_loss_iou: 0.2263 d0.dn_loss_cls: 0.1782 d0.dn_loss_bbox: 0.3283 d0.dn_loss_iou: 0.3625 d1.dn_loss_cls: 0.1245 d1.dn_loss_bbox: 0.2113 d1.dn_loss_iou: 0.2544 d2.dn_loss_cls: 0.1044 d2.dn_loss_bbox: 0.1904 d2.dn_loss_iou: 0.2352 d3.dn_loss_cls: 0.0980 d3.dn_loss_bbox: 0.1829 d3.dn_loss_iou: 0.2288 d4.dn_loss_cls: 0.0941 d4.dn_loss_bbox: 0.1807 d4.dn_loss_iou: 0.2264 d1.loss_lmm_region: 0.0936 loss_lmm_image: 0.7006 2024/11/13 21:20:55 - mmengine - INFO - Iter(train) [141400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:47:54 time: 2.0160 data_time: 0.0188 memory: 34963 grad_norm: 36.0083 loss: 7.7959 loss_cls: 0.2216 loss_bbox: 0.1302 loss_iou: 0.2140 d0.loss_cls: 0.2573 d0.loss_bbox: 0.1393 d0.loss_iou: 0.2226 d1.loss_cls: 0.2385 d1.loss_bbox: 0.1325 d1.loss_iou: 0.2169 d2.loss_cls: 0.2275 d2.loss_bbox: 0.1321 d2.loss_iou: 0.2159 d3.loss_cls: 0.2278 d3.loss_bbox: 0.1284 d3.loss_iou: 0.2147 d4.loss_cls: 0.2237 d4.loss_bbox: 0.1290 d4.loss_iou: 0.2147 enc_loss_cls: 0.2680 enc_loss_bbox: 0.1464 enc_loss_iou: 0.2346 dn_loss_cls: 0.0654 dn_loss_bbox: 0.1496 dn_loss_iou: 0.1905 d0.dn_loss_cls: 0.1371 d0.dn_loss_bbox: 0.2745 d0.dn_loss_iou: 0.3109 d1.dn_loss_cls: 0.0916 d1.dn_loss_bbox: 0.1744 d1.dn_loss_iou: 0.2151 d2.dn_loss_cls: 0.0791 d2.dn_loss_bbox: 0.1567 d2.dn_loss_iou: 0.1979 d3.dn_loss_cls: 0.0707 d3.dn_loss_bbox: 0.1512 d3.dn_loss_iou: 0.1926 d4.dn_loss_cls: 0.0672 d4.dn_loss_bbox: 0.1497 d4.dn_loss_iou: 0.1906 d1.loss_lmm_region: 0.1016 loss_lmm_image: 0.6940 2024/11/13 21:24:14 - mmengine - INFO - Iter(train) [141500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:44:33 time: 1.9978 data_time: 0.0192 memory: 35371 grad_norm: 23.5447 loss: 9.1370 loss_cls: 0.2937 loss_bbox: 0.1548 loss_iou: 0.2793 d0.loss_cls: 0.3378 d0.loss_bbox: 0.1698 d0.loss_iou: 0.2976 d1.loss_cls: 0.3125 d1.loss_bbox: 0.1647 d1.loss_iou: 0.2890 d2.loss_cls: 0.3038 d2.loss_bbox: 0.1605 d2.loss_iou: 0.2843 d3.loss_cls: 0.3025 d3.loss_bbox: 0.1558 d3.loss_iou: 0.2795 d4.loss_cls: 0.2995 d4.loss_bbox: 0.1544 d4.loss_iou: 0.2780 enc_loss_cls: 0.3466 enc_loss_bbox: 0.1779 enc_loss_iou: 0.3096 dn_loss_cls: 0.0709 dn_loss_bbox: 0.1465 dn_loss_iou: 0.2057 d0.dn_loss_cls: 0.1507 d0.dn_loss_bbox: 0.2662 d0.dn_loss_iou: 0.3273 d1.dn_loss_cls: 0.1015 d1.dn_loss_bbox: 0.1746 d1.dn_loss_iou: 0.2321 d2.dn_loss_cls: 0.0830 d2.dn_loss_bbox: 0.1546 d2.dn_loss_iou: 0.2139 d3.dn_loss_cls: 0.0762 d3.dn_loss_bbox: 0.1485 d3.dn_loss_iou: 0.2081 d4.dn_loss_cls: 0.0724 d4.dn_loss_bbox: 0.1465 d4.dn_loss_iou: 0.2057 d1.loss_lmm_region: 0.1018 loss_lmm_image: 0.6991 2024/11/13 21:27:34 - mmengine - INFO - Iter(train) [141600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:41:12 time: 2.0141 data_time: 0.0188 memory: 33926 grad_norm: 31.6628 loss: 8.5903 loss_cls: 0.2593 loss_bbox: 0.1392 loss_iou: 0.2669 d0.loss_cls: 0.2960 d0.loss_bbox: 0.1549 d0.loss_iou: 0.2805 d1.loss_cls: 0.2749 d1.loss_bbox: 0.1485 d1.loss_iou: 0.2730 d2.loss_cls: 0.2688 d2.loss_bbox: 0.1419 d2.loss_iou: 0.2670 d3.loss_cls: 0.2650 d3.loss_bbox: 0.1380 d3.loss_iou: 0.2640 d4.loss_cls: 0.2605 d4.loss_bbox: 0.1398 d4.loss_iou: 0.2672 enc_loss_cls: 0.3147 enc_loss_bbox: 0.1543 enc_loss_iou: 0.2854 dn_loss_cls: 0.0674 dn_loss_bbox: 0.1469 dn_loss_iou: 0.2027 d0.dn_loss_cls: 0.1442 d0.dn_loss_bbox: 0.2834 d0.dn_loss_iou: 0.3300 d1.dn_loss_cls: 0.0964 d1.dn_loss_bbox: 0.1732 d1.dn_loss_iou: 0.2285 d2.dn_loss_cls: 0.0786 d2.dn_loss_bbox: 0.1545 d2.dn_loss_iou: 0.2098 d3.dn_loss_cls: 0.0719 d3.dn_loss_bbox: 0.1483 d3.dn_loss_iou: 0.2041 d4.dn_loss_cls: 0.0684 d4.dn_loss_bbox: 0.1469 d4.dn_loss_iou: 0.2027 d1.loss_lmm_region: 0.0797 loss_lmm_image: 0.6927 2024/11/13 21:30:54 - mmengine - INFO - Iter(train) [141700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:37:51 time: 2.0044 data_time: 0.0190 memory: 32085 grad_norm: 31.8682 loss: 7.4745 loss_cls: 0.2203 loss_bbox: 0.1105 loss_iou: 0.1878 d0.loss_cls: 0.2504 d0.loss_bbox: 0.1233 d0.loss_iou: 0.2006 d1.loss_cls: 0.2303 d1.loss_bbox: 0.1166 d1.loss_iou: 0.1932 d2.loss_cls: 0.2280 d2.loss_bbox: 0.1093 d2.loss_iou: 0.1874 d3.loss_cls: 0.2247 d3.loss_bbox: 0.1081 d3.loss_iou: 0.1869 d4.loss_cls: 0.2219 d4.loss_bbox: 0.1080 d4.loss_iou: 0.1869 enc_loss_cls: 0.2616 enc_loss_bbox: 0.1316 enc_loss_iou: 0.2151 dn_loss_cls: 0.0842 dn_loss_bbox: 0.1423 dn_loss_iou: 0.1836 d0.dn_loss_cls: 0.1502 d0.dn_loss_bbox: 0.2711 d0.dn_loss_iou: 0.3006 d1.dn_loss_cls: 0.1041 d1.dn_loss_bbox: 0.1681 d1.dn_loss_iou: 0.2043 d2.dn_loss_cls: 0.0910 d2.dn_loss_bbox: 0.1512 d2.dn_loss_iou: 0.1905 d3.dn_loss_cls: 0.0871 d3.dn_loss_bbox: 0.1437 d3.dn_loss_iou: 0.1852 d4.dn_loss_cls: 0.0848 d4.dn_loss_bbox: 0.1423 d4.dn_loss_iou: 0.1836 d1.loss_lmm_region: 0.0961 loss_lmm_image: 0.7077 2024/11/13 21:34:13 - mmengine - INFO - Iter(train) [141800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:34:30 time: 1.9907 data_time: 0.0189 memory: 33958 grad_norm: 25.5157 loss: 7.6615 loss_cls: 0.2296 loss_bbox: 0.1133 loss_iou: 0.1942 d0.loss_cls: 0.2583 d0.loss_bbox: 0.1272 d0.loss_iou: 0.2118 d1.loss_cls: 0.2462 d1.loss_bbox: 0.1169 d1.loss_iou: 0.2024 d2.loss_cls: 0.2439 d2.loss_bbox: 0.1133 d2.loss_iou: 0.1974 d3.loss_cls: 0.2352 d3.loss_bbox: 0.1107 d3.loss_iou: 0.1931 d4.loss_cls: 0.2290 d4.loss_bbox: 0.1136 d4.loss_iou: 0.1944 enc_loss_cls: 0.2723 enc_loss_bbox: 0.1324 enc_loss_iou: 0.2240 dn_loss_cls: 0.0713 dn_loss_bbox: 0.1550 dn_loss_iou: 0.1869 d0.dn_loss_cls: 0.1413 d0.dn_loss_bbox: 0.2699 d0.dn_loss_iou: 0.3080 d1.dn_loss_cls: 0.0943 d1.dn_loss_bbox: 0.1775 d1.dn_loss_iou: 0.2110 d2.dn_loss_cls: 0.0803 d2.dn_loss_bbox: 0.1635 d2.dn_loss_iou: 0.1949 d3.dn_loss_cls: 0.0756 d3.dn_loss_bbox: 0.1561 d3.dn_loss_iou: 0.1887 d4.dn_loss_cls: 0.0728 d4.dn_loss_bbox: 0.1551 d4.dn_loss_iou: 0.1869 d1.loss_lmm_region: 0.0953 loss_lmm_image: 0.7182 2024/11/13 21:37:32 - mmengine - INFO - Iter(train) [141900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:31:09 time: 2.0141 data_time: 0.0188 memory: 35122 grad_norm: 30.5909 loss: 8.7528 loss_cls: 0.2674 loss_bbox: 0.1438 loss_iou: 0.2691 d0.loss_cls: 0.3035 d0.loss_bbox: 0.1507 d0.loss_iou: 0.2765 d1.loss_cls: 0.2850 d1.loss_bbox: 0.1444 d1.loss_iou: 0.2697 d2.loss_cls: 0.2764 d2.loss_bbox: 0.1427 d2.loss_iou: 0.2699 d3.loss_cls: 0.2657 d3.loss_bbox: 0.1442 d3.loss_iou: 0.2713 d4.loss_cls: 0.2671 d4.loss_bbox: 0.1426 d4.loss_iou: 0.2698 enc_loss_cls: 0.3130 enc_loss_bbox: 0.1570 enc_loss_iou: 0.2907 dn_loss_cls: 0.0736 dn_loss_bbox: 0.1547 dn_loss_iou: 0.2086 d0.dn_loss_cls: 0.1567 d0.dn_loss_bbox: 0.2923 d0.dn_loss_iou: 0.3464 d1.dn_loss_cls: 0.0987 d1.dn_loss_bbox: 0.1830 d1.dn_loss_iou: 0.2361 d2.dn_loss_cls: 0.0833 d2.dn_loss_bbox: 0.1634 d2.dn_loss_iou: 0.2171 d3.dn_loss_cls: 0.0758 d3.dn_loss_bbox: 0.1563 d3.dn_loss_iou: 0.2107 d4.dn_loss_cls: 0.0741 d4.dn_loss_bbox: 0.1548 d4.dn_loss_iou: 0.2086 d1.loss_lmm_region: 0.0914 loss_lmm_image: 0.6467 2024/11/13 21:40:52 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 21:40:52 - mmengine - INFO - Iter(train) [142000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:27:48 time: 1.9841 data_time: 0.0189 memory: 34831 grad_norm: 29.6010 loss: 8.9340 loss_cls: 0.2805 loss_bbox: 0.1485 loss_iou: 0.2867 d0.loss_cls: 0.3175 d0.loss_bbox: 0.1594 d0.loss_iou: 0.2981 d1.loss_cls: 0.2969 d1.loss_bbox: 0.1491 d1.loss_iou: 0.2908 d2.loss_cls: 0.2936 d2.loss_bbox: 0.1494 d2.loss_iou: 0.2857 d3.loss_cls: 0.2837 d3.loss_bbox: 0.1511 d3.loss_iou: 0.2893 d4.loss_cls: 0.2811 d4.loss_bbox: 0.1481 d4.loss_iou: 0.2857 enc_loss_cls: 0.3242 enc_loss_bbox: 0.1653 enc_loss_iou: 0.3076 dn_loss_cls: 0.0704 dn_loss_bbox: 0.1416 dn_loss_iou: 0.2027 d0.dn_loss_cls: 0.1442 d0.dn_loss_bbox: 0.2803 d0.dn_loss_iou: 0.3340 d1.dn_loss_cls: 0.0972 d1.dn_loss_bbox: 0.1713 d1.dn_loss_iou: 0.2285 d2.dn_loss_cls: 0.0809 d2.dn_loss_bbox: 0.1506 d2.dn_loss_iou: 0.2111 d3.dn_loss_cls: 0.0742 d3.dn_loss_bbox: 0.1427 d3.dn_loss_iou: 0.2042 d4.dn_loss_cls: 0.0711 d4.dn_loss_bbox: 0.1417 d4.dn_loss_iou: 0.2027 d1.loss_lmm_region: 0.0877 loss_lmm_image: 0.7045 2024/11/13 21:44:14 - mmengine - INFO - Iter(train) [142100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:24:27 time: 2.0248 data_time: 0.0189 memory: 35501 grad_norm: 26.2549 loss: 8.3216 loss_cls: 0.2492 loss_bbox: 0.1305 loss_iou: 0.2275 d0.loss_cls: 0.2918 d0.loss_bbox: 0.1390 d0.loss_iou: 0.2415 d1.loss_cls: 0.2734 d1.loss_bbox: 0.1376 d1.loss_iou: 0.2359 d2.loss_cls: 0.2660 d2.loss_bbox: 0.1266 d2.loss_iou: 0.2269 d3.loss_cls: 0.2585 d3.loss_bbox: 0.1266 d3.loss_iou: 0.2257 d4.loss_cls: 0.2532 d4.loss_bbox: 0.1284 d4.loss_iou: 0.2267 enc_loss_cls: 0.2983 enc_loss_bbox: 0.1451 enc_loss_iou: 0.2543 dn_loss_cls: 0.0709 dn_loss_bbox: 0.1576 dn_loss_iou: 0.2014 d0.dn_loss_cls: 0.1472 d0.dn_loss_bbox: 0.2846 d0.dn_loss_iou: 0.3304 d1.dn_loss_cls: 0.0999 d1.dn_loss_bbox: 0.1868 d1.dn_loss_iou: 0.2269 d2.dn_loss_cls: 0.0822 d2.dn_loss_bbox: 0.1662 d2.dn_loss_iou: 0.2089 d3.dn_loss_cls: 0.0746 d3.dn_loss_bbox: 0.1595 d3.dn_loss_iou: 0.2033 d4.dn_loss_cls: 0.0718 d4.dn_loss_bbox: 0.1576 d4.dn_loss_iou: 0.2014 d1.loss_lmm_region: 0.0974 loss_lmm_image: 0.7305 2024/11/13 21:47:31 - mmengine - INFO - Iter(train) [142200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:21:06 time: 1.9742 data_time: 0.0188 memory: 34133 grad_norm: 25.4636 loss: 8.8435 loss_cls: 0.2717 loss_bbox: 0.1440 loss_iou: 0.2600 d0.loss_cls: 0.3072 d0.loss_bbox: 0.1476 d0.loss_iou: 0.2676 d1.loss_cls: 0.2907 d1.loss_bbox: 0.1405 d1.loss_iou: 0.2614 d2.loss_cls: 0.2805 d2.loss_bbox: 0.1411 d2.loss_iou: 0.2593 d3.loss_cls: 0.2746 d3.loss_bbox: 0.1436 d3.loss_iou: 0.2604 d4.loss_cls: 0.2725 d4.loss_bbox: 0.1420 d4.loss_iou: 0.2589 enc_loss_cls: 0.3183 enc_loss_bbox: 0.1577 enc_loss_iou: 0.2790 dn_loss_cls: 0.0881 dn_loss_bbox: 0.1550 dn_loss_iou: 0.2126 d0.dn_loss_cls: 0.1635 d0.dn_loss_bbox: 0.2771 d0.dn_loss_iou: 0.3308 d1.dn_loss_cls: 0.1151 d1.dn_loss_bbox: 0.1824 d1.dn_loss_iou: 0.2354 d2.dn_loss_cls: 0.0979 d2.dn_loss_bbox: 0.1662 d2.dn_loss_iou: 0.2211 d3.dn_loss_cls: 0.0925 d3.dn_loss_bbox: 0.1568 d3.dn_loss_iou: 0.2145 d4.dn_loss_cls: 0.0887 d4.dn_loss_bbox: 0.1551 d4.dn_loss_iou: 0.2126 d1.loss_lmm_region: 0.0903 loss_lmm_image: 0.7088 2024/11/13 21:50:52 - mmengine - INFO - Iter(train) [142300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:17:45 time: 1.9901 data_time: 0.0189 memory: 34515 grad_norm: 24.5060 loss: 8.4281 loss_cls: 0.2443 loss_bbox: 0.1382 loss_iou: 0.2238 d0.loss_cls: 0.2834 d0.loss_bbox: 0.1512 d0.loss_iou: 0.2365 d1.loss_cls: 0.2655 d1.loss_bbox: 0.1424 d1.loss_iou: 0.2287 d2.loss_cls: 0.2533 d2.loss_bbox: 0.1408 d2.loss_iou: 0.2242 d3.loss_cls: 0.2495 d3.loss_bbox: 0.1368 d3.loss_iou: 0.2230 d4.loss_cls: 0.2455 d4.loss_bbox: 0.1383 d4.loss_iou: 0.2238 enc_loss_cls: 0.2902 enc_loss_bbox: 0.1620 enc_loss_iou: 0.2509 dn_loss_cls: 0.0936 dn_loss_bbox: 0.1601 dn_loss_iou: 0.1951 d0.dn_loss_cls: 0.1800 d0.dn_loss_bbox: 0.2994 d0.dn_loss_iou: 0.3218 d1.dn_loss_cls: 0.1276 d1.dn_loss_bbox: 0.1899 d1.dn_loss_iou: 0.2212 d2.dn_loss_cls: 0.1093 d2.dn_loss_bbox: 0.1723 d2.dn_loss_iou: 0.2050 d3.dn_loss_cls: 0.1016 d3.dn_loss_bbox: 0.1637 d3.dn_loss_iou: 0.1977 d4.dn_loss_cls: 0.0959 d4.dn_loss_bbox: 0.1601 d4.dn_loss_iou: 0.1951 d1.loss_lmm_region: 0.0961 loss_lmm_image: 0.6903 2024/11/13 21:54:13 - mmengine - INFO - Iter(train) [142400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:14:25 time: 2.0257 data_time: 0.0190 memory: 34105 grad_norm: 36.1063 loss: 7.7438 loss_cls: 0.2467 loss_bbox: 0.1324 loss_iou: 0.2078 d0.loss_cls: 0.2817 d0.loss_bbox: 0.1365 d0.loss_iou: 0.2168 d1.loss_cls: 0.2638 d1.loss_bbox: 0.1308 d1.loss_iou: 0.2091 d2.loss_cls: 0.2599 d2.loss_bbox: 0.1283 d2.loss_iou: 0.2054 d3.loss_cls: 0.2542 d3.loss_bbox: 0.1299 d3.loss_iou: 0.2067 d4.loss_cls: 0.2474 d4.loss_bbox: 0.1348 d4.loss_iou: 0.2079 enc_loss_cls: 0.2869 enc_loss_bbox: 0.1468 enc_loss_iou: 0.2323 dn_loss_cls: 0.0629 dn_loss_bbox: 0.1400 dn_loss_iou: 0.1847 d0.dn_loss_cls: 0.1462 d0.dn_loss_bbox: 0.2620 d0.dn_loss_iou: 0.3070 d1.dn_loss_cls: 0.0946 d1.dn_loss_bbox: 0.1698 d1.dn_loss_iou: 0.2104 d2.dn_loss_cls: 0.0759 d2.dn_loss_bbox: 0.1495 d2.dn_loss_iou: 0.1929 d3.dn_loss_cls: 0.0676 d3.dn_loss_bbox: 0.1422 d3.dn_loss_iou: 0.1867 d4.dn_loss_cls: 0.0635 d4.dn_loss_bbox: 0.1400 d4.dn_loss_iou: 0.1848 d1.loss_lmm_region: 0.0807 loss_lmm_image: 0.6160 2024/11/13 21:57:32 - mmengine - INFO - Iter(train) [142500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:11:04 time: 2.0015 data_time: 0.0188 memory: 34207 grad_norm: 27.1384 loss: 8.5626 loss_cls: 0.2817 loss_bbox: 0.1310 loss_iou: 0.2521 d0.loss_cls: 0.3205 d0.loss_bbox: 0.1487 d0.loss_iou: 0.2698 d1.loss_cls: 0.3031 d1.loss_bbox: 0.1360 d1.loss_iou: 0.2578 d2.loss_cls: 0.2949 d2.loss_bbox: 0.1348 d2.loss_iou: 0.2541 d3.loss_cls: 0.2830 d3.loss_bbox: 0.1335 d3.loss_iou: 0.2544 d4.loss_cls: 0.2822 d4.loss_bbox: 0.1312 d4.loss_iou: 0.2534 enc_loss_cls: 0.3313 enc_loss_bbox: 0.1539 enc_loss_iou: 0.2797 dn_loss_cls: 0.0647 dn_loss_bbox: 0.1397 dn_loss_iou: 0.1991 d0.dn_loss_cls: 0.1478 d0.dn_loss_bbox: 0.2787 d0.dn_loss_iou: 0.3289 d1.dn_loss_cls: 0.0949 d1.dn_loss_bbox: 0.1698 d1.dn_loss_iou: 0.2271 d2.dn_loss_cls: 0.0795 d2.dn_loss_bbox: 0.1468 d2.dn_loss_iou: 0.2070 d3.dn_loss_cls: 0.0724 d3.dn_loss_bbox: 0.1417 d3.dn_loss_iou: 0.2012 d4.dn_loss_cls: 0.0660 d4.dn_loss_bbox: 0.1397 d4.dn_loss_iou: 0.1992 d1.loss_lmm_region: 0.1004 loss_lmm_image: 0.6709 2024/11/13 22:00:53 - mmengine - INFO - Iter(train) [142600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:07:43 time: 2.0108 data_time: 0.0190 memory: 34487 grad_norm: 22.4375 loss: 9.0855 loss_cls: 0.2850 loss_bbox: 0.1471 loss_iou: 0.2911 d0.loss_cls: 0.3368 d0.loss_bbox: 0.1694 d0.loss_iou: 0.3121 d1.loss_cls: 0.3111 d1.loss_bbox: 0.1559 d1.loss_iou: 0.2966 d2.loss_cls: 0.2990 d2.loss_bbox: 0.1517 d2.loss_iou: 0.2950 d3.loss_cls: 0.2897 d3.loss_bbox: 0.1483 d3.loss_iou: 0.2930 d4.loss_cls: 0.2871 d4.loss_bbox: 0.1467 d4.loss_iou: 0.2916 enc_loss_cls: 0.3426 enc_loss_bbox: 0.1781 enc_loss_iou: 0.3278 dn_loss_cls: 0.0599 dn_loss_bbox: 0.1456 dn_loss_iou: 0.2014 d0.dn_loss_cls: 0.1506 d0.dn_loss_bbox: 0.2862 d0.dn_loss_iou: 0.3325 d1.dn_loss_cls: 0.0929 d1.dn_loss_bbox: 0.1730 d1.dn_loss_iou: 0.2280 d2.dn_loss_cls: 0.0725 d2.dn_loss_bbox: 0.1550 d2.dn_loss_iou: 0.2102 d3.dn_loss_cls: 0.0662 d3.dn_loss_bbox: 0.1476 d3.dn_loss_iou: 0.2038 d4.dn_loss_cls: 0.0612 d4.dn_loss_bbox: 0.1456 d4.dn_loss_iou: 0.2014 d1.loss_lmm_region: 0.1027 loss_lmm_image: 0.6935 2024/11/13 22:04:16 - mmengine - INFO - Iter(train) [142700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:04:22 time: 2.0130 data_time: 0.0189 memory: 34064 grad_norm: 23.0355 loss: 8.6731 loss_cls: 0.2851 loss_bbox: 0.1361 loss_iou: 0.2649 d0.loss_cls: 0.3305 d0.loss_bbox: 0.1539 d0.loss_iou: 0.2789 d1.loss_cls: 0.3064 d1.loss_bbox: 0.1420 d1.loss_iou: 0.2714 d2.loss_cls: 0.2959 d2.loss_bbox: 0.1380 d2.loss_iou: 0.2662 d3.loss_cls: 0.2857 d3.loss_bbox: 0.1381 d3.loss_iou: 0.2658 d4.loss_cls: 0.2850 d4.loss_bbox: 0.1355 d4.loss_iou: 0.2656 enc_loss_cls: 0.3379 enc_loss_bbox: 0.1634 enc_loss_iou: 0.2947 dn_loss_cls: 0.0719 dn_loss_bbox: 0.1392 dn_loss_iou: 0.1922 d0.dn_loss_cls: 0.1451 d0.dn_loss_bbox: 0.2626 d0.dn_loss_iou: 0.3124 d1.dn_loss_cls: 0.0951 d1.dn_loss_bbox: 0.1655 d1.dn_loss_iou: 0.2163 d2.dn_loss_cls: 0.0804 d2.dn_loss_bbox: 0.1462 d2.dn_loss_iou: 0.1996 d3.dn_loss_cls: 0.0744 d3.dn_loss_bbox: 0.1410 d3.dn_loss_iou: 0.1941 d4.dn_loss_cls: 0.0727 d4.dn_loss_bbox: 0.1390 d4.dn_loss_iou: 0.1923 d1.loss_lmm_region: 0.0907 loss_lmm_image: 0.7013 2024/11/13 22:07:36 - mmengine - INFO - Iter(train) [142800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 4:01:01 time: 2.0034 data_time: 0.0187 memory: 33705 grad_norm: 21.2174 loss: 8.2648 loss_cls: 0.2452 loss_bbox: 0.1242 loss_iou: 0.2266 d0.loss_cls: 0.2827 d0.loss_bbox: 0.1402 d0.loss_iou: 0.2421 d1.loss_cls: 0.2597 d1.loss_bbox: 0.1316 d1.loss_iou: 0.2361 d2.loss_cls: 0.2543 d2.loss_bbox: 0.1289 d2.loss_iou: 0.2303 d3.loss_cls: 0.2491 d3.loss_bbox: 0.1257 d3.loss_iou: 0.2279 d4.loss_cls: 0.2462 d4.loss_bbox: 0.1253 d4.loss_iou: 0.2261 enc_loss_cls: 0.2847 enc_loss_bbox: 0.1519 enc_loss_iou: 0.2569 dn_loss_cls: 0.0707 dn_loss_bbox: 0.1594 dn_loss_iou: 0.2153 d0.dn_loss_cls: 0.1518 d0.dn_loss_bbox: 0.2894 d0.dn_loss_iou: 0.3400 d1.dn_loss_cls: 0.0984 d1.dn_loss_bbox: 0.1852 d1.dn_loss_iou: 0.2406 d2.dn_loss_cls: 0.0807 d2.dn_loss_bbox: 0.1678 d2.dn_loss_iou: 0.2240 d3.dn_loss_cls: 0.0736 d3.dn_loss_bbox: 0.1612 d3.dn_loss_iou: 0.2174 d4.dn_loss_cls: 0.0713 d4.dn_loss_bbox: 0.1595 d4.dn_loss_iou: 0.2154 d1.loss_lmm_region: 0.1213 loss_lmm_image: 0.6258 2024/11/13 22:10:56 - mmengine - INFO - Iter(train) [142900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:57:40 time: 2.0000 data_time: 0.0188 memory: 34118 grad_norm: 23.3121 loss: 7.6077 loss_cls: 0.2069 loss_bbox: 0.1214 loss_iou: 0.2105 d0.loss_cls: 0.2374 d0.loss_bbox: 0.1300 d0.loss_iou: 0.2216 d1.loss_cls: 0.2233 d1.loss_bbox: 0.1215 d1.loss_iou: 0.2127 d2.loss_cls: 0.2135 d2.loss_bbox: 0.1194 d2.loss_iou: 0.2087 d3.loss_cls: 0.2085 d3.loss_bbox: 0.1207 d3.loss_iou: 0.2099 d4.loss_cls: 0.2069 d4.loss_bbox: 0.1217 d4.loss_iou: 0.2113 enc_loss_cls: 0.2442 enc_loss_bbox: 0.1395 enc_loss_iou: 0.2315 dn_loss_cls: 0.0725 dn_loss_bbox: 0.1481 dn_loss_iou: 0.1860 d0.dn_loss_cls: 0.1459 d0.dn_loss_bbox: 0.2821 d0.dn_loss_iou: 0.3103 d1.dn_loss_cls: 0.0985 d1.dn_loss_bbox: 0.1722 d1.dn_loss_iou: 0.2101 d2.dn_loss_cls: 0.0824 d2.dn_loss_bbox: 0.1566 d2.dn_loss_iou: 0.1943 d3.dn_loss_cls: 0.0763 d3.dn_loss_bbox: 0.1495 d3.dn_loss_iou: 0.1881 d4.dn_loss_cls: 0.0738 d4.dn_loss_bbox: 0.1481 d4.dn_loss_iou: 0.1860 d1.loss_lmm_region: 0.0956 loss_lmm_image: 0.7101 2024/11/13 22:14:16 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 22:14:16 - mmengine - INFO - Iter(train) [143000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:54:19 time: 2.0061 data_time: 0.0188 memory: 33044 grad_norm: 23.2480 loss: 8.9152 loss_cls: 0.2743 loss_bbox: 0.1577 loss_iou: 0.2426 d0.loss_cls: 0.3180 d0.loss_bbox: 0.1651 d0.loss_iou: 0.2525 d1.loss_cls: 0.2985 d1.loss_bbox: 0.1573 d1.loss_iou: 0.2457 d2.loss_cls: 0.2889 d2.loss_bbox: 0.1537 d2.loss_iou: 0.2418 d3.loss_cls: 0.2793 d3.loss_bbox: 0.1559 d3.loss_iou: 0.2413 d4.loss_cls: 0.2736 d4.loss_bbox: 0.1585 d4.loss_iou: 0.2428 enc_loss_cls: 0.3312 enc_loss_bbox: 0.1751 enc_loss_iou: 0.2663 dn_loss_cls: 0.0969 dn_loss_bbox: 0.1733 dn_loss_iou: 0.2018 d0.dn_loss_cls: 0.1692 d0.dn_loss_bbox: 0.2861 d0.dn_loss_iou: 0.3166 d1.dn_loss_cls: 0.1241 d1.dn_loss_bbox: 0.1948 d1.dn_loss_iou: 0.2249 d2.dn_loss_cls: 0.1064 d2.dn_loss_bbox: 0.1820 d2.dn_loss_iou: 0.2093 d3.dn_loss_cls: 0.1011 d3.dn_loss_bbox: 0.1755 d3.dn_loss_iou: 0.2038 d4.dn_loss_cls: 0.0988 d4.dn_loss_bbox: 0.1733 d4.dn_loss_iou: 0.2017 d1.loss_lmm_region: 0.0872 loss_lmm_image: 0.6681 2024/11/13 22:17:36 - mmengine - INFO - Iter(train) [143100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:50:58 time: 1.9998 data_time: 0.0189 memory: 34526 grad_norm: 25.7101 loss: 8.4515 loss_cls: 0.2840 loss_bbox: 0.1261 loss_iou: 0.2424 d0.loss_cls: 0.3259 d0.loss_bbox: 0.1348 d0.loss_iou: 0.2573 d1.loss_cls: 0.3030 d1.loss_bbox: 0.1306 d1.loss_iou: 0.2486 d2.loss_cls: 0.2963 d2.loss_bbox: 0.1269 d2.loss_iou: 0.2457 d3.loss_cls: 0.2891 d3.loss_bbox: 0.1273 d3.loss_iou: 0.2430 d4.loss_cls: 0.2865 d4.loss_bbox: 0.1259 d4.loss_iou: 0.2426 enc_loss_cls: 0.3329 enc_loss_bbox: 0.1431 enc_loss_iou: 0.2713 dn_loss_cls: 0.0736 dn_loss_bbox: 0.1343 dn_loss_iou: 0.1937 d0.dn_loss_cls: 0.1549 d0.dn_loss_bbox: 0.2521 d0.dn_loss_iou: 0.3180 d1.dn_loss_cls: 0.1026 d1.dn_loss_bbox: 0.1583 d1.dn_loss_iou: 0.2192 d2.dn_loss_cls: 0.0873 d2.dn_loss_bbox: 0.1409 d2.dn_loss_iou: 0.2014 d3.dn_loss_cls: 0.0797 d3.dn_loss_bbox: 0.1355 d3.dn_loss_iou: 0.1956 d4.dn_loss_cls: 0.0753 d4.dn_loss_bbox: 0.1344 d4.dn_loss_iou: 0.1938 d1.loss_lmm_region: 0.0836 loss_lmm_image: 0.7341 2024/11/13 22:20:56 - mmengine - INFO - Iter(train) [143200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:47:38 time: 1.9747 data_time: 0.0189 memory: 34139 grad_norm: 27.7340 loss: 8.3907 loss_cls: 0.2730 loss_bbox: 0.1116 loss_iou: 0.2233 d0.loss_cls: 0.3056 d0.loss_bbox: 0.1203 d0.loss_iou: 0.2420 d1.loss_cls: 0.2918 d1.loss_bbox: 0.1163 d1.loss_iou: 0.2299 d2.loss_cls: 0.2845 d2.loss_bbox: 0.1125 d2.loss_iou: 0.2279 d3.loss_cls: 0.2748 d3.loss_bbox: 0.1126 d3.loss_iou: 0.2252 d4.loss_cls: 0.2715 d4.loss_bbox: 0.1116 d4.loss_iou: 0.2235 enc_loss_cls: 0.3072 enc_loss_bbox: 0.1365 enc_loss_iou: 0.2586 dn_loss_cls: 0.1194 dn_loss_bbox: 0.1329 dn_loss_iou: 0.1978 d0.dn_loss_cls: 0.1898 d0.dn_loss_bbox: 0.2568 d0.dn_loss_iou: 0.3249 d1.dn_loss_cls: 0.1414 d1.dn_loss_bbox: 0.1561 d1.dn_loss_iou: 0.2218 d2.dn_loss_cls: 0.1282 d2.dn_loss_bbox: 0.1392 d2.dn_loss_iou: 0.2053 d3.dn_loss_cls: 0.1207 d3.dn_loss_bbox: 0.1331 d3.dn_loss_iou: 0.1998 d4.dn_loss_cls: 0.1154 d4.dn_loss_bbox: 0.1329 d4.dn_loss_iou: 0.1979 d1.loss_lmm_region: 0.0922 loss_lmm_image: 0.7251 2024/11/13 22:24:16 - mmengine - INFO - Iter(train) [143300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:44:17 time: 1.9876 data_time: 0.0187 memory: 35923 grad_norm: 24.9848 loss: 8.2906 loss_cls: 0.2417 loss_bbox: 0.1301 loss_iou: 0.2362 d0.loss_cls: 0.2766 d0.loss_bbox: 0.1441 d0.loss_iou: 0.2514 d1.loss_cls: 0.2536 d1.loss_bbox: 0.1345 d1.loss_iou: 0.2443 d2.loss_cls: 0.2458 d2.loss_bbox: 0.1330 d2.loss_iou: 0.2392 d3.loss_cls: 0.2457 d3.loss_bbox: 0.1310 d3.loss_iou: 0.2353 d4.loss_cls: 0.2422 d4.loss_bbox: 0.1308 d4.loss_iou: 0.2373 enc_loss_cls: 0.2875 enc_loss_bbox: 0.1540 enc_loss_iou: 0.2650 dn_loss_cls: 0.0625 dn_loss_bbox: 0.1723 dn_loss_iou: 0.2025 d0.dn_loss_cls: 0.1363 d0.dn_loss_bbox: 0.3259 d0.dn_loss_iou: 0.3340 d1.dn_loss_cls: 0.0869 d1.dn_loss_bbox: 0.2030 d1.dn_loss_iou: 0.2293 d2.dn_loss_cls: 0.0732 d2.dn_loss_bbox: 0.1835 d2.dn_loss_iou: 0.2119 d3.dn_loss_cls: 0.0673 d3.dn_loss_bbox: 0.1734 d3.dn_loss_iou: 0.2042 d4.dn_loss_cls: 0.0629 d4.dn_loss_bbox: 0.1724 d4.dn_loss_iou: 0.2026 d1.loss_lmm_region: 0.0875 loss_lmm_image: 0.6397 2024/11/13 22:27:35 - mmengine - INFO - Iter(train) [143400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:40:56 time: 2.0154 data_time: 0.0191 memory: 32799 grad_norm: 25.9434 loss: 8.7512 loss_cls: 0.2531 loss_bbox: 0.1534 loss_iou: 0.2391 d0.loss_cls: 0.2890 d0.loss_bbox: 0.1595 d0.loss_iou: 0.2535 d1.loss_cls: 0.2704 d1.loss_bbox: 0.1542 d1.loss_iou: 0.2438 d2.loss_cls: 0.2657 d2.loss_bbox: 0.1516 d2.loss_iou: 0.2389 d3.loss_cls: 0.2630 d3.loss_bbox: 0.1491 d3.loss_iou: 0.2362 d4.loss_cls: 0.2555 d4.loss_bbox: 0.1503 d4.loss_iou: 0.2384 enc_loss_cls: 0.2914 enc_loss_bbox: 0.1685 enc_loss_iou: 0.2621 dn_loss_cls: 0.0725 dn_loss_bbox: 0.1895 dn_loss_iou: 0.2150 d0.dn_loss_cls: 0.1544 d0.dn_loss_bbox: 0.3446 d0.dn_loss_iou: 0.3451 d1.dn_loss_cls: 0.1029 d1.dn_loss_bbox: 0.2211 d1.dn_loss_iou: 0.2409 d2.dn_loss_cls: 0.0841 d2.dn_loss_bbox: 0.1976 d2.dn_loss_iou: 0.2224 d3.dn_loss_cls: 0.0764 d3.dn_loss_bbox: 0.1917 d3.dn_loss_iou: 0.2168 d4.dn_loss_cls: 0.0742 d4.dn_loss_bbox: 0.1895 d4.dn_loss_iou: 0.2150 d1.loss_lmm_region: 0.0833 loss_lmm_image: 0.6274 2024/11/13 22:30:55 - mmengine - INFO - Iter(train) [143500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:37:35 time: 2.0091 data_time: 0.0190 memory: 33543 grad_norm: 22.2280 loss: 7.4036 loss_cls: 0.2190 loss_bbox: 0.1075 loss_iou: 0.1969 d0.loss_cls: 0.2509 d0.loss_bbox: 0.1230 d0.loss_iou: 0.2102 d1.loss_cls: 0.2318 d1.loss_bbox: 0.1129 d1.loss_iou: 0.1999 d2.loss_cls: 0.2259 d2.loss_bbox: 0.1100 d2.loss_iou: 0.1965 d3.loss_cls: 0.2196 d3.loss_bbox: 0.1077 d3.loss_iou: 0.1968 d4.loss_cls: 0.2192 d4.loss_bbox: 0.1068 d4.loss_iou: 0.1959 enc_loss_cls: 0.2588 enc_loss_bbox: 0.1324 enc_loss_iou: 0.2209 dn_loss_cls: 0.0759 dn_loss_bbox: 0.1323 dn_loss_iou: 0.1760 d0.dn_loss_cls: 0.1514 d0.dn_loss_bbox: 0.2744 d0.dn_loss_iou: 0.3009 d1.dn_loss_cls: 0.1053 d1.dn_loss_bbox: 0.1582 d1.dn_loss_iou: 0.1976 d2.dn_loss_cls: 0.0880 d2.dn_loss_bbox: 0.1395 d2.dn_loss_iou: 0.1829 d3.dn_loss_cls: 0.0813 d3.dn_loss_bbox: 0.1346 d3.dn_loss_iou: 0.1780 d4.dn_loss_cls: 0.0769 d4.dn_loss_bbox: 0.1323 d4.dn_loss_iou: 0.1761 d1.loss_lmm_region: 0.0832 loss_lmm_image: 0.7166 2024/11/13 22:34:14 - mmengine - INFO - Iter(train) [143600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:34:14 time: 1.9961 data_time: 0.0189 memory: 33745 grad_norm: 24.5099 loss: 8.1685 loss_cls: 0.2823 loss_bbox: 0.1194 loss_iou: 0.2548 d0.loss_cls: 0.3151 d0.loss_bbox: 0.1284 d0.loss_iou: 0.2642 d1.loss_cls: 0.2931 d1.loss_bbox: 0.1222 d1.loss_iou: 0.2598 d2.loss_cls: 0.2910 d2.loss_bbox: 0.1186 d2.loss_iou: 0.2561 d3.loss_cls: 0.2842 d3.loss_bbox: 0.1203 d3.loss_iou: 0.2563 d4.loss_cls: 0.2827 d4.loss_bbox: 0.1200 d4.loss_iou: 0.2554 enc_loss_cls: 0.3154 enc_loss_bbox: 0.1384 enc_loss_iou: 0.2810 dn_loss_cls: 0.0888 dn_loss_bbox: 0.1128 dn_loss_iou: 0.1784 d0.dn_loss_cls: 0.1551 d0.dn_loss_bbox: 0.2198 d0.dn_loss_iou: 0.2905 d1.dn_loss_cls: 0.1146 d1.dn_loss_bbox: 0.1307 d1.dn_loss_iou: 0.1990 d2.dn_loss_cls: 0.0972 d2.dn_loss_bbox: 0.1197 d2.dn_loss_iou: 0.1854 d3.dn_loss_cls: 0.0920 d3.dn_loss_bbox: 0.1138 d3.dn_loss_iou: 0.1798 d4.dn_loss_cls: 0.0883 d4.dn_loss_bbox: 0.1128 d4.dn_loss_iou: 0.1784 d1.loss_lmm_region: 0.0884 loss_lmm_image: 0.6644 2024/11/13 22:37:32 - mmengine - INFO - Iter(train) [143700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:30:53 time: 1.9638 data_time: 0.0188 memory: 33527 grad_norm: 25.7334 loss: 7.4050 loss_cls: 0.2257 loss_bbox: 0.1135 loss_iou: 0.1863 d0.loss_cls: 0.2664 d0.loss_bbox: 0.1262 d0.loss_iou: 0.2024 d1.loss_cls: 0.2448 d1.loss_bbox: 0.1193 d1.loss_iou: 0.1913 d2.loss_cls: 0.2398 d2.loss_bbox: 0.1106 d2.loss_iou: 0.1844 d3.loss_cls: 0.2291 d3.loss_bbox: 0.1122 d3.loss_iou: 0.1856 d4.loss_cls: 0.2274 d4.loss_bbox: 0.1127 d4.loss_iou: 0.1854 enc_loss_cls: 0.2749 enc_loss_bbox: 0.1388 enc_loss_iou: 0.2220 dn_loss_cls: 0.0670 dn_loss_bbox: 0.1457 dn_loss_iou: 0.1648 d0.dn_loss_cls: 0.1438 d0.dn_loss_bbox: 0.2789 d0.dn_loss_iou: 0.2869 d1.dn_loss_cls: 0.0936 d1.dn_loss_bbox: 0.1730 d1.dn_loss_iou: 0.1923 d2.dn_loss_cls: 0.0765 d2.dn_loss_bbox: 0.1558 d2.dn_loss_iou: 0.1750 d3.dn_loss_cls: 0.0716 d3.dn_loss_bbox: 0.1480 d3.dn_loss_iou: 0.1675 d4.dn_loss_cls: 0.0683 d4.dn_loss_bbox: 0.1457 d4.dn_loss_iou: 0.1649 d1.loss_lmm_region: 0.0796 loss_lmm_image: 0.7074 2024/11/13 22:40:52 - mmengine - INFO - Iter(train) [143800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:27:32 time: 2.0060 data_time: 0.0189 memory: 33619 grad_norm: 24.2431 loss: 8.0007 loss_cls: 0.2253 loss_bbox: 0.1441 loss_iou: 0.2292 d0.loss_cls: 0.2576 d0.loss_bbox: 0.1546 d0.loss_iou: 0.2398 d1.loss_cls: 0.2416 d1.loss_bbox: 0.1437 d1.loss_iou: 0.2342 d2.loss_cls: 0.2333 d2.loss_bbox: 0.1442 d2.loss_iou: 0.2320 d3.loss_cls: 0.2326 d3.loss_bbox: 0.1394 d3.loss_iou: 0.2286 d4.loss_cls: 0.2281 d4.loss_bbox: 0.1418 d4.loss_iou: 0.2288 enc_loss_cls: 0.2781 enc_loss_bbox: 0.1597 enc_loss_iou: 0.2514 dn_loss_cls: 0.0709 dn_loss_bbox: 0.1542 dn_loss_iou: 0.1841 d0.dn_loss_cls: 0.1409 d0.dn_loss_bbox: 0.2861 d0.dn_loss_iou: 0.3074 d1.dn_loss_cls: 0.0948 d1.dn_loss_bbox: 0.1810 d1.dn_loss_iou: 0.2117 d2.dn_loss_cls: 0.0775 d2.dn_loss_bbox: 0.1620 d2.dn_loss_iou: 0.1920 d3.dn_loss_cls: 0.0734 d3.dn_loss_bbox: 0.1563 d3.dn_loss_iou: 0.1863 d4.dn_loss_cls: 0.0717 d4.dn_loss_bbox: 0.1541 d4.dn_loss_iou: 0.1842 d1.loss_lmm_region: 0.0804 loss_lmm_image: 0.6639 2024/11/13 22:44:14 - mmengine - INFO - Iter(train) [143900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:24:11 time: 2.0069 data_time: 0.0190 memory: 34003 grad_norm: 30.7709 loss: 8.2760 loss_cls: 0.2872 loss_bbox: 0.1237 loss_iou: 0.2349 d0.loss_cls: 0.3182 d0.loss_bbox: 0.1374 d0.loss_iou: 0.2503 d1.loss_cls: 0.3017 d1.loss_bbox: 0.1321 d1.loss_iou: 0.2421 d2.loss_cls: 0.2938 d2.loss_bbox: 0.1273 d2.loss_iou: 0.2386 d3.loss_cls: 0.2885 d3.loss_bbox: 0.1254 d3.loss_iou: 0.2342 d4.loss_cls: 0.2853 d4.loss_bbox: 0.1266 d4.loss_iou: 0.2370 enc_loss_cls: 0.3274 enc_loss_bbox: 0.1458 enc_loss_iou: 0.2626 dn_loss_cls: 0.0894 dn_loss_bbox: 0.1278 dn_loss_iou: 0.1787 d0.dn_loss_cls: 0.1469 d0.dn_loss_bbox: 0.2468 d0.dn_loss_iou: 0.2987 d1.dn_loss_cls: 0.1093 d1.dn_loss_bbox: 0.1574 d1.dn_loss_iou: 0.2051 d2.dn_loss_cls: 0.0972 d2.dn_loss_bbox: 0.1375 d2.dn_loss_iou: 0.1866 d3.dn_loss_cls: 0.0926 d3.dn_loss_bbox: 0.1300 d3.dn_loss_iou: 0.1803 d4.dn_loss_cls: 0.0903 d4.dn_loss_bbox: 0.1278 d4.dn_loss_iou: 0.1787 d1.loss_lmm_region: 0.0901 loss_lmm_image: 0.6846 2024/11/13 22:47:33 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 22:47:33 - mmengine - INFO - Iter(train) [144000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:20:50 time: 1.9921 data_time: 0.0191 memory: 33936 grad_norm: 23.6078 loss: 8.7991 loss_cls: 0.2744 loss_bbox: 0.1441 loss_iou: 0.2664 d0.loss_cls: 0.3166 d0.loss_bbox: 0.1617 d0.loss_iou: 0.2852 d1.loss_cls: 0.2987 d1.loss_bbox: 0.1455 d1.loss_iou: 0.2675 d2.loss_cls: 0.2840 d2.loss_bbox: 0.1453 d2.loss_iou: 0.2647 d3.loss_cls: 0.2768 d3.loss_bbox: 0.1447 d3.loss_iou: 0.2657 d4.loss_cls: 0.2737 d4.loss_bbox: 0.1439 d4.loss_iou: 0.2648 enc_loss_cls: 0.3178 enc_loss_bbox: 0.1645 enc_loss_iou: 0.2967 dn_loss_cls: 0.0590 dn_loss_bbox: 0.1587 dn_loss_iou: 0.2048 d0.dn_loss_cls: 0.1461 d0.dn_loss_bbox: 0.2934 d0.dn_loss_iou: 0.3402 d1.dn_loss_cls: 0.0916 d1.dn_loss_bbox: 0.1895 d1.dn_loss_iou: 0.2337 d2.dn_loss_cls: 0.0716 d2.dn_loss_bbox: 0.1691 d2.dn_loss_iou: 0.2143 d3.dn_loss_cls: 0.0645 d3.dn_loss_bbox: 0.1599 d3.dn_loss_iou: 0.2063 d4.dn_loss_cls: 0.0587 d4.dn_loss_bbox: 0.1587 d4.dn_loss_iou: 0.2048 d1.loss_lmm_region: 0.0905 loss_lmm_image: 0.6811 2024/11/13 22:50:53 - mmengine - INFO - Iter(train) [144100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:17:29 time: 2.0073 data_time: 0.0191 memory: 34494 grad_norm: 29.2547 loss: 8.6247 loss_cls: 0.2732 loss_bbox: 0.1323 loss_iou: 0.2253 d0.loss_cls: 0.3127 d0.loss_bbox: 0.1439 d0.loss_iou: 0.2356 d1.loss_cls: 0.2987 d1.loss_bbox: 0.1345 d1.loss_iou: 0.2277 d2.loss_cls: 0.2893 d2.loss_bbox: 0.1332 d2.loss_iou: 0.2262 d3.loss_cls: 0.2829 d3.loss_bbox: 0.1317 d3.loss_iou: 0.2229 d4.loss_cls: 0.2763 d4.loss_bbox: 0.1315 d4.loss_iou: 0.2245 enc_loss_cls: 0.3223 enc_loss_bbox: 0.1499 enc_loss_iou: 0.2482 dn_loss_cls: 0.0814 dn_loss_bbox: 0.1613 dn_loss_iou: 0.2027 d0.dn_loss_cls: 0.1724 d0.dn_loss_bbox: 0.3167 d0.dn_loss_iou: 0.3418 d1.dn_loss_cls: 0.1218 d1.dn_loss_bbox: 0.1919 d1.dn_loss_iou: 0.2308 d2.dn_loss_cls: 0.1016 d2.dn_loss_bbox: 0.1696 d2.dn_loss_iou: 0.2113 d3.dn_loss_cls: 0.0862 d3.dn_loss_bbox: 0.1623 d3.dn_loss_iou: 0.2048 d4.dn_loss_cls: 0.0810 d4.dn_loss_bbox: 0.1613 d4.dn_loss_iou: 0.2026 d1.loss_lmm_region: 0.0928 loss_lmm_image: 0.7076 2024/11/13 22:54:11 - mmengine - INFO - Iter(train) [144200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:14:08 time: 1.9914 data_time: 0.0189 memory: 32379 grad_norm: 30.4541 loss: 8.8438 loss_cls: 0.2696 loss_bbox: 0.1387 loss_iou: 0.2450 d0.loss_cls: 0.3060 d0.loss_bbox: 0.1435 d0.loss_iou: 0.2572 d1.loss_cls: 0.2831 d1.loss_bbox: 0.1430 d1.loss_iou: 0.2490 d2.loss_cls: 0.2746 d2.loss_bbox: 0.1397 d2.loss_iou: 0.2467 d3.loss_cls: 0.2697 d3.loss_bbox: 0.1389 d3.loss_iou: 0.2455 d4.loss_cls: 0.2700 d4.loss_bbox: 0.1383 d4.loss_iou: 0.2439 enc_loss_cls: 0.3084 enc_loss_bbox: 0.1515 enc_loss_iou: 0.2684 dn_loss_cls: 0.0990 dn_loss_bbox: 0.1633 dn_loss_iou: 0.2097 d0.dn_loss_cls: 0.1880 d0.dn_loss_bbox: 0.2897 d0.dn_loss_iou: 0.3424 d1.dn_loss_cls: 0.1370 d1.dn_loss_bbox: 0.1879 d1.dn_loss_iou: 0.2355 d2.dn_loss_cls: 0.1111 d2.dn_loss_bbox: 0.1695 d2.dn_loss_iou: 0.2167 d3.dn_loss_cls: 0.1026 d3.dn_loss_bbox: 0.1646 d3.dn_loss_iou: 0.2116 d4.dn_loss_cls: 0.1003 d4.dn_loss_bbox: 0.1633 d4.dn_loss_iou: 0.2097 d1.loss_lmm_region: 0.0997 loss_lmm_image: 0.7117 2024/11/13 22:57:30 - mmengine - INFO - Iter(train) [144300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:10:48 time: 1.9979 data_time: 0.0189 memory: 33816 grad_norm: 22.8411 loss: 8.4911 loss_cls: 0.2818 loss_bbox: 0.1269 loss_iou: 0.2692 d0.loss_cls: 0.3197 d0.loss_bbox: 0.1391 d0.loss_iou: 0.2794 d1.loss_cls: 0.3018 d1.loss_bbox: 0.1293 d1.loss_iou: 0.2725 d2.loss_cls: 0.2929 d2.loss_bbox: 0.1245 d2.loss_iou: 0.2698 d3.loss_cls: 0.2866 d3.loss_bbox: 0.1257 d3.loss_iou: 0.2686 d4.loss_cls: 0.2841 d4.loss_bbox: 0.1250 d4.loss_iou: 0.2681 enc_loss_cls: 0.3219 enc_loss_bbox: 0.1504 enc_loss_iou: 0.2940 dn_loss_cls: 0.0749 dn_loss_bbox: 0.1219 dn_loss_iou: 0.1933 d0.dn_loss_cls: 0.1518 d0.dn_loss_bbox: 0.2466 d0.dn_loss_iou: 0.3129 d1.dn_loss_cls: 0.1052 d1.dn_loss_bbox: 0.1473 d1.dn_loss_iou: 0.2150 d2.dn_loss_cls: 0.0841 d2.dn_loss_bbox: 0.1313 d2.dn_loss_iou: 0.2006 d3.dn_loss_cls: 0.0757 d3.dn_loss_bbox: 0.1230 d3.dn_loss_iou: 0.1945 d4.dn_loss_cls: 0.0760 d4.dn_loss_bbox: 0.1219 d4.dn_loss_iou: 0.1933 d1.loss_lmm_region: 0.0819 loss_lmm_image: 0.7088 2024/11/13 23:00:50 - mmengine - INFO - Iter(train) [144400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:07:27 time: 1.9854 data_time: 0.0190 memory: 35176 grad_norm: 25.0789 loss: 8.5950 loss_cls: 0.2721 loss_bbox: 0.1439 loss_iou: 0.2440 d0.loss_cls: 0.3214 d0.loss_bbox: 0.1566 d0.loss_iou: 0.2573 d1.loss_cls: 0.2881 d1.loss_bbox: 0.1492 d1.loss_iou: 0.2486 d2.loss_cls: 0.2778 d2.loss_bbox: 0.1473 d2.loss_iou: 0.2494 d3.loss_cls: 0.2754 d3.loss_bbox: 0.1461 d3.loss_iou: 0.2481 d4.loss_cls: 0.2727 d4.loss_bbox: 0.1443 d4.loss_iou: 0.2456 enc_loss_cls: 0.3219 enc_loss_bbox: 0.1666 enc_loss_iou: 0.2751 dn_loss_cls: 0.0859 dn_loss_bbox: 0.1413 dn_loss_iou: 0.1933 d0.dn_loss_cls: 0.1555 d0.dn_loss_bbox: 0.2726 d0.dn_loss_iou: 0.3243 d1.dn_loss_cls: 0.1090 d1.dn_loss_bbox: 0.1692 d1.dn_loss_iou: 0.2212 d2.dn_loss_cls: 0.0949 d2.dn_loss_bbox: 0.1492 d2.dn_loss_iou: 0.2009 d3.dn_loss_cls: 0.0905 d3.dn_loss_bbox: 0.1418 d3.dn_loss_iou: 0.1941 d4.dn_loss_cls: 0.0890 d4.dn_loss_bbox: 0.1413 d4.dn_loss_iou: 0.1933 d1.loss_lmm_region: 0.0925 loss_lmm_image: 0.6836 2024/11/13 23:04:09 - mmengine - INFO - Iter(train) [144500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:04:06 time: 1.9914 data_time: 0.0191 memory: 33687 grad_norm: 23.8687 loss: 8.1101 loss_cls: 0.2199 loss_bbox: 0.1309 loss_iou: 0.1966 d0.loss_cls: 0.2535 d0.loss_bbox: 0.1439 d0.loss_iou: 0.2089 d1.loss_cls: 0.2361 d1.loss_bbox: 0.1352 d1.loss_iou: 0.2014 d2.loss_cls: 0.2311 d2.loss_bbox: 0.1291 d2.loss_iou: 0.1990 d3.loss_cls: 0.2244 d3.loss_bbox: 0.1297 d3.loss_iou: 0.1973 d4.loss_cls: 0.2200 d4.loss_bbox: 0.1304 d4.loss_iou: 0.1965 enc_loss_cls: 0.2636 enc_loss_bbox: 0.1485 enc_loss_iou: 0.2210 dn_loss_cls: 0.0861 dn_loss_bbox: 0.1861 dn_loss_iou: 0.2053 d0.dn_loss_cls: 0.1727 d0.dn_loss_bbox: 0.3465 d0.dn_loss_iou: 0.3444 d1.dn_loss_cls: 0.1196 d1.dn_loss_bbox: 0.2237 d1.dn_loss_iou: 0.2347 d2.dn_loss_cls: 0.0978 d2.dn_loss_bbox: 0.1967 d2.dn_loss_iou: 0.2148 d3.dn_loss_cls: 0.0892 d3.dn_loss_bbox: 0.1887 d3.dn_loss_iou: 0.2079 d4.dn_loss_cls: 0.0873 d4.dn_loss_bbox: 0.1861 d4.dn_loss_iou: 0.2053 d1.loss_lmm_region: 0.1012 loss_lmm_image: 0.5988 2024/11/13 23:07:30 - mmengine - INFO - Iter(train) [144600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 3:00:45 time: 2.0048 data_time: 0.0188 memory: 35803 grad_norm: 21.3640 loss: 7.9709 loss_cls: 0.2374 loss_bbox: 0.1318 loss_iou: 0.2356 d0.loss_cls: 0.2725 d0.loss_bbox: 0.1422 d0.loss_iou: 0.2474 d1.loss_cls: 0.2569 d1.loss_bbox: 0.1343 d1.loss_iou: 0.2399 d2.loss_cls: 0.2508 d2.loss_bbox: 0.1304 d2.loss_iou: 0.2332 d3.loss_cls: 0.2438 d3.loss_bbox: 0.1295 d3.loss_iou: 0.2338 d4.loss_cls: 0.2401 d4.loss_bbox: 0.1298 d4.loss_iou: 0.2332 enc_loss_cls: 0.2858 enc_loss_bbox: 0.1494 enc_loss_iou: 0.2605 dn_loss_cls: 0.0620 dn_loss_bbox: 0.1411 dn_loss_iou: 0.1877 d0.dn_loss_cls: 0.1400 d0.dn_loss_bbox: 0.2709 d0.dn_loss_iou: 0.3076 d1.dn_loss_cls: 0.0902 d1.dn_loss_bbox: 0.1645 d1.dn_loss_iou: 0.2086 d2.dn_loss_cls: 0.0741 d2.dn_loss_bbox: 0.1456 d2.dn_loss_iou: 0.1927 d3.dn_loss_cls: 0.0660 d3.dn_loss_bbox: 0.1414 d3.dn_loss_iou: 0.1887 d4.dn_loss_cls: 0.0623 d4.dn_loss_bbox: 0.1412 d4.dn_loss_iou: 0.1878 d1.loss_lmm_region: 0.0892 loss_lmm_image: 0.6909 2024/11/13 23:10:49 - mmengine - INFO - Iter(train) [144700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:57:24 time: 2.0035 data_time: 0.0189 memory: 34712 grad_norm: 25.1345 loss: 7.6722 loss_cls: 0.2419 loss_bbox: 0.0982 loss_iou: 0.2001 d0.loss_cls: 0.2748 d0.loss_bbox: 0.1106 d0.loss_iou: 0.2136 d1.loss_cls: 0.2537 d1.loss_bbox: 0.1080 d1.loss_iou: 0.2076 d2.loss_cls: 0.2462 d2.loss_bbox: 0.1016 d2.loss_iou: 0.2030 d3.loss_cls: 0.2437 d3.loss_bbox: 0.0991 d3.loss_iou: 0.2006 d4.loss_cls: 0.2441 d4.loss_bbox: 0.0977 d4.loss_iou: 0.2012 enc_loss_cls: 0.2851 enc_loss_bbox: 0.1198 enc_loss_iou: 0.2268 dn_loss_cls: 0.0926 dn_loss_bbox: 0.1305 dn_loss_iou: 0.1833 d0.dn_loss_cls: 0.1562 d0.dn_loss_bbox: 0.2599 d0.dn_loss_iou: 0.3131 d1.dn_loss_cls: 0.1149 d1.dn_loss_bbox: 0.1612 d1.dn_loss_iou: 0.2095 d2.dn_loss_cls: 0.0997 d2.dn_loss_bbox: 0.1399 d2.dn_loss_iou: 0.1912 d3.dn_loss_cls: 0.0964 d3.dn_loss_bbox: 0.1326 d3.dn_loss_iou: 0.1853 d4.dn_loss_cls: 0.0932 d4.dn_loss_bbox: 0.1305 d4.dn_loss_iou: 0.1833 d1.loss_lmm_region: 0.0877 loss_lmm_image: 0.7336 2024/11/13 23:14:07 - mmengine - INFO - Iter(train) [144800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:54:03 time: 1.9758 data_time: 0.0189 memory: 34067 grad_norm: 24.9668 loss: 7.8436 loss_cls: 0.2349 loss_bbox: 0.1223 loss_iou: 0.2155 d0.loss_cls: 0.2753 d0.loss_bbox: 0.1304 d0.loss_iou: 0.2276 d1.loss_cls: 0.2536 d1.loss_bbox: 0.1236 d1.loss_iou: 0.2172 d2.loss_cls: 0.2447 d2.loss_bbox: 0.1206 d2.loss_iou: 0.2136 d3.loss_cls: 0.2381 d3.loss_bbox: 0.1213 d3.loss_iou: 0.2153 d4.loss_cls: 0.2362 d4.loss_bbox: 0.1216 d4.loss_iou: 0.2142 enc_loss_cls: 0.2774 enc_loss_bbox: 0.1403 enc_loss_iou: 0.2402 dn_loss_cls: 0.0736 dn_loss_bbox: 0.1452 dn_loss_iou: 0.1804 d0.dn_loss_cls: 0.1512 d0.dn_loss_bbox: 0.2882 d0.dn_loss_iou: 0.3173 d1.dn_loss_cls: 0.1003 d1.dn_loss_bbox: 0.1696 d1.dn_loss_iou: 0.2087 d2.dn_loss_cls: 0.0840 d2.dn_loss_bbox: 0.1506 d2.dn_loss_iou: 0.1883 d3.dn_loss_cls: 0.0782 d3.dn_loss_bbox: 0.1454 d3.dn_loss_iou: 0.1818 d4.dn_loss_cls: 0.0739 d4.dn_loss_bbox: 0.1451 d4.dn_loss_iou: 0.1804 d1.loss_lmm_region: 0.0876 loss_lmm_image: 0.7101 2024/11/13 23:17:25 - mmengine - INFO - Iter(train) [144900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:50:42 time: 1.9967 data_time: 0.0188 memory: 34532 grad_norm: 29.3764 loss: 9.0677 loss_cls: 0.2839 loss_bbox: 0.1462 loss_iou: 0.2629 d0.loss_cls: 0.3101 d0.loss_bbox: 0.1536 d0.loss_iou: 0.2756 d1.loss_cls: 0.3013 d1.loss_bbox: 0.1456 d1.loss_iou: 0.2658 d2.loss_cls: 0.2937 d2.loss_bbox: 0.1433 d2.loss_iou: 0.2610 d3.loss_cls: 0.2878 d3.loss_bbox: 0.1451 d3.loss_iou: 0.2623 d4.loss_cls: 0.2827 d4.loss_bbox: 0.1457 d4.loss_iou: 0.2629 enc_loss_cls: 0.3293 enc_loss_bbox: 0.1626 enc_loss_iou: 0.2882 dn_loss_cls: 0.0979 dn_loss_bbox: 0.1531 dn_loss_iou: 0.2159 d0.dn_loss_cls: 0.1657 d0.dn_loss_bbox: 0.2960 d0.dn_loss_iou: 0.3501 d1.dn_loss_cls: 0.1233 d1.dn_loss_bbox: 0.1777 d1.dn_loss_iou: 0.2415 d2.dn_loss_cls: 0.1081 d2.dn_loss_bbox: 0.1623 d2.dn_loss_iou: 0.2248 d3.dn_loss_cls: 0.1017 d3.dn_loss_bbox: 0.1558 d3.dn_loss_iou: 0.2184 d4.dn_loss_cls: 0.0984 d4.dn_loss_bbox: 0.1532 d4.dn_loss_iou: 0.2160 d1.loss_lmm_region: 0.1004 loss_lmm_image: 0.6978 2024/11/13 23:20:47 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 23:20:47 - mmengine - INFO - Iter(train) [145000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:47:21 time: 2.0041 data_time: 0.0190 memory: 33848 grad_norm: 23.9216 loss: 9.1932 loss_cls: 0.2917 loss_bbox: 0.1535 loss_iou: 0.2768 d0.loss_cls: 0.3300 d0.loss_bbox: 0.1619 d0.loss_iou: 0.2915 d1.loss_cls: 0.3108 d1.loss_bbox: 0.1560 d1.loss_iou: 0.2815 d2.loss_cls: 0.3051 d2.loss_bbox: 0.1539 d2.loss_iou: 0.2793 d3.loss_cls: 0.3038 d3.loss_bbox: 0.1526 d3.loss_iou: 0.2741 d4.loss_cls: 0.2935 d4.loss_bbox: 0.1540 d4.loss_iou: 0.2775 enc_loss_cls: 0.3458 enc_loss_bbox: 0.1681 enc_loss_iou: 0.3001 dn_loss_cls: 0.0727 dn_loss_bbox: 0.1574 dn_loss_iou: 0.2126 d0.dn_loss_cls: 0.1623 d0.dn_loss_bbox: 0.2884 d0.dn_loss_iou: 0.3438 d1.dn_loss_cls: 0.1065 d1.dn_loss_bbox: 0.1860 d1.dn_loss_iou: 0.2405 d2.dn_loss_cls: 0.0824 d2.dn_loss_bbox: 0.1683 d2.dn_loss_iou: 0.2229 d3.dn_loss_cls: 0.0749 d3.dn_loss_bbox: 0.1604 d3.dn_loss_iou: 0.2153 d4.dn_loss_cls: 0.0730 d4.dn_loss_bbox: 0.1575 d4.dn_loss_iou: 0.2126 d1.loss_lmm_region: 0.0905 loss_lmm_image: 0.7038 2024/11/13 23:24:08 - mmengine - INFO - Iter(train) [145100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:44:01 time: 2.0147 data_time: 0.0189 memory: 35533 grad_norm: 40.0520 loss: 8.0971 loss_cls: 0.2611 loss_bbox: 0.1336 loss_iou: 0.2401 d0.loss_cls: 0.2915 d0.loss_bbox: 0.1425 d0.loss_iou: 0.2506 d1.loss_cls: 0.2691 d1.loss_bbox: 0.1391 d1.loss_iou: 0.2424 d2.loss_cls: 0.2722 d2.loss_bbox: 0.1298 d2.loss_iou: 0.2387 d3.loss_cls: 0.2678 d3.loss_bbox: 0.1298 d3.loss_iou: 0.2370 d4.loss_cls: 0.2629 d4.loss_bbox: 0.1319 d4.loss_iou: 0.2382 enc_loss_cls: 0.3002 enc_loss_bbox: 0.1508 enc_loss_iou: 0.2621 dn_loss_cls: 0.0642 dn_loss_bbox: 0.1235 dn_loss_iou: 0.1885 d0.dn_loss_cls: 0.1531 d0.dn_loss_bbox: 0.2535 d0.dn_loss_iou: 0.3139 d1.dn_loss_cls: 0.0968 d1.dn_loss_bbox: 0.1504 d1.dn_loss_iou: 0.2135 d2.dn_loss_cls: 0.0754 d2.dn_loss_bbox: 0.1323 d2.dn_loss_iou: 0.1972 d3.dn_loss_cls: 0.0680 d3.dn_loss_bbox: 0.1252 d3.dn_loss_iou: 0.1907 d4.dn_loss_cls: 0.0643 d4.dn_loss_bbox: 0.1235 d4.dn_loss_iou: 0.1884 d1.loss_lmm_region: 0.0965 loss_lmm_image: 0.6868 2024/11/13 23:27:28 - mmengine - INFO - Iter(train) [145200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:40:40 time: 1.9993 data_time: 0.0189 memory: 34252 grad_norm: 23.1894 loss: 8.0963 loss_cls: 0.2445 loss_bbox: 0.1323 loss_iou: 0.2128 d0.loss_cls: 0.2752 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2269 d1.loss_cls: 0.2535 d1.loss_bbox: 0.1413 d1.loss_iou: 0.2217 d2.loss_cls: 0.2539 d2.loss_bbox: 0.1310 d2.loss_iou: 0.2127 d3.loss_cls: 0.2464 d3.loss_bbox: 0.1339 d3.loss_iou: 0.2140 d4.loss_cls: 0.2464 d4.loss_bbox: 0.1311 d4.loss_iou: 0.2117 enc_loss_cls: 0.2895 enc_loss_bbox: 0.1508 enc_loss_iou: 0.2392 dn_loss_cls: 0.0655 dn_loss_bbox: 0.1605 dn_loss_iou: 0.1940 d0.dn_loss_cls: 0.1528 d0.dn_loss_bbox: 0.3033 d0.dn_loss_iou: 0.3256 d1.dn_loss_cls: 0.1006 d1.dn_loss_bbox: 0.1895 d1.dn_loss_iou: 0.2217 d2.dn_loss_cls: 0.0785 d2.dn_loss_bbox: 0.1700 d2.dn_loss_iou: 0.2037 d3.dn_loss_cls: 0.0705 d3.dn_loss_bbox: 0.1618 d3.dn_loss_iou: 0.1961 d4.dn_loss_cls: 0.0658 d4.dn_loss_bbox: 0.1605 d4.dn_loss_iou: 0.1941 d1.loss_lmm_region: 0.0783 loss_lmm_image: 0.6888 2024/11/13 23:30:47 - mmengine - INFO - Iter(train) [145300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:37:19 time: 2.0014 data_time: 0.0191 memory: 34634 grad_norm: 23.8490 loss: 8.4666 loss_cls: 0.2585 loss_bbox: 0.1239 loss_iou: 0.2297 d0.loss_cls: 0.3037 d0.loss_bbox: 0.1357 d0.loss_iou: 0.2445 d1.loss_cls: 0.2726 d1.loss_bbox: 0.1299 d1.loss_iou: 0.2406 d2.loss_cls: 0.2629 d2.loss_bbox: 0.1251 d2.loss_iou: 0.2341 d3.loss_cls: 0.2564 d3.loss_bbox: 0.1260 d3.loss_iou: 0.2335 d4.loss_cls: 0.2576 d4.loss_bbox: 0.1235 d4.loss_iou: 0.2302 enc_loss_cls: 0.3086 enc_loss_bbox: 0.1444 enc_loss_iou: 0.2579 dn_loss_cls: 0.0844 dn_loss_bbox: 0.1623 dn_loss_iou: 0.2044 d0.dn_loss_cls: 0.1669 d0.dn_loss_bbox: 0.3122 d0.dn_loss_iou: 0.3379 d1.dn_loss_cls: 0.1154 d1.dn_loss_bbox: 0.1954 d1.dn_loss_iou: 0.2315 d2.dn_loss_cls: 0.0959 d2.dn_loss_bbox: 0.1721 d2.dn_loss_iou: 0.2127 d3.dn_loss_cls: 0.0883 d3.dn_loss_bbox: 0.1640 d3.dn_loss_iou: 0.2057 d4.dn_loss_cls: 0.0849 d4.dn_loss_bbox: 0.1624 d4.dn_loss_iou: 0.2043 d1.loss_lmm_region: 0.0923 loss_lmm_image: 0.6744 2024/11/13 23:34:08 - mmengine - INFO - Iter(train) [145400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:33:58 time: 2.0013 data_time: 0.0190 memory: 33537 grad_norm: 26.2713 loss: 8.2103 loss_cls: 0.2497 loss_bbox: 0.1179 loss_iou: 0.2249 d0.loss_cls: 0.2829 d0.loss_bbox: 0.1300 d0.loss_iou: 0.2414 d1.loss_cls: 0.2635 d1.loss_bbox: 0.1256 d1.loss_iou: 0.2350 d2.loss_cls: 0.2598 d2.loss_bbox: 0.1171 d2.loss_iou: 0.2252 d3.loss_cls: 0.2517 d3.loss_bbox: 0.1194 d3.loss_iou: 0.2267 d4.loss_cls: 0.2527 d4.loss_bbox: 0.1170 d4.loss_iou: 0.2249 enc_loss_cls: 0.2915 enc_loss_bbox: 0.1369 enc_loss_iou: 0.2516 dn_loss_cls: 0.0938 dn_loss_bbox: 0.1462 dn_loss_iou: 0.1988 d0.dn_loss_cls: 0.1643 d0.dn_loss_bbox: 0.2754 d0.dn_loss_iou: 0.3184 d1.dn_loss_cls: 0.1198 d1.dn_loss_bbox: 0.1756 d1.dn_loss_iou: 0.2228 d2.dn_loss_cls: 0.1044 d2.dn_loss_bbox: 0.1550 d2.dn_loss_iou: 0.2065 d3.dn_loss_cls: 0.1000 d3.dn_loss_bbox: 0.1474 d3.dn_loss_iou: 0.2000 d4.dn_loss_cls: 0.0967 d4.dn_loss_bbox: 0.1463 d4.dn_loss_iou: 0.1989 d1.loss_lmm_region: 0.0919 loss_lmm_image: 0.7025 2024/11/13 23:37:29 - mmengine - INFO - Iter(train) [145500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:30:37 time: 2.0321 data_time: 0.0190 memory: 34298 grad_norm: 28.3416 loss: 8.3501 loss_cls: 0.2938 loss_bbox: 0.1234 loss_iou: 0.2155 d0.loss_cls: 0.3258 d0.loss_bbox: 0.1389 d0.loss_iou: 0.2343 d1.loss_cls: 0.3038 d1.loss_bbox: 0.1298 d1.loss_iou: 0.2231 d2.loss_cls: 0.2996 d2.loss_bbox: 0.1246 d2.loss_iou: 0.2171 d3.loss_cls: 0.2951 d3.loss_bbox: 0.1247 d3.loss_iou: 0.2176 d4.loss_cls: 0.2951 d4.loss_bbox: 0.1231 d4.loss_iou: 0.2160 enc_loss_cls: 0.3271 enc_loss_bbox: 0.1499 enc_loss_iou: 0.2466 dn_loss_cls: 0.1023 dn_loss_bbox: 0.1329 dn_loss_iou: 0.1720 d0.dn_loss_cls: 0.1961 d0.dn_loss_bbox: 0.2711 d0.dn_loss_iou: 0.2964 d1.dn_loss_cls: 0.1319 d1.dn_loss_bbox: 0.1647 d1.dn_loss_iou: 0.1971 d2.dn_loss_cls: 0.1084 d2.dn_loss_bbox: 0.1444 d2.dn_loss_iou: 0.1804 d3.dn_loss_cls: 0.1033 d3.dn_loss_bbox: 0.1353 d3.dn_loss_iou: 0.1744 d4.dn_loss_cls: 0.1023 d4.dn_loss_bbox: 0.1329 d4.dn_loss_iou: 0.1720 d1.loss_lmm_region: 0.1163 loss_lmm_image: 0.6908 2024/11/13 23:40:49 - mmengine - INFO - Iter(train) [145600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:27:16 time: 1.9982 data_time: 0.0189 memory: 35385 grad_norm: 40.4108 loss: 7.6838 loss_cls: 0.2130 loss_bbox: 0.1226 loss_iou: 0.2075 d0.loss_cls: 0.2463 d0.loss_bbox: 0.1321 d0.loss_iou: 0.2213 d1.loss_cls: 0.2325 d1.loss_bbox: 0.1219 d1.loss_iou: 0.2102 d2.loss_cls: 0.2234 d2.loss_bbox: 0.1249 d2.loss_iou: 0.2088 d3.loss_cls: 0.2187 d3.loss_bbox: 0.1240 d3.loss_iou: 0.2082 d4.loss_cls: 0.2124 d4.loss_bbox: 0.1250 d4.loss_iou: 0.2084 enc_loss_cls: 0.2605 enc_loss_bbox: 0.1422 enc_loss_iou: 0.2357 dn_loss_cls: 0.0676 dn_loss_bbox: 0.1524 dn_loss_iou: 0.1901 d0.dn_loss_cls: 0.1425 d0.dn_loss_bbox: 0.3021 d0.dn_loss_iou: 0.3277 d1.dn_loss_cls: 0.0892 d1.dn_loss_bbox: 0.1849 d1.dn_loss_iou: 0.2193 d2.dn_loss_cls: 0.0737 d2.dn_loss_bbox: 0.1630 d2.dn_loss_iou: 0.1996 d3.dn_loss_cls: 0.0703 d3.dn_loss_bbox: 0.1546 d3.dn_loss_iou: 0.1923 d4.dn_loss_cls: 0.0677 d4.dn_loss_bbox: 0.1525 d4.dn_loss_iou: 0.1902 d1.loss_lmm_region: 0.0877 loss_lmm_image: 0.6570 2024/11/13 23:44:08 - mmengine - INFO - Iter(train) [145700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:23:55 time: 2.0140 data_time: 0.0191 memory: 34057 grad_norm: 38.3468 loss: 8.4969 loss_cls: 0.2684 loss_bbox: 0.1348 loss_iou: 0.2501 d0.loss_cls: 0.3158 d0.loss_bbox: 0.1438 d0.loss_iou: 0.2629 d1.loss_cls: 0.2880 d1.loss_bbox: 0.1405 d1.loss_iou: 0.2585 d2.loss_cls: 0.2772 d2.loss_bbox: 0.1402 d2.loss_iou: 0.2544 d3.loss_cls: 0.2693 d3.loss_bbox: 0.1383 d3.loss_iou: 0.2531 d4.loss_cls: 0.2690 d4.loss_bbox: 0.1350 d4.loss_iou: 0.2505 enc_loss_cls: 0.3163 enc_loss_bbox: 0.1506 enc_loss_iou: 0.2753 dn_loss_cls: 0.0685 dn_loss_bbox: 0.1467 dn_loss_iou: 0.1919 d0.dn_loss_cls: 0.1436 d0.dn_loss_bbox: 0.2674 d0.dn_loss_iou: 0.3128 d1.dn_loss_cls: 0.0970 d1.dn_loss_bbox: 0.1767 d1.dn_loss_iou: 0.2171 d2.dn_loss_cls: 0.0804 d2.dn_loss_bbox: 0.1560 d2.dn_loss_iou: 0.2000 d3.dn_loss_cls: 0.0725 d3.dn_loss_bbox: 0.1492 d3.dn_loss_iou: 0.1943 d4.dn_loss_cls: 0.0697 d4.dn_loss_bbox: 0.1468 d4.dn_loss_iou: 0.1920 d1.loss_lmm_region: 0.0977 loss_lmm_image: 0.7247 2024/11/13 23:47:29 - mmengine - INFO - Iter(train) [145800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:20:35 time: 2.0227 data_time: 0.0193 memory: 33952 grad_norm: 47.9170 loss: 8.6798 loss_cls: 0.2629 loss_bbox: 0.1435 loss_iou: 0.2489 d0.loss_cls: 0.2985 d0.loss_bbox: 0.1542 d0.loss_iou: 0.2635 d1.loss_cls: 0.2835 d1.loss_bbox: 0.1463 d1.loss_iou: 0.2514 d2.loss_cls: 0.2775 d2.loss_bbox: 0.1421 d2.loss_iou: 0.2484 d3.loss_cls: 0.2710 d3.loss_bbox: 0.1417 d3.loss_iou: 0.2472 d4.loss_cls: 0.2649 d4.loss_bbox: 0.1440 d4.loss_iou: 0.2492 enc_loss_cls: 0.3049 enc_loss_bbox: 0.1569 enc_loss_iou: 0.2740 dn_loss_cls: 0.0825 dn_loss_bbox: 0.1501 dn_loss_iou: 0.2075 d0.dn_loss_cls: 0.1680 d0.dn_loss_bbox: 0.2825 d0.dn_loss_iou: 0.3384 d1.dn_loss_cls: 0.1121 d1.dn_loss_bbox: 0.1789 d1.dn_loss_iou: 0.2342 d2.dn_loss_cls: 0.0947 d2.dn_loss_bbox: 0.1564 d2.dn_loss_iou: 0.2142 d3.dn_loss_cls: 0.0895 d3.dn_loss_bbox: 0.1495 d3.dn_loss_iou: 0.2084 d4.dn_loss_cls: 0.0828 d4.dn_loss_bbox: 0.1500 d4.dn_loss_iou: 0.2075 d1.loss_lmm_region: 0.1197 loss_lmm_image: 0.6787 2024/11/13 23:50:51 - mmengine - INFO - Iter(train) [145900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:17:14 time: 2.0348 data_time: 0.0189 memory: 34872 grad_norm: 34.2533 loss: 8.7108 loss_cls: 0.2758 loss_bbox: 0.1339 loss_iou: 0.2270 d0.loss_cls: 0.3213 d0.loss_bbox: 0.1446 d0.loss_iou: 0.2361 d1.loss_cls: 0.2959 d1.loss_bbox: 0.1337 d1.loss_iou: 0.2272 d2.loss_cls: 0.2887 d2.loss_bbox: 0.1281 d2.loss_iou: 0.2241 d3.loss_cls: 0.2831 d3.loss_bbox: 0.1277 d3.loss_iou: 0.2238 d4.loss_cls: 0.2799 d4.loss_bbox: 0.1312 d4.loss_iou: 0.2253 enc_loss_cls: 0.3290 enc_loss_bbox: 0.1538 enc_loss_iou: 0.2489 dn_loss_cls: 0.0808 dn_loss_bbox: 0.1828 dn_loss_iou: 0.2057 d0.dn_loss_cls: 0.1544 d0.dn_loss_bbox: 0.3339 d0.dn_loss_iou: 0.3405 d1.dn_loss_cls: 0.1080 d1.dn_loss_bbox: 0.2169 d1.dn_loss_iou: 0.2340 d2.dn_loss_cls: 0.0905 d2.dn_loss_bbox: 0.1928 d2.dn_loss_iou: 0.2139 d3.dn_loss_cls: 0.0820 d3.dn_loss_bbox: 0.1858 d3.dn_loss_iou: 0.2086 d4.dn_loss_cls: 0.0808 d4.dn_loss_bbox: 0.1828 d4.dn_loss_iou: 0.2058 d1.loss_lmm_region: 0.0814 loss_lmm_image: 0.6907 2024/11/13 23:54:14 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/13 23:54:14 - mmengine - INFO - Iter(train) [146000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:13:53 time: 2.0137 data_time: 0.0188 memory: 35182 grad_norm: 22.6291 loss: 8.3991 loss_cls: 0.2676 loss_bbox: 0.1385 loss_iou: 0.2349 d0.loss_cls: 0.3071 d0.loss_bbox: 0.1437 d0.loss_iou: 0.2472 d1.loss_cls: 0.2849 d1.loss_bbox: 0.1391 d1.loss_iou: 0.2403 d2.loss_cls: 0.2761 d2.loss_bbox: 0.1379 d2.loss_iou: 0.2366 d3.loss_cls: 0.2711 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2339 d4.loss_cls: 0.2677 d4.loss_bbox: 0.1389 d4.loss_iou: 0.2347 enc_loss_cls: 0.3139 enc_loss_bbox: 0.1571 enc_loss_iou: 0.2637 dn_loss_cls: 0.0778 dn_loss_bbox: 0.1464 dn_loss_iou: 0.1913 d0.dn_loss_cls: 0.1639 d0.dn_loss_bbox: 0.2708 d0.dn_loss_iou: 0.3214 d1.dn_loss_cls: 0.1083 d1.dn_loss_bbox: 0.1712 d1.dn_loss_iou: 0.2178 d2.dn_loss_cls: 0.0878 d2.dn_loss_bbox: 0.1546 d2.dn_loss_iou: 0.2001 d3.dn_loss_cls: 0.0832 d3.dn_loss_bbox: 0.1478 d3.dn_loss_iou: 0.1934 d4.dn_loss_cls: 0.0787 d4.dn_loss_bbox: 0.1464 d4.dn_loss_iou: 0.1914 d1.loss_lmm_region: 0.0886 loss_lmm_image: 0.6883 2024/11/13 23:57:32 - mmengine - INFO - Iter(train) [146100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:10:32 time: 2.0153 data_time: 0.0190 memory: 33755 grad_norm: 22.1549 loss: 8.2779 loss_cls: 0.2594 loss_bbox: 0.1327 loss_iou: 0.2089 d0.loss_cls: 0.2902 d0.loss_bbox: 0.1431 d0.loss_iou: 0.2217 d1.loss_cls: 0.2771 d1.loss_bbox: 0.1318 d1.loss_iou: 0.2122 d2.loss_cls: 0.2708 d2.loss_bbox: 0.1290 d2.loss_iou: 0.2060 d3.loss_cls: 0.2658 d3.loss_bbox: 0.1284 d3.loss_iou: 0.2063 d4.loss_cls: 0.2605 d4.loss_bbox: 0.1333 d4.loss_iou: 0.2084 enc_loss_cls: 0.3071 enc_loss_bbox: 0.1509 enc_loss_iou: 0.2364 dn_loss_cls: 0.0784 dn_loss_bbox: 0.1618 dn_loss_iou: 0.1957 d0.dn_loss_cls: 0.1654 d0.dn_loss_bbox: 0.3040 d0.dn_loss_iou: 0.3267 d1.dn_loss_cls: 0.1091 d1.dn_loss_bbox: 0.1952 d1.dn_loss_iou: 0.2241 d2.dn_loss_cls: 0.0913 d2.dn_loss_bbox: 0.1714 d2.dn_loss_iou: 0.2053 d3.dn_loss_cls: 0.0832 d3.dn_loss_bbox: 0.1642 d3.dn_loss_iou: 0.1977 d4.dn_loss_cls: 0.0800 d4.dn_loss_bbox: 0.1618 d4.dn_loss_iou: 0.1958 d1.loss_lmm_region: 0.1023 loss_lmm_image: 0.6845 2024/11/14 00:00:54 - mmengine - INFO - Iter(train) [146200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:07:11 time: 1.9930 data_time: 0.0189 memory: 34590 grad_norm: 27.9133 loss: 8.2615 loss_cls: 0.2400 loss_bbox: 0.1375 loss_iou: 0.2691 d0.loss_cls: 0.2847 d0.loss_bbox: 0.1484 d0.loss_iou: 0.2827 d1.loss_cls: 0.2586 d1.loss_bbox: 0.1452 d1.loss_iou: 0.2720 d2.loss_cls: 0.2523 d2.loss_bbox: 0.1424 d2.loss_iou: 0.2706 d3.loss_cls: 0.2457 d3.loss_bbox: 0.1381 d3.loss_iou: 0.2693 d4.loss_cls: 0.2424 d4.loss_bbox: 0.1379 d4.loss_iou: 0.2693 enc_loss_cls: 0.2910 enc_loss_bbox: 0.1581 enc_loss_iou: 0.2964 dn_loss_cls: 0.0591 dn_loss_bbox: 0.1319 dn_loss_iou: 0.1938 d0.dn_loss_cls: 0.1351 d0.dn_loss_bbox: 0.2656 d0.dn_loss_iou: 0.3257 d1.dn_loss_cls: 0.0860 d1.dn_loss_bbox: 0.1586 d1.dn_loss_iou: 0.2200 d2.dn_loss_cls: 0.0707 d2.dn_loss_bbox: 0.1407 d2.dn_loss_iou: 0.2020 d3.dn_loss_cls: 0.0639 d3.dn_loss_bbox: 0.1337 d3.dn_loss_iou: 0.1956 d4.dn_loss_cls: 0.0603 d4.dn_loss_bbox: 0.1319 d4.dn_loss_iou: 0.1938 d1.loss_lmm_region: 0.0771 loss_lmm_image: 0.6642 2024/11/14 00:04:15 - mmengine - INFO - Iter(train) [146300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:03:50 time: 2.0124 data_time: 0.0190 memory: 33027 grad_norm: 29.9186 loss: 7.6143 loss_cls: 0.2458 loss_bbox: 0.1058 loss_iou: 0.1949 d0.loss_cls: 0.2819 d0.loss_bbox: 0.1141 d0.loss_iou: 0.2071 d1.loss_cls: 0.2658 d1.loss_bbox: 0.1061 d1.loss_iou: 0.1974 d2.loss_cls: 0.2572 d2.loss_bbox: 0.1043 d2.loss_iou: 0.1960 d3.loss_cls: 0.2558 d3.loss_bbox: 0.1042 d3.loss_iou: 0.1942 d4.loss_cls: 0.2492 d4.loss_bbox: 0.1061 d4.loss_iou: 0.1952 enc_loss_cls: 0.2811 enc_loss_bbox: 0.1258 enc_loss_iou: 0.2206 dn_loss_cls: 0.0858 dn_loss_bbox: 0.1350 dn_loss_iou: 0.1848 d0.dn_loss_cls: 0.1693 d0.dn_loss_bbox: 0.2444 d0.dn_loss_iou: 0.2960 d1.dn_loss_cls: 0.1187 d1.dn_loss_bbox: 0.1612 d1.dn_loss_iou: 0.2090 d2.dn_loss_cls: 0.0978 d2.dn_loss_bbox: 0.1450 d2.dn_loss_iou: 0.1930 d3.dn_loss_cls: 0.0917 d3.dn_loss_bbox: 0.1363 d3.dn_loss_iou: 0.1866 d4.dn_loss_cls: 0.0875 d4.dn_loss_bbox: 0.1350 d4.dn_loss_iou: 0.1848 d1.loss_lmm_region: 0.1017 loss_lmm_image: 0.6420 2024/11/14 00:07:34 - mmengine - INFO - Iter(train) [146400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 2:00:30 time: 2.0021 data_time: 0.0190 memory: 34970 grad_norm: 37.0859 loss: 8.9951 loss_cls: 0.3031 loss_bbox: 0.1367 loss_iou: 0.2487 d0.loss_cls: 0.3384 d0.loss_bbox: 0.1457 d0.loss_iou: 0.2570 d1.loss_cls: 0.3162 d1.loss_bbox: 0.1403 d1.loss_iou: 0.2514 d2.loss_cls: 0.3113 d2.loss_bbox: 0.1377 d2.loss_iou: 0.2465 d3.loss_cls: 0.3117 d3.loss_bbox: 0.1352 d3.loss_iou: 0.2448 d4.loss_cls: 0.3096 d4.loss_bbox: 0.1339 d4.loss_iou: 0.2438 enc_loss_cls: 0.3441 enc_loss_bbox: 0.1516 enc_loss_iou: 0.2694 dn_loss_cls: 0.1172 dn_loss_bbox: 0.1415 dn_loss_iou: 0.2001 d0.dn_loss_cls: 0.1958 d0.dn_loss_bbox: 0.2906 d0.dn_loss_iou: 0.3332 d1.dn_loss_cls: 0.1434 d1.dn_loss_bbox: 0.1765 d1.dn_loss_iou: 0.2271 d2.dn_loss_cls: 0.1247 d2.dn_loss_bbox: 0.1495 d2.dn_loss_iou: 0.2078 d3.dn_loss_cls: 0.1273 d3.dn_loss_bbox: 0.1424 d3.dn_loss_iou: 0.2014 d4.dn_loss_cls: 0.1177 d4.dn_loss_bbox: 0.1416 d4.dn_loss_iou: 0.2001 d1.loss_lmm_region: 0.0897 loss_lmm_image: 0.6906 2024/11/14 00:10:54 - mmengine - INFO - Iter(train) [146500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:57:09 time: 1.9992 data_time: 0.0189 memory: 33451 grad_norm: 23.1108 loss: 8.0564 loss_cls: 0.2642 loss_bbox: 0.1261 loss_iou: 0.2109 d0.loss_cls: 0.3018 d0.loss_bbox: 0.1473 d0.loss_iou: 0.2283 d1.loss_cls: 0.2819 d1.loss_bbox: 0.1351 d1.loss_iou: 0.2176 d2.loss_cls: 0.2733 d2.loss_bbox: 0.1313 d2.loss_iou: 0.2132 d3.loss_cls: 0.2671 d3.loss_bbox: 0.1278 d3.loss_iou: 0.2129 d4.loss_cls: 0.2648 d4.loss_bbox: 0.1271 d4.loss_iou: 0.2108 enc_loss_cls: 0.3104 enc_loss_bbox: 0.1585 enc_loss_iou: 0.2424 dn_loss_cls: 0.0799 dn_loss_bbox: 0.1282 dn_loss_iou: 0.1722 d0.dn_loss_cls: 0.1621 d0.dn_loss_bbox: 0.2696 d0.dn_loss_iou: 0.2985 d1.dn_loss_cls: 0.1173 d1.dn_loss_bbox: 0.1602 d1.dn_loss_iou: 0.1982 d2.dn_loss_cls: 0.0928 d2.dn_loss_bbox: 0.1385 d2.dn_loss_iou: 0.1815 d3.dn_loss_cls: 0.0853 d3.dn_loss_bbox: 0.1311 d3.dn_loss_iou: 0.1745 d4.dn_loss_cls: 0.0812 d4.dn_loss_bbox: 0.1283 d4.dn_loss_iou: 0.1722 d1.loss_lmm_region: 0.0989 loss_lmm_image: 0.7330 2024/11/14 00:14:15 - mmengine - INFO - Iter(train) [146600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:53:48 time: 2.0207 data_time: 0.0189 memory: 33155 grad_norm: 24.0678 loss: 8.0607 loss_cls: 0.2400 loss_bbox: 0.1313 loss_iou: 0.2101 d0.loss_cls: 0.2681 d0.loss_bbox: 0.1447 d0.loss_iou: 0.2261 d1.loss_cls: 0.2565 d1.loss_bbox: 0.1366 d1.loss_iou: 0.2155 d2.loss_cls: 0.2488 d2.loss_bbox: 0.1323 d2.loss_iou: 0.2140 d3.loss_cls: 0.2420 d3.loss_bbox: 0.1323 d3.loss_iou: 0.2109 d4.loss_cls: 0.2403 d4.loss_bbox: 0.1315 d4.loss_iou: 0.2095 enc_loss_cls: 0.2780 enc_loss_bbox: 0.1538 enc_loss_iou: 0.2409 dn_loss_cls: 0.0762 dn_loss_bbox: 0.1578 dn_loss_iou: 0.1898 d0.dn_loss_cls: 0.1454 d0.dn_loss_bbox: 0.2995 d0.dn_loss_iou: 0.3177 d1.dn_loss_cls: 0.1040 d1.dn_loss_bbox: 0.1896 d1.dn_loss_iou: 0.2182 d2.dn_loss_cls: 0.0850 d2.dn_loss_bbox: 0.1651 d2.dn_loss_iou: 0.1974 d3.dn_loss_cls: 0.0805 d3.dn_loss_bbox: 0.1590 d3.dn_loss_iou: 0.1915 d4.dn_loss_cls: 0.0776 d4.dn_loss_bbox: 0.1577 d4.dn_loss_iou: 0.1897 d1.loss_lmm_region: 0.0832 loss_lmm_image: 0.7122 2024/11/14 00:17:36 - mmengine - INFO - Iter(train) [146700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:50:27 time: 1.9764 data_time: 0.0189 memory: 34952 grad_norm: 27.0683 loss: 7.4831 loss_cls: 0.2466 loss_bbox: 0.1068 loss_iou: 0.2075 d0.loss_cls: 0.2666 d0.loss_bbox: 0.1224 d0.loss_iou: 0.2207 d1.loss_cls: 0.2547 d1.loss_bbox: 0.1196 d1.loss_iou: 0.2138 d2.loss_cls: 0.2525 d2.loss_bbox: 0.1107 d2.loss_iou: 0.2076 d3.loss_cls: 0.2507 d3.loss_bbox: 0.1088 d3.loss_iou: 0.2081 d4.loss_cls: 0.2500 d4.loss_bbox: 0.1066 d4.loss_iou: 0.2058 enc_loss_cls: 0.2798 enc_loss_bbox: 0.1304 enc_loss_iou: 0.2362 dn_loss_cls: 0.0862 dn_loss_bbox: 0.1186 dn_loss_iou: 0.1607 d0.dn_loss_cls: 0.1535 d0.dn_loss_bbox: 0.2238 d0.dn_loss_iou: 0.2716 d1.dn_loss_cls: 0.1145 d1.dn_loss_bbox: 0.1423 d1.dn_loss_iou: 0.1849 d2.dn_loss_cls: 0.0985 d2.dn_loss_bbox: 0.1245 d2.dn_loss_iou: 0.1678 d3.dn_loss_cls: 0.0921 d3.dn_loss_bbox: 0.1201 d3.dn_loss_iou: 0.1624 d4.dn_loss_cls: 0.0884 d4.dn_loss_bbox: 0.1188 d4.dn_loss_iou: 0.1607 d1.loss_lmm_region: 0.0931 loss_lmm_image: 0.6949 2024/11/14 00:20:57 - mmengine - INFO - Iter(train) [146800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:47:06 time: 2.0077 data_time: 0.0190 memory: 33705 grad_norm: 21.7815 loss: 7.9012 loss_cls: 0.2548 loss_bbox: 0.1236 loss_iou: 0.2329 d0.loss_cls: 0.3020 d0.loss_bbox: 0.1301 d0.loss_iou: 0.2450 d1.loss_cls: 0.2730 d1.loss_bbox: 0.1245 d1.loss_iou: 0.2373 d2.loss_cls: 0.2644 d2.loss_bbox: 0.1238 d2.loss_iou: 0.2327 d3.loss_cls: 0.2572 d3.loss_bbox: 0.1241 d3.loss_iou: 0.2340 d4.loss_cls: 0.2564 d4.loss_bbox: 0.1231 d4.loss_iou: 0.2328 enc_loss_cls: 0.3095 enc_loss_bbox: 0.1373 enc_loss_iou: 0.2592 dn_loss_cls: 0.0685 dn_loss_bbox: 0.1243 dn_loss_iou: 0.1702 d0.dn_loss_cls: 0.1387 d0.dn_loss_bbox: 0.2537 d0.dn_loss_iou: 0.2948 d1.dn_loss_cls: 0.0942 d1.dn_loss_bbox: 0.1546 d1.dn_loss_iou: 0.1975 d2.dn_loss_cls: 0.0758 d2.dn_loss_bbox: 0.1328 d2.dn_loss_iou: 0.1781 d3.dn_loss_cls: 0.0718 d3.dn_loss_bbox: 0.1250 d3.dn_loss_iou: 0.1723 d4.dn_loss_cls: 0.0684 d4.dn_loss_bbox: 0.1243 d4.dn_loss_iou: 0.1703 d1.loss_lmm_region: 0.0981 loss_lmm_image: 0.7098 2024/11/14 00:24:18 - mmengine - INFO - Iter(train) [146900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:43:45 time: 2.0315 data_time: 0.0190 memory: 33846 grad_norm: 26.6078 loss: 7.9467 loss_cls: 0.2463 loss_bbox: 0.1155 loss_iou: 0.2429 d0.loss_cls: 0.2861 d0.loss_bbox: 0.1323 d0.loss_iou: 0.2535 d1.loss_cls: 0.2636 d1.loss_bbox: 0.1224 d1.loss_iou: 0.2468 d2.loss_cls: 0.2580 d2.loss_bbox: 0.1148 d2.loss_iou: 0.2428 d3.loss_cls: 0.2490 d3.loss_bbox: 0.1182 d3.loss_iou: 0.2457 d4.loss_cls: 0.2493 d4.loss_bbox: 0.1153 d4.loss_iou: 0.2424 enc_loss_cls: 0.2913 enc_loss_bbox: 0.1409 enc_loss_iou: 0.2652 dn_loss_cls: 0.0662 dn_loss_bbox: 0.1377 dn_loss_iou: 0.1838 d0.dn_loss_cls: 0.1339 d0.dn_loss_bbox: 0.2634 d0.dn_loss_iou: 0.3002 d1.dn_loss_cls: 0.0945 d1.dn_loss_bbox: 0.1642 d1.dn_loss_iou: 0.2082 d2.dn_loss_cls: 0.0787 d2.dn_loss_bbox: 0.1477 d2.dn_loss_iou: 0.1918 d3.dn_loss_cls: 0.0715 d3.dn_loss_bbox: 0.1393 d3.dn_loss_iou: 0.1859 d4.dn_loss_cls: 0.0667 d4.dn_loss_bbox: 0.1377 d4.dn_loss_iou: 0.1838 d1.loss_lmm_region: 0.0710 loss_lmm_image: 0.6779 2024/11/14 00:27:36 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/14 00:27:36 - mmengine - INFO - Iter(train) [147000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:40:24 time: 1.9744 data_time: 0.0190 memory: 35940 grad_norm: 23.5736 loss: 7.4809 loss_cls: 0.2418 loss_bbox: 0.1071 loss_iou: 0.2047 d0.loss_cls: 0.2815 d0.loss_bbox: 0.1116 d0.loss_iou: 0.2173 d1.loss_cls: 0.2661 d1.loss_bbox: 0.1065 d1.loss_iou: 0.2056 d2.loss_cls: 0.2547 d2.loss_bbox: 0.1065 d2.loss_iou: 0.2042 d3.loss_cls: 0.2469 d3.loss_bbox: 0.1071 d3.loss_iou: 0.2064 d4.loss_cls: 0.2454 d4.loss_bbox: 0.1075 d4.loss_iou: 0.2056 enc_loss_cls: 0.2957 enc_loss_bbox: 0.1189 enc_loss_iou: 0.2231 dn_loss_cls: 0.0664 dn_loss_bbox: 0.1396 dn_loss_iou: 0.1691 d0.dn_loss_cls: 0.1298 d0.dn_loss_bbox: 0.2558 d0.dn_loss_iou: 0.2819 d1.dn_loss_cls: 0.0853 d1.dn_loss_bbox: 0.1641 d1.dn_loss_iou: 0.1916 d2.dn_loss_cls: 0.0719 d2.dn_loss_bbox: 0.1443 d2.dn_loss_iou: 0.1754 d3.dn_loss_cls: 0.0676 d3.dn_loss_bbox: 0.1406 d3.dn_loss_iou: 0.1707 d4.dn_loss_cls: 0.0658 d4.dn_loss_bbox: 0.1396 d4.dn_loss_iou: 0.1692 d1.loss_lmm_region: 0.0881 loss_lmm_image: 0.6999 2024/11/14 00:30:57 - mmengine - INFO - Iter(train) [147100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:37:04 time: 2.0278 data_time: 0.0190 memory: 33440 grad_norm: 24.7552 loss: 7.1268 loss_cls: 0.2129 loss_bbox: 0.1016 loss_iou: 0.1882 d0.loss_cls: 0.2550 d0.loss_bbox: 0.1087 d0.loss_iou: 0.1999 d1.loss_cls: 0.2324 d1.loss_bbox: 0.1035 d1.loss_iou: 0.1919 d2.loss_cls: 0.2247 d2.loss_bbox: 0.0996 d2.loss_iou: 0.1870 d3.loss_cls: 0.2203 d3.loss_bbox: 0.1007 d3.loss_iou: 0.1870 d4.loss_cls: 0.2169 d4.loss_bbox: 0.0995 d4.loss_iou: 0.1861 enc_loss_cls: 0.2630 enc_loss_bbox: 0.1172 enc_loss_iou: 0.2077 dn_loss_cls: 0.0851 dn_loss_bbox: 0.1229 dn_loss_iou: 0.1633 d0.dn_loss_cls: 0.1600 d0.dn_loss_bbox: 0.2394 d0.dn_loss_iou: 0.2750 d1.dn_loss_cls: 0.1133 d1.dn_loss_bbox: 0.1500 d1.dn_loss_iou: 0.1868 d2.dn_loss_cls: 0.0965 d2.dn_loss_bbox: 0.1303 d2.dn_loss_iou: 0.1701 d3.dn_loss_cls: 0.0897 d3.dn_loss_bbox: 0.1246 d3.dn_loss_iou: 0.1651 d4.dn_loss_cls: 0.0848 d4.dn_loss_bbox: 0.1230 d4.dn_loss_iou: 0.1634 d1.loss_lmm_region: 0.0929 loss_lmm_image: 0.6868 2024/11/14 00:34:21 - mmengine - INFO - Iter(train) [147200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:33:43 time: 2.0307 data_time: 0.0189 memory: 34732 grad_norm: 28.4319 loss: 8.1972 loss_cls: 0.2428 loss_bbox: 0.1231 loss_iou: 0.2131 d0.loss_cls: 0.2712 d0.loss_bbox: 0.1367 d0.loss_iou: 0.2243 d1.loss_cls: 0.2565 d1.loss_bbox: 0.1281 d1.loss_iou: 0.2143 d2.loss_cls: 0.2516 d2.loss_bbox: 0.1232 d2.loss_iou: 0.2108 d3.loss_cls: 0.2433 d3.loss_bbox: 0.1252 d3.loss_iou: 0.2119 d4.loss_cls: 0.2415 d4.loss_bbox: 0.1252 d4.loss_iou: 0.2148 enc_loss_cls: 0.2805 enc_loss_bbox: 0.1444 enc_loss_iou: 0.2337 dn_loss_cls: 0.0961 dn_loss_bbox: 0.1541 dn_loss_iou: 0.2053 d0.dn_loss_cls: 0.1637 d0.dn_loss_bbox: 0.2954 d0.dn_loss_iou: 0.3385 d1.dn_loss_cls: 0.1186 d1.dn_loss_bbox: 0.1865 d1.dn_loss_iou: 0.2336 d2.dn_loss_cls: 0.1017 d2.dn_loss_bbox: 0.1631 d2.dn_loss_iou: 0.2138 d3.dn_loss_cls: 0.0968 d3.dn_loss_bbox: 0.1548 d3.dn_loss_iou: 0.2070 d4.dn_loss_cls: 0.0954 d4.dn_loss_bbox: 0.1541 d4.dn_loss_iou: 0.2053 d1.loss_lmm_region: 0.1002 loss_lmm_image: 0.6969 2024/11/14 00:37:39 - mmengine - INFO - Iter(train) [147300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:30:22 time: 1.9843 data_time: 0.0190 memory: 33419 grad_norm: 30.9288 loss: 9.4330 loss_cls: 0.2938 loss_bbox: 0.1491 loss_iou: 0.2622 d0.loss_cls: 0.3344 d0.loss_bbox: 0.1727 d0.loss_iou: 0.2793 d1.loss_cls: 0.3126 d1.loss_bbox: 0.1551 d1.loss_iou: 0.2662 d2.loss_cls: 0.3086 d2.loss_bbox: 0.1534 d2.loss_iou: 0.2648 d3.loss_cls: 0.2992 d3.loss_bbox: 0.1524 d3.loss_iou: 0.2652 d4.loss_cls: 0.2969 d4.loss_bbox: 0.1482 d4.loss_iou: 0.2622 enc_loss_cls: 0.3505 enc_loss_bbox: 0.1821 enc_loss_iou: 0.2972 dn_loss_cls: 0.0725 dn_loss_bbox: 0.1853 dn_loss_iou: 0.2261 d0.dn_loss_cls: 0.1555 d0.dn_loss_bbox: 0.3324 d0.dn_loss_iou: 0.3631 d1.dn_loss_cls: 0.1077 d1.dn_loss_bbox: 0.2212 d1.dn_loss_iou: 0.2577 d2.dn_loss_cls: 0.0874 d2.dn_loss_bbox: 0.1987 d2.dn_loss_iou: 0.2369 d3.dn_loss_cls: 0.0805 d3.dn_loss_bbox: 0.1892 d3.dn_loss_iou: 0.2292 d4.dn_loss_cls: 0.0737 d4.dn_loss_bbox: 0.1852 d4.dn_loss_iou: 0.2261 d1.loss_lmm_region: 0.0945 loss_lmm_image: 0.7043 2024/11/14 00:41:01 - mmengine - INFO - Iter(train) [147400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:27:01 time: 2.0047 data_time: 0.0190 memory: 32590 grad_norm: 24.1628 loss: 7.7077 loss_cls: 0.2340 loss_bbox: 0.1075 loss_iou: 0.2050 d0.loss_cls: 0.2698 d0.loss_bbox: 0.1213 d0.loss_iou: 0.2202 d1.loss_cls: 0.2542 d1.loss_bbox: 0.1105 d1.loss_iou: 0.2099 d2.loss_cls: 0.2418 d2.loss_bbox: 0.1116 d2.loss_iou: 0.2089 d3.loss_cls: 0.2390 d3.loss_bbox: 0.1088 d3.loss_iou: 0.2058 d4.loss_cls: 0.2367 d4.loss_bbox: 0.1071 d4.loss_iou: 0.2044 enc_loss_cls: 0.2798 enc_loss_bbox: 0.1295 enc_loss_iou: 0.2353 dn_loss_cls: 0.0851 dn_loss_bbox: 0.1373 dn_loss_iou: 0.1809 d0.dn_loss_cls: 0.1558 d0.dn_loss_bbox: 0.2769 d0.dn_loss_iou: 0.3100 d1.dn_loss_cls: 0.1090 d1.dn_loss_bbox: 0.1679 d1.dn_loss_iou: 0.2079 d2.dn_loss_cls: 0.0943 d2.dn_loss_bbox: 0.1469 d2.dn_loss_iou: 0.1894 d3.dn_loss_cls: 0.0897 d3.dn_loss_bbox: 0.1394 d3.dn_loss_iou: 0.1834 d4.dn_loss_cls: 0.0870 d4.dn_loss_bbox: 0.1373 d4.dn_loss_iou: 0.1811 d1.loss_lmm_region: 0.0879 loss_lmm_image: 0.6993 2024/11/14 00:44:24 - mmengine - INFO - Iter(train) [147500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:23:40 time: 2.0638 data_time: 0.0191 memory: 34216 grad_norm: 29.0049 loss: 8.5934 loss_cls: 0.2671 loss_bbox: 0.1177 loss_iou: 0.2114 d0.loss_cls: 0.2887 d0.loss_bbox: 0.1335 d0.loss_iou: 0.2308 d1.loss_cls: 0.2812 d1.loss_bbox: 0.1250 d1.loss_iou: 0.2213 d2.loss_cls: 0.2765 d2.loss_bbox: 0.1171 d2.loss_iou: 0.2138 d3.loss_cls: 0.2712 d3.loss_bbox: 0.1159 d3.loss_iou: 0.2133 d4.loss_cls: 0.2647 d4.loss_bbox: 0.1182 d4.loss_iou: 0.2129 enc_loss_cls: 0.2961 enc_loss_bbox: 0.1436 enc_loss_iou: 0.2483 dn_loss_cls: 0.1424 dn_loss_bbox: 0.1620 dn_loss_iou: 0.2013 d0.dn_loss_cls: 0.1997 d0.dn_loss_bbox: 0.2989 d0.dn_loss_iou: 0.3350 d1.dn_loss_cls: 0.1676 d1.dn_loss_bbox: 0.1926 d1.dn_loss_iou: 0.2312 d2.dn_loss_cls: 0.1510 d2.dn_loss_bbox: 0.1679 d2.dn_loss_iou: 0.2090 d3.dn_loss_cls: 0.1507 d3.dn_loss_bbox: 0.1632 d3.dn_loss_iou: 0.2036 d4.dn_loss_cls: 0.1464 d4.dn_loss_bbox: 0.1620 d4.dn_loss_iou: 0.2014 d1.loss_lmm_region: 0.0861 loss_lmm_image: 0.6529 2024/11/14 00:47:44 - mmengine - INFO - Iter(train) [147600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:20:20 time: 1.9877 data_time: 0.0190 memory: 32975 grad_norm: 25.6544 loss: 8.6315 loss_cls: 0.2744 loss_bbox: 0.1397 loss_iou: 0.2328 d0.loss_cls: 0.3091 d0.loss_bbox: 0.1484 d0.loss_iou: 0.2431 d1.loss_cls: 0.2902 d1.loss_bbox: 0.1434 d1.loss_iou: 0.2326 d2.loss_cls: 0.2852 d2.loss_bbox: 0.1414 d2.loss_iou: 0.2322 d3.loss_cls: 0.2812 d3.loss_bbox: 0.1396 d3.loss_iou: 0.2314 d4.loss_cls: 0.2733 d4.loss_bbox: 0.1402 d4.loss_iou: 0.2315 enc_loss_cls: 0.3254 enc_loss_bbox: 0.1556 enc_loss_iou: 0.2556 dn_loss_cls: 0.0747 dn_loss_bbox: 0.1664 dn_loss_iou: 0.2067 d0.dn_loss_cls: 0.1588 d0.dn_loss_bbox: 0.3109 d0.dn_loss_iou: 0.3405 d1.dn_loss_cls: 0.1029 d1.dn_loss_bbox: 0.1954 d1.dn_loss_iou: 0.2343 d2.dn_loss_cls: 0.0856 d2.dn_loss_bbox: 0.1758 d2.dn_loss_iou: 0.2163 d3.dn_loss_cls: 0.0801 d3.dn_loss_bbox: 0.1682 d3.dn_loss_iou: 0.2092 d4.dn_loss_cls: 0.0757 d4.dn_loss_bbox: 0.1663 d4.dn_loss_iou: 0.2069 d1.loss_lmm_region: 0.0938 loss_lmm_image: 0.6567 2024/11/14 00:51:04 - mmengine - INFO - Iter(train) [147700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:16:59 time: 2.0130 data_time: 0.0191 memory: 33985 grad_norm: 25.0191 loss: 8.6246 loss_cls: 0.2852 loss_bbox: 0.1215 loss_iou: 0.2374 d0.loss_cls: 0.3202 d0.loss_bbox: 0.1308 d0.loss_iou: 0.2483 d1.loss_cls: 0.3006 d1.loss_bbox: 0.1224 d1.loss_iou: 0.2395 d2.loss_cls: 0.2895 d2.loss_bbox: 0.1241 d2.loss_iou: 0.2399 d3.loss_cls: 0.2853 d3.loss_bbox: 0.1208 d3.loss_iou: 0.2400 d4.loss_cls: 0.2828 d4.loss_bbox: 0.1218 d4.loss_iou: 0.2383 enc_loss_cls: 0.3260 enc_loss_bbox: 0.1372 enc_loss_iou: 0.2579 dn_loss_cls: 0.0977 dn_loss_bbox: 0.1480 dn_loss_iou: 0.2050 d0.dn_loss_cls: 0.1764 d0.dn_loss_bbox: 0.2846 d0.dn_loss_iou: 0.3316 d1.dn_loss_cls: 0.1264 d1.dn_loss_bbox: 0.1752 d1.dn_loss_iou: 0.2305 d2.dn_loss_cls: 0.1078 d2.dn_loss_bbox: 0.1587 d2.dn_loss_iou: 0.2142 d3.dn_loss_cls: 0.1005 d3.dn_loss_bbox: 0.1496 d3.dn_loss_iou: 0.2068 d4.dn_loss_cls: 0.0975 d4.dn_loss_bbox: 0.1480 d4.dn_loss_iou: 0.2048 d1.loss_lmm_region: 0.1108 loss_lmm_image: 0.6809 2024/11/14 00:54:22 - mmengine - INFO - Iter(train) [147800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:13:38 time: 1.9643 data_time: 0.0189 memory: 34228 grad_norm: 36.1135 loss: 8.7169 loss_cls: 0.2462 loss_bbox: 0.1541 loss_iou: 0.2543 d0.loss_cls: 0.2829 d0.loss_bbox: 0.1645 d0.loss_iou: 0.2675 d1.loss_cls: 0.2636 d1.loss_bbox: 0.1578 d1.loss_iou: 0.2598 d2.loss_cls: 0.2565 d2.loss_bbox: 0.1554 d2.loss_iou: 0.2558 d3.loss_cls: 0.2495 d3.loss_bbox: 0.1563 d3.loss_iou: 0.2556 d4.loss_cls: 0.2468 d4.loss_bbox: 0.1537 d4.loss_iou: 0.2547 enc_loss_cls: 0.2908 enc_loss_bbox: 0.1767 enc_loss_iou: 0.2828 dn_loss_cls: 0.0617 dn_loss_bbox: 0.1717 dn_loss_iou: 0.2094 d0.dn_loss_cls: 0.1508 d0.dn_loss_bbox: 0.3086 d0.dn_loss_iou: 0.3386 d1.dn_loss_cls: 0.0935 d1.dn_loss_bbox: 0.2011 d1.dn_loss_iou: 0.2360 d2.dn_loss_cls: 0.0730 d2.dn_loss_bbox: 0.1790 d2.dn_loss_iou: 0.2175 d3.dn_loss_cls: 0.0660 d3.dn_loss_bbox: 0.1730 d3.dn_loss_iou: 0.2112 d4.dn_loss_cls: 0.0627 d4.dn_loss_bbox: 0.1717 d4.dn_loss_iou: 0.2094 d1.loss_lmm_region: 0.0955 loss_lmm_image: 0.7012 2024/11/14 00:57:40 - mmengine - INFO - Iter(train) [147900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:10:17 time: 1.9961 data_time: 0.0189 memory: 34306 grad_norm: 22.8738 loss: 9.3103 loss_cls: 0.3151 loss_bbox: 0.1552 loss_iou: 0.2760 d0.loss_cls: 0.3616 d0.loss_bbox: 0.1713 d0.loss_iou: 0.2892 d1.loss_cls: 0.3386 d1.loss_bbox: 0.1579 d1.loss_iou: 0.2761 d2.loss_cls: 0.3292 d2.loss_bbox: 0.1534 d2.loss_iou: 0.2750 d3.loss_cls: 0.3205 d3.loss_bbox: 0.1538 d3.loss_iou: 0.2744 d4.loss_cls: 0.3180 d4.loss_bbox: 0.1516 d4.loss_iou: 0.2727 enc_loss_cls: 0.3714 enc_loss_bbox: 0.1738 enc_loss_iou: 0.3037 dn_loss_cls: 0.0783 dn_loss_bbox: 0.1579 dn_loss_iou: 0.2006 d0.dn_loss_cls: 0.1685 d0.dn_loss_bbox: 0.2975 d0.dn_loss_iou: 0.3344 d1.dn_loss_cls: 0.1144 d1.dn_loss_bbox: 0.1863 d1.dn_loss_iou: 0.2281 d2.dn_loss_cls: 0.0914 d2.dn_loss_bbox: 0.1659 d2.dn_loss_iou: 0.2090 d3.dn_loss_cls: 0.0839 d3.dn_loss_bbox: 0.1590 d3.dn_loss_iou: 0.2026 d4.dn_loss_cls: 0.0797 d4.dn_loss_bbox: 0.1579 d4.dn_loss_iou: 0.2006 d1.loss_lmm_region: 0.0939 loss_lmm_image: 0.6621 2024/11/14 01:00:58 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/14 01:00:58 - mmengine - INFO - Iter(train) [148000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:06:56 time: 1.9665 data_time: 0.0190 memory: 33438 grad_norm: 25.8349 loss: 7.9299 loss_cls: 0.2393 loss_bbox: 0.1207 loss_iou: 0.2409 d0.loss_cls: 0.2769 d0.loss_bbox: 0.1296 d0.loss_iou: 0.2536 d1.loss_cls: 0.2575 d1.loss_bbox: 0.1264 d1.loss_iou: 0.2498 d2.loss_cls: 0.2496 d2.loss_bbox: 0.1223 d2.loss_iou: 0.2435 d3.loss_cls: 0.2425 d3.loss_bbox: 0.1212 d3.loss_iou: 0.2422 d4.loss_cls: 0.2408 d4.loss_bbox: 0.1209 d4.loss_iou: 0.2410 enc_loss_cls: 0.2810 enc_loss_bbox: 0.1418 enc_loss_iou: 0.2698 dn_loss_cls: 0.0671 dn_loss_bbox: 0.1257 dn_loss_iou: 0.1925 d0.dn_loss_cls: 0.1423 d0.dn_loss_bbox: 0.2383 d0.dn_loss_iou: 0.3125 d1.dn_loss_cls: 0.0928 d1.dn_loss_bbox: 0.1485 d1.dn_loss_iou: 0.2169 d2.dn_loss_cls: 0.0756 d2.dn_loss_bbox: 0.1312 d2.dn_loss_iou: 0.1990 d3.dn_loss_cls: 0.0699 d3.dn_loss_bbox: 0.1270 d3.dn_loss_iou: 0.1942 d4.dn_loss_cls: 0.0676 d4.dn_loss_bbox: 0.1256 d4.dn_loss_iou: 0.1925 d1.loss_lmm_region: 0.0762 loss_lmm_image: 0.7231 2024/11/14 01:04:16 - mmengine - INFO - Iter(train) [148100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:03:35 time: 1.9505 data_time: 0.0189 memory: 33212 grad_norm: 28.8032 loss: 8.9671 loss_cls: 0.2779 loss_bbox: 0.1299 loss_iou: 0.2382 d0.loss_cls: 0.3130 d0.loss_bbox: 0.1326 d0.loss_iou: 0.2446 d1.loss_cls: 0.2923 d1.loss_bbox: 0.1256 d1.loss_iou: 0.2415 d2.loss_cls: 0.2834 d2.loss_bbox: 0.1291 d2.loss_iou: 0.2425 d3.loss_cls: 0.2815 d3.loss_bbox: 0.1274 d3.loss_iou: 0.2366 d4.loss_cls: 0.2781 d4.loss_bbox: 0.1303 d4.loss_iou: 0.2387 enc_loss_cls: 0.3149 enc_loss_bbox: 0.1428 enc_loss_iou: 0.2520 dn_loss_cls: 0.1575 dn_loss_bbox: 0.1472 dn_loss_iou: 0.2095 d0.dn_loss_cls: 0.2152 d0.dn_loss_bbox: 0.2893 d0.dn_loss_iou: 0.3428 d1.dn_loss_cls: 0.1804 d1.dn_loss_bbox: 0.1756 d1.dn_loss_iou: 0.2367 d2.dn_loss_cls: 0.1586 d2.dn_loss_bbox: 0.1560 d2.dn_loss_iou: 0.2172 d3.dn_loss_cls: 0.1593 d3.dn_loss_bbox: 0.1493 d3.dn_loss_iou: 0.2117 d4.dn_loss_cls: 0.1503 d4.dn_loss_bbox: 0.1473 d4.dn_loss_iou: 0.2095 d1.loss_lmm_region: 0.1068 loss_lmm_image: 0.6943 2024/11/14 01:07:37 - mmengine - INFO - Iter(train) [148200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 1:00:14 time: 2.0077 data_time: 0.0191 memory: 33177 grad_norm: 25.0092 loss: 9.7941 loss_cls: 0.2891 loss_bbox: 0.1701 loss_iou: 0.3105 d0.loss_cls: 0.3257 d0.loss_bbox: 0.1864 d0.loss_iou: 0.3296 d1.loss_cls: 0.3067 d1.loss_bbox: 0.1770 d1.loss_iou: 0.3188 d2.loss_cls: 0.3032 d2.loss_bbox: 0.1725 d2.loss_iou: 0.3138 d3.loss_cls: 0.2920 d3.loss_bbox: 0.1729 d3.loss_iou: 0.3111 d4.loss_cls: 0.2918 d4.loss_bbox: 0.1697 d4.loss_iou: 0.3097 enc_loss_cls: 0.3423 enc_loss_bbox: 0.1896 enc_loss_iou: 0.3352 dn_loss_cls: 0.0810 dn_loss_bbox: 0.1669 dn_loss_iou: 0.2369 d0.dn_loss_cls: 0.1678 d0.dn_loss_bbox: 0.3029 d0.dn_loss_iou: 0.3679 d1.dn_loss_cls: 0.1179 d1.dn_loss_bbox: 0.1986 d1.dn_loss_iou: 0.2635 d2.dn_loss_cls: 0.0959 d2.dn_loss_bbox: 0.1796 d2.dn_loss_iou: 0.2457 d3.dn_loss_cls: 0.0897 d3.dn_loss_bbox: 0.1694 d3.dn_loss_iou: 0.2388 d4.dn_loss_cls: 0.0835 d4.dn_loss_bbox: 0.1669 d4.dn_loss_iou: 0.2370 d1.loss_lmm_region: 0.1117 loss_lmm_image: 0.6549 2024/11/14 01:10:57 - mmengine - INFO - Iter(train) [148300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:56:54 time: 2.0083 data_time: 0.0190 memory: 34653 grad_norm: 22.5522 loss: 8.9897 loss_cls: 0.2779 loss_bbox: 0.1506 loss_iou: 0.2535 d0.loss_cls: 0.3195 d0.loss_bbox: 0.1687 d0.loss_iou: 0.2719 d1.loss_cls: 0.3005 d1.loss_bbox: 0.1586 d1.loss_iou: 0.2618 d2.loss_cls: 0.2920 d2.loss_bbox: 0.1540 d2.loss_iou: 0.2582 d3.loss_cls: 0.2881 d3.loss_bbox: 0.1467 d3.loss_iou: 0.2542 d4.loss_cls: 0.2830 d4.loss_bbox: 0.1522 d4.loss_iou: 0.2539 enc_loss_cls: 0.3293 enc_loss_bbox: 0.1795 enc_loss_iou: 0.2876 dn_loss_cls: 0.0659 dn_loss_bbox: 0.1608 dn_loss_iou: 0.2151 d0.dn_loss_cls: 0.1566 d0.dn_loss_bbox: 0.3131 d0.dn_loss_iou: 0.3436 d1.dn_loss_cls: 0.0999 d1.dn_loss_bbox: 0.1969 d1.dn_loss_iou: 0.2423 d2.dn_loss_cls: 0.0779 d2.dn_loss_bbox: 0.1732 d2.dn_loss_iou: 0.2239 d3.dn_loss_cls: 0.0702 d3.dn_loss_bbox: 0.1623 d3.dn_loss_iou: 0.2175 d4.dn_loss_cls: 0.0667 d4.dn_loss_bbox: 0.1608 d4.dn_loss_iou: 0.2152 d1.loss_lmm_region: 0.0988 loss_lmm_image: 0.6871 2024/11/14 01:14:18 - mmengine - INFO - Iter(train) [148400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:53:33 time: 2.0343 data_time: 0.0191 memory: 34176 grad_norm: 24.0518 loss: 8.5340 loss_cls: 0.2632 loss_bbox: 0.1379 loss_iou: 0.2770 d0.loss_cls: 0.3100 d0.loss_bbox: 0.1475 d0.loss_iou: 0.2870 d1.loss_cls: 0.2839 d1.loss_bbox: 0.1414 d1.loss_iou: 0.2811 d2.loss_cls: 0.2809 d2.loss_bbox: 0.1335 d2.loss_iou: 0.2713 d3.loss_cls: 0.2688 d3.loss_bbox: 0.1355 d3.loss_iou: 0.2743 d4.loss_cls: 0.2635 d4.loss_bbox: 0.1378 d4.loss_iou: 0.2764 enc_loss_cls: 0.3118 enc_loss_bbox: 0.1612 enc_loss_iou: 0.3030 dn_loss_cls: 0.0573 dn_loss_bbox: 0.1341 dn_loss_iou: 0.2007 d0.dn_loss_cls: 0.1272 d0.dn_loss_bbox: 0.2633 d0.dn_loss_iou: 0.3314 d1.dn_loss_cls: 0.0797 d1.dn_loss_bbox: 0.1628 d1.dn_loss_iou: 0.2269 d2.dn_loss_cls: 0.0660 d2.dn_loss_bbox: 0.1430 d2.dn_loss_iou: 0.2088 d3.dn_loss_cls: 0.0605 d3.dn_loss_bbox: 0.1369 d3.dn_loss_iou: 0.2031 d4.dn_loss_cls: 0.0583 d4.dn_loss_bbox: 0.1341 d4.dn_loss_iou: 0.2008 d1.loss_lmm_region: 0.0774 loss_lmm_image: 0.7149 2024/11/14 01:17:39 - mmengine - INFO - Iter(train) [148500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:50:12 time: 2.0048 data_time: 0.0190 memory: 33277 grad_norm: 35.6423 loss: 8.3095 loss_cls: 0.2574 loss_bbox: 0.1215 loss_iou: 0.2165 d0.loss_cls: 0.2916 d0.loss_bbox: 0.1389 d0.loss_iou: 0.2284 d1.loss_cls: 0.2780 d1.loss_bbox: 0.1260 d1.loss_iou: 0.2196 d2.loss_cls: 0.2668 d2.loss_bbox: 0.1238 d2.loss_iou: 0.2190 d3.loss_cls: 0.2626 d3.loss_bbox: 0.1216 d3.loss_iou: 0.2163 d4.loss_cls: 0.2578 d4.loss_bbox: 0.1218 d4.loss_iou: 0.2163 enc_loss_cls: 0.3040 enc_loss_bbox: 0.1424 enc_loss_iou: 0.2419 dn_loss_cls: 0.0998 dn_loss_bbox: 0.1418 dn_loss_iou: 0.1995 d0.dn_loss_cls: 0.1978 d0.dn_loss_bbox: 0.2668 d0.dn_loss_iou: 0.3244 d1.dn_loss_cls: 0.1371 d1.dn_loss_bbox: 0.1688 d1.dn_loss_iou: 0.2253 d2.dn_loss_cls: 0.1153 d2.dn_loss_bbox: 0.1499 d2.dn_loss_iou: 0.2073 d3.dn_loss_cls: 0.1065 d3.dn_loss_bbox: 0.1428 d3.dn_loss_iou: 0.2011 d4.dn_loss_cls: 0.1013 d4.dn_loss_bbox: 0.1417 d4.dn_loss_iou: 0.1994 d1.loss_lmm_region: 0.0996 loss_lmm_image: 0.7112 2024/11/14 01:21:00 - mmengine - INFO - Iter(train) [148600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:46:51 time: 2.0147 data_time: 0.0190 memory: 35321 grad_norm: 22.3288 loss: 7.6284 loss_cls: 0.2547 loss_bbox: 0.1240 loss_iou: 0.2143 d0.loss_cls: 0.2925 d0.loss_bbox: 0.1284 d0.loss_iou: 0.2259 d1.loss_cls: 0.2711 d1.loss_bbox: 0.1248 d1.loss_iou: 0.2185 d2.loss_cls: 0.2672 d2.loss_bbox: 0.1206 d2.loss_iou: 0.2146 d3.loss_cls: 0.2573 d3.loss_bbox: 0.1247 d3.loss_iou: 0.2144 d4.loss_cls: 0.2540 d4.loss_bbox: 0.1240 d4.loss_iou: 0.2140 enc_loss_cls: 0.3027 enc_loss_bbox: 0.1423 enc_loss_iou: 0.2421 dn_loss_cls: 0.0626 dn_loss_bbox: 0.1165 dn_loss_iou: 0.1692 d0.dn_loss_cls: 0.1372 d0.dn_loss_bbox: 0.2474 d0.dn_loss_iou: 0.2983 d1.dn_loss_cls: 0.0890 d1.dn_loss_bbox: 0.1457 d1.dn_loss_iou: 0.1965 d2.dn_loss_cls: 0.0715 d2.dn_loss_bbox: 0.1249 d2.dn_loss_iou: 0.1777 d3.dn_loss_cls: 0.0665 d3.dn_loss_bbox: 0.1181 d3.dn_loss_iou: 0.1712 d4.dn_loss_cls: 0.0630 d4.dn_loss_bbox: 0.1166 d4.dn_loss_iou: 0.1694 d1.loss_lmm_region: 0.0816 loss_lmm_image: 0.6733 2024/11/14 01:24:20 - mmengine - INFO - Iter(train) [148700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:43:30 time: 1.9910 data_time: 0.0191 memory: 34879 grad_norm: 23.8462 loss: 8.2606 loss_cls: 0.2589 loss_bbox: 0.1226 loss_iou: 0.2342 d0.loss_cls: 0.3007 d0.loss_bbox: 0.1327 d0.loss_iou: 0.2494 d1.loss_cls: 0.2801 d1.loss_bbox: 0.1251 d1.loss_iou: 0.2399 d2.loss_cls: 0.2707 d2.loss_bbox: 0.1256 d2.loss_iou: 0.2388 d3.loss_cls: 0.2647 d3.loss_bbox: 0.1247 d3.loss_iou: 0.2375 d4.loss_cls: 0.2583 d4.loss_bbox: 0.1240 d4.loss_iou: 0.2381 enc_loss_cls: 0.3064 enc_loss_bbox: 0.1446 enc_loss_iou: 0.2627 dn_loss_cls: 0.0743 dn_loss_bbox: 0.1393 dn_loss_iou: 0.1943 d0.dn_loss_cls: 0.1523 d0.dn_loss_bbox: 0.2597 d0.dn_loss_iou: 0.3256 d1.dn_loss_cls: 0.1026 d1.dn_loss_bbox: 0.1658 d1.dn_loss_iou: 0.2228 d2.dn_loss_cls: 0.0855 d2.dn_loss_bbox: 0.1480 d2.dn_loss_iou: 0.2036 d3.dn_loss_cls: 0.0798 d3.dn_loss_bbox: 0.1410 d3.dn_loss_iou: 0.1965 d4.dn_loss_cls: 0.0763 d4.dn_loss_bbox: 0.1393 d4.dn_loss_iou: 0.1944 d1.loss_lmm_region: 0.0927 loss_lmm_image: 0.7269 2024/11/14 01:27:41 - mmengine - INFO - Iter(train) [148800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:40:09 time: 2.0234 data_time: 0.0190 memory: 35070 grad_norm: 25.0134 loss: 7.3444 loss_cls: 0.2401 loss_bbox: 0.1063 loss_iou: 0.2210 d0.loss_cls: 0.2695 d0.loss_bbox: 0.1208 d0.loss_iou: 0.2352 d1.loss_cls: 0.2533 d1.loss_bbox: 0.1104 d1.loss_iou: 0.2265 d2.loss_cls: 0.2504 d2.loss_bbox: 0.1057 d2.loss_iou: 0.2196 d3.loss_cls: 0.2463 d3.loss_bbox: 0.1062 d3.loss_iou: 0.2212 d4.loss_cls: 0.2415 d4.loss_bbox: 0.1070 d4.loss_iou: 0.2213 enc_loss_cls: 0.2857 enc_loss_bbox: 0.1265 enc_loss_iou: 0.2474 dn_loss_cls: 0.0607 dn_loss_bbox: 0.1113 dn_loss_iou: 0.1677 d0.dn_loss_cls: 0.1314 d0.dn_loss_bbox: 0.2236 d0.dn_loss_iou: 0.2803 d1.dn_loss_cls: 0.0834 d1.dn_loss_bbox: 0.1314 d1.dn_loss_iou: 0.1887 d2.dn_loss_cls: 0.0697 d2.dn_loss_bbox: 0.1181 d2.dn_loss_iou: 0.1752 d3.dn_loss_cls: 0.0647 d3.dn_loss_bbox: 0.1133 d3.dn_loss_iou: 0.1695 d4.dn_loss_cls: 0.0618 d4.dn_loss_bbox: 0.1113 d4.dn_loss_iou: 0.1678 d1.loss_lmm_region: 0.0826 loss_lmm_image: 0.6702 2024/11/14 01:31:00 - mmengine - INFO - Iter(train) [148900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:36:49 time: 1.9914 data_time: 0.0190 memory: 33780 grad_norm: 31.5860 loss: 8.7746 loss_cls: 0.2849 loss_bbox: 0.1411 loss_iou: 0.2823 d0.loss_cls: 0.3313 d0.loss_bbox: 0.1505 d0.loss_iou: 0.2957 d1.loss_cls: 0.3050 d1.loss_bbox: 0.1427 d1.loss_iou: 0.2857 d2.loss_cls: 0.2950 d2.loss_bbox: 0.1408 d2.loss_iou: 0.2834 d3.loss_cls: 0.2919 d3.loss_bbox: 0.1407 d3.loss_iou: 0.2813 d4.loss_cls: 0.2872 d4.loss_bbox: 0.1401 d4.loss_iou: 0.2814 enc_loss_cls: 0.3407 enc_loss_bbox: 0.1607 enc_loss_iou: 0.3142 dn_loss_cls: 0.0655 dn_loss_bbox: 0.1229 dn_loss_iou: 0.2036 d0.dn_loss_cls: 0.1419 d0.dn_loss_bbox: 0.2373 d0.dn_loss_iou: 0.3323 d1.dn_loss_cls: 0.0941 d1.dn_loss_bbox: 0.1462 d1.dn_loss_iou: 0.2306 d2.dn_loss_cls: 0.0767 d2.dn_loss_bbox: 0.1291 d2.dn_loss_iou: 0.2115 d3.dn_loss_cls: 0.0713 d3.dn_loss_bbox: 0.1238 d3.dn_loss_iou: 0.2050 d4.dn_loss_cls: 0.0664 d4.dn_loss_bbox: 0.1229 d4.dn_loss_iou: 0.2036 d1.loss_lmm_region: 0.0995 loss_lmm_image: 0.7138 2024/11/14 01:34:20 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/14 01:34:20 - mmengine - INFO - Iter(train) [149000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:33:28 time: 1.9756 data_time: 0.0190 memory: 33790 grad_norm: 25.3175 loss: 7.7444 loss_cls: 0.2233 loss_bbox: 0.1145 loss_iou: 0.2192 d0.loss_cls: 0.2521 d0.loss_bbox: 0.1238 d0.loss_iou: 0.2323 d1.loss_cls: 0.2376 d1.loss_bbox: 0.1186 d1.loss_iou: 0.2261 d2.loss_cls: 0.2335 d2.loss_bbox: 0.1141 d2.loss_iou: 0.2205 d3.loss_cls: 0.2300 d3.loss_bbox: 0.1135 d3.loss_iou: 0.2201 d4.loss_cls: 0.2241 d4.loss_bbox: 0.1133 d4.loss_iou: 0.2189 enc_loss_cls: 0.2583 enc_loss_bbox: 0.1322 enc_loss_iou: 0.2444 dn_loss_cls: 0.0686 dn_loss_bbox: 0.1416 dn_loss_iou: 0.1984 d0.dn_loss_cls: 0.1473 d0.dn_loss_bbox: 0.2545 d0.dn_loss_iou: 0.3223 d1.dn_loss_cls: 0.0983 d1.dn_loss_bbox: 0.1693 d1.dn_loss_iou: 0.2225 d2.dn_loss_cls: 0.0778 d2.dn_loss_bbox: 0.1513 d2.dn_loss_iou: 0.2063 d3.dn_loss_cls: 0.0706 d3.dn_loss_bbox: 0.1435 d3.dn_loss_iou: 0.2004 d4.dn_loss_cls: 0.0690 d4.dn_loss_bbox: 0.1417 d4.dn_loss_iou: 0.1984 d1.loss_lmm_region: 0.0865 loss_lmm_image: 0.7057 2024/11/14 01:37:38 - mmengine - INFO - Iter(train) [149100/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:30:07 time: 2.0090 data_time: 0.0189 memory: 34519 grad_norm: 24.0568 loss: 7.5590 loss_cls: 0.2316 loss_bbox: 0.1025 loss_iou: 0.1945 d0.loss_cls: 0.2550 d0.loss_bbox: 0.1107 d0.loss_iou: 0.2069 d1.loss_cls: 0.2408 d1.loss_bbox: 0.1066 d1.loss_iou: 0.1962 d2.loss_cls: 0.2361 d2.loss_bbox: 0.1030 d2.loss_iou: 0.1938 d3.loss_cls: 0.2322 d3.loss_bbox: 0.1035 d3.loss_iou: 0.1940 d4.loss_cls: 0.2315 d4.loss_bbox: 0.1019 d4.loss_iou: 0.1942 enc_loss_cls: 0.2614 enc_loss_bbox: 0.1206 enc_loss_iou: 0.2212 dn_loss_cls: 0.0708 dn_loss_bbox: 0.1458 dn_loss_iou: 0.1966 d0.dn_loss_cls: 0.1556 d0.dn_loss_bbox: 0.2936 d0.dn_loss_iou: 0.3259 d1.dn_loss_cls: 0.1033 d1.dn_loss_bbox: 0.1814 d1.dn_loss_iou: 0.2233 d2.dn_loss_cls: 0.0845 d2.dn_loss_bbox: 0.1548 d2.dn_loss_iou: 0.2042 d3.dn_loss_cls: 0.0763 d3.dn_loss_bbox: 0.1481 d3.dn_loss_iou: 0.1986 d4.dn_loss_cls: 0.0720 d4.dn_loss_bbox: 0.1458 d4.dn_loss_iou: 0.1967 d1.loss_lmm_region: 0.0941 loss_lmm_image: 0.6492 2024/11/14 01:40:57 - mmengine - INFO - Iter(train) [149200/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:26:46 time: 1.9969 data_time: 0.0191 memory: 34489 grad_norm: 31.4364 loss: 6.9030 loss_cls: 0.2173 loss_bbox: 0.0956 loss_iou: 0.1715 d0.loss_cls: 0.2473 d0.loss_bbox: 0.1075 d0.loss_iou: 0.1913 d1.loss_cls: 0.2324 d1.loss_bbox: 0.0998 d1.loss_iou: 0.1829 d2.loss_cls: 0.2265 d2.loss_bbox: 0.0977 d2.loss_iou: 0.1757 d3.loss_cls: 0.2186 d3.loss_bbox: 0.0974 d3.loss_iou: 0.1727 d4.loss_cls: 0.2189 d4.loss_bbox: 0.0953 d4.loss_iou: 0.1716 enc_loss_cls: 0.2580 enc_loss_bbox: 0.1176 enc_loss_iou: 0.2052 dn_loss_cls: 0.0641 dn_loss_bbox: 0.1126 dn_loss_iou: 0.1659 d0.dn_loss_cls: 0.1444 d0.dn_loss_bbox: 0.2339 d0.dn_loss_iou: 0.2969 d1.dn_loss_cls: 0.0921 d1.dn_loss_bbox: 0.1352 d1.dn_loss_iou: 0.1916 d2.dn_loss_cls: 0.0736 d2.dn_loss_bbox: 0.1200 d2.dn_loss_iou: 0.1743 d3.dn_loss_cls: 0.0671 d3.dn_loss_bbox: 0.1142 d3.dn_loss_iou: 0.1677 d4.dn_loss_cls: 0.0644 d4.dn_loss_bbox: 0.1126 d4.dn_loss_iou: 0.1659 d1.loss_lmm_region: 0.0846 loss_lmm_image: 0.7211 2024/11/14 01:44:16 - mmengine - INFO - Iter(train) [149300/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:23:25 time: 1.9865 data_time: 0.0190 memory: 35183 grad_norm: 25.3426 loss: 8.5832 loss_cls: 0.2633 loss_bbox: 0.1312 loss_iou: 0.2498 d0.loss_cls: 0.3033 d0.loss_bbox: 0.1408 d0.loss_iou: 0.2615 d1.loss_cls: 0.2852 d1.loss_bbox: 0.1328 d1.loss_iou: 0.2532 d2.loss_cls: 0.2755 d2.loss_bbox: 0.1293 d2.loss_iou: 0.2482 d3.loss_cls: 0.2683 d3.loss_bbox: 0.1301 d3.loss_iou: 0.2486 d4.loss_cls: 0.2662 d4.loss_bbox: 0.1305 d4.loss_iou: 0.2502 enc_loss_cls: 0.3095 enc_loss_bbox: 0.1509 enc_loss_iou: 0.2740 dn_loss_cls: 0.0861 dn_loss_bbox: 0.1553 dn_loss_iou: 0.1945 d0.dn_loss_cls: 0.1645 d0.dn_loss_bbox: 0.2803 d0.dn_loss_iou: 0.3132 d1.dn_loss_cls: 0.1143 d1.dn_loss_bbox: 0.1775 d1.dn_loss_iou: 0.2162 d2.dn_loss_cls: 0.0976 d2.dn_loss_bbox: 0.1628 d2.dn_loss_iou: 0.2014 d3.dn_loss_cls: 0.0921 d3.dn_loss_bbox: 0.1565 d3.dn_loss_iou: 0.1959 d4.dn_loss_cls: 0.0878 d4.dn_loss_bbox: 0.1553 d4.dn_loss_iou: 0.1946 d1.loss_lmm_region: 0.1043 loss_lmm_image: 0.7306 2024/11/14 01:47:38 - mmengine - INFO - Iter(train) [149400/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:20:04 time: 2.0133 data_time: 0.0192 memory: 34556 grad_norm: 24.4302 loss: 8.6955 loss_cls: 0.2752 loss_bbox: 0.1370 loss_iou: 0.2410 d0.loss_cls: 0.3017 d0.loss_bbox: 0.1505 d0.loss_iou: 0.2542 d1.loss_cls: 0.2903 d1.loss_bbox: 0.1407 d1.loss_iou: 0.2488 d2.loss_cls: 0.2816 d2.loss_bbox: 0.1414 d2.loss_iou: 0.2457 d3.loss_cls: 0.2790 d3.loss_bbox: 0.1380 d3.loss_iou: 0.2405 d4.loss_cls: 0.2747 d4.loss_bbox: 0.1411 d4.loss_iou: 0.2414 enc_loss_cls: 0.3142 enc_loss_bbox: 0.1574 enc_loss_iou: 0.2648 dn_loss_cls: 0.0904 dn_loss_bbox: 0.1577 dn_loss_iou: 0.2040 d0.dn_loss_cls: 0.1660 d0.dn_loss_bbox: 0.3044 d0.dn_loss_iou: 0.3367 d1.dn_loss_cls: 0.1157 d1.dn_loss_bbox: 0.1869 d1.dn_loss_iou: 0.2294 d2.dn_loss_cls: 0.1004 d2.dn_loss_bbox: 0.1658 d2.dn_loss_iou: 0.2119 d3.dn_loss_cls: 0.0951 d3.dn_loss_bbox: 0.1596 d3.dn_loss_iou: 0.2062 d4.dn_loss_cls: 0.0905 d4.dn_loss_bbox: 0.1577 d4.dn_loss_iou: 0.2041 d1.loss_lmm_region: 0.0986 loss_lmm_image: 0.6554 2024/11/14 01:50:58 - mmengine - INFO - Iter(train) [149500/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:16:44 time: 1.9935 data_time: 0.0190 memory: 34592 grad_norm: 34.4189 loss: 8.9776 loss_cls: 0.3300 loss_bbox: 0.1272 loss_iou: 0.2472 d0.loss_cls: 0.3879 d0.loss_bbox: 0.1360 d0.loss_iou: 0.2582 d1.loss_cls: 0.3638 d1.loss_bbox: 0.1286 d1.loss_iou: 0.2482 d2.loss_cls: 0.3499 d2.loss_bbox: 0.1257 d2.loss_iou: 0.2446 d3.loss_cls: 0.3430 d3.loss_bbox: 0.1270 d3.loss_iou: 0.2461 d4.loss_cls: 0.3346 d4.loss_bbox: 0.1265 d4.loss_iou: 0.2458 enc_loss_cls: 0.3899 enc_loss_bbox: 0.1470 enc_loss_iou: 0.2762 dn_loss_cls: 0.1082 dn_loss_bbox: 0.1319 dn_loss_iou: 0.1868 d0.dn_loss_cls: 0.1935 d0.dn_loss_bbox: 0.2638 d0.dn_loss_iou: 0.3106 d1.dn_loss_cls: 0.1395 d1.dn_loss_bbox: 0.1615 d1.dn_loss_iou: 0.2122 d2.dn_loss_cls: 0.1199 d2.dn_loss_bbox: 0.1417 d2.dn_loss_iou: 0.1945 d3.dn_loss_cls: 0.1138 d3.dn_loss_bbox: 0.1337 d3.dn_loss_iou: 0.1885 d4.dn_loss_cls: 0.1100 d4.dn_loss_bbox: 0.1319 d4.dn_loss_iou: 0.1869 d1.loss_lmm_region: 0.1032 loss_lmm_image: 0.6622 2024/11/14 01:54:16 - mmengine - INFO - Iter(train) [149600/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:13:23 time: 2.0046 data_time: 0.0190 memory: 33597 grad_norm: 28.4985 loss: 7.6075 loss_cls: 0.2270 loss_bbox: 0.1092 loss_iou: 0.1966 d0.loss_cls: 0.2577 d0.loss_bbox: 0.1275 d0.loss_iou: 0.2161 d1.loss_cls: 0.2439 d1.loss_bbox: 0.1181 d1.loss_iou: 0.2072 d2.loss_cls: 0.2359 d2.loss_bbox: 0.1140 d2.loss_iou: 0.2004 d3.loss_cls: 0.2290 d3.loss_bbox: 0.1114 d3.loss_iou: 0.1970 d4.loss_cls: 0.2281 d4.loss_bbox: 0.1094 d4.loss_iou: 0.1966 enc_loss_cls: 0.2639 enc_loss_bbox: 0.1389 enc_loss_iou: 0.2311 dn_loss_cls: 0.0738 dn_loss_bbox: 0.1468 dn_loss_iou: 0.1771 d0.dn_loss_cls: 0.1512 d0.dn_loss_bbox: 0.2835 d0.dn_loss_iou: 0.3056 d1.dn_loss_cls: 0.1046 d1.dn_loss_bbox: 0.1823 d1.dn_loss_iou: 0.2061 d2.dn_loss_cls: 0.0848 d2.dn_loss_bbox: 0.1573 d2.dn_loss_iou: 0.1859 d3.dn_loss_cls: 0.0794 d3.dn_loss_bbox: 0.1484 d3.dn_loss_iou: 0.1793 d4.dn_loss_cls: 0.0746 d4.dn_loss_bbox: 0.1470 d4.dn_loss_iou: 0.1773 d1.loss_lmm_region: 0.1027 loss_lmm_image: 0.6806 2024/11/14 01:57:34 - mmengine - INFO - Iter(train) [149700/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:10:02 time: 1.9776 data_time: 0.0189 memory: 31884 grad_norm: 28.8687 loss: 7.7456 loss_cls: 0.2213 loss_bbox: 0.1246 loss_iou: 0.2015 d0.loss_cls: 0.2472 d0.loss_bbox: 0.1383 d0.loss_iou: 0.2165 d1.loss_cls: 0.2327 d1.loss_bbox: 0.1305 d1.loss_iou: 0.2082 d2.loss_cls: 0.2323 d2.loss_bbox: 0.1219 d2.loss_iou: 0.2028 d3.loss_cls: 0.2270 d3.loss_bbox: 0.1224 d3.loss_iou: 0.2009 d4.loss_cls: 0.2231 d4.loss_bbox: 0.1227 d4.loss_iou: 0.2011 enc_loss_cls: 0.2558 enc_loss_bbox: 0.1433 enc_loss_iou: 0.2249 dn_loss_cls: 0.0700 dn_loss_bbox: 0.1565 dn_loss_iou: 0.1928 d0.dn_loss_cls: 0.1468 d0.dn_loss_bbox: 0.2948 d0.dn_loss_iou: 0.3191 d1.dn_loss_cls: 0.1019 d1.dn_loss_bbox: 0.1834 d1.dn_loss_iou: 0.2158 d2.dn_loss_cls: 0.0840 d2.dn_loss_bbox: 0.1639 d2.dn_loss_iou: 0.1999 d3.dn_loss_cls: 0.0760 d3.dn_loss_bbox: 0.1577 d3.dn_loss_iou: 0.1943 d4.dn_loss_cls: 0.0710 d4.dn_loss_bbox: 0.1564 d4.dn_loss_iou: 0.1928 d1.loss_lmm_region: 0.0764 loss_lmm_image: 0.6933 2024/11/14 02:00:51 - mmengine - INFO - Iter(train) [149800/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:06:41 time: 1.9954 data_time: 0.0190 memory: 33562 grad_norm: 20.7603 loss: 8.6620 loss_cls: 0.2707 loss_bbox: 0.1459 loss_iou: 0.2586 d0.loss_cls: 0.3146 d0.loss_bbox: 0.1506 d0.loss_iou: 0.2677 d1.loss_cls: 0.2922 d1.loss_bbox: 0.1475 d1.loss_iou: 0.2600 d2.loss_cls: 0.2819 d2.loss_bbox: 0.1448 d2.loss_iou: 0.2610 d3.loss_cls: 0.2746 d3.loss_bbox: 0.1448 d3.loss_iou: 0.2585 d4.loss_cls: 0.2727 d4.loss_bbox: 0.1459 d4.loss_iou: 0.2595 enc_loss_cls: 0.3295 enc_loss_bbox: 0.1565 enc_loss_iou: 0.2755 dn_loss_cls: 0.0690 dn_loss_bbox: 0.1576 dn_loss_iou: 0.2055 d0.dn_loss_cls: 0.1432 d0.dn_loss_bbox: 0.2743 d0.dn_loss_iou: 0.3221 d1.dn_loss_cls: 0.0973 d1.dn_loss_bbox: 0.1831 d1.dn_loss_iou: 0.2297 d2.dn_loss_cls: 0.0792 d2.dn_loss_bbox: 0.1617 d2.dn_loss_iou: 0.2125 d3.dn_loss_cls: 0.0736 d3.dn_loss_bbox: 0.1583 d3.dn_loss_iou: 0.2072 d4.dn_loss_cls: 0.0698 d4.dn_loss_bbox: 0.1576 d4.dn_loss_iou: 0.2055 d1.loss_lmm_region: 0.0904 loss_lmm_image: 0.6514 2024/11/14 02:04:12 - mmengine - INFO - Iter(train) [149900/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:03:20 time: 2.0055 data_time: 0.0189 memory: 34187 grad_norm: 25.0704 loss: 7.5011 loss_cls: 0.2178 loss_bbox: 0.1214 loss_iou: 0.2099 d0.loss_cls: 0.2474 d0.loss_bbox: 0.1343 d0.loss_iou: 0.2287 d1.loss_cls: 0.2301 d1.loss_bbox: 0.1277 d1.loss_iou: 0.2188 d2.loss_cls: 0.2275 d2.loss_bbox: 0.1183 d2.loss_iou: 0.2096 d3.loss_cls: 0.2212 d3.loss_bbox: 0.1189 d3.loss_iou: 0.2093 d4.loss_cls: 0.2188 d4.loss_bbox: 0.1205 d4.loss_iou: 0.2102 enc_loss_cls: 0.2562 enc_loss_bbox: 0.1367 enc_loss_iou: 0.2399 dn_loss_cls: 0.0526 dn_loss_bbox: 0.1377 dn_loss_iou: 0.1879 d0.dn_loss_cls: 0.1300 d0.dn_loss_bbox: 0.2661 d0.dn_loss_iou: 0.3119 d1.dn_loss_cls: 0.0789 d1.dn_loss_bbox: 0.1653 d1.dn_loss_iou: 0.2137 d2.dn_loss_cls: 0.0621 d2.dn_loss_bbox: 0.1454 d2.dn_loss_iou: 0.1958 d3.dn_loss_cls: 0.0557 d3.dn_loss_bbox: 0.1386 d3.dn_loss_iou: 0.1896 d4.dn_loss_cls: 0.0531 d4.dn_loss_bbox: 0.1376 d4.dn_loss_iou: 0.1879 d1.loss_lmm_region: 0.0809 loss_lmm_image: 0.6872 2024/11/14 02:07:32 - mmengine - INFO - Exp name: grounding_dino_swin_l_20241110_092759 2024/11/14 02:07:32 - mmengine - INFO - Iter(train) [150000/150000] base_lr: 1.0000e-06 lr: 1.0000e-06 eta: 0:00:00 time: 2.0043 data_time: 0.0191 memory: 32220 grad_norm: 23.7986 loss: 8.7080 loss_cls: 0.2736 loss_bbox: 0.1368 loss_iou: 0.2183 d0.loss_cls: 0.3093 d0.loss_bbox: 0.1532 d0.loss_iou: 0.2329 d1.loss_cls: 0.2925 d1.loss_bbox: 0.1438 d1.loss_iou: 0.2227 d2.loss_cls: 0.2864 d2.loss_bbox: 0.1373 d2.loss_iou: 0.2174 d3.loss_cls: 0.2795 d3.loss_bbox: 0.1355 d3.loss_iou: 0.2172 d4.loss_cls: 0.2738 d4.loss_bbox: 0.1370 d4.loss_iou: 0.2186 enc_loss_cls: 0.3157 enc_loss_bbox: 0.1642 enc_loss_iou: 0.2512 dn_loss_cls: 0.0735 dn_loss_bbox: 0.1765 dn_loss_iou: 0.2056 d0.dn_loss_cls: 0.1596 d0.dn_loss_bbox: 0.3462 d0.dn_loss_iou: 0.3478 d1.dn_loss_cls: 0.1047 d1.dn_loss_bbox: 0.2122 d1.dn_loss_iou: 0.2362 d2.dn_loss_cls: 0.0840 d2.dn_loss_bbox: 0.1868 d2.dn_loss_iou: 0.2146 d3.dn_loss_cls: 0.0774 d3.dn_loss_bbox: 0.1781 d3.dn_loss_iou: 0.2078 d4.dn_loss_cls: 0.0745 d4.dn_loss_bbox: 0.1766 d4.dn_loss_iou: 0.2056 d1.loss_lmm_region: 0.1051 loss_lmm_image: 0.7181 2024/11/14 02:07:32 - mmengine - INFO - Saving checkpoint at 150000 iterations 2024/11/14 02:18:36 - mmengine - INFO - Iter(val) [100/602] eta: 0:48:42 time: 5.9556 data_time: 0.0018 memory: 7497 2024/11/14 02:28:52 - mmengine - INFO - Iter(val) [200/602] eta: 0:40:07 time: 6.2955 data_time: 0.0018 memory: 7528 2024/11/14 02:40:06 - mmengine - INFO - Iter(val) [300/602] eta: 0:31:24 time: 6.8255 data_time: 0.0018 memory: 7506 2024/11/14 02:52:18 - mmengine - INFO - Iter(val) [400/602] eta: 0:21:54 time: 7.4617 data_time: 0.0019 memory: 7528 2024/11/14 03:05:03 - mmengine - INFO - Iter(val) [500/602] eta: 0:11:27 time: 7.7833 data_time: 0.0018 memory: 7473 2024/11/14 03:18:43 - mmengine - INFO - Iter(val) [600/602] eta: 0:00:13 time: 8.3360 data_time: 0.0019 memory: 7522 2024/11/14 03:25:29 - mmengine - INFO - === 67 classes had less than 10000 detections! Outputting 10000 detections for each class will improve AP further. === 2024/11/14 03:27:48 - mmengine - INFO - mAP_copypaste: {'AP': 0.5061346343647185, 'AP50': 0.6192047436740596, 'AP75': 0.5378175825124424, 'APs': 0.45266486267310585, 'APm': 0.6334216488852137, 'APl': 0.7221627169838374, 'APr': 0.417155678281297, 'APc': 0.46201831012327854, 'APf': 0.5612725746966883} 2024/11/14 03:28:13 - mmengine - INFO - Iter(val) [602/602] lvis_fixed_ap/AP: 0.5061 lvis_fixed_ap/AP50: 0.6192 lvis_fixed_ap/AP75: 0.5378 lvis_fixed_ap/APs: 0.4527 lvis_fixed_ap/APm: 0.6334 lvis_fixed_ap/APl: 0.7222 lvis_fixed_ap/APr: 0.4172 lvis_fixed_ap/APc: 0.4620 lvis_fixed_ap/APf: 0.5613 data_time: 0.0020 time: 6.9872