01/26/2022 19:48:33 - INFO - codeparrot_training - Distributed environment: TPU Num processes: 8 Process index: 0 Local process index: 0 Device: xla:1 Use FP16 precision: False 01/26/2022 19:48:33 - WARNING - huggingface_hub.repository - Revision `royal-monkey-12` does not exist. Created and checked out branch `royal-monkey-12`. 01/26/2022 19:48:33 - WARNING - huggingface_hub.repository - 01/26/2022 19:48:41 - WARNING - datasets.builder - Using custom data configuration lvwerra___codeparrot-clean-train-a1efdd1059bd841d 01/26/2022 19:48:42 - WARNING - datasets.builder - Using custom data configuration lvwerra___codeparrot-clean-valid-a800eb55c299abc0 01/26/2022 19:49:16 - INFO - codeparrot_training - Step 0: {'lr': 0.0, 'samples': 192, 'steps': 0, 'loss/train': 1.3174253404140472} 01/26/2022 19:50:22 - INFO - codeparrot_training - Step 1: {'lr': 2.5e-07, 'samples': 384, 'steps': 1, 'loss/train': 1.098493069410324} 01/26/2022 19:51:32 - INFO - codeparrot_training - Step 2: {'lr': 5e-07, 'samples': 576, 'steps': 2, 'loss/train': 1.4475705027580261} 01/26/2022 19:51:35 - INFO - codeparrot_training - Step 3: {'lr': 7.5e-07, 'samples': 768, 'steps': 3, 'loss/train': 0.7283393740653992} 01/26/2022 19:51:38 - INFO - codeparrot_training - Step 4: {'lr': 1e-06, 'samples': 960, 'steps': 4, 'loss/train': 0.8016244769096375} 01/26/2022 19:51:41 - INFO - codeparrot_training - Step 5: {'lr': 1.25e-06, 'samples': 1152, 'steps': 5, 'loss/train': 1.1862211525440216} 01/26/2022 19:51:44 - INFO - codeparrot_training - Step 6: {'lr': 1.5e-06, 'samples': 1344, 'steps': 6, 'loss/train': 0.9115740358829498} 01/26/2022 19:51:47 - INFO - codeparrot_training - Step 7: {'lr': 1.75e-06, 'samples': 1536, 'steps': 7, 'loss/train': 0.7556531131267548} 01/26/2022 19:51:53 - INFO - codeparrot_training - Step 8: {'lr': 2e-06, 'samples': 1728, 'steps': 8, 'loss/train': 1.2928651571273804} 01/26/2022 19:51:56 - INFO - codeparrot_training - Step 9: {'lr': 2.25e-06, 'samples': 1920, 'steps': 9, 'loss/train': 1.7122818231582642} 01/26/2022 19:51:59 - INFO - codeparrot_training - Step 10: {'lr': 2.5e-06, 'samples': 2112, 'steps': 10, 'loss/train': 0.66591677069664} 01/26/2022 19:52:02 - INFO - codeparrot_training - Step 11: {'lr': 2.75e-06, 'samples': 2304, 'steps': 11, 'loss/train': 0.45707841217517853} 01/26/2022 19:52:06 - INFO - codeparrot_training - Step 12: {'lr': 3e-06, 'samples': 2496, 'steps': 12, 'loss/train': 1.0142653584480286} 01/26/2022 19:52:09 - INFO - codeparrot_training - Step 13: {'lr': 3.25e-06, 'samples': 2688, 'steps': 13, 'loss/train': 1.2159733772277832} 01/26/2022 19:52:12 - INFO - codeparrot_training - Step 14: {'lr': 3.5e-06, 'samples': 2880, 'steps': 14, 'loss/train': 0.8099969029426575} 01/26/2022 19:52:15 - INFO - codeparrot_training - Step 15: {'lr': 3.75e-06, 'samples': 3072, 'steps': 15, 'loss/train': 0.9018677473068237} 01/26/2022 19:52:18 - INFO - codeparrot_training - Step 16: {'lr': 4e-06, 'samples': 3264, 'steps': 16, 'loss/train': 1.4756002128124237} 01/26/2022 19:52:23 - INFO - codeparrot_training - Step 17: {'lr': 4.250000000000001e-06, 'samples': 3456, 'steps': 17, 'loss/train': 0.8930210173130035} 01/26/2022 19:52:26 - INFO - codeparrot_training - Step 18: {'lr': 4.5e-06, 'samples': 3648, 'steps': 18, 'loss/train': 1.3594367802143097} 01/26/2022 19:52:29 - INFO - codeparrot_training - Step 19: {'lr': 4.75e-06, 'samples': 3840, 'steps': 19, 'loss/train': 0.9841994047164917} 01/26/2022 19:52:32 - INFO - codeparrot_training - Step 20: {'lr': 5e-06, 'samples': 4032, 'steps': 20, 'loss/train': 0.5345936715602875} 01/26/2022 19:52:35 - INFO - codeparrot_training - Step 21: {'lr': 5.2500000000000006e-06, 'samples': 4224, 'steps': 21, 'loss/train': 0.637407973408699} 01/26/2022 19:52:38 - INFO - codeparrot_training - Step 22: {'lr': 5.5e-06, 'samples': 4416, 'steps': 22, 'loss/train': 1.2250346839427948} 01/26/2022 19:52:41 - INFO - codeparrot_training - Step 23: {'lr': 5.75e-06, 'samples': 4608, 'steps': 23, 'loss/train': 1.0428715646266937} 01/26/2022 19:52:45 - INFO - codeparrot_training - Step 24: {'lr': 6e-06, 'samples': 4800, 'steps': 24, 'loss/train': 0.9611641466617584} 01/26/2022 19:52:50 - INFO - codeparrot_training - Step 25: {'lr': 6.25e-06, 'samples': 4992, 'steps': 25, 'loss/train': 1.2642099559307098} 01/26/2022 19:52:53 - INFO - codeparrot_training - Step 26: {'lr': 6.5e-06, 'samples': 5184, 'steps': 26, 'loss/train': 0.38639961183071136} 01/26/2022 19:52:56 - INFO - codeparrot_training - Step 27: {'lr': 6.75e-06, 'samples': 5376, 'steps': 27, 'loss/train': 1.3766223192214966} 01/26/2022 19:52:59 - INFO - codeparrot_training - Step 28: {'lr': 7e-06, 'samples': 5568, 'steps': 28, 'loss/train': 1.0789436995983124} 01/26/2022 19:53:02 - INFO - codeparrot_training - Step 29: {'lr': 7.250000000000001e-06, 'samples': 5760, 'steps': 29, 'loss/train': 1.0117962956428528} 01/26/2022 19:53:05 - INFO - codeparrot_training - Step 30: {'lr': 7.5e-06, 'samples': 5952, 'steps': 30, 'loss/train': 0.4729790836572647} 01/26/2022 19:53:08 - INFO - codeparrot_training - Step 31: {'lr': 7.75e-06, 'samples': 6144, 'steps': 31, 'loss/train': 1.2678570449352264} 01/26/2022 19:53:12 - INFO - codeparrot_training - Step 32: {'lr': 8e-06, 'samples': 6336, 'steps': 32, 'loss/train': 1.0175764560699463} 01/26/2022 19:53:15 - INFO - codeparrot_training - Step 33: {'lr': 8.25e-06, 'samples': 6528, 'steps': 33, 'loss/train': 0.6667446792125702} 01/26/2022 19:53:19 - INFO - codeparrot_training - Step 34: {'lr': 8.500000000000002e-06, 'samples': 6720, 'steps': 34, 'loss/train': 1.0679191052913666} 01/26/2022 19:53:22 - INFO - codeparrot_training - Step 35: {'lr': 8.750000000000001e-06, 'samples': 6912, 'steps': 35, 'loss/train': 0.7404331415891647} 01/26/2022 19:53:25 - INFO - codeparrot_training - Step 36: {'lr': 9e-06, 'samples': 7104, 'steps': 36, 'loss/train': 1.0515027344226837} 01/26/2022 19:53:29 - INFO - codeparrot_training - Step 37: {'lr': 9.25e-06, 'samples': 7296, 'steps': 37, 'loss/train': 1.3058996200561523} 01/26/2022 19:53:32 - INFO - codeparrot_training - Step 38: {'lr': 9.5e-06, 'samples': 7488, 'steps': 38, 'loss/train': 1.1324551105499268} 01/26/2022 19:53:35 - INFO - codeparrot_training - Step 39: {'lr': 9.75e-06, 'samples': 7680, 'steps': 39, 'loss/train': 0.8902977705001831} 01/26/2022 19:53:38 - INFO - codeparrot_training - Step 40: {'lr': 1e-05, 'samples': 7872, 'steps': 40, 'loss/train': 0.5506274253129959} 01/26/2022 19:53:41 - INFO - codeparrot_training - Step 41: {'lr': 1.025e-05, 'samples': 8064, 'steps': 41, 'loss/train': 0.953863263130188} 01/26/2022 19:53:44 - INFO - codeparrot_training - Step 42: {'lr': 1.0500000000000001e-05, 'samples': 8256, 'steps': 42, 'loss/train': 1.2091366946697235} 01/26/2022 19:53:49 - INFO - codeparrot_training - Step 43: {'lr': 1.0749999999999999e-05, 'samples': 8448, 'steps': 43, 'loss/train': 0.6340087652206421} 01/26/2022 19:53:52 - INFO - codeparrot_training - Step 44: {'lr': 1.1e-05, 'samples': 8640, 'steps': 44, 'loss/train': 1.1415702402591705} 01/26/2022 19:53:55 - INFO - codeparrot_training - Step 45: {'lr': 1.1249999999999999e-05, 'samples': 8832, 'steps': 45, 'loss/train': 1.1699762642383575} 01/26/2022 19:53:58 - INFO - codeparrot_training - Step 46: {'lr': 1.15e-05, 'samples': 9024, 'steps': 46, 'loss/train': 0.519690066576004} 01/26/2022 19:54:01 - INFO - codeparrot_training - Step 47: {'lr': 1.1750000000000001e-05, 'samples': 9216, 'steps': 47, 'loss/train': 0.9116686284542084} 01/26/2022 19:54:04 - INFO - codeparrot_training - Step 48: {'lr': 1.2e-05, 'samples': 9408, 'steps': 48, 'loss/train': 0.8386119604110718} 01/26/2022 19:54:08 - INFO - codeparrot_training - Step 49: {'lr': 1.2250000000000001e-05, 'samples': 9600, 'steps': 49, 'loss/train': 1.3592749536037445} 01/26/2022 19:54:11 - INFO - codeparrot_training - Step 50: {'lr': 1.25e-05, 'samples': 9792, 'steps': 50, 'loss/train': 1.262882262468338} 01/26/2022 19:54:14 - INFO - codeparrot_training - Step 51: {'lr': 1.275e-05, 'samples': 9984, 'steps': 51, 'loss/train': 0.8111315667629242} 01/26/2022 19:54:19 - INFO - codeparrot_training - Step 52: {'lr': 1.3e-05, 'samples': 10176, 'steps': 52, 'loss/train': 0.9616953134536743} 01/26/2022 19:54:22 - INFO - codeparrot_training - Step 53: {'lr': 1.325e-05, 'samples': 10368, 'steps': 53, 'loss/train': 0.6483338177204132} 01/26/2022 19:54:26 - INFO - codeparrot_training - Step 54: {'lr': 1.35e-05, 'samples': 10560, 'steps': 54, 'loss/train': 1.130149930715561} 01/26/2022 19:54:29 - INFO - codeparrot_training - Step 55: {'lr': 1.375e-05, 'samples': 10752, 'steps': 55, 'loss/train': 0.6978979557752609} 01/26/2022 19:54:32 - INFO - codeparrot_training - Step 56: {'lr': 1.4e-05, 'samples': 10944, 'steps': 56, 'loss/train': 0.8231143355369568} 01/26/2022 19:54:35 - INFO - codeparrot_training - Step 57: {'lr': 1.425e-05, 'samples': 11136, 'steps': 57, 'loss/train': 1.9858249425888062} 01/26/2022 19:54:38 - INFO - codeparrot_training - Step 58: {'lr': 1.4500000000000002e-05, 'samples': 11328, 'steps': 58, 'loss/train': 1.0599297881126404} 01/26/2022 19:54:41 - INFO - codeparrot_training - Step 59: {'lr': 1.475e-05, 'samples': 11520, 'steps': 59, 'loss/train': 0.8293282985687256} 01/26/2022 19:54:44 - INFO - codeparrot_training - Step 60: {'lr': 1.5e-05, 'samples': 11712, 'steps': 60, 'loss/train': 0.4658845067024231} 01/26/2022 19:54:49 - INFO - codeparrot_training - Step 61: {'lr': 1.525e-05, 'samples': 11904, 'steps': 61, 'loss/train': 0.7947032153606415} 01/26/2022 19:54:52 - INFO - codeparrot_training - Step 62: {'lr': 1.55e-05, 'samples': 12096, 'steps': 62, 'loss/train': 0.481005996465683} 01/26/2022 19:54:56 - INFO - codeparrot_training - Step 63: {'lr': 1.575e-05, 'samples': 12288, 'steps': 63, 'loss/train': 0.9705469608306885} 01/26/2022 19:54:59 - INFO - codeparrot_training - Step 64: {'lr': 1.6e-05, 'samples': 12480, 'steps': 64, 'loss/train': 1.0617777407169342} 01/26/2022 19:55:02 - INFO - codeparrot_training - Step 65: {'lr': 1.6250000000000002e-05, 'samples': 12672, 'steps': 65, 'loss/train': 1.2117932438850403} 01/26/2022 19:55:05 - INFO - codeparrot_training - Step 66: {'lr': 1.65e-05, 'samples': 12864, 'steps': 66, 'loss/train': 1.173624336719513} 01/26/2022 19:55:08 - INFO - codeparrot_training - Step 67: {'lr': 1.675e-05, 'samples': 13056, 'steps': 67, 'loss/train': 1.6627777218818665} 01/26/2022 19:55:11 - INFO - codeparrot_training - Step 68: {'lr': 1.7000000000000003e-05, 'samples': 13248, 'steps': 68, 'loss/train': 1.043434739112854} 01/26/2022 19:55:14 - INFO - codeparrot_training - Step 69: {'lr': 1.7250000000000003e-05, 'samples': 13440, 'steps': 69, 'loss/train': 1.2198073267936707} 01/26/2022 19:55:19 - INFO - codeparrot_training - Step 70: {'lr': 1.7500000000000002e-05, 'samples': 13632, 'steps': 70, 'loss/train': 0.9751145839691162} 01/26/2022 19:55:22 - INFO - codeparrot_training - Step 71: {'lr': 1.7749999999999998e-05, 'samples': 13824, 'steps': 71, 'loss/train': 0.7835906445980072} 01/26/2022 19:55:25 - INFO - codeparrot_training - Step 72: {'lr': 1.8e-05, 'samples': 14016, 'steps': 72, 'loss/train': 0.7807894349098206} 01/26/2022 19:55:28 - INFO - codeparrot_training - Step 73: {'lr': 1.825e-05, 'samples': 14208, 'steps': 73, 'loss/train': 1.3491949439048767} 01/26/2022 19:55:31 - INFO - codeparrot_training - Step 74: {'lr': 1.85e-05, 'samples': 14400, 'steps': 74, 'loss/train': 1.1515992879867554} 01/26/2022 19:55:34 - INFO - codeparrot_training - Step 75: {'lr': 1.875e-05, 'samples': 14592, 'steps': 75, 'loss/train': 1.0031714737415314} 01/26/2022 19:55:38 - INFO - codeparrot_training - Step 76: {'lr': 1.9e-05, 'samples': 14784, 'steps': 76, 'loss/train': 1.1666240394115448} 01/26/2022 19:55:41 - INFO - codeparrot_training - Step 77: {'lr': 1.925e-05, 'samples': 14976, 'steps': 77, 'loss/train': 0.4633499085903168} 01/26/2022 19:55:44 - INFO - codeparrot_training - Step 78: {'lr': 1.95e-05, 'samples': 15168, 'steps': 78, 'loss/train': 1.052702933549881} 01/26/2022 19:55:49 - INFO - codeparrot_training - Step 79: {'lr': 1.975e-05, 'samples': 15360, 'steps': 79, 'loss/train': 0.7855435609817505} 01/26/2022 19:55:52 - INFO - codeparrot_training - Step 80: {'lr': 2e-05, 'samples': 15552, 'steps': 80, 'loss/train': 1.2250367403030396} 01/26/2022 19:55:56 - INFO - codeparrot_training - Step 81: {'lr': 2.025e-05, 'samples': 15744, 'steps': 81, 'loss/train': 0.906893402338028} 01/26/2022 19:55:59 - INFO - codeparrot_training - Step 82: {'lr': 2.05e-05, 'samples': 15936, 'steps': 82, 'loss/train': 0.9534100592136383} 01/26/2022 19:56:02 - INFO - codeparrot_training - Step 83: {'lr': 2.0750000000000003e-05, 'samples': 16128, 'steps': 83, 'loss/train': 1.1026679277420044} 01/26/2022 19:56:05 - INFO - codeparrot_training - Step 84: {'lr': 2.1000000000000002e-05, 'samples': 16320, 'steps': 84, 'loss/train': 0.35788920521736145} 01/26/2022 19:56:08 - INFO - codeparrot_training - Step 85: {'lr': 2.125e-05, 'samples': 16512, 'steps': 85, 'loss/train': 1.1877082586288452} 01/26/2022 19:56:11 - INFO - codeparrot_training - Step 86: {'lr': 2.1499999999999997e-05, 'samples': 16704, 'steps': 86, 'loss/train': 0.8088233470916748} 01/26/2022 19:56:16 - INFO - codeparrot_training - Step 87: {'lr': 2.175e-05, 'samples': 16896, 'steps': 87, 'loss/train': 1.3929831683635712} 01/26/2022 19:56:19 - INFO - codeparrot_training - Step 88: {'lr': 2.2e-05, 'samples': 17088, 'steps': 88, 'loss/train': 0.47710342705249786} 01/26/2022 19:56:22 - INFO - codeparrot_training - Step 89: {'lr': 2.225e-05, 'samples': 17280, 'steps': 89, 'loss/train': 0.763294905424118} 01/26/2022 19:56:25 - INFO - codeparrot_training - Step 90: {'lr': 2.2499999999999998e-05, 'samples': 17472, 'steps': 90, 'loss/train': 0.9009081423282623} 01/26/2022 19:56:28 - INFO - codeparrot_training - Step 91: {'lr': 2.275e-05, 'samples': 17664, 'steps': 91, 'loss/train': 1.0131529569625854} 01/26/2022 19:56:31 - INFO - codeparrot_training - Step 92: {'lr': 2.3e-05, 'samples': 17856, 'steps': 92, 'loss/train': 1.1171832382678986} 01/26/2022 19:56:34 - INFO - codeparrot_training - Step 93: {'lr': 2.325e-05, 'samples': 18048, 'steps': 93, 'loss/train': 1.1443756520748138} 01/26/2022 19:56:37 - INFO - codeparrot_training - Step 94: {'lr': 2.3500000000000002e-05, 'samples': 18240, 'steps': 94, 'loss/train': 1.0366325676441193} 01/26/2022 19:56:41 - INFO - codeparrot_training - Step 95: {'lr': 2.375e-05, 'samples': 18432, 'steps': 95, 'loss/train': 1.444101244211197} 01/26/2022 19:56:45 - INFO - codeparrot_training - Step 96: {'lr': 2.4e-05, 'samples': 18624, 'steps': 96, 'loss/train': 0.979115903377533} 01/26/2022 19:56:48 - INFO - codeparrot_training - Step 97: {'lr': 2.425e-05, 'samples': 18816, 'steps': 97, 'loss/train': 1.0008891820907593} 01/26/2022 19:56:51 - INFO - codeparrot_training - Step 98: {'lr': 2.4500000000000003e-05, 'samples': 19008, 'steps': 98, 'loss/train': 1.0458069741725922} 01/26/2022 19:56:54 - INFO - codeparrot_training - Step 99: {'lr': 2.4750000000000002e-05, 'samples': 19200, 'steps': 99, 'loss/train': 0.7854014039039612} 01/26/2022 19:56:58 - INFO - codeparrot_training - Step 100: {'lr': 2.5e-05, 'samples': 19392, 'steps': 100, 'loss/train': 0.9934608042240143} 01/26/2022 19:57:01 - INFO - codeparrot_training - Step 101: {'lr': 2.525e-05, 'samples': 19584, 'steps': 101, 'loss/train': 0.9963217377662659} 01/26/2022 19:57:04 - INFO - codeparrot_training - Step 102: {'lr': 2.55e-05, 'samples': 19776, 'steps': 102, 'loss/train': 0.7054498493671417} 01/26/2022 19:57:07 - INFO - codeparrot_training - Step 103: {'lr': 2.575e-05, 'samples': 19968, 'steps': 103, 'loss/train': 1.1544867753982544} 01/26/2022 19:57:10 - INFO - codeparrot_training - Step 104: {'lr': 2.6e-05, 'samples': 20160, 'steps': 104, 'loss/train': 0.8630503714084625} 01/26/2022 19:57:14 - INFO - codeparrot_training - Step 105: {'lr': 2.625e-05, 'samples': 20352, 'steps': 105, 'loss/train': 1.3325937688350677} 01/26/2022 19:57:18 - INFO - codeparrot_training - Step 106: {'lr': 2.65e-05, 'samples': 20544, 'steps': 106, 'loss/train': 0.945432722568512} 01/26/2022 19:57:21 - INFO - codeparrot_training - Step 107: {'lr': 2.675e-05, 'samples': 20736, 'steps': 107, 'loss/train': 0.8983843624591827} 01/26/2022 19:57:24 - INFO - codeparrot_training - Step 108: {'lr': 2.7e-05, 'samples': 20928, 'steps': 108, 'loss/train': 0.8008756935596466} 01/26/2022 19:57:27 - INFO - codeparrot_training - Step 109: {'lr': 2.725e-05, 'samples': 21120, 'steps': 109, 'loss/train': 1.5613712668418884} 01/26/2022 19:57:30 - INFO - codeparrot_training - Step 110: {'lr': 2.75e-05, 'samples': 21312, 'steps': 110, 'loss/train': 0.5566383004188538} 01/26/2022 19:57:33 - INFO - codeparrot_training - Step 111: {'lr': 2.775e-05, 'samples': 21504, 'steps': 111, 'loss/train': 0.6587662249803543} 01/26/2022 19:57:36 - INFO - codeparrot_training - Step 112: {'lr': 2.8e-05, 'samples': 21696, 'steps': 112, 'loss/train': 0.9899620413780212} 01/26/2022 19:57:39 - INFO - codeparrot_training - Step 113: {'lr': 2.8250000000000002e-05, 'samples': 21888, 'steps': 113, 'loss/train': 1.196788340806961} 01/26/2022 19:57:45 - INFO - codeparrot_training - Step 114: {'lr': 2.85e-05, 'samples': 22080, 'steps': 114, 'loss/train': 1.3232413530349731} 01/26/2022 19:57:48 - INFO - codeparrot_training - Step 115: {'lr': 2.875e-05, 'samples': 22272, 'steps': 115, 'loss/train': 1.111331284046173} 01/26/2022 19:57:51 - INFO - codeparrot_training - Step 116: {'lr': 2.9000000000000004e-05, 'samples': 22464, 'steps': 116, 'loss/train': 1.4940314590930939} 01/26/2022 19:57:54 - INFO - codeparrot_training - Step 117: {'lr': 2.9250000000000003e-05, 'samples': 22656, 'steps': 117, 'loss/train': 0.9729132950305939} 01/26/2022 19:57:57 - INFO - codeparrot_training - Step 118: {'lr': 2.95e-05, 'samples': 22848, 'steps': 118, 'loss/train': 1.5360177755355835} 01/26/2022 19:58:00 - INFO - codeparrot_training - Step 119: {'lr': 2.9749999999999998e-05, 'samples': 23040, 'steps': 119, 'loss/train': 1.4864407181739807} 01/26/2022 19:58:04 - INFO - codeparrot_training - Step 120: {'lr': 3e-05, 'samples': 23232, 'steps': 120, 'loss/train': 1.0524914860725403} 01/26/2022 19:58:07 - INFO - codeparrot_training - Step 121: {'lr': 3.025e-05, 'samples': 23424, 'steps': 121, 'loss/train': 0.8903714418411255} 01/26/2022 19:58:11 - INFO - codeparrot_training - Step 122: {'lr': 3.05e-05, 'samples': 23616, 'steps': 122, 'loss/train': 0.9781338572502136} 01/26/2022 19:58:15 - INFO - codeparrot_training - Step 123: {'lr': 3.075e-05, 'samples': 23808, 'steps': 123, 'loss/train': 0.5676954388618469} 01/26/2022 19:58:18 - INFO - codeparrot_training - Step 124: {'lr': 3.1e-05, 'samples': 24000, 'steps': 124, 'loss/train': 0.6250019073486328} 01/26/2022 19:58:21 - INFO - codeparrot_training - Step 125: {'lr': 3.125e-05, 'samples': 24192, 'steps': 125, 'loss/train': 1.1279764473438263} 01/26/2022 19:58:24 - INFO - codeparrot_training - Step 126: {'lr': 3.15e-05, 'samples': 24384, 'steps': 126, 'loss/train': 0.6120450049638748} 01/26/2022 19:58:27 - INFO - codeparrot_training - Step 127: {'lr': 3.175e-05, 'samples': 24576, 'steps': 127, 'loss/train': 0.6227641850709915} 01/26/2022 19:58:30 - INFO - codeparrot_training - Step 128: {'lr': 3.2e-05, 'samples': 24768, 'steps': 128, 'loss/train': 1.412695974111557} 01/26/2022 19:58:33 - INFO - codeparrot_training - Step 129: {'lr': 3.2250000000000005e-05, 'samples': 24960, 'steps': 129, 'loss/train': 1.011093556880951} 01/26/2022 19:58:36 - INFO - codeparrot_training - Step 130: {'lr': 3.2500000000000004e-05, 'samples': 25152, 'steps': 130, 'loss/train': 0.9026037454605103} 01/26/2022 19:58:42 - INFO - codeparrot_training - Step 131: {'lr': 3.275e-05, 'samples': 25344, 'steps': 131, 'loss/train': 1.2298235893249512} 01/26/2022 19:58:45 - INFO - codeparrot_training - Step 132: {'lr': 3.3e-05, 'samples': 25536, 'steps': 132, 'loss/train': 1.3381428122520447} 01/26/2022 19:58:48 - INFO - codeparrot_training - Step 133: {'lr': 3.325e-05, 'samples': 25728, 'steps': 133, 'loss/train': 1.043137639760971} 01/26/2022 19:58:51 - INFO - codeparrot_training - Step 134: {'lr': 3.35e-05, 'samples': 25920, 'steps': 134, 'loss/train': 1.189176231622696} 01/26/2022 19:58:54 - INFO - codeparrot_training - Step 135: {'lr': 3.375e-05, 'samples': 26112, 'steps': 135, 'loss/train': 1.3090053498744965} 01/26/2022 19:58:57 - INFO - codeparrot_training - Step 136: {'lr': 3.4000000000000007e-05, 'samples': 26304, 'steps': 136, 'loss/train': 1.3251706659793854} 01/26/2022 19:59:00 - INFO - codeparrot_training - Step 137: {'lr': 3.4250000000000006e-05, 'samples': 26496, 'steps': 137, 'loss/train': 0.9288864433765411} 01/26/2022 19:59:04 - INFO - codeparrot_training - Step 138: {'lr': 3.4500000000000005e-05, 'samples': 26688, 'steps': 138, 'loss/train': 1.0418122708797455} 01/26/2022 19:59:07 - INFO - codeparrot_training - Step 139: {'lr': 3.4750000000000004e-05, 'samples': 26880, 'steps': 139, 'loss/train': 0.9203312695026398} 01/26/2022 19:59:11 - INFO - codeparrot_training - Step 140: {'lr': 3.5000000000000004e-05, 'samples': 27072, 'steps': 140, 'loss/train': 0.9538873136043549} 01/26/2022 19:59:14 - INFO - codeparrot_training - Step 141: {'lr': 3.5249999999999996e-05, 'samples': 27264, 'steps': 141, 'loss/train': 0.9224672913551331} 01/26/2022 19:59:18 - INFO - codeparrot_training - Step 142: {'lr': 3.5499999999999996e-05, 'samples': 27456, 'steps': 142, 'loss/train': 0.6625053584575653} 01/26/2022 19:59:21 - INFO - codeparrot_training - Step 143: {'lr': 3.5749999999999995e-05, 'samples': 27648, 'steps': 143, 'loss/train': 0.9124108850955963} 01/26/2022 19:59:24 - INFO - codeparrot_training - Step 144: {'lr': 3.6e-05, 'samples': 27840, 'steps': 144, 'loss/train': 0.914698988199234} 01/26/2022 19:59:27 - INFO - codeparrot_training - Step 145: {'lr': 3.625e-05, 'samples': 28032, 'steps': 145, 'loss/train': 1.015060544013977} 01/26/2022 19:59:30 - INFO - codeparrot_training - Step 146: {'lr': 3.65e-05, 'samples': 28224, 'steps': 146, 'loss/train': 0.6130726039409637} 01/26/2022 19:59:33 - INFO - codeparrot_training - Step 147: {'lr': 3.675e-05, 'samples': 28416, 'steps': 147, 'loss/train': 1.1192330718040466} 01/26/2022 19:59:36 - INFO - codeparrot_training - Step 148: {'lr': 3.7e-05, 'samples': 28608, 'steps': 148, 'loss/train': 0.7602757215499878} 01/26/2022 19:59:41 - INFO - codeparrot_training - Step 149: {'lr': 3.725e-05, 'samples': 28800, 'steps': 149, 'loss/train': 0.9535870850086212} 01/26/2022 19:59:44 - INFO - codeparrot_training - Step 150: {'lr': 3.75e-05, 'samples': 28992, 'steps': 150, 'loss/train': 0.4617677181959152} 01/26/2022 19:59:47 - INFO - codeparrot_training - Step 151: {'lr': 3.775e-05, 'samples': 29184, 'steps': 151, 'loss/train': 1.0293497443199158} 01/26/2022 19:59:50 - INFO - codeparrot_training - Step 152: {'lr': 3.8e-05, 'samples': 29376, 'steps': 152, 'loss/train': 0.8444517552852631} 01/26/2022 19:59:53 - INFO - codeparrot_training - Step 153: {'lr': 3.825e-05, 'samples': 29568, 'steps': 153, 'loss/train': 0.5775855481624603} 01/26/2022 19:59:56 - INFO - codeparrot_training - Step 154: {'lr': 3.85e-05, 'samples': 29760, 'steps': 154, 'loss/train': 1.0447262227535248} 01/26/2022 20:00:00 - INFO - codeparrot_training - Step 155: {'lr': 3.875e-05, 'samples': 29952, 'steps': 155, 'loss/train': 1.02508082985878} 01/26/2022 20:00:03 - INFO - codeparrot_training - Step 156: {'lr': 3.9e-05, 'samples': 30144, 'steps': 156, 'loss/train': 0.9991528987884521} 01/26/2022 20:00:06 - INFO - codeparrot_training - Step 157: {'lr': 3.925e-05, 'samples': 30336, 'steps': 157, 'loss/train': 0.9459106028079987} 01/26/2022 20:00:11 - INFO - codeparrot_training - Step 158: {'lr': 3.95e-05, 'samples': 30528, 'steps': 158, 'loss/train': 0.8784597516059875} 01/26/2022 20:00:14 - INFO - codeparrot_training - Step 159: {'lr': 3.9750000000000004e-05, 'samples': 30720, 'steps': 159, 'loss/train': 0.8632999062538147} 01/26/2022 20:00:18 - INFO - codeparrot_training - Step 160: {'lr': 4e-05, 'samples': 30912, 'steps': 160, 'loss/train': 1.1137281954288483} 01/26/2022 20:00:21 - INFO - codeparrot_training - Step 161: {'lr': 4.025e-05, 'samples': 31104, 'steps': 161, 'loss/train': 0.729734480381012} 01/26/2022 20:00:24 - INFO - codeparrot_training - Step 162: {'lr': 4.05e-05, 'samples': 31296, 'steps': 162, 'loss/train': 0.4884277582168579} 01/26/2022 20:00:27 - INFO - codeparrot_training - Step 163: {'lr': 4.075e-05, 'samples': 31488, 'steps': 163, 'loss/train': 0.8900181949138641} 01/26/2022 20:00:30 - INFO - codeparrot_training - Step 164: {'lr': 4.1e-05, 'samples': 31680, 'steps': 164, 'loss/train': 0.6685371547937393} 01/26/2022 20:00:33 - INFO - codeparrot_training - Step 165: {'lr': 4.125e-05, 'samples': 31872, 'steps': 165, 'loss/train': 0.7850227653980255} 01/26/2022 20:00:36 - INFO - codeparrot_training - Step 166: {'lr': 4.1500000000000006e-05, 'samples': 32064, 'steps': 166, 'loss/train': 1.035389095544815} 01/26/2022 20:00:41 - INFO - codeparrot_training - Step 167: {'lr': 4.1750000000000005e-05, 'samples': 32256, 'steps': 167, 'loss/train': 1.4050793051719666} 01/26/2022 20:00:44 - INFO - codeparrot_training - Step 168: {'lr': 4.2000000000000004e-05, 'samples': 32448, 'steps': 168, 'loss/train': 0.31938090920448303} 01/26/2022 20:00:47 - INFO - codeparrot_training - Step 169: {'lr': 4.2250000000000004e-05, 'samples': 32640, 'steps': 169, 'loss/train': 0.9819863140583038} 01/26/2022 20:00:50 - INFO - codeparrot_training - Step 170: {'lr': 4.25e-05, 'samples': 32832, 'steps': 170, 'loss/train': 1.3573500216007233} 01/26/2022 20:00:53 - INFO - codeparrot_training - Step 171: {'lr': 4.275e-05, 'samples': 33024, 'steps': 171, 'loss/train': 0.6732581108808517} 01/26/2022 20:00:56 - INFO - codeparrot_training - Step 172: {'lr': 4.2999999999999995e-05, 'samples': 33216, 'steps': 172, 'loss/train': 1.555029273033142} 01/26/2022 20:01:00 - INFO - codeparrot_training - Step 173: {'lr': 4.325e-05, 'samples': 33408, 'steps': 173, 'loss/train': 1.3485970795154572} 01/26/2022 20:01:03 - INFO - codeparrot_training - Step 174: {'lr': 4.35e-05, 'samples': 33600, 'steps': 174, 'loss/train': 0.8738776445388794} 01/26/2022 20:01:06 - INFO - codeparrot_training - Step 175: {'lr': 4.375e-05, 'samples': 33792, 'steps': 175, 'loss/train': 0.763803094625473} 01/26/2022 20:01:10 - INFO - codeparrot_training - Step 176: {'lr': 4.4e-05, 'samples': 33984, 'steps': 176, 'loss/train': 0.7981584370136261} 01/26/2022 20:01:13 - INFO - codeparrot_training - Step 177: {'lr': 4.425e-05, 'samples': 34176, 'steps': 177, 'loss/train': 1.0527384281158447} 01/26/2022 20:01:16 - INFO - codeparrot_training - Step 178: {'lr': 4.45e-05, 'samples': 34368, 'steps': 178, 'loss/train': 1.026532530784607} 01/26/2022 20:01:19 - INFO - codeparrot_training - Step 179: {'lr': 4.475e-05, 'samples': 34560, 'steps': 179, 'loss/train': 1.1515803337097168} 01/26/2022 20:01:23 - INFO - codeparrot_training - Step 180: {'lr': 4.4999999999999996e-05, 'samples': 34752, 'steps': 180, 'loss/train': 0.6853055655956268} 01/26/2022 20:01:26 - INFO - codeparrot_training - Step 181: {'lr': 4.525e-05, 'samples': 34944, 'steps': 181, 'loss/train': 0.4972548186779022} 01/26/2022 20:01:29 - INFO - codeparrot_training - Step 182: {'lr': 4.55e-05, 'samples': 35136, 'steps': 182, 'loss/train': 0.8265936970710754} 01/26/2022 20:01:32 - INFO - codeparrot_training - Step 183: {'lr': 4.575e-05, 'samples': 35328, 'steps': 183, 'loss/train': 0.8314197063446045} 01/26/2022 20:01:37 - INFO - codeparrot_training - Step 184: {'lr': 4.6e-05, 'samples': 35520, 'steps': 184, 'loss/train': 1.1978421807289124} 01/26/2022 20:01:40 - INFO - codeparrot_training - Step 185: {'lr': 4.625e-05, 'samples': 35712, 'steps': 185, 'loss/train': 1.3686970174312592} 01/26/2022 20:01:43 - INFO - codeparrot_training - Step 186: {'lr': 4.65e-05, 'samples': 35904, 'steps': 186, 'loss/train': 0.7919806838035583} 01/26/2022 20:01:46 - INFO - codeparrot_training - Step 187: {'lr': 4.675e-05, 'samples': 36096, 'steps': 187, 'loss/train': 0.9425971806049347} 01/26/2022 20:01:50 - INFO - codeparrot_training - Step 188: {'lr': 4.7000000000000004e-05, 'samples': 36288, 'steps': 188, 'loss/train': 1.0793277025222778} 01/26/2022 20:01:53 - INFO - codeparrot_training - Step 189: {'lr': 4.725e-05, 'samples': 36480, 'steps': 189, 'loss/train': 0.9649386405944824} 01/26/2022 20:01:56 - INFO - codeparrot_training - Step 190: {'lr': 4.75e-05, 'samples': 36672, 'steps': 190, 'loss/train': 1.1041245460510254} 01/26/2022 20:01:59 - INFO - codeparrot_training - Step 191: {'lr': 4.775e-05, 'samples': 36864, 'steps': 191, 'loss/train': 0.9970739185810089} 01/26/2022 20:02:02 - INFO - codeparrot_training - Step 192: {'lr': 4.8e-05, 'samples': 37056, 'steps': 192, 'loss/train': 1.1216506361961365} 01/26/2022 20:02:07 - INFO - codeparrot_training - Step 193: {'lr': 4.825e-05, 'samples': 37248, 'steps': 193, 'loss/train': 0.9117452502250671} 01/26/2022 20:02:10 - INFO - codeparrot_training - Step 194: {'lr': 4.85e-05, 'samples': 37440, 'steps': 194, 'loss/train': 0.947819709777832} 01/26/2022 20:02:13 - INFO - codeparrot_training - Step 195: {'lr': 4.8750000000000006e-05, 'samples': 37632, 'steps': 195, 'loss/train': 0.9705623388290405} 01/26/2022 20:02:16 - INFO - codeparrot_training - Step 196: {'lr': 4.9000000000000005e-05, 'samples': 37824, 'steps': 196, 'loss/train': 0.750628262758255} 01/26/2022 20:02:19 - INFO - codeparrot_training - Step 197: {'lr': 4.9250000000000004e-05, 'samples': 38016, 'steps': 197, 'loss/train': 1.0953399538993835} 01/26/2022 20:02:22 - INFO - codeparrot_training - Step 198: {'lr': 4.9500000000000004e-05, 'samples': 38208, 'steps': 198, 'loss/train': 0.8188500702381134} 01/26/2022 20:02:25 - INFO - codeparrot_training - Step 199: {'lr': 4.975e-05, 'samples': 38400, 'steps': 199, 'loss/train': 1.3112203180789948} 01/26/2022 20:02:29 - INFO - codeparrot_training - Step 200: {'lr': 5e-05, 'samples': 38592, 'steps': 200, 'loss/train': 1.0671295523643494} 01/26/2022 20:02:32 - INFO - codeparrot_training - Step 201: {'lr': 5.025e-05, 'samples': 38784, 'steps': 201, 'loss/train': 1.0156736969947815} 01/26/2022 20:02:37 - INFO - codeparrot_training - Step 202: {'lr': 5.05e-05, 'samples': 38976, 'steps': 202, 'loss/train': 1.6629311442375183} 01/26/2022 20:02:41 - INFO - codeparrot_training - Step 203: {'lr': 5.075000000000001e-05, 'samples': 39168, 'steps': 203, 'loss/train': 1.3752846121788025} 01/26/2022 20:02:44 - INFO - codeparrot_training - Step 204: {'lr': 5.1e-05, 'samples': 39360, 'steps': 204, 'loss/train': 1.1547268331050873} 01/26/2022 20:02:47 - INFO - codeparrot_training - Step 205: {'lr': 5.125e-05, 'samples': 39552, 'steps': 205, 'loss/train': 0.9555048644542694} 01/26/2022 20:02:50 - INFO - codeparrot_training - Step 206: {'lr': 5.15e-05, 'samples': 39744, 'steps': 206, 'loss/train': 1.1384654939174652} 01/26/2022 20:02:53 - INFO - codeparrot_training - Step 207: {'lr': 5.175e-05, 'samples': 39936, 'steps': 207, 'loss/train': 1.1253110468387604} 01/26/2022 20:02:56 - INFO - codeparrot_training - Step 208: {'lr': 5.2e-05, 'samples': 40128, 'steps': 208, 'loss/train': 0.8209310173988342} 01/26/2022 20:02:59 - INFO - codeparrot_training - Step 209: {'lr': 5.2249999999999996e-05, 'samples': 40320, 'steps': 209, 'loss/train': 2.520827293395996} 01/26/2022 20:03:03 - INFO - codeparrot_training - Step 210: {'lr': 5.25e-05, 'samples': 40512, 'steps': 210, 'loss/train': 1.0695734024047852} 01/26/2022 20:03:07 - INFO - codeparrot_training - Step 211: {'lr': 5.275e-05, 'samples': 40704, 'steps': 211, 'loss/train': 0.47925974428653717} 01/26/2022 20:03:10 - INFO - codeparrot_training - Step 212: {'lr': 5.3e-05, 'samples': 40896, 'steps': 212, 'loss/train': 1.0081109404563904} 01/26/2022 20:03:13 - INFO - codeparrot_training - Step 213: {'lr': 5.325e-05, 'samples': 41088, 'steps': 213, 'loss/train': 0.9470605552196503} 01/26/2022 20:03:16 - INFO - codeparrot_training - Step 214: {'lr': 5.35e-05, 'samples': 41280, 'steps': 214, 'loss/train': 0.7130788117647171} 01/26/2022 20:03:19 - INFO - codeparrot_training - Step 215: {'lr': 5.375e-05, 'samples': 41472, 'steps': 215, 'loss/train': 1.2014243602752686} 01/26/2022 20:03:23 - INFO - codeparrot_training - Step 216: {'lr': 5.4e-05, 'samples': 41664, 'steps': 216, 'loss/train': 1.1142117083072662} 01/26/2022 20:03:26 - INFO - codeparrot_training - Step 217: {'lr': 5.4250000000000004e-05, 'samples': 41856, 'steps': 217, 'loss/train': 0.5922805666923523} 01/26/2022 20:03:29 - INFO - codeparrot_training - Step 218: {'lr': 5.45e-05, 'samples': 42048, 'steps': 218, 'loss/train': 0.5249330252408981} 01/26/2022 20:03:32 - INFO - codeparrot_training - Step 219: {'lr': 5.475e-05, 'samples': 42240, 'steps': 219, 'loss/train': 1.1915513277053833} 01/26/2022 20:03:37 - INFO - codeparrot_training - Step 220: {'lr': 5.5e-05, 'samples': 42432, 'steps': 220, 'loss/train': 0.35965168476104736} 01/26/2022 20:03:40 - INFO - codeparrot_training - Step 221: {'lr': 5.525e-05, 'samples': 42624, 'steps': 221, 'loss/train': 0.8307751715183258} 01/26/2022 20:03:43 - INFO - codeparrot_training - Step 222: {'lr': 5.55e-05, 'samples': 42816, 'steps': 222, 'loss/train': 0.5622230172157288} 01/26/2022 20:03:46 - INFO - codeparrot_training - Step 223: {'lr': 5.575e-05, 'samples': 43008, 'steps': 223, 'loss/train': 2.1728580594062805} 01/26/2022 20:03:49 - INFO - codeparrot_training - Step 224: {'lr': 5.6e-05, 'samples': 43200, 'steps': 224, 'loss/train': 1.3944549858570099} 01/26/2022 20:03:52 - INFO - codeparrot_training - Step 225: {'lr': 5.6250000000000005e-05, 'samples': 43392, 'steps': 225, 'loss/train': 1.1415264308452606} 01/26/2022 20:03:55 - INFO - codeparrot_training - Step 226: {'lr': 5.6500000000000005e-05, 'samples': 43584, 'steps': 226, 'loss/train': 1.3640662729740143} 01/26/2022 20:03:59 - INFO - codeparrot_training - Step 227: {'lr': 5.6750000000000004e-05, 'samples': 43776, 'steps': 227, 'loss/train': 1.2744399905204773} 01/26/2022 20:04:02 - INFO - codeparrot_training - Step 228: {'lr': 5.7e-05, 'samples': 43968, 'steps': 228, 'loss/train': 0.6094226539134979} 01/26/2022 20:04:06 - INFO - codeparrot_training - Step 229: {'lr': 5.725e-05, 'samples': 44160, 'steps': 229, 'loss/train': 0.8389751315116882} 01/26/2022 20:04:09 - INFO - codeparrot_training - Step 230: {'lr': 5.75e-05, 'samples': 44352, 'steps': 230, 'loss/train': 1.1073690354824066} 01/26/2022 20:04:12 - INFO - codeparrot_training - Step 231: {'lr': 5.775e-05, 'samples': 44544, 'steps': 231, 'loss/train': 0.6085668057203293} 01/26/2022 20:04:16 - INFO - codeparrot_training - Step 232: {'lr': 5.800000000000001e-05, 'samples': 44736, 'steps': 232, 'loss/train': 0.8861702978610992} 01/26/2022 20:04:19 - INFO - codeparrot_training - Step 233: {'lr': 5.8250000000000006e-05, 'samples': 44928, 'steps': 233, 'loss/train': 0.8692090809345245} 01/26/2022 20:04:22 - INFO - codeparrot_training - Step 234: {'lr': 5.8500000000000006e-05, 'samples': 45120, 'steps': 234, 'loss/train': 1.3772276937961578} 01/26/2022 20:04:25 - INFO - codeparrot_training - Step 235: {'lr': 5.875e-05, 'samples': 45312, 'steps': 235, 'loss/train': 0.7176054865121841} 01/26/2022 20:04:28 - INFO - codeparrot_training - Step 236: {'lr': 5.9e-05, 'samples': 45504, 'steps': 236, 'loss/train': 1.1665324866771698} 01/26/2022 20:04:34 - INFO - codeparrot_training - Step 237: {'lr': 5.925e-05, 'samples': 45696, 'steps': 237, 'loss/train': 1.10550257563591} 01/26/2022 20:04:37 - INFO - codeparrot_training - Step 238: {'lr': 5.9499999999999996e-05, 'samples': 45888, 'steps': 238, 'loss/train': 1.043763667345047} 01/26/2022 20:04:40 - INFO - codeparrot_training - Step 239: {'lr': 5.9749999999999995e-05, 'samples': 46080, 'steps': 239, 'loss/train': 0.9871982038021088} 01/26/2022 20:04:43 - INFO - codeparrot_training - Step 240: {'lr': 6e-05, 'samples': 46272, 'steps': 240, 'loss/train': 0.5332265049219131} 01/26/2022 20:04:46 - INFO - codeparrot_training - Step 241: {'lr': 6.025e-05, 'samples': 46464, 'steps': 241, 'loss/train': 1.2732048332691193} 01/26/2022 20:04:49 - INFO - codeparrot_training - Step 242: {'lr': 6.05e-05, 'samples': 46656, 'steps': 242, 'loss/train': 1.1088880598545074} 01/26/2022 20:04:53 - INFO - codeparrot_training - Step 243: {'lr': 6.075e-05, 'samples': 46848, 'steps': 243, 'loss/train': 1.1678805649280548} 01/26/2022 20:04:56 - INFO - codeparrot_training - Step 244: {'lr': 6.1e-05, 'samples': 47040, 'steps': 244, 'loss/train': 1.129733294248581} 01/26/2022 20:04:59 - INFO - codeparrot_training - Step 245: {'lr': 6.125e-05, 'samples': 47232, 'steps': 245, 'loss/train': 0.5138150900602341} 01/26/2022 20:05:03 - INFO - codeparrot_training - Step 246: {'lr': 6.15e-05, 'samples': 47424, 'steps': 246, 'loss/train': 1.1513821184635162} 01/26/2022 20:05:06 - INFO - codeparrot_training - Step 247: {'lr': 6.175e-05, 'samples': 47616, 'steps': 247, 'loss/train': 0.5100005865097046} 01/26/2022 20:05:10 - INFO - codeparrot_training - Step 248: {'lr': 6.2e-05, 'samples': 47808, 'steps': 248, 'loss/train': 0.7218698859214783} 01/26/2022 20:05:13 - INFO - codeparrot_training - Step 249: {'lr': 6.225e-05, 'samples': 48000, 'steps': 249, 'loss/train': 1.0485215485095978} 01/26/2022 20:05:16 - INFO - codeparrot_training - Step 250: {'lr': 6.25e-05, 'samples': 48192, 'steps': 250, 'loss/train': 0.43163707852363586} 01/26/2022 20:05:19 - INFO - codeparrot_training - Step 251: {'lr': 6.275000000000001e-05, 'samples': 48384, 'steps': 251, 'loss/train': 0.8172822296619415} 01/26/2022 20:05:22 - INFO - codeparrot_training - Step 252: {'lr': 6.3e-05, 'samples': 48576, 'steps': 252, 'loss/train': 0.6068518459796906} 01/26/2022 20:05:25 - INFO - codeparrot_training - Step 253: {'lr': 6.325e-05, 'samples': 48768, 'steps': 253, 'loss/train': 0.884998083114624} 01/26/2022 20:05:28 - INFO - codeparrot_training - Step 254: {'lr': 6.35e-05, 'samples': 48960, 'steps': 254, 'loss/train': 0.758701354265213} 01/26/2022 20:05:33 - INFO - codeparrot_training - Step 255: {'lr': 6.375e-05, 'samples': 49152, 'steps': 255, 'loss/train': 1.4007873237133026} 01/26/2022 20:05:37 - INFO - codeparrot_training - Step 256: {'lr': 6.4e-05, 'samples': 49344, 'steps': 256, 'loss/train': 0.7115031480789185} 01/26/2022 20:05:40 - INFO - codeparrot_training - Step 257: {'lr': 6.425e-05, 'samples': 49536, 'steps': 257, 'loss/train': 1.1868571043014526} 01/26/2022 20:05:43 - INFO - codeparrot_training - Step 258: {'lr': 6.450000000000001e-05, 'samples': 49728, 'steps': 258, 'loss/train': 0.49912266433238983} 01/26/2022 20:05:46 - INFO - codeparrot_training - Step 259: {'lr': 6.475e-05, 'samples': 49920, 'steps': 259, 'loss/train': 0.8214937448501587} 01/26/2022 20:05:49 - INFO - codeparrot_training - Step 260: {'lr': 6.500000000000001e-05, 'samples': 50112, 'steps': 260, 'loss/train': 1.1232694387435913} 01/26/2022 20:05:52 - INFO - codeparrot_training - Step 261: {'lr': 6.525e-05, 'samples': 50304, 'steps': 261, 'loss/train': 1.041410654783249} 01/26/2022 20:05:55 - INFO - codeparrot_training - Step 262: {'lr': 6.55e-05, 'samples': 50496, 'steps': 262, 'loss/train': 1.0826551616191864} 01/26/2022 20:05:58 - INFO - codeparrot_training - Step 263: {'lr': 6.575e-05, 'samples': 50688, 'steps': 263, 'loss/train': 1.3080483376979828} 01/26/2022 20:06:03 - INFO - codeparrot_training - Step 264: {'lr': 6.6e-05, 'samples': 50880, 'steps': 264, 'loss/train': 0.9459260702133179} 01/26/2022 20:06:06 - INFO - codeparrot_training - Step 265: {'lr': 6.625000000000001e-05, 'samples': 51072, 'steps': 265, 'loss/train': 1.602814257144928} 01/26/2022 20:06:09 - INFO - codeparrot_training - Step 266: {'lr': 6.65e-05, 'samples': 51264, 'steps': 266, 'loss/train': 0.9994533956050873} 01/26/2022 20:06:12 - INFO - codeparrot_training - Step 267: {'lr': 6.675000000000001e-05, 'samples': 51456, 'steps': 267, 'loss/train': 0.7647925615310669} 01/26/2022 20:06:15 - INFO - codeparrot_training - Step 268: {'lr': 6.7e-05, 'samples': 51648, 'steps': 268, 'loss/train': 1.1421605944633484} 01/26/2022 20:06:18 - INFO - codeparrot_training - Step 269: {'lr': 6.725000000000001e-05, 'samples': 51840, 'steps': 269, 'loss/train': 0.9605331122875214} 01/26/2022 20:06:22 - INFO - codeparrot_training - Step 270: {'lr': 6.75e-05, 'samples': 52032, 'steps': 270, 'loss/train': 0.8399298191070557} 01/26/2022 20:06:25 - INFO - codeparrot_training - Step 271: {'lr': 6.775000000000001e-05, 'samples': 52224, 'steps': 271, 'loss/train': 0.9594549536705017} 01/26/2022 20:06:28 - INFO - codeparrot_training - Step 272: {'lr': 6.800000000000001e-05, 'samples': 52416, 'steps': 272, 'loss/train': 1.1892599165439606} 01/26/2022 20:06:32 - INFO - codeparrot_training - Step 273: {'lr': 6.825e-05, 'samples': 52608, 'steps': 273, 'loss/train': 1.0380887389183044} 01/26/2022 20:06:35 - INFO - codeparrot_training - Step 274: {'lr': 6.850000000000001e-05, 'samples': 52800, 'steps': 274, 'loss/train': 0.5679173916578293} 01/26/2022 20:06:39 - INFO - codeparrot_training - Step 275: {'lr': 6.875e-05, 'samples': 52992, 'steps': 275, 'loss/train': 0.508046954870224} 01/26/2022 20:06:42 - INFO - codeparrot_training - Step 276: {'lr': 6.900000000000001e-05, 'samples': 53184, 'steps': 276, 'loss/train': 1.1423460245132446} 01/26/2022 20:06:45 - INFO - codeparrot_training - Step 277: {'lr': 6.925e-05, 'samples': 53376, 'steps': 277, 'loss/train': 0.7753852307796478} 01/26/2022 20:06:48 - INFO - codeparrot_training - Step 278: {'lr': 6.950000000000001e-05, 'samples': 53568, 'steps': 278, 'loss/train': 1.3239911198616028} 01/26/2022 20:06:51 - INFO - codeparrot_training - Step 279: {'lr': 6.975e-05, 'samples': 53760, 'steps': 279, 'loss/train': 1.1902539432048798} 01/26/2022 20:06:54 - INFO - codeparrot_training - Step 280: {'lr': 7.000000000000001e-05, 'samples': 53952, 'steps': 280, 'loss/train': 0.96241495013237} 01/26/2022 20:07:00 - INFO - codeparrot_training - Step 281: {'lr': 7.025000000000001e-05, 'samples': 54144, 'steps': 281, 'loss/train': 1.3379597067832947} 01/26/2022 20:07:03 - INFO - codeparrot_training - Step 282: {'lr': 7.049999999999999e-05, 'samples': 54336, 'steps': 282, 'loss/train': 1.1913694739341736} 01/26/2022 20:07:06 - INFO - codeparrot_training - Step 283: {'lr': 7.075e-05, 'samples': 54528, 'steps': 283, 'loss/train': 1.0873757600784302} 01/26/2022 20:07:09 - INFO - codeparrot_training - Step 284: {'lr': 7.099999999999999e-05, 'samples': 54720, 'steps': 284, 'loss/train': 1.291018009185791} 01/26/2022 20:07:12 - INFO - codeparrot_training - Step 285: {'lr': 7.125e-05, 'samples': 54912, 'steps': 285, 'loss/train': 0.7471304833889008} 01/26/2022 20:07:15 - INFO - codeparrot_training - Step 286: {'lr': 7.149999999999999e-05, 'samples': 55104, 'steps': 286, 'loss/train': 1.0613690614700317} 01/26/2022 20:07:18 - INFO - codeparrot_training - Step 287: {'lr': 7.175e-05, 'samples': 55296, 'steps': 287, 'loss/train': 0.9576367735862732} 01/26/2022 20:07:22 - INFO - codeparrot_training - Step 288: {'lr': 7.2e-05, 'samples': 55488, 'steps': 288, 'loss/train': 1.2826087474822998} 01/26/2022 20:07:25 - INFO - codeparrot_training - Step 289: {'lr': 7.225e-05, 'samples': 55680, 'steps': 289, 'loss/train': 0.972164511680603} 01/26/2022 20:07:29 - INFO - codeparrot_training - Step 290: {'lr': 7.25e-05, 'samples': 55872, 'steps': 290, 'loss/train': 0.40963388979434967} 01/26/2022 20:07:32 - INFO - codeparrot_training - Step 291: {'lr': 7.274999999999999e-05, 'samples': 56064, 'steps': 291, 'loss/train': 1.1230065822601318} 01/26/2022 20:07:35 - INFO - codeparrot_training - Step 292: {'lr': 7.3e-05, 'samples': 56256, 'steps': 292, 'loss/train': 0.8228602409362793} 01/26/2022 20:07:39 - INFO - codeparrot_training - Step 293: {'lr': 7.324999999999999e-05, 'samples': 56448, 'steps': 293, 'loss/train': 1.324948489665985} 01/26/2022 20:07:42 - INFO - codeparrot_training - Step 294: {'lr': 7.35e-05, 'samples': 56640, 'steps': 294, 'loss/train': 1.183780699968338} 01/26/2022 20:07:45 - INFO - codeparrot_training - Step 295: {'lr': 7.375e-05, 'samples': 56832, 'steps': 295, 'loss/train': 0.7068225145339966} 01/26/2022 20:07:48 - INFO - codeparrot_training - Step 296: {'lr': 7.4e-05, 'samples': 57024, 'steps': 296, 'loss/train': 1.191939800977707} 01/26/2022 20:07:51 - INFO - codeparrot_training - Step 297: {'lr': 7.425e-05, 'samples': 57216, 'steps': 297, 'loss/train': 1.0787363648414612} 01/26/2022 20:07:54 - INFO - codeparrot_training - Step 298: {'lr': 7.45e-05, 'samples': 57408, 'steps': 298, 'loss/train': 1.1144662499427795} 01/26/2022 20:07:59 - INFO - codeparrot_training - Step 299: {'lr': 7.475e-05, 'samples': 57600, 'steps': 299, 'loss/train': 0.8711251616477966} 01/26/2022 20:08:02 - INFO - codeparrot_training - Step 300: {'lr': 7.5e-05, 'samples': 57792, 'steps': 300, 'loss/train': 0.9427872598171234} 01/26/2022 20:08:05 - INFO - codeparrot_training - Step 301: {'lr': 7.525e-05, 'samples': 57984, 'steps': 301, 'loss/train': 0.9164088070392609} 01/26/2022 20:08:08 - INFO - codeparrot_training - Step 302: {'lr': 7.55e-05, 'samples': 58176, 'steps': 302, 'loss/train': 0.8746973276138306} 01/26/2022 20:08:11 - INFO - codeparrot_training - Step 303: {'lr': 7.575e-05, 'samples': 58368, 'steps': 303, 'loss/train': 0.9978926181793213} 01/26/2022 20:08:14 - INFO - codeparrot_training - Step 304: {'lr': 7.6e-05, 'samples': 58560, 'steps': 304, 'loss/train': 0.8042625188827515} 01/26/2022 20:08:17 - INFO - codeparrot_training - Step 305: {'lr': 7.625e-05, 'samples': 58752, 'steps': 305, 'loss/train': 0.7100889533758163} 01/26/2022 20:08:21 - INFO - codeparrot_training - Step 306: {'lr': 7.65e-05, 'samples': 58944, 'steps': 306, 'loss/train': 1.33299520611763} 01/26/2022 20:08:24 - INFO - codeparrot_training - Step 307: {'lr': 7.675e-05, 'samples': 59136, 'steps': 307, 'loss/train': 0.8356545567512512} 01/26/2022 20:08:28 - INFO - codeparrot_training - Step 308: {'lr': 7.7e-05, 'samples': 59328, 'steps': 308, 'loss/train': 0.751328855752945} 01/26/2022 20:08:31 - INFO - codeparrot_training - Step 309: {'lr': 7.725000000000001e-05, 'samples': 59520, 'steps': 309, 'loss/train': 1.005783587694168} 01/26/2022 20:08:34 - INFO - codeparrot_training - Step 310: {'lr': 7.75e-05, 'samples': 59712, 'steps': 310, 'loss/train': 1.2585538029670715} 01/26/2022 20:08:37 - INFO - codeparrot_training - Step 311: {'lr': 7.775e-05, 'samples': 59904, 'steps': 311, 'loss/train': 0.8243108689785004} 01/26/2022 20:08:41 - INFO - codeparrot_training - Step 312: {'lr': 7.8e-05, 'samples': 60096, 'steps': 312, 'loss/train': 0.9810410141944885} 01/26/2022 20:08:44 - INFO - codeparrot_training - Step 313: {'lr': 7.825e-05, 'samples': 60288, 'steps': 313, 'loss/train': 0.8090919256210327} 01/26/2022 20:08:47 - INFO - codeparrot_training - Step 314: {'lr': 7.85e-05, 'samples': 60480, 'steps': 314, 'loss/train': 1.1202751100063324} 01/26/2022 20:08:50 - INFO - codeparrot_training - Step 315: {'lr': 7.875e-05, 'samples': 60672, 'steps': 315, 'loss/train': 1.09663724899292} 01/26/2022 20:08:53 - INFO - codeparrot_training - Step 316: {'lr': 7.9e-05, 'samples': 60864, 'steps': 316, 'loss/train': 1.0598064959049225} 01/26/2022 20:08:59 - INFO - codeparrot_training - Step 317: {'lr': 7.925e-05, 'samples': 61056, 'steps': 317, 'loss/train': 1.0347656607627869} 01/26/2022 20:09:02 - INFO - codeparrot_training - Step 318: {'lr': 7.950000000000001e-05, 'samples': 61248, 'steps': 318, 'loss/train': 0.399854376912117} 01/26/2022 20:09:05 - INFO - codeparrot_training - Step 319: {'lr': 7.975e-05, 'samples': 61440, 'steps': 319, 'loss/train': 0.9313833117485046} 01/26/2022 20:09:08 - INFO - codeparrot_training - Step 320: {'lr': 8e-05, 'samples': 61632, 'steps': 320, 'loss/train': 0.39090071618556976} 01/26/2022 20:09:11 - INFO - codeparrot_training - Step 321: {'lr': 8.025e-05, 'samples': 61824, 'steps': 321, 'loss/train': 0.40278929471969604} 01/26/2022 20:09:14 - INFO - codeparrot_training - Step 322: {'lr': 8.05e-05, 'samples': 62016, 'steps': 322, 'loss/train': 0.8294789493083954} 01/26/2022 20:09:17 - INFO - codeparrot_training - Step 323: {'lr': 8.075e-05, 'samples': 62208, 'steps': 323, 'loss/train': 0.8699845969676971} 01/26/2022 20:09:21 - INFO - codeparrot_training - Step 324: {'lr': 8.1e-05, 'samples': 62400, 'steps': 324, 'loss/train': 1.1144334375858307} 01/26/2022 20:09:25 - INFO - codeparrot_training - Step 325: {'lr': 8.125000000000001e-05, 'samples': 62592, 'steps': 325, 'loss/train': 1.0010827481746674} 01/26/2022 20:09:28 - INFO - codeparrot_training - Step 326: {'lr': 8.15e-05, 'samples': 62784, 'steps': 326, 'loss/train': 0.8288448750972748} 01/26/2022 20:09:31 - INFO - codeparrot_training - Step 327: {'lr': 8.175000000000001e-05, 'samples': 62976, 'steps': 327, 'loss/train': 1.173732876777649} 01/26/2022 20:09:34 - INFO - codeparrot_training - Step 328: {'lr': 8.2e-05, 'samples': 63168, 'steps': 328, 'loss/train': 0.6803929656744003} 01/26/2022 20:09:38 - INFO - codeparrot_training - Step 329: {'lr': 8.225000000000001e-05, 'samples': 63360, 'steps': 329, 'loss/train': 1.233411580324173} 01/26/2022 20:09:41 - INFO - codeparrot_training - Step 330: {'lr': 8.25e-05, 'samples': 63552, 'steps': 330, 'loss/train': 0.6752614974975586} 01/26/2022 20:09:44 - INFO - codeparrot_training - Step 331: {'lr': 8.275e-05, 'samples': 63744, 'steps': 331, 'loss/train': 1.3333468437194824} 01/26/2022 20:09:47 - INFO - codeparrot_training - Step 332: {'lr': 8.300000000000001e-05, 'samples': 63936, 'steps': 332, 'loss/train': 0.6622332036495209} 01/26/2022 20:09:50 - INFO - codeparrot_training - Step 333: {'lr': 8.325e-05, 'samples': 64128, 'steps': 333, 'loss/train': 1.038160353899002} 01/26/2022 20:09:55 - INFO - codeparrot_training - Step 334: {'lr': 8.350000000000001e-05, 'samples': 64320, 'steps': 334, 'loss/train': 2.0789029598236084} 01/26/2022 20:09:58 - INFO - codeparrot_training - Step 335: {'lr': 8.375e-05, 'samples': 64512, 'steps': 335, 'loss/train': 1.0992600917816162} 01/26/2022 20:10:01 - INFO - codeparrot_training - Step 336: {'lr': 8.400000000000001e-05, 'samples': 64704, 'steps': 336, 'loss/train': 0.7307141125202179} 01/26/2022 20:10:04 - INFO - codeparrot_training - Step 337: {'lr': 8.425e-05, 'samples': 64896, 'steps': 337, 'loss/train': 0.755567729473114} 01/26/2022 20:10:07 - INFO - codeparrot_training - Step 338: {'lr': 8.450000000000001e-05, 'samples': 65088, 'steps': 338, 'loss/train': 1.1346741020679474} 01/26/2022 20:10:10 - INFO - codeparrot_training - Step 339: {'lr': 8.475000000000001e-05, 'samples': 65280, 'steps': 339, 'loss/train': 1.054144710302353} 01/26/2022 20:10:13 - INFO - codeparrot_training - Step 340: {'lr': 8.5e-05, 'samples': 65472, 'steps': 340, 'loss/train': 0.7578913271427155} 01/26/2022 20:10:17 - INFO - codeparrot_training - Step 341: {'lr': 8.525000000000001e-05, 'samples': 65664, 'steps': 341, 'loss/train': 0.9189346432685852} 01/26/2022 20:10:20 - INFO - codeparrot_training - Step 342: {'lr': 8.55e-05, 'samples': 65856, 'steps': 342, 'loss/train': 0.7414054423570633} 01/26/2022 20:10:27 - INFO - codeparrot_training - Step 343: {'lr': 8.575000000000001e-05, 'samples': 66048, 'steps': 343, 'loss/train': 0.747487485408783} 01/26/2022 20:10:30 - INFO - codeparrot_training - Step 344: {'lr': 8.599999999999999e-05, 'samples': 66240, 'steps': 344, 'loss/train': 0.6916420161724091} 01/26/2022 20:10:33 - INFO - codeparrot_training - Step 345: {'lr': 8.625e-05, 'samples': 66432, 'steps': 345, 'loss/train': 1.0344393253326416} 01/26/2022 20:10:37 - INFO - codeparrot_training - Step 346: {'lr': 8.65e-05, 'samples': 66624, 'steps': 346, 'loss/train': 2.085118353366852} 01/26/2022 20:10:40 - INFO - codeparrot_training - Step 347: {'lr': 8.675e-05, 'samples': 66816, 'steps': 347, 'loss/train': 1.888598084449768} 01/26/2022 20:10:43 - INFO - codeparrot_training - Step 348: {'lr': 8.7e-05, 'samples': 67008, 'steps': 348, 'loss/train': 0.7010593861341476} 01/26/2022 20:10:46 - INFO - codeparrot_training - Step 349: {'lr': 8.724999999999999e-05, 'samples': 67200, 'steps': 349, 'loss/train': 0.7900976836681366} 01/26/2022 20:10:49 - INFO - codeparrot_training - Step 350: {'lr': 8.75e-05, 'samples': 67392, 'steps': 350, 'loss/train': 1.0326433181762695} 01/26/2022 20:10:52 - INFO - codeparrot_training - Step 351: {'lr': 8.774999999999999e-05, 'samples': 67584, 'steps': 351, 'loss/train': 0.8805137872695923} 01/26/2022 20:10:57 - INFO - codeparrot_training - Step 352: {'lr': 8.8e-05, 'samples': 67776, 'steps': 352, 'loss/train': 0.9114089012145996} 01/26/2022 20:11:00 - INFO - codeparrot_training - Step 353: {'lr': 8.824999999999999e-05, 'samples': 67968, 'steps': 353, 'loss/train': 1.4836265444755554} 01/26/2022 20:11:03 - INFO - codeparrot_training - Step 354: {'lr': 8.85e-05, 'samples': 68160, 'steps': 354, 'loss/train': 0.7094464302062988} 01/26/2022 20:11:06 - INFO - codeparrot_training - Step 355: {'lr': 8.875e-05, 'samples': 68352, 'steps': 355, 'loss/train': 1.0316323935985565} 01/26/2022 20:11:09 - INFO - codeparrot_training - Step 356: {'lr': 8.9e-05, 'samples': 68544, 'steps': 356, 'loss/train': 0.8668911159038544} 01/26/2022 20:11:12 - INFO - codeparrot_training - Step 357: {'lr': 8.925e-05, 'samples': 68736, 'steps': 357, 'loss/train': 1.010850191116333} 01/26/2022 20:11:16 - INFO - codeparrot_training - Step 358: {'lr': 8.95e-05, 'samples': 68928, 'steps': 358, 'loss/train': 0.8607926666736603} 01/26/2022 20:11:19 - INFO - codeparrot_training - Step 359: {'lr': 8.975e-05, 'samples': 69120, 'steps': 359, 'loss/train': 0.9607502818107605} 01/26/2022 20:11:22 - INFO - codeparrot_training - Step 360: {'lr': 8.999999999999999e-05, 'samples': 69312, 'steps': 360, 'loss/train': 1.2990716993808746} 01/26/2022 20:11:27 - INFO - codeparrot_training - Step 361: {'lr': 9.025e-05, 'samples': 69504, 'steps': 361, 'loss/train': 0.7665486931800842} 01/26/2022 20:11:30 - INFO - codeparrot_training - Step 362: {'lr': 9.05e-05, 'samples': 69696, 'steps': 362, 'loss/train': 0.6945787221193314} 01/26/2022 20:11:33 - INFO - codeparrot_training - Step 363: {'lr': 9.075e-05, 'samples': 69888, 'steps': 363, 'loss/train': 1.2044677734375} 01/26/2022 20:11:36 - INFO - codeparrot_training - Step 364: {'lr': 9.1e-05, 'samples': 70080, 'steps': 364, 'loss/train': 1.199683964252472} 01/26/2022 20:11:39 - INFO - codeparrot_training - Step 365: {'lr': 9.125e-05, 'samples': 70272, 'steps': 365, 'loss/train': 0.3417014926671982} 01/26/2022 20:11:43 - INFO - codeparrot_training - Step 366: {'lr': 9.15e-05, 'samples': 70464, 'steps': 366, 'loss/train': 0.9123673439025879} 01/26/2022 20:11:46 - INFO - codeparrot_training - Step 367: {'lr': 9.175e-05, 'samples': 70656, 'steps': 367, 'loss/train': 1.1374095976352692} 01/26/2022 20:11:49 - INFO - codeparrot_training - Step 368: {'lr': 9.2e-05, 'samples': 70848, 'steps': 368, 'loss/train': 0.871574878692627} 01/26/2022 20:11:52 - INFO - codeparrot_training - Step 369: {'lr': 9.225e-05, 'samples': 71040, 'steps': 369, 'loss/train': 0.8438461124897003} 01/26/2022 20:11:56 - INFO - codeparrot_training - Step 370: {'lr': 9.25e-05, 'samples': 71232, 'steps': 370, 'loss/train': 1.0463526248931885} 01/26/2022 20:11:59 - INFO - codeparrot_training - Step 371: {'lr': 9.275e-05, 'samples': 71424, 'steps': 371, 'loss/train': 0.8958008587360382} 01/26/2022 20:12:03 - INFO - codeparrot_training - Step 372: {'lr': 9.3e-05, 'samples': 71616, 'steps': 372, 'loss/train': 1.0595494508743286} 01/26/2022 20:12:06 - INFO - codeparrot_training - Step 373: {'lr': 9.325e-05, 'samples': 71808, 'steps': 373, 'loss/train': 0.9859458804130554} 01/26/2022 20:12:09 - INFO - codeparrot_training - Step 374: {'lr': 9.35e-05, 'samples': 72000, 'steps': 374, 'loss/train': 0.7631303071975708} 01/26/2022 20:12:12 - INFO - codeparrot_training - Step 375: {'lr': 9.375e-05, 'samples': 72192, 'steps': 375, 'loss/train': 1.4473093450069427} 01/26/2022 20:12:15 - INFO - codeparrot_training - Step 376: {'lr': 9.400000000000001e-05, 'samples': 72384, 'steps': 376, 'loss/train': 0.7394805550575256} 01/26/2022 20:12:18 - INFO - codeparrot_training - Step 377: {'lr': 9.425e-05, 'samples': 72576, 'steps': 377, 'loss/train': 0.8891026675701141} 01/26/2022 20:12:21 - INFO - codeparrot_training - Step 378: {'lr': 9.45e-05, 'samples': 72768, 'steps': 378, 'loss/train': 0.9185190796852112} 01/26/2022 20:12:26 - INFO - codeparrot_training - Step 379: {'lr': 9.475e-05, 'samples': 72960, 'steps': 379, 'loss/train': 0.6343764960765839} 01/26/2022 20:12:29 - INFO - codeparrot_training - Step 380: {'lr': 9.5e-05, 'samples': 73152, 'steps': 380, 'loss/train': 0.8839260935783386} 01/26/2022 20:12:32 - INFO - codeparrot_training - Step 381: {'lr': 9.525e-05, 'samples': 73344, 'steps': 381, 'loss/train': 0.8907753825187683} 01/26/2022 20:12:35 - INFO - codeparrot_training - Step 382: {'lr': 9.55e-05, 'samples': 73536, 'steps': 382, 'loss/train': 1.2589702606201172} 01/26/2022 20:12:38 - INFO - codeparrot_training - Step 383: {'lr': 9.575000000000001e-05, 'samples': 73728, 'steps': 383, 'loss/train': 1.2859916388988495} 01/26/2022 20:12:41 - INFO - codeparrot_training - Step 384: {'lr': 9.6e-05, 'samples': 73920, 'steps': 384, 'loss/train': 0.6472349613904953} 01/26/2022 20:12:45 - INFO - codeparrot_training - Step 385: {'lr': 9.625000000000001e-05, 'samples': 74112, 'steps': 385, 'loss/train': 1.01359024643898} 01/26/2022 20:12:48 - INFO - codeparrot_training - Step 386: {'lr': 9.65e-05, 'samples': 74304, 'steps': 386, 'loss/train': 0.8292834162712097} 01/26/2022 20:12:51 - INFO - codeparrot_training - Step 387: {'lr': 9.675000000000001e-05, 'samples': 74496, 'steps': 387, 'loss/train': 1.038096696138382} 01/26/2022 20:12:56 - INFO - codeparrot_training - Step 388: {'lr': 9.7e-05, 'samples': 74688, 'steps': 388, 'loss/train': 1.0593030452728271} 01/26/2022 20:12:59 - INFO - codeparrot_training - Step 389: {'lr': 9.725e-05, 'samples': 74880, 'steps': 389, 'loss/train': 0.3904605209827423} 01/26/2022 20:13:02 - INFO - codeparrot_training - Step 390: {'lr': 9.750000000000001e-05, 'samples': 75072, 'steps': 390, 'loss/train': 1.214891105890274} 01/26/2022 20:13:06 - INFO - codeparrot_training - Step 391: {'lr': 9.775e-05, 'samples': 75264, 'steps': 391, 'loss/train': 1.1453385651111603} 01/26/2022 20:13:09 - INFO - codeparrot_training - Step 392: {'lr': 9.800000000000001e-05, 'samples': 75456, 'steps': 392, 'loss/train': 0.9596201777458191} 01/26/2022 20:13:12 - INFO - codeparrot_training - Step 393: {'lr': 9.825e-05, 'samples': 75648, 'steps': 393, 'loss/train': 1.13962984085083} 01/26/2022 20:13:15 - INFO - codeparrot_training - Step 394: {'lr': 9.850000000000001e-05, 'samples': 75840, 'steps': 394, 'loss/train': 0.9503010213375092} 01/26/2022 20:13:18 - INFO - codeparrot_training - Step 395: {'lr': 9.875e-05, 'samples': 76032, 'steps': 395, 'loss/train': 0.7186999171972275} 01/26/2022 20:13:21 - INFO - codeparrot_training - Step 396: {'lr': 9.900000000000001e-05, 'samples': 76224, 'steps': 396, 'loss/train': 0.5636555403470993} 01/26/2022 20:13:26 - INFO - codeparrot_training - Step 397: {'lr': 9.925000000000001e-05, 'samples': 76416, 'steps': 397, 'loss/train': 1.0697872638702393} 01/26/2022 20:13:29 - INFO - codeparrot_training - Step 398: {'lr': 9.95e-05, 'samples': 76608, 'steps': 398, 'loss/train': 0.7212703675031662} 01/26/2022 20:13:32 - INFO - codeparrot_training - Step 399: {'lr': 9.975000000000001e-05, 'samples': 76800, 'steps': 399, 'loss/train': 0.9841853678226471} 01/26/2022 20:13:35 - INFO - codeparrot_training - Step 400: {'lr': 0.0001, 'samples': 76992, 'steps': 400, 'loss/train': 1.08234241604805} 01/26/2022 20:13:38 - INFO - codeparrot_training - Step 401: {'lr': 0.00010025000000000001, 'samples': 77184, 'steps': 401, 'loss/train': 1.5624375343322754} 01/26/2022 20:13:41 - INFO - codeparrot_training - Step 402: {'lr': 0.0001005, 'samples': 77376, 'steps': 402, 'loss/train': 1.475949615240097} 01/26/2022 20:13:44 - INFO - codeparrot_training - Step 403: {'lr': 0.00010075000000000001, 'samples': 77568, 'steps': 403, 'loss/train': 1.1508514881134033} 01/26/2022 20:13:48 - INFO - codeparrot_training - Step 404: {'lr': 0.000101, 'samples': 77760, 'steps': 404, 'loss/train': 0.8824456036090851} 01/26/2022 20:13:53 - INFO - codeparrot_training - Step 405: {'lr': 0.00010125000000000001, 'samples': 77952, 'steps': 405, 'loss/train': 1.1006841659545898} 01/26/2022 20:13:56 - INFO - codeparrot_training - Step 406: {'lr': 0.00010150000000000001, 'samples': 78144, 'steps': 406, 'loss/train': 0.8000113070011139} 01/26/2022 20:13:59 - INFO - codeparrot_training - Step 407: {'lr': 0.00010174999999999999, 'samples': 78336, 'steps': 407, 'loss/train': 1.2521542310714722} 01/26/2022 20:14:02 - INFO - codeparrot_training - Step 408: {'lr': 0.000102, 'samples': 78528, 'steps': 408, 'loss/train': 0.6710378676652908} 01/26/2022 20:14:05 - INFO - codeparrot_training - Step 409: {'lr': 0.00010224999999999999, 'samples': 78720, 'steps': 409, 'loss/train': 0.7425864636898041} 01/26/2022 20:14:08 - INFO - codeparrot_training - Step 410: {'lr': 0.0001025, 'samples': 78912, 'steps': 410, 'loss/train': 0.8330551385879517} 01/26/2022 20:14:11 - INFO - codeparrot_training - Step 411: {'lr': 0.00010274999999999999, 'samples': 79104, 'steps': 411, 'loss/train': 1.99857759475708} 01/26/2022 20:14:15 - INFO - codeparrot_training - Step 412: {'lr': 0.000103, 'samples': 79296, 'steps': 412, 'loss/train': 0.495266318321228} 01/26/2022 20:14:18 - INFO - codeparrot_training - Step 413: {'lr': 0.00010325, 'samples': 79488, 'steps': 413, 'loss/train': 0.6426213383674622} 01/26/2022 20:14:22 - INFO - codeparrot_training - Step 414: {'lr': 0.0001035, 'samples': 79680, 'steps': 414, 'loss/train': 0.8009860217571259} 01/26/2022 20:14:25 - INFO - codeparrot_training - Step 415: {'lr': 0.00010375, 'samples': 79872, 'steps': 415, 'loss/train': 0.8425185978412628} 01/26/2022 20:14:28 - INFO - codeparrot_training - Step 416: {'lr': 0.000104, 'samples': 80064, 'steps': 416, 'loss/train': 1.3310015201568604} 01/26/2022 20:14:31 - INFO - codeparrot_training - Step 417: {'lr': 0.00010425, 'samples': 80256, 'steps': 417, 'loss/train': 1.3193172812461853} 01/26/2022 20:14:35 - INFO - codeparrot_training - Step 418: {'lr': 0.00010449999999999999, 'samples': 80448, 'steps': 418, 'loss/train': 0.6111249625682831} 01/26/2022 20:14:38 - INFO - codeparrot_training - Step 419: {'lr': 0.00010475, 'samples': 80640, 'steps': 419, 'loss/train': 1.204498440027237} 01/26/2022 20:14:41 - INFO - codeparrot_training - Step 420: {'lr': 0.000105, 'samples': 80832, 'steps': 420, 'loss/train': 1.018784075975418} 01/26/2022 20:14:44 - INFO - codeparrot_training - Step 421: {'lr': 0.00010525, 'samples': 81024, 'steps': 421, 'loss/train': 1.0076421797275543} 01/26/2022 20:14:47 - INFO - codeparrot_training - Step 422: {'lr': 0.0001055, 'samples': 81216, 'steps': 422, 'loss/train': 0.953014612197876} 01/26/2022 20:14:52 - INFO - codeparrot_training - Step 423: {'lr': 0.00010575, 'samples': 81408, 'steps': 423, 'loss/train': 1.0320123732089996} 01/26/2022 20:14:55 - INFO - codeparrot_training - Step 424: {'lr': 0.000106, 'samples': 81600, 'steps': 424, 'loss/train': 1.5705456733703613} 01/26/2022 20:14:58 - INFO - codeparrot_training - Step 425: {'lr': 0.00010625, 'samples': 81792, 'steps': 425, 'loss/train': 0.3477340489625931} 01/26/2022 20:15:01 - INFO - codeparrot_training - Step 426: {'lr': 0.0001065, 'samples': 81984, 'steps': 426, 'loss/train': 1.2109199166297913} 01/26/2022 20:15:04 - INFO - codeparrot_training - Step 427: {'lr': 0.00010675, 'samples': 82176, 'steps': 427, 'loss/train': 0.9677959978580475} 01/26/2022 20:15:07 - INFO - codeparrot_training - Step 428: {'lr': 0.000107, 'samples': 82368, 'steps': 428, 'loss/train': 0.8218532502651215} 01/26/2022 20:15:10 - INFO - codeparrot_training - Step 429: {'lr': 0.00010725, 'samples': 82560, 'steps': 429, 'loss/train': 0.8969594836235046} 01/26/2022 20:15:14 - INFO - codeparrot_training - Step 430: {'lr': 0.0001075, 'samples': 82752, 'steps': 430, 'loss/train': 1.0345425009727478} 01/26/2022 20:15:17 - INFO - codeparrot_training - Step 431: {'lr': 0.00010775, 'samples': 82944, 'steps': 431, 'loss/train': 0.6871862411499023} 01/26/2022 20:15:21 - INFO - codeparrot_training - Step 432: {'lr': 0.000108, 'samples': 83136, 'steps': 432, 'loss/train': 1.0284851789474487} 01/26/2022 20:15:24 - INFO - codeparrot_training - Step 433: {'lr': 0.00010825, 'samples': 83328, 'steps': 433, 'loss/train': 0.4605419933795929} 01/26/2022 20:15:27 - INFO - codeparrot_training - Step 434: {'lr': 0.00010850000000000001, 'samples': 83520, 'steps': 434, 'loss/train': 0.8205590844154358} 01/26/2022 20:15:31 - INFO - codeparrot_training - Step 435: {'lr': 0.00010875, 'samples': 83712, 'steps': 435, 'loss/train': 0.9477976262569427} 01/26/2022 20:15:34 - INFO - codeparrot_training - Step 436: {'lr': 0.000109, 'samples': 83904, 'steps': 436, 'loss/train': 0.6931536644697189} 01/26/2022 20:15:37 - INFO - codeparrot_training - Step 437: {'lr': 0.00010925, 'samples': 84096, 'steps': 437, 'loss/train': 1.0598368048667908} 01/26/2022 20:15:40 - INFO - codeparrot_training - Step 438: {'lr': 0.0001095, 'samples': 84288, 'steps': 438, 'loss/train': 0.6807989180088043} 01/26/2022 20:15:43 - INFO - codeparrot_training - Step 439: {'lr': 0.00010975, 'samples': 84480, 'steps': 439, 'loss/train': 0.5363283008337021} 01/26/2022 20:15:49 - INFO - codeparrot_training - Step 440: {'lr': 0.00011, 'samples': 84672, 'steps': 440, 'loss/train': 1.1454049050807953} 01/26/2022 20:15:52 - INFO - codeparrot_training - Step 441: {'lr': 0.00011025, 'samples': 84864, 'steps': 441, 'loss/train': 0.7169860750436783} 01/26/2022 20:15:55 - INFO - codeparrot_training - Step 442: {'lr': 0.0001105, 'samples': 85056, 'steps': 442, 'loss/train': 0.7018677145242691} 01/26/2022 20:15:58 - INFO - codeparrot_training - Step 443: {'lr': 0.00011075000000000001, 'samples': 85248, 'steps': 443, 'loss/train': 1.0023103952407837} 01/26/2022 20:16:02 - INFO - codeparrot_training - Step 444: {'lr': 0.000111, 'samples': 85440, 'steps': 444, 'loss/train': 0.7472628504037857} 01/26/2022 20:16:05 - INFO - codeparrot_training - Step 445: {'lr': 0.00011125000000000001, 'samples': 85632, 'steps': 445, 'loss/train': 1.162532240152359} 01/26/2022 20:16:08 - INFO - codeparrot_training - Step 446: {'lr': 0.0001115, 'samples': 85824, 'steps': 446, 'loss/train': 1.272094488143921} 01/26/2022 20:16:11 - INFO - codeparrot_training - Step 447: {'lr': 0.00011175, 'samples': 86016, 'steps': 447, 'loss/train': 0.9917010962963104} 01/26/2022 20:16:14 - INFO - codeparrot_training - Step 448: {'lr': 0.000112, 'samples': 86208, 'steps': 448, 'loss/train': 0.7656463086605072} 01/26/2022 20:16:17 - INFO - codeparrot_training - Step 449: {'lr': 0.00011225, 'samples': 86400, 'steps': 449, 'loss/train': 0.4950437843799591} 01/26/2022 20:16:22 - INFO - codeparrot_training - Step 450: {'lr': 0.00011250000000000001, 'samples': 86592, 'steps': 450, 'loss/train': 0.9192585647106171} 01/26/2022 20:16:25 - INFO - codeparrot_training - Step 451: {'lr': 0.00011275, 'samples': 86784, 'steps': 451, 'loss/train': 0.6744060963392258} 01/26/2022 20:16:28 - INFO - codeparrot_training - Step 452: {'lr': 0.00011300000000000001, 'samples': 86976, 'steps': 452, 'loss/train': 0.7491473704576492} 01/26/2022 20:16:31 - INFO - codeparrot_training - Step 453: {'lr': 0.00011325, 'samples': 87168, 'steps': 453, 'loss/train': 0.7701495587825775} 01/26/2022 20:16:34 - INFO - codeparrot_training - Step 454: {'lr': 0.00011350000000000001, 'samples': 87360, 'steps': 454, 'loss/train': 0.6129368394613266} 01/26/2022 20:16:37 - INFO - codeparrot_training - Step 455: {'lr': 0.00011375, 'samples': 87552, 'steps': 455, 'loss/train': 1.2186667621135712} 01/26/2022 20:16:40 - INFO - codeparrot_training - Step 456: {'lr': 0.000114, 'samples': 87744, 'steps': 456, 'loss/train': 0.970245748758316} 01/26/2022 20:16:44 - INFO - codeparrot_training - Step 457: {'lr': 0.00011425000000000001, 'samples': 87936, 'steps': 457, 'loss/train': 1.1947226822376251} 01/26/2022 20:16:48 - INFO - codeparrot_training - Step 458: {'lr': 0.0001145, 'samples': 88128, 'steps': 458, 'loss/train': 1.3567461669445038} 01/26/2022 20:16:51 - INFO - codeparrot_training - Step 459: {'lr': 0.00011475000000000001, 'samples': 88320, 'steps': 459, 'loss/train': 0.6487510353326797} 01/26/2022 20:16:54 - INFO - codeparrot_training - Step 460: {'lr': 0.000115, 'samples': 88512, 'steps': 460, 'loss/train': 0.8757333755493164} 01/26/2022 20:16:58 - INFO - codeparrot_training - Step 461: {'lr': 0.00011525000000000001, 'samples': 88704, 'steps': 461, 'loss/train': 1.011967420578003} 01/26/2022 20:17:01 - INFO - codeparrot_training - Step 462: {'lr': 0.0001155, 'samples': 88896, 'steps': 462, 'loss/train': 1.5802865624427795} 01/26/2022 20:17:04 - INFO - codeparrot_training - Step 463: {'lr': 0.00011575000000000001, 'samples': 89088, 'steps': 463, 'loss/train': 0.8010456562042236} 01/26/2022 20:17:07 - INFO - codeparrot_training - Step 464: {'lr': 0.00011600000000000001, 'samples': 89280, 'steps': 464, 'loss/train': 0.8695641160011292} 01/26/2022 20:17:10 - INFO - codeparrot_training - Step 465: {'lr': 0.00011625, 'samples': 89472, 'steps': 465, 'loss/train': 0.27032042294740677} 01/26/2022 20:17:13 - INFO - codeparrot_training - Step 466: {'lr': 0.00011650000000000001, 'samples': 89664, 'steps': 466, 'loss/train': 0.7990081608295441} 01/26/2022 20:17:19 - INFO - codeparrot_training - Step 467: {'lr': 0.00011675, 'samples': 89856, 'steps': 467, 'loss/train': 0.9661056697368622} 01/26/2022 20:17:22 - INFO - codeparrot_training - Step 468: {'lr': 0.00011700000000000001, 'samples': 90048, 'steps': 468, 'loss/train': 1.120609313249588} 01/26/2022 20:17:25 - INFO - codeparrot_training - Step 469: {'lr': 0.00011724999999999999, 'samples': 90240, 'steps': 469, 'loss/train': 0.47997334599494934} 01/26/2022 20:17:28 - INFO - codeparrot_training - Step 470: {'lr': 0.0001175, 'samples': 90432, 'steps': 470, 'loss/train': 0.836497038602829} 01/26/2022 20:17:31 - INFO - codeparrot_training - Step 471: {'lr': 0.00011775, 'samples': 90624, 'steps': 471, 'loss/train': 0.8853902220726013} 01/26/2022 20:17:34 - INFO - codeparrot_training - Step 472: {'lr': 0.000118, 'samples': 90816, 'steps': 472, 'loss/train': 0.6629137247800827} 01/26/2022 20:17:37 - INFO - codeparrot_training - Step 473: {'lr': 0.00011825, 'samples': 91008, 'steps': 473, 'loss/train': 1.1359244585037231} 01/26/2022 20:17:41 - INFO - codeparrot_training - Step 474: {'lr': 0.0001185, 'samples': 91200, 'steps': 474, 'loss/train': 0.8318636119365692} 01/26/2022 20:17:44 - INFO - codeparrot_training - Step 475: {'lr': 0.00011875, 'samples': 91392, 'steps': 475, 'loss/train': 0.49809399247169495} 01/26/2022 20:17:48 - INFO - codeparrot_training - Step 476: {'lr': 0.00011899999999999999, 'samples': 91584, 'steps': 476, 'loss/train': 1.2043863236904144} 01/26/2022 20:17:51 - INFO - codeparrot_training - Step 477: {'lr': 0.00011925, 'samples': 91776, 'steps': 477, 'loss/train': 0.8016407489776611} 01/26/2022 20:17:55 - INFO - codeparrot_training - Step 478: {'lr': 0.00011949999999999999, 'samples': 91968, 'steps': 478, 'loss/train': 0.9700524508953094} 01/26/2022 20:17:58 - INFO - codeparrot_training - Step 479: {'lr': 0.00011975, 'samples': 92160, 'steps': 479, 'loss/train': 0.6533531248569489} 01/26/2022 20:18:01 - INFO - codeparrot_training - Step 480: {'lr': 0.00012, 'samples': 92352, 'steps': 480, 'loss/train': 1.0995571911334991} 01/26/2022 20:18:04 - INFO - codeparrot_training - Step 481: {'lr': 0.00012025, 'samples': 92544, 'steps': 481, 'loss/train': 0.46675287187099457} 01/26/2022 20:18:07 - INFO - codeparrot_training - Step 482: {'lr': 0.0001205, 'samples': 92736, 'steps': 482, 'loss/train': 1.4225375354290009} 01/26/2022 20:18:10 - INFO - codeparrot_training - Step 483: {'lr': 0.00012075, 'samples': 92928, 'steps': 483, 'loss/train': 0.6436797380447388} 01/26/2022 20:18:13 - INFO - codeparrot_training - Step 484: {'lr': 0.000121, 'samples': 93120, 'steps': 484, 'loss/train': 0.9993196427822113} 01/26/2022 20:18:20 - INFO - codeparrot_training - Step 485: {'lr': 0.00012124999999999999, 'samples': 93312, 'steps': 485, 'loss/train': 0.7051179707050323} 01/26/2022 20:18:23 - INFO - codeparrot_training - Step 486: {'lr': 0.0001215, 'samples': 93504, 'steps': 486, 'loss/train': 0.8310778141021729} 01/26/2022 20:18:26 - INFO - codeparrot_training - Step 487: {'lr': 0.00012175, 'samples': 93696, 'steps': 487, 'loss/train': 0.7130374610424042} 01/26/2022 20:18:29 - INFO - codeparrot_training - Step 488: {'lr': 0.000122, 'samples': 93888, 'steps': 488, 'loss/train': 0.6525427848100662} 01/26/2022 20:18:32 - INFO - codeparrot_training - Step 489: {'lr': 0.00012225, 'samples': 94080, 'steps': 489, 'loss/train': 1.263034701347351} 01/26/2022 20:18:35 - INFO - codeparrot_training - Step 490: {'lr': 0.0001225, 'samples': 94272, 'steps': 490, 'loss/train': 1.2272007465362549} 01/26/2022 20:18:38 - INFO - codeparrot_training - Step 491: {'lr': 0.00012275, 'samples': 94464, 'steps': 491, 'loss/train': 0.3253694325685501} 01/26/2022 20:18:42 - INFO - codeparrot_training - Step 492: {'lr': 0.000123, 'samples': 94656, 'steps': 492, 'loss/train': 1.7392045855522156} 01/26/2022 20:18:45 - INFO - codeparrot_training - Step 493: {'lr': 0.00012325000000000001, 'samples': 94848, 'steps': 493, 'loss/train': 1.0487505197525024} 01/26/2022 20:18:49 - INFO - codeparrot_training - Step 494: {'lr': 0.0001235, 'samples': 95040, 'steps': 494, 'loss/train': 0.8816839456558228} 01/26/2022 20:18:53 - INFO - codeparrot_training - Step 495: {'lr': 0.00012375, 'samples': 95232, 'steps': 495, 'loss/train': 0.7144645750522614} 01/26/2022 20:18:56 - INFO - codeparrot_training - Step 496: {'lr': 0.000124, 'samples': 95424, 'steps': 496, 'loss/train': 0.7718562483787537} 01/26/2022 20:18:59 - INFO - codeparrot_training - Step 497: {'lr': 0.00012425, 'samples': 95616, 'steps': 497, 'loss/train': 1.0395937263965607} 01/26/2022 20:19:02 - INFO - codeparrot_training - Step 498: {'lr': 0.0001245, 'samples': 95808, 'steps': 498, 'loss/train': 0.9165495336055756} 01/26/2022 20:19:05 - INFO - codeparrot_training - Step 499: {'lr': 0.00012475, 'samples': 96000, 'steps': 499, 'loss/train': 0.5416628569364548} 01/26/2022 20:19:08 - INFO - codeparrot_training - Step 500: {'lr': 0.000125, 'samples': 96192, 'steps': 500, 'loss/train': 0.672015443444252} 01/26/2022 20:19:12 - INFO - codeparrot_training - Step 501: {'lr': 0.00012525, 'samples': 96384, 'steps': 501, 'loss/train': 1.9715461134910583} 01/26/2022 20:19:15 - INFO - codeparrot_training - Step 502: {'lr': 0.00012550000000000001, 'samples': 96576, 'steps': 502, 'loss/train': 0.8303739130496979} 01/26/2022 20:19:19 - INFO - codeparrot_training - Step 503: {'lr': 0.00012575, 'samples': 96768, 'steps': 503, 'loss/train': 1.1383517682552338} 01/26/2022 20:19:22 - INFO - codeparrot_training - Step 504: {'lr': 0.000126, 'samples': 96960, 'steps': 504, 'loss/train': 0.512078046798706} 01/26/2022 20:19:25 - INFO - codeparrot_training - Step 505: {'lr': 0.00012625, 'samples': 97152, 'steps': 505, 'loss/train': 0.5505821406841278} 01/26/2022 20:19:29 - INFO - codeparrot_training - Step 506: {'lr': 0.0001265, 'samples': 97344, 'steps': 506, 'loss/train': 0.6685009449720383} 01/26/2022 20:19:32 - INFO - codeparrot_training - Step 507: {'lr': 0.00012675, 'samples': 97536, 'steps': 507, 'loss/train': 0.5938424617052078} 01/26/2022 20:19:35 - INFO - codeparrot_training - Step 508: {'lr': 0.000127, 'samples': 97728, 'steps': 508, 'loss/train': 0.8808158934116364} 01/26/2022 20:19:38 - INFO - codeparrot_training - Step 509: {'lr': 0.00012725, 'samples': 97920, 'steps': 509, 'loss/train': 0.7746322453022003} 01/26/2022 20:19:41 - INFO - codeparrot_training - Step 510: {'lr': 0.0001275, 'samples': 98112, 'steps': 510, 'loss/train': 1.0879081785678864} 01/26/2022 20:19:44 - INFO - codeparrot_training - Step 511: {'lr': 0.00012775000000000002, 'samples': 98304, 'steps': 511, 'loss/train': 0.7731634676456451} 01/26/2022 20:19:49 - INFO - codeparrot_training - Step 512: {'lr': 0.000128, 'samples': 98496, 'steps': 512, 'loss/train': 0.6284303516149521} 01/26/2022 20:19:52 - INFO - codeparrot_training - Step 513: {'lr': 0.00012825, 'samples': 98688, 'steps': 513, 'loss/train': 0.27421581745147705} 01/26/2022 20:19:55 - INFO - codeparrot_training - Step 514: {'lr': 0.0001285, 'samples': 98880, 'steps': 514, 'loss/train': 0.9253816902637482} 01/26/2022 20:19:58 - INFO - codeparrot_training - Step 515: {'lr': 0.00012875, 'samples': 99072, 'steps': 515, 'loss/train': 1.2660154402256012} 01/26/2022 20:20:02 - INFO - codeparrot_training - Step 516: {'lr': 0.00012900000000000002, 'samples': 99264, 'steps': 516, 'loss/train': 0.8392927944660187} 01/26/2022 20:20:05 - INFO - codeparrot_training - Step 517: {'lr': 0.00012925, 'samples': 99456, 'steps': 517, 'loss/train': 1.019319087266922} 01/26/2022 20:20:08 - INFO - codeparrot_training - Step 518: {'lr': 0.0001295, 'samples': 99648, 'steps': 518, 'loss/train': 1.2289848625659943} 01/26/2022 20:20:11 - INFO - codeparrot_training - Step 519: {'lr': 0.00012975, 'samples': 99840, 'steps': 519, 'loss/train': 0.9302180707454681} 01/26/2022 20:20:14 - INFO - codeparrot_training - Step 520: {'lr': 0.00013000000000000002, 'samples': 100032, 'steps': 520, 'loss/train': 0.8552613258361816} 01/26/2022 20:20:20 - INFO - codeparrot_training - Step 521: {'lr': 0.00013025, 'samples': 100224, 'steps': 521, 'loss/train': 1.01572984457016} 01/26/2022 20:20:23 - INFO - codeparrot_training - Step 522: {'lr': 0.0001305, 'samples': 100416, 'steps': 522, 'loss/train': 0.7750627398490906} 01/26/2022 20:20:26 - INFO - codeparrot_training - Step 523: {'lr': 0.00013075, 'samples': 100608, 'steps': 523, 'loss/train': 1.4055988490581512} 01/26/2022 20:20:29 - INFO - codeparrot_training - Step 524: {'lr': 0.000131, 'samples': 100800, 'steps': 524, 'loss/train': 0.6941443383693695} 01/26/2022 20:20:32 - INFO - codeparrot_training - Step 525: {'lr': 0.00013125000000000002, 'samples': 100992, 'steps': 525, 'loss/train': 2.018996000289917} 01/26/2022 20:20:35 - INFO - codeparrot_training - Step 526: {'lr': 0.0001315, 'samples': 101184, 'steps': 526, 'loss/train': 1.4397898614406586} 01/26/2022 20:20:39 - INFO - codeparrot_training - Step 527: {'lr': 0.00013175, 'samples': 101376, 'steps': 527, 'loss/train': 0.9549647569656372} 01/26/2022 20:20:42 - INFO - codeparrot_training - Step 528: {'lr': 0.000132, 'samples': 101568, 'steps': 528, 'loss/train': 1.12039715051651} 01/26/2022 20:20:45 - INFO - codeparrot_training - Step 529: {'lr': 0.00013225000000000002, 'samples': 101760, 'steps': 529, 'loss/train': 0.8304433822631836} 01/26/2022 20:20:49 - INFO - codeparrot_training - Step 530: {'lr': 0.00013250000000000002, 'samples': 101952, 'steps': 530, 'loss/train': 1.0192952156066895} 01/26/2022 20:20:52 - INFO - codeparrot_training - Step 531: {'lr': 0.00013275, 'samples': 102144, 'steps': 531, 'loss/train': 1.0101014971733093} 01/26/2022 20:20:55 - INFO - codeparrot_training - Step 532: {'lr': 0.000133, 'samples': 102336, 'steps': 532, 'loss/train': 1.182517558336258} 01/26/2022 20:20:59 - INFO - codeparrot_training - Step 533: {'lr': 0.00013325, 'samples': 102528, 'steps': 533, 'loss/train': 0.7047746032476425} 01/26/2022 20:21:02 - INFO - codeparrot_training - Step 534: {'lr': 0.00013350000000000002, 'samples': 102720, 'steps': 534, 'loss/train': 0.9654974341392517} 01/26/2022 20:21:05 - INFO - codeparrot_training - Step 535: {'lr': 0.00013375, 'samples': 102912, 'steps': 535, 'loss/train': 1.4733890891075134} 01/26/2022 20:21:08 - INFO - codeparrot_training - Step 536: {'lr': 0.000134, 'samples': 103104, 'steps': 536, 'loss/train': 1.2371703386306763} 01/26/2022 20:21:11 - INFO - codeparrot_training - Step 537: {'lr': 0.00013425, 'samples': 103296, 'steps': 537, 'loss/train': 1.280784398317337} 01/26/2022 20:21:14 - INFO - codeparrot_training - Step 538: {'lr': 0.00013450000000000002, 'samples': 103488, 'steps': 538, 'loss/train': 1.0170392096042633} 01/26/2022 20:21:19 - INFO - codeparrot_training - Step 539: {'lr': 0.00013475000000000002, 'samples': 103680, 'steps': 539, 'loss/train': 1.1649912893772125} 01/26/2022 20:21:22 - INFO - codeparrot_training - Step 540: {'lr': 0.000135, 'samples': 103872, 'steps': 540, 'loss/train': 0.8566865622997284} 01/26/2022 20:21:25 - INFO - codeparrot_training - Step 541: {'lr': 0.00013525, 'samples': 104064, 'steps': 541, 'loss/train': 0.5552657693624496} 01/26/2022 20:21:28 - INFO - codeparrot_training - Step 542: {'lr': 0.00013550000000000001, 'samples': 104256, 'steps': 542, 'loss/train': 0.796043336391449} 01/26/2022 20:21:31 - INFO - codeparrot_training - Step 543: {'lr': 0.00013575000000000002, 'samples': 104448, 'steps': 543, 'loss/train': 0.7574986517429352} 01/26/2022 20:21:34 - INFO - codeparrot_training - Step 544: {'lr': 0.00013600000000000003, 'samples': 104640, 'steps': 544, 'loss/train': 0.6186468601226807} 01/26/2022 20:21:38 - INFO - codeparrot_training - Step 545: {'lr': 0.00013625, 'samples': 104832, 'steps': 545, 'loss/train': 0.9620187878608704} 01/26/2022 20:21:41 - INFO - codeparrot_training - Step 546: {'lr': 0.0001365, 'samples': 105024, 'steps': 546, 'loss/train': 0.9344134032726288} 01/26/2022 20:21:46 - INFO - codeparrot_training - Step 547: {'lr': 0.00013675000000000002, 'samples': 105216, 'steps': 547, 'loss/train': 0.6428420394659042} 01/26/2022 20:21:49 - INFO - codeparrot_training - Step 548: {'lr': 0.00013700000000000002, 'samples': 105408, 'steps': 548, 'loss/train': 1.1159744560718536} 01/26/2022 20:21:52 - INFO - codeparrot_training - Step 549: {'lr': 0.00013725, 'samples': 105600, 'steps': 549, 'loss/train': 0.5158883482217789} 01/26/2022 20:21:56 - INFO - codeparrot_training - Step 550: {'lr': 0.0001375, 'samples': 105792, 'steps': 550, 'loss/train': 1.0909550786018372} 01/26/2022 20:21:59 - INFO - codeparrot_training - Step 551: {'lr': 0.00013775000000000001, 'samples': 105984, 'steps': 551, 'loss/train': 1.6916638612747192} 01/26/2022 20:22:02 - INFO - codeparrot_training - Step 552: {'lr': 0.00013800000000000002, 'samples': 106176, 'steps': 552, 'loss/train': 0.9012663066387177} 01/26/2022 20:22:05 - INFO - codeparrot_training - Step 553: {'lr': 0.00013825000000000003, 'samples': 106368, 'steps': 553, 'loss/train': 0.7789967358112335} 01/26/2022 20:22:08 - INFO - codeparrot_training - Step 554: {'lr': 0.0001385, 'samples': 106560, 'steps': 554, 'loss/train': 1.0671734511852264} 01/26/2022 20:22:11 - INFO - codeparrot_training - Step 555: {'lr': 0.00013875, 'samples': 106752, 'steps': 555, 'loss/train': 1.0245478749275208} 01/26/2022 20:22:16 - INFO - codeparrot_training - Step 556: {'lr': 0.00013900000000000002, 'samples': 106944, 'steps': 556, 'loss/train': 0.5020046085119247} 01/26/2022 20:22:19 - INFO - codeparrot_training - Step 557: {'lr': 0.00013925000000000002, 'samples': 107136, 'steps': 557, 'loss/train': 1.0817587673664093} 01/26/2022 20:22:22 - INFO - codeparrot_training - Step 558: {'lr': 0.0001395, 'samples': 107328, 'steps': 558, 'loss/train': 0.9696366190910339} 01/26/2022 20:22:25 - INFO - codeparrot_training - Step 559: {'lr': 0.00013975, 'samples': 107520, 'steps': 559, 'loss/train': 1.6461342573165894} 01/26/2022 20:22:28 - INFO - codeparrot_training - Step 560: {'lr': 0.00014000000000000001, 'samples': 107712, 'steps': 560, 'loss/train': 0.7242932170629501} 01/26/2022 20:22:31 - INFO - codeparrot_training - Step 561: {'lr': 0.00014025000000000002, 'samples': 107904, 'steps': 561, 'loss/train': 0.6898515522480011} 01/26/2022 20:22:34 - INFO - codeparrot_training - Step 562: {'lr': 0.00014050000000000003, 'samples': 108096, 'steps': 562, 'loss/train': 1.1182992160320282} 01/26/2022 20:22:38 - INFO - codeparrot_training - Step 563: {'lr': 0.00014074999999999998, 'samples': 108288, 'steps': 563, 'loss/train': 1.2305793464183807} 01/26/2022 20:22:41 - INFO - codeparrot_training - Step 564: {'lr': 0.00014099999999999998, 'samples': 108480, 'steps': 564, 'loss/train': 0.6293094009160995} 01/26/2022 20:22:46 - INFO - codeparrot_training - Step 565: {'lr': 0.00014125, 'samples': 108672, 'steps': 565, 'loss/train': 0.7494006603956223} 01/26/2022 20:22:49 - INFO - codeparrot_training - Step 566: {'lr': 0.0001415, 'samples': 108864, 'steps': 566, 'loss/train': 0.6767587065696716} 01/26/2022 20:22:52 - INFO - codeparrot_training - Step 567: {'lr': 0.00014175, 'samples': 109056, 'steps': 567, 'loss/train': 0.7286884635686874} 01/26/2022 20:22:55 - INFO - codeparrot_training - Step 568: {'lr': 0.00014199999999999998, 'samples': 109248, 'steps': 568, 'loss/train': 1.3158212900161743} 01/26/2022 20:22:58 - INFO - codeparrot_training - Step 569: {'lr': 0.00014225, 'samples': 109440, 'steps': 569, 'loss/train': 1.304736077785492} 01/26/2022 20:23:02 - INFO - codeparrot_training - Step 570: {'lr': 0.0001425, 'samples': 109632, 'steps': 570, 'loss/train': 0.7607653141021729} 01/26/2022 20:23:05 - INFO - codeparrot_training - Step 571: {'lr': 0.00014275, 'samples': 109824, 'steps': 571, 'loss/train': 1.084415227174759} 01/26/2022 20:23:08 - INFO - codeparrot_training - Step 572: {'lr': 0.00014299999999999998, 'samples': 110016, 'steps': 572, 'loss/train': 0.9314425885677338} 01/26/2022 20:23:11 - INFO - codeparrot_training - Step 573: {'lr': 0.00014324999999999999, 'samples': 110208, 'steps': 573, 'loss/train': 0.8982188701629639} 01/26/2022 20:23:15 - INFO - codeparrot_training - Step 574: {'lr': 0.0001435, 'samples': 110400, 'steps': 574, 'loss/train': 1.0459118485450745} 01/26/2022 20:23:18 - INFO - codeparrot_training - Step 575: {'lr': 0.00014375, 'samples': 110592, 'steps': 575, 'loss/train': 1.24342542886734} 01/26/2022 20:23:22 - INFO - codeparrot_training - Step 576: {'lr': 0.000144, 'samples': 110784, 'steps': 576, 'loss/train': 1.0160776376724243} 01/26/2022 20:23:25 - INFO - codeparrot_training - Step 577: {'lr': 0.00014424999999999998, 'samples': 110976, 'steps': 577, 'loss/train': 1.1467996537685394} 01/26/2022 20:23:28 - INFO - codeparrot_training - Step 578: {'lr': 0.0001445, 'samples': 111168, 'steps': 578, 'loss/train': 0.8268345594406128} 01/26/2022 20:23:31 - INFO - codeparrot_training - Step 579: {'lr': 0.00014475, 'samples': 111360, 'steps': 579, 'loss/train': 0.873445987701416} 01/26/2022 20:23:34 - INFO - codeparrot_training - Step 580: {'lr': 0.000145, 'samples': 111552, 'steps': 580, 'loss/train': 0.45913079380989075} 01/26/2022 20:23:37 - INFO - codeparrot_training - Step 581: {'lr': 0.00014524999999999998, 'samples': 111744, 'steps': 581, 'loss/train': 0.8863986432552338} 01/26/2022 20:23:42 - INFO - codeparrot_training - Step 582: {'lr': 0.00014549999999999999, 'samples': 111936, 'steps': 582, 'loss/train': 0.8918125033378601} 01/26/2022 20:23:45 - INFO - codeparrot_training - Step 583: {'lr': 0.00014575, 'samples': 112128, 'steps': 583, 'loss/train': 1.0199389457702637} 01/26/2022 20:23:48 - INFO - codeparrot_training - Step 584: {'lr': 0.000146, 'samples': 112320, 'steps': 584, 'loss/train': 1.2752460837364197} 01/26/2022 20:23:51 - INFO - codeparrot_training - Step 585: {'lr': 0.00014625, 'samples': 112512, 'steps': 585, 'loss/train': 0.8354544639587402} 01/26/2022 20:23:54 - INFO - codeparrot_training - Step 586: {'lr': 0.00014649999999999998, 'samples': 112704, 'steps': 586, 'loss/train': 0.6896410435438156} 01/26/2022 20:23:58 - INFO - codeparrot_training - Step 587: {'lr': 0.00014675, 'samples': 112896, 'steps': 587, 'loss/train': 0.755020022392273} 01/26/2022 20:24:01 - INFO - codeparrot_training - Step 588: {'lr': 0.000147, 'samples': 113088, 'steps': 588, 'loss/train': 0.8740944564342499} 01/26/2022 20:24:04 - INFO - codeparrot_training - Step 589: {'lr': 0.00014725, 'samples': 113280, 'steps': 589, 'loss/train': 1.613927185535431} 01/26/2022 20:24:07 - INFO - codeparrot_training - Step 590: {'lr': 0.0001475, 'samples': 113472, 'steps': 590, 'loss/train': 0.8339685201644897} 01/26/2022 20:24:11 - INFO - codeparrot_training - Step 591: {'lr': 0.00014774999999999999, 'samples': 113664, 'steps': 591, 'loss/train': 0.899694174528122} 01/26/2022 20:24:15 - INFO - codeparrot_training - Step 592: {'lr': 0.000148, 'samples': 113856, 'steps': 592, 'loss/train': 1.1107328832149506} 01/26/2022 20:24:18 - INFO - codeparrot_training - Step 593: {'lr': 0.00014825, 'samples': 114048, 'steps': 593, 'loss/train': 0.875143826007843} 01/26/2022 20:24:21 - INFO - codeparrot_training - Step 594: {'lr': 0.0001485, 'samples': 114240, 'steps': 594, 'loss/train': 0.669572576880455} 01/26/2022 20:24:24 - INFO - codeparrot_training - Step 595: {'lr': 0.00014874999999999998, 'samples': 114432, 'steps': 595, 'loss/train': 1.0809493660926819} 01/26/2022 20:24:27 - INFO - codeparrot_training - Step 596: {'lr': 0.000149, 'samples': 114624, 'steps': 596, 'loss/train': 0.47745783627033234} 01/26/2022 20:24:30 - INFO - codeparrot_training - Step 597: {'lr': 0.00014925, 'samples': 114816, 'steps': 597, 'loss/train': 1.8296976685523987} 01/26/2022 20:24:34 - INFO - codeparrot_training - Step 598: {'lr': 0.0001495, 'samples': 115008, 'steps': 598, 'loss/train': 0.5959167033433914} 01/26/2022 20:24:37 - INFO - codeparrot_training - Step 599: {'lr': 0.00014975, 'samples': 115200, 'steps': 599, 'loss/train': 0.926987886428833} 01/26/2022 20:24:40 - INFO - codeparrot_training - Step 600: {'lr': 0.00015, 'samples': 115392, 'steps': 600, 'loss/train': 0.6421291083097458} 01/26/2022 20:24:46 - INFO - codeparrot_training - Step 601: {'lr': 0.00015025, 'samples': 115584, 'steps': 601, 'loss/train': 1.1162900626659393} 01/26/2022 20:24:49 - INFO - codeparrot_training - Step 602: {'lr': 0.0001505, 'samples': 115776, 'steps': 602, 'loss/train': 1.2625629901885986} 01/26/2022 20:24:52 - INFO - codeparrot_training - Step 603: {'lr': 0.00015075, 'samples': 115968, 'steps': 603, 'loss/train': 0.8391094207763672} 01/26/2022 20:24:55 - INFO - codeparrot_training - Step 604: {'lr': 0.000151, 'samples': 116160, 'steps': 604, 'loss/train': 1.0422407984733582} 01/26/2022 20:24:58 - INFO - codeparrot_training - Step 605: {'lr': 0.00015125, 'samples': 116352, 'steps': 605, 'loss/train': 1.1536369621753693} 01/26/2022 20:25:01 - INFO - codeparrot_training - Step 606: {'lr': 0.0001515, 'samples': 116544, 'steps': 606, 'loss/train': 0.8908830285072327} 01/26/2022 20:25:04 - INFO - codeparrot_training - Step 607: {'lr': 0.00015175, 'samples': 116736, 'steps': 607, 'loss/train': 0.7630718350410461} 01/26/2022 20:25:08 - INFO - codeparrot_training - Step 608: {'lr': 0.000152, 'samples': 116928, 'steps': 608, 'loss/train': 1.165149986743927} 01/26/2022 20:25:12 - INFO - codeparrot_training - Step 609: {'lr': 0.00015225, 'samples': 117120, 'steps': 609, 'loss/train': 0.5184662640094757} 01/26/2022 20:25:15 - INFO - codeparrot_training - Step 610: {'lr': 0.0001525, 'samples': 117312, 'steps': 610, 'loss/train': 0.813170313835144} 01/26/2022 20:25:18 - INFO - codeparrot_training - Step 611: {'lr': 0.00015275, 'samples': 117504, 'steps': 611, 'loss/train': 0.8342358469963074} 01/26/2022 20:25:21 - INFO - codeparrot_training - Step 612: {'lr': 0.000153, 'samples': 117696, 'steps': 612, 'loss/train': 1.4845359027385712} 01/26/2022 20:25:25 - INFO - codeparrot_training - Step 613: {'lr': 0.00015325, 'samples': 117888, 'steps': 613, 'loss/train': 0.7532512843608856} 01/26/2022 20:25:28 - INFO - codeparrot_training - Step 614: {'lr': 0.0001535, 'samples': 118080, 'steps': 614, 'loss/train': 0.9865740537643433} 01/26/2022 20:25:31 - INFO - codeparrot_training - Step 615: {'lr': 0.00015375, 'samples': 118272, 'steps': 615, 'loss/train': 0.8163191378116608} 01/26/2022 20:25:34 - INFO - codeparrot_training - Step 616: {'lr': 0.000154, 'samples': 118464, 'steps': 616, 'loss/train': 1.3022699654102325} 01/26/2022 20:25:37 - INFO - codeparrot_training - Step 617: {'lr': 0.00015425, 'samples': 118656, 'steps': 617, 'loss/train': 1.1061480939388275} 01/26/2022 20:25:41 - INFO - codeparrot_training - Step 618: {'lr': 0.00015450000000000001, 'samples': 118848, 'steps': 618, 'loss/train': 1.2850266695022583} 01/26/2022 20:25:45 - INFO - codeparrot_training - Step 619: {'lr': 0.00015475, 'samples': 119040, 'steps': 619, 'loss/train': 1.0637675821781158} 01/26/2022 20:25:48 - INFO - codeparrot_training - Step 620: {'lr': 0.000155, 'samples': 119232, 'steps': 620, 'loss/train': 1.1182284951210022} 01/26/2022 20:25:51 - INFO - codeparrot_training - Step 621: {'lr': 0.00015525, 'samples': 119424, 'steps': 621, 'loss/train': 0.6352360993623734} 01/26/2022 20:25:54 - INFO - codeparrot_training - Step 622: {'lr': 0.0001555, 'samples': 119616, 'steps': 622, 'loss/train': 1.0795150995254517} 01/26/2022 20:25:57 - INFO - codeparrot_training - Step 623: {'lr': 0.00015575, 'samples': 119808, 'steps': 623, 'loss/train': 0.8984178900718689} 01/26/2022 20:26:00 - INFO - codeparrot_training - Step 624: {'lr': 0.000156, 'samples': 120000, 'steps': 624, 'loss/train': 1.230320155620575} 01/26/2022 20:26:03 - INFO - codeparrot_training - Step 625: {'lr': 0.00015625, 'samples': 120192, 'steps': 625, 'loss/train': 0.9912900924682617} 01/26/2022 20:26:07 - INFO - codeparrot_training - Step 626: {'lr': 0.0001565, 'samples': 120384, 'steps': 626, 'loss/train': 0.7683042883872986} 01/26/2022 20:26:13 - INFO - codeparrot_training - Step 627: {'lr': 0.00015675000000000002, 'samples': 120576, 'steps': 627, 'loss/train': 1.0303488671779633} 01/26/2022 20:26:16 - INFO - codeparrot_training - Step 628: {'lr': 0.000157, 'samples': 120768, 'steps': 628, 'loss/train': 0.9553826451301575} 01/26/2022 20:26:19 - INFO - codeparrot_training - Step 629: {'lr': 0.00015725, 'samples': 120960, 'steps': 629, 'loss/train': 1.071675717830658} 01/26/2022 20:26:23 - INFO - codeparrot_training - Step 630: {'lr': 0.0001575, 'samples': 121152, 'steps': 630, 'loss/train': 3.7067281007766724} 01/26/2022 20:26:26 - INFO - codeparrot_training - Step 631: {'lr': 0.00015775, 'samples': 121344, 'steps': 631, 'loss/train': 0.9579127728939056} 01/26/2022 20:26:29 - INFO - codeparrot_training - Step 632: {'lr': 0.000158, 'samples': 121536, 'steps': 632, 'loss/train': 0.8656308352947235} 01/26/2022 20:26:32 - INFO - codeparrot_training - Step 633: {'lr': 0.00015825, 'samples': 121728, 'steps': 633, 'loss/train': 1.243520826101303} 01/26/2022 20:26:35 - INFO - codeparrot_training - Step 634: {'lr': 0.0001585, 'samples': 121920, 'steps': 634, 'loss/train': 0.5189113765954971} 01/26/2022 20:26:38 - INFO - codeparrot_training - Step 635: {'lr': 0.00015875, 'samples': 122112, 'steps': 635, 'loss/train': 1.0090940594673157} 01/26/2022 20:26:43 - INFO - codeparrot_training - Step 636: {'lr': 0.00015900000000000002, 'samples': 122304, 'steps': 636, 'loss/train': 1.1085612773895264} 01/26/2022 20:26:46 - INFO - codeparrot_training - Step 637: {'lr': 0.00015925, 'samples': 122496, 'steps': 637, 'loss/train': 1.3199002146720886} 01/26/2022 20:26:49 - INFO - codeparrot_training - Step 638: {'lr': 0.0001595, 'samples': 122688, 'steps': 638, 'loss/train': 1.4171862602233887} 01/26/2022 20:26:52 - INFO - codeparrot_training - Step 639: {'lr': 0.00015975, 'samples': 122880, 'steps': 639, 'loss/train': 0.4769306480884552} 01/26/2022 20:26:55 - INFO - codeparrot_training - Step 640: {'lr': 0.00016, 'samples': 123072, 'steps': 640, 'loss/train': 0.6695694476366043} 01/26/2022 20:26:58 - INFO - codeparrot_training - Step 641: {'lr': 0.00016025000000000002, 'samples': 123264, 'steps': 641, 'loss/train': 0.8001262843608856} 01/26/2022 20:27:02 - INFO - codeparrot_training - Step 642: {'lr': 0.0001605, 'samples': 123456, 'steps': 642, 'loss/train': 1.0185562670230865} 01/26/2022 20:27:05 - INFO - codeparrot_training - Step 643: {'lr': 0.00016075, 'samples': 123648, 'steps': 643, 'loss/train': 1.3380499184131622} 01/26/2022 20:27:08 - INFO - codeparrot_training - Step 644: {'lr': 0.000161, 'samples': 123840, 'steps': 644, 'loss/train': 0.9916056096553802} 01/26/2022 20:27:14 - INFO - codeparrot_training - Step 645: {'lr': 0.00016125000000000002, 'samples': 124032, 'steps': 645, 'loss/train': 0.9142903089523315} 01/26/2022 20:27:17 - INFO - codeparrot_training - Step 646: {'lr': 0.0001615, 'samples': 124224, 'steps': 646, 'loss/train': 0.6464662849903107} 01/26/2022 20:27:20 - INFO - codeparrot_training - Step 647: {'lr': 0.00016175, 'samples': 124416, 'steps': 647, 'loss/train': 0.6285644620656967} 01/26/2022 20:27:23 - INFO - codeparrot_training - Step 648: {'lr': 0.000162, 'samples': 124608, 'steps': 648, 'loss/train': 0.8848017454147339} 01/26/2022 20:27:27 - INFO - codeparrot_training - Step 649: {'lr': 0.00016225000000000001, 'samples': 124800, 'steps': 649, 'loss/train': 0.6977881193161011} 01/26/2022 20:27:30 - INFO - codeparrot_training - Step 650: {'lr': 0.00016250000000000002, 'samples': 124992, 'steps': 650, 'loss/train': 0.8536549508571625} 01/26/2022 20:27:33 - INFO - codeparrot_training - Step 651: {'lr': 0.00016275, 'samples': 125184, 'steps': 651, 'loss/train': 0.6653050482273102} 01/26/2022 20:27:36 - INFO - codeparrot_training - Step 652: {'lr': 0.000163, 'samples': 125376, 'steps': 652, 'loss/train': 0.7839305698871613} 01/26/2022 20:27:39 - INFO - codeparrot_training - Step 653: {'lr': 0.00016325, 'samples': 125568, 'steps': 653, 'loss/train': 1.036845713853836} 01/26/2022 20:27:43 - INFO - codeparrot_training - Step 654: {'lr': 0.00016350000000000002, 'samples': 125760, 'steps': 654, 'loss/train': 1.2619302570819855} 01/26/2022 20:27:47 - INFO - codeparrot_training - Step 655: {'lr': 0.00016375000000000002, 'samples': 125952, 'steps': 655, 'loss/train': 1.097649335861206} 01/26/2022 20:27:50 - INFO - codeparrot_training - Step 656: {'lr': 0.000164, 'samples': 126144, 'steps': 656, 'loss/train': 1.4299048483371735} 01/26/2022 20:27:53 - INFO - codeparrot_training - Step 657: {'lr': 0.00016425, 'samples': 126336, 'steps': 657, 'loss/train': 0.698960691690445} 01/26/2022 20:27:56 - INFO - codeparrot_training - Step 658: {'lr': 0.00016450000000000001, 'samples': 126528, 'steps': 658, 'loss/train': 1.0095846354961395} 01/26/2022 20:27:59 - INFO - codeparrot_training - Step 659: {'lr': 0.00016475000000000002, 'samples': 126720, 'steps': 659, 'loss/train': 1.0719581544399261} 01/26/2022 20:28:02 - INFO - codeparrot_training - Step 660: {'lr': 0.000165, 'samples': 126912, 'steps': 660, 'loss/train': 1.2665534913539886} 01/26/2022 20:28:05 - INFO - codeparrot_training - Step 661: {'lr': 0.00016525, 'samples': 127104, 'steps': 661, 'loss/train': 1.1495090425014496} 01/26/2022 20:28:10 - INFO - codeparrot_training - Step 662: {'lr': 0.0001655, 'samples': 127296, 'steps': 662, 'loss/train': 0.6644181311130524} 01/26/2022 20:28:13 - INFO - codeparrot_training - Step 663: {'lr': 0.00016575000000000002, 'samples': 127488, 'steps': 663, 'loss/train': 1.0472686886787415} 01/26/2022 20:28:16 - INFO - codeparrot_training - Step 664: {'lr': 0.00016600000000000002, 'samples': 127680, 'steps': 664, 'loss/train': 1.1239711940288544} 01/26/2022 20:28:19 - INFO - codeparrot_training - Step 665: {'lr': 0.00016625, 'samples': 127872, 'steps': 665, 'loss/train': 1.3414275348186493} 01/26/2022 20:28:23 - INFO - codeparrot_training - Step 666: {'lr': 0.0001665, 'samples': 128064, 'steps': 666, 'loss/train': 0.7016846090555191} 01/26/2022 20:28:26 - INFO - codeparrot_training - Step 667: {'lr': 0.00016675000000000001, 'samples': 128256, 'steps': 667, 'loss/train': 0.8392623960971832} 01/26/2022 20:28:29 - INFO - codeparrot_training - Step 668: {'lr': 0.00016700000000000002, 'samples': 128448, 'steps': 668, 'loss/train': 0.6881774067878723} 01/26/2022 20:28:32 - INFO - codeparrot_training - Step 669: {'lr': 0.00016725000000000003, 'samples': 128640, 'steps': 669, 'loss/train': 0.8654219806194305} 01/26/2022 20:28:35 - INFO - codeparrot_training - Step 670: {'lr': 0.0001675, 'samples': 128832, 'steps': 670, 'loss/train': 1.4148901104927063} 01/26/2022 20:28:41 - INFO - codeparrot_training - Step 671: {'lr': 0.00016775, 'samples': 129024, 'steps': 671, 'loss/train': 0.7871894538402557} 01/26/2022 20:28:44 - INFO - codeparrot_training - Step 672: {'lr': 0.00016800000000000002, 'samples': 129216, 'steps': 672, 'loss/train': 0.7191271483898163} 01/26/2022 20:28:48 - INFO - codeparrot_training - Step 673: {'lr': 0.00016825000000000002, 'samples': 129408, 'steps': 673, 'loss/train': 1.2848331034183502} 01/26/2022 20:28:51 - INFO - codeparrot_training - Step 674: {'lr': 0.0001685, 'samples': 129600, 'steps': 674, 'loss/train': 0.8642187416553497} 01/26/2022 20:28:54 - INFO - codeparrot_training - Step 675: {'lr': 0.00016875, 'samples': 129792, 'steps': 675, 'loss/train': 1.0577407479286194} 01/26/2022 20:28:57 - INFO - codeparrot_training - Step 676: {'lr': 0.00016900000000000002, 'samples': 129984, 'steps': 676, 'loss/train': 0.8613667488098145} 01/26/2022 20:29:00 - INFO - codeparrot_training - Step 677: {'lr': 0.00016925000000000002, 'samples': 130176, 'steps': 677, 'loss/train': 0.7029958516359329} 01/26/2022 20:29:03 - INFO - codeparrot_training - Step 678: {'lr': 0.00016950000000000003, 'samples': 130368, 'steps': 678, 'loss/train': 1.0868782997131348} 01/26/2022 20:29:06 - INFO - codeparrot_training - Step 679: {'lr': 0.00016975, 'samples': 130560, 'steps': 679, 'loss/train': 0.6318007707595825} 01/26/2022 20:29:11 - INFO - codeparrot_training - Step 680: {'lr': 0.00017, 'samples': 130752, 'steps': 680, 'loss/train': 0.7896529734134674} 01/26/2022 20:29:14 - INFO - codeparrot_training - Step 681: {'lr': 0.00017025000000000002, 'samples': 130944, 'steps': 681, 'loss/train': 0.7487976104021072} 01/26/2022 20:29:17 - INFO - codeparrot_training - Step 682: {'lr': 0.00017050000000000002, 'samples': 131136, 'steps': 682, 'loss/train': 0.5079184323549271} 01/26/2022 20:29:20 - INFO - codeparrot_training - Step 683: {'lr': 0.00017075, 'samples': 131328, 'steps': 683, 'loss/train': 0.7469399571418762} 01/26/2022 20:29:23 - INFO - codeparrot_training - Step 684: {'lr': 0.000171, 'samples': 131520, 'steps': 684, 'loss/train': 0.8566912114620209} 01/26/2022 20:29:27 - INFO - codeparrot_training - Step 685: {'lr': 0.00017125000000000002, 'samples': 131712, 'steps': 685, 'loss/train': 1.0669482350349426} 01/26/2022 20:29:30 - INFO - codeparrot_training - Step 686: {'lr': 0.00017150000000000002, 'samples': 131904, 'steps': 686, 'loss/train': 0.6893389374017715} 01/26/2022 20:29:33 - INFO - codeparrot_training - Step 687: {'lr': 0.00017175000000000003, 'samples': 132096, 'steps': 687, 'loss/train': 0.42208729684352875} 01/26/2022 20:29:36 - INFO - codeparrot_training - Step 688: {'lr': 0.00017199999999999998, 'samples': 132288, 'steps': 688, 'loss/train': 1.2730027735233307} 01/26/2022 20:29:40 - INFO - codeparrot_training - Step 689: {'lr': 0.00017224999999999999, 'samples': 132480, 'steps': 689, 'loss/train': 1.454077273607254} 01/26/2022 20:29:43 - INFO - codeparrot_training - Step 690: {'lr': 0.0001725, 'samples': 132672, 'steps': 690, 'loss/train': 1.051508903503418} 01/26/2022 20:29:47 - INFO - codeparrot_training - Step 691: {'lr': 0.00017275, 'samples': 132864, 'steps': 691, 'loss/train': 0.35310687124729156} 01/26/2022 20:29:50 - INFO - codeparrot_training - Step 692: {'lr': 0.000173, 'samples': 133056, 'steps': 692, 'loss/train': 0.9737249314785004} 01/26/2022 20:29:53 - INFO - codeparrot_training - Step 693: {'lr': 0.00017324999999999998, 'samples': 133248, 'steps': 693, 'loss/train': 0.8442901074886322} 01/26/2022 20:29:56 - INFO - codeparrot_training - Step 694: {'lr': 0.0001735, 'samples': 133440, 'steps': 694, 'loss/train': 0.9881038963794708} 01/26/2022 20:29:59 - INFO - codeparrot_training - Step 695: {'lr': 0.00017375, 'samples': 133632, 'steps': 695, 'loss/train': 0.9713330268859863} 01/26/2022 20:30:02 - INFO - codeparrot_training - Step 696: {'lr': 0.000174, 'samples': 133824, 'steps': 696, 'loss/train': 1.1849716901779175} 01/26/2022 20:30:05 - INFO - codeparrot_training - Step 697: {'lr': 0.00017424999999999998, 'samples': 134016, 'steps': 697, 'loss/train': 0.9789444208145142} 01/26/2022 20:30:12 - INFO - codeparrot_training - Step 698: {'lr': 0.00017449999999999999, 'samples': 134208, 'steps': 698, 'loss/train': 0.8334293961524963} 01/26/2022 20:30:15 - INFO - codeparrot_training - Step 699: {'lr': 0.00017475, 'samples': 134400, 'steps': 699, 'loss/train': 0.8760148286819458} 01/26/2022 20:30:18 - INFO - codeparrot_training - Step 700: {'lr': 0.000175, 'samples': 134592, 'steps': 700, 'loss/train': 1.3783383071422577} 01/26/2022 20:30:21 - INFO - codeparrot_training - Step 701: {'lr': 0.00017525, 'samples': 134784, 'steps': 701, 'loss/train': 1.022957682609558} 01/26/2022 20:30:24 - INFO - codeparrot_training - Step 702: {'lr': 0.00017549999999999998, 'samples': 134976, 'steps': 702, 'loss/train': 1.0660897493362427} 01/26/2022 20:30:28 - INFO - codeparrot_training - Step 703: {'lr': 0.00017575, 'samples': 135168, 'steps': 703, 'loss/train': 1.036992073059082} 01/26/2022 20:30:31 - INFO - codeparrot_training - Step 704: {'lr': 0.000176, 'samples': 135360, 'steps': 704, 'loss/train': 1.1998098492622375} 01/26/2022 20:30:34 - INFO - codeparrot_training - Step 705: {'lr': 0.00017625, 'samples': 135552, 'steps': 705, 'loss/train': 0.8409687280654907} 01/26/2022 20:30:38 - INFO - codeparrot_training - Step 706: {'lr': 0.00017649999999999998, 'samples': 135744, 'steps': 706, 'loss/train': 0.6072161346673965} 01/26/2022 20:30:41 - INFO - codeparrot_training - Step 707: {'lr': 0.00017675, 'samples': 135936, 'steps': 707, 'loss/train': 1.4789616465568542} 01/26/2022 20:30:45 - INFO - codeparrot_training - Step 708: {'lr': 0.000177, 'samples': 136128, 'steps': 708, 'loss/train': 0.8612742125988007} 01/26/2022 20:30:48 - INFO - codeparrot_training - Step 709: {'lr': 0.00017725, 'samples': 136320, 'steps': 709, 'loss/train': 0.9710145592689514} 01/26/2022 20:30:51 - INFO - codeparrot_training - Step 710: {'lr': 0.0001775, 'samples': 136512, 'steps': 710, 'loss/train': 1.1347587704658508} 01/26/2022 20:30:54 - INFO - codeparrot_training - Step 711: {'lr': 0.00017774999999999998, 'samples': 136704, 'steps': 711, 'loss/train': 0.818534642457962} 01/26/2022 20:30:57 - INFO - codeparrot_training - Step 712: {'lr': 0.000178, 'samples': 136896, 'steps': 712, 'loss/train': 0.9411788284778595} 01/26/2022 20:31:00 - INFO - codeparrot_training - Step 713: {'lr': 0.00017825, 'samples': 137088, 'steps': 713, 'loss/train': 1.6106010675430298} 01/26/2022 20:31:03 - INFO - codeparrot_training - Step 714: {'lr': 0.0001785, 'samples': 137280, 'steps': 714, 'loss/train': 1.0063637495040894} 01/26/2022 20:31:08 - INFO - codeparrot_training - Step 715: {'lr': 0.00017875, 'samples': 137472, 'steps': 715, 'loss/train': 1.0250695645809174} 01/26/2022 20:31:11 - INFO - codeparrot_training - Step 716: {'lr': 0.000179, 'samples': 137664, 'steps': 716, 'loss/train': 1.235056310892105} 01/26/2022 20:31:14 - INFO - codeparrot_training - Step 717: {'lr': 0.00017925, 'samples': 137856, 'steps': 717, 'loss/train': 1.1703775227069855} 01/26/2022 20:31:17 - INFO - codeparrot_training - Step 718: {'lr': 0.0001795, 'samples': 138048, 'steps': 718, 'loss/train': 0.6291922777891159} 01/26/2022 20:31:20 - INFO - codeparrot_training - Step 719: {'lr': 0.00017975, 'samples': 138240, 'steps': 719, 'loss/train': 0.6772595196962357} 01/26/2022 20:31:23 - INFO - codeparrot_training - Step 720: {'lr': 0.00017999999999999998, 'samples': 138432, 'steps': 720, 'loss/train': 0.562017023563385} 01/26/2022 20:31:27 - INFO - codeparrot_training - Step 721: {'lr': 0.00018025, 'samples': 138624, 'steps': 721, 'loss/train': 0.9805403351783752} 01/26/2022 20:31:30 - INFO - codeparrot_training - Step 722: {'lr': 0.0001805, 'samples': 138816, 'steps': 722, 'loss/train': 0.3919403851032257} 01/26/2022 20:31:33 - INFO - codeparrot_training - Step 723: {'lr': 0.00018075, 'samples': 139008, 'steps': 723, 'loss/train': 0.8560447990894318} 01/26/2022 20:31:37 - INFO - codeparrot_training - Step 724: {'lr': 0.000181, 'samples': 139200, 'steps': 724, 'loss/train': 1.100720465183258} 01/26/2022 20:31:40 - INFO - codeparrot_training - Step 725: {'lr': 0.00018125, 'samples': 139392, 'steps': 725, 'loss/train': 0.7327403426170349} 01/26/2022 20:31:43 - INFO - codeparrot_training - Step 726: {'lr': 0.0001815, 'samples': 139584, 'steps': 726, 'loss/train': 1.1028411090373993} 01/26/2022 20:31:47 - INFO - codeparrot_training - Step 727: {'lr': 0.00018175, 'samples': 139776, 'steps': 727, 'loss/train': 1.0841010510921478} 01/26/2022 20:31:50 - INFO - codeparrot_training - Step 728: {'lr': 0.000182, 'samples': 139968, 'steps': 728, 'loss/train': 0.6613376587629318} 01/26/2022 20:31:53 - INFO - codeparrot_training - Step 729: {'lr': 0.00018225, 'samples': 140160, 'steps': 729, 'loss/train': 0.6293601393699646} 01/26/2022 20:31:56 - INFO - codeparrot_training - Step 730: {'lr': 0.0001825, 'samples': 140352, 'steps': 730, 'loss/train': 0.9111529290676117} 01/26/2022 20:31:59 - INFO - codeparrot_training - Step 731: {'lr': 0.00018275, 'samples': 140544, 'steps': 731, 'loss/train': 1.2561307847499847} 01/26/2022 20:32:02 - INFO - codeparrot_training - Step 732: {'lr': 0.000183, 'samples': 140736, 'steps': 732, 'loss/train': 0.8419937789440155} 01/26/2022 20:32:09 - INFO - codeparrot_training - Step 733: {'lr': 0.00018325, 'samples': 140928, 'steps': 733, 'loss/train': 0.9878305792808533} 01/26/2022 20:32:12 - INFO - codeparrot_training - Step 734: {'lr': 0.0001835, 'samples': 141120, 'steps': 734, 'loss/train': 1.305568814277649} 01/26/2022 20:32:16 - INFO - codeparrot_training - Step 735: {'lr': 0.00018375, 'samples': 141312, 'steps': 735, 'loss/train': 0.1390710510313511} 01/26/2022 20:32:19 - INFO - codeparrot_training - Step 736: {'lr': 0.000184, 'samples': 141504, 'steps': 736, 'loss/train': 0.7866142094135284} 01/26/2022 20:32:22 - INFO - codeparrot_training - Step 737: {'lr': 0.00018425, 'samples': 141696, 'steps': 737, 'loss/train': 1.1417504847049713} 01/26/2022 20:32:25 - INFO - codeparrot_training - Step 738: {'lr': 0.0001845, 'samples': 141888, 'steps': 738, 'loss/train': 0.8403282165527344} 01/26/2022 20:32:28 - INFO - codeparrot_training - Step 739: {'lr': 0.00018475, 'samples': 142080, 'steps': 739, 'loss/train': 1.592317521572113} 01/26/2022 20:32:31 - INFO - codeparrot_training - Step 740: {'lr': 0.000185, 'samples': 142272, 'steps': 740, 'loss/train': 0.8372698724269867} 01/26/2022 20:32:34 - INFO - codeparrot_training - Step 741: {'lr': 0.00018525, 'samples': 142464, 'steps': 741, 'loss/train': 1.2187131643295288} 01/26/2022 20:32:38 - INFO - codeparrot_training - Step 742: {'lr': 0.0001855, 'samples': 142656, 'steps': 742, 'loss/train': 1.3483904600143433} 01/26/2022 20:32:42 - INFO - codeparrot_training - Step 743: {'lr': 0.00018575000000000002, 'samples': 142848, 'steps': 743, 'loss/train': 0.8372267782688141} 01/26/2022 20:32:45 - INFO - codeparrot_training - Step 744: {'lr': 0.000186, 'samples': 143040, 'steps': 744, 'loss/train': 0.9834917485713959} 01/26/2022 20:32:49 - INFO - codeparrot_training - Step 745: {'lr': 0.00018625, 'samples': 143232, 'steps': 745, 'loss/train': 0.7956083714962006} 01/26/2022 20:32:52 - INFO - codeparrot_training - Step 746: {'lr': 0.0001865, 'samples': 143424, 'steps': 746, 'loss/train': 0.7947530150413513} 01/26/2022 20:32:55 - INFO - codeparrot_training - Step 747: {'lr': 0.00018675, 'samples': 143616, 'steps': 747, 'loss/train': 1.4260998666286469} 01/26/2022 20:32:58 - INFO - codeparrot_training - Step 748: {'lr': 0.000187, 'samples': 143808, 'steps': 748, 'loss/train': 0.9100359678268433} 01/26/2022 20:33:01 - INFO - codeparrot_training - Step 749: {'lr': 0.00018725, 'samples': 144000, 'steps': 749, 'loss/train': 0.8670728802680969} 01/26/2022 20:33:04 - INFO - codeparrot_training - Step 750: {'lr': 0.0001875, 'samples': 144192, 'steps': 750, 'loss/train': 0.5285973697900772} 01/26/2022 20:33:07 - INFO - codeparrot_training - Step 751: {'lr': 0.00018775, 'samples': 144384, 'steps': 751, 'loss/train': 1.3827945291996002} 01/26/2022 20:33:14 - INFO - codeparrot_training - Step 752: {'lr': 0.00018800000000000002, 'samples': 144576, 'steps': 752, 'loss/train': 0.8902200758457184} 01/26/2022 20:33:17 - INFO - codeparrot_training - Step 753: {'lr': 0.00018825, 'samples': 144768, 'steps': 753, 'loss/train': 0.7913635075092316} 01/26/2022 20:33:20 - INFO - codeparrot_training - Step 754: {'lr': 0.0001885, 'samples': 144960, 'steps': 754, 'loss/train': 1.4179195761680603} 01/26/2022 20:33:23 - INFO - codeparrot_training - Step 755: {'lr': 0.00018875, 'samples': 145152, 'steps': 755, 'loss/train': 0.9423087537288666} 01/26/2022 20:33:26 - INFO - codeparrot_training - Step 756: {'lr': 0.000189, 'samples': 145344, 'steps': 756, 'loss/train': 0.9094258546829224} 01/26/2022 20:33:29 - INFO - codeparrot_training - Step 757: {'lr': 0.00018925, 'samples': 145536, 'steps': 757, 'loss/train': 1.1640411615371704} 01/26/2022 20:33:33 - INFO - codeparrot_training - Step 758: {'lr': 0.0001895, 'samples': 145728, 'steps': 758, 'loss/train': 1.0199125707149506} 01/26/2022 20:33:36 - INFO - codeparrot_training - Step 759: {'lr': 0.00018975, 'samples': 145920, 'steps': 759, 'loss/train': 1.3976908922195435} 01/26/2022 20:33:40 - INFO - codeparrot_training - Step 760: {'lr': 0.00019, 'samples': 146112, 'steps': 760, 'loss/train': 0.8301973342895508} 01/26/2022 20:33:43 - INFO - codeparrot_training - Step 761: {'lr': 0.00019025000000000002, 'samples': 146304, 'steps': 761, 'loss/train': 0.8738430440425873} 01/26/2022 20:33:46 - INFO - codeparrot_training - Step 762: {'lr': 0.0001905, 'samples': 146496, 'steps': 762, 'loss/train': 0.9114019274711609} 01/26/2022 20:33:50 - INFO - codeparrot_training - Step 763: {'lr': 0.00019075, 'samples': 146688, 'steps': 763, 'loss/train': 1.0480453670024872} 01/26/2022 20:33:53 - INFO - codeparrot_training - Step 764: {'lr': 0.000191, 'samples': 146880, 'steps': 764, 'loss/train': 1.0675413608551025} 01/26/2022 20:33:56 - INFO - codeparrot_training - Step 765: {'lr': 0.00019125000000000001, 'samples': 147072, 'steps': 765, 'loss/train': 0.9275387227535248} 01/26/2022 20:33:59 - INFO - codeparrot_training - Step 766: {'lr': 0.00019150000000000002, 'samples': 147264, 'steps': 766, 'loss/train': 1.5479466319084167} 01/26/2022 20:34:02 - INFO - codeparrot_training - Step 767: {'lr': 0.00019175, 'samples': 147456, 'steps': 767, 'loss/train': 1.1427192091941833} 01/26/2022 20:34:05 - INFO - codeparrot_training - Step 768: {'lr': 0.000192, 'samples': 147648, 'steps': 768, 'loss/train': 1.0345340967178345} 01/26/2022 20:34:10 - INFO - codeparrot_training - Step 769: {'lr': 0.00019225, 'samples': 147840, 'steps': 769, 'loss/train': 0.7592909932136536} 01/26/2022 20:34:13 - INFO - codeparrot_training - Step 770: {'lr': 0.00019250000000000002, 'samples': 148032, 'steps': 770, 'loss/train': 1.3460177779197693} 01/26/2022 20:34:16 - INFO - codeparrot_training - Step 771: {'lr': 0.00019275, 'samples': 148224, 'steps': 771, 'loss/train': 1.425268292427063} 01/26/2022 20:34:19 - INFO - codeparrot_training - Step 772: {'lr': 0.000193, 'samples': 148416, 'steps': 772, 'loss/train': 1.2665028870105743} 01/26/2022 20:34:22 - INFO - codeparrot_training - Step 773: {'lr': 0.00019325, 'samples': 148608, 'steps': 773, 'loss/train': 1.3485895693302155} 01/26/2022 20:34:25 - INFO - codeparrot_training - Step 774: {'lr': 0.00019350000000000001, 'samples': 148800, 'steps': 774, 'loss/train': 0.8594979643821716} 01/26/2022 20:34:28 - INFO - codeparrot_training - Step 775: {'lr': 0.00019375000000000002, 'samples': 148992, 'steps': 775, 'loss/train': 0.8714957535266876} 01/26/2022 20:34:32 - INFO - codeparrot_training - Step 776: {'lr': 0.000194, 'samples': 149184, 'steps': 776, 'loss/train': 0.5841094404459} 01/26/2022 20:34:35 - INFO - codeparrot_training - Step 777: {'lr': 0.00019425, 'samples': 149376, 'steps': 777, 'loss/train': 1.1731106042861938} 01/26/2022 20:34:41 - INFO - codeparrot_training - Step 778: {'lr': 0.0001945, 'samples': 149568, 'steps': 778, 'loss/train': 0.7482174932956696} 01/26/2022 20:34:44 - INFO - codeparrot_training - Step 779: {'lr': 0.00019475000000000002, 'samples': 149760, 'steps': 779, 'loss/train': 1.1149235665798187} 01/26/2022 20:34:47 - INFO - codeparrot_training - Step 780: {'lr': 0.00019500000000000002, 'samples': 149952, 'steps': 780, 'loss/train': 0.935143768787384} 01/26/2022 20:34:50 - INFO - codeparrot_training - Step 781: {'lr': 0.00019525, 'samples': 150144, 'steps': 781, 'loss/train': 0.8500780463218689} 01/26/2022 20:34:53 - INFO - codeparrot_training - Step 782: {'lr': 0.0001955, 'samples': 150336, 'steps': 782, 'loss/train': 0.5289049744606018} 01/26/2022 20:34:56 - INFO - codeparrot_training - Step 783: {'lr': 0.00019575000000000001, 'samples': 150528, 'steps': 783, 'loss/train': 1.4131156504154205} 01/26/2022 20:35:00 - INFO - codeparrot_training - Step 784: {'lr': 0.00019600000000000002, 'samples': 150720, 'steps': 784, 'loss/train': 0.30913594365119934} 01/26/2022 20:35:03 - INFO - codeparrot_training - Step 785: {'lr': 0.00019625, 'samples': 150912, 'steps': 785, 'loss/train': 1.2493534684181213} 01/26/2022 20:35:06 - INFO - codeparrot_training - Step 786: {'lr': 0.0001965, 'samples': 151104, 'steps': 786, 'loss/train': 0.9694152474403381} 01/26/2022 20:35:10 - INFO - codeparrot_training - Step 787: {'lr': 0.00019675, 'samples': 151296, 'steps': 787, 'loss/train': 0.3550092503428459} 01/26/2022 20:35:13 - INFO - codeparrot_training - Step 788: {'lr': 0.00019700000000000002, 'samples': 151488, 'steps': 788, 'loss/train': 0.690741777420044} 01/26/2022 20:35:16 - INFO - codeparrot_training - Step 789: {'lr': 0.00019725000000000002, 'samples': 151680, 'steps': 789, 'loss/train': 1.1352488100528717} 01/26/2022 20:35:20 - INFO - codeparrot_training - Step 790: {'lr': 0.0001975, 'samples': 151872, 'steps': 790, 'loss/train': 1.0375614166259766} 01/26/2022 20:35:23 - INFO - codeparrot_training - Step 791: {'lr': 0.00019775, 'samples': 152064, 'steps': 791, 'loss/train': 0.9023004770278931} 01/26/2022 20:35:26 - INFO - codeparrot_training - Step 792: {'lr': 0.00019800000000000002, 'samples': 152256, 'steps': 792, 'loss/train': 0.6497882902622223} 01/26/2022 20:35:29 - INFO - codeparrot_training - Step 793: {'lr': 0.00019825000000000002, 'samples': 152448, 'steps': 793, 'loss/train': 0.6133932173252106} 01/26/2022 20:35:32 - INFO - codeparrot_training - Step 794: {'lr': 0.00019850000000000003, 'samples': 152640, 'steps': 794, 'loss/train': 0.37911419570446014} 01/26/2022 20:35:35 - INFO - codeparrot_training - Step 795: {'lr': 0.00019875, 'samples': 152832, 'steps': 795, 'loss/train': 0.9305910766124725} 01/26/2022 20:35:42 - INFO - codeparrot_training - Step 796: {'lr': 0.000199, 'samples': 153024, 'steps': 796, 'loss/train': 1.0752443969249725} 01/26/2022 20:35:45 - INFO - codeparrot_training - Step 797: {'lr': 0.00019925000000000002, 'samples': 153216, 'steps': 797, 'loss/train': 0.7176533639431} 01/26/2022 20:35:48 - INFO - codeparrot_training - Step 798: {'lr': 0.00019950000000000002, 'samples': 153408, 'steps': 798, 'loss/train': 1.2106296122074127} 01/26/2022 20:35:51 - INFO - codeparrot_training - Step 799: {'lr': 0.00019975, 'samples': 153600, 'steps': 799, 'loss/train': 0.7819571793079376} 01/26/2022 20:35:54 - INFO - codeparrot_training - Step 800: {'lr': 0.0002, 'samples': 153792, 'steps': 800, 'loss/train': 1.0153715908527374} 01/26/2022 20:35:57 - INFO - codeparrot_training - Step 801: {'lr': 0.00020025000000000002, 'samples': 153984, 'steps': 801, 'loss/train': 1.2898483872413635} 01/26/2022 20:36:00 - INFO - codeparrot_training - Step 802: {'lr': 0.00020050000000000002, 'samples': 154176, 'steps': 802, 'loss/train': 1.1867233514785767} 01/26/2022 20:36:04 - INFO - codeparrot_training - Step 803: {'lr': 0.00020075000000000003, 'samples': 154368, 'steps': 803, 'loss/train': 0.6986654698848724} 01/26/2022 20:36:07 - INFO - codeparrot_training - Step 804: {'lr': 0.000201, 'samples': 154560, 'steps': 804, 'loss/train': 0.9793243110179901} 01/26/2022 20:36:11 - INFO - codeparrot_training - Step 805: {'lr': 0.00020125, 'samples': 154752, 'steps': 805, 'loss/train': 0.5471542775630951} 01/26/2022 20:36:15 - INFO - codeparrot_training - Step 806: {'lr': 0.00020150000000000002, 'samples': 154944, 'steps': 806, 'loss/train': 1.5339066982269287} 01/26/2022 20:36:18 - INFO - codeparrot_training - Step 807: {'lr': 0.00020175000000000003, 'samples': 155136, 'steps': 807, 'loss/train': 0.3004101812839508} 01/26/2022 20:36:21 - INFO - codeparrot_training - Step 808: {'lr': 0.000202, 'samples': 155328, 'steps': 808, 'loss/train': 0.9867880046367645} 01/26/2022 20:36:24 - INFO - codeparrot_training - Step 809: {'lr': 0.00020225, 'samples': 155520, 'steps': 809, 'loss/train': 1.2848010957241058} 01/26/2022 20:36:27 - INFO - codeparrot_training - Step 810: {'lr': 0.00020250000000000002, 'samples': 155712, 'steps': 810, 'loss/train': 1.0448269844055176} 01/26/2022 20:36:30 - INFO - codeparrot_training - Step 811: {'lr': 0.00020275000000000002, 'samples': 155904, 'steps': 811, 'loss/train': 1.1478981971740723} 01/26/2022 20:36:33 - INFO - codeparrot_training - Step 812: {'lr': 0.00020300000000000003, 'samples': 156096, 'steps': 812, 'loss/train': 0.8939153552055359} 01/26/2022 20:36:37 - INFO - codeparrot_training - Step 813: {'lr': 0.00020324999999999998, 'samples': 156288, 'steps': 813, 'loss/train': 0.6805943101644516} 01/26/2022 20:36:41 - INFO - codeparrot_training - Step 814: {'lr': 0.00020349999999999999, 'samples': 156480, 'steps': 814, 'loss/train': 0.5918071120977402} 01/26/2022 20:36:44 - INFO - codeparrot_training - Step 815: {'lr': 0.00020375, 'samples': 156672, 'steps': 815, 'loss/train': 0.7792027294635773} 01/26/2022 20:36:47 - INFO - codeparrot_training - Step 816: {'lr': 0.000204, 'samples': 156864, 'steps': 816, 'loss/train': 1.0537404119968414} 01/26/2022 20:36:51 - INFO - codeparrot_training - Step 817: {'lr': 0.00020425, 'samples': 157056, 'steps': 817, 'loss/train': 0.874046266078949} 01/26/2022 20:36:54 - INFO - codeparrot_training - Step 818: {'lr': 0.00020449999999999998, 'samples': 157248, 'steps': 818, 'loss/train': 0.8286419212818146} 01/26/2022 20:36:57 - INFO - codeparrot_training - Step 819: {'lr': 0.00020475, 'samples': 157440, 'steps': 819, 'loss/train': 1.1937158703804016} 01/26/2022 20:37:00 - INFO - codeparrot_training - Step 820: {'lr': 0.000205, 'samples': 157632, 'steps': 820, 'loss/train': 0.24175997078418732} 01/26/2022 20:37:03 - INFO - codeparrot_training - Step 821: {'lr': 0.00020525, 'samples': 157824, 'steps': 821, 'loss/train': 1.4559266567230225} 01/26/2022 20:37:07 - INFO - codeparrot_training - Step 822: {'lr': 0.00020549999999999998, 'samples': 158016, 'steps': 822, 'loss/train': 1.3116374015808105} 01/26/2022 20:37:11 - INFO - codeparrot_training - Step 823: {'lr': 0.00020575, 'samples': 158208, 'steps': 823, 'loss/train': 1.1885481476783752} 01/26/2022 20:37:14 - INFO - codeparrot_training - Step 824: {'lr': 0.000206, 'samples': 158400, 'steps': 824, 'loss/train': 0.7910436987876892} 01/26/2022 20:37:17 - INFO - codeparrot_training - Step 825: {'lr': 0.00020625, 'samples': 158592, 'steps': 825, 'loss/train': 1.0141403675079346} 01/26/2022 20:37:20 - INFO - codeparrot_training - Step 826: {'lr': 0.0002065, 'samples': 158784, 'steps': 826, 'loss/train': 1.0890448093414307} 01/26/2022 20:37:23 - INFO - codeparrot_training - Step 827: {'lr': 0.00020674999999999998, 'samples': 158976, 'steps': 827, 'loss/train': 1.073274314403534} 01/26/2022 20:37:26 - INFO - codeparrot_training - Step 828: {'lr': 0.000207, 'samples': 159168, 'steps': 828, 'loss/train': 1.4990364611148834} 01/26/2022 20:37:29 - INFO - codeparrot_training - Step 829: {'lr': 0.00020725, 'samples': 159360, 'steps': 829, 'loss/train': 1.4358133971691132} 01/26/2022 20:37:33 - INFO - codeparrot_training - Step 830: {'lr': 0.0002075, 'samples': 159552, 'steps': 830, 'loss/train': 0.9667293727397919} 01/26/2022 20:37:39 - INFO - codeparrot_training - Step 831: {'lr': 0.00020774999999999998, 'samples': 159744, 'steps': 831, 'loss/train': 1.0747352242469788} 01/26/2022 20:37:42 - INFO - codeparrot_training - Step 832: {'lr': 0.000208, 'samples': 159936, 'steps': 832, 'loss/train': 1.0136394202709198} 01/26/2022 20:37:45 - INFO - codeparrot_training - Step 833: {'lr': 0.00020825, 'samples': 160128, 'steps': 833, 'loss/train': 1.1674889624118805} 01/26/2022 20:37:48 - INFO - codeparrot_training - Step 834: {'lr': 0.0002085, 'samples': 160320, 'steps': 834, 'loss/train': 1.2853999435901642} 01/26/2022 20:37:51 - INFO - codeparrot_training - Step 835: {'lr': 0.00020875, 'samples': 160512, 'steps': 835, 'loss/train': 1.1943403780460358} 01/26/2022 20:37:54 - INFO - codeparrot_training - Step 836: {'lr': 0.00020899999999999998, 'samples': 160704, 'steps': 836, 'loss/train': 1.020514190196991} 01/26/2022 20:37:58 - INFO - codeparrot_training - Step 837: {'lr': 0.00020925, 'samples': 160896, 'steps': 837, 'loss/train': 0.9671913385391235} 01/26/2022 20:38:01 - INFO - codeparrot_training - Step 838: {'lr': 0.0002095, 'samples': 161088, 'steps': 838, 'loss/train': 1.0849076807498932} 01/26/2022 20:38:04 - INFO - codeparrot_training - Step 839: {'lr': 0.00020975, 'samples': 161280, 'steps': 839, 'loss/train': 1.339478462934494} 01/26/2022 20:38:08 - INFO - codeparrot_training - Step 840: {'lr': 0.00021, 'samples': 161472, 'steps': 840, 'loss/train': 0.7180408984422684} 01/26/2022 20:38:11 - INFO - codeparrot_training - Step 841: {'lr': 0.00021025, 'samples': 161664, 'steps': 841, 'loss/train': 0.9816141128540039} 01/26/2022 20:38:15 - INFO - codeparrot_training - Step 842: {'lr': 0.0002105, 'samples': 161856, 'steps': 842, 'loss/train': 0.8072361946105957} 01/26/2022 20:38:18 - INFO - codeparrot_training - Step 843: {'lr': 0.00021075, 'samples': 162048, 'steps': 843, 'loss/train': 1.1432275772094727} 01/26/2022 20:38:21 - INFO - codeparrot_training - Step 844: {'lr': 0.000211, 'samples': 162240, 'steps': 844, 'loss/train': 1.3443118929862976} 01/26/2022 20:38:24 - INFO - codeparrot_training - Step 845: {'lr': 0.00021124999999999998, 'samples': 162432, 'steps': 845, 'loss/train': 0.9823727309703827} 01/26/2022 20:38:27 - INFO - codeparrot_training - Step 846: {'lr': 0.0002115, 'samples': 162624, 'steps': 846, 'loss/train': 1.2291758358478546} 01/26/2022 20:38:30 - INFO - codeparrot_training - Step 847: {'lr': 0.00021175, 'samples': 162816, 'steps': 847, 'loss/train': 0.8374214172363281} 01/26/2022 20:38:34 - INFO - codeparrot_training - Step 848: {'lr': 0.000212, 'samples': 163008, 'steps': 848, 'loss/train': 0.9853262901306152} 01/26/2022 20:38:38 - INFO - codeparrot_training - Step 849: {'lr': 0.00021225, 'samples': 163200, 'steps': 849, 'loss/train': 0.9183481335639954} 01/26/2022 20:38:41 - INFO - codeparrot_training - Step 850: {'lr': 0.0002125, 'samples': 163392, 'steps': 850, 'loss/train': 1.2009184956550598} 01/26/2022 20:38:44 - INFO - codeparrot_training - Step 851: {'lr': 0.00021275, 'samples': 163584, 'steps': 851, 'loss/train': 0.850340723991394} 01/26/2022 20:38:47 - INFO - codeparrot_training - Step 852: {'lr': 0.000213, 'samples': 163776, 'steps': 852, 'loss/train': 0.9754958152770996} 01/26/2022 20:38:51 - INFO - codeparrot_training - Step 853: {'lr': 0.00021325, 'samples': 163968, 'steps': 853, 'loss/train': 0.9106079041957855} 01/26/2022 20:38:54 - INFO - codeparrot_training - Step 854: {'lr': 0.0002135, 'samples': 164160, 'steps': 854, 'loss/train': 1.1068985760211945} 01/26/2022 20:38:57 - INFO - codeparrot_training - Step 855: {'lr': 0.00021375, 'samples': 164352, 'steps': 855, 'loss/train': 0.9738516211509705} 01/26/2022 20:39:00 - INFO - codeparrot_training - Step 856: {'lr': 0.000214, 'samples': 164544, 'steps': 856, 'loss/train': 1.3173284232616425} 01/26/2022 20:39:03 - INFO - codeparrot_training - Step 857: {'lr': 0.00021425, 'samples': 164736, 'steps': 857, 'loss/train': 0.8615550398826599} 01/26/2022 20:39:09 - INFO - codeparrot_training - Step 858: {'lr': 0.0002145, 'samples': 164928, 'steps': 858, 'loss/train': 1.5239757299423218} 01/26/2022 20:39:12 - INFO - codeparrot_training - Step 859: {'lr': 0.00021475, 'samples': 165120, 'steps': 859, 'loss/train': 0.7371685802936554} 01/26/2022 20:39:16 - INFO - codeparrot_training - Step 860: {'lr': 0.000215, 'samples': 165312, 'steps': 860, 'loss/train': 0.7635878920555115} 01/26/2022 20:39:19 - INFO - codeparrot_training - Step 861: {'lr': 0.00021525, 'samples': 165504, 'steps': 861, 'loss/train': 0.7282658368349075} 01/26/2022 20:39:22 - INFO - codeparrot_training - Step 862: {'lr': 0.0002155, 'samples': 165696, 'steps': 862, 'loss/train': 1.1572135090827942} 01/26/2022 20:39:25 - INFO - codeparrot_training - Step 863: {'lr': 0.00021575, 'samples': 165888, 'steps': 863, 'loss/train': 1.0142411291599274} 01/26/2022 20:39:28 - INFO - codeparrot_training - Step 864: {'lr': 0.000216, 'samples': 166080, 'steps': 864, 'loss/train': 0.6775917559862137} 01/26/2022 20:39:31 - INFO - codeparrot_training - Step 865: {'lr': 0.00021625, 'samples': 166272, 'steps': 865, 'loss/train': 1.0829683542251587} 01/26/2022 20:39:34 - INFO - codeparrot_training - Step 866: {'lr': 0.0002165, 'samples': 166464, 'steps': 866, 'loss/train': 0.8288234174251556} 01/26/2022 20:39:39 - INFO - codeparrot_training - Step 867: {'lr': 0.00021675, 'samples': 166656, 'steps': 867, 'loss/train': 1.6893358826637268} 01/26/2022 20:39:42 - INFO - codeparrot_training - Step 868: {'lr': 0.00021700000000000002, 'samples': 166848, 'steps': 868, 'loss/train': 1.5192065834999084} 01/26/2022 20:39:45 - INFO - codeparrot_training - Step 869: {'lr': 0.00021725, 'samples': 167040, 'steps': 869, 'loss/train': 0.832884281873703} 01/26/2022 20:39:48 - INFO - codeparrot_training - Step 870: {'lr': 0.0002175, 'samples': 167232, 'steps': 870, 'loss/train': 0.710791289806366} 01/26/2022 20:39:51 - INFO - codeparrot_training - Step 871: {'lr': 0.00021775, 'samples': 167424, 'steps': 871, 'loss/train': 1.0109756290912628} 01/26/2022 20:39:54 - INFO - codeparrot_training - Step 872: {'lr': 0.000218, 'samples': 167616, 'steps': 872, 'loss/train': 0.9577648043632507} 01/26/2022 20:39:58 - INFO - codeparrot_training - Step 873: {'lr': 0.00021825, 'samples': 167808, 'steps': 873, 'loss/train': 0.6962565332651138} 01/26/2022 20:40:01 - INFO - codeparrot_training - Step 874: {'lr': 0.0002185, 'samples': 168000, 'steps': 874, 'loss/train': 1.0033046007156372} 01/26/2022 20:40:04 - INFO - codeparrot_training - Step 875: {'lr': 0.00021875, 'samples': 168192, 'steps': 875, 'loss/train': 0.4613918960094452} 01/26/2022 20:40:10 - INFO - codeparrot_training - Step 876: {'lr': 0.000219, 'samples': 168384, 'steps': 876, 'loss/train': 0.6006049811840057} 01/26/2022 20:40:13 - INFO - codeparrot_training - Step 877: {'lr': 0.00021925000000000002, 'samples': 168576, 'steps': 877, 'loss/train': 1.1248971819877625} 01/26/2022 20:40:16 - INFO - codeparrot_training - Step 878: {'lr': 0.0002195, 'samples': 168768, 'steps': 878, 'loss/train': 0.8051652610301971} 01/26/2022 20:40:20 - INFO - codeparrot_training - Step 879: {'lr': 0.00021975, 'samples': 168960, 'steps': 879, 'loss/train': 0.6274931877851486} 01/26/2022 20:40:23 - INFO - codeparrot_training - Step 880: {'lr': 0.00022, 'samples': 169152, 'steps': 880, 'loss/train': 1.4137959480285645} 01/26/2022 20:40:26 - INFO - codeparrot_training - Step 881: {'lr': 0.00022025000000000001, 'samples': 169344, 'steps': 881, 'loss/train': 1.2263758778572083} 01/26/2022 20:40:29 - INFO - codeparrot_training - Step 882: {'lr': 0.0002205, 'samples': 169536, 'steps': 882, 'loss/train': 0.9560001790523529} 01/26/2022 20:40:32 - INFO - codeparrot_training - Step 883: {'lr': 0.00022075, 'samples': 169728, 'steps': 883, 'loss/train': 1.1473515629768372} 01/26/2022 20:40:36 - INFO - codeparrot_training - Step 884: {'lr': 0.000221, 'samples': 169920, 'steps': 884, 'loss/train': 0.3859972804784775} 01/26/2022 20:40:40 - INFO - codeparrot_training - Step 885: {'lr': 0.00022125, 'samples': 170112, 'steps': 885, 'loss/train': 1.1466841399669647} 01/26/2022 20:40:43 - INFO - codeparrot_training - Step 886: {'lr': 0.00022150000000000002, 'samples': 170304, 'steps': 886, 'loss/train': 0.6745972484350204} 01/26/2022 20:40:46 - INFO - codeparrot_training - Step 887: {'lr': 0.00022175, 'samples': 170496, 'steps': 887, 'loss/train': 0.8258094191551208} 01/26/2022 20:40:49 - INFO - codeparrot_training - Step 888: {'lr': 0.000222, 'samples': 170688, 'steps': 888, 'loss/train': 0.4547792226076126} 01/26/2022 20:40:52 - INFO - codeparrot_training - Step 889: {'lr': 0.00022225, 'samples': 170880, 'steps': 889, 'loss/train': 1.1471035480499268} 01/26/2022 20:40:55 - INFO - codeparrot_training - Step 890: {'lr': 0.00022250000000000001, 'samples': 171072, 'steps': 890, 'loss/train': 1.7396194338798523} 01/26/2022 20:40:58 - INFO - codeparrot_training - Step 891: {'lr': 0.00022275000000000002, 'samples': 171264, 'steps': 891, 'loss/train': 0.4545985758304596} 01/26/2022 20:41:02 - INFO - codeparrot_training - Step 892: {'lr': 0.000223, 'samples': 171456, 'steps': 892, 'loss/train': 0.8459577262401581} 01/26/2022 20:41:06 - INFO - codeparrot_training - Step 893: {'lr': 0.00022325, 'samples': 171648, 'steps': 893, 'loss/train': 0.9746755957603455} 01/26/2022 20:41:09 - INFO - codeparrot_training - Step 894: {'lr': 0.0002235, 'samples': 171840, 'steps': 894, 'loss/train': 0.8435381054878235} 01/26/2022 20:41:12 - INFO - codeparrot_training - Step 895: {'lr': 0.00022375000000000002, 'samples': 172032, 'steps': 895, 'loss/train': 1.2720170617103577} 01/26/2022 20:41:15 - INFO - codeparrot_training - Step 896: {'lr': 0.000224, 'samples': 172224, 'steps': 896, 'loss/train': 0.9266642332077026} 01/26/2022 20:41:19 - INFO - codeparrot_training - Step 897: {'lr': 0.00022425, 'samples': 172416, 'steps': 897, 'loss/train': 1.1812182068824768} 01/26/2022 20:41:22 - INFO - codeparrot_training - Step 898: {'lr': 0.0002245, 'samples': 172608, 'steps': 898, 'loss/train': 0.930000901222229} 01/26/2022 20:41:25 - INFO - codeparrot_training - Step 899: {'lr': 0.00022475000000000001, 'samples': 172800, 'steps': 899, 'loss/train': 0.6613834798336029} 01/26/2022 20:41:28 - INFO - codeparrot_training - Step 900: {'lr': 0.00022500000000000002, 'samples': 172992, 'steps': 900, 'loss/train': 1.0311627388000488} 01/26/2022 20:41:31 - INFO - codeparrot_training - Step 901: {'lr': 0.00022525, 'samples': 173184, 'steps': 901, 'loss/train': 0.3564440757036209} 01/26/2022 20:41:38 - INFO - codeparrot_training - Step 902: {'lr': 0.0002255, 'samples': 173376, 'steps': 902, 'loss/train': 1.0167976319789886} 01/26/2022 20:41:41 - INFO - codeparrot_training - Step 903: {'lr': 0.00022575, 'samples': 173568, 'steps': 903, 'loss/train': 1.3098012506961823} 01/26/2022 20:41:44 - INFO - codeparrot_training - Step 904: {'lr': 0.00022600000000000002, 'samples': 173760, 'steps': 904, 'loss/train': 0.791702538728714} 01/26/2022 20:41:47 - INFO - codeparrot_training - Step 905: {'lr': 0.00022625000000000002, 'samples': 173952, 'steps': 905, 'loss/train': 0.9221445322036743} 01/26/2022 20:41:50 - INFO - codeparrot_training - Step 906: {'lr': 0.0002265, 'samples': 174144, 'steps': 906, 'loss/train': 0.8842598497867584} 01/26/2022 20:41:53 - INFO - codeparrot_training - Step 907: {'lr': 0.00022675, 'samples': 174336, 'steps': 907, 'loss/train': 1.2631148099899292} 01/26/2022 20:41:56 - INFO - codeparrot_training - Step 908: {'lr': 0.00022700000000000002, 'samples': 174528, 'steps': 908, 'loss/train': 1.1954034268856049} 01/26/2022 20:42:00 - INFO - codeparrot_training - Step 909: {'lr': 0.00022725000000000002, 'samples': 174720, 'steps': 909, 'loss/train': 1.0482499301433563} 01/26/2022 20:42:03 - INFO - codeparrot_training - Step 910: {'lr': 0.0002275, 'samples': 174912, 'steps': 910, 'loss/train': 0.7730309665203094} 01/26/2022 20:42:07 - INFO - codeparrot_training - Step 911: {'lr': 0.00022775, 'samples': 175104, 'steps': 911, 'loss/train': 0.5209553986787796} 01/26/2022 20:42:10 - INFO - codeparrot_training - Step 912: {'lr': 0.000228, 'samples': 175296, 'steps': 912, 'loss/train': 1.046157717704773} 01/26/2022 20:42:13 - INFO - codeparrot_training - Step 913: {'lr': 0.00022825000000000002, 'samples': 175488, 'steps': 913, 'loss/train': 0.655177965760231} 01/26/2022 20:42:16 - INFO - codeparrot_training - Step 914: {'lr': 0.00022850000000000002, 'samples': 175680, 'steps': 914, 'loss/train': 0.6155382245779037} 01/26/2022 20:42:20 - INFO - codeparrot_training - Step 915: {'lr': 0.00022875, 'samples': 175872, 'steps': 915, 'loss/train': 1.552494764328003} 01/26/2022 20:42:23 - INFO - codeparrot_training - Step 916: {'lr': 0.000229, 'samples': 176064, 'steps': 916, 'loss/train': 0.7812283337116241} 01/26/2022 20:42:26 - INFO - codeparrot_training - Step 917: {'lr': 0.00022925000000000002, 'samples': 176256, 'steps': 917, 'loss/train': 1.674992859363556} 01/26/2022 20:42:29 - INFO - codeparrot_training - Step 918: {'lr': 0.00022950000000000002, 'samples': 176448, 'steps': 918, 'loss/train': 0.7351734638214111} 01/26/2022 20:42:32 - INFO - codeparrot_training - Step 919: {'lr': 0.00022975000000000003, 'samples': 176640, 'steps': 919, 'loss/train': 0.8693040311336517} 01/26/2022 20:42:38 - INFO - codeparrot_training - Step 920: {'lr': 0.00023, 'samples': 176832, 'steps': 920, 'loss/train': 0.5426577776670456} 01/26/2022 20:42:41 - INFO - codeparrot_training - Step 921: {'lr': 0.00023025, 'samples': 177024, 'steps': 921, 'loss/train': 1.1854038834571838} 01/26/2022 20:42:45 - INFO - codeparrot_training - Step 922: {'lr': 0.00023050000000000002, 'samples': 177216, 'steps': 922, 'loss/train': 1.2404606938362122} 01/26/2022 20:42:48 - INFO - codeparrot_training - Step 923: {'lr': 0.00023075000000000003, 'samples': 177408, 'steps': 923, 'loss/train': 0.4229538291692734} 01/26/2022 20:42:51 - INFO - codeparrot_training - Step 924: {'lr': 0.000231, 'samples': 177600, 'steps': 924, 'loss/train': 1.0949006080627441} 01/26/2022 20:42:54 - INFO - codeparrot_training - Step 925: {'lr': 0.00023125, 'samples': 177792, 'steps': 925, 'loss/train': 0.6338553428649902} 01/26/2022 20:42:57 - INFO - codeparrot_training - Step 926: {'lr': 0.00023150000000000002, 'samples': 177984, 'steps': 926, 'loss/train': 0.7405180782079697} 01/26/2022 20:43:00 - INFO - codeparrot_training - Step 927: {'lr': 0.00023175000000000002, 'samples': 178176, 'steps': 927, 'loss/train': 1.1272001266479492} 01/26/2022 20:43:03 - INFO - codeparrot_training - Step 928: {'lr': 0.00023200000000000003, 'samples': 178368, 'steps': 928, 'loss/train': 0.3659774512052536} 01/26/2022 20:43:08 - INFO - codeparrot_training - Step 929: {'lr': 0.00023225, 'samples': 178560, 'steps': 929, 'loss/train': 0.8474507331848145} 01/26/2022 20:43:11 - INFO - codeparrot_training - Step 930: {'lr': 0.0002325, 'samples': 178752, 'steps': 930, 'loss/train': 0.7008123099803925} 01/26/2022 20:43:14 - INFO - codeparrot_training - Step 931: {'lr': 0.00023275000000000002, 'samples': 178944, 'steps': 931, 'loss/train': 0.6545947194099426} 01/26/2022 20:43:17 - INFO - codeparrot_training - Step 932: {'lr': 0.00023300000000000003, 'samples': 179136, 'steps': 932, 'loss/train': 1.0568382740020752} 01/26/2022 20:43:20 - INFO - codeparrot_training - Step 933: {'lr': 0.00023325, 'samples': 179328, 'steps': 933, 'loss/train': 0.4824586361646652} 01/26/2022 20:43:23 - INFO - codeparrot_training - Step 934: {'lr': 0.0002335, 'samples': 179520, 'steps': 934, 'loss/train': 0.3725056126713753} 01/26/2022 20:43:27 - INFO - codeparrot_training - Step 935: {'lr': 0.00023375000000000002, 'samples': 179712, 'steps': 935, 'loss/train': 0.4487442076206207} 01/26/2022 20:43:30 - INFO - codeparrot_training - Step 936: {'lr': 0.00023400000000000002, 'samples': 179904, 'steps': 936, 'loss/train': 0.9568904042243958} 01/26/2022 20:43:33 - INFO - codeparrot_training - Step 937: {'lr': 0.00023425000000000003, 'samples': 180096, 'steps': 937, 'loss/train': 1.1710163354873657} 01/26/2022 20:43:37 - INFO - codeparrot_training - Step 938: {'lr': 0.00023449999999999998, 'samples': 180288, 'steps': 938, 'loss/train': 1.0348673164844513} 01/26/2022 20:43:40 - INFO - codeparrot_training - Step 939: {'lr': 0.00023475, 'samples': 180480, 'steps': 939, 'loss/train': 1.2007386088371277} 01/26/2022 20:43:44 - INFO - codeparrot_training - Step 940: {'lr': 0.000235, 'samples': 180672, 'steps': 940, 'loss/train': 1.03763347864151} 01/26/2022 20:43:47 - INFO - codeparrot_training - Step 941: {'lr': 0.00023525, 'samples': 180864, 'steps': 941, 'loss/train': 1.4499427378177643} 01/26/2022 20:43:50 - INFO - codeparrot_training - Step 942: {'lr': 0.0002355, 'samples': 181056, 'steps': 942, 'loss/train': 1.1046107411384583} 01/26/2022 20:43:53 - INFO - codeparrot_training - Step 943: {'lr': 0.00023574999999999998, 'samples': 181248, 'steps': 943, 'loss/train': 0.8374215960502625} 01/26/2022 20:43:56 - INFO - codeparrot_training - Step 944: {'lr': 0.000236, 'samples': 181440, 'steps': 944, 'loss/train': 0.8754523694515228} 01/26/2022 20:43:59 - INFO - codeparrot_training - Step 945: {'lr': 0.00023625, 'samples': 181632, 'steps': 945, 'loss/train': 1.2402465641498566} 01/26/2022 20:44:04 - INFO - codeparrot_training - Step 946: {'lr': 0.0002365, 'samples': 181824, 'steps': 946, 'loss/train': 0.6099693328142166} 01/26/2022 20:44:07 - INFO - codeparrot_training - Step 947: {'lr': 0.00023674999999999998, 'samples': 182016, 'steps': 947, 'loss/train': 0.970650851726532} 01/26/2022 20:44:10 - INFO - codeparrot_training - Step 948: {'lr': 0.000237, 'samples': 182208, 'steps': 948, 'loss/train': 0.5941029936075211} 01/26/2022 20:44:13 - INFO - codeparrot_training - Step 949: {'lr': 0.00023725, 'samples': 182400, 'steps': 949, 'loss/train': 1.2262366712093353} 01/26/2022 20:44:16 - INFO - codeparrot_training - Step 950: {'lr': 0.0002375, 'samples': 182592, 'steps': 950, 'loss/train': 0.9913835227489471} 01/26/2022 20:44:19 - INFO - codeparrot_training - Step 951: {'lr': 0.00023775, 'samples': 182784, 'steps': 951, 'loss/train': 0.7844657599925995} 01/26/2022 20:44:23 - INFO - codeparrot_training - Step 952: {'lr': 0.00023799999999999998, 'samples': 182976, 'steps': 952, 'loss/train': 1.0873497426509857} 01/26/2022 20:44:26 - INFO - codeparrot_training - Step 953: {'lr': 0.00023825, 'samples': 183168, 'steps': 953, 'loss/train': 0.8692873120307922} 01/26/2022 20:44:29 - INFO - codeparrot_training - Step 954: {'lr': 0.0002385, 'samples': 183360, 'steps': 954, 'loss/train': 0.6407058835029602} 01/26/2022 20:44:35 - INFO - codeparrot_training - Step 955: {'lr': 0.00023875, 'samples': 183552, 'steps': 955, 'loss/train': 1.2325417399406433} 01/26/2022 20:44:38 - INFO - codeparrot_training - Step 956: {'lr': 0.00023899999999999998, 'samples': 183744, 'steps': 956, 'loss/train': 0.9815036952495575} 01/26/2022 20:44:41 - INFO - codeparrot_training - Step 957: {'lr': 0.00023925, 'samples': 183936, 'steps': 957, 'loss/train': 1.0643342435359955} 01/26/2022 20:44:45 - INFO - codeparrot_training - Step 958: {'lr': 0.0002395, 'samples': 184128, 'steps': 958, 'loss/train': 0.570918470621109} 01/26/2022 20:44:48 - INFO - codeparrot_training - Step 959: {'lr': 0.00023975, 'samples': 184320, 'steps': 959, 'loss/train': 1.0236015021800995} 01/26/2022 20:44:51 - INFO - codeparrot_training - Step 960: {'lr': 0.00024, 'samples': 184512, 'steps': 960, 'loss/train': 1.0369458496570587} 01/26/2022 20:44:54 - INFO - codeparrot_training - Step 961: {'lr': 0.00024024999999999999, 'samples': 184704, 'steps': 961, 'loss/train': 0.9108691513538361} 01/26/2022 20:44:57 - INFO - codeparrot_training - Step 962: {'lr': 0.0002405, 'samples': 184896, 'steps': 962, 'loss/train': 0.8744374215602875} 01/26/2022 20:45:00 - INFO - codeparrot_training - Step 963: {'lr': 0.00024075, 'samples': 185088, 'steps': 963, 'loss/train': 0.7000968307256699} 01/26/2022 20:45:05 - INFO - codeparrot_training - Step 964: {'lr': 0.000241, 'samples': 185280, 'steps': 964, 'loss/train': 1.119715690612793} 01/26/2022 20:45:08 - INFO - codeparrot_training - Step 965: {'lr': 0.00024125, 'samples': 185472, 'steps': 965, 'loss/train': 1.2492819428443909} 01/26/2022 20:45:11 - INFO - codeparrot_training - Step 966: {'lr': 0.0002415, 'samples': 185664, 'steps': 966, 'loss/train': 1.006849229335785} 01/26/2022 20:45:14 - INFO - codeparrot_training - Step 967: {'lr': 0.00024175, 'samples': 185856, 'steps': 967, 'loss/train': 0.8514295220375061} 01/26/2022 20:45:17 - INFO - codeparrot_training - Step 968: {'lr': 0.000242, 'samples': 186048, 'steps': 968, 'loss/train': 0.9313877820968628} 01/26/2022 20:45:20 - INFO - codeparrot_training - Step 969: {'lr': 0.00024225, 'samples': 186240, 'steps': 969, 'loss/train': 0.772581160068512} 01/26/2022 20:45:23 - INFO - codeparrot_training - Step 970: {'lr': 0.00024249999999999999, 'samples': 186432, 'steps': 970, 'loss/train': 0.7749391794204712} 01/26/2022 20:45:26 - INFO - codeparrot_training - Step 971: {'lr': 0.00024275, 'samples': 186624, 'steps': 971, 'loss/train': 1.497967153787613} 01/26/2022 20:45:30 - INFO - codeparrot_training - Step 972: {'lr': 0.000243, 'samples': 186816, 'steps': 972, 'loss/train': 0.6079481989145279} 01/26/2022 20:45:34 - INFO - codeparrot_training - Step 973: {'lr': 0.00024325, 'samples': 187008, 'steps': 973, 'loss/train': 1.072899878025055} 01/26/2022 20:45:37 - INFO - codeparrot_training - Step 974: {'lr': 0.0002435, 'samples': 187200, 'steps': 974, 'loss/train': 0.8720228970050812} 01/26/2022 20:45:40 - INFO - codeparrot_training - Step 975: {'lr': 0.00024375, 'samples': 187392, 'steps': 975, 'loss/train': 0.4871988594532013} 01/26/2022 20:45:44 - INFO - codeparrot_training - Step 976: {'lr': 0.000244, 'samples': 187584, 'steps': 976, 'loss/train': 0.7950179278850555} 01/26/2022 20:45:47 - INFO - codeparrot_training - Step 977: {'lr': 0.00024425, 'samples': 187776, 'steps': 977, 'loss/train': 1.214974969625473} 01/26/2022 20:45:50 - INFO - codeparrot_training - Step 978: {'lr': 0.0002445, 'samples': 187968, 'steps': 978, 'loss/train': 0.663908064365387} 01/26/2022 20:45:53 - INFO - codeparrot_training - Step 979: {'lr': 0.00024475, 'samples': 188160, 'steps': 979, 'loss/train': 0.6878142356872559} 01/26/2022 20:45:56 - INFO - codeparrot_training - Step 980: {'lr': 0.000245, 'samples': 188352, 'steps': 980, 'loss/train': 0.900773674249649} 01/26/2022 20:46:02 - INFO - codeparrot_training - Step 981: {'lr': 0.00024525, 'samples': 188544, 'steps': 981, 'loss/train': 0.9617729187011719} 01/26/2022 20:46:05 - INFO - codeparrot_training - Step 982: {'lr': 0.0002455, 'samples': 188736, 'steps': 982, 'loss/train': 0.7470361590385437} 01/26/2022 20:46:09 - INFO - codeparrot_training - Step 983: {'lr': 0.00024575, 'samples': 188928, 'steps': 983, 'loss/train': 1.06081885099411} 01/26/2022 20:46:12 - INFO - codeparrot_training - Step 984: {'lr': 0.000246, 'samples': 189120, 'steps': 984, 'loss/train': 0.6491813957691193} 01/26/2022 20:46:15 - INFO - codeparrot_training - Step 985: {'lr': 0.00024625, 'samples': 189312, 'steps': 985, 'loss/train': 0.8732985556125641} 01/26/2022 20:46:18 - INFO - codeparrot_training - Step 986: {'lr': 0.00024650000000000003, 'samples': 189504, 'steps': 986, 'loss/train': 0.8752740919589996} 01/26/2022 20:46:21 - INFO - codeparrot_training - Step 987: {'lr': 0.00024675, 'samples': 189696, 'steps': 987, 'loss/train': 1.5399380922317505} 01/26/2022 20:46:24 - INFO - codeparrot_training - Step 988: {'lr': 0.000247, 'samples': 189888, 'steps': 988, 'loss/train': 0.9013247787952423} 01/26/2022 20:46:27 - INFO - codeparrot_training - Step 989: {'lr': 0.00024725, 'samples': 190080, 'steps': 989, 'loss/train': 1.1562097370624542} 01/26/2022 20:46:32 - INFO - codeparrot_training - Step 990: {'lr': 0.0002475, 'samples': 190272, 'steps': 990, 'loss/train': 0.6458780318498611} 01/26/2022 20:46:35 - INFO - codeparrot_training - Step 991: {'lr': 0.00024775, 'samples': 190464, 'steps': 991, 'loss/train': 1.0805221796035767} 01/26/2022 20:46:38 - INFO - codeparrot_training - Step 992: {'lr': 0.000248, 'samples': 190656, 'steps': 992, 'loss/train': 0.8056624531745911} 01/26/2022 20:46:41 - INFO - codeparrot_training - Step 993: {'lr': 0.00024825, 'samples': 190848, 'steps': 993, 'loss/train': 0.6836653500795364} 01/26/2022 20:46:44 - INFO - codeparrot_training - Step 994: {'lr': 0.0002485, 'samples': 191040, 'steps': 994, 'loss/train': 0.6399315297603607} 01/26/2022 20:46:48 - INFO - codeparrot_training - Step 995: {'lr': 0.00024875, 'samples': 191232, 'steps': 995, 'loss/train': 0.6765156090259552} 01/26/2022 20:46:51 - INFO - codeparrot_training - Step 996: {'lr': 0.000249, 'samples': 191424, 'steps': 996, 'loss/train': 1.0690458118915558} 01/26/2022 20:46:54 - INFO - codeparrot_training - Step 997: {'lr': 0.00024925, 'samples': 191616, 'steps': 997, 'loss/train': 0.9835208058357239} 01/26/2022 20:46:57 - INFO - codeparrot_training - Step 998: {'lr': 0.0002495, 'samples': 191808, 'steps': 998, 'loss/train': 1.2982279658317566} 01/26/2022 20:47:01 - INFO - codeparrot_training - Step 999: {'lr': 0.00024975, 'samples': 192000, 'steps': 999, 'loss/train': 1.1632397174835205} 01/26/2022 20:47:04 - INFO - codeparrot_training - Step 1000: {'lr': 0.00025, 'samples': 192192, 'steps': 1000, 'loss/train': 1.7013718485832214} 01/26/2022 20:47:08 - INFO - codeparrot_training - Step 1001: {'lr': 0.00025025, 'samples': 192384, 'steps': 1001, 'loss/train': 0.9946185350418091} 01/26/2022 20:47:11 - INFO - codeparrot_training - Step 1002: {'lr': 0.0002505, 'samples': 192576, 'steps': 1002, 'loss/train': 1.195780634880066} 01/26/2022 20:47:14 - INFO - codeparrot_training - Step 1003: {'lr': 0.00025075, 'samples': 192768, 'steps': 1003, 'loss/train': 0.4232182949781418} 01/26/2022 20:47:17 - INFO - codeparrot_training - Step 1004: {'lr': 0.00025100000000000003, 'samples': 192960, 'steps': 1004, 'loss/train': 0.876092255115509} 01/26/2022 20:47:20 - INFO - codeparrot_training - Step 1005: {'lr': 0.00025124999999999995, 'samples': 193152, 'steps': 1005, 'loss/train': 0.7324695289134979} 01/26/2022 20:47:23 - INFO - codeparrot_training - Step 1006: {'lr': 0.0002515, 'samples': 193344, 'steps': 1006, 'loss/train': 0.9005085825920105} 01/26/2022 20:47:26 - INFO - codeparrot_training - Step 1007: {'lr': 0.00025174999999999997, 'samples': 193536, 'steps': 1007, 'loss/train': 0.7350870072841644} 01/26/2022 20:47:32 - INFO - codeparrot_training - Step 1008: {'lr': 0.000252, 'samples': 193728, 'steps': 1008, 'loss/train': 0.7800723016262054} 01/26/2022 20:47:36 - INFO - codeparrot_training - Step 1009: {'lr': 0.00025225, 'samples': 193920, 'steps': 1009, 'loss/train': 0.8280471861362457} 01/26/2022 20:47:39 - INFO - codeparrot_training - Step 1010: {'lr': 0.0002525, 'samples': 194112, 'steps': 1010, 'loss/train': 1.1435025036334991} 01/26/2022 20:47:42 - INFO - codeparrot_training - Step 1011: {'lr': 0.00025275, 'samples': 194304, 'steps': 1011, 'loss/train': 0.8306262195110321} 01/26/2022 20:47:45 - INFO - codeparrot_training - Step 1012: {'lr': 0.000253, 'samples': 194496, 'steps': 1012, 'loss/train': 0.9594416320323944} 01/26/2022 20:47:48 - INFO - codeparrot_training - Step 1013: {'lr': 0.00025325, 'samples': 194688, 'steps': 1013, 'loss/train': 0.9443064630031586} 01/26/2022 20:47:52 - INFO - codeparrot_training - Step 1014: {'lr': 0.0002535, 'samples': 194880, 'steps': 1014, 'loss/train': 0.9492665827274323} 01/26/2022 20:47:55 - INFO - codeparrot_training - Step 1015: {'lr': 0.00025374999999999996, 'samples': 195072, 'steps': 1015, 'loss/train': 1.1060684323310852} 01/26/2022 20:47:58 - INFO - codeparrot_training - Step 1016: {'lr': 0.000254, 'samples': 195264, 'steps': 1016, 'loss/train': 1.4302572906017303} 01/26/2022 20:48:02 - INFO - codeparrot_training - Step 1017: {'lr': 0.00025425, 'samples': 195456, 'steps': 1017, 'loss/train': 1.0134314596652985} 01/26/2022 20:48:05 - INFO - codeparrot_training - Step 1018: {'lr': 0.0002545, 'samples': 195648, 'steps': 1018, 'loss/train': 1.147715002298355} 01/26/2022 20:48:09 - INFO - codeparrot_training - Step 1019: {'lr': 0.00025475, 'samples': 195840, 'steps': 1019, 'loss/train': 0.834953248500824} 01/26/2022 20:48:12 - INFO - codeparrot_training - Step 1020: {'lr': 0.000255, 'samples': 196032, 'steps': 1020, 'loss/train': 0.7808264493942261} 01/26/2022 20:48:15 - INFO - codeparrot_training - Step 1021: {'lr': 0.00025525, 'samples': 196224, 'steps': 1021, 'loss/train': 0.19913220405578613} 01/26/2022 20:48:18 - INFO - codeparrot_training - Step 1022: {'lr': 0.00025550000000000003, 'samples': 196416, 'steps': 1022, 'loss/train': 1.3839197158813477} 01/26/2022 20:48:21 - INFO - codeparrot_training - Step 1023: {'lr': 0.00025575, 'samples': 196608, 'steps': 1023, 'loss/train': 0.697216808795929} 01/26/2022 20:48:24 - INFO - codeparrot_training - Step 1024: {'lr': 0.000256, 'samples': 196800, 'steps': 1024, 'loss/train': 0.7201445996761322} 01/26/2022 20:48:30 - INFO - codeparrot_training - Step 1025: {'lr': 0.00025624999999999997, 'samples': 196992, 'steps': 1025, 'loss/train': 0.802868127822876} 01/26/2022 20:48:33 - INFO - codeparrot_training - Step 1026: {'lr': 0.0002565, 'samples': 197184, 'steps': 1026, 'loss/train': 0.8758995831012726} 01/26/2022 20:48:37 - INFO - codeparrot_training - Step 1027: {'lr': 0.00025675, 'samples': 197376, 'steps': 1027, 'loss/train': 0.8112261593341827} 01/26/2022 20:48:40 - INFO - codeparrot_training - Step 1028: {'lr': 0.000257, 'samples': 197568, 'steps': 1028, 'loss/train': 1.1565943658351898} 01/26/2022 20:48:43 - INFO - codeparrot_training - Step 1029: {'lr': 0.00025725, 'samples': 197760, 'steps': 1029, 'loss/train': 1.0054325759410858} 01/26/2022 20:48:46 - INFO - codeparrot_training - Step 1030: {'lr': 0.0002575, 'samples': 197952, 'steps': 1030, 'loss/train': 1.1029196977615356} 01/26/2022 20:48:49 - INFO - codeparrot_training - Step 1031: {'lr': 0.00025775, 'samples': 198144, 'steps': 1031, 'loss/train': 0.7207678556442261} 01/26/2022 20:48:52 - INFO - codeparrot_training - Step 1032: {'lr': 0.00025800000000000004, 'samples': 198336, 'steps': 1032, 'loss/train': 0.9148613512516022} 01/26/2022 20:48:55 - INFO - codeparrot_training - Step 1033: {'lr': 0.00025824999999999996, 'samples': 198528, 'steps': 1033, 'loss/train': 0.9676394462585449} 01/26/2022 20:49:00 - INFO - codeparrot_training - Step 1034: {'lr': 0.0002585, 'samples': 198720, 'steps': 1034, 'loss/train': 0.6324540227651596} 01/26/2022 20:49:03 - INFO - codeparrot_training - Step 1035: {'lr': 0.00025875, 'samples': 198912, 'steps': 1035, 'loss/train': 0.6709214597940445} 01/26/2022 20:49:06 - INFO - codeparrot_training - Step 1036: {'lr': 0.000259, 'samples': 199104, 'steps': 1036, 'loss/train': 0.915786623954773} 01/26/2022 20:49:09 - INFO - codeparrot_training - Step 1037: {'lr': 0.00025925, 'samples': 199296, 'steps': 1037, 'loss/train': 0.8128197491168976} 01/26/2022 20:49:12 - INFO - codeparrot_training - Step 1038: {'lr': 0.0002595, 'samples': 199488, 'steps': 1038, 'loss/train': 0.7403267025947571} 01/26/2022 20:49:16 - INFO - codeparrot_training - Step 1039: {'lr': 0.00025975, 'samples': 199680, 'steps': 1039, 'loss/train': 0.88175368309021} 01/26/2022 20:49:19 - INFO - codeparrot_training - Step 1040: {'lr': 0.00026000000000000003, 'samples': 199872, 'steps': 1040, 'loss/train': 0.9358507096767426} 01/26/2022 20:49:22 - INFO - codeparrot_training - Step 1041: {'lr': 0.00026025, 'samples': 200064, 'steps': 1041, 'loss/train': 0.9998910427093506} 01/26/2022 20:49:25 - INFO - codeparrot_training - Step 1042: {'lr': 0.0002605, 'samples': 200256, 'steps': 1042, 'loss/train': 0.5532927811145782} 01/26/2022 20:49:30 - INFO - codeparrot_training - Step 1043: {'lr': 0.00026074999999999997, 'samples': 200448, 'steps': 1043, 'loss/train': 1.051554411649704} 01/26/2022 20:49:33 - INFO - codeparrot_training - Step 1044: {'lr': 0.000261, 'samples': 200640, 'steps': 1044, 'loss/train': 0.8074525594711304} 01/26/2022 20:49:36 - INFO - codeparrot_training - Step 1045: {'lr': 0.00026125, 'samples': 200832, 'steps': 1045, 'loss/train': 0.8834658265113831} 01/26/2022 20:49:39 - INFO - codeparrot_training - Step 1046: {'lr': 0.0002615, 'samples': 201024, 'steps': 1046, 'loss/train': 0.7426712214946747} 01/26/2022 20:49:42 - INFO - codeparrot_training - Step 1047: {'lr': 0.00026175, 'samples': 201216, 'steps': 1047, 'loss/train': 1.4034993946552277} 01/26/2022 20:49:45 - INFO - codeparrot_training - Step 1048: {'lr': 0.000262, 'samples': 201408, 'steps': 1048, 'loss/train': 0.7991145551204681} 01/26/2022 20:49:48 - INFO - codeparrot_training - Step 1049: {'lr': 0.00026225, 'samples': 201600, 'steps': 1049, 'loss/train': 1.1421080231666565} 01/26/2022 20:49:52 - INFO - codeparrot_training - Step 1050: {'lr': 0.00026250000000000004, 'samples': 201792, 'steps': 1050, 'loss/train': 1.2328433096408844} 01/26/2022 20:49:56 - INFO - codeparrot_training - Step 1051: {'lr': 0.00026274999999999996, 'samples': 201984, 'steps': 1051, 'loss/train': 1.2644303441047668} 01/26/2022 20:49:59 - INFO - codeparrot_training - Step 1052: {'lr': 0.000263, 'samples': 202176, 'steps': 1052, 'loss/train': 1.3665103018283844} 01/26/2022 20:50:02 - INFO - codeparrot_training - Step 1053: {'lr': 0.00026325, 'samples': 202368, 'steps': 1053, 'loss/train': 1.5380902290344238} 01/26/2022 20:50:06 - INFO - codeparrot_training - Step 1054: {'lr': 0.0002635, 'samples': 202560, 'steps': 1054, 'loss/train': 0.954221248626709} 01/26/2022 20:50:09 - INFO - codeparrot_training - Step 1055: {'lr': 0.00026375, 'samples': 202752, 'steps': 1055, 'loss/train': 1.0929900705814362} 01/26/2022 20:50:12 - INFO - codeparrot_training - Step 1056: {'lr': 0.000264, 'samples': 202944, 'steps': 1056, 'loss/train': 0.8090872764587402} 01/26/2022 20:50:15 - INFO - codeparrot_training - Step 1057: {'lr': 0.00026425, 'samples': 203136, 'steps': 1057, 'loss/train': 0.904187947511673} 01/26/2022 20:50:18 - INFO - codeparrot_training - Step 1058: {'lr': 0.00026450000000000003, 'samples': 203328, 'steps': 1058, 'loss/train': 1.3563102185726166} 01/26/2022 20:50:21 - INFO - codeparrot_training - Step 1059: {'lr': 0.00026475, 'samples': 203520, 'steps': 1059, 'loss/train': 0.8256677091121674} 01/26/2022 20:50:28 - INFO - codeparrot_training - Step 1060: {'lr': 0.00026500000000000004, 'samples': 203712, 'steps': 1060, 'loss/train': 1.652735710144043} 01/26/2022 20:50:31 - INFO - codeparrot_training - Step 1061: {'lr': 0.00026524999999999997, 'samples': 203904, 'steps': 1061, 'loss/train': 1.290626049041748} 01/26/2022 20:50:34 - INFO - codeparrot_training - Step 1062: {'lr': 0.0002655, 'samples': 204096, 'steps': 1062, 'loss/train': 0.7257387936115265} 01/26/2022 20:50:37 - INFO - codeparrot_training - Step 1063: {'lr': 0.00026575, 'samples': 204288, 'steps': 1063, 'loss/train': 1.090696781873703} 01/26/2022 20:50:40 - INFO - codeparrot_training - Step 1064: {'lr': 0.000266, 'samples': 204480, 'steps': 1064, 'loss/train': 1.3471647799015045} 01/26/2022 20:50:43 - INFO - codeparrot_training - Step 1065: {'lr': 0.00026625, 'samples': 204672, 'steps': 1065, 'loss/train': 1.224743664264679} 01/26/2022 20:50:46 - INFO - codeparrot_training - Step 1066: {'lr': 0.0002665, 'samples': 204864, 'steps': 1066, 'loss/train': 0.9322084486484528} 01/26/2022 20:50:50 - INFO - codeparrot_training - Step 1067: {'lr': 0.00026675, 'samples': 205056, 'steps': 1067, 'loss/train': 1.0117403268814087} 01/26/2022 20:50:53 - INFO - codeparrot_training - Step 1068: {'lr': 0.00026700000000000004, 'samples': 205248, 'steps': 1068, 'loss/train': 1.013206422328949} 01/26/2022 20:50:57 - INFO - codeparrot_training - Step 1069: {'lr': 0.00026725, 'samples': 205440, 'steps': 1069, 'loss/train': 1.4741791784763336} 01/26/2022 20:51:00 - INFO - codeparrot_training - Step 1070: {'lr': 0.0002675, 'samples': 205632, 'steps': 1070, 'loss/train': 1.0951833128929138} 01/26/2022 20:51:03 - INFO - codeparrot_training - Step 1071: {'lr': 0.00026775, 'samples': 205824, 'steps': 1071, 'loss/train': 0.48377394676208496} 01/26/2022 20:51:07 - INFO - codeparrot_training - Step 1072: {'lr': 0.000268, 'samples': 206016, 'steps': 1072, 'loss/train': 0.8418543040752411} 01/26/2022 20:51:10 - INFO - codeparrot_training - Step 1073: {'lr': 0.00026825, 'samples': 206208, 'steps': 1073, 'loss/train': 0.6488857716321945} 01/26/2022 20:51:13 - INFO - codeparrot_training - Step 1074: {'lr': 0.0002685, 'samples': 206400, 'steps': 1074, 'loss/train': 0.3778975009918213} 01/26/2022 20:51:16 - INFO - codeparrot_training - Step 1075: {'lr': 0.00026875, 'samples': 206592, 'steps': 1075, 'loss/train': 1.105844110250473} 01/26/2022 20:51:19 - INFO - codeparrot_training - Step 1076: {'lr': 0.00026900000000000003, 'samples': 206784, 'steps': 1076, 'loss/train': 0.9468898773193359} 01/26/2022 20:51:22 - INFO - codeparrot_training - Step 1077: {'lr': 0.00026925, 'samples': 206976, 'steps': 1077, 'loss/train': 0.98614302277565} 01/26/2022 20:51:27 - INFO - codeparrot_training - Step 1078: {'lr': 0.00026950000000000005, 'samples': 207168, 'steps': 1078, 'loss/train': 0.49622131884098053} 01/26/2022 20:51:30 - INFO - codeparrot_training - Step 1079: {'lr': 0.00026974999999999997, 'samples': 207360, 'steps': 1079, 'loss/train': 0.7102539986371994} 01/26/2022 20:51:33 - INFO - codeparrot_training - Step 1080: {'lr': 0.00027, 'samples': 207552, 'steps': 1080, 'loss/train': 1.3239697515964508} 01/26/2022 20:51:36 - INFO - codeparrot_training - Step 1081: {'lr': 0.00027025, 'samples': 207744, 'steps': 1081, 'loss/train': 0.71196149289608} 01/26/2022 20:51:39 - INFO - codeparrot_training - Step 1082: {'lr': 0.0002705, 'samples': 207936, 'steps': 1082, 'loss/train': 1.1662839353084564} 01/26/2022 20:51:42 - INFO - codeparrot_training - Step 1083: {'lr': 0.00027075, 'samples': 208128, 'steps': 1083, 'loss/train': 0.24068096280097961} 01/26/2022 20:51:45 - INFO - codeparrot_training - Step 1084: {'lr': 0.00027100000000000003, 'samples': 208320, 'steps': 1084, 'loss/train': 0.9668920934200287} 01/26/2022 20:51:48 - INFO - codeparrot_training - Step 1085: {'lr': 0.00027125, 'samples': 208512, 'steps': 1085, 'loss/train': 0.518968865275383} 01/26/2022 20:51:52 - INFO - codeparrot_training - Step 1086: {'lr': 0.00027150000000000004, 'samples': 208704, 'steps': 1086, 'loss/train': 0.5857425928115845} 01/26/2022 20:51:58 - INFO - codeparrot_training - Step 1087: {'lr': 0.00027175, 'samples': 208896, 'steps': 1087, 'loss/train': 1.2705464959144592} 01/26/2022 20:52:01 - INFO - codeparrot_training - Step 1088: {'lr': 0.00027200000000000005, 'samples': 209088, 'steps': 1088, 'loss/train': 1.142454832792282} 01/26/2022 20:52:04 - INFO - codeparrot_training - Step 1089: {'lr': 0.00027225, 'samples': 209280, 'steps': 1089, 'loss/train': 1.1075121760368347} 01/26/2022 20:52:07 - INFO - codeparrot_training - Step 1090: {'lr': 0.0002725, 'samples': 209472, 'steps': 1090, 'loss/train': 0.8761851489543915} 01/26/2022 20:52:10 - INFO - codeparrot_training - Step 1091: {'lr': 0.00027275, 'samples': 209664, 'steps': 1091, 'loss/train': 1.0561780035495758} 01/26/2022 20:52:13 - INFO - codeparrot_training - Step 1092: {'lr': 0.000273, 'samples': 209856, 'steps': 1092, 'loss/train': 1.1019476652145386} 01/26/2022 20:52:17 - INFO - codeparrot_training - Step 1093: {'lr': 0.00027325, 'samples': 210048, 'steps': 1093, 'loss/train': 0.9809303283691406} 01/26/2022 20:52:20 - INFO - codeparrot_training - Step 1094: {'lr': 0.00027350000000000003, 'samples': 210240, 'steps': 1094, 'loss/train': 0.5167847871780396} 01/26/2022 20:52:23 - INFO - codeparrot_training - Step 1095: {'lr': 0.00027375, 'samples': 210432, 'steps': 1095, 'loss/train': 0.6833093762397766} 01/26/2022 20:52:28 - INFO - codeparrot_training - Step 1096: {'lr': 0.00027400000000000005, 'samples': 210624, 'steps': 1096, 'loss/train': 0.26570820808410645} 01/26/2022 20:52:31 - INFO - codeparrot_training - Step 1097: {'lr': 0.00027425, 'samples': 210816, 'steps': 1097, 'loss/train': 0.3386870250105858} 01/26/2022 20:52:34 - INFO - codeparrot_training - Step 1098: {'lr': 0.0002745, 'samples': 211008, 'steps': 1098, 'loss/train': 0.688764363527298} 01/26/2022 20:52:37 - INFO - codeparrot_training - Step 1099: {'lr': 0.00027475, 'samples': 211200, 'steps': 1099, 'loss/train': 1.2169432640075684} 01/26/2022 20:52:40 - INFO - codeparrot_training - Step 1100: {'lr': 0.000275, 'samples': 211392, 'steps': 1100, 'loss/train': 0.256294347345829} 01/26/2022 20:52:43 - INFO - codeparrot_training - Step 1101: {'lr': 0.00027525, 'samples': 211584, 'steps': 1101, 'loss/train': 1.6608785390853882} 01/26/2022 20:52:47 - INFO - codeparrot_training - Step 1102: {'lr': 0.00027550000000000003, 'samples': 211776, 'steps': 1102, 'loss/train': 0.7106503844261169} 01/26/2022 20:52:50 - INFO - codeparrot_training - Step 1103: {'lr': 0.00027575, 'samples': 211968, 'steps': 1103, 'loss/train': 0.49972186982631683} 01/26/2022 20:52:56 - INFO - codeparrot_training - Step 1104: {'lr': 0.00027600000000000004, 'samples': 212160, 'steps': 1104, 'loss/train': 0.6209851205348969} 01/26/2022 20:52:59 - INFO - codeparrot_training - Step 1105: {'lr': 0.00027625, 'samples': 212352, 'steps': 1105, 'loss/train': 1.2214830815792084} 01/26/2022 20:53:02 - INFO - codeparrot_training - Step 1106: {'lr': 0.00027650000000000005, 'samples': 212544, 'steps': 1106, 'loss/train': 1.0984822511672974} 01/26/2022 20:53:05 - INFO - codeparrot_training - Step 1107: {'lr': 0.00027675, 'samples': 212736, 'steps': 1107, 'loss/train': 1.1388744413852692} 01/26/2022 20:53:09 - INFO - codeparrot_training - Step 1108: {'lr': 0.000277, 'samples': 212928, 'steps': 1108, 'loss/train': 0.9910741746425629} 01/26/2022 20:53:12 - INFO - codeparrot_training - Step 1109: {'lr': 0.00027725, 'samples': 213120, 'steps': 1109, 'loss/train': 0.3422721326351166} 01/26/2022 20:53:15 - INFO - codeparrot_training - Step 1110: {'lr': 0.0002775, 'samples': 213312, 'steps': 1110, 'loss/train': 0.6875036358833313} 01/26/2022 20:53:18 - INFO - codeparrot_training - Step 1111: {'lr': 0.00027775, 'samples': 213504, 'steps': 1111, 'loss/train': 0.35392677783966064} 01/26/2022 20:53:21 - INFO - codeparrot_training - Step 1112: {'lr': 0.00027800000000000004, 'samples': 213696, 'steps': 1112, 'loss/train': 1.1671156883239746} 01/26/2022 20:53:25 - INFO - codeparrot_training - Step 1113: {'lr': 0.00027825, 'samples': 213888, 'steps': 1113, 'loss/train': 0.35262975096702576} 01/26/2022 20:53:29 - INFO - codeparrot_training - Step 1114: {'lr': 0.00027850000000000005, 'samples': 214080, 'steps': 1114, 'loss/train': 1.2366746664047241} 01/26/2022 20:53:32 - INFO - codeparrot_training - Step 1115: {'lr': 0.00027875, 'samples': 214272, 'steps': 1115, 'loss/train': 0.650038942694664} 01/26/2022 20:53:35 - INFO - codeparrot_training - Step 1116: {'lr': 0.000279, 'samples': 214464, 'steps': 1116, 'loss/train': 0.6133878976106644} 01/26/2022 20:53:38 - INFO - codeparrot_training - Step 1117: {'lr': 0.00027925, 'samples': 214656, 'steps': 1117, 'loss/train': 1.2080609500408173} 01/26/2022 20:53:41 - INFO - codeparrot_training - Step 1118: {'lr': 0.0002795, 'samples': 214848, 'steps': 1118, 'loss/train': 1.3723876476287842} 01/26/2022 20:53:44 - INFO - codeparrot_training - Step 1119: {'lr': 0.00027975, 'samples': 215040, 'steps': 1119, 'loss/train': 0.9902322292327881} 01/26/2022 20:53:47 - INFO - codeparrot_training - Step 1120: {'lr': 0.00028000000000000003, 'samples': 215232, 'steps': 1120, 'loss/train': 0.8555106818675995} 01/26/2022 20:53:51 - INFO - codeparrot_training - Step 1121: {'lr': 0.00028025, 'samples': 215424, 'steps': 1121, 'loss/train': 0.6759006679058075} 01/26/2022 20:53:55 - INFO - codeparrot_training - Step 1122: {'lr': 0.00028050000000000004, 'samples': 215616, 'steps': 1122, 'loss/train': 0.8378792703151703} 01/26/2022 20:53:58 - INFO - codeparrot_training - Step 1123: {'lr': 0.00028075, 'samples': 215808, 'steps': 1123, 'loss/train': 1.138431340456009} 01/26/2022 20:54:01 - INFO - codeparrot_training - Step 1124: {'lr': 0.00028100000000000005, 'samples': 216000, 'steps': 1124, 'loss/train': 1.1794633269309998} 01/26/2022 20:54:04 - INFO - codeparrot_training - Step 1125: {'lr': 0.00028125000000000003, 'samples': 216192, 'steps': 1125, 'loss/train': 0.6962398141622543} 01/26/2022 20:54:07 - INFO - codeparrot_training - Step 1126: {'lr': 0.00028149999999999996, 'samples': 216384, 'steps': 1126, 'loss/train': 1.3031834363937378} 01/26/2022 20:54:11 - INFO - codeparrot_training - Step 1127: {'lr': 0.00028175, 'samples': 216576, 'steps': 1127, 'loss/train': 0.5448819547891617} 01/26/2022 20:54:14 - INFO - codeparrot_training - Step 1128: {'lr': 0.00028199999999999997, 'samples': 216768, 'steps': 1128, 'loss/train': 0.7928208410739899} 01/26/2022 20:54:17 - INFO - codeparrot_training - Step 1129: {'lr': 0.00028225, 'samples': 216960, 'steps': 1129, 'loss/train': 0.8456499874591827} 01/26/2022 20:54:20 - INFO - codeparrot_training - Step 1130: {'lr': 0.0002825, 'samples': 217152, 'steps': 1130, 'loss/train': 0.6639377474784851} 01/26/2022 20:54:26 - INFO - codeparrot_training - Step 1131: {'lr': 0.00028275, 'samples': 217344, 'steps': 1131, 'loss/train': 1.1020688116550446} 01/26/2022 20:54:29 - INFO - codeparrot_training - Step 1132: {'lr': 0.000283, 'samples': 217536, 'steps': 1132, 'loss/train': 4.993828296661377} 01/26/2022 20:54:32 - INFO - codeparrot_training - Step 1133: {'lr': 0.00028325000000000003, 'samples': 217728, 'steps': 1133, 'loss/train': 0.9387196898460388} 01/26/2022 20:54:35 - INFO - codeparrot_training - Step 1134: {'lr': 0.0002835, 'samples': 217920, 'steps': 1134, 'loss/train': 0.9615424275398254} 01/26/2022 20:54:39 - INFO - codeparrot_training - Step 1135: {'lr': 0.00028375, 'samples': 218112, 'steps': 1135, 'loss/train': 1.0674448907375336} 01/26/2022 20:54:42 - INFO - codeparrot_training - Step 1136: {'lr': 0.00028399999999999996, 'samples': 218304, 'steps': 1136, 'loss/train': 1.068463146686554} 01/26/2022 20:54:45 - INFO - codeparrot_training - Step 1137: {'lr': 0.00028425, 'samples': 218496, 'steps': 1137, 'loss/train': 0.6541077196598053} 01/26/2022 20:54:48 - INFO - codeparrot_training - Step 1138: {'lr': 0.0002845, 'samples': 218688, 'steps': 1138, 'loss/train': 0.9804241955280304} 01/26/2022 20:54:51 - INFO - codeparrot_training - Step 1139: {'lr': 0.00028475, 'samples': 218880, 'steps': 1139, 'loss/train': 0.3464767411351204} 01/26/2022 20:54:56 - INFO - codeparrot_training - Step 1140: {'lr': 0.000285, 'samples': 219072, 'steps': 1140, 'loss/train': 1.2621967792510986} 01/26/2022 20:54:59 - INFO - codeparrot_training - Step 1141: {'lr': 0.00028525, 'samples': 219264, 'steps': 1141, 'loss/train': 0.7673389613628387} 01/26/2022 20:55:02 - INFO - codeparrot_training - Step 1142: {'lr': 0.0002855, 'samples': 219456, 'steps': 1142, 'loss/train': 1.1267178654670715} 01/26/2022 20:55:05 - INFO - codeparrot_training - Step 1143: {'lr': 0.00028575000000000003, 'samples': 219648, 'steps': 1143, 'loss/train': 0.8958519995212555} 01/26/2022 20:55:08 - INFO - codeparrot_training - Step 1144: {'lr': 0.00028599999999999996, 'samples': 219840, 'steps': 1144, 'loss/train': 2.1498432755470276} 01/26/2022 20:55:11 - INFO - codeparrot_training - Step 1145: {'lr': 0.00028625, 'samples': 220032, 'steps': 1145, 'loss/train': 1.0694282948970795} 01/26/2022 20:55:14 - INFO - codeparrot_training - Step 1146: {'lr': 0.00028649999999999997, 'samples': 220224, 'steps': 1146, 'loss/train': 1.1303611099720001} 01/26/2022 20:55:17 - INFO - codeparrot_training - Step 1147: {'lr': 0.00028675, 'samples': 220416, 'steps': 1147, 'loss/train': 0.9350847601890564} 01/26/2022 20:55:21 - INFO - codeparrot_training - Step 1148: {'lr': 0.000287, 'samples': 220608, 'steps': 1148, 'loss/train': 1.5029119849205017} 01/26/2022 20:55:25 - INFO - codeparrot_training - Step 1149: {'lr': 0.00028725, 'samples': 220800, 'steps': 1149, 'loss/train': 0.7740932106971741} 01/26/2022 20:55:28 - INFO - codeparrot_training - Step 1150: {'lr': 0.0002875, 'samples': 220992, 'steps': 1150, 'loss/train': 0.8809188008308411} 01/26/2022 20:55:31 - INFO - codeparrot_training - Step 1151: {'lr': 0.00028775000000000003, 'samples': 221184, 'steps': 1151, 'loss/train': 1.3801161646842957} 01/26/2022 20:55:35 - INFO - codeparrot_training - Step 1152: {'lr': 0.000288, 'samples': 221376, 'steps': 1152, 'loss/train': 0.6021814048290253} 01/26/2022 20:55:38 - INFO - codeparrot_training - Step 1153: {'lr': 0.00028825, 'samples': 221568, 'steps': 1153, 'loss/train': 0.8896468877792358} 01/26/2022 20:55:41 - INFO - codeparrot_training - Step 1154: {'lr': 0.00028849999999999997, 'samples': 221760, 'steps': 1154, 'loss/train': 1.0600275099277496} 01/26/2022 20:55:44 - INFO - codeparrot_training - Step 1155: {'lr': 0.00028875, 'samples': 221952, 'steps': 1155, 'loss/train': 1.1250623166561127} 01/26/2022 20:55:47 - INFO - codeparrot_training - Step 1156: {'lr': 0.000289, 'samples': 222144, 'steps': 1156, 'loss/train': 0.5853099524974823} 01/26/2022 20:55:54 - INFO - codeparrot_training - Step 1157: {'lr': 0.00028925, 'samples': 222336, 'steps': 1157, 'loss/train': 0.709986001253128} 01/26/2022 20:55:57 - INFO - codeparrot_training - Step 1158: {'lr': 0.0002895, 'samples': 222528, 'steps': 1158, 'loss/train': 0.5911504179239273} 01/26/2022 20:56:00 - INFO - codeparrot_training - Step 1159: {'lr': 0.00028975, 'samples': 222720, 'steps': 1159, 'loss/train': 0.4766962230205536} 01/26/2022 20:56:03 - INFO - codeparrot_training - Step 1160: {'lr': 0.00029, 'samples': 222912, 'steps': 1160, 'loss/train': 1.1779105961322784} 01/26/2022 20:56:07 - INFO - codeparrot_training - Step 1161: {'lr': 0.00029025000000000003, 'samples': 223104, 'steps': 1161, 'loss/train': 1.2804690599441528} 01/26/2022 20:56:10 - INFO - codeparrot_training - Step 1162: {'lr': 0.00029049999999999996, 'samples': 223296, 'steps': 1162, 'loss/train': 1.1594082713127136} 01/26/2022 20:56:13 - INFO - codeparrot_training - Step 1163: {'lr': 0.00029075, 'samples': 223488, 'steps': 1163, 'loss/train': 1.1964775621891022} 01/26/2022 20:56:16 - INFO - codeparrot_training - Step 1164: {'lr': 0.00029099999999999997, 'samples': 223680, 'steps': 1164, 'loss/train': 0.9824680387973785} 01/26/2022 20:56:19 - INFO - codeparrot_training - Step 1165: {'lr': 0.00029125, 'samples': 223872, 'steps': 1165, 'loss/train': 0.8150842487812042} 01/26/2022 20:56:23 - INFO - codeparrot_training - Step 1166: {'lr': 0.0002915, 'samples': 224064, 'steps': 1166, 'loss/train': 1.152783751487732} 01/26/2022 20:56:27 - INFO - codeparrot_training - Step 1167: {'lr': 0.00029175, 'samples': 224256, 'steps': 1167, 'loss/train': 0.7861387431621552} 01/26/2022 20:56:30 - INFO - codeparrot_training - Step 1168: {'lr': 0.000292, 'samples': 224448, 'steps': 1168, 'loss/train': 0.24597644805908203} 01/26/2022 20:56:33 - INFO - codeparrot_training - Step 1169: {'lr': 0.00029225000000000003, 'samples': 224640, 'steps': 1169, 'loss/train': 0.9678133428096771} 01/26/2022 20:56:36 - INFO - codeparrot_training - Step 1170: {'lr': 0.0002925, 'samples': 224832, 'steps': 1170, 'loss/train': 1.0234933197498322} 01/26/2022 20:56:39 - INFO - codeparrot_training - Step 1171: {'lr': 0.00029275000000000004, 'samples': 225024, 'steps': 1171, 'loss/train': 0.9173187017440796} 01/26/2022 20:56:42 - INFO - codeparrot_training - Step 1172: {'lr': 0.00029299999999999997, 'samples': 225216, 'steps': 1172, 'loss/train': 0.8178969025611877} 01/26/2022 20:56:45 - INFO - codeparrot_training - Step 1173: {'lr': 0.00029325, 'samples': 225408, 'steps': 1173, 'loss/train': 0.6272235363721848} 01/26/2022 20:56:48 - INFO - codeparrot_training - Step 1174: {'lr': 0.0002935, 'samples': 225600, 'steps': 1174, 'loss/train': 0.7621327042579651} 01/26/2022 20:56:53 - INFO - codeparrot_training - Step 1175: {'lr': 0.00029375, 'samples': 225792, 'steps': 1175, 'loss/train': 1.562220811843872} 01/26/2022 20:56:56 - INFO - codeparrot_training - Step 1176: {'lr': 0.000294, 'samples': 225984, 'steps': 1176, 'loss/train': 1.083278328180313} 01/26/2022 20:56:59 - INFO - codeparrot_training - Step 1177: {'lr': 0.00029425, 'samples': 226176, 'steps': 1177, 'loss/train': 0.8461199104785919} 01/26/2022 20:57:02 - INFO - codeparrot_training - Step 1178: {'lr': 0.0002945, 'samples': 226368, 'steps': 1178, 'loss/train': 1.3932880461215973} 01/26/2022 20:57:05 - INFO - codeparrot_training - Step 1179: {'lr': 0.00029475000000000004, 'samples': 226560, 'steps': 1179, 'loss/train': 1.1243003904819489} 01/26/2022 20:57:09 - INFO - codeparrot_training - Step 1180: {'lr': 0.000295, 'samples': 226752, 'steps': 1180, 'loss/train': 1.2851622104644775} 01/26/2022 20:57:12 - INFO - codeparrot_training - Step 1181: {'lr': 0.00029525, 'samples': 226944, 'steps': 1181, 'loss/train': 0.5264954566955566} 01/26/2022 20:57:15 - INFO - codeparrot_training - Step 1182: {'lr': 0.00029549999999999997, 'samples': 227136, 'steps': 1182, 'loss/train': 1.279321700334549} 01/26/2022 20:57:18 - INFO - codeparrot_training - Step 1183: {'lr': 0.00029575, 'samples': 227328, 'steps': 1183, 'loss/train': 1.7727146744728088} 01/26/2022 20:57:24 - INFO - codeparrot_training - Step 1184: {'lr': 0.000296, 'samples': 227520, 'steps': 1184, 'loss/train': 0.11715855449438095} 01/26/2022 20:57:28 - INFO - codeparrot_training - Step 1185: {'lr': 0.00029625, 'samples': 227712, 'steps': 1185, 'loss/train': 0.7162847220897675} 01/26/2022 20:57:31 - INFO - codeparrot_training - Step 1186: {'lr': 0.0002965, 'samples': 227904, 'steps': 1186, 'loss/train': 0.5857726335525513} 01/26/2022 20:57:34 - INFO - codeparrot_training - Step 1187: {'lr': 0.00029675000000000003, 'samples': 228096, 'steps': 1187, 'loss/train': 0.9531798362731934} 01/26/2022 20:57:37 - INFO - codeparrot_training - Step 1188: {'lr': 0.000297, 'samples': 228288, 'steps': 1188, 'loss/train': 0.8573541641235352} 01/26/2022 20:57:40 - INFO - codeparrot_training - Step 1189: {'lr': 0.00029725000000000004, 'samples': 228480, 'steps': 1189, 'loss/train': 1.2324146032333374} 01/26/2022 20:57:43 - INFO - codeparrot_training - Step 1190: {'lr': 0.00029749999999999997, 'samples': 228672, 'steps': 1190, 'loss/train': 0.836683988571167} 01/26/2022 20:57:46 - INFO - codeparrot_training - Step 1191: {'lr': 0.00029775, 'samples': 228864, 'steps': 1191, 'loss/train': 0.9754275977611542} 01/26/2022 20:57:51 - INFO - codeparrot_training - Step 1192: {'lr': 0.000298, 'samples': 229056, 'steps': 1192, 'loss/train': 1.3395634889602661} 01/26/2022 20:57:54 - INFO - codeparrot_training - Step 1193: {'lr': 0.00029825, 'samples': 229248, 'steps': 1193, 'loss/train': 1.1002348065376282} 01/26/2022 20:57:57 - INFO - codeparrot_training - Step 1194: {'lr': 0.0002985, 'samples': 229440, 'steps': 1194, 'loss/train': 0.7102044224739075} 01/26/2022 20:58:00 - INFO - codeparrot_training - Step 1195: {'lr': 0.00029875, 'samples': 229632, 'steps': 1195, 'loss/train': 0.33209938555955887} 01/26/2022 20:58:04 - INFO - codeparrot_training - Step 1196: {'lr': 0.000299, 'samples': 229824, 'steps': 1196, 'loss/train': 0.4997599124908447} 01/26/2022 20:58:07 - INFO - codeparrot_training - Step 1197: {'lr': 0.00029925000000000004, 'samples': 230016, 'steps': 1197, 'loss/train': 0.5510084331035614} 01/26/2022 20:58:10 - INFO - codeparrot_training - Step 1198: {'lr': 0.0002995, 'samples': 230208, 'steps': 1198, 'loss/train': 0.9658601582050323} 01/26/2022 20:58:13 - INFO - codeparrot_training - Step 1199: {'lr': 0.00029975000000000005, 'samples': 230400, 'steps': 1199, 'loss/train': 1.0016333162784576} 01/26/2022 20:58:16 - INFO - codeparrot_training - Step 1200: {'lr': 0.0003, 'samples': 230592, 'steps': 1200, 'loss/train': 0.25452665984630585} 01/26/2022 20:58:21 - INFO - codeparrot_training - Step 1201: {'lr': 0.00030025, 'samples': 230784, 'steps': 1201, 'loss/train': 1.086582988500595} 01/26/2022 20:58:24 - INFO - codeparrot_training - Step 1202: {'lr': 0.0003005, 'samples': 230976, 'steps': 1202, 'loss/train': 0.2490437775850296} 01/26/2022 20:58:27 - INFO - codeparrot_training - Step 1203: {'lr': 0.00030075, 'samples': 231168, 'steps': 1203, 'loss/train': 0.43811631202697754} 01/26/2022 20:58:30 - INFO - codeparrot_training - Step 1204: {'lr': 0.000301, 'samples': 231360, 'steps': 1204, 'loss/train': 0.5827521085739136} 01/26/2022 20:58:33 - INFO - codeparrot_training - Step 1205: {'lr': 0.00030125000000000003, 'samples': 231552, 'steps': 1205, 'loss/train': 0.5712075680494308} 01/26/2022 20:58:36 - INFO - codeparrot_training - Step 1206: {'lr': 0.0003015, 'samples': 231744, 'steps': 1206, 'loss/train': 1.046557903289795} 01/26/2022 20:58:39 - INFO - codeparrot_training - Step 1207: {'lr': 0.00030175000000000004, 'samples': 231936, 'steps': 1207, 'loss/train': 1.0815697610378265} 01/26/2022 20:58:42 - INFO - codeparrot_training - Step 1208: {'lr': 0.000302, 'samples': 232128, 'steps': 1208, 'loss/train': 0.1934211105108261} 01/26/2022 20:58:46 - INFO - codeparrot_training - Step 1209: {'lr': 0.00030225, 'samples': 232320, 'steps': 1209, 'loss/train': 1.0718797445297241} 01/26/2022 20:58:52 - INFO - codeparrot_training - Step 1210: {'lr': 0.0003025, 'samples': 232512, 'steps': 1210, 'loss/train': 1.093067318201065} 01/26/2022 20:58:55 - INFO - codeparrot_training - Step 1211: {'lr': 0.00030275, 'samples': 232704, 'steps': 1211, 'loss/train': 0.2502623051404953} 01/26/2022 20:58:58 - INFO - codeparrot_training - Step 1212: {'lr': 0.000303, 'samples': 232896, 'steps': 1212, 'loss/train': 1.9116675853729248} 01/26/2022 20:59:02 - INFO - codeparrot_training - Step 1213: {'lr': 0.00030325, 'samples': 233088, 'steps': 1213, 'loss/train': 0.36320026963949203} 01/26/2022 20:59:05 - INFO - codeparrot_training - Step 1214: {'lr': 0.0003035, 'samples': 233280, 'steps': 1214, 'loss/train': 0.9241034388542175} 01/26/2022 20:59:08 - INFO - codeparrot_training - Step 1215: {'lr': 0.00030375000000000004, 'samples': 233472, 'steps': 1215, 'loss/train': 0.7539133429527283} 01/26/2022 20:59:11 - INFO - codeparrot_training - Step 1216: {'lr': 0.000304, 'samples': 233664, 'steps': 1216, 'loss/train': 0.8841446936130524} 01/26/2022 20:59:14 - INFO - codeparrot_training - Step 1217: {'lr': 0.00030425000000000005, 'samples': 233856, 'steps': 1217, 'loss/train': 0.5467899888753891} 01/26/2022 20:59:17 - INFO - codeparrot_training - Step 1218: {'lr': 0.0003045, 'samples': 234048, 'steps': 1218, 'loss/train': 0.931069403886795} 01/26/2022 20:59:22 - INFO - codeparrot_training - Step 1219: {'lr': 0.00030475, 'samples': 234240, 'steps': 1219, 'loss/train': 0.5878031551837921} 01/26/2022 20:59:25 - INFO - codeparrot_training - Step 1220: {'lr': 0.000305, 'samples': 234432, 'steps': 1220, 'loss/train': 0.9409241080284119} 01/26/2022 20:59:28 - INFO - codeparrot_training - Step 1221: {'lr': 0.00030525, 'samples': 234624, 'steps': 1221, 'loss/train': 1.2209935784339905} 01/26/2022 20:59:31 - INFO - codeparrot_training - Step 1222: {'lr': 0.0003055, 'samples': 234816, 'steps': 1222, 'loss/train': 1.092833697795868} 01/26/2022 20:59:34 - INFO - codeparrot_training - Step 1223: {'lr': 0.00030575000000000003, 'samples': 235008, 'steps': 1223, 'loss/train': 0.9506379961967468} 01/26/2022 20:59:37 - INFO - codeparrot_training - Step 1224: {'lr': 0.000306, 'samples': 235200, 'steps': 1224, 'loss/train': 0.8738553822040558} 01/26/2022 20:59:41 - INFO - codeparrot_training - Step 1225: {'lr': 0.00030625000000000004, 'samples': 235392, 'steps': 1225, 'loss/train': 0.6989216208457947} 01/26/2022 20:59:44 - INFO - codeparrot_training - Step 1226: {'lr': 0.0003065, 'samples': 235584, 'steps': 1226, 'loss/train': 1.1033534109592438} 01/26/2022 20:59:50 - INFO - codeparrot_training - Step 1227: {'lr': 0.00030675, 'samples': 235776, 'steps': 1227, 'loss/train': 1.10317263007164} 01/26/2022 20:59:53 - INFO - codeparrot_training - Step 1228: {'lr': 0.000307, 'samples': 235968, 'steps': 1228, 'loss/train': 1.2997640669345856} 01/26/2022 20:59:56 - INFO - codeparrot_training - Step 1229: {'lr': 0.00030725, 'samples': 236160, 'steps': 1229, 'loss/train': 0.9714340567588806} 01/26/2022 20:59:59 - INFO - codeparrot_training - Step 1230: {'lr': 0.0003075, 'samples': 236352, 'steps': 1230, 'loss/train': 0.7407642602920532} 01/26/2022 21:00:02 - INFO - codeparrot_training - Step 1231: {'lr': 0.00030775, 'samples': 236544, 'steps': 1231, 'loss/train': 0.698677584528923} 01/26/2022 21:00:05 - INFO - codeparrot_training - Step 1232: {'lr': 0.000308, 'samples': 236736, 'steps': 1232, 'loss/train': 0.6075624525547028} 01/26/2022 21:00:09 - INFO - codeparrot_training - Step 1233: {'lr': 0.00030825000000000004, 'samples': 236928, 'steps': 1233, 'loss/train': 0.7729901969432831} 01/26/2022 21:00:12 - INFO - codeparrot_training - Step 1234: {'lr': 0.0003085, 'samples': 237120, 'steps': 1234, 'loss/train': 1.0640463531017303} 01/26/2022 21:00:15 - INFO - codeparrot_training - Step 1235: {'lr': 0.00030875000000000005, 'samples': 237312, 'steps': 1235, 'loss/train': 0.8610347807407379} 01/26/2022 21:00:19 - INFO - codeparrot_training - Step 1236: {'lr': 0.00030900000000000003, 'samples': 237504, 'steps': 1236, 'loss/train': 1.2136048078536987} 01/26/2022 21:00:22 - INFO - codeparrot_training - Step 1237: {'lr': 0.00030925, 'samples': 237696, 'steps': 1237, 'loss/train': 0.8963712751865387} 01/26/2022 21:00:25 - INFO - codeparrot_training - Step 1238: {'lr': 0.0003095, 'samples': 237888, 'steps': 1238, 'loss/train': 0.3150174468755722} 01/26/2022 21:00:28 - INFO - codeparrot_training - Step 1239: {'lr': 0.00030975, 'samples': 238080, 'steps': 1239, 'loss/train': 0.7881652414798737} 01/26/2022 21:00:32 - INFO - codeparrot_training - Step 1240: {'lr': 0.00031, 'samples': 238272, 'steps': 1240, 'loss/train': 0.9741031229496002} 01/26/2022 21:00:35 - INFO - codeparrot_training - Step 1241: {'lr': 0.00031025000000000003, 'samples': 238464, 'steps': 1241, 'loss/train': 1.236857682466507} 01/26/2022 21:00:38 - INFO - codeparrot_training - Step 1242: {'lr': 0.0003105, 'samples': 238656, 'steps': 1242, 'loss/train': 0.39366231858730316} 01/26/2022 21:00:41 - INFO - codeparrot_training - Step 1243: {'lr': 0.00031075000000000005, 'samples': 238848, 'steps': 1243, 'loss/train': 0.8875214159488678} 01/26/2022 21:00:44 - INFO - codeparrot_training - Step 1244: {'lr': 0.000311, 'samples': 239040, 'steps': 1244, 'loss/train': 1.340793639421463} 01/26/2022 21:00:49 - INFO - codeparrot_training - Step 1245: {'lr': 0.00031125000000000006, 'samples': 239232, 'steps': 1245, 'loss/train': 0.6737084984779358} 01/26/2022 21:00:52 - INFO - codeparrot_training - Step 1246: {'lr': 0.0003115, 'samples': 239424, 'steps': 1246, 'loss/train': 1.1468097567558289} 01/26/2022 21:00:55 - INFO - codeparrot_training - Step 1247: {'lr': 0.00031175, 'samples': 239616, 'steps': 1247, 'loss/train': 0.9451137185096741} 01/26/2022 21:00:58 - INFO - codeparrot_training - Step 1248: {'lr': 0.000312, 'samples': 239808, 'steps': 1248, 'loss/train': 0.9751624166965485} 01/26/2022 21:01:01 - INFO - codeparrot_training - Step 1249: {'lr': 0.00031225000000000003, 'samples': 240000, 'steps': 1249, 'loss/train': 1.5141663551330566} 01/26/2022 21:01:04 - INFO - codeparrot_training - Step 1250: {'lr': 0.0003125, 'samples': 240192, 'steps': 1250, 'loss/train': 1.6736984252929688} 01/26/2022 21:01:07 - INFO - codeparrot_training - Step 1251: {'lr': 0.00031275, 'samples': 240384, 'steps': 1251, 'loss/train': 1.3493045568466187} 01/26/2022 21:01:11 - INFO - codeparrot_training - Step 1252: {'lr': 0.000313, 'samples': 240576, 'steps': 1252, 'loss/train': 1.2300075888633728} 01/26/2022 21:01:14 - INFO - codeparrot_training - Step 1253: {'lr': 0.00031325, 'samples': 240768, 'steps': 1253, 'loss/train': 0.7751044929027557} 01/26/2022 21:01:19 - INFO - codeparrot_training - Step 1254: {'lr': 0.00031350000000000003, 'samples': 240960, 'steps': 1254, 'loss/train': 0.534063309431076} 01/26/2022 21:01:22 - INFO - codeparrot_training - Step 1255: {'lr': 0.00031374999999999996, 'samples': 241152, 'steps': 1255, 'loss/train': 1.0949264466762543} 01/26/2022 21:01:26 - INFO - codeparrot_training - Step 1256: {'lr': 0.000314, 'samples': 241344, 'steps': 1256, 'loss/train': 0.8517526388168335} 01/26/2022 21:01:29 - INFO - codeparrot_training - Step 1257: {'lr': 0.00031424999999999997, 'samples': 241536, 'steps': 1257, 'loss/train': 0.951142430305481} 01/26/2022 21:01:32 - INFO - codeparrot_training - Step 1258: {'lr': 0.0003145, 'samples': 241728, 'steps': 1258, 'loss/train': 0.8981476128101349} 01/26/2022 21:01:35 - INFO - codeparrot_training - Step 1259: {'lr': 0.00031475, 'samples': 241920, 'steps': 1259, 'loss/train': 1.1605018973350525} 01/26/2022 21:01:38 - INFO - codeparrot_training - Step 1260: {'lr': 0.000315, 'samples': 242112, 'steps': 1260, 'loss/train': 1.2106269299983978} 01/26/2022 21:01:41 - INFO - codeparrot_training - Step 1261: {'lr': 0.00031525, 'samples': 242304, 'steps': 1261, 'loss/train': 0.47663457691669464} 01/26/2022 21:01:46 - INFO - codeparrot_training - Step 1262: {'lr': 0.0003155, 'samples': 242496, 'steps': 1262, 'loss/train': 0.8414538502693176} 01/26/2022 21:01:49 - INFO - codeparrot_training - Step 1263: {'lr': 0.00031575, 'samples': 242688, 'steps': 1263, 'loss/train': 0.8157608807086945} 01/26/2022 21:01:52 - INFO - codeparrot_training - Step 1264: {'lr': 0.000316, 'samples': 242880, 'steps': 1264, 'loss/train': 1.1345813870429993} 01/26/2022 21:01:55 - INFO - codeparrot_training - Step 1265: {'lr': 0.00031624999999999996, 'samples': 243072, 'steps': 1265, 'loss/train': 0.3450457602739334} 01/26/2022 21:01:58 - INFO - codeparrot_training - Step 1266: {'lr': 0.0003165, 'samples': 243264, 'steps': 1266, 'loss/train': 0.8551756739616394} 01/26/2022 21:02:01 - INFO - codeparrot_training - Step 1267: {'lr': 0.00031675, 'samples': 243456, 'steps': 1267, 'loss/train': 0.9590050578117371} 01/26/2022 21:02:04 - INFO - codeparrot_training - Step 1268: {'lr': 0.000317, 'samples': 243648, 'steps': 1268, 'loss/train': 0.4402073174715042} 01/26/2022 21:02:08 - INFO - codeparrot_training - Step 1269: {'lr': 0.00031725, 'samples': 243840, 'steps': 1269, 'loss/train': 0.9316028952598572} 01/26/2022 21:02:11 - INFO - codeparrot_training - Step 1270: {'lr': 0.0003175, 'samples': 244032, 'steps': 1270, 'loss/train': 1.153227835893631} 01/26/2022 21:02:15 - INFO - codeparrot_training - Step 1271: {'lr': 0.00031775, 'samples': 244224, 'steps': 1271, 'loss/train': 1.801672875881195} 01/26/2022 21:02:18 - INFO - codeparrot_training - Step 1272: {'lr': 0.00031800000000000003, 'samples': 244416, 'steps': 1272, 'loss/train': 0.8158755004405975} 01/26/2022 21:02:21 - INFO - codeparrot_training - Step 1273: {'lr': 0.00031825, 'samples': 244608, 'steps': 1273, 'loss/train': 1.1565444767475128} 01/26/2022 21:02:25 - INFO - codeparrot_training - Step 1274: {'lr': 0.0003185, 'samples': 244800, 'steps': 1274, 'loss/train': 0.6587251871824265} 01/26/2022 21:02:28 - INFO - codeparrot_training - Step 1275: {'lr': 0.00031874999999999997, 'samples': 244992, 'steps': 1275, 'loss/train': 1.1284300088882446} 01/26/2022 21:02:31 - INFO - codeparrot_training - Step 1276: {'lr': 0.000319, 'samples': 245184, 'steps': 1276, 'loss/train': 1.1241610944271088} 01/26/2022 21:02:34 - INFO - codeparrot_training - Step 1277: {'lr': 0.00031925, 'samples': 245376, 'steps': 1277, 'loss/train': 0.5060954242944717} 01/26/2022 21:02:37 - INFO - codeparrot_training - Step 1278: {'lr': 0.0003195, 'samples': 245568, 'steps': 1278, 'loss/train': 0.9319193065166473} 01/26/2022 21:02:40 - INFO - codeparrot_training - Step 1279: {'lr': 0.00031975, 'samples': 245760, 'steps': 1279, 'loss/train': 1.2106119096279144} 01/26/2022 21:02:44 - INFO - codeparrot_training - Step 1280: {'lr': 0.00032, 'samples': 245952, 'steps': 1280, 'loss/train': 1.040795624256134} 01/26/2022 21:02:48 - INFO - codeparrot_training - Step 1281: {'lr': 0.00032025, 'samples': 246144, 'steps': 1281, 'loss/train': 0.630868211388588} 01/26/2022 21:02:51 - INFO - codeparrot_training - Step 1282: {'lr': 0.00032050000000000004, 'samples': 246336, 'steps': 1282, 'loss/train': 1.2275404930114746} 01/26/2022 21:02:54 - INFO - codeparrot_training - Step 1283: {'lr': 0.00032074999999999996, 'samples': 246528, 'steps': 1283, 'loss/train': 0.8045752644538879} 01/26/2022 21:02:57 - INFO - codeparrot_training - Step 1284: {'lr': 0.000321, 'samples': 246720, 'steps': 1284, 'loss/train': 1.0104618966579437} 01/26/2022 21:03:00 - INFO - codeparrot_training - Step 1285: {'lr': 0.00032125, 'samples': 246912, 'steps': 1285, 'loss/train': 0.7690997421741486} 01/26/2022 21:03:03 - INFO - codeparrot_training - Step 1286: {'lr': 0.0003215, 'samples': 247104, 'steps': 1286, 'loss/train': 1.0202703773975372} 01/26/2022 21:03:06 - INFO - codeparrot_training - Step 1287: {'lr': 0.00032175, 'samples': 247296, 'steps': 1287, 'loss/train': 1.2204847633838654} 01/26/2022 21:03:10 - INFO - codeparrot_training - Step 1288: {'lr': 0.000322, 'samples': 247488, 'steps': 1288, 'loss/train': 1.1590179204940796} 01/26/2022 21:03:16 - INFO - codeparrot_training - Step 1289: {'lr': 0.00032225, 'samples': 247680, 'steps': 1289, 'loss/train': 0.9954877495765686} 01/26/2022 21:03:19 - INFO - codeparrot_training - Step 1290: {'lr': 0.00032250000000000003, 'samples': 247872, 'steps': 1290, 'loss/train': 1.1254505217075348} 01/26/2022 21:03:23 - INFO - codeparrot_training - Step 1291: {'lr': 0.00032275, 'samples': 248064, 'steps': 1291, 'loss/train': 0.990976095199585} 01/26/2022 21:03:26 - INFO - codeparrot_training - Step 1292: {'lr': 0.000323, 'samples': 248256, 'steps': 1292, 'loss/train': 0.8347549438476562} 01/26/2022 21:03:29 - INFO - codeparrot_training - Step 1293: {'lr': 0.00032324999999999997, 'samples': 248448, 'steps': 1293, 'loss/train': 0.5836189985275269} 01/26/2022 21:03:32 - INFO - codeparrot_training - Step 1294: {'lr': 0.0003235, 'samples': 248640, 'steps': 1294, 'loss/train': 0.7854525446891785} 01/26/2022 21:03:35 - INFO - codeparrot_training - Step 1295: {'lr': 0.00032375, 'samples': 248832, 'steps': 1295, 'loss/train': 1.9712843298912048} 01/26/2022 21:03:38 - INFO - codeparrot_training - Step 1296: {'lr': 0.000324, 'samples': 249024, 'steps': 1296, 'loss/train': 1.9580798149108887} 01/26/2022 21:03:41 - INFO - codeparrot_training - Step 1297: {'lr': 0.00032425, 'samples': 249216, 'steps': 1297, 'loss/train': 1.2691041827201843} 01/26/2022 21:03:46 - INFO - codeparrot_training - Step 1298: {'lr': 0.00032450000000000003, 'samples': 249408, 'steps': 1298, 'loss/train': 1.062127411365509} 01/26/2022 21:03:49 - INFO - codeparrot_training - Step 1299: {'lr': 0.00032475, 'samples': 249600, 'steps': 1299, 'loss/train': 0.4577314406633377} 01/26/2022 21:03:52 - INFO - codeparrot_training - Step 1300: {'lr': 0.00032500000000000004, 'samples': 249792, 'steps': 1300, 'loss/train': 0.49498289823532104} 01/26/2022 21:03:55 - INFO - codeparrot_training - Step 1301: {'lr': 0.00032524999999999996, 'samples': 249984, 'steps': 1301, 'loss/train': 1.0439395308494568} 01/26/2022 21:03:58 - INFO - codeparrot_training - Step 1302: {'lr': 0.0003255, 'samples': 250176, 'steps': 1302, 'loss/train': 0.9454618692398071} 01/26/2022 21:04:02 - INFO - codeparrot_training - Step 1303: {'lr': 0.00032575, 'samples': 250368, 'steps': 1303, 'loss/train': 0.8306179940700531} 01/26/2022 21:04:05 - INFO - codeparrot_training - Step 1304: {'lr': 0.000326, 'samples': 250560, 'steps': 1304, 'loss/train': 1.0661275684833527} 01/26/2022 21:04:08 - INFO - codeparrot_training - Step 1305: {'lr': 0.00032625, 'samples': 250752, 'steps': 1305, 'loss/train': 0.6971832364797592} 01/26/2022 21:04:11 - INFO - codeparrot_training - Step 1306: {'lr': 0.0003265, 'samples': 250944, 'steps': 1306, 'loss/train': 1.0274827480316162} 01/26/2022 21:04:17 - INFO - codeparrot_training - Step 1307: {'lr': 0.00032675, 'samples': 251136, 'steps': 1307, 'loss/train': 1.6470826864242554} 01/26/2022 21:04:20 - INFO - codeparrot_training - Step 1308: {'lr': 0.00032700000000000003, 'samples': 251328, 'steps': 1308, 'loss/train': 1.3024613857269287} 01/26/2022 21:04:23 - INFO - codeparrot_training - Step 1309: {'lr': 0.00032725, 'samples': 251520, 'steps': 1309, 'loss/train': 1.2962641417980194} 01/26/2022 21:04:27 - INFO - codeparrot_training - Step 1310: {'lr': 0.00032750000000000005, 'samples': 251712, 'steps': 1310, 'loss/train': 1.204667866230011} 01/26/2022 21:04:30 - INFO - codeparrot_training - Step 1311: {'lr': 0.00032774999999999997, 'samples': 251904, 'steps': 1311, 'loss/train': 0.6610483825206757} 01/26/2022 21:04:33 - INFO - codeparrot_training - Step 1312: {'lr': 0.000328, 'samples': 252096, 'steps': 1312, 'loss/train': 0.8699553608894348} 01/26/2022 21:04:36 - INFO - codeparrot_training - Step 1313: {'lr': 0.00032825, 'samples': 252288, 'steps': 1313, 'loss/train': 0.8802606761455536} 01/26/2022 21:04:39 - INFO - codeparrot_training - Step 1314: {'lr': 0.0003285, 'samples': 252480, 'steps': 1314, 'loss/train': 1.1301237344741821} 01/26/2022 21:04:42 - INFO - codeparrot_training - Step 1315: {'lr': 0.00032875, 'samples': 252672, 'steps': 1315, 'loss/train': 0.8648131191730499} 01/26/2022 21:04:47 - INFO - codeparrot_training - Step 1316: {'lr': 0.00032900000000000003, 'samples': 252864, 'steps': 1316, 'loss/train': 0.8390500545501709} 01/26/2022 21:04:50 - INFO - codeparrot_training - Step 1317: {'lr': 0.00032925, 'samples': 253056, 'steps': 1317, 'loss/train': 0.5135869234800339} 01/26/2022 21:04:53 - INFO - codeparrot_training - Step 1318: {'lr': 0.00032950000000000004, 'samples': 253248, 'steps': 1318, 'loss/train': 1.0603323876857758} 01/26/2022 21:04:56 - INFO - codeparrot_training - Step 1319: {'lr': 0.00032975, 'samples': 253440, 'steps': 1319, 'loss/train': 0.9542219638824463} 01/26/2022 21:04:59 - INFO - codeparrot_training - Step 1320: {'lr': 0.00033, 'samples': 253632, 'steps': 1320, 'loss/train': 1.0566553473472595} 01/26/2022 21:05:03 - INFO - codeparrot_training - Step 1321: {'lr': 0.00033025, 'samples': 253824, 'steps': 1321, 'loss/train': 1.3202703595161438} 01/26/2022 21:05:06 - INFO - codeparrot_training - Step 1322: {'lr': 0.0003305, 'samples': 254016, 'steps': 1322, 'loss/train': 1.001068890094757} 01/26/2022 21:05:09 - INFO - codeparrot_training - Step 1323: {'lr': 0.00033075, 'samples': 254208, 'steps': 1323, 'loss/train': 0.8870709836483002} 01/26/2022 21:05:13 - INFO - codeparrot_training - Step 1324: {'lr': 0.000331, 'samples': 254400, 'steps': 1324, 'loss/train': 1.2355210483074188} 01/26/2022 21:05:16 - INFO - codeparrot_training - Step 1325: {'lr': 0.00033125, 'samples': 254592, 'steps': 1325, 'loss/train': 1.148157387971878} 01/26/2022 21:05:19 - INFO - codeparrot_training - Step 1326: {'lr': 0.00033150000000000003, 'samples': 254784, 'steps': 1326, 'loss/train': 0.9794060289859772} 01/26/2022 21:05:23 - INFO - codeparrot_training - Step 1327: {'lr': 0.00033175, 'samples': 254976, 'steps': 1327, 'loss/train': 0.44070857763290405} 01/26/2022 21:05:26 - INFO - codeparrot_training - Step 1328: {'lr': 0.00033200000000000005, 'samples': 255168, 'steps': 1328, 'loss/train': 0.8632693290710449} 01/26/2022 21:05:29 - INFO - codeparrot_training - Step 1329: {'lr': 0.00033224999999999997, 'samples': 255360, 'steps': 1329, 'loss/train': 0.9719774723052979} 01/26/2022 21:05:32 - INFO - codeparrot_training - Step 1330: {'lr': 0.0003325, 'samples': 255552, 'steps': 1330, 'loss/train': 3.8416589498519897} 01/26/2022 21:05:35 - INFO - codeparrot_training - Step 1331: {'lr': 0.00033275, 'samples': 255744, 'steps': 1331, 'loss/train': 1.1004060208797455} 01/26/2022 21:05:38 - INFO - codeparrot_training - Step 1332: {'lr': 0.000333, 'samples': 255936, 'steps': 1332, 'loss/train': 0.7440213114023209} 01/26/2022 21:05:44 - INFO - codeparrot_training - Step 1333: {'lr': 0.00033325, 'samples': 256128, 'steps': 1333, 'loss/train': 0.8231377601623535} 01/26/2022 21:05:47 - INFO - codeparrot_training - Step 1334: {'lr': 0.00033350000000000003, 'samples': 256320, 'steps': 1334, 'loss/train': 0.7276638597249985} 01/26/2022 21:05:51 - INFO - codeparrot_training - Step 1335: {'lr': 0.00033375, 'samples': 256512, 'steps': 1335, 'loss/train': 1.099365770816803} 01/26/2022 21:05:54 - INFO - codeparrot_training - Step 1336: {'lr': 0.00033400000000000004, 'samples': 256704, 'steps': 1336, 'loss/train': 1.4680210053920746} 01/26/2022 21:05:57 - INFO - codeparrot_training - Step 1337: {'lr': 0.00033425, 'samples': 256896, 'steps': 1337, 'loss/train': 1.1128609478473663} 01/26/2022 21:06:00 - INFO - codeparrot_training - Step 1338: {'lr': 0.00033450000000000005, 'samples': 257088, 'steps': 1338, 'loss/train': 0.822838693857193} 01/26/2022 21:06:03 - INFO - codeparrot_training - Step 1339: {'lr': 0.00033475, 'samples': 257280, 'steps': 1339, 'loss/train': 0.8494419157505035} 01/26/2022 21:06:06 - INFO - codeparrot_training - Step 1340: {'lr': 0.000335, 'samples': 257472, 'steps': 1340, 'loss/train': 0.7268175780773163} 01/26/2022 21:06:09 - INFO - codeparrot_training - Step 1341: {'lr': 0.00033525, 'samples': 257664, 'steps': 1341, 'loss/train': 0.868606299161911} 01/26/2022 21:06:14 - INFO - codeparrot_training - Step 1342: {'lr': 0.0003355, 'samples': 257856, 'steps': 1342, 'loss/train': 0.818723738193512} 01/26/2022 21:06:17 - INFO - codeparrot_training - Step 1343: {'lr': 0.00033575, 'samples': 258048, 'steps': 1343, 'loss/train': 1.5978684425354004} 01/26/2022 21:06:20 - INFO - codeparrot_training - Step 1344: {'lr': 0.00033600000000000004, 'samples': 258240, 'steps': 1344, 'loss/train': 0.5804193019866943} 01/26/2022 21:06:23 - INFO - codeparrot_training - Step 1345: {'lr': 0.00033625, 'samples': 258432, 'steps': 1345, 'loss/train': 1.0418038666248322} 01/26/2022 21:06:26 - INFO - codeparrot_training - Step 1346: {'lr': 0.00033650000000000005, 'samples': 258624, 'steps': 1346, 'loss/train': 0.8577019572257996} 01/26/2022 21:06:30 - INFO - codeparrot_training - Step 1347: {'lr': 0.00033675, 'samples': 258816, 'steps': 1347, 'loss/train': 0.6332640498876572} 01/26/2022 21:06:33 - INFO - codeparrot_training - Step 1348: {'lr': 0.000337, 'samples': 259008, 'steps': 1348, 'loss/train': 1.1220255196094513} 01/26/2022 21:06:36 - INFO - codeparrot_training - Step 1349: {'lr': 0.00033725, 'samples': 259200, 'steps': 1349, 'loss/train': 0.7499145269393921} 01/26/2022 21:06:39 - INFO - codeparrot_training - Step 1350: {'lr': 0.0003375, 'samples': 259392, 'steps': 1350, 'loss/train': 1.040358155965805} 01/26/2022 21:06:44 - INFO - codeparrot_training - Step 1351: {'lr': 0.00033775, 'samples': 259584, 'steps': 1351, 'loss/train': 1.2843874096870422} 01/26/2022 21:06:47 - INFO - codeparrot_training - Step 1352: {'lr': 0.00033800000000000003, 'samples': 259776, 'steps': 1352, 'loss/train': 1.238112062215805} 01/26/2022 21:06:50 - INFO - codeparrot_training - Step 1353: {'lr': 0.00033825, 'samples': 259968, 'steps': 1353, 'loss/train': 0.3218713849782944} 01/26/2022 21:06:53 - INFO - codeparrot_training - Step 1354: {'lr': 0.00033850000000000004, 'samples': 260160, 'steps': 1354, 'loss/train': 0.7913881838321686} 01/26/2022 21:06:56 - INFO - codeparrot_training - Step 1355: {'lr': 0.00033875, 'samples': 260352, 'steps': 1355, 'loss/train': 0.6941387057304382} 01/26/2022 21:06:59 - INFO - codeparrot_training - Step 1356: {'lr': 0.00033900000000000005, 'samples': 260544, 'steps': 1356, 'loss/train': 0.8672609925270081} 01/26/2022 21:07:03 - INFO - codeparrot_training - Step 1357: {'lr': 0.00033925, 'samples': 260736, 'steps': 1357, 'loss/train': 0.4973887950181961} 01/26/2022 21:07:06 - INFO - codeparrot_training - Step 1358: {'lr': 0.0003395, 'samples': 260928, 'steps': 1358, 'loss/train': 0.8578560054302216} 01/26/2022 21:07:10 - INFO - codeparrot_training - Step 1359: {'lr': 0.00033975, 'samples': 261120, 'steps': 1359, 'loss/train': 0.753614991903305} 01/26/2022 21:07:14 - INFO - codeparrot_training - Step 1360: {'lr': 0.00034, 'samples': 261312, 'steps': 1360, 'loss/train': 0.7949243187904358} 01/26/2022 21:07:17 - INFO - codeparrot_training - Step 1361: {'lr': 0.00034025, 'samples': 261504, 'steps': 1361, 'loss/train': 1.0931392908096313} 01/26/2022 21:07:20 - INFO - codeparrot_training - Step 1362: {'lr': 0.00034050000000000004, 'samples': 261696, 'steps': 1362, 'loss/train': 0.7639859318733215} 01/26/2022 21:07:23 - INFO - codeparrot_training - Step 1363: {'lr': 0.00034075, 'samples': 261888, 'steps': 1363, 'loss/train': 1.110782414674759} 01/26/2022 21:07:26 - INFO - codeparrot_training - Step 1364: {'lr': 0.00034100000000000005, 'samples': 262080, 'steps': 1364, 'loss/train': 0.7246271520853043} 01/26/2022 21:07:29 - INFO - codeparrot_training - Step 1365: {'lr': 0.00034125000000000003, 'samples': 262272, 'steps': 1365, 'loss/train': 0.735277533531189} 01/26/2022 21:07:32 - INFO - codeparrot_training - Step 1366: {'lr': 0.0003415, 'samples': 262464, 'steps': 1366, 'loss/train': 1.0827614665031433} 01/26/2022 21:07:36 - INFO - codeparrot_training - Step 1367: {'lr': 0.00034175, 'samples': 262656, 'steps': 1367, 'loss/train': 0.8820734024047852} 01/26/2022 21:07:41 - INFO - codeparrot_training - Step 1368: {'lr': 0.000342, 'samples': 262848, 'steps': 1368, 'loss/train': 0.8954282104969025} 01/26/2022 21:07:44 - INFO - codeparrot_training - Step 1369: {'lr': 0.00034225, 'samples': 263040, 'steps': 1369, 'loss/train': 0.8317834138870239} 01/26/2022 21:07:47 - INFO - codeparrot_training - Step 1370: {'lr': 0.00034250000000000003, 'samples': 263232, 'steps': 1370, 'loss/train': 0.8572661876678467} 01/26/2022 21:07:51 - INFO - codeparrot_training - Step 1371: {'lr': 0.00034275, 'samples': 263424, 'steps': 1371, 'loss/train': 0.7083683162927628} 01/26/2022 21:07:54 - INFO - codeparrot_training - Step 1372: {'lr': 0.00034300000000000004, 'samples': 263616, 'steps': 1372, 'loss/train': 0.9145535230636597} 01/26/2022 21:07:57 - INFO - codeparrot_training - Step 1373: {'lr': 0.00034325, 'samples': 263808, 'steps': 1373, 'loss/train': 1.3301053941249847} 01/26/2022 21:08:00 - INFO - codeparrot_training - Step 1374: {'lr': 0.00034350000000000006, 'samples': 264000, 'steps': 1374, 'loss/train': 1.1468211114406586} 01/26/2022 21:08:03 - INFO - codeparrot_training - Step 1375: {'lr': 0.00034375, 'samples': 264192, 'steps': 1375, 'loss/train': 0.7009869664907455} 01/26/2022 21:08:06 - INFO - codeparrot_training - Step 1376: {'lr': 0.00034399999999999996, 'samples': 264384, 'steps': 1376, 'loss/train': 0.9463578164577484} 01/26/2022 21:08:11 - INFO - codeparrot_training - Step 1377: {'lr': 0.00034425, 'samples': 264576, 'steps': 1377, 'loss/train': 1.1591105461120605} 01/26/2022 21:08:14 - INFO - codeparrot_training - Step 1378: {'lr': 0.00034449999999999997, 'samples': 264768, 'steps': 1378, 'loss/train': 0.96595099568367} 01/26/2022 21:08:17 - INFO - codeparrot_training - Step 1379: {'lr': 0.00034475, 'samples': 264960, 'steps': 1379, 'loss/train': 0.6593697667121887} 01/26/2022 21:08:20 - INFO - codeparrot_training - Step 1380: {'lr': 0.000345, 'samples': 265152, 'steps': 1380, 'loss/train': 0.8035639822483063} 01/26/2022 21:08:23 - INFO - codeparrot_training - Step 1381: {'lr': 0.00034525, 'samples': 265344, 'steps': 1381, 'loss/train': 0.8536484241485596} 01/26/2022 21:08:26 - INFO - codeparrot_training - Step 1382: {'lr': 0.0003455, 'samples': 265536, 'steps': 1382, 'loss/train': 1.4567998945713043} 01/26/2022 21:08:29 - INFO - codeparrot_training - Step 1383: {'lr': 0.00034575000000000003, 'samples': 265728, 'steps': 1383, 'loss/train': 1.1728605329990387} 01/26/2022 21:08:33 - INFO - codeparrot_training - Step 1384: {'lr': 0.000346, 'samples': 265920, 'steps': 1384, 'loss/train': 0.5751859098672867} 01/26/2022 21:08:36 - INFO - codeparrot_training - Step 1385: {'lr': 0.00034625, 'samples': 266112, 'steps': 1385, 'loss/train': 0.722655862569809} 01/26/2022 21:08:42 - INFO - codeparrot_training - Step 1386: {'lr': 0.00034649999999999997, 'samples': 266304, 'steps': 1386, 'loss/train': 0.9264733493328094} 01/26/2022 21:08:45 - INFO - codeparrot_training - Step 1387: {'lr': 0.00034675, 'samples': 266496, 'steps': 1387, 'loss/train': 0.6819387674331665} 01/26/2022 21:08:48 - INFO - codeparrot_training - Step 1388: {'lr': 0.000347, 'samples': 266688, 'steps': 1388, 'loss/train': 1.0372397303581238} 01/26/2022 21:08:51 - INFO - codeparrot_training - Step 1389: {'lr': 0.00034725, 'samples': 266880, 'steps': 1389, 'loss/train': 0.6640693992376328} 01/26/2022 21:08:55 - INFO - codeparrot_training - Step 1390: {'lr': 0.0003475, 'samples': 267072, 'steps': 1390, 'loss/train': 0.23234233260154724} 01/26/2022 21:08:58 - INFO - codeparrot_training - Step 1391: {'lr': 0.00034775, 'samples': 267264, 'steps': 1391, 'loss/train': 0.6895884722471237} 01/26/2022 21:09:01 - INFO - codeparrot_training - Step 1392: {'lr': 0.000348, 'samples': 267456, 'steps': 1392, 'loss/train': 0.13055146113038063} 01/26/2022 21:09:04 - INFO - codeparrot_training - Step 1393: {'lr': 0.00034825000000000004, 'samples': 267648, 'steps': 1393, 'loss/train': 1.0278124809265137} 01/26/2022 21:09:08 - INFO - codeparrot_training - Step 1394: {'lr': 0.00034849999999999996, 'samples': 267840, 'steps': 1394, 'loss/train': 1.3332147896289825} 01/26/2022 21:09:11 - INFO - codeparrot_training - Step 1395: {'lr': 0.00034875, 'samples': 268032, 'steps': 1395, 'loss/train': 1.1082002520561218} 01/26/2022 21:09:15 - INFO - codeparrot_training - Step 1396: {'lr': 0.00034899999999999997, 'samples': 268224, 'steps': 1396, 'loss/train': 1.2473546862602234} 01/26/2022 21:09:18 - INFO - codeparrot_training - Step 1397: {'lr': 0.00034925, 'samples': 268416, 'steps': 1397, 'loss/train': 1.0238586366176605} 01/26/2022 21:09:21 - INFO - codeparrot_training - Step 1398: {'lr': 0.0003495, 'samples': 268608, 'steps': 1398, 'loss/train': 0.8205553293228149} 01/26/2022 21:09:24 - INFO - codeparrot_training - Step 1399: {'lr': 0.00034975, 'samples': 268800, 'steps': 1399, 'loss/train': 1.0121375620365143} 01/26/2022 21:09:27 - INFO - codeparrot_training - Step 1400: {'lr': 0.00035, 'samples': 268992, 'steps': 1400, 'loss/train': 0.8104203343391418} 01/26/2022 21:09:30 - INFO - codeparrot_training - Step 1401: {'lr': 0.00035025000000000003, 'samples': 269184, 'steps': 1401, 'loss/train': 0.12986082583665848} 01/26/2022 21:09:33 - INFO - codeparrot_training - Step 1402: {'lr': 0.0003505, 'samples': 269376, 'steps': 1402, 'loss/train': 1.314984530210495} 01/26/2022 21:09:38 - INFO - codeparrot_training - Step 1403: {'lr': 0.00035075, 'samples': 269568, 'steps': 1403, 'loss/train': 0.2538214176893234} 01/26/2022 21:09:41 - INFO - codeparrot_training - Step 1404: {'lr': 0.00035099999999999997, 'samples': 269760, 'steps': 1404, 'loss/train': 1.0766460299491882} 01/26/2022 21:09:44 - INFO - codeparrot_training - Step 1405: {'lr': 0.00035125, 'samples': 269952, 'steps': 1405, 'loss/train': 0.2770286500453949} 01/26/2022 21:09:47 - INFO - codeparrot_training - Step 1406: {'lr': 0.0003515, 'samples': 270144, 'steps': 1406, 'loss/train': 1.054178237915039} 01/26/2022 21:09:50 - INFO - codeparrot_training - Step 1407: {'lr': 0.00035175, 'samples': 270336, 'steps': 1407, 'loss/train': 0.8579706251621246} 01/26/2022 21:09:53 - INFO - codeparrot_training - Step 1408: {'lr': 0.000352, 'samples': 270528, 'steps': 1408, 'loss/train': 1.0789376199245453} 01/26/2022 21:09:57 - INFO - codeparrot_training - Step 1409: {'lr': 0.00035225, 'samples': 270720, 'steps': 1409, 'loss/train': 1.3925848603248596} 01/26/2022 21:10:00 - INFO - codeparrot_training - Step 1410: {'lr': 0.0003525, 'samples': 270912, 'steps': 1410, 'loss/train': 0.5225864052772522} 01/26/2022 21:10:03 - INFO - codeparrot_training - Step 1411: {'lr': 0.00035275000000000004, 'samples': 271104, 'steps': 1411, 'loss/train': 1.2726842164993286} 01/26/2022 21:10:09 - INFO - codeparrot_training - Step 1412: {'lr': 0.00035299999999999996, 'samples': 271296, 'steps': 1412, 'loss/train': 0.7153510898351669} 01/26/2022 21:10:12 - INFO - codeparrot_training - Step 1413: {'lr': 0.00035325, 'samples': 271488, 'steps': 1413, 'loss/train': 0.40713250637054443} 01/26/2022 21:10:15 - INFO - codeparrot_training - Step 1414: {'lr': 0.0003535, 'samples': 271680, 'steps': 1414, 'loss/train': 0.7666701972484589} 01/26/2022 21:10:18 - INFO - codeparrot_training - Step 1415: {'lr': 0.00035375, 'samples': 271872, 'steps': 1415, 'loss/train': 1.1024984121322632} 01/26/2022 21:10:22 - INFO - codeparrot_training - Step 1416: {'lr': 0.000354, 'samples': 272064, 'steps': 1416, 'loss/train': 0.9069237112998962} 01/26/2022 21:10:25 - INFO - codeparrot_training - Step 1417: {'lr': 0.00035425, 'samples': 272256, 'steps': 1417, 'loss/train': 0.9518739581108093} 01/26/2022 21:10:28 - INFO - codeparrot_training - Step 1418: {'lr': 0.0003545, 'samples': 272448, 'steps': 1418, 'loss/train': 1.203192561864853} 01/26/2022 21:10:31 - INFO - codeparrot_training - Step 1419: {'lr': 0.00035475000000000003, 'samples': 272640, 'steps': 1419, 'loss/train': 1.001071572303772} 01/26/2022 21:10:34 - INFO - codeparrot_training - Step 1420: {'lr': 0.000355, 'samples': 272832, 'steps': 1420, 'loss/train': 1.5776689052581787} 01/26/2022 21:10:39 - INFO - codeparrot_training - Step 1421: {'lr': 0.00035525000000000004, 'samples': 273024, 'steps': 1421, 'loss/train': 0.9579824209213257} 01/26/2022 21:10:42 - INFO - codeparrot_training - Step 1422: {'lr': 0.00035549999999999997, 'samples': 273216, 'steps': 1422, 'loss/train': 0.22726918011903763} 01/26/2022 21:10:45 - INFO - codeparrot_training - Step 1423: {'lr': 0.00035575, 'samples': 273408, 'steps': 1423, 'loss/train': 1.2481546998023987} 01/26/2022 21:10:48 - INFO - codeparrot_training - Step 1424: {'lr': 0.000356, 'samples': 273600, 'steps': 1424, 'loss/train': 0.850852757692337} 01/26/2022 21:10:51 - INFO - codeparrot_training - Step 1425: {'lr': 0.00035625, 'samples': 273792, 'steps': 1425, 'loss/train': 0.9370703101158142} 01/26/2022 21:10:55 - INFO - codeparrot_training - Step 1426: {'lr': 0.0003565, 'samples': 273984, 'steps': 1426, 'loss/train': 1.0206585824489594} 01/26/2022 21:10:58 - INFO - codeparrot_training - Step 1427: {'lr': 0.00035675, 'samples': 274176, 'steps': 1427, 'loss/train': 1.035037636756897} 01/26/2022 21:11:01 - INFO - codeparrot_training - Step 1428: {'lr': 0.000357, 'samples': 274368, 'steps': 1428, 'loss/train': 0.8349533379077911} 01/26/2022 21:11:05 - INFO - codeparrot_training - Step 1429: {'lr': 0.00035725000000000004, 'samples': 274560, 'steps': 1429, 'loss/train': 0.8950520753860474} 01/26/2022 21:11:08 - INFO - codeparrot_training - Step 1430: {'lr': 0.0003575, 'samples': 274752, 'steps': 1430, 'loss/train': 1.1159393191337585} 01/26/2022 21:11:11 - INFO - codeparrot_training - Step 1431: {'lr': 0.00035775, 'samples': 274944, 'steps': 1431, 'loss/train': 1.2659131586551666} 01/26/2022 21:11:15 - INFO - codeparrot_training - Step 1432: {'lr': 0.000358, 'samples': 275136, 'steps': 1432, 'loss/train': 1.0870972573757172} 01/26/2022 21:11:18 - INFO - codeparrot_training - Step 1433: {'lr': 0.00035825, 'samples': 275328, 'steps': 1433, 'loss/train': 0.8313236832618713} 01/26/2022 21:11:21 - INFO - codeparrot_training - Step 1434: {'lr': 0.0003585, 'samples': 275520, 'steps': 1434, 'loss/train': 1.0548158884048462} 01/26/2022 21:11:24 - INFO - codeparrot_training - Step 1435: {'lr': 0.00035875, 'samples': 275712, 'steps': 1435, 'loss/train': 0.6349529027938843} 01/26/2022 21:11:27 - INFO - codeparrot_training - Step 1436: {'lr': 0.000359, 'samples': 275904, 'steps': 1436, 'loss/train': 0.7269305884838104} 01/26/2022 21:11:30 - INFO - codeparrot_training - Step 1437: {'lr': 0.00035925000000000003, 'samples': 276096, 'steps': 1437, 'loss/train': 0.8836067318916321} 01/26/2022 21:11:36 - INFO - codeparrot_training - Step 1438: {'lr': 0.0003595, 'samples': 276288, 'steps': 1438, 'loss/train': 1.2289762794971466} 01/26/2022 21:11:40 - INFO - codeparrot_training - Step 1439: {'lr': 0.00035975000000000004, 'samples': 276480, 'steps': 1439, 'loss/train': 0.9617836475372314} 01/26/2022 21:11:43 - INFO - codeparrot_training - Step 1440: {'lr': 0.00035999999999999997, 'samples': 276672, 'steps': 1440, 'loss/train': 1.362577110528946} 01/26/2022 21:11:46 - INFO - codeparrot_training - Step 1441: {'lr': 0.00036025, 'samples': 276864, 'steps': 1441, 'loss/train': 0.7411259114742279} 01/26/2022 21:11:49 - INFO - codeparrot_training - Step 1442: {'lr': 0.0003605, 'samples': 277056, 'steps': 1442, 'loss/train': 1.1015606224536896} 01/26/2022 21:11:52 - INFO - codeparrot_training - Step 1443: {'lr': 0.00036075, 'samples': 277248, 'steps': 1443, 'loss/train': 1.2923219203948975} 01/26/2022 21:11:55 - INFO - codeparrot_training - Step 1444: {'lr': 0.000361, 'samples': 277440, 'steps': 1444, 'loss/train': 1.3723007440567017} 01/26/2022 21:11:58 - INFO - codeparrot_training - Step 1445: {'lr': 0.00036125, 'samples': 277632, 'steps': 1445, 'loss/train': 1.5804736018180847} 01/26/2022 21:12:01 - INFO - codeparrot_training - Step 1446: {'lr': 0.0003615, 'samples': 277824, 'steps': 1446, 'loss/train': 1.3025188744068146} 01/26/2022 21:12:06 - INFO - codeparrot_training - Step 1447: {'lr': 0.00036175000000000004, 'samples': 278016, 'steps': 1447, 'loss/train': 0.9075505435466766} 01/26/2022 21:12:09 - INFO - codeparrot_training - Step 1448: {'lr': 0.000362, 'samples': 278208, 'steps': 1448, 'loss/train': 0.576828807592392} 01/26/2022 21:12:13 - INFO - codeparrot_training - Step 1449: {'lr': 0.00036225000000000005, 'samples': 278400, 'steps': 1449, 'loss/train': 0.9907841384410858} 01/26/2022 21:12:16 - INFO - codeparrot_training - Step 1450: {'lr': 0.0003625, 'samples': 278592, 'steps': 1450, 'loss/train': 1.2009357511997223} 01/26/2022 21:12:19 - INFO - codeparrot_training - Step 1451: {'lr': 0.00036275, 'samples': 278784, 'steps': 1451, 'loss/train': 1.6536110043525696} 01/26/2022 21:12:22 - INFO - codeparrot_training - Step 1452: {'lr': 0.000363, 'samples': 278976, 'steps': 1452, 'loss/train': 1.003242552280426} 01/26/2022 21:12:25 - INFO - codeparrot_training - Step 1453: {'lr': 0.00036325, 'samples': 279168, 'steps': 1453, 'loss/train': 0.8086393475532532} 01/26/2022 21:12:28 - INFO - codeparrot_training - Step 1454: {'lr': 0.0003635, 'samples': 279360, 'steps': 1454, 'loss/train': 0.7789215445518494} 01/26/2022 21:12:33 - INFO - codeparrot_training - Step 1455: {'lr': 0.00036375000000000003, 'samples': 279552, 'steps': 1455, 'loss/train': 0.8268111348152161} 01/26/2022 21:12:36 - INFO - codeparrot_training - Step 1456: {'lr': 0.000364, 'samples': 279744, 'steps': 1456, 'loss/train': 1.1320781707763672} 01/26/2022 21:12:39 - INFO - codeparrot_training - Step 1457: {'lr': 0.00036425000000000004, 'samples': 279936, 'steps': 1457, 'loss/train': 0.30621258169412613} 01/26/2022 21:12:42 - INFO - codeparrot_training - Step 1458: {'lr': 0.0003645, 'samples': 280128, 'steps': 1458, 'loss/train': 0.2289910465478897} 01/26/2022 21:12:45 - INFO - codeparrot_training - Step 1459: {'lr': 0.00036475, 'samples': 280320, 'steps': 1459, 'loss/train': 0.6802295297384262} 01/26/2022 21:12:48 - INFO - codeparrot_training - Step 1460: {'lr': 0.000365, 'samples': 280512, 'steps': 1460, 'loss/train': 0.41221417486667633} 01/26/2022 21:12:51 - INFO - codeparrot_training - Step 1461: {'lr': 0.00036525, 'samples': 280704, 'steps': 1461, 'loss/train': 0.7212480157613754} 01/26/2022 21:12:55 - INFO - codeparrot_training - Step 1462: {'lr': 0.0003655, 'samples': 280896, 'steps': 1462, 'loss/train': 1.1107229590415955} 01/26/2022 21:12:58 - INFO - codeparrot_training - Step 1463: {'lr': 0.00036575, 'samples': 281088, 'steps': 1463, 'loss/train': 1.149871587753296} 01/26/2022 21:13:04 - INFO - codeparrot_training - Step 1464: {'lr': 0.000366, 'samples': 281280, 'steps': 1464, 'loss/train': 0.6629093885421753} 01/26/2022 21:13:07 - INFO - codeparrot_training - Step 1465: {'lr': 0.00036625000000000004, 'samples': 281472, 'steps': 1465, 'loss/train': 0.6574883311986923} 01/26/2022 21:13:10 - INFO - codeparrot_training - Step 1466: {'lr': 0.0003665, 'samples': 281664, 'steps': 1466, 'loss/train': 0.3459419757127762} 01/26/2022 21:13:13 - INFO - codeparrot_training - Step 1467: {'lr': 0.00036675000000000005, 'samples': 281856, 'steps': 1467, 'loss/train': 1.0028822422027588} 01/26/2022 21:13:16 - INFO - codeparrot_training - Step 1468: {'lr': 0.000367, 'samples': 282048, 'steps': 1468, 'loss/train': 0.8441137075424194} 01/26/2022 21:13:19 - INFO - codeparrot_training - Step 1469: {'lr': 0.00036725, 'samples': 282240, 'steps': 1469, 'loss/train': 0.7084958553314209} 01/26/2022 21:13:23 - INFO - codeparrot_training - Step 1470: {'lr': 0.0003675, 'samples': 282432, 'steps': 1470, 'loss/train': 0.8485589325428009} 01/26/2022 21:13:26 - INFO - codeparrot_training - Step 1471: {'lr': 0.00036775, 'samples': 282624, 'steps': 1471, 'loss/train': 0.964582085609436} 01/26/2022 21:13:29 - INFO - codeparrot_training - Step 1472: {'lr': 0.000368, 'samples': 282816, 'steps': 1472, 'loss/train': 1.0355412662029266} 01/26/2022 21:13:33 - INFO - codeparrot_training - Step 1473: {'lr': 0.00036825000000000003, 'samples': 283008, 'steps': 1473, 'loss/train': 1.048643320798874} 01/26/2022 21:13:37 - INFO - codeparrot_training - Step 1474: {'lr': 0.0003685, 'samples': 283200, 'steps': 1474, 'loss/train': 1.1607147753238678} 01/26/2022 21:13:40 - INFO - codeparrot_training - Step 1475: {'lr': 0.00036875000000000005, 'samples': 283392, 'steps': 1475, 'loss/train': 0.9776473045349121} 01/26/2022 21:13:43 - INFO - codeparrot_training - Step 1476: {'lr': 0.000369, 'samples': 283584, 'steps': 1476, 'loss/train': 0.8770821690559387} 01/26/2022 21:13:46 - INFO - codeparrot_training - Step 1477: {'lr': 0.00036925, 'samples': 283776, 'steps': 1477, 'loss/train': 0.19134023040533066} 01/26/2022 21:13:49 - INFO - codeparrot_training - Step 1478: {'lr': 0.0003695, 'samples': 283968, 'steps': 1478, 'loss/train': 1.9430841207504272} 01/26/2022 21:13:52 - INFO - codeparrot_training - Step 1479: {'lr': 0.00036975, 'samples': 284160, 'steps': 1479, 'loss/train': 1.4303966760635376} 01/26/2022 21:13:55 - INFO - codeparrot_training - Step 1480: {'lr': 0.00037, 'samples': 284352, 'steps': 1480, 'loss/train': 0.836928516626358} 01/26/2022 21:13:59 - INFO - codeparrot_training - Step 1481: {'lr': 0.00037025000000000003, 'samples': 284544, 'steps': 1481, 'loss/train': 0.5673790723085403} 01/26/2022 21:14:03 - INFO - codeparrot_training - Step 1482: {'lr': 0.0003705, 'samples': 284736, 'steps': 1482, 'loss/train': 1.098906934261322} 01/26/2022 21:14:06 - INFO - codeparrot_training - Step 1483: {'lr': 0.00037075000000000004, 'samples': 284928, 'steps': 1483, 'loss/train': 0.23770031332969666} 01/26/2022 21:14:10 - INFO - codeparrot_training - Step 1484: {'lr': 0.000371, 'samples': 285120, 'steps': 1484, 'loss/train': 1.1671381294727325} 01/26/2022 21:14:13 - INFO - codeparrot_training - Step 1485: {'lr': 0.00037125000000000005, 'samples': 285312, 'steps': 1485, 'loss/train': 0.1803583838045597} 01/26/2022 21:14:16 - INFO - codeparrot_training - Step 1486: {'lr': 0.00037150000000000003, 'samples': 285504, 'steps': 1486, 'loss/train': 0.8468064665794373} 01/26/2022 21:14:19 - INFO - codeparrot_training - Step 1487: {'lr': 0.00037175, 'samples': 285696, 'steps': 1487, 'loss/train': 0.9230948388576508} 01/26/2022 21:14:22 - INFO - codeparrot_training - Step 1488: {'lr': 0.000372, 'samples': 285888, 'steps': 1488, 'loss/train': 0.6031788736581802} 01/26/2022 21:14:25 - INFO - codeparrot_training - Step 1489: {'lr': 0.00037225, 'samples': 286080, 'steps': 1489, 'loss/train': 0.8745079636573792} 01/26/2022 21:14:28 - INFO - codeparrot_training - Step 1490: {'lr': 0.0003725, 'samples': 286272, 'steps': 1490, 'loss/train': 0.7502643764019012} 01/26/2022 21:14:35 - INFO - codeparrot_training - Step 1491: {'lr': 0.00037275000000000003, 'samples': 286464, 'steps': 1491, 'loss/train': 0.8902399241924286} 01/26/2022 21:14:38 - INFO - codeparrot_training - Step 1492: {'lr': 0.000373, 'samples': 286656, 'steps': 1492, 'loss/train': 0.8958548605442047} 01/26/2022 21:14:41 - INFO - codeparrot_training - Step 1493: {'lr': 0.00037325000000000005, 'samples': 286848, 'steps': 1493, 'loss/train': 1.2120806872844696} 01/26/2022 21:14:45 - INFO - codeparrot_training - Step 1494: {'lr': 0.0003735, 'samples': 287040, 'steps': 1494, 'loss/train': 1.110617458820343} 01/26/2022 21:14:48 - INFO - codeparrot_training - Step 1495: {'lr': 0.00037375000000000006, 'samples': 287232, 'steps': 1495, 'loss/train': 4.196964383125305} 01/26/2022 21:14:51 - INFO - codeparrot_training - Step 1496: {'lr': 0.000374, 'samples': 287424, 'steps': 1496, 'loss/train': 1.0553580522537231} 01/26/2022 21:14:54 - INFO - codeparrot_training - Step 1497: {'lr': 0.00037425, 'samples': 287616, 'steps': 1497, 'loss/train': 2.0101332664489746} 01/26/2022 21:14:57 - INFO - codeparrot_training - Step 1498: {'lr': 0.0003745, 'samples': 287808, 'steps': 1498, 'loss/train': 0.8249520063400269} 01/26/2022 21:15:00 - INFO - codeparrot_training - Step 1499: {'lr': 0.00037475000000000003, 'samples': 288000, 'steps': 1499, 'loss/train': 0.48685017228126526} 01/26/2022 21:15:05 - INFO - codeparrot_training - Step 1500: {'lr': 0.000375, 'samples': 288192, 'steps': 1500, 'loss/train': 1.4895641505718231} 01/26/2022 21:15:08 - INFO - codeparrot_training - Step 1501: {'lr': 0.00037525, 'samples': 288384, 'steps': 1501, 'loss/train': 0.2093707099556923} 01/26/2022 21:15:11 - INFO - codeparrot_training - Step 1502: {'lr': 0.0003755, 'samples': 288576, 'steps': 1502, 'loss/train': 0.5461329817771912} 01/26/2022 21:15:14 - INFO - codeparrot_training - Step 1503: {'lr': 0.00037575, 'samples': 288768, 'steps': 1503, 'loss/train': 1.0211555063724518} 01/26/2022 21:15:17 - INFO - codeparrot_training - Step 1504: {'lr': 0.00037600000000000003, 'samples': 288960, 'steps': 1504, 'loss/train': 1.2740503549575806} 01/26/2022 21:15:20 - INFO - codeparrot_training - Step 1505: {'lr': 0.00037624999999999996, 'samples': 289152, 'steps': 1505, 'loss/train': 0.6302264928817749} 01/26/2022 21:15:23 - INFO - codeparrot_training - Step 1506: {'lr': 0.0003765, 'samples': 289344, 'steps': 1506, 'loss/train': 1.5738256573677063} 01/26/2022 21:15:27 - INFO - codeparrot_training - Step 1507: {'lr': 0.00037674999999999997, 'samples': 289536, 'steps': 1507, 'loss/train': 0.9112080037593842} 01/26/2022 21:15:30 - INFO - codeparrot_training - Step 1508: {'lr': 0.000377, 'samples': 289728, 'steps': 1508, 'loss/train': 0.9607047736644745} 01/26/2022 21:15:34 - INFO - codeparrot_training - Step 1509: {'lr': 0.00037725, 'samples': 289920, 'steps': 1509, 'loss/train': 0.4343150854110718} 01/26/2022 21:15:37 - INFO - codeparrot_training - Step 1510: {'lr': 0.0003775, 'samples': 290112, 'steps': 1510, 'loss/train': 1.0953315496444702} 01/26/2022 21:15:40 - INFO - codeparrot_training - Step 1511: {'lr': 0.00037775, 'samples': 290304, 'steps': 1511, 'loss/train': 0.8842317759990692} 01/26/2022 21:15:44 - INFO - codeparrot_training - Step 1512: {'lr': 0.000378, 'samples': 290496, 'steps': 1512, 'loss/train': 0.7734381258487701} 01/26/2022 21:15:47 - INFO - codeparrot_training - Step 1513: {'lr': 0.00037825, 'samples': 290688, 'steps': 1513, 'loss/train': 0.5693178176879883} 01/26/2022 21:15:50 - INFO - codeparrot_training - Step 1514: {'lr': 0.0003785, 'samples': 290880, 'steps': 1514, 'loss/train': 0.7337125092744827} 01/26/2022 21:15:53 - INFO - codeparrot_training - Step 1515: {'lr': 0.00037874999999999996, 'samples': 291072, 'steps': 1515, 'loss/train': 0.7267966568470001} 01/26/2022 21:15:56 - INFO - codeparrot_training - Step 1516: {'lr': 0.000379, 'samples': 291264, 'steps': 1516, 'loss/train': 1.007617861032486} 01/26/2022 21:15:59 - INFO - codeparrot_training - Step 1517: {'lr': 0.00037925, 'samples': 291456, 'steps': 1517, 'loss/train': 1.068414956331253} 01/26/2022 21:16:05 - INFO - codeparrot_training - Step 1518: {'lr': 0.0003795, 'samples': 291648, 'steps': 1518, 'loss/train': 0.7719941139221191} 01/26/2022 21:16:08 - INFO - codeparrot_training - Step 1519: {'lr': 0.00037975, 'samples': 291840, 'steps': 1519, 'loss/train': 1.0266875624656677} 01/26/2022 21:16:11 - INFO - codeparrot_training - Step 1520: {'lr': 0.00038, 'samples': 292032, 'steps': 1520, 'loss/train': 1.0779113173484802} 01/26/2022 21:16:14 - INFO - codeparrot_training - Step 1521: {'lr': 0.00038025, 'samples': 292224, 'steps': 1521, 'loss/train': 1.1040978133678436} 01/26/2022 21:16:17 - INFO - codeparrot_training - Step 1522: {'lr': 0.00038050000000000003, 'samples': 292416, 'steps': 1522, 'loss/train': 0.7414789348840714} 01/26/2022 21:16:21 - INFO - codeparrot_training - Step 1523: {'lr': 0.00038075, 'samples': 292608, 'steps': 1523, 'loss/train': 1.0996066331863403} 01/26/2022 21:16:24 - INFO - codeparrot_training - Step 1524: {'lr': 0.000381, 'samples': 292800, 'steps': 1524, 'loss/train': 0.7356545180082321} 01/26/2022 21:16:27 - INFO - codeparrot_training - Step 1525: {'lr': 0.00038124999999999997, 'samples': 292992, 'steps': 1525, 'loss/train': 0.6968617290258408} 01/26/2022 21:16:31 - INFO - codeparrot_training - Step 1526: {'lr': 0.0003815, 'samples': 293184, 'steps': 1526, 'loss/train': 0.9847961962223053} 01/26/2022 21:16:34 - INFO - codeparrot_training - Step 1527: {'lr': 0.00038175, 'samples': 293376, 'steps': 1527, 'loss/train': 1.3369852602481842} 01/26/2022 21:16:38 - INFO - codeparrot_training - Step 1528: {'lr': 0.000382, 'samples': 293568, 'steps': 1528, 'loss/train': 1.1806872189044952} 01/26/2022 21:16:41 - INFO - codeparrot_training - Step 1529: {'lr': 0.00038225, 'samples': 293760, 'steps': 1529, 'loss/train': 1.4961203634738922} 01/26/2022 21:16:44 - INFO - codeparrot_training - Step 1530: {'lr': 0.00038250000000000003, 'samples': 293952, 'steps': 1530, 'loss/train': 1.1804668307304382} 01/26/2022 21:16:47 - INFO - codeparrot_training - Step 1531: {'lr': 0.00038275, 'samples': 294144, 'steps': 1531, 'loss/train': 0.5334660708904266} 01/26/2022 21:16:50 - INFO - codeparrot_training - Step 1532: {'lr': 0.00038300000000000004, 'samples': 294336, 'steps': 1532, 'loss/train': 1.3035263121128082} 01/26/2022 21:16:53 - INFO - codeparrot_training - Step 1533: {'lr': 0.00038324999999999996, 'samples': 294528, 'steps': 1533, 'loss/train': 0.8394831418991089} 01/26/2022 21:16:56 - INFO - codeparrot_training - Step 1534: {'lr': 0.0003835, 'samples': 294720, 'steps': 1534, 'loss/train': 0.5574247241020203} 01/26/2022 21:17:01 - INFO - codeparrot_training - Step 1535: {'lr': 0.00038375, 'samples': 294912, 'steps': 1535, 'loss/train': 1.1966947317123413} 01/26/2022 21:17:04 - INFO - codeparrot_training - Step 1536: {'lr': 0.000384, 'samples': 295104, 'steps': 1536, 'loss/train': 0.9524939060211182} 01/26/2022 21:17:07 - INFO - codeparrot_training - Step 1537: {'lr': 0.00038425, 'samples': 295296, 'steps': 1537, 'loss/train': 0.9000881016254425} 01/26/2022 21:17:10 - INFO - codeparrot_training - Step 1538: {'lr': 0.0003845, 'samples': 295488, 'steps': 1538, 'loss/train': 1.6516569256782532} 01/26/2022 21:17:14 - INFO - codeparrot_training - Step 1539: {'lr': 0.00038475, 'samples': 295680, 'steps': 1539, 'loss/train': 1.3666580021381378} 01/26/2022 21:17:17 - INFO - codeparrot_training - Step 1540: {'lr': 0.00038500000000000003, 'samples': 295872, 'steps': 1540, 'loss/train': 1.0497311353683472} 01/26/2022 21:17:20 - INFO - codeparrot_training - Step 1541: {'lr': 0.00038525, 'samples': 296064, 'steps': 1541, 'loss/train': 0.8218839168548584} 01/26/2022 21:17:23 - INFO - codeparrot_training - Step 1542: {'lr': 0.0003855, 'samples': 296256, 'steps': 1542, 'loss/train': 0.9426329433917999} 01/26/2022 21:17:26 - INFO - codeparrot_training - Step 1543: {'lr': 0.00038574999999999997, 'samples': 296448, 'steps': 1543, 'loss/train': 1.2239013612270355} 01/26/2022 21:17:31 - INFO - codeparrot_training - Step 1544: {'lr': 0.000386, 'samples': 296640, 'steps': 1544, 'loss/train': 0.5781396925449371} 01/26/2022 21:17:34 - INFO - codeparrot_training - Step 1545: {'lr': 0.00038625, 'samples': 296832, 'steps': 1545, 'loss/train': 0.8666767179965973} 01/26/2022 21:17:37 - INFO - codeparrot_training - Step 1546: {'lr': 0.0003865, 'samples': 297024, 'steps': 1546, 'loss/train': 0.18094372004270554} 01/26/2022 21:17:40 - INFO - codeparrot_training - Step 1547: {'lr': 0.00038675, 'samples': 297216, 'steps': 1547, 'loss/train': 0.9551449120044708} 01/26/2022 21:17:43 - INFO - codeparrot_training - Step 1548: {'lr': 0.00038700000000000003, 'samples': 297408, 'steps': 1548, 'loss/train': 0.5519776493310928} 01/26/2022 21:17:46 - INFO - codeparrot_training - Step 1549: {'lr': 0.00038725, 'samples': 297600, 'steps': 1549, 'loss/train': 1.1795437037944794} 01/26/2022 21:17:50 - INFO - codeparrot_training - Step 1550: {'lr': 0.00038750000000000004, 'samples': 297792, 'steps': 1550, 'loss/train': 1.1903632879257202} 01/26/2022 21:17:53 - INFO - codeparrot_training - Step 1551: {'lr': 0.00038774999999999997, 'samples': 297984, 'steps': 1551, 'loss/train': 1.211746484041214} 01/26/2022 21:17:56 - INFO - codeparrot_training - Step 1552: {'lr': 0.000388, 'samples': 298176, 'steps': 1552, 'loss/train': 1.132394939661026} 01/26/2022 21:18:02 - INFO - codeparrot_training - Step 1553: {'lr': 0.00038825, 'samples': 298368, 'steps': 1553, 'loss/train': 1.5255194306373596} 01/26/2022 21:18:05 - INFO - codeparrot_training - Step 1554: {'lr': 0.0003885, 'samples': 298560, 'steps': 1554, 'loss/train': 0.9148447215557098} 01/26/2022 21:18:09 - INFO - codeparrot_training - Step 1555: {'lr': 0.00038875, 'samples': 298752, 'steps': 1555, 'loss/train': 0.8803000152111053} 01/26/2022 21:18:12 - INFO - codeparrot_training - Step 1556: {'lr': 0.000389, 'samples': 298944, 'steps': 1556, 'loss/train': 1.1481870710849762} 01/26/2022 21:18:15 - INFO - codeparrot_training - Step 1557: {'lr': 0.00038925, 'samples': 299136, 'steps': 1557, 'loss/train': 0.44970184564590454} 01/26/2022 21:18:18 - INFO - codeparrot_training - Step 1558: {'lr': 0.00038950000000000003, 'samples': 299328, 'steps': 1558, 'loss/train': 1.0816842019557953} 01/26/2022 21:18:21 - INFO - codeparrot_training - Step 1559: {'lr': 0.00038975, 'samples': 299520, 'steps': 1559, 'loss/train': 0.6674492508172989} 01/26/2022 21:18:24 - INFO - codeparrot_training - Step 1560: {'lr': 0.00039000000000000005, 'samples': 299712, 'steps': 1560, 'loss/train': 0.9751677811145782} 01/26/2022 21:18:27 - INFO - codeparrot_training - Step 1561: {'lr': 0.00039024999999999997, 'samples': 299904, 'steps': 1561, 'loss/train': 0.43891793489456177} 01/26/2022 21:18:32 - INFO - codeparrot_training - Step 1562: {'lr': 0.0003905, 'samples': 300096, 'steps': 1562, 'loss/train': 1.0710235834121704} 01/26/2022 21:18:35 - INFO - codeparrot_training - Step 1563: {'lr': 0.00039075, 'samples': 300288, 'steps': 1563, 'loss/train': 0.9703768193721771} 01/26/2022 21:18:38 - INFO - codeparrot_training - Step 1564: {'lr': 0.000391, 'samples': 300480, 'steps': 1564, 'loss/train': 1.481842964887619} 01/26/2022 21:18:41 - INFO - codeparrot_training - Step 1565: {'lr': 0.00039125, 'samples': 300672, 'steps': 1565, 'loss/train': 0.8194465041160583} 01/26/2022 21:18:44 - INFO - codeparrot_training - Step 1566: {'lr': 0.00039150000000000003, 'samples': 300864, 'steps': 1566, 'loss/train': 1.0495053827762604} 01/26/2022 21:18:47 - INFO - codeparrot_training - Step 1567: {'lr': 0.00039175, 'samples': 301056, 'steps': 1567, 'loss/train': 0.9151850938796997} 01/26/2022 21:18:51 - INFO - codeparrot_training - Step 1568: {'lr': 0.00039200000000000004, 'samples': 301248, 'steps': 1568, 'loss/train': 0.6693004667758942} 01/26/2022 21:18:54 - INFO - codeparrot_training - Step 1569: {'lr': 0.00039225, 'samples': 301440, 'steps': 1569, 'loss/train': 1.0176717638969421} 01/26/2022 21:18:57 - INFO - codeparrot_training - Step 1570: {'lr': 0.0003925, 'samples': 301632, 'steps': 1570, 'loss/train': 0.900753915309906} 01/26/2022 21:19:03 - INFO - codeparrot_training - Step 1571: {'lr': 0.00039275, 'samples': 301824, 'steps': 1571, 'loss/train': 0.5285711288452148} 01/26/2022 21:19:06 - INFO - codeparrot_training - Step 1572: {'lr': 0.000393, 'samples': 302016, 'steps': 1572, 'loss/train': 1.1502817869186401} 01/26/2022 21:19:09 - INFO - codeparrot_training - Step 1573: {'lr': 0.00039325, 'samples': 302208, 'steps': 1573, 'loss/train': 0.6094715595245361} 01/26/2022 21:19:12 - INFO - codeparrot_training - Step 1574: {'lr': 0.0003935, 'samples': 302400, 'steps': 1574, 'loss/train': 0.6199387907981873} 01/26/2022 21:19:15 - INFO - codeparrot_training - Step 1575: {'lr': 0.00039375, 'samples': 302592, 'steps': 1575, 'loss/train': 1.0760919749736786} 01/26/2022 21:19:19 - INFO - codeparrot_training - Step 1576: {'lr': 0.00039400000000000004, 'samples': 302784, 'steps': 1576, 'loss/train': 1.1562922596931458} 01/26/2022 21:19:22 - INFO - codeparrot_training - Step 1577: {'lr': 0.00039425, 'samples': 302976, 'steps': 1577, 'loss/train': 1.089037925004959} 01/26/2022 21:19:25 - INFO - codeparrot_training - Step 1578: {'lr': 0.00039450000000000005, 'samples': 303168, 'steps': 1578, 'loss/train': 1.0494210720062256} 01/26/2022 21:19:29 - INFO - codeparrot_training - Step 1579: {'lr': 0.00039474999999999997, 'samples': 303360, 'steps': 1579, 'loss/train': 0.9432246387004852} 01/26/2022 21:19:33 - INFO - codeparrot_training - Step 1580: {'lr': 0.000395, 'samples': 303552, 'steps': 1580, 'loss/train': 0.7735198438167572} 01/26/2022 21:19:36 - INFO - codeparrot_training - Step 1581: {'lr': 0.00039525, 'samples': 303744, 'steps': 1581, 'loss/train': 1.1471571028232574} 01/26/2022 21:19:39 - INFO - codeparrot_training - Step 1582: {'lr': 0.0003955, 'samples': 303936, 'steps': 1582, 'loss/train': 1.1255502998828888} 01/26/2022 21:19:42 - INFO - codeparrot_training - Step 1583: {'lr': 0.00039575, 'samples': 304128, 'steps': 1583, 'loss/train': 1.1929344534873962} 01/26/2022 21:19:45 - INFO - codeparrot_training - Step 1584: {'lr': 0.00039600000000000003, 'samples': 304320, 'steps': 1584, 'loss/train': 0.7604456841945648} 01/26/2022 21:19:48 - INFO - codeparrot_training - Step 1585: {'lr': 0.00039625, 'samples': 304512, 'steps': 1585, 'loss/train': 0.886058896780014} 01/26/2022 21:19:51 - INFO - codeparrot_training - Step 1586: {'lr': 0.00039650000000000004, 'samples': 304704, 'steps': 1586, 'loss/train': 1.392673820257187} 01/26/2022 21:19:55 - INFO - codeparrot_training - Step 1587: {'lr': 0.00039675, 'samples': 304896, 'steps': 1587, 'loss/train': 1.0805667042732239} 01/26/2022 21:19:59 - INFO - codeparrot_training - Step 1588: {'lr': 0.00039700000000000005, 'samples': 305088, 'steps': 1588, 'loss/train': 0.7766793072223663} 01/26/2022 21:20:02 - INFO - codeparrot_training - Step 1589: {'lr': 0.00039725, 'samples': 305280, 'steps': 1589, 'loss/train': 0.9202056527137756} 01/26/2022 21:20:05 - INFO - codeparrot_training - Step 1590: {'lr': 0.0003975, 'samples': 305472, 'steps': 1590, 'loss/train': 0.42704272270202637} 01/26/2022 21:20:08 - INFO - codeparrot_training - Step 1591: {'lr': 0.00039775, 'samples': 305664, 'steps': 1591, 'loss/train': 0.937246710062027} 01/26/2022 21:20:12 - INFO - codeparrot_training - Step 1592: {'lr': 0.000398, 'samples': 305856, 'steps': 1592, 'loss/train': 1.04826021194458} 01/26/2022 21:20:15 - INFO - codeparrot_training - Step 1593: {'lr': 0.00039825, 'samples': 306048, 'steps': 1593, 'loss/train': 1.472885549068451} 01/26/2022 21:20:18 - INFO - codeparrot_training - Step 1594: {'lr': 0.00039850000000000004, 'samples': 306240, 'steps': 1594, 'loss/train': 0.7873425185680389} 01/26/2022 21:20:21 - INFO - codeparrot_training - Step 1595: {'lr': 0.00039875, 'samples': 306432, 'steps': 1595, 'loss/train': 0.7252905517816544} 01/26/2022 21:20:24 - INFO - codeparrot_training - Step 1596: {'lr': 0.00039900000000000005, 'samples': 306624, 'steps': 1596, 'loss/train': 1.0548094511032104} 01/26/2022 21:20:30 - INFO - codeparrot_training - Step 1597: {'lr': 0.00039925000000000003, 'samples': 306816, 'steps': 1597, 'loss/train': 1.1972896456718445} 01/26/2022 21:20:33 - INFO - codeparrot_training - Step 1598: {'lr': 0.0003995, 'samples': 307008, 'steps': 1598, 'loss/train': 0.8496110737323761} 01/26/2022 21:20:36 - INFO - codeparrot_training - Step 1599: {'lr': 0.00039975, 'samples': 307200, 'steps': 1599, 'loss/train': 0.8173715472221375} 01/26/2022 21:20:40 - INFO - codeparrot_training - Step 1600: {'lr': 0.0004, 'samples': 307392, 'steps': 1600, 'loss/train': 1.2981213927268982} 01/26/2022 21:20:43 - INFO - codeparrot_training - Step 1601: {'lr': 0.00040025, 'samples': 307584, 'steps': 1601, 'loss/train': 0.8052924871444702} 01/26/2022 21:20:46 - INFO - codeparrot_training - Step 1602: {'lr': 0.00040050000000000003, 'samples': 307776, 'steps': 1602, 'loss/train': 0.903513103723526} 01/26/2022 21:20:49 - INFO - codeparrot_training - Step 1603: {'lr': 0.00040075, 'samples': 307968, 'steps': 1603, 'loss/train': 0.8908604979515076} 01/26/2022 21:20:52 - INFO - codeparrot_training - Step 1604: {'lr': 0.00040100000000000004, 'samples': 308160, 'steps': 1604, 'loss/train': 1.2467265129089355} 01/26/2022 21:20:57 - INFO - codeparrot_training - Step 1605: {'lr': 0.00040125, 'samples': 308352, 'steps': 1605, 'loss/train': 1.1703225374221802} 01/26/2022 21:21:00 - INFO - codeparrot_training - Step 1606: {'lr': 0.00040150000000000006, 'samples': 308544, 'steps': 1606, 'loss/train': 0.9032231569290161} 01/26/2022 21:21:03 - INFO - codeparrot_training - Step 1607: {'lr': 0.00040175, 'samples': 308736, 'steps': 1607, 'loss/train': 0.570895716547966} 01/26/2022 21:21:06 - INFO - codeparrot_training - Step 1608: {'lr': 0.000402, 'samples': 308928, 'steps': 1608, 'loss/train': 1.2200436294078827} 01/26/2022 21:21:09 - INFO - codeparrot_training - Step 1609: {'lr': 0.00040225, 'samples': 309120, 'steps': 1609, 'loss/train': 0.8977490365505219} 01/26/2022 21:21:12 - INFO - codeparrot_training - Step 1610: {'lr': 0.0004025, 'samples': 309312, 'steps': 1610, 'loss/train': 0.9952645003795624} 01/26/2022 21:21:15 - INFO - codeparrot_training - Step 1611: {'lr': 0.00040275, 'samples': 309504, 'steps': 1611, 'loss/train': 0.902794361114502} 01/26/2022 21:21:19 - INFO - codeparrot_training - Step 1612: {'lr': 0.00040300000000000004, 'samples': 309696, 'steps': 1612, 'loss/train': 0.9980608820915222} 01/26/2022 21:21:22 - INFO - codeparrot_training - Step 1613: {'lr': 0.00040325, 'samples': 309888, 'steps': 1613, 'loss/train': 1.1632069051265717} 01/26/2022 21:21:28 - INFO - codeparrot_training - Step 1614: {'lr': 0.00040350000000000005, 'samples': 310080, 'steps': 1614, 'loss/train': 0.8721232116222382} 01/26/2022 21:21:31 - INFO - codeparrot_training - Step 1615: {'lr': 0.00040375000000000003, 'samples': 310272, 'steps': 1615, 'loss/train': 0.6417164504528046} 01/26/2022 21:21:35 - INFO - codeparrot_training - Step 1616: {'lr': 0.000404, 'samples': 310464, 'steps': 1616, 'loss/train': 1.0448070466518402} 01/26/2022 21:21:38 - INFO - codeparrot_training - Step 1617: {'lr': 0.00040425, 'samples': 310656, 'steps': 1617, 'loss/train': 0.763612300157547} 01/26/2022 21:21:41 - INFO - codeparrot_training - Step 1618: {'lr': 0.0004045, 'samples': 310848, 'steps': 1618, 'loss/train': 0.9192868173122406} 01/26/2022 21:21:44 - INFO - codeparrot_training - Step 1619: {'lr': 0.00040475, 'samples': 311040, 'steps': 1619, 'loss/train': 0.8964953720569611} 01/26/2022 21:21:47 - INFO - codeparrot_training - Step 1620: {'lr': 0.00040500000000000003, 'samples': 311232, 'steps': 1620, 'loss/train': 0.4305136352777481} 01/26/2022 21:21:50 - INFO - codeparrot_training - Step 1621: {'lr': 0.00040525, 'samples': 311424, 'steps': 1621, 'loss/train': 1.2225679457187653} 01/26/2022 21:21:53 - INFO - codeparrot_training - Step 1622: {'lr': 0.00040550000000000004, 'samples': 311616, 'steps': 1622, 'loss/train': 0.8143474459648132} 01/26/2022 21:21:58 - INFO - codeparrot_training - Step 1623: {'lr': 0.00040575, 'samples': 311808, 'steps': 1623, 'loss/train': 0.9397753179073334} 01/26/2022 21:22:01 - INFO - codeparrot_training - Step 1624: {'lr': 0.00040600000000000006, 'samples': 312000, 'steps': 1624, 'loss/train': 0.9602738320827484} 01/26/2022 21:22:04 - INFO - codeparrot_training - Step 1625: {'lr': 0.00040625000000000004, 'samples': 312192, 'steps': 1625, 'loss/train': 1.1994936168193817} 01/26/2022 21:22:07 - INFO - codeparrot_training - Step 1626: {'lr': 0.00040649999999999996, 'samples': 312384, 'steps': 1626, 'loss/train': 0.852255642414093} 01/26/2022 21:22:10 - INFO - codeparrot_training - Step 1627: {'lr': 0.00040675, 'samples': 312576, 'steps': 1627, 'loss/train': 0.7201205044984818} 01/26/2022 21:22:13 - INFO - codeparrot_training - Step 1628: {'lr': 0.00040699999999999997, 'samples': 312768, 'steps': 1628, 'loss/train': 0.7200788408517838} 01/26/2022 21:22:17 - INFO - codeparrot_training - Step 1629: {'lr': 0.00040725, 'samples': 312960, 'steps': 1629, 'loss/train': 1.2039445638656616} 01/26/2022 21:22:20 - INFO - codeparrot_training - Step 1630: {'lr': 0.0004075, 'samples': 313152, 'steps': 1630, 'loss/train': 0.8602632880210876} 01/26/2022 21:22:23 - INFO - codeparrot_training - Step 1631: {'lr': 0.00040775, 'samples': 313344, 'steps': 1631, 'loss/train': 0.3987828344106674} 01/26/2022 21:22:27 - INFO - codeparrot_training - Step 1632: {'lr': 0.000408, 'samples': 313536, 'steps': 1632, 'loss/train': 1.8231298327445984} 01/26/2022 21:22:30 - INFO - codeparrot_training - Step 1633: {'lr': 0.00040825000000000003, 'samples': 313728, 'steps': 1633, 'loss/train': 0.7313338816165924} 01/26/2022 21:22:33 - INFO - codeparrot_training - Step 1634: {'lr': 0.0004085, 'samples': 313920, 'steps': 1634, 'loss/train': 0.9621573686599731} 01/26/2022 21:22:37 - INFO - codeparrot_training - Step 1635: {'lr': 0.00040875, 'samples': 314112, 'steps': 1635, 'loss/train': 1.0393626987934113} 01/26/2022 21:22:40 - INFO - codeparrot_training - Step 1636: {'lr': 0.00040899999999999997, 'samples': 314304, 'steps': 1636, 'loss/train': 0.6312345117330551} 01/26/2022 21:22:43 - INFO - codeparrot_training - Step 1637: {'lr': 0.00040925, 'samples': 314496, 'steps': 1637, 'loss/train': 1.0294510424137115} 01/26/2022 21:22:46 - INFO - codeparrot_training - Step 1638: {'lr': 0.0004095, 'samples': 314688, 'steps': 1638, 'loss/train': 0.5636235326528549} 01/26/2022 21:22:49 - INFO - codeparrot_training - Step 1639: {'lr': 0.00040975, 'samples': 314880, 'steps': 1639, 'loss/train': 1.4927867352962494} 01/26/2022 21:22:52 - INFO - codeparrot_training - Step 1640: {'lr': 0.00041, 'samples': 315072, 'steps': 1640, 'loss/train': 1.0345790684223175} 01/26/2022 21:22:57 - INFO - codeparrot_training - Step 1641: {'lr': 0.00041025, 'samples': 315264, 'steps': 1641, 'loss/train': 0.38186345994472504} 01/26/2022 21:23:00 - INFO - codeparrot_training - Step 1642: {'lr': 0.0004105, 'samples': 315456, 'steps': 1642, 'loss/train': 0.9589768052101135} 01/26/2022 21:23:03 - INFO - codeparrot_training - Step 1643: {'lr': 0.00041075000000000004, 'samples': 315648, 'steps': 1643, 'loss/train': 1.0343111157417297} 01/26/2022 21:23:06 - INFO - codeparrot_training - Step 1644: {'lr': 0.00041099999999999996, 'samples': 315840, 'steps': 1644, 'loss/train': 0.7044095993041992} 01/26/2022 21:23:09 - INFO - codeparrot_training - Step 1645: {'lr': 0.00041125, 'samples': 316032, 'steps': 1645, 'loss/train': 0.8021068274974823} 01/26/2022 21:23:13 - INFO - codeparrot_training - Step 1646: {'lr': 0.0004115, 'samples': 316224, 'steps': 1646, 'loss/train': 0.7520883679389954} 01/26/2022 21:23:16 - INFO - codeparrot_training - Step 1647: {'lr': 0.00041175, 'samples': 316416, 'steps': 1647, 'loss/train': 0.47065356373786926} 01/26/2022 21:23:19 - INFO - codeparrot_training - Step 1648: {'lr': 0.000412, 'samples': 316608, 'steps': 1648, 'loss/train': 1.2325392365455627} 01/26/2022 21:23:24 - INFO - codeparrot_training - Step 1649: {'lr': 0.00041225, 'samples': 316800, 'steps': 1649, 'loss/train': 1.624911367893219} 01/26/2022 21:23:28 - INFO - codeparrot_training - Step 1650: {'lr': 0.0004125, 'samples': 316992, 'steps': 1650, 'loss/train': 0.9685000777244568} 01/26/2022 21:23:31 - INFO - codeparrot_training - Step 1651: {'lr': 0.00041275000000000003, 'samples': 317184, 'steps': 1651, 'loss/train': 0.7986305952072144} 01/26/2022 21:23:34 - INFO - codeparrot_training - Step 1652: {'lr': 0.000413, 'samples': 317376, 'steps': 1652, 'loss/train': 0.8410978317260742} 01/26/2022 21:23:37 - INFO - codeparrot_training - Step 1653: {'lr': 0.00041325, 'samples': 317568, 'steps': 1653, 'loss/train': 0.6380593925714493} 01/26/2022 21:23:40 - INFO - codeparrot_training - Step 1654: {'lr': 0.00041349999999999997, 'samples': 317760, 'steps': 1654, 'loss/train': 0.7542805373668671} 01/26/2022 21:23:43 - INFO - codeparrot_training - Step 1655: {'lr': 0.00041375, 'samples': 317952, 'steps': 1655, 'loss/train': 0.8882549107074738} 01/26/2022 21:23:46 - INFO - codeparrot_training - Step 1656: {'lr': 0.000414, 'samples': 318144, 'steps': 1656, 'loss/train': 0.4313184767961502} 01/26/2022 21:23:50 - INFO - codeparrot_training - Step 1657: {'lr': 0.00041425, 'samples': 318336, 'steps': 1657, 'loss/train': 1.1223379075527191} 01/26/2022 21:23:54 - INFO - codeparrot_training - Step 1658: {'lr': 0.0004145, 'samples': 318528, 'steps': 1658, 'loss/train': 1.3333961963653564} 01/26/2022 21:23:57 - INFO - codeparrot_training - Step 1659: {'lr': 0.00041475, 'samples': 318720, 'steps': 1659, 'loss/train': 0.7927728295326233} 01/26/2022 21:24:00 - INFO - codeparrot_training - Step 1660: {'lr': 0.000415, 'samples': 318912, 'steps': 1660, 'loss/train': 1.2785732746124268} 01/26/2022 21:24:04 - INFO - codeparrot_training - Step 1661: {'lr': 0.00041525000000000004, 'samples': 319104, 'steps': 1661, 'loss/train': 0.15419504418969154} 01/26/2022 21:24:07 - INFO - codeparrot_training - Step 1662: {'lr': 0.00041549999999999996, 'samples': 319296, 'steps': 1662, 'loss/train': 1.1597148478031158} 01/26/2022 21:24:10 - INFO - codeparrot_training - Step 1663: {'lr': 0.00041575, 'samples': 319488, 'steps': 1663, 'loss/train': 1.2206933498382568} 01/26/2022 21:24:13 - INFO - codeparrot_training - Step 1664: {'lr': 0.000416, 'samples': 319680, 'steps': 1664, 'loss/train': 0.9429126977920532} 01/26/2022 21:24:16 - INFO - codeparrot_training - Step 1665: {'lr': 0.00041625, 'samples': 319872, 'steps': 1665, 'loss/train': 2.2514559030532837} 01/26/2022 21:24:19 - INFO - codeparrot_training - Step 1666: {'lr': 0.0004165, 'samples': 320064, 'steps': 1666, 'loss/train': 0.6221943497657776} 01/26/2022 21:24:24 - INFO - codeparrot_training - Step 1667: {'lr': 0.00041675, 'samples': 320256, 'steps': 1667, 'loss/train': 1.1753460466861725} 01/26/2022 21:24:27 - INFO - codeparrot_training - Step 1668: {'lr': 0.000417, 'samples': 320448, 'steps': 1668, 'loss/train': 1.175249844789505} 01/26/2022 21:24:30 - INFO - codeparrot_training - Step 1669: {'lr': 0.00041725000000000003, 'samples': 320640, 'steps': 1669, 'loss/train': 1.073170781135559} 01/26/2022 21:24:33 - INFO - codeparrot_training - Step 1670: {'lr': 0.0004175, 'samples': 320832, 'steps': 1670, 'loss/train': 0.5739630460739136} 01/26/2022 21:24:36 - INFO - codeparrot_training - Step 1671: {'lr': 0.00041775000000000004, 'samples': 321024, 'steps': 1671, 'loss/train': 0.6225871592760086} 01/26/2022 21:24:39 - INFO - codeparrot_training - Step 1672: {'lr': 0.00041799999999999997, 'samples': 321216, 'steps': 1672, 'loss/train': 0.7493513077497482} 01/26/2022 21:24:43 - INFO - codeparrot_training - Step 1673: {'lr': 0.00041825, 'samples': 321408, 'steps': 1673, 'loss/train': 1.1422923803329468} 01/26/2022 21:24:46 - INFO - codeparrot_training - Step 1674: {'lr': 0.0004185, 'samples': 321600, 'steps': 1674, 'loss/train': 0.8022510409355164} 01/26/2022 21:24:49 - INFO - codeparrot_training - Step 1675: {'lr': 0.00041875, 'samples': 321792, 'steps': 1675, 'loss/train': 0.9691024124622345} 01/26/2022 21:24:55 - INFO - codeparrot_training - Step 1676: {'lr': 0.000419, 'samples': 321984, 'steps': 1676, 'loss/train': 0.40006019175052643} 01/26/2022 21:24:58 - INFO - codeparrot_training - Step 1677: {'lr': 0.00041925, 'samples': 322176, 'steps': 1677, 'loss/train': 1.0653008222579956} 01/26/2022 21:25:02 - INFO - codeparrot_training - Step 1678: {'lr': 0.0004195, 'samples': 322368, 'steps': 1678, 'loss/train': 0.6026585251092911} 01/26/2022 21:25:05 - INFO - codeparrot_training - Step 1679: {'lr': 0.00041975000000000004, 'samples': 322560, 'steps': 1679, 'loss/train': 1.0619997382164001} 01/26/2022 21:25:08 - INFO - codeparrot_training - Step 1680: {'lr': 0.00042, 'samples': 322752, 'steps': 1680, 'loss/train': 1.0160093307495117} 01/26/2022 21:25:11 - INFO - codeparrot_training - Step 1681: {'lr': 0.00042025, 'samples': 322944, 'steps': 1681, 'loss/train': 0.18223318085074425} 01/26/2022 21:25:14 - INFO - codeparrot_training - Step 1682: {'lr': 0.0004205, 'samples': 323136, 'steps': 1682, 'loss/train': 1.0753837823867798} 01/26/2022 21:25:17 - INFO - codeparrot_training - Step 1683: {'lr': 0.00042075, 'samples': 323328, 'steps': 1683, 'loss/train': 0.9113363027572632} 01/26/2022 21:25:20 - INFO - codeparrot_training - Step 1684: {'lr': 0.000421, 'samples': 323520, 'steps': 1684, 'loss/train': 0.8418412506580353} 01/26/2022 21:25:25 - INFO - codeparrot_training - Step 1685: {'lr': 0.00042125, 'samples': 323712, 'steps': 1685, 'loss/train': 0.8789848387241364} 01/26/2022 21:25:28 - INFO - codeparrot_training - Step 1686: {'lr': 0.0004215, 'samples': 323904, 'steps': 1686, 'loss/train': 0.8070947527885437} 01/26/2022 21:25:32 - INFO - codeparrot_training - Step 1687: {'lr': 0.00042175000000000003, 'samples': 324096, 'steps': 1687, 'loss/train': 1.122605949640274} 01/26/2022 21:25:35 - INFO - codeparrot_training - Step 1688: {'lr': 0.000422, 'samples': 324288, 'steps': 1688, 'loss/train': 1.000831425189972} 01/26/2022 21:25:38 - INFO - codeparrot_training - Step 1689: {'lr': 0.00042225000000000005, 'samples': 324480, 'steps': 1689, 'loss/train': 1.348246157169342} 01/26/2022 21:25:41 - INFO - codeparrot_training - Step 1690: {'lr': 0.00042249999999999997, 'samples': 324672, 'steps': 1690, 'loss/train': 0.4896686375141144} 01/26/2022 21:25:44 - INFO - codeparrot_training - Step 1691: {'lr': 0.00042275, 'samples': 324864, 'steps': 1691, 'loss/train': 1.15392404794693} 01/26/2022 21:25:47 - INFO - codeparrot_training - Step 1692: {'lr': 0.000423, 'samples': 325056, 'steps': 1692, 'loss/train': 1.1223683953285217} 01/26/2022 21:25:54 - INFO - codeparrot_training - Step 1693: {'lr': 0.00042325, 'samples': 325248, 'steps': 1693, 'loss/train': 1.1055345833301544} 01/26/2022 21:25:57 - INFO - codeparrot_training - Step 1694: {'lr': 0.0004235, 'samples': 325440, 'steps': 1694, 'loss/train': 0.9645794034004211} 01/26/2022 21:26:00 - INFO - codeparrot_training - Step 1695: {'lr': 0.00042375000000000003, 'samples': 325632, 'steps': 1695, 'loss/train': 1.160351425409317} 01/26/2022 21:26:03 - INFO - codeparrot_training - Step 1696: {'lr': 0.000424, 'samples': 325824, 'steps': 1696, 'loss/train': 1.1919098496437073} 01/26/2022 21:26:06 - INFO - codeparrot_training - Step 1697: {'lr': 0.00042425000000000004, 'samples': 326016, 'steps': 1697, 'loss/train': 0.35773663222789764} 01/26/2022 21:26:09 - INFO - codeparrot_training - Step 1698: {'lr': 0.0004245, 'samples': 326208, 'steps': 1698, 'loss/train': 0.9857082366943359} 01/26/2022 21:26:12 - INFO - codeparrot_training - Step 1699: {'lr': 0.00042475000000000005, 'samples': 326400, 'steps': 1699, 'loss/train': 0.6878446340560913} 01/26/2022 21:26:16 - INFO - codeparrot_training - Step 1700: {'lr': 0.000425, 'samples': 326592, 'steps': 1700, 'loss/train': 1.1664015054702759} 01/26/2022 21:26:19 - INFO - codeparrot_training - Step 1701: {'lr': 0.00042525, 'samples': 326784, 'steps': 1701, 'loss/train': 1.1706021130084991} 01/26/2022 21:26:23 - INFO - codeparrot_training - Step 1702: {'lr': 0.0004255, 'samples': 326976, 'steps': 1702, 'loss/train': 0.7951970100402832} 01/26/2022 21:26:26 - INFO - codeparrot_training - Step 1703: {'lr': 0.00042575, 'samples': 327168, 'steps': 1703, 'loss/train': 1.207459956407547} 01/26/2022 21:26:29 - INFO - codeparrot_training - Step 1704: {'lr': 0.000426, 'samples': 327360, 'steps': 1704, 'loss/train': 0.6888840794563293} 01/26/2022 21:26:32 - INFO - codeparrot_training - Step 1705: {'lr': 0.00042625000000000003, 'samples': 327552, 'steps': 1705, 'loss/train': 0.8381628692150116} 01/26/2022 21:26:36 - INFO - codeparrot_training - Step 1706: {'lr': 0.0004265, 'samples': 327744, 'steps': 1706, 'loss/train': 1.0316231846809387} 01/26/2022 21:26:39 - INFO - codeparrot_training - Step 1707: {'lr': 0.00042675000000000005, 'samples': 327936, 'steps': 1707, 'loss/train': 1.0781696140766144} 01/26/2022 21:26:42 - INFO - codeparrot_training - Step 1708: {'lr': 0.000427, 'samples': 328128, 'steps': 1708, 'loss/train': 1.189242660999298} 01/26/2022 21:26:45 - INFO - codeparrot_training - Step 1709: {'lr': 0.00042725, 'samples': 328320, 'steps': 1709, 'loss/train': 1.295003056526184} 01/26/2022 21:26:48 - INFO - codeparrot_training - Step 1710: {'lr': 0.0004275, 'samples': 328512, 'steps': 1710, 'loss/train': 0.6395526677370071} 01/26/2022 21:26:53 - INFO - codeparrot_training - Step 1711: {'lr': 0.00042775, 'samples': 328704, 'steps': 1711, 'loss/train': 1.3519046008586884} 01/26/2022 21:26:56 - INFO - codeparrot_training - Step 1712: {'lr': 0.000428, 'samples': 328896, 'steps': 1712, 'loss/train': 0.4921133816242218} 01/26/2022 21:26:59 - INFO - codeparrot_training - Step 1713: {'lr': 0.00042825000000000003, 'samples': 329088, 'steps': 1713, 'loss/train': 0.9713128209114075} 01/26/2022 21:27:02 - INFO - codeparrot_training - Step 1714: {'lr': 0.0004285, 'samples': 329280, 'steps': 1714, 'loss/train': 0.8321100175380707} 01/26/2022 21:27:05 - INFO - codeparrot_training - Step 1715: {'lr': 0.00042875000000000004, 'samples': 329472, 'steps': 1715, 'loss/train': 1.1703671514987946} 01/26/2022 21:27:08 - INFO - codeparrot_training - Step 1716: {'lr': 0.000429, 'samples': 329664, 'steps': 1716, 'loss/train': 0.8002711236476898} 01/26/2022 21:27:11 - INFO - codeparrot_training - Step 1717: {'lr': 0.00042925000000000005, 'samples': 329856, 'steps': 1717, 'loss/train': 1.1443277299404144} 01/26/2022 21:27:15 - INFO - codeparrot_training - Step 1718: {'lr': 0.0004295, 'samples': 330048, 'steps': 1718, 'loss/train': 1.6503835916519165} 01/26/2022 21:27:19 - INFO - codeparrot_training - Step 1719: {'lr': 0.00042975, 'samples': 330240, 'steps': 1719, 'loss/train': 0.4028375744819641} 01/26/2022 21:27:22 - INFO - codeparrot_training - Step 1720: {'lr': 0.00043, 'samples': 330432, 'steps': 1720, 'loss/train': 0.9084438979625702} 01/26/2022 21:27:26 - INFO - codeparrot_training - Step 1721: {'lr': 0.00043025, 'samples': 330624, 'steps': 1721, 'loss/train': 0.2540413811802864} 01/26/2022 21:27:29 - INFO - codeparrot_training - Step 1722: {'lr': 0.0004305, 'samples': 330816, 'steps': 1722, 'loss/train': 1.1666429042816162} 01/26/2022 21:27:32 - INFO - codeparrot_training - Step 1723: {'lr': 0.00043075000000000003, 'samples': 331008, 'steps': 1723, 'loss/train': 0.8956621885299683} 01/26/2022 21:27:35 - INFO - codeparrot_training - Step 1724: {'lr': 0.000431, 'samples': 331200, 'steps': 1724, 'loss/train': 0.5922983586788177} 01/26/2022 21:27:38 - INFO - codeparrot_training - Step 1725: {'lr': 0.00043125000000000005, 'samples': 331392, 'steps': 1725, 'loss/train': 1.00207981467247} 01/26/2022 21:27:41 - INFO - codeparrot_training - Step 1726: {'lr': 0.0004315, 'samples': 331584, 'steps': 1726, 'loss/train': 0.9447910487651825} 01/26/2022 21:27:44 - INFO - codeparrot_training - Step 1727: {'lr': 0.00043175, 'samples': 331776, 'steps': 1727, 'loss/train': 1.1207318007946014} 01/26/2022 21:27:51 - INFO - codeparrot_training - Step 1728: {'lr': 0.000432, 'samples': 331968, 'steps': 1728, 'loss/train': 0.9733808040618896} 01/26/2022 21:27:54 - INFO - codeparrot_training - Step 1729: {'lr': 0.00043225, 'samples': 332160, 'steps': 1729, 'loss/train': 1.3409484028816223} 01/26/2022 21:27:57 - INFO - codeparrot_training - Step 1730: {'lr': 0.0004325, 'samples': 332352, 'steps': 1730, 'loss/train': 0.9002691507339478} 01/26/2022 21:28:00 - INFO - codeparrot_training - Step 1731: {'lr': 0.00043275000000000003, 'samples': 332544, 'steps': 1731, 'loss/train': 0.8659699559211731} 01/26/2022 21:28:03 - INFO - codeparrot_training - Step 1732: {'lr': 0.000433, 'samples': 332736, 'steps': 1732, 'loss/train': 1.156950294971466} 01/26/2022 21:28:06 - INFO - codeparrot_training - Step 1733: {'lr': 0.00043325000000000004, 'samples': 332928, 'steps': 1733, 'loss/train': 0.7196600139141083} 01/26/2022 21:28:09 - INFO - codeparrot_training - Step 1734: {'lr': 0.0004335, 'samples': 333120, 'steps': 1734, 'loss/train': 0.6546306610107422} 01/26/2022 21:28:13 - INFO - codeparrot_training - Step 1735: {'lr': 0.00043375000000000005, 'samples': 333312, 'steps': 1735, 'loss/train': 0.9295001327991486} 01/26/2022 21:28:16 - INFO - codeparrot_training - Step 1736: {'lr': 0.00043400000000000003, 'samples': 333504, 'steps': 1736, 'loss/train': 2.139705240726471} 01/26/2022 21:28:20 - INFO - codeparrot_training - Step 1737: {'lr': 0.00043425, 'samples': 333696, 'steps': 1737, 'loss/train': 0.6314667463302612} 01/26/2022 21:28:23 - INFO - codeparrot_training - Step 1738: {'lr': 0.0004345, 'samples': 333888, 'steps': 1738, 'loss/train': 0.887237548828125} 01/26/2022 21:28:27 - INFO - codeparrot_training - Step 1739: {'lr': 0.00043475, 'samples': 334080, 'steps': 1739, 'loss/train': 0.9448980689048767} 01/26/2022 21:28:30 - INFO - codeparrot_training - Step 1740: {'lr': 0.000435, 'samples': 334272, 'steps': 1740, 'loss/train': 0.6644312739372253} 01/26/2022 21:28:33 - INFO - codeparrot_training - Step 1741: {'lr': 0.00043525000000000004, 'samples': 334464, 'steps': 1741, 'loss/train': 0.9693324565887451} 01/26/2022 21:28:36 - INFO - codeparrot_training - Step 1742: {'lr': 0.0004355, 'samples': 334656, 'steps': 1742, 'loss/train': 0.9859629571437836} 01/26/2022 21:28:39 - INFO - codeparrot_training - Step 1743: {'lr': 0.00043575000000000005, 'samples': 334848, 'steps': 1743, 'loss/train': 0.8252474963665009} 01/26/2022 21:28:42 - INFO - codeparrot_training - Step 1744: {'lr': 0.000436, 'samples': 335040, 'steps': 1744, 'loss/train': 1.0943603217601776} 01/26/2022 21:28:47 - INFO - codeparrot_training - Step 1745: {'lr': 0.00043625000000000006, 'samples': 335232, 'steps': 1745, 'loss/train': 0.8645595610141754} 01/26/2022 21:28:50 - INFO - codeparrot_training - Step 1746: {'lr': 0.0004365, 'samples': 335424, 'steps': 1746, 'loss/train': 0.7886210381984711} 01/26/2022 21:28:53 - INFO - codeparrot_training - Step 1747: {'lr': 0.00043675, 'samples': 335616, 'steps': 1747, 'loss/train': 1.2167228758335114} 01/26/2022 21:28:56 - INFO - codeparrot_training - Step 1748: {'lr': 0.000437, 'samples': 335808, 'steps': 1748, 'loss/train': 0.8175221979618073} 01/26/2022 21:28:59 - INFO - codeparrot_training - Step 1749: {'lr': 0.00043725000000000003, 'samples': 336000, 'steps': 1749, 'loss/train': 0.7292230725288391} 01/26/2022 21:29:02 - INFO - codeparrot_training - Step 1750: {'lr': 0.0004375, 'samples': 336192, 'steps': 1750, 'loss/train': 1.10347580909729} 01/26/2022 21:29:06 - INFO - codeparrot_training - Step 1751: {'lr': 0.00043775, 'samples': 336384, 'steps': 1751, 'loss/train': 0.7977529764175415} 01/26/2022 21:29:09 - INFO - codeparrot_training - Step 1752: {'lr': 0.000438, 'samples': 336576, 'steps': 1752, 'loss/train': 0.7142423093318939} 01/26/2022 21:29:12 - INFO - codeparrot_training - Step 1753: {'lr': 0.00043825, 'samples': 336768, 'steps': 1753, 'loss/train': 1.27318874001503} 01/26/2022 21:29:18 - INFO - codeparrot_training - Step 1754: {'lr': 0.00043850000000000003, 'samples': 336960, 'steps': 1754, 'loss/train': 0.9715580642223358} 01/26/2022 21:29:21 - INFO - codeparrot_training - Step 1755: {'lr': 0.00043874999999999996, 'samples': 337152, 'steps': 1755, 'loss/train': 0.7470668256282806} 01/26/2022 21:29:24 - INFO - codeparrot_training - Step 1756: {'lr': 0.000439, 'samples': 337344, 'steps': 1756, 'loss/train': 1.004561573266983} 01/26/2022 21:29:27 - INFO - codeparrot_training - Step 1757: {'lr': 0.00043924999999999997, 'samples': 337536, 'steps': 1757, 'loss/train': 1.2235007286071777} 01/26/2022 21:29:30 - INFO - codeparrot_training - Step 1758: {'lr': 0.0004395, 'samples': 337728, 'steps': 1758, 'loss/train': 2.091886818408966} 01/26/2022 21:29:34 - INFO - codeparrot_training - Step 1759: {'lr': 0.00043975, 'samples': 337920, 'steps': 1759, 'loss/train': 0.7003630846738815} 01/26/2022 21:29:37 - INFO - codeparrot_training - Step 1760: {'lr': 0.00044, 'samples': 338112, 'steps': 1760, 'loss/train': 0.9811866581439972} 01/26/2022 21:29:40 - INFO - codeparrot_training - Step 1761: {'lr': 0.00044025, 'samples': 338304, 'steps': 1761, 'loss/train': 1.5664928555488586} 01/26/2022 21:29:43 - INFO - codeparrot_training - Step 1762: {'lr': 0.00044050000000000003, 'samples': 338496, 'steps': 1762, 'loss/train': 0.9035204350948334} 01/26/2022 21:29:47 - INFO - codeparrot_training - Step 1763: {'lr': 0.00044075, 'samples': 338688, 'steps': 1763, 'loss/train': 1.044754832983017} 01/26/2022 21:29:51 - INFO - codeparrot_training - Step 1764: {'lr': 0.000441, 'samples': 338880, 'steps': 1764, 'loss/train': 0.44676947593688965} 01/26/2022 21:29:54 - INFO - codeparrot_training - Step 1765: {'lr': 0.00044124999999999996, 'samples': 339072, 'steps': 1765, 'loss/train': 1.2584916651248932} 01/26/2022 21:29:57 - INFO - codeparrot_training - Step 1766: {'lr': 0.0004415, 'samples': 339264, 'steps': 1766, 'loss/train': 0.6115643084049225} 01/26/2022 21:30:00 - INFO - codeparrot_training - Step 1767: {'lr': 0.00044175, 'samples': 339456, 'steps': 1767, 'loss/train': 0.7946164906024933} 01/26/2022 21:30:03 - INFO - codeparrot_training - Step 1768: {'lr': 0.000442, 'samples': 339648, 'steps': 1768, 'loss/train': 0.886534720659256} 01/26/2022 21:30:06 - INFO - codeparrot_training - Step 1769: {'lr': 0.00044225, 'samples': 339840, 'steps': 1769, 'loss/train': 1.2899263501167297} 01/26/2022 21:30:09 - INFO - codeparrot_training - Step 1770: {'lr': 0.0004425, 'samples': 340032, 'steps': 1770, 'loss/train': 0.9034107327461243} 01/26/2022 21:30:13 - INFO - codeparrot_training - Step 1771: {'lr': 0.00044275, 'samples': 340224, 'steps': 1771, 'loss/train': 0.6806822866201401} 01/26/2022 21:30:19 - INFO - codeparrot_training - Step 1772: {'lr': 0.00044300000000000003, 'samples': 340416, 'steps': 1772, 'loss/train': 1.2623277604579926} 01/26/2022 21:30:22 - INFO - codeparrot_training - Step 1773: {'lr': 0.00044325, 'samples': 340608, 'steps': 1773, 'loss/train': 0.43822503089904785} 01/26/2022 21:30:25 - INFO - codeparrot_training - Step 1774: {'lr': 0.0004435, 'samples': 340800, 'steps': 1774, 'loss/train': 0.7557626366615295} 01/26/2022 21:30:28 - INFO - codeparrot_training - Step 1775: {'lr': 0.00044374999999999997, 'samples': 340992, 'steps': 1775, 'loss/train': 1.0331142246723175} 01/26/2022 21:30:31 - INFO - codeparrot_training - Step 1776: {'lr': 0.000444, 'samples': 341184, 'steps': 1776, 'loss/train': 0.7802868783473969} 01/26/2022 21:30:34 - INFO - codeparrot_training - Step 1777: {'lr': 0.00044425, 'samples': 341376, 'steps': 1777, 'loss/train': 0.9548504054546356} 01/26/2022 21:30:37 - INFO - codeparrot_training - Step 1778: {'lr': 0.0004445, 'samples': 341568, 'steps': 1778, 'loss/train': 0.6218579113483429} 01/26/2022 21:30:41 - INFO - codeparrot_training - Step 1779: {'lr': 0.00044475, 'samples': 341760, 'steps': 1779, 'loss/train': 1.225717842578888} 01/26/2022 21:30:45 - INFO - codeparrot_training - Step 1780: {'lr': 0.00044500000000000003, 'samples': 341952, 'steps': 1780, 'loss/train': 0.8935568332672119} 01/26/2022 21:30:48 - INFO - codeparrot_training - Step 1781: {'lr': 0.00044525, 'samples': 342144, 'steps': 1781, 'loss/train': 1.4280221164226532} 01/26/2022 21:30:51 - INFO - codeparrot_training - Step 1782: {'lr': 0.00044550000000000004, 'samples': 342336, 'steps': 1782, 'loss/train': 0.7796886563301086} 01/26/2022 21:30:54 - INFO - codeparrot_training - Step 1783: {'lr': 0.00044574999999999997, 'samples': 342528, 'steps': 1783, 'loss/train': 0.900734156370163} 01/26/2022 21:30:57 - INFO - codeparrot_training - Step 1784: {'lr': 0.000446, 'samples': 342720, 'steps': 1784, 'loss/train': 1.3631338477134705} 01/26/2022 21:31:00 - INFO - codeparrot_training - Step 1785: {'lr': 0.00044625, 'samples': 342912, 'steps': 1785, 'loss/train': 0.9239045977592468} 01/26/2022 21:31:04 - INFO - codeparrot_training - Step 1786: {'lr': 0.0004465, 'samples': 343104, 'steps': 1786, 'loss/train': 0.8701066374778748} 01/26/2022 21:31:07 - INFO - codeparrot_training - Step 1787: {'lr': 0.00044675, 'samples': 343296, 'steps': 1787, 'loss/train': 0.844506025314331} 01/26/2022 21:31:10 - INFO - codeparrot_training - Step 1788: {'lr': 0.000447, 'samples': 343488, 'steps': 1788, 'loss/train': 0.8615960776805878} 01/26/2022 21:31:15 - INFO - codeparrot_training - Step 1789: {'lr': 0.00044725, 'samples': 343680, 'steps': 1789, 'loss/train': 1.0209888517856598} 01/26/2022 21:31:18 - INFO - codeparrot_training - Step 1790: {'lr': 0.00044750000000000004, 'samples': 343872, 'steps': 1790, 'loss/train': 0.7795795798301697} 01/26/2022 21:31:21 - INFO - codeparrot_training - Step 1791: {'lr': 0.00044775, 'samples': 344064, 'steps': 1791, 'loss/train': 1.035276710987091} 01/26/2022 21:31:24 - INFO - codeparrot_training - Step 1792: {'lr': 0.000448, 'samples': 344256, 'steps': 1792, 'loss/train': 1.2197923958301544} 01/26/2022 21:31:27 - INFO - codeparrot_training - Step 1793: {'lr': 0.00044824999999999997, 'samples': 344448, 'steps': 1793, 'loss/train': 0.5810040235519409} 01/26/2022 21:31:30 - INFO - codeparrot_training - Step 1794: {'lr': 0.0004485, 'samples': 344640, 'steps': 1794, 'loss/train': 0.9683815240859985} 01/26/2022 21:31:33 - INFO - codeparrot_training - Step 1795: {'lr': 0.00044875, 'samples': 344832, 'steps': 1795, 'loss/train': 0.3584947809576988} 01/26/2022 21:31:37 - INFO - codeparrot_training - Step 1796: {'lr': 0.000449, 'samples': 345024, 'steps': 1796, 'loss/train': 0.965572714805603} 01/26/2022 21:31:40 - INFO - codeparrot_training - Step 1797: {'lr': 0.00044925, 'samples': 345216, 'steps': 1797, 'loss/train': 1.208339273929596} 01/26/2022 21:31:46 - INFO - codeparrot_training - Step 1798: {'lr': 0.00044950000000000003, 'samples': 345408, 'steps': 1798, 'loss/train': 0.9125329256057739} 01/26/2022 21:31:49 - INFO - codeparrot_training - Step 1799: {'lr': 0.00044975, 'samples': 345600, 'steps': 1799, 'loss/train': 0.1478106938302517} 01/26/2022 21:31:52 - INFO - codeparrot_training - Step 1800: {'lr': 0.00045000000000000004, 'samples': 345792, 'steps': 1800, 'loss/train': 1.5486858487129211} 01/26/2022 21:31:55 - INFO - codeparrot_training - Step 1801: {'lr': 0.00045024999999999997, 'samples': 345984, 'steps': 1801, 'loss/train': 0.8634694218635559} 01/26/2022 21:31:58 - INFO - codeparrot_training - Step 1802: {'lr': 0.0004505, 'samples': 346176, 'steps': 1802, 'loss/train': 0.6972474753856659} 01/26/2022 21:32:02 - INFO - codeparrot_training - Step 1803: {'lr': 0.00045075, 'samples': 346368, 'steps': 1803, 'loss/train': 0.710689902305603} 01/26/2022 21:32:05 - INFO - codeparrot_training - Step 1804: {'lr': 0.000451, 'samples': 346560, 'steps': 1804, 'loss/train': 0.7198485285043716} 01/26/2022 21:32:08 - INFO - codeparrot_training - Step 1805: {'lr': 0.00045125, 'samples': 346752, 'steps': 1805, 'loss/train': 0.8817155957221985} 01/26/2022 21:32:11 - INFO - codeparrot_training - Step 1806: {'lr': 0.0004515, 'samples': 346944, 'steps': 1806, 'loss/train': 0.6702062487602234} 01/26/2022 21:32:15 - INFO - codeparrot_training - Step 1807: {'lr': 0.00045175, 'samples': 347136, 'steps': 1807, 'loss/train': 1.1744391918182373} 01/26/2022 21:32:18 - INFO - codeparrot_training - Step 1808: {'lr': 0.00045200000000000004, 'samples': 347328, 'steps': 1808, 'loss/train': 0.7731001675128937} 01/26/2022 21:32:22 - INFO - codeparrot_training - Step 1809: {'lr': 0.00045225, 'samples': 347520, 'steps': 1809, 'loss/train': 1.4250400364398956} 01/26/2022 21:32:25 - INFO - codeparrot_training - Step 1810: {'lr': 0.00045250000000000005, 'samples': 347712, 'steps': 1810, 'loss/train': 0.5657884776592255} 01/26/2022 21:32:28 - INFO - codeparrot_training - Step 1811: {'lr': 0.00045275, 'samples': 347904, 'steps': 1811, 'loss/train': 1.0133686065673828} 01/26/2022 21:32:31 - INFO - codeparrot_training - Step 1812: {'lr': 0.000453, 'samples': 348096, 'steps': 1812, 'loss/train': 0.6502123922109604} 01/26/2022 21:32:34 - INFO - codeparrot_training - Step 1813: {'lr': 0.00045325, 'samples': 348288, 'steps': 1813, 'loss/train': 0.9410197734832764} 01/26/2022 21:32:37 - INFO - codeparrot_training - Step 1814: {'lr': 0.0004535, 'samples': 348480, 'steps': 1814, 'loss/train': 1.1278867721557617} 01/26/2022 21:32:42 - INFO - codeparrot_training - Step 1815: {'lr': 0.00045375, 'samples': 348672, 'steps': 1815, 'loss/train': 1.2114013731479645} 01/26/2022 21:32:45 - INFO - codeparrot_training - Step 1816: {'lr': 0.00045400000000000003, 'samples': 348864, 'steps': 1816, 'loss/train': 0.8608831465244293} 01/26/2022 21:32:48 - INFO - codeparrot_training - Step 1817: {'lr': 0.00045425, 'samples': 349056, 'steps': 1817, 'loss/train': 1.0036253929138184} 01/26/2022 21:32:51 - INFO - codeparrot_training - Step 1818: {'lr': 0.00045450000000000004, 'samples': 349248, 'steps': 1818, 'loss/train': 0.41098009049892426} 01/26/2022 21:32:54 - INFO - codeparrot_training - Step 1819: {'lr': 0.00045475, 'samples': 349440, 'steps': 1819, 'loss/train': 0.9145841002464294} 01/26/2022 21:32:57 - INFO - codeparrot_training - Step 1820: {'lr': 0.000455, 'samples': 349632, 'steps': 1820, 'loss/train': 1.5986244678497314} 01/26/2022 21:33:01 - INFO - codeparrot_training - Step 1821: {'lr': 0.00045525, 'samples': 349824, 'steps': 1821, 'loss/train': 0.7239570021629333} 01/26/2022 21:33:04 - INFO - codeparrot_training - Step 1822: {'lr': 0.0004555, 'samples': 350016, 'steps': 1822, 'loss/train': 1.2572510540485382} 01/26/2022 21:33:07 - INFO - codeparrot_training - Step 1823: {'lr': 0.00045575, 'samples': 350208, 'steps': 1823, 'loss/train': 0.2298322319984436} 01/26/2022 21:33:11 - INFO - codeparrot_training - Step 1824: {'lr': 0.000456, 'samples': 350400, 'steps': 1824, 'loss/train': 1.1167780458927155} 01/26/2022 21:33:14 - INFO - codeparrot_training - Step 1825: {'lr': 0.00045625, 'samples': 350592, 'steps': 1825, 'loss/train': 0.4124080538749695} 01/26/2022 21:33:18 - INFO - codeparrot_training - Step 1826: {'lr': 0.00045650000000000004, 'samples': 350784, 'steps': 1826, 'loss/train': 0.7638796269893646} 01/26/2022 21:33:21 - INFO - codeparrot_training - Step 1827: {'lr': 0.00045675, 'samples': 350976, 'steps': 1827, 'loss/train': 0.573961928486824} 01/26/2022 21:33:24 - INFO - codeparrot_training - Step 1828: {'lr': 0.00045700000000000005, 'samples': 351168, 'steps': 1828, 'loss/train': 1.0830810964107513} 01/26/2022 21:33:27 - INFO - codeparrot_training - Step 1829: {'lr': 0.00045725, 'samples': 351360, 'steps': 1829, 'loss/train': 0.6615297049283981} 01/26/2022 21:33:30 - INFO - codeparrot_training - Step 1830: {'lr': 0.0004575, 'samples': 351552, 'steps': 1830, 'loss/train': 0.9944534003734589} 01/26/2022 21:33:33 - INFO - codeparrot_training - Step 1831: {'lr': 0.00045775, 'samples': 351744, 'steps': 1831, 'loss/train': 0.7959812879562378} 01/26/2022 21:33:36 - INFO - codeparrot_training - Step 1832: {'lr': 0.000458, 'samples': 351936, 'steps': 1832, 'loss/train': 0.8877821266651154} 01/26/2022 21:33:43 - INFO - codeparrot_training - Step 1833: {'lr': 0.00045825, 'samples': 352128, 'steps': 1833, 'loss/train': 1.261086791753769} 01/26/2022 21:33:46 - INFO - codeparrot_training - Step 1834: {'lr': 0.00045850000000000003, 'samples': 352320, 'steps': 1834, 'loss/train': 0.5066499710083008} 01/26/2022 21:33:49 - INFO - codeparrot_training - Step 1835: {'lr': 0.00045875, 'samples': 352512, 'steps': 1835, 'loss/train': 1.0754070281982422} 01/26/2022 21:33:52 - INFO - codeparrot_training - Step 1836: {'lr': 0.00045900000000000004, 'samples': 352704, 'steps': 1836, 'loss/train': 0.831580638885498} 01/26/2022 21:33:55 - INFO - codeparrot_training - Step 1837: {'lr': 0.00045925, 'samples': 352896, 'steps': 1837, 'loss/train': 0.7616505324840546} 01/26/2022 21:33:58 - INFO - codeparrot_training - Step 1838: {'lr': 0.00045950000000000006, 'samples': 353088, 'steps': 1838, 'loss/train': 0.37710709869861603} 01/26/2022 21:34:02 - INFO - codeparrot_training - Step 1839: {'lr': 0.00045975, 'samples': 353280, 'steps': 1839, 'loss/train': 0.7463306486606598} 01/26/2022 21:34:05 - INFO - codeparrot_training - Step 1840: {'lr': 0.00046, 'samples': 353472, 'steps': 1840, 'loss/train': 0.8024594485759735} 01/26/2022 21:34:08 - INFO - codeparrot_training - Step 1841: {'lr': 0.00046025, 'samples': 353664, 'steps': 1841, 'loss/train': 1.3716038167476654} 01/26/2022 21:34:13 - INFO - codeparrot_training - Step 1842: {'lr': 0.0004605, 'samples': 353856, 'steps': 1842, 'loss/train': 1.1901555061340332} 01/26/2022 21:34:16 - INFO - codeparrot_training - Step 1843: {'lr': 0.00046075, 'samples': 354048, 'steps': 1843, 'loss/train': 0.5672524273395538} 01/26/2022 21:34:19 - INFO - codeparrot_training - Step 1844: {'lr': 0.00046100000000000004, 'samples': 354240, 'steps': 1844, 'loss/train': 0.32982489466667175} 01/26/2022 21:34:22 - INFO - codeparrot_training - Step 1845: {'lr': 0.00046125, 'samples': 354432, 'steps': 1845, 'loss/train': 0.42343519628047943} 01/26/2022 21:34:25 - INFO - codeparrot_training - Step 1846: {'lr': 0.00046150000000000005, 'samples': 354624, 'steps': 1846, 'loss/train': 0.8662121593952179} 01/26/2022 21:34:28 - INFO - codeparrot_training - Step 1847: {'lr': 0.00046175000000000003, 'samples': 354816, 'steps': 1847, 'loss/train': 0.6935672610998154} 01/26/2022 21:34:31 - INFO - codeparrot_training - Step 1848: {'lr': 0.000462, 'samples': 355008, 'steps': 1848, 'loss/train': 0.9938570559024811} 01/26/2022 21:34:34 - INFO - codeparrot_training - Step 1849: {'lr': 0.00046225, 'samples': 355200, 'steps': 1849, 'loss/train': 1.0552408397197723} 01/26/2022 21:34:39 - INFO - codeparrot_training - Step 1850: {'lr': 0.0004625, 'samples': 355392, 'steps': 1850, 'loss/train': 0.8935852646827698} 01/26/2022 21:34:42 - INFO - codeparrot_training - Step 1851: {'lr': 0.00046275, 'samples': 355584, 'steps': 1851, 'loss/train': 1.531786859035492} 01/26/2022 21:34:45 - INFO - codeparrot_training - Step 1852: {'lr': 0.00046300000000000003, 'samples': 355776, 'steps': 1852, 'loss/train': 0.5282375514507294} 01/26/2022 21:34:48 - INFO - codeparrot_training - Step 1853: {'lr': 0.00046325, 'samples': 355968, 'steps': 1853, 'loss/train': 0.9514651894569397} 01/26/2022 21:34:52 - INFO - codeparrot_training - Step 1854: {'lr': 0.00046350000000000004, 'samples': 356160, 'steps': 1854, 'loss/train': 0.5179400593042374} 01/26/2022 21:34:55 - INFO - codeparrot_training - Step 1855: {'lr': 0.00046375, 'samples': 356352, 'steps': 1855, 'loss/train': 0.8588109612464905} 01/26/2022 21:34:58 - INFO - codeparrot_training - Step 1856: {'lr': 0.00046400000000000006, 'samples': 356544, 'steps': 1856, 'loss/train': 0.6994162648916245} 01/26/2022 21:35:01 - INFO - codeparrot_training - Step 1857: {'lr': 0.00046425, 'samples': 356736, 'steps': 1857, 'loss/train': 0.6678935587406158} 01/26/2022 21:35:04 - INFO - codeparrot_training - Step 1858: {'lr': 0.0004645, 'samples': 356928, 'steps': 1858, 'loss/train': 0.7436229139566422} 01/26/2022 21:35:11 - INFO - codeparrot_training - Step 1859: {'lr': 0.00046475, 'samples': 357120, 'steps': 1859, 'loss/train': 0.9596456587314606} 01/26/2022 21:35:14 - INFO - codeparrot_training - Step 1860: {'lr': 0.000465, 'samples': 357312, 'steps': 1860, 'loss/train': 1.4724598824977875} 01/26/2022 21:35:17 - INFO - codeparrot_training - Step 1861: {'lr': 0.00046525, 'samples': 357504, 'steps': 1861, 'loss/train': 0.9142586588859558} 01/26/2022 21:35:20 - INFO - codeparrot_training - Step 1862: {'lr': 0.00046550000000000004, 'samples': 357696, 'steps': 1862, 'loss/train': 0.3554471433162689} 01/26/2022 21:35:23 - INFO - codeparrot_training - Step 1863: {'lr': 0.00046575, 'samples': 357888, 'steps': 1863, 'loss/train': 0.8216174840927124} 01/26/2022 21:35:26 - INFO - codeparrot_training - Step 1864: {'lr': 0.00046600000000000005, 'samples': 358080, 'steps': 1864, 'loss/train': 1.3381338715553284} 01/26/2022 21:35:29 - INFO - codeparrot_training - Step 1865: {'lr': 0.00046625000000000003, 'samples': 358272, 'steps': 1865, 'loss/train': 0.659206286072731} 01/26/2022 21:35:33 - INFO - codeparrot_training - Step 1866: {'lr': 0.0004665, 'samples': 358464, 'steps': 1866, 'loss/train': 0.6387936025857925} 01/26/2022 21:35:36 - INFO - codeparrot_training - Step 1867: {'lr': 0.00046675, 'samples': 358656, 'steps': 1867, 'loss/train': 0.9759268462657928} 01/26/2022 21:35:40 - INFO - codeparrot_training - Step 1868: {'lr': 0.000467, 'samples': 358848, 'steps': 1868, 'loss/train': 0.8318382203578949} 01/26/2022 21:35:43 - INFO - codeparrot_training - Step 1869: {'lr': 0.00046725, 'samples': 359040, 'steps': 1869, 'loss/train': 0.8856178522109985} 01/26/2022 21:35:46 - INFO - codeparrot_training - Step 1870: {'lr': 0.00046750000000000003, 'samples': 359232, 'steps': 1870, 'loss/train': 0.7402231693267822} 01/26/2022 21:35:50 - INFO - codeparrot_training - Step 1871: {'lr': 0.00046775, 'samples': 359424, 'steps': 1871, 'loss/train': 0.4504626989364624} 01/26/2022 21:35:53 - INFO - codeparrot_training - Step 1872: {'lr': 0.00046800000000000005, 'samples': 359616, 'steps': 1872, 'loss/train': 0.6901952773332596} 01/26/2022 21:35:56 - INFO - codeparrot_training - Step 1873: {'lr': 0.00046825, 'samples': 359808, 'steps': 1873, 'loss/train': 1.3196231424808502} 01/26/2022 21:35:59 - INFO - codeparrot_training - Step 1874: {'lr': 0.00046850000000000006, 'samples': 360000, 'steps': 1874, 'loss/train': 0.8008153438568115} 01/26/2022 21:36:02 - INFO - codeparrot_training - Step 1875: {'lr': 0.00046875, 'samples': 360192, 'steps': 1875, 'loss/train': 0.5548744797706604} 01/26/2022 21:36:05 - INFO - codeparrot_training - Step 1876: {'lr': 0.00046899999999999996, 'samples': 360384, 'steps': 1876, 'loss/train': 0.6174647659063339} 01/26/2022 21:36:12 - INFO - codeparrot_training - Step 1877: {'lr': 0.00046925, 'samples': 360576, 'steps': 1877, 'loss/train': 1.070265144109726} 01/26/2022 21:36:15 - INFO - codeparrot_training - Step 1878: {'lr': 0.0004695, 'samples': 360768, 'steps': 1878, 'loss/train': 1.1434812247753143} 01/26/2022 21:36:18 - INFO - codeparrot_training - Step 1879: {'lr': 0.00046975, 'samples': 360960, 'steps': 1879, 'loss/train': 0.7403541505336761} 01/26/2022 21:36:21 - INFO - codeparrot_training - Step 1880: {'lr': 0.00047, 'samples': 361152, 'steps': 1880, 'loss/train': 2.072912871837616} 01/26/2022 21:36:24 - INFO - codeparrot_training - Step 1881: {'lr': 0.00047025, 'samples': 361344, 'steps': 1881, 'loss/train': 0.5732850283384323} 01/26/2022 21:36:27 - INFO - codeparrot_training - Step 1882: {'lr': 0.0004705, 'samples': 361536, 'steps': 1882, 'loss/train': 0.8318330347537994} 01/26/2022 21:36:30 - INFO - codeparrot_training - Step 1883: {'lr': 0.00047075000000000003, 'samples': 361728, 'steps': 1883, 'loss/train': 0.30028827488422394} 01/26/2022 21:36:34 - INFO - codeparrot_training - Step 1884: {'lr': 0.000471, 'samples': 361920, 'steps': 1884, 'loss/train': 0.9006307125091553} 01/26/2022 21:36:37 - INFO - codeparrot_training - Step 1885: {'lr': 0.00047125, 'samples': 362112, 'steps': 1885, 'loss/train': 1.1136357486248016} 01/26/2022 21:36:41 - INFO - codeparrot_training - Step 1886: {'lr': 0.00047149999999999997, 'samples': 362304, 'steps': 1886, 'loss/train': 1.07257479429245} 01/26/2022 21:36:44 - INFO - codeparrot_training - Step 1887: {'lr': 0.00047175, 'samples': 362496, 'steps': 1887, 'loss/train': 1.3226381242275238} 01/26/2022 21:36:47 - INFO - codeparrot_training - Step 1888: {'lr': 0.000472, 'samples': 362688, 'steps': 1888, 'loss/train': 0.7470325380563736} 01/26/2022 21:36:50 - INFO - codeparrot_training - Step 1889: {'lr': 0.00047225, 'samples': 362880, 'steps': 1889, 'loss/train': 1.038808822631836} 01/26/2022 21:36:54 - INFO - codeparrot_training - Step 1890: {'lr': 0.0004725, 'samples': 363072, 'steps': 1890, 'loss/train': 1.0341619849205017} 01/26/2022 21:36:57 - INFO - codeparrot_training - Step 1891: {'lr': 0.00047275, 'samples': 363264, 'steps': 1891, 'loss/train': 0.7290969640016556} 01/26/2022 21:37:00 - INFO - codeparrot_training - Step 1892: {'lr': 0.000473, 'samples': 363456, 'steps': 1892, 'loss/train': 0.8344210982322693} 01/26/2022 21:37:03 - INFO - codeparrot_training - Step 1893: {'lr': 0.00047325000000000004, 'samples': 363648, 'steps': 1893, 'loss/train': 0.7739488184452057} 01/26/2022 21:37:07 - INFO - codeparrot_training - Step 1894: {'lr': 0.00047349999999999996, 'samples': 363840, 'steps': 1894, 'loss/train': 1.6855548620224} 01/26/2022 21:37:11 - INFO - codeparrot_training - Step 1895: {'lr': 0.00047375, 'samples': 364032, 'steps': 1895, 'loss/train': 1.100717157125473} 01/26/2022 21:37:14 - INFO - codeparrot_training - Step 1896: {'lr': 0.000474, 'samples': 364224, 'steps': 1896, 'loss/train': 1.0391847789287567} 01/26/2022 21:37:17 - INFO - codeparrot_training - Step 1897: {'lr': 0.00047425, 'samples': 364416, 'steps': 1897, 'loss/train': 0.3638544827699661} 01/26/2022 21:37:20 - INFO - codeparrot_training - Step 1898: {'lr': 0.0004745, 'samples': 364608, 'steps': 1898, 'loss/train': 1.035841315984726} 01/26/2022 21:37:23 - INFO - codeparrot_training - Step 1899: {'lr': 0.00047475, 'samples': 364800, 'steps': 1899, 'loss/train': 1.0633786618709564} 01/26/2022 21:37:26 - INFO - codeparrot_training - Step 1900: {'lr': 0.000475, 'samples': 364992, 'steps': 1900, 'loss/train': 1.2380416989326477} 01/26/2022 21:37:29 - INFO - codeparrot_training - Step 1901: {'lr': 0.00047525000000000003, 'samples': 365184, 'steps': 1901, 'loss/train': 0.9911230802536011} 01/26/2022 21:37:33 - INFO - codeparrot_training - Step 1902: {'lr': 0.0004755, 'samples': 365376, 'steps': 1902, 'loss/train': 1.433401644229889} 01/26/2022 21:37:39 - INFO - codeparrot_training - Step 1903: {'lr': 0.00047575, 'samples': 365568, 'steps': 1903, 'loss/train': 0.6902095824480057} 01/26/2022 21:37:42 - INFO - codeparrot_training - Step 1904: {'lr': 0.00047599999999999997, 'samples': 365760, 'steps': 1904, 'loss/train': 0.07514088414609432} 01/26/2022 21:37:45 - INFO - codeparrot_training - Step 1905: {'lr': 0.00047625, 'samples': 365952, 'steps': 1905, 'loss/train': 0.9614972770214081} 01/26/2022 21:37:48 - INFO - codeparrot_training - Step 1906: {'lr': 0.0004765, 'samples': 366144, 'steps': 1906, 'loss/train': 1.0431278944015503} 01/26/2022 21:37:51 - INFO - codeparrot_training - Step 1907: {'lr': 0.00047675, 'samples': 366336, 'steps': 1907, 'loss/train': 0.8581434488296509} 01/26/2022 21:37:55 - INFO - codeparrot_training - Step 1908: {'lr': 0.000477, 'samples': 366528, 'steps': 1908, 'loss/train': 1.7163219451904297} 01/26/2022 21:37:58 - INFO - codeparrot_training - Step 1909: {'lr': 0.00047725, 'samples': 366720, 'steps': 1909, 'loss/train': 0.8825593292713165} 01/26/2022 21:38:01 - INFO - codeparrot_training - Step 1910: {'lr': 0.0004775, 'samples': 366912, 'steps': 1910, 'loss/train': 1.0761583149433136} 01/26/2022 21:38:04 - INFO - codeparrot_training - Step 1911: {'lr': 0.00047775000000000004, 'samples': 367104, 'steps': 1911, 'loss/train': 1.2633014917373657} 01/26/2022 21:38:09 - INFO - codeparrot_training - Step 1912: {'lr': 0.00047799999999999996, 'samples': 367296, 'steps': 1912, 'loss/train': 1.1808938384056091} 01/26/2022 21:38:12 - INFO - codeparrot_training - Step 1913: {'lr': 0.00047825, 'samples': 367488, 'steps': 1913, 'loss/train': 0.677722156047821} 01/26/2022 21:38:15 - INFO - codeparrot_training - Step 1914: {'lr': 0.0004785, 'samples': 367680, 'steps': 1914, 'loss/train': 0.6951000988483429} 01/26/2022 21:38:18 - INFO - codeparrot_training - Step 1915: {'lr': 0.00047875, 'samples': 367872, 'steps': 1915, 'loss/train': 0.5601763129234314} 01/26/2022 21:38:21 - INFO - codeparrot_training - Step 1916: {'lr': 0.000479, 'samples': 368064, 'steps': 1916, 'loss/train': 1.130952537059784} 01/26/2022 21:38:24 - INFO - codeparrot_training - Step 1917: {'lr': 0.00047925, 'samples': 368256, 'steps': 1917, 'loss/train': 1.269955426454544} 01/26/2022 21:38:28 - INFO - codeparrot_training - Step 1918: {'lr': 0.0004795, 'samples': 368448, 'steps': 1918, 'loss/train': 0.8425226211547852} 01/26/2022 21:38:31 - INFO - codeparrot_training - Step 1919: {'lr': 0.00047975000000000003, 'samples': 368640, 'steps': 1919, 'loss/train': 0.6803317219018936} 01/26/2022 21:38:35 - INFO - codeparrot_training - Step 1920: {'lr': 0.00048, 'samples': 368832, 'steps': 1920, 'loss/train': 0.9387582242488861} 01/26/2022 21:38:38 - INFO - codeparrot_training - Step 1921: {'lr': 0.00048025000000000005, 'samples': 369024, 'steps': 1921, 'loss/train': 0.9797181487083435} 01/26/2022 21:38:41 - INFO - codeparrot_training - Step 1922: {'lr': 0.00048049999999999997, 'samples': 369216, 'steps': 1922, 'loss/train': 1.0194525718688965} 01/26/2022 21:38:44 - INFO - codeparrot_training - Step 1923: {'lr': 0.00048075, 'samples': 369408, 'steps': 1923, 'loss/train': 0.30047231912612915} 01/26/2022 21:38:48 - INFO - codeparrot_training - Step 1924: {'lr': 0.000481, 'samples': 369600, 'steps': 1924, 'loss/train': 0.983442485332489} 01/26/2022 21:38:51 - INFO - codeparrot_training - Step 1925: {'lr': 0.00048125, 'samples': 369792, 'steps': 1925, 'loss/train': 1.318665325641632} 01/26/2022 21:38:54 - INFO - codeparrot_training - Step 1926: {'lr': 0.0004815, 'samples': 369984, 'steps': 1926, 'loss/train': 1.295868694782257} 01/26/2022 21:38:57 - INFO - codeparrot_training - Step 1927: {'lr': 0.00048175000000000003, 'samples': 370176, 'steps': 1927, 'loss/train': 0.8086439073085785} 01/26/2022 21:39:00 - INFO - codeparrot_training - Step 1928: {'lr': 0.000482, 'samples': 370368, 'steps': 1928, 'loss/train': 1.0307956337928772} 01/26/2022 21:39:04 - INFO - codeparrot_training - Step 1929: {'lr': 0.00048225000000000004, 'samples': 370560, 'steps': 1929, 'loss/train': 1.4041475057601929} 01/26/2022 21:39:08 - INFO - codeparrot_training - Step 1930: {'lr': 0.0004825, 'samples': 370752, 'steps': 1930, 'loss/train': 0.9882602691650391} 01/26/2022 21:39:11 - INFO - codeparrot_training - Step 1931: {'lr': 0.00048275, 'samples': 370944, 'steps': 1931, 'loss/train': 1.1968180239200592} 01/26/2022 21:39:14 - INFO - codeparrot_training - Step 1932: {'lr': 0.000483, 'samples': 371136, 'steps': 1932, 'loss/train': 0.6910819709300995} 01/26/2022 21:39:17 - INFO - codeparrot_training - Step 1933: {'lr': 0.00048325, 'samples': 371328, 'steps': 1933, 'loss/train': 0.7734626233577728} 01/26/2022 21:39:20 - INFO - codeparrot_training - Step 1934: {'lr': 0.0004835, 'samples': 371520, 'steps': 1934, 'loss/train': 1.1351169347763062} 01/26/2022 21:39:23 - INFO - codeparrot_training - Step 1935: {'lr': 0.00048375, 'samples': 371712, 'steps': 1935, 'loss/train': 1.1465918719768524} 01/26/2022 21:39:26 - INFO - codeparrot_training - Step 1936: {'lr': 0.000484, 'samples': 371904, 'steps': 1936, 'loss/train': 0.4684644788503647} 01/26/2022 21:39:30 - INFO - codeparrot_training - Step 1937: {'lr': 0.00048425000000000003, 'samples': 372096, 'steps': 1937, 'loss/train': 1.0882051885128021} 01/26/2022 21:39:36 - INFO - codeparrot_training - Step 1938: {'lr': 0.0004845, 'samples': 372288, 'steps': 1938, 'loss/train': 0.8394992351531982} 01/26/2022 21:39:39 - INFO - codeparrot_training - Step 1939: {'lr': 0.00048475000000000005, 'samples': 372480, 'steps': 1939, 'loss/train': 0.993206262588501} 01/26/2022 21:39:42 - INFO - codeparrot_training - Step 1940: {'lr': 0.00048499999999999997, 'samples': 372672, 'steps': 1940, 'loss/train': 0.18441668897867203} 01/26/2022 21:39:45 - INFO - codeparrot_training - Step 1941: {'lr': 0.00048525, 'samples': 372864, 'steps': 1941, 'loss/train': 0.7797511518001556} 01/26/2022 21:39:48 - INFO - codeparrot_training - Step 1942: {'lr': 0.0004855, 'samples': 373056, 'steps': 1942, 'loss/train': 0.6799484342336655} 01/26/2022 21:39:51 - INFO - codeparrot_training - Step 1943: {'lr': 0.00048575, 'samples': 373248, 'steps': 1943, 'loss/train': 0.7915197908878326} 01/26/2022 21:39:55 - INFO - codeparrot_training - Step 1944: {'lr': 0.000486, 'samples': 373440, 'steps': 1944, 'loss/train': 1.0098568797111511} 01/26/2022 21:39:58 - INFO - codeparrot_training - Step 1945: {'lr': 0.00048625000000000003, 'samples': 373632, 'steps': 1945, 'loss/train': 0.7834502756595612} 01/26/2022 21:40:01 - INFO - codeparrot_training - Step 1946: {'lr': 0.0004865, 'samples': 373824, 'steps': 1946, 'loss/train': 1.05072820186615} 01/26/2022 21:40:05 - INFO - codeparrot_training - Step 1947: {'lr': 0.00048675000000000004, 'samples': 374016, 'steps': 1947, 'loss/train': 0.49344009160995483} 01/26/2022 21:40:08 - INFO - codeparrot_training - Step 1948: {'lr': 0.000487, 'samples': 374208, 'steps': 1948, 'loss/train': 1.1734929084777832} 01/26/2022 21:40:11 - INFO - codeparrot_training - Step 1949: {'lr': 0.00048725000000000005, 'samples': 374400, 'steps': 1949, 'loss/train': 1.2145574390888214} 01/26/2022 21:40:15 - INFO - codeparrot_training - Step 1950: {'lr': 0.0004875, 'samples': 374592, 'steps': 1950, 'loss/train': 0.8924444317817688} 01/26/2022 21:40:18 - INFO - codeparrot_training - Step 1951: {'lr': 0.00048775, 'samples': 374784, 'steps': 1951, 'loss/train': 0.7068826854228973} 01/26/2022 21:40:21 - INFO - codeparrot_training - Step 1952: {'lr': 0.000488, 'samples': 374976, 'steps': 1952, 'loss/train': 0.9460795819759369} 01/26/2022 21:40:24 - INFO - codeparrot_training - Step 1953: {'lr': 0.00048825, 'samples': 375168, 'steps': 1953, 'loss/train': 0.7600442469120026} 01/26/2022 21:40:27 - INFO - codeparrot_training - Step 1954: {'lr': 0.0004885, 'samples': 375360, 'steps': 1954, 'loss/train': 0.8999834954738617} 01/26/2022 21:40:33 - INFO - codeparrot_training - Step 1955: {'lr': 0.00048875, 'samples': 375552, 'steps': 1955, 'loss/train': 0.8945177793502808} 01/26/2022 21:40:36 - INFO - codeparrot_training - Step 1956: {'lr': 0.000489, 'samples': 375744, 'steps': 1956, 'loss/train': 0.6711243689060211} 01/26/2022 21:40:40 - INFO - codeparrot_training - Step 1957: {'lr': 0.00048925, 'samples': 375936, 'steps': 1957, 'loss/train': 0.9730430245399475} 01/26/2022 21:40:43 - INFO - codeparrot_training - Step 1958: {'lr': 0.0004895, 'samples': 376128, 'steps': 1958, 'loss/train': 0.6753373593091965} 01/26/2022 21:40:46 - INFO - codeparrot_training - Step 1959: {'lr': 0.0004897500000000001, 'samples': 376320, 'steps': 1959, 'loss/train': 0.7206808626651764} 01/26/2022 21:40:49 - INFO - codeparrot_training - Step 1960: {'lr': 0.00049, 'samples': 376512, 'steps': 1960, 'loss/train': 1.1024954617023468} 01/26/2022 21:40:52 - INFO - codeparrot_training - Step 1961: {'lr': 0.00049025, 'samples': 376704, 'steps': 1961, 'loss/train': 1.1701643764972687} 01/26/2022 21:40:55 - INFO - codeparrot_training - Step 1962: {'lr': 0.0004905, 'samples': 376896, 'steps': 1962, 'loss/train': 0.9999094605445862} 01/26/2022 21:40:58 - INFO - codeparrot_training - Step 1963: {'lr': 0.0004907500000000001, 'samples': 377088, 'steps': 1963, 'loss/train': 0.8964014947414398} 01/26/2022 21:41:03 - INFO - codeparrot_training - Step 1964: {'lr': 0.000491, 'samples': 377280, 'steps': 1964, 'loss/train': 0.8065761923789978} 01/26/2022 21:41:06 - INFO - codeparrot_training - Step 1965: {'lr': 0.00049125, 'samples': 377472, 'steps': 1965, 'loss/train': 0.9917560815811157} 01/26/2022 21:41:09 - INFO - codeparrot_training - Step 1966: {'lr': 0.0004915, 'samples': 377664, 'steps': 1966, 'loss/train': 1.1736750304698944} 01/26/2022 21:41:12 - INFO - codeparrot_training - Step 1967: {'lr': 0.00049175, 'samples': 377856, 'steps': 1967, 'loss/train': 0.8484614789485931} 01/26/2022 21:41:15 - INFO - codeparrot_training - Step 1968: {'lr': 0.000492, 'samples': 378048, 'steps': 1968, 'loss/train': 0.9249576330184937} 01/26/2022 21:41:18 - INFO - codeparrot_training - Step 1969: {'lr': 0.0004922500000000001, 'samples': 378240, 'steps': 1969, 'loss/train': 1.0223771631717682} 01/26/2022 21:41:22 - INFO - codeparrot_training - Step 1970: {'lr': 0.0004925, 'samples': 378432, 'steps': 1970, 'loss/train': 0.5959418714046478} 01/26/2022 21:41:25 - INFO - codeparrot_training - Step 1971: {'lr': 0.00049275, 'samples': 378624, 'steps': 1971, 'loss/train': 0.4839349687099457} 01/26/2022 21:41:28 - INFO - codeparrot_training - Step 1972: {'lr': 0.0004930000000000001, 'samples': 378816, 'steps': 1972, 'loss/train': 0.73919378221035} 01/26/2022 21:41:32 - INFO - codeparrot_training - Step 1973: {'lr': 0.00049325, 'samples': 379008, 'steps': 1973, 'loss/train': 0.8754311800003052} 01/26/2022 21:41:35 - INFO - codeparrot_training - Step 1974: {'lr': 0.0004935, 'samples': 379200, 'steps': 1974, 'loss/train': 0.2693951725959778} 01/26/2022 21:41:38 - INFO - codeparrot_training - Step 1975: {'lr': 0.00049375, 'samples': 379392, 'steps': 1975, 'loss/train': 0.8377140462398529} 01/26/2022 21:41:42 - INFO - codeparrot_training - Step 1976: {'lr': 0.000494, 'samples': 379584, 'steps': 1976, 'loss/train': 0.7758494317531586} 01/26/2022 21:41:45 - INFO - codeparrot_training - Step 1977: {'lr': 0.00049425, 'samples': 379776, 'steps': 1977, 'loss/train': 1.216726541519165} 01/26/2022 21:41:48 - INFO - codeparrot_training - Step 1978: {'lr': 0.0004945, 'samples': 379968, 'steps': 1978, 'loss/train': 0.9289956986904144} 01/26/2022 21:41:51 - INFO - codeparrot_training - Step 1979: {'lr': 0.0004947500000000001, 'samples': 380160, 'steps': 1979, 'loss/train': 0.8398751020431519} 01/26/2022 21:41:54 - INFO - codeparrot_training - Step 1980: {'lr': 0.000495, 'samples': 380352, 'steps': 1980, 'loss/train': 0.9946226477622986} 01/26/2022 21:42:01 - INFO - codeparrot_training - Step 1981: {'lr': 0.00049525, 'samples': 380544, 'steps': 1981, 'loss/train': 0.6382877826690674} 01/26/2022 21:42:04 - INFO - codeparrot_training - Step 1982: {'lr': 0.0004955, 'samples': 380736, 'steps': 1982, 'loss/train': 0.6531389057636261} 01/26/2022 21:42:07 - INFO - codeparrot_training - Step 1983: {'lr': 0.00049575, 'samples': 380928, 'steps': 1983, 'loss/train': 1.2436447441577911} 01/26/2022 21:42:10 - INFO - codeparrot_training - Step 1984: {'lr': 0.000496, 'samples': 381120, 'steps': 1984, 'loss/train': 0.8173876404762268} 01/26/2022 21:42:13 - INFO - codeparrot_training - Step 1985: {'lr': 0.0004962500000000001, 'samples': 381312, 'steps': 1985, 'loss/train': 0.2629638612270355} 01/26/2022 21:42:16 - INFO - codeparrot_training - Step 1986: {'lr': 0.0004965, 'samples': 381504, 'steps': 1986, 'loss/train': 0.4969499856233597} 01/26/2022 21:42:19 - INFO - codeparrot_training - Step 1987: {'lr': 0.00049675, 'samples': 381696, 'steps': 1987, 'loss/train': 1.013057827949524} 01/26/2022 21:42:23 - INFO - codeparrot_training - Step 1988: {'lr': 0.000497, 'samples': 381888, 'steps': 1988, 'loss/train': 0.8126345872879028} 01/26/2022 21:42:26 - INFO - codeparrot_training - Step 1989: {'lr': 0.0004972500000000001, 'samples': 382080, 'steps': 1989, 'loss/train': 1.1256734132766724} 01/26/2022 21:42:30 - INFO - codeparrot_training - Step 1990: {'lr': 0.0004975, 'samples': 382272, 'steps': 1990, 'loss/train': 0.9168229401111603} 01/26/2022 21:42:33 - INFO - codeparrot_training - Step 1991: {'lr': 0.00049775, 'samples': 382464, 'steps': 1991, 'loss/train': 0.941860556602478} 01/26/2022 21:42:36 - INFO - codeparrot_training - Step 1992: {'lr': 0.000498, 'samples': 382656, 'steps': 1992, 'loss/train': 1.4965969026088715} 01/26/2022 21:42:40 - INFO - codeparrot_training - Step 1993: {'lr': 0.00049825, 'samples': 382848, 'steps': 1993, 'loss/train': 1.0781153440475464} 01/26/2022 21:42:43 - INFO - codeparrot_training - Step 1994: {'lr': 0.0004985, 'samples': 383040, 'steps': 1994, 'loss/train': 0.9627453088760376} 01/26/2022 21:42:46 - INFO - codeparrot_training - Step 1995: {'lr': 0.0004987500000000001, 'samples': 383232, 'steps': 1995, 'loss/train': 0.9109219908714294} 01/26/2022 21:42:49 - INFO - codeparrot_training - Step 1996: {'lr': 0.000499, 'samples': 383424, 'steps': 1996, 'loss/train': 0.13320894911885262} 01/26/2022 21:42:52 - INFO - codeparrot_training - Step 1997: {'lr': 0.00049925, 'samples': 383616, 'steps': 1997, 'loss/train': 1.204388827085495} 01/26/2022 21:42:55 - INFO - codeparrot_training - Step 1998: {'lr': 0.0004995, 'samples': 383808, 'steps': 1998, 'loss/train': 1.3314221799373627} 01/26/2022 21:43:00 - INFO - codeparrot_training - Step 1999: {'lr': 0.0004997500000000001, 'samples': 384000, 'steps': 1999, 'loss/train': 1.2572254836559296} 01/26/2022 21:43:00 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/26/2022 21:44:51 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py * [new branch] royal-monkey-12 -> royal-monkey-12 01/26/2022 21:46:08 - INFO - codeparrot_training - Step 2000: {'lr': 0.0005, 'samples': 384192, 'steps': 2000, 'loss/train': 1.2902482151985168} 01/26/2022 21:46:11 - INFO - codeparrot_training - Step 2001: {'lr': 0.0004999999994645397, 'samples': 384384, 'steps': 2001, 'loss/train': 1.012294739484787} 01/26/2022 21:46:14 - INFO - codeparrot_training - Step 2002: {'lr': 0.0004999999978581587, 'samples': 384576, 'steps': 2002, 'loss/train': 1.1903394162654877} 01/26/2022 21:46:17 - INFO - codeparrot_training - Step 2003: {'lr': 0.0004999999951808573, 'samples': 384768, 'steps': 2003, 'loss/train': 0.6301666796207428} 01/26/2022 21:46:21 - INFO - codeparrot_training - Step 2004: {'lr': 0.0004999999914326351, 'samples': 384960, 'steps': 2004, 'loss/train': 0.6268340349197388} 01/26/2022 21:46:24 - INFO - codeparrot_training - Step 2005: {'lr': 0.0004999999866134924, 'samples': 385152, 'steps': 2005, 'loss/train': 0.7916231453418732} 01/26/2022 21:46:27 - INFO - codeparrot_training - Step 2006: {'lr': 0.0004999999807234292, 'samples': 385344, 'steps': 2006, 'loss/train': 1.0635093748569489} 01/26/2022 21:46:30 - INFO - codeparrot_training - Step 2007: {'lr': 0.0004999999737624453, 'samples': 385536, 'steps': 2007, 'loss/train': 0.3961552530527115} 01/26/2022 21:46:36 - INFO - codeparrot_training - Step 2008: {'lr': 0.0004999999657305411, 'samples': 385728, 'steps': 2008, 'loss/train': 0.5662686377763748} 01/26/2022 21:46:40 - INFO - codeparrot_training - Step 2009: {'lr': 0.0004999999566277163, 'samples': 385920, 'steps': 2009, 'loss/train': 0.846624881029129} 01/26/2022 21:46:43 - INFO - codeparrot_training - Step 2010: {'lr': 0.0004999999464539711, 'samples': 386112, 'steps': 2010, 'loss/train': 1.4005944728851318} 01/26/2022 21:46:46 - INFO - codeparrot_training - Step 2011: {'lr': 0.0004999999352093055, 'samples': 386304, 'steps': 2011, 'loss/train': 1.2804512679576874} 01/26/2022 21:46:49 - INFO - codeparrot_training - Step 2012: {'lr': 0.0004999999228937196, 'samples': 386496, 'steps': 2012, 'loss/train': 0.5199491679668427} 01/26/2022 21:46:52 - INFO - codeparrot_training - Step 2013: {'lr': 0.0004999999095072135, 'samples': 386688, 'steps': 2013, 'loss/train': 1.3126805126667023} 01/26/2022 21:46:55 - INFO - codeparrot_training - Step 2014: {'lr': 0.0004999998950497869, 'samples': 386880, 'steps': 2014, 'loss/train': 0.7487647086381912} 01/26/2022 21:46:58 - INFO - codeparrot_training - Step 2015: {'lr': 0.0004999998795214404, 'samples': 387072, 'steps': 2015, 'loss/train': 1.3387062549591064} 01/26/2022 21:47:02 - INFO - codeparrot_training - Step 2016: {'lr': 0.0004999998629221736, 'samples': 387264, 'steps': 2016, 'loss/train': 0.8916180431842804} 01/26/2022 21:47:06 - INFO - codeparrot_training - Step 2017: {'lr': 0.0004999998452519869, 'samples': 387456, 'steps': 2017, 'loss/train': 0.6454151719808578} 01/26/2022 21:47:09 - INFO - codeparrot_training - Step 2018: {'lr': 0.0004999998265108802, 'samples': 387648, 'steps': 2018, 'loss/train': 1.089319109916687} 01/26/2022 21:47:13 - INFO - codeparrot_training - Step 2019: {'lr': 0.0004999998066988537, 'samples': 387840, 'steps': 2019, 'loss/train': 0.45093633234500885} 01/26/2022 21:47:16 - INFO - codeparrot_training - Step 2020: {'lr': 0.0004999997858159073, 'samples': 388032, 'steps': 2020, 'loss/train': 1.1280781924724579} 01/26/2022 21:47:19 - INFO - codeparrot_training - Step 2021: {'lr': 0.0004999997638620412, 'samples': 388224, 'steps': 2021, 'loss/train': 0.8910837471485138} 01/26/2022 21:47:22 - INFO - codeparrot_training - Step 2022: {'lr': 0.0004999997408372557, 'samples': 388416, 'steps': 2022, 'loss/train': 0.9995455741882324} 01/26/2022 21:47:25 - INFO - codeparrot_training - Step 2023: {'lr': 0.0004999997167415504, 'samples': 388608, 'steps': 2023, 'loss/train': 1.4466348588466644} 01/26/2022 21:47:28 - INFO - codeparrot_training - Step 2024: {'lr': 0.0004999996915749259, 'samples': 388800, 'steps': 2024, 'loss/train': 0.39474794268608093} 01/26/2022 21:47:33 - INFO - codeparrot_training - Step 2025: {'lr': 0.0004999996653373821, 'samples': 388992, 'steps': 2025, 'loss/train': 1.0281030535697937} 01/26/2022 21:47:36 - INFO - codeparrot_training - Step 2026: {'lr': 0.000499999638028919, 'samples': 389184, 'steps': 2026, 'loss/train': 1.1158581376075745} 01/26/2022 21:47:39 - INFO - codeparrot_training - Step 2027: {'lr': 0.0004999996096495369, 'samples': 389376, 'steps': 2027, 'loss/train': 0.8855677843093872} 01/26/2022 21:47:42 - INFO - codeparrot_training - Step 2028: {'lr': 0.0004999995801992359, 'samples': 389568, 'steps': 2028, 'loss/train': 0.6355200558900833} 01/26/2022 21:47:45 - INFO - codeparrot_training - Step 2029: {'lr': 0.000499999549678016, 'samples': 389760, 'steps': 2029, 'loss/train': 1.2554163336753845} 01/26/2022 21:47:49 - INFO - codeparrot_training - Step 2030: {'lr': 0.0004999995180858774, 'samples': 389952, 'steps': 2030, 'loss/train': 0.8659185469150543} 01/26/2022 21:47:52 - INFO - codeparrot_training - Step 2031: {'lr': 0.0004999994854228203, 'samples': 390144, 'steps': 2031, 'loss/train': 0.8121409714221954} 01/26/2022 21:47:55 - INFO - codeparrot_training - Step 2032: {'lr': 0.0004999994516888449, 'samples': 390336, 'steps': 2032, 'loss/train': 1.1874024868011475} 01/26/2022 21:47:58 - INFO - codeparrot_training - Step 2033: {'lr': 0.000499999416883951, 'samples': 390528, 'steps': 2033, 'loss/train': 0.8310914039611816} 01/26/2022 21:48:05 - INFO - codeparrot_training - Step 2034: {'lr': 0.0004999993810081391, 'samples': 390720, 'steps': 2034, 'loss/train': 0.8911593854427338} 01/26/2022 21:48:08 - INFO - codeparrot_training - Step 2035: {'lr': 0.0004999993440614092, 'samples': 390912, 'steps': 2035, 'loss/train': 0.10809960588812828} 01/26/2022 21:48:11 - INFO - codeparrot_training - Step 2036: {'lr': 0.0004999993060437616, 'samples': 391104, 'steps': 2036, 'loss/train': 0.48812802135944366} 01/26/2022 21:48:14 - INFO - codeparrot_training - Step 2037: {'lr': 0.0004999992669551962, 'samples': 391296, 'steps': 2037, 'loss/train': 0.7920154631137848} 01/26/2022 21:48:17 - INFO - codeparrot_training - Step 2038: {'lr': 0.0004999992267957135, 'samples': 391488, 'steps': 2038, 'loss/train': 0.7592490613460541} 01/26/2022 21:48:20 - INFO - codeparrot_training - Step 2039: {'lr': 0.0004999991855653134, 'samples': 391680, 'steps': 2039, 'loss/train': 0.7787691950798035} 01/26/2022 21:48:23 - INFO - codeparrot_training - Step 2040: {'lr': 0.0004999991432639963, 'samples': 391872, 'steps': 2040, 'loss/train': 0.2983754575252533} 01/26/2022 21:48:27 - INFO - codeparrot_training - Step 2041: {'lr': 0.0004999990998917621, 'samples': 392064, 'steps': 2041, 'loss/train': 1.7055848836898804} 01/26/2022 21:48:30 - INFO - codeparrot_training - Step 2042: {'lr': 0.0004999990554486111, 'samples': 392256, 'steps': 2042, 'loss/train': 1.7756046652793884} 01/26/2022 21:48:33 - INFO - codeparrot_training - Step 2043: {'lr': 0.0004999990099345436, 'samples': 392448, 'steps': 2043, 'loss/train': 1.3484628796577454} 01/26/2022 21:48:37 - INFO - codeparrot_training - Step 2044: {'lr': 0.0004999989633495597, 'samples': 392640, 'steps': 2044, 'loss/train': 0.9816133975982666} 01/26/2022 21:48:41 - INFO - codeparrot_training - Step 2045: {'lr': 0.0004999989156936597, 'samples': 392832, 'steps': 2045, 'loss/train': 0.14317631721496582} 01/26/2022 21:48:44 - INFO - codeparrot_training - Step 2046: {'lr': 0.0004999988669668437, 'samples': 393024, 'steps': 2046, 'loss/train': 1.108687698841095} 01/26/2022 21:48:47 - INFO - codeparrot_training - Step 2047: {'lr': 0.0004999988171691119, 'samples': 393216, 'steps': 2047, 'loss/train': 1.4060376584529877} 01/26/2022 21:48:50 - INFO - codeparrot_training - Step 2048: {'lr': 0.0004999987663004646, 'samples': 393408, 'steps': 2048, 'loss/train': 1.0836627781391144} 01/26/2022 21:48:53 - INFO - codeparrot_training - Step 2049: {'lr': 0.0004999987143609019, 'samples': 393600, 'steps': 2049, 'loss/train': 1.599990427494049} 01/26/2022 21:48:56 - INFO - codeparrot_training - Step 2050: {'lr': 0.0004999986613504242, 'samples': 393792, 'steps': 2050, 'loss/train': 1.156786859035492} 01/26/2022 21:48:59 - INFO - codeparrot_training - Step 2051: {'lr': 0.0004999986072690315, 'samples': 393984, 'steps': 2051, 'loss/train': 0.950546532869339} 01/26/2022 21:49:04 - INFO - codeparrot_training - Step 2052: {'lr': 0.0004999985521167242, 'samples': 394176, 'steps': 2052, 'loss/train': 0.7436255514621735} 01/26/2022 21:49:07 - INFO - codeparrot_training - Step 2053: {'lr': 0.0004999984958935025, 'samples': 394368, 'steps': 2053, 'loss/train': 1.6797552108764648} 01/26/2022 21:49:10 - INFO - codeparrot_training - Step 2054: {'lr': 0.0004999984385993665, 'samples': 394560, 'steps': 2054, 'loss/train': 1.1232050657272339} 01/26/2022 21:49:13 - INFO - codeparrot_training - Step 2055: {'lr': 0.0004999983802343168, 'samples': 394752, 'steps': 2055, 'loss/train': 0.5053276866674423} 01/26/2022 21:49:16 - INFO - codeparrot_training - Step 2056: {'lr': 0.0004999983207983532, 'samples': 394944, 'steps': 2056, 'loss/train': 1.1689175963401794} 01/26/2022 21:49:19 - INFO - codeparrot_training - Step 2057: {'lr': 0.0004999982602914763, 'samples': 395136, 'steps': 2057, 'loss/train': 0.9626614451408386} 01/26/2022 21:49:23 - INFO - codeparrot_training - Step 2058: {'lr': 0.0004999981987136862, 'samples': 395328, 'steps': 2058, 'loss/train': 0.8688786327838898} 01/26/2022 21:49:26 - INFO - codeparrot_training - Step 2059: {'lr': 0.0004999981360649833, 'samples': 395520, 'steps': 2059, 'loss/train': 1.1762094497680664} 01/26/2022 21:49:29 - INFO - codeparrot_training - Step 2060: {'lr': 0.0004999980723453676, 'samples': 395712, 'steps': 2060, 'loss/train': 0.7672549188137054} 01/26/2022 21:49:35 - INFO - codeparrot_training - Step 2061: {'lr': 0.0004999980075548397, 'samples': 395904, 'steps': 2061, 'loss/train': 1.003443717956543} 01/26/2022 21:49:38 - INFO - codeparrot_training - Step 2062: {'lr': 0.0004999979416933997, 'samples': 396096, 'steps': 2062, 'loss/train': 0.6878953278064728} 01/26/2022 21:49:41 - INFO - codeparrot_training - Step 2063: {'lr': 0.0004999978747610478, 'samples': 396288, 'steps': 2063, 'loss/train': 1.0185282826423645} 01/26/2022 21:49:45 - INFO - codeparrot_training - Step 2064: {'lr': 0.0004999978067577843, 'samples': 396480, 'steps': 2064, 'loss/train': 1.2931655645370483} 01/26/2022 21:49:48 - INFO - codeparrot_training - Step 2065: {'lr': 0.0004999977376836098, 'samples': 396672, 'steps': 2065, 'loss/train': 0.9251795411109924} 01/26/2022 21:49:51 - INFO - codeparrot_training - Step 2066: {'lr': 0.0004999976675385243, 'samples': 396864, 'steps': 2066, 'loss/train': 0.5291837006807327} 01/26/2022 21:49:54 - INFO - codeparrot_training - Step 2067: {'lr': 0.0004999975963225282, 'samples': 397056, 'steps': 2067, 'loss/train': 0.1125844158232212} 01/26/2022 21:49:57 - INFO - codeparrot_training - Step 2068: {'lr': 0.0004999975240356217, 'samples': 397248, 'steps': 2068, 'loss/train': 1.1009353995323181} 01/26/2022 21:50:00 - INFO - codeparrot_training - Step 2069: {'lr': 0.0004999974506778053, 'samples': 397440, 'steps': 2069, 'loss/train': 0.9341163039207458} 01/26/2022 21:50:05 - INFO - codeparrot_training - Step 2070: {'lr': 0.0004999973762490792, 'samples': 397632, 'steps': 2070, 'loss/train': 0.9636965095996857} 01/26/2022 21:50:08 - INFO - codeparrot_training - Step 2071: {'lr': 0.0004999973007494436, 'samples': 397824, 'steps': 2071, 'loss/train': 1.116022914648056} 01/26/2022 21:50:11 - INFO - codeparrot_training - Step 2072: {'lr': 0.000499997224178899, 'samples': 398016, 'steps': 2072, 'loss/train': 1.1989569067955017} 01/26/2022 21:50:14 - INFO - codeparrot_training - Step 2073: {'lr': 0.0004999971465374457, 'samples': 398208, 'steps': 2073, 'loss/train': 0.6034450381994247} 01/26/2022 21:50:18 - INFO - codeparrot_training - Step 2074: {'lr': 0.000499997067825084, 'samples': 398400, 'steps': 2074, 'loss/train': 0.16078270599246025} 01/26/2022 21:50:21 - INFO - codeparrot_training - Step 2075: {'lr': 0.0004999969880418142, 'samples': 398592, 'steps': 2075, 'loss/train': 1.4032876789569855} 01/26/2022 21:50:24 - INFO - codeparrot_training - Step 2076: {'lr': 0.0004999969071876367, 'samples': 398784, 'steps': 2076, 'loss/train': 0.9101490676403046} 01/26/2022 21:50:27 - INFO - codeparrot_training - Step 2077: {'lr': 0.0004999968252625519, 'samples': 398976, 'steps': 2077, 'loss/train': 0.8321006298065186} 01/26/2022 21:50:30 - INFO - codeparrot_training - Step 2078: {'lr': 0.00049999674226656, 'samples': 399168, 'steps': 2078, 'loss/train': 0.7387937754392624} 01/26/2022 21:50:35 - INFO - codeparrot_training - Step 2079: {'lr': 0.0004999966581996616, 'samples': 399360, 'steps': 2079, 'loss/train': 1.354985386133194} 01/26/2022 21:50:38 - INFO - codeparrot_training - Step 2080: {'lr': 0.0004999965730618567, 'samples': 399552, 'steps': 2080, 'loss/train': 0.9787302911281586} 01/26/2022 21:50:41 - INFO - codeparrot_training - Step 2081: {'lr': 0.000499996486853146, 'samples': 399744, 'steps': 2081, 'loss/train': 0.4693177789449692} 01/26/2022 21:50:44 - INFO - codeparrot_training - Step 2082: {'lr': 0.0004999963995735296, 'samples': 399936, 'steps': 2082, 'loss/train': 1.1204890608787537} 01/26/2022 21:50:47 - INFO - codeparrot_training - Step 2083: {'lr': 0.0004999963112230081, 'samples': 400128, 'steps': 2083, 'loss/train': 0.9240640997886658} 01/26/2022 21:50:50 - INFO - codeparrot_training - Step 2084: {'lr': 0.0004999962218015818, 'samples': 400320, 'steps': 2084, 'loss/train': 0.1406186856329441} 01/26/2022 21:50:54 - INFO - codeparrot_training - Step 2085: {'lr': 0.0004999961313092511, 'samples': 400512, 'steps': 2085, 'loss/train': 1.1738861203193665} 01/26/2022 21:50:57 - INFO - codeparrot_training - Step 2086: {'lr': 0.0004999960397460162, 'samples': 400704, 'steps': 2086, 'loss/train': 1.0778899490833282} 01/26/2022 21:51:03 - INFO - codeparrot_training - Step 2087: {'lr': 0.0004999959471118778, 'samples': 400896, 'steps': 2087, 'loss/train': 0.5370262116193771} 01/26/2022 21:51:06 - INFO - codeparrot_training - Step 2088: {'lr': 0.000499995853406836, 'samples': 401088, 'steps': 2088, 'loss/train': 0.9893450438976288} 01/26/2022 21:51:09 - INFO - codeparrot_training - Step 2089: {'lr': 0.0004999957586308914, 'samples': 401280, 'steps': 2089, 'loss/train': 0.6184823513031006} 01/26/2022 21:51:12 - INFO - codeparrot_training - Step 2090: {'lr': 0.0004999956627840445, 'samples': 401472, 'steps': 2090, 'loss/train': 0.9505895376205444} 01/26/2022 21:51:15 - INFO - codeparrot_training - Step 2091: {'lr': 0.0004999955658662954, 'samples': 401664, 'steps': 2091, 'loss/train': 0.595918133854866} 01/26/2022 21:51:19 - INFO - codeparrot_training - Step 2092: {'lr': 0.0004999954678776448, 'samples': 401856, 'steps': 2092, 'loss/train': 1.1441571414470673} 01/26/2022 21:51:22 - INFO - codeparrot_training - Step 2093: {'lr': 0.0004999953688180929, 'samples': 402048, 'steps': 2093, 'loss/train': 1.086691439151764} 01/26/2022 21:51:25 - INFO - codeparrot_training - Step 2094: {'lr': 0.0004999952686876402, 'samples': 402240, 'steps': 2094, 'loss/train': 1.0486720204353333} 01/26/2022 21:51:28 - INFO - codeparrot_training - Step 2095: {'lr': 0.0004999951674862872, 'samples': 402432, 'steps': 2095, 'loss/train': 0.6069618612527847} 01/26/2022 21:51:32 - INFO - codeparrot_training - Step 2096: {'lr': 0.0004999950652140343, 'samples': 402624, 'steps': 2096, 'loss/train': 1.2156330943107605} 01/26/2022 21:51:36 - INFO - codeparrot_training - Step 2097: {'lr': 0.0004999949618708819, 'samples': 402816, 'steps': 2097, 'loss/train': 1.8087542653083801} 01/26/2022 21:51:39 - INFO - codeparrot_training - Step 2098: {'lr': 0.0004999948574568305, 'samples': 403008, 'steps': 2098, 'loss/train': 1.0872942209243774} 01/26/2022 21:51:42 - INFO - codeparrot_training - Step 2099: {'lr': 0.0004999947519718805, 'samples': 403200, 'steps': 2099, 'loss/train': 1.1872133016586304} 01/26/2022 21:51:45 - INFO - codeparrot_training - Step 2100: {'lr': 0.0004999946454160324, 'samples': 403392, 'steps': 2100, 'loss/train': 0.7825076580047607} 01/26/2022 21:51:48 - INFO - codeparrot_training - Step 2101: {'lr': 0.0004999945377892865, 'samples': 403584, 'steps': 2101, 'loss/train': 0.6369920074939728} 01/26/2022 21:51:51 - INFO - codeparrot_training - Step 2102: {'lr': 0.0004999944290916434, 'samples': 403776, 'steps': 2102, 'loss/train': 0.5938144773244858} 01/26/2022 21:51:54 - INFO - codeparrot_training - Step 2103: {'lr': 0.0004999943193231037, 'samples': 403968, 'steps': 2103, 'loss/train': 0.985668808221817} 01/26/2022 21:51:58 - INFO - codeparrot_training - Step 2104: {'lr': 0.0004999942084836675, 'samples': 404160, 'steps': 2104, 'loss/train': 0.3586886152625084} 01/26/2022 21:52:04 - INFO - codeparrot_training - Step 2105: {'lr': 0.0004999940965733356, 'samples': 404352, 'steps': 2105, 'loss/train': 1.094574898481369} 01/26/2022 21:52:07 - INFO - codeparrot_training - Step 2106: {'lr': 0.0004999939835921085, 'samples': 404544, 'steps': 2106, 'loss/train': 0.9500210881233215} 01/26/2022 21:52:10 - INFO - codeparrot_training - Step 2107: {'lr': 0.0004999938695399864, 'samples': 404736, 'steps': 2107, 'loss/train': 1.6433390378952026} 01/26/2022 21:52:13 - INFO - codeparrot_training - Step 2108: {'lr': 0.00049999375441697, 'samples': 404928, 'steps': 2108, 'loss/train': 1.1768105328083038} 01/26/2022 21:52:16 - INFO - codeparrot_training - Step 2109: {'lr': 0.0004999936382230597, 'samples': 405120, 'steps': 2109, 'loss/train': 0.801573783159256} 01/26/2022 21:52:20 - INFO - codeparrot_training - Step 2110: {'lr': 0.000499993520958256, 'samples': 405312, 'steps': 2110, 'loss/train': 1.2407980263233185} 01/26/2022 21:52:23 - INFO - codeparrot_training - Step 2111: {'lr': 0.0004999934026225595, 'samples': 405504, 'steps': 2111, 'loss/train': 0.840054988861084} 01/26/2022 21:52:26 - INFO - codeparrot_training - Step 2112: {'lr': 0.0004999932832159708, 'samples': 405696, 'steps': 2112, 'loss/train': 0.6450248658657074} 01/26/2022 21:52:30 - INFO - codeparrot_training - Step 2113: {'lr': 0.00049999316273849, 'samples': 405888, 'steps': 2113, 'loss/train': 0.5340508371591568} 01/26/2022 21:52:33 - INFO - codeparrot_training - Step 2114: {'lr': 0.0004999930411901181, 'samples': 406080, 'steps': 2114, 'loss/train': 1.0998678803443909} 01/26/2022 21:52:37 - INFO - codeparrot_training - Step 2115: {'lr': 0.0004999929185708551, 'samples': 406272, 'steps': 2115, 'loss/train': 0.7132131010293961} 01/26/2022 21:52:40 - INFO - codeparrot_training - Step 2116: {'lr': 0.000499992794880702, 'samples': 406464, 'steps': 2116, 'loss/train': 0.6058786511421204} 01/26/2022 21:52:43 - INFO - codeparrot_training - Step 2117: {'lr': 0.0004999926701196592, 'samples': 406656, 'steps': 2117, 'loss/train': 0.8524394631385803} 01/26/2022 21:52:46 - INFO - codeparrot_training - Step 2118: {'lr': 0.0004999925442877271, 'samples': 406848, 'steps': 2118, 'loss/train': 0.4678300470113754} 01/26/2022 21:52:49 - INFO - codeparrot_training - Step 2119: {'lr': 0.0004999924173849063, 'samples': 407040, 'steps': 2119, 'loss/train': 0.800188422203064} 01/26/2022 21:52:52 - INFO - codeparrot_training - Step 2120: {'lr': 0.0004999922894111975, 'samples': 407232, 'steps': 2120, 'loss/train': 0.63613660633564} 01/26/2022 21:52:55 - INFO - codeparrot_training - Step 2121: {'lr': 0.000499992160366601, 'samples': 407424, 'steps': 2121, 'loss/train': 1.033191829919815} 01/26/2022 21:53:00 - INFO - codeparrot_training - Step 2122: {'lr': 0.0004999920302511175, 'samples': 407616, 'steps': 2122, 'loss/train': 1.4258902072906494} 01/26/2022 21:53:03 - INFO - codeparrot_training - Step 2123: {'lr': 0.0004999918990647474, 'samples': 407808, 'steps': 2123, 'loss/train': 0.8511922359466553} 01/26/2022 21:53:06 - INFO - codeparrot_training - Step 2124: {'lr': 0.0004999917668074915, 'samples': 408000, 'steps': 2124, 'loss/train': 0.9408641159534454} 01/26/2022 21:53:09 - INFO - codeparrot_training - Step 2125: {'lr': 0.0004999916334793503, 'samples': 408192, 'steps': 2125, 'loss/train': 1.5075509548187256} 01/26/2022 21:53:12 - INFO - codeparrot_training - Step 2126: {'lr': 0.0004999914990803242, 'samples': 408384, 'steps': 2126, 'loss/train': 1.0471579134464264} 01/26/2022 21:53:15 - INFO - codeparrot_training - Step 2127: {'lr': 0.000499991363610414, 'samples': 408576, 'steps': 2127, 'loss/train': 0.7510868310928345} 01/26/2022 21:53:19 - INFO - codeparrot_training - Step 2128: {'lr': 0.0004999912270696202, 'samples': 408768, 'steps': 2128, 'loss/train': 0.922368049621582} 01/26/2022 21:53:22 - INFO - codeparrot_training - Step 2129: {'lr': 0.0004999910894579432, 'samples': 408960, 'steps': 2129, 'loss/train': 0.8984503448009491} 01/26/2022 21:53:26 - INFO - codeparrot_training - Step 2130: {'lr': 0.000499990950775384, 'samples': 409152, 'steps': 2130, 'loss/train': 1.272133469581604} 01/26/2022 21:53:29 - INFO - codeparrot_training - Step 2131: {'lr': 0.0004999908110219428, 'samples': 409344, 'steps': 2131, 'loss/train': 1.07924884557724} 01/26/2022 21:53:32 - INFO - codeparrot_training - Step 2132: {'lr': 0.0004999906701976203, 'samples': 409536, 'steps': 2132, 'loss/train': 0.7307192981243134} 01/26/2022 21:53:36 - INFO - codeparrot_training - Step 2133: {'lr': 0.0004999905283024172, 'samples': 409728, 'steps': 2133, 'loss/train': 0.9029307961463928} 01/26/2022 21:53:39 - INFO - codeparrot_training - Step 2134: {'lr': 0.0004999903853363341, 'samples': 409920, 'steps': 2134, 'loss/train': 0.8297041654586792} 01/26/2022 21:53:42 - INFO - codeparrot_training - Step 2135: {'lr': 0.0004999902412993715, 'samples': 410112, 'steps': 2135, 'loss/train': 0.9197361767292023} 01/26/2022 21:53:45 - INFO - codeparrot_training - Step 2136: {'lr': 0.0004999900961915302, 'samples': 410304, 'steps': 2136, 'loss/train': 0.994433730840683} 01/26/2022 21:53:48 - INFO - codeparrot_training - Step 2137: {'lr': 0.0004999899500128107, 'samples': 410496, 'steps': 2137, 'loss/train': 0.8735065162181854} 01/26/2022 21:53:51 - INFO - codeparrot_training - Step 2138: {'lr': 0.0004999898027632135, 'samples': 410688, 'steps': 2138, 'loss/train': 1.1632116436958313} 01/26/2022 21:53:57 - INFO - codeparrot_training - Step 2139: {'lr': 0.0004999896544427394, 'samples': 410880, 'steps': 2139, 'loss/train': 0.8799234330654144} 01/26/2022 21:54:01 - INFO - codeparrot_training - Step 2140: {'lr': 0.0004999895050513891, 'samples': 411072, 'steps': 2140, 'loss/train': 0.668507918715477} 01/26/2022 21:54:04 - INFO - codeparrot_training - Step 2141: {'lr': 0.0004999893545891631, 'samples': 411264, 'steps': 2141, 'loss/train': 1.2403203248977661} 01/26/2022 21:54:07 - INFO - codeparrot_training - Step 2142: {'lr': 0.000499989203056062, 'samples': 411456, 'steps': 2142, 'loss/train': 0.9359115958213806} 01/26/2022 21:54:10 - INFO - codeparrot_training - Step 2143: {'lr': 0.0004999890504520866, 'samples': 411648, 'steps': 2143, 'loss/train': 0.6950445771217346} 01/26/2022 21:54:13 - INFO - codeparrot_training - Step 2144: {'lr': 0.0004999888967772375, 'samples': 411840, 'steps': 2144, 'loss/train': 0.945826381444931} 01/26/2022 21:54:16 - INFO - codeparrot_training - Step 2145: {'lr': 0.0004999887420315153, 'samples': 412032, 'steps': 2145, 'loss/train': 0.5417589247226715} 01/26/2022 21:54:19 - INFO - codeparrot_training - Step 2146: {'lr': 0.0004999885862149207, 'samples': 412224, 'steps': 2146, 'loss/train': 0.8147931396961212} 01/26/2022 21:54:23 - INFO - codeparrot_training - Step 2147: {'lr': 0.0004999884293274545, 'samples': 412416, 'steps': 2147, 'loss/train': 1.1996518671512604} 01/26/2022 21:54:27 - INFO - codeparrot_training - Step 2148: {'lr': 0.0004999882713691171, 'samples': 412608, 'steps': 2148, 'loss/train': 0.9248567819595337} 01/26/2022 21:54:30 - INFO - codeparrot_training - Step 2149: {'lr': 0.0004999881123399093, 'samples': 412800, 'steps': 2149, 'loss/train': 0.4881078153848648} 01/26/2022 21:54:33 - INFO - codeparrot_training - Step 2150: {'lr': 0.000499987952239832, 'samples': 412992, 'steps': 2150, 'loss/train': 1.90619695186615} 01/26/2022 21:54:37 - INFO - codeparrot_training - Step 2151: {'lr': 0.0004999877910688856, 'samples': 413184, 'steps': 2151, 'loss/train': 0.6582081913948059} 01/26/2022 21:54:40 - INFO - codeparrot_training - Step 2152: {'lr': 0.0004999876288270708, 'samples': 413376, 'steps': 2152, 'loss/train': 0.8744857013225555} 01/26/2022 21:54:43 - INFO - codeparrot_training - Step 2153: {'lr': 0.0004999874655143886, 'samples': 413568, 'steps': 2153, 'loss/train': 0.5345426201820374} 01/26/2022 21:54:46 - INFO - codeparrot_training - Step 2154: {'lr': 0.0004999873011308393, 'samples': 413760, 'steps': 2154, 'loss/train': 0.8760119676589966} 01/26/2022 21:54:49 - INFO - codeparrot_training - Step 2155: {'lr': 0.0004999871356764238, 'samples': 413952, 'steps': 2155, 'loss/train': 0.6077926307916641} 01/26/2022 21:54:52 - INFO - codeparrot_training - Step 2156: {'lr': 0.0004999869691511428, 'samples': 414144, 'steps': 2156, 'loss/train': 0.9738145172595978} 01/26/2022 21:54:56 - INFO - codeparrot_training - Step 2157: {'lr': 0.000499986801554997, 'samples': 414336, 'steps': 2157, 'loss/train': 1.0448395907878876} 01/26/2022 21:55:00 - INFO - codeparrot_training - Step 2158: {'lr': 0.0004999866328879871, 'samples': 414528, 'steps': 2158, 'loss/train': 0.9120277762413025} 01/26/2022 21:55:03 - INFO - codeparrot_training - Step 2159: {'lr': 0.0004999864631501139, 'samples': 414720, 'steps': 2159, 'loss/train': 1.2449660897254944} 01/26/2022 21:55:06 - INFO - codeparrot_training - Step 2160: {'lr': 0.000499986292341378, 'samples': 414912, 'steps': 2160, 'loss/train': 1.3211356401443481} 01/26/2022 21:55:09 - INFO - codeparrot_training - Step 2161: {'lr': 0.0004999861204617803, 'samples': 415104, 'steps': 2161, 'loss/train': 0.4913983941078186} 01/26/2022 21:55:12 - INFO - codeparrot_training - Step 2162: {'lr': 0.0004999859475113213, 'samples': 415296, 'steps': 2162, 'loss/train': 0.7307425886392593} 01/26/2022 21:55:15 - INFO - codeparrot_training - Step 2163: {'lr': 0.0004999857734900021, 'samples': 415488, 'steps': 2163, 'loss/train': 1.0872806310653687} 01/26/2022 21:55:18 - INFO - codeparrot_training - Step 2164: {'lr': 0.000499985598397823, 'samples': 415680, 'steps': 2164, 'loss/train': 0.5287566035985947} 01/26/2022 21:55:22 - INFO - codeparrot_training - Step 2165: {'lr': 0.0004999854222347851, 'samples': 415872, 'steps': 2165, 'loss/train': 1.3536615371704102} 01/26/2022 21:55:28 - INFO - codeparrot_training - Step 2166: {'lr': 0.000499985245000889, 'samples': 416064, 'steps': 2166, 'loss/train': 0.611466720700264} 01/26/2022 21:55:31 - INFO - codeparrot_training - Step 2167: {'lr': 0.0004999850666961355, 'samples': 416256, 'steps': 2167, 'loss/train': 0.5897393077611923} 01/26/2022 21:55:34 - INFO - codeparrot_training - Step 2168: {'lr': 0.0004999848873205254, 'samples': 416448, 'steps': 2168, 'loss/train': 0.6915461719036102} 01/26/2022 21:55:37 - INFO - codeparrot_training - Step 2169: {'lr': 0.0004999847068740593, 'samples': 416640, 'steps': 2169, 'loss/train': 1.2371439635753632} 01/26/2022 21:55:40 - INFO - codeparrot_training - Step 2170: {'lr': 0.0004999845253567382, 'samples': 416832, 'steps': 2170, 'loss/train': 0.773919403553009} 01/26/2022 21:55:44 - INFO - codeparrot_training - Step 2171: {'lr': 0.0004999843427685627, 'samples': 417024, 'steps': 2171, 'loss/train': 1.0751231610774994} 01/26/2022 21:55:47 - INFO - codeparrot_training - Step 2172: {'lr': 0.0004999841591095337, 'samples': 417216, 'steps': 2172, 'loss/train': 0.9082628488540649} 01/26/2022 21:55:50 - INFO - codeparrot_training - Step 2173: {'lr': 0.0004999839743796519, 'samples': 417408, 'steps': 2173, 'loss/train': 0.5015641897916794} 01/26/2022 21:55:53 - INFO - codeparrot_training - Step 2174: {'lr': 0.0004999837885789182, 'samples': 417600, 'steps': 2174, 'loss/train': 0.7078596353530884} 01/26/2022 21:55:57 - INFO - codeparrot_training - Step 2175: {'lr': 0.0004999836017073332, 'samples': 417792, 'steps': 2175, 'loss/train': 0.09642519056797028} 01/26/2022 21:56:01 - INFO - codeparrot_training - Step 2176: {'lr': 0.000499983413764898, 'samples': 417984, 'steps': 2176, 'loss/train': 0.7258294969797134} 01/26/2022 21:56:04 - INFO - codeparrot_training - Step 2177: {'lr': 0.0004999832247516132, 'samples': 418176, 'steps': 2177, 'loss/train': 1.4385782182216644} 01/26/2022 21:56:07 - INFO - codeparrot_training - Step 2178: {'lr': 0.0004999830346674796, 'samples': 418368, 'steps': 2178, 'loss/train': 0.6410648971796036} 01/26/2022 21:56:10 - INFO - codeparrot_training - Step 2179: {'lr': 0.000499982843512498, 'samples': 418560, 'steps': 2179, 'loss/train': 1.230393648147583} 01/26/2022 21:56:13 - INFO - codeparrot_training - Step 2180: {'lr': 0.0004999826512866693, 'samples': 418752, 'steps': 2180, 'loss/train': 0.9883247315883636} 01/26/2022 21:56:16 - INFO - codeparrot_training - Step 2181: {'lr': 0.0004999824579899944, 'samples': 418944, 'steps': 2181, 'loss/train': 1.2229525744915009} 01/26/2022 21:56:19 - INFO - codeparrot_training - Step 2182: {'lr': 0.000499982263622474, 'samples': 419136, 'steps': 2182, 'loss/train': 0.9424770176410675} 01/26/2022 21:56:23 - INFO - codeparrot_training - Step 2183: {'lr': 0.0004999820681841088, 'samples': 419328, 'steps': 2183, 'loss/train': 1.009579360485077} 01/26/2022 21:56:29 - INFO - codeparrot_training - Step 2184: {'lr': 0.0004999818716748999, 'samples': 419520, 'steps': 2184, 'loss/train': 0.7780390083789825} 01/26/2022 21:56:32 - INFO - codeparrot_training - Step 2185: {'lr': 0.0004999816740948481, 'samples': 419712, 'steps': 2185, 'loss/train': 1.1167762577533722} 01/26/2022 21:56:35 - INFO - codeparrot_training - Step 2186: {'lr': 0.0004999814754439542, 'samples': 419904, 'steps': 2186, 'loss/train': 0.5377397686243057} 01/26/2022 21:56:38 - INFO - codeparrot_training - Step 2187: {'lr': 0.000499981275722219, 'samples': 420096, 'steps': 2187, 'loss/train': 0.6784306168556213} 01/26/2022 21:56:41 - INFO - codeparrot_training - Step 2188: {'lr': 0.0004999810749296434, 'samples': 420288, 'steps': 2188, 'loss/train': 0.8639592826366425} 01/26/2022 21:56:44 - INFO - codeparrot_training - Step 2189: {'lr': 0.0004999808730662282, 'samples': 420480, 'steps': 2189, 'loss/train': 0.8963306844234467} 01/26/2022 21:56:47 - INFO - codeparrot_training - Step 2190: {'lr': 0.0004999806701319743, 'samples': 420672, 'steps': 2190, 'loss/train': 0.45171529054641724} 01/26/2022 21:56:51 - INFO - codeparrot_training - Step 2191: {'lr': 0.0004999804661268827, 'samples': 420864, 'steps': 2191, 'loss/train': 1.1417400240898132} 01/26/2022 21:56:55 - INFO - codeparrot_training - Step 2192: {'lr': 0.0004999802610509541, 'samples': 421056, 'steps': 2192, 'loss/train': 1.0332525372505188} 01/26/2022 21:56:58 - INFO - codeparrot_training - Step 2193: {'lr': 0.0004999800549041894, 'samples': 421248, 'steps': 2193, 'loss/train': 0.9972928762435913} 01/26/2022 21:57:01 - INFO - codeparrot_training - Step 2194: {'lr': 0.0004999798476865895, 'samples': 421440, 'steps': 2194, 'loss/train': 1.2369089126586914} 01/26/2022 21:57:04 - INFO - codeparrot_training - Step 2195: {'lr': 0.0004999796393981554, 'samples': 421632, 'steps': 2195, 'loss/train': 1.0950245261192322} 01/26/2022 21:57:08 - INFO - codeparrot_training - Step 2196: {'lr': 0.0004999794300388879, 'samples': 421824, 'steps': 2196, 'loss/train': 0.5078418999910355} 01/26/2022 21:57:11 - INFO - codeparrot_training - Step 2197: {'lr': 0.0004999792196087879, 'samples': 422016, 'steps': 2197, 'loss/train': 0.49921372532844543} 01/26/2022 21:57:14 - INFO - codeparrot_training - Step 2198: {'lr': 0.0004999790081078562, 'samples': 422208, 'steps': 2198, 'loss/train': 1.3541961014270782} 01/26/2022 21:57:17 - INFO - codeparrot_training - Step 2199: {'lr': 0.0004999787955360939, 'samples': 422400, 'steps': 2199, 'loss/train': 0.72906294465065} 01/26/2022 21:57:20 - INFO - codeparrot_training - Step 2200: {'lr': 0.0004999785818935018, 'samples': 422592, 'steps': 2200, 'loss/train': 0.9099203646183014} 01/26/2022 21:57:25 - INFO - codeparrot_training - Step 2201: {'lr': 0.0004999783671800808, 'samples': 422784, 'steps': 2201, 'loss/train': 0.26237753033638} 01/26/2022 21:57:28 - INFO - codeparrot_training - Step 2202: {'lr': 0.0004999781513958318, 'samples': 422976, 'steps': 2202, 'loss/train': 1.0568976402282715} 01/26/2022 21:57:31 - INFO - codeparrot_training - Step 2203: {'lr': 0.000499977934540756, 'samples': 423168, 'steps': 2203, 'loss/train': 2.1460066437721252} 01/26/2022 21:57:34 - INFO - codeparrot_training - Step 2204: {'lr': 0.0004999777166148539, 'samples': 423360, 'steps': 2204, 'loss/train': 0.6118212640285492} 01/26/2022 21:57:37 - INFO - codeparrot_training - Step 2205: {'lr': 0.0004999774976181267, 'samples': 423552, 'steps': 2205, 'loss/train': 1.1418206691741943} 01/26/2022 21:57:40 - INFO - codeparrot_training - Step 2206: {'lr': 0.0004999772775505753, 'samples': 423744, 'steps': 2206, 'loss/train': 1.0342950224876404} 01/26/2022 21:57:44 - INFO - codeparrot_training - Step 2207: {'lr': 0.0004999770564122005, 'samples': 423936, 'steps': 2207, 'loss/train': 0.7250835299491882} 01/26/2022 21:57:47 - INFO - codeparrot_training - Step 2208: {'lr': 0.0004999768342030035, 'samples': 424128, 'steps': 2208, 'loss/train': 0.950308620929718} 01/26/2022 21:57:50 - INFO - codeparrot_training - Step 2209: {'lr': 0.0004999766109229851, 'samples': 424320, 'steps': 2209, 'loss/train': 0.8864114284515381} 01/26/2022 21:57:56 - INFO - codeparrot_training - Step 2210: {'lr': 0.0004999763865721463, 'samples': 424512, 'steps': 2210, 'loss/train': 1.4495744705200195} 01/26/2022 21:57:59 - INFO - codeparrot_training - Step 2211: {'lr': 0.000499976161150488, 'samples': 424704, 'steps': 2211, 'loss/train': 0.8796536028385162} 01/26/2022 21:58:02 - INFO - codeparrot_training - Step 2212: {'lr': 0.0004999759346580111, 'samples': 424896, 'steps': 2212, 'loss/train': 1.5318319201469421} 01/26/2022 21:58:05 - INFO - codeparrot_training - Step 2213: {'lr': 0.0004999757070947168, 'samples': 425088, 'steps': 2213, 'loss/train': 0.9910604953765869} 01/26/2022 21:58:09 - INFO - codeparrot_training - Step 2214: {'lr': 0.0004999754784606058, 'samples': 425280, 'steps': 2214, 'loss/train': 1.145070880651474} 01/26/2022 21:58:12 - INFO - codeparrot_training - Step 2215: {'lr': 0.0004999752487556794, 'samples': 425472, 'steps': 2215, 'loss/train': 1.178069919347763} 01/26/2022 21:58:15 - INFO - codeparrot_training - Step 2216: {'lr': 0.0004999750179799383, 'samples': 425664, 'steps': 2216, 'loss/train': 0.9223387241363525} 01/26/2022 21:58:18 - INFO - codeparrot_training - Step 2217: {'lr': 0.0004999747861333838, 'samples': 425856, 'steps': 2217, 'loss/train': 0.586457222700119} 01/26/2022 21:58:21 - INFO - codeparrot_training - Step 2218: {'lr': 0.0004999745532160164, 'samples': 426048, 'steps': 2218, 'loss/train': 0.9391205012798309} 01/26/2022 21:58:26 - INFO - codeparrot_training - Step 2219: {'lr': 0.0004999743192278377, 'samples': 426240, 'steps': 2219, 'loss/train': 0.289036363363266} 01/26/2022 21:58:29 - INFO - codeparrot_training - Step 2220: {'lr': 0.0004999740841688481, 'samples': 426432, 'steps': 2220, 'loss/train': 0.9155005216598511} 01/26/2022 21:58:32 - INFO - codeparrot_training - Step 2221: {'lr': 0.000499973848039049, 'samples': 426624, 'steps': 2221, 'loss/train': 1.0020636320114136} 01/26/2022 21:58:35 - INFO - codeparrot_training - Step 2222: {'lr': 0.0004999736108384414, 'samples': 426816, 'steps': 2222, 'loss/train': 0.9221923649311066} 01/26/2022 21:58:38 - INFO - codeparrot_training - Step 2223: {'lr': 0.0004999733725670261, 'samples': 427008, 'steps': 2223, 'loss/train': 2.542407989501953} 01/26/2022 21:58:41 - INFO - codeparrot_training - Step 2224: {'lr': 0.0004999731332248044, 'samples': 427200, 'steps': 2224, 'loss/train': 0.7836450040340424} 01/26/2022 21:58:45 - INFO - codeparrot_training - Step 2225: {'lr': 0.0004999728928117771, 'samples': 427392, 'steps': 2225, 'loss/train': 0.9659948945045471} 01/26/2022 21:58:48 - INFO - codeparrot_training - Step 2226: {'lr': 0.0004999726513279452, 'samples': 427584, 'steps': 2226, 'loss/train': 0.8156147003173828} 01/26/2022 21:58:52 - INFO - codeparrot_training - Step 2227: {'lr': 0.0004999724087733099, 'samples': 427776, 'steps': 2227, 'loss/train': 0.784356415271759} 01/26/2022 21:58:55 - INFO - codeparrot_training - Step 2228: {'lr': 0.0004999721651478723, 'samples': 427968, 'steps': 2228, 'loss/train': 0.7320916503667831} 01/26/2022 21:58:58 - INFO - codeparrot_training - Step 2229: {'lr': 0.0004999719204516332, 'samples': 428160, 'steps': 2229, 'loss/train': 1.7573259472846985} 01/26/2022 21:59:02 - INFO - codeparrot_training - Step 2230: {'lr': 0.0004999716746845937, 'samples': 428352, 'steps': 2230, 'loss/train': 1.090604156255722} 01/26/2022 21:59:05 - INFO - codeparrot_training - Step 2231: {'lr': 0.0004999714278467551, 'samples': 428544, 'steps': 2231, 'loss/train': 0.7671788334846497} 01/26/2022 21:59:08 - INFO - codeparrot_training - Step 2232: {'lr': 0.0004999711799381181, 'samples': 428736, 'steps': 2232, 'loss/train': 0.9610298573970795} 01/26/2022 21:59:11 - INFO - codeparrot_training - Step 2233: {'lr': 0.000499970930958684, 'samples': 428928, 'steps': 2233, 'loss/train': 1.201143890619278} 01/26/2022 21:59:14 - INFO - codeparrot_training - Step 2234: {'lr': 0.0004999706809084538, 'samples': 429120, 'steps': 2234, 'loss/train': 0.906295895576477} 01/26/2022 21:59:17 - INFO - codeparrot_training - Step 2235: {'lr': 0.0004999704297874287, 'samples': 429312, 'steps': 2235, 'loss/train': 0.6909526884555817} 01/26/2022 21:59:22 - INFO - codeparrot_training - Step 2236: {'lr': 0.0004999701775956095, 'samples': 429504, 'steps': 2236, 'loss/train': 0.15398437902331352} 01/26/2022 21:59:25 - INFO - codeparrot_training - Step 2237: {'lr': 0.0004999699243329975, 'samples': 429696, 'steps': 2237, 'loss/train': 0.9287404417991638} 01/26/2022 21:59:28 - INFO - codeparrot_training - Step 2238: {'lr': 0.0004999696699995937, 'samples': 429888, 'steps': 2238, 'loss/train': 1.3639012277126312} 01/26/2022 21:59:31 - INFO - codeparrot_training - Step 2239: {'lr': 0.0004999694145953992, 'samples': 430080, 'steps': 2239, 'loss/train': 0.8283425867557526} 01/26/2022 21:59:34 - INFO - codeparrot_training - Step 2240: {'lr': 0.0004999691581204152, 'samples': 430272, 'steps': 2240, 'loss/train': 0.6986379325389862} 01/26/2022 21:59:38 - INFO - codeparrot_training - Step 2241: {'lr': 0.0004999689005746426, 'samples': 430464, 'steps': 2241, 'loss/train': 1.4465905129909515} 01/26/2022 21:59:41 - INFO - codeparrot_training - Step 2242: {'lr': 0.0004999686419580827, 'samples': 430656, 'steps': 2242, 'loss/train': 1.2390041649341583} 01/26/2022 21:59:44 - INFO - codeparrot_training - Step 2243: {'lr': 0.0004999683822707364, 'samples': 430848, 'steps': 2243, 'loss/train': 1.6316227912902832} 01/26/2022 21:59:47 - INFO - codeparrot_training - Step 2244: {'lr': 0.0004999681215126049, 'samples': 431040, 'steps': 2244, 'loss/train': 0.7911624312400818} 01/26/2022 21:59:53 - INFO - codeparrot_training - Step 2245: {'lr': 0.0004999678596836894, 'samples': 431232, 'steps': 2245, 'loss/train': 0.8120863437652588} 01/26/2022 21:59:56 - INFO - codeparrot_training - Step 2246: {'lr': 0.000499967596783991, 'samples': 431424, 'steps': 2246, 'loss/train': 0.8041100800037384} 01/26/2022 21:59:59 - INFO - codeparrot_training - Step 2247: {'lr': 0.0004999673328135107, 'samples': 431616, 'steps': 2247, 'loss/train': 1.056616097688675} 01/26/2022 22:00:03 - INFO - codeparrot_training - Step 2248: {'lr': 0.0004999670677722498, 'samples': 431808, 'steps': 2248, 'loss/train': 1.6552199721336365} 01/26/2022 22:00:06 - INFO - codeparrot_training - Step 2249: {'lr': 0.0004999668016602094, 'samples': 432000, 'steps': 2249, 'loss/train': 1.2691152691841125} 01/26/2022 22:00:09 - INFO - codeparrot_training - Step 2250: {'lr': 0.0004999665344773905, 'samples': 432192, 'steps': 2250, 'loss/train': 0.965964138507843} 01/26/2022 22:00:12 - INFO - codeparrot_training - Step 2251: {'lr': 0.0004999662662237943, 'samples': 432384, 'steps': 2251, 'loss/train': 0.8920717835426331} 01/26/2022 22:00:15 - INFO - codeparrot_training - Step 2252: {'lr': 0.0004999659968994221, 'samples': 432576, 'steps': 2252, 'loss/train': 1.0815414190292358} 01/26/2022 22:00:19 - INFO - codeparrot_training - Step 2253: {'lr': 0.0004999657265042748, 'samples': 432768, 'steps': 2253, 'loss/train': 0.622749388217926} 01/26/2022 22:00:23 - INFO - codeparrot_training - Step 2254: {'lr': 0.0004999654550383539, 'samples': 432960, 'steps': 2254, 'loss/train': 1.0661737024784088} 01/26/2022 22:00:26 - INFO - codeparrot_training - Step 2255: {'lr': 0.0004999651825016603, 'samples': 433152, 'steps': 2255, 'loss/train': 0.39630840718746185} 01/26/2022 22:00:29 - INFO - codeparrot_training - Step 2256: {'lr': 0.0004999649088941951, 'samples': 433344, 'steps': 2256, 'loss/train': 0.8672060966491699} 01/26/2022 22:00:32 - INFO - codeparrot_training - Step 2257: {'lr': 0.0004999646342159597, 'samples': 433536, 'steps': 2257, 'loss/train': 0.7789485454559326} 01/26/2022 22:00:35 - INFO - codeparrot_training - Step 2258: {'lr': 0.0004999643584669552, 'samples': 433728, 'steps': 2258, 'loss/train': 0.9987546801567078} 01/26/2022 22:00:38 - INFO - codeparrot_training - Step 2259: {'lr': 0.0004999640816471827, 'samples': 433920, 'steps': 2259, 'loss/train': 0.6824109256267548} 01/26/2022 22:00:42 - INFO - codeparrot_training - Step 2260: {'lr': 0.0004999638037566436, 'samples': 434112, 'steps': 2260, 'loss/train': 1.2807804644107819} 01/26/2022 22:00:45 - INFO - codeparrot_training - Step 2261: {'lr': 0.0004999635247953387, 'samples': 434304, 'steps': 2261, 'loss/train': 0.9396954774856567} 01/26/2022 22:00:49 - INFO - codeparrot_training - Step 2262: {'lr': 0.0004999632447632696, 'samples': 434496, 'steps': 2262, 'loss/train': 0.847042590379715} 01/26/2022 22:00:53 - INFO - codeparrot_training - Step 2263: {'lr': 0.0004999629636604372, 'samples': 434688, 'steps': 2263, 'loss/train': 1.0100138783454895} 01/26/2022 22:00:56 - INFO - codeparrot_training - Step 2264: {'lr': 0.0004999626814868429, 'samples': 434880, 'steps': 2264, 'loss/train': 0.8650005161762238} 01/26/2022 22:00:59 - INFO - codeparrot_training - Step 2265: {'lr': 0.0004999623982424879, 'samples': 435072, 'steps': 2265, 'loss/train': 1.617754876613617} 01/26/2022 22:01:02 - INFO - codeparrot_training - Step 2266: {'lr': 0.0004999621139273733, 'samples': 435264, 'steps': 2266, 'loss/train': 0.372613325715065} 01/26/2022 22:01:05 - INFO - codeparrot_training - Step 2267: {'lr': 0.0004999618285415004, 'samples': 435456, 'steps': 2267, 'loss/train': 0.7545748651027679} 01/26/2022 22:01:08 - INFO - codeparrot_training - Step 2268: {'lr': 0.0004999615420848704, 'samples': 435648, 'steps': 2268, 'loss/train': 0.4284180700778961} 01/26/2022 22:01:11 - INFO - codeparrot_training - Step 2269: {'lr': 0.0004999612545574845, 'samples': 435840, 'steps': 2269, 'loss/train': 0.1949484944343567} 01/26/2022 22:01:15 - INFO - codeparrot_training - Step 2270: {'lr': 0.000499960965959344, 'samples': 436032, 'steps': 2270, 'loss/train': 0.7842688858509064} 01/26/2022 22:01:21 - INFO - codeparrot_training - Step 2271: {'lr': 0.0004999606762904501, 'samples': 436224, 'steps': 2271, 'loss/train': 0.9906761348247528} 01/26/2022 22:01:24 - INFO - codeparrot_training - Step 2272: {'lr': 0.000499960385550804, 'samples': 436416, 'steps': 2272, 'loss/train': 0.8732045888900757} 01/26/2022 22:01:27 - INFO - codeparrot_training - Step 2273: {'lr': 0.000499960093740407, 'samples': 436608, 'steps': 2273, 'loss/train': 0.7350258529186249} 01/26/2022 22:01:30 - INFO - codeparrot_training - Step 2274: {'lr': 0.0004999598008592603, 'samples': 436800, 'steps': 2274, 'loss/train': 1.1863957643508911} 01/26/2022 22:01:33 - INFO - codeparrot_training - Step 2275: {'lr': 0.0004999595069073653, 'samples': 436992, 'steps': 2275, 'loss/train': 0.4830196648836136} 01/26/2022 22:01:36 - INFO - codeparrot_training - Step 2276: {'lr': 0.0004999592118847229, 'samples': 437184, 'steps': 2276, 'loss/train': 0.6521430909633636} 01/26/2022 22:01:40 - INFO - codeparrot_training - Step 2277: {'lr': 0.0004999589157913348, 'samples': 437376, 'steps': 2277, 'loss/train': 0.9573456645011902} 01/26/2022 22:01:43 - INFO - codeparrot_training - Step 2278: {'lr': 0.0004999586186272021, 'samples': 437568, 'steps': 2278, 'loss/train': 0.6996997743844986} 01/26/2022 22:01:46 - INFO - codeparrot_training - Step 2279: {'lr': 0.000499958320392326, 'samples': 437760, 'steps': 2279, 'loss/train': 0.7678402662277222} 01/26/2022 22:01:50 - INFO - codeparrot_training - Step 2280: {'lr': 0.0004999580210867077, 'samples': 437952, 'steps': 2280, 'loss/train': 1.1326868534088135} 01/26/2022 22:01:53 - INFO - codeparrot_training - Step 2281: {'lr': 0.0004999577207103487, 'samples': 438144, 'steps': 2281, 'loss/train': 0.8403244614601135} 01/26/2022 22:01:57 - INFO - codeparrot_training - Step 2282: {'lr': 0.0004999574192632502, 'samples': 438336, 'steps': 2282, 'loss/train': 0.5290795862674713} 01/26/2022 22:02:00 - INFO - codeparrot_training - Step 2283: {'lr': 0.0004999571167454135, 'samples': 438528, 'steps': 2283, 'loss/train': 1.1055583655834198} 01/26/2022 22:02:03 - INFO - codeparrot_training - Step 2284: {'lr': 0.0004999568131568399, 'samples': 438720, 'steps': 2284, 'loss/train': 1.6332373023033142} 01/26/2022 22:02:06 - INFO - codeparrot_training - Step 2285: {'lr': 0.0004999565084975306, 'samples': 438912, 'steps': 2285, 'loss/train': 0.765064537525177} 01/26/2022 22:02:09 - INFO - codeparrot_training - Step 2286: {'lr': 0.0004999562027674871, 'samples': 439104, 'steps': 2286, 'loss/train': 0.9157280623912811} 01/26/2022 22:02:12 - INFO - codeparrot_training - Step 2287: {'lr': 0.0004999558959667105, 'samples': 439296, 'steps': 2287, 'loss/train': 0.9992413222789764} 01/26/2022 22:02:19 - INFO - codeparrot_training - Step 2288: {'lr': 0.0004999555880952023, 'samples': 439488, 'steps': 2288, 'loss/train': 1.1248255670070648} 01/26/2022 22:02:22 - INFO - codeparrot_training - Step 2289: {'lr': 0.0004999552791529637, 'samples': 439680, 'steps': 2289, 'loss/train': 0.9472548365592957} 01/26/2022 22:02:25 - INFO - codeparrot_training - Step 2290: {'lr': 0.000499954969139996, 'samples': 439872, 'steps': 2290, 'loss/train': 1.1027405261993408} 01/26/2022 22:02:28 - INFO - codeparrot_training - Step 2291: {'lr': 0.0004999546580563006, 'samples': 440064, 'steps': 2291, 'loss/train': 0.8699802160263062} 01/26/2022 22:02:31 - INFO - codeparrot_training - Step 2292: {'lr': 0.0004999543459018788, 'samples': 440256, 'steps': 2292, 'loss/train': 0.7391451448202133} 01/26/2022 22:02:34 - INFO - codeparrot_training - Step 2293: {'lr': 0.000499954032676732, 'samples': 440448, 'steps': 2293, 'loss/train': 0.9644017517566681} 01/26/2022 22:02:38 - INFO - codeparrot_training - Step 2294: {'lr': 0.0004999537183808614, 'samples': 440640, 'steps': 2294, 'loss/train': 1.4602245390415192} 01/26/2022 22:02:41 - INFO - codeparrot_training - Step 2295: {'lr': 0.0004999534030142686, 'samples': 440832, 'steps': 2295, 'loss/train': 0.8340847492218018} 01/26/2022 22:02:44 - INFO - codeparrot_training - Step 2296: {'lr': 0.0004999530865769547, 'samples': 441024, 'steps': 2296, 'loss/train': 0.8711365163326263} 01/26/2022 22:02:49 - INFO - codeparrot_training - Step 2297: {'lr': 0.0004999527690689212, 'samples': 441216, 'steps': 2297, 'loss/train': 1.0435057282447815} 01/26/2022 22:02:53 - INFO - codeparrot_training - Step 2298: {'lr': 0.0004999524504901694, 'samples': 441408, 'steps': 2298, 'loss/train': 0.574559211730957} 01/26/2022 22:02:56 - INFO - codeparrot_training - Step 2299: {'lr': 0.0004999521308407006, 'samples': 441600, 'steps': 2299, 'loss/train': 0.7513264417648315} 01/26/2022 22:02:59 - INFO - codeparrot_training - Step 2300: {'lr': 0.0004999518101205162, 'samples': 441792, 'steps': 2300, 'loss/train': 1.1966418027877808} 01/26/2022 22:03:02 - INFO - codeparrot_training - Step 2301: {'lr': 0.0004999514883296176, 'samples': 441984, 'steps': 2301, 'loss/train': 1.759121596813202} 01/26/2022 22:03:05 - INFO - codeparrot_training - Step 2302: {'lr': 0.0004999511654680064, 'samples': 442176, 'steps': 2302, 'loss/train': 1.7775678634643555} 01/26/2022 22:03:08 - INFO - codeparrot_training - Step 2303: {'lr': 0.0004999508415356836, 'samples': 442368, 'steps': 2303, 'loss/train': 1.261641025543213} 01/26/2022 22:03:11 - INFO - codeparrot_training - Step 2304: {'lr': 0.0004999505165326509, 'samples': 442560, 'steps': 2304, 'loss/train': 0.6929381042718887} 01/26/2022 22:03:15 - INFO - codeparrot_training - Step 2305: {'lr': 0.0004999501904589095, 'samples': 442752, 'steps': 2305, 'loss/train': 0.6010587215423584} 01/26/2022 22:03:18 - INFO - codeparrot_training - Step 2306: {'lr': 0.0004999498633144608, 'samples': 442944, 'steps': 2306, 'loss/train': 6.815897226333618} 01/26/2022 22:03:22 - INFO - codeparrot_training - Step 2307: {'lr': 0.0004999495350993062, 'samples': 443136, 'steps': 2307, 'loss/train': 1.4664837419986725} 01/26/2022 22:03:25 - INFO - codeparrot_training - Step 2308: {'lr': 0.0004999492058134473, 'samples': 443328, 'steps': 2308, 'loss/train': 1.4236234724521637} 01/26/2022 22:03:28 - INFO - codeparrot_training - Step 2309: {'lr': 0.0004999488754568853, 'samples': 443520, 'steps': 2309, 'loss/train': 1.5943763852119446} 01/26/2022 22:03:31 - INFO - codeparrot_training - Step 2310: {'lr': 0.0004999485440296216, 'samples': 443712, 'steps': 2310, 'loss/train': 1.1796399056911469} 01/26/2022 22:03:35 - INFO - codeparrot_training - Step 2311: {'lr': 0.0004999482115316579, 'samples': 443904, 'steps': 2311, 'loss/train': 1.0818094611167908} 01/26/2022 22:03:38 - INFO - codeparrot_training - Step 2312: {'lr': 0.0004999478779629953, 'samples': 444096, 'steps': 2312, 'loss/train': 0.7817964255809784} 01/26/2022 22:03:41 - INFO - codeparrot_training - Step 2313: {'lr': 0.0004999475433236354, 'samples': 444288, 'steps': 2313, 'loss/train': 1.2095704972743988} 01/26/2022 22:03:44 - INFO - codeparrot_training - Step 2314: {'lr': 0.0004999472076135796, 'samples': 444480, 'steps': 2314, 'loss/train': 1.0692829191684723} 01/26/2022 22:03:47 - INFO - codeparrot_training - Step 2315: {'lr': 0.0004999468708328293, 'samples': 444672, 'steps': 2315, 'loss/train': 0.9813380241394043} 01/26/2022 22:03:53 - INFO - codeparrot_training - Step 2316: {'lr': 0.0004999465329813859, 'samples': 444864, 'steps': 2316, 'loss/train': 0.6905350685119629} 01/26/2022 22:03:56 - INFO - codeparrot_training - Step 2317: {'lr': 0.000499946194059251, 'samples': 445056, 'steps': 2317, 'loss/train': 1.017122358083725} 01/26/2022 22:03:59 - INFO - codeparrot_training - Step 2318: {'lr': 0.000499945854066426, 'samples': 445248, 'steps': 2318, 'loss/train': 0.9423352181911469} 01/26/2022 22:04:03 - INFO - codeparrot_training - Step 2319: {'lr': 0.0004999455130029123, 'samples': 445440, 'steps': 2319, 'loss/train': 1.112172693014145} 01/26/2022 22:04:06 - INFO - codeparrot_training - Step 2320: {'lr': 0.0004999451708687113, 'samples': 445632, 'steps': 2320, 'loss/train': 0.7915323972702026} 01/26/2022 22:04:09 - INFO - codeparrot_training - Step 2321: {'lr': 0.0004999448276638247, 'samples': 445824, 'steps': 2321, 'loss/train': 0.609552651643753} 01/26/2022 22:04:12 - INFO - codeparrot_training - Step 2322: {'lr': 0.0004999444833882538, 'samples': 446016, 'steps': 2322, 'loss/train': 1.286918967962265} 01/26/2022 22:04:15 - INFO - codeparrot_training - Step 2323: {'lr': 0.000499944138042, 'samples': 446208, 'steps': 2323, 'loss/train': 1.3506639897823334} 01/26/2022 22:04:20 - INFO - codeparrot_training - Step 2324: {'lr': 0.000499943791625065, 'samples': 446400, 'steps': 2324, 'loss/train': 1.190936028957367} 01/26/2022 22:04:23 - INFO - codeparrot_training - Step 2325: {'lr': 0.0004999434441374501, 'samples': 446592, 'steps': 2325, 'loss/train': 0.7326725274324417} 01/26/2022 22:04:26 - INFO - codeparrot_training - Step 2326: {'lr': 0.0004999430955791569, 'samples': 446784, 'steps': 2326, 'loss/train': 0.5266271531581879} 01/26/2022 22:04:29 - INFO - codeparrot_training - Step 2327: {'lr': 0.0004999427459501868, 'samples': 446976, 'steps': 2327, 'loss/train': 0.9655237197875977} 01/26/2022 22:04:32 - INFO - codeparrot_training - Step 2328: {'lr': 0.0004999423952505414, 'samples': 447168, 'steps': 2328, 'loss/train': 1.104047566652298} 01/26/2022 22:04:36 - INFO - codeparrot_training - Step 2329: {'lr': 0.000499942043480222, 'samples': 447360, 'steps': 2329, 'loss/train': 1.0776964724063873} 01/26/2022 22:04:39 - INFO - codeparrot_training - Step 2330: {'lr': 0.0004999416906392303, 'samples': 447552, 'steps': 2330, 'loss/train': 1.113428682088852} 01/26/2022 22:04:42 - INFO - codeparrot_training - Step 2331: {'lr': 0.0004999413367275678, 'samples': 447744, 'steps': 2331, 'loss/train': 1.1981463432312012} 01/26/2022 22:04:45 - INFO - codeparrot_training - Step 2332: {'lr': 0.000499940981745236, 'samples': 447936, 'steps': 2332, 'loss/train': 1.0298221707344055} 01/26/2022 22:04:48 - INFO - codeparrot_training - Step 2333: {'lr': 0.0004999406256922365, 'samples': 448128, 'steps': 2333, 'loss/train': 0.7603691518306732} 01/26/2022 22:04:53 - INFO - codeparrot_training - Step 2334: {'lr': 0.0004999402685685705, 'samples': 448320, 'steps': 2334, 'loss/train': 1.071915864944458} 01/26/2022 22:04:56 - INFO - codeparrot_training - Step 2335: {'lr': 0.0004999399103742399, 'samples': 448512, 'steps': 2335, 'loss/train': 0.8818162679672241} 01/26/2022 22:04:59 - INFO - codeparrot_training - Step 2336: {'lr': 0.000499939551109246, 'samples': 448704, 'steps': 2336, 'loss/train': 1.1356034874916077} 01/26/2022 22:05:02 - INFO - codeparrot_training - Step 2337: {'lr': 0.0004999391907735905, 'samples': 448896, 'steps': 2337, 'loss/train': 0.747326597571373} 01/26/2022 22:05:05 - INFO - codeparrot_training - Step 2338: {'lr': 0.0004999388293672748, 'samples': 449088, 'steps': 2338, 'loss/train': 0.6027696132659912} 01/26/2022 22:05:08 - INFO - codeparrot_training - Step 2339: {'lr': 0.0004999384668903006, 'samples': 449280, 'steps': 2339, 'loss/train': 1.2318427562713623} 01/26/2022 22:05:12 - INFO - codeparrot_training - Step 2340: {'lr': 0.0004999381033426693, 'samples': 449472, 'steps': 2340, 'loss/train': 1.1744778156280518} 01/26/2022 22:05:15 - INFO - codeparrot_training - Step 2341: {'lr': 0.0004999377387243827, 'samples': 449664, 'steps': 2341, 'loss/train': 0.513311505317688} 01/26/2022 22:05:21 - INFO - codeparrot_training - Step 2342: {'lr': 0.0004999373730354419, 'samples': 449856, 'steps': 2342, 'loss/train': 1.3369641602039337} 01/26/2022 22:05:24 - INFO - codeparrot_training - Step 2343: {'lr': 0.0004999370062758491, 'samples': 450048, 'steps': 2343, 'loss/train': 0.9288293123245239} 01/26/2022 22:05:27 - INFO - codeparrot_training - Step 2344: {'lr': 0.0004999366384456052, 'samples': 450240, 'steps': 2344, 'loss/train': 1.2162843346595764} 01/26/2022 22:05:30 - INFO - codeparrot_training - Step 2345: {'lr': 0.0004999362695447123, 'samples': 450432, 'steps': 2345, 'loss/train': 0.2624422609806061} 01/26/2022 22:05:33 - INFO - codeparrot_training - Step 2346: {'lr': 0.0004999358995731718, 'samples': 450624, 'steps': 2346, 'loss/train': 0.7912989556789398} 01/26/2022 22:05:37 - INFO - codeparrot_training - Step 2347: {'lr': 0.0004999355285309851, 'samples': 450816, 'steps': 2347, 'loss/train': 1.431153416633606} 01/26/2022 22:05:40 - INFO - codeparrot_training - Step 2348: {'lr': 0.0004999351564181541, 'samples': 451008, 'steps': 2348, 'loss/train': 1.153271108865738} 01/26/2022 22:05:43 - INFO - codeparrot_training - Step 2349: {'lr': 0.0004999347832346802, 'samples': 451200, 'steps': 2349, 'loss/train': 1.1508750915527344} 01/26/2022 22:05:46 - INFO - codeparrot_training - Step 2350: {'lr': 0.0004999344089805651, 'samples': 451392, 'steps': 2350, 'loss/train': 1.1868318915367126} 01/26/2022 22:05:50 - INFO - codeparrot_training - Step 2351: {'lr': 0.0004999340336558104, 'samples': 451584, 'steps': 2351, 'loss/train': 1.4442530572414398} 01/26/2022 22:05:54 - INFO - codeparrot_training - Step 2352: {'lr': 0.0004999336572604175, 'samples': 451776, 'steps': 2352, 'loss/train': 0.7693294286727905} 01/26/2022 22:05:57 - INFO - codeparrot_training - Step 2353: {'lr': 0.0004999332797943883, 'samples': 451968, 'steps': 2353, 'loss/train': 1.0882196724414825} 01/26/2022 22:06:00 - INFO - codeparrot_training - Step 2354: {'lr': 0.0004999329012577243, 'samples': 452160, 'steps': 2354, 'loss/train': 0.7885296642780304} 01/26/2022 22:06:03 - INFO - codeparrot_training - Step 2355: {'lr': 0.000499932521650427, 'samples': 452352, 'steps': 2355, 'loss/train': 0.5981163829565048} 01/26/2022 22:06:06 - INFO - codeparrot_training - Step 2356: {'lr': 0.0004999321409724982, 'samples': 452544, 'steps': 2356, 'loss/train': 1.2604880332946777} 01/26/2022 22:06:09 - INFO - codeparrot_training - Step 2357: {'lr': 0.0004999317592239395, 'samples': 452736, 'steps': 2357, 'loss/train': 0.8564072549343109} 01/26/2022 22:06:13 - INFO - codeparrot_training - Step 2358: {'lr': 0.0004999313764047525, 'samples': 452928, 'steps': 2358, 'loss/train': 0.8870100975036621} 01/26/2022 22:06:16 - INFO - codeparrot_training - Step 2359: {'lr': 0.0004999309925149388, 'samples': 453120, 'steps': 2359, 'loss/train': 1.5852343440055847} 01/26/2022 22:06:20 - INFO - codeparrot_training - Step 2360: {'lr': 0.0004999306075545002, 'samples': 453312, 'steps': 2360, 'loss/train': 0.5727228373289108} 01/26/2022 22:06:23 - INFO - codeparrot_training - Step 2361: {'lr': 0.0004999302215234381, 'samples': 453504, 'steps': 2361, 'loss/train': 1.4677555561065674} 01/26/2022 22:06:27 - INFO - codeparrot_training - Step 2362: {'lr': 0.0004999298344217543, 'samples': 453696, 'steps': 2362, 'loss/train': 0.7242024689912796} 01/26/2022 22:06:30 - INFO - codeparrot_training - Step 2363: {'lr': 0.0004999294462494506, 'samples': 453888, 'steps': 2363, 'loss/train': 1.4942905604839325} 01/26/2022 22:06:33 - INFO - codeparrot_training - Step 2364: {'lr': 0.0004999290570065284, 'samples': 454080, 'steps': 2364, 'loss/train': 1.2041187286376953} 01/26/2022 22:06:36 - INFO - codeparrot_training - Step 2365: {'lr': 0.0004999286666929895, 'samples': 454272, 'steps': 2365, 'loss/train': 0.716061607003212} 01/26/2022 22:06:39 - INFO - codeparrot_training - Step 2366: {'lr': 0.0004999282753088356, 'samples': 454464, 'steps': 2366, 'loss/train': 0.9831347465515137} 01/26/2022 22:06:42 - INFO - codeparrot_training - Step 2367: {'lr': 0.0004999278828540682, 'samples': 454656, 'steps': 2367, 'loss/train': 0.6707087606191635} 01/26/2022 22:06:49 - INFO - codeparrot_training - Step 2368: {'lr': 0.0004999274893286893, 'samples': 454848, 'steps': 2368, 'loss/train': 1.0053615868091583} 01/26/2022 22:06:52 - INFO - codeparrot_training - Step 2369: {'lr': 0.0004999270947327003, 'samples': 455040, 'steps': 2369, 'loss/train': 0.976643443107605} 01/26/2022 22:06:55 - INFO - codeparrot_training - Step 2370: {'lr': 0.0004999266990661029, 'samples': 455232, 'steps': 2370, 'loss/train': 1.0536576211452484} 01/26/2022 22:06:58 - INFO - codeparrot_training - Step 2371: {'lr': 0.0004999263023288989, 'samples': 455424, 'steps': 2371, 'loss/train': 0.7989112436771393} 01/26/2022 22:07:02 - INFO - codeparrot_training - Step 2372: {'lr': 0.0004999259045210901, 'samples': 455616, 'steps': 2372, 'loss/train': 0.6341119408607483} 01/26/2022 22:07:05 - INFO - codeparrot_training - Step 2373: {'lr': 0.000499925505642678, 'samples': 455808, 'steps': 2373, 'loss/train': 0.8889137506484985} 01/26/2022 22:07:08 - INFO - codeparrot_training - Step 2374: {'lr': 0.0004999251056936645, 'samples': 456000, 'steps': 2374, 'loss/train': 0.9699170887470245} 01/26/2022 22:07:11 - INFO - codeparrot_training - Step 2375: {'lr': 0.000499924704674051, 'samples': 456192, 'steps': 2375, 'loss/train': 1.4518083035945892} 01/26/2022 22:07:14 - INFO - codeparrot_training - Step 2376: {'lr': 0.0004999243025838396, 'samples': 456384, 'steps': 2376, 'loss/train': 0.8003997802734375} 01/26/2022 22:07:19 - INFO - codeparrot_training - Step 2377: {'lr': 0.0004999238994230318, 'samples': 456576, 'steps': 2377, 'loss/train': 1.1449685096740723} 01/26/2022 22:07:22 - INFO - codeparrot_training - Step 2378: {'lr': 0.0004999234951916293, 'samples': 456768, 'steps': 2378, 'loss/train': 1.076916128396988} 01/26/2022 22:07:25 - INFO - codeparrot_training - Step 2379: {'lr': 0.0004999230898896341, 'samples': 456960, 'steps': 2379, 'loss/train': 1.0367165207862854} 01/26/2022 22:07:28 - INFO - codeparrot_training - Step 2380: {'lr': 0.0004999226835170476, 'samples': 457152, 'steps': 2380, 'loss/train': 1.2215034663677216} 01/26/2022 22:07:31 - INFO - codeparrot_training - Step 2381: {'lr': 0.0004999222760738717, 'samples': 457344, 'steps': 2381, 'loss/train': 1.1021158397197723} 01/26/2022 22:07:34 - INFO - codeparrot_training - Step 2382: {'lr': 0.0004999218675601081, 'samples': 457536, 'steps': 2382, 'loss/train': 0.9500536322593689} 01/26/2022 22:07:37 - INFO - codeparrot_training - Step 2383: {'lr': 0.0004999214579757586, 'samples': 457728, 'steps': 2383, 'loss/train': 0.6538669466972351} 01/26/2022 22:07:40 - INFO - codeparrot_training - Step 2384: {'lr': 0.000499921047320825, 'samples': 457920, 'steps': 2384, 'loss/train': 0.8166100680828094} 01/26/2022 22:07:44 - INFO - codeparrot_training - Step 2385: {'lr': 0.000499920635595309, 'samples': 458112, 'steps': 2385, 'loss/train': 0.4824526011943817} 01/26/2022 22:07:48 - INFO - codeparrot_training - Step 2386: {'lr': 0.0004999202227992122, 'samples': 458304, 'steps': 2386, 'loss/train': 0.5278577506542206} 01/26/2022 22:07:51 - INFO - codeparrot_training - Step 2387: {'lr': 0.0004999198089325367, 'samples': 458496, 'steps': 2387, 'loss/train': 1.1520011723041534} 01/26/2022 22:07:54 - INFO - codeparrot_training - Step 2388: {'lr': 0.0004999193939952839, 'samples': 458688, 'steps': 2388, 'loss/train': 0.4489096999168396} 01/26/2022 22:07:57 - INFO - codeparrot_training - Step 2389: {'lr': 0.000499918977987456, 'samples': 458880, 'steps': 2389, 'loss/train': 0.6412830054759979} 01/26/2022 22:08:01 - INFO - codeparrot_training - Step 2390: {'lr': 0.0004999185609090544, 'samples': 459072, 'steps': 2390, 'loss/train': 0.8925353586673737} 01/26/2022 22:08:04 - INFO - codeparrot_training - Step 2391: {'lr': 0.0004999181427600811, 'samples': 459264, 'steps': 2391, 'loss/train': 0.7785822451114655} 01/26/2022 22:08:07 - INFO - codeparrot_training - Step 2392: {'lr': 0.0004999177235405378, 'samples': 459456, 'steps': 2392, 'loss/train': 0.9079448282718658} 01/26/2022 22:08:10 - INFO - codeparrot_training - Step 2393: {'lr': 0.0004999173032504264, 'samples': 459648, 'steps': 2393, 'loss/train': 1.1047098934650421} 01/26/2022 22:08:13 - INFO - codeparrot_training - Step 2394: {'lr': 0.0004999168818897486, 'samples': 459840, 'steps': 2394, 'loss/train': 1.014288067817688} 01/26/2022 22:08:19 - INFO - codeparrot_training - Step 2395: {'lr': 0.0004999164594585062, 'samples': 460032, 'steps': 2395, 'loss/train': 0.8196901381015778} 01/26/2022 22:08:22 - INFO - codeparrot_training - Step 2396: {'lr': 0.0004999160359567011, 'samples': 460224, 'steps': 2396, 'loss/train': 0.6962648034095764} 01/26/2022 22:08:26 - INFO - codeparrot_training - Step 2397: {'lr': 0.000499915611384335, 'samples': 460416, 'steps': 2397, 'loss/train': 0.9502690136432648} 01/26/2022 22:08:29 - INFO - codeparrot_training - Step 2398: {'lr': 0.0004999151857414099, 'samples': 460608, 'steps': 2398, 'loss/train': 1.260164737701416} 01/26/2022 22:08:32 - INFO - codeparrot_training - Step 2399: {'lr': 0.0004999147590279273, 'samples': 460800, 'steps': 2399, 'loss/train': 0.8665354549884796} 01/26/2022 22:08:35 - INFO - codeparrot_training - Step 2400: {'lr': 0.0004999143312438893, 'samples': 460992, 'steps': 2400, 'loss/train': 0.9794042408466339} 01/26/2022 22:08:38 - INFO - codeparrot_training - Step 2401: {'lr': 0.0004999139023892978, 'samples': 461184, 'steps': 2401, 'loss/train': 0.9445435702800751} 01/26/2022 22:08:41 - INFO - codeparrot_training - Step 2402: {'lr': 0.0004999134724641543, 'samples': 461376, 'steps': 2402, 'loss/train': 1.2808993756771088} 01/26/2022 22:08:46 - INFO - codeparrot_training - Step 2403: {'lr': 0.000499913041468461, 'samples': 461568, 'steps': 2403, 'loss/train': 0.9198321998119354} 01/26/2022 22:08:49 - INFO - codeparrot_training - Step 2404: {'lr': 0.0004999126094022195, 'samples': 461760, 'steps': 2404, 'loss/train': 1.0891768634319305} 01/26/2022 22:08:52 - INFO - codeparrot_training - Step 2405: {'lr': 0.0004999121762654318, 'samples': 461952, 'steps': 2405, 'loss/train': 0.3072686567902565} 01/26/2022 22:08:55 - INFO - codeparrot_training - Step 2406: {'lr': 0.0004999117420580996, 'samples': 462144, 'steps': 2406, 'loss/train': 0.7494112998247147} 01/26/2022 22:08:58 - INFO - codeparrot_training - Step 2407: {'lr': 0.0004999113067802249, 'samples': 462336, 'steps': 2407, 'loss/train': 0.8119820058345795} 01/26/2022 22:09:02 - INFO - codeparrot_training - Step 2408: {'lr': 0.0004999108704318095, 'samples': 462528, 'steps': 2408, 'loss/train': 0.5340041220188141} 01/26/2022 22:09:05 - INFO - codeparrot_training - Step 2409: {'lr': 0.0004999104330128553, 'samples': 462720, 'steps': 2409, 'loss/train': 0.7074038833379745} 01/26/2022 22:09:08 - INFO - codeparrot_training - Step 2410: {'lr': 0.0004999099945233641, 'samples': 462912, 'steps': 2410, 'loss/train': 1.1774276196956635} 01/26/2022 22:09:11 - INFO - codeparrot_training - Step 2411: {'lr': 0.000499909554963338, 'samples': 463104, 'steps': 2411, 'loss/train': 0.9650202691555023} 01/26/2022 22:09:17 - INFO - codeparrot_training - Step 2412: {'lr': 0.0004999091143327786, 'samples': 463296, 'steps': 2412, 'loss/train': 0.8228668570518494} 01/26/2022 22:09:20 - INFO - codeparrot_training - Step 2413: {'lr': 0.000499908672631688, 'samples': 463488, 'steps': 2413, 'loss/train': 0.83864825963974} 01/26/2022 22:09:23 - INFO - codeparrot_training - Step 2414: {'lr': 0.0004999082298600679, 'samples': 463680, 'steps': 2414, 'loss/train': 0.5115313231945038} 01/26/2022 22:09:26 - INFO - codeparrot_training - Step 2415: {'lr': 0.0004999077860179204, 'samples': 463872, 'steps': 2415, 'loss/train': 0.8694373369216919} 01/26/2022 22:09:29 - INFO - codeparrot_training - Step 2416: {'lr': 0.0004999073411052472, 'samples': 464064, 'steps': 2416, 'loss/train': 1.035866528749466} 01/26/2022 22:09:32 - INFO - codeparrot_training - Step 2417: {'lr': 0.0004999068951220503, 'samples': 464256, 'steps': 2417, 'loss/train': 1.163872629404068} 01/26/2022 22:09:36 - INFO - codeparrot_training - Step 2418: {'lr': 0.0004999064480683317, 'samples': 464448, 'steps': 2418, 'loss/train': 0.8919706642627716} 01/26/2022 22:09:39 - INFO - codeparrot_training - Step 2419: {'lr': 0.0004999059999440932, 'samples': 464640, 'steps': 2419, 'loss/train': 0.7717393040657043} 01/26/2022 22:09:42 - INFO - codeparrot_training - Step 2420: {'lr': 0.0004999055507493368, 'samples': 464832, 'steps': 2420, 'loss/train': 1.239013820886612} 01/26/2022 22:09:46 - INFO - codeparrot_training - Step 2421: {'lr': 0.0004999051004840642, 'samples': 465024, 'steps': 2421, 'loss/train': 0.4346838891506195} 01/26/2022 22:09:49 - INFO - codeparrot_training - Step 2422: {'lr': 0.0004999046491482777, 'samples': 465216, 'steps': 2422, 'loss/train': 0.7200586795806885} 01/26/2022 22:09:52 - INFO - codeparrot_training - Step 2423: {'lr': 0.000499904196741979, 'samples': 465408, 'steps': 2423, 'loss/train': 0.9794321358203888} 01/26/2022 22:09:56 - INFO - codeparrot_training - Step 2424: {'lr': 0.00049990374326517, 'samples': 465600, 'steps': 2424, 'loss/train': 1.2582945227622986} 01/26/2022 22:09:59 - INFO - codeparrot_training - Step 2425: {'lr': 0.0004999032887178527, 'samples': 465792, 'steps': 2425, 'loss/train': 0.6760444790124893} 01/26/2022 22:10:02 - INFO - codeparrot_training - Step 2426: {'lr': 0.000499902833100029, 'samples': 465984, 'steps': 2426, 'loss/train': 0.6432498693466187} 01/26/2022 22:10:05 - INFO - codeparrot_training - Step 2427: {'lr': 0.0004999023764117011, 'samples': 466176, 'steps': 2427, 'loss/train': 0.9502222537994385} 01/26/2022 22:10:08 - INFO - codeparrot_training - Step 2428: {'lr': 0.0004999019186528708, 'samples': 466368, 'steps': 2428, 'loss/train': 0.978620320558548} 01/26/2022 22:10:11 - INFO - codeparrot_training - Step 2429: {'lr': 0.0004999014598235399, 'samples': 466560, 'steps': 2429, 'loss/train': 1.031268149614334} 01/26/2022 22:10:16 - INFO - codeparrot_training - Step 2430: {'lr': 0.0004999009999237105, 'samples': 466752, 'steps': 2430, 'loss/train': 0.8793404996395111} 01/26/2022 22:10:19 - INFO - codeparrot_training - Step 2431: {'lr': 0.0004999005389533846, 'samples': 466944, 'steps': 2431, 'loss/train': 0.5789824426174164} 01/26/2022 22:10:22 - INFO - codeparrot_training - Step 2432: {'lr': 0.0004999000769125642, 'samples': 467136, 'steps': 2432, 'loss/train': 0.8629926145076752} 01/26/2022 22:10:25 - INFO - codeparrot_training - Step 2433: {'lr': 0.0004998996138012512, 'samples': 467328, 'steps': 2433, 'loss/train': 0.6794450283050537} 01/26/2022 22:10:28 - INFO - codeparrot_training - Step 2434: {'lr': 0.0004998991496194475, 'samples': 467520, 'steps': 2434, 'loss/train': 0.38719770312309265} 01/26/2022 22:10:31 - INFO - codeparrot_training - Step 2435: {'lr': 0.0004998986843671552, 'samples': 467712, 'steps': 2435, 'loss/train': 1.0725580751895905} 01/26/2022 22:10:34 - INFO - codeparrot_training - Step 2436: {'lr': 0.0004998982180443764, 'samples': 467904, 'steps': 2436, 'loss/train': 1.165977269411087} 01/26/2022 22:10:38 - INFO - codeparrot_training - Step 2437: {'lr': 0.000499897750651113, 'samples': 468096, 'steps': 2437, 'loss/train': 0.8312303423881531} 01/26/2022 22:10:41 - INFO - codeparrot_training - Step 2438: {'lr': 0.0004998972821873668, 'samples': 468288, 'steps': 2438, 'loss/train': 1.1453656554222107} 01/26/2022 22:10:47 - INFO - codeparrot_training - Step 2439: {'lr': 0.0004998968126531402, 'samples': 468480, 'steps': 2439, 'loss/train': 0.7888818383216858} 01/26/2022 22:10:50 - INFO - codeparrot_training - Step 2440: {'lr': 0.0004998963420484349, 'samples': 468672, 'steps': 2440, 'loss/train': 0.4735616147518158} 01/26/2022 22:10:53 - INFO - codeparrot_training - Step 2441: {'lr': 0.0004998958703732532, 'samples': 468864, 'steps': 2441, 'loss/train': 1.2174751460552216} 01/26/2022 22:10:56 - INFO - codeparrot_training - Step 2442: {'lr': 0.0004998953976275966, 'samples': 469056, 'steps': 2442, 'loss/train': 0.5274702608585358} 01/26/2022 22:10:59 - INFO - codeparrot_training - Step 2443: {'lr': 0.0004998949238114677, 'samples': 469248, 'steps': 2443, 'loss/train': 0.8861036896705627} 01/26/2022 22:11:03 - INFO - codeparrot_training - Step 2444: {'lr': 0.0004998944489248683, 'samples': 469440, 'steps': 2444, 'loss/train': 0.8051759898662567} 01/26/2022 22:11:06 - INFO - codeparrot_training - Step 2445: {'lr': 0.0004998939729678004, 'samples': 469632, 'steps': 2445, 'loss/train': 0.9955472946166992} 01/26/2022 22:11:09 - INFO - codeparrot_training - Step 2446: {'lr': 0.000499893495940266, 'samples': 469824, 'steps': 2446, 'loss/train': 1.2473201751708984} 01/26/2022 22:11:12 - INFO - codeparrot_training - Step 2447: {'lr': 0.0004998930178422673, 'samples': 470016, 'steps': 2447, 'loss/train': 1.0641159117221832} 01/26/2022 22:11:16 - INFO - codeparrot_training - Step 2448: {'lr': 0.0004998925386738062, 'samples': 470208, 'steps': 2448, 'loss/train': 0.8777492344379425} 01/26/2022 22:11:20 - INFO - codeparrot_training - Step 2449: {'lr': 0.0004998920584348849, 'samples': 470400, 'steps': 2449, 'loss/train': 1.5744094848632812} 01/26/2022 22:11:23 - INFO - codeparrot_training - Step 2450: {'lr': 0.0004998915771255053, 'samples': 470592, 'steps': 2450, 'loss/train': 0.32181400805711746} 01/26/2022 22:11:26 - INFO - codeparrot_training - Step 2451: {'lr': 0.0004998910947456696, 'samples': 470784, 'steps': 2451, 'loss/train': 0.9428379535675049} 01/26/2022 22:11:29 - INFO - codeparrot_training - Step 2452: {'lr': 0.0004998906112953797, 'samples': 470976, 'steps': 2452, 'loss/train': 1.1739367246627808} 01/26/2022 22:11:32 - INFO - codeparrot_training - Step 2453: {'lr': 0.0004998901267746379, 'samples': 471168, 'steps': 2453, 'loss/train': 0.7443417906761169} 01/26/2022 22:11:35 - INFO - codeparrot_training - Step 2454: {'lr': 0.0004998896411834461, 'samples': 471360, 'steps': 2454, 'loss/train': 0.9662229716777802} 01/26/2022 22:11:38 - INFO - codeparrot_training - Step 2455: {'lr': 0.0004998891545218063, 'samples': 471552, 'steps': 2455, 'loss/train': 0.8978563249111176} 01/26/2022 22:11:45 - INFO - codeparrot_training - Step 2456: {'lr': 0.0004998886667897209, 'samples': 471744, 'steps': 2456, 'loss/train': 1.1758093535900116} 01/26/2022 22:11:48 - INFO - codeparrot_training - Step 2457: {'lr': 0.0004998881779871917, 'samples': 471936, 'steps': 2457, 'loss/train': 1.2049357295036316} 01/26/2022 22:11:51 - INFO - codeparrot_training - Step 2458: {'lr': 0.0004998876881142208, 'samples': 472128, 'steps': 2458, 'loss/train': 0.8738523423671722} 01/26/2022 22:11:54 - INFO - codeparrot_training - Step 2459: {'lr': 0.0004998871971708106, 'samples': 472320, 'steps': 2459, 'loss/train': 0.8469813466072083} 01/26/2022 22:11:57 - INFO - codeparrot_training - Step 2460: {'lr': 0.0004998867051569627, 'samples': 472512, 'steps': 2460, 'loss/train': 1.1291041374206543} 01/26/2022 22:12:00 - INFO - codeparrot_training - Step 2461: {'lr': 0.0004998862120726798, 'samples': 472704, 'steps': 2461, 'loss/train': 1.104601263999939} 01/26/2022 22:12:04 - INFO - codeparrot_training - Step 2462: {'lr': 0.0004998857179179636, 'samples': 472896, 'steps': 2462, 'loss/train': 0.9954442083835602} 01/26/2022 22:12:07 - INFO - codeparrot_training - Step 2463: {'lr': 0.0004998852226928164, 'samples': 473088, 'steps': 2463, 'loss/train': 0.10541380941867828} 01/26/2022 22:12:10 - INFO - codeparrot_training - Step 2464: {'lr': 0.0004998847263972401, 'samples': 473280, 'steps': 2464, 'loss/train': 0.9742861390113831} 01/26/2022 22:12:14 - INFO - codeparrot_training - Step 2465: {'lr': 0.0004998842290312371, 'samples': 473472, 'steps': 2465, 'loss/train': 0.8619533479213715} 01/26/2022 22:12:17 - INFO - codeparrot_training - Step 2466: {'lr': 0.0004998837305948094, 'samples': 473664, 'steps': 2466, 'loss/train': 1.0078363716602325} 01/26/2022 22:12:20 - INFO - codeparrot_training - Step 2467: {'lr': 0.0004998832310879591, 'samples': 473856, 'steps': 2467, 'loss/train': 0.7206002622842789} 01/26/2022 22:12:24 - INFO - codeparrot_training - Step 2468: {'lr': 0.0004998827305106884, 'samples': 474048, 'steps': 2468, 'loss/train': 0.9333997964859009} 01/26/2022 22:12:27 - INFO - codeparrot_training - Step 2469: {'lr': 0.0004998822288629995, 'samples': 474240, 'steps': 2469, 'loss/train': 1.0863673388957977} 01/26/2022 22:12:30 - INFO - codeparrot_training - Step 2470: {'lr': 0.0004998817261448943, 'samples': 474432, 'steps': 2470, 'loss/train': 0.43822181224823} 01/26/2022 22:12:33 - INFO - codeparrot_training - Step 2471: {'lr': 0.0004998812223563754, 'samples': 474624, 'steps': 2471, 'loss/train': 0.6487643122673035} 01/26/2022 22:12:36 - INFO - codeparrot_training - Step 2472: {'lr': 0.0004998807174974445, 'samples': 474816, 'steps': 2472, 'loss/train': 1.004664570093155} 01/26/2022 22:12:39 - INFO - codeparrot_training - Step 2473: {'lr': 0.0004998802115681039, 'samples': 475008, 'steps': 2473, 'loss/train': 1.5099944472312927} 01/26/2022 22:12:44 - INFO - codeparrot_training - Step 2474: {'lr': 0.000499879704568356, 'samples': 475200, 'steps': 2474, 'loss/train': 0.6042963266372681} 01/26/2022 22:12:47 - INFO - codeparrot_training - Step 2475: {'lr': 0.0004998791964982026, 'samples': 475392, 'steps': 2475, 'loss/train': 0.6023204773664474} 01/26/2022 22:12:50 - INFO - codeparrot_training - Step 2476: {'lr': 0.0004998786873576462, 'samples': 475584, 'steps': 2476, 'loss/train': 1.6431105136871338} 01/26/2022 22:12:53 - INFO - codeparrot_training - Step 2477: {'lr': 0.0004998781771466889, 'samples': 475776, 'steps': 2477, 'loss/train': 0.634314626455307} 01/26/2022 22:12:56 - INFO - codeparrot_training - Step 2478: {'lr': 0.0004998776658653327, 'samples': 475968, 'steps': 2478, 'loss/train': 0.9321169853210449} 01/26/2022 22:12:59 - INFO - codeparrot_training - Step 2479: {'lr': 0.00049987715351358, 'samples': 476160, 'steps': 2479, 'loss/train': 0.7996085286140442} 01/26/2022 22:13:03 - INFO - codeparrot_training - Step 2480: {'lr': 0.0004998766400914329, 'samples': 476352, 'steps': 2480, 'loss/train': 0.7158496677875519} 01/26/2022 22:13:06 - INFO - codeparrot_training - Step 2481: {'lr': 0.0004998761255988936, 'samples': 476544, 'steps': 2481, 'loss/train': 0.985247790813446} 01/26/2022 22:13:09 - INFO - codeparrot_training - Step 2482: {'lr': 0.0004998756100359643, 'samples': 476736, 'steps': 2482, 'loss/train': 0.8374587893486023} 01/26/2022 22:13:13 - INFO - codeparrot_training - Step 2483: {'lr': 0.0004998750934026474, 'samples': 476928, 'steps': 2483, 'loss/train': 1.2252114415168762} 01/26/2022 22:13:16 - INFO - codeparrot_training - Step 2484: {'lr': 0.0004998745756989448, 'samples': 477120, 'steps': 2484, 'loss/train': 0.8816327154636383} 01/26/2022 22:13:20 - INFO - codeparrot_training - Step 2485: {'lr': 0.0004998740569248588, 'samples': 477312, 'steps': 2485, 'loss/train': 0.6970432251691818} 01/26/2022 22:13:23 - INFO - codeparrot_training - Step 2486: {'lr': 0.0004998735370803917, 'samples': 477504, 'steps': 2486, 'loss/train': 1.1008096933364868} 01/26/2022 22:13:26 - INFO - codeparrot_training - Step 2487: {'lr': 0.0004998730161655459, 'samples': 477696, 'steps': 2487, 'loss/train': 1.1262676119804382} 01/26/2022 22:13:29 - INFO - codeparrot_training - Step 2488: {'lr': 0.0004998724941803232, 'samples': 477888, 'steps': 2488, 'loss/train': 0.23808791488409042} 01/26/2022 22:13:32 - INFO - codeparrot_training - Step 2489: {'lr': 0.0004998719711247262, 'samples': 478080, 'steps': 2489, 'loss/train': 0.9193916022777557} 01/26/2022 22:13:35 - INFO - codeparrot_training - Step 2490: {'lr': 0.0004998714469987571, 'samples': 478272, 'steps': 2490, 'loss/train': 1.3135632276535034} 01/26/2022 22:13:38 - INFO - codeparrot_training - Step 2491: {'lr': 0.000499870921802418, 'samples': 478464, 'steps': 2491, 'loss/train': 1.2750966846942902} 01/26/2022 22:13:45 - INFO - codeparrot_training - Step 2492: {'lr': 0.0004998703955357111, 'samples': 478656, 'steps': 2492, 'loss/train': 0.8539542853832245} 01/26/2022 22:13:48 - INFO - codeparrot_training - Step 2493: {'lr': 0.0004998698681986389, 'samples': 478848, 'steps': 2493, 'loss/train': 0.6856047213077545} 01/26/2022 22:13:51 - INFO - codeparrot_training - Step 2494: {'lr': 0.0004998693397912034, 'samples': 479040, 'steps': 2494, 'loss/train': 0.6763717085123062} 01/26/2022 22:13:54 - INFO - codeparrot_training - Step 2495: {'lr': 0.0004998688103134072, 'samples': 479232, 'steps': 2495, 'loss/train': 1.3162151277065277} 01/26/2022 22:13:57 - INFO - codeparrot_training - Step 2496: {'lr': 0.0004998682797652522, 'samples': 479424, 'steps': 2496, 'loss/train': 0.5659158825874329} 01/26/2022 22:14:00 - INFO - codeparrot_training - Step 2497: {'lr': 0.0004998677481467408, 'samples': 479616, 'steps': 2497, 'loss/train': 0.13923576101660728} 01/26/2022 22:14:04 - INFO - codeparrot_training - Step 2498: {'lr': 0.0004998672154578754, 'samples': 479808, 'steps': 2498, 'loss/train': 0.7427906692028046} 01/26/2022 22:14:07 - INFO - codeparrot_training - Step 2499: {'lr': 0.0004998666816986582, 'samples': 480000, 'steps': 2499, 'loss/train': 1.3431158065795898} 01/26/2022 22:14:10 - INFO - codeparrot_training - Step 2500: {'lr': 0.0004998661468690914, 'samples': 480192, 'steps': 2500, 'loss/train': 0.8359576463699341} 01/26/2022 22:14:14 - INFO - codeparrot_training - Step 2501: {'lr': 0.0004998656109691774, 'samples': 480384, 'steps': 2501, 'loss/train': 0.8113740384578705} 01/26/2022 22:14:17 - INFO - codeparrot_training - Step 2502: {'lr': 0.0004998650739989185, 'samples': 480576, 'steps': 2502, 'loss/train': 0.5125447511672974} 01/26/2022 22:14:20 - INFO - codeparrot_training - Step 2503: {'lr': 0.0004998645359583169, 'samples': 480768, 'steps': 2503, 'loss/train': 0.8431939780712128} 01/26/2022 22:14:24 - INFO - codeparrot_training - Step 2504: {'lr': 0.0004998639968473751, 'samples': 480960, 'steps': 2504, 'loss/train': 1.3737978637218475} 01/26/2022 22:14:27 - INFO - codeparrot_training - Step 2505: {'lr': 0.0004998634566660952, 'samples': 481152, 'steps': 2505, 'loss/train': 0.5326707512140274} 01/26/2022 22:14:30 - INFO - codeparrot_training - Step 2506: {'lr': 0.0004998629154144795, 'samples': 481344, 'steps': 2506, 'loss/train': 0.5578314810991287} 01/26/2022 22:14:33 - INFO - codeparrot_training - Step 2507: {'lr': 0.0004998623730925305, 'samples': 481536, 'steps': 2507, 'loss/train': 0.5641510784626007} 01/26/2022 22:14:36 - INFO - codeparrot_training - Step 2508: {'lr': 0.0004998618297002504, 'samples': 481728, 'steps': 2508, 'loss/train': 0.8165236115455627} 01/26/2022 22:14:39 - INFO - codeparrot_training - Step 2509: {'lr': 0.0004998612852376417, 'samples': 481920, 'steps': 2509, 'loss/train': 1.0531817972660065} 01/26/2022 22:14:44 - INFO - codeparrot_training - Step 2510: {'lr': 0.0004998607397047063, 'samples': 482112, 'steps': 2510, 'loss/train': 1.1464526653289795} 01/26/2022 22:14:47 - INFO - codeparrot_training - Step 2511: {'lr': 0.0004998601931014471, 'samples': 482304, 'steps': 2511, 'loss/train': 0.9474646747112274} 01/26/2022 22:14:50 - INFO - codeparrot_training - Step 2512: {'lr': 0.0004998596454278661, 'samples': 482496, 'steps': 2512, 'loss/train': 1.0706453919410706} 01/26/2022 22:14:53 - INFO - codeparrot_training - Step 2513: {'lr': 0.0004998590966839657, 'samples': 482688, 'steps': 2513, 'loss/train': 0.7668367624282837} 01/26/2022 22:14:56 - INFO - codeparrot_training - Step 2514: {'lr': 0.0004998585468697482, 'samples': 482880, 'steps': 2514, 'loss/train': 1.53949373960495} 01/26/2022 22:14:59 - INFO - codeparrot_training - Step 2515: {'lr': 0.0004998579959852161, 'samples': 483072, 'steps': 2515, 'loss/train': 0.9950288236141205} 01/26/2022 22:15:02 - INFO - codeparrot_training - Step 2516: {'lr': 0.0004998574440303718, 'samples': 483264, 'steps': 2516, 'loss/train': 0.8041297495365143} 01/26/2022 22:15:06 - INFO - codeparrot_training - Step 2517: {'lr': 0.0004998568910052173, 'samples': 483456, 'steps': 2517, 'loss/train': 2.093239903450012} 01/26/2022 22:15:09 - INFO - codeparrot_training - Step 2518: {'lr': 0.0004998563369097554, 'samples': 483648, 'steps': 2518, 'loss/train': 0.8622440993785858} 01/26/2022 22:15:15 - INFO - codeparrot_training - Step 2519: {'lr': 0.0004998557817439882, 'samples': 483840, 'steps': 2519, 'loss/train': 0.4452754408121109} 01/26/2022 22:15:19 - INFO - codeparrot_training - Step 2520: {'lr': 0.0004998552255079182, 'samples': 484032, 'steps': 2520, 'loss/train': 0.393641397356987} 01/26/2022 22:15:22 - INFO - codeparrot_training - Step 2521: {'lr': 0.0004998546682015478, 'samples': 484224, 'steps': 2521, 'loss/train': 0.8870640993118286} 01/26/2022 22:15:25 - INFO - codeparrot_training - Step 2522: {'lr': 0.0004998541098248793, 'samples': 484416, 'steps': 2522, 'loss/train': 0.6159920543432236} 01/26/2022 22:15:28 - INFO - codeparrot_training - Step 2523: {'lr': 0.0004998535503779151, 'samples': 484608, 'steps': 2523, 'loss/train': 1.0276718437671661} 01/26/2022 22:15:31 - INFO - codeparrot_training - Step 2524: {'lr': 0.0004998529898606576, 'samples': 484800, 'steps': 2524, 'loss/train': 0.8099948465824127} 01/26/2022 22:15:34 - INFO - codeparrot_training - Step 2525: {'lr': 0.0004998524282731093, 'samples': 484992, 'steps': 2525, 'loss/train': 1.2438308000564575} 01/26/2022 22:15:37 - INFO - codeparrot_training - Step 2526: {'lr': 0.0004998518656152725, 'samples': 485184, 'steps': 2526, 'loss/train': 1.1360128819942474} 01/26/2022 22:15:42 - INFO - codeparrot_training - Step 2527: {'lr': 0.0004998513018871498, 'samples': 485376, 'steps': 2527, 'loss/train': 1.0774096548557281} 01/26/2022 22:15:45 - INFO - codeparrot_training - Step 2528: {'lr': 0.0004998507370887433, 'samples': 485568, 'steps': 2528, 'loss/train': 0.984403520822525} 01/26/2022 22:15:48 - INFO - codeparrot_training - Step 2529: {'lr': 0.0004998501712200555, 'samples': 485760, 'steps': 2529, 'loss/train': 1.128627598285675} 01/26/2022 22:15:51 - INFO - codeparrot_training - Step 2530: {'lr': 0.000499849604281089, 'samples': 485952, 'steps': 2530, 'loss/train': 1.1407291889190674} 01/26/2022 22:15:55 - INFO - codeparrot_training - Step 2531: {'lr': 0.0004998490362718462, 'samples': 486144, 'steps': 2531, 'loss/train': 0.8995281457901001} 01/26/2022 22:15:58 - INFO - codeparrot_training - Step 2532: {'lr': 0.0004998484671923293, 'samples': 486336, 'steps': 2532, 'loss/train': 1.158761590719223} 01/26/2022 22:16:01 - INFO - codeparrot_training - Step 2533: {'lr': 0.000499847897042541, 'samples': 486528, 'steps': 2533, 'loss/train': 0.4998553544282913} 01/26/2022 22:16:04 - INFO - codeparrot_training - Step 2534: {'lr': 0.0004998473258224837, 'samples': 486720, 'steps': 2534, 'loss/train': 0.6458668112754822} 01/26/2022 22:16:07 - INFO - codeparrot_training - Step 2535: {'lr': 0.0004998467535321597, 'samples': 486912, 'steps': 2535, 'loss/train': 0.7284467071294785} 01/26/2022 22:16:13 - INFO - codeparrot_training - Step 2536: {'lr': 0.0004998461801715716, 'samples': 487104, 'steps': 2536, 'loss/train': 0.865193098783493} 01/26/2022 22:16:16 - INFO - codeparrot_training - Step 2537: {'lr': 0.0004998456057407218, 'samples': 487296, 'steps': 2537, 'loss/train': 1.0324251651763916} 01/26/2022 22:16:19 - INFO - codeparrot_training - Step 2538: {'lr': 0.0004998450302396127, 'samples': 487488, 'steps': 2538, 'loss/train': 1.018400251865387} 01/26/2022 22:16:22 - INFO - codeparrot_training - Step 2539: {'lr': 0.0004998444536682469, 'samples': 487680, 'steps': 2539, 'loss/train': 0.6251821517944336} 01/26/2022 22:16:25 - INFO - codeparrot_training - Step 2540: {'lr': 0.0004998438760266267, 'samples': 487872, 'steps': 2540, 'loss/train': 0.8866836726665497} 01/26/2022 22:16:28 - INFO - codeparrot_training - Step 2541: {'lr': 0.0004998432973147548, 'samples': 488064, 'steps': 2541, 'loss/train': 1.094898372888565} 01/26/2022 22:16:32 - INFO - codeparrot_training - Step 2542: {'lr': 0.0004998427175326335, 'samples': 488256, 'steps': 2542, 'loss/train': 0.9140185117721558} 01/26/2022 22:16:35 - INFO - codeparrot_training - Step 2543: {'lr': 0.0004998421366802653, 'samples': 488448, 'steps': 2543, 'loss/train': 1.1809421181678772} 01/26/2022 22:16:38 - INFO - codeparrot_training - Step 2544: {'lr': 0.0004998415547576527, 'samples': 488640, 'steps': 2544, 'loss/train': 1.1158947944641113} 01/26/2022 22:16:42 - INFO - codeparrot_training - Step 2545: {'lr': 0.0004998409717647983, 'samples': 488832, 'steps': 2545, 'loss/train': 1.105543076992035} 01/26/2022 22:16:45 - INFO - codeparrot_training - Step 2546: {'lr': 0.0004998403877017044, 'samples': 489024, 'steps': 2546, 'loss/train': 0.8482134640216827} 01/26/2022 22:16:49 - INFO - codeparrot_training - Step 2547: {'lr': 0.0004998398025683737, 'samples': 489216, 'steps': 2547, 'loss/train': 1.0570367574691772} 01/26/2022 22:16:52 - INFO - codeparrot_training - Step 2548: {'lr': 0.0004998392163648085, 'samples': 489408, 'steps': 2548, 'loss/train': 1.0010789036750793} 01/26/2022 22:16:55 - INFO - codeparrot_training - Step 2549: {'lr': 0.0004998386290910116, 'samples': 489600, 'steps': 2549, 'loss/train': 1.5020899772644043} 01/26/2022 22:16:58 - INFO - codeparrot_training - Step 2550: {'lr': 0.0004998380407469853, 'samples': 489792, 'steps': 2550, 'loss/train': 0.8121333718299866} 01/26/2022 22:17:01 - INFO - codeparrot_training - Step 2551: {'lr': 0.0004998374513327321, 'samples': 489984, 'steps': 2551, 'loss/train': 0.9117420315742493} 01/26/2022 22:17:04 - INFO - codeparrot_training - Step 2552: {'lr': 0.0004998368608482546, 'samples': 490176, 'steps': 2552, 'loss/train': 0.8093945682048798} 01/26/2022 22:17:07 - INFO - codeparrot_training - Step 2553: {'lr': 0.0004998362692935553, 'samples': 490368, 'steps': 2553, 'loss/train': 0.9055194854736328} 01/26/2022 22:17:12 - INFO - codeparrot_training - Step 2554: {'lr': 0.0004998356766686368, 'samples': 490560, 'steps': 2554, 'loss/train': 0.9581896662712097} 01/26/2022 22:17:15 - INFO - codeparrot_training - Step 2555: {'lr': 0.0004998350829735016, 'samples': 490752, 'steps': 2555, 'loss/train': 0.7039699852466583} 01/26/2022 22:17:18 - INFO - codeparrot_training - Step 2556: {'lr': 0.0004998344882081522, 'samples': 490944, 'steps': 2556, 'loss/train': 0.8121767342090607} 01/26/2022 22:17:21 - INFO - codeparrot_training - Step 2557: {'lr': 0.0004998338923725913, 'samples': 491136, 'steps': 2557, 'loss/train': 0.8274579048156738} 01/26/2022 22:17:24 - INFO - codeparrot_training - Step 2558: {'lr': 0.0004998332954668211, 'samples': 491328, 'steps': 2558, 'loss/train': 0.5095050483942032} 01/26/2022 22:17:27 - INFO - codeparrot_training - Step 2559: {'lr': 0.0004998326974908446, 'samples': 491520, 'steps': 2559, 'loss/train': 1.0777899026870728} 01/26/2022 22:17:31 - INFO - codeparrot_training - Step 2560: {'lr': 0.0004998320984446641, 'samples': 491712, 'steps': 2560, 'loss/train': 1.0843445956707} 01/26/2022 22:17:34 - INFO - codeparrot_training - Step 2561: {'lr': 0.0004998314983282821, 'samples': 491904, 'steps': 2561, 'loss/train': 1.21025151014328} 01/26/2022 22:17:37 - INFO - codeparrot_training - Step 2562: {'lr': 0.0004998308971417015, 'samples': 492096, 'steps': 2562, 'loss/train': 0.5834202021360397} 01/26/2022 22:17:43 - INFO - codeparrot_training - Step 2563: {'lr': 0.0004998302948849246, 'samples': 492288, 'steps': 2563, 'loss/train': 0.893994927406311} 01/26/2022 22:17:46 - INFO - codeparrot_training - Step 2564: {'lr': 0.0004998296915579539, 'samples': 492480, 'steps': 2564, 'loss/train': 0.5401930510997772} 01/26/2022 22:17:49 - INFO - codeparrot_training - Step 2565: {'lr': 0.0004998290871607924, 'samples': 492672, 'steps': 2565, 'loss/train': 0.6258967369794846} 01/26/2022 22:17:53 - INFO - codeparrot_training - Step 2566: {'lr': 0.0004998284816934422, 'samples': 492864, 'steps': 2566, 'loss/train': 0.9740405380725861} 01/26/2022 22:17:56 - INFO - codeparrot_training - Step 2567: {'lr': 0.0004998278751559062, 'samples': 493056, 'steps': 2567, 'loss/train': 1.019394725561142} 01/26/2022 22:17:59 - INFO - codeparrot_training - Step 2568: {'lr': 0.0004998272675481868, 'samples': 493248, 'steps': 2568, 'loss/train': 0.8454435467720032} 01/26/2022 22:18:02 - INFO - codeparrot_training - Step 2569: {'lr': 0.0004998266588702869, 'samples': 493440, 'steps': 2569, 'loss/train': 0.8740995526313782} 01/26/2022 22:18:05 - INFO - codeparrot_training - Step 2570: {'lr': 0.0004998260491222088, 'samples': 493632, 'steps': 2570, 'loss/train': 0.6478859782218933} 01/26/2022 22:18:10 - INFO - codeparrot_training - Step 2571: {'lr': 0.0004998254383039552, 'samples': 493824, 'steps': 2571, 'loss/train': 0.7209407687187195} 01/26/2022 22:18:13 - INFO - codeparrot_training - Step 2572: {'lr': 0.0004998248264155288, 'samples': 494016, 'steps': 2572, 'loss/train': 1.033970296382904} 01/26/2022 22:18:16 - INFO - codeparrot_training - Step 2573: {'lr': 0.0004998242134569322, 'samples': 494208, 'steps': 2573, 'loss/train': 0.8972678482532501} 01/26/2022 22:18:19 - INFO - codeparrot_training - Step 2574: {'lr': 0.0004998235994281681, 'samples': 494400, 'steps': 2574, 'loss/train': 0.7267903983592987} 01/26/2022 22:18:22 - INFO - codeparrot_training - Step 2575: {'lr': 0.0004998229843292388, 'samples': 494592, 'steps': 2575, 'loss/train': 0.9600571990013123} 01/26/2022 22:18:25 - INFO - codeparrot_training - Step 2576: {'lr': 0.0004998223681601474, 'samples': 494784, 'steps': 2576, 'loss/train': 0.8661814033985138} 01/26/2022 22:18:29 - INFO - codeparrot_training - Step 2577: {'lr': 0.0004998217509208961, 'samples': 494976, 'steps': 2577, 'loss/train': 1.198028415441513} 01/26/2022 22:18:32 - INFO - codeparrot_training - Step 2578: {'lr': 0.0004998211326114878, 'samples': 495168, 'steps': 2578, 'loss/train': 0.7639254927635193} 01/26/2022 22:18:35 - INFO - codeparrot_training - Step 2579: {'lr': 0.0004998205132319252, 'samples': 495360, 'steps': 2579, 'loss/train': 0.3592715263366699} 01/26/2022 22:18:38 - INFO - codeparrot_training - Step 2580: {'lr': 0.0004998198927822108, 'samples': 495552, 'steps': 2580, 'loss/train': 1.1148167252540588} 01/26/2022 22:18:43 - INFO - codeparrot_training - Step 2581: {'lr': 0.0004998192712623472, 'samples': 495744, 'steps': 2581, 'loss/train': 0.9159150123596191} 01/26/2022 22:18:46 - INFO - codeparrot_training - Step 2582: {'lr': 0.0004998186486723373, 'samples': 495936, 'steps': 2582, 'loss/train': 0.9795024991035461} 01/26/2022 22:18:49 - INFO - codeparrot_training - Step 2583: {'lr': 0.0004998180250121836, 'samples': 496128, 'steps': 2583, 'loss/train': 0.8946020007133484} 01/26/2022 22:18:52 - INFO - codeparrot_training - Step 2584: {'lr': 0.0004998174002818887, 'samples': 496320, 'steps': 2584, 'loss/train': 0.43773436546325684} 01/26/2022 22:18:55 - INFO - codeparrot_training - Step 2585: {'lr': 0.0004998167744814555, 'samples': 496512, 'steps': 2585, 'loss/train': 0.9494539797306061} 01/26/2022 22:18:58 - INFO - codeparrot_training - Step 2586: {'lr': 0.0004998161476108864, 'samples': 496704, 'steps': 2586, 'loss/train': 0.7681335210800171} 01/26/2022 22:19:02 - INFO - codeparrot_training - Step 2587: {'lr': 0.0004998155196701845, 'samples': 496896, 'steps': 2587, 'loss/train': 1.0642088949680328} 01/26/2022 22:19:05 - INFO - codeparrot_training - Step 2588: {'lr': 0.000499814890659352, 'samples': 497088, 'steps': 2588, 'loss/train': 0.8771871328353882} 01/26/2022 22:19:09 - INFO - codeparrot_training - Step 2589: {'lr': 0.000499814260578392, 'samples': 497280, 'steps': 2589, 'loss/train': 1.198269546031952} 01/26/2022 22:19:12 - INFO - codeparrot_training - Step 2590: {'lr': 0.000499813629427307, 'samples': 497472, 'steps': 2590, 'loss/train': 1.0678884387016296} 01/26/2022 22:19:16 - INFO - codeparrot_training - Step 2591: {'lr': 0.0004998129972060998, 'samples': 497664, 'steps': 2591, 'loss/train': 0.6985900551080704} 01/26/2022 22:19:19 - INFO - codeparrot_training - Step 2592: {'lr': 0.000499812363914773, 'samples': 497856, 'steps': 2592, 'loss/train': 0.9977722764015198} 01/26/2022 22:19:22 - INFO - codeparrot_training - Step 2593: {'lr': 0.0004998117295533292, 'samples': 498048, 'steps': 2593, 'loss/train': 0.9548056125640869} 01/26/2022 22:19:25 - INFO - codeparrot_training - Step 2594: {'lr': 0.0004998110941217714, 'samples': 498240, 'steps': 2594, 'loss/train': 0.2368261143565178} 01/26/2022 22:19:28 - INFO - codeparrot_training - Step 2595: {'lr': 0.0004998104576201022, 'samples': 498432, 'steps': 2595, 'loss/train': 1.39950692653656} 01/26/2022 22:19:31 - INFO - codeparrot_training - Step 2596: {'lr': 0.0004998098200483243, 'samples': 498624, 'steps': 2596, 'loss/train': 1.028096616268158} 01/26/2022 22:19:34 - INFO - codeparrot_training - Step 2597: {'lr': 0.0004998091814064405, 'samples': 498816, 'steps': 2597, 'loss/train': 0.783336728811264} 01/26/2022 22:19:41 - INFO - codeparrot_training - Step 2598: {'lr': 0.0004998085416944534, 'samples': 499008, 'steps': 2598, 'loss/train': 1.000181883573532} 01/26/2022 22:19:44 - INFO - codeparrot_training - Step 2599: {'lr': 0.000499807900912366, 'samples': 499200, 'steps': 2599, 'loss/train': 1.013779878616333} 01/26/2022 22:19:47 - INFO - codeparrot_training - Step 2600: {'lr': 0.0004998072590601808, 'samples': 499392, 'steps': 2600, 'loss/train': 0.8479633033275604} 01/26/2022 22:19:50 - INFO - codeparrot_training - Step 2601: {'lr': 0.0004998066161379006, 'samples': 499584, 'steps': 2601, 'loss/train': 1.5794582962989807} 01/26/2022 22:19:53 - INFO - codeparrot_training - Step 2602: {'lr': 0.0004998059721455281, 'samples': 499776, 'steps': 2602, 'loss/train': 0.9690039753913879} 01/26/2022 22:19:56 - INFO - codeparrot_training - Step 2603: {'lr': 0.0004998053270830662, 'samples': 499968, 'steps': 2603, 'loss/train': 0.7170280963182449} 01/26/2022 22:20:00 - INFO - codeparrot_training - Step 2604: {'lr': 0.0004998046809505176, 'samples': 500160, 'steps': 2604, 'loss/train': 1.016732633113861} 01/26/2022 22:20:03 - INFO - codeparrot_training - Step 2605: {'lr': 0.0004998040337478851, 'samples': 500352, 'steps': 2605, 'loss/train': 0.6672349870204926} 01/26/2022 22:20:06 - INFO - codeparrot_training - Step 2606: {'lr': 0.0004998033854751715, 'samples': 500544, 'steps': 2606, 'loss/train': 1.0359874963760376} 01/26/2022 22:20:10 - INFO - codeparrot_training - Step 2607: {'lr': 0.0004998027361323794, 'samples': 500736, 'steps': 2607, 'loss/train': 0.6321619749069214} 01/26/2022 22:20:13 - INFO - codeparrot_training - Step 2608: {'lr': 0.0004998020857195117, 'samples': 500928, 'steps': 2608, 'loss/train': 0.8926662504673004} 01/26/2022 22:20:16 - INFO - codeparrot_training - Step 2609: {'lr': 0.0004998014342365712, 'samples': 501120, 'steps': 2609, 'loss/train': 0.9400539100170135} 01/26/2022 22:20:20 - INFO - codeparrot_training - Step 2610: {'lr': 0.0004998007816835608, 'samples': 501312, 'steps': 2610, 'loss/train': 0.9577142894268036} 01/26/2022 22:20:23 - INFO - codeparrot_training - Step 2611: {'lr': 0.000499800128060483, 'samples': 501504, 'steps': 2611, 'loss/train': 0.9244851171970367} 01/26/2022 22:20:26 - INFO - codeparrot_training - Step 2612: {'lr': 0.0004997994733673409, 'samples': 501696, 'steps': 2612, 'loss/train': 1.5299354195594788} 01/26/2022 22:20:29 - INFO - codeparrot_training - Step 2613: {'lr': 0.000499798817604137, 'samples': 501888, 'steps': 2613, 'loss/train': 0.8185036182403564} 01/26/2022 22:20:32 - INFO - codeparrot_training - Step 2614: {'lr': 0.0004997981607708745, 'samples': 502080, 'steps': 2614, 'loss/train': 0.7679775059223175} 01/26/2022 22:20:38 - INFO - codeparrot_training - Step 2615: {'lr': 0.0004997975028675558, 'samples': 502272, 'steps': 2615, 'loss/train': 0.6767157465219498} 01/26/2022 22:20:41 - INFO - codeparrot_training - Step 2616: {'lr': 0.0004997968438941841, 'samples': 502464, 'steps': 2616, 'loss/train': 0.38108061254024506} 01/26/2022 22:20:45 - INFO - codeparrot_training - Step 2617: {'lr': 0.0004997961838507619, 'samples': 502656, 'steps': 2617, 'loss/train': 0.7412612736225128} 01/26/2022 22:20:48 - INFO - codeparrot_training - Step 2618: {'lr': 0.0004997955227372923, 'samples': 502848, 'steps': 2618, 'loss/train': 1.4096669554710388} 01/26/2022 22:20:51 - INFO - codeparrot_training - Step 2619: {'lr': 0.000499794860553778, 'samples': 503040, 'steps': 2619, 'loss/train': 0.9209230542182922} 01/26/2022 22:20:54 - INFO - codeparrot_training - Step 2620: {'lr': 0.0004997941973002216, 'samples': 503232, 'steps': 2620, 'loss/train': 1.2704803347587585} 01/26/2022 22:20:57 - INFO - codeparrot_training - Step 2621: {'lr': 0.0004997935329766265, 'samples': 503424, 'steps': 2621, 'loss/train': 0.47678254544734955} 01/26/2022 22:21:00 - INFO - codeparrot_training - Step 2622: {'lr': 0.000499792867582995, 'samples': 503616, 'steps': 2622, 'loss/train': 1.2074722945690155} 01/26/2022 22:21:03 - INFO - codeparrot_training - Step 2623: {'lr': 0.0004997922011193303, 'samples': 503808, 'steps': 2623, 'loss/train': 0.8973154127597809} 01/26/2022 22:21:08 - INFO - codeparrot_training - Step 2624: {'lr': 0.000499791533585635, 'samples': 504000, 'steps': 2624, 'loss/train': 0.9286564886569977} 01/26/2022 22:21:11 - INFO - codeparrot_training - Step 2625: {'lr': 0.0004997908649819122, 'samples': 504192, 'steps': 2625, 'loss/train': 1.3207147121429443} 01/26/2022 22:21:14 - INFO - codeparrot_training - Step 2626: {'lr': 0.0004997901953081646, 'samples': 504384, 'steps': 2626, 'loss/train': 0.7896087169647217} 01/26/2022 22:21:17 - INFO - codeparrot_training - Step 2627: {'lr': 0.0004997895245643951, 'samples': 504576, 'steps': 2627, 'loss/train': 1.2086052596569061} 01/26/2022 22:21:20 - INFO - codeparrot_training - Step 2628: {'lr': 0.0004997888527506067, 'samples': 504768, 'steps': 2628, 'loss/train': 0.901053786277771} 01/26/2022 22:21:24 - INFO - codeparrot_training - Step 2629: {'lr': 0.000499788179866802, 'samples': 504960, 'steps': 2629, 'loss/train': 0.5188821405172348} 01/26/2022 22:21:27 - INFO - codeparrot_training - Step 2630: {'lr': 0.0004997875059129843, 'samples': 505152, 'steps': 2630, 'loss/train': 1.183224856853485} 01/26/2022 22:21:30 - INFO - codeparrot_training - Step 2631: {'lr': 0.000499786830889156, 'samples': 505344, 'steps': 2631, 'loss/train': 0.4331882447004318} 01/26/2022 22:21:33 - INFO - codeparrot_training - Step 2632: {'lr': 0.0004997861547953203, 'samples': 505536, 'steps': 2632, 'loss/train': 0.5412651300430298} 01/26/2022 22:21:37 - INFO - codeparrot_training - Step 2633: {'lr': 0.00049978547763148, 'samples': 505728, 'steps': 2633, 'loss/train': 0.7887166142463684} 01/26/2022 22:21:40 - INFO - codeparrot_training - Step 2634: {'lr': 0.0004997847993976381, 'samples': 505920, 'steps': 2634, 'loss/train': 0.8687497973442078} 01/26/2022 22:21:44 - INFO - codeparrot_training - Step 2635: {'lr': 0.0004997841200937975, 'samples': 506112, 'steps': 2635, 'loss/train': 0.5095852017402649} 01/26/2022 22:21:47 - INFO - codeparrot_training - Step 2636: {'lr': 0.0004997834397199609, 'samples': 506304, 'steps': 2636, 'loss/train': 0.9452239573001862} 01/26/2022 22:21:50 - INFO - codeparrot_training - Step 2637: {'lr': 0.0004997827582761315, 'samples': 506496, 'steps': 2637, 'loss/train': 1.1862452030181885} 01/26/2022 22:21:53 - INFO - codeparrot_training - Step 2638: {'lr': 0.0004997820757623119, 'samples': 506688, 'steps': 2638, 'loss/train': 1.2154799401760101} 01/26/2022 22:21:56 - INFO - codeparrot_training - Step 2639: {'lr': 0.0004997813921785054, 'samples': 506880, 'steps': 2639, 'loss/train': 1.129555732011795} 01/26/2022 22:21:59 - INFO - codeparrot_training - Step 2640: {'lr': 0.0004997807075247146, 'samples': 507072, 'steps': 2640, 'loss/train': 2.204172670841217} 01/26/2022 22:22:02 - INFO - codeparrot_training - Step 2641: {'lr': 0.0004997800218009426, 'samples': 507264, 'steps': 2641, 'loss/train': 1.0391161143779755} 01/26/2022 22:22:09 - INFO - codeparrot_training - Step 2642: {'lr': 0.0004997793350071923, 'samples': 507456, 'steps': 2642, 'loss/train': 0.8796599507331848} 01/26/2022 22:22:12 - INFO - codeparrot_training - Step 2643: {'lr': 0.0004997786471434666, 'samples': 507648, 'steps': 2643, 'loss/train': 1.1503953337669373} 01/26/2022 22:22:15 - INFO - codeparrot_training - Step 2644: {'lr': 0.0004997779582097686, 'samples': 507840, 'steps': 2644, 'loss/train': 1.272165298461914} 01/26/2022 22:22:18 - INFO - codeparrot_training - Step 2645: {'lr': 0.0004997772682061011, 'samples': 508032, 'steps': 2645, 'loss/train': 0.995927095413208} 01/26/2022 22:22:21 - INFO - codeparrot_training - Step 2646: {'lr': 0.000499776577132467, 'samples': 508224, 'steps': 2646, 'loss/train': 1.126908302307129} 01/26/2022 22:22:24 - INFO - codeparrot_training - Step 2647: {'lr': 0.0004997758849888693, 'samples': 508416, 'steps': 2647, 'loss/train': 1.788887858390808} 01/26/2022 22:22:28 - INFO - codeparrot_training - Step 2648: {'lr': 0.0004997751917753113, 'samples': 508608, 'steps': 2648, 'loss/train': 1.2055270671844482} 01/26/2022 22:22:31 - INFO - codeparrot_training - Step 2649: {'lr': 0.0004997744974917955, 'samples': 508800, 'steps': 2649, 'loss/train': 1.0854520797729492} 01/26/2022 22:22:34 - INFO - codeparrot_training - Step 2650: {'lr': 0.0004997738021383252, 'samples': 508992, 'steps': 2650, 'loss/train': 0.8841123282909393} 01/26/2022 22:22:38 - INFO - codeparrot_training - Step 2651: {'lr': 0.000499773105714903, 'samples': 509184, 'steps': 2651, 'loss/train': 0.9092717170715332} 01/26/2022 22:22:42 - INFO - codeparrot_training - Step 2652: {'lr': 0.0004997724082215323, 'samples': 509376, 'steps': 2652, 'loss/train': 1.1935081779956818} 01/26/2022 22:22:45 - INFO - codeparrot_training - Step 2653: {'lr': 0.0004997717096582159, 'samples': 509568, 'steps': 2653, 'loss/train': 0.4794198274612427} 01/26/2022 22:22:48 - INFO - codeparrot_training - Step 2654: {'lr': 0.0004997710100249568, 'samples': 509760, 'steps': 2654, 'loss/train': 0.7424392551183701} 01/26/2022 22:22:51 - INFO - codeparrot_training - Step 2655: {'lr': 0.000499770309321758, 'samples': 509952, 'steps': 2655, 'loss/train': 1.132214069366455} 01/26/2022 22:22:54 - INFO - codeparrot_training - Step 2656: {'lr': 0.0004997696075486225, 'samples': 510144, 'steps': 2656, 'loss/train': 1.1322592198848724} 01/26/2022 22:22:57 - INFO - codeparrot_training - Step 2657: {'lr': 0.0004997689047055534, 'samples': 510336, 'steps': 2657, 'loss/train': 1.1283273696899414} 01/26/2022 22:23:00 - INFO - codeparrot_training - Step 2658: {'lr': 0.0004997682007925535, 'samples': 510528, 'steps': 2658, 'loss/train': 1.1260587573051453} 01/26/2022 22:23:05 - INFO - codeparrot_training - Step 2659: {'lr': 0.0004997674958096259, 'samples': 510720, 'steps': 2659, 'loss/train': 0.36270933598279953} 01/26/2022 22:23:08 - INFO - codeparrot_training - Step 2660: {'lr': 0.0004997667897567738, 'samples': 510912, 'steps': 2660, 'loss/train': 0.45319823920726776} 01/26/2022 22:23:11 - INFO - codeparrot_training - Step 2661: {'lr': 0.000499766082634, 'samples': 511104, 'steps': 2661, 'loss/train': 1.098320335149765} 01/26/2022 22:23:14 - INFO - codeparrot_training - Step 2662: {'lr': 0.0004997653744413076, 'samples': 511296, 'steps': 2662, 'loss/train': 0.8161561489105225} 01/26/2022 22:23:17 - INFO - codeparrot_training - Step 2663: {'lr': 0.0004997646651786996, 'samples': 511488, 'steps': 2663, 'loss/train': 0.6496269106864929} 01/26/2022 22:23:21 - INFO - codeparrot_training - Step 2664: {'lr': 0.0004997639548461792, 'samples': 511680, 'steps': 2664, 'loss/train': 0.9057336151599884} 01/26/2022 22:23:24 - INFO - codeparrot_training - Step 2665: {'lr': 0.0004997632434437493, 'samples': 511872, 'steps': 2665, 'loss/train': 0.8712016046047211} 01/26/2022 22:23:27 - INFO - codeparrot_training - Step 2666: {'lr': 0.0004997625309714129, 'samples': 512064, 'steps': 2666, 'loss/train': 0.8524197936058044} 01/26/2022 22:23:30 - INFO - codeparrot_training - Step 2667: {'lr': 0.0004997618174291732, 'samples': 512256, 'steps': 2667, 'loss/train': 0.9716092050075531} 01/26/2022 22:23:36 - INFO - codeparrot_training - Step 2668: {'lr': 0.0004997611028170332, 'samples': 512448, 'steps': 2668, 'loss/train': 0.8992140591144562} 01/26/2022 22:23:39 - INFO - codeparrot_training - Step 2669: {'lr': 0.000499760387134996, 'samples': 512640, 'steps': 2669, 'loss/train': 1.255696177482605} 01/26/2022 22:23:43 - INFO - codeparrot_training - Step 2670: {'lr': 0.0004997596703830645, 'samples': 512832, 'steps': 2670, 'loss/train': 1.0326242744922638} 01/26/2022 22:23:46 - INFO - codeparrot_training - Step 2671: {'lr': 0.0004997589525612418, 'samples': 513024, 'steps': 2671, 'loss/train': 0.6074943691492081} 01/26/2022 22:23:49 - INFO - codeparrot_training - Step 2672: {'lr': 0.0004997582336695312, 'samples': 513216, 'steps': 2672, 'loss/train': 0.7911580502986908} 01/26/2022 22:23:52 - INFO - codeparrot_training - Step 2673: {'lr': 0.0004997575137079355, 'samples': 513408, 'steps': 2673, 'loss/train': 0.8841857314109802} 01/26/2022 22:23:55 - INFO - codeparrot_training - Step 2674: {'lr': 0.0004997567926764581, 'samples': 513600, 'steps': 2674, 'loss/train': 0.6091492921113968} 01/26/2022 22:23:58 - INFO - codeparrot_training - Step 2675: {'lr': 0.0004997560705751018, 'samples': 513792, 'steps': 2675, 'loss/train': 1.1315706968307495} 01/26/2022 22:24:01 - INFO - codeparrot_training - Step 2676: {'lr': 0.0004997553474038698, 'samples': 513984, 'steps': 2676, 'loss/train': 1.5710667371749878} 01/26/2022 22:24:06 - INFO - codeparrot_training - Step 2677: {'lr': 0.0004997546231627652, 'samples': 514176, 'steps': 2677, 'loss/train': 0.8125653862953186} 01/26/2022 22:24:09 - INFO - codeparrot_training - Step 2678: {'lr': 0.0004997538978517912, 'samples': 514368, 'steps': 2678, 'loss/train': 0.6224167048931122} 01/26/2022 22:24:12 - INFO - codeparrot_training - Step 2679: {'lr': 0.0004997531714709506, 'samples': 514560, 'steps': 2679, 'loss/train': 1.019887089729309} 01/26/2022 22:24:15 - INFO - codeparrot_training - Step 2680: {'lr': 0.0004997524440202469, 'samples': 514752, 'steps': 2680, 'loss/train': 0.7227133512496948} 01/26/2022 22:24:18 - INFO - codeparrot_training - Step 2681: {'lr': 0.0004997517154996829, 'samples': 514944, 'steps': 2681, 'loss/train': 0.9434175789356232} 01/26/2022 22:24:22 - INFO - codeparrot_training - Step 2682: {'lr': 0.000499750985909262, 'samples': 515136, 'steps': 2682, 'loss/train': 0.876489669084549} 01/26/2022 22:24:25 - INFO - codeparrot_training - Step 2683: {'lr': 0.0004997502552489871, 'samples': 515328, 'steps': 2683, 'loss/train': 1.1285617053508759} 01/26/2022 22:24:28 - INFO - codeparrot_training - Step 2684: {'lr': 0.0004997495235188614, 'samples': 515520, 'steps': 2684, 'loss/train': 0.7932533025741577} 01/26/2022 22:24:31 - INFO - codeparrot_training - Step 2685: {'lr': 0.0004997487907188881, 'samples': 515712, 'steps': 2685, 'loss/train': 1.071141242980957} 01/26/2022 22:24:36 - INFO - codeparrot_training - Step 2686: {'lr': 0.0004997480568490702, 'samples': 515904, 'steps': 2686, 'loss/train': 1.248889535665512} 01/26/2022 22:24:39 - INFO - codeparrot_training - Step 2687: {'lr': 0.0004997473219094111, 'samples': 516096, 'steps': 2687, 'loss/train': 0.7478516399860382} 01/26/2022 22:24:42 - INFO - codeparrot_training - Step 2688: {'lr': 0.0004997465858999136, 'samples': 516288, 'steps': 2688, 'loss/train': 1.1512064337730408} 01/26/2022 22:24:45 - INFO - codeparrot_training - Step 2689: {'lr': 0.0004997458488205811, 'samples': 516480, 'steps': 2689, 'loss/train': 0.4710398465394974} 01/26/2022 22:24:48 - INFO - codeparrot_training - Step 2690: {'lr': 0.0004997451106714166, 'samples': 516672, 'steps': 2690, 'loss/train': 0.564058855175972} 01/26/2022 22:24:51 - INFO - codeparrot_training - Step 2691: {'lr': 0.0004997443714524235, 'samples': 516864, 'steps': 2691, 'loss/train': 0.8751346170902252} 01/26/2022 22:24:55 - INFO - codeparrot_training - Step 2692: {'lr': 0.0004997436311636046, 'samples': 517056, 'steps': 2692, 'loss/train': 0.6052373796701431} 01/26/2022 22:24:58 - INFO - codeparrot_training - Step 2693: {'lr': 0.0004997428898049635, 'samples': 517248, 'steps': 2693, 'loss/train': 0.9444292187690735} 01/26/2022 22:25:01 - INFO - codeparrot_training - Step 2694: {'lr': 0.0004997421473765031, 'samples': 517440, 'steps': 2694, 'loss/train': 0.8947279751300812} 01/26/2022 22:25:05 - INFO - codeparrot_training - Step 2695: {'lr': 0.0004997414038782266, 'samples': 517632, 'steps': 2695, 'loss/train': 0.9933700561523438} 01/26/2022 22:25:08 - INFO - codeparrot_training - Step 2696: {'lr': 0.0004997406593101373, 'samples': 517824, 'steps': 2696, 'loss/train': 1.3522795736789703} 01/26/2022 22:25:12 - INFO - codeparrot_training - Step 2697: {'lr': 0.0004997399136722383, 'samples': 518016, 'steps': 2697, 'loss/train': 1.099130541086197} 01/26/2022 22:25:15 - INFO - codeparrot_training - Step 2698: {'lr': 0.0004997391669645327, 'samples': 518208, 'steps': 2698, 'loss/train': 0.6339345127344131} 01/26/2022 22:25:18 - INFO - codeparrot_training - Step 2699: {'lr': 0.0004997384191870239, 'samples': 518400, 'steps': 2699, 'loss/train': 1.009850174188614} 01/26/2022 22:25:21 - INFO - codeparrot_training - Step 2700: {'lr': 0.000499737670339715, 'samples': 518592, 'steps': 2700, 'loss/train': 1.168801188468933} 01/26/2022 22:25:24 - INFO - codeparrot_training - Step 2701: {'lr': 0.0004997369204226093, 'samples': 518784, 'steps': 2701, 'loss/train': 0.6663587987422943} 01/26/2022 22:25:27 - INFO - codeparrot_training - Step 2702: {'lr': 0.0004997361694357098, 'samples': 518976, 'steps': 2702, 'loss/train': 1.129889041185379} 01/26/2022 22:25:33 - INFO - codeparrot_training - Step 2703: {'lr': 0.00049973541737902, 'samples': 519168, 'steps': 2703, 'loss/train': 1.066578984260559} 01/26/2022 22:25:36 - INFO - codeparrot_training - Step 2704: {'lr': 0.0004997346642525428, 'samples': 519360, 'steps': 2704, 'loss/train': 1.0903682112693787} 01/26/2022 22:25:39 - INFO - codeparrot_training - Step 2705: {'lr': 0.0004997339100562817, 'samples': 519552, 'steps': 2705, 'loss/train': 0.8677274286746979} 01/26/2022 22:25:42 - INFO - codeparrot_training - Step 2706: {'lr': 0.0004997331547902398, 'samples': 519744, 'steps': 2706, 'loss/train': 1.015476554632187} 01/26/2022 22:25:46 - INFO - codeparrot_training - Step 2707: {'lr': 0.0004997323984544204, 'samples': 519936, 'steps': 2707, 'loss/train': 0.7875263392925262} 01/26/2022 22:25:49 - INFO - codeparrot_training - Step 2708: {'lr': 0.0004997316410488267, 'samples': 520128, 'steps': 2708, 'loss/train': 0.8404641151428223} 01/26/2022 22:25:52 - INFO - codeparrot_training - Step 2709: {'lr': 0.0004997308825734619, 'samples': 520320, 'steps': 2709, 'loss/train': 0.6073380410671234} 01/26/2022 22:25:55 - INFO - codeparrot_training - Step 2710: {'lr': 0.0004997301230283294, 'samples': 520512, 'steps': 2710, 'loss/train': 0.7155093401670456} 01/26/2022 22:25:58 - INFO - codeparrot_training - Step 2711: {'lr': 0.0004997293624134322, 'samples': 520704, 'steps': 2711, 'loss/train': 0.7951323688030243} 01/26/2022 22:26:03 - INFO - codeparrot_training - Step 2712: {'lr': 0.0004997286007287738, 'samples': 520896, 'steps': 2712, 'loss/train': 1.0263630151748657} 01/26/2022 22:26:06 - INFO - codeparrot_training - Step 2713: {'lr': 0.0004997278379743574, 'samples': 521088, 'steps': 2713, 'loss/train': 1.0408628582954407} 01/26/2022 22:26:09 - INFO - codeparrot_training - Step 2714: {'lr': 0.0004997270741501861, 'samples': 521280, 'steps': 2714, 'loss/train': 1.0635490715503693} 01/26/2022 22:26:12 - INFO - codeparrot_training - Step 2715: {'lr': 0.0004997263092562634, 'samples': 521472, 'steps': 2715, 'loss/train': 1.4662517309188843} 01/26/2022 22:26:15 - INFO - codeparrot_training - Step 2716: {'lr': 0.0004997255432925926, 'samples': 521664, 'steps': 2716, 'loss/train': 0.8487744033336639} 01/26/2022 22:26:18 - INFO - codeparrot_training - Step 2717: {'lr': 0.0004997247762591766, 'samples': 521856, 'steps': 2717, 'loss/train': 0.8276062309741974} 01/26/2022 22:26:21 - INFO - codeparrot_training - Step 2718: {'lr': 0.0004997240081560193, 'samples': 522048, 'steps': 2718, 'loss/train': 0.6998850703239441} 01/26/2022 22:26:25 - INFO - codeparrot_training - Step 2719: {'lr': 0.0004997232389831234, 'samples': 522240, 'steps': 2719, 'loss/train': 1.22300586104393} 01/26/2022 22:26:28 - INFO - codeparrot_training - Step 2720: {'lr': 0.0004997224687404926, 'samples': 522432, 'steps': 2720, 'loss/train': 1.3392615616321564} 01/26/2022 22:26:34 - INFO - codeparrot_training - Step 2721: {'lr': 0.0004997216974281299, 'samples': 522624, 'steps': 2721, 'loss/train': 1.176204353570938} 01/26/2022 22:26:37 - INFO - codeparrot_training - Step 2722: {'lr': 0.0004997209250460387, 'samples': 522816, 'steps': 2722, 'loss/train': 0.743754655122757} 01/26/2022 22:26:40 - INFO - codeparrot_training - Step 2723: {'lr': 0.0004997201515942225, 'samples': 523008, 'steps': 2723, 'loss/train': 0.8428615629673004} 01/26/2022 22:26:43 - INFO - codeparrot_training - Step 2724: {'lr': 0.0004997193770726844, 'samples': 523200, 'steps': 2724, 'loss/train': 0.687632605433464} 01/26/2022 22:26:47 - INFO - codeparrot_training - Step 2725: {'lr': 0.0004997186014814278, 'samples': 523392, 'steps': 2725, 'loss/train': 1.0862182974815369} 01/26/2022 22:26:50 - INFO - codeparrot_training - Step 2726: {'lr': 0.000499717824820456, 'samples': 523584, 'steps': 2726, 'loss/train': 0.8000016510486603} 01/26/2022 22:26:53 - INFO - codeparrot_training - Step 2727: {'lr': 0.0004997170470897723, 'samples': 523776, 'steps': 2727, 'loss/train': 0.6469288319349289} 01/26/2022 22:26:56 - INFO - codeparrot_training - Step 2728: {'lr': 0.0004997162682893801, 'samples': 523968, 'steps': 2728, 'loss/train': 0.9333581328392029} 01/26/2022 22:26:59 - INFO - codeparrot_training - Step 2729: {'lr': 0.0004997154884192827, 'samples': 524160, 'steps': 2729, 'loss/train': 0.3309767469763756} 01/26/2022 22:27:04 - INFO - codeparrot_training - Step 2730: {'lr': 0.0004997147074794835, 'samples': 524352, 'steps': 2730, 'loss/train': 1.356850504875183} 01/26/2022 22:27:07 - INFO - codeparrot_training - Step 2731: {'lr': 0.0004997139254699856, 'samples': 524544, 'steps': 2731, 'loss/train': 1.1640528738498688} 01/26/2022 22:27:10 - INFO - codeparrot_training - Step 2732: {'lr': 0.0004997131423907927, 'samples': 524736, 'steps': 2732, 'loss/train': 1.0417866110801697} 01/26/2022 22:27:13 - INFO - codeparrot_training - Step 2733: {'lr': 0.000499712358241908, 'samples': 524928, 'steps': 2733, 'loss/train': 0.8527174293994904} 01/26/2022 22:27:16 - INFO - codeparrot_training - Step 2734: {'lr': 0.0004997115730233349, 'samples': 525120, 'steps': 2734, 'loss/train': 1.151707112789154} 01/26/2022 22:27:19 - INFO - codeparrot_training - Step 2735: {'lr': 0.0004997107867350765, 'samples': 525312, 'steps': 2735, 'loss/train': 1.3621914088726044} 01/26/2022 22:27:22 - INFO - codeparrot_training - Step 2736: {'lr': 0.0004997099993771365, 'samples': 525504, 'steps': 2736, 'loss/train': 1.0067547261714935} 01/26/2022 22:27:26 - INFO - codeparrot_training - Step 2737: {'lr': 0.0004997092109495181, 'samples': 525696, 'steps': 2737, 'loss/train': 0.7126902043819427} 01/26/2022 22:27:29 - INFO - codeparrot_training - Step 2738: {'lr': 0.0004997084214522249, 'samples': 525888, 'steps': 2738, 'loss/train': 1.3180483281612396} 01/26/2022 22:27:33 - INFO - codeparrot_training - Step 2739: {'lr': 0.0004997076308852599, 'samples': 526080, 'steps': 2739, 'loss/train': 0.8002457320690155} 01/26/2022 22:27:36 - INFO - codeparrot_training - Step 2740: {'lr': 0.0004997068392486268, 'samples': 526272, 'steps': 2740, 'loss/train': 1.2160926461219788} 01/26/2022 22:27:39 - INFO - codeparrot_training - Step 2741: {'lr': 0.0004997060465423288, 'samples': 526464, 'steps': 2741, 'loss/train': 1.079861730337143} 01/26/2022 22:27:43 - INFO - codeparrot_training - Step 2742: {'lr': 0.0004997052527663696, 'samples': 526656, 'steps': 2742, 'loss/train': 0.7405757904052734} 01/26/2022 22:27:46 - INFO - codeparrot_training - Step 2743: {'lr': 0.0004997044579207522, 'samples': 526848, 'steps': 2743, 'loss/train': 0.8756797313690186} 01/26/2022 22:27:49 - INFO - codeparrot_training - Step 2744: {'lr': 0.0004997036620054803, 'samples': 527040, 'steps': 2744, 'loss/train': 0.9109331667423248} 01/26/2022 22:27:52 - INFO - codeparrot_training - Step 2745: {'lr': 0.0004997028650205572, 'samples': 527232, 'steps': 2745, 'loss/train': 0.7113658636808395} 01/26/2022 22:27:55 - INFO - codeparrot_training - Step 2746: {'lr': 0.0004997020669659862, 'samples': 527424, 'steps': 2746, 'loss/train': 1.5273540616035461} 01/26/2022 22:28:01 - INFO - codeparrot_training - Step 2747: {'lr': 0.000499701267841771, 'samples': 527616, 'steps': 2747, 'loss/train': 1.414840042591095} 01/26/2022 22:28:04 - INFO - codeparrot_training - Step 2748: {'lr': 0.0004997004676479147, 'samples': 527808, 'steps': 2748, 'loss/train': 0.8727172315120697} 01/26/2022 22:28:07 - INFO - codeparrot_training - Step 2749: {'lr': 0.0004996996663844209, 'samples': 528000, 'steps': 2749, 'loss/train': 0.9442483484745026} 01/26/2022 22:28:11 - INFO - codeparrot_training - Step 2750: {'lr': 0.0004996988640512931, 'samples': 528192, 'steps': 2750, 'loss/train': 0.9026637375354767} 01/26/2022 22:28:14 - INFO - codeparrot_training - Step 2751: {'lr': 0.0004996980606485346, 'samples': 528384, 'steps': 2751, 'loss/train': 0.9002696871757507} 01/26/2022 22:28:17 - INFO - codeparrot_training - Step 2752: {'lr': 0.0004996972561761489, 'samples': 528576, 'steps': 2752, 'loss/train': 1.362278938293457} 01/26/2022 22:28:20 - INFO - codeparrot_training - Step 2753: {'lr': 0.0004996964506341395, 'samples': 528768, 'steps': 2753, 'loss/train': 1.1605354249477386} 01/26/2022 22:28:23 - INFO - codeparrot_training - Step 2754: {'lr': 0.0004996956440225098, 'samples': 528960, 'steps': 2754, 'loss/train': 0.6273291260004044} 01/26/2022 22:28:26 - INFO - codeparrot_training - Step 2755: {'lr': 0.0004996948363412631, 'samples': 529152, 'steps': 2755, 'loss/train': 0.36709584295749664} 01/26/2022 22:28:31 - INFO - codeparrot_training - Step 2756: {'lr': 0.0004996940275904031, 'samples': 529344, 'steps': 2756, 'loss/train': 0.8403642475605011} 01/26/2022 22:28:34 - INFO - codeparrot_training - Step 2757: {'lr': 0.0004996932177699332, 'samples': 529536, 'steps': 2757, 'loss/train': 1.1186150908470154} 01/26/2022 22:28:37 - INFO - codeparrot_training - Step 2758: {'lr': 0.0004996924068798569, 'samples': 529728, 'steps': 2758, 'loss/train': 0.24712546169757843} 01/26/2022 22:28:40 - INFO - codeparrot_training - Step 2759: {'lr': 0.0004996915949201775, 'samples': 529920, 'steps': 2759, 'loss/train': 1.0443961322307587} 01/26/2022 22:28:43 - INFO - codeparrot_training - Step 2760: {'lr': 0.0004996907818908987, 'samples': 530112, 'steps': 2760, 'loss/train': 0.5726265907287598} 01/26/2022 22:28:46 - INFO - codeparrot_training - Step 2761: {'lr': 0.0004996899677920238, 'samples': 530304, 'steps': 2761, 'loss/train': 1.1546131074428558} 01/26/2022 22:28:50 - INFO - codeparrot_training - Step 2762: {'lr': 0.0004996891526235564, 'samples': 530496, 'steps': 2762, 'loss/train': 1.2615946233272552} 01/26/2022 22:28:53 - INFO - codeparrot_training - Step 2763: {'lr': 0.0004996883363854998, 'samples': 530688, 'steps': 2763, 'loss/train': 1.0436536967754364} 01/26/2022 22:28:56 - INFO - codeparrot_training - Step 2764: {'lr': 0.0004996875190778579, 'samples': 530880, 'steps': 2764, 'loss/train': 0.6226357072591782} 01/26/2022 22:29:02 - INFO - codeparrot_training - Step 2765: {'lr': 0.0004996867007006339, 'samples': 531072, 'steps': 2765, 'loss/train': 0.8888140618801117} 01/26/2022 22:29:05 - INFO - codeparrot_training - Step 2766: {'lr': 0.0004996858812538312, 'samples': 531264, 'steps': 2766, 'loss/train': 1.1428104937076569} 01/26/2022 22:29:08 - INFO - codeparrot_training - Step 2767: {'lr': 0.0004996850607374535, 'samples': 531456, 'steps': 2767, 'loss/train': 0.9044584929943085} 01/26/2022 22:29:11 - INFO - codeparrot_training - Step 2768: {'lr': 0.0004996842391515044, 'samples': 531648, 'steps': 2768, 'loss/train': 0.5077144056558609} 01/26/2022 22:29:14 - INFO - codeparrot_training - Step 2769: {'lr': 0.0004996834164959872, 'samples': 531840, 'steps': 2769, 'loss/train': 0.8690685331821442} 01/26/2022 22:29:18 - INFO - codeparrot_training - Step 2770: {'lr': 0.0004996825927709056, 'samples': 532032, 'steps': 2770, 'loss/train': 1.1424556374549866} 01/26/2022 22:29:21 - INFO - codeparrot_training - Step 2771: {'lr': 0.0004996817679762631, 'samples': 532224, 'steps': 2771, 'loss/train': 0.4624849855899811} 01/26/2022 22:29:24 - INFO - codeparrot_training - Step 2772: {'lr': 0.000499680942112063, 'samples': 532416, 'steps': 2772, 'loss/train': 0.7478476166725159} 01/26/2022 22:29:27 - INFO - codeparrot_training - Step 2773: {'lr': 0.0004996801151783092, 'samples': 532608, 'steps': 2773, 'loss/train': 0.6651541739702225} 01/26/2022 22:29:31 - INFO - codeparrot_training - Step 2774: {'lr': 0.000499679287175005, 'samples': 532800, 'steps': 2774, 'loss/train': 0.9218331277370453} 01/26/2022 22:29:35 - INFO - codeparrot_training - Step 2775: {'lr': 0.000499678458102154, 'samples': 532992, 'steps': 2775, 'loss/train': 0.8772390782833099} 01/26/2022 22:29:38 - INFO - codeparrot_training - Step 2776: {'lr': 0.0004996776279597598, 'samples': 533184, 'steps': 2776, 'loss/train': 1.0501557290554047} 01/26/2022 22:29:41 - INFO - codeparrot_training - Step 2777: {'lr': 0.0004996767967478259, 'samples': 533376, 'steps': 2777, 'loss/train': 1.0562025010585785} 01/26/2022 22:29:44 - INFO - codeparrot_training - Step 2778: {'lr': 0.0004996759644663559, 'samples': 533568, 'steps': 2778, 'loss/train': 0.9994540214538574} 01/26/2022 22:29:47 - INFO - codeparrot_training - Step 2779: {'lr': 0.0004996751311153535, 'samples': 533760, 'steps': 2779, 'loss/train': 0.8301956355571747} 01/26/2022 22:29:50 - INFO - codeparrot_training - Step 2780: {'lr': 0.0004996742966948219, 'samples': 533952, 'steps': 2780, 'loss/train': 0.8526880145072937} 01/26/2022 22:29:53 - INFO - codeparrot_training - Step 2781: {'lr': 0.000499673461204765, 'samples': 534144, 'steps': 2781, 'loss/train': 0.9846518933773041} 01/26/2022 22:29:57 - INFO - codeparrot_training - Step 2782: {'lr': 0.0004996726246451862, 'samples': 534336, 'steps': 2782, 'loss/train': 1.4533347487449646} 01/26/2022 22:30:01 - INFO - codeparrot_training - Step 2783: {'lr': 0.0004996717870160892, 'samples': 534528, 'steps': 2783, 'loss/train': 1.2505550980567932} 01/26/2022 22:30:04 - INFO - codeparrot_training - Step 2784: {'lr': 0.0004996709483174775, 'samples': 534720, 'steps': 2784, 'loss/train': 0.7311754524707794} 01/26/2022 22:30:07 - INFO - codeparrot_training - Step 2785: {'lr': 0.0004996701085493547, 'samples': 534912, 'steps': 2785, 'loss/train': 1.0561385750770569} 01/26/2022 22:30:10 - INFO - codeparrot_training - Step 2786: {'lr': 0.0004996692677117246, 'samples': 535104, 'steps': 2786, 'loss/train': 0.8680675327777863} 01/26/2022 22:30:14 - INFO - codeparrot_training - Step 2787: {'lr': 0.0004996684258045906, 'samples': 535296, 'steps': 2787, 'loss/train': 0.6945963352918625} 01/26/2022 22:30:17 - INFO - codeparrot_training - Step 2788: {'lr': 0.0004996675828279562, 'samples': 535488, 'steps': 2788, 'loss/train': 2.2626108527183533} 01/26/2022 22:30:20 - INFO - codeparrot_training - Step 2789: {'lr': 0.0004996667387818254, 'samples': 535680, 'steps': 2789, 'loss/train': 1.0695243179798126} 01/26/2022 22:30:23 - INFO - codeparrot_training - Step 2790: {'lr': 0.0004996658936662013, 'samples': 535872, 'steps': 2790, 'loss/train': 0.8650468289852142} 01/26/2022 22:30:26 - INFO - codeparrot_training - Step 2791: {'lr': 0.0004996650474810879, 'samples': 536064, 'steps': 2791, 'loss/train': 0.9307378828525543} 01/26/2022 22:30:30 - INFO - codeparrot_training - Step 2792: {'lr': 0.0004996642002264887, 'samples': 536256, 'steps': 2792, 'loss/train': 0.7944250702857971} 01/26/2022 22:30:34 - INFO - codeparrot_training - Step 2793: {'lr': 0.0004996633519024074, 'samples': 536448, 'steps': 2793, 'loss/train': 1.1947387754917145} 01/26/2022 22:30:37 - INFO - codeparrot_training - Step 2794: {'lr': 0.0004996625025088476, 'samples': 536640, 'steps': 2794, 'loss/train': 0.7485059201717377} 01/26/2022 22:30:40 - INFO - codeparrot_training - Step 2795: {'lr': 0.0004996616520458128, 'samples': 536832, 'steps': 2795, 'loss/train': 0.7903175354003906} 01/26/2022 22:30:43 - INFO - codeparrot_training - Step 2796: {'lr': 0.0004996608005133068, 'samples': 537024, 'steps': 2796, 'loss/train': 0.7815181016921997} 01/26/2022 22:30:46 - INFO - codeparrot_training - Step 2797: {'lr': 0.0004996599479113333, 'samples': 537216, 'steps': 2797, 'loss/train': 1.5178906917572021} 01/26/2022 22:30:49 - INFO - codeparrot_training - Step 2798: {'lr': 0.0004996590942398958, 'samples': 537408, 'steps': 2798, 'loss/train': 1.025696575641632} 01/26/2022 22:30:52 - INFO - codeparrot_training - Step 2799: {'lr': 0.0004996582394989979, 'samples': 537600, 'steps': 2799, 'loss/train': 0.7688591480255127} 01/26/2022 22:30:59 - INFO - codeparrot_training - Step 2800: {'lr': 0.0004996573836886434, 'samples': 537792, 'steps': 2800, 'loss/train': 1.0039652287960052} 01/26/2022 22:31:02 - INFO - codeparrot_training - Step 2801: {'lr': 0.0004996565268088362, 'samples': 537984, 'steps': 2801, 'loss/train': 0.9838553667068481} 01/26/2022 22:31:05 - INFO - codeparrot_training - Step 2802: {'lr': 0.0004996556688595794, 'samples': 538176, 'steps': 2802, 'loss/train': 1.0018908083438873} 01/26/2022 22:31:08 - INFO - codeparrot_training - Step 2803: {'lr': 0.0004996548098408772, 'samples': 538368, 'steps': 2803, 'loss/train': 0.7304205894470215} 01/26/2022 22:31:12 - INFO - codeparrot_training - Step 2804: {'lr': 0.0004996539497527329, 'samples': 538560, 'steps': 2804, 'loss/train': 0.265297494828701} 01/26/2022 22:31:15 - INFO - codeparrot_training - Step 2805: {'lr': 0.0004996530885951505, 'samples': 538752, 'steps': 2805, 'loss/train': 0.14289868623018265} 01/26/2022 22:31:18 - INFO - codeparrot_training - Step 2806: {'lr': 0.0004996522263681335, 'samples': 538944, 'steps': 2806, 'loss/train': 1.116512954235077} 01/26/2022 22:31:21 - INFO - codeparrot_training - Step 2807: {'lr': 0.0004996513630716856, 'samples': 539136, 'steps': 2807, 'loss/train': 0.8697557151317596} 01/26/2022 22:31:24 - INFO - codeparrot_training - Step 2808: {'lr': 0.0004996504987058105, 'samples': 539328, 'steps': 2808, 'loss/train': 0.9390318095684052} 01/26/2022 22:31:29 - INFO - codeparrot_training - Step 2809: {'lr': 0.000499649633270512, 'samples': 539520, 'steps': 2809, 'loss/train': 0.7044840753078461} 01/26/2022 22:31:32 - INFO - codeparrot_training - Step 2810: {'lr': 0.0004996487667657938, 'samples': 539712, 'steps': 2810, 'loss/train': 0.6086350232362747} 01/26/2022 22:31:35 - INFO - codeparrot_training - Step 2811: {'lr': 0.0004996478991916595, 'samples': 539904, 'steps': 2811, 'loss/train': 0.9005656242370605} 01/26/2022 22:31:38 - INFO - codeparrot_training - Step 2812: {'lr': 0.0004996470305481127, 'samples': 540096, 'steps': 2812, 'loss/train': 0.6344811469316483} 01/26/2022 22:31:41 - INFO - codeparrot_training - Step 2813: {'lr': 0.0004996461608351575, 'samples': 540288, 'steps': 2813, 'loss/train': 1.0170718431472778} 01/26/2022 22:31:44 - INFO - codeparrot_training - Step 2814: {'lr': 0.0004996452900527974, 'samples': 540480, 'steps': 2814, 'loss/train': 0.7405312657356262} 01/26/2022 22:31:47 - INFO - codeparrot_training - Step 2815: {'lr': 0.0004996444182010361, 'samples': 540672, 'steps': 2815, 'loss/train': 1.110595464706421} 01/26/2022 22:31:51 - INFO - codeparrot_training - Step 2816: {'lr': 0.0004996435452798775, 'samples': 540864, 'steps': 2816, 'loss/train': 1.0085819363594055} 01/26/2022 22:31:54 - INFO - codeparrot_training - Step 2817: {'lr': 0.000499642671289325, 'samples': 541056, 'steps': 2817, 'loss/train': 1.1786141395568848} 01/26/2022 22:31:58 - INFO - codeparrot_training - Step 2818: {'lr': 0.0004996417962293828, 'samples': 541248, 'steps': 2818, 'loss/train': 0.8922220766544342} 01/26/2022 22:32:01 - INFO - codeparrot_training - Step 2819: {'lr': 0.0004996409201000543, 'samples': 541440, 'steps': 2819, 'loss/train': 0.5207226723432541} 01/26/2022 22:32:04 - INFO - codeparrot_training - Step 2820: {'lr': 0.0004996400429013434, 'samples': 541632, 'steps': 2820, 'loss/train': 0.4298195242881775} 01/26/2022 22:32:08 - INFO - codeparrot_training - Step 2821: {'lr': 0.0004996391646332537, 'samples': 541824, 'steps': 2821, 'loss/train': 0.5942118018865585} 01/26/2022 22:32:11 - INFO - codeparrot_training - Step 2822: {'lr': 0.0004996382852957892, 'samples': 542016, 'steps': 2822, 'loss/train': 0.9123173654079437} 01/26/2022 22:32:14 - INFO - codeparrot_training - Step 2823: {'lr': 0.0004996374048889536, 'samples': 542208, 'steps': 2823, 'loss/train': 1.0034546256065369} 01/26/2022 22:32:17 - INFO - codeparrot_training - Step 2824: {'lr': 0.0004996365234127506, 'samples': 542400, 'steps': 2824, 'loss/train': 0.827489823102951} 01/26/2022 22:32:20 - INFO - codeparrot_training - Step 2825: {'lr': 0.000499635640867184, 'samples': 542592, 'steps': 2825, 'loss/train': 0.5488094240427017} 01/26/2022 22:32:23 - INFO - codeparrot_training - Step 2826: {'lr': 0.0004996347572522575, 'samples': 542784, 'steps': 2826, 'loss/train': 0.6856733858585358} 01/26/2022 22:32:29 - INFO - codeparrot_training - Step 2827: {'lr': 0.000499633872567975, 'samples': 542976, 'steps': 2827, 'loss/train': 1.2221630215644836} 01/26/2022 22:32:33 - INFO - codeparrot_training - Step 2828: {'lr': 0.0004996329868143404, 'samples': 543168, 'steps': 2828, 'loss/train': 0.9303854405879974} 01/26/2022 22:32:36 - INFO - codeparrot_training - Step 2829: {'lr': 0.0004996320999913572, 'samples': 543360, 'steps': 2829, 'loss/train': 0.5664921998977661} 01/26/2022 22:32:39 - INFO - codeparrot_training - Step 2830: {'lr': 0.0004996312120990293, 'samples': 543552, 'steps': 2830, 'loss/train': 1.2467233836650848} 01/26/2022 22:32:42 - INFO - codeparrot_training - Step 2831: {'lr': 0.0004996303231373607, 'samples': 543744, 'steps': 2831, 'loss/train': 0.3257194831967354} 01/26/2022 22:32:45 - INFO - codeparrot_training - Step 2832: {'lr': 0.000499629433106355, 'samples': 543936, 'steps': 2832, 'loss/train': 0.5207729190587997} 01/26/2022 22:32:48 - INFO - codeparrot_training - Step 2833: {'lr': 0.000499628542006016, 'samples': 544128, 'steps': 2833, 'loss/train': 0.7043555080890656} 01/26/2022 22:32:51 - INFO - codeparrot_training - Step 2834: {'lr': 0.0004996276498363477, 'samples': 544320, 'steps': 2834, 'loss/train': 0.7994174659252167} 01/26/2022 22:32:56 - INFO - codeparrot_training - Step 2835: {'lr': 0.0004996267565973538, 'samples': 544512, 'steps': 2835, 'loss/train': 0.9611524343490601} 01/26/2022 22:32:59 - INFO - codeparrot_training - Step 2836: {'lr': 0.0004996258622890381, 'samples': 544704, 'steps': 2836, 'loss/train': 1.2207193672657013} 01/26/2022 22:33:02 - INFO - codeparrot_training - Step 2837: {'lr': 0.0004996249669114045, 'samples': 544896, 'steps': 2837, 'loss/train': 0.9298156499862671} 01/26/2022 22:33:06 - INFO - codeparrot_training - Step 2838: {'lr': 0.0004996240704644568, 'samples': 545088, 'steps': 2838, 'loss/train': 1.3130445778369904} 01/26/2022 22:33:09 - INFO - codeparrot_training - Step 2839: {'lr': 0.0004996231729481989, 'samples': 545280, 'steps': 2839, 'loss/train': 0.6194036453962326} 01/26/2022 22:33:12 - INFO - codeparrot_training - Step 2840: {'lr': 0.0004996222743626345, 'samples': 545472, 'steps': 2840, 'loss/train': 0.9341537654399872} 01/26/2022 22:33:15 - INFO - codeparrot_training - Step 2841: {'lr': 0.0004996213747077675, 'samples': 545664, 'steps': 2841, 'loss/train': 0.847167044878006} 01/26/2022 22:33:18 - INFO - codeparrot_training - Step 2842: {'lr': 0.0004996204739836019, 'samples': 545856, 'steps': 2842, 'loss/train': 0.8448958396911621} 01/26/2022 22:33:21 - INFO - codeparrot_training - Step 2843: {'lr': 0.0004996195721901415, 'samples': 546048, 'steps': 2843, 'loss/train': 0.9048586785793304} 01/26/2022 22:33:27 - INFO - codeparrot_training - Step 2844: {'lr': 0.00049961866932739, 'samples': 546240, 'steps': 2844, 'loss/train': 1.0578806698322296} 01/26/2022 22:33:31 - INFO - codeparrot_training - Step 2845: {'lr': 0.0004996177653953514, 'samples': 546432, 'steps': 2845, 'loss/train': 0.7790518999099731} 01/26/2022 22:33:34 - INFO - codeparrot_training - Step 2846: {'lr': 0.0004996168603940296, 'samples': 546624, 'steps': 2846, 'loss/train': 0.7073230147361755} 01/26/2022 22:33:37 - INFO - codeparrot_training - Step 2847: {'lr': 0.0004996159543234285, 'samples': 546816, 'steps': 2847, 'loss/train': 1.1356099247932434} 01/26/2022 22:33:40 - INFO - codeparrot_training - Step 2848: {'lr': 0.0004996150471835518, 'samples': 547008, 'steps': 2848, 'loss/train': 1.1354980766773224} 01/26/2022 22:33:43 - INFO - codeparrot_training - Step 2849: {'lr': 0.0004996141389744035, 'samples': 547200, 'steps': 2849, 'loss/train': 0.6644075810909271} 01/26/2022 22:33:46 - INFO - codeparrot_training - Step 2850: {'lr': 0.0004996132296959876, 'samples': 547392, 'steps': 2850, 'loss/train': 1.0119557976722717} 01/26/2022 22:33:49 - INFO - codeparrot_training - Step 2851: {'lr': 0.0004996123193483076, 'samples': 547584, 'steps': 2851, 'loss/train': 1.0886731445789337} 01/26/2022 22:33:53 - INFO - codeparrot_training - Step 2852: {'lr': 0.000499611407931368, 'samples': 547776, 'steps': 2852, 'loss/train': 0.5306823402643204} 01/26/2022 22:33:57 - INFO - codeparrot_training - Step 2853: {'lr': 0.0004996104954451722, 'samples': 547968, 'steps': 2853, 'loss/train': 1.0818226039409637} 01/26/2022 22:34:00 - INFO - codeparrot_training - Step 2854: {'lr': 0.0004996095818897245, 'samples': 548160, 'steps': 2854, 'loss/train': 0.9101642668247223} 01/26/2022 22:34:03 - INFO - codeparrot_training - Step 2855: {'lr': 0.0004996086672650284, 'samples': 548352, 'steps': 2855, 'loss/train': 1.2193906009197235} 01/26/2022 22:34:06 - INFO - codeparrot_training - Step 2856: {'lr': 0.0004996077515710881, 'samples': 548544, 'steps': 2856, 'loss/train': 0.7131706774234772} 01/26/2022 22:34:10 - INFO - codeparrot_training - Step 2857: {'lr': 0.0004996068348079075, 'samples': 548736, 'steps': 2857, 'loss/train': 0.9546933174133301} 01/26/2022 22:34:13 - INFO - codeparrot_training - Step 2858: {'lr': 0.0004996059169754904, 'samples': 548928, 'steps': 2858, 'loss/train': 1.1241235435009003} 01/26/2022 22:34:16 - INFO - codeparrot_training - Step 2859: {'lr': 0.0004996049980738409, 'samples': 549120, 'steps': 2859, 'loss/train': 0.639680877327919} 01/26/2022 22:34:19 - INFO - codeparrot_training - Step 2860: {'lr': 0.0004996040781029629, 'samples': 549312, 'steps': 2860, 'loss/train': 0.7452893704175949} 01/26/2022 22:34:23 - INFO - codeparrot_training - Step 2861: {'lr': 0.00049960315706286, 'samples': 549504, 'steps': 2861, 'loss/train': 0.7880368530750275} 01/26/2022 22:34:27 - INFO - codeparrot_training - Step 2862: {'lr': 0.0004996022349535367, 'samples': 549696, 'steps': 2862, 'loss/train': 0.5898650586605072} 01/26/2022 22:34:30 - INFO - codeparrot_training - Step 2863: {'lr': 0.0004996013117749967, 'samples': 549888, 'steps': 2863, 'loss/train': 1.0602720379829407} 01/26/2022 22:34:33 - INFO - codeparrot_training - Step 2864: {'lr': 0.0004996003875272438, 'samples': 550080, 'steps': 2864, 'loss/train': 1.5050806403160095} 01/26/2022 22:34:36 - INFO - codeparrot_training - Step 2865: {'lr': 0.0004995994622102821, 'samples': 550272, 'steps': 2865, 'loss/train': 0.9654529094696045} 01/26/2022 22:34:39 - INFO - codeparrot_training - Step 2866: {'lr': 0.0004995985358241156, 'samples': 550464, 'steps': 2866, 'loss/train': 1.0101422667503357} 01/26/2022 22:34:42 - INFO - codeparrot_training - Step 2867: {'lr': 0.0004995976083687482, 'samples': 550656, 'steps': 2867, 'loss/train': 0.8342483639717102} 01/26/2022 22:34:45 - INFO - codeparrot_training - Step 2868: {'lr': 0.000499596679844184, 'samples': 550848, 'steps': 2868, 'loss/train': 1.2566809058189392} 01/26/2022 22:34:49 - INFO - codeparrot_training - Step 2869: {'lr': 0.0004995957502504268, 'samples': 551040, 'steps': 2869, 'loss/train': 1.0069268345832825} 01/26/2022 22:34:55 - INFO - codeparrot_training - Step 2870: {'lr': 0.0004995948195874807, 'samples': 551232, 'steps': 2870, 'loss/train': 1.4726496040821075} 01/26/2022 22:34:58 - INFO - codeparrot_training - Step 2871: {'lr': 0.0004995938878553496, 'samples': 551424, 'steps': 2871, 'loss/train': 1.4239637553691864} 01/26/2022 22:35:01 - INFO - codeparrot_training - Step 2872: {'lr': 0.0004995929550540376, 'samples': 551616, 'steps': 2872, 'loss/train': 0.6224964112043381} 01/26/2022 22:35:04 - INFO - codeparrot_training - Step 2873: {'lr': 0.0004995920211835485, 'samples': 551808, 'steps': 2873, 'loss/train': 1.1882835924625397} 01/26/2022 22:35:07 - INFO - codeparrot_training - Step 2874: {'lr': 0.0004995910862438866, 'samples': 552000, 'steps': 2874, 'loss/train': 1.323416143655777} 01/26/2022 22:35:10 - INFO - codeparrot_training - Step 2875: {'lr': 0.0004995901502350556, 'samples': 552192, 'steps': 2875, 'loss/train': 1.0730119943618774} 01/26/2022 22:35:14 - INFO - codeparrot_training - Step 2876: {'lr': 0.0004995892131570598, 'samples': 552384, 'steps': 2876, 'loss/train': 0.7371807396411896} 01/26/2022 22:35:17 - INFO - codeparrot_training - Step 2877: {'lr': 0.0004995882750099029, 'samples': 552576, 'steps': 2877, 'loss/train': 1.172472596168518} 01/26/2022 22:35:20 - INFO - codeparrot_training - Step 2878: {'lr': 0.0004995873357935892, 'samples': 552768, 'steps': 2878, 'loss/train': 0.9136722385883331} 01/26/2022 22:35:24 - INFO - codeparrot_training - Step 2879: {'lr': 0.0004995863955081226, 'samples': 552960, 'steps': 2879, 'loss/train': 1.1050791442394257} 01/26/2022 22:35:28 - INFO - codeparrot_training - Step 2880: {'lr': 0.0004995854541535071, 'samples': 553152, 'steps': 2880, 'loss/train': 0.7806945741176605} 01/26/2022 22:35:31 - INFO - codeparrot_training - Step 2881: {'lr': 0.0004995845117297468, 'samples': 553344, 'steps': 2881, 'loss/train': 0.9278761446475983} 01/26/2022 22:35:34 - INFO - codeparrot_training - Step 2882: {'lr': 0.0004995835682368457, 'samples': 553536, 'steps': 2882, 'loss/train': 1.3032511174678802} 01/26/2022 22:35:37 - INFO - codeparrot_training - Step 2883: {'lr': 0.0004995826236748078, 'samples': 553728, 'steps': 2883, 'loss/train': 1.438004583120346} 01/26/2022 22:35:40 - INFO - codeparrot_training - Step 2884: {'lr': 0.0004995816780436372, 'samples': 553920, 'steps': 2884, 'loss/train': 0.40261177718639374} 01/26/2022 22:35:43 - INFO - codeparrot_training - Step 2885: {'lr': 0.0004995807313433379, 'samples': 554112, 'steps': 2885, 'loss/train': 0.7305226027965546} 01/26/2022 22:35:46 - INFO - codeparrot_training - Step 2886: {'lr': 0.0004995797835739141, 'samples': 554304, 'steps': 2886, 'loss/train': 0.1302936002612114} 01/26/2022 22:35:50 - INFO - codeparrot_training - Step 2887: {'lr': 0.0004995788347353697, 'samples': 554496, 'steps': 2887, 'loss/train': 0.9029273092746735} 01/26/2022 22:35:54 - INFO - codeparrot_training - Step 2888: {'lr': 0.0004995778848277088, 'samples': 554688, 'steps': 2888, 'loss/train': 0.9550876915454865} 01/26/2022 22:35:57 - INFO - codeparrot_training - Step 2889: {'lr': 0.0004995769338509357, 'samples': 554880, 'steps': 2889, 'loss/train': 0.824156641960144} 01/26/2022 22:36:00 - INFO - codeparrot_training - Step 2890: {'lr': 0.000499575981805054, 'samples': 555072, 'steps': 2890, 'loss/train': 1.0670775175094604} 01/26/2022 22:36:04 - INFO - codeparrot_training - Step 2891: {'lr': 0.000499575028690068, 'samples': 555264, 'steps': 2891, 'loss/train': 1.2360831499099731} 01/26/2022 22:36:07 - INFO - codeparrot_training - Step 2892: {'lr': 0.000499574074505982, 'samples': 555456, 'steps': 2892, 'loss/train': 1.058509200811386} 01/26/2022 22:36:10 - INFO - codeparrot_training - Step 2893: {'lr': 0.0004995731192527999, 'samples': 555648, 'steps': 2893, 'loss/train': 0.7297503501176834} 01/26/2022 22:36:13 - INFO - codeparrot_training - Step 2894: {'lr': 0.0004995721629305258, 'samples': 555840, 'steps': 2894, 'loss/train': 0.9184623956680298} 01/26/2022 22:36:16 - INFO - codeparrot_training - Step 2895: {'lr': 0.0004995712055391638, 'samples': 556032, 'steps': 2895, 'loss/train': 0.7956283986568451} 01/26/2022 22:36:19 - INFO - codeparrot_training - Step 2896: {'lr': 0.000499570247078718, 'samples': 556224, 'steps': 2896, 'loss/train': 1.3067372739315033} 01/26/2022 22:36:24 - INFO - codeparrot_training - Step 2897: {'lr': 0.0004995692875491925, 'samples': 556416, 'steps': 2897, 'loss/train': 0.5108103007078171} 01/26/2022 22:36:27 - INFO - codeparrot_training - Step 2898: {'lr': 0.0004995683269505914, 'samples': 556608, 'steps': 2898, 'loss/train': 0.8355798125267029} 01/26/2022 22:36:30 - INFO - codeparrot_training - Step 2899: {'lr': 0.000499567365282919, 'samples': 556800, 'steps': 2899, 'loss/train': 0.7420223951339722} 01/26/2022 22:36:33 - INFO - codeparrot_training - Step 2900: {'lr': 0.000499566402546179, 'samples': 556992, 'steps': 2900, 'loss/train': 0.8526769280433655} 01/26/2022 22:36:36 - INFO - codeparrot_training - Step 2901: {'lr': 0.0004995654387403758, 'samples': 557184, 'steps': 2901, 'loss/train': 0.6503414511680603} 01/26/2022 22:36:40 - INFO - codeparrot_training - Step 2902: {'lr': 0.0004995644738655136, 'samples': 557376, 'steps': 2902, 'loss/train': 0.6415115296840668} 01/26/2022 22:36:43 - INFO - codeparrot_training - Step 2903: {'lr': 0.0004995635079215965, 'samples': 557568, 'steps': 2903, 'loss/train': 0.9660298526287079} 01/26/2022 22:36:46 - INFO - codeparrot_training - Step 2904: {'lr': 0.0004995625409086285, 'samples': 557760, 'steps': 2904, 'loss/train': 0.8822596371173859} 01/26/2022 22:36:53 - INFO - codeparrot_training - Step 2905: {'lr': 0.0004995615728266138, 'samples': 557952, 'steps': 2905, 'loss/train': 0.8182893991470337} 01/26/2022 22:36:56 - INFO - codeparrot_training - Step 2906: {'lr': 0.0004995606036755566, 'samples': 558144, 'steps': 2906, 'loss/train': 0.5154286623001099} 01/26/2022 22:36:59 - INFO - codeparrot_training - Step 2907: {'lr': 0.000499559633455461, 'samples': 558336, 'steps': 2907, 'loss/train': 0.8203851878643036} 01/26/2022 22:37:02 - INFO - codeparrot_training - Step 2908: {'lr': 0.0004995586621663312, 'samples': 558528, 'steps': 2908, 'loss/train': 0.8175567090511322} 01/26/2022 22:37:05 - INFO - codeparrot_training - Step 2909: {'lr': 0.0004995576898081713, 'samples': 558720, 'steps': 2909, 'loss/train': 1.1661034226417542} 01/26/2022 22:37:08 - INFO - codeparrot_training - Step 2910: {'lr': 0.0004995567163809855, 'samples': 558912, 'steps': 2910, 'loss/train': 0.4763499051332474} 01/26/2022 22:37:11 - INFO - codeparrot_training - Step 2911: {'lr': 0.000499555741884778, 'samples': 559104, 'steps': 2911, 'loss/train': 0.7199180871248245} 01/26/2022 22:37:14 - INFO - codeparrot_training - Step 2912: {'lr': 0.000499554766319553, 'samples': 559296, 'steps': 2912, 'loss/train': 1.0343948900699615} 01/26/2022 22:37:18 - INFO - codeparrot_training - Step 2913: {'lr': 0.0004995537896853146, 'samples': 559488, 'steps': 2913, 'loss/train': 0.6390861868858337} 01/26/2022 22:37:22 - INFO - codeparrot_training - Step 2914: {'lr': 0.0004995528119820669, 'samples': 559680, 'steps': 2914, 'loss/train': 1.080781638622284} 01/26/2022 22:37:25 - INFO - codeparrot_training - Step 2915: {'lr': 0.0004995518332098143, 'samples': 559872, 'steps': 2915, 'loss/train': 0.7522558271884918} 01/26/2022 22:37:29 - INFO - codeparrot_training - Step 2916: {'lr': 0.0004995508533685608, 'samples': 560064, 'steps': 2916, 'loss/train': 1.3695071339607239} 01/26/2022 22:37:32 - INFO - codeparrot_training - Step 2917: {'lr': 0.0004995498724583107, 'samples': 560256, 'steps': 2917, 'loss/train': 0.969440370798111} 01/26/2022 22:37:35 - INFO - codeparrot_training - Step 2918: {'lr': 0.0004995488904790682, 'samples': 560448, 'steps': 2918, 'loss/train': 0.7805348038673401} 01/26/2022 22:37:38 - INFO - codeparrot_training - Step 2919: {'lr': 0.0004995479074308375, 'samples': 560640, 'steps': 2919, 'loss/train': 1.5623856782913208} 01/26/2022 22:37:41 - INFO - codeparrot_training - Step 2920: {'lr': 0.0004995469233136228, 'samples': 560832, 'steps': 2920, 'loss/train': 1.3009008765220642} 01/26/2022 22:37:44 - INFO - codeparrot_training - Step 2921: {'lr': 0.0004995459381274284, 'samples': 561024, 'steps': 2921, 'loss/train': 0.3281758725643158} 01/26/2022 22:37:48 - INFO - codeparrot_training - Step 2922: {'lr': 0.0004995449518722584, 'samples': 561216, 'steps': 2922, 'loss/train': 0.4804335683584213} 01/26/2022 22:37:54 - INFO - codeparrot_training - Step 2923: {'lr': 0.000499543964548117, 'samples': 561408, 'steps': 2923, 'loss/train': 0.5144355744123459} 01/26/2022 22:37:57 - INFO - codeparrot_training - Step 2924: {'lr': 0.0004995429761550086, 'samples': 561600, 'steps': 2924, 'loss/train': 1.121930480003357} 01/26/2022 22:38:00 - INFO - codeparrot_training - Step 2925: {'lr': 0.0004995419866929373, 'samples': 561792, 'steps': 2925, 'loss/train': 1.2779150605201721} 01/26/2022 22:38:03 - INFO - codeparrot_training - Step 2926: {'lr': 0.0004995409961619073, 'samples': 561984, 'steps': 2926, 'loss/train': 0.7397122532129288} 01/26/2022 22:38:06 - INFO - codeparrot_training - Step 2927: {'lr': 0.0004995400045619229, 'samples': 562176, 'steps': 2927, 'loss/train': 0.9617904424667358} 01/26/2022 22:38:09 - INFO - codeparrot_training - Step 2928: {'lr': 0.0004995390118929885, 'samples': 562368, 'steps': 2928, 'loss/train': 0.6050659418106079} 01/26/2022 22:38:13 - INFO - codeparrot_training - Step 2929: {'lr': 0.0004995380181551081, 'samples': 562560, 'steps': 2929, 'loss/train': 1.1505182683467865} 01/26/2022 22:38:16 - INFO - codeparrot_training - Step 2930: {'lr': 0.0004995370233482861, 'samples': 562752, 'steps': 2930, 'loss/train': 0.8492617607116699} 01/26/2022 22:38:21 - INFO - codeparrot_training - Step 2931: {'lr': 0.0004995360274725267, 'samples': 562944, 'steps': 2931, 'loss/train': 0.8612657189369202} 01/26/2022 22:38:24 - INFO - codeparrot_training - Step 2932: {'lr': 0.0004995350305278342, 'samples': 563136, 'steps': 2932, 'loss/train': 0.3368518799543381} 01/26/2022 22:38:28 - INFO - codeparrot_training - Step 2933: {'lr': 0.0004995340325142128, 'samples': 563328, 'steps': 2933, 'loss/train': 0.7988592088222504} 01/26/2022 22:38:31 - INFO - codeparrot_training - Step 2934: {'lr': 0.000499533033431667, 'samples': 563520, 'steps': 2934, 'loss/train': 0.6908591687679291} 01/26/2022 22:38:34 - INFO - codeparrot_training - Step 2935: {'lr': 0.0004995320332802008, 'samples': 563712, 'steps': 2935, 'loss/train': 0.8248486518859863} 01/26/2022 22:38:37 - INFO - codeparrot_training - Step 2936: {'lr': 0.0004995310320598187, 'samples': 563904, 'steps': 2936, 'loss/train': 1.5085832476615906} 01/26/2022 22:38:40 - INFO - codeparrot_training - Step 2937: {'lr': 0.0004995300297705248, 'samples': 564096, 'steps': 2937, 'loss/train': 0.9930601716041565} 01/26/2022 22:38:43 - INFO - codeparrot_training - Step 2938: {'lr': 0.0004995290264123235, 'samples': 564288, 'steps': 2938, 'loss/train': 0.9806357324123383} 01/26/2022 22:38:46 - INFO - codeparrot_training - Step 2939: {'lr': 0.0004995280219852192, 'samples': 564480, 'steps': 2939, 'loss/train': 0.43083326518535614} 01/26/2022 22:38:49 - INFO - codeparrot_training - Step 2940: {'lr': 0.000499527016489216, 'samples': 564672, 'steps': 2940, 'loss/train': 1.6937531232833862} 01/26/2022 22:38:53 - INFO - codeparrot_training - Step 2941: {'lr': 0.0004995260099243182, 'samples': 564864, 'steps': 2941, 'loss/train': 1.688249409198761} 01/26/2022 22:38:57 - INFO - codeparrot_training - Step 2942: {'lr': 0.0004995250022905303, 'samples': 565056, 'steps': 2942, 'loss/train': 0.7330740541219711} 01/26/2022 22:39:00 - INFO - codeparrot_training - Step 2943: {'lr': 0.0004995239935878565, 'samples': 565248, 'steps': 2943, 'loss/train': 5.497656941413879} 01/26/2022 22:39:03 - INFO - codeparrot_training - Step 2944: {'lr': 0.0004995229838163012, 'samples': 565440, 'steps': 2944, 'loss/train': 0.9582376778125763} 01/26/2022 22:39:07 - INFO - codeparrot_training - Step 2945: {'lr': 0.0004995219729758687, 'samples': 565632, 'steps': 2945, 'loss/train': 0.800303041934967} 01/26/2022 22:39:10 - INFO - codeparrot_training - Step 2946: {'lr': 0.0004995209610665632, 'samples': 565824, 'steps': 2946, 'loss/train': 1.088929831981659} 01/26/2022 22:39:13 - INFO - codeparrot_training - Step 2947: {'lr': 0.0004995199480883892, 'samples': 566016, 'steps': 2947, 'loss/train': 0.8486848175525665} 01/26/2022 22:39:16 - INFO - codeparrot_training - Step 2948: {'lr': 0.0004995189340413509, 'samples': 566208, 'steps': 2948, 'loss/train': 1.141898900270462} 01/26/2022 22:39:19 - INFO - codeparrot_training - Step 2949: {'lr': 0.0004995179189254528, 'samples': 566400, 'steps': 2949, 'loss/train': 0.8297030925750732} 01/26/2022 22:39:22 - INFO - codeparrot_training - Step 2950: {'lr': 0.000499516902740699, 'samples': 566592, 'steps': 2950, 'loss/train': 0.8947453200817108} 01/26/2022 22:39:28 - INFO - codeparrot_training - Step 2951: {'lr': 0.0004995158854870942, 'samples': 566784, 'steps': 2951, 'loss/train': 1.2301333844661713} 01/26/2022 22:39:32 - INFO - codeparrot_training - Step 2952: {'lr': 0.0004995148671646426, 'samples': 566976, 'steps': 2952, 'loss/train': 0.4554039090871811} 01/26/2022 22:39:35 - INFO - codeparrot_training - Step 2953: {'lr': 0.0004995138477733484, 'samples': 567168, 'steps': 2953, 'loss/train': 0.6537792831659317} 01/26/2022 22:39:38 - INFO - codeparrot_training - Step 2954: {'lr': 0.0004995128273132161, 'samples': 567360, 'steps': 2954, 'loss/train': 0.7860420048236847} 01/26/2022 22:39:41 - INFO - codeparrot_training - Step 2955: {'lr': 0.0004995118057842502, 'samples': 567552, 'steps': 2955, 'loss/train': 0.8332033753395081} 01/26/2022 22:39:44 - INFO - codeparrot_training - Step 2956: {'lr': 0.0004995107831864549, 'samples': 567744, 'steps': 2956, 'loss/train': 0.9633582830429077} 01/26/2022 22:39:47 - INFO - codeparrot_training - Step 2957: {'lr': 0.0004995097595198346, 'samples': 567936, 'steps': 2957, 'loss/train': 0.5096389353275299} 01/26/2022 22:39:50 - INFO - codeparrot_training - Step 2958: {'lr': 0.0004995087347843938, 'samples': 568128, 'steps': 2958, 'loss/train': 0.9497101306915283} 01/26/2022 22:39:55 - INFO - codeparrot_training - Step 2959: {'lr': 0.0004995077089801368, 'samples': 568320, 'steps': 2959, 'loss/train': 0.9945984184741974} 01/26/2022 22:39:58 - INFO - codeparrot_training - Step 2960: {'lr': 0.0004995066821070679, 'samples': 568512, 'steps': 2960, 'loss/train': 0.793113648891449} 01/26/2022 22:40:01 - INFO - codeparrot_training - Step 2961: {'lr': 0.0004995056541651917, 'samples': 568704, 'steps': 2961, 'loss/train': 0.7215123921632767} 01/26/2022 22:40:04 - INFO - codeparrot_training - Step 2962: {'lr': 0.0004995046251545125, 'samples': 568896, 'steps': 2962, 'loss/train': 1.3325583636760712} 01/26/2022 22:40:07 - INFO - codeparrot_training - Step 2963: {'lr': 0.0004995035950750346, 'samples': 569088, 'steps': 2963, 'loss/train': 0.8373770713806152} 01/26/2022 22:40:11 - INFO - codeparrot_training - Step 2964: {'lr': 0.0004995025639267627, 'samples': 569280, 'steps': 2964, 'loss/train': 0.6363615542650223} 01/26/2022 22:40:14 - INFO - codeparrot_training - Step 2965: {'lr': 0.0004995015317097009, 'samples': 569472, 'steps': 2965, 'loss/train': 0.6023329049348831} 01/26/2022 22:40:17 - INFO - codeparrot_training - Step 2966: {'lr': 0.0004995004984238538, 'samples': 569664, 'steps': 2966, 'loss/train': 1.7117934823036194} 01/26/2022 22:40:20 - INFO - codeparrot_training - Step 2967: {'lr': 0.0004994994640692258, 'samples': 569856, 'steps': 2967, 'loss/train': 0.9588890075683594} 01/26/2022 22:40:25 - INFO - codeparrot_training - Step 2968: {'lr': 0.0004994984286458213, 'samples': 570048, 'steps': 2968, 'loss/train': 0.8250570595264435} 01/26/2022 22:40:28 - INFO - codeparrot_training - Step 2969: {'lr': 0.0004994973921536447, 'samples': 570240, 'steps': 2969, 'loss/train': 0.9486054182052612} 01/26/2022 22:40:31 - INFO - codeparrot_training - Step 2970: {'lr': 0.0004994963545927006, 'samples': 570432, 'steps': 2970, 'loss/train': 0.6970750093460083} 01/26/2022 22:40:34 - INFO - codeparrot_training - Step 2971: {'lr': 0.0004994953159629934, 'samples': 570624, 'steps': 2971, 'loss/train': 1.1082816123962402} 01/26/2022 22:40:37 - INFO - codeparrot_training - Step 2972: {'lr': 0.0004994942762645274, 'samples': 570816, 'steps': 2972, 'loss/train': 0.9577749967575073} 01/26/2022 22:40:40 - INFO - codeparrot_training - Step 2973: {'lr': 0.000499493235497307, 'samples': 571008, 'steps': 2973, 'loss/train': 1.0263490676879883} 01/26/2022 22:40:44 - INFO - codeparrot_training - Step 2974: {'lr': 0.000499492193661337, 'samples': 571200, 'steps': 2974, 'loss/train': 0.8902063965797424} 01/26/2022 22:40:47 - INFO - codeparrot_training - Step 2975: {'lr': 0.0004994911507566216, 'samples': 571392, 'steps': 2975, 'loss/train': 0.8638622760772705} 01/26/2022 22:40:50 - INFO - codeparrot_training - Step 2976: {'lr': 0.0004994901067831654, 'samples': 571584, 'steps': 2976, 'loss/train': 1.02662193775177} 01/26/2022 22:40:55 - INFO - codeparrot_training - Step 2977: {'lr': 0.0004994890617409728, 'samples': 571776, 'steps': 2977, 'loss/train': 1.3163869678974152} 01/26/2022 22:40:59 - INFO - codeparrot_training - Step 2978: {'lr': 0.0004994880156300482, 'samples': 571968, 'steps': 2978, 'loss/train': 0.4756520390510559} 01/26/2022 22:41:02 - INFO - codeparrot_training - Step 2979: {'lr': 0.0004994869684503962, 'samples': 572160, 'steps': 2979, 'loss/train': 1.4162181615829468} 01/26/2022 22:41:05 - INFO - codeparrot_training - Step 2980: {'lr': 0.0004994859202020212, 'samples': 572352, 'steps': 2980, 'loss/train': 1.0185375809669495} 01/26/2022 22:41:08 - INFO - codeparrot_training - Step 2981: {'lr': 0.0004994848708849279, 'samples': 572544, 'steps': 2981, 'loss/train': 0.8801330924034119} 01/26/2022 22:41:11 - INFO - codeparrot_training - Step 2982: {'lr': 0.0004994838204991205, 'samples': 572736, 'steps': 2982, 'loss/train': 1.0409952700138092} 01/26/2022 22:41:14 - INFO - codeparrot_training - Step 2983: {'lr': 0.0004994827690446036, 'samples': 572928, 'steps': 2983, 'loss/train': 0.7141469568014145} 01/26/2022 22:41:17 - INFO - codeparrot_training - Step 2984: {'lr': 0.0004994817165213817, 'samples': 573120, 'steps': 2984, 'loss/train': 0.628266379237175} 01/26/2022 22:41:21 - INFO - codeparrot_training - Step 2985: {'lr': 0.0004994806629294594, 'samples': 573312, 'steps': 2985, 'loss/train': 0.8343723714351654} 01/26/2022 22:41:25 - INFO - codeparrot_training - Step 2986: {'lr': 0.0004994796082688413, 'samples': 573504, 'steps': 2986, 'loss/train': 0.8202827274799347} 01/26/2022 22:41:28 - INFO - codeparrot_training - Step 2987: {'lr': 0.0004994785525395316, 'samples': 573696, 'steps': 2987, 'loss/train': 1.2065679430961609} 01/26/2022 22:41:31 - INFO - codeparrot_training - Step 2988: {'lr': 0.0004994774957415351, 'samples': 573888, 'steps': 2988, 'loss/train': 0.6210755109786987} 01/26/2022 22:41:34 - INFO - codeparrot_training - Step 2989: {'lr': 0.0004994764378748562, 'samples': 574080, 'steps': 2989, 'loss/train': 0.9630583226680756} 01/26/2022 22:41:38 - INFO - codeparrot_training - Step 2990: {'lr': 0.0004994753789394994, 'samples': 574272, 'steps': 2990, 'loss/train': 1.7000622153282166} 01/26/2022 22:41:41 - INFO - codeparrot_training - Step 2991: {'lr': 0.0004994743189354694, 'samples': 574464, 'steps': 2991, 'loss/train': 0.513814240694046} 01/26/2022 22:41:44 - INFO - codeparrot_training - Step 2992: {'lr': 0.0004994732578627706, 'samples': 574656, 'steps': 2992, 'loss/train': 1.4843436777591705} 01/26/2022 22:41:47 - INFO - codeparrot_training - Step 2993: {'lr': 0.0004994721957214076, 'samples': 574848, 'steps': 2993, 'loss/train': 0.5911509990692139} 01/26/2022 22:41:50 - INFO - codeparrot_training - Step 2994: {'lr': 0.0004994711325113849, 'samples': 575040, 'steps': 2994, 'loss/train': 0.6512318551540375} 01/26/2022 22:41:55 - INFO - codeparrot_training - Step 2995: {'lr': 0.000499470068232707, 'samples': 575232, 'steps': 2995, 'loss/train': 0.5227906554937363} 01/26/2022 22:41:58 - INFO - codeparrot_training - Step 2996: {'lr': 0.0004994690028853787, 'samples': 575424, 'steps': 2996, 'loss/train': 0.84751296043396} 01/26/2022 22:42:01 - INFO - codeparrot_training - Step 2997: {'lr': 0.0004994679364694043, 'samples': 575616, 'steps': 2997, 'loss/train': 1.4486286342144012} 01/26/2022 22:42:04 - INFO - codeparrot_training - Step 2998: {'lr': 0.0004994668689847885, 'samples': 575808, 'steps': 2998, 'loss/train': 0.9349216818809509} 01/26/2022 22:42:07 - INFO - codeparrot_training - Step 2999: {'lr': 0.0004994658004315358, 'samples': 576000, 'steps': 2999, 'loss/train': 1.1882988810539246} 01/26/2022 22:42:10 - INFO - codeparrot_training - Step 3000: {'lr': 0.0004994647308096509, 'samples': 576192, 'steps': 3000, 'loss/train': 0.8685111701488495} 01/26/2022 22:42:14 - INFO - codeparrot_training - Step 3001: {'lr': 0.0004994636601191383, 'samples': 576384, 'steps': 3001, 'loss/train': 0.8306197822093964} 01/26/2022 22:42:17 - INFO - codeparrot_training - Step 3002: {'lr': 0.0004994625883600025, 'samples': 576576, 'steps': 3002, 'loss/train': 0.9463300108909607} 01/26/2022 22:42:20 - INFO - codeparrot_training - Step 3003: {'lr': 0.0004994615155322483, 'samples': 576768, 'steps': 3003, 'loss/train': 0.5510808080434799} 01/26/2022 22:42:26 - INFO - codeparrot_training - Step 3004: {'lr': 0.0004994604416358801, 'samples': 576960, 'steps': 3004, 'loss/train': 0.733802318572998} 01/26/2022 22:42:29 - INFO - codeparrot_training - Step 3005: {'lr': 0.0004994593666709027, 'samples': 577152, 'steps': 3005, 'loss/train': 0.9910570085048676} 01/26/2022 22:42:32 - INFO - codeparrot_training - Step 3006: {'lr': 0.0004994582906373205, 'samples': 577344, 'steps': 3006, 'loss/train': 1.2708490490913391} 01/26/2022 22:42:36 - INFO - codeparrot_training - Step 3007: {'lr': 0.0004994572135351382, 'samples': 577536, 'steps': 3007, 'loss/train': 0.8326449394226074} 01/26/2022 22:42:39 - INFO - codeparrot_training - Step 3008: {'lr': 0.0004994561353643604, 'samples': 577728, 'steps': 3008, 'loss/train': 0.7453435957431793} 01/26/2022 22:42:42 - INFO - codeparrot_training - Step 3009: {'lr': 0.0004994550561249917, 'samples': 577920, 'steps': 3009, 'loss/train': 0.8280673921108246} 01/26/2022 22:42:45 - INFO - codeparrot_training - Step 3010: {'lr': 0.0004994539758170367, 'samples': 578112, 'steps': 3010, 'loss/train': 0.9592432379722595} 01/26/2022 22:42:48 - INFO - codeparrot_training - Step 3011: {'lr': 0.0004994528944405002, 'samples': 578304, 'steps': 3011, 'loss/train': 0.6853153556585312} 01/26/2022 22:42:53 - INFO - codeparrot_training - Step 3012: {'lr': 0.0004994518119953867, 'samples': 578496, 'steps': 3012, 'loss/train': 0.8472967743873596} 01/26/2022 22:42:56 - INFO - codeparrot_training - Step 3013: {'lr': 0.0004994507284817009, 'samples': 578688, 'steps': 3013, 'loss/train': 0.8156357109546661} 01/26/2022 22:42:59 - INFO - codeparrot_training - Step 3014: {'lr': 0.0004994496438994472, 'samples': 578880, 'steps': 3014, 'loss/train': 0.9628555476665497} 01/26/2022 22:43:02 - INFO - codeparrot_training - Step 3015: {'lr': 0.0004994485582486306, 'samples': 579072, 'steps': 3015, 'loss/train': 0.8839818835258484} 01/26/2022 22:43:05 - INFO - codeparrot_training - Step 3016: {'lr': 0.0004994474715292555, 'samples': 579264, 'steps': 3016, 'loss/train': 1.2006425857543945} 01/26/2022 22:43:08 - INFO - codeparrot_training - Step 3017: {'lr': 0.0004994463837413268, 'samples': 579456, 'steps': 3017, 'loss/train': 0.6012454479932785} 01/26/2022 22:43:12 - INFO - codeparrot_training - Step 3018: {'lr': 0.0004994452948848488, 'samples': 579648, 'steps': 3018, 'loss/train': 0.5971735417842865} 01/26/2022 22:43:15 - INFO - codeparrot_training - Step 3019: {'lr': 0.0004994442049598265, 'samples': 579840, 'steps': 3019, 'loss/train': 0.7201147377490997} 01/26/2022 22:43:18 - INFO - codeparrot_training - Step 3020: {'lr': 0.0004994431139662643, 'samples': 580032, 'steps': 3020, 'loss/train': 0.9368309676647186} 01/26/2022 22:43:22 - INFO - codeparrot_training - Step 3021: {'lr': 0.0004994420219041671, 'samples': 580224, 'steps': 3021, 'loss/train': 1.0248503386974335} 01/26/2022 22:43:25 - INFO - codeparrot_training - Step 3022: {'lr': 0.0004994409287735394, 'samples': 580416, 'steps': 3022, 'loss/train': 0.7647323906421661} 01/26/2022 22:43:29 - INFO - codeparrot_training - Step 3023: {'lr': 0.0004994398345743861, 'samples': 580608, 'steps': 3023, 'loss/train': 1.4494730830192566} 01/26/2022 22:43:32 - INFO - codeparrot_training - Step 3024: {'lr': 0.0004994387393067117, 'samples': 580800, 'steps': 3024, 'loss/train': 0.9752577245235443} 01/26/2022 22:43:35 - INFO - codeparrot_training - Step 3025: {'lr': 0.0004994376429705208, 'samples': 580992, 'steps': 3025, 'loss/train': 1.023252546787262} 01/26/2022 22:43:38 - INFO - codeparrot_training - Step 3026: {'lr': 0.0004994365455658185, 'samples': 581184, 'steps': 3026, 'loss/train': 0.6086786538362503} 01/26/2022 22:43:41 - INFO - codeparrot_training - Step 3027: {'lr': 0.000499435447092609, 'samples': 581376, 'steps': 3027, 'loss/train': 0.5914618223905563} 01/26/2022 22:43:44 - INFO - codeparrot_training - Step 3028: {'lr': 0.0004994343475508974, 'samples': 581568, 'steps': 3028, 'loss/train': 0.8832364082336426} 01/26/2022 22:43:47 - INFO - codeparrot_training - Step 3029: {'lr': 0.0004994332469406882, 'samples': 581760, 'steps': 3029, 'loss/train': 0.7864811718463898} 01/26/2022 22:43:54 - INFO - codeparrot_training - Step 3030: {'lr': 0.0004994321452619863, 'samples': 581952, 'steps': 3030, 'loss/train': 1.0908188223838806} 01/26/2022 22:43:57 - INFO - codeparrot_training - Step 3031: {'lr': 0.0004994310425147962, 'samples': 582144, 'steps': 3031, 'loss/train': 1.1295273005962372} 01/26/2022 22:44:00 - INFO - codeparrot_training - Step 3032: {'lr': 0.0004994299386991227, 'samples': 582336, 'steps': 3032, 'loss/train': 1.2477806210517883} 01/26/2022 22:44:03 - INFO - codeparrot_training - Step 3033: {'lr': 0.0004994288338149705, 'samples': 582528, 'steps': 3033, 'loss/train': 0.3792816996574402} 01/26/2022 22:44:06 - INFO - codeparrot_training - Step 3034: {'lr': 0.0004994277278623445, 'samples': 582720, 'steps': 3034, 'loss/train': 1.033644050359726} 01/26/2022 22:44:09 - INFO - codeparrot_training - Step 3035: {'lr': 0.0004994266208412493, 'samples': 582912, 'steps': 3035, 'loss/train': 0.74165940284729} 01/26/2022 22:44:12 - INFO - codeparrot_training - Step 3036: {'lr': 0.0004994255127516895, 'samples': 583104, 'steps': 3036, 'loss/train': 0.7552266418933868} 01/26/2022 22:44:16 - INFO - codeparrot_training - Step 3037: {'lr': 0.0004994244035936701, 'samples': 583296, 'steps': 3037, 'loss/train': 1.265843778848648} 01/26/2022 22:44:19 - INFO - codeparrot_training - Step 3038: {'lr': 0.0004994232933671958, 'samples': 583488, 'steps': 3038, 'loss/train': 1.217609167098999} 01/26/2022 22:44:23 - INFO - codeparrot_training - Step 3039: {'lr': 0.0004994221820722713, 'samples': 583680, 'steps': 3039, 'loss/train': 0.7715681791305542} 01/26/2022 22:44:26 - INFO - codeparrot_training - Step 3040: {'lr': 0.0004994210697089013, 'samples': 583872, 'steps': 3040, 'loss/train': 0.5747961401939392} 01/26/2022 22:44:30 - INFO - codeparrot_training - Step 3041: {'lr': 0.0004994199562770907, 'samples': 584064, 'steps': 3041, 'loss/train': 1.1922541558742523} 01/26/2022 22:44:33 - INFO - codeparrot_training - Step 3042: {'lr': 0.0004994188417768443, 'samples': 584256, 'steps': 3042, 'loss/train': 0.9550997614860535} 01/26/2022 22:44:36 - INFO - codeparrot_training - Step 3043: {'lr': 0.0004994177262081666, 'samples': 584448, 'steps': 3043, 'loss/train': 0.650934487581253} 01/26/2022 22:44:39 - INFO - codeparrot_training - Step 3044: {'lr': 0.0004994166095710626, 'samples': 584640, 'steps': 3044, 'loss/train': 0.8712303042411804} 01/26/2022 22:44:42 - INFO - codeparrot_training - Step 3045: {'lr': 0.0004994154918655371, 'samples': 584832, 'steps': 3045, 'loss/train': 0.30411460250616074} 01/26/2022 22:44:45 - INFO - codeparrot_training - Step 3046: {'lr': 0.0004994143730915948, 'samples': 585024, 'steps': 3046, 'loss/train': 0.897725522518158} 01/26/2022 22:44:50 - INFO - codeparrot_training - Step 3047: {'lr': 0.0004994132532492406, 'samples': 585216, 'steps': 3047, 'loss/train': 0.9452016949653625} 01/26/2022 22:44:53 - INFO - codeparrot_training - Step 3048: {'lr': 0.0004994121323384791, 'samples': 585408, 'steps': 3048, 'loss/train': 1.1467872262001038} 01/26/2022 22:44:56 - INFO - codeparrot_training - Step 3049: {'lr': 0.0004994110103593154, 'samples': 585600, 'steps': 3049, 'loss/train': 1.1749460399150848} 01/26/2022 22:44:59 - INFO - codeparrot_training - Step 3050: {'lr': 0.0004994098873117539, 'samples': 585792, 'steps': 3050, 'loss/train': 1.1754595935344696} 01/26/2022 22:45:02 - INFO - codeparrot_training - Step 3051: {'lr': 0.0004994087631957998, 'samples': 585984, 'steps': 3051, 'loss/train': 1.3142394125461578} 01/26/2022 22:45:05 - INFO - codeparrot_training - Step 3052: {'lr': 0.0004994076380114577, 'samples': 586176, 'steps': 3052, 'loss/train': 0.9467853605747223} 01/26/2022 22:45:08 - INFO - codeparrot_training - Step 3053: {'lr': 0.0004994065117587325, 'samples': 586368, 'steps': 3053, 'loss/train': 0.718153327703476} 01/26/2022 22:45:12 - INFO - codeparrot_training - Step 3054: {'lr': 0.0004994053844376289, 'samples': 586560, 'steps': 3054, 'loss/train': 1.3371729254722595} 01/26/2022 22:45:15 - INFO - codeparrot_training - Step 3055: {'lr': 0.000499404256048152, 'samples': 586752, 'steps': 3055, 'loss/train': 0.3835703730583191} 01/26/2022 22:45:21 - INFO - codeparrot_training - Step 3056: {'lr': 0.0004994031265903063, 'samples': 586944, 'steps': 3056, 'loss/train': 0.7479745745658875} 01/26/2022 22:45:24 - INFO - codeparrot_training - Step 3057: {'lr': 0.0004994019960640969, 'samples': 587136, 'steps': 3057, 'loss/train': 0.8235342800617218} 01/26/2022 22:45:27 - INFO - codeparrot_training - Step 3058: {'lr': 0.0004994008644695285, 'samples': 587328, 'steps': 3058, 'loss/train': 1.0746384859085083} 01/26/2022 22:45:30 - INFO - codeparrot_training - Step 3059: {'lr': 0.0004993997318066061, 'samples': 587520, 'steps': 3059, 'loss/train': 0.8414807617664337} 01/26/2022 22:45:34 - INFO - codeparrot_training - Step 3060: {'lr': 0.0004993985980753342, 'samples': 587712, 'steps': 3060, 'loss/train': 0.8642144501209259} 01/26/2022 22:45:37 - INFO - codeparrot_training - Step 3061: {'lr': 0.0004993974632757181, 'samples': 587904, 'steps': 3061, 'loss/train': 0.5456404387950897} 01/26/2022 22:45:40 - INFO - codeparrot_training - Step 3062: {'lr': 0.0004993963274077624, 'samples': 588096, 'steps': 3062, 'loss/train': 1.0538479685783386} 01/26/2022 22:45:43 - INFO - codeparrot_training - Step 3063: {'lr': 0.000499395190471472, 'samples': 588288, 'steps': 3063, 'loss/train': 1.0717209577560425} 01/26/2022 22:45:46 - INFO - codeparrot_training - Step 3064: {'lr': 0.0004993940524668518, 'samples': 588480, 'steps': 3064, 'loss/train': 0.6821800768375397} 01/26/2022 22:45:51 - INFO - codeparrot_training - Step 3065: {'lr': 0.0004993929133939067, 'samples': 588672, 'steps': 3065, 'loss/train': 0.9506807327270508} 01/26/2022 22:45:54 - INFO - codeparrot_training - Step 3066: {'lr': 0.0004993917732526416, 'samples': 588864, 'steps': 3066, 'loss/train': 1.3480643928050995} 01/26/2022 22:45:57 - INFO - codeparrot_training - Step 3067: {'lr': 0.0004993906320430613, 'samples': 589056, 'steps': 3067, 'loss/train': 0.7566342651844025} 01/26/2022 22:46:00 - INFO - codeparrot_training - Step 3068: {'lr': 0.0004993894897651706, 'samples': 589248, 'steps': 3068, 'loss/train': 1.2316966652870178} 01/26/2022 22:46:03 - INFO - codeparrot_training - Step 3069: {'lr': 0.0004993883464189747, 'samples': 589440, 'steps': 3069, 'loss/train': 0.8068084716796875} 01/26/2022 22:46:07 - INFO - codeparrot_training - Step 3070: {'lr': 0.0004993872020044781, 'samples': 589632, 'steps': 3070, 'loss/train': 0.6046827435493469} 01/26/2022 22:46:10 - INFO - codeparrot_training - Step 3071: {'lr': 0.0004993860565216861, 'samples': 589824, 'steps': 3071, 'loss/train': 0.9846154153347015} 01/26/2022 22:46:13 - INFO - codeparrot_training - Step 3072: {'lr': 0.0004993849099706034, 'samples': 590016, 'steps': 3072, 'loss/train': 0.9019624292850494} 01/26/2022 22:46:16 - INFO - codeparrot_training - Step 3073: {'lr': 0.0004993837623512349, 'samples': 590208, 'steps': 3073, 'loss/train': 0.7343018352985382} 01/26/2022 22:46:22 - INFO - codeparrot_training - Step 3074: {'lr': 0.0004993826136635856, 'samples': 590400, 'steps': 3074, 'loss/train': 1.166970044374466} 01/26/2022 22:46:25 - INFO - codeparrot_training - Step 3075: {'lr': 0.0004993814639076602, 'samples': 590592, 'steps': 3075, 'loss/train': 0.5759517252445221} 01/26/2022 22:46:28 - INFO - codeparrot_training - Step 3076: {'lr': 0.000499380313083464, 'samples': 590784, 'steps': 3076, 'loss/train': 1.059753030538559} 01/26/2022 22:46:32 - INFO - codeparrot_training - Step 3077: {'lr': 0.0004993791611910017, 'samples': 590976, 'steps': 3077, 'loss/train': 0.984149694442749} 01/26/2022 22:46:35 - INFO - codeparrot_training - Step 3078: {'lr': 0.0004993780082302782, 'samples': 591168, 'steps': 3078, 'loss/train': 1.2462731301784515} 01/26/2022 22:46:38 - INFO - codeparrot_training - Step 3079: {'lr': 0.0004993768542012985, 'samples': 591360, 'steps': 3079, 'loss/train': 0.8407773077487946} 01/26/2022 22:46:41 - INFO - codeparrot_training - Step 3080: {'lr': 0.0004993756991040675, 'samples': 591552, 'steps': 3080, 'loss/train': 0.4486342817544937} 01/26/2022 22:46:44 - INFO - codeparrot_training - Step 3081: {'lr': 0.0004993745429385903, 'samples': 591744, 'steps': 3081, 'loss/train': 0.6189307272434235} 01/26/2022 22:46:49 - INFO - codeparrot_training - Step 3082: {'lr': 0.0004993733857048717, 'samples': 591936, 'steps': 3082, 'loss/train': 0.9769656658172607} 01/26/2022 22:46:52 - INFO - codeparrot_training - Step 3083: {'lr': 0.0004993722274029167, 'samples': 592128, 'steps': 3083, 'loss/train': 0.8283707499504089} 01/26/2022 22:46:56 - INFO - codeparrot_training - Step 3084: {'lr': 0.0004993710680327301, 'samples': 592320, 'steps': 3084, 'loss/train': 0.7320514619350433} 01/26/2022 22:46:59 - INFO - codeparrot_training - Step 3085: {'lr': 0.0004993699075943172, 'samples': 592512, 'steps': 3085, 'loss/train': 0.423452228307724} 01/26/2022 22:47:02 - INFO - codeparrot_training - Step 3086: {'lr': 0.0004993687460876829, 'samples': 592704, 'steps': 3086, 'loss/train': 0.3219415470957756} 01/26/2022 22:47:05 - INFO - codeparrot_training - Step 3087: {'lr': 0.0004993675835128319, 'samples': 592896, 'steps': 3087, 'loss/train': 0.874818742275238} 01/26/2022 22:47:08 - INFO - codeparrot_training - Step 3088: {'lr': 0.0004993664198697694, 'samples': 593088, 'steps': 3088, 'loss/train': 1.689469814300537} 01/26/2022 22:47:11 - INFO - codeparrot_training - Step 3089: {'lr': 0.0004993652551585003, 'samples': 593280, 'steps': 3089, 'loss/train': 1.6341939568519592} 01/26/2022 22:47:14 - INFO - codeparrot_training - Step 3090: {'lr': 0.0004993640893790298, 'samples': 593472, 'steps': 3090, 'loss/train': 0.815530925989151} 01/26/2022 22:47:17 - INFO - codeparrot_training - Step 3091: {'lr': 0.0004993629225313625, 'samples': 593664, 'steps': 3091, 'loss/train': 1.3303383886814117} 01/26/2022 22:47:22 - INFO - codeparrot_training - Step 3092: {'lr': 0.0004993617546155037, 'samples': 593856, 'steps': 3092, 'loss/train': 1.0688976645469666} 01/26/2022 22:47:25 - INFO - codeparrot_training - Step 3093: {'lr': 0.0004993605856314584, 'samples': 594048, 'steps': 3093, 'loss/train': 1.028331309556961} 01/26/2022 22:47:28 - INFO - codeparrot_training - Step 3094: {'lr': 0.0004993594155792315, 'samples': 594240, 'steps': 3094, 'loss/train': 0.38336697220802307} 01/26/2022 22:47:31 - INFO - codeparrot_training - Step 3095: {'lr': 0.000499358244458828, 'samples': 594432, 'steps': 3095, 'loss/train': 1.0424884557724} 01/26/2022 22:47:35 - INFO - codeparrot_training - Step 3096: {'lr': 0.0004993570722702529, 'samples': 594624, 'steps': 3096, 'loss/train': 0.7777439653873444} 01/26/2022 22:47:38 - INFO - codeparrot_training - Step 3097: {'lr': 0.0004993558990135115, 'samples': 594816, 'steps': 3097, 'loss/train': 0.7141478955745697} 01/26/2022 22:47:41 - INFO - codeparrot_training - Step 3098: {'lr': 0.0004993547246886084, 'samples': 595008, 'steps': 3098, 'loss/train': 1.195128858089447} 01/26/2022 22:47:44 - INFO - codeparrot_training - Step 3099: {'lr': 0.0004993535492955488, 'samples': 595200, 'steps': 3099, 'loss/train': 2.3510032296180725} 01/26/2022 22:47:47 - INFO - codeparrot_training - Step 3100: {'lr': 0.000499352372834338, 'samples': 595392, 'steps': 3100, 'loss/train': 1.9624511003494263} 01/26/2022 22:47:51 - INFO - codeparrot_training - Step 3101: {'lr': 0.0004993511953049807, 'samples': 595584, 'steps': 3101, 'loss/train': 0.7397764474153519} 01/26/2022 22:47:55 - INFO - codeparrot_training - Step 3102: {'lr': 0.000499350016707482, 'samples': 595776, 'steps': 3102, 'loss/train': 0.7210698276758194} 01/26/2022 22:47:58 - INFO - codeparrot_training - Step 3103: {'lr': 0.0004993488370418471, 'samples': 595968, 'steps': 3103, 'loss/train': 0.9575537145137787} 01/26/2022 22:48:01 - INFO - codeparrot_training - Step 3104: {'lr': 0.0004993476563080809, 'samples': 596160, 'steps': 3104, 'loss/train': 0.6006440967321396} 01/26/2022 22:48:04 - INFO - codeparrot_training - Step 3105: {'lr': 0.0004993464745061885, 'samples': 596352, 'steps': 3105, 'loss/train': 1.3471609354019165} 01/26/2022 22:48:07 - INFO - codeparrot_training - Step 3106: {'lr': 0.0004993452916361751, 'samples': 596544, 'steps': 3106, 'loss/train': 0.696561187505722} 01/26/2022 22:48:10 - INFO - codeparrot_training - Step 3107: {'lr': 0.0004993441076980455, 'samples': 596736, 'steps': 3107, 'loss/train': 0.9184426367282867} 01/26/2022 22:48:13 - INFO - codeparrot_training - Step 3108: {'lr': 0.0004993429226918051, 'samples': 596928, 'steps': 3108, 'loss/train': 0.8071390986442566} 01/26/2022 22:48:17 - INFO - codeparrot_training - Step 3109: {'lr': 0.0004993417366174586, 'samples': 597120, 'steps': 3109, 'loss/train': 0.8521171510219574} 01/26/2022 22:48:23 - INFO - codeparrot_training - Step 3110: {'lr': 0.0004993405494750113, 'samples': 597312, 'steps': 3110, 'loss/train': 1.3886156380176544} 01/26/2022 22:48:26 - INFO - codeparrot_training - Step 3111: {'lr': 0.0004993393612644683, 'samples': 597504, 'steps': 3111, 'loss/train': 2.4160265922546387} 01/26/2022 22:48:29 - INFO - codeparrot_training - Step 3112: {'lr': 0.0004993381719858347, 'samples': 597696, 'steps': 3112, 'loss/train': 0.9460723400115967} 01/26/2022 22:48:32 - INFO - codeparrot_training - Step 3113: {'lr': 0.0004993369816391156, 'samples': 597888, 'steps': 3113, 'loss/train': 0.8706209063529968} 01/26/2022 22:48:35 - INFO - codeparrot_training - Step 3114: {'lr': 0.0004993357902243158, 'samples': 598080, 'steps': 3114, 'loss/train': 0.9922253787517548} 01/26/2022 22:48:39 - INFO - codeparrot_training - Step 3115: {'lr': 0.0004993345977414408, 'samples': 598272, 'steps': 3115, 'loss/train': 1.067894160747528} 01/26/2022 22:48:42 - INFO - codeparrot_training - Step 3116: {'lr': 0.0004993334041904957, 'samples': 598464, 'steps': 3116, 'loss/train': 0.6676531881093979} 01/26/2022 22:48:45 - INFO - codeparrot_training - Step 3117: {'lr': 0.0004993322095714853, 'samples': 598656, 'steps': 3117, 'loss/train': 0.9403817653656006} 01/26/2022 22:48:49 - INFO - codeparrot_training - Step 3118: {'lr': 0.0004993310138844149, 'samples': 598848, 'steps': 3118, 'loss/train': 0.9134982526302338} 01/26/2022 22:48:53 - INFO - codeparrot_training - Step 3119: {'lr': 0.0004993298171292896, 'samples': 599040, 'steps': 3119, 'loss/train': 0.7884234488010406} 01/26/2022 22:48:56 - INFO - codeparrot_training - Step 3120: {'lr': 0.0004993286193061145, 'samples': 599232, 'steps': 3120, 'loss/train': 0.43365754187107086} 01/26/2022 22:48:59 - INFO - codeparrot_training - Step 3121: {'lr': 0.0004993274204148949, 'samples': 599424, 'steps': 3121, 'loss/train': 0.9465415477752686} 01/26/2022 22:49:02 - INFO - codeparrot_training - Step 3122: {'lr': 0.0004993262204556356, 'samples': 599616, 'steps': 3122, 'loss/train': 0.773804783821106} 01/26/2022 22:49:05 - INFO - codeparrot_training - Step 3123: {'lr': 0.0004993250194283421, 'samples': 599808, 'steps': 3123, 'loss/train': 0.8454104661941528} 01/26/2022 22:49:08 - INFO - codeparrot_training - Step 3124: {'lr': 0.0004993238173330194, 'samples': 600000, 'steps': 3124, 'loss/train': 0.7748620212078094} 01/26/2022 22:49:11 - INFO - codeparrot_training - Step 3125: {'lr': 0.0004993226141696725, 'samples': 600192, 'steps': 3125, 'loss/train': 1.1729987561702728} 01/26/2022 22:49:14 - INFO - codeparrot_training - Step 3126: {'lr': 0.0004993214099383069, 'samples': 600384, 'steps': 3126, 'loss/train': 0.8386180400848389} 01/26/2022 22:49:19 - INFO - codeparrot_training - Step 3127: {'lr': 0.0004993202046389274, 'samples': 600576, 'steps': 3127, 'loss/train': 0.9724913835525513} 01/26/2022 22:49:22 - INFO - codeparrot_training - Step 3128: {'lr': 0.0004993189982715392, 'samples': 600768, 'steps': 3128, 'loss/train': 1.0183878242969513} 01/26/2022 22:49:25 - INFO - codeparrot_training - Step 3129: {'lr': 0.0004993177908361479, 'samples': 600960, 'steps': 3129, 'loss/train': 0.7240040749311447} 01/26/2022 22:49:29 - INFO - codeparrot_training - Step 3130: {'lr': 0.000499316582332758, 'samples': 601152, 'steps': 3130, 'loss/train': 0.586049348115921} 01/26/2022 22:49:32 - INFO - codeparrot_training - Step 3131: {'lr': 0.0004993153727613753, 'samples': 601344, 'steps': 3131, 'loss/train': 0.7087288051843643} 01/26/2022 22:49:35 - INFO - codeparrot_training - Step 3132: {'lr': 0.0004993141621220046, 'samples': 601536, 'steps': 3132, 'loss/train': 0.15447142347693443} 01/26/2022 22:49:38 - INFO - codeparrot_training - Step 3133: {'lr': 0.0004993129504146512, 'samples': 601728, 'steps': 3133, 'loss/train': 1.0285411477088928} 01/26/2022 22:49:41 - INFO - codeparrot_training - Step 3134: {'lr': 0.0004993117376393203, 'samples': 601920, 'steps': 3134, 'loss/train': 0.45526421070098877} 01/26/2022 22:49:44 - INFO - codeparrot_training - Step 3135: {'lr': 0.000499310523796017, 'samples': 602112, 'steps': 3135, 'loss/train': 0.7371034026145935} 01/26/2022 22:49:51 - INFO - codeparrot_training - Step 3136: {'lr': 0.0004993093088847466, 'samples': 602304, 'steps': 3136, 'loss/train': 1.020345389842987} 01/26/2022 22:49:54 - INFO - codeparrot_training - Step 3137: {'lr': 0.0004993080929055144, 'samples': 602496, 'steps': 3137, 'loss/train': 0.6957274675369263} 01/26/2022 22:49:57 - INFO - codeparrot_training - Step 3138: {'lr': 0.0004993068758583254, 'samples': 602688, 'steps': 3138, 'loss/train': 0.4992561489343643} 01/26/2022 22:50:00 - INFO - codeparrot_training - Step 3139: {'lr': 0.0004993056577431849, 'samples': 602880, 'steps': 3139, 'loss/train': 0.9296811819076538} 01/26/2022 22:50:03 - INFO - codeparrot_training - Step 3140: {'lr': 0.0004993044385600982, 'samples': 603072, 'steps': 3140, 'loss/train': 0.7521571218967438} 01/26/2022 22:50:07 - INFO - codeparrot_training - Step 3141: {'lr': 0.0004993032183090704, 'samples': 603264, 'steps': 3141, 'loss/train': 0.18590404093265533} 01/26/2022 22:50:10 - INFO - codeparrot_training - Step 3142: {'lr': 0.0004993019969901069, 'samples': 603456, 'steps': 3142, 'loss/train': 1.1115323603153229} 01/26/2022 22:50:13 - INFO - codeparrot_training - Step 3143: {'lr': 0.0004993007746032126, 'samples': 603648, 'steps': 3143, 'loss/train': 1.8252160549163818} 01/26/2022 22:50:16 - INFO - codeparrot_training - Step 3144: {'lr': 0.000499299551148393, 'samples': 603840, 'steps': 3144, 'loss/train': 0.7734940946102142} 01/26/2022 22:50:20 - INFO - codeparrot_training - Step 3145: {'lr': 0.0004992983266256533, 'samples': 604032, 'steps': 3145, 'loss/train': 0.7274016737937927} 01/26/2022 22:50:24 - INFO - codeparrot_training - Step 3146: {'lr': 0.0004992971010349987, 'samples': 604224, 'steps': 3146, 'loss/train': 0.4508884996175766} 01/26/2022 22:50:27 - INFO - codeparrot_training - Step 3147: {'lr': 0.0004992958743764346, 'samples': 604416, 'steps': 3147, 'loss/train': 2.3917962312698364} 01/26/2022 22:50:30 - INFO - codeparrot_training - Step 3148: {'lr': 0.0004992946466499661, 'samples': 604608, 'steps': 3148, 'loss/train': 1.4330312311649323} 01/26/2022 22:50:33 - INFO - codeparrot_training - Step 3149: {'lr': 0.0004992934178555984, 'samples': 604800, 'steps': 3149, 'loss/train': 1.096650391817093} 01/26/2022 22:50:36 - INFO - codeparrot_training - Step 3150: {'lr': 0.000499292187993337, 'samples': 604992, 'steps': 3150, 'loss/train': 0.6916479170322418} 01/26/2022 22:50:39 - INFO - codeparrot_training - Step 3151: {'lr': 0.0004992909570631868, 'samples': 605184, 'steps': 3151, 'loss/train': 1.0674205720424652} 01/26/2022 22:50:42 - INFO - codeparrot_training - Step 3152: {'lr': 0.0004992897250651535, 'samples': 605376, 'steps': 3152, 'loss/train': 0.41068343818187714} 01/26/2022 22:50:49 - INFO - codeparrot_training - Step 3153: {'lr': 0.0004992884919992421, 'samples': 605568, 'steps': 3153, 'loss/train': 0.828798919916153} 01/26/2022 22:50:52 - INFO - codeparrot_training - Step 3154: {'lr': 0.000499287257865458, 'samples': 605760, 'steps': 3154, 'loss/train': 0.767394483089447} 01/26/2022 22:50:55 - INFO - codeparrot_training - Step 3155: {'lr': 0.0004992860226638064, 'samples': 605952, 'steps': 3155, 'loss/train': 1.0100817382335663} 01/26/2022 22:50:58 - INFO - codeparrot_training - Step 3156: {'lr': 0.0004992847863942927, 'samples': 606144, 'steps': 3156, 'loss/train': 1.291605681180954} 01/26/2022 22:51:01 - INFO - codeparrot_training - Step 3157: {'lr': 0.000499283549056922, 'samples': 606336, 'steps': 3157, 'loss/train': 0.8222542405128479} 01/26/2022 22:51:04 - INFO - codeparrot_training - Step 3158: {'lr': 0.0004992823106516999, 'samples': 606528, 'steps': 3158, 'loss/train': 0.1939530149102211} 01/26/2022 22:51:08 - INFO - codeparrot_training - Step 3159: {'lr': 0.0004992810711786314, 'samples': 606720, 'steps': 3159, 'loss/train': 0.8390203714370728} 01/26/2022 22:51:11 - INFO - codeparrot_training - Step 3160: {'lr': 0.000499279830637722, 'samples': 606912, 'steps': 3160, 'loss/train': 0.7266795784235001} 01/26/2022 22:51:14 - INFO - codeparrot_training - Step 3161: {'lr': 0.000499278589028977, 'samples': 607104, 'steps': 3161, 'loss/train': 0.7870489954948425} 01/26/2022 22:51:18 - INFO - codeparrot_training - Step 3162: {'lr': 0.0004992773463524016, 'samples': 607296, 'steps': 3162, 'loss/train': 0.9470200538635254} 01/26/2022 22:51:22 - INFO - codeparrot_training - Step 3163: {'lr': 0.0004992761026080013, 'samples': 607488, 'steps': 3163, 'loss/train': 0.8834607303142548} 01/26/2022 22:51:25 - INFO - codeparrot_training - Step 3164: {'lr': 0.0004992748577957812, 'samples': 607680, 'steps': 3164, 'loss/train': 0.6332361102104187} 01/26/2022 22:51:28 - INFO - codeparrot_training - Step 3165: {'lr': 0.0004992736119157469, 'samples': 607872, 'steps': 3165, 'loss/train': 0.7474916875362396} 01/26/2022 22:51:31 - INFO - codeparrot_training - Step 3166: {'lr': 0.0004992723649679035, 'samples': 608064, 'steps': 3166, 'loss/train': 0.5014024078845978} 01/26/2022 22:51:34 - INFO - codeparrot_training - Step 3167: {'lr': 0.0004992711169522565, 'samples': 608256, 'steps': 3167, 'loss/train': 0.8994901478290558} 01/26/2022 22:51:37 - INFO - codeparrot_training - Step 3168: {'lr': 0.0004992698678688111, 'samples': 608448, 'steps': 3168, 'loss/train': 0.9332343935966492} 01/26/2022 22:51:40 - INFO - codeparrot_training - Step 3169: {'lr': 0.0004992686177175728, 'samples': 608640, 'steps': 3169, 'loss/train': 0.857835978269577} 01/26/2022 22:51:44 - INFO - codeparrot_training - Step 3170: {'lr': 0.000499267366498547, 'samples': 608832, 'steps': 3170, 'loss/train': 0.47579504549503326} 01/26/2022 22:51:48 - INFO - codeparrot_training - Step 3171: {'lr': 0.0004992661142117388, 'samples': 609024, 'steps': 3171, 'loss/train': 0.9496922492980957} 01/26/2022 22:51:51 - INFO - codeparrot_training - Step 3172: {'lr': 0.0004992648608571537, 'samples': 609216, 'steps': 3172, 'loss/train': 0.809814065694809} 01/26/2022 22:51:54 - INFO - codeparrot_training - Step 3173: {'lr': 0.0004992636064347971, 'samples': 609408, 'steps': 3173, 'loss/train': 0.6980393528938293} 01/26/2022 22:51:58 - INFO - codeparrot_training - Step 3174: {'lr': 0.0004992623509446746, 'samples': 609600, 'steps': 3174, 'loss/train': 1.1265261769294739} 01/26/2022 22:52:01 - INFO - codeparrot_training - Step 3175: {'lr': 0.0004992610943867911, 'samples': 609792, 'steps': 3175, 'loss/train': 0.9538992941379547} 01/26/2022 22:52:04 - INFO - codeparrot_training - Step 3176: {'lr': 0.0004992598367611523, 'samples': 609984, 'steps': 3176, 'loss/train': 1.1681291162967682} 01/26/2022 22:52:07 - INFO - codeparrot_training - Step 3177: {'lr': 0.0004992585780677634, 'samples': 610176, 'steps': 3177, 'loss/train': 0.6945383548736572} 01/26/2022 22:52:10 - INFO - codeparrot_training - Step 3178: {'lr': 0.00049925731830663, 'samples': 610368, 'steps': 3178, 'loss/train': 0.8053547143936157} 01/26/2022 22:52:13 - INFO - codeparrot_training - Step 3179: {'lr': 0.0004992560574777574, 'samples': 610560, 'steps': 3179, 'loss/train': 0.6344894617795944} 01/26/2022 22:52:19 - INFO - codeparrot_training - Step 3180: {'lr': 0.000499254795581151, 'samples': 610752, 'steps': 3180, 'loss/train': 1.230169415473938} 01/26/2022 22:52:22 - INFO - codeparrot_training - Step 3181: {'lr': 0.0004992535326168162, 'samples': 610944, 'steps': 3181, 'loss/train': 1.0862579941749573} 01/26/2022 22:52:26 - INFO - codeparrot_training - Step 3182: {'lr': 0.0004992522685847583, 'samples': 611136, 'steps': 3182, 'loss/train': 1.017082393169403} 01/26/2022 22:52:29 - INFO - codeparrot_training - Step 3183: {'lr': 0.000499251003484983, 'samples': 611328, 'steps': 3183, 'loss/train': 1.0749970972537994} 01/26/2022 22:52:32 - INFO - codeparrot_training - Step 3184: {'lr': 0.0004992497373174955, 'samples': 611520, 'steps': 3184, 'loss/train': 0.6898059546947479} 01/26/2022 22:52:35 - INFO - codeparrot_training - Step 3185: {'lr': 0.0004992484700823012, 'samples': 611712, 'steps': 3185, 'loss/train': 0.7525937855243683} 01/26/2022 22:52:38 - INFO - codeparrot_training - Step 3186: {'lr': 0.0004992472017794057, 'samples': 611904, 'steps': 3186, 'loss/train': 1.0924937725067139} 01/26/2022 22:52:41 - INFO - codeparrot_training - Step 3187: {'lr': 0.0004992459324088143, 'samples': 612096, 'steps': 3187, 'loss/train': 0.6581974625587463} 01/26/2022 22:52:44 - INFO - codeparrot_training - Step 3188: {'lr': 0.0004992446619705324, 'samples': 612288, 'steps': 3188, 'loss/train': 0.718385249376297} 01/26/2022 22:52:49 - INFO - codeparrot_training - Step 3189: {'lr': 0.0004992433904645654, 'samples': 612480, 'steps': 3189, 'loss/train': 0.6422256678342819} 01/26/2022 22:52:52 - INFO - codeparrot_training - Step 3190: {'lr': 0.0004992421178909191, 'samples': 612672, 'steps': 3190, 'loss/train': 0.6329650729894638} 01/26/2022 22:52:55 - INFO - codeparrot_training - Step 3191: {'lr': 0.0004992408442495986, 'samples': 612864, 'steps': 3191, 'loss/train': 1.1672164499759674} 01/26/2022 22:52:58 - INFO - codeparrot_training - Step 3192: {'lr': 0.0004992395695406095, 'samples': 613056, 'steps': 3192, 'loss/train': 1.0504159927368164} 01/26/2022 22:53:02 - INFO - codeparrot_training - Step 3193: {'lr': 0.0004992382937639572, 'samples': 613248, 'steps': 3193, 'loss/train': 1.099656879901886} 01/26/2022 22:53:05 - INFO - codeparrot_training - Step 3194: {'lr': 0.0004992370169196472, 'samples': 613440, 'steps': 3194, 'loss/train': 1.2224359810352325} 01/26/2022 22:53:08 - INFO - codeparrot_training - Step 3195: {'lr': 0.000499235739007685, 'samples': 613632, 'steps': 3195, 'loss/train': 0.4157678335905075} 01/26/2022 22:53:11 - INFO - codeparrot_training - Step 3196: {'lr': 0.000499234460028076, 'samples': 613824, 'steps': 3196, 'loss/train': 1.1449446380138397} 01/26/2022 22:53:14 - INFO - codeparrot_training - Step 3197: {'lr': 0.0004992331799808258, 'samples': 614016, 'steps': 3197, 'loss/train': 0.12080072611570358} 01/26/2022 22:53:19 - INFO - codeparrot_training - Step 3198: {'lr': 0.0004992318988659396, 'samples': 614208, 'steps': 3198, 'loss/train': 1.22263303399086} 01/26/2022 22:53:22 - INFO - codeparrot_training - Step 3199: {'lr': 0.0004992306166834232, 'samples': 614400, 'steps': 3199, 'loss/train': 1.0527534484863281} 01/26/2022 22:53:25 - INFO - codeparrot_training - Step 3200: {'lr': 0.000499229333433282, 'samples': 614592, 'steps': 3200, 'loss/train': 0.7631523907184601} 01/26/2022 22:53:28 - INFO - codeparrot_training - Step 3201: {'lr': 0.0004992280491155214, 'samples': 614784, 'steps': 3201, 'loss/train': 1.382867306470871} 01/26/2022 22:53:32 - INFO - codeparrot_training - Step 3202: {'lr': 0.0004992267637301471, 'samples': 614976, 'steps': 3202, 'loss/train': 0.8780151307582855} 01/26/2022 22:53:35 - INFO - codeparrot_training - Step 3203: {'lr': 0.0004992254772771644, 'samples': 615168, 'steps': 3203, 'loss/train': 0.7238021492958069} 01/26/2022 22:53:38 - INFO - codeparrot_training - Step 3204: {'lr': 0.0004992241897565789, 'samples': 615360, 'steps': 3204, 'loss/train': 0.8594332337379456} 01/26/2022 22:53:41 - INFO - codeparrot_training - Step 3205: {'lr': 0.0004992229011683961, 'samples': 615552, 'steps': 3205, 'loss/train': 0.7084581702947617} 01/26/2022 22:53:44 - INFO - codeparrot_training - Step 3206: {'lr': 0.0004992216115126216, 'samples': 615744, 'steps': 3206, 'loss/train': 0.598720595240593} 01/26/2022 22:53:49 - INFO - codeparrot_training - Step 3207: {'lr': 0.0004992203207892607, 'samples': 615936, 'steps': 3207, 'loss/train': 1.034243881702423} 01/26/2022 22:53:52 - INFO - codeparrot_training - Step 3208: {'lr': 0.0004992190289983192, 'samples': 616128, 'steps': 3208, 'loss/train': 1.057195007801056} 01/26/2022 22:53:55 - INFO - codeparrot_training - Step 3209: {'lr': 0.0004992177361398026, 'samples': 616320, 'steps': 3209, 'loss/train': 0.8341355323791504} 01/26/2022 22:53:58 - INFO - codeparrot_training - Step 3210: {'lr': 0.0004992164422137162, 'samples': 616512, 'steps': 3210, 'loss/train': 0.7004289329051971} 01/26/2022 22:54:01 - INFO - codeparrot_training - Step 3211: {'lr': 0.0004992151472200657, 'samples': 616704, 'steps': 3211, 'loss/train': 0.8968085646629333} 01/26/2022 22:54:04 - INFO - codeparrot_training - Step 3212: {'lr': 0.0004992138511588567, 'samples': 616896, 'steps': 3212, 'loss/train': 0.36174899339675903} 01/26/2022 22:54:07 - INFO - codeparrot_training - Step 3213: {'lr': 0.0004992125540300947, 'samples': 617088, 'steps': 3213, 'loss/train': 0.8428601324558258} 01/26/2022 22:54:11 - INFO - codeparrot_training - Step 3214: {'lr': 0.0004992112558337852, 'samples': 617280, 'steps': 3214, 'loss/train': 0.7951616048812866} 01/26/2022 22:54:17 - INFO - codeparrot_training - Step 3215: {'lr': 0.0004992099565699339, 'samples': 617472, 'steps': 3215, 'loss/train': 0.7562346160411835} 01/26/2022 22:54:20 - INFO - codeparrot_training - Step 3216: {'lr': 0.0004992086562385462, 'samples': 617664, 'steps': 3216, 'loss/train': 0.7515211701393127} 01/26/2022 22:54:23 - INFO - codeparrot_training - Step 3217: {'lr': 0.0004992073548396277, 'samples': 617856, 'steps': 3217, 'loss/train': 0.24278750270605087} 01/26/2022 22:54:26 - INFO - codeparrot_training - Step 3218: {'lr': 0.0004992060523731842, 'samples': 618048, 'steps': 3218, 'loss/train': 1.867811143398285} 01/26/2022 22:54:29 - INFO - codeparrot_training - Step 3219: {'lr': 0.0004992047488392209, 'samples': 618240, 'steps': 3219, 'loss/train': 1.5122644901275635} 01/26/2022 22:54:32 - INFO - codeparrot_training - Step 3220: {'lr': 0.0004992034442377437, 'samples': 618432, 'steps': 3220, 'loss/train': 1.156360924243927} 01/26/2022 22:54:36 - INFO - codeparrot_training - Step 3221: {'lr': 0.0004992021385687582, 'samples': 618624, 'steps': 3221, 'loss/train': 0.742856964468956} 01/26/2022 22:54:39 - INFO - codeparrot_training - Step 3222: {'lr': 0.0004992008318322697, 'samples': 618816, 'steps': 3222, 'loss/train': 0.862229973077774} 01/26/2022 22:54:42 - INFO - codeparrot_training - Step 3223: {'lr': 0.000499199524028284, 'samples': 619008, 'steps': 3223, 'loss/train': 0.5651630312204361} 01/26/2022 22:54:46 - INFO - codeparrot_training - Step 3224: {'lr': 0.0004991982151568066, 'samples': 619200, 'steps': 3224, 'loss/train': 0.7225781679153442} 01/26/2022 22:54:49 - INFO - codeparrot_training - Step 3225: {'lr': 0.0004991969052178433, 'samples': 619392, 'steps': 3225, 'loss/train': 0.613748162984848} 01/26/2022 22:54:53 - INFO - codeparrot_training - Step 3226: {'lr': 0.0004991955942113995, 'samples': 619584, 'steps': 3226, 'loss/train': 1.0407235622406006} 01/26/2022 22:54:56 - INFO - codeparrot_training - Step 3227: {'lr': 0.0004991942821374809, 'samples': 619776, 'steps': 3227, 'loss/train': 1.10627943277359} 01/26/2022 22:54:59 - INFO - codeparrot_training - Step 3228: {'lr': 0.0004991929689960932, 'samples': 619968, 'steps': 3228, 'loss/train': 0.8286504149436951} 01/26/2022 22:55:02 - INFO - codeparrot_training - Step 3229: {'lr': 0.000499191654787242, 'samples': 620160, 'steps': 3229, 'loss/train': 1.2843478918075562} 01/26/2022 22:55:05 - INFO - codeparrot_training - Step 3230: {'lr': 0.0004991903395109328, 'samples': 620352, 'steps': 3230, 'loss/train': 0.7456908971071243} 01/26/2022 22:55:08 - INFO - codeparrot_training - Step 3231: {'lr': 0.0004991890231671712, 'samples': 620544, 'steps': 3231, 'loss/train': 0.6713365763425827} 01/26/2022 22:55:11 - INFO - codeparrot_training - Step 3232: {'lr': 0.0004991877057559631, 'samples': 620736, 'steps': 3232, 'loss/train': 0.6618391424417496} 01/26/2022 22:55:18 - INFO - codeparrot_training - Step 3233: {'lr': 0.0004991863872773139, 'samples': 620928, 'steps': 3233, 'loss/train': 1.7401928901672363} 01/26/2022 22:55:21 - INFO - codeparrot_training - Step 3234: {'lr': 0.0004991850677312295, 'samples': 621120, 'steps': 3234, 'loss/train': 1.6244856119155884} 01/26/2022 22:55:24 - INFO - codeparrot_training - Step 3235: {'lr': 0.0004991837471177152, 'samples': 621312, 'steps': 3235, 'loss/train': 0.7034331858158112} 01/26/2022 22:55:27 - INFO - codeparrot_training - Step 3236: {'lr': 0.000499182425436777, 'samples': 621504, 'steps': 3236, 'loss/train': 0.5511635988950729} 01/26/2022 22:55:30 - INFO - codeparrot_training - Step 3237: {'lr': 0.0004991811026884203, 'samples': 621696, 'steps': 3237, 'loss/train': 0.7116588056087494} 01/26/2022 22:55:33 - INFO - codeparrot_training - Step 3238: {'lr': 0.0004991797788726509, 'samples': 621888, 'steps': 3238, 'loss/train': 0.8795218169689178} 01/26/2022 22:55:37 - INFO - codeparrot_training - Step 3239: {'lr': 0.0004991784539894745, 'samples': 622080, 'steps': 3239, 'loss/train': 1.095743089914322} 01/26/2022 22:55:40 - INFO - codeparrot_training - Step 3240: {'lr': 0.0004991771280388967, 'samples': 622272, 'steps': 3240, 'loss/train': 0.6821416318416595} 01/26/2022 22:55:43 - INFO - codeparrot_training - Step 3241: {'lr': 0.0004991758010209232, 'samples': 622464, 'steps': 3241, 'loss/train': 0.6763386726379395} 01/26/2022 22:55:47 - INFO - codeparrot_training - Step 3242: {'lr': 0.0004991744729355598, 'samples': 622656, 'steps': 3242, 'loss/train': 0.3246803283691406} 01/26/2022 22:55:51 - INFO - codeparrot_training - Step 3243: {'lr': 0.0004991731437828119, 'samples': 622848, 'steps': 3243, 'loss/train': 0.7672356963157654} 01/26/2022 22:55:54 - INFO - codeparrot_training - Step 3244: {'lr': 0.0004991718135626855, 'samples': 623040, 'steps': 3244, 'loss/train': 0.7847853899002075} 01/26/2022 22:55:57 - INFO - codeparrot_training - Step 3245: {'lr': 0.0004991704822751861, 'samples': 623232, 'steps': 3245, 'loss/train': 0.7725111544132233} 01/26/2022 22:56:00 - INFO - codeparrot_training - Step 3246: {'lr': 0.0004991691499203195, 'samples': 623424, 'steps': 3246, 'loss/train': 0.5148628950119019} 01/26/2022 22:56:03 - INFO - codeparrot_training - Step 3247: {'lr': 0.0004991678164980914, 'samples': 623616, 'steps': 3247, 'loss/train': 1.1629799008369446} 01/26/2022 22:56:06 - INFO - codeparrot_training - Step 3248: {'lr': 0.0004991664820085074, 'samples': 623808, 'steps': 3248, 'loss/train': 1.293966919183731} 01/26/2022 22:56:09 - INFO - codeparrot_training - Step 3249: {'lr': 0.0004991651464515735, 'samples': 624000, 'steps': 3249, 'loss/train': 0.7499685734510422} 01/26/2022 22:56:13 - INFO - codeparrot_training - Step 3250: {'lr': 0.0004991638098272951, 'samples': 624192, 'steps': 3250, 'loss/train': 0.6211171299219131} 01/26/2022 22:56:17 - INFO - codeparrot_training - Step 3251: {'lr': 0.000499162472135678, 'samples': 624384, 'steps': 3251, 'loss/train': 1.0435671508312225} 01/26/2022 22:56:20 - INFO - codeparrot_training - Step 3252: {'lr': 0.0004991611333767281, 'samples': 624576, 'steps': 3252, 'loss/train': 0.46713294088840485} 01/26/2022 22:56:23 - INFO - codeparrot_training - Step 3253: {'lr': 0.000499159793550451, 'samples': 624768, 'steps': 3253, 'loss/train': 1.1375735700130463} 01/26/2022 22:56:26 - INFO - codeparrot_training - Step 3254: {'lr': 0.0004991584526568524, 'samples': 624960, 'steps': 3254, 'loss/train': 1.091998815536499} 01/26/2022 22:56:30 - INFO - codeparrot_training - Step 3255: {'lr': 0.0004991571106959383, 'samples': 625152, 'steps': 3255, 'loss/train': 0.7007830291986465} 01/26/2022 22:56:33 - INFO - codeparrot_training - Step 3256: {'lr': 0.000499155767667714, 'samples': 625344, 'steps': 3256, 'loss/train': 1.35159632563591} 01/26/2022 22:56:36 - INFO - codeparrot_training - Step 3257: {'lr': 0.0004991544235721857, 'samples': 625536, 'steps': 3257, 'loss/train': 0.9233076274394989} 01/26/2022 22:56:39 - INFO - codeparrot_training - Step 3258: {'lr': 0.0004991530784093589, 'samples': 625728, 'steps': 3258, 'loss/train': 0.5696717351675034} 01/26/2022 22:56:42 - INFO - codeparrot_training - Step 3259: {'lr': 0.0004991517321792394, 'samples': 625920, 'steps': 3259, 'loss/train': 0.553167924284935} 01/26/2022 22:56:48 - INFO - codeparrot_training - Step 3260: {'lr': 0.000499150384881833, 'samples': 626112, 'steps': 3260, 'loss/train': 0.6710781902074814} 01/26/2022 22:56:52 - INFO - codeparrot_training - Step 3261: {'lr': 0.0004991490365171454, 'samples': 626304, 'steps': 3261, 'loss/train': 0.788628101348877} 01/26/2022 22:56:55 - INFO - codeparrot_training - Step 3262: {'lr': 0.0004991476870851825, 'samples': 626496, 'steps': 3262, 'loss/train': 1.2765026986598969} 01/26/2022 22:56:58 - INFO - codeparrot_training - Step 3263: {'lr': 0.0004991463365859501, 'samples': 626688, 'steps': 3263, 'loss/train': 0.8398063480854034} 01/26/2022 22:57:01 - INFO - codeparrot_training - Step 3264: {'lr': 0.0004991449850194538, 'samples': 626880, 'steps': 3264, 'loss/train': 0.8649659156799316} 01/26/2022 22:57:04 - INFO - codeparrot_training - Step 3265: {'lr': 0.0004991436323856995, 'samples': 627072, 'steps': 3265, 'loss/train': 0.9510187804698944} 01/26/2022 22:57:07 - INFO - codeparrot_training - Step 3266: {'lr': 0.0004991422786846931, 'samples': 627264, 'steps': 3266, 'loss/train': 0.75941401720047} 01/26/2022 22:57:11 - INFO - codeparrot_training - Step 3267: {'lr': 0.0004991409239164401, 'samples': 627456, 'steps': 3267, 'loss/train': 0.49139830470085144} 01/26/2022 22:57:15 - INFO - codeparrot_training - Step 3268: {'lr': 0.0004991395680809467, 'samples': 627648, 'steps': 3268, 'loss/train': 1.1411290168762207} 01/26/2022 22:57:18 - INFO - codeparrot_training - Step 3269: {'lr': 0.0004991382111782183, 'samples': 627840, 'steps': 3269, 'loss/train': 0.9157446920871735} 01/26/2022 22:57:21 - INFO - codeparrot_training - Step 3270: {'lr': 0.0004991368532082611, 'samples': 628032, 'steps': 3270, 'loss/train': 0.4292723536491394} 01/26/2022 22:57:25 - INFO - codeparrot_training - Step 3271: {'lr': 0.0004991354941710806, 'samples': 628224, 'steps': 3271, 'loss/train': 2.7014687061309814} 01/26/2022 22:57:28 - INFO - codeparrot_training - Step 3272: {'lr': 0.0004991341340666828, 'samples': 628416, 'steps': 3272, 'loss/train': 1.1478630602359772} 01/26/2022 22:57:31 - INFO - codeparrot_training - Step 3273: {'lr': 0.0004991327728950736, 'samples': 628608, 'steps': 3273, 'loss/train': 0.9948282837867737} 01/26/2022 22:57:34 - INFO - codeparrot_training - Step 3274: {'lr': 0.0004991314106562586, 'samples': 628800, 'steps': 3274, 'loss/train': 1.244753658771515} 01/26/2022 22:57:37 - INFO - codeparrot_training - Step 3275: {'lr': 0.0004991300473502437, 'samples': 628992, 'steps': 3275, 'loss/train': 0.9703348875045776} 01/26/2022 22:57:40 - INFO - codeparrot_training - Step 3276: {'lr': 0.0004991286829770348, 'samples': 629184, 'steps': 3276, 'loss/train': 1.1810760498046875} 01/26/2022 22:57:45 - INFO - codeparrot_training - Step 3277: {'lr': 0.0004991273175366378, 'samples': 629376, 'steps': 3277, 'loss/train': 0.7311785817146301} 01/26/2022 22:57:48 - INFO - codeparrot_training - Step 3278: {'lr': 0.0004991259510290584, 'samples': 629568, 'steps': 3278, 'loss/train': 1.0104859471321106} 01/26/2022 22:57:51 - INFO - codeparrot_training - Step 3279: {'lr': 0.0004991245834543025, 'samples': 629760, 'steps': 3279, 'loss/train': 0.5066119283437729} 01/26/2022 22:57:54 - INFO - codeparrot_training - Step 3280: {'lr': 0.0004991232148123761, 'samples': 629952, 'steps': 3280, 'loss/train': 0.7334842532873154} 01/26/2022 22:57:57 - INFO - codeparrot_training - Step 3281: {'lr': 0.0004991218451032849, 'samples': 630144, 'steps': 3281, 'loss/train': 0.8897055387496948} 01/26/2022 22:58:00 - INFO - codeparrot_training - Step 3282: {'lr': 0.0004991204743270348, 'samples': 630336, 'steps': 3282, 'loss/train': 0.6245409697294235} 01/26/2022 22:58:04 - INFO - codeparrot_training - Step 3283: {'lr': 0.0004991191024836317, 'samples': 630528, 'steps': 3283, 'loss/train': 0.5372527688741684} 01/26/2022 22:58:07 - INFO - codeparrot_training - Step 3284: {'lr': 0.0004991177295730815, 'samples': 630720, 'steps': 3284, 'loss/train': 1.0876726806163788} 01/26/2022 22:58:10 - INFO - codeparrot_training - Step 3285: {'lr': 0.0004991163555953901, 'samples': 630912, 'steps': 3285, 'loss/train': 0.8222662210464478} 01/26/2022 22:58:14 - INFO - codeparrot_training - Step 3286: {'lr': 0.0004991149805505632, 'samples': 631104, 'steps': 3286, 'loss/train': 0.9346046447753906} 01/26/2022 22:58:17 - INFO - codeparrot_training - Step 3287: {'lr': 0.0004991136044386069, 'samples': 631296, 'steps': 3287, 'loss/train': 1.152842938899994} 01/26/2022 22:58:21 - INFO - codeparrot_training - Step 3288: {'lr': 0.0004991122272595271, 'samples': 631488, 'steps': 3288, 'loss/train': 1.1643860042095184} 01/26/2022 22:58:24 - INFO - codeparrot_training - Step 3289: {'lr': 0.0004991108490133296, 'samples': 631680, 'steps': 3289, 'loss/train': 0.14109507948160172} 01/26/2022 22:58:27 - INFO - codeparrot_training - Step 3290: {'lr': 0.0004991094697000202, 'samples': 631872, 'steps': 3290, 'loss/train': 0.5784157812595367} 01/26/2022 22:58:30 - INFO - codeparrot_training - Step 3291: {'lr': 0.000499108089319605, 'samples': 632064, 'steps': 3291, 'loss/train': 0.5820433795452118} 01/26/2022 22:58:33 - INFO - codeparrot_training - Step 3292: {'lr': 0.0004991067078720899, 'samples': 632256, 'steps': 3292, 'loss/train': 1.153558373451233} 01/26/2022 22:58:36 - INFO - codeparrot_training - Step 3293: {'lr': 0.0004991053253574807, 'samples': 632448, 'steps': 3293, 'loss/train': 0.9854021966457367} 01/26/2022 22:58:39 - INFO - codeparrot_training - Step 3294: {'lr': 0.0004991039417757833, 'samples': 632640, 'steps': 3294, 'loss/train': 1.1021654605865479} 01/26/2022 22:58:46 - INFO - codeparrot_training - Step 3295: {'lr': 0.0004991025571270039, 'samples': 632832, 'steps': 3295, 'loss/train': 0.7969307899475098} 01/26/2022 22:58:49 - INFO - codeparrot_training - Step 3296: {'lr': 0.000499101171411148, 'samples': 633024, 'steps': 3296, 'loss/train': 0.9652641713619232} 01/26/2022 22:58:52 - INFO - codeparrot_training - Step 3297: {'lr': 0.000499099784628222, 'samples': 633216, 'steps': 3297, 'loss/train': 1.0579898357391357} 01/26/2022 22:58:55 - INFO - codeparrot_training - Step 3298: {'lr': 0.0004990983967782316, 'samples': 633408, 'steps': 3298, 'loss/train': 0.84233558177948} 01/26/2022 22:58:58 - INFO - codeparrot_training - Step 3299: {'lr': 0.0004990970078611827, 'samples': 633600, 'steps': 3299, 'loss/train': 0.9046323001384735} 01/26/2022 22:59:02 - INFO - codeparrot_training - Step 3300: {'lr': 0.0004990956178770814, 'samples': 633792, 'steps': 3300, 'loss/train': 0.6016707122325897} 01/26/2022 22:59:05 - INFO - codeparrot_training - Step 3301: {'lr': 0.0004990942268259335, 'samples': 633984, 'steps': 3301, 'loss/train': 0.8941959142684937} 01/26/2022 22:59:08 - INFO - codeparrot_training - Step 3302: {'lr': 0.000499092834707745, 'samples': 634176, 'steps': 3302, 'loss/train': 0.7073316425085068} 01/26/2022 22:59:11 - INFO - codeparrot_training - Step 3303: {'lr': 0.000499091441522522, 'samples': 634368, 'steps': 3303, 'loss/train': 0.42678697407245636} 01/26/2022 22:59:16 - INFO - codeparrot_training - Step 3304: {'lr': 0.0004990900472702702, 'samples': 634560, 'steps': 3304, 'loss/train': 0.8518743216991425} 01/26/2022 22:59:19 - INFO - codeparrot_training - Step 3305: {'lr': 0.0004990886519509959, 'samples': 634752, 'steps': 3305, 'loss/train': 0.6334079504013062} 01/26/2022 22:59:22 - INFO - codeparrot_training - Step 3306: {'lr': 0.0004990872555647048, 'samples': 634944, 'steps': 3306, 'loss/train': 0.7495883703231812} 01/26/2022 22:59:25 - INFO - codeparrot_training - Step 3307: {'lr': 0.0004990858581114029, 'samples': 635136, 'steps': 3307, 'loss/train': 1.156492531299591} 01/26/2022 22:59:28 - INFO - codeparrot_training - Step 3308: {'lr': 0.0004990844595910965, 'samples': 635328, 'steps': 3308, 'loss/train': 0.7994627952575684} 01/26/2022 22:59:31 - INFO - codeparrot_training - Step 3309: {'lr': 0.0004990830600037912, 'samples': 635520, 'steps': 3309, 'loss/train': 0.7568854987621307} 01/26/2022 22:59:35 - INFO - codeparrot_training - Step 3310: {'lr': 0.0004990816593494933, 'samples': 635712, 'steps': 3310, 'loss/train': 0.8288282454013824} 01/26/2022 22:59:38 - INFO - codeparrot_training - Step 3311: {'lr': 0.0004990802576282085, 'samples': 635904, 'steps': 3311, 'loss/train': 0.9526894390583038} 01/26/2022 22:59:41 - INFO - codeparrot_training - Step 3312: {'lr': 0.0004990788548399431, 'samples': 636096, 'steps': 3312, 'loss/train': 0.4023294299840927} 01/26/2022 22:59:47 - INFO - codeparrot_training - Step 3313: {'lr': 0.0004990774509847029, 'samples': 636288, 'steps': 3313, 'loss/train': 0.8199418187141418} 01/26/2022 22:59:50 - INFO - codeparrot_training - Step 3314: {'lr': 0.0004990760460624941, 'samples': 636480, 'steps': 3314, 'loss/train': 0.7004515081644058} 01/26/2022 22:59:54 - INFO - codeparrot_training - Step 3315: {'lr': 0.0004990746400733225, 'samples': 636672, 'steps': 3315, 'loss/train': 0.6201003938913345} 01/26/2022 22:59:57 - INFO - codeparrot_training - Step 3316: {'lr': 0.0004990732330171943, 'samples': 636864, 'steps': 3316, 'loss/train': 0.5750999003648758} 01/26/2022 23:00:00 - INFO - codeparrot_training - Step 3317: {'lr': 0.0004990718248941154, 'samples': 637056, 'steps': 3317, 'loss/train': 0.9877484142780304} 01/26/2022 23:00:03 - INFO - codeparrot_training - Step 3318: {'lr': 0.0004990704157040919, 'samples': 637248, 'steps': 3318, 'loss/train': 0.7921568155288696} 01/26/2022 23:00:06 - INFO - codeparrot_training - Step 3319: {'lr': 0.0004990690054471299, 'samples': 637440, 'steps': 3319, 'loss/train': 0.2495512068271637} 01/26/2022 23:00:09 - INFO - codeparrot_training - Step 3320: {'lr': 0.0004990675941232354, 'samples': 637632, 'steps': 3320, 'loss/train': 0.45810627937316895} 01/26/2022 23:00:14 - INFO - codeparrot_training - Step 3321: {'lr': 0.0004990661817324142, 'samples': 637824, 'steps': 3321, 'loss/train': 0.8280715048313141} 01/26/2022 23:00:17 - INFO - codeparrot_training - Step 3322: {'lr': 0.0004990647682746727, 'samples': 638016, 'steps': 3322, 'loss/train': 0.9451010227203369} 01/26/2022 23:00:20 - INFO - codeparrot_training - Step 3323: {'lr': 0.0004990633537500169, 'samples': 638208, 'steps': 3323, 'loss/train': 1.004376232624054} 01/26/2022 23:00:23 - INFO - codeparrot_training - Step 3324: {'lr': 0.0004990619381584527, 'samples': 638400, 'steps': 3324, 'loss/train': 1.8370290398597717} 01/26/2022 23:00:26 - INFO - codeparrot_training - Step 3325: {'lr': 0.0004990605214999862, 'samples': 638592, 'steps': 3325, 'loss/train': 0.833541065454483} 01/26/2022 23:00:29 - INFO - codeparrot_training - Step 3326: {'lr': 0.0004990591037746236, 'samples': 638784, 'steps': 3326, 'loss/train': 0.47015243768692017} 01/26/2022 23:00:32 - INFO - codeparrot_training - Step 3327: {'lr': 0.0004990576849823708, 'samples': 638976, 'steps': 3327, 'loss/train': 0.9850050508975983} 01/26/2022 23:00:36 - INFO - codeparrot_training - Step 3328: {'lr': 0.000499056265123234, 'samples': 639168, 'steps': 3328, 'loss/train': 0.664535254240036} 01/26/2022 23:00:39 - INFO - codeparrot_training - Step 3329: {'lr': 0.0004990548441972193, 'samples': 639360, 'steps': 3329, 'loss/train': 0.7856253683567047} 01/26/2022 23:00:43 - INFO - codeparrot_training - Step 3330: {'lr': 0.0004990534222043325, 'samples': 639552, 'steps': 3330, 'loss/train': 0.6515641361474991} 01/26/2022 23:00:46 - INFO - codeparrot_training - Step 3331: {'lr': 0.0004990519991445803, 'samples': 639744, 'steps': 3331, 'loss/train': 1.196165531873703} 01/26/2022 23:00:49 - INFO - codeparrot_training - Step 3332: {'lr': 0.0004990505750179682, 'samples': 639936, 'steps': 3332, 'loss/train': 0.753687858581543} 01/26/2022 23:00:53 - INFO - codeparrot_training - Step 3333: {'lr': 0.0004990491498245024, 'samples': 640128, 'steps': 3333, 'loss/train': 0.6127645969390869} 01/26/2022 23:00:56 - INFO - codeparrot_training - Step 3334: {'lr': 0.0004990477235641893, 'samples': 640320, 'steps': 3334, 'loss/train': 0.924448549747467} 01/26/2022 23:00:59 - INFO - codeparrot_training - Step 3335: {'lr': 0.0004990462962370347, 'samples': 640512, 'steps': 3335, 'loss/train': 0.578912079334259} 01/26/2022 23:01:02 - INFO - codeparrot_training - Step 3336: {'lr': 0.0004990448678430451, 'samples': 640704, 'steps': 3336, 'loss/train': 0.9342729449272156} 01/26/2022 23:01:05 - INFO - codeparrot_training - Step 3337: {'lr': 0.0004990434383822261, 'samples': 640896, 'steps': 3337, 'loss/train': 0.8032689392566681} 01/26/2022 23:01:08 - INFO - codeparrot_training - Step 3338: {'lr': 0.0004990420078545843, 'samples': 641088, 'steps': 3338, 'loss/train': 1.036234974861145} 01/26/2022 23:01:14 - INFO - codeparrot_training - Step 3339: {'lr': 0.0004990405762601254, 'samples': 641280, 'steps': 3339, 'loss/train': 0.8272891044616699} 01/26/2022 23:01:18 - INFO - codeparrot_training - Step 3340: {'lr': 0.000499039143598856, 'samples': 641472, 'steps': 3340, 'loss/train': 0.8729260861873627} 01/26/2022 23:01:21 - INFO - codeparrot_training - Step 3341: {'lr': 0.0004990377098707818, 'samples': 641664, 'steps': 3341, 'loss/train': 0.62966188788414} 01/26/2022 23:01:24 - INFO - codeparrot_training - Step 3342: {'lr': 0.0004990362750759092, 'samples': 641856, 'steps': 3342, 'loss/train': 0.7245334088802338} 01/26/2022 23:01:27 - INFO - codeparrot_training - Step 3343: {'lr': 0.0004990348392142443, 'samples': 642048, 'steps': 3343, 'loss/train': 0.696128323674202} 01/26/2022 23:01:30 - INFO - codeparrot_training - Step 3344: {'lr': 0.0004990334022857932, 'samples': 642240, 'steps': 3344, 'loss/train': 1.436158150434494} 01/26/2022 23:01:33 - INFO - codeparrot_training - Step 3345: {'lr': 0.0004990319642905619, 'samples': 642432, 'steps': 3345, 'loss/train': 0.9431849420070648} 01/26/2022 23:01:36 - INFO - codeparrot_training - Step 3346: {'lr': 0.000499030525228557, 'samples': 642624, 'steps': 3346, 'loss/train': 1.2131484746932983} 01/26/2022 23:01:41 - INFO - codeparrot_training - Step 3347: {'lr': 0.0004990290850997843, 'samples': 642816, 'steps': 3347, 'loss/train': 1.1203909814357758} 01/26/2022 23:01:44 - INFO - codeparrot_training - Step 3348: {'lr': 0.0004990276439042501, 'samples': 643008, 'steps': 3348, 'loss/train': 0.9579396843910217} 01/26/2022 23:01:47 - INFO - codeparrot_training - Step 3349: {'lr': 0.0004990262016419606, 'samples': 643200, 'steps': 3349, 'loss/train': 0.6519352197647095} 01/26/2022 23:01:51 - INFO - codeparrot_training - Step 3350: {'lr': 0.0004990247583129218, 'samples': 643392, 'steps': 3350, 'loss/train': 0.7867145240306854} 01/26/2022 23:01:54 - INFO - codeparrot_training - Step 3351: {'lr': 0.00049902331391714, 'samples': 643584, 'steps': 3351, 'loss/train': 1.2604939341545105} 01/26/2022 23:01:57 - INFO - codeparrot_training - Step 3352: {'lr': 0.0004990218684546216, 'samples': 643776, 'steps': 3352, 'loss/train': 0.8498329818248749} 01/26/2022 23:02:00 - INFO - codeparrot_training - Step 3353: {'lr': 0.0004990204219253724, 'samples': 643968, 'steps': 3353, 'loss/train': 1.2933508157730103} 01/26/2022 23:02:03 - INFO - codeparrot_training - Step 3354: {'lr': 0.0004990189743293989, 'samples': 644160, 'steps': 3354, 'loss/train': 0.7443737089633942} 01/26/2022 23:02:06 - INFO - codeparrot_training - Step 3355: {'lr': 0.0004990175256667071, 'samples': 644352, 'steps': 3355, 'loss/train': 0.32971034198999405} 01/26/2022 23:02:09 - INFO - codeparrot_training - Step 3356: {'lr': 0.0004990160759373033, 'samples': 644544, 'steps': 3356, 'loss/train': 1.2139603793621063} 01/26/2022 23:02:14 - INFO - codeparrot_training - Step 3357: {'lr': 0.0004990146251411938, 'samples': 644736, 'steps': 3357, 'loss/train': 1.3296758830547333} 01/26/2022 23:02:17 - INFO - codeparrot_training - Step 3358: {'lr': 0.0004990131732783846, 'samples': 644928, 'steps': 3358, 'loss/train': 0.5510297119617462} 01/26/2022 23:02:20 - INFO - codeparrot_training - Step 3359: {'lr': 0.000499011720348882, 'samples': 645120, 'steps': 3359, 'loss/train': 1.0653780698776245} 01/26/2022 23:02:23 - INFO - codeparrot_training - Step 3360: {'lr': 0.0004990102663526924, 'samples': 645312, 'steps': 3360, 'loss/train': 1.0396090149879456} 01/26/2022 23:02:26 - INFO - codeparrot_training - Step 3361: {'lr': 0.0004990088112898219, 'samples': 645504, 'steps': 3361, 'loss/train': 0.6450718939304352} 01/26/2022 23:02:29 - INFO - codeparrot_training - Step 3362: {'lr': 0.0004990073551602766, 'samples': 645696, 'steps': 3362, 'loss/train': 0.9418070018291473} 01/26/2022 23:02:33 - INFO - codeparrot_training - Step 3363: {'lr': 0.000499005897964063, 'samples': 645888, 'steps': 3363, 'loss/train': 0.8711890876293182} 01/26/2022 23:02:36 - INFO - codeparrot_training - Step 3364: {'lr': 0.0004990044397011871, 'samples': 646080, 'steps': 3364, 'loss/train': 0.8437177240848541} 01/26/2022 23:02:42 - INFO - codeparrot_training - Step 3365: {'lr': 0.0004990029803716552, 'samples': 646272, 'steps': 3365, 'loss/train': 0.741241917014122} 01/26/2022 23:02:45 - INFO - codeparrot_training - Step 3366: {'lr': 0.0004990015199754736, 'samples': 646464, 'steps': 3366, 'loss/train': 0.9914049804210663} 01/26/2022 23:02:48 - INFO - codeparrot_training - Step 3367: {'lr': 0.0004990000585126486, 'samples': 646656, 'steps': 3367, 'loss/train': 1.1680242419242859} 01/26/2022 23:02:51 - INFO - codeparrot_training - Step 3368: {'lr': 0.0004989985959831865, 'samples': 646848, 'steps': 3368, 'loss/train': 0.8023242652416229} 01/26/2022 23:02:54 - INFO - codeparrot_training - Step 3369: {'lr': 0.0004989971323870934, 'samples': 647040, 'steps': 3369, 'loss/train': 1.1617513597011566} 01/26/2022 23:02:58 - INFO - codeparrot_training - Step 3370: {'lr': 0.0004989956677243757, 'samples': 647232, 'steps': 3370, 'loss/train': 0.5355340987443924} 01/26/2022 23:03:01 - INFO - codeparrot_training - Step 3371: {'lr': 0.0004989942019950395, 'samples': 647424, 'steps': 3371, 'loss/train': 0.8531793057918549} 01/26/2022 23:03:04 - INFO - codeparrot_training - Step 3372: {'lr': 0.0004989927351990912, 'samples': 647616, 'steps': 3372, 'loss/train': 0.894323319196701} 01/26/2022 23:03:07 - INFO - codeparrot_training - Step 3373: {'lr': 0.0004989912673365373, 'samples': 647808, 'steps': 3373, 'loss/train': 0.7593604624271393} 01/26/2022 23:03:11 - INFO - codeparrot_training - Step 3374: {'lr': 0.0004989897984073837, 'samples': 648000, 'steps': 3374, 'loss/train': 0.7157657593488693} 01/26/2022 23:03:15 - INFO - codeparrot_training - Step 3375: {'lr': 0.000498988328411637, 'samples': 648192, 'steps': 3375, 'loss/train': 1.0219756364822388} 01/26/2022 23:03:18 - INFO - codeparrot_training - Step 3376: {'lr': 0.0004989868573493032, 'samples': 648384, 'steps': 3376, 'loss/train': 0.7305191159248352} 01/26/2022 23:03:21 - INFO - codeparrot_training - Step 3377: {'lr': 0.0004989853852203889, 'samples': 648576, 'steps': 3377, 'loss/train': 0.7612543702125549} 01/26/2022 23:03:24 - INFO - codeparrot_training - Step 3378: {'lr': 0.0004989839120249002, 'samples': 648768, 'steps': 3378, 'loss/train': 0.8555670082569122} 01/26/2022 23:03:27 - INFO - codeparrot_training - Step 3379: {'lr': 0.0004989824377628435, 'samples': 648960, 'steps': 3379, 'loss/train': 0.9577430784702301} 01/26/2022 23:03:30 - INFO - codeparrot_training - Step 3380: {'lr': 0.0004989809624342251, 'samples': 649152, 'steps': 3380, 'loss/train': 1.2073418498039246} 01/26/2022 23:03:33 - INFO - codeparrot_training - Step 3381: {'lr': 0.0004989794860390513, 'samples': 649344, 'steps': 3381, 'loss/train': 1.1189488470554352} 01/26/2022 23:03:37 - INFO - codeparrot_training - Step 3382: {'lr': 0.0004989780085773285, 'samples': 649536, 'steps': 3382, 'loss/train': 1.0950740575790405} 01/26/2022 23:03:42 - INFO - codeparrot_training - Step 3383: {'lr': 0.0004989765300490628, 'samples': 649728, 'steps': 3383, 'loss/train': 1.1203240156173706} 01/26/2022 23:03:45 - INFO - codeparrot_training - Step 3384: {'lr': 0.0004989750504542609, 'samples': 649920, 'steps': 3384, 'loss/train': 0.7238738536834717} 01/26/2022 23:03:48 - INFO - codeparrot_training - Step 3385: {'lr': 0.0004989735697929289, 'samples': 650112, 'steps': 3385, 'loss/train': 1.8045287132263184} 01/26/2022 23:03:51 - INFO - codeparrot_training - Step 3386: {'lr': 0.0004989720880650731, 'samples': 650304, 'steps': 3386, 'loss/train': 2.176514983177185} 01/26/2022 23:03:55 - INFO - codeparrot_training - Step 3387: {'lr': 0.0004989706052707, 'samples': 650496, 'steps': 3387, 'loss/train': 0.8496333360671997} 01/26/2022 23:03:58 - INFO - codeparrot_training - Step 3388: {'lr': 0.0004989691214098158, 'samples': 650688, 'steps': 3388, 'loss/train': 0.9432154297828674} 01/26/2022 23:04:01 - INFO - codeparrot_training - Step 3389: {'lr': 0.0004989676364824271, 'samples': 650880, 'steps': 3389, 'loss/train': 1.0780064463615417} 01/26/2022 23:04:04 - INFO - codeparrot_training - Step 3390: {'lr': 0.00049896615048854, 'samples': 651072, 'steps': 3390, 'loss/train': 0.7276835292577744} 01/26/2022 23:04:07 - INFO - codeparrot_training - Step 3391: {'lr': 0.000498964663428161, 'samples': 651264, 'steps': 3391, 'loss/train': 1.243201196193695} 01/26/2022 23:04:13 - INFO - codeparrot_training - Step 3392: {'lr': 0.0004989631753012964, 'samples': 651456, 'steps': 3392, 'loss/train': 0.5782046914100647} 01/26/2022 23:04:16 - INFO - codeparrot_training - Step 3393: {'lr': 0.0004989616861079527, 'samples': 651648, 'steps': 3393, 'loss/train': 0.8366221189498901} 01/26/2022 23:04:19 - INFO - codeparrot_training - Step 3394: {'lr': 0.0004989601958481361, 'samples': 651840, 'steps': 3394, 'loss/train': 0.8463922441005707} 01/26/2022 23:04:23 - INFO - codeparrot_training - Step 3395: {'lr': 0.000498958704521853, 'samples': 652032, 'steps': 3395, 'loss/train': 0.933262288570404} 01/26/2022 23:04:26 - INFO - codeparrot_training - Step 3396: {'lr': 0.00049895721212911, 'samples': 652224, 'steps': 3396, 'loss/train': 0.6880873739719391} 01/26/2022 23:04:29 - INFO - codeparrot_training - Step 3397: {'lr': 0.0004989557186699133, 'samples': 652416, 'steps': 3397, 'loss/train': 0.28998812288045883} 01/26/2022 23:04:32 - INFO - codeparrot_training - Step 3398: {'lr': 0.0004989542241442695, 'samples': 652608, 'steps': 3398, 'loss/train': 0.9190465807914734} 01/26/2022 23:04:35 - INFO - codeparrot_training - Step 3399: {'lr': 0.0004989527285521846, 'samples': 652800, 'steps': 3399, 'loss/train': 0.7667238414287567} 01/26/2022 23:04:38 - INFO - codeparrot_training - Step 3400: {'lr': 0.0004989512318936654, 'samples': 652992, 'steps': 3400, 'loss/train': 1.3397120833396912} 01/26/2022 23:04:43 - INFO - codeparrot_training - Step 3401: {'lr': 0.0004989497341687182, 'samples': 653184, 'steps': 3401, 'loss/train': 0.6665136069059372} 01/26/2022 23:04:46 - INFO - codeparrot_training - Step 3402: {'lr': 0.0004989482353773494, 'samples': 653376, 'steps': 3402, 'loss/train': 1.1426078975200653} 01/26/2022 23:04:49 - INFO - codeparrot_training - Step 3403: {'lr': 0.0004989467355195653, 'samples': 653568, 'steps': 3403, 'loss/train': 1.0712773203849792} 01/26/2022 23:04:52 - INFO - codeparrot_training - Step 3404: {'lr': 0.0004989452345953725, 'samples': 653760, 'steps': 3404, 'loss/train': 1.0524961352348328} 01/26/2022 23:04:55 - INFO - codeparrot_training - Step 3405: {'lr': 0.0004989437326047774, 'samples': 653952, 'steps': 3405, 'loss/train': 0.4569498896598816} 01/26/2022 23:04:58 - INFO - codeparrot_training - Step 3406: {'lr': 0.0004989422295477863, 'samples': 654144, 'steps': 3406, 'loss/train': 0.9809224605560303} 01/26/2022 23:05:02 - INFO - codeparrot_training - Step 3407: {'lr': 0.0004989407254244058, 'samples': 654336, 'steps': 3407, 'loss/train': 1.0962432324886322} 01/26/2022 23:05:05 - INFO - codeparrot_training - Step 3408: {'lr': 0.0004989392202346424, 'samples': 654528, 'steps': 3408, 'loss/train': 1.469344675540924} 01/26/2022 23:05:08 - INFO - codeparrot_training - Step 3409: {'lr': 0.0004989377139785022, 'samples': 654720, 'steps': 3409, 'loss/train': 0.9950678050518036} 01/26/2022 23:05:12 - INFO - codeparrot_training - Step 3410: {'lr': 0.000498936206655992, 'samples': 654912, 'steps': 3410, 'loss/train': 0.9152162075042725} 01/26/2022 23:05:16 - INFO - codeparrot_training - Step 3411: {'lr': 0.0004989346982671181, 'samples': 655104, 'steps': 3411, 'loss/train': 0.7493961006402969} 01/26/2022 23:05:19 - INFO - codeparrot_training - Step 3412: {'lr': 0.0004989331888118869, 'samples': 655296, 'steps': 3412, 'loss/train': 0.9478351771831512} 01/26/2022 23:05:22 - INFO - codeparrot_training - Step 3413: {'lr': 0.0004989316782903052, 'samples': 655488, 'steps': 3413, 'loss/train': 0.6196504980325699} 01/26/2022 23:05:25 - INFO - codeparrot_training - Step 3414: {'lr': 0.0004989301667023791, 'samples': 655680, 'steps': 3414, 'loss/train': 0.6791446208953857} 01/26/2022 23:05:28 - INFO - codeparrot_training - Step 3415: {'lr': 0.0004989286540481152, 'samples': 655872, 'steps': 3415, 'loss/train': 0.9134921729564667} 01/26/2022 23:05:31 - INFO - codeparrot_training - Step 3416: {'lr': 0.00049892714032752, 'samples': 656064, 'steps': 3416, 'loss/train': 1.0181048512458801} 01/26/2022 23:05:34 - INFO - codeparrot_training - Step 3417: {'lr': 0.0004989256255406001, 'samples': 656256, 'steps': 3417, 'loss/train': 0.8543981909751892} 01/26/2022 23:05:38 - INFO - codeparrot_training - Step 3418: {'lr': 0.0004989241096873617, 'samples': 656448, 'steps': 3418, 'loss/train': 1.5064562559127808} 01/26/2022 23:05:44 - INFO - codeparrot_training - Step 3419: {'lr': 0.0004989225927678115, 'samples': 656640, 'steps': 3419, 'loss/train': 0.8290984332561493} 01/26/2022 23:05:47 - INFO - codeparrot_training - Step 3420: {'lr': 0.000498921074781956, 'samples': 656832, 'steps': 3420, 'loss/train': 0.8913812935352325} 01/26/2022 23:05:50 - INFO - codeparrot_training - Step 3421: {'lr': 0.0004989195557298016, 'samples': 657024, 'steps': 3421, 'loss/train': 0.7166685461997986} 01/26/2022 23:05:53 - INFO - codeparrot_training - Step 3422: {'lr': 0.0004989180356113549, 'samples': 657216, 'steps': 3422, 'loss/train': 0.6726335138082504} 01/26/2022 23:05:56 - INFO - codeparrot_training - Step 3423: {'lr': 0.0004989165144266224, 'samples': 657408, 'steps': 3423, 'loss/train': 1.4689233005046844} 01/26/2022 23:05:59 - INFO - codeparrot_training - Step 3424: {'lr': 0.0004989149921756105, 'samples': 657600, 'steps': 3424, 'loss/train': 0.9013668894767761} 01/26/2022 23:06:03 - INFO - codeparrot_training - Step 3425: {'lr': 0.0004989134688583259, 'samples': 657792, 'steps': 3425, 'loss/train': 0.785958856344223} 01/26/2022 23:06:06 - INFO - codeparrot_training - Step 3426: {'lr': 0.000498911944474775, 'samples': 657984, 'steps': 3426, 'loss/train': 0.8202915787696838} 01/26/2022 23:06:09 - INFO - codeparrot_training - Step 3427: {'lr': 0.0004989104190249643, 'samples': 658176, 'steps': 3427, 'loss/train': 1.0005434453487396} 01/26/2022 23:06:13 - INFO - codeparrot_training - Step 3428: {'lr': 0.0004989088925089005, 'samples': 658368, 'steps': 3428, 'loss/train': 0.9423141181468964} 01/26/2022 23:06:16 - INFO - codeparrot_training - Step 3429: {'lr': 0.00049890736492659, 'samples': 658560, 'steps': 3429, 'loss/train': 1.3740594685077667} 01/26/2022 23:06:19 - INFO - codeparrot_training - Step 3430: {'lr': 0.0004989058362780394, 'samples': 658752, 'steps': 3430, 'loss/train': 1.3195534944534302} 01/26/2022 23:06:23 - INFO - codeparrot_training - Step 3431: {'lr': 0.0004989043065632552, 'samples': 658944, 'steps': 3431, 'loss/train': 0.9550465643405914} 01/26/2022 23:06:26 - INFO - codeparrot_training - Step 3432: {'lr': 0.0004989027757822441, 'samples': 659136, 'steps': 3432, 'loss/train': 0.4924685060977936} 01/26/2022 23:06:29 - INFO - codeparrot_training - Step 3433: {'lr': 0.0004989012439350124, 'samples': 659328, 'steps': 3433, 'loss/train': 0.750621110200882} 01/26/2022 23:06:32 - INFO - codeparrot_training - Step 3434: {'lr': 0.0004988997110215668, 'samples': 659520, 'steps': 3434, 'loss/train': 0.9732092320919037} 01/26/2022 23:06:35 - INFO - codeparrot_training - Step 3435: {'lr': 0.0004988981770419141, 'samples': 659712, 'steps': 3435, 'loss/train': 1.3945174813270569} 01/26/2022 23:06:38 - INFO - codeparrot_training - Step 3436: {'lr': 0.0004988966419960605, 'samples': 659904, 'steps': 3436, 'loss/train': 0.5546827465295792} 01/26/2022 23:06:43 - INFO - codeparrot_training - Step 3437: {'lr': 0.0004988951058840127, 'samples': 660096, 'steps': 3437, 'loss/train': 1.1871053874492645} 01/26/2022 23:06:46 - INFO - codeparrot_training - Step 3438: {'lr': 0.0004988935687057773, 'samples': 660288, 'steps': 3438, 'loss/train': 0.8423788547515869} 01/26/2022 23:06:49 - INFO - codeparrot_training - Step 3439: {'lr': 0.0004988920304613609, 'samples': 660480, 'steps': 3439, 'loss/train': 0.6588937193155289} 01/26/2022 23:06:52 - INFO - codeparrot_training - Step 3440: {'lr': 0.00049889049115077, 'samples': 660672, 'steps': 3440, 'loss/train': 1.0389866530895233} 01/26/2022 23:06:55 - INFO - codeparrot_training - Step 3441: {'lr': 0.0004988889507740113, 'samples': 660864, 'steps': 3441, 'loss/train': 0.21414439380168915} 01/26/2022 23:06:59 - INFO - codeparrot_training - Step 3442: {'lr': 0.0004988874093310914, 'samples': 661056, 'steps': 3442, 'loss/train': 1.1915831565856934} 01/26/2022 23:07:02 - INFO - codeparrot_training - Step 3443: {'lr': 0.000498885866822017, 'samples': 661248, 'steps': 3443, 'loss/train': 0.8059501647949219} 01/26/2022 23:07:05 - INFO - codeparrot_training - Step 3444: {'lr': 0.0004988843232467944, 'samples': 661440, 'steps': 3444, 'loss/train': 0.261421836912632} 01/26/2022 23:07:11 - INFO - codeparrot_training - Step 3445: {'lr': 0.0004988827786054304, 'samples': 661632, 'steps': 3445, 'loss/train': 0.6394649147987366} 01/26/2022 23:07:14 - INFO - codeparrot_training - Step 3446: {'lr': 0.0004988812328979317, 'samples': 661824, 'steps': 3446, 'loss/train': 1.0005225241184235} 01/26/2022 23:07:17 - INFO - codeparrot_training - Step 3447: {'lr': 0.0004988796861243046, 'samples': 662016, 'steps': 3447, 'loss/train': 0.422732949256897} 01/26/2022 23:07:21 - INFO - codeparrot_training - Step 3448: {'lr': 0.0004988781382845562, 'samples': 662208, 'steps': 3448, 'loss/train': 1.0667278468608856} 01/26/2022 23:07:24 - INFO - codeparrot_training - Step 3449: {'lr': 0.0004988765893786929, 'samples': 662400, 'steps': 3449, 'loss/train': 1.2116923034191132} 01/26/2022 23:07:27 - INFO - codeparrot_training - Step 3450: {'lr': 0.0004988750394067211, 'samples': 662592, 'steps': 3450, 'loss/train': 0.6238459646701813} 01/26/2022 23:07:30 - INFO - codeparrot_training - Step 3451: {'lr': 0.0004988734883686479, 'samples': 662784, 'steps': 3451, 'loss/train': 0.7555444836616516} 01/26/2022 23:07:33 - INFO - codeparrot_training - Step 3452: {'lr': 0.0004988719362644795, 'samples': 662976, 'steps': 3452, 'loss/train': 0.910333514213562} 01/26/2022 23:07:36 - INFO - codeparrot_training - Step 3453: {'lr': 0.0004988703830942228, 'samples': 663168, 'steps': 3453, 'loss/train': 0.5283038914203644} 01/26/2022 23:07:41 - INFO - codeparrot_training - Step 3454: {'lr': 0.0004988688288578845, 'samples': 663360, 'steps': 3454, 'loss/train': 0.8638629019260406} 01/26/2022 23:07:44 - INFO - codeparrot_training - Step 3455: {'lr': 0.0004988672735554711, 'samples': 663552, 'steps': 3455, 'loss/train': 0.8743799328804016} 01/26/2022 23:07:47 - INFO - codeparrot_training - Step 3456: {'lr': 0.0004988657171869893, 'samples': 663744, 'steps': 3456, 'loss/train': 0.9962558448314667} 01/26/2022 23:07:50 - INFO - codeparrot_training - Step 3457: {'lr': 0.0004988641597524458, 'samples': 663936, 'steps': 3457, 'loss/train': 0.7042672634124756} 01/26/2022 23:07:53 - INFO - codeparrot_training - Step 3458: {'lr': 0.0004988626012518473, 'samples': 664128, 'steps': 3458, 'loss/train': 0.8665454685688019} 01/26/2022 23:07:56 - INFO - codeparrot_training - Step 3459: {'lr': 0.0004988610416852004, 'samples': 664320, 'steps': 3459, 'loss/train': 1.0815126299858093} 01/26/2022 23:07:59 - INFO - codeparrot_training - Step 3460: {'lr': 0.0004988594810525118, 'samples': 664512, 'steps': 3460, 'loss/train': 0.9468939900398254} 01/26/2022 23:08:03 - INFO - codeparrot_training - Step 3461: {'lr': 0.0004988579193537883, 'samples': 664704, 'steps': 3461, 'loss/train': 1.0378364324569702} 01/26/2022 23:08:06 - INFO - codeparrot_training - Step 3462: {'lr': 0.0004988563565890364, 'samples': 664896, 'steps': 3462, 'loss/train': 1.0310713648796082} 01/26/2022 23:08:12 - INFO - codeparrot_training - Step 3463: {'lr': 0.000498854792758263, 'samples': 665088, 'steps': 3463, 'loss/train': 0.9088804721832275} 01/26/2022 23:08:15 - INFO - codeparrot_training - Step 3464: {'lr': 0.0004988532278614745, 'samples': 665280, 'steps': 3464, 'loss/train': 1.0673575401306152} 01/26/2022 23:08:19 - INFO - codeparrot_training - Step 3465: {'lr': 0.0004988516618986779, 'samples': 665472, 'steps': 3465, 'loss/train': 0.8007838726043701} 01/26/2022 23:08:22 - INFO - codeparrot_training - Step 3466: {'lr': 0.0004988500948698799, 'samples': 665664, 'steps': 3466, 'loss/train': 1.3248580992221832} 01/26/2022 23:08:25 - INFO - codeparrot_training - Step 3467: {'lr': 0.000498848526775087, 'samples': 665856, 'steps': 3467, 'loss/train': 1.2212061882019043} 01/26/2022 23:08:28 - INFO - codeparrot_training - Step 3468: {'lr': 0.0004988469576143059, 'samples': 666048, 'steps': 3468, 'loss/train': 0.8970865309238434} 01/26/2022 23:08:31 - INFO - codeparrot_training - Step 3469: {'lr': 0.0004988453873875437, 'samples': 666240, 'steps': 3469, 'loss/train': 0.6108669340610504} 01/26/2022 23:08:34 - INFO - codeparrot_training - Step 3470: {'lr': 0.0004988438160948068, 'samples': 666432, 'steps': 3470, 'loss/train': 1.2089430391788483} 01/26/2022 23:08:38 - INFO - codeparrot_training - Step 3471: {'lr': 0.000498842243736102, 'samples': 666624, 'steps': 3471, 'loss/train': 0.11250380426645279} 01/26/2022 23:08:42 - INFO - codeparrot_training - Step 3472: {'lr': 0.000498840670311436, 'samples': 666816, 'steps': 3472, 'loss/train': 0.6885092407464981} 01/26/2022 23:08:45 - INFO - codeparrot_training - Step 3473: {'lr': 0.0004988390958208156, 'samples': 667008, 'steps': 3473, 'loss/train': 1.0614052712917328} 01/26/2022 23:08:48 - INFO - codeparrot_training - Step 3474: {'lr': 0.0004988375202642475, 'samples': 667200, 'steps': 3474, 'loss/train': 0.7545152306556702} 01/26/2022 23:08:51 - INFO - codeparrot_training - Step 3475: {'lr': 0.0004988359436417385, 'samples': 667392, 'steps': 3475, 'loss/train': 0.7210916429758072} 01/26/2022 23:08:54 - INFO - codeparrot_training - Step 3476: {'lr': 0.0004988343659532954, 'samples': 667584, 'steps': 3476, 'loss/train': 0.5656009912490845} 01/26/2022 23:08:58 - INFO - codeparrot_training - Step 3477: {'lr': 0.0004988327871989249, 'samples': 667776, 'steps': 3477, 'loss/train': 0.1403643786907196} 01/26/2022 23:09:01 - INFO - codeparrot_training - Step 3478: {'lr': 0.0004988312073786336, 'samples': 667968, 'steps': 3478, 'loss/train': 0.86668261885643} 01/26/2022 23:09:04 - INFO - codeparrot_training - Step 3479: {'lr': 0.0004988296264924286, 'samples': 668160, 'steps': 3479, 'loss/train': 0.9628993570804596} 01/26/2022 23:09:07 - INFO - codeparrot_training - Step 3480: {'lr': 0.0004988280445403164, 'samples': 668352, 'steps': 3480, 'loss/train': 1.0496500432491302} 01/26/2022 23:09:12 - INFO - codeparrot_training - Step 3481: {'lr': 0.0004988264615223038, 'samples': 668544, 'steps': 3481, 'loss/train': 1.039917916059494} 01/26/2022 23:09:15 - INFO - codeparrot_training - Step 3482: {'lr': 0.0004988248774383978, 'samples': 668736, 'steps': 3482, 'loss/train': 0.7871565520763397} 01/26/2022 23:09:18 - INFO - codeparrot_training - Step 3483: {'lr': 0.0004988232922886049, 'samples': 668928, 'steps': 3483, 'loss/train': 0.6844160109758377} 01/26/2022 23:09:22 - INFO - codeparrot_training - Step 3484: {'lr': 0.0004988217060729321, 'samples': 669120, 'steps': 3484, 'loss/train': 0.7865399122238159} 01/26/2022 23:09:25 - INFO - codeparrot_training - Step 3485: {'lr': 0.0004988201187913861, 'samples': 669312, 'steps': 3485, 'loss/train': 0.2008211687207222} 01/26/2022 23:09:28 - INFO - codeparrot_training - Step 3486: {'lr': 0.0004988185304439737, 'samples': 669504, 'steps': 3486, 'loss/train': 0.8412896990776062} 01/26/2022 23:09:31 - INFO - codeparrot_training - Step 3487: {'lr': 0.0004988169410307018, 'samples': 669696, 'steps': 3487, 'loss/train': 0.9697532951831818} 01/26/2022 23:09:34 - INFO - codeparrot_training - Step 3488: {'lr': 0.0004988153505515771, 'samples': 669888, 'steps': 3488, 'loss/train': 1.728425681591034} 01/26/2022 23:09:39 - INFO - codeparrot_training - Step 3489: {'lr': 0.0004988137590066064, 'samples': 670080, 'steps': 3489, 'loss/train': 1.2039638757705688} 01/26/2022 23:09:42 - INFO - codeparrot_training - Step 3490: {'lr': 0.0004988121663957966, 'samples': 670272, 'steps': 3490, 'loss/train': 1.3312985301017761} 01/26/2022 23:09:45 - INFO - codeparrot_training - Step 3491: {'lr': 0.0004988105727191546, 'samples': 670464, 'steps': 3491, 'loss/train': 0.5942160040140152} 01/26/2022 23:09:48 - INFO - codeparrot_training - Step 3492: {'lr': 0.0004988089779766869, 'samples': 670656, 'steps': 3492, 'loss/train': 1.276122808456421} 01/26/2022 23:09:51 - INFO - codeparrot_training - Step 3493: {'lr': 0.0004988073821684006, 'samples': 670848, 'steps': 3493, 'loss/train': 1.0278385877609253} 01/26/2022 23:09:54 - INFO - codeparrot_training - Step 3494: {'lr': 0.0004988057852943025, 'samples': 671040, 'steps': 3494, 'loss/train': 1.0297375917434692} 01/26/2022 23:09:57 - INFO - codeparrot_training - Step 3495: {'lr': 0.0004988041873543995, 'samples': 671232, 'steps': 3495, 'loss/train': 0.7507414519786835} 01/26/2022 23:10:01 - INFO - codeparrot_training - Step 3496: {'lr': 0.0004988025883486983, 'samples': 671424, 'steps': 3496, 'loss/train': 0.5614122301340103} 01/26/2022 23:10:04 - INFO - codeparrot_training - Step 3497: {'lr': 0.0004988009882772058, 'samples': 671616, 'steps': 3497, 'loss/train': 1.155599445104599} 01/26/2022 23:10:10 - INFO - codeparrot_training - Step 3498: {'lr': 0.0004987993871399289, 'samples': 671808, 'steps': 3498, 'loss/train': 1.1550453007221222} 01/26/2022 23:10:13 - INFO - codeparrot_training - Step 3499: {'lr': 0.0004987977849368744, 'samples': 672000, 'steps': 3499, 'loss/train': 1.4273580014705658} 01/26/2022 23:10:16 - INFO - codeparrot_training - Step 3500: {'lr': 0.0004987961816680492, 'samples': 672192, 'steps': 3500, 'loss/train': 0.76985102891922} 01/26/2022 23:10:20 - INFO - codeparrot_training - Step 3501: {'lr': 0.0004987945773334602, 'samples': 672384, 'steps': 3501, 'loss/train': 0.6798747181892395} 01/26/2022 23:10:23 - INFO - codeparrot_training - Step 3502: {'lr': 0.0004987929719331142, 'samples': 672576, 'steps': 3502, 'loss/train': 0.9337702095508575} 01/26/2022 23:10:26 - INFO - codeparrot_training - Step 3503: {'lr': 0.0004987913654670181, 'samples': 672768, 'steps': 3503, 'loss/train': 1.9593796133995056} 01/26/2022 23:10:29 - INFO - codeparrot_training - Step 3504: {'lr': 0.0004987897579351787, 'samples': 672960, 'steps': 3504, 'loss/train': 0.6814756840467453} 01/26/2022 23:10:32 - INFO - codeparrot_training - Step 3505: {'lr': 0.0004987881493376032, 'samples': 673152, 'steps': 3505, 'loss/train': 0.6682507395744324} 01/26/2022 23:10:35 - INFO - codeparrot_training - Step 3506: {'lr': 0.0004987865396742981, 'samples': 673344, 'steps': 3506, 'loss/train': 0.9115349650382996} 01/26/2022 23:10:40 - INFO - codeparrot_training - Step 3507: {'lr': 0.0004987849289452705, 'samples': 673536, 'steps': 3507, 'loss/train': 0.43426497280597687} 01/26/2022 23:10:43 - INFO - codeparrot_training - Step 3508: {'lr': 0.0004987833171505272, 'samples': 673728, 'steps': 3508, 'loss/train': 0.46005673706531525} 01/26/2022 23:10:47 - INFO - codeparrot_training - Step 3509: {'lr': 0.0004987817042900753, 'samples': 673920, 'steps': 3509, 'loss/train': 0.42020469903945923} 01/26/2022 23:10:50 - INFO - codeparrot_training - Step 3510: {'lr': 0.0004987800903639216, 'samples': 674112, 'steps': 3510, 'loss/train': 1.0017637610435486} 01/26/2022 23:10:53 - INFO - codeparrot_training - Step 3511: {'lr': 0.0004987784753720728, 'samples': 674304, 'steps': 3511, 'loss/train': 0.6124793887138367} 01/26/2022 23:10:56 - INFO - codeparrot_training - Step 3512: {'lr': 0.0004987768593145362, 'samples': 674496, 'steps': 3512, 'loss/train': 0.6916363835334778} 01/26/2022 23:10:59 - INFO - codeparrot_training - Step 3513: {'lr': 0.0004987752421913185, 'samples': 674688, 'steps': 3513, 'loss/train': 0.4831202030181885} 01/26/2022 23:11:02 - INFO - codeparrot_training - Step 3514: {'lr': 0.0004987736240024264, 'samples': 674880, 'steps': 3514, 'loss/train': 0.576916292309761} 01/26/2022 23:11:05 - INFO - codeparrot_training - Step 3515: {'lr': 0.0004987720047478673, 'samples': 675072, 'steps': 3515, 'loss/train': 1.0383233428001404} 01/26/2022 23:11:10 - INFO - codeparrot_training - Step 3516: {'lr': 0.000498770384427648, 'samples': 675264, 'steps': 3516, 'loss/train': 0.8653996288776398} 01/26/2022 23:11:13 - INFO - codeparrot_training - Step 3517: {'lr': 0.0004987687630417753, 'samples': 675456, 'steps': 3517, 'loss/train': 1.0176049768924713} 01/26/2022 23:11:16 - INFO - codeparrot_training - Step 3518: {'lr': 0.0004987671405902562, 'samples': 675648, 'steps': 3518, 'loss/train': 1.8130323886871338} 01/26/2022 23:11:19 - INFO - codeparrot_training - Step 3519: {'lr': 0.0004987655170730976, 'samples': 675840, 'steps': 3519, 'loss/train': 0.9789048135280609} 01/26/2022 23:11:22 - INFO - codeparrot_training - Step 3520: {'lr': 0.0004987638924903066, 'samples': 676032, 'steps': 3520, 'loss/train': 0.5999565124511719} 01/26/2022 23:11:26 - INFO - codeparrot_training - Step 3521: {'lr': 0.00049876226684189, 'samples': 676224, 'steps': 3521, 'loss/train': 0.546031728386879} 01/26/2022 23:11:29 - INFO - codeparrot_training - Step 3522: {'lr': 0.0004987606401278549, 'samples': 676416, 'steps': 3522, 'loss/train': 0.807611882686615} 01/26/2022 23:11:32 - INFO - codeparrot_training - Step 3523: {'lr': 0.0004987590123482082, 'samples': 676608, 'steps': 3523, 'loss/train': 0.7037413269281387} 01/26/2022 23:11:35 - INFO - codeparrot_training - Step 3524: {'lr': 0.0004987573835029569, 'samples': 676800, 'steps': 3524, 'loss/train': 0.6685998290777206} 01/26/2022 23:11:41 - INFO - codeparrot_training - Step 3525: {'lr': 0.0004987557535921079, 'samples': 676992, 'steps': 3525, 'loss/train': 0.7950996458530426} 01/26/2022 23:11:44 - INFO - codeparrot_training - Step 3526: {'lr': 0.0004987541226156683, 'samples': 677184, 'steps': 3526, 'loss/train': 0.8862772285938263} 01/26/2022 23:11:48 - INFO - codeparrot_training - Step 3527: {'lr': 0.0004987524905736451, 'samples': 677376, 'steps': 3527, 'loss/train': 1.1140244007110596} 01/26/2022 23:11:51 - INFO - codeparrot_training - Step 3528: {'lr': 0.000498750857466045, 'samples': 677568, 'steps': 3528, 'loss/train': 0.9616683125495911} 01/26/2022 23:11:54 - INFO - codeparrot_training - Step 3529: {'lr': 0.0004987492232928753, 'samples': 677760, 'steps': 3529, 'loss/train': 0.7830295264720917} 01/26/2022 23:11:57 - INFO - codeparrot_training - Step 3530: {'lr': 0.000498747588054143, 'samples': 677952, 'steps': 3530, 'loss/train': 0.8430259823799133} 01/26/2022 23:12:00 - INFO - codeparrot_training - Step 3531: {'lr': 0.0004987459517498549, 'samples': 678144, 'steps': 3531, 'loss/train': 0.43376424908638} 01/26/2022 23:12:03 - INFO - codeparrot_training - Step 3532: {'lr': 0.0004987443143800182, 'samples': 678336, 'steps': 3532, 'loss/train': 0.4295644015073776} 01/26/2022 23:12:06 - INFO - codeparrot_training - Step 3533: {'lr': 0.0004987426759446398, 'samples': 678528, 'steps': 3533, 'loss/train': 1.1131397187709808} 01/26/2022 23:12:11 - INFO - codeparrot_training - Step 3534: {'lr': 0.0004987410364437269, 'samples': 678720, 'steps': 3534, 'loss/train': 0.8183103203773499} 01/26/2022 23:12:14 - INFO - codeparrot_training - Step 3535: {'lr': 0.0004987393958772862, 'samples': 678912, 'steps': 3535, 'loss/train': 1.202182799577713} 01/26/2022 23:12:17 - INFO - codeparrot_training - Step 3536: {'lr': 0.0004987377542453251, 'samples': 679104, 'steps': 3536, 'loss/train': 1.0004086196422577} 01/26/2022 23:12:20 - INFO - codeparrot_training - Step 3537: {'lr': 0.0004987361115478502, 'samples': 679296, 'steps': 3537, 'loss/train': 0.7032854408025742} 01/26/2022 23:12:23 - INFO - codeparrot_training - Step 3538: {'lr': 0.000498734467784869, 'samples': 679488, 'steps': 3538, 'loss/train': 0.9592151641845703} 01/26/2022 23:12:27 - INFO - codeparrot_training - Step 3539: {'lr': 0.0004987328229563883, 'samples': 679680, 'steps': 3539, 'loss/train': 1.4277535378932953} 01/26/2022 23:12:30 - INFO - codeparrot_training - Step 3540: {'lr': 0.0004987311770624151, 'samples': 679872, 'steps': 3540, 'loss/train': 0.3747042864561081} 01/26/2022 23:12:33 - INFO - codeparrot_training - Step 3541: {'lr': 0.0004987295301029565, 'samples': 680064, 'steps': 3541, 'loss/train': 0.8155333399772644} 01/26/2022 23:12:37 - INFO - codeparrot_training - Step 3542: {'lr': 0.0004987278820780196, 'samples': 680256, 'steps': 3542, 'loss/train': 0.4623965620994568} 01/26/2022 23:12:40 - INFO - codeparrot_training - Step 3543: {'lr': 0.0004987262329876114, 'samples': 680448, 'steps': 3543, 'loss/train': 0.7306657433509827} 01/26/2022 23:12:43 - INFO - codeparrot_training - Step 3544: {'lr': 0.000498724582831739, 'samples': 680640, 'steps': 3544, 'loss/train': 0.8007720708847046} 01/26/2022 23:12:47 - INFO - codeparrot_training - Step 3545: {'lr': 0.0004987229316104095, 'samples': 680832, 'steps': 3545, 'loss/train': 1.1258904933929443} 01/26/2022 23:12:50 - INFO - codeparrot_training - Step 3546: {'lr': 0.00049872127932363, 'samples': 681024, 'steps': 3546, 'loss/train': 0.5097497552633286} 01/26/2022 23:12:53 - INFO - codeparrot_training - Step 3547: {'lr': 0.0004987196259714074, 'samples': 681216, 'steps': 3547, 'loss/train': 0.6768006384372711} 01/26/2022 23:12:56 - INFO - codeparrot_training - Step 3548: {'lr': 0.000498717971553749, 'samples': 681408, 'steps': 3548, 'loss/train': 0.6267824023962021} 01/26/2022 23:12:59 - INFO - codeparrot_training - Step 3549: {'lr': 0.0004987163160706617, 'samples': 681600, 'steps': 3549, 'loss/train': 0.8343434035778046} 01/26/2022 23:13:02 - INFO - codeparrot_training - Step 3550: {'lr': 0.0004987146595221527, 'samples': 681792, 'steps': 3550, 'loss/train': 1.1337076127529144} 01/26/2022 23:13:07 - INFO - codeparrot_training - Step 3551: {'lr': 0.0004987130019082291, 'samples': 681984, 'steps': 3551, 'loss/train': 1.0472382009029388} 01/26/2022 23:13:10 - INFO - codeparrot_training - Step 3552: {'lr': 0.000498711343228898, 'samples': 682176, 'steps': 3552, 'loss/train': 1.3223441541194916} 01/26/2022 23:13:14 - INFO - codeparrot_training - Step 3553: {'lr': 0.0004987096834841665, 'samples': 682368, 'steps': 3553, 'loss/train': 0.39941637217998505} 01/26/2022 23:13:17 - INFO - codeparrot_training - Step 3554: {'lr': 0.0004987080226740416, 'samples': 682560, 'steps': 3554, 'loss/train': 0.7795284390449524} 01/26/2022 23:13:20 - INFO - codeparrot_training - Step 3555: {'lr': 0.0004987063607985305, 'samples': 682752, 'steps': 3555, 'loss/train': 2.2629974484443665} 01/26/2022 23:13:23 - INFO - codeparrot_training - Step 3556: {'lr': 0.0004987046978576404, 'samples': 682944, 'steps': 3556, 'loss/train': 1.0906520783901215} 01/26/2022 23:13:26 - INFO - codeparrot_training - Step 3557: {'lr': 0.0004987030338513783, 'samples': 683136, 'steps': 3557, 'loss/train': 1.1148298680782318} 01/26/2022 23:13:29 - INFO - codeparrot_training - Step 3558: {'lr': 0.0004987013687797514, 'samples': 683328, 'steps': 3558, 'loss/train': 0.5412906557321548} 01/26/2022 23:13:32 - INFO - codeparrot_training - Step 3559: {'lr': 0.0004986997026427668, 'samples': 683520, 'steps': 3559, 'loss/train': 0.8675212562084198} 01/26/2022 23:13:39 - INFO - codeparrot_training - Step 3560: {'lr': 0.0004986980354404316, 'samples': 683712, 'steps': 3560, 'loss/train': 1.3221834897994995} 01/26/2022 23:13:42 - INFO - codeparrot_training - Step 3561: {'lr': 0.000498696367172753, 'samples': 683904, 'steps': 3561, 'loss/train': 1.3358522951602936} 01/26/2022 23:13:45 - INFO - codeparrot_training - Step 3562: {'lr': 0.0004986946978397382, 'samples': 684096, 'steps': 3562, 'loss/train': 0.4751574844121933} 01/26/2022 23:13:48 - INFO - codeparrot_training - Step 3563: {'lr': 0.0004986930274413942, 'samples': 684288, 'steps': 3563, 'loss/train': 0.9189262390136719} 01/26/2022 23:13:52 - INFO - codeparrot_training - Step 3564: {'lr': 0.0004986913559777283, 'samples': 684480, 'steps': 3564, 'loss/train': 0.6495265066623688} 01/26/2022 23:13:55 - INFO - codeparrot_training - Step 3565: {'lr': 0.0004986896834487477, 'samples': 684672, 'steps': 3565, 'loss/train': 1.0174423456192017} 01/26/2022 23:13:58 - INFO - codeparrot_training - Step 3566: {'lr': 0.0004986880098544593, 'samples': 684864, 'steps': 3566, 'loss/train': 0.976658821105957} 01/26/2022 23:14:01 - INFO - codeparrot_training - Step 3567: {'lr': 0.0004986863351948705, 'samples': 685056, 'steps': 3567, 'loss/train': 0.9586613774299622} 01/26/2022 23:14:04 - INFO - codeparrot_training - Step 3568: {'lr': 0.0004986846594699883, 'samples': 685248, 'steps': 3568, 'loss/train': 1.1336173117160797} 01/26/2022 23:14:09 - INFO - codeparrot_training - Step 3569: {'lr': 0.0004986829826798202, 'samples': 685440, 'steps': 3569, 'loss/train': 1.366103321313858} 01/26/2022 23:14:12 - INFO - codeparrot_training - Step 3570: {'lr': 0.0004986813048243729, 'samples': 685632, 'steps': 3570, 'loss/train': 0.9043460190296173} 01/26/2022 23:14:15 - INFO - codeparrot_training - Step 3571: {'lr': 0.000498679625903654, 'samples': 685824, 'steps': 3571, 'loss/train': 0.6891858726739883} 01/26/2022 23:14:18 - INFO - codeparrot_training - Step 3572: {'lr': 0.0004986779459176706, 'samples': 686016, 'steps': 3572, 'loss/train': 1.3364554345607758} 01/26/2022 23:14:21 - INFO - codeparrot_training - Step 3573: {'lr': 0.0004986762648664298, 'samples': 686208, 'steps': 3573, 'loss/train': 0.6674574762582779} 01/26/2022 23:14:25 - INFO - codeparrot_training - Step 3574: {'lr': 0.0004986745827499389, 'samples': 686400, 'steps': 3574, 'loss/train': 2.249790608882904} 01/26/2022 23:14:28 - INFO - codeparrot_training - Step 3575: {'lr': 0.0004986728995682049, 'samples': 686592, 'steps': 3575, 'loss/train': 0.915315181016922} 01/26/2022 23:14:31 - INFO - codeparrot_training - Step 3576: {'lr': 0.0004986712153212352, 'samples': 686784, 'steps': 3576, 'loss/train': 1.1725983917713165} 01/26/2022 23:14:37 - INFO - codeparrot_training - Step 3577: {'lr': 0.0004986695300090371, 'samples': 686976, 'steps': 3577, 'loss/train': 0.9421500563621521} 01/26/2022 23:14:40 - INFO - codeparrot_training - Step 3578: {'lr': 0.0004986678436316175, 'samples': 687168, 'steps': 3578, 'loss/train': 1.0411916971206665} 01/26/2022 23:14:43 - INFO - codeparrot_training - Step 3579: {'lr': 0.000498666156188984, 'samples': 687360, 'steps': 3579, 'loss/train': 1.7173989415168762} 01/26/2022 23:14:46 - INFO - codeparrot_training - Step 3580: {'lr': 0.0004986644676811436, 'samples': 687552, 'steps': 3580, 'loss/train': 0.6146013289690018} 01/26/2022 23:14:50 - INFO - codeparrot_training - Step 3581: {'lr': 0.0004986627781081035, 'samples': 687744, 'steps': 3581, 'loss/train': 1.1088705360889435} 01/26/2022 23:14:53 - INFO - codeparrot_training - Step 3582: {'lr': 0.0004986610874698712, 'samples': 687936, 'steps': 3582, 'loss/train': 0.5077158361673355} 01/26/2022 23:14:56 - INFO - codeparrot_training - Step 3583: {'lr': 0.0004986593957664536, 'samples': 688128, 'steps': 3583, 'loss/train': 0.625507190823555} 01/26/2022 23:14:59 - INFO - codeparrot_training - Step 3584: {'lr': 0.0004986577029978581, 'samples': 688320, 'steps': 3584, 'loss/train': 1.083919197320938} 01/26/2022 23:15:02 - INFO - codeparrot_training - Step 3585: {'lr': 0.000498656009164092, 'samples': 688512, 'steps': 3585, 'loss/train': 0.4094402343034744} 01/26/2022 23:15:07 - INFO - codeparrot_training - Step 3586: {'lr': 0.0004986543142651625, 'samples': 688704, 'steps': 3586, 'loss/train': 0.9997166097164154} 01/26/2022 23:15:10 - INFO - codeparrot_training - Step 3587: {'lr': 0.0004986526183010769, 'samples': 688896, 'steps': 3587, 'loss/train': 0.5691585838794708} 01/26/2022 23:15:13 - INFO - codeparrot_training - Step 3588: {'lr': 0.0004986509212718425, 'samples': 689088, 'steps': 3588, 'loss/train': 0.44922034442424774} 01/26/2022 23:15:16 - INFO - codeparrot_training - Step 3589: {'lr': 0.0004986492231774664, 'samples': 689280, 'steps': 3589, 'loss/train': 1.1283352375030518} 01/26/2022 23:15:19 - INFO - codeparrot_training - Step 3590: {'lr': 0.0004986475240179559, 'samples': 689472, 'steps': 3590, 'loss/train': 0.8099814355373383} 01/26/2022 23:15:22 - INFO - codeparrot_training - Step 3591: {'lr': 0.0004986458237933185, 'samples': 689664, 'steps': 3591, 'loss/train': 0.9982381761074066} 01/26/2022 23:15:25 - INFO - codeparrot_training - Step 3592: {'lr': 0.0004986441225035614, 'samples': 689856, 'steps': 3592, 'loss/train': 1.1407145261764526} 01/26/2022 23:15:29 - INFO - codeparrot_training - Step 3593: {'lr': 0.0004986424201486918, 'samples': 690048, 'steps': 3593, 'loss/train': 1.1117941439151764} 01/26/2022 23:15:32 - INFO - codeparrot_training - Step 3594: {'lr': 0.000498640716728717, 'samples': 690240, 'steps': 3594, 'loss/train': 0.3931679427623749} 01/26/2022 23:15:36 - INFO - codeparrot_training - Step 3595: {'lr': 0.0004986390122436443, 'samples': 690432, 'steps': 3595, 'loss/train': 0.48713448643684387} 01/26/2022 23:15:39 - INFO - codeparrot_training - Step 3596: {'lr': 0.000498637306693481, 'samples': 690624, 'steps': 3596, 'loss/train': 1.0181270241737366} 01/26/2022 23:15:42 - INFO - codeparrot_training - Step 3597: {'lr': 0.0004986356000782345, 'samples': 690816, 'steps': 3597, 'loss/train': 1.0641497075557709} 01/26/2022 23:15:46 - INFO - codeparrot_training - Step 3598: {'lr': 0.0004986338923979119, 'samples': 691008, 'steps': 3598, 'loss/train': 0.4452457129955292} 01/26/2022 23:15:49 - INFO - codeparrot_training - Step 3599: {'lr': 0.0004986321836525209, 'samples': 691200, 'steps': 3599, 'loss/train': 0.41190366446971893} 01/26/2022 23:15:52 - INFO - codeparrot_training - Step 3600: {'lr': 0.0004986304738420684, 'samples': 691392, 'steps': 3600, 'loss/train': 0.8729208111763} 01/26/2022 23:15:55 - INFO - codeparrot_training - Step 3601: {'lr': 0.0004986287629665619, 'samples': 691584, 'steps': 3601, 'loss/train': 0.8319123387336731} 01/26/2022 23:15:58 - INFO - codeparrot_training - Step 3602: {'lr': 0.0004986270510260087, 'samples': 691776, 'steps': 3602, 'loss/train': 0.9446446001529694} 01/26/2022 23:16:01 - INFO - codeparrot_training - Step 3603: {'lr': 0.0004986253380204163, 'samples': 691968, 'steps': 3603, 'loss/train': 1.3737148940563202} 01/26/2022 23:16:07 - INFO - codeparrot_training - Step 3604: {'lr': 0.0004986236239497918, 'samples': 692160, 'steps': 3604, 'loss/train': 0.7350283563137054} 01/26/2022 23:16:11 - INFO - codeparrot_training - Step 3605: {'lr': 0.0004986219088141426, 'samples': 692352, 'steps': 3605, 'loss/train': 1.0546115934848785} 01/26/2022 23:16:14 - INFO - codeparrot_training - Step 3606: {'lr': 0.0004986201926134761, 'samples': 692544, 'steps': 3606, 'loss/train': 0.9937415421009064} 01/26/2022 23:16:17 - INFO - codeparrot_training - Step 3607: {'lr': 0.0004986184753477998, 'samples': 692736, 'steps': 3607, 'loss/train': 0.9873069226741791} 01/26/2022 23:16:20 - INFO - codeparrot_training - Step 3608: {'lr': 0.0004986167570171208, 'samples': 692928, 'steps': 3608, 'loss/train': 0.6657139509916306} 01/26/2022 23:16:23 - INFO - codeparrot_training - Step 3609: {'lr': 0.0004986150376214465, 'samples': 693120, 'steps': 3609, 'loss/train': 0.8554791212081909} 01/26/2022 23:16:26 - INFO - codeparrot_training - Step 3610: {'lr': 0.0004986133171607844, 'samples': 693312, 'steps': 3610, 'loss/train': 1.1330173015594482} 01/26/2022 23:16:29 - INFO - codeparrot_training - Step 3611: {'lr': 0.0004986115956351417, 'samples': 693504, 'steps': 3611, 'loss/train': 0.34765197336673737} 01/26/2022 23:16:33 - INFO - codeparrot_training - Step 3612: {'lr': 0.000498609873044526, 'samples': 693696, 'steps': 3612, 'loss/train': 0.8094501793384552} 01/26/2022 23:16:37 - INFO - codeparrot_training - Step 3613: {'lr': 0.0004986081493889444, 'samples': 693888, 'steps': 3613, 'loss/train': 1.1447930932044983} 01/26/2022 23:16:40 - INFO - codeparrot_training - Step 3614: {'lr': 0.0004986064246684046, 'samples': 694080, 'steps': 3614, 'loss/train': 0.7717787325382233} 01/26/2022 23:16:43 - INFO - codeparrot_training - Step 3615: {'lr': 0.0004986046988829136, 'samples': 694272, 'steps': 3615, 'loss/train': 0.963932454586029} 01/26/2022 23:16:46 - INFO - codeparrot_training - Step 3616: {'lr': 0.0004986029720324791, 'samples': 694464, 'steps': 3616, 'loss/train': 0.7372507899999619} 01/26/2022 23:16:50 - INFO - codeparrot_training - Step 3617: {'lr': 0.0004986012441171085, 'samples': 694656, 'steps': 3617, 'loss/train': 1.1633365452289581} 01/26/2022 23:16:53 - INFO - codeparrot_training - Step 3618: {'lr': 0.000498599515136809, 'samples': 694848, 'steps': 3618, 'loss/train': 0.9011070728302002} 01/26/2022 23:16:56 - INFO - codeparrot_training - Step 3619: {'lr': 0.0004985977850915882, 'samples': 695040, 'steps': 3619, 'loss/train': 0.8045866191387177} 01/26/2022 23:16:59 - INFO - codeparrot_training - Step 3620: {'lr': 0.0004985960539814534, 'samples': 695232, 'steps': 3620, 'loss/train': 1.0367517471313477} 01/26/2022 23:17:02 - INFO - codeparrot_training - Step 3621: {'lr': 0.000498594321806412, 'samples': 695424, 'steps': 3621, 'loss/train': 0.5073650032281876} 01/26/2022 23:17:14 - INFO - codeparrot_training - Step 3622: {'lr': 0.0004985925885664716, 'samples': 695616, 'steps': 3622, 'loss/train': 1.3225988745689392} 01/26/2022 23:17:17 - INFO - codeparrot_training - Step 3623: {'lr': 0.0004985908542616393, 'samples': 695808, 'steps': 3623, 'loss/train': 0.911689281463623} 01/26/2022 23:17:21 - INFO - codeparrot_training - Step 3624: {'lr': 0.0004985891188919229, 'samples': 696000, 'steps': 3624, 'loss/train': 1.0117626786231995} 01/26/2022 23:17:24 - INFO - codeparrot_training - Step 3625: {'lr': 0.0004985873824573296, 'samples': 696192, 'steps': 3625, 'loss/train': 1.0375434458255768} 01/26/2022 23:17:27 - INFO - codeparrot_training - Step 3626: {'lr': 0.0004985856449578667, 'samples': 696384, 'steps': 3626, 'loss/train': 0.6563795953989029} 01/26/2022 23:17:30 - INFO - codeparrot_training - Step 3627: {'lr': 0.0004985839063935421, 'samples': 696576, 'steps': 3627, 'loss/train': 1.262283056974411} 01/26/2022 23:17:33 - INFO - codeparrot_training - Step 3628: {'lr': 0.0004985821667643628, 'samples': 696768, 'steps': 3628, 'loss/train': 0.7780786156654358} 01/26/2022 23:17:36 - INFO - codeparrot_training - Step 3629: {'lr': 0.0004985804260703364, 'samples': 696960, 'steps': 3629, 'loss/train': 1.1060768365859985} 01/26/2022 23:17:39 - INFO - codeparrot_training - Step 3630: {'lr': 0.0004985786843114706, 'samples': 697152, 'steps': 3630, 'loss/train': 0.675699770450592} 01/26/2022 23:17:44 - INFO - codeparrot_training - Step 3631: {'lr': 0.0004985769414877725, 'samples': 697344, 'steps': 3631, 'loss/train': 0.7946071028709412} 01/26/2022 23:17:47 - INFO - codeparrot_training - Step 3632: {'lr': 0.0004985751975992497, 'samples': 697536, 'steps': 3632, 'loss/train': 1.1245575249195099} 01/26/2022 23:17:50 - INFO - codeparrot_training - Step 3633: {'lr': 0.0004985734526459098, 'samples': 697728, 'steps': 3633, 'loss/train': 0.960069090127945} 01/26/2022 23:17:53 - INFO - codeparrot_training - Step 3634: {'lr': 0.0004985717066277601, 'samples': 697920, 'steps': 3634, 'loss/train': 0.672697439789772} 01/26/2022 23:17:56 - INFO - codeparrot_training - Step 3635: {'lr': 0.0004985699595448081, 'samples': 698112, 'steps': 3635, 'loss/train': 1.1461104154586792} 01/26/2022 23:17:59 - INFO - codeparrot_training - Step 3636: {'lr': 0.0004985682113970613, 'samples': 698304, 'steps': 3636, 'loss/train': 1.190938800573349} 01/26/2022 23:18:03 - INFO - codeparrot_training - Step 3637: {'lr': 0.0004985664621845273, 'samples': 698496, 'steps': 3637, 'loss/train': 1.096005767583847} 01/26/2022 23:18:06 - INFO - codeparrot_training - Step 3638: {'lr': 0.0004985647119072135, 'samples': 698688, 'steps': 3638, 'loss/train': 1.0905040204524994} 01/26/2022 23:18:10 - INFO - codeparrot_training - Step 3639: {'lr': 0.0004985629605651273, 'samples': 698880, 'steps': 3639, 'loss/train': 1.0089076459407806} 01/26/2022 23:18:13 - INFO - codeparrot_training - Step 3640: {'lr': 0.0004985612081582763, 'samples': 699072, 'steps': 3640, 'loss/train': 0.8841873407363892} 01/26/2022 23:18:16 - INFO - codeparrot_training - Step 3641: {'lr': 0.0004985594546866682, 'samples': 699264, 'steps': 3641, 'loss/train': 0.9438735544681549} 01/26/2022 23:18:20 - INFO - codeparrot_training - Step 3642: {'lr': 0.0004985577001503102, 'samples': 699456, 'steps': 3642, 'loss/train': 0.9265691041946411} 01/26/2022 23:18:23 - INFO - codeparrot_training - Step 3643: {'lr': 0.0004985559445492099, 'samples': 699648, 'steps': 3643, 'loss/train': 0.9146486520767212} 01/26/2022 23:18:26 - INFO - codeparrot_training - Step 3644: {'lr': 0.0004985541878833749, 'samples': 699840, 'steps': 3644, 'loss/train': 1.0394000709056854} 01/26/2022 23:18:29 - INFO - codeparrot_training - Step 3645: {'lr': 0.0004985524301528127, 'samples': 700032, 'steps': 3645, 'loss/train': 0.9431679546833038} 01/26/2022 23:18:32 - INFO - codeparrot_training - Step 3646: {'lr': 0.0004985506713575307, 'samples': 700224, 'steps': 3646, 'loss/train': 0.7472154200077057} 01/26/2022 23:18:35 - INFO - codeparrot_training - Step 3647: {'lr': 0.0004985489114975368, 'samples': 700416, 'steps': 3647, 'loss/train': 0.7634583413600922} 01/26/2022 23:18:41 - INFO - codeparrot_training - Step 3648: {'lr': 0.0004985471505728381, 'samples': 700608, 'steps': 3648, 'loss/train': 0.9807338118553162} 01/26/2022 23:18:44 - INFO - codeparrot_training - Step 3649: {'lr': 0.0004985453885834423, 'samples': 700800, 'steps': 3649, 'loss/train': 0.5948038548231125} 01/26/2022 23:18:47 - INFO - codeparrot_training - Step 3650: {'lr': 0.0004985436255293571, 'samples': 700992, 'steps': 3650, 'loss/train': 0.336322121322155} 01/26/2022 23:18:51 - INFO - codeparrot_training - Step 3651: {'lr': 0.0004985418614105898, 'samples': 701184, 'steps': 3651, 'loss/train': 0.9844797849655151} 01/26/2022 23:18:54 - INFO - codeparrot_training - Step 3652: {'lr': 0.0004985400962271482, 'samples': 701376, 'steps': 3652, 'loss/train': 0.9375941455364227} 01/26/2022 23:18:57 - INFO - codeparrot_training - Step 3653: {'lr': 0.0004985383299790397, 'samples': 701568, 'steps': 3653, 'loss/train': 0.8276392221450806} 01/26/2022 23:19:00 - INFO - codeparrot_training - Step 3654: {'lr': 0.0004985365626662719, 'samples': 701760, 'steps': 3654, 'loss/train': 1.0705838799476624} 01/26/2022 23:19:03 - INFO - codeparrot_training - Step 3655: {'lr': 0.0004985347942888524, 'samples': 701952, 'steps': 3655, 'loss/train': 0.6470370143651962} 01/26/2022 23:19:06 - INFO - codeparrot_training - Step 3656: {'lr': 0.0004985330248467888, 'samples': 702144, 'steps': 3656, 'loss/train': 0.7450163662433624} 01/26/2022 23:19:11 - INFO - codeparrot_training - Step 3657: {'lr': 0.0004985312543400886, 'samples': 702336, 'steps': 3657, 'loss/train': 1.1274625360965729} 01/26/2022 23:19:14 - INFO - codeparrot_training - Step 3658: {'lr': 0.0004985294827687594, 'samples': 702528, 'steps': 3658, 'loss/train': 0.07253782823681831} 01/26/2022 23:19:17 - INFO - codeparrot_training - Step 3659: {'lr': 0.0004985277101328088, 'samples': 702720, 'steps': 3659, 'loss/train': 0.9502056241035461} 01/26/2022 23:19:20 - INFO - codeparrot_training - Step 3660: {'lr': 0.0004985259364322445, 'samples': 702912, 'steps': 3660, 'loss/train': 0.8285295367240906} 01/26/2022 23:19:23 - INFO - codeparrot_training - Step 3661: {'lr': 0.0004985241616670739, 'samples': 703104, 'steps': 3661, 'loss/train': 0.6813288778066635} 01/26/2022 23:19:26 - INFO - codeparrot_training - Step 3662: {'lr': 0.0004985223858373048, 'samples': 703296, 'steps': 3662, 'loss/train': 0.8577052652835846} 01/26/2022 23:19:30 - INFO - codeparrot_training - Step 3663: {'lr': 0.0004985206089429447, 'samples': 703488, 'steps': 3663, 'loss/train': 0.7957110106945038} 01/26/2022 23:19:33 - INFO - codeparrot_training - Step 3664: {'lr': 0.0004985188309840012, 'samples': 703680, 'steps': 3664, 'loss/train': 0.6844681799411774} 01/26/2022 23:19:36 - INFO - codeparrot_training - Step 3665: {'lr': 0.0004985170519604819, 'samples': 703872, 'steps': 3665, 'loss/train': 0.9190719723701477} 01/26/2022 23:19:40 - INFO - codeparrot_training - Step 3666: {'lr': 0.0004985152718723944, 'samples': 704064, 'steps': 3666, 'loss/train': 0.8057535588741302} 01/26/2022 23:19:43 - INFO - codeparrot_training - Step 3667: {'lr': 0.0004985134907197466, 'samples': 704256, 'steps': 3667, 'loss/train': 0.8867939114570618} 01/26/2022 23:19:46 - INFO - codeparrot_training - Step 3668: {'lr': 0.0004985117085025458, 'samples': 704448, 'steps': 3668, 'loss/train': 0.6047841757535934} 01/26/2022 23:19:50 - INFO - codeparrot_training - Step 3669: {'lr': 0.0004985099252207998, 'samples': 704640, 'steps': 3669, 'loss/train': 0.9221009910106659} 01/26/2022 23:19:53 - INFO - codeparrot_training - Step 3670: {'lr': 0.0004985081408745161, 'samples': 704832, 'steps': 3670, 'loss/train': 1.0662732124328613} 01/26/2022 23:19:56 - INFO - codeparrot_training - Step 3671: {'lr': 0.0004985063554637025, 'samples': 705024, 'steps': 3671, 'loss/train': 1.324745625257492} 01/26/2022 23:19:59 - INFO - codeparrot_training - Step 3672: {'lr': 0.0004985045689883665, 'samples': 705216, 'steps': 3672, 'loss/train': 0.5143138021230698} 01/26/2022 23:20:02 - INFO - codeparrot_training - Step 3673: {'lr': 0.0004985027814485159, 'samples': 705408, 'steps': 3673, 'loss/train': 0.9413469135761261} 01/26/2022 23:20:05 - INFO - codeparrot_training - Step 3674: {'lr': 0.0004985009928441584, 'samples': 705600, 'steps': 3674, 'loss/train': 1.208923727273941} 01/26/2022 23:20:10 - INFO - codeparrot_training - Step 3675: {'lr': 0.0004984992031753014, 'samples': 705792, 'steps': 3675, 'loss/train': 0.076290562748909} 01/26/2022 23:20:13 - INFO - codeparrot_training - Step 3676: {'lr': 0.0004984974124419528, 'samples': 705984, 'steps': 3676, 'loss/train': 0.8439990878105164} 01/26/2022 23:20:16 - INFO - codeparrot_training - Step 3677: {'lr': 0.0004984956206441201, 'samples': 706176, 'steps': 3677, 'loss/train': 1.0292843878269196} 01/26/2022 23:20:20 - INFO - codeparrot_training - Step 3678: {'lr': 0.0004984938277818112, 'samples': 706368, 'steps': 3678, 'loss/train': 0.7091749906539917} 01/26/2022 23:20:23 - INFO - codeparrot_training - Step 3679: {'lr': 0.0004984920338550335, 'samples': 706560, 'steps': 3679, 'loss/train': 1.2433115243911743} 01/26/2022 23:20:26 - INFO - codeparrot_training - Step 3680: {'lr': 0.0004984902388637949, 'samples': 706752, 'steps': 3680, 'loss/train': 1.1573538780212402} 01/26/2022 23:20:29 - INFO - codeparrot_training - Step 3681: {'lr': 0.0004984884428081031, 'samples': 706944, 'steps': 3681, 'loss/train': 0.6394312530755997} 01/26/2022 23:20:32 - INFO - codeparrot_training - Step 3682: {'lr': 0.0004984866456879657, 'samples': 707136, 'steps': 3682, 'loss/train': 0.9051013290882111} 01/26/2022 23:20:35 - INFO - codeparrot_training - Step 3683: {'lr': 0.0004984848475033903, 'samples': 707328, 'steps': 3683, 'loss/train': 0.8951087594032288} 01/26/2022 23:20:42 - INFO - codeparrot_training - Step 3684: {'lr': 0.0004984830482543847, 'samples': 707520, 'steps': 3684, 'loss/train': 1.006563663482666} 01/26/2022 23:20:45 - INFO - codeparrot_training - Step 3685: {'lr': 0.0004984812479409568, 'samples': 707712, 'steps': 3685, 'loss/train': 0.6751502752304077} 01/26/2022 23:20:48 - INFO - codeparrot_training - Step 3686: {'lr': 0.000498479446563114, 'samples': 707904, 'steps': 3686, 'loss/train': 1.1661420464515686} 01/26/2022 23:20:51 - INFO - codeparrot_training - Step 3687: {'lr': 0.0004984776441208642, 'samples': 708096, 'steps': 3687, 'loss/train': 0.8337242603302002} 01/26/2022 23:20:54 - INFO - codeparrot_training - Step 3688: {'lr': 0.000498475840614215, 'samples': 708288, 'steps': 3688, 'loss/train': 0.6630101054906845} 01/26/2022 23:20:58 - INFO - codeparrot_training - Step 3689: {'lr': 0.0004984740360431742, 'samples': 708480, 'steps': 3689, 'loss/train': 0.7735016047954559} 01/26/2022 23:21:01 - INFO - codeparrot_training - Step 3690: {'lr': 0.0004984722304077496, 'samples': 708672, 'steps': 3690, 'loss/train': 1.0463787317276} 01/26/2022 23:21:04 - INFO - codeparrot_training - Step 3691: {'lr': 0.0004984704237079489, 'samples': 708864, 'steps': 3691, 'loss/train': 1.0303286612033844} 01/26/2022 23:21:08 - INFO - codeparrot_training - Step 3692: {'lr': 0.0004984686159437798, 'samples': 709056, 'steps': 3692, 'loss/train': 1.3746975660324097} 01/26/2022 23:21:11 - INFO - codeparrot_training - Step 3693: {'lr': 0.00049846680711525, 'samples': 709248, 'steps': 3693, 'loss/train': 1.111884891986847} 01/26/2022 23:21:15 - INFO - codeparrot_training - Step 3694: {'lr': 0.0004984649972223673, 'samples': 709440, 'steps': 3694, 'loss/train': 0.9374993741512299} 01/26/2022 23:21:18 - INFO - codeparrot_training - Step 3695: {'lr': 0.0004984631862651395, 'samples': 709632, 'steps': 3695, 'loss/train': 0.8234342336654663} 01/26/2022 23:21:21 - INFO - codeparrot_training - Step 3696: {'lr': 0.0004984613742435742, 'samples': 709824, 'steps': 3696, 'loss/train': 1.1671507358551025} 01/26/2022 23:21:24 - INFO - codeparrot_training - Step 3697: {'lr': 0.0004984595611576793, 'samples': 710016, 'steps': 3697, 'loss/train': 1.3755386173725128} 01/26/2022 23:21:27 - INFO - codeparrot_training - Step 3698: {'lr': 0.0004984577470074625, 'samples': 710208, 'steps': 3698, 'loss/train': 0.8951567709445953} 01/26/2022 23:21:30 - INFO - codeparrot_training - Step 3699: {'lr': 0.0004984559317929317, 'samples': 710400, 'steps': 3699, 'loss/train': 0.08389728516340256} 01/26/2022 23:21:33 - INFO - codeparrot_training - Step 3700: {'lr': 0.0004984541155140946, 'samples': 710592, 'steps': 3700, 'loss/train': 0.3605138137936592} 01/26/2022 23:21:38 - INFO - codeparrot_training - Step 3701: {'lr': 0.0004984522981709589, 'samples': 710784, 'steps': 3701, 'loss/train': 1.125177651643753} 01/26/2022 23:21:41 - INFO - codeparrot_training - Step 3702: {'lr': 0.0004984504797635324, 'samples': 710976, 'steps': 3702, 'loss/train': 0.5316541492938995} 01/26/2022 23:21:44 - INFO - codeparrot_training - Step 3703: {'lr': 0.000498448660291823, 'samples': 711168, 'steps': 3703, 'loss/train': 1.1036417484283447} 01/26/2022 23:21:47 - INFO - codeparrot_training - Step 3704: {'lr': 0.0004984468397558384, 'samples': 711360, 'steps': 3704, 'loss/train': 0.3453776612877846} 01/26/2022 23:21:50 - INFO - codeparrot_training - Step 3705: {'lr': 0.0004984450181555864, 'samples': 711552, 'steps': 3705, 'loss/train': 0.8637085855007172} 01/26/2022 23:21:54 - INFO - codeparrot_training - Step 3706: {'lr': 0.0004984431954910749, 'samples': 711744, 'steps': 3706, 'loss/train': 1.2165125012397766} 01/26/2022 23:21:57 - INFO - codeparrot_training - Step 3707: {'lr': 0.0004984413717623117, 'samples': 711936, 'steps': 3707, 'loss/train': 1.0253919661045074} 01/26/2022 23:22:00 - INFO - codeparrot_training - Step 3708: {'lr': 0.0004984395469693044, 'samples': 712128, 'steps': 3708, 'loss/train': 0.9327195882797241} 01/26/2022 23:22:03 - INFO - codeparrot_training - Step 3709: {'lr': 0.000498437721112061, 'samples': 712320, 'steps': 3709, 'loss/train': 1.2881193459033966} 01/26/2022 23:22:09 - INFO - codeparrot_training - Step 3710: {'lr': 0.0004984358941905894, 'samples': 712512, 'steps': 3710, 'loss/train': 0.642622634768486} 01/26/2022 23:22:12 - INFO - codeparrot_training - Step 3711: {'lr': 0.0004984340662048972, 'samples': 712704, 'steps': 3711, 'loss/train': 0.7749103009700775} 01/26/2022 23:22:16 - INFO - codeparrot_training - Step 3712: {'lr': 0.0004984322371549924, 'samples': 712896, 'steps': 3712, 'loss/train': 0.182144645601511} 01/26/2022 23:22:19 - INFO - codeparrot_training - Step 3713: {'lr': 0.0004984304070408828, 'samples': 713088, 'steps': 3713, 'loss/train': 0.6407246589660645} 01/26/2022 23:22:22 - INFO - codeparrot_training - Step 3714: {'lr': 0.0004984285758625761, 'samples': 713280, 'steps': 3714, 'loss/train': 1.0897387862205505} 01/26/2022 23:22:25 - INFO - codeparrot_training - Step 3715: {'lr': 0.0004984267436200805, 'samples': 713472, 'steps': 3715, 'loss/train': 0.8288021385669708} 01/26/2022 23:22:28 - INFO - codeparrot_training - Step 3716: {'lr': 0.0004984249103134035, 'samples': 713664, 'steps': 3716, 'loss/train': 1.1546903550624847} 01/26/2022 23:22:31 - INFO - codeparrot_training - Step 3717: {'lr': 0.000498423075942553, 'samples': 713856, 'steps': 3717, 'loss/train': 0.602114349603653} 01/26/2022 23:22:34 - INFO - codeparrot_training - Step 3718: {'lr': 0.0004984212405075369, 'samples': 714048, 'steps': 3718, 'loss/train': 0.4431838095188141} 01/26/2022 23:22:39 - INFO - codeparrot_training - Step 3719: {'lr': 0.0004984194040083632, 'samples': 714240, 'steps': 3719, 'loss/train': 0.9682289063930511} 01/26/2022 23:22:42 - INFO - codeparrot_training - Step 3720: {'lr': 0.0004984175664450397, 'samples': 714432, 'steps': 3720, 'loss/train': 0.5904970765113831} 01/26/2022 23:22:45 - INFO - codeparrot_training - Step 3721: {'lr': 0.0004984157278175741, 'samples': 714624, 'steps': 3721, 'loss/train': 0.8838883638381958} 01/26/2022 23:22:48 - INFO - codeparrot_training - Step 3722: {'lr': 0.0004984138881259744, 'samples': 714816, 'steps': 3722, 'loss/train': 0.4775403141975403} 01/26/2022 23:22:51 - INFO - codeparrot_training - Step 3723: {'lr': 0.0004984120473702486, 'samples': 715008, 'steps': 3723, 'loss/train': 0.7145626991987228} 01/26/2022 23:22:55 - INFO - codeparrot_training - Step 3724: {'lr': 0.0004984102055504044, 'samples': 715200, 'steps': 3724, 'loss/train': 1.0590972304344177} 01/26/2022 23:22:58 - INFO - codeparrot_training - Step 3725: {'lr': 0.0004984083626664497, 'samples': 715392, 'steps': 3725, 'loss/train': 0.6633213311433792} 01/26/2022 23:23:01 - INFO - codeparrot_training - Step 3726: {'lr': 0.0004984065187183925, 'samples': 715584, 'steps': 3726, 'loss/train': 0.352688692510128} 01/26/2022 23:23:04 - INFO - codeparrot_training - Step 3727: {'lr': 0.0004984046737062407, 'samples': 715776, 'steps': 3727, 'loss/train': 1.2124418914318085} 01/26/2022 23:23:11 - INFO - codeparrot_training - Step 3728: {'lr': 0.0004984028276300021, 'samples': 715968, 'steps': 3728, 'loss/train': 0.8313522040843964} 01/26/2022 23:23:14 - INFO - codeparrot_training - Step 3729: {'lr': 0.0004984009804896846, 'samples': 716160, 'steps': 3729, 'loss/train': 1.6890597939491272} 01/26/2022 23:23:17 - INFO - codeparrot_training - Step 3730: {'lr': 0.0004983991322852963, 'samples': 716352, 'steps': 3730, 'loss/train': 0.9736234545707703} 01/26/2022 23:23:20 - INFO - codeparrot_training - Step 3731: {'lr': 0.000498397283016845, 'samples': 716544, 'steps': 3731, 'loss/train': 1.0694261491298676} 01/26/2022 23:23:23 - INFO - codeparrot_training - Step 3732: {'lr': 0.0004983954326843386, 'samples': 716736, 'steps': 3732, 'loss/train': 1.0857925415039062} 01/26/2022 23:23:26 - INFO - codeparrot_training - Step 3733: {'lr': 0.000498393581287785, 'samples': 716928, 'steps': 3733, 'loss/train': 0.574594035744667} 01/26/2022 23:23:30 - INFO - codeparrot_training - Step 3734: {'lr': 0.0004983917288271921, 'samples': 717120, 'steps': 3734, 'loss/train': 0.7173343598842621} 01/26/2022 23:23:33 - INFO - codeparrot_training - Step 3735: {'lr': 0.0004983898753025681, 'samples': 717312, 'steps': 3735, 'loss/train': 0.6105244159698486} 01/26/2022 23:23:36 - INFO - codeparrot_training - Step 3736: {'lr': 0.0004983880207139205, 'samples': 717504, 'steps': 3736, 'loss/train': 0.5013457685709} 01/26/2022 23:23:40 - INFO - codeparrot_training - Step 3737: {'lr': 0.0004983861650612577, 'samples': 717696, 'steps': 3737, 'loss/train': 0.9963938891887665} 01/26/2022 23:23:43 - INFO - codeparrot_training - Step 3738: {'lr': 0.0004983843083445873, 'samples': 717888, 'steps': 3738, 'loss/train': 0.8196808397769928} 01/26/2022 23:23:47 - INFO - codeparrot_training - Step 3739: {'lr': 0.0004983824505639175, 'samples': 718080, 'steps': 3739, 'loss/train': 1.0600842833518982} 01/26/2022 23:23:50 - INFO - codeparrot_training - Step 3740: {'lr': 0.000498380591719256, 'samples': 718272, 'steps': 3740, 'loss/train': 0.8824985325336456} 01/26/2022 23:23:53 - INFO - codeparrot_training - Step 3741: {'lr': 0.0004983787318106111, 'samples': 718464, 'steps': 3741, 'loss/train': 0.995827853679657} 01/26/2022 23:23:56 - INFO - codeparrot_training - Step 3742: {'lr': 0.0004983768708379905, 'samples': 718656, 'steps': 3742, 'loss/train': 1.3336502015590668} 01/26/2022 23:23:59 - INFO - codeparrot_training - Step 3743: {'lr': 0.0004983750088014023, 'samples': 718848, 'steps': 3743, 'loss/train': 1.1187188029289246} 01/26/2022 23:24:02 - INFO - codeparrot_training - Step 3744: {'lr': 0.0004983731457008544, 'samples': 719040, 'steps': 3744, 'loss/train': 0.714343249797821} 01/26/2022 23:24:05 - INFO - codeparrot_training - Step 3745: {'lr': 0.0004983712815363548, 'samples': 719232, 'steps': 3745, 'loss/train': 1.1926473677158356} 01/26/2022 23:24:10 - INFO - codeparrot_training - Step 3746: {'lr': 0.0004983694163079115, 'samples': 719424, 'steps': 3746, 'loss/train': 1.6023852825164795} 01/26/2022 23:24:13 - INFO - codeparrot_training - Step 3747: {'lr': 0.0004983675500155325, 'samples': 719616, 'steps': 3747, 'loss/train': 0.9113150238990784} 01/26/2022 23:24:16 - INFO - codeparrot_training - Step 3748: {'lr': 0.0004983656826592258, 'samples': 719808, 'steps': 3748, 'loss/train': 0.8016282320022583} 01/26/2022 23:24:19 - INFO - codeparrot_training - Step 3749: {'lr': 0.0004983638142389993, 'samples': 720000, 'steps': 3749, 'loss/train': 0.8759652078151703} 01/26/2022 23:24:22 - INFO - codeparrot_training - Step 3750: {'lr': 0.000498361944754861, 'samples': 720192, 'steps': 3750, 'loss/train': 1.1177653670310974} 01/26/2022 23:24:26 - INFO - codeparrot_training - Step 3751: {'lr': 0.0004983600742068192, 'samples': 720384, 'steps': 3751, 'loss/train': 1.3080310821533203} 01/26/2022 23:24:29 - INFO - codeparrot_training - Step 3752: {'lr': 0.0004983582025948816, 'samples': 720576, 'steps': 3752, 'loss/train': 1.16409033536911} 01/26/2022 23:24:32 - INFO - codeparrot_training - Step 3753: {'lr': 0.0004983563299190564, 'samples': 720768, 'steps': 3753, 'loss/train': 1.0002285540103912} 01/26/2022 23:24:35 - INFO - codeparrot_training - Step 3754: {'lr': 0.0004983544561793515, 'samples': 720960, 'steps': 3754, 'loss/train': 0.472054660320282} 01/26/2022 23:24:42 - INFO - codeparrot_training - Step 3755: {'lr': 0.000498352581375775, 'samples': 721152, 'steps': 3755, 'loss/train': 0.7076060771942139} 01/26/2022 23:24:45 - INFO - codeparrot_training - Step 3756: {'lr': 0.0004983507055083349, 'samples': 721344, 'steps': 3756, 'loss/train': 0.649177148938179} 01/26/2022 23:24:48 - INFO - codeparrot_training - Step 3757: {'lr': 0.0004983488285770391, 'samples': 721536, 'steps': 3757, 'loss/train': 0.6854746788740158} 01/26/2022 23:24:51 - INFO - codeparrot_training - Step 3758: {'lr': 0.000498346950581896, 'samples': 721728, 'steps': 3758, 'loss/train': 1.1404270827770233} 01/26/2022 23:24:54 - INFO - codeparrot_training - Step 3759: {'lr': 0.0004983450715229132, 'samples': 721920, 'steps': 3759, 'loss/train': 0.2544739097356796} 01/26/2022 23:24:57 - INFO - codeparrot_training - Step 3760: {'lr': 0.000498343191400099, 'samples': 722112, 'steps': 3760, 'loss/train': 0.6262269169092178} 01/26/2022 23:25:00 - INFO - codeparrot_training - Step 3761: {'lr': 0.0004983413102134616, 'samples': 722304, 'steps': 3761, 'loss/train': 0.7196769118309021} 01/26/2022 23:25:04 - INFO - codeparrot_training - Step 3762: {'lr': 0.0004983394279630088, 'samples': 722496, 'steps': 3762, 'loss/train': 1.1562857329845428} 01/26/2022 23:25:07 - INFO - codeparrot_training - Step 3763: {'lr': 0.0004983375446487488, 'samples': 722688, 'steps': 3763, 'loss/train': 0.9136307537555695} 01/26/2022 23:25:11 - INFO - codeparrot_training - Step 3764: {'lr': 0.0004983356602706895, 'samples': 722880, 'steps': 3764, 'loss/train': 1.0532222092151642} 01/26/2022 23:25:14 - INFO - codeparrot_training - Step 3765: {'lr': 0.0004983337748288391, 'samples': 723072, 'steps': 3765, 'loss/train': 1.1372467875480652} 01/26/2022 23:25:17 - INFO - codeparrot_training - Step 3766: {'lr': 0.0004983318883232058, 'samples': 723264, 'steps': 3766, 'loss/train': 0.9381504356861115} 01/26/2022 23:25:21 - INFO - codeparrot_training - Step 3767: {'lr': 0.0004983300007537974, 'samples': 723456, 'steps': 3767, 'loss/train': 0.201431505382061} 01/26/2022 23:25:24 - INFO - codeparrot_training - Step 3768: {'lr': 0.0004983281121206222, 'samples': 723648, 'steps': 3768, 'loss/train': 1.5608893632888794} 01/26/2022 23:25:27 - INFO - codeparrot_training - Step 3769: {'lr': 0.0004983262224236882, 'samples': 723840, 'steps': 3769, 'loss/train': 1.5201348066329956} 01/26/2022 23:25:30 - INFO - codeparrot_training - Step 3770: {'lr': 0.0004983243316630035, 'samples': 724032, 'steps': 3770, 'loss/train': 0.18016504123806953} 01/26/2022 23:25:33 - INFO - codeparrot_training - Step 3771: {'lr': 0.0004983224398385762, 'samples': 724224, 'steps': 3771, 'loss/train': 0.559074729681015} 01/26/2022 23:25:36 - INFO - codeparrot_training - Step 3772: {'lr': 0.0004983205469504144, 'samples': 724416, 'steps': 3772, 'loss/train': 0.9533350467681885} 01/26/2022 23:25:41 - INFO - codeparrot_training - Step 3773: {'lr': 0.0004983186529985263, 'samples': 724608, 'steps': 3773, 'loss/train': 0.7845650017261505} 01/26/2022 23:25:44 - INFO - codeparrot_training - Step 3774: {'lr': 0.00049831675798292, 'samples': 724800, 'steps': 3774, 'loss/train': 0.6259867250919342} 01/26/2022 23:25:47 - INFO - codeparrot_training - Step 3775: {'lr': 0.0004983148619036034, 'samples': 724992, 'steps': 3775, 'loss/train': 0.5132587552070618} 01/26/2022 23:25:50 - INFO - codeparrot_training - Step 3776: {'lr': 0.0004983129647605849, 'samples': 725184, 'steps': 3776, 'loss/train': 0.7691565155982971} 01/26/2022 23:25:53 - INFO - codeparrot_training - Step 3777: {'lr': 0.0004983110665538724, 'samples': 725376, 'steps': 3777, 'loss/train': 0.9134242236614227} 01/26/2022 23:25:56 - INFO - codeparrot_training - Step 3778: {'lr': 0.0004983091672834742, 'samples': 725568, 'steps': 3778, 'loss/train': 0.5658659934997559} 01/26/2022 23:26:00 - INFO - codeparrot_training - Step 3779: {'lr': 0.0004983072669493985, 'samples': 725760, 'steps': 3779, 'loss/train': 0.6930532157421112} 01/26/2022 23:26:03 - INFO - codeparrot_training - Step 3780: {'lr': 0.0004983053655516531, 'samples': 725952, 'steps': 3780, 'loss/train': 0.2844412699341774} 01/26/2022 23:26:07 - INFO - codeparrot_training - Step 3781: {'lr': 0.0004983034630902465, 'samples': 726144, 'steps': 3781, 'loss/train': 0.8435737788677216} 01/26/2022 23:26:10 - INFO - codeparrot_training - Step 3782: {'lr': 0.0004983015595651867, 'samples': 726336, 'steps': 3782, 'loss/train': 1.0499356985092163} 01/26/2022 23:26:13 - INFO - codeparrot_training - Step 3783: {'lr': 0.0004982996549764817, 'samples': 726528, 'steps': 3783, 'loss/train': 0.883103996515274} 01/26/2022 23:26:17 - INFO - codeparrot_training - Step 3784: {'lr': 0.0004982977493241399, 'samples': 726720, 'steps': 3784, 'loss/train': 0.8023152351379395} 01/26/2022 23:26:20 - INFO - codeparrot_training - Step 3785: {'lr': 0.0004982958426081695, 'samples': 726912, 'steps': 3785, 'loss/train': 0.6796708703041077} 01/26/2022 23:26:23 - INFO - codeparrot_training - Step 3786: {'lr': 0.0004982939348285784, 'samples': 727104, 'steps': 3786, 'loss/train': 0.7646068632602692} 01/26/2022 23:26:26 - INFO - codeparrot_training - Step 3787: {'lr': 0.000498292025985375, 'samples': 727296, 'steps': 3787, 'loss/train': 0.7292510122060776} 01/26/2022 23:26:29 - INFO - codeparrot_training - Step 3788: {'lr': 0.0004982901160785675, 'samples': 727488, 'steps': 3788, 'loss/train': 1.101166695356369} 01/26/2022 23:26:32 - INFO - codeparrot_training - Step 3789: {'lr': 0.0004982882051081639, 'samples': 727680, 'steps': 3789, 'loss/train': 0.3748445212841034} 01/26/2022 23:26:38 - INFO - codeparrot_training - Step 3790: {'lr': 0.0004982862930741725, 'samples': 727872, 'steps': 3790, 'loss/train': 0.524399533867836} 01/26/2022 23:26:42 - INFO - codeparrot_training - Step 3791: {'lr': 0.0004982843799766014, 'samples': 728064, 'steps': 3791, 'loss/train': 0.8300400674343109} 01/26/2022 23:26:45 - INFO - codeparrot_training - Step 3792: {'lr': 0.0004982824658154589, 'samples': 728256, 'steps': 3792, 'loss/train': 0.7660401463508606} 01/26/2022 23:26:48 - INFO - codeparrot_training - Step 3793: {'lr': 0.000498280550590753, 'samples': 728448, 'steps': 3793, 'loss/train': 0.6218583136796951} 01/26/2022 23:26:51 - INFO - codeparrot_training - Step 3794: {'lr': 0.0004982786343024923, 'samples': 728640, 'steps': 3794, 'loss/train': 0.9506357610225677} 01/26/2022 23:26:54 - INFO - codeparrot_training - Step 3795: {'lr': 0.0004982767169506847, 'samples': 728832, 'steps': 3795, 'loss/train': 0.6598942279815674} 01/26/2022 23:26:57 - INFO - codeparrot_training - Step 3796: {'lr': 0.0004982747985353384, 'samples': 729024, 'steps': 3796, 'loss/train': 0.9875503778457642} 01/26/2022 23:27:00 - INFO - codeparrot_training - Step 3797: {'lr': 0.0004982728790564616, 'samples': 729216, 'steps': 3797, 'loss/train': 0.6458108872175217} 01/26/2022 23:27:04 - INFO - codeparrot_training - Step 3798: {'lr': 0.0004982709585140629, 'samples': 729408, 'steps': 3798, 'loss/train': 0.7914699018001556} 01/26/2022 23:27:08 - INFO - codeparrot_training - Step 3799: {'lr': 0.0004982690369081501, 'samples': 729600, 'steps': 3799, 'loss/train': 2.0117000341415405} 01/26/2022 23:27:11 - INFO - codeparrot_training - Step 3800: {'lr': 0.0004982671142387316, 'samples': 729792, 'steps': 3800, 'loss/train': 0.9555622637271881} 01/26/2022 23:27:14 - INFO - codeparrot_training - Step 3801: {'lr': 0.0004982651905058156, 'samples': 729984, 'steps': 3801, 'loss/train': 0.8480411767959595} 01/26/2022 23:27:18 - INFO - codeparrot_training - Step 3802: {'lr': 0.0004982632657094104, 'samples': 730176, 'steps': 3802, 'loss/train': 0.8004260659217834} 01/26/2022 23:27:21 - INFO - codeparrot_training - Step 3803: {'lr': 0.0004982613398495241, 'samples': 730368, 'steps': 3803, 'loss/train': 0.7299167811870575} 01/26/2022 23:27:24 - INFO - codeparrot_training - Step 3804: {'lr': 0.0004982594129261652, 'samples': 730560, 'steps': 3804, 'loss/train': 0.42847582697868347} 01/26/2022 23:27:27 - INFO - codeparrot_training - Step 3805: {'lr': 0.0004982574849393416, 'samples': 730752, 'steps': 3805, 'loss/train': 1.6515052914619446} 01/26/2022 23:27:30 - INFO - codeparrot_training - Step 3806: {'lr': 0.000498255555889062, 'samples': 730944, 'steps': 3806, 'loss/train': 0.8789466619491577} 01/26/2022 23:27:33 - INFO - codeparrot_training - Step 3807: {'lr': 0.0004982536257753343, 'samples': 731136, 'steps': 3807, 'loss/train': 1.0199354588985443} 01/26/2022 23:27:39 - INFO - codeparrot_training - Step 3808: {'lr': 0.0004982516945981669, 'samples': 731328, 'steps': 3808, 'loss/train': 0.48241089284420013} 01/26/2022 23:27:43 - INFO - codeparrot_training - Step 3809: {'lr': 0.0004982497623575681, 'samples': 731520, 'steps': 3809, 'loss/train': 0.6943470686674118} 01/26/2022 23:27:46 - INFO - codeparrot_training - Step 3810: {'lr': 0.0004982478290535461, 'samples': 731712, 'steps': 3810, 'loss/train': 0.5331630706787109} 01/26/2022 23:27:49 - INFO - codeparrot_training - Step 3811: {'lr': 0.0004982458946861093, 'samples': 731904, 'steps': 3811, 'loss/train': 1.8078829050064087} 01/26/2022 23:27:52 - INFO - codeparrot_training - Step 3812: {'lr': 0.0004982439592552658, 'samples': 732096, 'steps': 3812, 'loss/train': 0.6396533399820328} 01/26/2022 23:27:55 - INFO - codeparrot_training - Step 3813: {'lr': 0.0004982420227610242, 'samples': 732288, 'steps': 3813, 'loss/train': 0.884215235710144} 01/26/2022 23:27:58 - INFO - codeparrot_training - Step 3814: {'lr': 0.0004982400852033924, 'samples': 732480, 'steps': 3814, 'loss/train': 0.7795093953609467} 01/26/2022 23:28:01 - INFO - codeparrot_training - Step 3815: {'lr': 0.000498238146582379, 'samples': 732672, 'steps': 3815, 'loss/train': 0.5720730274915695} 01/26/2022 23:28:05 - INFO - codeparrot_training - Step 3816: {'lr': 0.0004982362068979921, 'samples': 732864, 'steps': 3816, 'loss/train': 0.6843110471963882} 01/26/2022 23:28:09 - INFO - codeparrot_training - Step 3817: {'lr': 0.0004982342661502403, 'samples': 733056, 'steps': 3817, 'loss/train': 0.4506169259548187} 01/26/2022 23:28:12 - INFO - codeparrot_training - Step 3818: {'lr': 0.0004982323243391315, 'samples': 733248, 'steps': 3818, 'loss/train': 1.4059887528419495} 01/26/2022 23:28:15 - INFO - codeparrot_training - Step 3819: {'lr': 0.0004982303814646745, 'samples': 733440, 'steps': 3819, 'loss/train': 0.973318487405777} 01/26/2022 23:28:18 - INFO - codeparrot_training - Step 3820: {'lr': 0.0004982284375268772, 'samples': 733632, 'steps': 3820, 'loss/train': 0.41735270619392395} 01/26/2022 23:28:22 - INFO - codeparrot_training - Step 3821: {'lr': 0.0004982264925257481, 'samples': 733824, 'steps': 3821, 'loss/train': 0.6220707446336746} 01/26/2022 23:28:25 - INFO - codeparrot_training - Step 3822: {'lr': 0.0004982245464612955, 'samples': 734016, 'steps': 3822, 'loss/train': 0.766078770160675} 01/26/2022 23:28:28 - INFO - codeparrot_training - Step 3823: {'lr': 0.0004982225993335279, 'samples': 734208, 'steps': 3823, 'loss/train': 0.9947437047958374} 01/26/2022 23:28:31 - INFO - codeparrot_training - Step 3824: {'lr': 0.0004982206511424534, 'samples': 734400, 'steps': 3824, 'loss/train': 0.7373889684677124} 01/26/2022 23:28:34 - INFO - codeparrot_training - Step 3825: {'lr': 0.0004982187018880805, 'samples': 734592, 'steps': 3825, 'loss/train': 0.755209743976593} 01/26/2022 23:28:39 - INFO - codeparrot_training - Step 3826: {'lr': 0.0004982167515704174, 'samples': 734784, 'steps': 3826, 'loss/train': 0.5668278783559799} 01/26/2022 23:28:42 - INFO - codeparrot_training - Step 3827: {'lr': 0.0004982148001894727, 'samples': 734976, 'steps': 3827, 'loss/train': 0.8495799601078033} 01/26/2022 23:28:45 - INFO - codeparrot_training - Step 3828: {'lr': 0.0004982128477452546, 'samples': 735168, 'steps': 3828, 'loss/train': 0.834953784942627} 01/26/2022 23:28:48 - INFO - codeparrot_training - Step 3829: {'lr': 0.0004982108942377713, 'samples': 735360, 'steps': 3829, 'loss/train': 0.8760692775249481} 01/26/2022 23:28:51 - INFO - codeparrot_training - Step 3830: {'lr': 0.0004982089396670316, 'samples': 735552, 'steps': 3830, 'loss/train': 0.6115702539682388} 01/26/2022 23:28:54 - INFO - codeparrot_training - Step 3831: {'lr': 0.0004982069840330435, 'samples': 735744, 'steps': 3831, 'loss/train': 1.2871561646461487} 01/26/2022 23:28:58 - INFO - codeparrot_training - Step 3832: {'lr': 0.0004982050273358154, 'samples': 735936, 'steps': 3832, 'loss/train': 1.538468062877655} 01/26/2022 23:29:01 - INFO - codeparrot_training - Step 3833: {'lr': 0.0004982030695753558, 'samples': 736128, 'steps': 3833, 'loss/train': 0.7517512142658234} 01/26/2022 23:29:07 - INFO - codeparrot_training - Step 3834: {'lr': 0.0004982011107516732, 'samples': 736320, 'steps': 3834, 'loss/train': 0.7341686189174652} 01/26/2022 23:29:10 - INFO - codeparrot_training - Step 3835: {'lr': 0.0004981991508647757, 'samples': 736512, 'steps': 3835, 'loss/train': 0.3357127010822296} 01/26/2022 23:29:13 - INFO - codeparrot_training - Step 3836: {'lr': 0.0004981971899146719, 'samples': 736704, 'steps': 3836, 'loss/train': 1.001909852027893} 01/26/2022 23:29:16 - INFO - codeparrot_training - Step 3837: {'lr': 0.0004981952279013702, 'samples': 736896, 'steps': 3837, 'loss/train': 1.1640489399433136} 01/26/2022 23:29:19 - INFO - codeparrot_training - Step 3838: {'lr': 0.0004981932648248789, 'samples': 737088, 'steps': 3838, 'loss/train': 0.37886425852775574} 01/26/2022 23:29:22 - INFO - codeparrot_training - Step 3839: {'lr': 0.0004981913006852065, 'samples': 737280, 'steps': 3839, 'loss/train': 1.3285256624221802} 01/26/2022 23:29:26 - INFO - codeparrot_training - Step 3840: {'lr': 0.0004981893354823614, 'samples': 737472, 'steps': 3840, 'loss/train': 1.0405485033988953} 01/26/2022 23:29:29 - INFO - codeparrot_training - Step 3841: {'lr': 0.000498187369216352, 'samples': 737664, 'steps': 3841, 'loss/train': 1.4288856983184814} 01/26/2022 23:29:32 - INFO - codeparrot_training - Step 3842: {'lr': 0.0004981854018871867, 'samples': 737856, 'steps': 3842, 'loss/train': 0.8018413782119751} 01/26/2022 23:29:36 - INFO - codeparrot_training - Step 3843: {'lr': 0.0004981834334948738, 'samples': 738048, 'steps': 3843, 'loss/train': 0.8442158997058868} 01/26/2022 23:29:40 - INFO - codeparrot_training - Step 3844: {'lr': 0.0004981814640394221, 'samples': 738240, 'steps': 3844, 'loss/train': 0.9670072495937347} 01/26/2022 23:29:43 - INFO - codeparrot_training - Step 3845: {'lr': 0.0004981794935208397, 'samples': 738432, 'steps': 3845, 'loss/train': 0.7715116739273071} 01/26/2022 23:29:46 - INFO - codeparrot_training - Step 3846: {'lr': 0.0004981775219391352, 'samples': 738624, 'steps': 3846, 'loss/train': 0.5654731392860413} 01/26/2022 23:29:49 - INFO - codeparrot_training - Step 3847: {'lr': 0.000498175549294317, 'samples': 738816, 'steps': 3847, 'loss/train': 0.6793680489063263} 01/26/2022 23:29:52 - INFO - codeparrot_training - Step 3848: {'lr': 0.0004981735755863934, 'samples': 739008, 'steps': 3848, 'loss/train': 0.7453990280628204} 01/26/2022 23:29:56 - INFO - codeparrot_training - Step 3849: {'lr': 0.0004981716008153732, 'samples': 739200, 'steps': 3849, 'loss/train': 0.7034454345703125} 01/26/2022 23:29:59 - INFO - codeparrot_training - Step 3850: {'lr': 0.0004981696249812646, 'samples': 739392, 'steps': 3850, 'loss/train': 0.9948690533638} 01/26/2022 23:30:02 - INFO - codeparrot_training - Step 3851: {'lr': 0.0004981676480840761, 'samples': 739584, 'steps': 3851, 'loss/train': 0.9899130463600159} 01/26/2022 23:30:08 - INFO - codeparrot_training - Step 3852: {'lr': 0.0004981656701238162, 'samples': 739776, 'steps': 3852, 'loss/train': 1.073481559753418} 01/26/2022 23:30:11 - INFO - codeparrot_training - Step 3853: {'lr': 0.0004981636911004934, 'samples': 739968, 'steps': 3853, 'loss/train': 1.003449261188507} 01/26/2022 23:30:14 - INFO - codeparrot_training - Step 3854: {'lr': 0.0004981617110141162, 'samples': 740160, 'steps': 3854, 'loss/train': 1.117401659488678} 01/26/2022 23:30:17 - INFO - codeparrot_training - Step 3855: {'lr': 0.000498159729864693, 'samples': 740352, 'steps': 3855, 'loss/train': 0.5249563157558441} 01/26/2022 23:30:21 - INFO - codeparrot_training - Step 3856: {'lr': 0.0004981577476522323, 'samples': 740544, 'steps': 3856, 'loss/train': 0.9319711625576019} 01/26/2022 23:30:24 - INFO - codeparrot_training - Step 3857: {'lr': 0.0004981557643767426, 'samples': 740736, 'steps': 3857, 'loss/train': 0.9188098311424255} 01/26/2022 23:30:27 - INFO - codeparrot_training - Step 3858: {'lr': 0.0004981537800382323, 'samples': 740928, 'steps': 3858, 'loss/train': 0.9925062954425812} 01/26/2022 23:30:30 - INFO - codeparrot_training - Step 3859: {'lr': 0.0004981517946367102, 'samples': 741120, 'steps': 3859, 'loss/train': 0.5786432325839996} 01/26/2022 23:30:33 - INFO - codeparrot_training - Step 3860: {'lr': 0.0004981498081721845, 'samples': 741312, 'steps': 3860, 'loss/train': 1.0692836344242096} 01/26/2022 23:30:37 - INFO - codeparrot_training - Step 3861: {'lr': 0.0004981478206446638, 'samples': 741504, 'steps': 3861, 'loss/train': 1.0235053896903992} 01/26/2022 23:30:41 - INFO - codeparrot_training - Step 3862: {'lr': 0.0004981458320541567, 'samples': 741696, 'steps': 3862, 'loss/train': 1.0538764894008636} 01/26/2022 23:30:44 - INFO - codeparrot_training - Step 3863: {'lr': 0.0004981438424006716, 'samples': 741888, 'steps': 3863, 'loss/train': 0.7906354665756226} 01/26/2022 23:30:47 - INFO - codeparrot_training - Step 3864: {'lr': 0.0004981418516842171, 'samples': 742080, 'steps': 3864, 'loss/train': 0.6484892070293427} 01/26/2022 23:30:50 - INFO - codeparrot_training - Step 3865: {'lr': 0.0004981398599048018, 'samples': 742272, 'steps': 3865, 'loss/train': 0.627307265996933} 01/26/2022 23:30:53 - INFO - codeparrot_training - Step 3866: {'lr': 0.000498137867062434, 'samples': 742464, 'steps': 3866, 'loss/train': 0.8429844081401825} 01/26/2022 23:30:56 - INFO - codeparrot_training - Step 3867: {'lr': 0.0004981358731571223, 'samples': 742656, 'steps': 3867, 'loss/train': 0.9455378651618958} 01/26/2022 23:30:59 - INFO - codeparrot_training - Step 3868: {'lr': 0.0004981338781888755, 'samples': 742848, 'steps': 3868, 'loss/train': 0.8855392634868622} 01/26/2022 23:31:03 - INFO - codeparrot_training - Step 3869: {'lr': 0.0004981318821577018, 'samples': 743040, 'steps': 3869, 'loss/train': 0.6129806488752365} 01/26/2022 23:31:07 - INFO - codeparrot_training - Step 3870: {'lr': 0.00049812988506361, 'samples': 743232, 'steps': 3870, 'loss/train': 0.676503449678421} 01/26/2022 23:31:10 - INFO - codeparrot_training - Step 3871: {'lr': 0.0004981278869066085, 'samples': 743424, 'steps': 3871, 'loss/train': 0.8119095861911774} 01/26/2022 23:31:13 - INFO - codeparrot_training - Step 3872: {'lr': 0.000498125887686706, 'samples': 743616, 'steps': 3872, 'loss/train': 0.6815033107995987} 01/26/2022 23:31:17 - INFO - codeparrot_training - Step 3873: {'lr': 0.0004981238874039109, 'samples': 743808, 'steps': 3873, 'loss/train': 0.8968636393547058} 01/26/2022 23:31:20 - INFO - codeparrot_training - Step 3874: {'lr': 0.0004981218860582319, 'samples': 744000, 'steps': 3874, 'loss/train': 1.1665104031562805} 01/26/2022 23:31:23 - INFO - codeparrot_training - Step 3875: {'lr': 0.0004981198836496775, 'samples': 744192, 'steps': 3875, 'loss/train': 1.029586136341095} 01/26/2022 23:31:26 - INFO - codeparrot_training - Step 3876: {'lr': 0.0004981178801782563, 'samples': 744384, 'steps': 3876, 'loss/train': 0.4661898761987686} 01/26/2022 23:31:29 - INFO - codeparrot_training - Step 3877: {'lr': 0.000498115875643977, 'samples': 744576, 'steps': 3877, 'loss/train': 1.3692617118358612} 01/26/2022 23:31:32 - INFO - codeparrot_training - Step 3878: {'lr': 0.0004981138700468479, 'samples': 744768, 'steps': 3878, 'loss/train': 1.0751698315143585} 01/26/2022 23:31:37 - INFO - codeparrot_training - Step 3879: {'lr': 0.0004981118633868779, 'samples': 744960, 'steps': 3879, 'loss/train': 0.6203278452157974} 01/26/2022 23:31:40 - INFO - codeparrot_training - Step 3880: {'lr': 0.0004981098556640755, 'samples': 745152, 'steps': 3880, 'loss/train': 1.1889275908470154} 01/26/2022 23:31:43 - INFO - codeparrot_training - Step 3881: {'lr': 0.0004981078468784491, 'samples': 745344, 'steps': 3881, 'loss/train': 1.5341938734054565} 01/26/2022 23:31:46 - INFO - codeparrot_training - Step 3882: {'lr': 0.0004981058370300076, 'samples': 745536, 'steps': 3882, 'loss/train': 0.45420464873313904} 01/26/2022 23:31:49 - INFO - codeparrot_training - Step 3883: {'lr': 0.0004981038261187594, 'samples': 745728, 'steps': 3883, 'loss/train': 1.0273961126804352} 01/26/2022 23:31:52 - INFO - codeparrot_training - Step 3884: {'lr': 0.0004981018141447133, 'samples': 745920, 'steps': 3884, 'loss/train': 0.5881200134754181} 01/26/2022 23:31:55 - INFO - codeparrot_training - Step 3885: {'lr': 0.0004980998011078776, 'samples': 746112, 'steps': 3885, 'loss/train': 0.732703909277916} 01/26/2022 23:31:59 - INFO - codeparrot_training - Step 3886: {'lr': 0.0004980977870082613, 'samples': 746304, 'steps': 3886, 'loss/train': 0.6988644897937775} 01/26/2022 23:32:02 - INFO - codeparrot_training - Step 3887: {'lr': 0.0004980957718458729, 'samples': 746496, 'steps': 3887, 'loss/train': 0.9826968312263489} 01/26/2022 23:32:08 - INFO - codeparrot_training - Step 3888: {'lr': 0.0004980937556207207, 'samples': 746688, 'steps': 3888, 'loss/train': 0.764199435710907} 01/26/2022 23:32:11 - INFO - codeparrot_training - Step 3889: {'lr': 0.0004980917383328139, 'samples': 746880, 'steps': 3889, 'loss/train': 0.5022792667150497} 01/26/2022 23:32:14 - INFO - codeparrot_training - Step 3890: {'lr': 0.0004980897199821609, 'samples': 747072, 'steps': 3890, 'loss/train': 1.014471709728241} 01/26/2022 23:32:17 - INFO - codeparrot_training - Step 3891: {'lr': 0.0004980877005687701, 'samples': 747264, 'steps': 3891, 'loss/train': 1.2192294895648956} 01/26/2022 23:32:20 - INFO - codeparrot_training - Step 3892: {'lr': 0.0004980856800926506, 'samples': 747456, 'steps': 3892, 'loss/train': 1.1142738461494446} 01/26/2022 23:32:23 - INFO - codeparrot_training - Step 3893: {'lr': 0.0004980836585538107, 'samples': 747648, 'steps': 3893, 'loss/train': 1.093270182609558} 01/26/2022 23:32:27 - INFO - codeparrot_training - Step 3894: {'lr': 0.0004980816359522592, 'samples': 747840, 'steps': 3894, 'loss/train': 0.7418079525232315} 01/26/2022 23:32:30 - INFO - codeparrot_training - Step 3895: {'lr': 0.0004980796122880048, 'samples': 748032, 'steps': 3895, 'loss/train': 0.86576709151268} 01/26/2022 23:32:34 - INFO - codeparrot_training - Step 3896: {'lr': 0.000498077587561056, 'samples': 748224, 'steps': 3896, 'loss/train': 1.0761189758777618} 01/26/2022 23:32:37 - INFO - codeparrot_training - Step 3897: {'lr': 0.0004980755617714216, 'samples': 748416, 'steps': 3897, 'loss/train': 0.8532220423221588} 01/26/2022 23:32:41 - INFO - codeparrot_training - Step 3898: {'lr': 0.0004980735349191104, 'samples': 748608, 'steps': 3898, 'loss/train': 1.6118881702423096} 01/26/2022 23:32:44 - INFO - codeparrot_training - Step 3899: {'lr': 0.0004980715070041308, 'samples': 748800, 'steps': 3899, 'loss/train': 1.3194540739059448} 01/26/2022 23:32:47 - INFO - codeparrot_training - Step 3900: {'lr': 0.0004980694780264917, 'samples': 748992, 'steps': 3900, 'loss/train': 0.9668495357036591} 01/26/2022 23:32:50 - INFO - codeparrot_training - Step 3901: {'lr': 0.0004980674479862018, 'samples': 749184, 'steps': 3901, 'loss/train': 0.42522579431533813} 01/26/2022 23:32:53 - INFO - codeparrot_training - Step 3902: {'lr': 0.0004980654168832697, 'samples': 749376, 'steps': 3902, 'loss/train': 0.7754543423652649} 01/26/2022 23:32:56 - INFO - codeparrot_training - Step 3903: {'lr': 0.0004980633847177041, 'samples': 749568, 'steps': 3903, 'loss/train': 1.220658391714096} 01/26/2022 23:32:59 - INFO - codeparrot_training - Step 3904: {'lr': 0.0004980613514895135, 'samples': 749760, 'steps': 3904, 'loss/train': 0.9515006840229034} 01/26/2022 23:33:04 - INFO - codeparrot_training - Step 3905: {'lr': 0.0004980593171987072, 'samples': 749952, 'steps': 3905, 'loss/train': 0.9227783381938934} 01/26/2022 23:33:07 - INFO - codeparrot_training - Step 3906: {'lr': 0.0004980572818452934, 'samples': 750144, 'steps': 3906, 'loss/train': 0.9343565404415131} 01/26/2022 23:33:10 - INFO - codeparrot_training - Step 3907: {'lr': 0.0004980552454292809, 'samples': 750336, 'steps': 3907, 'loss/train': 0.7269952744245529} 01/26/2022 23:33:13 - INFO - codeparrot_training - Step 3908: {'lr': 0.0004980532079506786, 'samples': 750528, 'steps': 3908, 'loss/train': 0.5910067856311798} 01/26/2022 23:33:16 - INFO - codeparrot_training - Step 3909: {'lr': 0.0004980511694094951, 'samples': 750720, 'steps': 3909, 'loss/train': 0.907954216003418} 01/26/2022 23:33:20 - INFO - codeparrot_training - Step 3910: {'lr': 0.0004980491298057392, 'samples': 750912, 'steps': 3910, 'loss/train': 0.8685242235660553} 01/26/2022 23:33:23 - INFO - codeparrot_training - Step 3911: {'lr': 0.0004980470891394194, 'samples': 751104, 'steps': 3911, 'loss/train': 1.433293730020523} 01/26/2022 23:33:26 - INFO - codeparrot_training - Step 3912: {'lr': 0.0004980450474105448, 'samples': 751296, 'steps': 3912, 'loss/train': 0.8301834762096405} 01/26/2022 23:33:29 - INFO - codeparrot_training - Step 3913: {'lr': 0.000498043004619124, 'samples': 751488, 'steps': 3913, 'loss/train': 0.46711157262325287} 01/26/2022 23:33:35 - INFO - codeparrot_training - Step 3914: {'lr': 0.0004980409607651656, 'samples': 751680, 'steps': 3914, 'loss/train': 0.8855275511741638} 01/26/2022 23:33:39 - INFO - codeparrot_training - Step 3915: {'lr': 0.0004980389158486786, 'samples': 751872, 'steps': 3915, 'loss/train': 0.6846596002578735} 01/26/2022 23:33:42 - INFO - codeparrot_training - Step 3916: {'lr': 0.0004980368698696716, 'samples': 752064, 'steps': 3916, 'loss/train': 1.1028982400894165} 01/26/2022 23:33:45 - INFO - codeparrot_training - Step 3917: {'lr': 0.0004980348228281534, 'samples': 752256, 'steps': 3917, 'loss/train': 0.9805440008640289} 01/26/2022 23:33:48 - INFO - codeparrot_training - Step 3918: {'lr': 0.0004980327747241329, 'samples': 752448, 'steps': 3918, 'loss/train': 1.0723249018192291} 01/26/2022 23:33:51 - INFO - codeparrot_training - Step 3919: {'lr': 0.0004980307255576185, 'samples': 752640, 'steps': 3919, 'loss/train': 0.8895979821681976} 01/26/2022 23:33:54 - INFO - codeparrot_training - Step 3920: {'lr': 0.0004980286753286195, 'samples': 752832, 'steps': 3920, 'loss/train': 1.154020607471466} 01/26/2022 23:33:57 - INFO - codeparrot_training - Step 3921: {'lr': 0.0004980266240371443, 'samples': 753024, 'steps': 3921, 'loss/train': 0.5551009476184845} 01/26/2022 23:34:01 - INFO - codeparrot_training - Step 3922: {'lr': 0.0004980245716832018, 'samples': 753216, 'steps': 3922, 'loss/train': 0.9515096247196198} 01/26/2022 23:34:05 - INFO - codeparrot_training - Step 3923: {'lr': 0.0004980225182668008, 'samples': 753408, 'steps': 3923, 'loss/train': 0.3102077767252922} 01/26/2022 23:34:08 - INFO - codeparrot_training - Step 3924: {'lr': 0.00049802046378795, 'samples': 753600, 'steps': 3924, 'loss/train': 1.0312245190143585} 01/26/2022 23:34:12 - INFO - codeparrot_training - Step 3925: {'lr': 0.0004980184082466583, 'samples': 753792, 'steps': 3925, 'loss/train': 0.8025231063365936} 01/26/2022 23:34:15 - INFO - codeparrot_training - Step 3926: {'lr': 0.0004980163516429346, 'samples': 753984, 'steps': 3926, 'loss/train': 0.8290510475635529} 01/26/2022 23:34:18 - INFO - codeparrot_training - Step 3927: {'lr': 0.0004980142939767876, 'samples': 754176, 'steps': 3927, 'loss/train': 0.705356240272522} 01/26/2022 23:34:21 - INFO - codeparrot_training - Step 3928: {'lr': 0.000498012235248226, 'samples': 754368, 'steps': 3928, 'loss/train': 1.0376103222370148} 01/26/2022 23:34:24 - INFO - codeparrot_training - Step 3929: {'lr': 0.0004980101754572589, 'samples': 754560, 'steps': 3929, 'loss/train': 0.9729749858379364} 01/26/2022 23:34:27 - INFO - codeparrot_training - Step 3930: {'lr': 0.0004980081146038948, 'samples': 754752, 'steps': 3930, 'loss/train': 0.693305566906929} 01/26/2022 23:34:30 - INFO - codeparrot_training - Step 3931: {'lr': 0.0004980060526881429, 'samples': 754944, 'steps': 3931, 'loss/train': 1.020486295223236} 01/26/2022 23:34:35 - INFO - codeparrot_training - Step 3932: {'lr': 0.0004980039897100115, 'samples': 755136, 'steps': 3932, 'loss/train': 0.7755397260189056} 01/26/2022 23:34:39 - INFO - codeparrot_training - Step 3933: {'lr': 0.0004980019256695101, 'samples': 755328, 'steps': 3933, 'loss/train': 0.9548532664775848} 01/26/2022 23:34:42 - INFO - codeparrot_training - Step 3934: {'lr': 0.000497999860566647, 'samples': 755520, 'steps': 3934, 'loss/train': 1.214694231748581} 01/26/2022 23:34:45 - INFO - codeparrot_training - Step 3935: {'lr': 0.0004979977944014313, 'samples': 755712, 'steps': 3935, 'loss/train': 0.6181241869926453} 01/26/2022 23:34:48 - INFO - codeparrot_training - Step 3936: {'lr': 0.0004979957271738718, 'samples': 755904, 'steps': 3936, 'loss/train': 0.7836324870586395} 01/26/2022 23:34:51 - INFO - codeparrot_training - Step 3937: {'lr': 0.0004979936588839773, 'samples': 756096, 'steps': 3937, 'loss/train': 0.48641417920589447} 01/26/2022 23:34:54 - INFO - codeparrot_training - Step 3938: {'lr': 0.0004979915895317567, 'samples': 756288, 'steps': 3938, 'loss/train': 0.8558112680912018} 01/26/2022 23:34:57 - INFO - codeparrot_training - Step 3939: {'lr': 0.000497989519117219, 'samples': 756480, 'steps': 3939, 'loss/train': 1.140969157218933} 01/26/2022 23:35:01 - INFO - codeparrot_training - Step 3940: {'lr': 0.0004979874476403729, 'samples': 756672, 'steps': 3940, 'loss/train': 1.260134071111679} 01/26/2022 23:35:05 - INFO - codeparrot_training - Step 3941: {'lr': 0.0004979853751012273, 'samples': 756864, 'steps': 3941, 'loss/train': 0.8369384407997131} 01/26/2022 23:35:08 - INFO - codeparrot_training - Step 3942: {'lr': 0.0004979833014997911, 'samples': 757056, 'steps': 3942, 'loss/train': 1.183288961648941} 01/26/2022 23:35:11 - INFO - codeparrot_training - Step 3943: {'lr': 0.0004979812268360731, 'samples': 757248, 'steps': 3943, 'loss/train': 0.8140276372432709} 01/26/2022 23:35:15 - INFO - codeparrot_training - Step 3944: {'lr': 0.0004979791511100823, 'samples': 757440, 'steps': 3944, 'loss/train': 0.9455628991127014} 01/26/2022 23:35:18 - INFO - codeparrot_training - Step 3945: {'lr': 0.0004979770743218276, 'samples': 757632, 'steps': 3945, 'loss/train': 0.5517093390226364} 01/26/2022 23:35:21 - INFO - codeparrot_training - Step 3946: {'lr': 0.0004979749964713179, 'samples': 757824, 'steps': 3946, 'loss/train': 1.061017781496048} 01/26/2022 23:35:24 - INFO - codeparrot_training - Step 3947: {'lr': 0.000497972917558562, 'samples': 758016, 'steps': 3947, 'loss/train': 0.7622946202754974} 01/26/2022 23:35:27 - INFO - codeparrot_training - Step 3948: {'lr': 0.0004979708375835688, 'samples': 758208, 'steps': 3948, 'loss/train': 1.557452380657196} 01/26/2022 23:35:30 - INFO - codeparrot_training - Step 3949: {'lr': 0.0004979687565463475, 'samples': 758400, 'steps': 3949, 'loss/train': 0.9414726197719574} 01/26/2022 23:35:35 - INFO - codeparrot_training - Step 3950: {'lr': 0.0004979666744469065, 'samples': 758592, 'steps': 3950, 'loss/train': 1.0767672657966614} 01/26/2022 23:35:39 - INFO - codeparrot_training - Step 3951: {'lr': 0.0004979645912852552, 'samples': 758784, 'steps': 3951, 'loss/train': 1.6584420204162598} 01/26/2022 23:35:42 - INFO - codeparrot_training - Step 3952: {'lr': 0.0004979625070614022, 'samples': 758976, 'steps': 3952, 'loss/train': 1.1661506295204163} 01/26/2022 23:35:45 - INFO - codeparrot_training - Step 3953: {'lr': 0.0004979604217753566, 'samples': 759168, 'steps': 3953, 'loss/train': 0.9775739908218384} 01/26/2022 23:35:48 - INFO - codeparrot_training - Step 3954: {'lr': 0.0004979583354271273, 'samples': 759360, 'steps': 3954, 'loss/train': 0.6838332116603851} 01/26/2022 23:35:51 - INFO - codeparrot_training - Step 3955: {'lr': 0.0004979562480167232, 'samples': 759552, 'steps': 3955, 'loss/train': 0.962743878364563} 01/26/2022 23:35:54 - INFO - codeparrot_training - Step 3956: {'lr': 0.0004979541595441534, 'samples': 759744, 'steps': 3956, 'loss/train': 0.7820748388767242} 01/26/2022 23:35:57 - INFO - codeparrot_training - Step 3957: {'lr': 0.0004979520700094265, 'samples': 759936, 'steps': 3957, 'loss/train': 1.2323706150054932} 01/26/2022 23:36:01 - INFO - codeparrot_training - Step 3958: {'lr': 0.0004979499794125518, 'samples': 760128, 'steps': 3958, 'loss/train': 0.8136936128139496} 01/26/2022 23:36:06 - INFO - codeparrot_training - Step 3959: {'lr': 0.0004979478877535382, 'samples': 760320, 'steps': 3959, 'loss/train': 1.6207884550094604} 01/26/2022 23:36:09 - INFO - codeparrot_training - Step 3960: {'lr': 0.0004979457950323945, 'samples': 760512, 'steps': 3960, 'loss/train': 1.1098946034908295} 01/26/2022 23:36:12 - INFO - codeparrot_training - Step 3961: {'lr': 0.0004979437012491297, 'samples': 760704, 'steps': 3961, 'loss/train': 0.5499571412801743} 01/26/2022 23:36:15 - INFO - codeparrot_training - Step 3962: {'lr': 0.0004979416064037528, 'samples': 760896, 'steps': 3962, 'loss/train': 0.7738769352436066} 01/26/2022 23:36:18 - INFO - codeparrot_training - Step 3963: {'lr': 0.0004979395104962728, 'samples': 761088, 'steps': 3963, 'loss/train': 1.0436354577541351} 01/26/2022 23:36:21 - INFO - codeparrot_training - Step 3964: {'lr': 0.0004979374135266987, 'samples': 761280, 'steps': 3964, 'loss/train': 0.6387320011854172} 01/26/2022 23:36:24 - INFO - codeparrot_training - Step 3965: {'lr': 0.0004979353154950394, 'samples': 761472, 'steps': 3965, 'loss/train': 0.7929292023181915} 01/26/2022 23:36:28 - INFO - codeparrot_training - Step 3966: {'lr': 0.0004979332164013041, 'samples': 761664, 'steps': 3966, 'loss/train': 0.5118876993656158} 01/26/2022 23:36:31 - INFO - codeparrot_training - Step 3967: {'lr': 0.0004979311162455015, 'samples': 761856, 'steps': 3967, 'loss/train': 1.4334106743335724} 01/26/2022 23:36:35 - INFO - codeparrot_training - Step 3968: {'lr': 0.0004979290150276407, 'samples': 762048, 'steps': 3968, 'loss/train': 1.043969303369522} 01/26/2022 23:36:38 - INFO - codeparrot_training - Step 3969: {'lr': 0.0004979269127477308, 'samples': 762240, 'steps': 3969, 'loss/train': 0.6244706064462662} 01/26/2022 23:36:41 - INFO - codeparrot_training - Step 3970: {'lr': 0.0004979248094057806, 'samples': 762432, 'steps': 3970, 'loss/train': 0.6331766992807388} 01/26/2022 23:36:45 - INFO - codeparrot_training - Step 3971: {'lr': 0.0004979227050017994, 'samples': 762624, 'steps': 3971, 'loss/train': 1.0366674363613129} 01/26/2022 23:36:48 - INFO - codeparrot_training - Step 3972: {'lr': 0.000497920599535796, 'samples': 762816, 'steps': 3972, 'loss/train': 0.7720125317573547} 01/26/2022 23:36:51 - INFO - codeparrot_training - Step 3973: {'lr': 0.0004979184930077794, 'samples': 763008, 'steps': 3973, 'loss/train': 1.0295783579349518} 01/26/2022 23:36:54 - INFO - codeparrot_training - Step 3974: {'lr': 0.0004979163854177588, 'samples': 763200, 'steps': 3974, 'loss/train': 0.7898235619068146} 01/26/2022 23:36:57 - INFO - codeparrot_training - Step 3975: {'lr': 0.0004979142767657432, 'samples': 763392, 'steps': 3975, 'loss/train': 0.9591449797153473} 01/26/2022 23:37:00 - INFO - codeparrot_training - Step 3976: {'lr': 0.0004979121670517413, 'samples': 763584, 'steps': 3976, 'loss/train': 0.8581961095333099} 01/26/2022 23:37:05 - INFO - codeparrot_training - Step 3977: {'lr': 0.0004979100562757626, 'samples': 763776, 'steps': 3977, 'loss/train': 0.9693990647792816} 01/26/2022 23:37:08 - INFO - codeparrot_training - Step 3978: {'lr': 0.0004979079444378159, 'samples': 763968, 'steps': 3978, 'loss/train': 0.7731925249099731} 01/26/2022 23:37:11 - INFO - codeparrot_training - Step 3979: {'lr': 0.0004979058315379103, 'samples': 764160, 'steps': 3979, 'loss/train': 0.9952047765254974} 01/26/2022 23:37:14 - INFO - codeparrot_training - Step 3980: {'lr': 0.0004979037175760548, 'samples': 764352, 'steps': 3980, 'loss/train': 0.8215713500976562} 01/26/2022 23:37:17 - INFO - codeparrot_training - Step 3981: {'lr': 0.0004979016025522586, 'samples': 764544, 'steps': 3981, 'loss/train': 0.42449015378952026} 01/26/2022 23:37:21 - INFO - codeparrot_training - Step 3982: {'lr': 0.0004978994864665305, 'samples': 764736, 'steps': 3982, 'loss/train': 0.7290050089359283} 01/26/2022 23:37:24 - INFO - codeparrot_training - Step 3983: {'lr': 0.0004978973693188797, 'samples': 764928, 'steps': 3983, 'loss/train': 1.0577080249786377} 01/26/2022 23:37:27 - INFO - codeparrot_training - Step 3984: {'lr': 0.0004978952511093155, 'samples': 765120, 'steps': 3984, 'loss/train': 1.721834421157837} 01/26/2022 23:37:30 - INFO - codeparrot_training - Step 3985: {'lr': 0.0004978931318378465, 'samples': 765312, 'steps': 3985, 'loss/train': 1.035116046667099} 01/26/2022 23:37:35 - INFO - codeparrot_training - Step 3986: {'lr': 0.0004978910115044822, 'samples': 765504, 'steps': 3986, 'loss/train': 1.2025447189807892} 01/26/2022 23:37:38 - INFO - codeparrot_training - Step 3987: {'lr': 0.0004978888901092315, 'samples': 765696, 'steps': 3987, 'loss/train': 0.6157684922218323} 01/26/2022 23:37:41 - INFO - codeparrot_training - Step 3988: {'lr': 0.0004978867676521035, 'samples': 765888, 'steps': 3988, 'loss/train': 1.0940046608448029} 01/26/2022 23:37:45 - INFO - codeparrot_training - Step 3989: {'lr': 0.0004978846441331073, 'samples': 766080, 'steps': 3989, 'loss/train': 1.308286339044571} 01/26/2022 23:37:48 - INFO - codeparrot_training - Step 3990: {'lr': 0.000497882519552252, 'samples': 766272, 'steps': 3990, 'loss/train': 0.9947578310966492} 01/26/2022 23:37:51 - INFO - codeparrot_training - Step 3991: {'lr': 0.0004978803939095466, 'samples': 766464, 'steps': 3991, 'loss/train': 0.7385606467723846} 01/26/2022 23:37:54 - INFO - codeparrot_training - Step 3992: {'lr': 0.0004978782672050004, 'samples': 766656, 'steps': 3992, 'loss/train': 0.8506863713264465} 01/26/2022 23:37:57 - INFO - codeparrot_training - Step 3993: {'lr': 0.0004978761394386224, 'samples': 766848, 'steps': 3993, 'loss/train': 0.47996753454208374} 01/26/2022 23:38:00 - INFO - codeparrot_training - Step 3994: {'lr': 0.0004978740106104218, 'samples': 767040, 'steps': 3994, 'loss/train': 0.7131772041320801} 01/26/2022 23:38:05 - INFO - codeparrot_training - Step 3995: {'lr': 0.0004978718807204076, 'samples': 767232, 'steps': 3995, 'loss/train': 1.070930689573288} 01/26/2022 23:38:08 - INFO - codeparrot_training - Step 3996: {'lr': 0.0004978697497685889, 'samples': 767424, 'steps': 3996, 'loss/train': 1.4254731237888336} 01/26/2022 23:38:11 - INFO - codeparrot_training - Step 3997: {'lr': 0.0004978676177549749, 'samples': 767616, 'steps': 3997, 'loss/train': 1.0729676485061646} 01/26/2022 23:38:14 - INFO - codeparrot_training - Step 3998: {'lr': 0.0004978654846795748, 'samples': 767808, 'steps': 3998, 'loss/train': 0.7836130857467651} 01/26/2022 23:38:17 - INFO - codeparrot_training - Step 3999: {'lr': 0.0004978633505423976, 'samples': 768000, 'steps': 3999, 'loss/train': 0.6604420244693756} 01/26/2022 23:38:17 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/26/2022 23:38:35 - WARNING - huggingface_hub.repository - Several commits (2) will be pushed upstream. 01/26/2022 23:38:35 - WARNING - huggingface_hub.repository - The progress bars may be unreliable. 01/26/2022 23:39:12 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py 9ecaf4a..da12221 royal-monkey-12 -> royal-monkey-12 01/26/2022 23:39:16 - INFO - codeparrot_training - Step 4000: {'lr': 0.0004978612153434526, 'samples': 768192, 'steps': 4000, 'loss/train': 1.2772589027881622} 01/26/2022 23:39:19 - INFO - codeparrot_training - Step 4001: {'lr': 0.0004978590790827488, 'samples': 768384, 'steps': 4001, 'loss/train': 0.8840435743331909} 01/26/2022 23:39:22 - INFO - codeparrot_training - Step 4002: {'lr': 0.0004978569417602955, 'samples': 768576, 'steps': 4002, 'loss/train': 0.721100851893425} 01/26/2022 23:39:27 - INFO - codeparrot_training - Step 4003: {'lr': 0.0004978548033761017, 'samples': 768768, 'steps': 4003, 'loss/train': 0.932602196931839} 01/26/2022 23:39:30 - INFO - codeparrot_training - Step 4004: {'lr': 0.0004978526639301766, 'samples': 768960, 'steps': 4004, 'loss/train': 1.3107247352600098} 01/26/2022 23:39:33 - INFO - codeparrot_training - Step 4005: {'lr': 0.0004978505234225294, 'samples': 769152, 'steps': 4005, 'loss/train': 0.9157354831695557} 01/26/2022 23:39:37 - INFO - codeparrot_training - Step 4006: {'lr': 0.0004978483818531693, 'samples': 769344, 'steps': 4006, 'loss/train': 1.0819831788539886} 01/26/2022 23:39:40 - INFO - codeparrot_training - Step 4007: {'lr': 0.0004978462392221054, 'samples': 769536, 'steps': 4007, 'loss/train': 0.5205494463443756} 01/26/2022 23:39:43 - INFO - codeparrot_training - Step 4008: {'lr': 0.0004978440955293468, 'samples': 769728, 'steps': 4008, 'loss/train': 0.6775269359350204} 01/26/2022 23:39:46 - INFO - codeparrot_training - Step 4009: {'lr': 0.000497841950774903, 'samples': 769920, 'steps': 4009, 'loss/train': 0.9375828802585602} 01/26/2022 23:39:49 - INFO - codeparrot_training - Step 4010: {'lr': 0.0004978398049587828, 'samples': 770112, 'steps': 4010, 'loss/train': 0.3183499351143837} 01/26/2022 23:39:52 - INFO - codeparrot_training - Step 4011: {'lr': 0.0004978376580809957, 'samples': 770304, 'steps': 4011, 'loss/train': 1.1100340783596039} 01/26/2022 23:39:57 - INFO - codeparrot_training - Step 4012: {'lr': 0.0004978355101415507, 'samples': 770496, 'steps': 4012, 'loss/train': 0.9437098503112793} 01/26/2022 23:40:01 - INFO - codeparrot_training - Step 4013: {'lr': 0.0004978333611404571, 'samples': 770688, 'steps': 4013, 'loss/train': 0.6379410177469254} 01/26/2022 23:40:04 - INFO - codeparrot_training - Step 4014: {'lr': 0.0004978312110777241, 'samples': 770880, 'steps': 4014, 'loss/train': 0.6716441363096237} 01/26/2022 23:40:07 - INFO - codeparrot_training - Step 4015: {'lr': 0.0004978290599533609, 'samples': 771072, 'steps': 4015, 'loss/train': 1.1299696862697601} 01/26/2022 23:40:10 - INFO - codeparrot_training - Step 4016: {'lr': 0.0004978269077673766, 'samples': 771264, 'steps': 4016, 'loss/train': 1.7829644680023193} 01/26/2022 23:40:13 - INFO - codeparrot_training - Step 4017: {'lr': 0.0004978247545197806, 'samples': 771456, 'steps': 4017, 'loss/train': 0.6675313711166382} 01/26/2022 23:40:16 - INFO - codeparrot_training - Step 4018: {'lr': 0.0004978226002105821, 'samples': 771648, 'steps': 4018, 'loss/train': 0.9305848181247711} 01/26/2022 23:40:19 - INFO - codeparrot_training - Step 4019: {'lr': 0.0004978204448397902, 'samples': 771840, 'steps': 4019, 'loss/train': 0.9519038200378418} 01/26/2022 23:40:23 - INFO - codeparrot_training - Step 4020: {'lr': 0.0004978182884074142, 'samples': 772032, 'steps': 4020, 'loss/train': 0.8829478025436401} 01/26/2022 23:40:26 - INFO - codeparrot_training - Step 4021: {'lr': 0.0004978161309134633, 'samples': 772224, 'steps': 4021, 'loss/train': 1.0805622339248657} 01/26/2022 23:40:31 - INFO - codeparrot_training - Step 4022: {'lr': 0.0004978139723579469, 'samples': 772416, 'steps': 4022, 'loss/train': 1.1087295413017273} 01/26/2022 23:40:34 - INFO - codeparrot_training - Step 4023: {'lr': 0.0004978118127408741, 'samples': 772608, 'steps': 4023, 'loss/train': 1.006752222776413} 01/26/2022 23:40:38 - INFO - codeparrot_training - Step 4024: {'lr': 0.0004978096520622541, 'samples': 772800, 'steps': 4024, 'loss/train': 0.9731187522411346} 01/26/2022 23:40:41 - INFO - codeparrot_training - Step 4025: {'lr': 0.0004978074903220964, 'samples': 772992, 'steps': 4025, 'loss/train': 0.8995282351970673} 01/26/2022 23:40:44 - INFO - codeparrot_training - Step 4026: {'lr': 0.0004978053275204099, 'samples': 773184, 'steps': 4026, 'loss/train': 0.9191463589668274} 01/26/2022 23:40:47 - INFO - codeparrot_training - Step 4027: {'lr': 0.0004978031636572042, 'samples': 773376, 'steps': 4027, 'loss/train': 0.44516538083553314} 01/26/2022 23:40:50 - INFO - codeparrot_training - Step 4028: {'lr': 0.0004978009987324884, 'samples': 773568, 'steps': 4028, 'loss/train': 0.3797793388366699} 01/26/2022 23:40:53 - INFO - codeparrot_training - Step 4029: {'lr': 0.0004977988327462718, 'samples': 773760, 'steps': 4029, 'loss/train': 0.8683596253395081} 01/26/2022 23:40:58 - INFO - codeparrot_training - Step 4030: {'lr': 0.0004977966656985637, 'samples': 773952, 'steps': 4030, 'loss/train': 1.2682644724845886} 01/26/2022 23:41:01 - INFO - codeparrot_training - Step 4031: {'lr': 0.0004977944975893733, 'samples': 774144, 'steps': 4031, 'loss/train': 0.7197783440351486} 01/26/2022 23:41:04 - INFO - codeparrot_training - Step 4032: {'lr': 0.00049779232841871, 'samples': 774336, 'steps': 4032, 'loss/train': 0.3036152645945549} 01/26/2022 23:41:07 - INFO - codeparrot_training - Step 4033: {'lr': 0.0004977901581865831, 'samples': 774528, 'steps': 4033, 'loss/train': 0.7276606857776642} 01/26/2022 23:41:10 - INFO - codeparrot_training - Step 4034: {'lr': 0.0004977879868930018, 'samples': 774720, 'steps': 4034, 'loss/train': 1.0405435860157013} 01/26/2022 23:41:13 - INFO - codeparrot_training - Step 4035: {'lr': 0.0004977858145379754, 'samples': 774912, 'steps': 4035, 'loss/train': 1.1217760741710663} 01/26/2022 23:41:17 - INFO - codeparrot_training - Step 4036: {'lr': 0.0004977836411215133, 'samples': 775104, 'steps': 4036, 'loss/train': 1.6915043592453003} 01/26/2022 23:41:20 - INFO - codeparrot_training - Step 4037: {'lr': 0.0004977814666436248, 'samples': 775296, 'steps': 4037, 'loss/train': 1.2210614383220673} 01/26/2022 23:41:23 - INFO - codeparrot_training - Step 4038: {'lr': 0.0004977792911043191, 'samples': 775488, 'steps': 4038, 'loss/train': 1.139863908290863} 01/26/2022 23:41:28 - INFO - codeparrot_training - Step 4039: {'lr': 0.0004977771145036056, 'samples': 775680, 'steps': 4039, 'loss/train': 0.12391293793916702} 01/26/2022 23:41:31 - INFO - codeparrot_training - Step 4040: {'lr': 0.0004977749368414937, 'samples': 775872, 'steps': 4040, 'loss/train': 0.6916334331035614} 01/26/2022 23:41:34 - INFO - codeparrot_training - Step 4041: {'lr': 0.0004977727581179926, 'samples': 776064, 'steps': 4041, 'loss/train': 1.0480160415172577} 01/26/2022 23:41:38 - INFO - codeparrot_training - Step 4042: {'lr': 0.0004977705783331117, 'samples': 776256, 'steps': 4042, 'loss/train': 0.8125028014183044} 01/26/2022 23:41:41 - INFO - codeparrot_training - Step 4043: {'lr': 0.0004977683974868603, 'samples': 776448, 'steps': 4043, 'loss/train': 0.9144650995731354} 01/26/2022 23:41:44 - INFO - codeparrot_training - Step 4044: {'lr': 0.0004977662155792478, 'samples': 776640, 'steps': 4044, 'loss/train': 0.8468934595584869} 01/26/2022 23:41:47 - INFO - codeparrot_training - Step 4045: {'lr': 0.0004977640326102834, 'samples': 776832, 'steps': 4045, 'loss/train': 0.2643071115016937} 01/26/2022 23:41:50 - INFO - codeparrot_training - Step 4046: {'lr': 0.0004977618485799767, 'samples': 777024, 'steps': 4046, 'loss/train': 0.6079893708229065} 01/26/2022 23:41:53 - INFO - codeparrot_training - Step 4047: {'lr': 0.0004977596634883368, 'samples': 777216, 'steps': 4047, 'loss/train': 0.9408487379550934} 01/26/2022 23:41:58 - INFO - codeparrot_training - Step 4048: {'lr': 0.0004977574773353732, 'samples': 777408, 'steps': 4048, 'loss/train': 1.0293994545936584} 01/26/2022 23:42:01 - INFO - codeparrot_training - Step 4049: {'lr': 0.0004977552901210952, 'samples': 777600, 'steps': 4049, 'loss/train': 0.9664385318756104} 01/26/2022 23:42:04 - INFO - codeparrot_training - Step 4050: {'lr': 0.0004977531018455124, 'samples': 777792, 'steps': 4050, 'loss/train': 0.6691076159477234} 01/26/2022 23:42:07 - INFO - codeparrot_training - Step 4051: {'lr': 0.0004977509125086338, 'samples': 777984, 'steps': 4051, 'loss/train': 0.7741096615791321} 01/26/2022 23:42:10 - INFO - codeparrot_training - Step 4052: {'lr': 0.000497748722110469, 'samples': 778176, 'steps': 4052, 'loss/train': 0.8022007048130035} 01/26/2022 23:42:13 - INFO - codeparrot_training - Step 4053: {'lr': 0.0004977465306510273, 'samples': 778368, 'steps': 4053, 'loss/train': 0.6400416344404221} 01/26/2022 23:42:17 - INFO - codeparrot_training - Step 4054: {'lr': 0.0004977443381303182, 'samples': 778560, 'steps': 4054, 'loss/train': 0.9124862551689148} 01/26/2022 23:42:20 - INFO - codeparrot_training - Step 4055: {'lr': 0.000497742144548351, 'samples': 778752, 'steps': 4055, 'loss/train': 0.5060795992612839} 01/26/2022 23:42:23 - INFO - codeparrot_training - Step 4056: {'lr': 0.0004977399499051351, 'samples': 778944, 'steps': 4056, 'loss/train': 1.018628865480423} 01/26/2022 23:42:27 - INFO - codeparrot_training - Step 4057: {'lr': 0.0004977377542006799, 'samples': 779136, 'steps': 4057, 'loss/train': 1.3472035825252533} 01/26/2022 23:42:30 - INFO - codeparrot_training - Step 4058: {'lr': 0.0004977355574349949, 'samples': 779328, 'steps': 4058, 'loss/train': 0.89895960688591} 01/26/2022 23:42:34 - INFO - codeparrot_training - Step 4059: {'lr': 0.0004977333596080894, 'samples': 779520, 'steps': 4059, 'loss/train': 0.33905457705259323} 01/26/2022 23:42:37 - INFO - codeparrot_training - Step 4060: {'lr': 0.0004977311607199729, 'samples': 779712, 'steps': 4060, 'loss/train': 1.1015218198299408} 01/26/2022 23:42:40 - INFO - codeparrot_training - Step 4061: {'lr': 0.0004977289607706547, 'samples': 779904, 'steps': 4061, 'loss/train': 0.9617807865142822} 01/26/2022 23:42:43 - INFO - codeparrot_training - Step 4062: {'lr': 0.0004977267597601443, 'samples': 780096, 'steps': 4062, 'loss/train': 1.0049347579479218} 01/26/2022 23:42:46 - INFO - codeparrot_training - Step 4063: {'lr': 0.0004977245576884511, 'samples': 780288, 'steps': 4063, 'loss/train': 0.701876699924469} 01/26/2022 23:42:49 - INFO - codeparrot_training - Step 4064: {'lr': 0.0004977223545555847, 'samples': 780480, 'steps': 4064, 'loss/train': 0.5152295976877213} 01/26/2022 23:42:52 - INFO - codeparrot_training - Step 4065: {'lr': 0.0004977201503615543, 'samples': 780672, 'steps': 4065, 'loss/train': 0.6925360858440399} 01/26/2022 23:42:58 - INFO - codeparrot_training - Step 4066: {'lr': 0.0004977179451063694, 'samples': 780864, 'steps': 4066, 'loss/train': 0.9561848044395447} 01/26/2022 23:43:01 - INFO - codeparrot_training - Step 4067: {'lr': 0.0004977157387900395, 'samples': 781056, 'steps': 4067, 'loss/train': 0.40022653341293335} 01/26/2022 23:43:04 - INFO - codeparrot_training - Step 4068: {'lr': 0.0004977135314125741, 'samples': 781248, 'steps': 4068, 'loss/train': 0.7587995231151581} 01/26/2022 23:43:07 - INFO - codeparrot_training - Step 4069: {'lr': 0.0004977113229739825, 'samples': 781440, 'steps': 4069, 'loss/train': 0.1082068495452404} 01/26/2022 23:43:10 - INFO - codeparrot_training - Step 4070: {'lr': 0.0004977091134742743, 'samples': 781632, 'steps': 4070, 'loss/train': 1.0726988017559052} 01/26/2022 23:43:13 - INFO - codeparrot_training - Step 4071: {'lr': 0.0004977069029134588, 'samples': 781824, 'steps': 4071, 'loss/train': 1.2519980370998383} 01/26/2022 23:43:17 - INFO - codeparrot_training - Step 4072: {'lr': 0.0004977046912915458, 'samples': 782016, 'steps': 4072, 'loss/train': 0.8162739872932434} 01/26/2022 23:43:20 - INFO - codeparrot_training - Step 4073: {'lr': 0.0004977024786085444, 'samples': 782208, 'steps': 4073, 'loss/train': 0.7974226176738739} 01/26/2022 23:43:23 - INFO - codeparrot_training - Step 4074: {'lr': 0.0004977002648644642, 'samples': 782400, 'steps': 4074, 'loss/train': 0.5486471056938171} 01/26/2022 23:43:27 - INFO - codeparrot_training - Step 4075: {'lr': 0.0004976980500593149, 'samples': 782592, 'steps': 4075, 'loss/train': 0.9146117269992828} 01/26/2022 23:43:30 - INFO - codeparrot_training - Step 4076: {'lr': 0.0004976958341931057, 'samples': 782784, 'steps': 4076, 'loss/train': 0.33824698626995087} 01/26/2022 23:43:33 - INFO - codeparrot_training - Step 4077: {'lr': 0.0004976936172658462, 'samples': 782976, 'steps': 4077, 'loss/train': 0.5542143434286118} 01/26/2022 23:43:37 - INFO - codeparrot_training - Step 4078: {'lr': 0.0004976913992775459, 'samples': 783168, 'steps': 4078, 'loss/train': 0.5380898863077164} 01/26/2022 23:43:40 - INFO - codeparrot_training - Step 4079: {'lr': 0.0004976891802282143, 'samples': 783360, 'steps': 4079, 'loss/train': 1.0717923045158386} 01/26/2022 23:43:43 - INFO - codeparrot_training - Step 4080: {'lr': 0.0004976869601178609, 'samples': 783552, 'steps': 4080, 'loss/train': 1.0646085441112518} 01/26/2022 23:43:46 - INFO - codeparrot_training - Step 4081: {'lr': 0.0004976847389464952, 'samples': 783744, 'steps': 4081, 'loss/train': 0.5765394866466522} 01/26/2022 23:43:49 - INFO - codeparrot_training - Step 4082: {'lr': 0.0004976825167141268, 'samples': 783936, 'steps': 4082, 'loss/train': 0.9304335415363312} 01/26/2022 23:43:52 - INFO - codeparrot_training - Step 4083: {'lr': 0.000497680293420765, 'samples': 784128, 'steps': 4083, 'loss/train': 0.6233063042163849} 01/26/2022 23:43:57 - INFO - codeparrot_training - Step 4084: {'lr': 0.0004976780690664196, 'samples': 784320, 'steps': 4084, 'loss/train': 1.4184984862804413} 01/26/2022 23:44:00 - INFO - codeparrot_training - Step 4085: {'lr': 0.0004976758436511, 'samples': 784512, 'steps': 4085, 'loss/train': 0.9929869472980499} 01/26/2022 23:44:03 - INFO - codeparrot_training - Step 4086: {'lr': 0.0004976736171748156, 'samples': 784704, 'steps': 4086, 'loss/train': 1.144044041633606} 01/26/2022 23:44:07 - INFO - codeparrot_training - Step 4087: {'lr': 0.0004976713896375762, 'samples': 784896, 'steps': 4087, 'loss/train': 0.5358744263648987} 01/26/2022 23:44:10 - INFO - codeparrot_training - Step 4088: {'lr': 0.0004976691610393911, 'samples': 785088, 'steps': 4088, 'loss/train': 0.8040524125099182} 01/26/2022 23:44:13 - INFO - codeparrot_training - Step 4089: {'lr': 0.0004976669313802701, 'samples': 785280, 'steps': 4089, 'loss/train': 1.3938780426979065} 01/26/2022 23:44:16 - INFO - codeparrot_training - Step 4090: {'lr': 0.0004976647006602225, 'samples': 785472, 'steps': 4090, 'loss/train': 0.4934360682964325} 01/26/2022 23:44:19 - INFO - codeparrot_training - Step 4091: {'lr': 0.0004976624688792581, 'samples': 785664, 'steps': 4091, 'loss/train': 0.7671352028846741} 01/26/2022 23:44:22 - INFO - codeparrot_training - Step 4092: {'lr': 0.0004976602360373861, 'samples': 785856, 'steps': 4092, 'loss/train': 0.6154375076293945} 01/26/2022 23:44:27 - INFO - codeparrot_training - Step 4093: {'lr': 0.0004976580021346164, 'samples': 786048, 'steps': 4093, 'loss/train': 1.4636439979076385} 01/26/2022 23:44:30 - INFO - codeparrot_training - Step 4094: {'lr': 0.0004976557671709585, 'samples': 786240, 'steps': 4094, 'loss/train': 0.8187973201274872} 01/26/2022 23:44:33 - INFO - codeparrot_training - Step 4095: {'lr': 0.0004976535311464219, 'samples': 786432, 'steps': 4095, 'loss/train': 0.41239728033542633} 01/26/2022 23:44:36 - INFO - codeparrot_training - Step 4096: {'lr': 0.0004976512940610162, 'samples': 786624, 'steps': 4096, 'loss/train': 0.6183968782424927} 01/26/2022 23:44:39 - INFO - codeparrot_training - Step 4097: {'lr': 0.0004976490559147511, 'samples': 786816, 'steps': 4097, 'loss/train': 0.6674591302871704} 01/26/2022 23:44:42 - INFO - codeparrot_training - Step 4098: {'lr': 0.0004976468167076359, 'samples': 787008, 'steps': 4098, 'loss/train': 1.2501537501811981} 01/26/2022 23:44:46 - INFO - codeparrot_training - Step 4099: {'lr': 0.0004976445764396805, 'samples': 787200, 'steps': 4099, 'loss/train': 0.6338312029838562} 01/26/2022 23:44:49 - INFO - codeparrot_training - Step 4100: {'lr': 0.0004976423351108943, 'samples': 787392, 'steps': 4100, 'loss/train': 0.6123462617397308} 01/26/2022 23:44:54 - INFO - codeparrot_training - Step 4101: {'lr': 0.0004976400927212871, 'samples': 787584, 'steps': 4101, 'loss/train': 0.9185439348220825} 01/26/2022 23:44:57 - INFO - codeparrot_training - Step 4102: {'lr': 0.0004976378492708681, 'samples': 787776, 'steps': 4102, 'loss/train': 1.276155263185501} 01/26/2022 23:45:00 - INFO - codeparrot_training - Step 4103: {'lr': 0.0004976356047596475, 'samples': 787968, 'steps': 4103, 'loss/train': 1.0947949290275574} 01/26/2022 23:45:03 - INFO - codeparrot_training - Step 4104: {'lr': 0.0004976333591876344, 'samples': 788160, 'steps': 4104, 'loss/train': 0.904954344034195} 01/26/2022 23:45:06 - INFO - codeparrot_training - Step 4105: {'lr': 0.0004976311125548387, 'samples': 788352, 'steps': 4105, 'loss/train': 0.5101485103368759} 01/26/2022 23:45:10 - INFO - codeparrot_training - Step 4106: {'lr': 0.00049762886486127, 'samples': 788544, 'steps': 4106, 'loss/train': 0.6014846563339233} 01/26/2022 23:45:13 - INFO - codeparrot_training - Step 4107: {'lr': 0.0004976266161069379, 'samples': 788736, 'steps': 4107, 'loss/train': 0.7939410209655762} 01/26/2022 23:45:16 - INFO - codeparrot_training - Step 4108: {'lr': 0.0004976243662918518, 'samples': 788928, 'steps': 4108, 'loss/train': 0.8927273154258728} 01/26/2022 23:45:19 - INFO - codeparrot_training - Step 4109: {'lr': 0.0004976221154160217, 'samples': 789120, 'steps': 4109, 'loss/train': 0.9040933549404144} 01/26/2022 23:45:23 - INFO - codeparrot_training - Step 4110: {'lr': 0.0004976198634794571, 'samples': 789312, 'steps': 4110, 'loss/train': 1.3373994827270508} 01/26/2022 23:45:27 - INFO - codeparrot_training - Step 4111: {'lr': 0.0004976176104821675, 'samples': 789504, 'steps': 4111, 'loss/train': 0.9225672483444214} 01/26/2022 23:45:30 - INFO - codeparrot_training - Step 4112: {'lr': 0.0004976153564241628, 'samples': 789696, 'steps': 4112, 'loss/train': 0.8254786133766174} 01/26/2022 23:45:33 - INFO - codeparrot_training - Step 4113: {'lr': 0.0004976131013054526, 'samples': 789888, 'steps': 4113, 'loss/train': 0.9444302022457123} 01/26/2022 23:45:36 - INFO - codeparrot_training - Step 4114: {'lr': 0.0004976108451260464, 'samples': 790080, 'steps': 4114, 'loss/train': 1.0143995583057404} 01/26/2022 23:45:39 - INFO - codeparrot_training - Step 4115: {'lr': 0.000497608587885954, 'samples': 790272, 'steps': 4115, 'loss/train': 0.4704884737730026} 01/26/2022 23:45:42 - INFO - codeparrot_training - Step 4116: {'lr': 0.0004976063295851849, 'samples': 790464, 'steps': 4116, 'loss/train': 0.5225976258516312} 01/26/2022 23:45:45 - INFO - codeparrot_training - Step 4117: {'lr': 0.000497604070223749, 'samples': 790656, 'steps': 4117, 'loss/train': 0.9745803773403168} 01/26/2022 23:45:49 - INFO - codeparrot_training - Step 4118: {'lr': 0.0004976018098016559, 'samples': 790848, 'steps': 4118, 'loss/train': 1.004689782857895} 01/26/2022 23:45:54 - INFO - codeparrot_training - Step 4119: {'lr': 0.0004975995483189153, 'samples': 791040, 'steps': 4119, 'loss/train': 0.821289986371994} 01/26/2022 23:45:57 - INFO - codeparrot_training - Step 4120: {'lr': 0.0004975972857755368, 'samples': 791232, 'steps': 4120, 'loss/train': 0.6787645518779755} 01/26/2022 23:46:00 - INFO - codeparrot_training - Step 4121: {'lr': 0.0004975950221715302, 'samples': 791424, 'steps': 4121, 'loss/train': 0.558518797159195} 01/26/2022 23:46:03 - INFO - codeparrot_training - Step 4122: {'lr': 0.0004975927575069051, 'samples': 791616, 'steps': 4122, 'loss/train': 0.6002683639526367} 01/26/2022 23:46:07 - INFO - codeparrot_training - Step 4123: {'lr': 0.0004975904917816713, 'samples': 791808, 'steps': 4123, 'loss/train': 1.20290145277977} 01/26/2022 23:46:10 - INFO - codeparrot_training - Step 4124: {'lr': 0.0004975882249958385, 'samples': 792000, 'steps': 4124, 'loss/train': 0.9254344403743744} 01/26/2022 23:46:13 - INFO - codeparrot_training - Step 4125: {'lr': 0.0004975859571494162, 'samples': 792192, 'steps': 4125, 'loss/train': 1.0770629346370697} 01/26/2022 23:46:16 - INFO - codeparrot_training - Step 4126: {'lr': 0.0004975836882424143, 'samples': 792384, 'steps': 4126, 'loss/train': 0.45770788192749023} 01/26/2022 23:46:19 - INFO - codeparrot_training - Step 4127: {'lr': 0.0004975814182748426, 'samples': 792576, 'steps': 4127, 'loss/train': 0.8026349544525146} 01/26/2022 23:46:24 - INFO - codeparrot_training - Step 4128: {'lr': 0.0004975791472467108, 'samples': 792768, 'steps': 4128, 'loss/train': 0.8542385101318359} 01/26/2022 23:46:27 - INFO - codeparrot_training - Step 4129: {'lr': 0.0004975768751580283, 'samples': 792960, 'steps': 4129, 'loss/train': 1.0628069043159485} 01/26/2022 23:46:30 - INFO - codeparrot_training - Step 4130: {'lr': 0.0004975746020088052, 'samples': 793152, 'steps': 4130, 'loss/train': 0.9547072649002075} 01/26/2022 23:46:33 - INFO - codeparrot_training - Step 4131: {'lr': 0.0004975723277990512, 'samples': 793344, 'steps': 4131, 'loss/train': 0.7897018790245056} 01/26/2022 23:46:36 - INFO - codeparrot_training - Step 4132: {'lr': 0.0004975700525287758, 'samples': 793536, 'steps': 4132, 'loss/train': 0.5878535360097885} 01/26/2022 23:46:40 - INFO - codeparrot_training - Step 4133: {'lr': 0.0004975677761979891, 'samples': 793728, 'steps': 4133, 'loss/train': 1.266051560640335} 01/26/2022 23:46:43 - INFO - codeparrot_training - Step 4134: {'lr': 0.0004975654988067005, 'samples': 793920, 'steps': 4134, 'loss/train': 0.8177362382411957} 01/26/2022 23:46:46 - INFO - codeparrot_training - Step 4135: {'lr': 0.00049756322035492, 'samples': 794112, 'steps': 4135, 'loss/train': 0.2738635540008545} 01/26/2022 23:46:49 - INFO - codeparrot_training - Step 4136: {'lr': 0.0004975609408426572, 'samples': 794304, 'steps': 4136, 'loss/train': 0.7404840141534805} 01/26/2022 23:46:55 - INFO - codeparrot_training - Step 4137: {'lr': 0.000497558660269922, 'samples': 794496, 'steps': 4137, 'loss/train': 0.8524323105812073} 01/26/2022 23:46:58 - INFO - codeparrot_training - Step 4138: {'lr': 0.0004975563786367241, 'samples': 794688, 'steps': 4138, 'loss/train': 0.7786914110183716} 01/26/2022 23:47:01 - INFO - codeparrot_training - Step 4139: {'lr': 0.0004975540959430732, 'samples': 794880, 'steps': 4139, 'loss/train': 0.20960108935832977} 01/26/2022 23:47:04 - INFO - codeparrot_training - Step 4140: {'lr': 0.0004975518121889793, 'samples': 795072, 'steps': 4140, 'loss/train': 0.8394540846347809} 01/26/2022 23:47:07 - INFO - codeparrot_training - Step 4141: {'lr': 0.000497549527374452, 'samples': 795264, 'steps': 4141, 'loss/train': 1.0288716852664948} 01/26/2022 23:47:10 - INFO - codeparrot_training - Step 4142: {'lr': 0.000497547241499501, 'samples': 795456, 'steps': 4142, 'loss/train': 0.5797905474901199} 01/26/2022 23:47:14 - INFO - codeparrot_training - Step 4143: {'lr': 0.0004975449545641364, 'samples': 795648, 'steps': 4143, 'loss/train': 0.6004155278205872} 01/26/2022 23:47:17 - INFO - codeparrot_training - Step 4144: {'lr': 0.0004975426665683678, 'samples': 795840, 'steps': 4144, 'loss/train': 1.0281884372234344} 01/26/2022 23:47:20 - INFO - codeparrot_training - Step 4145: {'lr': 0.000497540377512205, 'samples': 796032, 'steps': 4145, 'loss/train': 0.9763956964015961} 01/26/2022 23:47:23 - INFO - codeparrot_training - Step 4146: {'lr': 0.0004975380873956577, 'samples': 796224, 'steps': 4146, 'loss/train': 0.7178889513015747} 01/26/2022 23:47:28 - INFO - codeparrot_training - Step 4147: {'lr': 0.0004975357962187359, 'samples': 796416, 'steps': 4147, 'loss/train': 0.5354655683040619} 01/26/2022 23:47:31 - INFO - codeparrot_training - Step 4148: {'lr': 0.0004975335039814493, 'samples': 796608, 'steps': 4148, 'loss/train': 1.4709855616092682} 01/26/2022 23:47:35 - INFO - codeparrot_training - Step 4149: {'lr': 0.0004975312106838079, 'samples': 796800, 'steps': 4149, 'loss/train': 0.4221367835998535} 01/26/2022 23:47:38 - INFO - codeparrot_training - Step 4150: {'lr': 0.0004975289163258214, 'samples': 796992, 'steps': 4150, 'loss/train': 0.3280765861272812} 01/26/2022 23:47:41 - INFO - codeparrot_training - Step 4151: {'lr': 0.0004975266209074995, 'samples': 797184, 'steps': 4151, 'loss/train': 0.7372308075428009} 01/26/2022 23:47:44 - INFO - codeparrot_training - Step 4152: {'lr': 0.0004975243244288522, 'samples': 797376, 'steps': 4152, 'loss/train': 0.9792649447917938} 01/26/2022 23:47:47 - INFO - codeparrot_training - Step 4153: {'lr': 0.0004975220268898893, 'samples': 797568, 'steps': 4153, 'loss/train': 1.0738156735897064} 01/26/2022 23:47:50 - INFO - codeparrot_training - Step 4154: {'lr': 0.0004975197282906207, 'samples': 797760, 'steps': 4154, 'loss/train': 1.0334748029708862} 01/26/2022 23:47:55 - INFO - codeparrot_training - Step 4155: {'lr': 0.0004975174286310562, 'samples': 797952, 'steps': 4155, 'loss/train': 1.284747451543808} 01/26/2022 23:47:58 - INFO - codeparrot_training - Step 4156: {'lr': 0.0004975151279112054, 'samples': 798144, 'steps': 4156, 'loss/train': 1.1183123588562012} 01/26/2022 23:48:01 - INFO - codeparrot_training - Step 4157: {'lr': 0.0004975128261310787, 'samples': 798336, 'steps': 4157, 'loss/train': 0.21499959379434586} 01/26/2022 23:48:04 - INFO - codeparrot_training - Step 4158: {'lr': 0.0004975105232906854, 'samples': 798528, 'steps': 4158, 'loss/train': 0.9238461256027222} 01/26/2022 23:48:08 - INFO - codeparrot_training - Step 4159: {'lr': 0.0004975082193900357, 'samples': 798720, 'steps': 4159, 'loss/train': 0.6936242133378983} 01/26/2022 23:48:11 - INFO - codeparrot_training - Step 4160: {'lr': 0.0004975059144291394, 'samples': 798912, 'steps': 4160, 'loss/train': 0.9805659055709839} 01/26/2022 23:48:14 - INFO - codeparrot_training - Step 4161: {'lr': 0.0004975036084080063, 'samples': 799104, 'steps': 4161, 'loss/train': 0.9104909598827362} 01/26/2022 23:48:17 - INFO - codeparrot_training - Step 4162: {'lr': 0.0004975013013266464, 'samples': 799296, 'steps': 4162, 'loss/train': 1.1004058420658112} 01/26/2022 23:48:20 - INFO - codeparrot_training - Step 4163: {'lr': 0.0004974989931850695, 'samples': 799488, 'steps': 4163, 'loss/train': 0.7261201590299606} 01/26/2022 23:48:25 - INFO - codeparrot_training - Step 4164: {'lr': 0.0004974966839832855, 'samples': 799680, 'steps': 4164, 'loss/train': 0.8086262047290802} 01/26/2022 23:48:28 - INFO - codeparrot_training - Step 4165: {'lr': 0.0004974943737213042, 'samples': 799872, 'steps': 4165, 'loss/train': 0.9609712064266205} 01/26/2022 23:48:32 - INFO - codeparrot_training - Step 4166: {'lr': 0.0004974920623991356, 'samples': 800064, 'steps': 4166, 'loss/train': 1.4015211760997772} 01/26/2022 23:48:35 - INFO - codeparrot_training - Step 4167: {'lr': 0.0004974897500167898, 'samples': 800256, 'steps': 4167, 'loss/train': 1.0761483907699585} 01/26/2022 23:48:38 - INFO - codeparrot_training - Step 4168: {'lr': 0.0004974874365742763, 'samples': 800448, 'steps': 4168, 'loss/train': 1.4085007309913635} 01/26/2022 23:48:41 - INFO - codeparrot_training - Step 4169: {'lr': 0.0004974851220716053, 'samples': 800640, 'steps': 4169, 'loss/train': 1.1052669882774353} 01/26/2022 23:48:44 - INFO - codeparrot_training - Step 4170: {'lr': 0.0004974828065087867, 'samples': 800832, 'steps': 4170, 'loss/train': 0.8475576639175415} 01/26/2022 23:48:47 - INFO - codeparrot_training - Step 4171: {'lr': 0.0004974804898858302, 'samples': 801024, 'steps': 4171, 'loss/train': 1.0296071469783783} 01/26/2022 23:48:50 - INFO - codeparrot_training - Step 4172: {'lr': 0.0004974781722027459, 'samples': 801216, 'steps': 4172, 'loss/train': 0.9455699622631073} 01/26/2022 23:48:55 - INFO - codeparrot_training - Step 4173: {'lr': 0.0004974758534595436, 'samples': 801408, 'steps': 4173, 'loss/train': 1.0535179674625397} 01/26/2022 23:48:58 - INFO - codeparrot_training - Step 4174: {'lr': 0.0004974735336562335, 'samples': 801600, 'steps': 4174, 'loss/train': 1.016805499792099} 01/26/2022 23:49:01 - INFO - codeparrot_training - Step 4175: {'lr': 0.0004974712127928252, 'samples': 801792, 'steps': 4175, 'loss/train': 1.091554194688797} 01/26/2022 23:49:04 - INFO - codeparrot_training - Step 4176: {'lr': 0.000497468890869329, 'samples': 801984, 'steps': 4176, 'loss/train': 0.6676788479089737} 01/26/2022 23:49:08 - INFO - codeparrot_training - Step 4177: {'lr': 0.0004974665678857545, 'samples': 802176, 'steps': 4177, 'loss/train': 0.47011126577854156} 01/26/2022 23:49:11 - INFO - codeparrot_training - Step 4178: {'lr': 0.0004974642438421118, 'samples': 802368, 'steps': 4178, 'loss/train': 0.9783768653869629} 01/26/2022 23:49:14 - INFO - codeparrot_training - Step 4179: {'lr': 0.0004974619187384109, 'samples': 802560, 'steps': 4179, 'loss/train': 1.319285362958908} 01/26/2022 23:49:17 - INFO - codeparrot_training - Step 4180: {'lr': 0.0004974595925746618, 'samples': 802752, 'steps': 4180, 'loss/train': 0.6020345985889435} 01/26/2022 23:49:20 - INFO - codeparrot_training - Step 4181: {'lr': 0.0004974572653508742, 'samples': 802944, 'steps': 4181, 'loss/train': 0.9198919236660004} 01/26/2022 23:49:25 - INFO - codeparrot_training - Step 4182: {'lr': 0.0004974549370670584, 'samples': 803136, 'steps': 4182, 'loss/train': 0.4004551023244858} 01/26/2022 23:49:28 - INFO - codeparrot_training - Step 4183: {'lr': 0.0004974526077232242, 'samples': 803328, 'steps': 4183, 'loss/train': 1.1850870251655579} 01/26/2022 23:49:31 - INFO - codeparrot_training - Step 4184: {'lr': 0.0004974502773193815, 'samples': 803520, 'steps': 4184, 'loss/train': 0.9816985130310059} 01/26/2022 23:49:34 - INFO - codeparrot_training - Step 4185: {'lr': 0.0004974479458555405, 'samples': 803712, 'steps': 4185, 'loss/train': 0.6522116214036942} 01/26/2022 23:49:37 - INFO - codeparrot_training - Step 4186: {'lr': 0.000497445613331711, 'samples': 803904, 'steps': 4186, 'loss/train': 0.8881154358386993} 01/26/2022 23:49:40 - INFO - codeparrot_training - Step 4187: {'lr': 0.0004974432797479032, 'samples': 804096, 'steps': 4187, 'loss/train': 1.056018054485321} 01/26/2022 23:49:44 - INFO - codeparrot_training - Step 4188: {'lr': 0.0004974409451041268, 'samples': 804288, 'steps': 4188, 'loss/train': 0.4770399034023285} 01/26/2022 23:49:47 - INFO - codeparrot_training - Step 4189: {'lr': 0.0004974386094003921, 'samples': 804480, 'steps': 4189, 'loss/train': 0.7322945147752762} 01/26/2022 23:49:50 - INFO - codeparrot_training - Step 4190: {'lr': 0.0004974362726367089, 'samples': 804672, 'steps': 4190, 'loss/train': 0.7078633904457092} 01/26/2022 23:49:54 - INFO - codeparrot_training - Step 4191: {'lr': 0.0004974339348130873, 'samples': 804864, 'steps': 4191, 'loss/train': 0.8855026066303253} 01/26/2022 23:49:57 - INFO - codeparrot_training - Step 4192: {'lr': 0.0004974315959295373, 'samples': 805056, 'steps': 4192, 'loss/train': 0.6886641383171082} 01/26/2022 23:50:01 - INFO - codeparrot_training - Step 4193: {'lr': 0.0004974292559860688, 'samples': 805248, 'steps': 4193, 'loss/train': 0.8293375074863434} 01/26/2022 23:50:04 - INFO - codeparrot_training - Step 4194: {'lr': 0.0004974269149826921, 'samples': 805440, 'steps': 4194, 'loss/train': 0.7212063521146774} 01/26/2022 23:50:07 - INFO - codeparrot_training - Step 4195: {'lr': 0.0004974245729194169, 'samples': 805632, 'steps': 4195, 'loss/train': 0.730066180229187} 01/26/2022 23:50:10 - INFO - codeparrot_training - Step 4196: {'lr': 0.0004974222297962535, 'samples': 805824, 'steps': 4196, 'loss/train': 0.8033663034439087} 01/26/2022 23:50:13 - INFO - codeparrot_training - Step 4197: {'lr': 0.0004974198856132118, 'samples': 806016, 'steps': 4197, 'loss/train': 0.8297343850135803} 01/26/2022 23:50:16 - INFO - codeparrot_training - Step 4198: {'lr': 0.0004974175403703019, 'samples': 806208, 'steps': 4198, 'loss/train': 1.306143879890442} 01/26/2022 23:50:19 - INFO - codeparrot_training - Step 4199: {'lr': 0.0004974151940675338, 'samples': 806400, 'steps': 4199, 'loss/train': 1.2323322594165802} 01/26/2022 23:50:25 - INFO - codeparrot_training - Step 4200: {'lr': 0.0004974128467049176, 'samples': 806592, 'steps': 4200, 'loss/train': 1.1601318418979645} 01/26/2022 23:50:28 - INFO - codeparrot_training - Step 4201: {'lr': 0.0004974104982824632, 'samples': 806784, 'steps': 4201, 'loss/train': 0.7441605627536774} 01/26/2022 23:50:31 - INFO - codeparrot_training - Step 4202: {'lr': 0.0004974081488001809, 'samples': 806976, 'steps': 4202, 'loss/train': 0.7756482660770416} 01/26/2022 23:50:34 - INFO - codeparrot_training - Step 4203: {'lr': 0.0004974057982580806, 'samples': 807168, 'steps': 4203, 'loss/train': 1.2825631499290466} 01/26/2022 23:50:37 - INFO - codeparrot_training - Step 4204: {'lr': 0.0004974034466561725, 'samples': 807360, 'steps': 4204, 'loss/train': 1.7076825499534607} 01/26/2022 23:50:40 - INFO - codeparrot_training - Step 4205: {'lr': 0.0004974010939944667, 'samples': 807552, 'steps': 4205, 'loss/train': 0.5591617226600647} 01/26/2022 23:50:43 - INFO - codeparrot_training - Step 4206: {'lr': 0.0004973987402729729, 'samples': 807744, 'steps': 4206, 'loss/train': 1.0574645698070526} 01/26/2022 23:50:47 - INFO - codeparrot_training - Step 4207: {'lr': 0.0004973963854917016, 'samples': 807936, 'steps': 4207, 'loss/train': 0.8675772249698639} 01/26/2022 23:50:50 - INFO - codeparrot_training - Step 4208: {'lr': 0.0004973940296506627, 'samples': 808128, 'steps': 4208, 'loss/train': 0.9477838575839996} 01/26/2022 23:50:54 - INFO - codeparrot_training - Step 4209: {'lr': 0.0004973916727498664, 'samples': 808320, 'steps': 4209, 'loss/train': 1.2293379306793213} 01/26/2022 23:50:57 - INFO - codeparrot_training - Step 4210: {'lr': 0.0004973893147893227, 'samples': 808512, 'steps': 4210, 'loss/train': 1.2390592396259308} 01/26/2022 23:51:00 - INFO - codeparrot_training - Step 4211: {'lr': 0.0004973869557690417, 'samples': 808704, 'steps': 4211, 'loss/train': 1.5186637043952942} 01/26/2022 23:51:04 - INFO - codeparrot_training - Step 4212: {'lr': 0.0004973845956890336, 'samples': 808896, 'steps': 4212, 'loss/train': 0.4894462823867798} 01/26/2022 23:51:07 - INFO - codeparrot_training - Step 4213: {'lr': 0.0004973822345493084, 'samples': 809088, 'steps': 4213, 'loss/train': 0.8664330840110779} 01/26/2022 23:51:10 - INFO - codeparrot_training - Step 4214: {'lr': 0.0004973798723498762, 'samples': 809280, 'steps': 4214, 'loss/train': 0.5829682946205139} 01/26/2022 23:51:13 - INFO - codeparrot_training - Step 4215: {'lr': 0.0004973775090907473, 'samples': 809472, 'steps': 4215, 'loss/train': 0.5618193447589874} 01/26/2022 23:51:16 - INFO - codeparrot_training - Step 4216: {'lr': 0.0004973751447719316, 'samples': 809664, 'steps': 4216, 'loss/train': 0.7617257237434387} 01/26/2022 23:51:21 - INFO - codeparrot_training - Step 4217: {'lr': 0.0004973727793934394, 'samples': 809856, 'steps': 4217, 'loss/train': 0.724897563457489} 01/26/2022 23:51:24 - INFO - codeparrot_training - Step 4218: {'lr': 0.0004973704129552808, 'samples': 810048, 'steps': 4218, 'loss/train': 0.6089994013309479} 01/26/2022 23:51:27 - INFO - codeparrot_training - Step 4219: {'lr': 0.0004973680454574657, 'samples': 810240, 'steps': 4219, 'loss/train': 0.9128605127334595} 01/26/2022 23:51:30 - INFO - codeparrot_training - Step 4220: {'lr': 0.0004973656769000046, 'samples': 810432, 'steps': 4220, 'loss/train': 1.1537082195281982} 01/26/2022 23:51:33 - INFO - codeparrot_training - Step 4221: {'lr': 0.0004973633072829075, 'samples': 810624, 'steps': 4221, 'loss/train': 0.47289061546325684} 01/26/2022 23:51:37 - INFO - codeparrot_training - Step 4222: {'lr': 0.0004973609366061845, 'samples': 810816, 'steps': 4222, 'loss/train': 0.7365460395812988} 01/26/2022 23:51:40 - INFO - codeparrot_training - Step 4223: {'lr': 0.0004973585648698457, 'samples': 811008, 'steps': 4223, 'loss/train': 0.7732673585414886} 01/26/2022 23:51:43 - INFO - codeparrot_training - Step 4224: {'lr': 0.0004973561920739015, 'samples': 811200, 'steps': 4224, 'loss/train': 0.6346185207366943} 01/26/2022 23:51:46 - INFO - codeparrot_training - Step 4225: {'lr': 0.0004973538182183618, 'samples': 811392, 'steps': 4225, 'loss/train': 0.8652702569961548} 01/26/2022 23:51:49 - INFO - codeparrot_training - Step 4226: {'lr': 0.000497351443303237, 'samples': 811584, 'steps': 4226, 'loss/train': 1.4073538184165955} 01/26/2022 23:51:54 - INFO - codeparrot_training - Step 4227: {'lr': 0.0004973490673285372, 'samples': 811776, 'steps': 4227, 'loss/train': 0.43933045864105225} 01/26/2022 23:51:57 - INFO - codeparrot_training - Step 4228: {'lr': 0.0004973466902942723, 'samples': 811968, 'steps': 4228, 'loss/train': 0.9941853582859039} 01/26/2022 23:52:00 - INFO - codeparrot_training - Step 4229: {'lr': 0.0004973443122004529, 'samples': 812160, 'steps': 4229, 'loss/train': 0.46848835051059723} 01/26/2022 23:52:04 - INFO - codeparrot_training - Step 4230: {'lr': 0.0004973419330470891, 'samples': 812352, 'steps': 4230, 'loss/train': 0.6798611730337143} 01/26/2022 23:52:07 - INFO - codeparrot_training - Step 4231: {'lr': 0.0004973395528341908, 'samples': 812544, 'steps': 4231, 'loss/train': 0.8406854867935181} 01/26/2022 23:52:10 - INFO - codeparrot_training - Step 4232: {'lr': 0.0004973371715617685, 'samples': 812736, 'steps': 4232, 'loss/train': 2.0641048550605774} 01/26/2022 23:52:13 - INFO - codeparrot_training - Step 4233: {'lr': 0.0004973347892298322, 'samples': 812928, 'steps': 4233, 'loss/train': 1.0966025590896606} 01/26/2022 23:52:16 - INFO - codeparrot_training - Step 4234: {'lr': 0.0004973324058383924, 'samples': 813120, 'steps': 4234, 'loss/train': 0.661454513669014} 01/26/2022 23:52:21 - INFO - codeparrot_training - Step 4235: {'lr': 0.0004973300213874589, 'samples': 813312, 'steps': 4235, 'loss/train': 0.3935379534959793} 01/26/2022 23:52:24 - INFO - codeparrot_training - Step 4236: {'lr': 0.0004973276358770422, 'samples': 813504, 'steps': 4236, 'loss/train': 0.9501077234745026} 01/26/2022 23:52:27 - INFO - codeparrot_training - Step 4237: {'lr': 0.0004973252493071525, 'samples': 813696, 'steps': 4237, 'loss/train': 0.8998911380767822} 01/26/2022 23:52:30 - INFO - codeparrot_training - Step 4238: {'lr': 0.0004973228616777999, 'samples': 813888, 'steps': 4238, 'loss/train': 0.7878801226615906} 01/26/2022 23:52:33 - INFO - codeparrot_training - Step 4239: {'lr': 0.0004973204729889946, 'samples': 814080, 'steps': 4239, 'loss/train': 0.6122044175863266} 01/26/2022 23:52:36 - INFO - codeparrot_training - Step 4240: {'lr': 0.0004973180832407472, 'samples': 814272, 'steps': 4240, 'loss/train': 1.5207231044769287} 01/26/2022 23:52:39 - INFO - codeparrot_training - Step 4241: {'lr': 0.0004973156924330674, 'samples': 814464, 'steps': 4241, 'loss/train': 0.9133970439434052} 01/26/2022 23:52:43 - INFO - codeparrot_training - Step 4242: {'lr': 0.0004973133005659658, 'samples': 814656, 'steps': 4242, 'loss/train': 0.7415607869625092} 01/26/2022 23:52:46 - INFO - codeparrot_training - Step 4243: {'lr': 0.0004973109076394526, 'samples': 814848, 'steps': 4243, 'loss/train': 0.31982284784317017} 01/26/2022 23:52:51 - INFO - codeparrot_training - Step 4244: {'lr': 0.0004973085136535379, 'samples': 815040, 'steps': 4244, 'loss/train': 1.1002086997032166} 01/26/2022 23:52:54 - INFO - codeparrot_training - Step 4245: {'lr': 0.000497306118608232, 'samples': 815232, 'steps': 4245, 'loss/train': 1.2978757917881012} 01/26/2022 23:52:57 - INFO - codeparrot_training - Step 4246: {'lr': 0.0004973037225035454, 'samples': 815424, 'steps': 4246, 'loss/train': 1.024344563484192} 01/26/2022 23:53:00 - INFO - codeparrot_training - Step 4247: {'lr': 0.0004973013253394881, 'samples': 815616, 'steps': 4247, 'loss/train': 1.106774926185608} 01/26/2022 23:53:03 - INFO - codeparrot_training - Step 4248: {'lr': 0.0004972989271160705, 'samples': 815808, 'steps': 4248, 'loss/train': 0.9135290086269379} 01/26/2022 23:53:06 - INFO - codeparrot_training - Step 4249: {'lr': 0.0004972965278333028, 'samples': 816000, 'steps': 4249, 'loss/train': 0.7108882069587708} 01/26/2022 23:53:10 - INFO - codeparrot_training - Step 4250: {'lr': 0.0004972941274911952, 'samples': 816192, 'steps': 4250, 'loss/train': 1.266545444726944} 01/26/2022 23:53:13 - INFO - codeparrot_training - Step 4251: {'lr': 0.0004972917260897583, 'samples': 816384, 'steps': 4251, 'loss/train': 1.1872006952762604} 01/26/2022 23:53:16 - INFO - codeparrot_training - Step 4252: {'lr': 0.0004972893236290019, 'samples': 816576, 'steps': 4252, 'loss/train': 0.8439594805240631} 01/26/2022 23:53:20 - INFO - codeparrot_training - Step 4253: {'lr': 0.0004972869201089367, 'samples': 816768, 'steps': 4253, 'loss/train': 1.2183307707309723} 01/26/2022 23:53:24 - INFO - codeparrot_training - Step 4254: {'lr': 0.0004972845155295729, 'samples': 816960, 'steps': 4254, 'loss/train': 0.6883189827203751} 01/26/2022 23:53:27 - INFO - codeparrot_training - Step 4255: {'lr': 0.0004972821098909207, 'samples': 817152, 'steps': 4255, 'loss/train': 0.9576145112514496} 01/26/2022 23:53:30 - INFO - codeparrot_training - Step 4256: {'lr': 0.0004972797031929904, 'samples': 817344, 'steps': 4256, 'loss/train': 0.7859797775745392} 01/26/2022 23:53:33 - INFO - codeparrot_training - Step 4257: {'lr': 0.0004972772954357924, 'samples': 817536, 'steps': 4257, 'loss/train': 0.6408076733350754} 01/26/2022 23:53:36 - INFO - codeparrot_training - Step 4258: {'lr': 0.0004972748866193371, 'samples': 817728, 'steps': 4258, 'loss/train': 0.34810855239629745} 01/26/2022 23:53:39 - INFO - codeparrot_training - Step 4259: {'lr': 0.0004972724767436346, 'samples': 817920, 'steps': 4259, 'loss/train': 1.1734924614429474} 01/26/2022 23:53:42 - INFO - codeparrot_training - Step 4260: {'lr': 0.0004972700658086954, 'samples': 818112, 'steps': 4260, 'loss/train': 0.7680357098579407} 01/26/2022 23:53:46 - INFO - codeparrot_training - Step 4261: {'lr': 0.0004972676538145298, 'samples': 818304, 'steps': 4261, 'loss/train': 1.1241542994976044} 01/26/2022 23:53:50 - INFO - codeparrot_training - Step 4262: {'lr': 0.0004972652407611479, 'samples': 818496, 'steps': 4262, 'loss/train': 0.9061572253704071} 01/26/2022 23:53:53 - INFO - codeparrot_training - Step 4263: {'lr': 0.0004972628266485604, 'samples': 818688, 'steps': 4263, 'loss/train': 0.8318447470664978} 01/26/2022 23:53:56 - INFO - codeparrot_training - Step 4264: {'lr': 0.0004972604114767774, 'samples': 818880, 'steps': 4264, 'loss/train': 0.7256346791982651} 01/26/2022 23:53:59 - INFO - codeparrot_training - Step 4265: {'lr': 0.0004972579952458092, 'samples': 819072, 'steps': 4265, 'loss/train': 0.730308473110199} 01/26/2022 23:54:03 - INFO - codeparrot_training - Step 4266: {'lr': 0.0004972555779556664, 'samples': 819264, 'steps': 4266, 'loss/train': 0.7622151374816895} 01/26/2022 23:54:06 - INFO - codeparrot_training - Step 4267: {'lr': 0.0004972531596063592, 'samples': 819456, 'steps': 4267, 'loss/train': 0.9673137366771698} 01/26/2022 23:54:09 - INFO - codeparrot_training - Step 4268: {'lr': 0.000497250740197898, 'samples': 819648, 'steps': 4268, 'loss/train': 1.1534693241119385} 01/26/2022 23:54:12 - INFO - codeparrot_training - Step 4269: {'lr': 0.0004972483197302931, 'samples': 819840, 'steps': 4269, 'loss/train': 1.5483570098876953} 01/26/2022 23:54:17 - INFO - codeparrot_training - Step 4270: {'lr': 0.0004972458982035548, 'samples': 820032, 'steps': 4270, 'loss/train': 0.9143301844596863} 01/26/2022 23:54:20 - INFO - codeparrot_training - Step 4271: {'lr': 0.0004972434756176937, 'samples': 820224, 'steps': 4271, 'loss/train': 1.7905810475349426} 01/26/2022 23:54:23 - INFO - codeparrot_training - Step 4272: {'lr': 0.0004972410519727201, 'samples': 820416, 'steps': 4272, 'loss/train': 0.8045686483383179} 01/26/2022 23:54:27 - INFO - codeparrot_training - Step 4273: {'lr': 0.0004972386272686443, 'samples': 820608, 'steps': 4273, 'loss/train': 0.7673881351947784} 01/26/2022 23:54:30 - INFO - codeparrot_training - Step 4274: {'lr': 0.0004972362015054767, 'samples': 820800, 'steps': 4274, 'loss/train': 1.08675616979599} 01/26/2022 23:54:33 - INFO - codeparrot_training - Step 4275: {'lr': 0.0004972337746832278, 'samples': 820992, 'steps': 4275, 'loss/train': 0.6246713250875473} 01/26/2022 23:54:36 - INFO - codeparrot_training - Step 4276: {'lr': 0.0004972313468019077, 'samples': 821184, 'steps': 4276, 'loss/train': 0.7421243637800217} 01/26/2022 23:54:39 - INFO - codeparrot_training - Step 4277: {'lr': 0.0004972289178615273, 'samples': 821376, 'steps': 4277, 'loss/train': 0.8379755616188049} 01/26/2022 23:54:42 - INFO - codeparrot_training - Step 4278: {'lr': 0.0004972264878620965, 'samples': 821568, 'steps': 4278, 'loss/train': 0.5359567254781723} 01/26/2022 23:54:47 - INFO - codeparrot_training - Step 4279: {'lr': 0.000497224056803626, 'samples': 821760, 'steps': 4279, 'loss/train': 0.6169435679912567} 01/26/2022 23:54:50 - INFO - codeparrot_training - Step 4280: {'lr': 0.0004972216246861262, 'samples': 821952, 'steps': 4280, 'loss/train': 1.0803429186344147} 01/26/2022 23:54:53 - INFO - codeparrot_training - Step 4281: {'lr': 0.0004972191915096074, 'samples': 822144, 'steps': 4281, 'loss/train': 0.5582659542560577} 01/26/2022 23:54:56 - INFO - codeparrot_training - Step 4282: {'lr': 0.0004972167572740801, 'samples': 822336, 'steps': 4282, 'loss/train': 0.33793866634368896} 01/26/2022 23:54:59 - INFO - codeparrot_training - Step 4283: {'lr': 0.0004972143219795547, 'samples': 822528, 'steps': 4283, 'loss/train': 1.0247913300991058} 01/26/2022 23:55:03 - INFO - codeparrot_training - Step 4284: {'lr': 0.0004972118856260416, 'samples': 822720, 'steps': 4284, 'loss/train': 0.5137513875961304} 01/26/2022 23:55:06 - INFO - codeparrot_training - Step 4285: {'lr': 0.0004972094482135514, 'samples': 822912, 'steps': 4285, 'loss/train': 1.1728418469429016} 01/26/2022 23:55:09 - INFO - codeparrot_training - Step 4286: {'lr': 0.0004972070097420943, 'samples': 823104, 'steps': 4286, 'loss/train': 0.6137052476406097} 01/26/2022 23:55:12 - INFO - codeparrot_training - Step 4287: {'lr': 0.0004972045702116809, 'samples': 823296, 'steps': 4287, 'loss/train': 1.0179985463619232} 01/26/2022 23:55:16 - INFO - codeparrot_training - Step 4288: {'lr': 0.0004972021296223217, 'samples': 823488, 'steps': 4288, 'loss/train': 0.8508589267730713} 01/26/2022 23:55:20 - INFO - codeparrot_training - Step 4289: {'lr': 0.0004971996879740271, 'samples': 823680, 'steps': 4289, 'loss/train': 0.7671812474727631} 01/26/2022 23:55:23 - INFO - codeparrot_training - Step 4290: {'lr': 0.0004971972452668074, 'samples': 823872, 'steps': 4290, 'loss/train': 0.9138025045394897} 01/26/2022 23:55:26 - INFO - codeparrot_training - Step 4291: {'lr': 0.0004971948015006732, 'samples': 824064, 'steps': 4291, 'loss/train': 0.950187474489212} 01/26/2022 23:55:29 - INFO - codeparrot_training - Step 4292: {'lr': 0.000497192356675635, 'samples': 824256, 'steps': 4292, 'loss/train': 0.7367661148309708} 01/26/2022 23:55:32 - INFO - codeparrot_training - Step 4293: {'lr': 0.0004971899107917033, 'samples': 824448, 'steps': 4293, 'loss/train': 0.8757580518722534} 01/26/2022 23:55:35 - INFO - codeparrot_training - Step 4294: {'lr': 0.0004971874638488884, 'samples': 824640, 'steps': 4294, 'loss/train': 0.9382392168045044} 01/26/2022 23:55:38 - INFO - codeparrot_training - Step 4295: {'lr': 0.000497185015847201, 'samples': 824832, 'steps': 4295, 'loss/train': 1.1467476189136505} 01/26/2022 23:55:41 - INFO - codeparrot_training - Step 4296: {'lr': 0.0004971825667866515, 'samples': 825024, 'steps': 4296, 'loss/train': 0.82261061668396} 01/26/2022 23:55:46 - INFO - codeparrot_training - Step 4297: {'lr': 0.0004971801166672502, 'samples': 825216, 'steps': 4297, 'loss/train': 0.7389131337404251} 01/26/2022 23:55:49 - INFO - codeparrot_training - Step 4298: {'lr': 0.0004971776654890079, 'samples': 825408, 'steps': 4298, 'loss/train': 0.5161141902208328} 01/26/2022 23:55:52 - INFO - codeparrot_training - Step 4299: {'lr': 0.000497175213251935, 'samples': 825600, 'steps': 4299, 'loss/train': 0.376567080616951} 01/26/2022 23:55:55 - INFO - codeparrot_training - Step 4300: {'lr': 0.0004971727599560418, 'samples': 825792, 'steps': 4300, 'loss/train': 0.7528911530971527} 01/26/2022 23:55:59 - INFO - codeparrot_training - Step 4301: {'lr': 0.0004971703056013392, 'samples': 825984, 'steps': 4301, 'loss/train': 0.922217845916748} 01/26/2022 23:56:02 - INFO - codeparrot_training - Step 4302: {'lr': 0.0004971678501878374, 'samples': 826176, 'steps': 4302, 'loss/train': 1.0294829607009888} 01/26/2022 23:56:05 - INFO - codeparrot_training - Step 4303: {'lr': 0.000497165393715547, 'samples': 826368, 'steps': 4303, 'loss/train': 0.8816482722759247} 01/26/2022 23:56:08 - INFO - codeparrot_training - Step 4304: {'lr': 0.0004971629361844785, 'samples': 826560, 'steps': 4304, 'loss/train': 1.4842769801616669} 01/26/2022 23:56:11 - INFO - codeparrot_training - Step 4305: {'lr': 0.0004971604775946425, 'samples': 826752, 'steps': 4305, 'loss/train': 1.170231968164444} 01/26/2022 23:56:17 - INFO - codeparrot_training - Step 4306: {'lr': 0.0004971580179460495, 'samples': 826944, 'steps': 4306, 'loss/train': 0.8313776850700378} 01/26/2022 23:56:20 - INFO - codeparrot_training - Step 4307: {'lr': 0.0004971555572387101, 'samples': 827136, 'steps': 4307, 'loss/train': 0.5604393035173416} 01/26/2022 23:56:23 - INFO - codeparrot_training - Step 4308: {'lr': 0.0004971530954726346, 'samples': 827328, 'steps': 4308, 'loss/train': 1.2220383882522583} 01/26/2022 23:56:26 - INFO - codeparrot_training - Step 4309: {'lr': 0.0004971506326478339, 'samples': 827520, 'steps': 4309, 'loss/train': 0.683408260345459} 01/26/2022 23:56:29 - INFO - codeparrot_training - Step 4310: {'lr': 0.0004971481687643184, 'samples': 827712, 'steps': 4310, 'loss/train': 0.4280122071504593} 01/26/2022 23:56:32 - INFO - codeparrot_training - Step 4311: {'lr': 0.0004971457038220984, 'samples': 827904, 'steps': 4311, 'loss/train': 0.433548241853714} 01/26/2022 23:56:35 - INFO - codeparrot_training - Step 4312: {'lr': 0.0004971432378211849, 'samples': 828096, 'steps': 4312, 'loss/train': 0.8101196587085724} 01/26/2022 23:56:39 - INFO - codeparrot_training - Step 4313: {'lr': 0.0004971407707615881, 'samples': 828288, 'steps': 4313, 'loss/train': 0.777549147605896} 01/26/2022 23:56:42 - INFO - codeparrot_training - Step 4314: {'lr': 0.0004971383026433189, 'samples': 828480, 'steps': 4314, 'loss/train': 0.589007243514061} 01/26/2022 23:56:46 - INFO - codeparrot_training - Step 4315: {'lr': 0.0004971358334663875, 'samples': 828672, 'steps': 4315, 'loss/train': 0.8365578353404999} 01/26/2022 23:56:49 - INFO - codeparrot_training - Step 4316: {'lr': 0.0004971333632308047, 'samples': 828864, 'steps': 4316, 'loss/train': 0.8434689044952393} 01/26/2022 23:56:52 - INFO - codeparrot_training - Step 4317: {'lr': 0.000497130891936581, 'samples': 829056, 'steps': 4317, 'loss/train': 1.1094758212566376} 01/26/2022 23:56:56 - INFO - codeparrot_training - Step 4318: {'lr': 0.0004971284195837271, 'samples': 829248, 'steps': 4318, 'loss/train': 0.8324873149394989} 01/26/2022 23:56:59 - INFO - codeparrot_training - Step 4319: {'lr': 0.0004971259461722536, 'samples': 829440, 'steps': 4319, 'loss/train': 0.42545078694820404} 01/26/2022 23:57:02 - INFO - codeparrot_training - Step 4320: {'lr': 0.0004971234717021708, 'samples': 829632, 'steps': 4320, 'loss/train': 0.6683817207813263} 01/26/2022 23:57:05 - INFO - codeparrot_training - Step 4321: {'lr': 0.0004971209961734897, 'samples': 829824, 'steps': 4321, 'loss/train': 0.9468327462673187} 01/26/2022 23:57:08 - INFO - codeparrot_training - Step 4322: {'lr': 0.0004971185195862207, 'samples': 830016, 'steps': 4322, 'loss/train': 0.6052544564008713} 01/26/2022 23:57:11 - INFO - codeparrot_training - Step 4323: {'lr': 0.0004971160419403744, 'samples': 830208, 'steps': 4323, 'loss/train': 0.6708842664957047} 01/26/2022 23:57:17 - INFO - codeparrot_training - Step 4324: {'lr': 0.0004971135632359614, 'samples': 830400, 'steps': 4324, 'loss/train': 1.2518320083618164} 01/26/2022 23:57:20 - INFO - codeparrot_training - Step 4325: {'lr': 0.0004971110834729925, 'samples': 830592, 'steps': 4325, 'loss/train': 0.9015273749828339} 01/26/2022 23:57:23 - INFO - codeparrot_training - Step 4326: {'lr': 0.0004971086026514781, 'samples': 830784, 'steps': 4326, 'loss/train': 1.261090636253357} 01/26/2022 23:57:26 - INFO - codeparrot_training - Step 4327: {'lr': 0.0004971061207714289, 'samples': 830976, 'steps': 4327, 'loss/train': 0.9261849224567413} 01/26/2022 23:57:29 - INFO - codeparrot_training - Step 4328: {'lr': 0.0004971036378328556, 'samples': 831168, 'steps': 4328, 'loss/train': 0.7833089232444763} 01/26/2022 23:57:32 - INFO - codeparrot_training - Step 4329: {'lr': 0.0004971011538357687, 'samples': 831360, 'steps': 4329, 'loss/train': 0.9280033707618713} 01/26/2022 23:57:35 - INFO - codeparrot_training - Step 4330: {'lr': 0.000497098668780179, 'samples': 831552, 'steps': 4330, 'loss/train': 1.1203572750091553} 01/26/2022 23:57:39 - INFO - codeparrot_training - Step 4331: {'lr': 0.000497096182666097, 'samples': 831744, 'steps': 4331, 'loss/train': 0.6260233819484711} 01/26/2022 23:57:44 - INFO - codeparrot_training - Step 4332: {'lr': 0.0004970936954935334, 'samples': 831936, 'steps': 4332, 'loss/train': 1.3559931814670563} 01/26/2022 23:57:47 - INFO - codeparrot_training - Step 4333: {'lr': 0.0004970912072624989, 'samples': 832128, 'steps': 4333, 'loss/train': 0.8992406129837036} 01/26/2022 23:57:50 - INFO - codeparrot_training - Step 4334: {'lr': 0.0004970887179730041, 'samples': 832320, 'steps': 4334, 'loss/train': 1.120208591222763} 01/26/2022 23:57:53 - INFO - codeparrot_training - Step 4335: {'lr': 0.0004970862276250599, 'samples': 832512, 'steps': 4335, 'loss/train': 0.9092339873313904} 01/26/2022 23:57:56 - INFO - codeparrot_training - Step 4336: {'lr': 0.0004970837362186766, 'samples': 832704, 'steps': 4336, 'loss/train': 1.079437494277954} 01/26/2022 23:58:00 - INFO - codeparrot_training - Step 4337: {'lr': 0.0004970812437538649, 'samples': 832896, 'steps': 4337, 'loss/train': 0.9379414021968842} 01/26/2022 23:58:03 - INFO - codeparrot_training - Step 4338: {'lr': 0.0004970787502306357, 'samples': 833088, 'steps': 4338, 'loss/train': 0.6098039746284485} 01/26/2022 23:58:06 - INFO - codeparrot_training - Step 4339: {'lr': 0.0004970762556489996, 'samples': 833280, 'steps': 4339, 'loss/train': 1.348998785018921} 01/26/2022 23:58:09 - INFO - codeparrot_training - Step 4340: {'lr': 0.0004970737600089673, 'samples': 833472, 'steps': 4340, 'loss/train': 0.9456009864807129} 01/26/2022 23:58:12 - INFO - codeparrot_training - Step 4341: {'lr': 0.0004970712633105496, 'samples': 833664, 'steps': 4341, 'loss/train': 0.956061601638794} 01/26/2022 23:58:17 - INFO - codeparrot_training - Step 4342: {'lr': 0.0004970687655537568, 'samples': 833856, 'steps': 4342, 'loss/train': 1.3711810111999512} 01/26/2022 23:58:20 - INFO - codeparrot_training - Step 4343: {'lr': 0.0004970662667386, 'samples': 834048, 'steps': 4343, 'loss/train': 0.9497353434562683} 01/26/2022 23:58:23 - INFO - codeparrot_training - Step 4344: {'lr': 0.0004970637668650898, 'samples': 834240, 'steps': 4344, 'loss/train': 0.4767637699842453} 01/26/2022 23:58:26 - INFO - codeparrot_training - Step 4345: {'lr': 0.0004970612659332368, 'samples': 834432, 'steps': 4345, 'loss/train': 0.8307383358478546} 01/26/2022 23:58:29 - INFO - codeparrot_training - Step 4346: {'lr': 0.0004970587639430518, 'samples': 834624, 'steps': 4346, 'loss/train': 1.0952051281929016} 01/26/2022 23:58:33 - INFO - codeparrot_training - Step 4347: {'lr': 0.0004970562608945455, 'samples': 834816, 'steps': 4347, 'loss/train': 0.9355016648769379} 01/26/2022 23:58:36 - INFO - codeparrot_training - Step 4348: {'lr': 0.0004970537567877286, 'samples': 835008, 'steps': 4348, 'loss/train': 0.8229217529296875} 01/26/2022 23:58:39 - INFO - codeparrot_training - Step 4349: {'lr': 0.000497051251622612, 'samples': 835200, 'steps': 4349, 'loss/train': 1.1469561159610748} 01/26/2022 23:58:42 - INFO - codeparrot_training - Step 4350: {'lr': 0.0004970487453992062, 'samples': 835392, 'steps': 4350, 'loss/train': 1.153117686510086} 01/26/2022 23:58:47 - INFO - codeparrot_training - Step 4351: {'lr': 0.000497046238117522, 'samples': 835584, 'steps': 4351, 'loss/train': 0.7794833779335022} 01/26/2022 23:58:50 - INFO - codeparrot_training - Step 4352: {'lr': 0.0004970437297775702, 'samples': 835776, 'steps': 4352, 'loss/train': 1.2507768273353577} 01/26/2022 23:58:54 - INFO - codeparrot_training - Step 4353: {'lr': 0.0004970412203793614, 'samples': 835968, 'steps': 4353, 'loss/train': 0.8246333599090576} 01/26/2022 23:58:57 - INFO - codeparrot_training - Step 4354: {'lr': 0.0004970387099229066, 'samples': 836160, 'steps': 4354, 'loss/train': 1.0113607048988342} 01/26/2022 23:59:00 - INFO - codeparrot_training - Step 4355: {'lr': 0.0004970361984082163, 'samples': 836352, 'steps': 4355, 'loss/train': 1.0464424788951874} 01/26/2022 23:59:03 - INFO - codeparrot_training - Step 4356: {'lr': 0.0004970336858353014, 'samples': 836544, 'steps': 4356, 'loss/train': 1.1610645353794098} 01/26/2022 23:59:06 - INFO - codeparrot_training - Step 4357: {'lr': 0.0004970311722041727, 'samples': 836736, 'steps': 4357, 'loss/train': 0.7517681121826172} 01/26/2022 23:59:09 - INFO - codeparrot_training - Step 4358: {'lr': 0.0004970286575148408, 'samples': 836928, 'steps': 4358, 'loss/train': 0.0807645320892334} 01/26/2022 23:59:12 - INFO - codeparrot_training - Step 4359: {'lr': 0.0004970261417673165, 'samples': 837120, 'steps': 4359, 'loss/train': 1.2625409960746765} 01/26/2022 23:59:17 - INFO - codeparrot_training - Step 4360: {'lr': 0.0004970236249616109, 'samples': 837312, 'steps': 4360, 'loss/train': 0.9889122247695923} 01/26/2022 23:59:20 - INFO - codeparrot_training - Step 4361: {'lr': 0.0004970211070977344, 'samples': 837504, 'steps': 4361, 'loss/train': 0.4664599746465683} 01/26/2022 23:59:23 - INFO - codeparrot_training - Step 4362: {'lr': 0.0004970185881756979, 'samples': 837696, 'steps': 4362, 'loss/train': 0.40002813935279846} 01/26/2022 23:59:26 - INFO - codeparrot_training - Step 4363: {'lr': 0.0004970160681955121, 'samples': 837888, 'steps': 4363, 'loss/train': 0.9445239007472992} 01/26/2022 23:59:30 - INFO - codeparrot_training - Step 4364: {'lr': 0.0004970135471571881, 'samples': 838080, 'steps': 4364, 'loss/train': 0.6014495640993118} 01/26/2022 23:59:33 - INFO - codeparrot_training - Step 4365: {'lr': 0.0004970110250607364, 'samples': 838272, 'steps': 4365, 'loss/train': 0.7273515164852142} 01/26/2022 23:59:36 - INFO - codeparrot_training - Step 4366: {'lr': 0.000497008501906168, 'samples': 838464, 'steps': 4366, 'loss/train': 1.4587800800800323} 01/26/2022 23:59:39 - INFO - codeparrot_training - Step 4367: {'lr': 0.0004970059776934935, 'samples': 838656, 'steps': 4367, 'loss/train': 0.1730901449918747} 01/26/2022 23:59:42 - INFO - codeparrot_training - Step 4368: {'lr': 0.0004970034524227238, 'samples': 838848, 'steps': 4368, 'loss/train': 1.0060526132583618} 01/26/2022 23:59:47 - INFO - codeparrot_training - Step 4369: {'lr': 0.0004970009260938698, 'samples': 839040, 'steps': 4369, 'loss/train': 0.864978700876236} 01/26/2022 23:59:50 - INFO - codeparrot_training - Step 4370: {'lr': 0.0004969983987069423, 'samples': 839232, 'steps': 4370, 'loss/train': 1.1103917062282562} 01/26/2022 23:59:53 - INFO - codeparrot_training - Step 4371: {'lr': 0.000496995870261952, 'samples': 839424, 'steps': 4371, 'loss/train': 0.4660354256629944} 01/26/2022 23:59:56 - INFO - codeparrot_training - Step 4372: {'lr': 0.0004969933407589098, 'samples': 839616, 'steps': 4372, 'loss/train': 0.7206534147262573} 01/26/2022 23:59:59 - INFO - codeparrot_training - Step 4373: {'lr': 0.0004969908101978267, 'samples': 839808, 'steps': 4373, 'loss/train': 1.0008127391338348} 01/27/2022 00:00:02 - INFO - codeparrot_training - Step 4374: {'lr': 0.0004969882785787133, 'samples': 840000, 'steps': 4374, 'loss/train': 1.1602018475532532} 01/27/2022 00:00:05 - INFO - codeparrot_training - Step 4375: {'lr': 0.0004969857459015807, 'samples': 840192, 'steps': 4375, 'loss/train': 0.2811736688017845} 01/27/2022 00:00:09 - INFO - codeparrot_training - Step 4376: {'lr': 0.0004969832121664394, 'samples': 840384, 'steps': 4376, 'loss/train': 1.3094521164894104} 01/27/2022 00:00:14 - INFO - codeparrot_training - Step 4377: {'lr': 0.0004969806773733004, 'samples': 840576, 'steps': 4377, 'loss/train': 0.5255046486854553} 01/27/2022 00:00:17 - INFO - codeparrot_training - Step 4378: {'lr': 0.0004969781415221748, 'samples': 840768, 'steps': 4378, 'loss/train': 0.8005363941192627} 01/27/2022 00:00:20 - INFO - codeparrot_training - Step 4379: {'lr': 0.0004969756046130731, 'samples': 840960, 'steps': 4379, 'loss/train': 0.992594450712204} 01/27/2022 00:00:23 - INFO - codeparrot_training - Step 4380: {'lr': 0.0004969730666460065, 'samples': 841152, 'steps': 4380, 'loss/train': 0.6427193731069565} 01/27/2022 00:00:26 - INFO - codeparrot_training - Step 4381: {'lr': 0.0004969705276209856, 'samples': 841344, 'steps': 4381, 'loss/train': 0.9201199114322662} 01/27/2022 00:00:29 - INFO - codeparrot_training - Step 4382: {'lr': 0.0004969679875380214, 'samples': 841536, 'steps': 4382, 'loss/train': 0.4914444833993912} 01/27/2022 00:00:33 - INFO - codeparrot_training - Step 4383: {'lr': 0.0004969654463971247, 'samples': 841728, 'steps': 4383, 'loss/train': 1.0151227712631226} 01/27/2022 00:00:36 - INFO - codeparrot_training - Step 4384: {'lr': 0.0004969629041983065, 'samples': 841920, 'steps': 4384, 'loss/train': 0.6443781852722168} 01/27/2022 00:00:39 - INFO - codeparrot_training - Step 4385: {'lr': 0.0004969603609415777, 'samples': 842112, 'steps': 4385, 'loss/train': 0.8228707909584045} 01/27/2022 00:00:43 - INFO - codeparrot_training - Step 4386: {'lr': 0.000496957816626949, 'samples': 842304, 'steps': 4386, 'loss/train': 0.7782959640026093} 01/27/2022 00:00:47 - INFO - codeparrot_training - Step 4387: {'lr': 0.0004969552712544316, 'samples': 842496, 'steps': 4387, 'loss/train': 0.4162811189889908} 01/27/2022 00:00:50 - INFO - codeparrot_training - Step 4388: {'lr': 0.0004969527248240361, 'samples': 842688, 'steps': 4388, 'loss/train': 1.2257954478263855} 01/27/2022 00:00:53 - INFO - codeparrot_training - Step 4389: {'lr': 0.0004969501773357736, 'samples': 842880, 'steps': 4389, 'loss/train': 0.8608929812908173} 01/27/2022 00:00:56 - INFO - codeparrot_training - Step 4390: {'lr': 0.000496947628789655, 'samples': 843072, 'steps': 4390, 'loss/train': 0.7993662357330322} 01/27/2022 00:00:59 - INFO - codeparrot_training - Step 4391: {'lr': 0.000496945079185691, 'samples': 843264, 'steps': 4391, 'loss/train': 1.0445149540901184} 01/27/2022 00:01:02 - INFO - codeparrot_training - Step 4392: {'lr': 0.0004969425285238928, 'samples': 843456, 'steps': 4392, 'loss/train': 0.9577012360095978} 01/27/2022 00:01:05 - INFO - codeparrot_training - Step 4393: {'lr': 0.0004969399768042713, 'samples': 843648, 'steps': 4393, 'loss/train': 1.041018784046173} 01/27/2022 00:01:09 - INFO - codeparrot_training - Step 4394: {'lr': 0.0004969374240268373, 'samples': 843840, 'steps': 4394, 'loss/train': 1.0327244997024536} 01/27/2022 00:01:14 - INFO - codeparrot_training - Step 4395: {'lr': 0.0004969348701916018, 'samples': 844032, 'steps': 4395, 'loss/train': 1.0541380941867828} 01/27/2022 00:01:17 - INFO - codeparrot_training - Step 4396: {'lr': 0.0004969323152985756, 'samples': 844224, 'steps': 4396, 'loss/train': 1.0795272588729858} 01/27/2022 00:01:20 - INFO - codeparrot_training - Step 4397: {'lr': 0.0004969297593477699, 'samples': 844416, 'steps': 4397, 'loss/train': 0.22319097071886063} 01/27/2022 00:01:23 - INFO - codeparrot_training - Step 4398: {'lr': 0.0004969272023391955, 'samples': 844608, 'steps': 4398, 'loss/train': 0.6978833824396133} 01/27/2022 00:01:26 - INFO - codeparrot_training - Step 4399: {'lr': 0.0004969246442728633, 'samples': 844800, 'steps': 4399, 'loss/train': 0.7101884186267853} 01/27/2022 00:01:29 - INFO - codeparrot_training - Step 4400: {'lr': 0.0004969220851487844, 'samples': 844992, 'steps': 4400, 'loss/train': 0.566977322101593} 01/27/2022 00:01:33 - INFO - codeparrot_training - Step 4401: {'lr': 0.0004969195249669697, 'samples': 845184, 'steps': 4401, 'loss/train': 0.6602832823991776} 01/27/2022 00:01:36 - INFO - codeparrot_training - Step 4402: {'lr': 0.0004969169637274301, 'samples': 845376, 'steps': 4402, 'loss/train': 0.779666393995285} 01/27/2022 00:01:39 - INFO - codeparrot_training - Step 4403: {'lr': 0.0004969144014301767, 'samples': 845568, 'steps': 4403, 'loss/train': 1.3942926228046417} 01/27/2022 00:01:43 - INFO - codeparrot_training - Step 4404: {'lr': 0.0004969118380752205, 'samples': 845760, 'steps': 4404, 'loss/train': 0.9443361461162567} 01/27/2022 00:01:47 - INFO - codeparrot_training - Step 4405: {'lr': 0.0004969092736625722, 'samples': 845952, 'steps': 4405, 'loss/train': 0.4104815125465393} 01/27/2022 00:01:50 - INFO - codeparrot_training - Step 4406: {'lr': 0.000496906708192243, 'samples': 846144, 'steps': 4406, 'loss/train': 0.6523527055978775} 01/27/2022 00:01:53 - INFO - codeparrot_training - Step 4407: {'lr': 0.000496904141664244, 'samples': 846336, 'steps': 4407, 'loss/train': 0.9952062964439392} 01/27/2022 00:01:56 - INFO - codeparrot_training - Step 4408: {'lr': 0.0004969015740785859, 'samples': 846528, 'steps': 4408, 'loss/train': 0.5774247497320175} 01/27/2022 00:01:59 - INFO - codeparrot_training - Step 4409: {'lr': 0.00049689900543528, 'samples': 846720, 'steps': 4409, 'loss/train': 0.9276637136936188} 01/27/2022 00:02:02 - INFO - codeparrot_training - Step 4410: {'lr': 0.0004968964357343371, 'samples': 846912, 'steps': 4410, 'loss/train': 0.7382834404706955} 01/27/2022 00:02:05 - INFO - codeparrot_training - Step 4411: {'lr': 0.0004968938649757682, 'samples': 847104, 'steps': 4411, 'loss/train': 0.9865886270999908} 01/27/2022 00:02:09 - INFO - codeparrot_training - Step 4412: {'lr': 0.0004968912931595845, 'samples': 847296, 'steps': 4412, 'loss/train': 0.6190544664859772} 01/27/2022 00:02:14 - INFO - codeparrot_training - Step 4413: {'lr': 0.0004968887202857968, 'samples': 847488, 'steps': 4413, 'loss/train': 0.8950621783733368} 01/27/2022 00:02:17 - INFO - codeparrot_training - Step 4414: {'lr': 0.0004968861463544163, 'samples': 847680, 'steps': 4414, 'loss/train': 1.2721519768238068} 01/27/2022 00:02:20 - INFO - codeparrot_training - Step 4415: {'lr': 0.0004968835713654538, 'samples': 847872, 'steps': 4415, 'loss/train': 0.8783634603023529} 01/27/2022 00:02:23 - INFO - codeparrot_training - Step 4416: {'lr': 0.0004968809953189206, 'samples': 848064, 'steps': 4416, 'loss/train': 0.7269645631313324} 01/27/2022 00:02:26 - INFO - codeparrot_training - Step 4417: {'lr': 0.0004968784182148276, 'samples': 848256, 'steps': 4417, 'loss/train': 1.096371978521347} 01/27/2022 00:02:29 - INFO - codeparrot_training - Step 4418: {'lr': 0.0004968758400531859, 'samples': 848448, 'steps': 4418, 'loss/train': 1.1006981134414673} 01/27/2022 00:02:32 - INFO - codeparrot_training - Step 4419: {'lr': 0.0004968732608340064, 'samples': 848640, 'steps': 4419, 'loss/train': 0.9857710003852844} 01/27/2022 00:02:36 - INFO - codeparrot_training - Step 4420: {'lr': 0.0004968706805573002, 'samples': 848832, 'steps': 4420, 'loss/train': 0.8444927930831909} 01/27/2022 00:02:39 - INFO - codeparrot_training - Step 4421: {'lr': 0.0004968680992230785, 'samples': 849024, 'steps': 4421, 'loss/train': 1.3203396499156952} 01/27/2022 00:02:43 - INFO - codeparrot_training - Step 4422: {'lr': 0.0004968655168313522, 'samples': 849216, 'steps': 4422, 'loss/train': 1.281595230102539} 01/27/2022 00:02:46 - INFO - codeparrot_training - Step 4423: {'lr': 0.0004968629333821324, 'samples': 849408, 'steps': 4423, 'loss/train': 0.7429156601428986} 01/27/2022 00:02:49 - INFO - codeparrot_training - Step 4424: {'lr': 0.0004968603488754302, 'samples': 849600, 'steps': 4424, 'loss/train': 1.3449249565601349} 01/27/2022 00:02:53 - INFO - codeparrot_training - Step 4425: {'lr': 0.0004968577633112566, 'samples': 849792, 'steps': 4425, 'loss/train': 0.892963707447052} 01/27/2022 00:02:56 - INFO - codeparrot_training - Step 4426: {'lr': 0.0004968551766896228, 'samples': 849984, 'steps': 4426, 'loss/train': 1.0419965386390686} 01/27/2022 00:02:59 - INFO - codeparrot_training - Step 4427: {'lr': 0.0004968525890105399, 'samples': 850176, 'steps': 4427, 'loss/train': 0.8751453459262848} 01/27/2022 00:03:02 - INFO - codeparrot_training - Step 4428: {'lr': 0.0004968500002740187, 'samples': 850368, 'steps': 4428, 'loss/train': 0.660454273223877} 01/27/2022 00:03:05 - INFO - codeparrot_training - Step 4429: {'lr': 0.0004968474104800706, 'samples': 850560, 'steps': 4429, 'loss/train': 0.639846682548523} 01/27/2022 00:03:08 - INFO - codeparrot_training - Step 4430: {'lr': 0.0004968448196287066, 'samples': 850752, 'steps': 4430, 'loss/train': 0.8478149771690369} 01/27/2022 00:03:13 - INFO - codeparrot_training - Step 4431: {'lr': 0.0004968422277199377, 'samples': 850944, 'steps': 4431, 'loss/train': 0.7088775783777237} 01/27/2022 00:03:16 - INFO - codeparrot_training - Step 4432: {'lr': 0.000496839634753775, 'samples': 851136, 'steps': 4432, 'loss/train': 0.9446540772914886} 01/27/2022 00:03:19 - INFO - codeparrot_training - Step 4433: {'lr': 0.0004968370407302299, 'samples': 851328, 'steps': 4433, 'loss/train': 0.9296148419380188} 01/27/2022 00:03:23 - INFO - codeparrot_training - Step 4434: {'lr': 0.0004968344456493132, 'samples': 851520, 'steps': 4434, 'loss/train': 1.0613470673561096} 01/27/2022 00:03:26 - INFO - codeparrot_training - Step 4435: {'lr': 0.000496831849511036, 'samples': 851712, 'steps': 4435, 'loss/train': 0.9723226726055145} 01/27/2022 00:03:29 - INFO - codeparrot_training - Step 4436: {'lr': 0.0004968292523154096, 'samples': 851904, 'steps': 4436, 'loss/train': 0.5345800369977951} 01/27/2022 00:03:32 - INFO - codeparrot_training - Step 4437: {'lr': 0.0004968266540624452, 'samples': 852096, 'steps': 4437, 'loss/train': 0.8505198955535889} 01/27/2022 00:03:35 - INFO - codeparrot_training - Step 4438: {'lr': 0.0004968240547521536, 'samples': 852288, 'steps': 4438, 'loss/train': 0.9027020931243896} 01/27/2022 00:03:40 - INFO - codeparrot_training - Step 4439: {'lr': 0.0004968214543845463, 'samples': 852480, 'steps': 4439, 'loss/train': 0.7688876688480377} 01/27/2022 00:03:43 - INFO - codeparrot_training - Step 4440: {'lr': 0.0004968188529596341, 'samples': 852672, 'steps': 4440, 'loss/train': 1.127624273300171} 01/27/2022 00:03:46 - INFO - codeparrot_training - Step 4441: {'lr': 0.0004968162504774284, 'samples': 852864, 'steps': 4441, 'loss/train': 0.9695869088172913} 01/27/2022 00:03:49 - INFO - codeparrot_training - Step 4442: {'lr': 0.0004968136469379403, 'samples': 853056, 'steps': 4442, 'loss/train': 0.41753706336021423} 01/27/2022 00:03:52 - INFO - codeparrot_training - Step 4443: {'lr': 0.0004968110423411808, 'samples': 853248, 'steps': 4443, 'loss/train': 0.6902837455272675} 01/27/2022 00:03:55 - INFO - codeparrot_training - Step 4444: {'lr': 0.0004968084366871612, 'samples': 853440, 'steps': 4444, 'loss/train': 0.8094327449798584} 01/27/2022 00:03:59 - INFO - codeparrot_training - Step 4445: {'lr': 0.0004968058299758926, 'samples': 853632, 'steps': 4445, 'loss/train': 0.4814504384994507} 01/27/2022 00:04:02 - INFO - codeparrot_training - Step 4446: {'lr': 0.0004968032222073863, 'samples': 853824, 'steps': 4446, 'loss/train': 1.0904746055603027} 01/27/2022 00:04:05 - INFO - codeparrot_training - Step 4447: {'lr': 0.0004968006133816532, 'samples': 854016, 'steps': 4447, 'loss/train': 0.775661051273346} 01/27/2022 00:04:08 - INFO - codeparrot_training - Step 4448: {'lr': 0.0004967980034987048, 'samples': 854208, 'steps': 4448, 'loss/train': 0.9060638844966888} 01/27/2022 00:04:13 - INFO - codeparrot_training - Step 4449: {'lr': 0.0004967953925585521, 'samples': 854400, 'steps': 4449, 'loss/train': 1.2165805399417877} 01/27/2022 00:04:16 - INFO - codeparrot_training - Step 4450: {'lr': 0.0004967927805612063, 'samples': 854592, 'steps': 4450, 'loss/train': 0.7680396437644958} 01/27/2022 00:04:19 - INFO - codeparrot_training - Step 4451: {'lr': 0.0004967901675066784, 'samples': 854784, 'steps': 4451, 'loss/train': 0.8665350079536438} 01/27/2022 00:04:22 - INFO - codeparrot_training - Step 4452: {'lr': 0.0004967875533949801, 'samples': 854976, 'steps': 4452, 'loss/train': 1.0388399362564087} 01/27/2022 00:04:25 - INFO - codeparrot_training - Step 4453: {'lr': 0.000496784938226122, 'samples': 855168, 'steps': 4453, 'loss/train': 0.9771183729171753} 01/27/2022 00:04:28 - INFO - codeparrot_training - Step 4454: {'lr': 0.0004967823220001158, 'samples': 855360, 'steps': 4454, 'loss/train': 0.6356800496578217} 01/27/2022 00:04:31 - INFO - codeparrot_training - Step 4455: {'lr': 0.0004967797047169724, 'samples': 855552, 'steps': 4455, 'loss/train': 0.7922734022140503} 01/27/2022 00:04:35 - INFO - codeparrot_training - Step 4456: {'lr': 0.0004967770863767031, 'samples': 855744, 'steps': 4456, 'loss/train': 1.1056855022907257} 01/27/2022 00:04:40 - INFO - codeparrot_training - Step 4457: {'lr': 0.0004967744669793192, 'samples': 855936, 'steps': 4457, 'loss/train': 1.3729758560657501} 01/27/2022 00:04:43 - INFO - codeparrot_training - Step 4458: {'lr': 0.0004967718465248317, 'samples': 856128, 'steps': 4458, 'loss/train': 0.8818501532077789} 01/27/2022 00:04:46 - INFO - codeparrot_training - Step 4459: {'lr': 0.000496769225013252, 'samples': 856320, 'steps': 4459, 'loss/train': 1.191847801208496} 01/27/2022 00:04:49 - INFO - codeparrot_training - Step 4460: {'lr': 0.0004967666024445913, 'samples': 856512, 'steps': 4460, 'loss/train': 0.5824467837810516} 01/27/2022 00:04:52 - INFO - codeparrot_training - Step 4461: {'lr': 0.000496763978818861, 'samples': 856704, 'steps': 4461, 'loss/train': 0.8129179179668427} 01/27/2022 00:04:55 - INFO - codeparrot_training - Step 4462: {'lr': 0.000496761354136072, 'samples': 856896, 'steps': 4462, 'loss/train': 1.1676211059093475} 01/27/2022 00:04:58 - INFO - codeparrot_training - Step 4463: {'lr': 0.0004967587283962358, 'samples': 857088, 'steps': 4463, 'loss/train': 1.4113693535327911} 01/27/2022 00:05:02 - INFO - codeparrot_training - Step 4464: {'lr': 0.0004967561015993635, 'samples': 857280, 'steps': 4464, 'loss/train': 0.716547042131424} 01/27/2022 00:05:05 - INFO - codeparrot_training - Step 4465: {'lr': 0.0004967534737454665, 'samples': 857472, 'steps': 4465, 'loss/train': 0.7871385812759399} 01/27/2022 00:05:09 - INFO - codeparrot_training - Step 4466: {'lr': 0.000496750844834556, 'samples': 857664, 'steps': 4466, 'loss/train': 1.0276413559913635} 01/27/2022 00:05:13 - INFO - codeparrot_training - Step 4467: {'lr': 0.000496748214866643, 'samples': 857856, 'steps': 4467, 'loss/train': 0.719674363732338} 01/27/2022 00:05:16 - INFO - codeparrot_training - Step 4468: {'lr': 0.0004967455838417392, 'samples': 858048, 'steps': 4468, 'loss/train': 0.8438231348991394} 01/27/2022 00:05:19 - INFO - codeparrot_training - Step 4469: {'lr': 0.0004967429517598556, 'samples': 858240, 'steps': 4469, 'loss/train': 0.7433875948190689} 01/27/2022 00:05:22 - INFO - codeparrot_training - Step 4470: {'lr': 0.0004967403186210036, 'samples': 858432, 'steps': 4470, 'loss/train': 1.0653120875358582} 01/27/2022 00:05:25 - INFO - codeparrot_training - Step 4471: {'lr': 0.0004967376844251944, 'samples': 858624, 'steps': 4471, 'loss/train': 0.701850950717926} 01/27/2022 00:05:28 - INFO - codeparrot_training - Step 4472: {'lr': 0.0004967350491724392, 'samples': 858816, 'steps': 4472, 'loss/train': 1.0906981229782104} 01/27/2022 00:05:31 - INFO - codeparrot_training - Step 4473: {'lr': 0.0004967324128627495, 'samples': 859008, 'steps': 4473, 'loss/train': 0.8885300159454346} 01/27/2022 00:05:35 - INFO - codeparrot_training - Step 4474: {'lr': 0.0004967297754961365, 'samples': 859200, 'steps': 4474, 'loss/train': 0.7083751559257507} 01/27/2022 00:05:40 - INFO - codeparrot_training - Step 4475: {'lr': 0.0004967271370726115, 'samples': 859392, 'steps': 4475, 'loss/train': 0.6831809431314468} 01/27/2022 00:05:43 - INFO - codeparrot_training - Step 4476: {'lr': 0.0004967244975921857, 'samples': 859584, 'steps': 4476, 'loss/train': 0.6388809084892273} 01/27/2022 00:05:46 - INFO - codeparrot_training - Step 4477: {'lr': 0.0004967218570548706, 'samples': 859776, 'steps': 4477, 'loss/train': 0.7387153655290604} 01/27/2022 00:05:49 - INFO - codeparrot_training - Step 4478: {'lr': 0.0004967192154606774, 'samples': 859968, 'steps': 4478, 'loss/train': 0.9559193551540375} 01/27/2022 00:05:52 - INFO - codeparrot_training - Step 4479: {'lr': 0.0004967165728096172, 'samples': 860160, 'steps': 4479, 'loss/train': 0.9553640484809875} 01/27/2022 00:05:56 - INFO - codeparrot_training - Step 4480: {'lr': 0.0004967139291017018, 'samples': 860352, 'steps': 4480, 'loss/train': 0.7830926477909088} 01/27/2022 00:05:59 - INFO - codeparrot_training - Step 4481: {'lr': 0.0004967112843369423, 'samples': 860544, 'steps': 4481, 'loss/train': 0.8501486778259277} 01/27/2022 00:06:02 - INFO - codeparrot_training - Step 4482: {'lr': 0.0004967086385153499, 'samples': 860736, 'steps': 4482, 'loss/train': 1.069366067647934} 01/27/2022 00:06:05 - INFO - codeparrot_training - Step 4483: {'lr': 0.0004967059916369359, 'samples': 860928, 'steps': 4483, 'loss/train': 0.7832656502723694} 01/27/2022 00:06:10 - INFO - codeparrot_training - Step 4484: {'lr': 0.000496703343701712, 'samples': 861120, 'steps': 4484, 'loss/train': 0.6884204149246216} 01/27/2022 00:06:13 - INFO - codeparrot_training - Step 4485: {'lr': 0.0004967006947096892, 'samples': 861312, 'steps': 4485, 'loss/train': 0.6564039140939713} 01/27/2022 00:06:16 - INFO - codeparrot_training - Step 4486: {'lr': 0.0004966980446608789, 'samples': 861504, 'steps': 4486, 'loss/train': 0.6043923050165176} 01/27/2022 00:06:19 - INFO - codeparrot_training - Step 4487: {'lr': 0.0004966953935552925, 'samples': 861696, 'steps': 4487, 'loss/train': 1.794855773448944} 01/27/2022 00:06:22 - INFO - codeparrot_training - Step 4488: {'lr': 0.0004966927413929415, 'samples': 861888, 'steps': 4488, 'loss/train': 0.7247919738292694} 01/27/2022 00:06:25 - INFO - codeparrot_training - Step 4489: {'lr': 0.0004966900881738371, 'samples': 862080, 'steps': 4489, 'loss/train': 1.1537608802318573} 01/27/2022 00:06:28 - INFO - codeparrot_training - Step 4490: {'lr': 0.0004966874338979907, 'samples': 862272, 'steps': 4490, 'loss/train': 0.622882068157196} 01/27/2022 00:06:32 - INFO - codeparrot_training - Step 4491: {'lr': 0.0004966847785654136, 'samples': 862464, 'steps': 4491, 'loss/train': 0.5239217430353165} 01/27/2022 00:06:35 - INFO - codeparrot_training - Step 4492: {'lr': 0.0004966821221761173, 'samples': 862656, 'steps': 4492, 'loss/train': 0.8109391629695892} 01/27/2022 00:06:39 - INFO - codeparrot_training - Step 4493: {'lr': 0.0004966794647301131, 'samples': 862848, 'steps': 4493, 'loss/train': 0.749193012714386} 01/27/2022 00:06:42 - INFO - codeparrot_training - Step 4494: {'lr': 0.0004966768062274125, 'samples': 863040, 'steps': 4494, 'loss/train': 0.6427338570356369} 01/27/2022 00:06:45 - INFO - codeparrot_training - Step 4495: {'lr': 0.0004966741466680266, 'samples': 863232, 'steps': 4495, 'loss/train': 0.48299261927604675} 01/27/2022 00:06:48 - INFO - codeparrot_training - Step 4496: {'lr': 0.000496671486051967, 'samples': 863424, 'steps': 4496, 'loss/train': 0.9633139371871948} 01/27/2022 00:06:52 - INFO - codeparrot_training - Step 4497: {'lr': 0.0004966688243792452, 'samples': 863616, 'steps': 4497, 'loss/train': 0.7283810377120972} 01/27/2022 00:06:55 - INFO - codeparrot_training - Step 4498: {'lr': 0.0004966661616498724, 'samples': 863808, 'steps': 4498, 'loss/train': 1.2083295285701752} 01/27/2022 00:06:58 - INFO - codeparrot_training - Step 4499: {'lr': 0.0004966634978638601, 'samples': 864000, 'steps': 4499, 'loss/train': 1.364411473274231} 01/27/2022 00:07:01 - INFO - codeparrot_training - Step 4500: {'lr': 0.0004966608330212198, 'samples': 864192, 'steps': 4500, 'loss/train': 0.6392533332109451} 01/27/2022 00:07:04 - INFO - codeparrot_training - Step 4501: {'lr': 0.0004966581671219627, 'samples': 864384, 'steps': 4501, 'loss/train': 0.777204304933548} 01/27/2022 00:07:09 - INFO - codeparrot_training - Step 4502: {'lr': 0.0004966555001661004, 'samples': 864576, 'steps': 4502, 'loss/train': 1.1817198693752289} 01/27/2022 00:07:12 - INFO - codeparrot_training - Step 4503: {'lr': 0.0004966528321536442, 'samples': 864768, 'steps': 4503, 'loss/train': 1.0759036839008331} 01/27/2022 00:07:15 - INFO - codeparrot_training - Step 4504: {'lr': 0.0004966501630846057, 'samples': 864960, 'steps': 4504, 'loss/train': 0.7193944305181503} 01/27/2022 00:07:18 - INFO - codeparrot_training - Step 4505: {'lr': 0.000496647492958996, 'samples': 865152, 'steps': 4505, 'loss/train': 1.0801239609718323} 01/27/2022 00:07:21 - INFO - codeparrot_training - Step 4506: {'lr': 0.000496644821776827, 'samples': 865344, 'steps': 4506, 'loss/train': 0.9585309326648712} 01/27/2022 00:07:24 - INFO - codeparrot_training - Step 4507: {'lr': 0.0004966421495381098, 'samples': 865536, 'steps': 4507, 'loss/train': 0.8006413578987122} 01/27/2022 00:07:28 - INFO - codeparrot_training - Step 4508: {'lr': 0.0004966394762428559, 'samples': 865728, 'steps': 4508, 'loss/train': 1.479098618030548} 01/27/2022 00:07:31 - INFO - codeparrot_training - Step 4509: {'lr': 0.0004966368018910768, 'samples': 865920, 'steps': 4509, 'loss/train': 0.8778758347034454} 01/27/2022 00:07:34 - INFO - codeparrot_training - Step 4510: {'lr': 0.000496634126482784, 'samples': 866112, 'steps': 4510, 'loss/train': 0.6156168133020401} 01/27/2022 00:07:39 - INFO - codeparrot_training - Step 4511: {'lr': 0.000496631450017989, 'samples': 866304, 'steps': 4511, 'loss/train': 0.9835388660430908} 01/27/2022 00:07:42 - INFO - codeparrot_training - Step 4512: {'lr': 0.0004966287724967032, 'samples': 866496, 'steps': 4512, 'loss/train': 0.8577397763729095} 01/27/2022 00:07:45 - INFO - codeparrot_training - Step 4513: {'lr': 0.0004966260939189379, 'samples': 866688, 'steps': 4513, 'loss/train': 0.7593384683132172} 01/27/2022 00:07:48 - INFO - codeparrot_training - Step 4514: {'lr': 0.0004966234142847048, 'samples': 866880, 'steps': 4514, 'loss/train': 1.236937254667282} 01/27/2022 00:07:52 - INFO - codeparrot_training - Step 4515: {'lr': 0.0004966207335940153, 'samples': 867072, 'steps': 4515, 'loss/train': 0.29751840233802795} 01/27/2022 00:07:55 - INFO - codeparrot_training - Step 4516: {'lr': 0.0004966180518468808, 'samples': 867264, 'steps': 4516, 'loss/train': 0.6501522660255432} 01/27/2022 00:07:58 - INFO - codeparrot_training - Step 4517: {'lr': 0.000496615369043313, 'samples': 867456, 'steps': 4517, 'loss/train': 0.9232717752456665} 01/27/2022 00:08:01 - INFO - codeparrot_training - Step 4518: {'lr': 0.0004966126851833233, 'samples': 867648, 'steps': 4518, 'loss/train': 1.3116298913955688} 01/27/2022 00:08:04 - INFO - codeparrot_training - Step 4519: {'lr': 0.0004966100002669231, 'samples': 867840, 'steps': 4519, 'loss/train': 0.8335847854614258} 01/27/2022 00:08:09 - INFO - codeparrot_training - Step 4520: {'lr': 0.0004966073142941239, 'samples': 868032, 'steps': 4520, 'loss/train': 1.3738740384578705} 01/27/2022 00:08:12 - INFO - codeparrot_training - Step 4521: {'lr': 0.0004966046272649372, 'samples': 868224, 'steps': 4521, 'loss/train': 1.000776082277298} 01/27/2022 00:08:15 - INFO - codeparrot_training - Step 4522: {'lr': 0.0004966019391793748, 'samples': 868416, 'steps': 4522, 'loss/train': 0.09669575840234756} 01/27/2022 00:08:18 - INFO - codeparrot_training - Step 4523: {'lr': 0.0004965992500374479, 'samples': 868608, 'steps': 4523, 'loss/train': 0.8742348253726959} 01/27/2022 00:08:21 - INFO - codeparrot_training - Step 4524: {'lr': 0.0004965965598391682, 'samples': 868800, 'steps': 4524, 'loss/train': 1.0451313257217407} 01/27/2022 00:08:25 - INFO - codeparrot_training - Step 4525: {'lr': 0.000496593868584547, 'samples': 868992, 'steps': 4525, 'loss/train': 1.0474567115306854} 01/27/2022 00:08:28 - INFO - codeparrot_training - Step 4526: {'lr': 0.0004965911762735961, 'samples': 869184, 'steps': 4526, 'loss/train': 0.9529868066310883} 01/27/2022 00:08:31 - INFO - codeparrot_training - Step 4527: {'lr': 0.0004965884829063268, 'samples': 869376, 'steps': 4527, 'loss/train': 0.3894535303115845} 01/27/2022 00:08:35 - INFO - codeparrot_training - Step 4528: {'lr': 0.0004965857884827508, 'samples': 869568, 'steps': 4528, 'loss/train': 0.972162276506424} 01/27/2022 00:08:38 - INFO - codeparrot_training - Step 4529: {'lr': 0.0004965830930028795, 'samples': 869760, 'steps': 4529, 'loss/train': 0.5538148283958435} 01/27/2022 00:08:42 - INFO - codeparrot_training - Step 4530: {'lr': 0.0004965803964667246, 'samples': 869952, 'steps': 4530, 'loss/train': 1.0173731446266174} 01/27/2022 00:08:45 - INFO - codeparrot_training - Step 4531: {'lr': 0.0004965776988742976, 'samples': 870144, 'steps': 4531, 'loss/train': 0.7572234570980072} 01/27/2022 00:08:48 - INFO - codeparrot_training - Step 4532: {'lr': 0.00049657500022561, 'samples': 870336, 'steps': 4532, 'loss/train': 1.0075095891952515} 01/27/2022 00:08:51 - INFO - codeparrot_training - Step 4533: {'lr': 0.0004965723005206734, 'samples': 870528, 'steps': 4533, 'loss/train': 1.1663819253444672} 01/27/2022 00:08:54 - INFO - codeparrot_training - Step 4534: {'lr': 0.0004965695997594993, 'samples': 870720, 'steps': 4534, 'loss/train': 1.433705896139145} 01/27/2022 00:08:57 - INFO - codeparrot_training - Step 4535: {'lr': 0.0004965668979420994, 'samples': 870912, 'steps': 4535, 'loss/train': 1.1577147245407104} 01/27/2022 00:09:00 - INFO - codeparrot_training - Step 4536: {'lr': 0.0004965641950684852, 'samples': 871104, 'steps': 4536, 'loss/train': 1.336251586675644} 01/27/2022 00:09:06 - INFO - codeparrot_training - Step 4537: {'lr': 0.0004965614911386683, 'samples': 871296, 'steps': 4537, 'loss/train': 0.7707619071006775} 01/27/2022 00:09:09 - INFO - codeparrot_training - Step 4538: {'lr': 0.0004965587861526602, 'samples': 871488, 'steps': 4538, 'loss/train': 0.6193422228097916} 01/27/2022 00:09:12 - INFO - codeparrot_training - Step 4539: {'lr': 0.0004965560801104726, 'samples': 871680, 'steps': 4539, 'loss/train': 1.3050081431865692} 01/27/2022 00:09:15 - INFO - codeparrot_training - Step 4540: {'lr': 0.000496553373012117, 'samples': 871872, 'steps': 4540, 'loss/train': 5.120829463005066} 01/27/2022 00:09:18 - INFO - codeparrot_training - Step 4541: {'lr': 0.0004965506648576052, 'samples': 872064, 'steps': 4541, 'loss/train': 0.4620320498943329} 01/27/2022 00:09:21 - INFO - codeparrot_training - Step 4542: {'lr': 0.0004965479556469485, 'samples': 872256, 'steps': 4542, 'loss/train': 0.4662015438079834} 01/27/2022 00:09:24 - INFO - codeparrot_training - Step 4543: {'lr': 0.0004965452453801586, 'samples': 872448, 'steps': 4543, 'loss/train': 1.05718794465065} 01/27/2022 00:09:27 - INFO - codeparrot_training - Step 4544: {'lr': 0.0004965425340572472, 'samples': 872640, 'steps': 4544, 'loss/train': 0.752721905708313} 01/27/2022 00:09:31 - INFO - codeparrot_training - Step 4545: {'lr': 0.0004965398216782258, 'samples': 872832, 'steps': 4545, 'loss/train': 1.3991345465183258} 01/27/2022 00:09:35 - INFO - codeparrot_training - Step 4546: {'lr': 0.0004965371082431062, 'samples': 873024, 'steps': 4546, 'loss/train': 1.0766872465610504} 01/27/2022 00:09:38 - INFO - codeparrot_training - Step 4547: {'lr': 0.0004965343937519, 'samples': 873216, 'steps': 4547, 'loss/train': 0.8957100212574005} 01/27/2022 00:09:41 - INFO - codeparrot_training - Step 4548: {'lr': 0.0004965316782046186, 'samples': 873408, 'steps': 4548, 'loss/train': 1.0470613539218903} 01/27/2022 00:09:44 - INFO - codeparrot_training - Step 4549: {'lr': 0.0004965289616012739, 'samples': 873600, 'steps': 4549, 'loss/train': 0.6171463876962662} 01/27/2022 00:09:47 - INFO - codeparrot_training - Step 4550: {'lr': 0.0004965262439418772, 'samples': 873792, 'steps': 4550, 'loss/train': 0.902805358171463} 01/27/2022 00:09:51 - INFO - codeparrot_training - Step 4551: {'lr': 0.0004965235252264405, 'samples': 873984, 'steps': 4551, 'loss/train': 1.0565557479858398} 01/27/2022 00:09:54 - INFO - codeparrot_training - Step 4552: {'lr': 0.0004965208054549753, 'samples': 874176, 'steps': 4552, 'loss/train': 1.0069093108177185} 01/27/2022 00:09:57 - INFO - codeparrot_training - Step 4553: {'lr': 0.0004965180846274931, 'samples': 874368, 'steps': 4553, 'loss/train': 1.0305477976799011} 01/27/2022 00:10:00 - INFO - codeparrot_training - Step 4554: {'lr': 0.0004965153627440058, 'samples': 874560, 'steps': 4554, 'loss/train': 1.1483242213726044} 01/27/2022 00:10:05 - INFO - codeparrot_training - Step 4555: {'lr': 0.000496512639804525, 'samples': 874752, 'steps': 4555, 'loss/train': 0.8648707866668701} 01/27/2022 00:10:08 - INFO - codeparrot_training - Step 4556: {'lr': 0.0004965099158090624, 'samples': 874944, 'steps': 4556, 'loss/train': 0.966820478439331} 01/27/2022 00:10:11 - INFO - codeparrot_training - Step 4557: {'lr': 0.0004965071907576294, 'samples': 875136, 'steps': 4557, 'loss/train': 0.7823950946331024} 01/27/2022 00:10:15 - INFO - codeparrot_training - Step 4558: {'lr': 0.000496504464650238, 'samples': 875328, 'steps': 4558, 'loss/train': 0.951241672039032} 01/27/2022 00:10:18 - INFO - codeparrot_training - Step 4559: {'lr': 0.0004965017374868997, 'samples': 875520, 'steps': 4559, 'loss/train': 0.6711113601922989} 01/27/2022 00:10:21 - INFO - codeparrot_training - Step 4560: {'lr': 0.0004964990092676262, 'samples': 875712, 'steps': 4560, 'loss/train': 0.7800466418266296} 01/27/2022 00:10:24 - INFO - codeparrot_training - Step 4561: {'lr': 0.0004964962799924293, 'samples': 875904, 'steps': 4561, 'loss/train': 1.5154834985733032} 01/27/2022 00:10:27 - INFO - codeparrot_training - Step 4562: {'lr': 0.0004964935496613206, 'samples': 876096, 'steps': 4562, 'loss/train': 0.8512544631958008} 01/27/2022 00:10:30 - INFO - codeparrot_training - Step 4563: {'lr': 0.0004964908182743117, 'samples': 876288, 'steps': 4563, 'loss/train': 1.0764916241168976} 01/27/2022 00:10:35 - INFO - codeparrot_training - Step 4564: {'lr': 0.0004964880858314146, 'samples': 876480, 'steps': 4564, 'loss/train': 1.0849956572055817} 01/27/2022 00:10:38 - INFO - codeparrot_training - Step 4565: {'lr': 0.0004964853523326406, 'samples': 876672, 'steps': 4565, 'loss/train': 0.7123946696519852} 01/27/2022 00:10:41 - INFO - codeparrot_training - Step 4566: {'lr': 0.0004964826177780017, 'samples': 876864, 'steps': 4566, 'loss/train': 0.1526653692126274} 01/27/2022 00:10:44 - INFO - codeparrot_training - Step 4567: {'lr': 0.0004964798821675096, 'samples': 877056, 'steps': 4567, 'loss/train': 0.9255441427230835} 01/27/2022 00:10:47 - INFO - codeparrot_training - Step 4568: {'lr': 0.0004964771455011758, 'samples': 877248, 'steps': 4568, 'loss/train': 0.5782732665538788} 01/27/2022 00:10:51 - INFO - codeparrot_training - Step 4569: {'lr': 0.0004964744077790123, 'samples': 877440, 'steps': 4569, 'loss/train': 0.9785381555557251} 01/27/2022 00:10:54 - INFO - codeparrot_training - Step 4570: {'lr': 0.0004964716690010306, 'samples': 877632, 'steps': 4570, 'loss/train': 1.2055164277553558} 01/27/2022 00:10:57 - INFO - codeparrot_training - Step 4571: {'lr': 0.0004964689291672427, 'samples': 877824, 'steps': 4571, 'loss/train': 0.9585228860378265} 01/27/2022 00:11:00 - INFO - codeparrot_training - Step 4572: {'lr': 0.00049646618827766, 'samples': 878016, 'steps': 4572, 'loss/train': 0.9428175687789917} 01/27/2022 00:11:05 - INFO - codeparrot_training - Step 4573: {'lr': 0.0004964634463322945, 'samples': 878208, 'steps': 4573, 'loss/train': 0.8275305032730103} 01/27/2022 00:11:08 - INFO - codeparrot_training - Step 4574: {'lr': 0.0004964607033311579, 'samples': 878400, 'steps': 4574, 'loss/train': 1.0095592439174652} 01/27/2022 00:11:11 - INFO - codeparrot_training - Step 4575: {'lr': 0.0004964579592742618, 'samples': 878592, 'steps': 4575, 'loss/train': 0.10170872882008553} 01/27/2022 00:11:14 - INFO - codeparrot_training - Step 4576: {'lr': 0.000496455214161618, 'samples': 878784, 'steps': 4576, 'loss/train': 0.957490861415863} 01/27/2022 00:11:17 - INFO - codeparrot_training - Step 4577: {'lr': 0.0004964524679932385, 'samples': 878976, 'steps': 4577, 'loss/train': 0.9475832283496857} 01/27/2022 00:11:21 - INFO - codeparrot_training - Step 4578: {'lr': 0.0004964497207691349, 'samples': 879168, 'steps': 4578, 'loss/train': 1.7077245712280273} 01/27/2022 00:11:24 - INFO - codeparrot_training - Step 4579: {'lr': 0.0004964469724893188, 'samples': 879360, 'steps': 4579, 'loss/train': 1.1741195619106293} 01/27/2022 00:11:27 - INFO - codeparrot_training - Step 4580: {'lr': 0.0004964442231538023, 'samples': 879552, 'steps': 4580, 'loss/train': 0.7858286798000336} 01/27/2022 00:11:30 - INFO - codeparrot_training - Step 4581: {'lr': 0.0004964414727625968, 'samples': 879744, 'steps': 4581, 'loss/train': 0.8149469196796417} 01/27/2022 00:11:35 - INFO - codeparrot_training - Step 4582: {'lr': 0.0004964387213157143, 'samples': 879936, 'steps': 4582, 'loss/train': 0.3735726848244667} 01/27/2022 00:11:38 - INFO - codeparrot_training - Step 4583: {'lr': 0.0004964359688131667, 'samples': 880128, 'steps': 4583, 'loss/train': 0.9922443330287933} 01/27/2022 00:11:41 - INFO - codeparrot_training - Step 4584: {'lr': 0.0004964332152549657, 'samples': 880320, 'steps': 4584, 'loss/train': 0.666193887591362} 01/27/2022 00:11:44 - INFO - codeparrot_training - Step 4585: {'lr': 0.0004964304606411229, 'samples': 880512, 'steps': 4585, 'loss/train': 0.9572897851467133} 01/27/2022 00:11:48 - INFO - codeparrot_training - Step 4586: {'lr': 0.0004964277049716503, 'samples': 880704, 'steps': 4586, 'loss/train': 1.575671911239624} 01/27/2022 00:11:51 - INFO - codeparrot_training - Step 4587: {'lr': 0.0004964249482465597, 'samples': 880896, 'steps': 4587, 'loss/train': 0.6467206478118896} 01/27/2022 00:11:54 - INFO - codeparrot_training - Step 4588: {'lr': 0.0004964221904658629, 'samples': 881088, 'steps': 4588, 'loss/train': 0.8389574289321899} 01/27/2022 00:11:57 - INFO - codeparrot_training - Step 4589: {'lr': 0.0004964194316295716, 'samples': 881280, 'steps': 4589, 'loss/train': 0.7263224869966507} 01/27/2022 00:12:02 - INFO - codeparrot_training - Step 4590: {'lr': 0.0004964166717376978, 'samples': 881472, 'steps': 4590, 'loss/train': 1.033930242061615} 01/27/2022 00:12:05 - INFO - codeparrot_training - Step 4591: {'lr': 0.0004964139107902531, 'samples': 881664, 'steps': 4591, 'loss/train': 0.8311435282230377} 01/27/2022 00:12:08 - INFO - codeparrot_training - Step 4592: {'lr': 0.0004964111487872495, 'samples': 881856, 'steps': 4592, 'loss/train': 0.9894022643566132} 01/27/2022 00:12:11 - INFO - codeparrot_training - Step 4593: {'lr': 0.0004964083857286988, 'samples': 882048, 'steps': 4593, 'loss/train': 1.2341094017028809} 01/27/2022 00:12:14 - INFO - codeparrot_training - Step 4594: {'lr': 0.0004964056216146129, 'samples': 882240, 'steps': 4594, 'loss/train': 1.0763654708862305} 01/27/2022 00:12:17 - INFO - codeparrot_training - Step 4595: {'lr': 0.0004964028564450034, 'samples': 882432, 'steps': 4595, 'loss/train': 1.0295953452587128} 01/27/2022 00:12:20 - INFO - codeparrot_training - Step 4596: {'lr': 0.0004964000902198824, 'samples': 882624, 'steps': 4596, 'loss/train': 0.7806967198848724} 01/27/2022 00:12:24 - INFO - codeparrot_training - Step 4597: {'lr': 0.0004963973229392617, 'samples': 882816, 'steps': 4597, 'loss/train': 0.5852890759706497} 01/27/2022 00:12:27 - INFO - codeparrot_training - Step 4598: {'lr': 0.0004963945546031529, 'samples': 883008, 'steps': 4598, 'loss/train': 1.5504178404808044} 01/27/2022 00:12:31 - INFO - codeparrot_training - Step 4599: {'lr': 0.0004963917852115683, 'samples': 883200, 'steps': 4599, 'loss/train': 1.3088673949241638} 01/27/2022 00:12:34 - INFO - codeparrot_training - Step 4600: {'lr': 0.0004963890147645194, 'samples': 883392, 'steps': 4600, 'loss/train': 0.9868986010551453} 01/27/2022 00:12:37 - INFO - codeparrot_training - Step 4601: {'lr': 0.0004963862432620183, 'samples': 883584, 'steps': 4601, 'loss/train': 0.7159115821123123} 01/27/2022 00:12:40 - INFO - codeparrot_training - Step 4602: {'lr': 0.0004963834707040767, 'samples': 883776, 'steps': 4602, 'loss/train': 0.6606528013944626} 01/27/2022 00:12:44 - INFO - codeparrot_training - Step 4603: {'lr': 0.0004963806970907066, 'samples': 883968, 'steps': 4603, 'loss/train': 1.3228847980499268} 01/27/2022 00:12:47 - INFO - codeparrot_training - Step 4604: {'lr': 0.0004963779224219197, 'samples': 884160, 'steps': 4604, 'loss/train': 1.3982521891593933} 01/27/2022 00:12:50 - INFO - codeparrot_training - Step 4605: {'lr': 0.0004963751466977281, 'samples': 884352, 'steps': 4605, 'loss/train': 1.7250730991363525} 01/27/2022 00:12:53 - INFO - codeparrot_training - Step 4606: {'lr': 0.0004963723699181437, 'samples': 884544, 'steps': 4606, 'loss/train': 0.9387743175029755} 01/27/2022 00:12:56 - INFO - codeparrot_training - Step 4607: {'lr': 0.0004963695920831781, 'samples': 884736, 'steps': 4607, 'loss/train': 0.8633173406124115} 01/27/2022 00:13:01 - INFO - codeparrot_training - Step 4608: {'lr': 0.0004963668131928436, 'samples': 884928, 'steps': 4608, 'loss/train': 0.654402494430542} 01/27/2022 00:13:04 - INFO - codeparrot_training - Step 4609: {'lr': 0.0004963640332471518, 'samples': 885120, 'steps': 4609, 'loss/train': 0.8019990921020508} 01/27/2022 00:13:07 - INFO - codeparrot_training - Step 4610: {'lr': 0.0004963612522461147, 'samples': 885312, 'steps': 4610, 'loss/train': 0.5975612103939056} 01/27/2022 00:13:10 - INFO - codeparrot_training - Step 4611: {'lr': 0.0004963584701897443, 'samples': 885504, 'steps': 4611, 'loss/train': 0.636740192770958} 01/27/2022 00:13:13 - INFO - codeparrot_training - Step 4612: {'lr': 0.0004963556870780523, 'samples': 885696, 'steps': 4612, 'loss/train': 1.2743381559848785} 01/27/2022 00:13:16 - INFO - codeparrot_training - Step 4613: {'lr': 0.0004963529029110509, 'samples': 885888, 'steps': 4613, 'loss/train': 1.1530618965625763} 01/27/2022 00:13:19 - INFO - codeparrot_training - Step 4614: {'lr': 0.0004963501176887519, 'samples': 886080, 'steps': 4614, 'loss/train': 1.1171431839466095} 01/27/2022 00:13:23 - INFO - codeparrot_training - Step 4615: {'lr': 0.000496347331411167, 'samples': 886272, 'steps': 4615, 'loss/train': 0.778814971446991} 01/27/2022 00:13:26 - INFO - codeparrot_training - Step 4616: {'lr': 0.0004963445440783086, 'samples': 886464, 'steps': 4616, 'loss/train': 0.7502430975437164} 01/27/2022 00:13:31 - INFO - codeparrot_training - Step 4617: {'lr': 0.0004963417556901882, 'samples': 886656, 'steps': 4617, 'loss/train': 0.9808393120765686} 01/27/2022 00:13:34 - INFO - codeparrot_training - Step 4618: {'lr': 0.0004963389662468182, 'samples': 886848, 'steps': 4618, 'loss/train': 0.7524392902851105} 01/27/2022 00:13:37 - INFO - codeparrot_training - Step 4619: {'lr': 0.0004963361757482101, 'samples': 887040, 'steps': 4619, 'loss/train': 0.5300348103046417} 01/27/2022 00:13:40 - INFO - codeparrot_training - Step 4620: {'lr': 0.000496333384194376, 'samples': 887232, 'steps': 4620, 'loss/train': 0.914741724729538} 01/27/2022 00:13:43 - INFO - codeparrot_training - Step 4621: {'lr': 0.000496330591585328, 'samples': 887424, 'steps': 4621, 'loss/train': 1.5971880555152893} 01/27/2022 00:13:46 - INFO - codeparrot_training - Step 4622: {'lr': 0.0004963277979210779, 'samples': 887616, 'steps': 4622, 'loss/train': 0.7655216753482819} 01/27/2022 00:13:50 - INFO - codeparrot_training - Step 4623: {'lr': 0.0004963250032016379, 'samples': 887808, 'steps': 4623, 'loss/train': 0.7457107454538345} 01/27/2022 00:13:53 - INFO - codeparrot_training - Step 4624: {'lr': 0.0004963222074270197, 'samples': 888000, 'steps': 4624, 'loss/train': 1.0535828769207} 01/27/2022 00:13:56 - INFO - codeparrot_training - Step 4625: {'lr': 0.0004963194105972353, 'samples': 888192, 'steps': 4625, 'loss/train': 0.7036985903978348} 01/27/2022 00:14:01 - INFO - codeparrot_training - Step 4626: {'lr': 0.0004963166127122969, 'samples': 888384, 'steps': 4626, 'loss/train': 0.9010710418224335} 01/27/2022 00:14:04 - INFO - codeparrot_training - Step 4627: {'lr': 0.0004963138137722161, 'samples': 888576, 'steps': 4627, 'loss/train': 0.9996485710144043} 01/27/2022 00:14:07 - INFO - codeparrot_training - Step 4628: {'lr': 0.0004963110137770054, 'samples': 888768, 'steps': 4628, 'loss/train': 0.6410195678472519} 01/27/2022 00:14:10 - INFO - codeparrot_training - Step 4629: {'lr': 0.0004963082127266764, 'samples': 888960, 'steps': 4629, 'loss/train': 0.5960507690906525} 01/27/2022 00:14:13 - INFO - codeparrot_training - Step 4630: {'lr': 0.0004963054106212414, 'samples': 889152, 'steps': 4630, 'loss/train': 0.750140905380249} 01/27/2022 00:14:16 - INFO - codeparrot_training - Step 4631: {'lr': 0.000496302607460712, 'samples': 889344, 'steps': 4631, 'loss/train': 1.213489294052124} 01/27/2022 00:14:19 - INFO - codeparrot_training - Step 4632: {'lr': 0.0004962998032451005, 'samples': 889536, 'steps': 4632, 'loss/train': 0.8354296982288361} 01/27/2022 00:14:23 - INFO - codeparrot_training - Step 4633: {'lr': 0.0004962969979744189, 'samples': 889728, 'steps': 4633, 'loss/train': 0.9597411453723907} 01/27/2022 00:14:26 - INFO - codeparrot_training - Step 4634: {'lr': 0.0004962941916486791, 'samples': 889920, 'steps': 4634, 'loss/train': 0.69196517765522} 01/27/2022 00:14:30 - INFO - codeparrot_training - Step 4635: {'lr': 0.0004962913842678934, 'samples': 890112, 'steps': 4635, 'loss/train': 1.0265707969665527} 01/27/2022 00:14:33 - INFO - codeparrot_training - Step 4636: {'lr': 0.0004962885758320734, 'samples': 890304, 'steps': 4636, 'loss/train': 0.8120384216308594} 01/27/2022 00:14:36 - INFO - codeparrot_training - Step 4637: {'lr': 0.0004962857663412314, 'samples': 890496, 'steps': 4637, 'loss/train': 0.8389788866043091} 01/27/2022 00:14:40 - INFO - codeparrot_training - Step 4638: {'lr': 0.0004962829557953794, 'samples': 890688, 'steps': 4638, 'loss/train': 0.9401789903640747} 01/27/2022 00:14:43 - INFO - codeparrot_training - Step 4639: {'lr': 0.0004962801441945293, 'samples': 890880, 'steps': 4639, 'loss/train': 0.9959884285926819} 01/27/2022 00:14:46 - INFO - codeparrot_training - Step 4640: {'lr': 0.0004962773315386935, 'samples': 891072, 'steps': 4640, 'loss/train': 0.6978537440299988} 01/27/2022 00:14:49 - INFO - codeparrot_training - Step 4641: {'lr': 0.0004962745178278837, 'samples': 891264, 'steps': 4641, 'loss/train': 0.725190594792366} 01/27/2022 00:14:52 - INFO - codeparrot_training - Step 4642: {'lr': 0.000496271703062112, 'samples': 891456, 'steps': 4642, 'loss/train': 0.550032377243042} 01/27/2022 00:14:55 - INFO - codeparrot_training - Step 4643: {'lr': 0.0004962688872413906, 'samples': 891648, 'steps': 4643, 'loss/train': 0.8472450971603394} 01/27/2022 00:15:00 - INFO - codeparrot_training - Step 4644: {'lr': 0.0004962660703657315, 'samples': 891840, 'steps': 4644, 'loss/train': 0.9837723970413208} 01/27/2022 00:15:04 - INFO - codeparrot_training - Step 4645: {'lr': 0.0004962632524351467, 'samples': 892032, 'steps': 4645, 'loss/train': 0.40105949342250824} 01/27/2022 00:15:07 - INFO - codeparrot_training - Step 4646: {'lr': 0.0004962604334496483, 'samples': 892224, 'steps': 4646, 'loss/train': 1.1366229951381683} 01/27/2022 00:15:10 - INFO - codeparrot_training - Step 4647: {'lr': 0.0004962576134092485, 'samples': 892416, 'steps': 4647, 'loss/train': 0.8859321177005768} 01/27/2022 00:15:13 - INFO - codeparrot_training - Step 4648: {'lr': 0.0004962547923139592, 'samples': 892608, 'steps': 4648, 'loss/train': 0.322323739528656} 01/27/2022 00:15:16 - INFO - codeparrot_training - Step 4649: {'lr': 0.0004962519701637926, 'samples': 892800, 'steps': 4649, 'loss/train': 1.1854309737682343} 01/27/2022 00:15:19 - INFO - codeparrot_training - Step 4650: {'lr': 0.0004962491469587607, 'samples': 892992, 'steps': 4650, 'loss/train': 0.8912545144557953} 01/27/2022 00:15:22 - INFO - codeparrot_training - Step 4651: {'lr': 0.0004962463226988758, 'samples': 893184, 'steps': 4651, 'loss/train': 0.5283988863229752} 01/27/2022 00:15:27 - INFO - codeparrot_training - Step 4652: {'lr': 0.0004962434973841497, 'samples': 893376, 'steps': 4652, 'loss/train': 0.7634910643100739} 01/27/2022 00:15:30 - INFO - codeparrot_training - Step 4653: {'lr': 0.0004962406710145946, 'samples': 893568, 'steps': 4653, 'loss/train': 0.660570502281189} 01/27/2022 00:15:33 - INFO - codeparrot_training - Step 4654: {'lr': 0.0004962378435902228, 'samples': 893760, 'steps': 4654, 'loss/train': 0.875801146030426} 01/27/2022 00:15:37 - INFO - codeparrot_training - Step 4655: {'lr': 0.0004962350151110461, 'samples': 893952, 'steps': 4655, 'loss/train': 0.7810496091842651} 01/27/2022 00:15:40 - INFO - codeparrot_training - Step 4656: {'lr': 0.0004962321855770769, 'samples': 894144, 'steps': 4656, 'loss/train': 0.6789182871580124} 01/27/2022 00:15:43 - INFO - codeparrot_training - Step 4657: {'lr': 0.0004962293549883273, 'samples': 894336, 'steps': 4657, 'loss/train': 0.5510556399822235} 01/27/2022 00:15:46 - INFO - codeparrot_training - Step 4658: {'lr': 0.0004962265233448092, 'samples': 894528, 'steps': 4658, 'loss/train': 1.0247858762741089} 01/27/2022 00:15:49 - INFO - codeparrot_training - Step 4659: {'lr': 0.0004962236906465349, 'samples': 894720, 'steps': 4659, 'loss/train': 0.7675975263118744} 01/27/2022 00:15:52 - INFO - codeparrot_training - Step 4660: {'lr': 0.0004962208568935164, 'samples': 894912, 'steps': 4660, 'loss/train': 0.7470409423112869} 01/27/2022 00:15:57 - INFO - codeparrot_training - Step 4661: {'lr': 0.000496218022085766, 'samples': 895104, 'steps': 4661, 'loss/train': 0.5871432423591614} 01/27/2022 00:16:00 - INFO - codeparrot_training - Step 4662: {'lr': 0.0004962151862232958, 'samples': 895296, 'steps': 4662, 'loss/train': 1.3834701776504517} 01/27/2022 00:16:04 - INFO - codeparrot_training - Step 4663: {'lr': 0.000496212349306118, 'samples': 895488, 'steps': 4663, 'loss/train': 0.13673239946365356} 01/27/2022 00:16:07 - INFO - codeparrot_training - Step 4664: {'lr': 0.0004962095113342445, 'samples': 895680, 'steps': 4664, 'loss/train': 0.709673747420311} 01/27/2022 00:16:10 - INFO - codeparrot_training - Step 4665: {'lr': 0.0004962066723076878, 'samples': 895872, 'steps': 4665, 'loss/train': 0.9789771437644958} 01/27/2022 00:16:13 - INFO - codeparrot_training - Step 4666: {'lr': 0.0004962038322264598, 'samples': 896064, 'steps': 4666, 'loss/train': 1.1926880478858948} 01/27/2022 00:16:16 - INFO - codeparrot_training - Step 4667: {'lr': 0.0004962009910905728, 'samples': 896256, 'steps': 4667, 'loss/train': 0.8343027234077454} 01/27/2022 00:16:19 - INFO - codeparrot_training - Step 4668: {'lr': 0.0004961981489000389, 'samples': 896448, 'steps': 4668, 'loss/train': 0.8921383917331696} 01/27/2022 00:16:22 - INFO - codeparrot_training - Step 4669: {'lr': 0.0004961953056548703, 'samples': 896640, 'steps': 4669, 'loss/train': 0.888984739780426} 01/27/2022 00:16:27 - INFO - codeparrot_training - Step 4670: {'lr': 0.0004961924613550793, 'samples': 896832, 'steps': 4670, 'loss/train': 0.8915638625621796} 01/27/2022 00:16:30 - INFO - codeparrot_training - Step 4671: {'lr': 0.0004961896160006778, 'samples': 897024, 'steps': 4671, 'loss/train': 0.8091963529586792} 01/27/2022 00:16:34 - INFO - codeparrot_training - Step 4672: {'lr': 0.0004961867695916782, 'samples': 897216, 'steps': 4672, 'loss/train': 1.112051546573639} 01/27/2022 00:16:37 - INFO - codeparrot_training - Step 4673: {'lr': 0.0004961839221280927, 'samples': 897408, 'steps': 4673, 'loss/train': 0.8377510607242584} 01/27/2022 00:16:40 - INFO - codeparrot_training - Step 4674: {'lr': 0.0004961810736099334, 'samples': 897600, 'steps': 4674, 'loss/train': 0.05567766912281513} 01/27/2022 00:16:43 - INFO - codeparrot_training - Step 4675: {'lr': 0.0004961782240372126, 'samples': 897792, 'steps': 4675, 'loss/train': 0.9511981308460236} 01/27/2022 00:16:46 - INFO - codeparrot_training - Step 4676: {'lr': 0.0004961753734099425, 'samples': 897984, 'steps': 4676, 'loss/train': 0.7730490267276764} 01/27/2022 00:16:49 - INFO - codeparrot_training - Step 4677: {'lr': 0.0004961725217281352, 'samples': 898176, 'steps': 4677, 'loss/train': 1.8118067979812622} 01/27/2022 00:16:53 - INFO - codeparrot_training - Step 4678: {'lr': 0.0004961696689918029, 'samples': 898368, 'steps': 4678, 'loss/train': 1.1940688490867615} 01/27/2022 00:16:57 - INFO - codeparrot_training - Step 4679: {'lr': 0.0004961668152009581, 'samples': 898560, 'steps': 4679, 'loss/train': 0.6913723200559616} 01/27/2022 00:17:00 - INFO - codeparrot_training - Step 4680: {'lr': 0.0004961639603556127, 'samples': 898752, 'steps': 4680, 'loss/train': 0.8223822712898254} 01/27/2022 00:17:03 - INFO - codeparrot_training - Step 4681: {'lr': 0.0004961611044557792, 'samples': 898944, 'steps': 4681, 'loss/train': 0.9421108067035675} 01/27/2022 00:17:06 - INFO - codeparrot_training - Step 4682: {'lr': 0.0004961582475014695, 'samples': 899136, 'steps': 4682, 'loss/train': 1.078751653432846} 01/27/2022 00:17:10 - INFO - codeparrot_training - Step 4683: {'lr': 0.0004961553894926961, 'samples': 899328, 'steps': 4683, 'loss/train': 0.6569390147924423} 01/27/2022 00:17:13 - INFO - codeparrot_training - Step 4684: {'lr': 0.0004961525304294712, 'samples': 899520, 'steps': 4684, 'loss/train': 0.9878751039505005} 01/27/2022 00:17:16 - INFO - codeparrot_training - Step 4685: {'lr': 0.000496149670311807, 'samples': 899712, 'steps': 4685, 'loss/train': 0.6850486546754837} 01/27/2022 00:17:19 - INFO - codeparrot_training - Step 4686: {'lr': 0.0004961468091397158, 'samples': 899904, 'steps': 4686, 'loss/train': 0.8876096606254578} 01/27/2022 00:17:22 - INFO - codeparrot_training - Step 4687: {'lr': 0.0004961439469132098, 'samples': 900096, 'steps': 4687, 'loss/train': 1.1196255683898926} 01/27/2022 00:17:27 - INFO - codeparrot_training - Step 4688: {'lr': 0.0004961410836323014, 'samples': 900288, 'steps': 4688, 'loss/train': 1.2280381321907043} 01/27/2022 00:17:31 - INFO - codeparrot_training - Step 4689: {'lr': 0.0004961382192970027, 'samples': 900480, 'steps': 4689, 'loss/train': 0.6953539252281189} 01/27/2022 00:17:34 - INFO - codeparrot_training - Step 4690: {'lr': 0.0004961353539073258, 'samples': 900672, 'steps': 4690, 'loss/train': 0.5278088450431824} 01/27/2022 00:17:37 - INFO - codeparrot_training - Step 4691: {'lr': 0.0004961324874632835, 'samples': 900864, 'steps': 4691, 'loss/train': 1.0932819843292236} 01/27/2022 00:17:40 - INFO - codeparrot_training - Step 4692: {'lr': 0.0004961296199648877, 'samples': 901056, 'steps': 4692, 'loss/train': 0.882615476846695} 01/27/2022 00:17:43 - INFO - codeparrot_training - Step 4693: {'lr': 0.0004961267514121507, 'samples': 901248, 'steps': 4693, 'loss/train': 0.5934814810752869} 01/27/2022 00:17:46 - INFO - codeparrot_training - Step 4694: {'lr': 0.0004961238818050849, 'samples': 901440, 'steps': 4694, 'loss/train': 0.5707316100597382} 01/27/2022 00:17:49 - INFO - codeparrot_training - Step 4695: {'lr': 0.0004961210111437026, 'samples': 901632, 'steps': 4695, 'loss/train': 0.822721391916275} 01/27/2022 00:17:53 - INFO - codeparrot_training - Step 4696: {'lr': 0.0004961181394280159, 'samples': 901824, 'steps': 4696, 'loss/train': 0.3982510417699814} 01/27/2022 00:17:57 - INFO - codeparrot_training - Step 4697: {'lr': 0.0004961152666580373, 'samples': 902016, 'steps': 4697, 'loss/train': 0.5826927423477173} 01/27/2022 00:18:00 - INFO - codeparrot_training - Step 4698: {'lr': 0.0004961123928337791, 'samples': 902208, 'steps': 4698, 'loss/train': 0.9754900932312012} 01/27/2022 00:18:03 - INFO - codeparrot_training - Step 4699: {'lr': 0.0004961095179552535, 'samples': 902400, 'steps': 4699, 'loss/train': 0.7586562037467957} 01/27/2022 00:18:06 - INFO - codeparrot_training - Step 4700: {'lr': 0.0004961066420224729, 'samples': 902592, 'steps': 4700, 'loss/train': 0.834577739238739} 01/27/2022 00:18:10 - INFO - codeparrot_training - Step 4701: {'lr': 0.0004961037650354496, 'samples': 902784, 'steps': 4701, 'loss/train': 0.5121902972459793} 01/27/2022 00:18:13 - INFO - codeparrot_training - Step 4702: {'lr': 0.0004961008869941959, 'samples': 902976, 'steps': 4702, 'loss/train': 0.8198444545269012} 01/27/2022 00:18:16 - INFO - codeparrot_training - Step 4703: {'lr': 0.0004960980078987241, 'samples': 903168, 'steps': 4703, 'loss/train': 0.6128533780574799} 01/27/2022 00:18:19 - INFO - codeparrot_training - Step 4704: {'lr': 0.0004960951277490467, 'samples': 903360, 'steps': 4704, 'loss/train': 1.061902016401291} 01/27/2022 00:18:22 - INFO - codeparrot_training - Step 4705: {'lr': 0.0004960922465451758, 'samples': 903552, 'steps': 4705, 'loss/train': 0.8978462219238281} 01/27/2022 00:18:29 - INFO - codeparrot_training - Step 4706: {'lr': 0.0004960893642871239, 'samples': 903744, 'steps': 4706, 'loss/train': 0.9600379765033722} 01/27/2022 00:18:32 - INFO - codeparrot_training - Step 4707: {'lr': 0.0004960864809749034, 'samples': 903936, 'steps': 4707, 'loss/train': 0.8784620761871338} 01/27/2022 00:18:35 - INFO - codeparrot_training - Step 4708: {'lr': 0.0004960835966085264, 'samples': 904128, 'steps': 4708, 'loss/train': 0.30206436663866043} 01/27/2022 00:18:39 - INFO - codeparrot_training - Step 4709: {'lr': 0.0004960807111880055, 'samples': 904320, 'steps': 4709, 'loss/train': 1.155686616897583} 01/27/2022 00:18:42 - INFO - codeparrot_training - Step 4710: {'lr': 0.000496077824713353, 'samples': 904512, 'steps': 4710, 'loss/train': 0.9288280606269836} 01/27/2022 00:18:45 - INFO - codeparrot_training - Step 4711: {'lr': 0.0004960749371845812, 'samples': 904704, 'steps': 4711, 'loss/train': 0.82766193151474} 01/27/2022 00:18:48 - INFO - codeparrot_training - Step 4712: {'lr': 0.0004960720486017025, 'samples': 904896, 'steps': 4712, 'loss/train': 0.9226456582546234} 01/27/2022 00:18:51 - INFO - codeparrot_training - Step 4713: {'lr': 0.0004960691589647292, 'samples': 905088, 'steps': 4713, 'loss/train': 0.6810197532176971} 01/27/2022 00:18:56 - INFO - codeparrot_training - Step 4714: {'lr': 0.0004960662682736739, 'samples': 905280, 'steps': 4714, 'loss/train': 0.6639811098575592} 01/27/2022 00:18:59 - INFO - codeparrot_training - Step 4715: {'lr': 0.0004960633765285487, 'samples': 905472, 'steps': 4715, 'loss/train': 0.8629632890224457} 01/27/2022 00:19:02 - INFO - codeparrot_training - Step 4716: {'lr': 0.0004960604837293663, 'samples': 905664, 'steps': 4716, 'loss/train': 0.916792094707489} 01/27/2022 00:19:05 - INFO - codeparrot_training - Step 4717: {'lr': 0.0004960575898761388, 'samples': 905856, 'steps': 4717, 'loss/train': 0.8688374161720276} 01/27/2022 00:19:08 - INFO - codeparrot_training - Step 4718: {'lr': 0.0004960546949688788, 'samples': 906048, 'steps': 4718, 'loss/train': 1.0163571238517761} 01/27/2022 00:19:11 - INFO - codeparrot_training - Step 4719: {'lr': 0.0004960517990075985, 'samples': 906240, 'steps': 4719, 'loss/train': 1.0745339691638947} 01/27/2022 00:19:15 - INFO - codeparrot_training - Step 4720: {'lr': 0.0004960489019923105, 'samples': 906432, 'steps': 4720, 'loss/train': 0.6541410237550735} 01/27/2022 00:19:18 - INFO - codeparrot_training - Step 4721: {'lr': 0.0004960460039230271, 'samples': 906624, 'steps': 4721, 'loss/train': 0.792168527841568} 01/27/2022 00:19:21 - INFO - codeparrot_training - Step 4722: {'lr': 0.0004960431047997608, 'samples': 906816, 'steps': 4722, 'loss/train': 0.9630715548992157} 01/27/2022 00:19:25 - INFO - codeparrot_training - Step 4723: {'lr': 0.0004960402046225239, 'samples': 907008, 'steps': 4723, 'loss/train': 0.8948393762111664} 01/27/2022 00:19:28 - INFO - codeparrot_training - Step 4724: {'lr': 0.0004960373033913289, 'samples': 907200, 'steps': 4724, 'loss/train': 0.674195408821106} 01/27/2022 00:19:31 - INFO - codeparrot_training - Step 4725: {'lr': 0.0004960344011061882, 'samples': 907392, 'steps': 4725, 'loss/train': 0.5702462643384933} 01/27/2022 00:19:35 - INFO - codeparrot_training - Step 4726: {'lr': 0.0004960314977671144, 'samples': 907584, 'steps': 4726, 'loss/train': 0.7028428763151169} 01/27/2022 00:19:38 - INFO - codeparrot_training - Step 4727: {'lr': 0.0004960285933741196, 'samples': 907776, 'steps': 4727, 'loss/train': 1.0260201394557953} 01/27/2022 00:19:41 - INFO - codeparrot_training - Step 4728: {'lr': 0.0004960256879272166, 'samples': 907968, 'steps': 4728, 'loss/train': 0.5375488847494125} 01/27/2022 00:19:44 - INFO - codeparrot_training - Step 4729: {'lr': 0.0004960227814264175, 'samples': 908160, 'steps': 4729, 'loss/train': 0.5806394219398499} 01/27/2022 00:19:47 - INFO - codeparrot_training - Step 4730: {'lr': 0.0004960198738717351, 'samples': 908352, 'steps': 4730, 'loss/train': 0.9816100895404816} 01/27/2022 00:19:50 - INFO - codeparrot_training - Step 4731: {'lr': 0.0004960169652631815, 'samples': 908544, 'steps': 4731, 'loss/train': 0.8115139603614807} 01/27/2022 00:19:56 - INFO - codeparrot_training - Step 4732: {'lr': 0.0004960140556007695, 'samples': 908736, 'steps': 4732, 'loss/train': 0.4336967021226883} 01/27/2022 00:19:59 - INFO - codeparrot_training - Step 4733: {'lr': 0.0004960111448845114, 'samples': 908928, 'steps': 4733, 'loss/train': 0.7362978905439377} 01/27/2022 00:20:02 - INFO - codeparrot_training - Step 4734: {'lr': 0.0004960082331144195, 'samples': 909120, 'steps': 4734, 'loss/train': 1.0761151313781738} 01/27/2022 00:20:05 - INFO - codeparrot_training - Step 4735: {'lr': 0.0004960053202905066, 'samples': 909312, 'steps': 4735, 'loss/train': 1.0377857387065887} 01/27/2022 00:20:09 - INFO - codeparrot_training - Step 4736: {'lr': 0.0004960024064127849, 'samples': 909504, 'steps': 4736, 'loss/train': 1.100013256072998} 01/27/2022 00:20:12 - INFO - codeparrot_training - Step 4737: {'lr': 0.0004959994914812671, 'samples': 909696, 'steps': 4737, 'loss/train': 0.7682605683803558} 01/27/2022 00:20:15 - INFO - codeparrot_training - Step 4738: {'lr': 0.0004959965754959656, 'samples': 909888, 'steps': 4738, 'loss/train': 0.5501359552145004} 01/27/2022 00:20:18 - INFO - codeparrot_training - Step 4739: {'lr': 0.0004959936584568928, 'samples': 910080, 'steps': 4739, 'loss/train': 0.9216068387031555} 01/27/2022 00:20:21 - INFO - codeparrot_training - Step 4740: {'lr': 0.0004959907403640614, 'samples': 910272, 'steps': 4740, 'loss/train': 0.28534356504678726} 01/27/2022 00:20:25 - INFO - codeparrot_training - Step 4741: {'lr': 0.0004959878212174837, 'samples': 910464, 'steps': 4741, 'loss/train': 0.9911748468875885} 01/27/2022 00:20:29 - INFO - codeparrot_training - Step 4742: {'lr': 0.0004959849010171723, 'samples': 910656, 'steps': 4742, 'loss/train': 1.353675127029419} 01/27/2022 00:20:32 - INFO - codeparrot_training - Step 4743: {'lr': 0.0004959819797631397, 'samples': 910848, 'steps': 4743, 'loss/train': 0.9704238474369049} 01/27/2022 00:20:35 - INFO - codeparrot_training - Step 4744: {'lr': 0.0004959790574553984, 'samples': 911040, 'steps': 4744, 'loss/train': 1.0456754565238953} 01/27/2022 00:20:38 - INFO - codeparrot_training - Step 4745: {'lr': 0.000495976134093961, 'samples': 911232, 'steps': 4745, 'loss/train': 1.5165655016899109} 01/27/2022 00:20:41 - INFO - codeparrot_training - Step 4746: {'lr': 0.0004959732096788398, 'samples': 911424, 'steps': 4746, 'loss/train': 0.8259406685829163} 01/27/2022 00:20:44 - INFO - codeparrot_training - Step 4747: {'lr': 0.0004959702842100475, 'samples': 911616, 'steps': 4747, 'loss/train': 1.006917804479599} 01/27/2022 00:20:47 - INFO - codeparrot_training - Step 4748: {'lr': 0.0004959673576875967, 'samples': 911808, 'steps': 4748, 'loss/train': 1.0294341444969177} 01/27/2022 00:20:51 - INFO - codeparrot_training - Step 4749: {'lr': 0.0004959644301114998, 'samples': 912000, 'steps': 4749, 'loss/train': 0.759185403585434} 01/27/2022 00:20:57 - INFO - codeparrot_training - Step 4750: {'lr': 0.0004959615014817694, 'samples': 912192, 'steps': 4750, 'loss/train': 0.8160866796970367} 01/27/2022 00:21:00 - INFO - codeparrot_training - Step 4751: {'lr': 0.000495958571798418, 'samples': 912384, 'steps': 4751, 'loss/train': 0.36804961413145065} 01/27/2022 00:21:03 - INFO - codeparrot_training - Step 4752: {'lr': 0.0004959556410614582, 'samples': 912576, 'steps': 4752, 'loss/train': 0.6180883347988129} 01/27/2022 00:21:06 - INFO - codeparrot_training - Step 4753: {'lr': 0.0004959527092709026, 'samples': 912768, 'steps': 4753, 'loss/train': 0.8933899104595184} 01/27/2022 00:21:09 - INFO - codeparrot_training - Step 4754: {'lr': 0.0004959497764267636, 'samples': 912960, 'steps': 4754, 'loss/train': 1.0021984577178955} 01/27/2022 00:21:13 - INFO - codeparrot_training - Step 4755: {'lr': 0.0004959468425290537, 'samples': 913152, 'steps': 4755, 'loss/train': 0.7144480794668198} 01/27/2022 00:21:16 - INFO - codeparrot_training - Step 4756: {'lr': 0.0004959439075777858, 'samples': 913344, 'steps': 4756, 'loss/train': 1.0549521446228027} 01/27/2022 00:21:19 - INFO - codeparrot_training - Step 4757: {'lr': 0.0004959409715729723, 'samples': 913536, 'steps': 4757, 'loss/train': 0.5898215174674988} 01/27/2022 00:21:22 - INFO - codeparrot_training - Step 4758: {'lr': 0.0004959380345146258, 'samples': 913728, 'steps': 4758, 'loss/train': 0.7665644288063049} 01/27/2022 00:21:27 - INFO - codeparrot_training - Step 4759: {'lr': 0.0004959350964027588, 'samples': 913920, 'steps': 4759, 'loss/train': 0.5111132562160492} 01/27/2022 00:21:30 - INFO - codeparrot_training - Step 4760: {'lr': 0.000495932157237384, 'samples': 914112, 'steps': 4760, 'loss/train': 0.8086941540241241} 01/27/2022 00:21:33 - INFO - codeparrot_training - Step 4761: {'lr': 0.0004959292170185139, 'samples': 914304, 'steps': 4761, 'loss/train': 0.9197696149349213} 01/27/2022 00:21:36 - INFO - codeparrot_training - Step 4762: {'lr': 0.0004959262757461611, 'samples': 914496, 'steps': 4762, 'loss/train': 0.6545912325382233} 01/27/2022 00:21:39 - INFO - codeparrot_training - Step 4763: {'lr': 0.0004959233334203382, 'samples': 914688, 'steps': 4763, 'loss/train': 0.8185981214046478} 01/27/2022 00:21:42 - INFO - codeparrot_training - Step 4764: {'lr': 0.0004959203900410579, 'samples': 914880, 'steps': 4764, 'loss/train': 1.0430143475532532} 01/27/2022 00:21:46 - INFO - codeparrot_training - Step 4765: {'lr': 0.0004959174456083327, 'samples': 915072, 'steps': 4765, 'loss/train': 1.1118747889995575} 01/27/2022 00:21:49 - INFO - codeparrot_training - Step 4766: {'lr': 0.0004959145001221752, 'samples': 915264, 'steps': 4766, 'loss/train': 0.8531017005443573} 01/27/2022 00:21:52 - INFO - codeparrot_training - Step 4767: {'lr': 0.0004959115535825982, 'samples': 915456, 'steps': 4767, 'loss/train': 1.3114643096923828} 01/27/2022 00:21:56 - INFO - codeparrot_training - Step 4768: {'lr': 0.000495908605989614, 'samples': 915648, 'steps': 4768, 'loss/train': 1.14733225107193} 01/27/2022 00:22:00 - INFO - codeparrot_training - Step 4769: {'lr': 0.0004959056573432357, 'samples': 915840, 'steps': 4769, 'loss/train': 0.09760264679789543} 01/27/2022 00:22:03 - INFO - codeparrot_training - Step 4770: {'lr': 0.0004959027076434754, 'samples': 916032, 'steps': 4770, 'loss/train': 1.1459956169128418} 01/27/2022 00:22:06 - INFO - codeparrot_training - Step 4771: {'lr': 0.000495899756890346, 'samples': 916224, 'steps': 4771, 'loss/train': 1.0883512794971466} 01/27/2022 00:22:09 - INFO - codeparrot_training - Step 4772: {'lr': 0.0004958968050838603, 'samples': 916416, 'steps': 4772, 'loss/train': 0.8369905650615692} 01/27/2022 00:22:12 - INFO - codeparrot_training - Step 4773: {'lr': 0.0004958938522240306, 'samples': 916608, 'steps': 4773, 'loss/train': 1.0204364955425262} 01/27/2022 00:22:15 - INFO - codeparrot_training - Step 4774: {'lr': 0.0004958908983108697, 'samples': 916800, 'steps': 4774, 'loss/train': 0.9762943089008331} 01/27/2022 00:22:18 - INFO - codeparrot_training - Step 4775: {'lr': 0.0004958879433443903, 'samples': 916992, 'steps': 4775, 'loss/train': 0.9000146985054016} 01/27/2022 00:22:22 - INFO - codeparrot_training - Step 4776: {'lr': 0.0004958849873246051, 'samples': 917184, 'steps': 4776, 'loss/train': 0.9862204492092133} 01/27/2022 00:22:28 - INFO - codeparrot_training - Step 4777: {'lr': 0.0004958820302515268, 'samples': 917376, 'steps': 4777, 'loss/train': 1.2084486186504364} 01/27/2022 00:22:31 - INFO - codeparrot_training - Step 4778: {'lr': 0.0004958790721251678, 'samples': 917568, 'steps': 4778, 'loss/train': 0.941179096698761} 01/27/2022 00:22:34 - INFO - codeparrot_training - Step 4779: {'lr': 0.000495876112945541, 'samples': 917760, 'steps': 4779, 'loss/train': 1.211145043373108} 01/27/2022 00:22:38 - INFO - codeparrot_training - Step 4780: {'lr': 0.0004958731527126589, 'samples': 917952, 'steps': 4780, 'loss/train': 0.719131126999855} 01/27/2022 00:22:41 - INFO - codeparrot_training - Step 4781: {'lr': 0.0004958701914265344, 'samples': 918144, 'steps': 4781, 'loss/train': 1.5486595630645752} 01/27/2022 00:22:44 - INFO - codeparrot_training - Step 4782: {'lr': 0.0004958672290871799, 'samples': 918336, 'steps': 4782, 'loss/train': 0.47536732256412506} 01/27/2022 00:22:47 - INFO - codeparrot_training - Step 4783: {'lr': 0.0004958642656946084, 'samples': 918528, 'steps': 4783, 'loss/train': 0.5963964611291885} 01/27/2022 00:22:50 - INFO - codeparrot_training - Step 4784: {'lr': 0.0004958613012488324, 'samples': 918720, 'steps': 4784, 'loss/train': 0.777581512928009} 01/27/2022 00:22:53 - INFO - codeparrot_training - Step 4785: {'lr': 0.0004958583357498647, 'samples': 918912, 'steps': 4785, 'loss/train': 0.8439410626888275} 01/27/2022 00:22:58 - INFO - codeparrot_training - Step 4786: {'lr': 0.000495855369197718, 'samples': 919104, 'steps': 4786, 'loss/train': 1.0146812796592712} 01/27/2022 00:23:01 - INFO - codeparrot_training - Step 4787: {'lr': 0.0004958524015924048, 'samples': 919296, 'steps': 4787, 'loss/train': 0.8145301043987274} 01/27/2022 00:23:04 - INFO - codeparrot_training - Step 4788: {'lr': 0.0004958494329339382, 'samples': 919488, 'steps': 4788, 'loss/train': 0.8349961638450623} 01/27/2022 00:23:07 - INFO - codeparrot_training - Step 4789: {'lr': 0.0004958464632223306, 'samples': 919680, 'steps': 4789, 'loss/train': 0.9189355373382568} 01/27/2022 00:23:10 - INFO - codeparrot_training - Step 4790: {'lr': 0.0004958434924575947, 'samples': 919872, 'steps': 4790, 'loss/train': 1.06044602394104} 01/27/2022 00:23:14 - INFO - codeparrot_training - Step 4791: {'lr': 0.0004958405206397434, 'samples': 920064, 'steps': 4791, 'loss/train': 0.2795732617378235} 01/27/2022 00:23:17 - INFO - codeparrot_training - Step 4792: {'lr': 0.0004958375477687896, 'samples': 920256, 'steps': 4792, 'loss/train': 1.303646296262741} 01/27/2022 00:23:20 - INFO - codeparrot_training - Step 4793: {'lr': 0.0004958345738447456, 'samples': 920448, 'steps': 4793, 'loss/train': 1.0234647989273071} 01/27/2022 00:23:24 - INFO - codeparrot_training - Step 4794: {'lr': 0.0004958315988676244, 'samples': 920640, 'steps': 4794, 'loss/train': 0.35209813714027405} 01/27/2022 00:23:28 - INFO - codeparrot_training - Step 4795: {'lr': 0.0004958286228374387, 'samples': 920832, 'steps': 4795, 'loss/train': 1.1514249444007874} 01/27/2022 00:23:31 - INFO - codeparrot_training - Step 4796: {'lr': 0.0004958256457542011, 'samples': 921024, 'steps': 4796, 'loss/train': 0.9770607054233551} 01/27/2022 00:23:34 - INFO - codeparrot_training - Step 4797: {'lr': 0.0004958226676179246, 'samples': 921216, 'steps': 4797, 'loss/train': 0.8808944821357727} 01/27/2022 00:23:37 - INFO - codeparrot_training - Step 4798: {'lr': 0.0004958196884286218, 'samples': 921408, 'steps': 4798, 'loss/train': 0.8222105205059052} 01/27/2022 00:23:40 - INFO - codeparrot_training - Step 4799: {'lr': 0.0004958167081863057, 'samples': 921600, 'steps': 4799, 'loss/train': 1.0780322849750519} 01/27/2022 00:23:43 - INFO - codeparrot_training - Step 4800: {'lr': 0.0004958137268909887, 'samples': 921792, 'steps': 4800, 'loss/train': 0.8777478933334351} 01/27/2022 00:23:46 - INFO - codeparrot_training - Step 4801: {'lr': 0.0004958107445426838, 'samples': 921984, 'steps': 4801, 'loss/train': 0.7284628003835678} 01/27/2022 00:23:50 - INFO - codeparrot_training - Step 4802: {'lr': 0.0004958077611414037, 'samples': 922176, 'steps': 4802, 'loss/train': 1.2250187695026398} 01/27/2022 00:23:54 - INFO - codeparrot_training - Step 4803: {'lr': 0.0004958047766871612, 'samples': 922368, 'steps': 4803, 'loss/train': 0.7157261073589325} 01/27/2022 00:23:57 - INFO - codeparrot_training - Step 4804: {'lr': 0.000495801791179969, 'samples': 922560, 'steps': 4804, 'loss/train': 0.6598362922668457} 01/27/2022 00:24:00 - INFO - codeparrot_training - Step 4805: {'lr': 0.0004957988046198401, 'samples': 922752, 'steps': 4805, 'loss/train': 0.9299048781394958} 01/27/2022 00:24:04 - INFO - codeparrot_training - Step 4806: {'lr': 0.0004957958170067872, 'samples': 922944, 'steps': 4806, 'loss/train': 0.4805036187171936} 01/27/2022 00:24:07 - INFO - codeparrot_training - Step 4807: {'lr': 0.000495792828340823, 'samples': 923136, 'steps': 4807, 'loss/train': 0.70143261551857} 01/27/2022 00:24:10 - INFO - codeparrot_training - Step 4808: {'lr': 0.0004957898386219603, 'samples': 923328, 'steps': 4808, 'loss/train': 1.2158813774585724} 01/27/2022 00:24:13 - INFO - codeparrot_training - Step 4809: {'lr': 0.0004957868478502121, 'samples': 923520, 'steps': 4809, 'loss/train': 1.1931433081626892} 01/27/2022 00:24:16 - INFO - codeparrot_training - Step 4810: {'lr': 0.0004957838560255911, 'samples': 923712, 'steps': 4810, 'loss/train': 0.8408704698085785} 01/27/2022 00:24:19 - INFO - codeparrot_training - Step 4811: {'lr': 0.0004957808631481101, 'samples': 923904, 'steps': 4811, 'loss/train': 1.1081664562225342} 01/27/2022 00:24:25 - INFO - codeparrot_training - Step 4812: {'lr': 0.0004957778692177819, 'samples': 924096, 'steps': 4812, 'loss/train': 1.1409084498882294} 01/27/2022 00:24:28 - INFO - codeparrot_training - Step 4813: {'lr': 0.0004957748742346193, 'samples': 924288, 'steps': 4813, 'loss/train': 0.4018996059894562} 01/27/2022 00:24:31 - INFO - codeparrot_training - Step 4814: {'lr': 0.0004957718781986352, 'samples': 924480, 'steps': 4814, 'loss/train': 1.3421134650707245} 01/27/2022 00:24:34 - INFO - codeparrot_training - Step 4815: {'lr': 0.0004957688811098425, 'samples': 924672, 'steps': 4815, 'loss/train': 0.9733855426311493} 01/27/2022 00:24:37 - INFO - codeparrot_training - Step 4816: {'lr': 0.0004957658829682539, 'samples': 924864, 'steps': 4816, 'loss/train': 0.9653363227844238} 01/27/2022 00:24:40 - INFO - codeparrot_training - Step 4817: {'lr': 0.0004957628837738823, 'samples': 925056, 'steps': 4817, 'loss/train': 1.1099712252616882} 01/27/2022 00:24:43 - INFO - codeparrot_training - Step 4818: {'lr': 0.0004957598835267405, 'samples': 925248, 'steps': 4818, 'loss/train': 1.106886237859726} 01/27/2022 00:24:47 - INFO - codeparrot_training - Step 4819: {'lr': 0.0004957568822268415, 'samples': 925440, 'steps': 4819, 'loss/train': 0.917710930109024} 01/27/2022 00:24:50 - INFO - codeparrot_training - Step 4820: {'lr': 0.000495753879874198, 'samples': 925632, 'steps': 4820, 'loss/train': 0.6008580029010773} 01/27/2022 00:24:54 - INFO - codeparrot_training - Step 4821: {'lr': 0.0004957508764688227, 'samples': 925824, 'steps': 4821, 'loss/train': 1.5898540019989014} 01/27/2022 00:24:58 - INFO - codeparrot_training - Step 4822: {'lr': 0.000495747872010729, 'samples': 926016, 'steps': 4822, 'loss/train': 0.8025794327259064} 01/27/2022 00:25:01 - INFO - codeparrot_training - Step 4823: {'lr': 0.0004957448664999293, 'samples': 926208, 'steps': 4823, 'loss/train': 0.9501464366912842} 01/27/2022 00:25:04 - INFO - codeparrot_training - Step 4824: {'lr': 0.0004957418599364367, 'samples': 926400, 'steps': 4824, 'loss/train': 0.6514131277799606} 01/27/2022 00:25:07 - INFO - codeparrot_training - Step 4825: {'lr': 0.000495738852320264, 'samples': 926592, 'steps': 4825, 'loss/train': 0.525568038225174} 01/27/2022 00:25:10 - INFO - codeparrot_training - Step 4826: {'lr': 0.000495735843651424, 'samples': 926784, 'steps': 4826, 'loss/train': 1.24069145321846} 01/27/2022 00:25:13 - INFO - codeparrot_training - Step 4827: {'lr': 0.0004957328339299297, 'samples': 926976, 'steps': 4827, 'loss/train': 0.7268161475658417} 01/27/2022 00:25:17 - INFO - codeparrot_training - Step 4828: {'lr': 0.0004957298231557939, 'samples': 927168, 'steps': 4828, 'loss/train': 0.5175927132368088} 01/27/2022 00:25:22 - INFO - codeparrot_training - Step 4829: {'lr': 0.0004957268113290297, 'samples': 927360, 'steps': 4829, 'loss/train': 1.5261547565460205} 01/27/2022 00:25:25 - INFO - codeparrot_training - Step 4830: {'lr': 0.0004957237984496499, 'samples': 927552, 'steps': 4830, 'loss/train': 0.5456887632608414} 01/27/2022 00:25:28 - INFO - codeparrot_training - Step 4831: {'lr': 0.0004957207845176673, 'samples': 927744, 'steps': 4831, 'loss/train': 0.942411482334137} 01/27/2022 00:25:31 - INFO - codeparrot_training - Step 4832: {'lr': 0.0004957177695330948, 'samples': 927936, 'steps': 4832, 'loss/train': 0.7555948197841644} 01/27/2022 00:25:35 - INFO - codeparrot_training - Step 4833: {'lr': 0.0004957147534959455, 'samples': 928128, 'steps': 4833, 'loss/train': 1.104974627494812} 01/27/2022 00:25:38 - INFO - codeparrot_training - Step 4834: {'lr': 0.0004957117364062321, 'samples': 928320, 'steps': 4834, 'loss/train': 1.394470363855362} 01/27/2022 00:25:41 - INFO - codeparrot_training - Step 4835: {'lr': 0.0004957087182639678, 'samples': 928512, 'steps': 4835, 'loss/train': 1.0150971114635468} 01/27/2022 00:25:44 - INFO - codeparrot_training - Step 4836: {'lr': 0.0004957056990691653, 'samples': 928704, 'steps': 4836, 'loss/train': 1.091951608657837} 01/27/2022 00:25:47 - INFO - codeparrot_training - Step 4837: {'lr': 0.0004957026788218377, 'samples': 928896, 'steps': 4837, 'loss/train': 2.007048726081848} 01/27/2022 00:25:52 - INFO - codeparrot_training - Step 4838: {'lr': 0.0004956996575219977, 'samples': 929088, 'steps': 4838, 'loss/train': 1.3690402507781982} 01/27/2022 00:25:55 - INFO - codeparrot_training - Step 4839: {'lr': 0.0004956966351696584, 'samples': 929280, 'steps': 4839, 'loss/train': 1.4007951021194458} 01/27/2022 00:25:58 - INFO - codeparrot_training - Step 4840: {'lr': 0.0004956936117648329, 'samples': 929472, 'steps': 4840, 'loss/train': 1.0917760133743286} 01/27/2022 00:26:01 - INFO - codeparrot_training - Step 4841: {'lr': 0.0004956905873075338, 'samples': 929664, 'steps': 4841, 'loss/train': 1.1146464049816132} 01/27/2022 00:26:04 - INFO - codeparrot_training - Step 4842: {'lr': 0.0004956875617977743, 'samples': 929856, 'steps': 4842, 'loss/train': 0.7428535223007202} 01/27/2022 00:26:07 - INFO - codeparrot_training - Step 4843: {'lr': 0.0004956845352355674, 'samples': 930048, 'steps': 4843, 'loss/train': 0.8186894953250885} 01/27/2022 00:26:10 - INFO - codeparrot_training - Step 4844: {'lr': 0.0004956815076209257, 'samples': 930240, 'steps': 4844, 'loss/train': 0.9392916262149811} 01/27/2022 00:26:14 - INFO - codeparrot_training - Step 4845: {'lr': 0.0004956784789538626, 'samples': 930432, 'steps': 4845, 'loss/train': 0.9382995665073395} 01/27/2022 00:26:17 - INFO - codeparrot_training - Step 4846: {'lr': 0.000495675449234391, 'samples': 930624, 'steps': 4846, 'loss/train': 1.3783289194107056} 01/27/2022 00:26:21 - INFO - codeparrot_training - Step 4847: {'lr': 0.0004956724184625237, 'samples': 930816, 'steps': 4847, 'loss/train': 1.4460124969482422} 01/27/2022 00:26:24 - INFO - codeparrot_training - Step 4848: {'lr': 0.0004956693866382738, 'samples': 931008, 'steps': 4848, 'loss/train': 0.7372614294290543} 01/27/2022 00:26:28 - INFO - codeparrot_training - Step 4849: {'lr': 0.0004956663537616542, 'samples': 931200, 'steps': 4849, 'loss/train': 0.6464224755764008} 01/27/2022 00:26:31 - INFO - codeparrot_training - Step 4850: {'lr': 0.000495663319832678, 'samples': 931392, 'steps': 4850, 'loss/train': 1.0935460925102234} 01/27/2022 00:26:34 - INFO - codeparrot_training - Step 4851: {'lr': 0.0004956602848513581, 'samples': 931584, 'steps': 4851, 'loss/train': 0.8639451563358307} 01/27/2022 00:26:37 - INFO - codeparrot_training - Step 4852: {'lr': 0.0004956572488177075, 'samples': 931776, 'steps': 4852, 'loss/train': 0.9275532066822052} 01/27/2022 00:26:40 - INFO - codeparrot_training - Step 4853: {'lr': 0.0004956542117317393, 'samples': 931968, 'steps': 4853, 'loss/train': 0.5070791244506836} 01/27/2022 00:26:43 - INFO - codeparrot_training - Step 4854: {'lr': 0.0004956511735934665, 'samples': 932160, 'steps': 4854, 'loss/train': 0.9137432277202606} 01/27/2022 00:26:46 - INFO - codeparrot_training - Step 4855: {'lr': 0.000495648134402902, 'samples': 932352, 'steps': 4855, 'loss/train': 0.843874454498291} 01/27/2022 00:26:52 - INFO - codeparrot_training - Step 4856: {'lr': 0.0004956450941600589, 'samples': 932544, 'steps': 4856, 'loss/train': 0.7809615433216095} 01/27/2022 00:26:55 - INFO - codeparrot_training - Step 4857: {'lr': 0.0004956420528649504, 'samples': 932736, 'steps': 4857, 'loss/train': 0.5442244559526443} 01/27/2022 00:26:58 - INFO - codeparrot_training - Step 4858: {'lr': 0.0004956390105175892, 'samples': 932928, 'steps': 4858, 'loss/train': 0.43363259732723236} 01/27/2022 00:27:01 - INFO - codeparrot_training - Step 4859: {'lr': 0.0004956359671179885, 'samples': 933120, 'steps': 4859, 'loss/train': 0.5904340445995331} 01/27/2022 00:27:05 - INFO - codeparrot_training - Step 4860: {'lr': 0.0004956329226661612, 'samples': 933312, 'steps': 4860, 'loss/train': 1.028090000152588} 01/27/2022 00:27:08 - INFO - codeparrot_training - Step 4861: {'lr': 0.0004956298771621206, 'samples': 933504, 'steps': 4861, 'loss/train': 1.455135852098465} 01/27/2022 00:27:11 - INFO - codeparrot_training - Step 4862: {'lr': 0.0004956268306058795, 'samples': 933696, 'steps': 4862, 'loss/train': 0.9368384778499603} 01/27/2022 00:27:14 - INFO - codeparrot_training - Step 4863: {'lr': 0.0004956237829974511, 'samples': 933888, 'steps': 4863, 'loss/train': 1.1544573605060577} 01/27/2022 00:27:17 - INFO - codeparrot_training - Step 4864: {'lr': 0.0004956207343368485, 'samples': 934080, 'steps': 4864, 'loss/train': 0.6431125849485397} 01/27/2022 00:27:22 - INFO - codeparrot_training - Step 4865: {'lr': 0.0004956176846240845, 'samples': 934272, 'steps': 4865, 'loss/train': 0.5093989670276642} 01/27/2022 00:27:25 - INFO - codeparrot_training - Step 4866: {'lr': 0.0004956146338591725, 'samples': 934464, 'steps': 4866, 'loss/train': 0.7220514714717865} 01/27/2022 00:27:28 - INFO - codeparrot_training - Step 4867: {'lr': 0.0004956115820421253, 'samples': 934656, 'steps': 4867, 'loss/train': 0.3980863094329834} 01/27/2022 00:27:31 - INFO - codeparrot_training - Step 4868: {'lr': 0.000495608529172956, 'samples': 934848, 'steps': 4868, 'loss/train': 0.9944806694984436} 01/27/2022 00:27:34 - INFO - codeparrot_training - Step 4869: {'lr': 0.000495605475251678, 'samples': 935040, 'steps': 4869, 'loss/train': 1.039499044418335} 01/27/2022 00:27:37 - INFO - codeparrot_training - Step 4870: {'lr': 0.000495602420278304, 'samples': 935232, 'steps': 4870, 'loss/train': 0.681800052523613} 01/27/2022 00:27:41 - INFO - codeparrot_training - Step 4871: {'lr': 0.0004955993642528471, 'samples': 935424, 'steps': 4871, 'loss/train': 0.9275394380092621} 01/27/2022 00:27:44 - INFO - codeparrot_training - Step 4872: {'lr': 0.0004955963071753206, 'samples': 935616, 'steps': 4872, 'loss/train': 0.9731263518333435} 01/27/2022 00:27:49 - INFO - codeparrot_training - Step 4873: {'lr': 0.0004955932490457375, 'samples': 935808, 'steps': 4873, 'loss/train': 1.462528109550476} 01/27/2022 00:27:52 - INFO - codeparrot_training - Step 4874: {'lr': 0.0004955901898641109, 'samples': 936000, 'steps': 4874, 'loss/train': 0.6530923694372177} 01/27/2022 00:27:55 - INFO - codeparrot_training - Step 4875: {'lr': 0.000495587129630454, 'samples': 936192, 'steps': 4875, 'loss/train': 0.41603171825408936} 01/27/2022 00:27:59 - INFO - codeparrot_training - Step 4876: {'lr': 0.0004955840683447797, 'samples': 936384, 'steps': 4876, 'loss/train': 0.6516867130994797} 01/27/2022 00:28:02 - INFO - codeparrot_training - Step 4877: {'lr': 0.0004955810060071012, 'samples': 936576, 'steps': 4877, 'loss/train': 1.4544979333877563} 01/27/2022 00:28:05 - INFO - codeparrot_training - Step 4878: {'lr': 0.0004955779426174318, 'samples': 936768, 'steps': 4878, 'loss/train': 0.9632821083068848} 01/27/2022 00:28:08 - INFO - codeparrot_training - Step 4879: {'lr': 0.0004955748781757844, 'samples': 936960, 'steps': 4879, 'loss/train': 1.1087526977062225} 01/27/2022 00:28:11 - INFO - codeparrot_training - Step 4880: {'lr': 0.0004955718126821722, 'samples': 937152, 'steps': 4880, 'loss/train': 0.6709884256124496} 01/27/2022 00:28:14 - INFO - codeparrot_training - Step 4881: {'lr': 0.0004955687461366083, 'samples': 937344, 'steps': 4881, 'loss/train': 0.7313467115163803} 01/27/2022 00:28:19 - INFO - codeparrot_training - Step 4882: {'lr': 0.000495565678539106, 'samples': 937536, 'steps': 4882, 'loss/train': 0.7391556054353714} 01/27/2022 00:28:22 - INFO - codeparrot_training - Step 4883: {'lr': 0.0004955626098896782, 'samples': 937728, 'steps': 4883, 'loss/train': 0.8687232434749603} 01/27/2022 00:28:25 - INFO - codeparrot_training - Step 4884: {'lr': 0.0004955595401883381, 'samples': 937920, 'steps': 4884, 'loss/train': 0.5523279458284378} 01/27/2022 00:28:28 - INFO - codeparrot_training - Step 4885: {'lr': 0.0004955564694350989, 'samples': 938112, 'steps': 4885, 'loss/train': 1.0853704512119293} 01/27/2022 00:28:31 - INFO - codeparrot_training - Step 4886: {'lr': 0.0004955533976299739, 'samples': 938304, 'steps': 4886, 'loss/train': 0.6225558668375015} 01/27/2022 00:28:34 - INFO - codeparrot_training - Step 4887: {'lr': 0.000495550324772976, 'samples': 938496, 'steps': 4887, 'loss/train': 0.5521571338176727} 01/27/2022 00:28:38 - INFO - codeparrot_training - Step 4888: {'lr': 0.0004955472508641186, 'samples': 938688, 'steps': 4888, 'loss/train': 1.183691829442978} 01/27/2022 00:28:41 - INFO - codeparrot_training - Step 4889: {'lr': 0.0004955441759034146, 'samples': 938880, 'steps': 4889, 'loss/train': 1.3187878131866455} 01/27/2022 00:28:44 - INFO - codeparrot_training - Step 4890: {'lr': 0.0004955410998908774, 'samples': 939072, 'steps': 4890, 'loss/train': 1.0246459543704987} 01/27/2022 00:28:48 - INFO - codeparrot_training - Step 4891: {'lr': 0.0004955380228265201, 'samples': 939264, 'steps': 4891, 'loss/train': 0.8545865714550018} 01/27/2022 00:28:51 - INFO - codeparrot_training - Step 4892: {'lr': 0.0004955349447103559, 'samples': 939456, 'steps': 4892, 'loss/train': 1.0014809668064117} 01/27/2022 00:28:55 - INFO - codeparrot_training - Step 4893: {'lr': 0.000495531865542398, 'samples': 939648, 'steps': 4893, 'loss/train': 0.1452586054801941} 01/27/2022 00:28:58 - INFO - codeparrot_training - Step 4894: {'lr': 0.0004955287853226594, 'samples': 939840, 'steps': 4894, 'loss/train': 0.7606708109378815} 01/27/2022 00:29:01 - INFO - codeparrot_training - Step 4895: {'lr': 0.0004955257040511534, 'samples': 940032, 'steps': 4895, 'loss/train': 1.0899037420749664} 01/27/2022 00:29:04 - INFO - codeparrot_training - Step 4896: {'lr': 0.0004955226217278934, 'samples': 940224, 'steps': 4896, 'loss/train': 0.5715540200471878} 01/27/2022 00:29:07 - INFO - codeparrot_training - Step 4897: {'lr': 0.0004955195383528926, 'samples': 940416, 'steps': 4897, 'loss/train': 0.6550563275814056} 01/27/2022 00:29:10 - INFO - codeparrot_training - Step 4898: {'lr': 0.0004955164539261638, 'samples': 940608, 'steps': 4898, 'loss/train': 0.8216629028320312} 01/27/2022 00:29:13 - INFO - codeparrot_training - Step 4899: {'lr': 0.0004955133684477205, 'samples': 940800, 'steps': 4899, 'loss/train': 1.270383596420288} 01/27/2022 00:29:19 - INFO - codeparrot_training - Step 4900: {'lr': 0.000495510281917576, 'samples': 940992, 'steps': 4900, 'loss/train': 0.7454513758420944} 01/27/2022 00:29:22 - INFO - codeparrot_training - Step 4901: {'lr': 0.0004955071943357433, 'samples': 941184, 'steps': 4901, 'loss/train': 1.4932959973812103} 01/27/2022 00:29:25 - INFO - codeparrot_training - Step 4902: {'lr': 0.0004955041057022358, 'samples': 941376, 'steps': 4902, 'loss/train': 0.6002615690231323} 01/27/2022 00:29:28 - INFO - codeparrot_training - Step 4903: {'lr': 0.0004955010160170667, 'samples': 941568, 'steps': 4903, 'loss/train': 1.1445910334587097} 01/27/2022 00:29:31 - INFO - codeparrot_training - Step 4904: {'lr': 0.0004954979252802491, 'samples': 941760, 'steps': 4904, 'loss/train': 0.7517771422863007} 01/27/2022 00:29:35 - INFO - codeparrot_training - Step 4905: {'lr': 0.0004954948334917965, 'samples': 941952, 'steps': 4905, 'loss/train': 1.1157358288764954} 01/27/2022 00:29:38 - INFO - codeparrot_training - Step 4906: {'lr': 0.0004954917406517218, 'samples': 942144, 'steps': 4906, 'loss/train': 0.2829056605696678} 01/27/2022 00:29:41 - INFO - codeparrot_training - Step 4907: {'lr': 0.0004954886467600386, 'samples': 942336, 'steps': 4907, 'loss/train': 0.6919961124658585} 01/27/2022 00:29:45 - INFO - codeparrot_training - Step 4908: {'lr': 0.0004954855518167599, 'samples': 942528, 'steps': 4908, 'loss/train': 0.8967989087104797} 01/27/2022 00:29:49 - INFO - codeparrot_training - Step 4909: {'lr': 0.000495482455821899, 'samples': 942720, 'steps': 4909, 'loss/train': 1.0038146674633026} 01/27/2022 00:29:52 - INFO - codeparrot_training - Step 4910: {'lr': 0.0004954793587754694, 'samples': 942912, 'steps': 4910, 'loss/train': 0.7265303581953049} 01/27/2022 00:29:55 - INFO - codeparrot_training - Step 4911: {'lr': 0.000495476260677484, 'samples': 943104, 'steps': 4911, 'loss/train': 0.2867288812994957} 01/27/2022 00:29:58 - INFO - codeparrot_training - Step 4912: {'lr': 0.0004954731615279563, 'samples': 943296, 'steps': 4912, 'loss/train': 0.8271078765392303} 01/27/2022 00:30:01 - INFO - codeparrot_training - Step 4913: {'lr': 0.0004954700613268995, 'samples': 943488, 'steps': 4913, 'loss/train': 0.628940686583519} 01/27/2022 00:30:04 - INFO - codeparrot_training - Step 4914: {'lr': 0.0004954669600743269, 'samples': 943680, 'steps': 4914, 'loss/train': 1.6911964416503906} 01/27/2022 00:30:08 - INFO - codeparrot_training - Step 4915: {'lr': 0.0004954638577702519, 'samples': 943872, 'steps': 4915, 'loss/train': 1.3583308160305023} 01/27/2022 00:30:11 - INFO - codeparrot_training - Step 4916: {'lr': 0.0004954607544146875, 'samples': 944064, 'steps': 4916, 'loss/train': 0.7337427735328674} 01/27/2022 00:30:15 - INFO - codeparrot_training - Step 4917: {'lr': 0.0004954576500076472, 'samples': 944256, 'steps': 4917, 'loss/train': 1.1248127818107605} 01/27/2022 00:30:18 - INFO - codeparrot_training - Step 4918: {'lr': 0.0004954545445491444, 'samples': 944448, 'steps': 4918, 'loss/train': 0.8780820965766907} 01/27/2022 00:30:21 - INFO - codeparrot_training - Step 4919: {'lr': 0.0004954514380391921, 'samples': 944640, 'steps': 4919, 'loss/train': 0.8421580195426941} 01/27/2022 00:30:24 - INFO - codeparrot_training - Step 4920: {'lr': 0.0004954483304778039, 'samples': 944832, 'steps': 4920, 'loss/train': 1.2779438495635986} 01/27/2022 00:30:28 - INFO - codeparrot_training - Step 4921: {'lr': 0.0004954452218649929, 'samples': 945024, 'steps': 4921, 'loss/train': 1.0471968054771423} 01/27/2022 00:30:31 - INFO - codeparrot_training - Step 4922: {'lr': 0.0004954421122007727, 'samples': 945216, 'steps': 4922, 'loss/train': 0.8313927948474884} 01/27/2022 00:30:34 - INFO - codeparrot_training - Step 4923: {'lr': 0.0004954390014851563, 'samples': 945408, 'steps': 4923, 'loss/train': 0.8067930936813354} 01/27/2022 00:30:37 - INFO - codeparrot_training - Step 4924: {'lr': 0.0004954358897181571, 'samples': 945600, 'steps': 4924, 'loss/train': 0.6742963492870331} 01/27/2022 00:30:40 - INFO - codeparrot_training - Step 4925: {'lr': 0.0004954327768997885, 'samples': 945792, 'steps': 4925, 'loss/train': 0.7208969593048096} 01/27/2022 00:30:45 - INFO - codeparrot_training - Step 4926: {'lr': 0.0004954296630300638, 'samples': 945984, 'steps': 4926, 'loss/train': 0.8760599792003632} 01/27/2022 00:30:48 - INFO - codeparrot_training - Step 4927: {'lr': 0.0004954265481089965, 'samples': 946176, 'steps': 4927, 'loss/train': 1.0978595316410065} 01/27/2022 00:30:51 - INFO - codeparrot_training - Step 4928: {'lr': 0.0004954234321365998, 'samples': 946368, 'steps': 4928, 'loss/train': 0.1871832199394703} 01/27/2022 00:30:54 - INFO - codeparrot_training - Step 4929: {'lr': 0.0004954203151128868, 'samples': 946560, 'steps': 4929, 'loss/train': 1.2625381350517273} 01/27/2022 00:30:57 - INFO - codeparrot_training - Step 4930: {'lr': 0.0004954171970378713, 'samples': 946752, 'steps': 4930, 'loss/train': 1.0319899320602417} 01/27/2022 00:31:01 - INFO - codeparrot_training - Step 4931: {'lr': 0.0004954140779115664, 'samples': 946944, 'steps': 4931, 'loss/train': 2.2932018041610718} 01/27/2022 00:31:04 - INFO - codeparrot_training - Step 4932: {'lr': 0.0004954109577339856, 'samples': 947136, 'steps': 4932, 'loss/train': 0.46784743666648865} 01/27/2022 00:31:07 - INFO - codeparrot_training - Step 4933: {'lr': 0.0004954078365051421, 'samples': 947328, 'steps': 4933, 'loss/train': 0.958724856376648} 01/27/2022 00:31:10 - INFO - codeparrot_training - Step 4934: {'lr': 0.0004954047142250494, 'samples': 947520, 'steps': 4934, 'loss/train': 0.5950245559215546} 01/27/2022 00:31:15 - INFO - codeparrot_training - Step 4935: {'lr': 0.0004954015908937208, 'samples': 947712, 'steps': 4935, 'loss/train': 0.18143795058131218} 01/27/2022 00:31:19 - INFO - codeparrot_training - Step 4936: {'lr': 0.0004953984665111697, 'samples': 947904, 'steps': 4936, 'loss/train': 1.116322249174118} 01/27/2022 00:31:22 - INFO - codeparrot_training - Step 4937: {'lr': 0.0004953953410774095, 'samples': 948096, 'steps': 4937, 'loss/train': 1.3112629652023315} 01/27/2022 00:31:25 - INFO - codeparrot_training - Step 4938: {'lr': 0.0004953922145924535, 'samples': 948288, 'steps': 4938, 'loss/train': 0.43383586406707764} 01/27/2022 00:31:28 - INFO - codeparrot_training - Step 4939: {'lr': 0.0004953890870563153, 'samples': 948480, 'steps': 4939, 'loss/train': 0.5383943617343903} 01/27/2022 00:31:31 - INFO - codeparrot_training - Step 4940: {'lr': 0.0004953859584690081, 'samples': 948672, 'steps': 4940, 'loss/train': 0.9284209907054901} 01/27/2022 00:31:34 - INFO - codeparrot_training - Step 4941: {'lr': 0.0004953828288305454, 'samples': 948864, 'steps': 4941, 'loss/train': 0.6111751198768616} 01/27/2022 00:31:38 - INFO - codeparrot_training - Step 4942: {'lr': 0.0004953796981409407, 'samples': 949056, 'steps': 4942, 'loss/train': 0.732147753238678} 01/27/2022 00:31:41 - INFO - codeparrot_training - Step 4943: {'lr': 0.0004953765664002071, 'samples': 949248, 'steps': 4943, 'loss/train': 0.32419509440660477} 01/27/2022 00:31:45 - INFO - codeparrot_training - Step 4944: {'lr': 0.0004953734336083582, 'samples': 949440, 'steps': 4944, 'loss/train': 1.0334928631782532} 01/27/2022 00:31:48 - INFO - codeparrot_training - Step 4945: {'lr': 0.0004953702997654076, 'samples': 949632, 'steps': 4945, 'loss/train': 0.8923453688621521} 01/27/2022 00:31:52 - INFO - codeparrot_training - Step 4946: {'lr': 0.0004953671648713683, 'samples': 949824, 'steps': 4946, 'loss/train': 0.8973584175109863} 01/27/2022 00:31:55 - INFO - codeparrot_training - Step 4947: {'lr': 0.0004953640289262542, 'samples': 950016, 'steps': 4947, 'loss/train': 0.8718738555908203} 01/27/2022 00:31:58 - INFO - codeparrot_training - Step 4948: {'lr': 0.0004953608919300784, 'samples': 950208, 'steps': 4948, 'loss/train': 0.8859338164329529} 01/27/2022 00:32:01 - INFO - codeparrot_training - Step 4949: {'lr': 0.0004953577538828546, 'samples': 950400, 'steps': 4949, 'loss/train': 0.9371341466903687} 01/27/2022 00:32:04 - INFO - codeparrot_training - Step 4950: {'lr': 0.0004953546147845959, 'samples': 950592, 'steps': 4950, 'loss/train': 1.228497326374054} 01/27/2022 00:32:07 - INFO - codeparrot_training - Step 4951: {'lr': 0.0004953514746353161, 'samples': 950784, 'steps': 4951, 'loss/train': 0.7837587296962738} 01/27/2022 00:32:11 - INFO - codeparrot_training - Step 4952: {'lr': 0.0004953483334350283, 'samples': 950976, 'steps': 4952, 'loss/train': 1.2446864247322083} 01/27/2022 00:32:16 - INFO - codeparrot_training - Step 4953: {'lr': 0.0004953451911837463, 'samples': 951168, 'steps': 4953, 'loss/train': 1.775108277797699} 01/27/2022 00:32:19 - INFO - codeparrot_training - Step 4954: {'lr': 0.0004953420478814834, 'samples': 951360, 'steps': 4954, 'loss/train': 0.6683933436870575} 01/27/2022 00:32:22 - INFO - codeparrot_training - Step 4955: {'lr': 0.000495338903528253, 'samples': 951552, 'steps': 4955, 'loss/train': 1.3803775906562805} 01/27/2022 00:32:26 - INFO - codeparrot_training - Step 4956: {'lr': 0.0004953357581240686, 'samples': 951744, 'steps': 4956, 'loss/train': 0.8443701267242432} 01/27/2022 00:32:29 - INFO - codeparrot_training - Step 4957: {'lr': 0.0004953326116689438, 'samples': 951936, 'steps': 4957, 'loss/train': 1.1462601721286774} 01/27/2022 00:32:32 - INFO - codeparrot_training - Step 4958: {'lr': 0.000495329464162892, 'samples': 952128, 'steps': 4958, 'loss/train': 0.9192580282688141} 01/27/2022 00:32:35 - INFO - codeparrot_training - Step 4959: {'lr': 0.0004953263156059266, 'samples': 952320, 'steps': 4959, 'loss/train': 0.7915876507759094} 01/27/2022 00:32:38 - INFO - codeparrot_training - Step 4960: {'lr': 0.0004953231659980613, 'samples': 952512, 'steps': 4960, 'loss/train': 1.10595703125} 01/27/2022 00:32:41 - INFO - codeparrot_training - Step 4961: {'lr': 0.0004953200153393094, 'samples': 952704, 'steps': 4961, 'loss/train': 1.222559541463852} 01/27/2022 00:32:46 - INFO - codeparrot_training - Step 4962: {'lr': 0.0004953168636296845, 'samples': 952896, 'steps': 4962, 'loss/train': 0.2418917790055275} 01/27/2022 00:32:49 - INFO - codeparrot_training - Step 4963: {'lr': 0.0004953137108691999, 'samples': 953088, 'steps': 4963, 'loss/train': 0.6505532115697861} 01/27/2022 00:32:52 - INFO - codeparrot_training - Step 4964: {'lr': 0.0004953105570578693, 'samples': 953280, 'steps': 4964, 'loss/train': 0.5530595183372498} 01/27/2022 00:32:55 - INFO - codeparrot_training - Step 4965: {'lr': 0.0004953074021957063, 'samples': 953472, 'steps': 4965, 'loss/train': 0.6431242525577545} 01/27/2022 00:32:59 - INFO - codeparrot_training - Step 4966: {'lr': 0.0004953042462827242, 'samples': 953664, 'steps': 4966, 'loss/train': 1.1224029064178467} 01/27/2022 00:33:02 - INFO - codeparrot_training - Step 4967: {'lr': 0.0004953010893189365, 'samples': 953856, 'steps': 4967, 'loss/train': 0.9259015917778015} 01/27/2022 00:33:05 - INFO - codeparrot_training - Step 4968: {'lr': 0.000495297931304357, 'samples': 954048, 'steps': 4968, 'loss/train': 0.7878577709197998} 01/27/2022 00:33:08 - INFO - codeparrot_training - Step 4969: {'lr': 0.000495294772238999, 'samples': 954240, 'steps': 4969, 'loss/train': 0.5581444054841995} 01/27/2022 00:33:11 - INFO - codeparrot_training - Step 4970: {'lr': 0.000495291612122876, 'samples': 954432, 'steps': 4970, 'loss/train': 0.6931990832090378} 01/27/2022 00:33:16 - INFO - codeparrot_training - Step 4971: {'lr': 0.0004952884509560017, 'samples': 954624, 'steps': 4971, 'loss/train': 0.7869375050067902} 01/27/2022 00:33:19 - INFO - codeparrot_training - Step 4972: {'lr': 0.0004952852887383895, 'samples': 954816, 'steps': 4972, 'loss/train': 0.41494596004486084} 01/27/2022 00:33:22 - INFO - codeparrot_training - Step 4973: {'lr': 0.0004952821254700531, 'samples': 955008, 'steps': 4973, 'loss/train': 1.3406065106391907} 01/27/2022 00:33:25 - INFO - codeparrot_training - Step 4974: {'lr': 0.0004952789611510059, 'samples': 955200, 'steps': 4974, 'loss/train': 0.8880027830600739} 01/27/2022 00:33:28 - INFO - codeparrot_training - Step 4975: {'lr': 0.0004952757957812615, 'samples': 955392, 'steps': 4975, 'loss/train': 1.1286392211914062} 01/27/2022 00:33:31 - INFO - codeparrot_training - Step 4976: {'lr': 0.0004952726293608335, 'samples': 955584, 'steps': 4976, 'loss/train': 0.8646506667137146} 01/27/2022 00:33:35 - INFO - codeparrot_training - Step 4977: {'lr': 0.0004952694618897354, 'samples': 955776, 'steps': 4977, 'loss/train': 1.2194461226463318} 01/27/2022 00:33:38 - INFO - codeparrot_training - Step 4978: {'lr': 0.0004952662933679809, 'samples': 955968, 'steps': 4978, 'loss/train': 0.7595382928848267} 01/27/2022 00:33:41 - INFO - codeparrot_training - Step 4979: {'lr': 0.0004952631237955835, 'samples': 956160, 'steps': 4979, 'loss/train': 1.1463501155376434} 01/27/2022 00:33:46 - INFO - codeparrot_training - Step 4980: {'lr': 0.0004952599531725567, 'samples': 956352, 'steps': 4980, 'loss/train': 0.9391747713088989} 01/27/2022 00:33:49 - INFO - codeparrot_training - Step 4981: {'lr': 0.0004952567814989141, 'samples': 956544, 'steps': 4981, 'loss/train': 0.7705775499343872} 01/27/2022 00:33:53 - INFO - codeparrot_training - Step 4982: {'lr': 0.0004952536087746693, 'samples': 956736, 'steps': 4982, 'loss/train': 1.1717070043087006} 01/27/2022 00:33:56 - INFO - codeparrot_training - Step 4983: {'lr': 0.000495250434999836, 'samples': 956928, 'steps': 4983, 'loss/train': 0.6640839725732803} 01/27/2022 00:33:59 - INFO - codeparrot_training - Step 4984: {'lr': 0.0004952472601744277, 'samples': 957120, 'steps': 4984, 'loss/train': 1.069059133529663} 01/27/2022 00:34:02 - INFO - codeparrot_training - Step 4985: {'lr': 0.000495244084298458, 'samples': 957312, 'steps': 4985, 'loss/train': 0.6541790664196014} 01/27/2022 00:34:05 - INFO - codeparrot_training - Step 4986: {'lr': 0.0004952409073719405, 'samples': 957504, 'steps': 4986, 'loss/train': 0.33056674897670746} 01/27/2022 00:34:08 - INFO - codeparrot_training - Step 4987: {'lr': 0.0004952377293948888, 'samples': 957696, 'steps': 4987, 'loss/train': 1.2186055183410645} 01/27/2022 00:34:11 - INFO - codeparrot_training - Step 4988: {'lr': 0.0004952345503673166, 'samples': 957888, 'steps': 4988, 'loss/train': 0.8504677712917328} 01/27/2022 00:34:16 - INFO - codeparrot_training - Step 4989: {'lr': 0.0004952313702892375, 'samples': 958080, 'steps': 4989, 'loss/train': 1.3194620311260223} 01/27/2022 00:34:19 - INFO - codeparrot_training - Step 4990: {'lr': 0.0004952281891606649, 'samples': 958272, 'steps': 4990, 'loss/train': 1.0387072563171387} 01/27/2022 00:34:22 - INFO - codeparrot_training - Step 4991: {'lr': 0.0004952250069816127, 'samples': 958464, 'steps': 4991, 'loss/train': 0.873846173286438} 01/27/2022 00:34:25 - INFO - codeparrot_training - Step 4992: {'lr': 0.0004952218237520945, 'samples': 958656, 'steps': 4992, 'loss/train': 1.2310647368431091} 01/27/2022 00:34:28 - INFO - codeparrot_training - Step 4993: {'lr': 0.0004952186394721239, 'samples': 958848, 'steps': 4993, 'loss/train': 0.9336924254894257} 01/27/2022 00:34:31 - INFO - codeparrot_training - Step 4994: {'lr': 0.0004952154541417144, 'samples': 959040, 'steps': 4994, 'loss/train': 0.11516109108924866} 01/27/2022 00:34:35 - INFO - codeparrot_training - Step 4995: {'lr': 0.0004952122677608798, 'samples': 959232, 'steps': 4995, 'loss/train': 0.6941595822572708} 01/27/2022 00:34:38 - INFO - codeparrot_training - Step 4996: {'lr': 0.0004952090803296337, 'samples': 959424, 'steps': 4996, 'loss/train': 0.7862244844436646} 01/27/2022 00:34:42 - INFO - codeparrot_training - Step 4997: {'lr': 0.0004952058918479899, 'samples': 959616, 'steps': 4997, 'loss/train': 0.8863954246044159} 01/27/2022 00:34:45 - INFO - codeparrot_training - Step 4998: {'lr': 0.0004952027023159617, 'samples': 959808, 'steps': 4998, 'loss/train': 0.6749326139688492} 01/27/2022 00:34:48 - INFO - codeparrot_training - Step 4999: {'lr': 0.0004951995117335631, 'samples': 960000, 'steps': 4999, 'loss/train': 0.9696817696094513} 01/27/2022 00:34:51 - INFO - codeparrot_training - Step 5000: {'lr': 0.0004951963201008077, 'samples': 960192, 'steps': 5000, 'loss/train': 0.6660680025815964} 01/27/2022 00:34:55 - INFO - codeparrot_training - Step 5001: {'lr': 0.000495193127417709, 'samples': 960384, 'steps': 5001, 'loss/train': 0.8052612841129303} 01/27/2022 00:34:58 - INFO - codeparrot_training - Step 5002: {'lr': 0.0004951899336842809, 'samples': 960576, 'steps': 5002, 'loss/train': 0.9412990808486938} 01/27/2022 00:35:01 - INFO - codeparrot_training - Step 5003: {'lr': 0.0004951867389005369, 'samples': 960768, 'steps': 5003, 'loss/train': 0.645723357796669} 01/27/2022 00:35:04 - INFO - codeparrot_training - Step 5004: {'lr': 0.0004951835430664908, 'samples': 960960, 'steps': 5004, 'loss/train': 0.5360070616006851} 01/27/2022 00:35:07 - INFO - codeparrot_training - Step 5005: {'lr': 0.0004951803461821562, 'samples': 961152, 'steps': 5005, 'loss/train': 1.2135450839996338} 01/27/2022 00:35:12 - INFO - codeparrot_training - Step 5006: {'lr': 0.0004951771482475469, 'samples': 961344, 'steps': 5006, 'loss/train': 0.8647854030132294} 01/27/2022 00:35:15 - INFO - codeparrot_training - Step 5007: {'lr': 0.0004951739492626766, 'samples': 961536, 'steps': 5007, 'loss/train': 0.8698221445083618} 01/27/2022 00:35:18 - INFO - codeparrot_training - Step 5008: {'lr': 0.0004951707492275589, 'samples': 961728, 'steps': 5008, 'loss/train': 0.9634445607662201} 01/27/2022 00:35:21 - INFO - codeparrot_training - Step 5009: {'lr': 0.0004951675481422075, 'samples': 961920, 'steps': 5009, 'loss/train': 1.127245455980301} 01/27/2022 00:35:24 - INFO - codeparrot_training - Step 5010: {'lr': 0.0004951643460066363, 'samples': 962112, 'steps': 5010, 'loss/train': 0.8100754022598267} 01/27/2022 00:35:27 - INFO - codeparrot_training - Step 5011: {'lr': 0.0004951611428208589, 'samples': 962304, 'steps': 5011, 'loss/train': 1.534365177154541} 01/27/2022 00:35:31 - INFO - codeparrot_training - Step 5012: {'lr': 0.0004951579385848889, 'samples': 962496, 'steps': 5012, 'loss/train': 1.1759101152420044} 01/27/2022 00:35:34 - INFO - codeparrot_training - Step 5013: {'lr': 0.0004951547332987401, 'samples': 962688, 'steps': 5013, 'loss/train': 0.7812463045120239} 01/27/2022 00:35:37 - INFO - codeparrot_training - Step 5014: {'lr': 0.0004951515269624265, 'samples': 962880, 'steps': 5014, 'loss/train': 1.106719046831131} 01/27/2022 00:35:43 - INFO - codeparrot_training - Step 5015: {'lr': 0.0004951483195759614, 'samples': 963072, 'steps': 5015, 'loss/train': 0.4857836812734604} 01/27/2022 00:35:46 - INFO - codeparrot_training - Step 5016: {'lr': 0.0004951451111393588, 'samples': 963264, 'steps': 5016, 'loss/train': 0.5077658146619797} 01/27/2022 00:35:49 - INFO - codeparrot_training - Step 5017: {'lr': 0.0004951419016526324, 'samples': 963456, 'steps': 5017, 'loss/train': 0.6269199103116989} 01/27/2022 00:35:53 - INFO - codeparrot_training - Step 5018: {'lr': 0.0004951386911157959, 'samples': 963648, 'steps': 5018, 'loss/train': 0.9144598245620728} 01/27/2022 00:35:56 - INFO - codeparrot_training - Step 5019: {'lr': 0.0004951354795288631, 'samples': 963840, 'steps': 5019, 'loss/train': 0.8757360577583313} 01/27/2022 00:35:59 - INFO - codeparrot_training - Step 5020: {'lr': 0.0004951322668918477, 'samples': 964032, 'steps': 5020, 'loss/train': 0.7320632189512253} 01/27/2022 00:36:02 - INFO - codeparrot_training - Step 5021: {'lr': 0.0004951290532047637, 'samples': 964224, 'steps': 5021, 'loss/train': 1.001897782087326} 01/27/2022 00:36:05 - INFO - codeparrot_training - Step 5022: {'lr': 0.0004951258384676244, 'samples': 964416, 'steps': 5022, 'loss/train': 0.8489722609519958} 01/27/2022 00:36:08 - INFO - codeparrot_training - Step 5023: {'lr': 0.0004951226226804441, 'samples': 964608, 'steps': 5023, 'loss/train': 0.5535428524017334} 01/27/2022 00:36:13 - INFO - codeparrot_training - Step 5024: {'lr': 0.0004951194058432361, 'samples': 964800, 'steps': 5024, 'loss/train': 1.0927357971668243} 01/27/2022 00:36:16 - INFO - codeparrot_training - Step 5025: {'lr': 0.0004951161879560146, 'samples': 964992, 'steps': 5025, 'loss/train': 1.156048983335495} 01/27/2022 00:36:19 - INFO - codeparrot_training - Step 5026: {'lr': 0.000495112969018793, 'samples': 965184, 'steps': 5026, 'loss/train': 0.9512095749378204} 01/27/2022 00:36:22 - INFO - codeparrot_training - Step 5027: {'lr': 0.0004951097490315853, 'samples': 965376, 'steps': 5027, 'loss/train': 1.1079666316509247} 01/27/2022 00:36:25 - INFO - codeparrot_training - Step 5028: {'lr': 0.0004951065279944054, 'samples': 965568, 'steps': 5028, 'loss/train': 0.5361309796571732} 01/27/2022 00:36:28 - INFO - codeparrot_training - Step 5029: {'lr': 0.0004951033059072668, 'samples': 965760, 'steps': 5029, 'loss/train': 1.2049629092216492} 01/27/2022 00:36:32 - INFO - codeparrot_training - Step 5030: {'lr': 0.0004951000827701836, 'samples': 965952, 'steps': 5030, 'loss/train': 0.7746974229812622} 01/27/2022 00:36:35 - INFO - codeparrot_training - Step 5031: {'lr': 0.0004950968585831694, 'samples': 966144, 'steps': 5031, 'loss/train': 0.8055372834205627} 01/27/2022 00:36:38 - INFO - codeparrot_training - Step 5032: {'lr': 0.0004950936333462381, 'samples': 966336, 'steps': 5032, 'loss/train': 1.1944166421890259} 01/27/2022 00:36:42 - INFO - codeparrot_training - Step 5033: {'lr': 0.0004950904070594036, 'samples': 966528, 'steps': 5033, 'loss/train': 0.9735545217990875} 01/27/2022 00:36:45 - INFO - codeparrot_training - Step 5034: {'lr': 0.0004950871797226795, 'samples': 966720, 'steps': 5034, 'loss/train': 0.9429353177547455} 01/27/2022 00:36:48 - INFO - codeparrot_training - Step 5035: {'lr': 0.0004950839513360798, 'samples': 966912, 'steps': 5035, 'loss/train': 0.7062765955924988} 01/27/2022 00:36:52 - INFO - codeparrot_training - Step 5036: {'lr': 0.0004950807218996182, 'samples': 967104, 'steps': 5036, 'loss/train': 0.5496766716241837} 01/27/2022 00:36:55 - INFO - codeparrot_training - Step 5037: {'lr': 0.0004950774914133086, 'samples': 967296, 'steps': 5037, 'loss/train': 0.7347442209720612} 01/27/2022 00:36:58 - INFO - codeparrot_training - Step 5038: {'lr': 0.0004950742598771649, 'samples': 967488, 'steps': 5038, 'loss/train': 1.3007748126983643} 01/27/2022 00:37:01 - INFO - codeparrot_training - Step 5039: {'lr': 0.0004950710272912009, 'samples': 967680, 'steps': 5039, 'loss/train': 0.731753334403038} 01/27/2022 00:37:04 - INFO - codeparrot_training - Step 5040: {'lr': 0.0004950677936554305, 'samples': 967872, 'steps': 5040, 'loss/train': 1.090220868587494} 01/27/2022 00:37:07 - INFO - codeparrot_training - Step 5041: {'lr': 0.0004950645589698674, 'samples': 968064, 'steps': 5041, 'loss/train': 0.7718667984008789} 01/27/2022 00:37:13 - INFO - codeparrot_training - Step 5042: {'lr': 0.0004950613232345256, 'samples': 968256, 'steps': 5042, 'loss/train': 0.320449098944664} 01/27/2022 00:37:16 - INFO - codeparrot_training - Step 5043: {'lr': 0.0004950580864494188, 'samples': 968448, 'steps': 5043, 'loss/train': 0.6004405170679092} 01/27/2022 00:37:19 - INFO - codeparrot_training - Step 5044: {'lr': 0.0004950548486145611, 'samples': 968640, 'steps': 5044, 'loss/train': 0.814173549413681} 01/27/2022 00:37:22 - INFO - codeparrot_training - Step 5045: {'lr': 0.0004950516097299662, 'samples': 968832, 'steps': 5045, 'loss/train': 0.6195739209651947} 01/27/2022 00:37:25 - INFO - codeparrot_training - Step 5046: {'lr': 0.000495048369795648, 'samples': 969024, 'steps': 5046, 'loss/train': 1.1597760915756226} 01/27/2022 00:37:28 - INFO - codeparrot_training - Step 5047: {'lr': 0.0004950451288116204, 'samples': 969216, 'steps': 5047, 'loss/train': 0.8055923581123352} 01/27/2022 00:37:32 - INFO - codeparrot_training - Step 5048: {'lr': 0.0004950418867778973, 'samples': 969408, 'steps': 5048, 'loss/train': 1.1872000694274902} 01/27/2022 00:37:35 - INFO - codeparrot_training - Step 5049: {'lr': 0.0004950386436944925, 'samples': 969600, 'steps': 5049, 'loss/train': 1.0092271864414215} 01/27/2022 00:37:38 - INFO - codeparrot_training - Step 5050: {'lr': 0.0004950353995614201, 'samples': 969792, 'steps': 5050, 'loss/train': 0.826736569404602} 01/27/2022 00:37:43 - INFO - codeparrot_training - Step 5051: {'lr': 0.0004950321543786937, 'samples': 969984, 'steps': 5051, 'loss/train': 1.0925350785255432} 01/27/2022 00:37:46 - INFO - codeparrot_training - Step 5052: {'lr': 0.0004950289081463273, 'samples': 970176, 'steps': 5052, 'loss/train': 1.0135610103607178} 01/27/2022 00:37:49 - INFO - codeparrot_training - Step 5053: {'lr': 0.0004950256608643351, 'samples': 970368, 'steps': 5053, 'loss/train': 0.21328595280647278} 01/27/2022 00:37:52 - INFO - codeparrot_training - Step 5054: {'lr': 0.0004950224125327307, 'samples': 970560, 'steps': 5054, 'loss/train': 0.9890313148498535} 01/27/2022 00:37:55 - INFO - codeparrot_training - Step 5055: {'lr': 0.000495019163151528, 'samples': 970752, 'steps': 5055, 'loss/train': 0.8549079895019531} 01/27/2022 00:37:58 - INFO - codeparrot_training - Step 5056: {'lr': 0.0004950159127207411, 'samples': 970944, 'steps': 5056, 'loss/train': 1.0344591736793518} 01/27/2022 00:38:01 - INFO - codeparrot_training - Step 5057: {'lr': 0.0004950126612403838, 'samples': 971136, 'steps': 5057, 'loss/train': 1.129236102104187} 01/27/2022 00:38:05 - INFO - codeparrot_training - Step 5058: {'lr': 0.00049500940871047, 'samples': 971328, 'steps': 5058, 'loss/train': 0.7098401337862015} 01/27/2022 00:38:08 - INFO - codeparrot_training - Step 5059: {'lr': 0.0004950061551310138, 'samples': 971520, 'steps': 5059, 'loss/train': 0.8370873034000397} 01/27/2022 00:38:14 - INFO - codeparrot_training - Step 5060: {'lr': 0.0004950029005020289, 'samples': 971712, 'steps': 5060, 'loss/train': 0.4835786372423172} 01/27/2022 00:38:17 - INFO - codeparrot_training - Step 5061: {'lr': 0.0004949996448235294, 'samples': 971904, 'steps': 5061, 'loss/train': 0.9706891179084778} 01/27/2022 00:38:20 - INFO - codeparrot_training - Step 5062: {'lr': 0.0004949963880955293, 'samples': 972096, 'steps': 5062, 'loss/train': 1.0558454096317291} 01/27/2022 00:38:23 - INFO - codeparrot_training - Step 5063: {'lr': 0.0004949931303180424, 'samples': 972288, 'steps': 5063, 'loss/train': 0.6600755900144577} 01/27/2022 00:38:26 - INFO - codeparrot_training - Step 5064: {'lr': 0.0004949898714910828, 'samples': 972480, 'steps': 5064, 'loss/train': 0.8326624631881714} 01/27/2022 00:38:30 - INFO - codeparrot_training - Step 5065: {'lr': 0.0004949866116146643, 'samples': 972672, 'steps': 5065, 'loss/train': 0.7476957142353058} 01/27/2022 00:38:33 - INFO - codeparrot_training - Step 5066: {'lr': 0.000494983350688801, 'samples': 972864, 'steps': 5066, 'loss/train': 0.5822794586420059} 01/27/2022 00:38:36 - INFO - codeparrot_training - Step 5067: {'lr': 0.0004949800887135067, 'samples': 973056, 'steps': 5067, 'loss/train': 0.9614718854427338} 01/27/2022 00:38:39 - INFO - codeparrot_training - Step 5068: {'lr': 0.0004949768256887956, 'samples': 973248, 'steps': 5068, 'loss/train': 1.2199425995349884} 01/27/2022 00:38:43 - INFO - codeparrot_training - Step 5069: {'lr': 0.0004949735616146816, 'samples': 973440, 'steps': 5069, 'loss/train': 0.5660669356584549} 01/27/2022 00:38:47 - INFO - codeparrot_training - Step 5070: {'lr': 0.0004949702964911787, 'samples': 973632, 'steps': 5070, 'loss/train': 0.918015718460083} 01/27/2022 00:38:50 - INFO - codeparrot_training - Step 5071: {'lr': 0.0004949670303183006, 'samples': 973824, 'steps': 5071, 'loss/train': 0.7792814075946808} 01/27/2022 00:38:53 - INFO - codeparrot_training - Step 5072: {'lr': 0.0004949637630960618, 'samples': 974016, 'steps': 5072, 'loss/train': 0.7274968028068542} 01/27/2022 00:38:56 - INFO - codeparrot_training - Step 5073: {'lr': 0.0004949604948244758, 'samples': 974208, 'steps': 5073, 'loss/train': 0.3239792659878731} 01/27/2022 00:38:59 - INFO - codeparrot_training - Step 5074: {'lr': 0.0004949572255035569, 'samples': 974400, 'steps': 5074, 'loss/train': 0.7221634536981583} 01/27/2022 00:39:02 - INFO - codeparrot_training - Step 5075: {'lr': 0.0004949539551333191, 'samples': 974592, 'steps': 5075, 'loss/train': 1.052912324666977} 01/27/2022 00:39:05 - INFO - codeparrot_training - Step 5076: {'lr': 0.0004949506837137763, 'samples': 974784, 'steps': 5076, 'loss/train': 1.071209192276001} 01/27/2022 00:39:08 - INFO - codeparrot_training - Step 5077: {'lr': 0.0004949474112449424, 'samples': 974976, 'steps': 5077, 'loss/train': 1.140217512845993} 01/27/2022 00:39:13 - INFO - codeparrot_training - Step 5078: {'lr': 0.0004949441377268318, 'samples': 975168, 'steps': 5078, 'loss/train': 0.6983357816934586} 01/27/2022 00:39:16 - INFO - codeparrot_training - Step 5079: {'lr': 0.0004949408631594582, 'samples': 975360, 'steps': 5079, 'loss/train': 0.7304432094097137} 01/27/2022 00:39:19 - INFO - codeparrot_training - Step 5080: {'lr': 0.0004949375875428357, 'samples': 975552, 'steps': 5080, 'loss/train': 0.7216688990592957} 01/27/2022 00:39:22 - INFO - codeparrot_training - Step 5081: {'lr': 0.0004949343108769784, 'samples': 975744, 'steps': 5081, 'loss/train': 1.1922003328800201} 01/27/2022 00:39:25 - INFO - codeparrot_training - Step 5082: {'lr': 0.0004949310331619002, 'samples': 975936, 'steps': 5082, 'loss/train': 1.2782810926437378} 01/27/2022 00:39:29 - INFO - codeparrot_training - Step 5083: {'lr': 0.0004949277543976153, 'samples': 976128, 'steps': 5083, 'loss/train': 0.5897771716117859} 01/27/2022 00:39:32 - INFO - codeparrot_training - Step 5084: {'lr': 0.0004949244745841377, 'samples': 976320, 'steps': 5084, 'loss/train': 0.8826940655708313} 01/27/2022 00:39:35 - INFO - codeparrot_training - Step 5085: {'lr': 0.0004949211937214814, 'samples': 976512, 'steps': 5085, 'loss/train': 1.7480455040931702} 01/27/2022 00:39:38 - INFO - codeparrot_training - Step 5086: {'lr': 0.0004949179118096604, 'samples': 976704, 'steps': 5086, 'loss/train': 1.1464395225048065} 01/27/2022 00:39:43 - INFO - codeparrot_training - Step 5087: {'lr': 0.0004949146288486889, 'samples': 976896, 'steps': 5087, 'loss/train': 0.5755658894777298} 01/27/2022 00:39:46 - INFO - codeparrot_training - Step 5088: {'lr': 0.0004949113448385809, 'samples': 977088, 'steps': 5088, 'loss/train': 1.1803382635116577} 01/27/2022 00:39:49 - INFO - codeparrot_training - Step 5089: {'lr': 0.0004949080597793505, 'samples': 977280, 'steps': 5089, 'loss/train': 0.6827146410942078} 01/27/2022 00:39:53 - INFO - codeparrot_training - Step 5090: {'lr': 0.0004949047736710116, 'samples': 977472, 'steps': 5090, 'loss/train': 0.6475501656532288} 01/27/2022 00:39:56 - INFO - codeparrot_training - Step 5091: {'lr': 0.0004949014865135786, 'samples': 977664, 'steps': 5091, 'loss/train': 0.7743928134441376} 01/27/2022 00:39:59 - INFO - codeparrot_training - Step 5092: {'lr': 0.0004948981983070652, 'samples': 977856, 'steps': 5092, 'loss/train': 0.6227437555789948} 01/27/2022 00:40:02 - INFO - codeparrot_training - Step 5093: {'lr': 0.0004948949090514858, 'samples': 978048, 'steps': 5093, 'loss/train': 0.6974231600761414} 01/27/2022 00:40:05 - INFO - codeparrot_training - Step 5094: {'lr': 0.0004948916187468544, 'samples': 978240, 'steps': 5094, 'loss/train': 0.45488977432250977} 01/27/2022 00:40:08 - INFO - codeparrot_training - Step 5095: {'lr': 0.000494888327393185, 'samples': 978432, 'steps': 5095, 'loss/train': 0.9375851154327393} 01/27/2022 00:40:13 - INFO - codeparrot_training - Step 5096: {'lr': 0.0004948850349904919, 'samples': 978624, 'steps': 5096, 'loss/train': 0.20755831897258759} 01/27/2022 00:40:16 - INFO - codeparrot_training - Step 5097: {'lr': 0.000494881741538789, 'samples': 978816, 'steps': 5097, 'loss/train': 0.6137009114027023} 01/27/2022 00:40:19 - INFO - codeparrot_training - Step 5098: {'lr': 0.0004948784470380904, 'samples': 979008, 'steps': 5098, 'loss/train': 0.5504492372274399} 01/27/2022 00:40:22 - INFO - codeparrot_training - Step 5099: {'lr': 0.0004948751514884103, 'samples': 979200, 'steps': 5099, 'loss/train': 1.2064440250396729} 01/27/2022 00:40:25 - INFO - codeparrot_training - Step 5100: {'lr': 0.0004948718548897628, 'samples': 979392, 'steps': 5100, 'loss/train': 0.8865601122379303} 01/27/2022 00:40:28 - INFO - codeparrot_training - Step 5101: {'lr': 0.0004948685572421621, 'samples': 979584, 'steps': 5101, 'loss/train': 0.27745456248521805} 01/27/2022 00:40:32 - INFO - codeparrot_training - Step 5102: {'lr': 0.0004948652585456222, 'samples': 979776, 'steps': 5102, 'loss/train': 0.8021194338798523} 01/27/2022 00:40:35 - INFO - codeparrot_training - Step 5103: {'lr': 0.0004948619588001574, 'samples': 979968, 'steps': 5103, 'loss/train': 0.3882061690092087} 01/27/2022 00:40:38 - INFO - codeparrot_training - Step 5104: {'lr': 0.0004948586580057816, 'samples': 980160, 'steps': 5104, 'loss/train': 0.7403820008039474} 01/27/2022 00:40:42 - INFO - codeparrot_training - Step 5105: {'lr': 0.0004948553561625091, 'samples': 980352, 'steps': 5105, 'loss/train': 0.8722383677959442} 01/27/2022 00:40:46 - INFO - codeparrot_training - Step 5106: {'lr': 0.000494852053270354, 'samples': 980544, 'steps': 5106, 'loss/train': 0.8381020724773407} 01/27/2022 00:40:49 - INFO - codeparrot_training - Step 5107: {'lr': 0.0004948487493293305, 'samples': 980736, 'steps': 5107, 'loss/train': 0.9682667255401611} 01/27/2022 00:40:52 - INFO - codeparrot_training - Step 5108: {'lr': 0.0004948454443394527, 'samples': 980928, 'steps': 5108, 'loss/train': 1.229188710451126} 01/27/2022 00:40:55 - INFO - codeparrot_training - Step 5109: {'lr': 0.0004948421383007347, 'samples': 981120, 'steps': 5109, 'loss/train': 1.2268009185791016} 01/27/2022 00:40:58 - INFO - codeparrot_training - Step 5110: {'lr': 0.0004948388312131908, 'samples': 981312, 'steps': 5110, 'loss/train': 1.3365297317504883} 01/27/2022 00:41:01 - INFO - codeparrot_training - Step 5111: {'lr': 0.0004948355230768349, 'samples': 981504, 'steps': 5111, 'loss/train': 1.182130515575409} 01/27/2022 00:41:04 - INFO - codeparrot_training - Step 5112: {'lr': 0.0004948322138916816, 'samples': 981696, 'steps': 5112, 'loss/train': 0.23837844282388687} 01/27/2022 00:41:08 - INFO - codeparrot_training - Step 5113: {'lr': 0.0004948289036577447, 'samples': 981888, 'steps': 5113, 'loss/train': 0.5495970100164413} 01/27/2022 00:41:13 - INFO - codeparrot_training - Step 5114: {'lr': 0.0004948255923750385, 'samples': 982080, 'steps': 5114, 'loss/train': 1.0951071381568909} 01/27/2022 00:41:16 - INFO - codeparrot_training - Step 5115: {'lr': 0.0004948222800435773, 'samples': 982272, 'steps': 5115, 'loss/train': 0.9722296893596649} 01/27/2022 00:41:19 - INFO - codeparrot_training - Step 5116: {'lr': 0.0004948189666633752, 'samples': 982464, 'steps': 5116, 'loss/train': 1.0196075141429901} 01/27/2022 00:41:22 - INFO - codeparrot_training - Step 5117: {'lr': 0.0004948156522344463, 'samples': 982656, 'steps': 5117, 'loss/train': 1.0056976675987244} 01/27/2022 00:41:25 - INFO - codeparrot_training - Step 5118: {'lr': 0.0004948123367568049, 'samples': 982848, 'steps': 5118, 'loss/train': 1.0539316534996033} 01/27/2022 00:41:28 - INFO - codeparrot_training - Step 5119: {'lr': 0.0004948090202304652, 'samples': 983040, 'steps': 5119, 'loss/train': 0.8074873387813568} 01/27/2022 00:41:31 - INFO - codeparrot_training - Step 5120: {'lr': 0.0004948057026554415, 'samples': 983232, 'steps': 5120, 'loss/train': 0.8228709697723389} 01/27/2022 00:41:35 - INFO - codeparrot_training - Step 5121: {'lr': 0.0004948023840317477, 'samples': 983424, 'steps': 5121, 'loss/train': 0.4293624758720398} 01/27/2022 00:41:39 - INFO - codeparrot_training - Step 5122: {'lr': 0.0004947990643593983, 'samples': 983616, 'steps': 5122, 'loss/train': 1.0747005343437195} 01/27/2022 00:41:43 - INFO - codeparrot_training - Step 5123: {'lr': 0.0004947957436384076, 'samples': 983808, 'steps': 5123, 'loss/train': 0.4760216474533081} 01/27/2022 00:41:46 - INFO - codeparrot_training - Step 5124: {'lr': 0.0004947924218687894, 'samples': 984000, 'steps': 5124, 'loss/train': 1.0039696991443634} 01/27/2022 00:41:49 - INFO - codeparrot_training - Step 5125: {'lr': 0.0004947890990505585, 'samples': 984192, 'steps': 5125, 'loss/train': 0.4982391893863678} 01/27/2022 00:41:52 - INFO - codeparrot_training - Step 5126: {'lr': 0.0004947857751837286, 'samples': 984384, 'steps': 5126, 'loss/train': 0.5157832056283951} 01/27/2022 00:41:55 - INFO - codeparrot_training - Step 5127: {'lr': 0.0004947824502683142, 'samples': 984576, 'steps': 5127, 'loss/train': 0.8302628695964813} 01/27/2022 00:41:58 - INFO - codeparrot_training - Step 5128: {'lr': 0.0004947791243043296, 'samples': 984768, 'steps': 5128, 'loss/train': 1.0008039772510529} 01/27/2022 00:42:02 - INFO - codeparrot_training - Step 5129: {'lr': 0.0004947757972917889, 'samples': 984960, 'steps': 5129, 'loss/train': 1.1321770548820496} 01/27/2022 00:42:05 - INFO - codeparrot_training - Step 5130: {'lr': 0.0004947724692307064, 'samples': 985152, 'steps': 5130, 'loss/train': 1.1079138815402985} 01/27/2022 00:42:09 - INFO - codeparrot_training - Step 5131: {'lr': 0.0004947691401210963, 'samples': 985344, 'steps': 5131, 'loss/train': 0.6504739969968796} 01/27/2022 00:42:12 - INFO - codeparrot_training - Step 5132: {'lr': 0.0004947658099629731, 'samples': 985536, 'steps': 5132, 'loss/train': 1.234673649072647} 01/27/2022 00:42:16 - INFO - codeparrot_training - Step 5133: {'lr': 0.0004947624787563507, 'samples': 985728, 'steps': 5133, 'loss/train': 0.8651351630687714} 01/27/2022 00:42:19 - INFO - codeparrot_training - Step 5134: {'lr': 0.0004947591465012436, 'samples': 985920, 'steps': 5134, 'loss/train': 0.8709989190101624} 01/27/2022 00:42:22 - INFO - codeparrot_training - Step 5135: {'lr': 0.0004947558131976661, 'samples': 986112, 'steps': 5135, 'loss/train': 0.9472754001617432} 01/27/2022 00:42:25 - INFO - codeparrot_training - Step 5136: {'lr': 0.0004947524788456324, 'samples': 986304, 'steps': 5136, 'loss/train': 0.300875723361969} 01/27/2022 00:42:28 - INFO - codeparrot_training - Step 5137: {'lr': 0.0004947491434451569, 'samples': 986496, 'steps': 5137, 'loss/train': 0.7861379384994507} 01/27/2022 00:42:31 - INFO - codeparrot_training - Step 5138: {'lr': 0.0004947458069962537, 'samples': 986688, 'steps': 5138, 'loss/train': 0.5822822749614716} 01/27/2022 00:42:34 - INFO - codeparrot_training - Step 5139: {'lr': 0.0004947424694989371, 'samples': 986880, 'steps': 5139, 'loss/train': 1.2048709988594055} 01/27/2022 00:42:40 - INFO - codeparrot_training - Step 5140: {'lr': 0.0004947391309532216, 'samples': 987072, 'steps': 5140, 'loss/train': 0.4116116166114807} 01/27/2022 00:42:43 - INFO - codeparrot_training - Step 5141: {'lr': 0.0004947357913591213, 'samples': 987264, 'steps': 5141, 'loss/train': 0.7688612937927246} 01/27/2022 00:42:46 - INFO - codeparrot_training - Step 5142: {'lr': 0.0004947324507166505, 'samples': 987456, 'steps': 5142, 'loss/train': 1.2503845989704132} 01/27/2022 00:42:49 - INFO - codeparrot_training - Step 5143: {'lr': 0.0004947291090258238, 'samples': 987648, 'steps': 5143, 'loss/train': 0.8692565560340881} 01/27/2022 00:42:53 - INFO - codeparrot_training - Step 5144: {'lr': 0.0004947257662866551, 'samples': 987840, 'steps': 5144, 'loss/train': 0.1287854164838791} 01/27/2022 00:42:56 - INFO - codeparrot_training - Step 5145: {'lr': 0.0004947224224991591, 'samples': 988032, 'steps': 5145, 'loss/train': 0.637305423617363} 01/27/2022 00:42:59 - INFO - codeparrot_training - Step 5146: {'lr': 0.0004947190776633499, 'samples': 988224, 'steps': 5146, 'loss/train': 0.8504159152507782} 01/27/2022 00:43:02 - INFO - codeparrot_training - Step 5147: {'lr': 0.0004947157317792418, 'samples': 988416, 'steps': 5147, 'loss/train': 0.31168200820684433} 01/27/2022 00:43:05 - INFO - codeparrot_training - Step 5148: {'lr': 0.0004947123848468493, 'samples': 988608, 'steps': 5148, 'loss/train': 1.0832512378692627} 01/27/2022 00:43:11 - INFO - codeparrot_training - Step 5149: {'lr': 0.0004947090368661866, 'samples': 988800, 'steps': 5149, 'loss/train': 0.5411324054002762} 01/27/2022 00:43:14 - INFO - codeparrot_training - Step 5150: {'lr': 0.0004947056878372681, 'samples': 988992, 'steps': 5150, 'loss/train': 0.7487791478633881} 01/27/2022 00:43:17 - INFO - codeparrot_training - Step 5151: {'lr': 0.0004947023377601082, 'samples': 989184, 'steps': 5151, 'loss/train': 0.6366070210933685} 01/27/2022 00:43:20 - INFO - codeparrot_training - Step 5152: {'lr': 0.0004946989866347211, 'samples': 989376, 'steps': 5152, 'loss/train': 1.2111272513866425} 01/27/2022 00:43:23 - INFO - codeparrot_training - Step 5153: {'lr': 0.0004946956344611212, 'samples': 989568, 'steps': 5153, 'loss/train': 1.0021287202835083} 01/27/2022 00:43:26 - INFO - codeparrot_training - Step 5154: {'lr': 0.000494692281239323, 'samples': 989760, 'steps': 5154, 'loss/train': 0.8296122550964355} 01/27/2022 00:43:29 - INFO - codeparrot_training - Step 5155: {'lr': 0.0004946889269693408, 'samples': 989952, 'steps': 5155, 'loss/train': 0.9274095296859741} 01/27/2022 00:43:33 - INFO - codeparrot_training - Step 5156: {'lr': 0.0004946855716511888, 'samples': 990144, 'steps': 5156, 'loss/train': 2.062955617904663} 01/27/2022 00:43:36 - INFO - codeparrot_training - Step 5157: {'lr': 0.0004946822152848816, 'samples': 990336, 'steps': 5157, 'loss/train': 1.812056064605713} 01/27/2022 00:43:40 - INFO - codeparrot_training - Step 5158: {'lr': 0.0004946788578704335, 'samples': 990528, 'steps': 5158, 'loss/train': 1.2700426876544952} 01/27/2022 00:43:43 - INFO - codeparrot_training - Step 5159: {'lr': 0.0004946754994078588, 'samples': 990720, 'steps': 5159, 'loss/train': 1.0783852636814117} 01/27/2022 00:43:46 - INFO - codeparrot_training - Step 5160: {'lr': 0.000494672139897172, 'samples': 990912, 'steps': 5160, 'loss/train': 1.1019156575202942} 01/27/2022 00:43:50 - INFO - codeparrot_training - Step 5161: {'lr': 0.0004946687793383874, 'samples': 991104, 'steps': 5161, 'loss/train': 0.9208860397338867} 01/27/2022 00:43:53 - INFO - codeparrot_training - Step 5162: {'lr': 0.0004946654177315194, 'samples': 991296, 'steps': 5162, 'loss/train': 0.32167869061231613} 01/27/2022 00:43:56 - INFO - codeparrot_training - Step 5163: {'lr': 0.0004946620550765826, 'samples': 991488, 'steps': 5163, 'loss/train': 1.1110846996307373} 01/27/2022 00:43:59 - INFO - codeparrot_training - Step 5164: {'lr': 0.0004946586913735911, 'samples': 991680, 'steps': 5164, 'loss/train': 0.7833567559719086} 01/27/2022 00:44:02 - INFO - codeparrot_training - Step 5165: {'lr': 0.0004946553266225595, 'samples': 991872, 'steps': 5165, 'loss/train': 0.8261927962303162} 01/27/2022 00:44:05 - INFO - codeparrot_training - Step 5166: {'lr': 0.0004946519608235022, 'samples': 992064, 'steps': 5166, 'loss/train': 1.5439825057983398} 01/27/2022 00:44:10 - INFO - codeparrot_training - Step 5167: {'lr': 0.0004946485939764336, 'samples': 992256, 'steps': 5167, 'loss/train': 1.08121919631958} 01/27/2022 00:44:13 - INFO - codeparrot_training - Step 5168: {'lr': 0.000494645226081368, 'samples': 992448, 'steps': 5168, 'loss/train': 0.9271784126758575} 01/27/2022 00:44:17 - INFO - codeparrot_training - Step 5169: {'lr': 0.00049464185713832, 'samples': 992640, 'steps': 5169, 'loss/train': 0.8192310333251953} 01/27/2022 00:44:20 - INFO - codeparrot_training - Step 5170: {'lr': 0.000494638487147304, 'samples': 992832, 'steps': 5170, 'loss/train': 0.8627199232578278} 01/27/2022 00:44:23 - INFO - codeparrot_training - Step 5171: {'lr': 0.0004946351161083344, 'samples': 993024, 'steps': 5171, 'loss/train': 0.7063748091459274} 01/27/2022 00:44:26 - INFO - codeparrot_training - Step 5172: {'lr': 0.0004946317440214257, 'samples': 993216, 'steps': 5172, 'loss/train': 1.029202401638031} 01/27/2022 00:44:29 - INFO - codeparrot_training - Step 5173: {'lr': 0.000494628370886592, 'samples': 993408, 'steps': 5173, 'loss/train': 0.9102608263492584} 01/27/2022 00:44:32 - INFO - codeparrot_training - Step 5174: {'lr': 0.0004946249967038483, 'samples': 993600, 'steps': 5174, 'loss/train': 0.20660559833049774} 01/27/2022 00:44:35 - INFO - codeparrot_training - Step 5175: {'lr': 0.0004946216214732088, 'samples': 993792, 'steps': 5175, 'loss/train': 0.6163818687200546} 01/27/2022 00:44:40 - INFO - codeparrot_training - Step 5176: {'lr': 0.0004946182451946878, 'samples': 993984, 'steps': 5176, 'loss/train': 0.9675976932048798} 01/27/2022 00:44:43 - INFO - codeparrot_training - Step 5177: {'lr': 0.0004946148678683001, 'samples': 994176, 'steps': 5177, 'loss/train': 0.8471893072128296} 01/27/2022 00:44:46 - INFO - codeparrot_training - Step 5178: {'lr': 0.0004946114894940599, 'samples': 994368, 'steps': 5178, 'loss/train': 0.6854356974363327} 01/27/2022 00:44:49 - INFO - codeparrot_training - Step 5179: {'lr': 0.0004946081100719817, 'samples': 994560, 'steps': 5179, 'loss/train': 0.7808744609355927} 01/27/2022 00:44:52 - INFO - codeparrot_training - Step 5180: {'lr': 0.00049460472960208, 'samples': 994752, 'steps': 5180, 'loss/train': 1.1207892000675201} 01/27/2022 00:44:56 - INFO - codeparrot_training - Step 5181: {'lr': 0.0004946013480843694, 'samples': 994944, 'steps': 5181, 'loss/train': 1.183348685503006} 01/27/2022 00:44:59 - INFO - codeparrot_training - Step 5182: {'lr': 0.0004945979655188642, 'samples': 995136, 'steps': 5182, 'loss/train': 0.9436107873916626} 01/27/2022 00:45:02 - INFO - codeparrot_training - Step 5183: {'lr': 0.0004945945819055791, 'samples': 995328, 'steps': 5183, 'loss/train': 0.534300684928894} 01/27/2022 00:45:05 - INFO - codeparrot_training - Step 5184: {'lr': 0.0004945911972445284, 'samples': 995520, 'steps': 5184, 'loss/train': 1.1894933581352234} 01/27/2022 00:45:10 - INFO - codeparrot_training - Step 5185: {'lr': 0.0004945878115357267, 'samples': 995712, 'steps': 5185, 'loss/train': 1.1787810623645782} 01/27/2022 00:45:13 - INFO - codeparrot_training - Step 5186: {'lr': 0.0004945844247791886, 'samples': 995904, 'steps': 5186, 'loss/train': 0.9497879147529602} 01/27/2022 00:45:17 - INFO - codeparrot_training - Step 5187: {'lr': 0.0004945810369749283, 'samples': 996096, 'steps': 5187, 'loss/train': 0.8530630767345428} 01/27/2022 00:45:20 - INFO - codeparrot_training - Step 5188: {'lr': 0.0004945776481229605, 'samples': 996288, 'steps': 5188, 'loss/train': 0.31409718096256256} 01/27/2022 00:45:23 - INFO - codeparrot_training - Step 5189: {'lr': 0.0004945742582232999, 'samples': 996480, 'steps': 5189, 'loss/train': 1.0700317919254303} 01/27/2022 00:45:26 - INFO - codeparrot_training - Step 5190: {'lr': 0.0004945708672759606, 'samples': 996672, 'steps': 5190, 'loss/train': 0.660377562046051} 01/27/2022 00:45:29 - INFO - codeparrot_training - Step 5191: {'lr': 0.0004945674752809575, 'samples': 996864, 'steps': 5191, 'loss/train': 0.517800584435463} 01/27/2022 00:45:32 - INFO - codeparrot_training - Step 5192: {'lr': 0.000494564082238305, 'samples': 997056, 'steps': 5192, 'loss/train': 0.7789634764194489} 01/27/2022 00:45:37 - INFO - codeparrot_training - Step 5193: {'lr': 0.0004945606881480176, 'samples': 997248, 'steps': 5193, 'loss/train': 1.2775113880634308} 01/27/2022 00:45:40 - INFO - codeparrot_training - Step 5194: {'lr': 0.0004945572930101098, 'samples': 997440, 'steps': 5194, 'loss/train': 2.312151074409485} 01/27/2022 00:45:43 - INFO - codeparrot_training - Step 5195: {'lr': 0.0004945538968245964, 'samples': 997632, 'steps': 5195, 'loss/train': 0.6572871208190918} 01/27/2022 00:45:46 - INFO - codeparrot_training - Step 5196: {'lr': 0.0004945504995914917, 'samples': 997824, 'steps': 5196, 'loss/train': 0.9626956880092621} 01/27/2022 00:45:49 - INFO - codeparrot_training - Step 5197: {'lr': 0.0004945471013108102, 'samples': 998016, 'steps': 5197, 'loss/train': 1.890772819519043} 01/27/2022 00:45:52 - INFO - codeparrot_training - Step 5198: {'lr': 0.0004945437019825668, 'samples': 998208, 'steps': 5198, 'loss/train': 1.1859493553638458} 01/27/2022 00:45:55 - INFO - codeparrot_training - Step 5199: {'lr': 0.0004945403016067756, 'samples': 998400, 'steps': 5199, 'loss/train': 0.4673551172018051} 01/27/2022 00:45:59 - INFO - codeparrot_training - Step 5200: {'lr': 0.0004945369001834514, 'samples': 998592, 'steps': 5200, 'loss/train': 0.6784869879484177} 01/27/2022 00:46:02 - INFO - codeparrot_training - Step 5201: {'lr': 0.0004945334977126089, 'samples': 998784, 'steps': 5201, 'loss/train': 0.5384026765823364} 01/27/2022 00:46:06 - INFO - codeparrot_training - Step 5202: {'lr': 0.0004945300941942624, 'samples': 998976, 'steps': 5202, 'loss/train': 0.5426927804946899} 01/27/2022 00:46:09 - INFO - codeparrot_training - Step 5203: {'lr': 0.0004945266896284268, 'samples': 999168, 'steps': 5203, 'loss/train': 0.6765743494033813} 01/27/2022 00:46:13 - INFO - codeparrot_training - Step 5204: {'lr': 0.0004945232840151164, 'samples': 999360, 'steps': 5204, 'loss/train': 1.403884470462799} 01/27/2022 00:46:16 - INFO - codeparrot_training - Step 5205: {'lr': 0.0004945198773543459, 'samples': 999552, 'steps': 5205, 'loss/train': 0.88702592253685} 01/27/2022 00:46:19 - INFO - codeparrot_training - Step 5206: {'lr': 0.0004945164696461299, 'samples': 999744, 'steps': 5206, 'loss/train': 1.17251917719841} 01/27/2022 00:46:22 - INFO - codeparrot_training - Step 5207: {'lr': 0.000494513060890483, 'samples': 999936, 'steps': 5207, 'loss/train': 0.8939831256866455} 01/27/2022 00:46:25 - INFO - codeparrot_training - Step 5208: {'lr': 0.0004945096510874197, 'samples': 1000128, 'steps': 5208, 'loss/train': 1.1310645639896393} 01/27/2022 00:46:28 - INFO - codeparrot_training - Step 5209: {'lr': 0.0004945062402369548, 'samples': 1000320, 'steps': 5209, 'loss/train': 0.934337854385376} 01/27/2022 00:46:31 - INFO - codeparrot_training - Step 5210: {'lr': 0.0004945028283391028, 'samples': 1000512, 'steps': 5210, 'loss/train': 0.5702621340751648} 01/27/2022 00:46:36 - INFO - codeparrot_training - Step 5211: {'lr': 0.0004944994153938783, 'samples': 1000704, 'steps': 5211, 'loss/train': 1.533934235572815} 01/27/2022 00:46:39 - INFO - codeparrot_training - Step 5212: {'lr': 0.0004944960014012959, 'samples': 1000896, 'steps': 5212, 'loss/train': 0.82228884100914} 01/27/2022 00:46:42 - INFO - codeparrot_training - Step 5213: {'lr': 0.0004944925863613704, 'samples': 1001088, 'steps': 5213, 'loss/train': 0.808757096529007} 01/27/2022 00:46:45 - INFO - codeparrot_training - Step 5214: {'lr': 0.0004944891702741161, 'samples': 1001280, 'steps': 5214, 'loss/train': 1.2545742988586426} 01/27/2022 00:46:49 - INFO - codeparrot_training - Step 5215: {'lr': 0.0004944857531395479, 'samples': 1001472, 'steps': 5215, 'loss/train': 0.9180120527744293} 01/27/2022 00:46:52 - INFO - codeparrot_training - Step 5216: {'lr': 0.0004944823349576805, 'samples': 1001664, 'steps': 5216, 'loss/train': 1.2417646944522858} 01/27/2022 00:46:55 - INFO - codeparrot_training - Step 5217: {'lr': 0.0004944789157285283, 'samples': 1001856, 'steps': 5217, 'loss/train': 1.0344763398170471} 01/27/2022 00:46:58 - INFO - codeparrot_training - Step 5218: {'lr': 0.0004944754954521061, 'samples': 1002048, 'steps': 5218, 'loss/train': 1.021317332983017} 01/27/2022 00:47:01 - INFO - codeparrot_training - Step 5219: {'lr': 0.0004944720741284285, 'samples': 1002240, 'steps': 5219, 'loss/train': 0.6006308197975159} 01/27/2022 00:47:07 - INFO - codeparrot_training - Step 5220: {'lr': 0.00049446865175751, 'samples': 1002432, 'steps': 5220, 'loss/train': 0.9779736399650574} 01/27/2022 00:47:10 - INFO - codeparrot_training - Step 5221: {'lr': 0.0004944652283393656, 'samples': 1002624, 'steps': 5221, 'loss/train': 1.2266523241996765} 01/27/2022 00:47:13 - INFO - codeparrot_training - Step 5222: {'lr': 0.0004944618038740098, 'samples': 1002816, 'steps': 5222, 'loss/train': 0.1288113221526146} 01/27/2022 00:47:16 - INFO - codeparrot_training - Step 5223: {'lr': 0.0004944583783614571, 'samples': 1003008, 'steps': 5223, 'loss/train': 1.1055023074150085} 01/27/2022 00:47:19 - INFO - codeparrot_training - Step 5224: {'lr': 0.0004944549518017225, 'samples': 1003200, 'steps': 5224, 'loss/train': 1.0770920813083649} 01/27/2022 00:47:22 - INFO - codeparrot_training - Step 5225: {'lr': 0.0004944515241948204, 'samples': 1003392, 'steps': 5225, 'loss/train': 0.6596703082323074} 01/27/2022 00:47:25 - INFO - codeparrot_training - Step 5226: {'lr': 0.0004944480955407656, 'samples': 1003584, 'steps': 5226, 'loss/train': 0.8575578331947327} 01/27/2022 00:47:29 - INFO - codeparrot_training - Step 5227: {'lr': 0.0004944446658395728, 'samples': 1003776, 'steps': 5227, 'loss/train': 0.5534832179546356} 01/27/2022 00:47:32 - INFO - codeparrot_training - Step 5228: {'lr': 0.0004944412350912567, 'samples': 1003968, 'steps': 5228, 'loss/train': 0.6883066892623901} 01/27/2022 00:47:36 - INFO - codeparrot_training - Step 5229: {'lr': 0.000494437803295832, 'samples': 1004160, 'steps': 5229, 'loss/train': 1.0872275233268738} 01/27/2022 00:47:40 - INFO - codeparrot_training - Step 5230: {'lr': 0.0004944343704533133, 'samples': 1004352, 'steps': 5230, 'loss/train': 0.87831249833107} 01/27/2022 00:47:43 - INFO - codeparrot_training - Step 5231: {'lr': 0.0004944309365637154, 'samples': 1004544, 'steps': 5231, 'loss/train': 0.7233273983001709} 01/27/2022 00:47:46 - INFO - codeparrot_training - Step 5232: {'lr': 0.000494427501627053, 'samples': 1004736, 'steps': 5232, 'loss/train': 0.9471874237060547} 01/27/2022 00:47:49 - INFO - codeparrot_training - Step 5233: {'lr': 0.0004944240656433407, 'samples': 1004928, 'steps': 5233, 'loss/train': 0.9999295771121979} 01/27/2022 00:47:52 - INFO - codeparrot_training - Step 5234: {'lr': 0.0004944206286125935, 'samples': 1005120, 'steps': 5234, 'loss/train': 0.7317471206188202} 01/27/2022 00:47:55 - INFO - codeparrot_training - Step 5235: {'lr': 0.0004944171905348258, 'samples': 1005312, 'steps': 5235, 'loss/train': 0.7857035100460052} 01/27/2022 00:47:58 - INFO - codeparrot_training - Step 5236: {'lr': 0.0004944137514100525, 'samples': 1005504, 'steps': 5236, 'loss/train': 0.7457680553197861} 01/27/2022 00:48:02 - INFO - codeparrot_training - Step 5237: {'lr': 0.0004944103112382883, 'samples': 1005696, 'steps': 5237, 'loss/train': 0.8658075034618378} 01/27/2022 00:48:06 - INFO - codeparrot_training - Step 5238: {'lr': 0.0004944068700195479, 'samples': 1005888, 'steps': 5238, 'loss/train': 0.7779465615749359} 01/27/2022 00:48:09 - INFO - codeparrot_training - Step 5239: {'lr': 0.0004944034277538462, 'samples': 1006080, 'steps': 5239, 'loss/train': 0.9814800024032593} 01/27/2022 00:48:12 - INFO - codeparrot_training - Step 5240: {'lr': 0.0004943999844411977, 'samples': 1006272, 'steps': 5240, 'loss/train': 0.9334609508514404} 01/27/2022 00:48:15 - INFO - codeparrot_training - Step 5241: {'lr': 0.0004943965400816173, 'samples': 1006464, 'steps': 5241, 'loss/train': 0.6705514937639236} 01/27/2022 00:48:19 - INFO - codeparrot_training - Step 5242: {'lr': 0.0004943930946751197, 'samples': 1006656, 'steps': 5242, 'loss/train': 0.878639817237854} 01/27/2022 00:48:22 - INFO - codeparrot_training - Step 5243: {'lr': 0.0004943896482217197, 'samples': 1006848, 'steps': 5243, 'loss/train': 0.7468528300523758} 01/27/2022 00:48:25 - INFO - codeparrot_training - Step 5244: {'lr': 0.0004943862007214322, 'samples': 1007040, 'steps': 5244, 'loss/train': 0.9830920994281769} 01/27/2022 00:48:28 - INFO - codeparrot_training - Step 5245: {'lr': 0.0004943827521742716, 'samples': 1007232, 'steps': 5245, 'loss/train': 0.6447456032037735} 01/27/2022 00:48:34 - INFO - codeparrot_training - Step 5246: {'lr': 0.000494379302580253, 'samples': 1007424, 'steps': 5246, 'loss/train': 0.822100818157196} 01/27/2022 00:48:37 - INFO - codeparrot_training - Step 5247: {'lr': 0.000494375851939391, 'samples': 1007616, 'steps': 5247, 'loss/train': 1.0480502843856812} 01/27/2022 00:48:40 - INFO - codeparrot_training - Step 5248: {'lr': 0.0004943724002517005, 'samples': 1007808, 'steps': 5248, 'loss/train': 0.9253580868244171} 01/27/2022 00:48:43 - INFO - codeparrot_training - Step 5249: {'lr': 0.0004943689475171962, 'samples': 1008000, 'steps': 5249, 'loss/train': 1.8875994086265564} 01/27/2022 00:48:46 - INFO - codeparrot_training - Step 5250: {'lr': 0.000494365493735893, 'samples': 1008192, 'steps': 5250, 'loss/train': 0.6957137435674667} 01/27/2022 00:48:49 - INFO - codeparrot_training - Step 5251: {'lr': 0.0004943620389078055, 'samples': 1008384, 'steps': 5251, 'loss/train': 0.2114662081003189} 01/27/2022 00:48:52 - INFO - codeparrot_training - Step 5252: {'lr': 0.0004943585830329487, 'samples': 1008576, 'steps': 5252, 'loss/train': 1.2011256515979767} 01/27/2022 00:48:56 - INFO - codeparrot_training - Step 5253: {'lr': 0.0004943551261113373, 'samples': 1008768, 'steps': 5253, 'loss/train': 0.8026490807533264} 01/27/2022 00:48:59 - INFO - codeparrot_training - Step 5254: {'lr': 0.0004943516681429861, 'samples': 1008960, 'steps': 5254, 'loss/train': 0.8870516717433929} 01/27/2022 00:49:03 - INFO - codeparrot_training - Step 5255: {'lr': 0.0004943482091279101, 'samples': 1009152, 'steps': 5255, 'loss/train': 0.7076024562120438} 01/27/2022 00:49:06 - INFO - codeparrot_training - Step 5256: {'lr': 0.0004943447490661238, 'samples': 1009344, 'steps': 5256, 'loss/train': 0.718863308429718} 01/27/2022 00:49:09 - INFO - codeparrot_training - Step 5257: {'lr': 0.0004943412879576422, 'samples': 1009536, 'steps': 5257, 'loss/train': 0.8394177854061127} 01/27/2022 00:49:13 - INFO - codeparrot_training - Step 5258: {'lr': 0.0004943378258024802, 'samples': 1009728, 'steps': 5258, 'loss/train': 0.7363982051610947} 01/27/2022 00:49:16 - INFO - codeparrot_training - Step 5259: {'lr': 0.0004943343626006524, 'samples': 1009920, 'steps': 5259, 'loss/train': 1.190596729516983} 01/27/2022 00:49:19 - INFO - codeparrot_training - Step 5260: {'lr': 0.000494330898352174, 'samples': 1010112, 'steps': 5260, 'loss/train': 0.5365058183670044} 01/27/2022 00:49:22 - INFO - codeparrot_training - Step 5261: {'lr': 0.0004943274330570594, 'samples': 1010304, 'steps': 5261, 'loss/train': 0.6916020959615707} 01/27/2022 00:49:25 - INFO - codeparrot_training - Step 5262: {'lr': 0.0004943239667153237, 'samples': 1010496, 'steps': 5262, 'loss/train': 0.6019612848758698} 01/27/2022 00:49:28 - INFO - codeparrot_training - Step 5263: {'lr': 0.0004943204993269818, 'samples': 1010688, 'steps': 5263, 'loss/train': 1.074790209531784} 01/27/2022 00:49:33 - INFO - codeparrot_training - Step 5264: {'lr': 0.0004943170308920483, 'samples': 1010880, 'steps': 5264, 'loss/train': 1.5678103566169739} 01/27/2022 00:49:37 - INFO - codeparrot_training - Step 5265: {'lr': 0.0004943135614105384, 'samples': 1011072, 'steps': 5265, 'loss/train': 0.7248430252075195} 01/27/2022 00:49:40 - INFO - codeparrot_training - Step 5266: {'lr': 0.0004943100908824667, 'samples': 1011264, 'steps': 5266, 'loss/train': 0.8760175108909607} 01/27/2022 00:49:43 - INFO - codeparrot_training - Step 5267: {'lr': 0.0004943066193078482, 'samples': 1011456, 'steps': 5267, 'loss/train': 1.6341934204101562} 01/27/2022 00:49:46 - INFO - codeparrot_training - Step 5268: {'lr': 0.0004943031466866976, 'samples': 1011648, 'steps': 5268, 'loss/train': 0.6594547480344772} 01/27/2022 00:49:49 - INFO - codeparrot_training - Step 5269: {'lr': 0.00049429967301903, 'samples': 1011840, 'steps': 5269, 'loss/train': 0.7372903525829315} 01/27/2022 00:49:52 - INFO - codeparrot_training - Step 5270: {'lr': 0.0004942961983048601, 'samples': 1012032, 'steps': 5270, 'loss/train': 0.7550294995307922} 01/27/2022 00:49:55 - INFO - codeparrot_training - Step 5271: {'lr': 0.0004942927225442029, 'samples': 1012224, 'steps': 5271, 'loss/train': 0.7179592698812485} 01/27/2022 00:49:59 - INFO - codeparrot_training - Step 5272: {'lr': 0.0004942892457370732, 'samples': 1012416, 'steps': 5272, 'loss/train': 0.7756651639938354} 01/27/2022 00:50:03 - INFO - codeparrot_training - Step 5273: {'lr': 0.000494285767883486, 'samples': 1012608, 'steps': 5273, 'loss/train': 0.24393611401319504} 01/27/2022 00:50:06 - INFO - codeparrot_training - Step 5274: {'lr': 0.0004942822889834562, 'samples': 1012800, 'steps': 5274, 'loss/train': 0.9433778822422028} 01/27/2022 00:50:10 - INFO - codeparrot_training - Step 5275: {'lr': 0.0004942788090369985, 'samples': 1012992, 'steps': 5275, 'loss/train': 0.7258710712194443} 01/27/2022 00:50:13 - INFO - codeparrot_training - Step 5276: {'lr': 0.0004942753280441281, 'samples': 1013184, 'steps': 5276, 'loss/train': 0.5512968599796295} 01/27/2022 00:50:16 - INFO - codeparrot_training - Step 5277: {'lr': 0.0004942718460048596, 'samples': 1013376, 'steps': 5277, 'loss/train': 1.0754300951957703} 01/27/2022 00:50:19 - INFO - codeparrot_training - Step 5278: {'lr': 0.0004942683629192082, 'samples': 1013568, 'steps': 5278, 'loss/train': 0.8511764109134674} 01/27/2022 00:50:22 - INFO - codeparrot_training - Step 5279: {'lr': 0.0004942648787871886, 'samples': 1013760, 'steps': 5279, 'loss/train': 0.3067754879593849} 01/27/2022 00:50:25 - INFO - codeparrot_training - Step 5280: {'lr': 0.000494261393608816, 'samples': 1013952, 'steps': 5280, 'loss/train': 0.37505216896533966} 01/27/2022 00:50:28 - INFO - codeparrot_training - Step 5281: {'lr': 0.0004942579073841049, 'samples': 1014144, 'steps': 5281, 'loss/train': 1.0094541013240814} 01/27/2022 00:50:33 - INFO - codeparrot_training - Step 5282: {'lr': 0.0004942544201130706, 'samples': 1014336, 'steps': 5282, 'loss/train': 0.8114448487758636} 01/27/2022 00:50:36 - INFO - codeparrot_training - Step 5283: {'lr': 0.000494250931795728, 'samples': 1014528, 'steps': 5283, 'loss/train': 0.7446841299533844} 01/27/2022 00:50:39 - INFO - codeparrot_training - Step 5284: {'lr': 0.0004942474424320919, 'samples': 1014720, 'steps': 5284, 'loss/train': 0.9408541023731232} 01/27/2022 00:50:42 - INFO - codeparrot_training - Step 5285: {'lr': 0.0004942439520221774, 'samples': 1014912, 'steps': 5285, 'loss/train': 0.9262778162956238} 01/27/2022 00:50:45 - INFO - codeparrot_training - Step 5286: {'lr': 0.0004942404605659991, 'samples': 1015104, 'steps': 5286, 'loss/train': 1.1299797892570496} 01/27/2022 00:50:49 - INFO - codeparrot_training - Step 5287: {'lr': 0.0004942369680635724, 'samples': 1015296, 'steps': 5287, 'loss/train': 0.5791256278753281} 01/27/2022 00:50:52 - INFO - codeparrot_training - Step 5288: {'lr': 0.0004942334745149122, 'samples': 1015488, 'steps': 5288, 'loss/train': 1.0120967030525208} 01/27/2022 00:50:55 - INFO - codeparrot_training - Step 5289: {'lr': 0.0004942299799200332, 'samples': 1015680, 'steps': 5289, 'loss/train': 0.8534795343875885} 01/27/2022 00:50:58 - INFO - codeparrot_training - Step 5290: {'lr': 0.0004942264842789506, 'samples': 1015872, 'steps': 5290, 'loss/train': 1.101803719997406} 01/27/2022 00:51:03 - INFO - codeparrot_training - Step 5291: {'lr': 0.0004942229875916792, 'samples': 1016064, 'steps': 5291, 'loss/train': 0.8310179114341736} 01/27/2022 00:51:06 - INFO - codeparrot_training - Step 5292: {'lr': 0.0004942194898582341, 'samples': 1016256, 'steps': 5292, 'loss/train': 0.558097779750824} 01/27/2022 00:51:09 - INFO - codeparrot_training - Step 5293: {'lr': 0.0004942159910786303, 'samples': 1016448, 'steps': 5293, 'loss/train': 0.7714150249958038} 01/27/2022 00:51:12 - INFO - codeparrot_training - Step 5294: {'lr': 0.0004942124912528827, 'samples': 1016640, 'steps': 5294, 'loss/train': 0.37356336414813995} 01/27/2022 00:51:15 - INFO - codeparrot_training - Step 5295: {'lr': 0.0004942089903810064, 'samples': 1016832, 'steps': 5295, 'loss/train': 1.4673011898994446} 01/27/2022 00:51:18 - INFO - codeparrot_training - Step 5296: {'lr': 0.0004942054884630162, 'samples': 1017024, 'steps': 5296, 'loss/train': 0.8468267619609833} 01/27/2022 00:51:22 - INFO - codeparrot_training - Step 5297: {'lr': 0.0004942019854989274, 'samples': 1017216, 'steps': 5297, 'loss/train': 0.6213961690664291} 01/27/2022 00:51:25 - INFO - codeparrot_training - Step 5298: {'lr': 0.0004941984814887546, 'samples': 1017408, 'steps': 5298, 'loss/train': 0.5143189877271652} 01/27/2022 00:51:28 - INFO - codeparrot_training - Step 5299: {'lr': 0.0004941949764325133, 'samples': 1017600, 'steps': 5299, 'loss/train': 1.0621272325515747} 01/27/2022 00:51:33 - INFO - codeparrot_training - Step 5300: {'lr': 0.0004941914703302181, 'samples': 1017792, 'steps': 5300, 'loss/train': 0.6837655752897263} 01/27/2022 00:51:36 - INFO - codeparrot_training - Step 5301: {'lr': 0.0004941879631818843, 'samples': 1017984, 'steps': 5301, 'loss/train': 1.2677130103111267} 01/27/2022 00:51:39 - INFO - codeparrot_training - Step 5302: {'lr': 0.0004941844549875267, 'samples': 1018176, 'steps': 5302, 'loss/train': 0.8056283891201019} 01/27/2022 00:51:42 - INFO - codeparrot_training - Step 5303: {'lr': 0.0004941809457471605, 'samples': 1018368, 'steps': 5303, 'loss/train': 0.9893189370632172} 01/27/2022 00:51:46 - INFO - codeparrot_training - Step 5304: {'lr': 0.0004941774354608006, 'samples': 1018560, 'steps': 5304, 'loss/train': 1.0934996008872986} 01/27/2022 00:51:49 - INFO - codeparrot_training - Step 5305: {'lr': 0.0004941739241284621, 'samples': 1018752, 'steps': 5305, 'loss/train': 0.858322262763977} 01/27/2022 00:51:52 - INFO - codeparrot_training - Step 5306: {'lr': 0.0004941704117501601, 'samples': 1018944, 'steps': 5306, 'loss/train': 0.6873732805252075} 01/27/2022 00:51:55 - INFO - codeparrot_training - Step 5307: {'lr': 0.0004941668983259095, 'samples': 1019136, 'steps': 5307, 'loss/train': 1.793824017047882} 01/27/2022 00:51:58 - INFO - codeparrot_training - Step 5308: {'lr': 0.0004941633838557256, 'samples': 1019328, 'steps': 5308, 'loss/train': 0.8351019322872162} 01/27/2022 00:52:03 - INFO - codeparrot_training - Step 5309: {'lr': 0.0004941598683396232, 'samples': 1019520, 'steps': 5309, 'loss/train': 1.1342563033103943} 01/27/2022 00:52:06 - INFO - codeparrot_training - Step 5310: {'lr': 0.0004941563517776174, 'samples': 1019712, 'steps': 5310, 'loss/train': 0.8659357130527496} 01/27/2022 00:52:09 - INFO - codeparrot_training - Step 5311: {'lr': 0.0004941528341697234, 'samples': 1019904, 'steps': 5311, 'loss/train': 1.151748239994049} 01/27/2022 00:52:12 - INFO - codeparrot_training - Step 5312: {'lr': 0.0004941493155159562, 'samples': 1020096, 'steps': 5312, 'loss/train': 1.01390540599823} 01/27/2022 00:52:15 - INFO - codeparrot_training - Step 5313: {'lr': 0.0004941457958163308, 'samples': 1020288, 'steps': 5313, 'loss/train': 0.8374475240707397} 01/27/2022 00:52:18 - INFO - codeparrot_training - Step 5314: {'lr': 0.0004941422750708623, 'samples': 1020480, 'steps': 5314, 'loss/train': 1.121747463941574} 01/27/2022 00:52:21 - INFO - codeparrot_training - Step 5315: {'lr': 0.0004941387532795659, 'samples': 1020672, 'steps': 5315, 'loss/train': 0.8426987528800964} 01/27/2022 00:52:24 - INFO - codeparrot_training - Step 5316: {'lr': 0.0004941352304424566, 'samples': 1020864, 'steps': 5316, 'loss/train': 1.092704325914383} 01/27/2022 00:52:28 - INFO - codeparrot_training - Step 5317: {'lr': 0.0004941317065595495, 'samples': 1021056, 'steps': 5317, 'loss/train': 0.8662800192832947} 01/27/2022 00:52:32 - INFO - codeparrot_training - Step 5318: {'lr': 0.0004941281816308596, 'samples': 1021248, 'steps': 5318, 'loss/train': 1.055805265903473} 01/27/2022 00:52:35 - INFO - codeparrot_training - Step 5319: {'lr': 0.0004941246556564021, 'samples': 1021440, 'steps': 5319, 'loss/train': 0.4658552259206772} 01/27/2022 00:52:38 - INFO - codeparrot_training - Step 5320: {'lr': 0.0004941211286361922, 'samples': 1021632, 'steps': 5320, 'loss/train': 1.250751793384552} 01/27/2022 00:52:41 - INFO - codeparrot_training - Step 5321: {'lr': 0.0004941176005702448, 'samples': 1021824, 'steps': 5321, 'loss/train': 1.3796458840370178} 01/27/2022 00:52:45 - INFO - codeparrot_training - Step 5322: {'lr': 0.0004941140714585752, 'samples': 1022016, 'steps': 5322, 'loss/train': 0.6600645035505295} 01/27/2022 00:52:48 - INFO - codeparrot_training - Step 5323: {'lr': 0.0004941105413011984, 'samples': 1022208, 'steps': 5323, 'loss/train': 0.7773055136203766} 01/27/2022 00:52:51 - INFO - codeparrot_training - Step 5324: {'lr': 0.0004941070100981295, 'samples': 1022400, 'steps': 5324, 'loss/train': 0.8840253353118896} 01/27/2022 00:52:54 - INFO - codeparrot_training - Step 5325: {'lr': 0.0004941034778493837, 'samples': 1022592, 'steps': 5325, 'loss/train': 0.693273514509201} 01/27/2022 00:52:57 - INFO - codeparrot_training - Step 5326: {'lr': 0.0004940999445549762, 'samples': 1022784, 'steps': 5326, 'loss/train': 0.9170462787151337} 01/27/2022 00:53:02 - INFO - codeparrot_training - Step 5327: {'lr': 0.0004940964102149219, 'samples': 1022976, 'steps': 5327, 'loss/train': 0.6725068241357803} 01/27/2022 00:53:05 - INFO - codeparrot_training - Step 5328: {'lr': 0.0004940928748292363, 'samples': 1023168, 'steps': 5328, 'loss/train': 1.388189435005188} 01/27/2022 00:53:09 - INFO - codeparrot_training - Step 5329: {'lr': 0.0004940893383979341, 'samples': 1023360, 'steps': 5329, 'loss/train': 1.2298143804073334} 01/27/2022 00:53:12 - INFO - codeparrot_training - Step 5330: {'lr': 0.0004940858009210308, 'samples': 1023552, 'steps': 5330, 'loss/train': 1.0168286561965942} 01/27/2022 00:53:15 - INFO - codeparrot_training - Step 5331: {'lr': 0.0004940822623985414, 'samples': 1023744, 'steps': 5331, 'loss/train': 0.8379828035831451} 01/27/2022 00:53:18 - INFO - codeparrot_training - Step 5332: {'lr': 0.0004940787228304811, 'samples': 1023936, 'steps': 5332, 'loss/train': 0.5073303133249283} 01/27/2022 00:53:21 - INFO - codeparrot_training - Step 5333: {'lr': 0.0004940751822168651, 'samples': 1024128, 'steps': 5333, 'loss/train': 1.12864351272583} 01/27/2022 00:53:24 - INFO - codeparrot_training - Step 5334: {'lr': 0.0004940716405577086, 'samples': 1024320, 'steps': 5334, 'loss/train': 0.6170992702245712} 01/27/2022 00:53:29 - INFO - codeparrot_training - Step 5335: {'lr': 0.0004940680978530265, 'samples': 1024512, 'steps': 5335, 'loss/train': 1.170753836631775} 01/27/2022 00:53:32 - INFO - codeparrot_training - Step 5336: {'lr': 0.0004940645541028343, 'samples': 1024704, 'steps': 5336, 'loss/train': 0.5778009742498398} 01/27/2022 00:53:35 - INFO - codeparrot_training - Step 5337: {'lr': 0.0004940610093071469, 'samples': 1024896, 'steps': 5337, 'loss/train': 0.6424407362937927} 01/27/2022 00:53:38 - INFO - codeparrot_training - Step 5338: {'lr': 0.0004940574634659798, 'samples': 1025088, 'steps': 5338, 'loss/train': 0.9023905098438263} 01/27/2022 00:53:41 - INFO - codeparrot_training - Step 5339: {'lr': 0.000494053916579348, 'samples': 1025280, 'steps': 5339, 'loss/train': 1.5037182569503784} 01/27/2022 00:53:45 - INFO - codeparrot_training - Step 5340: {'lr': 0.0004940503686472667, 'samples': 1025472, 'steps': 5340, 'loss/train': 0.9630154967308044} 01/27/2022 00:53:48 - INFO - codeparrot_training - Step 5341: {'lr': 0.0004940468196697511, 'samples': 1025664, 'steps': 5341, 'loss/train': 1.1870900988578796} 01/27/2022 00:53:51 - INFO - codeparrot_training - Step 5342: {'lr': 0.0004940432696468164, 'samples': 1025856, 'steps': 5342, 'loss/train': 0.721158429980278} 01/27/2022 00:53:54 - INFO - codeparrot_training - Step 5343: {'lr': 0.0004940397185784778, 'samples': 1026048, 'steps': 5343, 'loss/train': 0.7759350836277008} 01/27/2022 00:53:59 - INFO - codeparrot_training - Step 5344: {'lr': 0.0004940361664647506, 'samples': 1026240, 'steps': 5344, 'loss/train': 0.8501045107841492} 01/27/2022 00:54:02 - INFO - codeparrot_training - Step 5345: {'lr': 0.0004940326133056499, 'samples': 1026432, 'steps': 5345, 'loss/train': 1.481533706188202} 01/27/2022 00:54:05 - INFO - codeparrot_training - Step 5346: {'lr': 0.000494029059101191, 'samples': 1026624, 'steps': 5346, 'loss/train': 1.0454780459403992} 01/27/2022 00:54:08 - INFO - codeparrot_training - Step 5347: {'lr': 0.0004940255038513891, 'samples': 1026816, 'steps': 5347, 'loss/train': 0.8516926467418671} 01/27/2022 00:54:12 - INFO - codeparrot_training - Step 5348: {'lr': 0.0004940219475562593, 'samples': 1027008, 'steps': 5348, 'loss/train': 0.988204836845398} 01/27/2022 00:54:15 - INFO - codeparrot_training - Step 5349: {'lr': 0.0004940183902158172, 'samples': 1027200, 'steps': 5349, 'loss/train': 1.2282470762729645} 01/27/2022 00:54:18 - INFO - codeparrot_training - Step 5350: {'lr': 0.0004940148318300777, 'samples': 1027392, 'steps': 5350, 'loss/train': 1.032161682844162} 01/27/2022 00:54:21 - INFO - codeparrot_training - Step 5351: {'lr': 0.0004940112723990561, 'samples': 1027584, 'steps': 5351, 'loss/train': 1.124710589647293} 01/27/2022 00:54:24 - INFO - codeparrot_training - Step 5352: {'lr': 0.0004940077119227678, 'samples': 1027776, 'steps': 5352, 'loss/train': 0.7442672699689865} 01/27/2022 00:54:28 - INFO - codeparrot_training - Step 5353: {'lr': 0.0004940041504012279, 'samples': 1027968, 'steps': 5353, 'loss/train': 0.9910921454429626} 01/27/2022 00:54:32 - INFO - codeparrot_training - Step 5354: {'lr': 0.0004940005878344517, 'samples': 1028160, 'steps': 5354, 'loss/train': 0.7462829947471619} 01/27/2022 00:54:35 - INFO - codeparrot_training - Step 5355: {'lr': 0.0004939970242224544, 'samples': 1028352, 'steps': 5355, 'loss/train': 0.6867945492267609} 01/27/2022 00:54:38 - INFO - codeparrot_training - Step 5356: {'lr': 0.0004939934595652513, 'samples': 1028544, 'steps': 5356, 'loss/train': 0.1100204810500145} 01/27/2022 00:54:41 - INFO - codeparrot_training - Step 5357: {'lr': 0.0004939898938628578, 'samples': 1028736, 'steps': 5357, 'loss/train': 0.4141118824481964} 01/27/2022 00:54:44 - INFO - codeparrot_training - Step 5358: {'lr': 0.000493986327115289, 'samples': 1028928, 'steps': 5358, 'loss/train': 0.9952563643455505} 01/27/2022 00:54:47 - INFO - codeparrot_training - Step 5359: {'lr': 0.0004939827593225602, 'samples': 1029120, 'steps': 5359, 'loss/train': 0.6502967029809952} 01/27/2022 00:54:51 - INFO - codeparrot_training - Step 5360: {'lr': 0.0004939791904846869, 'samples': 1029312, 'steps': 5360, 'loss/train': 0.768248587846756} 01/27/2022 00:54:54 - INFO - codeparrot_training - Step 5361: {'lr': 0.0004939756206016841, 'samples': 1029504, 'steps': 5361, 'loss/train': 0.5971511900424957} 01/27/2022 00:54:58 - INFO - codeparrot_training - Step 5362: {'lr': 0.0004939720496735672, 'samples': 1029696, 'steps': 5362, 'loss/train': 0.5343149900436401} 01/27/2022 00:55:02 - INFO - codeparrot_training - Step 5363: {'lr': 0.0004939684777003516, 'samples': 1029888, 'steps': 5363, 'loss/train': 0.7532182037830353} 01/27/2022 00:55:05 - INFO - codeparrot_training - Step 5364: {'lr': 0.0004939649046820524, 'samples': 1030080, 'steps': 5364, 'loss/train': 0.6543085277080536} 01/27/2022 00:55:08 - INFO - codeparrot_training - Step 5365: {'lr': 0.0004939613306186851, 'samples': 1030272, 'steps': 5365, 'loss/train': 0.7264645993709564} 01/27/2022 00:55:11 - INFO - codeparrot_training - Step 5366: {'lr': 0.0004939577555102649, 'samples': 1030464, 'steps': 5366, 'loss/train': 1.2369433343410492} 01/27/2022 00:55:14 - INFO - codeparrot_training - Step 5367: {'lr': 0.0004939541793568072, 'samples': 1030656, 'steps': 5367, 'loss/train': 0.8641359508037567} 01/27/2022 00:55:17 - INFO - codeparrot_training - Step 5368: {'lr': 0.000493950602158327, 'samples': 1030848, 'steps': 5368, 'loss/train': 0.6847621947526932} 01/27/2022 00:55:20 - INFO - codeparrot_training - Step 5369: {'lr': 0.0004939470239148403, 'samples': 1031040, 'steps': 5369, 'loss/train': 0.7160131931304932} 01/27/2022 00:55:24 - INFO - codeparrot_training - Step 5370: {'lr': 0.0004939434446263617, 'samples': 1031232, 'steps': 5370, 'loss/train': 1.0526277422904968} 01/27/2022 00:55:29 - INFO - codeparrot_training - Step 5371: {'lr': 0.000493939864292907, 'samples': 1031424, 'steps': 5371, 'loss/train': 1.431679755449295} 01/27/2022 00:55:32 - INFO - codeparrot_training - Step 5372: {'lr': 0.0004939362829144913, 'samples': 1031616, 'steps': 5372, 'loss/train': 1.0063089430332184} 01/27/2022 00:55:35 - INFO - codeparrot_training - Step 5373: {'lr': 0.00049393270049113, 'samples': 1031808, 'steps': 5373, 'loss/train': 0.78043994307518} 01/27/2022 00:55:38 - INFO - codeparrot_training - Step 5374: {'lr': 0.0004939291170228385, 'samples': 1032000, 'steps': 5374, 'loss/train': 0.6897190511226654} 01/27/2022 00:55:42 - INFO - codeparrot_training - Step 5375: {'lr': 0.0004939255325096321, 'samples': 1032192, 'steps': 5375, 'loss/train': 0.7810074090957642} 01/27/2022 00:55:45 - INFO - codeparrot_training - Step 5376: {'lr': 0.0004939219469515262, 'samples': 1032384, 'steps': 5376, 'loss/train': 0.902347594499588} 01/27/2022 00:55:48 - INFO - codeparrot_training - Step 5377: {'lr': 0.0004939183603485363, 'samples': 1032576, 'steps': 5377, 'loss/train': 0.6359347254037857} 01/27/2022 00:55:51 - INFO - codeparrot_training - Step 5378: {'lr': 0.0004939147727006773, 'samples': 1032768, 'steps': 5378, 'loss/train': 0.9974945783615112} 01/27/2022 00:55:56 - INFO - codeparrot_training - Step 5379: {'lr': 0.000493911184007965, 'samples': 1032960, 'steps': 5379, 'loss/train': 1.134657382965088} 01/27/2022 00:55:59 - INFO - codeparrot_training - Step 5380: {'lr': 0.0004939075942704147, 'samples': 1033152, 'steps': 5380, 'loss/train': 1.0010448396205902} 01/27/2022 00:56:02 - INFO - codeparrot_training - Step 5381: {'lr': 0.0004939040034880416, 'samples': 1033344, 'steps': 5381, 'loss/train': 0.9744475185871124} 01/27/2022 00:56:05 - INFO - codeparrot_training - Step 5382: {'lr': 0.0004939004116608612, 'samples': 1033536, 'steps': 5382, 'loss/train': 1.16484135389328} 01/27/2022 00:56:09 - INFO - codeparrot_training - Step 5383: {'lr': 0.000493896818788889, 'samples': 1033728, 'steps': 5383, 'loss/train': 0.6347774416208267} 01/27/2022 00:56:12 - INFO - codeparrot_training - Step 5384: {'lr': 0.0004938932248721401, 'samples': 1033920, 'steps': 5384, 'loss/train': 1.213702529668808} 01/27/2022 00:56:15 - INFO - codeparrot_training - Step 5385: {'lr': 0.0004938896299106302, 'samples': 1034112, 'steps': 5385, 'loss/train': 1.21448814868927} 01/27/2022 00:56:18 - INFO - codeparrot_training - Step 5386: {'lr': 0.0004938860339043746, 'samples': 1034304, 'steps': 5386, 'loss/train': 1.2338314354419708} 01/27/2022 00:56:21 - INFO - codeparrot_training - Step 5387: {'lr': 0.0004938824368533886, 'samples': 1034496, 'steps': 5387, 'loss/train': 0.7382543385028839} 01/27/2022 00:56:25 - INFO - codeparrot_training - Step 5388: {'lr': 0.0004938788387576878, 'samples': 1034688, 'steps': 5388, 'loss/train': 0.9653415083885193} 01/27/2022 00:56:29 - INFO - codeparrot_training - Step 5389: {'lr': 0.0004938752396172873, 'samples': 1034880, 'steps': 5389, 'loss/train': 0.9590345621109009} 01/27/2022 00:56:32 - INFO - codeparrot_training - Step 5390: {'lr': 0.0004938716394322028, 'samples': 1035072, 'steps': 5390, 'loss/train': 1.1591531038284302} 01/27/2022 00:56:35 - INFO - codeparrot_training - Step 5391: {'lr': 0.0004938680382024497, 'samples': 1035264, 'steps': 5391, 'loss/train': 0.8427523076534271} 01/27/2022 00:56:38 - INFO - codeparrot_training - Step 5392: {'lr': 0.0004938644359280433, 'samples': 1035456, 'steps': 5392, 'loss/train': 0.9192105531692505} 01/27/2022 00:56:41 - INFO - codeparrot_training - Step 5393: {'lr': 0.000493860832608999, 'samples': 1035648, 'steps': 5393, 'loss/train': 1.0776862800121307} 01/27/2022 00:56:44 - INFO - codeparrot_training - Step 5394: {'lr': 0.0004938572282453326, 'samples': 1035840, 'steps': 5394, 'loss/train': 0.39555229246616364} 01/27/2022 00:56:47 - INFO - codeparrot_training - Step 5395: {'lr': 0.000493853622837059, 'samples': 1036032, 'steps': 5395, 'loss/train': 0.5684731900691986} 01/27/2022 00:56:50 - INFO - codeparrot_training - Step 5396: {'lr': 0.000493850016384194, 'samples': 1036224, 'steps': 5396, 'loss/train': 0.6466943174600601} 01/27/2022 00:56:55 - INFO - codeparrot_training - Step 5397: {'lr': 0.000493846408886753, 'samples': 1036416, 'steps': 5397, 'loss/train': 0.8909723460674286} 01/27/2022 00:56:58 - INFO - codeparrot_training - Step 5398: {'lr': 0.0004938428003447514, 'samples': 1036608, 'steps': 5398, 'loss/train': 0.8359588980674744} 01/27/2022 00:57:01 - INFO - codeparrot_training - Step 5399: {'lr': 0.0004938391907582046, 'samples': 1036800, 'steps': 5399, 'loss/train': 1.2466420233249664} 01/27/2022 00:57:04 - INFO - codeparrot_training - Step 5400: {'lr': 0.0004938355801271282, 'samples': 1036992, 'steps': 5400, 'loss/train': 0.42668670415878296} 01/27/2022 00:57:08 - INFO - codeparrot_training - Step 5401: {'lr': 0.0004938319684515375, 'samples': 1037184, 'steps': 5401, 'loss/train': 0.7939328849315643} 01/27/2022 00:57:11 - INFO - codeparrot_training - Step 5402: {'lr': 0.0004938283557314483, 'samples': 1037376, 'steps': 5402, 'loss/train': 1.0741088390350342} 01/27/2022 00:57:14 - INFO - codeparrot_training - Step 5403: {'lr': 0.0004938247419668757, 'samples': 1037568, 'steps': 5403, 'loss/train': 0.7926310300827026} 01/27/2022 00:57:17 - INFO - codeparrot_training - Step 5404: {'lr': 0.0004938211271578352, 'samples': 1037760, 'steps': 5404, 'loss/train': 0.9707221984863281} 01/27/2022 00:57:20 - INFO - codeparrot_training - Step 5405: {'lr': 0.0004938175113043426, 'samples': 1037952, 'steps': 5405, 'loss/train': 1.038948655128479} 01/27/2022 00:57:25 - INFO - codeparrot_training - Step 5406: {'lr': 0.0004938138944064131, 'samples': 1038144, 'steps': 5406, 'loss/train': 1.0120647847652435} 01/27/2022 00:57:28 - INFO - codeparrot_training - Step 5407: {'lr': 0.0004938102764640624, 'samples': 1038336, 'steps': 5407, 'loss/train': 0.9464127123355865} 01/27/2022 00:57:31 - INFO - codeparrot_training - Step 5408: {'lr': 0.0004938066574773058, 'samples': 1038528, 'steps': 5408, 'loss/train': 1.4582170844078064} 01/27/2022 00:57:35 - INFO - codeparrot_training - Step 5409: {'lr': 0.000493803037446159, 'samples': 1038720, 'steps': 5409, 'loss/train': 1.4783585965633392} 01/27/2022 00:57:38 - INFO - codeparrot_training - Step 5410: {'lr': 0.0004937994163706374, 'samples': 1038912, 'steps': 5410, 'loss/train': 0.7715415358543396} 01/27/2022 00:57:41 - INFO - codeparrot_training - Step 5411: {'lr': 0.0004937957942507564, 'samples': 1039104, 'steps': 5411, 'loss/train': 1.0054458975791931} 01/27/2022 00:57:44 - INFO - codeparrot_training - Step 5412: {'lr': 0.0004937921710865317, 'samples': 1039296, 'steps': 5412, 'loss/train': 0.7571540772914886} 01/27/2022 00:57:47 - INFO - codeparrot_training - Step 5413: {'lr': 0.0004937885468779787, 'samples': 1039488, 'steps': 5413, 'loss/train': 1.151543766260147} 01/27/2022 00:57:50 - INFO - codeparrot_training - Step 5414: {'lr': 0.000493784921625113, 'samples': 1039680, 'steps': 5414, 'loss/train': 1.8367883563041687} 01/27/2022 00:57:55 - INFO - codeparrot_training - Step 5415: {'lr': 0.0004937812953279502, 'samples': 1039872, 'steps': 5415, 'loss/train': 0.34823445975780487} 01/27/2022 00:57:59 - INFO - codeparrot_training - Step 5416: {'lr': 0.0004937776679865057, 'samples': 1040064, 'steps': 5416, 'loss/train': 0.6815518587827682} 01/27/2022 00:58:02 - INFO - codeparrot_training - Step 5417: {'lr': 0.000493774039600795, 'samples': 1040256, 'steps': 5417, 'loss/train': 0.26778361201286316} 01/27/2022 00:58:05 - INFO - codeparrot_training - Step 5418: {'lr': 0.0004937704101708338, 'samples': 1040448, 'steps': 5418, 'loss/train': 1.0805065333843231} 01/27/2022 00:58:08 - INFO - codeparrot_training - Step 5419: {'lr': 0.0004937667796966374, 'samples': 1040640, 'steps': 5419, 'loss/train': 1.439744085073471} 01/27/2022 00:58:11 - INFO - codeparrot_training - Step 5420: {'lr': 0.0004937631481782218, 'samples': 1040832, 'steps': 5420, 'loss/train': 1.7872501015663147} 01/27/2022 00:58:14 - INFO - codeparrot_training - Step 5421: {'lr': 0.000493759515615602, 'samples': 1041024, 'steps': 5421, 'loss/train': 0.6549527943134308} 01/27/2022 00:58:17 - INFO - codeparrot_training - Step 5422: {'lr': 0.000493755882008794, 'samples': 1041216, 'steps': 5422, 'loss/train': 1.0962396562099457} 01/27/2022 00:58:21 - INFO - codeparrot_training - Step 5423: {'lr': 0.0004937522473578132, 'samples': 1041408, 'steps': 5423, 'loss/train': 1.1066783666610718} 01/27/2022 00:58:25 - INFO - codeparrot_training - Step 5424: {'lr': 0.0004937486116626752, 'samples': 1041600, 'steps': 5424, 'loss/train': 0.8417403995990753} 01/27/2022 00:58:28 - INFO - codeparrot_training - Step 5425: {'lr': 0.0004937449749233954, 'samples': 1041792, 'steps': 5425, 'loss/train': 0.9527096450328827} 01/27/2022 00:58:31 - INFO - codeparrot_training - Step 5426: {'lr': 0.0004937413371399897, 'samples': 1041984, 'steps': 5426, 'loss/train': 0.5159370750188828} 01/27/2022 00:58:34 - INFO - codeparrot_training - Step 5427: {'lr': 0.0004937376983124734, 'samples': 1042176, 'steps': 5427, 'loss/train': 1.1205935776233673} 01/27/2022 00:58:38 - INFO - codeparrot_training - Step 5428: {'lr': 0.0004937340584408622, 'samples': 1042368, 'steps': 5428, 'loss/train': 0.780012309551239} 01/27/2022 00:58:41 - INFO - codeparrot_training - Step 5429: {'lr': 0.0004937304175251717, 'samples': 1042560, 'steps': 5429, 'loss/train': 1.5699752569198608} 01/27/2022 00:58:44 - INFO - codeparrot_training - Step 5430: {'lr': 0.0004937267755654174, 'samples': 1042752, 'steps': 5430, 'loss/train': 0.9231610000133514} 01/27/2022 00:58:47 - INFO - codeparrot_training - Step 5431: {'lr': 0.0004937231325616152, 'samples': 1042944, 'steps': 5431, 'loss/train': 0.4362407773733139} 01/27/2022 00:58:50 - INFO - codeparrot_training - Step 5432: {'lr': 0.0004937194885137803, 'samples': 1043136, 'steps': 5432, 'loss/train': 0.6324776262044907} 01/27/2022 00:58:55 - INFO - codeparrot_training - Step 5433: {'lr': 0.0004937158434219286, 'samples': 1043328, 'steps': 5433, 'loss/train': 1.0851335227489471} 01/27/2022 00:58:58 - INFO - codeparrot_training - Step 5434: {'lr': 0.0004937121972860755, 'samples': 1043520, 'steps': 5434, 'loss/train': 1.0822890400886536} 01/27/2022 00:59:02 - INFO - codeparrot_training - Step 5435: {'lr': 0.0004937085501062369, 'samples': 1043712, 'steps': 5435, 'loss/train': 1.000714123249054} 01/27/2022 00:59:05 - INFO - codeparrot_training - Step 5436: {'lr': 0.0004937049018824282, 'samples': 1043904, 'steps': 5436, 'loss/train': 1.0224141776561737} 01/27/2022 00:59:08 - INFO - codeparrot_training - Step 5437: {'lr': 0.000493701252614665, 'samples': 1044096, 'steps': 5437, 'loss/train': 0.5396161526441574} 01/27/2022 00:59:11 - INFO - codeparrot_training - Step 5438: {'lr': 0.0004936976023029631, 'samples': 1044288, 'steps': 5438, 'loss/train': 0.8262980282306671} 01/27/2022 00:59:14 - INFO - codeparrot_training - Step 5439: {'lr': 0.000493693950947338, 'samples': 1044480, 'steps': 5439, 'loss/train': 0.5498960763216019} 01/27/2022 00:59:17 - INFO - codeparrot_training - Step 5440: {'lr': 0.0004936902985478055, 'samples': 1044672, 'steps': 5440, 'loss/train': 1.9294120073318481} 01/27/2022 00:59:22 - INFO - codeparrot_training - Step 5441: {'lr': 0.000493686645104381, 'samples': 1044864, 'steps': 5441, 'loss/train': 0.7356658726930618} 01/27/2022 00:59:25 - INFO - codeparrot_training - Step 5442: {'lr': 0.0004936829906170804, 'samples': 1045056, 'steps': 5442, 'loss/train': 0.6585300117731094} 01/27/2022 00:59:28 - INFO - codeparrot_training - Step 5443: {'lr': 0.0004936793350859192, 'samples': 1045248, 'steps': 5443, 'loss/train': 0.8956536054611206} 01/27/2022 00:59:31 - INFO - codeparrot_training - Step 5444: {'lr': 0.0004936756785109131, 'samples': 1045440, 'steps': 5444, 'loss/train': 0.851976066827774} 01/27/2022 00:59:34 - INFO - codeparrot_training - Step 5445: {'lr': 0.0004936720208920778, 'samples': 1045632, 'steps': 5445, 'loss/train': 1.1174467206001282} 01/27/2022 00:59:38 - INFO - codeparrot_training - Step 5446: {'lr': 0.0004936683622294289, 'samples': 1045824, 'steps': 5446, 'loss/train': 0.5981727987527847} 01/27/2022 00:59:41 - INFO - codeparrot_training - Step 5447: {'lr': 0.0004936647025229822, 'samples': 1046016, 'steps': 5447, 'loss/train': 1.278153419494629} 01/27/2022 00:59:44 - INFO - codeparrot_training - Step 5448: {'lr': 0.0004936610417727532, 'samples': 1046208, 'steps': 5448, 'loss/train': 1.0054152309894562} 01/27/2022 00:59:47 - INFO - codeparrot_training - Step 5449: {'lr': 0.0004936573799787575, 'samples': 1046400, 'steps': 5449, 'loss/train': 1.069281667470932} 01/27/2022 00:59:52 - INFO - codeparrot_training - Step 5450: {'lr': 0.0004936537171410112, 'samples': 1046592, 'steps': 5450, 'loss/train': 0.9157505929470062} 01/27/2022 00:59:55 - INFO - codeparrot_training - Step 5451: {'lr': 0.0004936500532595297, 'samples': 1046784, 'steps': 5451, 'loss/train': 1.0445912182331085} 01/27/2022 00:59:58 - INFO - codeparrot_training - Step 5452: {'lr': 0.0004936463883343287, 'samples': 1046976, 'steps': 5452, 'loss/train': 1.1098119020462036} 01/27/2022 01:00:02 - INFO - codeparrot_training - Step 5453: {'lr': 0.000493642722365424, 'samples': 1047168, 'steps': 5453, 'loss/train': 1.0391857624053955} 01/27/2022 01:00:05 - INFO - codeparrot_training - Step 5454: {'lr': 0.0004936390553528313, 'samples': 1047360, 'steps': 5454, 'loss/train': 1.0665278434753418} 01/27/2022 01:00:08 - INFO - codeparrot_training - Step 5455: {'lr': 0.0004936353872965661, 'samples': 1047552, 'steps': 5455, 'loss/train': 0.963932454586029} 01/27/2022 01:00:11 - INFO - codeparrot_training - Step 5456: {'lr': 0.0004936317181966443, 'samples': 1047744, 'steps': 5456, 'loss/train': 0.7667041718959808} 01/27/2022 01:00:14 - INFO - codeparrot_training - Step 5457: {'lr': 0.0004936280480530816, 'samples': 1047936, 'steps': 5457, 'loss/train': 1.1040500700473785} 01/27/2022 01:00:17 - INFO - codeparrot_training - Step 5458: {'lr': 0.0004936243768658937, 'samples': 1048128, 'steps': 5458, 'loss/train': 0.6105572730302811} 01/27/2022 01:00:22 - INFO - codeparrot_training - Step 5459: {'lr': 0.0004936207046350963, 'samples': 1048320, 'steps': 5459, 'loss/train': 1.0320798754692078} 01/27/2022 01:00:25 - INFO - codeparrot_training - Step 5460: {'lr': 0.0004936170313607053, 'samples': 1048512, 'steps': 5460, 'loss/train': 0.5990137159824371} 01/27/2022 01:00:28 - INFO - codeparrot_training - Step 5461: {'lr': 0.0004936133570427361, 'samples': 1048704, 'steps': 5461, 'loss/train': 1.0396022200584412} 01/27/2022 01:00:32 - INFO - codeparrot_training - Step 5462: {'lr': 0.0004936096816812046, 'samples': 1048896, 'steps': 5462, 'loss/train': 0.4787595570087433} 01/27/2022 01:00:35 - INFO - codeparrot_training - Step 5463: {'lr': 0.0004936060052761268, 'samples': 1049088, 'steps': 5463, 'loss/train': 0.8142186105251312} 01/27/2022 01:00:38 - INFO - codeparrot_training - Step 5464: {'lr': 0.0004936023278275182, 'samples': 1049280, 'steps': 5464, 'loss/train': 0.8242056369781494} 01/27/2022 01:00:41 - INFO - codeparrot_training - Step 5465: {'lr': 0.0004935986493353944, 'samples': 1049472, 'steps': 5465, 'loss/train': 1.2670840322971344} 01/27/2022 01:00:44 - INFO - codeparrot_training - Step 5466: {'lr': 0.0004935949697997715, 'samples': 1049664, 'steps': 5466, 'loss/train': 1.2428186237812042} 01/27/2022 01:00:47 - INFO - codeparrot_training - Step 5467: {'lr': 0.000493591289220665, 'samples': 1049856, 'steps': 5467, 'loss/train': 0.8091789186000824} 01/27/2022 01:00:52 - INFO - codeparrot_training - Step 5468: {'lr': 0.0004935876075980908, 'samples': 1050048, 'steps': 5468, 'loss/train': 0.8508340716362} 01/27/2022 01:00:55 - INFO - codeparrot_training - Step 5469: {'lr': 0.0004935839249320647, 'samples': 1050240, 'steps': 5469, 'loss/train': 0.4662923365831375} 01/27/2022 01:00:58 - INFO - codeparrot_training - Step 5470: {'lr': 0.0004935802412226024, 'samples': 1050432, 'steps': 5470, 'loss/train': 0.7793891429901123} 01/27/2022 01:01:01 - INFO - codeparrot_training - Step 5471: {'lr': 0.0004935765564697195, 'samples': 1050624, 'steps': 5471, 'loss/train': 1.0336825847625732} 01/27/2022 01:01:05 - INFO - codeparrot_training - Step 5472: {'lr': 0.0004935728706734322, 'samples': 1050816, 'steps': 5472, 'loss/train': 0.889893651008606} 01/27/2022 01:01:08 - INFO - codeparrot_training - Step 5473: {'lr': 0.000493569183833756, 'samples': 1051008, 'steps': 5473, 'loss/train': 0.7369284331798553} 01/27/2022 01:01:11 - INFO - codeparrot_training - Step 5474: {'lr': 0.0004935654959507068, 'samples': 1051200, 'steps': 5474, 'loss/train': 0.8595706522464752} 01/27/2022 01:01:14 - INFO - codeparrot_training - Step 5475: {'lr': 0.0004935618070243003, 'samples': 1051392, 'steps': 5475, 'loss/train': 1.328216403722763} 01/27/2022 01:01:17 - INFO - codeparrot_training - Step 5476: {'lr': 0.0004935581170545523, 'samples': 1051584, 'steps': 5476, 'loss/train': 0.3854576647281647} 01/27/2022 01:01:22 - INFO - codeparrot_training - Step 5477: {'lr': 0.0004935544260414787, 'samples': 1051776, 'steps': 5477, 'loss/train': 0.23161715269088745} 01/27/2022 01:01:25 - INFO - codeparrot_training - Step 5478: {'lr': 0.0004935507339850953, 'samples': 1051968, 'steps': 5478, 'loss/train': 0.8014706075191498} 01/27/2022 01:01:29 - INFO - codeparrot_training - Step 5479: {'lr': 0.0004935470408854179, 'samples': 1052160, 'steps': 5479, 'loss/train': 0.7586125731468201} 01/27/2022 01:01:32 - INFO - codeparrot_training - Step 5480: {'lr': 0.0004935433467424624, 'samples': 1052352, 'steps': 5480, 'loss/train': 0.9132244884967804} 01/27/2022 01:01:35 - INFO - codeparrot_training - Step 5481: {'lr': 0.0004935396515562444, 'samples': 1052544, 'steps': 5481, 'loss/train': 0.8676745891571045} 01/27/2022 01:01:38 - INFO - codeparrot_training - Step 5482: {'lr': 0.0004935359553267798, 'samples': 1052736, 'steps': 5482, 'loss/train': 0.6028604954481125} 01/27/2022 01:01:41 - INFO - codeparrot_training - Step 5483: {'lr': 0.0004935322580540847, 'samples': 1052928, 'steps': 5483, 'loss/train': 0.8776095807552338} 01/27/2022 01:01:44 - INFO - codeparrot_training - Step 5484: {'lr': 0.0004935285597381747, 'samples': 1053120, 'steps': 5484, 'loss/train': 2.143848180770874} 01/27/2022 01:01:49 - INFO - codeparrot_training - Step 5485: {'lr': 0.0004935248603790656, 'samples': 1053312, 'steps': 5485, 'loss/train': 0.9772946834564209} 01/27/2022 01:01:52 - INFO - codeparrot_training - Step 5486: {'lr': 0.0004935211599767733, 'samples': 1053504, 'steps': 5486, 'loss/train': 1.0938246846199036} 01/27/2022 01:01:55 - INFO - codeparrot_training - Step 5487: {'lr': 0.0004935174585313138, 'samples': 1053696, 'steps': 5487, 'loss/train': 1.1557316780090332} 01/27/2022 01:01:58 - INFO - codeparrot_training - Step 5488: {'lr': 0.0004935137560427027, 'samples': 1053888, 'steps': 5488, 'loss/train': 0.4970819056034088} 01/27/2022 01:02:01 - INFO - codeparrot_training - Step 5489: {'lr': 0.000493510052510956, 'samples': 1054080, 'steps': 5489, 'loss/train': 1.0017674267292023} 01/27/2022 01:02:05 - INFO - codeparrot_training - Step 5490: {'lr': 0.0004935063479360897, 'samples': 1054272, 'steps': 5490, 'loss/train': 1.2768966257572174} 01/27/2022 01:02:08 - INFO - codeparrot_training - Step 5491: {'lr': 0.0004935026423181194, 'samples': 1054464, 'steps': 5491, 'loss/train': 0.8217293322086334} 01/27/2022 01:02:11 - INFO - codeparrot_training - Step 5492: {'lr': 0.0004934989356570611, 'samples': 1054656, 'steps': 5492, 'loss/train': 1.0866258144378662} 01/27/2022 01:02:14 - INFO - codeparrot_training - Step 5493: {'lr': 0.0004934952279529308, 'samples': 1054848, 'steps': 5493, 'loss/train': 1.1826252043247223} 01/27/2022 01:02:19 - INFO - codeparrot_training - Step 5494: {'lr': 0.0004934915192057441, 'samples': 1055040, 'steps': 5494, 'loss/train': 1.3658278584480286} 01/27/2022 01:02:22 - INFO - codeparrot_training - Step 5495: {'lr': 0.0004934878094155172, 'samples': 1055232, 'steps': 5495, 'loss/train': 1.22800675034523} 01/27/2022 01:02:25 - INFO - codeparrot_training - Step 5496: {'lr': 0.0004934840985822657, 'samples': 1055424, 'steps': 5496, 'loss/train': 0.8760682046413422} 01/27/2022 01:02:28 - INFO - codeparrot_training - Step 5497: {'lr': 0.0004934803867060058, 'samples': 1055616, 'steps': 5497, 'loss/train': 0.8025699555873871} 01/27/2022 01:02:31 - INFO - codeparrot_training - Step 5498: {'lr': 0.0004934766737867531, 'samples': 1055808, 'steps': 5498, 'loss/train': 0.8014595210552216} 01/27/2022 01:02:34 - INFO - codeparrot_training - Step 5499: {'lr': 0.0004934729598245237, 'samples': 1056000, 'steps': 5499, 'loss/train': 0.809843122959137} 01/27/2022 01:02:37 - INFO - codeparrot_training - Step 5500: {'lr': 0.0004934692448193334, 'samples': 1056192, 'steps': 5500, 'loss/train': 0.9814674854278564} 01/27/2022 01:02:41 - INFO - codeparrot_training - Step 5501: {'lr': 0.0004934655287711982, 'samples': 1056384, 'steps': 5501, 'loss/train': 0.9587212800979614} 01/27/2022 01:02:44 - INFO - codeparrot_training - Step 5502: {'lr': 0.0004934618116801341, 'samples': 1056576, 'steps': 5502, 'loss/train': 0.8292405009269714} 01/27/2022 01:02:48 - INFO - codeparrot_training - Step 5503: {'lr': 0.0004934580935461567, 'samples': 1056768, 'steps': 5503, 'loss/train': 0.9329614341259003} 01/27/2022 01:02:51 - INFO - codeparrot_training - Step 5504: {'lr': 0.0004934543743692822, 'samples': 1056960, 'steps': 5504, 'loss/train': 0.5180657207965851} 01/27/2022 01:02:54 - INFO - codeparrot_training - Step 5505: {'lr': 0.0004934506541495265, 'samples': 1057152, 'steps': 5505, 'loss/train': 0.7517238557338715} 01/27/2022 01:02:58 - INFO - codeparrot_training - Step 5506: {'lr': 0.0004934469328869056, 'samples': 1057344, 'steps': 5506, 'loss/train': 0.7131989747285843} 01/27/2022 01:03:01 - INFO - codeparrot_training - Step 5507: {'lr': 0.0004934432105814352, 'samples': 1057536, 'steps': 5507, 'loss/train': 0.873621940612793} 01/27/2022 01:03:04 - INFO - codeparrot_training - Step 5508: {'lr': 0.0004934394872331314, 'samples': 1057728, 'steps': 5508, 'loss/train': 1.1747479140758514} 01/27/2022 01:03:07 - INFO - codeparrot_training - Step 5509: {'lr': 0.0004934357628420101, 'samples': 1057920, 'steps': 5509, 'loss/train': 0.7743418514728546} 01/27/2022 01:03:10 - INFO - codeparrot_training - Step 5510: {'lr': 0.0004934320374080874, 'samples': 1058112, 'steps': 5510, 'loss/train': 0.8424888253211975} 01/27/2022 01:03:13 - INFO - codeparrot_training - Step 5511: {'lr': 0.000493428310931379, 'samples': 1058304, 'steps': 5511, 'loss/train': 0.7202334702014923} 01/27/2022 01:03:19 - INFO - codeparrot_training - Step 5512: {'lr': 0.0004934245834119013, 'samples': 1058496, 'steps': 5512, 'loss/train': 0.9077649414539337} 01/27/2022 01:03:22 - INFO - codeparrot_training - Step 5513: {'lr': 0.0004934208548496697, 'samples': 1058688, 'steps': 5513, 'loss/train': 0.6846790462732315} 01/27/2022 01:03:25 - INFO - codeparrot_training - Step 5514: {'lr': 0.0004934171252447006, 'samples': 1058880, 'steps': 5514, 'loss/train': 0.9848185479640961} 01/27/2022 01:03:28 - INFO - codeparrot_training - Step 5515: {'lr': 0.0004934133945970097, 'samples': 1059072, 'steps': 5515, 'loss/train': 0.9674980938434601} 01/27/2022 01:03:31 - INFO - codeparrot_training - Step 5516: {'lr': 0.0004934096629066133, 'samples': 1059264, 'steps': 5516, 'loss/train': 1.068875402212143} 01/27/2022 01:03:34 - INFO - codeparrot_training - Step 5517: {'lr': 0.000493405930173527, 'samples': 1059456, 'steps': 5517, 'loss/train': 0.8562102913856506} 01/27/2022 01:03:37 - INFO - codeparrot_training - Step 5518: {'lr': 0.0004934021963977671, 'samples': 1059648, 'steps': 5518, 'loss/train': 0.666742354631424} 01/27/2022 01:03:41 - INFO - codeparrot_training - Step 5519: {'lr': 0.0004933984615793494, 'samples': 1059840, 'steps': 5519, 'loss/train': 0.6245390474796295} 01/27/2022 01:03:44 - INFO - codeparrot_training - Step 5520: {'lr': 0.0004933947257182901, 'samples': 1060032, 'steps': 5520, 'loss/train': 1.0170432329177856} 01/27/2022 01:03:48 - INFO - codeparrot_training - Step 5521: {'lr': 0.000493390988814605, 'samples': 1060224, 'steps': 5521, 'loss/train': 0.7355740070343018} 01/27/2022 01:03:51 - INFO - codeparrot_training - Step 5522: {'lr': 0.0004933872508683101, 'samples': 1060416, 'steps': 5522, 'loss/train': 0.752920925617218} 01/27/2022 01:03:54 - INFO - codeparrot_training - Step 5523: {'lr': 0.0004933835118794217, 'samples': 1060608, 'steps': 5523, 'loss/train': 0.5990312397480011} 01/27/2022 01:03:58 - INFO - codeparrot_training - Step 5524: {'lr': 0.0004933797718479555, 'samples': 1060800, 'steps': 5524, 'loss/train': 0.9821886420249939} 01/27/2022 01:04:01 - INFO - codeparrot_training - Step 5525: {'lr': 0.0004933760307739277, 'samples': 1060992, 'steps': 5525, 'loss/train': 1.106119304895401} 01/27/2022 01:04:04 - INFO - codeparrot_training - Step 5526: {'lr': 0.0004933722886573542, 'samples': 1061184, 'steps': 5526, 'loss/train': 0.7905083298683167} 01/27/2022 01:04:07 - INFO - codeparrot_training - Step 5527: {'lr': 0.0004933685454982511, 'samples': 1061376, 'steps': 5527, 'loss/train': 0.939765214920044} 01/27/2022 01:04:10 - INFO - codeparrot_training - Step 5528: {'lr': 0.0004933648012966344, 'samples': 1061568, 'steps': 5528, 'loss/train': 1.173948973417282} 01/27/2022 01:04:13 - INFO - codeparrot_training - Step 5529: {'lr': 0.0004933610560525203, 'samples': 1061760, 'steps': 5529, 'loss/train': 0.8380131125450134} 01/27/2022 01:04:18 - INFO - codeparrot_training - Step 5530: {'lr': 0.0004933573097659246, 'samples': 1061952, 'steps': 5530, 'loss/train': 0.9526005685329437} 01/27/2022 01:04:22 - INFO - codeparrot_training - Step 5531: {'lr': 0.0004933535624368634, 'samples': 1062144, 'steps': 5531, 'loss/train': 1.0481337010860443} 01/27/2022 01:04:25 - INFO - codeparrot_training - Step 5532: {'lr': 0.0004933498140653529, 'samples': 1062336, 'steps': 5532, 'loss/train': 0.6184575855731964} 01/27/2022 01:04:28 - INFO - codeparrot_training - Step 5533: {'lr': 0.0004933460646514092, 'samples': 1062528, 'steps': 5533, 'loss/train': 0.7701111137866974} 01/27/2022 01:04:31 - INFO - codeparrot_training - Step 5534: {'lr': 0.000493342314195048, 'samples': 1062720, 'steps': 5534, 'loss/train': 1.0392004251480103} 01/27/2022 01:04:34 - INFO - codeparrot_training - Step 5535: {'lr': 0.0004933385626962858, 'samples': 1062912, 'steps': 5535, 'loss/train': 0.9228495955467224} 01/27/2022 01:04:37 - INFO - codeparrot_training - Step 5536: {'lr': 0.0004933348101551383, 'samples': 1063104, 'steps': 5536, 'loss/train': 0.6989863961935043} 01/27/2022 01:04:40 - INFO - codeparrot_training - Step 5537: {'lr': 0.0004933310565716218, 'samples': 1063296, 'steps': 5537, 'loss/train': 1.1593808233737946} 01/27/2022 01:04:44 - INFO - codeparrot_training - Step 5538: {'lr': 0.0004933273019457524, 'samples': 1063488, 'steps': 5538, 'loss/train': 1.0563824772834778} 01/27/2022 01:04:48 - INFO - codeparrot_training - Step 5539: {'lr': 0.0004933235462775459, 'samples': 1063680, 'steps': 5539, 'loss/train': 0.33505024015903473} 01/27/2022 01:04:51 - INFO - codeparrot_training - Step 5540: {'lr': 0.0004933197895670187, 'samples': 1063872, 'steps': 5540, 'loss/train': 0.9564119875431061} 01/27/2022 01:04:54 - INFO - codeparrot_training - Step 5541: {'lr': 0.0004933160318141869, 'samples': 1064064, 'steps': 5541, 'loss/train': 0.9101941287517548} 01/27/2022 01:04:58 - INFO - codeparrot_training - Step 5542: {'lr': 0.0004933122730190663, 'samples': 1064256, 'steps': 5542, 'loss/train': 1.1498801708221436} 01/27/2022 01:05:01 - INFO - codeparrot_training - Step 5543: {'lr': 0.0004933085131816733, 'samples': 1064448, 'steps': 5543, 'loss/train': 1.4911058843135834} 01/27/2022 01:05:04 - INFO - codeparrot_training - Step 5544: {'lr': 0.0004933047523020239, 'samples': 1064640, 'steps': 5544, 'loss/train': 0.9126071333885193} 01/27/2022 01:05:07 - INFO - codeparrot_training - Step 5545: {'lr': 0.0004933009903801341, 'samples': 1064832, 'steps': 5545, 'loss/train': 1.1080713272094727} 01/27/2022 01:05:10 - INFO - codeparrot_training - Step 5546: {'lr': 0.0004932972274160202, 'samples': 1065024, 'steps': 5546, 'loss/train': 0.8899206519126892} 01/27/2022 01:05:14 - INFO - codeparrot_training - Step 5547: {'lr': 0.0004932934634096982, 'samples': 1065216, 'steps': 5547, 'loss/train': 0.926663339138031} 01/27/2022 01:05:18 - INFO - codeparrot_training - Step 5548: {'lr': 0.0004932896983611843, 'samples': 1065408, 'steps': 5548, 'loss/train': 0.9625017642974854} 01/27/2022 01:05:21 - INFO - codeparrot_training - Step 5549: {'lr': 0.0004932859322704944, 'samples': 1065600, 'steps': 5549, 'loss/train': 1.0855249464511871} 01/27/2022 01:05:24 - INFO - codeparrot_training - Step 5550: {'lr': 0.000493282165137645, 'samples': 1065792, 'steps': 5550, 'loss/train': 1.0397650301456451} 01/27/2022 01:05:27 - INFO - codeparrot_training - Step 5551: {'lr': 0.0004932783969626521, 'samples': 1065984, 'steps': 5551, 'loss/train': 0.7294128388166428} 01/27/2022 01:05:30 - INFO - codeparrot_training - Step 5552: {'lr': 0.0004932746277455317, 'samples': 1066176, 'steps': 5552, 'loss/train': 1.0364496409893036} 01/27/2022 01:05:33 - INFO - codeparrot_training - Step 5553: {'lr': 0.0004932708574863, 'samples': 1066368, 'steps': 5553, 'loss/train': 1.1569565534591675} 01/27/2022 01:05:36 - INFO - codeparrot_training - Step 5554: {'lr': 0.0004932670861849733, 'samples': 1066560, 'steps': 5554, 'loss/train': 1.0685987770557404} 01/27/2022 01:05:40 - INFO - codeparrot_training - Step 5555: {'lr': 0.0004932633138415675, 'samples': 1066752, 'steps': 5555, 'loss/train': 0.860051304101944} 01/27/2022 01:05:45 - INFO - codeparrot_training - Step 5556: {'lr': 0.000493259540456099, 'samples': 1066944, 'steps': 5556, 'loss/train': 0.9960667490959167} 01/27/2022 01:05:48 - INFO - codeparrot_training - Step 5557: {'lr': 0.0004932557660285839, 'samples': 1067136, 'steps': 5557, 'loss/train': 0.13456523045897484} 01/27/2022 01:05:51 - INFO - codeparrot_training - Step 5558: {'lr': 0.0004932519905590383, 'samples': 1067328, 'steps': 5558, 'loss/train': 0.9153513014316559} 01/27/2022 01:05:54 - INFO - codeparrot_training - Step 5559: {'lr': 0.0004932482140474785, 'samples': 1067520, 'steps': 5559, 'loss/train': 1.0133355259895325} 01/27/2022 01:05:58 - INFO - codeparrot_training - Step 5560: {'lr': 0.0004932444364939204, 'samples': 1067712, 'steps': 5560, 'loss/train': 1.0242209136486053} 01/27/2022 01:06:01 - INFO - codeparrot_training - Step 5561: {'lr': 0.0004932406578983806, 'samples': 1067904, 'steps': 5561, 'loss/train': 0.9986436367034912} 01/27/2022 01:06:04 - INFO - codeparrot_training - Step 5562: {'lr': 0.0004932368782608749, 'samples': 1068096, 'steps': 5562, 'loss/train': 1.095296323299408} 01/27/2022 01:06:07 - INFO - codeparrot_training - Step 5563: {'lr': 0.0004932330975814198, 'samples': 1068288, 'steps': 5563, 'loss/train': 1.0995244681835175} 01/27/2022 01:06:10 - INFO - codeparrot_training - Step 5564: {'lr': 0.0004932293158600312, 'samples': 1068480, 'steps': 5564, 'loss/train': 0.9253827631473541} 01/27/2022 01:06:15 - INFO - codeparrot_training - Step 5565: {'lr': 0.0004932255330967255, 'samples': 1068672, 'steps': 5565, 'loss/train': 1.039421796798706} 01/27/2022 01:06:18 - INFO - codeparrot_training - Step 5566: {'lr': 0.0004932217492915189, 'samples': 1068864, 'steps': 5566, 'loss/train': 0.6294238418340683} 01/27/2022 01:06:21 - INFO - codeparrot_training - Step 5567: {'lr': 0.0004932179644444274, 'samples': 1069056, 'steps': 5567, 'loss/train': 1.296176701784134} 01/27/2022 01:06:24 - INFO - codeparrot_training - Step 5568: {'lr': 0.0004932141785554676, 'samples': 1069248, 'steps': 5568, 'loss/train': 0.9701499044895172} 01/27/2022 01:06:27 - INFO - codeparrot_training - Step 5569: {'lr': 0.0004932103916246553, 'samples': 1069440, 'steps': 5569, 'loss/train': 0.7712600827217102} 01/27/2022 01:06:30 - INFO - codeparrot_training - Step 5570: {'lr': 0.000493206603652007, 'samples': 1069632, 'steps': 5570, 'loss/train': 0.8678340017795563} 01/27/2022 01:06:33 - INFO - codeparrot_training - Step 5571: {'lr': 0.0004932028146375388, 'samples': 1069824, 'steps': 5571, 'loss/train': 1.1661474108695984} 01/27/2022 01:06:37 - INFO - codeparrot_training - Step 5572: {'lr': 0.000493199024581267, 'samples': 1070016, 'steps': 5572, 'loss/train': 0.4251181483268738} 01/27/2022 01:06:40 - INFO - codeparrot_training - Step 5573: {'lr': 0.0004931952334832077, 'samples': 1070208, 'steps': 5573, 'loss/train': 1.3509310483932495} 01/27/2022 01:06:45 - INFO - codeparrot_training - Step 5574: {'lr': 0.0004931914413433773, 'samples': 1070400, 'steps': 5574, 'loss/train': 1.0167522132396698} 01/27/2022 01:06:48 - INFO - codeparrot_training - Step 5575: {'lr': 0.0004931876481617921, 'samples': 1070592, 'steps': 5575, 'loss/train': 0.9823505580425262} 01/27/2022 01:06:51 - INFO - codeparrot_training - Step 5576: {'lr': 0.0004931838539384681, 'samples': 1070784, 'steps': 5576, 'loss/train': 0.0695789773017168} 01/27/2022 01:06:54 - INFO - codeparrot_training - Step 5577: {'lr': 0.0004931800586734218, 'samples': 1070976, 'steps': 5577, 'loss/train': 0.5319532603025436} 01/27/2022 01:06:57 - INFO - codeparrot_training - Step 5578: {'lr': 0.0004931762623666692, 'samples': 1071168, 'steps': 5578, 'loss/train': 0.8117820918560028} 01/27/2022 01:07:01 - INFO - codeparrot_training - Step 5579: {'lr': 0.0004931724650182268, 'samples': 1071360, 'steps': 5579, 'loss/train': 0.6779577881097794} 01/27/2022 01:07:04 - INFO - codeparrot_training - Step 5580: {'lr': 0.0004931686666281108, 'samples': 1071552, 'steps': 5580, 'loss/train': 0.9603763818740845} 01/27/2022 01:07:07 - INFO - codeparrot_training - Step 5581: {'lr': 0.0004931648671963373, 'samples': 1071744, 'steps': 5581, 'loss/train': 1.211286038160324} 01/27/2022 01:07:10 - INFO - codeparrot_training - Step 5582: {'lr': 0.000493161066722923, 'samples': 1071936, 'steps': 5582, 'loss/train': 1.2914857864379883} 01/27/2022 01:07:14 - INFO - codeparrot_training - Step 5583: {'lr': 0.0004931572652078837, 'samples': 1072128, 'steps': 5583, 'loss/train': 0.8429335355758667} 01/27/2022 01:07:18 - INFO - codeparrot_training - Step 5584: {'lr': 0.0004931534626512359, 'samples': 1072320, 'steps': 5584, 'loss/train': 0.9584689736366272} 01/27/2022 01:07:21 - INFO - codeparrot_training - Step 5585: {'lr': 0.0004931496590529959, 'samples': 1072512, 'steps': 5585, 'loss/train': 0.4917397052049637} 01/27/2022 01:07:24 - INFO - codeparrot_training - Step 5586: {'lr': 0.0004931458544131799, 'samples': 1072704, 'steps': 5586, 'loss/train': 0.6082343459129333} 01/27/2022 01:07:27 - INFO - codeparrot_training - Step 5587: {'lr': 0.0004931420487318044, 'samples': 1072896, 'steps': 5587, 'loss/train': 0.5101063996553421} 01/27/2022 01:07:30 - INFO - codeparrot_training - Step 5588: {'lr': 0.0004931382420088855, 'samples': 1073088, 'steps': 5588, 'loss/train': 0.8046934604644775} 01/27/2022 01:07:33 - INFO - codeparrot_training - Step 5589: {'lr': 0.0004931344342444396, 'samples': 1073280, 'steps': 5589, 'loss/train': 0.6621893048286438} 01/27/2022 01:07:36 - INFO - codeparrot_training - Step 5590: {'lr': 0.000493130625438483, 'samples': 1073472, 'steps': 5590, 'loss/train': 0.6065195202827454} 01/27/2022 01:07:40 - INFO - codeparrot_training - Step 5591: {'lr': 0.000493126815591032, 'samples': 1073664, 'steps': 5591, 'loss/train': 1.231829434633255} 01/27/2022 01:07:44 - INFO - codeparrot_training - Step 5592: {'lr': 0.0004931230047021028, 'samples': 1073856, 'steps': 5592, 'loss/train': 1.1140200197696686} 01/27/2022 01:07:47 - INFO - codeparrot_training - Step 5593: {'lr': 0.000493119192771712, 'samples': 1074048, 'steps': 5593, 'loss/train': 0.7436311841011047} 01/27/2022 01:07:50 - INFO - codeparrot_training - Step 5594: {'lr': 0.0004931153797998757, 'samples': 1074240, 'steps': 5594, 'loss/train': 1.020599216222763} 01/27/2022 01:07:53 - INFO - codeparrot_training - Step 5595: {'lr': 0.0004931115657866103, 'samples': 1074432, 'steps': 5595, 'loss/train': 0.6187772154808044} 01/27/2022 01:07:57 - INFO - codeparrot_training - Step 5596: {'lr': 0.0004931077507319322, 'samples': 1074624, 'steps': 5596, 'loss/train': 0.5941933840513229} 01/27/2022 01:08:00 - INFO - codeparrot_training - Step 5597: {'lr': 0.0004931039346358577, 'samples': 1074816, 'steps': 5597, 'loss/train': 0.76202791929245} 01/27/2022 01:08:03 - INFO - codeparrot_training - Step 5598: {'lr': 0.0004931001174984032, 'samples': 1075008, 'steps': 5598, 'loss/train': 1.1062496602535248} 01/27/2022 01:08:06 - INFO - codeparrot_training - Step 5599: {'lr': 0.0004930962993195848, 'samples': 1075200, 'steps': 5599, 'loss/train': 0.48596589267253876} 01/27/2022 01:08:09 - INFO - codeparrot_training - Step 5600: {'lr': 0.0004930924800994192, 'samples': 1075392, 'steps': 5600, 'loss/train': 0.2770431786775589} 01/27/2022 01:08:14 - INFO - codeparrot_training - Step 5601: {'lr': 0.0004930886598379225, 'samples': 1075584, 'steps': 5601, 'loss/train': 1.079896241426468} 01/27/2022 01:08:17 - INFO - codeparrot_training - Step 5602: {'lr': 0.0004930848385351112, 'samples': 1075776, 'steps': 5602, 'loss/train': 0.6264656782150269} 01/27/2022 01:08:20 - INFO - codeparrot_training - Step 5603: {'lr': 0.0004930810161910017, 'samples': 1075968, 'steps': 5603, 'loss/train': 1.097726047039032} 01/27/2022 01:08:23 - INFO - codeparrot_training - Step 5604: {'lr': 0.0004930771928056102, 'samples': 1076160, 'steps': 5604, 'loss/train': 1.108136236667633} 01/27/2022 01:08:26 - INFO - codeparrot_training - Step 5605: {'lr': 0.0004930733683789533, 'samples': 1076352, 'steps': 5605, 'loss/train': 0.9671033620834351} 01/27/2022 01:08:29 - INFO - codeparrot_training - Step 5606: {'lr': 0.0004930695429110473, 'samples': 1076544, 'steps': 5606, 'loss/train': 0.9902356266975403} 01/27/2022 01:08:32 - INFO - codeparrot_training - Step 5607: {'lr': 0.0004930657164019085, 'samples': 1076736, 'steps': 5607, 'loss/train': 0.9796362519264221} 01/27/2022 01:08:36 - INFO - codeparrot_training - Step 5608: {'lr': 0.0004930618888515534, 'samples': 1076928, 'steps': 5608, 'loss/train': 0.19600781053304672} 01/27/2022 01:08:39 - INFO - codeparrot_training - Step 5609: {'lr': 0.0004930580602599983, 'samples': 1077120, 'steps': 5609, 'loss/train': 0.8110741674900055} 01/27/2022 01:08:44 - INFO - codeparrot_training - Step 5610: {'lr': 0.0004930542306272596, 'samples': 1077312, 'steps': 5610, 'loss/train': 0.6153388023376465} 01/27/2022 01:08:47 - INFO - codeparrot_training - Step 5611: {'lr': 0.0004930503999533538, 'samples': 1077504, 'steps': 5611, 'loss/train': 0.6148265451192856} 01/27/2022 01:08:50 - INFO - codeparrot_training - Step 5612: {'lr': 0.0004930465682382973, 'samples': 1077696, 'steps': 5612, 'loss/train': 0.8954994678497314} 01/27/2022 01:08:53 - INFO - codeparrot_training - Step 5613: {'lr': 0.0004930427354821064, 'samples': 1077888, 'steps': 5613, 'loss/train': 1.5420109033584595} 01/27/2022 01:08:56 - INFO - codeparrot_training - Step 5614: {'lr': 0.0004930389016847977, 'samples': 1078080, 'steps': 5614, 'loss/train': 0.8882982730865479} 01/27/2022 01:08:59 - INFO - codeparrot_training - Step 5615: {'lr': 0.0004930350668463874, 'samples': 1078272, 'steps': 5615, 'loss/train': 0.26497384160757065} 01/27/2022 01:09:03 - INFO - codeparrot_training - Step 5616: {'lr': 0.0004930312309668922, 'samples': 1078464, 'steps': 5616, 'loss/train': 1.0066474378108978} 01/27/2022 01:09:06 - INFO - codeparrot_training - Step 5617: {'lr': 0.0004930273940463283, 'samples': 1078656, 'steps': 5617, 'loss/train': 1.0068918764591217} 01/27/2022 01:09:10 - INFO - codeparrot_training - Step 5618: {'lr': 0.0004930235560847121, 'samples': 1078848, 'steps': 5618, 'loss/train': 0.4024966210126877} 01/27/2022 01:09:13 - INFO - codeparrot_training - Step 5619: {'lr': 0.0004930197170820603, 'samples': 1079040, 'steps': 5619, 'loss/train': 1.2004051208496094} 01/27/2022 01:09:17 - INFO - codeparrot_training - Step 5620: {'lr': 0.0004930158770383891, 'samples': 1079232, 'steps': 5620, 'loss/train': 1.3708434104919434} 01/27/2022 01:09:20 - INFO - codeparrot_training - Step 5621: {'lr': 0.0004930120359537153, 'samples': 1079424, 'steps': 5621, 'loss/train': 0.9293768405914307} 01/27/2022 01:09:23 - INFO - codeparrot_training - Step 5622: {'lr': 0.0004930081938280548, 'samples': 1079616, 'steps': 5622, 'loss/train': 1.251639872789383} 01/27/2022 01:09:26 - INFO - codeparrot_training - Step 5623: {'lr': 0.0004930043506614245, 'samples': 1079808, 'steps': 5623, 'loss/train': 0.654700443148613} 01/27/2022 01:09:29 - INFO - codeparrot_training - Step 5624: {'lr': 0.0004930005064538406, 'samples': 1080000, 'steps': 5624, 'loss/train': 0.7958073019981384} 01/27/2022 01:09:32 - INFO - codeparrot_training - Step 5625: {'lr': 0.0004929966612053199, 'samples': 1080192, 'steps': 5625, 'loss/train': 1.042179822921753} 01/27/2022 01:09:36 - INFO - codeparrot_training - Step 5626: {'lr': 0.0004929928149158785, 'samples': 1080384, 'steps': 5626, 'loss/train': 0.6342465877532959} 01/27/2022 01:09:40 - INFO - codeparrot_training - Step 5627: {'lr': 0.0004929889675855332, 'samples': 1080576, 'steps': 5627, 'loss/train': 0.059149956330657005} 01/27/2022 01:09:43 - INFO - codeparrot_training - Step 5628: {'lr': 0.0004929851192143001, 'samples': 1080768, 'steps': 5628, 'loss/train': 1.0667179226875305} 01/27/2022 01:09:46 - INFO - codeparrot_training - Step 5629: {'lr': 0.0004929812698021961, 'samples': 1080960, 'steps': 5629, 'loss/train': 0.6396634429693222} 01/27/2022 01:09:49 - INFO - codeparrot_training - Step 5630: {'lr': 0.0004929774193492373, 'samples': 1081152, 'steps': 5630, 'loss/train': 1.2367088198661804} 01/27/2022 01:09:52 - INFO - codeparrot_training - Step 5631: {'lr': 0.0004929735678554406, 'samples': 1081344, 'steps': 5631, 'loss/train': 0.8742882013320923} 01/27/2022 01:09:56 - INFO - codeparrot_training - Step 5632: {'lr': 0.0004929697153208221, 'samples': 1081536, 'steps': 5632, 'loss/train': 0.8497355282306671} 01/27/2022 01:09:59 - INFO - codeparrot_training - Step 5633: {'lr': 0.0004929658617453986, 'samples': 1081728, 'steps': 5633, 'loss/train': 1.0795779526233673} 01/27/2022 01:10:02 - INFO - codeparrot_training - Step 5634: {'lr': 0.0004929620071291865, 'samples': 1081920, 'steps': 5634, 'loss/train': 0.7871828377246857} 01/27/2022 01:10:05 - INFO - codeparrot_training - Step 5635: {'lr': 0.0004929581514722023, 'samples': 1082112, 'steps': 5635, 'loss/train': 1.03280907869339} 01/27/2022 01:10:10 - INFO - codeparrot_training - Step 5636: {'lr': 0.0004929542947744625, 'samples': 1082304, 'steps': 5636, 'loss/train': 1.2207087278366089} 01/27/2022 01:10:13 - INFO - codeparrot_training - Step 5637: {'lr': 0.0004929504370359837, 'samples': 1082496, 'steps': 5637, 'loss/train': 0.7762577533721924} 01/27/2022 01:10:16 - INFO - codeparrot_training - Step 5638: {'lr': 0.0004929465782567824, 'samples': 1082688, 'steps': 5638, 'loss/train': 0.22473366558551788} 01/27/2022 01:10:19 - INFO - codeparrot_training - Step 5639: {'lr': 0.000492942718436875, 'samples': 1082880, 'steps': 5639, 'loss/train': 0.7115877270698547} 01/27/2022 01:10:23 - INFO - codeparrot_training - Step 5640: {'lr': 0.0004929388575762782, 'samples': 1083072, 'steps': 5640, 'loss/train': 0.6975263804197311} 01/27/2022 01:10:26 - INFO - codeparrot_training - Step 5641: {'lr': 0.0004929349956750085, 'samples': 1083264, 'steps': 5641, 'loss/train': 0.8174146413803101} 01/27/2022 01:10:29 - INFO - codeparrot_training - Step 5642: {'lr': 0.0004929311327330823, 'samples': 1083456, 'steps': 5642, 'loss/train': 1.1809204816818237} 01/27/2022 01:10:32 - INFO - codeparrot_training - Step 5643: {'lr': 0.0004929272687505163, 'samples': 1083648, 'steps': 5643, 'loss/train': 0.7941537201404572} 01/27/2022 01:10:35 - INFO - codeparrot_training - Step 5644: {'lr': 0.0004929234037273271, 'samples': 1083840, 'steps': 5644, 'loss/train': 0.7892582416534424} 01/27/2022 01:10:40 - INFO - codeparrot_training - Step 5645: {'lr': 0.0004929195376635311, 'samples': 1084032, 'steps': 5645, 'loss/train': 0.584923043847084} 01/27/2022 01:10:43 - INFO - codeparrot_training - Step 5646: {'lr': 0.000492915670559145, 'samples': 1084224, 'steps': 5646, 'loss/train': 0.6825436949729919} 01/27/2022 01:10:46 - INFO - codeparrot_training - Step 5647: {'lr': 0.0004929118024141853, 'samples': 1084416, 'steps': 5647, 'loss/train': 0.9741248488426208} 01/27/2022 01:10:49 - INFO - codeparrot_training - Step 5648: {'lr': 0.0004929079332286685, 'samples': 1084608, 'steps': 5648, 'loss/train': 0.9501141607761383} 01/27/2022 01:10:52 - INFO - codeparrot_training - Step 5649: {'lr': 0.0004929040630026112, 'samples': 1084800, 'steps': 5649, 'loss/train': 1.2835651338100433} 01/27/2022 01:10:55 - INFO - codeparrot_training - Step 5650: {'lr': 0.0004929001917360302, 'samples': 1084992, 'steps': 5650, 'loss/train': 0.9714953005313873} 01/27/2022 01:10:59 - INFO - codeparrot_training - Step 5651: {'lr': 0.0004928963194289419, 'samples': 1085184, 'steps': 5651, 'loss/train': 0.7946405410766602} 01/27/2022 01:11:02 - INFO - codeparrot_training - Step 5652: {'lr': 0.0004928924460813627, 'samples': 1085376, 'steps': 5652, 'loss/train': 0.9591336250305176} 01/27/2022 01:11:05 - INFO - codeparrot_training - Step 5653: {'lr': 0.0004928885716933096, 'samples': 1085568, 'steps': 5653, 'loss/train': 0.7312550246715546} 01/27/2022 01:11:10 - INFO - codeparrot_training - Step 5654: {'lr': 0.0004928846962647988, 'samples': 1085760, 'steps': 5654, 'loss/train': 0.9202224612236023} 01/27/2022 01:11:13 - INFO - codeparrot_training - Step 5655: {'lr': 0.0004928808197958472, 'samples': 1085952, 'steps': 5655, 'loss/train': 1.0118515491485596} 01/27/2022 01:11:16 - INFO - codeparrot_training - Step 5656: {'lr': 0.0004928769422864712, 'samples': 1086144, 'steps': 5656, 'loss/train': 1.2848760187625885} 01/27/2022 01:11:19 - INFO - codeparrot_training - Step 5657: {'lr': 0.0004928730637366877, 'samples': 1086336, 'steps': 5657, 'loss/train': 0.9254516959190369} 01/27/2022 01:11:22 - INFO - codeparrot_training - Step 5658: {'lr': 0.000492869184146513, 'samples': 1086528, 'steps': 5658, 'loss/train': 1.1855173408985138} 01/27/2022 01:11:25 - INFO - codeparrot_training - Step 5659: {'lr': 0.0004928653035159638, 'samples': 1086720, 'steps': 5659, 'loss/train': 1.8844115138053894} 01/27/2022 01:11:28 - INFO - codeparrot_training - Step 5660: {'lr': 0.0004928614218450568, 'samples': 1086912, 'steps': 5660, 'loss/train': 0.6226985156536102} 01/27/2022 01:11:32 - INFO - codeparrot_training - Step 5661: {'lr': 0.0004928575391338085, 'samples': 1087104, 'steps': 5661, 'loss/train': 0.7607380449771881} 01/27/2022 01:11:35 - INFO - codeparrot_training - Step 5662: {'lr': 0.0004928536553822357, 'samples': 1087296, 'steps': 5662, 'loss/train': 0.7067456245422363} 01/27/2022 01:11:40 - INFO - codeparrot_training - Step 5663: {'lr': 0.0004928497705903549, 'samples': 1087488, 'steps': 5663, 'loss/train': 0.7739646434783936} 01/27/2022 01:11:43 - INFO - codeparrot_training - Step 5664: {'lr': 0.0004928458847581828, 'samples': 1087680, 'steps': 5664, 'loss/train': 0.7803888022899628} 01/27/2022 01:11:46 - INFO - codeparrot_training - Step 5665: {'lr': 0.0004928419978857361, 'samples': 1087872, 'steps': 5665, 'loss/train': 0.9956881105899811} 01/27/2022 01:11:49 - INFO - codeparrot_training - Step 5666: {'lr': 0.0004928381099730314, 'samples': 1088064, 'steps': 5666, 'loss/train': 1.1120034456253052} 01/27/2022 01:11:52 - INFO - codeparrot_training - Step 5667: {'lr': 0.0004928342210200853, 'samples': 1088256, 'steps': 5667, 'loss/train': 0.7935058772563934} 01/27/2022 01:11:56 - INFO - codeparrot_training - Step 5668: {'lr': 0.0004928303310269145, 'samples': 1088448, 'steps': 5668, 'loss/train': 0.20463687926530838} 01/27/2022 01:11:59 - INFO - codeparrot_training - Step 5669: {'lr': 0.0004928264399935357, 'samples': 1088640, 'steps': 5669, 'loss/train': 0.672861322760582} 01/27/2022 01:12:02 - INFO - codeparrot_training - Step 5670: {'lr': 0.0004928225479199655, 'samples': 1088832, 'steps': 5670, 'loss/train': 1.1102770864963531} 01/27/2022 01:12:05 - INFO - codeparrot_training - Step 5671: {'lr': 0.0004928186548062206, 'samples': 1089024, 'steps': 5671, 'loss/train': 1.1332717537879944} 01/27/2022 01:12:10 - INFO - codeparrot_training - Step 5672: {'lr': 0.0004928147606523179, 'samples': 1089216, 'steps': 5672, 'loss/train': 0.9638663828372955} 01/27/2022 01:12:13 - INFO - codeparrot_training - Step 5673: {'lr': 0.0004928108654582736, 'samples': 1089408, 'steps': 5673, 'loss/train': 1.196021854877472} 01/27/2022 01:12:16 - INFO - codeparrot_training - Step 5674: {'lr': 0.0004928069692241048, 'samples': 1089600, 'steps': 5674, 'loss/train': 1.1393100321292877} 01/27/2022 01:12:19 - INFO - codeparrot_training - Step 5675: {'lr': 0.000492803071949828, 'samples': 1089792, 'steps': 5675, 'loss/train': 0.8968163430690765} 01/27/2022 01:12:22 - INFO - codeparrot_training - Step 5676: {'lr': 0.0004927991736354599, 'samples': 1089984, 'steps': 5676, 'loss/train': 0.7299248725175858} 01/27/2022 01:12:25 - INFO - codeparrot_training - Step 5677: {'lr': 0.0004927952742810173, 'samples': 1090176, 'steps': 5677, 'loss/train': 0.7778525948524475} 01/27/2022 01:12:29 - INFO - codeparrot_training - Step 5678: {'lr': 0.0004927913738865167, 'samples': 1090368, 'steps': 5678, 'loss/train': 1.011057436466217} 01/27/2022 01:12:32 - INFO - codeparrot_training - Step 5679: {'lr': 0.0004927874724519751, 'samples': 1090560, 'steps': 5679, 'loss/train': 0.7636504769325256} 01/27/2022 01:12:35 - INFO - codeparrot_training - Step 5680: {'lr': 0.000492783569977409, 'samples': 1090752, 'steps': 5680, 'loss/train': 0.8287596702575684} 01/27/2022 01:12:40 - INFO - codeparrot_training - Step 5681: {'lr': 0.0004927796664628353, 'samples': 1090944, 'steps': 5681, 'loss/train': 1.0291254222393036} 01/27/2022 01:12:43 - INFO - codeparrot_training - Step 5682: {'lr': 0.0004927757619082704, 'samples': 1091136, 'steps': 5682, 'loss/train': 0.5398550480604172} 01/27/2022 01:12:46 - INFO - codeparrot_training - Step 5683: {'lr': 0.0004927718563137313, 'samples': 1091328, 'steps': 5683, 'loss/train': 0.9377852976322174} 01/27/2022 01:12:50 - INFO - codeparrot_training - Step 5684: {'lr': 0.0004927679496792347, 'samples': 1091520, 'steps': 5684, 'loss/train': 1.1504453122615814} 01/27/2022 01:12:53 - INFO - codeparrot_training - Step 5685: {'lr': 0.0004927640420047973, 'samples': 1091712, 'steps': 5685, 'loss/train': 0.5730797052383423} 01/27/2022 01:12:56 - INFO - codeparrot_training - Step 5686: {'lr': 0.0004927601332904358, 'samples': 1091904, 'steps': 5686, 'loss/train': 0.6796579509973526} 01/27/2022 01:12:59 - INFO - codeparrot_training - Step 5687: {'lr': 0.0004927562235361669, 'samples': 1092096, 'steps': 5687, 'loss/train': 0.5895860195159912} 01/27/2022 01:13:02 - INFO - codeparrot_training - Step 5688: {'lr': 0.0004927523127420076, 'samples': 1092288, 'steps': 5688, 'loss/train': 0.6215015351772308} 01/27/2022 01:13:05 - INFO - codeparrot_training - Step 5689: {'lr': 0.0004927484009079743, 'samples': 1092480, 'steps': 5689, 'loss/train': 0.9453877508640289} 01/27/2022 01:13:10 - INFO - codeparrot_training - Step 5690: {'lr': 0.000492744488034084, 'samples': 1092672, 'steps': 5690, 'loss/train': 0.8696480691432953} 01/27/2022 01:13:13 - INFO - codeparrot_training - Step 5691: {'lr': 0.0004927405741203534, 'samples': 1092864, 'steps': 5691, 'loss/train': 1.1086261868476868} 01/27/2022 01:13:16 - INFO - codeparrot_training - Step 5692: {'lr': 0.0004927366591667993, 'samples': 1093056, 'steps': 5692, 'loss/train': 0.7632568180561066} 01/27/2022 01:13:19 - INFO - codeparrot_training - Step 5693: {'lr': 0.0004927327431734383, 'samples': 1093248, 'steps': 5693, 'loss/train': 0.9782018959522247} 01/27/2022 01:13:22 - INFO - codeparrot_training - Step 5694: {'lr': 0.0004927288261402875, 'samples': 1093440, 'steps': 5694, 'loss/train': 2.2917993664741516} 01/27/2022 01:13:26 - INFO - codeparrot_training - Step 5695: {'lr': 0.0004927249080673633, 'samples': 1093632, 'steps': 5695, 'loss/train': 0.6863045990467072} 01/27/2022 01:13:29 - INFO - codeparrot_training - Step 5696: {'lr': 0.0004927209889546828, 'samples': 1093824, 'steps': 5696, 'loss/train': 0.999163806438446} 01/27/2022 01:13:32 - INFO - codeparrot_training - Step 5697: {'lr': 0.0004927170688022625, 'samples': 1094016, 'steps': 5697, 'loss/train': 0.8166477978229523} 01/27/2022 01:13:35 - INFO - codeparrot_training - Step 5698: {'lr': 0.0004927131476101195, 'samples': 1094208, 'steps': 5698, 'loss/train': 2.511584758758545} 01/27/2022 01:13:39 - INFO - codeparrot_training - Step 5699: {'lr': 0.0004927092253782704, 'samples': 1094400, 'steps': 5699, 'loss/train': 0.9999098181724548} 01/27/2022 01:13:43 - INFO - codeparrot_training - Step 5700: {'lr': 0.0004927053021067321, 'samples': 1094592, 'steps': 5700, 'loss/train': 0.8259532749652863} 01/27/2022 01:13:46 - INFO - codeparrot_training - Step 5701: {'lr': 0.0004927013777955212, 'samples': 1094784, 'steps': 5701, 'loss/train': 0.76128950715065} 01/27/2022 01:13:49 - INFO - codeparrot_training - Step 5702: {'lr': 0.0004926974524446548, 'samples': 1094976, 'steps': 5702, 'loss/train': 0.8034748435020447} 01/27/2022 01:13:52 - INFO - codeparrot_training - Step 5703: {'lr': 0.0004926935260541496, 'samples': 1095168, 'steps': 5703, 'loss/train': 0.5197899788618088} 01/27/2022 01:13:55 - INFO - codeparrot_training - Step 5704: {'lr': 0.0004926895986240222, 'samples': 1095360, 'steps': 5704, 'loss/train': 0.8987273275852203} 01/27/2022 01:13:58 - INFO - codeparrot_training - Step 5705: {'lr': 0.0004926856701542898, 'samples': 1095552, 'steps': 5705, 'loss/train': 1.1249169409275055} 01/27/2022 01:14:01 - INFO - codeparrot_training - Step 5706: {'lr': 0.000492681740644969, 'samples': 1095744, 'steps': 5706, 'loss/train': 1.0011306703090668} 01/27/2022 01:14:05 - INFO - codeparrot_training - Step 5707: {'lr': 0.0004926778100960767, 'samples': 1095936, 'steps': 5707, 'loss/train': 1.1432439386844635} 01/27/2022 01:14:09 - INFO - codeparrot_training - Step 5708: {'lr': 0.0004926738785076297, 'samples': 1096128, 'steps': 5708, 'loss/train': 0.45650777220726013} 01/27/2022 01:14:12 - INFO - codeparrot_training - Step 5709: {'lr': 0.0004926699458796448, 'samples': 1096320, 'steps': 5709, 'loss/train': 0.6890246272087097} 01/27/2022 01:14:16 - INFO - codeparrot_training - Step 5710: {'lr': 0.0004926660122121391, 'samples': 1096512, 'steps': 5710, 'loss/train': 1.2083974778652191} 01/27/2022 01:14:19 - INFO - codeparrot_training - Step 5711: {'lr': 0.0004926620775051291, 'samples': 1096704, 'steps': 5711, 'loss/train': 0.8678120970726013} 01/27/2022 01:14:22 - INFO - codeparrot_training - Step 5712: {'lr': 0.0004926581417586318, 'samples': 1096896, 'steps': 5712, 'loss/train': 1.235972821712494} 01/27/2022 01:14:25 - INFO - codeparrot_training - Step 5713: {'lr': 0.0004926542049726642, 'samples': 1097088, 'steps': 5713, 'loss/train': 0.6119488924741745} 01/27/2022 01:14:28 - INFO - codeparrot_training - Step 5714: {'lr': 0.0004926502671472429, 'samples': 1097280, 'steps': 5714, 'loss/train': 1.0915912985801697} 01/27/2022 01:14:31 - INFO - codeparrot_training - Step 5715: {'lr': 0.000492646328282385, 'samples': 1097472, 'steps': 5715, 'loss/train': 0.8344210088253021} 01/27/2022 01:14:36 - INFO - codeparrot_training - Step 5716: {'lr': 0.0004926423883781073, 'samples': 1097664, 'steps': 5716, 'loss/train': 0.8116436898708344} 01/27/2022 01:14:40 - INFO - codeparrot_training - Step 5717: {'lr': 0.0004926384474344265, 'samples': 1097856, 'steps': 5717, 'loss/train': 0.4136364161968231} 01/27/2022 01:14:43 - INFO - codeparrot_training - Step 5718: {'lr': 0.0004926345054513598, 'samples': 1098048, 'steps': 5718, 'loss/train': 0.8021824657917023} 01/27/2022 01:14:46 - INFO - codeparrot_training - Step 5719: {'lr': 0.0004926305624289238, 'samples': 1098240, 'steps': 5719, 'loss/train': 1.0375877022743225} 01/27/2022 01:14:49 - INFO - codeparrot_training - Step 5720: {'lr': 0.0004926266183671356, 'samples': 1098432, 'steps': 5720, 'loss/train': 0.7406782060861588} 01/27/2022 01:14:52 - INFO - codeparrot_training - Step 5721: {'lr': 0.000492622673266012, 'samples': 1098624, 'steps': 5721, 'loss/train': 0.9665404558181763} 01/27/2022 01:14:55 - INFO - codeparrot_training - Step 5722: {'lr': 0.0004926187271255698, 'samples': 1098816, 'steps': 5722, 'loss/train': 0.86870938539505} 01/27/2022 01:14:58 - INFO - codeparrot_training - Step 5723: {'lr': 0.0004926147799458262, 'samples': 1099008, 'steps': 5723, 'loss/train': 1.0114547610282898} 01/27/2022 01:15:02 - INFO - codeparrot_training - Step 5724: {'lr': 0.0004926108317267979, 'samples': 1099200, 'steps': 5724, 'loss/train': 0.5722859054803848} 01/27/2022 01:15:06 - INFO - codeparrot_training - Step 5725: {'lr': 0.0004926068824685017, 'samples': 1099392, 'steps': 5725, 'loss/train': 1.4014276564121246} 01/27/2022 01:15:09 - INFO - codeparrot_training - Step 5726: {'lr': 0.0004926029321709548, 'samples': 1099584, 'steps': 5726, 'loss/train': 1.1389858424663544} 01/27/2022 01:15:12 - INFO - codeparrot_training - Step 5727: {'lr': 0.0004925989808341738, 'samples': 1099776, 'steps': 5727, 'loss/train': 0.5859709829092026} 01/27/2022 01:15:15 - INFO - codeparrot_training - Step 5728: {'lr': 0.0004925950284581759, 'samples': 1099968, 'steps': 5728, 'loss/train': 0.9746486842632294} 01/27/2022 01:15:19 - INFO - codeparrot_training - Step 5729: {'lr': 0.0004925910750429779, 'samples': 1100160, 'steps': 5729, 'loss/train': 1.0073795914649963} 01/27/2022 01:15:22 - INFO - codeparrot_training - Step 5730: {'lr': 0.0004925871205885968, 'samples': 1100352, 'steps': 5730, 'loss/train': 0.7980017066001892} 01/27/2022 01:15:25 - INFO - codeparrot_training - Step 5731: {'lr': 0.0004925831650950495, 'samples': 1100544, 'steps': 5731, 'loss/train': 0.30635641515254974} 01/27/2022 01:15:28 - INFO - codeparrot_training - Step 5732: {'lr': 0.000492579208562353, 'samples': 1100736, 'steps': 5732, 'loss/train': 1.0460370182991028} 01/27/2022 01:15:31 - INFO - codeparrot_training - Step 5733: {'lr': 0.0004925752509905241, 'samples': 1100928, 'steps': 5733, 'loss/train': 0.5406961888074875} 01/27/2022 01:15:36 - INFO - codeparrot_training - Step 5734: {'lr': 0.0004925712923795799, 'samples': 1101120, 'steps': 5734, 'loss/train': 1.0669217705726624} 01/27/2022 01:15:39 - INFO - codeparrot_training - Step 5735: {'lr': 0.0004925673327295374, 'samples': 1101312, 'steps': 5735, 'loss/train': 1.1509699523448944} 01/27/2022 01:15:42 - INFO - codeparrot_training - Step 5736: {'lr': 0.0004925633720404132, 'samples': 1101504, 'steps': 5736, 'loss/train': 1.2285049259662628} 01/27/2022 01:15:45 - INFO - codeparrot_training - Step 5737: {'lr': 0.0004925594103122248, 'samples': 1101696, 'steps': 5737, 'loss/train': 0.6886437982320786} 01/27/2022 01:15:48 - INFO - codeparrot_training - Step 5738: {'lr': 0.0004925554475449888, 'samples': 1101888, 'steps': 5738, 'loss/train': 1.1330832839012146} 01/27/2022 01:15:52 - INFO - codeparrot_training - Step 5739: {'lr': 0.0004925514837387223, 'samples': 1102080, 'steps': 5739, 'loss/train': 0.4933837652206421} 01/27/2022 01:15:55 - INFO - codeparrot_training - Step 5740: {'lr': 0.0004925475188934423, 'samples': 1102272, 'steps': 5740, 'loss/train': 1.264536201953888} 01/27/2022 01:15:58 - INFO - codeparrot_training - Step 5741: {'lr': 0.0004925435530091656, 'samples': 1102464, 'steps': 5741, 'loss/train': 0.3933396488428116} 01/27/2022 01:16:03 - INFO - codeparrot_training - Step 5742: {'lr': 0.0004925395860859096, 'samples': 1102656, 'steps': 5742, 'loss/train': 0.6463459879159927} 01/27/2022 01:16:06 - INFO - codeparrot_training - Step 5743: {'lr': 0.0004925356181236908, 'samples': 1102848, 'steps': 5743, 'loss/train': 1.1315801739692688} 01/27/2022 01:16:09 - INFO - codeparrot_training - Step 5744: {'lr': 0.0004925316491225265, 'samples': 1103040, 'steps': 5744, 'loss/train': 0.46687518060207367} 01/27/2022 01:16:12 - INFO - codeparrot_training - Step 5745: {'lr': 0.0004925276790824336, 'samples': 1103232, 'steps': 5745, 'loss/train': 1.0301997363567352} 01/27/2022 01:16:16 - INFO - codeparrot_training - Step 5746: {'lr': 0.0004925237080034291, 'samples': 1103424, 'steps': 5746, 'loss/train': 1.2801786661148071} 01/27/2022 01:16:19 - INFO - codeparrot_training - Step 5747: {'lr': 0.0004925197358855301, 'samples': 1103616, 'steps': 5747, 'loss/train': 0.7005579024553299} 01/27/2022 01:16:22 - INFO - codeparrot_training - Step 5748: {'lr': 0.0004925157627287536, 'samples': 1103808, 'steps': 5748, 'loss/train': 0.7934202253818512} 01/27/2022 01:16:25 - INFO - codeparrot_training - Step 5749: {'lr': 0.0004925117885331166, 'samples': 1104000, 'steps': 5749, 'loss/train': 1.2141571640968323} 01/27/2022 01:16:28 - INFO - codeparrot_training - Step 5750: {'lr': 0.000492507813298636, 'samples': 1104192, 'steps': 5750, 'loss/train': 0.9346729516983032} 01/27/2022 01:16:33 - INFO - codeparrot_training - Step 5751: {'lr': 0.000492503837025329, 'samples': 1104384, 'steps': 5751, 'loss/train': 0.7756345868110657} 01/27/2022 01:16:36 - INFO - codeparrot_training - Step 5752: {'lr': 0.0004924998597132125, 'samples': 1104576, 'steps': 5752, 'loss/train': 0.8816147446632385} 01/27/2022 01:16:39 - INFO - codeparrot_training - Step 5753: {'lr': 0.0004924958813623037, 'samples': 1104768, 'steps': 5753, 'loss/train': 1.0384391248226166} 01/27/2022 01:16:42 - INFO - codeparrot_training - Step 5754: {'lr': 0.0004924919019726195, 'samples': 1104960, 'steps': 5754, 'loss/train': 1.145852655172348} 01/27/2022 01:16:45 - INFO - codeparrot_training - Step 5755: {'lr': 0.000492487921544177, 'samples': 1105152, 'steps': 5755, 'loss/train': 1.2282185554504395} 01/27/2022 01:16:48 - INFO - codeparrot_training - Step 5756: {'lr': 0.0004924839400769932, 'samples': 1105344, 'steps': 5756, 'loss/train': 0.9020050764083862} 01/27/2022 01:16:51 - INFO - codeparrot_training - Step 5757: {'lr': 0.0004924799575710852, 'samples': 1105536, 'steps': 5757, 'loss/train': 1.0144824385643005} 01/27/2022 01:16:55 - INFO - codeparrot_training - Step 5758: {'lr': 0.0004924759740264701, 'samples': 1105728, 'steps': 5758, 'loss/train': 0.767365962266922} 01/27/2022 01:16:58 - INFO - codeparrot_training - Step 5759: {'lr': 0.000492471989443165, 'samples': 1105920, 'steps': 5759, 'loss/train': 0.5698802322149277} 01/27/2022 01:17:03 - INFO - codeparrot_training - Step 5760: {'lr': 0.0004924680038211868, 'samples': 1106112, 'steps': 5760, 'loss/train': 1.033988893032074} 01/27/2022 01:17:06 - INFO - codeparrot_training - Step 5761: {'lr': 0.0004924640171605526, 'samples': 1106304, 'steps': 5761, 'loss/train': 0.5890437662601471} 01/27/2022 01:17:09 - INFO - codeparrot_training - Step 5762: {'lr': 0.0004924600294612796, 'samples': 1106496, 'steps': 5762, 'loss/train': 1.2000714540481567} 01/27/2022 01:17:12 - INFO - codeparrot_training - Step 5763: {'lr': 0.0004924560407233848, 'samples': 1106688, 'steps': 5763, 'loss/train': 0.5321519672870636} 01/27/2022 01:17:15 - INFO - codeparrot_training - Step 5764: {'lr': 0.0004924520509468854, 'samples': 1106880, 'steps': 5764, 'loss/train': 0.577097699046135} 01/27/2022 01:17:18 - INFO - codeparrot_training - Step 5765: {'lr': 0.0004924480601317982, 'samples': 1107072, 'steps': 5765, 'loss/train': 0.47630757093429565} 01/27/2022 01:17:22 - INFO - codeparrot_training - Step 5766: {'lr': 0.0004924440682781407, 'samples': 1107264, 'steps': 5766, 'loss/train': 0.8991141021251678} 01/27/2022 01:17:25 - INFO - codeparrot_training - Step 5767: {'lr': 0.0004924400753859297, 'samples': 1107456, 'steps': 5767, 'loss/train': 1.1498947441577911} 01/27/2022 01:17:28 - INFO - codeparrot_training - Step 5768: {'lr': 0.0004924360814551825, 'samples': 1107648, 'steps': 5768, 'loss/train': 0.5866316556930542} 01/27/2022 01:17:33 - INFO - codeparrot_training - Step 5769: {'lr': 0.000492432086485916, 'samples': 1107840, 'steps': 5769, 'loss/train': 0.6854530870914459} 01/27/2022 01:17:36 - INFO - codeparrot_training - Step 5770: {'lr': 0.0004924280904781475, 'samples': 1108032, 'steps': 5770, 'loss/train': 0.7551125586032867} 01/27/2022 01:17:39 - INFO - codeparrot_training - Step 5771: {'lr': 0.0004924240934318939, 'samples': 1108224, 'steps': 5771, 'loss/train': 0.7346796691417694} 01/27/2022 01:17:42 - INFO - codeparrot_training - Step 5772: {'lr': 0.0004924200953471727, 'samples': 1108416, 'steps': 5772, 'loss/train': 1.2565562725067139} 01/27/2022 01:17:45 - INFO - codeparrot_training - Step 5773: {'lr': 0.0004924160962240005, 'samples': 1108608, 'steps': 5773, 'loss/train': 1.1708578169345856} 01/27/2022 01:17:48 - INFO - codeparrot_training - Step 5774: {'lr': 0.0004924120960623949, 'samples': 1108800, 'steps': 5774, 'loss/train': 0.9176222383975983} 01/27/2022 01:17:52 - INFO - codeparrot_training - Step 5775: {'lr': 0.0004924080948623729, 'samples': 1108992, 'steps': 5775, 'loss/train': 0.9013220071792603} 01/27/2022 01:17:55 - INFO - codeparrot_training - Step 5776: {'lr': 0.0004924040926239515, 'samples': 1109184, 'steps': 5776, 'loss/train': 0.9115279912948608} 01/27/2022 01:17:58 - INFO - codeparrot_training - Step 5777: {'lr': 0.000492400089347148, 'samples': 1109376, 'steps': 5777, 'loss/train': 0.9697399735450745} 01/27/2022 01:18:03 - INFO - codeparrot_training - Step 5778: {'lr': 0.0004923960850319794, 'samples': 1109568, 'steps': 5778, 'loss/train': 0.9244847595691681} 01/27/2022 01:18:06 - INFO - codeparrot_training - Step 5779: {'lr': 0.000492392079678463, 'samples': 1109760, 'steps': 5779, 'loss/train': 0.6689833402633667} 01/27/2022 01:18:09 - INFO - codeparrot_training - Step 5780: {'lr': 0.0004923880732866159, 'samples': 1109952, 'steps': 5780, 'loss/train': 1.2687252759933472} 01/27/2022 01:18:12 - INFO - codeparrot_training - Step 5781: {'lr': 0.0004923840658564553, 'samples': 1110144, 'steps': 5781, 'loss/train': 0.5990207344293594} 01/27/2022 01:18:15 - INFO - codeparrot_training - Step 5782: {'lr': 0.0004923800573879983, 'samples': 1110336, 'steps': 5782, 'loss/train': 1.212381273508072} 01/27/2022 01:18:18 - INFO - codeparrot_training - Step 5783: {'lr': 0.000492376047881262, 'samples': 1110528, 'steps': 5783, 'loss/train': 0.19919990748167038} 01/27/2022 01:18:22 - INFO - codeparrot_training - Step 5784: {'lr': 0.0004923720373362638, 'samples': 1110720, 'steps': 5784, 'loss/train': 0.6787098348140717} 01/27/2022 01:18:25 - INFO - codeparrot_training - Step 5785: {'lr': 0.0004923680257530207, 'samples': 1110912, 'steps': 5785, 'loss/train': 0.15477299317717552} 01/27/2022 01:18:28 - INFO - codeparrot_training - Step 5786: {'lr': 0.0004923640131315499, 'samples': 1111104, 'steps': 5786, 'loss/train': 1.1425308287143707} 01/27/2022 01:18:33 - INFO - codeparrot_training - Step 5787: {'lr': 0.0004923599994718687, 'samples': 1111296, 'steps': 5787, 'loss/train': 1.4837938249111176} 01/27/2022 01:18:36 - INFO - codeparrot_training - Step 5788: {'lr': 0.0004923559847739941, 'samples': 1111488, 'steps': 5788, 'loss/train': 0.8358966708183289} 01/27/2022 01:18:39 - INFO - codeparrot_training - Step 5789: {'lr': 0.0004923519690379436, 'samples': 1111680, 'steps': 5789, 'loss/train': 1.1459766626358032} 01/27/2022 01:18:42 - INFO - codeparrot_training - Step 5790: {'lr': 0.0004923479522637341, 'samples': 1111872, 'steps': 5790, 'loss/train': 0.7394654899835587} 01/27/2022 01:18:45 - INFO - codeparrot_training - Step 5791: {'lr': 0.0004923439344513829, 'samples': 1112064, 'steps': 5791, 'loss/train': 0.9758017659187317} 01/27/2022 01:18:49 - INFO - codeparrot_training - Step 5792: {'lr': 0.0004923399156009073, 'samples': 1112256, 'steps': 5792, 'loss/train': 0.8608639240264893} 01/27/2022 01:18:52 - INFO - codeparrot_training - Step 5793: {'lr': 0.0004923358957123245, 'samples': 1112448, 'steps': 5793, 'loss/train': 0.3680007755756378} 01/27/2022 01:18:55 - INFO - codeparrot_training - Step 5794: {'lr': 0.0004923318747856515, 'samples': 1112640, 'steps': 5794, 'loss/train': 0.8738178312778473} 01/27/2022 01:18:58 - INFO - codeparrot_training - Step 5795: {'lr': 0.0004923278528209059, 'samples': 1112832, 'steps': 5795, 'loss/train': 1.114964783191681} 01/27/2022 01:19:02 - INFO - codeparrot_training - Step 5796: {'lr': 0.0004923238298181047, 'samples': 1113024, 'steps': 5796, 'loss/train': 0.4895630478858948} 01/27/2022 01:19:06 - INFO - codeparrot_training - Step 5797: {'lr': 0.0004923198057772651, 'samples': 1113216, 'steps': 5797, 'loss/train': 0.8937214314937592} 01/27/2022 01:19:09 - INFO - codeparrot_training - Step 5798: {'lr': 0.0004923157806984044, 'samples': 1113408, 'steps': 5798, 'loss/train': 0.7699221968650818} 01/27/2022 01:19:12 - INFO - codeparrot_training - Step 5799: {'lr': 0.0004923117545815398, 'samples': 1113600, 'steps': 5799, 'loss/train': 0.8412430286407471} 01/27/2022 01:19:15 - INFO - codeparrot_training - Step 5800: {'lr': 0.0004923077274266886, 'samples': 1113792, 'steps': 5800, 'loss/train': 1.2527315318584442} 01/27/2022 01:19:18 - INFO - codeparrot_training - Step 5801: {'lr': 0.0004923036992338681, 'samples': 1113984, 'steps': 5801, 'loss/train': 1.0592527091503143} 01/27/2022 01:19:21 - INFO - codeparrot_training - Step 5802: {'lr': 0.0004922996700030954, 'samples': 1114176, 'steps': 5802, 'loss/train': 0.7249799072742462} 01/27/2022 01:19:24 - INFO - codeparrot_training - Step 5803: {'lr': 0.000492295639734388, 'samples': 1114368, 'steps': 5803, 'loss/train': 0.6454092711210251} 01/27/2022 01:19:27 - INFO - codeparrot_training - Step 5804: {'lr': 0.0004922916084277629, 'samples': 1114560, 'steps': 5804, 'loss/train': 0.9611206948757172} 01/27/2022 01:19:33 - INFO - codeparrot_training - Step 5805: {'lr': 0.0004922875760832375, 'samples': 1114752, 'steps': 5805, 'loss/train': 1.03008171916008} 01/27/2022 01:19:36 - INFO - codeparrot_training - Step 5806: {'lr': 0.000492283542700829, 'samples': 1114944, 'steps': 5806, 'loss/train': 1.141666442155838} 01/27/2022 01:19:39 - INFO - codeparrot_training - Step 5807: {'lr': 0.0004922795082805549, 'samples': 1115136, 'steps': 5807, 'loss/train': 0.5394528061151505} 01/27/2022 01:19:42 - INFO - codeparrot_training - Step 5808: {'lr': 0.0004922754728224322, 'samples': 1115328, 'steps': 5808, 'loss/train': 0.5180284827947617} 01/27/2022 01:19:45 - INFO - codeparrot_training - Step 5809: {'lr': 0.0004922714363264783, 'samples': 1115520, 'steps': 5809, 'loss/train': 1.1852556467056274} 01/27/2022 01:19:49 - INFO - codeparrot_training - Step 5810: {'lr': 0.0004922673987927106, 'samples': 1115712, 'steps': 5810, 'loss/train': 0.816545695066452} 01/27/2022 01:19:52 - INFO - codeparrot_training - Step 5811: {'lr': 0.0004922633602211462, 'samples': 1115904, 'steps': 5811, 'loss/train': 0.12732313200831413} 01/27/2022 01:19:55 - INFO - codeparrot_training - Step 5812: {'lr': 0.0004922593206118025, 'samples': 1116096, 'steps': 5812, 'loss/train': 0.913758784532547} 01/27/2022 01:20:00 - INFO - codeparrot_training - Step 5813: {'lr': 0.0004922552799646968, 'samples': 1116288, 'steps': 5813, 'loss/train': 0.7391301691532135} 01/27/2022 01:20:03 - INFO - codeparrot_training - Step 5814: {'lr': 0.0004922512382798463, 'samples': 1116480, 'steps': 5814, 'loss/train': 0.6331376284360886} 01/27/2022 01:20:06 - INFO - codeparrot_training - Step 5815: {'lr': 0.0004922471955572686, 'samples': 1116672, 'steps': 5815, 'loss/train': 0.3442368060350418} 01/27/2022 01:20:09 - INFO - codeparrot_training - Step 5816: {'lr': 0.0004922431517969808, 'samples': 1116864, 'steps': 5816, 'loss/train': 1.3411907851696014} 01/27/2022 01:20:12 - INFO - codeparrot_training - Step 5817: {'lr': 0.0004922391069990002, 'samples': 1117056, 'steps': 5817, 'loss/train': 0.4590401351451874} 01/27/2022 01:20:15 - INFO - codeparrot_training - Step 5818: {'lr': 0.0004922350611633442, 'samples': 1117248, 'steps': 5818, 'loss/train': 0.7714360356330872} 01/27/2022 01:20:19 - INFO - codeparrot_training - Step 5819: {'lr': 0.0004922310142900302, 'samples': 1117440, 'steps': 5819, 'loss/train': 0.3512250781059265} 01/27/2022 01:20:22 - INFO - codeparrot_training - Step 5820: {'lr': 0.0004922269663790753, 'samples': 1117632, 'steps': 5820, 'loss/train': 0.9678540229797363} 01/27/2022 01:20:25 - INFO - codeparrot_training - Step 5821: {'lr': 0.0004922229174304971, 'samples': 1117824, 'steps': 5821, 'loss/train': 0.7855994403362274} 01/27/2022 01:20:30 - INFO - codeparrot_training - Step 5822: {'lr': 0.0004922188674443128, 'samples': 1118016, 'steps': 5822, 'loss/train': 0.7578282952308655} 01/27/2022 01:20:33 - INFO - codeparrot_training - Step 5823: {'lr': 0.0004922148164205398, 'samples': 1118208, 'steps': 5823, 'loss/train': 0.5822552293539047} 01/27/2022 01:20:36 - INFO - codeparrot_training - Step 5824: {'lr': 0.0004922107643591954, 'samples': 1118400, 'steps': 5824, 'loss/train': 0.9964213371276855} 01/27/2022 01:20:39 - INFO - codeparrot_training - Step 5825: {'lr': 0.000492206711260297, 'samples': 1118592, 'steps': 5825, 'loss/train': 1.0387970209121704} 01/27/2022 01:20:42 - INFO - codeparrot_training - Step 5826: {'lr': 0.000492202657123862, 'samples': 1118784, 'steps': 5826, 'loss/train': 2.821325182914734} 01/27/2022 01:20:45 - INFO - codeparrot_training - Step 5827: {'lr': 0.0004921986019499078, 'samples': 1118976, 'steps': 5827, 'loss/train': 1.0988933444023132} 01/27/2022 01:20:48 - INFO - codeparrot_training - Step 5828: {'lr': 0.0004921945457384516, 'samples': 1119168, 'steps': 5828, 'loss/train': 0.7488349378108978} 01/27/2022 01:20:52 - INFO - codeparrot_training - Step 5829: {'lr': 0.0004921904884895108, 'samples': 1119360, 'steps': 5829, 'loss/train': 0.7922382652759552} 01/27/2022 01:20:55 - INFO - codeparrot_training - Step 5830: {'lr': 0.000492186430203103, 'samples': 1119552, 'steps': 5830, 'loss/train': 1.0144742131233215} 01/27/2022 01:20:59 - INFO - codeparrot_training - Step 5831: {'lr': 0.0004921823708792453, 'samples': 1119744, 'steps': 5831, 'loss/train': 1.01655113697052} 01/27/2022 01:21:02 - INFO - codeparrot_training - Step 5832: {'lr': 0.0004921783105179552, 'samples': 1119936, 'steps': 5832, 'loss/train': 0.9938004612922668} 01/27/2022 01:21:05 - INFO - codeparrot_training - Step 5833: {'lr': 0.0004921742491192502, 'samples': 1120128, 'steps': 5833, 'loss/train': 0.34569941461086273} 01/27/2022 01:21:09 - INFO - codeparrot_training - Step 5834: {'lr': 0.0004921701866831477, 'samples': 1120320, 'steps': 5834, 'loss/train': 0.836155503988266} 01/27/2022 01:21:12 - INFO - codeparrot_training - Step 5835: {'lr': 0.000492166123209665, 'samples': 1120512, 'steps': 5835, 'loss/train': 0.7126468420028687} 01/27/2022 01:21:15 - INFO - codeparrot_training - Step 5836: {'lr': 0.0004921620586988193, 'samples': 1120704, 'steps': 5836, 'loss/train': 0.7256429046392441} 01/27/2022 01:21:18 - INFO - codeparrot_training - Step 5837: {'lr': 0.0004921579931506285, 'samples': 1120896, 'steps': 5837, 'loss/train': 0.7341084480285645} 01/27/2022 01:21:21 - INFO - codeparrot_training - Step 5838: {'lr': 0.0004921539265651096, 'samples': 1121088, 'steps': 5838, 'loss/train': 0.8236607015132904} 01/27/2022 01:21:24 - INFO - codeparrot_training - Step 5839: {'lr': 0.0004921498589422803, 'samples': 1121280, 'steps': 5839, 'loss/train': 1.1045056879520416} 01/27/2022 01:21:30 - INFO - codeparrot_training - Step 5840: {'lr': 0.0004921457902821578, 'samples': 1121472, 'steps': 5840, 'loss/train': 0.9618598222732544} 01/27/2022 01:21:33 - INFO - codeparrot_training - Step 5841: {'lr': 0.0004921417205847597, 'samples': 1121664, 'steps': 5841, 'loss/train': 0.9128601551055908} 01/27/2022 01:21:37 - INFO - codeparrot_training - Step 5842: {'lr': 0.0004921376498501032, 'samples': 1121856, 'steps': 5842, 'loss/train': 1.0442813336849213} 01/27/2022 01:21:40 - INFO - codeparrot_training - Step 5843: {'lr': 0.000492133578078206, 'samples': 1122048, 'steps': 5843, 'loss/train': 0.6117445081472397} 01/27/2022 01:21:43 - INFO - codeparrot_training - Step 5844: {'lr': 0.0004921295052690855, 'samples': 1122240, 'steps': 5844, 'loss/train': 0.5128631293773651} 01/27/2022 01:21:46 - INFO - codeparrot_training - Step 5845: {'lr': 0.000492125431422759, 'samples': 1122432, 'steps': 5845, 'loss/train': 0.6906043142080307} 01/27/2022 01:21:49 - INFO - codeparrot_training - Step 5846: {'lr': 0.0004921213565392441, 'samples': 1122624, 'steps': 5846, 'loss/train': 1.3197895288467407} 01/27/2022 01:21:52 - INFO - codeparrot_training - Step 5847: {'lr': 0.000492117280618558, 'samples': 1122816, 'steps': 5847, 'loss/train': 0.3206147029995918} 01/27/2022 01:21:55 - INFO - codeparrot_training - Step 5848: {'lr': 0.0004921132036607186, 'samples': 1123008, 'steps': 5848, 'loss/train': 0.9042716324329376} 01/27/2022 01:22:00 - INFO - codeparrot_training - Step 5849: {'lr': 0.0004921091256657429, 'samples': 1123200, 'steps': 5849, 'loss/train': 0.8231405317783356} 01/27/2022 01:22:03 - INFO - codeparrot_training - Step 5850: {'lr': 0.0004921050466336487, 'samples': 1123392, 'steps': 5850, 'loss/train': 0.8839819729328156} 01/27/2022 01:22:06 - INFO - codeparrot_training - Step 5851: {'lr': 0.0004921009665644535, 'samples': 1123584, 'steps': 5851, 'loss/train': 0.9857703745365143} 01/27/2022 01:22:09 - INFO - codeparrot_training - Step 5852: {'lr': 0.0004920968854581745, 'samples': 1123776, 'steps': 5852, 'loss/train': 1.2525494992733002} 01/27/2022 01:22:12 - INFO - codeparrot_training - Step 5853: {'lr': 0.0004920928033148292, 'samples': 1123968, 'steps': 5853, 'loss/train': 1.3082900047302246} 01/27/2022 01:22:16 - INFO - codeparrot_training - Step 5854: {'lr': 0.0004920887201344353, 'samples': 1124160, 'steps': 5854, 'loss/train': 0.8569275140762329} 01/27/2022 01:22:19 - INFO - codeparrot_training - Step 5855: {'lr': 0.0004920846359170103, 'samples': 1124352, 'steps': 5855, 'loss/train': 0.9443888962268829} 01/27/2022 01:22:22 - INFO - codeparrot_training - Step 5856: {'lr': 0.0004920805506625714, 'samples': 1124544, 'steps': 5856, 'loss/train': 0.5392933487892151} 01/27/2022 01:22:28 - INFO - codeparrot_training - Step 5857: {'lr': 0.0004920764643711364, 'samples': 1124736, 'steps': 5857, 'loss/train': 1.0580498278141022} 01/27/2022 01:22:31 - INFO - codeparrot_training - Step 5858: {'lr': 0.0004920723770427226, 'samples': 1124928, 'steps': 5858, 'loss/train': 0.879225879907608} 01/27/2022 01:22:34 - INFO - codeparrot_training - Step 5859: {'lr': 0.0004920682886773478, 'samples': 1125120, 'steps': 5859, 'loss/train': 0.7002574503421783} 01/27/2022 01:22:37 - INFO - codeparrot_training - Step 5860: {'lr': 0.000492064199275029, 'samples': 1125312, 'steps': 5860, 'loss/train': 0.800474613904953} 01/27/2022 01:22:40 - INFO - codeparrot_training - Step 5861: {'lr': 0.0004920601088357844, 'samples': 1125504, 'steps': 5861, 'loss/train': 0.1851087100803852} 01/27/2022 01:22:43 - INFO - codeparrot_training - Step 5862: {'lr': 0.0004920560173596309, 'samples': 1125696, 'steps': 5862, 'loss/train': 1.0077109336853027} 01/27/2022 01:22:46 - INFO - codeparrot_training - Step 5863: {'lr': 0.0004920519248465864, 'samples': 1125888, 'steps': 5863, 'loss/train': 1.1799747347831726} 01/27/2022 01:22:50 - INFO - codeparrot_training - Step 5864: {'lr': 0.0004920478312966683, 'samples': 1126080, 'steps': 5864, 'loss/train': 0.9638159573078156} 01/27/2022 01:22:53 - INFO - codeparrot_training - Step 5865: {'lr': 0.0004920437367098941, 'samples': 1126272, 'steps': 5865, 'loss/train': 1.1976197361946106} 01/27/2022 01:22:56 - INFO - codeparrot_training - Step 5866: {'lr': 0.0004920396410862815, 'samples': 1126464, 'steps': 5866, 'loss/train': 5.9527119398117065} 01/27/2022 01:23:00 - INFO - codeparrot_training - Step 5867: {'lr': 0.0004920355444258479, 'samples': 1126656, 'steps': 5867, 'loss/train': 0.9419665932655334} 01/27/2022 01:23:03 - INFO - codeparrot_training - Step 5868: {'lr': 0.0004920314467286108, 'samples': 1126848, 'steps': 5868, 'loss/train': 0.8910782039165497} 01/27/2022 01:23:07 - INFO - codeparrot_training - Step 5869: {'lr': 0.0004920273479945878, 'samples': 1127040, 'steps': 5869, 'loss/train': 1.1802368760108948} 01/27/2022 01:23:10 - INFO - codeparrot_training - Step 5870: {'lr': 0.0004920232482237966, 'samples': 1127232, 'steps': 5870, 'loss/train': 0.8981773853302002} 01/27/2022 01:23:13 - INFO - codeparrot_training - Step 5871: {'lr': 0.0004920191474162547, 'samples': 1127424, 'steps': 5871, 'loss/train': 1.1009838581085205} 01/27/2022 01:23:16 - INFO - codeparrot_training - Step 5872: {'lr': 0.0004920150455719795, 'samples': 1127616, 'steps': 5872, 'loss/train': 0.709509551525116} 01/27/2022 01:23:19 - INFO - codeparrot_training - Step 5873: {'lr': 0.0004920109426909887, 'samples': 1127808, 'steps': 5873, 'loss/train': 0.9728664457798004} 01/27/2022 01:23:22 - INFO - codeparrot_training - Step 5874: {'lr': 0.0004920068387733, 'samples': 1128000, 'steps': 5874, 'loss/train': 1.3111307322978973} 01/27/2022 01:23:25 - INFO - codeparrot_training - Step 5875: {'lr': 0.0004920027338189307, 'samples': 1128192, 'steps': 5875, 'loss/train': 0.833376556634903} 01/27/2022 01:23:30 - INFO - codeparrot_training - Step 5876: {'lr': 0.0004919986278278986, 'samples': 1128384, 'steps': 5876, 'loss/train': 0.8077880144119263} 01/27/2022 01:23:33 - INFO - codeparrot_training - Step 5877: {'lr': 0.0004919945208002212, 'samples': 1128576, 'steps': 5877, 'loss/train': 0.672213077545166} 01/27/2022 01:23:36 - INFO - codeparrot_training - Step 5878: {'lr': 0.0004919904127359162, 'samples': 1128768, 'steps': 5878, 'loss/train': 0.775629848241806} 01/27/2022 01:23:39 - INFO - codeparrot_training - Step 5879: {'lr': 0.000491986303635001, 'samples': 1128960, 'steps': 5879, 'loss/train': 1.0484599471092224} 01/27/2022 01:23:42 - INFO - codeparrot_training - Step 5880: {'lr': 0.0004919821934974933, 'samples': 1129152, 'steps': 5880, 'loss/train': 0.3161552846431732} 01/27/2022 01:23:46 - INFO - codeparrot_training - Step 5881: {'lr': 0.0004919780823234108, 'samples': 1129344, 'steps': 5881, 'loss/train': 1.1396743655204773} 01/27/2022 01:23:49 - INFO - codeparrot_training - Step 5882: {'lr': 0.000491973970112771, 'samples': 1129536, 'steps': 5882, 'loss/train': 0.3890655040740967} 01/27/2022 01:23:52 - INFO - codeparrot_training - Step 5883: {'lr': 0.0004919698568655916, 'samples': 1129728, 'steps': 5883, 'loss/train': 0.9291070103645325} 01/27/2022 01:23:58 - INFO - codeparrot_training - Step 5884: {'lr': 0.0004919657425818901, 'samples': 1129920, 'steps': 5884, 'loss/train': 1.1539075076580048} 01/27/2022 01:24:01 - INFO - codeparrot_training - Step 5885: {'lr': 0.0004919616272616842, 'samples': 1130112, 'steps': 5885, 'loss/train': 0.2180122286081314} 01/27/2022 01:24:04 - INFO - codeparrot_training - Step 5886: {'lr': 0.0004919575109049915, 'samples': 1130304, 'steps': 5886, 'loss/train': 0.759196937084198} 01/27/2022 01:24:07 - INFO - codeparrot_training - Step 5887: {'lr': 0.0004919533935118296, 'samples': 1130496, 'steps': 5887, 'loss/train': 0.8594117760658264} 01/27/2022 01:24:10 - INFO - codeparrot_training - Step 5888: {'lr': 0.0004919492750822163, 'samples': 1130688, 'steps': 5888, 'loss/train': 0.8095922470092773} 01/27/2022 01:24:13 - INFO - codeparrot_training - Step 5889: {'lr': 0.0004919451556161692, 'samples': 1130880, 'steps': 5889, 'loss/train': 0.9852256178855896} 01/27/2022 01:24:17 - INFO - codeparrot_training - Step 5890: {'lr': 0.0004919410351137058, 'samples': 1131072, 'steps': 5890, 'loss/train': 0.7036311775445938} 01/27/2022 01:24:20 - INFO - codeparrot_training - Step 5891: {'lr': 0.0004919369135748438, 'samples': 1131264, 'steps': 5891, 'loss/train': 1.5173825025558472} 01/27/2022 01:24:23 - INFO - codeparrot_training - Step 5892: {'lr': 0.0004919327909996008, 'samples': 1131456, 'steps': 5892, 'loss/train': 0.6817954480648041} 01/27/2022 01:24:26 - INFO - codeparrot_training - Step 5893: {'lr': 0.0004919286673879948, 'samples': 1131648, 'steps': 5893, 'loss/train': 0.9980740249156952} 01/27/2022 01:24:31 - INFO - codeparrot_training - Step 5894: {'lr': 0.000491924542740043, 'samples': 1131840, 'steps': 5894, 'loss/train': 0.3837336301803589} 01/27/2022 01:24:34 - INFO - codeparrot_training - Step 5895: {'lr': 0.0004919204170557634, 'samples': 1132032, 'steps': 5895, 'loss/train': 0.9161497950553894} 01/27/2022 01:24:37 - INFO - codeparrot_training - Step 5896: {'lr': 0.0004919162903351734, 'samples': 1132224, 'steps': 5896, 'loss/train': 0.9028942286968231} 01/27/2022 01:24:40 - INFO - codeparrot_training - Step 5897: {'lr': 0.000491912162578291, 'samples': 1132416, 'steps': 5897, 'loss/train': 1.2400311827659607} 01/27/2022 01:24:43 - INFO - codeparrot_training - Step 5898: {'lr': 0.0004919080337851336, 'samples': 1132608, 'steps': 5898, 'loss/train': 1.3756625354290009} 01/27/2022 01:24:46 - INFO - codeparrot_training - Step 5899: {'lr': 0.000491903903955719, 'samples': 1132800, 'steps': 5899, 'loss/train': 0.8077528774738312} 01/27/2022 01:24:50 - INFO - codeparrot_training - Step 5900: {'lr': 0.0004918997730900649, 'samples': 1132992, 'steps': 5900, 'loss/train': 1.0166500210762024} 01/27/2022 01:24:53 - INFO - codeparrot_training - Step 5901: {'lr': 0.000491895641188189, 'samples': 1133184, 'steps': 5901, 'loss/train': 0.8172473609447479} 01/27/2022 01:24:56 - INFO - codeparrot_training - Step 5902: {'lr': 0.000491891508250109, 'samples': 1133376, 'steps': 5902, 'loss/train': 0.4539901614189148} 01/27/2022 01:25:00 - INFO - codeparrot_training - Step 5903: {'lr': 0.0004918873742758426, 'samples': 1133568, 'steps': 5903, 'loss/train': 0.9048154056072235} 01/27/2022 01:25:03 - INFO - codeparrot_training - Step 5904: {'lr': 0.0004918832392654074, 'samples': 1133760, 'steps': 5904, 'loss/train': 0.5832163095474243} 01/27/2022 01:25:07 - INFO - codeparrot_training - Step 5905: {'lr': 0.0004918791032188214, 'samples': 1133952, 'steps': 5905, 'loss/train': 0.9354490041732788} 01/27/2022 01:25:10 - INFO - codeparrot_training - Step 5906: {'lr': 0.0004918749661361019, 'samples': 1134144, 'steps': 5906, 'loss/train': 0.41834887862205505} 01/27/2022 01:25:13 - INFO - codeparrot_training - Step 5907: {'lr': 0.000491870828017267, 'samples': 1134336, 'steps': 5907, 'loss/train': 0.6627690643072128} 01/27/2022 01:25:16 - INFO - codeparrot_training - Step 5908: {'lr': 0.0004918666888623342, 'samples': 1134528, 'steps': 5908, 'loss/train': 0.5592893958091736} 01/27/2022 01:25:19 - INFO - codeparrot_training - Step 5909: {'lr': 0.0004918625486713214, 'samples': 1134720, 'steps': 5909, 'loss/train': 0.4884107708930969} 01/27/2022 01:25:22 - INFO - codeparrot_training - Step 5910: {'lr': 0.0004918584074442462, 'samples': 1134912, 'steps': 5910, 'loss/train': 0.4042762666940689} 01/27/2022 01:25:28 - INFO - codeparrot_training - Step 5911: {'lr': 0.0004918542651811263, 'samples': 1135104, 'steps': 5911, 'loss/train': 0.7186342924833298} 01/27/2022 01:25:31 - INFO - codeparrot_training - Step 5912: {'lr': 0.0004918501218819796, 'samples': 1135296, 'steps': 5912, 'loss/train': 0.6176833212375641} 01/27/2022 01:25:34 - INFO - codeparrot_training - Step 5913: {'lr': 0.0004918459775468238, 'samples': 1135488, 'steps': 5913, 'loss/train': 0.7182808220386505} 01/27/2022 01:25:37 - INFO - codeparrot_training - Step 5914: {'lr': 0.0004918418321756766, 'samples': 1135680, 'steps': 5914, 'loss/train': 0.6809637397527695} 01/27/2022 01:25:40 - INFO - codeparrot_training - Step 5915: {'lr': 0.0004918376857685557, 'samples': 1135872, 'steps': 5915, 'loss/train': 0.8915555477142334} 01/27/2022 01:25:43 - INFO - codeparrot_training - Step 5916: {'lr': 0.000491833538325479, 'samples': 1136064, 'steps': 5916, 'loss/train': 0.8004566431045532} 01/27/2022 01:25:47 - INFO - codeparrot_training - Step 5917: {'lr': 0.0004918293898464643, 'samples': 1136256, 'steps': 5917, 'loss/train': 1.0044544637203217} 01/27/2022 01:25:50 - INFO - codeparrot_training - Step 5918: {'lr': 0.0004918252403315292, 'samples': 1136448, 'steps': 5918, 'loss/train': 1.0296278893947601} 01/27/2022 01:25:53 - INFO - codeparrot_training - Step 5919: {'lr': 0.0004918210897806916, 'samples': 1136640, 'steps': 5919, 'loss/train': 0.7947912812232971} 01/27/2022 01:25:57 - INFO - codeparrot_training - Step 5920: {'lr': 0.0004918169381939692, 'samples': 1136832, 'steps': 5920, 'loss/train': 0.7713005840778351} 01/27/2022 01:26:00 - INFO - codeparrot_training - Step 5921: {'lr': 0.0004918127855713799, 'samples': 1137024, 'steps': 5921, 'loss/train': 0.42193400859832764} 01/27/2022 01:26:04 - INFO - codeparrot_training - Step 5922: {'lr': 0.0004918086319129413, 'samples': 1137216, 'steps': 5922, 'loss/train': 0.8021037876605988} 01/27/2022 01:26:07 - INFO - codeparrot_training - Step 5923: {'lr': 0.0004918044772186714, 'samples': 1137408, 'steps': 5923, 'loss/train': 0.8656665086746216} 01/27/2022 01:26:10 - INFO - codeparrot_training - Step 5924: {'lr': 0.0004918003214885877, 'samples': 1137600, 'steps': 5924, 'loss/train': 0.9308280944824219} 01/27/2022 01:26:13 - INFO - codeparrot_training - Step 5925: {'lr': 0.0004917961647227084, 'samples': 1137792, 'steps': 5925, 'loss/train': 0.8462794125080109} 01/27/2022 01:26:16 - INFO - codeparrot_training - Step 5926: {'lr': 0.0004917920069210511, 'samples': 1137984, 'steps': 5926, 'loss/train': 0.9949426352977753} 01/27/2022 01:26:19 - INFO - codeparrot_training - Step 5927: {'lr': 0.0004917878480836336, 'samples': 1138176, 'steps': 5927, 'loss/train': 0.6819029599428177} 01/27/2022 01:26:22 - INFO - codeparrot_training - Step 5928: {'lr': 0.0004917836882104737, 'samples': 1138368, 'steps': 5928, 'loss/train': 0.9051776826381683} 01/27/2022 01:26:27 - INFO - codeparrot_training - Step 5929: {'lr': 0.0004917795273015892, 'samples': 1138560, 'steps': 5929, 'loss/train': 0.18653736636042595} 01/27/2022 01:26:30 - INFO - codeparrot_training - Step 5930: {'lr': 0.0004917753653569981, 'samples': 1138752, 'steps': 5930, 'loss/train': 1.3077355027198792} 01/27/2022 01:26:33 - INFO - codeparrot_training - Step 5931: {'lr': 0.000491771202376718, 'samples': 1138944, 'steps': 5931, 'loss/train': 0.48296669125556946} 01/27/2022 01:26:36 - INFO - codeparrot_training - Step 5932: {'lr': 0.000491767038360767, 'samples': 1139136, 'steps': 5932, 'loss/train': 0.16821619123220444} 01/27/2022 01:26:40 - INFO - codeparrot_training - Step 5933: {'lr': 0.0004917628733091626, 'samples': 1139328, 'steps': 5933, 'loss/train': 0.9137832820415497} 01/27/2022 01:26:43 - INFO - codeparrot_training - Step 5934: {'lr': 0.000491758707221923, 'samples': 1139520, 'steps': 5934, 'loss/train': 0.43596839904785156} 01/27/2022 01:26:46 - INFO - codeparrot_training - Step 5935: {'lr': 0.0004917545400990657, 'samples': 1139712, 'steps': 5935, 'loss/train': 0.8321780562400818} 01/27/2022 01:26:49 - INFO - codeparrot_training - Step 5936: {'lr': 0.0004917503719406087, 'samples': 1139904, 'steps': 5936, 'loss/train': 1.1725132763385773} 01/27/2022 01:26:52 - INFO - codeparrot_training - Step 5937: {'lr': 0.00049174620274657, 'samples': 1140096, 'steps': 5937, 'loss/train': 1.15487739443779} 01/27/2022 01:26:58 - INFO - codeparrot_training - Step 5938: {'lr': 0.0004917420325169673, 'samples': 1140288, 'steps': 5938, 'loss/train': 0.8387462496757507} 01/27/2022 01:27:01 - INFO - codeparrot_training - Step 5939: {'lr': 0.0004917378612518185, 'samples': 1140480, 'steps': 5939, 'loss/train': 1.2521882057189941} 01/27/2022 01:27:04 - INFO - codeparrot_training - Step 5940: {'lr': 0.0004917336889511414, 'samples': 1140672, 'steps': 5940, 'loss/train': 0.5880907773971558} 01/27/2022 01:27:07 - INFO - codeparrot_training - Step 5941: {'lr': 0.0004917295156149539, 'samples': 1140864, 'steps': 5941, 'loss/train': 1.3937951624393463} 01/27/2022 01:27:10 - INFO - codeparrot_training - Step 5942: {'lr': 0.000491725341243274, 'samples': 1141056, 'steps': 5942, 'loss/train': 0.905097484588623} 01/27/2022 01:27:13 - INFO - codeparrot_training - Step 5943: {'lr': 0.0004917211658361196, 'samples': 1141248, 'steps': 5943, 'loss/train': 1.1747694611549377} 01/27/2022 01:27:17 - INFO - codeparrot_training - Step 5944: {'lr': 0.0004917169893935083, 'samples': 1141440, 'steps': 5944, 'loss/train': 0.2602001801133156} 01/27/2022 01:27:20 - INFO - codeparrot_training - Step 5945: {'lr': 0.0004917128119154582, 'samples': 1141632, 'steps': 5945, 'loss/train': 1.4339193999767303} 01/27/2022 01:27:23 - INFO - codeparrot_training - Step 5946: {'lr': 0.0004917086334019872, 'samples': 1141824, 'steps': 5946, 'loss/train': 0.6126915067434311} 01/27/2022 01:27:27 - INFO - codeparrot_training - Step 5947: {'lr': 0.0004917044538531131, 'samples': 1142016, 'steps': 5947, 'loss/train': 1.2185064554214478} 01/27/2022 01:27:30 - INFO - codeparrot_training - Step 5948: {'lr': 0.000491700273268854, 'samples': 1142208, 'steps': 5948, 'loss/train': 1.0312707424163818} 01/27/2022 01:27:34 - INFO - codeparrot_training - Step 5949: {'lr': 0.0004916960916492276, 'samples': 1142400, 'steps': 5949, 'loss/train': 0.8356442749500275} 01/27/2022 01:27:37 - INFO - codeparrot_training - Step 5950: {'lr': 0.0004916919089942519, 'samples': 1142592, 'steps': 5950, 'loss/train': 0.9691996872425079} 01/27/2022 01:27:40 - INFO - codeparrot_training - Step 5951: {'lr': 0.0004916877253039448, 'samples': 1142784, 'steps': 5951, 'loss/train': 0.921790212392807} 01/27/2022 01:27:43 - INFO - codeparrot_training - Step 5952: {'lr': 0.0004916835405783242, 'samples': 1142976, 'steps': 5952, 'loss/train': 0.747716635465622} 01/27/2022 01:27:46 - INFO - codeparrot_training - Step 5953: {'lr': 0.0004916793548174081, 'samples': 1143168, 'steps': 5953, 'loss/train': 1.088217169046402} 01/27/2022 01:27:49 - INFO - codeparrot_training - Step 5954: {'lr': 0.0004916751680212145, 'samples': 1143360, 'steps': 5954, 'loss/train': 0.9205394983291626} 01/27/2022 01:27:53 - INFO - codeparrot_training - Step 5955: {'lr': 0.000491670980189761, 'samples': 1143552, 'steps': 5955, 'loss/train': 0.8024406731128693} 01/27/2022 01:27:57 - INFO - codeparrot_training - Step 5956: {'lr': 0.0004916667913230659, 'samples': 1143744, 'steps': 5956, 'loss/train': 0.5471832454204559} 01/27/2022 01:28:00 - INFO - codeparrot_training - Step 5957: {'lr': 0.000491662601421147, 'samples': 1143936, 'steps': 5957, 'loss/train': 1.0969154834747314} 01/27/2022 01:28:03 - INFO - codeparrot_training - Step 5958: {'lr': 0.0004916584104840222, 'samples': 1144128, 'steps': 5958, 'loss/train': 1.4486732482910156} 01/27/2022 01:28:06 - INFO - codeparrot_training - Step 5959: {'lr': 0.0004916542185117095, 'samples': 1144320, 'steps': 5959, 'loss/train': 0.6395173072814941} 01/27/2022 01:28:10 - INFO - codeparrot_training - Step 5960: {'lr': 0.0004916500255042268, 'samples': 1144512, 'steps': 5960, 'loss/train': 0.8952089846134186} 01/27/2022 01:28:13 - INFO - codeparrot_training - Step 5961: {'lr': 0.0004916458314615923, 'samples': 1144704, 'steps': 5961, 'loss/train': 0.8460304141044617} 01/27/2022 01:28:16 - INFO - codeparrot_training - Step 5962: {'lr': 0.0004916416363838237, 'samples': 1144896, 'steps': 5962, 'loss/train': 0.7959212064743042} 01/27/2022 01:28:19 - INFO - codeparrot_training - Step 5963: {'lr': 0.000491637440270939, 'samples': 1145088, 'steps': 5963, 'loss/train': 0.9367941319942474} 01/27/2022 01:28:22 - INFO - codeparrot_training - Step 5964: {'lr': 0.0004916332431229562, 'samples': 1145280, 'steps': 5964, 'loss/train': 0.7599012851715088} 01/27/2022 01:28:28 - INFO - codeparrot_training - Step 5965: {'lr': 0.0004916290449398934, 'samples': 1145472, 'steps': 5965, 'loss/train': 0.8075355291366577} 01/27/2022 01:28:31 - INFO - codeparrot_training - Step 5966: {'lr': 0.0004916248457217686, 'samples': 1145664, 'steps': 5966, 'loss/train': 1.1928081214427948} 01/27/2022 01:28:34 - INFO - codeparrot_training - Step 5967: {'lr': 0.0004916206454685995, 'samples': 1145856, 'steps': 5967, 'loss/train': 0.20665942132472992} 01/27/2022 01:28:37 - INFO - codeparrot_training - Step 5968: {'lr': 0.0004916164441804044, 'samples': 1146048, 'steps': 5968, 'loss/train': 1.3763956725597382} 01/27/2022 01:28:41 - INFO - codeparrot_training - Step 5969: {'lr': 0.0004916122418572011, 'samples': 1146240, 'steps': 5969, 'loss/train': 1.206805408000946} 01/27/2022 01:28:44 - INFO - codeparrot_training - Step 5970: {'lr': 0.0004916080384990077, 'samples': 1146432, 'steps': 5970, 'loss/train': 0.7197115570306778} 01/27/2022 01:28:47 - INFO - codeparrot_training - Step 5971: {'lr': 0.0004916038341058423, 'samples': 1146624, 'steps': 5971, 'loss/train': 0.5604524463415146} 01/27/2022 01:28:50 - INFO - codeparrot_training - Step 5972: {'lr': 0.0004915996286777226, 'samples': 1146816, 'steps': 5972, 'loss/train': 1.1775434911251068} 01/27/2022 01:28:53 - INFO - codeparrot_training - Step 5973: {'lr': 0.0004915954222146669, 'samples': 1147008, 'steps': 5973, 'loss/train': 0.3150007277727127} 01/27/2022 01:28:58 - INFO - codeparrot_training - Step 5974: {'lr': 0.0004915912147166932, 'samples': 1147200, 'steps': 5974, 'loss/train': 0.9859269261360168} 01/27/2022 01:29:01 - INFO - codeparrot_training - Step 5975: {'lr': 0.0004915870061838193, 'samples': 1147392, 'steps': 5975, 'loss/train': 1.090939611196518} 01/27/2022 01:29:04 - INFO - codeparrot_training - Step 5976: {'lr': 0.0004915827966160634, 'samples': 1147584, 'steps': 5976, 'loss/train': 0.7837338745594025} 01/27/2022 01:29:07 - INFO - codeparrot_training - Step 5977: {'lr': 0.0004915785860134436, 'samples': 1147776, 'steps': 5977, 'loss/train': 0.41621553897857666} 01/27/2022 01:29:10 - INFO - codeparrot_training - Step 5978: {'lr': 0.0004915743743759779, 'samples': 1147968, 'steps': 5978, 'loss/train': 0.8034381866455078} 01/27/2022 01:29:13 - INFO - codeparrot_training - Step 5979: {'lr': 0.0004915701617036842, 'samples': 1148160, 'steps': 5979, 'loss/train': 0.47881655395030975} 01/27/2022 01:29:16 - INFO - codeparrot_training - Step 5980: {'lr': 0.0004915659479965806, 'samples': 1148352, 'steps': 5980, 'loss/train': 1.1428370475769043} 01/27/2022 01:29:20 - INFO - codeparrot_training - Step 5981: {'lr': 0.0004915617332546852, 'samples': 1148544, 'steps': 5981, 'loss/train': 0.6230627149343491} 01/27/2022 01:29:23 - INFO - codeparrot_training - Step 5982: {'lr': 0.0004915575174780161, 'samples': 1148736, 'steps': 5982, 'loss/train': 0.38649125397205353} 01/27/2022 01:29:28 - INFO - codeparrot_training - Step 5983: {'lr': 0.0004915533006665912, 'samples': 1148928, 'steps': 5983, 'loss/train': 0.6079123020172119} 01/27/2022 01:29:31 - INFO - codeparrot_training - Step 5984: {'lr': 0.0004915490828204287, 'samples': 1149120, 'steps': 5984, 'loss/train': 0.9055639207363129} 01/27/2022 01:29:34 - INFO - codeparrot_training - Step 5985: {'lr': 0.0004915448639395466, 'samples': 1149312, 'steps': 5985, 'loss/train': 0.8659065663814545} 01/27/2022 01:29:38 - INFO - codeparrot_training - Step 5986: {'lr': 0.0004915406440239631, 'samples': 1149504, 'steps': 5986, 'loss/train': 1.0837918817996979} 01/27/2022 01:29:41 - INFO - codeparrot_training - Step 5987: {'lr': 0.0004915364230736961, 'samples': 1149696, 'steps': 5987, 'loss/train': 0.861989289522171} 01/27/2022 01:29:44 - INFO - codeparrot_training - Step 5988: {'lr': 0.0004915322010887637, 'samples': 1149888, 'steps': 5988, 'loss/train': 1.079150140285492} 01/27/2022 01:29:47 - INFO - codeparrot_training - Step 5989: {'lr': 0.0004915279780691843, 'samples': 1150080, 'steps': 5989, 'loss/train': 0.7141554951667786} 01/27/2022 01:29:50 - INFO - codeparrot_training - Step 5990: {'lr': 0.0004915237540149755, 'samples': 1150272, 'steps': 5990, 'loss/train': 1.0042881667613983} 01/27/2022 01:29:53 - INFO - codeparrot_training - Step 5991: {'lr': 0.0004915195289261557, 'samples': 1150464, 'steps': 5991, 'loss/train': 1.0117742121219635} 01/27/2022 01:29:58 - INFO - codeparrot_training - Step 5992: {'lr': 0.0004915153028027429, 'samples': 1150656, 'steps': 5992, 'loss/train': 1.0333560705184937} 01/27/2022 01:30:01 - INFO - codeparrot_training - Step 5993: {'lr': 0.0004915110756447552, 'samples': 1150848, 'steps': 5993, 'loss/train': 0.6415847092866898} 01/27/2022 01:30:04 - INFO - codeparrot_training - Step 5994: {'lr': 0.0004915068474522109, 'samples': 1151040, 'steps': 5994, 'loss/train': 0.4053955078125} 01/27/2022 01:30:07 - INFO - codeparrot_training - Step 5995: {'lr': 0.0004915026182251278, 'samples': 1151232, 'steps': 5995, 'loss/train': 1.1155361831188202} 01/27/2022 01:30:10 - INFO - codeparrot_training - Step 5996: {'lr': 0.0004914983879635242, 'samples': 1151424, 'steps': 5996, 'loss/train': 1.0008392035961151} 01/27/2022 01:30:14 - INFO - codeparrot_training - Step 5997: {'lr': 0.0004914941566674183, 'samples': 1151616, 'steps': 5997, 'loss/train': 0.9563375115394592} 01/27/2022 01:30:17 - INFO - codeparrot_training - Step 5998: {'lr': 0.0004914899243368279, 'samples': 1151808, 'steps': 5998, 'loss/train': 0.758020430803299} 01/27/2022 01:30:20 - INFO - codeparrot_training - Step 5999: {'lr': 0.0004914856909717715, 'samples': 1152000, 'steps': 5999, 'loss/train': 0.8879449367523193} 01/27/2022 01:30:20 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/27/2022 01:30:38 - WARNING - huggingface_hub.repository - Several commits (3) will be pushed upstream. 01/27/2022 01:30:38 - WARNING - huggingface_hub.repository - The progress bars may be unreliable. 01/27/2022 01:31:13 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py da12221..8aa5ec4 royal-monkey-12 -> royal-monkey-12 01/27/2022 01:31:17 - INFO - codeparrot_training - Step 6000: {'lr': 0.0004914814565722671, 'samples': 1152192, 'steps': 6000, 'loss/train': 0.47781872749328613} 01/27/2022 01:31:22 - INFO - codeparrot_training - Step 6001: {'lr': 0.0004914772211383327, 'samples': 1152384, 'steps': 6001, 'loss/train': 0.7370853871107101} 01/27/2022 01:31:25 - INFO - codeparrot_training - Step 6002: {'lr': 0.0004914729846699867, 'samples': 1152576, 'steps': 6002, 'loss/train': 1.0314266681671143} 01/27/2022 01:31:29 - INFO - codeparrot_training - Step 6003: {'lr': 0.000491468747167247, 'samples': 1152768, 'steps': 6003, 'loss/train': 0.5577990710735321} 01/27/2022 01:31:32 - INFO - codeparrot_training - Step 6004: {'lr': 0.0004914645086301319, 'samples': 1152960, 'steps': 6004, 'loss/train': 0.9060070216655731} 01/27/2022 01:31:35 - INFO - codeparrot_training - Step 6005: {'lr': 0.0004914602690586596, 'samples': 1153152, 'steps': 6005, 'loss/train': 1.1503175497055054} 01/27/2022 01:31:38 - INFO - codeparrot_training - Step 6006: {'lr': 0.0004914560284528481, 'samples': 1153344, 'steps': 6006, 'loss/train': 0.6653635650873184} 01/27/2022 01:31:41 - INFO - codeparrot_training - Step 6007: {'lr': 0.0004914517868127156, 'samples': 1153536, 'steps': 6007, 'loss/train': 0.8128127753734589} 01/27/2022 01:31:44 - INFO - codeparrot_training - Step 6008: {'lr': 0.0004914475441382804, 'samples': 1153728, 'steps': 6008, 'loss/train': 0.7923440337181091} 01/27/2022 01:31:50 - INFO - codeparrot_training - Step 6009: {'lr': 0.0004914433004295605, 'samples': 1153920, 'steps': 6009, 'loss/train': 0.6783936470746994} 01/27/2022 01:31:53 - INFO - codeparrot_training - Step 6010: {'lr': 0.0004914390556865743, 'samples': 1154112, 'steps': 6010, 'loss/train': 1.2388244569301605} 01/27/2022 01:31:56 - INFO - codeparrot_training - Step 6011: {'lr': 0.0004914348099093398, 'samples': 1154304, 'steps': 6011, 'loss/train': 0.7230310589075089} 01/27/2022 01:31:59 - INFO - codeparrot_training - Step 6012: {'lr': 0.0004914305630978751, 'samples': 1154496, 'steps': 6012, 'loss/train': 0.3604404777288437} 01/27/2022 01:32:02 - INFO - codeparrot_training - Step 6013: {'lr': 0.0004914263152521987, 'samples': 1154688, 'steps': 6013, 'loss/train': 0.868224173784256} 01/27/2022 01:32:06 - INFO - codeparrot_training - Step 6014: {'lr': 0.0004914220663723286, 'samples': 1154880, 'steps': 6014, 'loss/train': 0.30920009315013885} 01/27/2022 01:32:09 - INFO - codeparrot_training - Step 6015: {'lr': 0.0004914178164582829, 'samples': 1155072, 'steps': 6015, 'loss/train': 0.844325065612793} 01/27/2022 01:32:12 - INFO - codeparrot_training - Step 6016: {'lr': 0.0004914135655100801, 'samples': 1155264, 'steps': 6016, 'loss/train': 0.7064405232667923} 01/27/2022 01:32:15 - INFO - codeparrot_training - Step 6017: {'lr': 0.0004914093135277381, 'samples': 1155456, 'steps': 6017, 'loss/train': 0.7322920113801956} 01/27/2022 01:32:19 - INFO - codeparrot_training - Step 6018: {'lr': 0.0004914050605112753, 'samples': 1155648, 'steps': 6018, 'loss/train': 1.3764962553977966} 01/27/2022 01:32:23 - INFO - codeparrot_training - Step 6019: {'lr': 0.00049140080646071, 'samples': 1155840, 'steps': 6019, 'loss/train': 0.7494382560253143} 01/27/2022 01:32:26 - INFO - codeparrot_training - Step 6020: {'lr': 0.0004913965513760601, 'samples': 1156032, 'steps': 6020, 'loss/train': 0.9572587609291077} 01/27/2022 01:32:29 - INFO - codeparrot_training - Step 6021: {'lr': 0.0004913922952573442, 'samples': 1156224, 'steps': 6021, 'loss/train': 0.9611290991306305} 01/27/2022 01:32:32 - INFO - codeparrot_training - Step 6022: {'lr': 0.0004913880381045803, 'samples': 1156416, 'steps': 6022, 'loss/train': 0.61031574010849} 01/27/2022 01:32:35 - INFO - codeparrot_training - Step 6023: {'lr': 0.0004913837799177867, 'samples': 1156608, 'steps': 6023, 'loss/train': 1.847208559513092} 01/27/2022 01:32:38 - INFO - codeparrot_training - Step 6024: {'lr': 0.0004913795206969815, 'samples': 1156800, 'steps': 6024, 'loss/train': 0.6278470605611801} 01/27/2022 01:32:42 - INFO - codeparrot_training - Step 6025: {'lr': 0.0004913752604421833, 'samples': 1156992, 'steps': 6025, 'loss/train': 1.6419541239738464} 01/27/2022 01:32:45 - INFO - codeparrot_training - Step 6026: {'lr': 0.0004913709991534099, 'samples': 1157184, 'steps': 6026, 'loss/train': 0.6583697497844696} 01/27/2022 01:32:49 - INFO - codeparrot_training - Step 6027: {'lr': 0.00049136673683068, 'samples': 1157376, 'steps': 6027, 'loss/train': 0.7293607145547867} 01/27/2022 01:32:52 - INFO - codeparrot_training - Step 6028: {'lr': 0.0004913624734740115, 'samples': 1157568, 'steps': 6028, 'loss/train': 0.9262633323669434} 01/27/2022 01:32:55 - INFO - codeparrot_training - Step 6029: {'lr': 0.0004913582090834229, 'samples': 1157760, 'steps': 6029, 'loss/train': 0.9719429612159729} 01/27/2022 01:32:59 - INFO - codeparrot_training - Step 6030: {'lr': 0.0004913539436589323, 'samples': 1157952, 'steps': 6030, 'loss/train': 1.0582362413406372} 01/27/2022 01:33:02 - INFO - codeparrot_training - Step 6031: {'lr': 0.0004913496772005581, 'samples': 1158144, 'steps': 6031, 'loss/train': 0.9699996113777161} 01/27/2022 01:33:05 - INFO - codeparrot_training - Step 6032: {'lr': 0.0004913454097083185, 'samples': 1158336, 'steps': 6032, 'loss/train': 1.2441707253456116} 01/27/2022 01:33:08 - INFO - codeparrot_training - Step 6033: {'lr': 0.0004913411411822318, 'samples': 1158528, 'steps': 6033, 'loss/train': 0.6149754077196121} 01/27/2022 01:33:11 - INFO - codeparrot_training - Step 6034: {'lr': 0.0004913368716223162, 'samples': 1158720, 'steps': 6034, 'loss/train': 0.6109058707952499} 01/27/2022 01:33:14 - INFO - codeparrot_training - Step 6035: {'lr': 0.0004913326010285902, 'samples': 1158912, 'steps': 6035, 'loss/train': 1.2850828170776367} 01/27/2022 01:33:19 - INFO - codeparrot_training - Step 6036: {'lr': 0.0004913283294010719, 'samples': 1159104, 'steps': 6036, 'loss/train': 0.24895741045475006} 01/27/2022 01:33:22 - INFO - codeparrot_training - Step 6037: {'lr': 0.0004913240567397797, 'samples': 1159296, 'steps': 6037, 'loss/train': 1.0583360195159912} 01/27/2022 01:33:25 - INFO - codeparrot_training - Step 6038: {'lr': 0.0004913197830447319, 'samples': 1159488, 'steps': 6038, 'loss/train': 1.0275395214557648} 01/27/2022 01:33:28 - INFO - codeparrot_training - Step 6039: {'lr': 0.0004913155083159467, 'samples': 1159680, 'steps': 6039, 'loss/train': 0.6769727468490601} 01/27/2022 01:33:31 - INFO - codeparrot_training - Step 6040: {'lr': 0.0004913112325534426, 'samples': 1159872, 'steps': 6040, 'loss/train': 0.6013341397047043} 01/27/2022 01:33:34 - INFO - codeparrot_training - Step 6041: {'lr': 0.0004913069557572376, 'samples': 1160064, 'steps': 6041, 'loss/train': 0.9155062437057495} 01/27/2022 01:33:37 - INFO - codeparrot_training - Step 6042: {'lr': 0.0004913026779273504, 'samples': 1160256, 'steps': 6042, 'loss/train': 0.6997316926717758} 01/27/2022 01:33:41 - INFO - codeparrot_training - Step 6043: {'lr': 0.0004912983990637992, 'samples': 1160448, 'steps': 6043, 'loss/train': 1.1326244473457336} 01/27/2022 01:33:47 - INFO - codeparrot_training - Step 6044: {'lr': 0.0004912941191666021, 'samples': 1160640, 'steps': 6044, 'loss/train': 1.2101694345474243} 01/27/2022 01:33:50 - INFO - codeparrot_training - Step 6045: {'lr': 0.0004912898382357777, 'samples': 1160832, 'steps': 6045, 'loss/train': 3.641731023788452} 01/27/2022 01:33:53 - INFO - codeparrot_training - Step 6046: {'lr': 0.0004912855562713443, 'samples': 1161024, 'steps': 6046, 'loss/train': 1.4224253296852112} 01/27/2022 01:33:56 - INFO - codeparrot_training - Step 6047: {'lr': 0.0004912812732733201, 'samples': 1161216, 'steps': 6047, 'loss/train': 0.5901898294687271} 01/27/2022 01:33:59 - INFO - codeparrot_training - Step 6048: {'lr': 0.0004912769892417236, 'samples': 1161408, 'steps': 6048, 'loss/train': 1.0473860800266266} 01/27/2022 01:34:02 - INFO - codeparrot_training - Step 6049: {'lr': 0.000491272704176573, 'samples': 1161600, 'steps': 6049, 'loss/train': 0.9029603898525238} 01/27/2022 01:34:05 - INFO - codeparrot_training - Step 6050: {'lr': 0.0004912684180778869, 'samples': 1161792, 'steps': 6050, 'loss/train': 0.7695573270320892} 01/27/2022 01:34:09 - INFO - codeparrot_training - Step 6051: {'lr': 0.0004912641309456834, 'samples': 1161984, 'steps': 6051, 'loss/train': 0.6422866433858871} 01/27/2022 01:34:12 - INFO - codeparrot_training - Step 6052: {'lr': 0.000491259842779981, 'samples': 1162176, 'steps': 6052, 'loss/train': 0.9152500927448273} 01/27/2022 01:34:15 - INFO - codeparrot_training - Step 6053: {'lr': 0.0004912555535807981, 'samples': 1162368, 'steps': 6053, 'loss/train': 0.91745924949646} 01/27/2022 01:34:20 - INFO - codeparrot_training - Step 6054: {'lr': 0.0004912512633481529, 'samples': 1162560, 'steps': 6054, 'loss/train': 0.8383043110370636} 01/27/2022 01:34:23 - INFO - codeparrot_training - Step 6055: {'lr': 0.0004912469720820639, 'samples': 1162752, 'steps': 6055, 'loss/train': 0.9232521057128906} 01/27/2022 01:34:26 - INFO - codeparrot_training - Step 6056: {'lr': 0.0004912426797825495, 'samples': 1162944, 'steps': 6056, 'loss/train': 1.8707664012908936} 01/27/2022 01:34:29 - INFO - codeparrot_training - Step 6057: {'lr': 0.0004912383864496281, 'samples': 1163136, 'steps': 6057, 'loss/train': 1.1027304232120514} 01/27/2022 01:34:32 - INFO - codeparrot_training - Step 6058: {'lr': 0.0004912340920833182, 'samples': 1163328, 'steps': 6058, 'loss/train': 0.8153014183044434} 01/27/2022 01:34:35 - INFO - codeparrot_training - Step 6059: {'lr': 0.0004912297966836378, 'samples': 1163520, 'steps': 6059, 'loss/train': 0.9799639284610748} 01/27/2022 01:34:38 - INFO - codeparrot_training - Step 6060: {'lr': 0.0004912255002506057, 'samples': 1163712, 'steps': 6060, 'loss/train': 0.7983030080795288} 01/27/2022 01:34:42 - INFO - codeparrot_training - Step 6061: {'lr': 0.00049122120278424, 'samples': 1163904, 'steps': 6061, 'loss/train': 0.45533743500709534} 01/27/2022 01:34:46 - INFO - codeparrot_training - Step 6062: {'lr': 0.0004912169042845595, 'samples': 1164096, 'steps': 6062, 'loss/train': 0.9434081017971039} 01/27/2022 01:34:49 - INFO - codeparrot_training - Step 6063: {'lr': 0.0004912126047515821, 'samples': 1164288, 'steps': 6063, 'loss/train': 0.5123435854911804} 01/27/2022 01:34:52 - INFO - codeparrot_training - Step 6064: {'lr': 0.0004912083041853267, 'samples': 1164480, 'steps': 6064, 'loss/train': 0.6385739296674728} 01/27/2022 01:34:56 - INFO - codeparrot_training - Step 6065: {'lr': 0.0004912040025858114, 'samples': 1164672, 'steps': 6065, 'loss/train': 0.4284241944551468} 01/27/2022 01:34:59 - INFO - codeparrot_training - Step 6066: {'lr': 0.0004911996999530548, 'samples': 1164864, 'steps': 6066, 'loss/train': 0.7748903632164001} 01/27/2022 01:35:02 - INFO - codeparrot_training - Step 6067: {'lr': 0.0004911953962870754, 'samples': 1165056, 'steps': 6067, 'loss/train': 0.8712100982666016} 01/27/2022 01:35:05 - INFO - codeparrot_training - Step 6068: {'lr': 0.0004911910915878913, 'samples': 1165248, 'steps': 6068, 'loss/train': 1.1309532523155212} 01/27/2022 01:35:08 - INFO - codeparrot_training - Step 6069: {'lr': 0.0004911867858555212, 'samples': 1165440, 'steps': 6069, 'loss/train': 0.9333480298519135} 01/27/2022 01:35:11 - INFO - codeparrot_training - Step 6070: {'lr': 0.0004911824790899836, 'samples': 1165632, 'steps': 6070, 'loss/train': 0.4439709931612015} 01/27/2022 01:35:17 - INFO - codeparrot_training - Step 6071: {'lr': 0.0004911781712912968, 'samples': 1165824, 'steps': 6071, 'loss/train': 0.44669562578201294} 01/27/2022 01:35:20 - INFO - codeparrot_training - Step 6072: {'lr': 0.0004911738624594793, 'samples': 1166016, 'steps': 6072, 'loss/train': 1.1735219657421112} 01/27/2022 01:35:23 - INFO - codeparrot_training - Step 6073: {'lr': 0.0004911695525945494, 'samples': 1166208, 'steps': 6073, 'loss/train': 0.6576834172010422} 01/27/2022 01:35:26 - INFO - codeparrot_training - Step 6074: {'lr': 0.0004911652416965259, 'samples': 1166400, 'steps': 6074, 'loss/train': 0.9435459673404694} 01/27/2022 01:35:29 - INFO - codeparrot_training - Step 6075: {'lr': 0.000491160929765427, 'samples': 1166592, 'steps': 6075, 'loss/train': 0.8050745129585266} 01/27/2022 01:35:33 - INFO - codeparrot_training - Step 6076: {'lr': 0.0004911566168012714, 'samples': 1166784, 'steps': 6076, 'loss/train': 0.9714757204055786} 01/27/2022 01:35:36 - INFO - codeparrot_training - Step 6077: {'lr': 0.0004911523028040772, 'samples': 1166976, 'steps': 6077, 'loss/train': 0.789343535900116} 01/27/2022 01:35:39 - INFO - codeparrot_training - Step 6078: {'lr': 0.0004911479877738633, 'samples': 1167168, 'steps': 6078, 'loss/train': 0.41696441173553467} 01/27/2022 01:35:42 - INFO - codeparrot_training - Step 6079: {'lr': 0.0004911436717106478, 'samples': 1167360, 'steps': 6079, 'loss/train': 0.8040503561496735} 01/27/2022 01:35:47 - INFO - codeparrot_training - Step 6080: {'lr': 0.0004911393546144495, 'samples': 1167552, 'steps': 6080, 'loss/train': 0.48819686472415924} 01/27/2022 01:35:50 - INFO - codeparrot_training - Step 6081: {'lr': 0.0004911350364852868, 'samples': 1167744, 'steps': 6081, 'loss/train': 0.8775950968265533} 01/27/2022 01:35:53 - INFO - codeparrot_training - Step 6082: {'lr': 0.0004911307173231782, 'samples': 1167936, 'steps': 6082, 'loss/train': 0.8804233968257904} 01/27/2022 01:35:56 - INFO - codeparrot_training - Step 6083: {'lr': 0.000491126397128142, 'samples': 1168128, 'steps': 6083, 'loss/train': 0.7251869291067123} 01/27/2022 01:35:59 - INFO - codeparrot_training - Step 6084: {'lr': 0.0004911220759001971, 'samples': 1168320, 'steps': 6084, 'loss/train': 1.0273682177066803} 01/27/2022 01:36:02 - INFO - codeparrot_training - Step 6085: {'lr': 0.0004911177536393616, 'samples': 1168512, 'steps': 6085, 'loss/train': 0.8009030520915985} 01/27/2022 01:36:05 - INFO - codeparrot_training - Step 6086: {'lr': 0.0004911134303456543, 'samples': 1168704, 'steps': 6086, 'loss/train': 1.618733525276184} 01/27/2022 01:36:09 - INFO - codeparrot_training - Step 6087: {'lr': 0.0004911091060190937, 'samples': 1168896, 'steps': 6087, 'loss/train': 0.7631807327270508} 01/27/2022 01:36:12 - INFO - codeparrot_training - Step 6088: {'lr': 0.0004911047806596981, 'samples': 1169088, 'steps': 6088, 'loss/train': 0.8910266160964966} 01/27/2022 01:36:18 - INFO - codeparrot_training - Step 6089: {'lr': 0.0004911004542674863, 'samples': 1169280, 'steps': 6089, 'loss/train': 0.9057877063751221} 01/27/2022 01:36:21 - INFO - codeparrot_training - Step 6090: {'lr': 0.0004910961268424766, 'samples': 1169472, 'steps': 6090, 'loss/train': 0.8887301981449127} 01/27/2022 01:36:24 - INFO - codeparrot_training - Step 6091: {'lr': 0.0004910917983846877, 'samples': 1169664, 'steps': 6091, 'loss/train': 1.258389562368393} 01/27/2022 01:36:27 - INFO - codeparrot_training - Step 6092: {'lr': 0.0004910874688941381, 'samples': 1169856, 'steps': 6092, 'loss/train': 1.1894950568675995} 01/27/2022 01:36:30 - INFO - codeparrot_training - Step 6093: {'lr': 0.0004910831383708464, 'samples': 1170048, 'steps': 6093, 'loss/train': 0.7038432955741882} 01/27/2022 01:36:33 - INFO - codeparrot_training - Step 6094: {'lr': 0.000491078806814831, 'samples': 1170240, 'steps': 6094, 'loss/train': 0.7837424576282501} 01/27/2022 01:36:36 - INFO - codeparrot_training - Step 6095: {'lr': 0.0004910744742261106, 'samples': 1170432, 'steps': 6095, 'loss/train': 0.35940442979335785} 01/27/2022 01:36:40 - INFO - codeparrot_training - Step 6096: {'lr': 0.0004910701406047037, 'samples': 1170624, 'steps': 6096, 'loss/train': 1.023830384016037} 01/27/2022 01:36:43 - INFO - codeparrot_training - Step 6097: {'lr': 0.0004910658059506289, 'samples': 1170816, 'steps': 6097, 'loss/train': 0.6115923821926117} 01/27/2022 01:36:47 - INFO - codeparrot_training - Step 6098: {'lr': 0.0004910614702639045, 'samples': 1171008, 'steps': 6098, 'loss/train': 0.9743988811969757} 01/27/2022 01:36:50 - INFO - codeparrot_training - Step 6099: {'lr': 0.0004910571335445496, 'samples': 1171200, 'steps': 6099, 'loss/train': 0.9877602159976959} 01/27/2022 01:36:53 - INFO - codeparrot_training - Step 6100: {'lr': 0.0004910527957925823, 'samples': 1171392, 'steps': 6100, 'loss/train': 0.8897913694381714} 01/27/2022 01:36:57 - INFO - codeparrot_training - Step 6101: {'lr': 0.0004910484570080215, 'samples': 1171584, 'steps': 6101, 'loss/train': 0.8537431061267853} 01/27/2022 01:37:00 - INFO - codeparrot_training - Step 6102: {'lr': 0.0004910441171908855, 'samples': 1171776, 'steps': 6102, 'loss/train': 0.4798547476530075} 01/27/2022 01:37:03 - INFO - codeparrot_training - Step 6103: {'lr': 0.0004910397763411931, 'samples': 1171968, 'steps': 6103, 'loss/train': 0.7010283172130585} 01/27/2022 01:37:06 - INFO - codeparrot_training - Step 6104: {'lr': 0.000491035434458963, 'samples': 1172160, 'steps': 6104, 'loss/train': 1.1217038333415985} 01/27/2022 01:37:09 - INFO - codeparrot_training - Step 6105: {'lr': 0.0004910310915442135, 'samples': 1172352, 'steps': 6105, 'loss/train': 0.548427164554596} 01/27/2022 01:37:14 - INFO - codeparrot_training - Step 6106: {'lr': 0.0004910267475969633, 'samples': 1172544, 'steps': 6106, 'loss/train': 0.30546513944864273} 01/27/2022 01:37:17 - INFO - codeparrot_training - Step 6107: {'lr': 0.000491022402617231, 'samples': 1172736, 'steps': 6107, 'loss/train': 0.9987663924694061} 01/27/2022 01:37:20 - INFO - codeparrot_training - Step 6108: {'lr': 0.0004910180566050354, 'samples': 1172928, 'steps': 6108, 'loss/train': 0.8330307304859161} 01/27/2022 01:37:23 - INFO - codeparrot_training - Step 6109: {'lr': 0.0004910137095603949, 'samples': 1173120, 'steps': 6109, 'loss/train': 0.9698396623134613} 01/27/2022 01:37:26 - INFO - codeparrot_training - Step 6110: {'lr': 0.0004910093614833282, 'samples': 1173312, 'steps': 6110, 'loss/train': 0.7614223659038544} 01/27/2022 01:37:29 - INFO - codeparrot_training - Step 6111: {'lr': 0.000491005012373854, 'samples': 1173504, 'steps': 6111, 'loss/train': 0.49341000616550446} 01/27/2022 01:37:33 - INFO - codeparrot_training - Step 6112: {'lr': 0.0004910006622319908, 'samples': 1173696, 'steps': 6112, 'loss/train': 0.9399229288101196} 01/27/2022 01:37:36 - INFO - codeparrot_training - Step 6113: {'lr': 0.0004909963110577573, 'samples': 1173888, 'steps': 6113, 'loss/train': 0.8165886104106903} 01/27/2022 01:37:39 - INFO - codeparrot_training - Step 6114: {'lr': 0.000490991958851172, 'samples': 1174080, 'steps': 6114, 'loss/train': 0.38434669375419617} 01/27/2022 01:37:42 - INFO - codeparrot_training - Step 6115: {'lr': 0.0004909876056122538, 'samples': 1174272, 'steps': 6115, 'loss/train': 0.9941025674343109} 01/27/2022 01:37:48 - INFO - codeparrot_training - Step 6116: {'lr': 0.0004909832513410213, 'samples': 1174464, 'steps': 6116, 'loss/train': 1.3212283551692963} 01/27/2022 01:37:51 - INFO - codeparrot_training - Step 6117: {'lr': 0.000490978896037493, 'samples': 1174656, 'steps': 6117, 'loss/train': 0.7264234721660614} 01/27/2022 01:37:54 - INFO - codeparrot_training - Step 6118: {'lr': 0.0004909745397016876, 'samples': 1174848, 'steps': 6118, 'loss/train': 0.755801796913147} 01/27/2022 01:37:57 - INFO - codeparrot_training - Step 6119: {'lr': 0.0004909701823336238, 'samples': 1175040, 'steps': 6119, 'loss/train': 1.0162360668182373} 01/27/2022 01:38:00 - INFO - codeparrot_training - Step 6120: {'lr': 0.0004909658239333202, 'samples': 1175232, 'steps': 6120, 'loss/train': 1.2872189283370972} 01/27/2022 01:38:03 - INFO - codeparrot_training - Step 6121: {'lr': 0.0004909614645007956, 'samples': 1175424, 'steps': 6121, 'loss/train': 1.175863355398178} 01/27/2022 01:38:06 - INFO - codeparrot_training - Step 6122: {'lr': 0.0004909571040360686, 'samples': 1175616, 'steps': 6122, 'loss/train': 0.654901921749115} 01/27/2022 01:38:10 - INFO - codeparrot_training - Step 6123: {'lr': 0.0004909527425391579, 'samples': 1175808, 'steps': 6123, 'loss/train': 0.9899008870124817} 01/27/2022 01:38:14 - INFO - codeparrot_training - Step 6124: {'lr': 0.0004909483800100822, 'samples': 1176000, 'steps': 6124, 'loss/train': 0.2181759774684906} 01/27/2022 01:38:17 - INFO - codeparrot_training - Step 6125: {'lr': 0.00049094401644886, 'samples': 1176192, 'steps': 6125, 'loss/train': 1.0365383327007294} 01/27/2022 01:38:20 - INFO - codeparrot_training - Step 6126: {'lr': 0.0004909396518555102, 'samples': 1176384, 'steps': 6126, 'loss/train': 0.5053049772977829} 01/27/2022 01:38:24 - INFO - codeparrot_training - Step 6127: {'lr': 0.0004909352862300514, 'samples': 1176576, 'steps': 6127, 'loss/train': 1.1664391458034515} 01/27/2022 01:38:27 - INFO - codeparrot_training - Step 6128: {'lr': 0.0004909309195725024, 'samples': 1176768, 'steps': 6128, 'loss/train': 2.480421781539917} 01/27/2022 01:38:30 - INFO - codeparrot_training - Step 6129: {'lr': 0.0004909265518828819, 'samples': 1176960, 'steps': 6129, 'loss/train': 1.2931541204452515} 01/27/2022 01:38:33 - INFO - codeparrot_training - Step 6130: {'lr': 0.0004909221831612085, 'samples': 1177152, 'steps': 6130, 'loss/train': 1.027052879333496} 01/27/2022 01:38:36 - INFO - codeparrot_training - Step 6131: {'lr': 0.000490917813407501, 'samples': 1177344, 'steps': 6131, 'loss/train': 0.36363615095615387} 01/27/2022 01:38:39 - INFO - codeparrot_training - Step 6132: {'lr': 0.0004909134426217779, 'samples': 1177536, 'steps': 6132, 'loss/train': 0.24080215394496918} 01/27/2022 01:38:45 - INFO - codeparrot_training - Step 6133: {'lr': 0.0004909090708040583, 'samples': 1177728, 'steps': 6133, 'loss/train': 0.8456514179706573} 01/27/2022 01:38:48 - INFO - codeparrot_training - Step 6134: {'lr': 0.0004909046979543608, 'samples': 1177920, 'steps': 6134, 'loss/train': 0.9166979491710663} 01/27/2022 01:38:51 - INFO - codeparrot_training - Step 6135: {'lr': 0.000490900324072704, 'samples': 1178112, 'steps': 6135, 'loss/train': 0.5678919106721878} 01/27/2022 01:38:54 - INFO - codeparrot_training - Step 6136: {'lr': 0.0004908959491591065, 'samples': 1178304, 'steps': 6136, 'loss/train': 1.3796371221542358} 01/27/2022 01:38:57 - INFO - codeparrot_training - Step 6137: {'lr': 0.0004908915732135874, 'samples': 1178496, 'steps': 6137, 'loss/train': 1.176431804895401} 01/27/2022 01:39:00 - INFO - codeparrot_training - Step 6138: {'lr': 0.0004908871962361654, 'samples': 1178688, 'steps': 6138, 'loss/train': 0.8589780628681183} 01/27/2022 01:39:03 - INFO - codeparrot_training - Step 6139: {'lr': 0.0004908828182268591, 'samples': 1178880, 'steps': 6139, 'loss/train': 1.0000331103801727} 01/27/2022 01:39:07 - INFO - codeparrot_training - Step 6140: {'lr': 0.0004908784391856872, 'samples': 1179072, 'steps': 6140, 'loss/train': 1.0158064663410187} 01/27/2022 01:39:10 - INFO - codeparrot_training - Step 6141: {'lr': 0.0004908740591126686, 'samples': 1179264, 'steps': 6141, 'loss/train': 1.0287701189517975} 01/27/2022 01:39:14 - INFO - codeparrot_training - Step 6142: {'lr': 0.000490869678007822, 'samples': 1179456, 'steps': 6142, 'loss/train': 1.027391105890274} 01/27/2022 01:39:17 - INFO - codeparrot_training - Step 6143: {'lr': 0.0004908652958711663, 'samples': 1179648, 'steps': 6143, 'loss/train': 0.7177703082561493} 01/27/2022 01:39:20 - INFO - codeparrot_training - Step 6144: {'lr': 0.00049086091270272, 'samples': 1179840, 'steps': 6144, 'loss/train': 0.8170986771583557} 01/27/2022 01:39:24 - INFO - codeparrot_training - Step 6145: {'lr': 0.0004908565285025021, 'samples': 1180032, 'steps': 6145, 'loss/train': 1.3868275880813599} 01/27/2022 01:39:27 - INFO - codeparrot_training - Step 6146: {'lr': 0.0004908521432705312, 'samples': 1180224, 'steps': 6146, 'loss/train': 0.9680404365062714} 01/27/2022 01:39:30 - INFO - codeparrot_training - Step 6147: {'lr': 0.0004908477570068263, 'samples': 1180416, 'steps': 6147, 'loss/train': 0.9965924620628357} 01/27/2022 01:39:33 - INFO - codeparrot_training - Step 6148: {'lr': 0.0004908433697114062, 'samples': 1180608, 'steps': 6148, 'loss/train': 0.9805591106414795} 01/27/2022 01:39:36 - INFO - codeparrot_training - Step 6149: {'lr': 0.0004908389813842894, 'samples': 1180800, 'steps': 6149, 'loss/train': 1.4852460622787476} 01/27/2022 01:39:41 - INFO - codeparrot_training - Step 6150: {'lr': 0.0004908345920254949, 'samples': 1180992, 'steps': 6150, 'loss/train': 1.1526761949062347} 01/27/2022 01:39:44 - INFO - codeparrot_training - Step 6151: {'lr': 0.0004908302016350416, 'samples': 1181184, 'steps': 6151, 'loss/train': 0.6943336129188538} 01/27/2022 01:39:47 - INFO - codeparrot_training - Step 6152: {'lr': 0.0004908258102129481, 'samples': 1181376, 'steps': 6152, 'loss/train': 1.3978728353977203} 01/27/2022 01:39:50 - INFO - codeparrot_training - Step 6153: {'lr': 0.0004908214177592334, 'samples': 1181568, 'steps': 6153, 'loss/train': 0.7458503544330597} 01/27/2022 01:39:53 - INFO - codeparrot_training - Step 6154: {'lr': 0.000490817024273916, 'samples': 1181760, 'steps': 6154, 'loss/train': 1.2987273037433624} 01/27/2022 01:39:56 - INFO - codeparrot_training - Step 6155: {'lr': 0.0004908126297570152, 'samples': 1181952, 'steps': 6155, 'loss/train': 1.0178053379058838} 01/27/2022 01:39:59 - INFO - codeparrot_training - Step 6156: {'lr': 0.0004908082342085494, 'samples': 1182144, 'steps': 6156, 'loss/train': 0.5909083932638168} 01/27/2022 01:40:03 - INFO - codeparrot_training - Step 6157: {'lr': 0.0004908038376285375, 'samples': 1182336, 'steps': 6157, 'loss/train': 0.6540604680776596} 01/27/2022 01:40:06 - INFO - codeparrot_training - Step 6158: {'lr': 0.0004907994400169986, 'samples': 1182528, 'steps': 6158, 'loss/train': 1.0311257243156433} 01/27/2022 01:40:10 - INFO - codeparrot_training - Step 6159: {'lr': 0.0004907950413739514, 'samples': 1182720, 'steps': 6159, 'loss/train': 0.6169803589582443} 01/27/2022 01:40:13 - INFO - codeparrot_training - Step 6160: {'lr': 0.0004907906416994146, 'samples': 1182912, 'steps': 6160, 'loss/train': 1.0381777882575989} 01/27/2022 01:40:17 - INFO - codeparrot_training - Step 6161: {'lr': 0.0004907862409934071, 'samples': 1183104, 'steps': 6161, 'loss/train': 1.4350807964801788} 01/27/2022 01:40:20 - INFO - codeparrot_training - Step 6162: {'lr': 0.0004907818392559479, 'samples': 1183296, 'steps': 6162, 'loss/train': 1.1148098409175873} 01/27/2022 01:40:23 - INFO - codeparrot_training - Step 6163: {'lr': 0.0004907774364870557, 'samples': 1183488, 'steps': 6163, 'loss/train': 1.7836014032363892} 01/27/2022 01:40:26 - INFO - codeparrot_training - Step 6164: {'lr': 0.0004907730326867495, 'samples': 1183680, 'steps': 6164, 'loss/train': 1.347621113061905} 01/27/2022 01:40:29 - INFO - codeparrot_training - Step 6165: {'lr': 0.0004907686278550479, 'samples': 1183872, 'steps': 6165, 'loss/train': 0.8545809388160706} 01/27/2022 01:40:32 - INFO - codeparrot_training - Step 6166: {'lr': 0.0004907642219919701, 'samples': 1184064, 'steps': 6166, 'loss/train': 1.1683538854122162} 01/27/2022 01:40:35 - INFO - codeparrot_training - Step 6167: {'lr': 0.0004907598150975348, 'samples': 1184256, 'steps': 6167, 'loss/train': 0.5366539657115936} 01/27/2022 01:40:39 - INFO - codeparrot_training - Step 6168: {'lr': 0.0004907554071717609, 'samples': 1184448, 'steps': 6168, 'loss/train': 0.8106778264045715} 01/27/2022 01:40:44 - INFO - codeparrot_training - Step 6169: {'lr': 0.0004907509982146673, 'samples': 1184640, 'steps': 6169, 'loss/train': 0.8894577920436859} 01/27/2022 01:40:47 - INFO - codeparrot_training - Step 6170: {'lr': 0.0004907465882262728, 'samples': 1184832, 'steps': 6170, 'loss/train': 0.8647693991661072} 01/27/2022 01:40:51 - INFO - codeparrot_training - Step 6171: {'lr': 0.0004907421772065965, 'samples': 1185024, 'steps': 6171, 'loss/train': 0.7515610456466675} 01/27/2022 01:40:54 - INFO - codeparrot_training - Step 6172: {'lr': 0.000490737765155657, 'samples': 1185216, 'steps': 6172, 'loss/train': 1.2062107622623444} 01/27/2022 01:40:57 - INFO - codeparrot_training - Step 6173: {'lr': 0.0004907333520734734, 'samples': 1185408, 'steps': 6173, 'loss/train': 0.8442074954509735} 01/27/2022 01:41:00 - INFO - codeparrot_training - Step 6174: {'lr': 0.0004907289379600646, 'samples': 1185600, 'steps': 6174, 'loss/train': 0.1701122336089611} 01/27/2022 01:41:03 - INFO - codeparrot_training - Step 6175: {'lr': 0.0004907245228154495, 'samples': 1185792, 'steps': 6175, 'loss/train': 0.4349469691514969} 01/27/2022 01:41:06 - INFO - codeparrot_training - Step 6176: {'lr': 0.0004907201066396469, 'samples': 1185984, 'steps': 6176, 'loss/train': 1.0908653140068054} 01/27/2022 01:41:11 - INFO - codeparrot_training - Step 6177: {'lr': 0.0004907156894326758, 'samples': 1186176, 'steps': 6177, 'loss/train': 0.5364468991756439} 01/27/2022 01:41:14 - INFO - codeparrot_training - Step 6178: {'lr': 0.0004907112711945552, 'samples': 1186368, 'steps': 6178, 'loss/train': 0.7573724091053009} 01/27/2022 01:41:17 - INFO - codeparrot_training - Step 6179: {'lr': 0.000490706851925304, 'samples': 1186560, 'steps': 6179, 'loss/train': 0.6439138948917389} 01/27/2022 01:41:20 - INFO - codeparrot_training - Step 6180: {'lr': 0.0004907024316249408, 'samples': 1186752, 'steps': 6180, 'loss/train': 0.7770249545574188} 01/27/2022 01:41:23 - INFO - codeparrot_training - Step 6181: {'lr': 0.0004906980102934852, 'samples': 1186944, 'steps': 6181, 'loss/train': 0.7470451891422272} 01/27/2022 01:41:27 - INFO - codeparrot_training - Step 6182: {'lr': 0.0004906935879309555, 'samples': 1187136, 'steps': 6182, 'loss/train': 1.2192634642124176} 01/27/2022 01:41:30 - INFO - codeparrot_training - Step 6183: {'lr': 0.0004906891645373709, 'samples': 1187328, 'steps': 6183, 'loss/train': 0.2948988452553749} 01/27/2022 01:41:33 - INFO - codeparrot_training - Step 6184: {'lr': 0.0004906847401127504, 'samples': 1187520, 'steps': 6184, 'loss/train': 0.7693221867084503} 01/27/2022 01:41:36 - INFO - codeparrot_training - Step 6185: {'lr': 0.0004906803146571129, 'samples': 1187712, 'steps': 6185, 'loss/train': 0.7290471643209457} 01/27/2022 01:41:41 - INFO - codeparrot_training - Step 6186: {'lr': 0.0004906758881704774, 'samples': 1187904, 'steps': 6186, 'loss/train': 0.9266629815101624} 01/27/2022 01:41:44 - INFO - codeparrot_training - Step 6187: {'lr': 0.0004906714606528628, 'samples': 1188096, 'steps': 6187, 'loss/train': 0.8888923823833466} 01/27/2022 01:41:47 - INFO - codeparrot_training - Step 6188: {'lr': 0.0004906670321042881, 'samples': 1188288, 'steps': 6188, 'loss/train': 0.5966765284538269} 01/27/2022 01:41:50 - INFO - codeparrot_training - Step 6189: {'lr': 0.0004906626025247722, 'samples': 1188480, 'steps': 6189, 'loss/train': 0.6689232587814331} 01/27/2022 01:41:53 - INFO - codeparrot_training - Step 6190: {'lr': 0.000490658171914334, 'samples': 1188672, 'steps': 6190, 'loss/train': 0.9070932269096375} 01/27/2022 01:41:56 - INFO - codeparrot_training - Step 6191: {'lr': 0.0004906537402729928, 'samples': 1188864, 'steps': 6191, 'loss/train': 0.7688755095005035} 01/27/2022 01:42:00 - INFO - codeparrot_training - Step 6192: {'lr': 0.0004906493076007675, 'samples': 1189056, 'steps': 6192, 'loss/train': 0.09757475182414055} 01/27/2022 01:42:03 - INFO - codeparrot_training - Step 6193: {'lr': 0.0004906448738976768, 'samples': 1189248, 'steps': 6193, 'loss/train': 1.5708882808685303} 01/27/2022 01:42:06 - INFO - codeparrot_training - Step 6194: {'lr': 0.0004906404391637397, 'samples': 1189440, 'steps': 6194, 'loss/train': 0.22054266929626465} 01/27/2022 01:42:11 - INFO - codeparrot_training - Step 6195: {'lr': 0.0004906360033989758, 'samples': 1189632, 'steps': 6195, 'loss/train': 0.8792781829833984} 01/27/2022 01:42:15 - INFO - codeparrot_training - Step 6196: {'lr': 0.0004906315666034034, 'samples': 1189824, 'steps': 6196, 'loss/train': 0.8019262254238129} 01/27/2022 01:42:18 - INFO - codeparrot_training - Step 6197: {'lr': 0.0004906271287770418, 'samples': 1190016, 'steps': 6197, 'loss/train': 0.9724822640419006} 01/27/2022 01:42:21 - INFO - codeparrot_training - Step 6198: {'lr': 0.00049062268991991, 'samples': 1190208, 'steps': 6198, 'loss/train': 0.342559352517128} 01/27/2022 01:42:24 - INFO - codeparrot_training - Step 6199: {'lr': 0.0004906182500320269, 'samples': 1190400, 'steps': 6199, 'loss/train': 0.48370175063610077} 01/27/2022 01:42:27 - INFO - codeparrot_training - Step 6200: {'lr': 0.0004906138091134118, 'samples': 1190592, 'steps': 6200, 'loss/train': 0.5260859280824661} 01/27/2022 01:42:30 - INFO - codeparrot_training - Step 6201: {'lr': 0.0004906093671640836, 'samples': 1190784, 'steps': 6201, 'loss/train': 1.0458130538463593} 01/27/2022 01:42:34 - INFO - codeparrot_training - Step 6202: {'lr': 0.0004906049241840612, 'samples': 1190976, 'steps': 6202, 'loss/train': 1.0333931744098663} 01/27/2022 01:42:37 - INFO - codeparrot_training - Step 6203: {'lr': 0.0004906004801733635, 'samples': 1191168, 'steps': 6203, 'loss/train': 1.0876603424549103} 01/27/2022 01:42:41 - INFO - codeparrot_training - Step 6204: {'lr': 0.0004905960351320099, 'samples': 1191360, 'steps': 6204, 'loss/train': 1.6694644689559937} 01/27/2022 01:42:45 - INFO - codeparrot_training - Step 6205: {'lr': 0.0004905915890600194, 'samples': 1191552, 'steps': 6205, 'loss/train': 1.2936885952949524} 01/27/2022 01:42:48 - INFO - codeparrot_training - Step 6206: {'lr': 0.0004905871419574107, 'samples': 1191744, 'steps': 6206, 'loss/train': 1.6470108032226562} 01/27/2022 01:42:51 - INFO - codeparrot_training - Step 6207: {'lr': 0.0004905826938242032, 'samples': 1191936, 'steps': 6207, 'loss/train': 0.7209721058607101} 01/27/2022 01:42:54 - INFO - codeparrot_training - Step 6208: {'lr': 0.0004905782446604158, 'samples': 1192128, 'steps': 6208, 'loss/train': 0.6441299468278885} 01/27/2022 01:42:57 - INFO - codeparrot_training - Step 6209: {'lr': 0.0004905737944660676, 'samples': 1192320, 'steps': 6209, 'loss/train': 0.6953828483819962} 01/27/2022 01:43:00 - INFO - codeparrot_training - Step 6210: {'lr': 0.0004905693432411777, 'samples': 1192512, 'steps': 6210, 'loss/train': 0.653590053319931} 01/27/2022 01:43:03 - INFO - codeparrot_training - Step 6211: {'lr': 0.0004905648909857652, 'samples': 1192704, 'steps': 6211, 'loss/train': 0.8504708111286163} 01/27/2022 01:43:07 - INFO - codeparrot_training - Step 6212: {'lr': 0.0004905604376998489, 'samples': 1192896, 'steps': 6212, 'loss/train': 0.7008648812770844} 01/27/2022 01:43:12 - INFO - codeparrot_training - Step 6213: {'lr': 0.0004905559833834482, 'samples': 1193088, 'steps': 6213, 'loss/train': 0.34200435876846313} 01/27/2022 01:43:15 - INFO - codeparrot_training - Step 6214: {'lr': 0.000490551528036582, 'samples': 1193280, 'steps': 6214, 'loss/train': 0.5273130387067795} 01/27/2022 01:43:18 - INFO - codeparrot_training - Step 6215: {'lr': 0.0004905470716592695, 'samples': 1193472, 'steps': 6215, 'loss/train': 0.8150880932807922} 01/27/2022 01:43:22 - INFO - codeparrot_training - Step 6216: {'lr': 0.0004905426142515296, 'samples': 1193664, 'steps': 6216, 'loss/train': 0.8427024185657501} 01/27/2022 01:43:25 - INFO - codeparrot_training - Step 6217: {'lr': 0.0004905381558133817, 'samples': 1193856, 'steps': 6217, 'loss/train': 0.5512809455394745} 01/27/2022 01:43:28 - INFO - codeparrot_training - Step 6218: {'lr': 0.0004905336963448446, 'samples': 1194048, 'steps': 6218, 'loss/train': 0.4310515522956848} 01/27/2022 01:43:31 - INFO - codeparrot_training - Step 6219: {'lr': 0.0004905292358459375, 'samples': 1194240, 'steps': 6219, 'loss/train': 1.1258452534675598} 01/27/2022 01:43:34 - INFO - codeparrot_training - Step 6220: {'lr': 0.0004905247743166796, 'samples': 1194432, 'steps': 6220, 'loss/train': 1.1239492893218994} 01/27/2022 01:43:37 - INFO - codeparrot_training - Step 6221: {'lr': 0.0004905203117570899, 'samples': 1194624, 'steps': 6221, 'loss/train': 0.8251465559005737} 01/27/2022 01:43:42 - INFO - codeparrot_training - Step 6222: {'lr': 0.0004905158481671876, 'samples': 1194816, 'steps': 6222, 'loss/train': 0.4816325604915619} 01/27/2022 01:43:45 - INFO - codeparrot_training - Step 6223: {'lr': 0.0004905113835469918, 'samples': 1195008, 'steps': 6223, 'loss/train': 0.6792735010385513} 01/27/2022 01:43:48 - INFO - codeparrot_training - Step 6224: {'lr': 0.0004905069178965214, 'samples': 1195200, 'steps': 6224, 'loss/train': 1.322471022605896} 01/27/2022 01:43:51 - INFO - codeparrot_training - Step 6225: {'lr': 0.0004905024512157959, 'samples': 1195392, 'steps': 6225, 'loss/train': 1.1241303384304047} 01/27/2022 01:43:54 - INFO - codeparrot_training - Step 6226: {'lr': 0.0004904979835048343, 'samples': 1195584, 'steps': 6226, 'loss/train': 1.465214878320694} 01/27/2022 01:43:57 - INFO - codeparrot_training - Step 6227: {'lr': 0.0004904935147636557, 'samples': 1195776, 'steps': 6227, 'loss/train': 0.5610690861940384} 01/27/2022 01:44:01 - INFO - codeparrot_training - Step 6228: {'lr': 0.0004904890449922792, 'samples': 1195968, 'steps': 6228, 'loss/train': 0.9656420946121216} 01/27/2022 01:44:04 - INFO - codeparrot_training - Step 6229: {'lr': 0.0004904845741907241, 'samples': 1196160, 'steps': 6229, 'loss/train': 0.9589585661888123} 01/27/2022 01:44:07 - INFO - codeparrot_training - Step 6230: {'lr': 0.0004904801023590094, 'samples': 1196352, 'steps': 6230, 'loss/train': 1.0149517357349396} 01/27/2022 01:44:12 - INFO - codeparrot_training - Step 6231: {'lr': 0.0004904756294971541, 'samples': 1196544, 'steps': 6231, 'loss/train': 0.9709258675575256} 01/27/2022 01:44:15 - INFO - codeparrot_training - Step 6232: {'lr': 0.0004904711556051778, 'samples': 1196736, 'steps': 6232, 'loss/train': 0.48598824441432953} 01/27/2022 01:44:18 - INFO - codeparrot_training - Step 6233: {'lr': 0.0004904666806830992, 'samples': 1196928, 'steps': 6233, 'loss/train': 0.9721422493457794} 01/27/2022 01:44:21 - INFO - codeparrot_training - Step 6234: {'lr': 0.0004904622047309379, 'samples': 1197120, 'steps': 6234, 'loss/train': 1.1393048465251923} 01/27/2022 01:44:24 - INFO - codeparrot_training - Step 6235: {'lr': 0.0004904577277487129, 'samples': 1197312, 'steps': 6235, 'loss/train': 0.45363248884677887} 01/27/2022 01:44:27 - INFO - codeparrot_training - Step 6236: {'lr': 0.0004904532497364432, 'samples': 1197504, 'steps': 6236, 'loss/train': 0.9061096608638763} 01/27/2022 01:44:30 - INFO - codeparrot_training - Step 6237: {'lr': 0.0004904487706941481, 'samples': 1197696, 'steps': 6237, 'loss/train': 0.8363232314586639} 01/27/2022 01:44:34 - INFO - codeparrot_training - Step 6238: {'lr': 0.000490444290621847, 'samples': 1197888, 'steps': 6238, 'loss/train': 1.1272292733192444} 01/27/2022 01:44:40 - INFO - codeparrot_training - Step 6239: {'lr': 0.0004904398095195588, 'samples': 1198080, 'steps': 6239, 'loss/train': 0.559835359454155} 01/27/2022 01:44:43 - INFO - codeparrot_training - Step 6240: {'lr': 0.0004904353273873028, 'samples': 1198272, 'steps': 6240, 'loss/train': 0.6205233782529831} 01/27/2022 01:44:46 - INFO - codeparrot_training - Step 6241: {'lr': 0.0004904308442250983, 'samples': 1198464, 'steps': 6241, 'loss/train': 1.138074517250061} 01/27/2022 01:44:49 - INFO - codeparrot_training - Step 6242: {'lr': 0.0004904263600329643, 'samples': 1198656, 'steps': 6242, 'loss/train': 0.8401991128921509} 01/27/2022 01:44:52 - INFO - codeparrot_training - Step 6243: {'lr': 0.0004904218748109201, 'samples': 1198848, 'steps': 6243, 'loss/train': 0.9191627204418182} 01/27/2022 01:44:56 - INFO - codeparrot_training - Step 6244: {'lr': 0.000490417388558985, 'samples': 1199040, 'steps': 6244, 'loss/train': 0.674313023686409} 01/27/2022 01:44:59 - INFO - codeparrot_training - Step 6245: {'lr': 0.0004904129012771782, 'samples': 1199232, 'steps': 6245, 'loss/train': 0.8085736334323883} 01/27/2022 01:45:02 - INFO - codeparrot_training - Step 6246: {'lr': 0.0004904084129655188, 'samples': 1199424, 'steps': 6246, 'loss/train': 1.0011175274848938} 01/27/2022 01:45:05 - INFO - codeparrot_training - Step 6247: {'lr': 0.000490403923624026, 'samples': 1199616, 'steps': 6247, 'loss/train': 1.0562097430229187} 01/27/2022 01:45:09 - INFO - codeparrot_training - Step 6248: {'lr': 0.0004903994332527193, 'samples': 1199808, 'steps': 6248, 'loss/train': 0.5065501481294632} 01/27/2022 01:45:13 - INFO - codeparrot_training - Step 6249: {'lr': 0.0004903949418516178, 'samples': 1200000, 'steps': 6249, 'loss/train': 0.9304390847682953} 01/27/2022 01:45:16 - INFO - codeparrot_training - Step 6250: {'lr': 0.0004903904494207405, 'samples': 1200192, 'steps': 6250, 'loss/train': 0.907720685005188} 01/27/2022 01:45:19 - INFO - codeparrot_training - Step 6251: {'lr': 0.000490385955960107, 'samples': 1200384, 'steps': 6251, 'loss/train': 0.9448528289794922} 01/27/2022 01:45:22 - INFO - codeparrot_training - Step 6252: {'lr': 0.0004903814614697363, 'samples': 1200576, 'steps': 6252, 'loss/train': 0.9290369153022766} 01/27/2022 01:45:25 - INFO - codeparrot_training - Step 6253: {'lr': 0.0004903769659496478, 'samples': 1200768, 'steps': 6253, 'loss/train': 0.5731484591960907} 01/27/2022 01:45:28 - INFO - codeparrot_training - Step 6254: {'lr': 0.0004903724693998607, 'samples': 1200960, 'steps': 6254, 'loss/train': 1.1125912070274353} 01/27/2022 01:45:31 - INFO - codeparrot_training - Step 6255: {'lr': 0.0004903679718203942, 'samples': 1201152, 'steps': 6255, 'loss/train': 0.8849154710769653} 01/27/2022 01:45:35 - INFO - codeparrot_training - Step 6256: {'lr': 0.0004903634732112678, 'samples': 1201344, 'steps': 6256, 'loss/train': 0.802963525056839} 01/27/2022 01:45:39 - INFO - codeparrot_training - Step 6257: {'lr': 0.0004903589735725004, 'samples': 1201536, 'steps': 6257, 'loss/train': 1.820286512374878} 01/27/2022 01:45:43 - INFO - codeparrot_training - Step 6258: {'lr': 0.0004903544729041116, 'samples': 1201728, 'steps': 6258, 'loss/train': 0.7406523674726486} 01/27/2022 01:45:46 - INFO - codeparrot_training - Step 6259: {'lr': 0.0004903499712061206, 'samples': 1201920, 'steps': 6259, 'loss/train': 0.7685896754264832} 01/27/2022 01:45:49 - INFO - codeparrot_training - Step 6260: {'lr': 0.0004903454684785465, 'samples': 1202112, 'steps': 6260, 'loss/train': 0.8351544141769409} 01/27/2022 01:45:52 - INFO - codeparrot_training - Step 6261: {'lr': 0.0004903409647214088, 'samples': 1202304, 'steps': 6261, 'loss/train': 0.693569540977478} 01/27/2022 01:45:55 - INFO - codeparrot_training - Step 6262: {'lr': 0.0004903364599347268, 'samples': 1202496, 'steps': 6262, 'loss/train': 0.7981361746788025} 01/27/2022 01:45:58 - INFO - codeparrot_training - Step 6263: {'lr': 0.0004903319541185196, 'samples': 1202688, 'steps': 6263, 'loss/train': 0.9505898952484131} 01/27/2022 01:46:01 - INFO - codeparrot_training - Step 6264: {'lr': 0.0004903274472728067, 'samples': 1202880, 'steps': 6264, 'loss/train': 0.7416847050189972} 01/27/2022 01:46:05 - INFO - codeparrot_training - Step 6265: {'lr': 0.0004903229393976073, 'samples': 1203072, 'steps': 6265, 'loss/train': 0.8544180393218994} 01/27/2022 01:46:09 - INFO - codeparrot_training - Step 6266: {'lr': 0.0004903184304929408, 'samples': 1203264, 'steps': 6266, 'loss/train': 0.8650773167610168} 01/27/2022 01:46:12 - INFO - codeparrot_training - Step 6267: {'lr': 0.0004903139205588264, 'samples': 1203456, 'steps': 6267, 'loss/train': 1.2087790668010712} 01/27/2022 01:46:15 - INFO - codeparrot_training - Step 6268: {'lr': 0.0004903094095952834, 'samples': 1203648, 'steps': 6268, 'loss/train': 0.6772246956825256} 01/27/2022 01:46:19 - INFO - codeparrot_training - Step 6269: {'lr': 0.0004903048976023313, 'samples': 1203840, 'steps': 6269, 'loss/train': 0.9589842259883881} 01/27/2022 01:46:22 - INFO - codeparrot_training - Step 6270: {'lr': 0.0004903003845799893, 'samples': 1204032, 'steps': 6270, 'loss/train': 0.8273256719112396} 01/27/2022 01:46:25 - INFO - codeparrot_training - Step 6271: {'lr': 0.0004902958705282767, 'samples': 1204224, 'steps': 6271, 'loss/train': 1.0198016166687012} 01/27/2022 01:46:28 - INFO - codeparrot_training - Step 6272: {'lr': 0.000490291355447213, 'samples': 1204416, 'steps': 6272, 'loss/train': 0.8764869868755341} 01/27/2022 01:46:31 - INFO - codeparrot_training - Step 6273: {'lr': 0.0004902868393368174, 'samples': 1204608, 'steps': 6273, 'loss/train': 0.716747984290123} 01/27/2022 01:46:34 - INFO - codeparrot_training - Step 6274: {'lr': 0.0004902823221971092, 'samples': 1204800, 'steps': 6274, 'loss/train': 1.2991863191127777} 01/27/2022 01:46:39 - INFO - codeparrot_training - Step 6275: {'lr': 0.000490277804028108, 'samples': 1204992, 'steps': 6275, 'loss/train': 0.7081300914287567} 01/27/2022 01:46:42 - INFO - codeparrot_training - Step 6276: {'lr': 0.0004902732848298328, 'samples': 1205184, 'steps': 6276, 'loss/train': 1.1482901573181152} 01/27/2022 01:46:46 - INFO - codeparrot_training - Step 6277: {'lr': 0.0004902687646023032, 'samples': 1205376, 'steps': 6277, 'loss/train': 0.5641093701124191} 01/27/2022 01:46:49 - INFO - codeparrot_training - Step 6278: {'lr': 0.0004902642433455385, 'samples': 1205568, 'steps': 6278, 'loss/train': 1.364085853099823} 01/27/2022 01:46:52 - INFO - codeparrot_training - Step 6279: {'lr': 0.0004902597210595581, 'samples': 1205760, 'steps': 6279, 'loss/train': 0.884040355682373} 01/27/2022 01:46:55 - INFO - codeparrot_training - Step 6280: {'lr': 0.0004902551977443813, 'samples': 1205952, 'steps': 6280, 'loss/train': 1.109858751296997} 01/27/2022 01:46:58 - INFO - codeparrot_training - Step 6281: {'lr': 0.0004902506734000276, 'samples': 1206144, 'steps': 6281, 'loss/train': 0.7247392237186432} 01/27/2022 01:47:01 - INFO - codeparrot_training - Step 6282: {'lr': 0.0004902461480265163, 'samples': 1206336, 'steps': 6282, 'loss/train': 1.2366293370723724} 01/27/2022 01:47:04 - INFO - codeparrot_training - Step 6283: {'lr': 0.0004902416216238667, 'samples': 1206528, 'steps': 6283, 'loss/train': 0.44683948159217834} 01/27/2022 01:47:09 - INFO - codeparrot_training - Step 6284: {'lr': 0.0004902370941920984, 'samples': 1206720, 'steps': 6284, 'loss/train': 1.0987411737442017} 01/27/2022 01:47:12 - INFO - codeparrot_training - Step 6285: {'lr': 0.0004902325657312306, 'samples': 1206912, 'steps': 6285, 'loss/train': 0.7974940538406372} 01/27/2022 01:47:15 - INFO - codeparrot_training - Step 6286: {'lr': 0.0004902280362412828, 'samples': 1207104, 'steps': 6286, 'loss/train': 1.2908906936645508} 01/27/2022 01:47:19 - INFO - codeparrot_training - Step 6287: {'lr': 0.0004902235057222743, 'samples': 1207296, 'steps': 6287, 'loss/train': 0.857167661190033} 01/27/2022 01:47:22 - INFO - codeparrot_training - Step 6288: {'lr': 0.0004902189741742246, 'samples': 1207488, 'steps': 6288, 'loss/train': 0.49467524886131287} 01/27/2022 01:47:25 - INFO - codeparrot_training - Step 6289: {'lr': 0.0004902144415971532, 'samples': 1207680, 'steps': 6289, 'loss/train': 0.6572093367576599} 01/27/2022 01:47:28 - INFO - codeparrot_training - Step 6290: {'lr': 0.0004902099079910794, 'samples': 1207872, 'steps': 6290, 'loss/train': 1.1197626292705536} 01/27/2022 01:47:31 - INFO - codeparrot_training - Step 6291: {'lr': 0.0004902053733560225, 'samples': 1208064, 'steps': 6291, 'loss/train': 0.7839319109916687} 01/27/2022 01:47:36 - INFO - codeparrot_training - Step 6292: {'lr': 0.0004902008376920021, 'samples': 1208256, 'steps': 6292, 'loss/train': 0.6571007966995239} 01/27/2022 01:47:39 - INFO - codeparrot_training - Step 6293: {'lr': 0.0004901963009990376, 'samples': 1208448, 'steps': 6293, 'loss/train': 0.5284258872270584} 01/27/2022 01:47:42 - INFO - codeparrot_training - Step 6294: {'lr': 0.0004901917632771485, 'samples': 1208640, 'steps': 6294, 'loss/train': 0.5736896395683289} 01/27/2022 01:47:45 - INFO - codeparrot_training - Step 6295: {'lr': 0.000490187224526354, 'samples': 1208832, 'steps': 6295, 'loss/train': 0.8860796391963959} 01/27/2022 01:47:48 - INFO - codeparrot_training - Step 6296: {'lr': 0.0004901826847466738, 'samples': 1209024, 'steps': 6296, 'loss/train': 0.9753461480140686} 01/27/2022 01:47:51 - INFO - codeparrot_training - Step 6297: {'lr': 0.0004901781439381272, 'samples': 1209216, 'steps': 6297, 'loss/train': 0.6709851175546646} 01/27/2022 01:47:54 - INFO - codeparrot_training - Step 6298: {'lr': 0.0004901736021007337, 'samples': 1209408, 'steps': 6298, 'loss/train': 0.2797527238726616} 01/27/2022 01:47:58 - INFO - codeparrot_training - Step 6299: {'lr': 0.0004901690592345127, 'samples': 1209600, 'steps': 6299, 'loss/train': 0.8093985021114349} 01/27/2022 01:48:01 - INFO - codeparrot_training - Step 6300: {'lr': 0.0004901645153394838, 'samples': 1209792, 'steps': 6300, 'loss/train': 0.8963547348976135} 01/27/2022 01:48:06 - INFO - codeparrot_training - Step 6301: {'lr': 0.0004901599704156664, 'samples': 1209984, 'steps': 6301, 'loss/train': 1.256255954504013} 01/27/2022 01:48:09 - INFO - codeparrot_training - Step 6302: {'lr': 0.00049015542446308, 'samples': 1210176, 'steps': 6302, 'loss/train': 1.0209924280643463} 01/27/2022 01:48:12 - INFO - codeparrot_training - Step 6303: {'lr': 0.0004901508774817438, 'samples': 1210368, 'steps': 6303, 'loss/train': 0.9816356599330902} 01/27/2022 01:48:15 - INFO - codeparrot_training - Step 6304: {'lr': 0.0004901463294716776, 'samples': 1210560, 'steps': 6304, 'loss/train': 1.813110888004303} 01/27/2022 01:48:18 - INFO - codeparrot_training - Step 6305: {'lr': 0.0004901417804329008, 'samples': 1210752, 'steps': 6305, 'loss/train': 0.7715764045715332} 01/27/2022 01:48:22 - INFO - codeparrot_training - Step 6306: {'lr': 0.0004901372303654329, 'samples': 1210944, 'steps': 6306, 'loss/train': 0.9392834007740021} 01/27/2022 01:48:25 - INFO - codeparrot_training - Step 6307: {'lr': 0.0004901326792692934, 'samples': 1211136, 'steps': 6307, 'loss/train': 0.7637546360492706} 01/27/2022 01:48:28 - INFO - codeparrot_training - Step 6308: {'lr': 0.0004901281271445016, 'samples': 1211328, 'steps': 6308, 'loss/train': 0.8036229014396667} 01/27/2022 01:48:31 - INFO - codeparrot_training - Step 6309: {'lr': 0.0004901235739910772, 'samples': 1211520, 'steps': 6309, 'loss/train': 0.10124886408448219} 01/27/2022 01:48:35 - INFO - codeparrot_training - Step 6310: {'lr': 0.0004901190198090397, 'samples': 1211712, 'steps': 6310, 'loss/train': 0.7289289236068726} 01/27/2022 01:48:39 - INFO - codeparrot_training - Step 6311: {'lr': 0.0004901144645984086, 'samples': 1211904, 'steps': 6311, 'loss/train': 0.9587049186229706} 01/27/2022 01:48:42 - INFO - codeparrot_training - Step 6312: {'lr': 0.0004901099083592034, 'samples': 1212096, 'steps': 6312, 'loss/train': 0.7408390939235687} 01/27/2022 01:48:45 - INFO - codeparrot_training - Step 6313: {'lr': 0.0004901053510914434, 'samples': 1212288, 'steps': 6313, 'loss/train': 1.1274677217006683} 01/27/2022 01:48:48 - INFO - codeparrot_training - Step 6314: {'lr': 0.0004901007927951485, 'samples': 1212480, 'steps': 6314, 'loss/train': 0.9992886185646057} 01/27/2022 01:48:51 - INFO - codeparrot_training - Step 6315: {'lr': 0.000490096233470338, 'samples': 1212672, 'steps': 6315, 'loss/train': 0.875158041715622} 01/27/2022 01:48:54 - INFO - codeparrot_training - Step 6316: {'lr': 0.0004900916731170314, 'samples': 1212864, 'steps': 6316, 'loss/train': 1.038465678691864} 01/27/2022 01:48:57 - INFO - codeparrot_training - Step 6317: {'lr': 0.0004900871117352484, 'samples': 1213056, 'steps': 6317, 'loss/train': 0.9835432469844818} 01/27/2022 01:49:01 - INFO - codeparrot_training - Step 6318: {'lr': 0.0004900825493250084, 'samples': 1213248, 'steps': 6318, 'loss/train': 0.8029257953166962} 01/27/2022 01:49:06 - INFO - codeparrot_training - Step 6319: {'lr': 0.000490077985886331, 'samples': 1213440, 'steps': 6319, 'loss/train': 0.4269631505012512} 01/27/2022 01:49:09 - INFO - codeparrot_training - Step 6320: {'lr': 0.0004900734214192358, 'samples': 1213632, 'steps': 6320, 'loss/train': 1.1525818705558777} 01/27/2022 01:49:12 - INFO - codeparrot_training - Step 6321: {'lr': 0.0004900688559237422, 'samples': 1213824, 'steps': 6321, 'loss/train': 0.5676162242889404} 01/27/2022 01:49:15 - INFO - codeparrot_training - Step 6322: {'lr': 0.0004900642893998699, 'samples': 1214016, 'steps': 6322, 'loss/train': 0.9837022125720978} 01/27/2022 01:49:18 - INFO - codeparrot_training - Step 6323: {'lr': 0.0004900597218476385, 'samples': 1214208, 'steps': 6323, 'loss/train': 1.2447884380817413} 01/27/2022 01:49:22 - INFO - codeparrot_training - Step 6324: {'lr': 0.0004900551532670673, 'samples': 1214400, 'steps': 6324, 'loss/train': 0.8098407089710236} 01/27/2022 01:49:25 - INFO - codeparrot_training - Step 6325: {'lr': 0.0004900505836581763, 'samples': 1214592, 'steps': 6325, 'loss/train': 0.822954922914505} 01/27/2022 01:49:28 - INFO - codeparrot_training - Step 6326: {'lr': 0.0004900460130209845, 'samples': 1214784, 'steps': 6326, 'loss/train': 1.0564486384391785} 01/27/2022 01:49:32 - INFO - codeparrot_training - Step 6327: {'lr': 0.000490041441355512, 'samples': 1214976, 'steps': 6327, 'loss/train': 0.3163149654865265} 01/27/2022 01:49:35 - INFO - codeparrot_training - Step 6328: {'lr': 0.0004900368686617783, 'samples': 1215168, 'steps': 6328, 'loss/train': 0.8436737358570099} 01/27/2022 01:49:39 - INFO - codeparrot_training - Step 6329: {'lr': 0.0004900322949398026, 'samples': 1215360, 'steps': 6329, 'loss/train': 0.9765004813671112} 01/27/2022 01:49:42 - INFO - codeparrot_training - Step 6330: {'lr': 0.000490027720189605, 'samples': 1215552, 'steps': 6330, 'loss/train': 0.9460907578468323} 01/27/2022 01:49:45 - INFO - codeparrot_training - Step 6331: {'lr': 0.0004900231444112047, 'samples': 1215744, 'steps': 6331, 'loss/train': 1.0319190323352814} 01/27/2022 01:49:48 - INFO - codeparrot_training - Step 6332: {'lr': 0.0004900185676046214, 'samples': 1215936, 'steps': 6332, 'loss/train': 0.9604050815105438} 01/27/2022 01:49:51 - INFO - codeparrot_training - Step 6333: {'lr': 0.0004900139897698751, 'samples': 1216128, 'steps': 6333, 'loss/train': 0.3333669975399971} 01/27/2022 01:49:54 - INFO - codeparrot_training - Step 6334: {'lr': 0.0004900094109069848, 'samples': 1216320, 'steps': 6334, 'loss/train': 0.5880054384469986} 01/27/2022 01:49:57 - INFO - codeparrot_training - Step 6335: {'lr': 0.0004900048310159705, 'samples': 1216512, 'steps': 6335, 'loss/train': 1.0420709252357483} 01/27/2022 01:50:02 - INFO - codeparrot_training - Step 6336: {'lr': 0.0004900002500968516, 'samples': 1216704, 'steps': 6336, 'loss/train': 0.8928120732307434} 01/27/2022 01:50:05 - INFO - codeparrot_training - Step 6337: {'lr': 0.000489995668149648, 'samples': 1216896, 'steps': 6337, 'loss/train': 1.5697431564331055} 01/27/2022 01:50:08 - INFO - codeparrot_training - Step 6338: {'lr': 0.0004899910851743791, 'samples': 1217088, 'steps': 6338, 'loss/train': 0.7910953760147095} 01/27/2022 01:50:11 - INFO - codeparrot_training - Step 6339: {'lr': 0.0004899865011710646, 'samples': 1217280, 'steps': 6339, 'loss/train': 0.3770630210638046} 01/27/2022 01:50:14 - INFO - codeparrot_training - Step 6340: {'lr': 0.0004899819161397241, 'samples': 1217472, 'steps': 6340, 'loss/train': 1.0980540812015533} 01/27/2022 01:50:17 - INFO - codeparrot_training - Step 6341: {'lr': 0.0004899773300803774, 'samples': 1217664, 'steps': 6341, 'loss/train': 0.9882853031158447} 01/27/2022 01:50:21 - INFO - codeparrot_training - Step 6342: {'lr': 0.0004899727429930438, 'samples': 1217856, 'steps': 6342, 'loss/train': 1.1649073362350464} 01/27/2022 01:50:24 - INFO - codeparrot_training - Step 6343: {'lr': 0.0004899681548777434, 'samples': 1218048, 'steps': 6343, 'loss/train': 1.0546450316905975} 01/27/2022 01:50:27 - INFO - codeparrot_training - Step 6344: {'lr': 0.0004899635657344954, 'samples': 1218240, 'steps': 6344, 'loss/train': 0.7757726311683655} 01/27/2022 01:50:32 - INFO - codeparrot_training - Step 6345: {'lr': 0.0004899589755633198, 'samples': 1218432, 'steps': 6345, 'loss/train': 0.7066180855035782} 01/27/2022 01:50:35 - INFO - codeparrot_training - Step 6346: {'lr': 0.0004899543843642362, 'samples': 1218624, 'steps': 6346, 'loss/train': 0.6311254352331161} 01/27/2022 01:50:38 - INFO - codeparrot_training - Step 6347: {'lr': 0.0004899497921372641, 'samples': 1218816, 'steps': 6347, 'loss/train': 1.29383584856987} 01/27/2022 01:50:41 - INFO - codeparrot_training - Step 6348: {'lr': 0.0004899451988824233, 'samples': 1219008, 'steps': 6348, 'loss/train': 1.4691520035266876} 01/27/2022 01:50:45 - INFO - codeparrot_training - Step 6349: {'lr': 0.0004899406045997336, 'samples': 1219200, 'steps': 6349, 'loss/train': 1.0379295945167542} 01/27/2022 01:50:48 - INFO - codeparrot_training - Step 6350: {'lr': 0.0004899360092892143, 'samples': 1219392, 'steps': 6350, 'loss/train': 0.6476236581802368} 01/27/2022 01:50:51 - INFO - codeparrot_training - Step 6351: {'lr': 0.0004899314129508855, 'samples': 1219584, 'steps': 6351, 'loss/train': 0.7151519805192947} 01/27/2022 01:50:54 - INFO - codeparrot_training - Step 6352: {'lr': 0.0004899268155847667, 'samples': 1219776, 'steps': 6352, 'loss/train': 0.9221700131893158} 01/27/2022 01:50:58 - INFO - codeparrot_training - Step 6353: {'lr': 0.0004899222171908776, 'samples': 1219968, 'steps': 6353, 'loss/train': 0.625466912984848} 01/27/2022 01:51:02 - INFO - codeparrot_training - Step 6354: {'lr': 0.0004899176177692379, 'samples': 1220160, 'steps': 6354, 'loss/train': 1.0663008391857147} 01/27/2022 01:51:05 - INFO - codeparrot_training - Step 6355: {'lr': 0.0004899130173198672, 'samples': 1220352, 'steps': 6355, 'loss/train': 0.4866520017385483} 01/27/2022 01:51:08 - INFO - codeparrot_training - Step 6356: {'lr': 0.0004899084158427855, 'samples': 1220544, 'steps': 6356, 'loss/train': 2.801897406578064} 01/27/2022 01:51:11 - INFO - codeparrot_training - Step 6357: {'lr': 0.0004899038133380121, 'samples': 1220736, 'steps': 6357, 'loss/train': 1.0881783664226532} 01/27/2022 01:51:14 - INFO - codeparrot_training - Step 6358: {'lr': 0.0004898992098055671, 'samples': 1220928, 'steps': 6358, 'loss/train': 0.8174291253089905} 01/27/2022 01:51:17 - INFO - codeparrot_training - Step 6359: {'lr': 0.00048989460524547, 'samples': 1221120, 'steps': 6359, 'loss/train': 1.0130333304405212} 01/27/2022 01:51:20 - INFO - codeparrot_training - Step 6360: {'lr': 0.0004898899996577407, 'samples': 1221312, 'steps': 6360, 'loss/train': 0.42961983382701874} 01/27/2022 01:51:24 - INFO - codeparrot_training - Step 6361: {'lr': 0.0004898853930423987, 'samples': 1221504, 'steps': 6361, 'loss/train': 1.0390426218509674} 01/27/2022 01:51:29 - INFO - codeparrot_training - Step 6362: {'lr': 0.0004898807853994639, 'samples': 1221696, 'steps': 6362, 'loss/train': 1.0948165655136108} 01/27/2022 01:51:32 - INFO - codeparrot_training - Step 6363: {'lr': 0.000489876176728956, 'samples': 1221888, 'steps': 6363, 'loss/train': 0.8479156494140625} 01/27/2022 01:51:35 - INFO - codeparrot_training - Step 6364: {'lr': 0.0004898715670308947, 'samples': 1222080, 'steps': 6364, 'loss/train': 0.7434628307819366} 01/27/2022 01:51:38 - INFO - codeparrot_training - Step 6365: {'lr': 0.0004898669563052997, 'samples': 1222272, 'steps': 6365, 'loss/train': 0.9276902675628662} 01/27/2022 01:51:42 - INFO - codeparrot_training - Step 6366: {'lr': 0.0004898623445521909, 'samples': 1222464, 'steps': 6366, 'loss/train': 0.8814190328121185} 01/27/2022 01:51:45 - INFO - codeparrot_training - Step 6367: {'lr': 0.000489857731771588, 'samples': 1222656, 'steps': 6367, 'loss/train': 0.5963395088911057} 01/27/2022 01:51:48 - INFO - codeparrot_training - Step 6368: {'lr': 0.0004898531179635108, 'samples': 1222848, 'steps': 6368, 'loss/train': 1.3502325117588043} 01/27/2022 01:51:51 - INFO - codeparrot_training - Step 6369: {'lr': 0.0004898485031279788, 'samples': 1223040, 'steps': 6369, 'loss/train': 0.3504873365163803} 01/27/2022 01:51:54 - INFO - codeparrot_training - Step 6370: {'lr': 0.0004898438872650121, 'samples': 1223232, 'steps': 6370, 'loss/train': 0.4020184278488159} 01/27/2022 01:51:58 - INFO - codeparrot_training - Step 6371: {'lr': 0.0004898392703746304, 'samples': 1223424, 'steps': 6371, 'loss/train': 0.9466051161289215} 01/27/2022 01:52:02 - INFO - codeparrot_training - Step 6372: {'lr': 0.0004898346524568533, 'samples': 1223616, 'steps': 6372, 'loss/train': 2.2318882942199707} 01/27/2022 01:52:05 - INFO - codeparrot_training - Step 6373: {'lr': 0.0004898300335117008, 'samples': 1223808, 'steps': 6373, 'loss/train': 0.6629572659730911} 01/27/2022 01:52:08 - INFO - codeparrot_training - Step 6374: {'lr': 0.0004898254135391926, 'samples': 1224000, 'steps': 6374, 'loss/train': 0.855831116437912} 01/27/2022 01:52:11 - INFO - codeparrot_training - Step 6375: {'lr': 0.0004898207925393485, 'samples': 1224192, 'steps': 6375, 'loss/train': 1.2702929377555847} 01/27/2022 01:52:14 - INFO - codeparrot_training - Step 6376: {'lr': 0.0004898161705121882, 'samples': 1224384, 'steps': 6376, 'loss/train': 0.470152884721756} 01/27/2022 01:52:17 - INFO - codeparrot_training - Step 6377: {'lr': 0.0004898115474577315, 'samples': 1224576, 'steps': 6377, 'loss/train': 0.8749406039714813} 01/27/2022 01:52:21 - INFO - codeparrot_training - Step 6378: {'lr': 0.0004898069233759985, 'samples': 1224768, 'steps': 6378, 'loss/train': 1.016590029001236} 01/27/2022 01:52:24 - INFO - codeparrot_training - Step 6379: {'lr': 0.0004898022982670085, 'samples': 1224960, 'steps': 6379, 'loss/train': 1.147790014743805} 01/27/2022 01:52:29 - INFO - codeparrot_training - Step 6380: {'lr': 0.0004897976721307818, 'samples': 1225152, 'steps': 6380, 'loss/train': 0.7446692883968353} 01/27/2022 01:52:32 - INFO - codeparrot_training - Step 6381: {'lr': 0.000489793044967338, 'samples': 1225344, 'steps': 6381, 'loss/train': 0.37768301367759705} 01/27/2022 01:52:35 - INFO - codeparrot_training - Step 6382: {'lr': 0.0004897884167766969, 'samples': 1225536, 'steps': 6382, 'loss/train': 1.0531463921070099} 01/27/2022 01:52:38 - INFO - codeparrot_training - Step 6383: {'lr': 0.0004897837875588784, 'samples': 1225728, 'steps': 6383, 'loss/train': 1.1602638959884644} 01/27/2022 01:52:41 - INFO - codeparrot_training - Step 6384: {'lr': 0.0004897791573139022, 'samples': 1225920, 'steps': 6384, 'loss/train': 0.8275417685508728} 01/27/2022 01:52:44 - INFO - codeparrot_training - Step 6385: {'lr': 0.0004897745260417884, 'samples': 1226112, 'steps': 6385, 'loss/train': 1.218506634235382} 01/27/2022 01:52:48 - INFO - codeparrot_training - Step 6386: {'lr': 0.0004897698937425566, 'samples': 1226304, 'steps': 6386, 'loss/train': 0.6680924445390701} 01/27/2022 01:52:51 - INFO - codeparrot_training - Step 6387: {'lr': 0.0004897652604162266, 'samples': 1226496, 'steps': 6387, 'loss/train': 0.14738300442695618} 01/27/2022 01:52:54 - INFO - codeparrot_training - Step 6388: {'lr': 0.0004897606260628184, 'samples': 1226688, 'steps': 6388, 'loss/train': 1.195718139410019} 01/27/2022 01:52:58 - INFO - codeparrot_training - Step 6389: {'lr': 0.0004897559906823518, 'samples': 1226880, 'steps': 6389, 'loss/train': 0.9865776300430298} 01/27/2022 01:53:01 - INFO - codeparrot_training - Step 6390: {'lr': 0.0004897513542748468, 'samples': 1227072, 'steps': 6390, 'loss/train': 0.9448831379413605} 01/27/2022 01:53:05 - INFO - codeparrot_training - Step 6391: {'lr': 0.0004897467168403231, 'samples': 1227264, 'steps': 6391, 'loss/train': 0.9922038316726685} 01/27/2022 01:53:08 - INFO - codeparrot_training - Step 6392: {'lr': 0.0004897420783788006, 'samples': 1227456, 'steps': 6392, 'loss/train': 0.777029424905777} 01/27/2022 01:53:11 - INFO - codeparrot_training - Step 6393: {'lr': 0.0004897374388902991, 'samples': 1227648, 'steps': 6393, 'loss/train': 1.197772890329361} 01/27/2022 01:53:14 - INFO - codeparrot_training - Step 6394: {'lr': 0.0004897327983748385, 'samples': 1227840, 'steps': 6394, 'loss/train': 0.7630469799041748} 01/27/2022 01:53:17 - INFO - codeparrot_training - Step 6395: {'lr': 0.0004897281568324387, 'samples': 1228032, 'steps': 6395, 'loss/train': 0.2219494879245758} 01/27/2022 01:53:20 - INFO - codeparrot_training - Step 6396: {'lr': 0.0004897235142631197, 'samples': 1228224, 'steps': 6396, 'loss/train': 0.9247861504554749} 01/27/2022 01:53:25 - INFO - codeparrot_training - Step 6397: {'lr': 0.0004897188706669012, 'samples': 1228416, 'steps': 6397, 'loss/train': 0.8578087985515594} 01/27/2022 01:53:28 - INFO - codeparrot_training - Step 6398: {'lr': 0.0004897142260438032, 'samples': 1228608, 'steps': 6398, 'loss/train': 1.0645051896572113} 01/27/2022 01:53:32 - INFO - codeparrot_training - Step 6399: {'lr': 0.0004897095803938456, 'samples': 1228800, 'steps': 6399, 'loss/train': 0.8590836524963379} 01/27/2022 01:53:35 - INFO - codeparrot_training - Step 6400: {'lr': 0.0004897049337170483, 'samples': 1228992, 'steps': 6400, 'loss/train': 1.3283278942108154} 01/27/2022 01:53:38 - INFO - codeparrot_training - Step 6401: {'lr': 0.0004897002860134311, 'samples': 1229184, 'steps': 6401, 'loss/train': 1.0214281976222992} 01/27/2022 01:53:41 - INFO - codeparrot_training - Step 6402: {'lr': 0.0004896956372830141, 'samples': 1229376, 'steps': 6402, 'loss/train': 0.9596795439720154} 01/27/2022 01:53:44 - INFO - codeparrot_training - Step 6403: {'lr': 0.000489690987525817, 'samples': 1229568, 'steps': 6403, 'loss/train': 1.237148255109787} 01/27/2022 01:53:47 - INFO - codeparrot_training - Step 6404: {'lr': 0.0004896863367418598, 'samples': 1229760, 'steps': 6404, 'loss/train': 0.5733175277709961} 01/27/2022 01:53:50 - INFO - codeparrot_training - Step 6405: {'lr': 0.0004896816849311625, 'samples': 1229952, 'steps': 6405, 'loss/train': 0.7737717926502228} 01/27/2022 01:53:55 - INFO - codeparrot_training - Step 6406: {'lr': 0.000489677032093745, 'samples': 1230144, 'steps': 6406, 'loss/train': 0.5153376460075378} 01/27/2022 01:53:58 - INFO - codeparrot_training - Step 6407: {'lr': 0.0004896723782296272, 'samples': 1230336, 'steps': 6407, 'loss/train': 0.44370049238204956} 01/27/2022 01:54:01 - INFO - codeparrot_training - Step 6408: {'lr': 0.0004896677233388289, 'samples': 1230528, 'steps': 6408, 'loss/train': 0.8270233869552612} 01/27/2022 01:54:04 - INFO - codeparrot_training - Step 6409: {'lr': 0.0004896630674213703, 'samples': 1230720, 'steps': 6409, 'loss/train': 0.6444699615240097} 01/27/2022 01:54:07 - INFO - codeparrot_training - Step 6410: {'lr': 0.0004896584104772712, 'samples': 1230912, 'steps': 6410, 'loss/train': 0.7060408294200897} 01/27/2022 01:54:11 - INFO - codeparrot_training - Step 6411: {'lr': 0.0004896537525065516, 'samples': 1231104, 'steps': 6411, 'loss/train': 1.3443508744239807} 01/27/2022 01:54:14 - INFO - codeparrot_training - Step 6412: {'lr': 0.0004896490935092314, 'samples': 1231296, 'steps': 6412, 'loss/train': 0.6481138318777084} 01/27/2022 01:54:17 - INFO - codeparrot_training - Step 6413: {'lr': 0.0004896444334853305, 'samples': 1231488, 'steps': 6413, 'loss/train': 0.6932360082864761} 01/27/2022 01:54:20 - INFO - codeparrot_training - Step 6414: {'lr': 0.000489639772434869, 'samples': 1231680, 'steps': 6414, 'loss/train': 0.9841288626194} 01/27/2022 01:54:25 - INFO - codeparrot_training - Step 6415: {'lr': 0.0004896351103578669, 'samples': 1231872, 'steps': 6415, 'loss/train': 1.1706828474998474} 01/27/2022 01:54:28 - INFO - codeparrot_training - Step 6416: {'lr': 0.0004896304472543439, 'samples': 1232064, 'steps': 6416, 'loss/train': 0.8027427792549133} 01/27/2022 01:54:31 - INFO - codeparrot_training - Step 6417: {'lr': 0.0004896257831243204, 'samples': 1232256, 'steps': 6417, 'loss/train': 1.3880745470523834} 01/27/2022 01:54:34 - INFO - codeparrot_training - Step 6418: {'lr': 0.0004896211179678159, 'samples': 1232448, 'steps': 6418, 'loss/train': 0.6649356186389923} 01/27/2022 01:54:37 - INFO - codeparrot_training - Step 6419: {'lr': 0.0004896164517848508, 'samples': 1232640, 'steps': 6419, 'loss/train': 0.21961062401533127} 01/27/2022 01:54:40 - INFO - codeparrot_training - Step 6420: {'lr': 0.0004896117845754448, 'samples': 1232832, 'steps': 6420, 'loss/train': 0.8001343309879303} 01/27/2022 01:54:43 - INFO - codeparrot_training - Step 6421: {'lr': 0.0004896071163396179, 'samples': 1233024, 'steps': 6421, 'loss/train': 0.6537343561649323} 01/27/2022 01:54:47 - INFO - codeparrot_training - Step 6422: {'lr': 0.0004896024470773904, 'samples': 1233216, 'steps': 6422, 'loss/train': 0.6879144161939621} 01/27/2022 01:54:50 - INFO - codeparrot_training - Step 6423: {'lr': 0.000489597776788782, 'samples': 1233408, 'steps': 6423, 'loss/train': 1.1737562119960785} 01/27/2022 01:54:55 - INFO - codeparrot_training - Step 6424: {'lr': 0.0004895931054738128, 'samples': 1233600, 'steps': 6424, 'loss/train': 0.48177333176136017} 01/27/2022 01:54:58 - INFO - codeparrot_training - Step 6425: {'lr': 0.0004895884331325028, 'samples': 1233792, 'steps': 6425, 'loss/train': 0.3756950944662094} 01/27/2022 01:55:01 - INFO - codeparrot_training - Step 6426: {'lr': 0.0004895837597648721, 'samples': 1233984, 'steps': 6426, 'loss/train': 1.2978003323078156} 01/27/2022 01:55:05 - INFO - codeparrot_training - Step 6427: {'lr': 0.0004895790853709406, 'samples': 1234176, 'steps': 6427, 'loss/train': 0.7325103878974915} 01/27/2022 01:55:08 - INFO - codeparrot_training - Step 6428: {'lr': 0.0004895744099507284, 'samples': 1234368, 'steps': 6428, 'loss/train': 1.236417979001999} 01/27/2022 01:55:11 - INFO - codeparrot_training - Step 6429: {'lr': 0.0004895697335042555, 'samples': 1234560, 'steps': 6429, 'loss/train': 0.9336467385292053} 01/27/2022 01:55:14 - INFO - codeparrot_training - Step 6430: {'lr': 0.0004895650560315419, 'samples': 1234752, 'steps': 6430, 'loss/train': 1.0424746870994568} 01/27/2022 01:55:17 - INFO - codeparrot_training - Step 6431: {'lr': 0.0004895603775326077, 'samples': 1234944, 'steps': 6431, 'loss/train': 0.8279817402362823} 01/27/2022 01:55:20 - INFO - codeparrot_training - Step 6432: {'lr': 0.0004895556980074729, 'samples': 1235136, 'steps': 6432, 'loss/train': 0.9919312298297882} 01/27/2022 01:55:25 - INFO - codeparrot_training - Step 6433: {'lr': 0.0004895510174561576, 'samples': 1235328, 'steps': 6433, 'loss/train': 0.6704083979129791} 01/27/2022 01:55:28 - INFO - codeparrot_training - Step 6434: {'lr': 0.0004895463358786818, 'samples': 1235520, 'steps': 6434, 'loss/train': 0.5560906827449799} 01/27/2022 01:55:31 - INFO - codeparrot_training - Step 6435: {'lr': 0.0004895416532750655, 'samples': 1235712, 'steps': 6435, 'loss/train': 0.7407025694847107} 01/27/2022 01:55:34 - INFO - codeparrot_training - Step 6436: {'lr': 0.0004895369696453289, 'samples': 1235904, 'steps': 6436, 'loss/train': 0.7004330456256866} 01/27/2022 01:55:37 - INFO - codeparrot_training - Step 6437: {'lr': 0.0004895322849894918, 'samples': 1236096, 'steps': 6437, 'loss/train': 0.8578828275203705} 01/27/2022 01:55:40 - INFO - codeparrot_training - Step 6438: {'lr': 0.0004895275993075747, 'samples': 1236288, 'steps': 6438, 'loss/train': 1.0512960255146027} 01/27/2022 01:55:44 - INFO - codeparrot_training - Step 6439: {'lr': 0.0004895229125995973, 'samples': 1236480, 'steps': 6439, 'loss/train': 1.0300231575965881} 01/27/2022 01:55:47 - INFO - codeparrot_training - Step 6440: {'lr': 0.0004895182248655798, 'samples': 1236672, 'steps': 6440, 'loss/train': 0.8893972635269165} 01/27/2022 01:55:50 - INFO - codeparrot_training - Step 6441: {'lr': 0.0004895135361055422, 'samples': 1236864, 'steps': 6441, 'loss/train': 0.9833247363567352} 01/27/2022 01:55:55 - INFO - codeparrot_training - Step 6442: {'lr': 0.0004895088463195049, 'samples': 1237056, 'steps': 6442, 'loss/train': 1.1429655253887177} 01/27/2022 01:55:58 - INFO - codeparrot_training - Step 6443: {'lr': 0.0004895041555074875, 'samples': 1237248, 'steps': 6443, 'loss/train': 0.9779397547245026} 01/27/2022 01:56:02 - INFO - codeparrot_training - Step 6444: {'lr': 0.0004894994636695105, 'samples': 1237440, 'steps': 6444, 'loss/train': 0.8032970130443573} 01/27/2022 01:56:05 - INFO - codeparrot_training - Step 6445: {'lr': 0.0004894947708055938, 'samples': 1237632, 'steps': 6445, 'loss/train': 0.6996442526578903} 01/27/2022 01:56:08 - INFO - codeparrot_training - Step 6446: {'lr': 0.0004894900769157576, 'samples': 1237824, 'steps': 6446, 'loss/train': 0.1781008578836918} 01/27/2022 01:56:11 - INFO - codeparrot_training - Step 6447: {'lr': 0.0004894853820000219, 'samples': 1238016, 'steps': 6447, 'loss/train': 0.9408718049526215} 01/27/2022 01:56:14 - INFO - codeparrot_training - Step 6448: {'lr': 0.000489480686058407, 'samples': 1238208, 'steps': 6448, 'loss/train': 0.9601345360279083} 01/27/2022 01:56:17 - INFO - codeparrot_training - Step 6449: {'lr': 0.0004894759890909326, 'samples': 1238400, 'steps': 6449, 'loss/train': 0.33476782590150833} 01/27/2022 01:56:22 - INFO - codeparrot_training - Step 6450: {'lr': 0.0004894712910976193, 'samples': 1238592, 'steps': 6450, 'loss/train': 0.8222501277923584} 01/27/2022 01:56:25 - INFO - codeparrot_training - Step 6451: {'lr': 0.000489466592078487, 'samples': 1238784, 'steps': 6451, 'loss/train': 1.0382649600505829} 01/27/2022 01:56:28 - INFO - codeparrot_training - Step 6452: {'lr': 0.0004894618920335558, 'samples': 1238976, 'steps': 6452, 'loss/train': 1.0739319026470184} 01/27/2022 01:56:31 - INFO - codeparrot_training - Step 6453: {'lr': 0.000489457190962846, 'samples': 1239168, 'steps': 6453, 'loss/train': 0.5725400000810623} 01/27/2022 01:56:34 - INFO - codeparrot_training - Step 6454: {'lr': 0.0004894524888663776, 'samples': 1239360, 'steps': 6454, 'loss/train': 0.34680236130952835} 01/27/2022 01:56:37 - INFO - codeparrot_training - Step 6455: {'lr': 0.0004894477857441707, 'samples': 1239552, 'steps': 6455, 'loss/train': 0.9272817671298981} 01/27/2022 01:56:41 - INFO - codeparrot_training - Step 6456: {'lr': 0.0004894430815962456, 'samples': 1239744, 'steps': 6456, 'loss/train': 1.060455322265625} 01/27/2022 01:56:44 - INFO - codeparrot_training - Step 6457: {'lr': 0.0004894383764226224, 'samples': 1239936, 'steps': 6457, 'loss/train': 0.8561077415943146} 01/27/2022 01:56:47 - INFO - codeparrot_training - Step 6458: {'lr': 0.0004894336702233212, 'samples': 1240128, 'steps': 6458, 'loss/train': 2.5242764353752136} 01/27/2022 01:56:51 - INFO - codeparrot_training - Step 6459: {'lr': 0.0004894289629983621, 'samples': 1240320, 'steps': 6459, 'loss/train': 0.9798576235771179} 01/27/2022 01:56:54 - INFO - codeparrot_training - Step 6460: {'lr': 0.0004894242547477654, 'samples': 1240512, 'steps': 6460, 'loss/train': 1.1058398187160492} 01/27/2022 01:56:58 - INFO - codeparrot_training - Step 6461: {'lr': 0.0004894195454715512, 'samples': 1240704, 'steps': 6461, 'loss/train': 0.5937058478593826} 01/27/2022 01:57:01 - INFO - codeparrot_training - Step 6462: {'lr': 0.0004894148351697398, 'samples': 1240896, 'steps': 6462, 'loss/train': 0.6234152913093567} 01/27/2022 01:57:04 - INFO - codeparrot_training - Step 6463: {'lr': 0.0004894101238423512, 'samples': 1241088, 'steps': 6463, 'loss/train': 1.1815871000289917} 01/27/2022 01:57:07 - INFO - codeparrot_training - Step 6464: {'lr': 0.0004894054114894055, 'samples': 1241280, 'steps': 6464, 'loss/train': 0.45416316390037537} 01/27/2022 01:57:10 - INFO - codeparrot_training - Step 6465: {'lr': 0.0004894006981109232, 'samples': 1241472, 'steps': 6465, 'loss/train': 0.8721913397312164} 01/27/2022 01:57:13 - INFO - codeparrot_training - Step 6466: {'lr': 0.0004893959837069243, 'samples': 1241664, 'steps': 6466, 'loss/train': 0.958389401435852} 01/27/2022 01:57:17 - INFO - codeparrot_training - Step 6467: {'lr': 0.0004893912682774291, 'samples': 1241856, 'steps': 6467, 'loss/train': 0.8280833959579468} 01/27/2022 01:57:22 - INFO - codeparrot_training - Step 6468: {'lr': 0.0004893865518224576, 'samples': 1242048, 'steps': 6468, 'loss/train': 0.9247764945030212} 01/27/2022 01:57:25 - INFO - codeparrot_training - Step 6469: {'lr': 0.0004893818343420302, 'samples': 1242240, 'steps': 6469, 'loss/train': 0.7083458304405212} 01/27/2022 01:57:28 - INFO - codeparrot_training - Step 6470: {'lr': 0.000489377115836167, 'samples': 1242432, 'steps': 6470, 'loss/train': 1.0501765608787537} 01/27/2022 01:57:31 - INFO - codeparrot_training - Step 6471: {'lr': 0.0004893723963048882, 'samples': 1242624, 'steps': 6471, 'loss/train': 1.229216605424881} 01/27/2022 01:57:34 - INFO - codeparrot_training - Step 6472: {'lr': 0.0004893676757482142, 'samples': 1242816, 'steps': 6472, 'loss/train': 1.051376223564148} 01/27/2022 01:57:37 - INFO - codeparrot_training - Step 6473: {'lr': 0.0004893629541661649, 'samples': 1243008, 'steps': 6473, 'loss/train': 0.6711908429861069} 01/27/2022 01:57:40 - INFO - codeparrot_training - Step 6474: {'lr': 0.0004893582315587608, 'samples': 1243200, 'steps': 6474, 'loss/train': 0.7444881945848465} 01/27/2022 01:57:44 - INFO - codeparrot_training - Step 6475: {'lr': 0.0004893535079260221, 'samples': 1243392, 'steps': 6475, 'loss/train': 1.1054498255252838} 01/27/2022 01:57:47 - INFO - codeparrot_training - Step 6476: {'lr': 0.000489348783267969, 'samples': 1243584, 'steps': 6476, 'loss/train': 0.636889636516571} 01/27/2022 01:57:51 - INFO - codeparrot_training - Step 6477: {'lr': 0.0004893440575846215, 'samples': 1243776, 'steps': 6477, 'loss/train': 0.6710321009159088} 01/27/2022 01:57:54 - INFO - codeparrot_training - Step 6478: {'lr': 0.0004893393308760002, 'samples': 1243968, 'steps': 6478, 'loss/train': 0.9872595369815826} 01/27/2022 01:57:58 - INFO - codeparrot_training - Step 6479: {'lr': 0.0004893346031421253, 'samples': 1244160, 'steps': 6479, 'loss/train': 0.7846276760101318} 01/27/2022 01:58:01 - INFO - codeparrot_training - Step 6480: {'lr': 0.0004893298743830168, 'samples': 1244352, 'steps': 6480, 'loss/train': 1.3248554170131683} 01/27/2022 01:58:04 - INFO - codeparrot_training - Step 6481: {'lr': 0.0004893251445986952, 'samples': 1244544, 'steps': 6481, 'loss/train': 0.7045775055885315} 01/27/2022 01:58:07 - INFO - codeparrot_training - Step 6482: {'lr': 0.0004893204137891807, 'samples': 1244736, 'steps': 6482, 'loss/train': 0.6059805303812027} 01/27/2022 01:58:10 - INFO - codeparrot_training - Step 6483: {'lr': 0.0004893156819544935, 'samples': 1244928, 'steps': 6483, 'loss/train': 0.9456937909126282} 01/27/2022 01:58:13 - INFO - codeparrot_training - Step 6484: {'lr': 0.0004893109490946539, 'samples': 1245120, 'steps': 6484, 'loss/train': 1.3156598210334778} 01/27/2022 01:58:18 - INFO - codeparrot_training - Step 6485: {'lr': 0.0004893062152096821, 'samples': 1245312, 'steps': 6485, 'loss/train': 0.9093525409698486} 01/27/2022 01:58:21 - INFO - codeparrot_training - Step 6486: {'lr': 0.0004893014802995985, 'samples': 1245504, 'steps': 6486, 'loss/train': 0.9557449221611023} 01/27/2022 01:58:24 - INFO - codeparrot_training - Step 6487: {'lr': 0.0004892967443644235, 'samples': 1245696, 'steps': 6487, 'loss/train': 0.8069811165332794} 01/27/2022 01:58:27 - INFO - codeparrot_training - Step 6488: {'lr': 0.0004892920074041771, 'samples': 1245888, 'steps': 6488, 'loss/train': 0.9993148148059845} 01/27/2022 01:58:30 - INFO - codeparrot_training - Step 6489: {'lr': 0.0004892872694188797, 'samples': 1246080, 'steps': 6489, 'loss/train': 1.0697342455387115} 01/27/2022 01:58:33 - INFO - codeparrot_training - Step 6490: {'lr': 0.0004892825304085517, 'samples': 1246272, 'steps': 6490, 'loss/train': 0.728005975484848} 01/27/2022 01:58:37 - INFO - codeparrot_training - Step 6491: {'lr': 0.0004892777903732133, 'samples': 1246464, 'steps': 6491, 'loss/train': 1.4892234206199646} 01/27/2022 01:58:40 - INFO - codeparrot_training - Step 6492: {'lr': 0.0004892730493128848, 'samples': 1246656, 'steps': 6492, 'loss/train': 0.9387146830558777} 01/27/2022 01:58:43 - INFO - codeparrot_training - Step 6493: {'lr': 0.0004892683072275865, 'samples': 1246848, 'steps': 6493, 'loss/train': 0.9668287932872772} 01/27/2022 01:58:47 - INFO - codeparrot_training - Step 6494: {'lr': 0.0004892635641173389, 'samples': 1247040, 'steps': 6494, 'loss/train': 0.8239660263061523} 01/27/2022 01:58:50 - INFO - codeparrot_training - Step 6495: {'lr': 0.0004892588199821619, 'samples': 1247232, 'steps': 6495, 'loss/train': 0.9839179515838623} 01/27/2022 01:58:53 - INFO - codeparrot_training - Step 6496: {'lr': 0.0004892540748220763, 'samples': 1247424, 'steps': 6496, 'loss/train': 1.1027733385562897} 01/27/2022 01:58:57 - INFO - codeparrot_training - Step 6497: {'lr': 0.0004892493286371022, 'samples': 1247616, 'steps': 6497, 'loss/train': 0.6908422708511353} 01/27/2022 01:59:00 - INFO - codeparrot_training - Step 6498: {'lr': 0.00048924458142726, 'samples': 1247808, 'steps': 6498, 'loss/train': 0.960543304681778} 01/27/2022 01:59:03 - INFO - codeparrot_training - Step 6499: {'lr': 0.0004892398331925698, 'samples': 1248000, 'steps': 6499, 'loss/train': 0.854502260684967} 01/27/2022 01:59:06 - INFO - codeparrot_training - Step 6500: {'lr': 0.0004892350839330522, 'samples': 1248192, 'steps': 6500, 'loss/train': 0.5628131479024887} 01/27/2022 01:59:09 - INFO - codeparrot_training - Step 6501: {'lr': 0.0004892303336487275, 'samples': 1248384, 'steps': 6501, 'loss/train': 0.9196599125862122} 01/27/2022 01:59:12 - INFO - codeparrot_training - Step 6502: {'lr': 0.000489225582339616, 'samples': 1248576, 'steps': 6502, 'loss/train': 1.1944470405578613} 01/27/2022 01:59:17 - INFO - codeparrot_training - Step 6503: {'lr': 0.000489220830005738, 'samples': 1248768, 'steps': 6503, 'loss/train': 0.7738680839538574} 01/27/2022 01:59:21 - INFO - codeparrot_training - Step 6504: {'lr': 0.0004892160766471141, 'samples': 1248960, 'steps': 6504, 'loss/train': 0.7838071882724762} 01/27/2022 01:59:24 - INFO - codeparrot_training - Step 6505: {'lr': 0.0004892113222637643, 'samples': 1249152, 'steps': 6505, 'loss/train': 0.7663211524486542} 01/27/2022 01:59:27 - INFO - codeparrot_training - Step 6506: {'lr': 0.0004892065668557093, 'samples': 1249344, 'steps': 6506, 'loss/train': 0.9826212823390961} 01/27/2022 01:59:30 - INFO - codeparrot_training - Step 6507: {'lr': 0.0004892018104229692, 'samples': 1249536, 'steps': 6507, 'loss/train': 0.9168035387992859} 01/27/2022 01:59:33 - INFO - codeparrot_training - Step 6508: {'lr': 0.0004891970529655646, 'samples': 1249728, 'steps': 6508, 'loss/train': 0.5090121030807495} 01/27/2022 01:59:36 - INFO - codeparrot_training - Step 6509: {'lr': 0.0004891922944835158, 'samples': 1249920, 'steps': 6509, 'loss/train': 0.6441204696893692} 01/27/2022 01:59:39 - INFO - codeparrot_training - Step 6510: {'lr': 0.000489187534976843, 'samples': 1250112, 'steps': 6510, 'loss/train': 0.31325604021549225} 01/27/2022 01:59:43 - INFO - codeparrot_training - Step 6511: {'lr': 0.0004891827744455668, 'samples': 1250304, 'steps': 6511, 'loss/train': 1.4462987780570984} 01/27/2022 01:59:47 - INFO - codeparrot_training - Step 6512: {'lr': 0.0004891780128897077, 'samples': 1250496, 'steps': 6512, 'loss/train': 0.5415111780166626} 01/27/2022 01:59:50 - INFO - codeparrot_training - Step 6513: {'lr': 0.0004891732503092858, 'samples': 1250688, 'steps': 6513, 'loss/train': 0.6589534431695938} 01/27/2022 01:59:54 - INFO - codeparrot_training - Step 6514: {'lr': 0.0004891684867043216, 'samples': 1250880, 'steps': 6514, 'loss/train': 1.204281896352768} 01/27/2022 01:59:57 - INFO - codeparrot_training - Step 6515: {'lr': 0.0004891637220748356, 'samples': 1251072, 'steps': 6515, 'loss/train': 0.827670693397522} 01/27/2022 02:00:00 - INFO - codeparrot_training - Step 6516: {'lr': 0.0004891589564208482, 'samples': 1251264, 'steps': 6516, 'loss/train': 0.6771835684776306} 01/27/2022 02:00:03 - INFO - codeparrot_training - Step 6517: {'lr': 0.0004891541897423798, 'samples': 1251456, 'steps': 6517, 'loss/train': 0.7315075993537903} 01/27/2022 02:00:06 - INFO - codeparrot_training - Step 6518: {'lr': 0.0004891494220394507, 'samples': 1251648, 'steps': 6518, 'loss/train': 0.5140803158283234} 01/27/2022 02:00:09 - INFO - codeparrot_training - Step 6519: {'lr': 0.0004891446533120815, 'samples': 1251840, 'steps': 6519, 'loss/train': 0.5347931385040283} 01/27/2022 02:00:14 - INFO - codeparrot_training - Step 6520: {'lr': 0.0004891398835602925, 'samples': 1252032, 'steps': 6520, 'loss/train': 1.0911515951156616} 01/27/2022 02:00:17 - INFO - codeparrot_training - Step 6521: {'lr': 0.0004891351127841041, 'samples': 1252224, 'steps': 6521, 'loss/train': 1.0136068761348724} 01/27/2022 02:00:20 - INFO - codeparrot_training - Step 6522: {'lr': 0.0004891303409835369, 'samples': 1252416, 'steps': 6522, 'loss/train': 0.7049199789762497} 01/27/2022 02:00:23 - INFO - codeparrot_training - Step 6523: {'lr': 0.0004891255681586113, 'samples': 1252608, 'steps': 6523, 'loss/train': 1.1698864996433258} 01/27/2022 02:00:26 - INFO - codeparrot_training - Step 6524: {'lr': 0.0004891207943093476, 'samples': 1252800, 'steps': 6524, 'loss/train': 0.982132226228714} 01/27/2022 02:00:29 - INFO - codeparrot_training - Step 6525: {'lr': 0.0004891160194357663, 'samples': 1252992, 'steps': 6525, 'loss/train': 1.0428886413574219} 01/27/2022 02:00:33 - INFO - codeparrot_training - Step 6526: {'lr': 0.0004891112435378881, 'samples': 1253184, 'steps': 6526, 'loss/train': 1.0000210404396057} 01/27/2022 02:00:36 - INFO - codeparrot_training - Step 6527: {'lr': 0.0004891064666157331, 'samples': 1253376, 'steps': 6527, 'loss/train': 1.0078984200954437} 01/27/2022 02:00:39 - INFO - codeparrot_training - Step 6528: {'lr': 0.0004891016886693219, 'samples': 1253568, 'steps': 6528, 'loss/train': 1.1133740544319153} 01/27/2022 02:00:44 - INFO - codeparrot_training - Step 6529: {'lr': 0.0004890969096986751, 'samples': 1253760, 'steps': 6529, 'loss/train': 0.7746005952358246} 01/27/2022 02:00:47 - INFO - codeparrot_training - Step 6530: {'lr': 0.000489092129703813, 'samples': 1253952, 'steps': 6530, 'loss/train': 0.6856762021780014} 01/27/2022 02:00:50 - INFO - codeparrot_training - Step 6531: {'lr': 0.0004890873486847561, 'samples': 1254144, 'steps': 6531, 'loss/train': 1.2229767143726349} 01/27/2022 02:00:54 - INFO - codeparrot_training - Step 6532: {'lr': 0.000489082566641525, 'samples': 1254336, 'steps': 6532, 'loss/train': 1.08779838681221} 01/27/2022 02:00:57 - INFO - codeparrot_training - Step 6533: {'lr': 0.00048907778357414, 'samples': 1254528, 'steps': 6533, 'loss/train': 0.9024446904659271} 01/27/2022 02:01:00 - INFO - codeparrot_training - Step 6534: {'lr': 0.0004890729994826218, 'samples': 1254720, 'steps': 6534, 'loss/train': 0.6247542053461075} 01/27/2022 02:01:03 - INFO - codeparrot_training - Step 6535: {'lr': 0.0004890682143669908, 'samples': 1254912, 'steps': 6535, 'loss/train': 1.0485840439796448} 01/27/2022 02:01:06 - INFO - codeparrot_training - Step 6536: {'lr': 0.0004890634282272673, 'samples': 1255104, 'steps': 6536, 'loss/train': 0.6727143824100494} 01/27/2022 02:01:09 - INFO - codeparrot_training - Step 6537: {'lr': 0.0004890586410634722, 'samples': 1255296, 'steps': 6537, 'loss/train': 1.1756085455417633} 01/27/2022 02:01:14 - INFO - codeparrot_training - Step 6538: {'lr': 0.0004890538528756256, 'samples': 1255488, 'steps': 6538, 'loss/train': 0.8074637353420258} 01/27/2022 02:01:17 - INFO - codeparrot_training - Step 6539: {'lr': 0.0004890490636637484, 'samples': 1255680, 'steps': 6539, 'loss/train': 0.9574673473834991} 01/27/2022 02:01:20 - INFO - codeparrot_training - Step 6540: {'lr': 0.0004890442734278608, 'samples': 1255872, 'steps': 6540, 'loss/train': 0.9732928276062012} 01/27/2022 02:01:23 - INFO - codeparrot_training - Step 6541: {'lr': 0.0004890394821679834, 'samples': 1256064, 'steps': 6541, 'loss/train': 1.1469668447971344} 01/27/2022 02:01:27 - INFO - codeparrot_training - Step 6542: {'lr': 0.0004890346898841369, 'samples': 1256256, 'steps': 6542, 'loss/train': 1.0162839889526367} 01/27/2022 02:01:30 - INFO - codeparrot_training - Step 6543: {'lr': 0.0004890298965763416, 'samples': 1256448, 'steps': 6543, 'loss/train': 0.26059046387672424} 01/27/2022 02:01:33 - INFO - codeparrot_training - Step 6544: {'lr': 0.0004890251022446181, 'samples': 1256640, 'steps': 6544, 'loss/train': 0.976361095905304} 01/27/2022 02:01:36 - INFO - codeparrot_training - Step 6545: {'lr': 0.000489020306888987, 'samples': 1256832, 'steps': 6545, 'loss/train': 0.890012115240097} 01/27/2022 02:01:39 - INFO - codeparrot_training - Step 6546: {'lr': 0.0004890155105094688, 'samples': 1257024, 'steps': 6546, 'loss/train': 1.0784519612789154} 01/27/2022 02:01:44 - INFO - codeparrot_training - Step 6547: {'lr': 0.0004890107131060841, 'samples': 1257216, 'steps': 6547, 'loss/train': 0.9499726295471191} 01/27/2022 02:01:47 - INFO - codeparrot_training - Step 6548: {'lr': 0.0004890059146788532, 'samples': 1257408, 'steps': 6548, 'loss/train': 0.847395122051239} 01/27/2022 02:01:51 - INFO - codeparrot_training - Step 6549: {'lr': 0.000489001115227797, 'samples': 1257600, 'steps': 6549, 'loss/train': 0.5186022520065308} 01/27/2022 02:01:54 - INFO - codeparrot_training - Step 6550: {'lr': 0.000488996314752936, 'samples': 1257792, 'steps': 6550, 'loss/train': 0.6793401092290878} 01/27/2022 02:01:57 - INFO - codeparrot_training - Step 6551: {'lr': 0.0004889915132542906, 'samples': 1257984, 'steps': 6551, 'loss/train': 0.20366017520427704} 01/27/2022 02:02:00 - INFO - codeparrot_training - Step 6552: {'lr': 0.0004889867107318814, 'samples': 1258176, 'steps': 6552, 'loss/train': 0.9197715818881989} 01/27/2022 02:02:03 - INFO - codeparrot_training - Step 6553: {'lr': 0.0004889819071857291, 'samples': 1258368, 'steps': 6553, 'loss/train': 1.3038400411605835} 01/27/2022 02:02:06 - INFO - codeparrot_training - Step 6554: {'lr': 0.0004889771026158541, 'samples': 1258560, 'steps': 6554, 'loss/train': 0.8200859427452087} 01/27/2022 02:02:11 - INFO - codeparrot_training - Step 6555: {'lr': 0.0004889722970222772, 'samples': 1258752, 'steps': 6555, 'loss/train': 1.2632518708705902} 01/27/2022 02:02:14 - INFO - codeparrot_training - Step 6556: {'lr': 0.0004889674904050188, 'samples': 1258944, 'steps': 6556, 'loss/train': 1.143616408109665} 01/27/2022 02:02:17 - INFO - codeparrot_training - Step 6557: {'lr': 0.0004889626827640994, 'samples': 1259136, 'steps': 6557, 'loss/train': 0.818926602602005} 01/27/2022 02:02:20 - INFO - codeparrot_training - Step 6558: {'lr': 0.00048895787409954, 'samples': 1259328, 'steps': 6558, 'loss/train': 1.2681356370449066} 01/27/2022 02:02:23 - INFO - codeparrot_training - Step 6559: {'lr': 0.0004889530644113608, 'samples': 1259520, 'steps': 6559, 'loss/train': 0.1986890807747841} 01/27/2022 02:02:26 - INFO - codeparrot_training - Step 6560: {'lr': 0.0004889482536995825, 'samples': 1259712, 'steps': 6560, 'loss/train': 1.3844054639339447} 01/27/2022 02:02:30 - INFO - codeparrot_training - Step 6561: {'lr': 0.0004889434419642259, 'samples': 1259904, 'steps': 6561, 'loss/train': 1.0948154926300049} 01/27/2022 02:02:33 - INFO - codeparrot_training - Step 6562: {'lr': 0.0004889386292053114, 'samples': 1260096, 'steps': 6562, 'loss/train': 0.834183543920517} 01/27/2022 02:02:36 - INFO - codeparrot_training - Step 6563: {'lr': 0.0004889338154228596, 'samples': 1260288, 'steps': 6563, 'loss/train': 0.5842142701148987} 01/27/2022 02:02:40 - INFO - codeparrot_training - Step 6564: {'lr': 0.0004889290006168913, 'samples': 1260480, 'steps': 6564, 'loss/train': 0.5780097395181656} 01/27/2022 02:02:43 - INFO - codeparrot_training - Step 6565: {'lr': 0.0004889241847874271, 'samples': 1260672, 'steps': 6565, 'loss/train': 0.6958589851856232} 01/27/2022 02:02:46 - INFO - codeparrot_training - Step 6566: {'lr': 0.0004889193679344874, 'samples': 1260864, 'steps': 6566, 'loss/train': 1.0658779442310333} 01/27/2022 02:02:50 - INFO - codeparrot_training - Step 6567: {'lr': 0.0004889145500580932, 'samples': 1261056, 'steps': 6567, 'loss/train': 0.17144571617245674} 01/27/2022 02:02:53 - INFO - codeparrot_training - Step 6568: {'lr': 0.0004889097311582647, 'samples': 1261248, 'steps': 6568, 'loss/train': 0.8573758006095886} 01/27/2022 02:02:56 - INFO - codeparrot_training - Step 6569: {'lr': 0.000488904911235023, 'samples': 1261440, 'steps': 6569, 'loss/train': 1.3570860922336578} 01/27/2022 02:02:59 - INFO - codeparrot_training - Step 6570: {'lr': 0.0004889000902883883, 'samples': 1261632, 'steps': 6570, 'loss/train': 0.4657786041498184} 01/27/2022 02:03:02 - INFO - codeparrot_training - Step 6571: {'lr': 0.0004888952683183816, 'samples': 1261824, 'steps': 6571, 'loss/train': 0.8432925939559937} 01/27/2022 02:03:05 - INFO - codeparrot_training - Step 6572: {'lr': 0.0004888904453250233, 'samples': 1262016, 'steps': 6572, 'loss/train': 1.7137506008148193} 01/27/2022 02:03:11 - INFO - codeparrot_training - Step 6573: {'lr': 0.0004888856213083343, 'samples': 1262208, 'steps': 6573, 'loss/train': 0.8368186354637146} 01/27/2022 02:03:14 - INFO - codeparrot_training - Step 6574: {'lr': 0.0004888807962683353, 'samples': 1262400, 'steps': 6574, 'loss/train': 1.292923092842102} 01/27/2022 02:03:17 - INFO - codeparrot_training - Step 6575: {'lr': 0.0004888759702050466, 'samples': 1262592, 'steps': 6575, 'loss/train': 0.8966459333896637} 01/27/2022 02:03:20 - INFO - codeparrot_training - Step 6576: {'lr': 0.0004888711431184892, 'samples': 1262784, 'steps': 6576, 'loss/train': 0.7926495373249054} 01/27/2022 02:03:23 - INFO - codeparrot_training - Step 6577: {'lr': 0.0004888663150086835, 'samples': 1262976, 'steps': 6577, 'loss/train': 5.32520055770874} 01/27/2022 02:03:26 - INFO - codeparrot_training - Step 6578: {'lr': 0.0004888614858756505, 'samples': 1263168, 'steps': 6578, 'loss/train': 0.9044597446918488} 01/27/2022 02:03:29 - INFO - codeparrot_training - Step 6579: {'lr': 0.0004888566557194107, 'samples': 1263360, 'steps': 6579, 'loss/train': 0.6631735414266586} 01/27/2022 02:03:32 - INFO - codeparrot_training - Step 6580: {'lr': 0.0004888518245399849, 'samples': 1263552, 'steps': 6580, 'loss/train': 0.7358801364898682} 01/27/2022 02:03:36 - INFO - codeparrot_training - Step 6581: {'lr': 0.0004888469923373937, 'samples': 1263744, 'steps': 6581, 'loss/train': 1.168412446975708} 01/27/2022 02:03:40 - INFO - codeparrot_training - Step 6582: {'lr': 0.0004888421591116578, 'samples': 1263936, 'steps': 6582, 'loss/train': 1.6964954137802124} 01/27/2022 02:03:43 - INFO - codeparrot_training - Step 6583: {'lr': 0.000488837324862798, 'samples': 1264128, 'steps': 6583, 'loss/train': 0.662755161523819} 01/27/2022 02:03:47 - INFO - codeparrot_training - Step 6584: {'lr': 0.0004888324895908349, 'samples': 1264320, 'steps': 6584, 'loss/train': 1.0327047407627106} 01/27/2022 02:03:50 - INFO - codeparrot_training - Step 6585: {'lr': 0.0004888276532957892, 'samples': 1264512, 'steps': 6585, 'loss/train': 0.7540208101272583} 01/27/2022 02:03:53 - INFO - codeparrot_training - Step 6586: {'lr': 0.0004888228159776818, 'samples': 1264704, 'steps': 6586, 'loss/train': 0.7608446180820465} 01/27/2022 02:03:56 - INFO - codeparrot_training - Step 6587: {'lr': 0.0004888179776365331, 'samples': 1264896, 'steps': 6587, 'loss/train': 1.0355992019176483} 01/27/2022 02:03:59 - INFO - codeparrot_training - Step 6588: {'lr': 0.0004888131382723641, 'samples': 1265088, 'steps': 6588, 'loss/train': 1.655299186706543} 01/27/2022 02:04:02 - INFO - codeparrot_training - Step 6589: {'lr': 0.0004888082978851954, 'samples': 1265280, 'steps': 6589, 'loss/train': 0.46945448219776154} 01/27/2022 02:04:05 - INFO - codeparrot_training - Step 6590: {'lr': 0.000488803456475048, 'samples': 1265472, 'steps': 6590, 'loss/train': 0.7213195413351059} 01/27/2022 02:04:10 - INFO - codeparrot_training - Step 6591: {'lr': 0.0004887986140419422, 'samples': 1265664, 'steps': 6591, 'loss/train': 0.884341299533844} 01/27/2022 02:04:13 - INFO - codeparrot_training - Step 6592: {'lr': 0.000488793770585899, 'samples': 1265856, 'steps': 6592, 'loss/train': 0.9822398722171783} 01/27/2022 02:04:16 - INFO - codeparrot_training - Step 6593: {'lr': 0.0004887889261069392, 'samples': 1266048, 'steps': 6593, 'loss/train': 0.896996945142746} 01/27/2022 02:04:19 - INFO - codeparrot_training - Step 6594: {'lr': 0.0004887840806050834, 'samples': 1266240, 'steps': 6594, 'loss/train': 0.8056856989860535} 01/27/2022 02:04:22 - INFO - codeparrot_training - Step 6595: {'lr': 0.0004887792340803524, 'samples': 1266432, 'steps': 6595, 'loss/train': 0.6007370799779892} 01/27/2022 02:04:26 - INFO - codeparrot_training - Step 6596: {'lr': 0.000488774386532767, 'samples': 1266624, 'steps': 6596, 'loss/train': 0.8024649918079376} 01/27/2022 02:04:29 - INFO - codeparrot_training - Step 6597: {'lr': 0.0004887695379623481, 'samples': 1266816, 'steps': 6597, 'loss/train': 0.13057564571499825} 01/27/2022 02:04:32 - INFO - codeparrot_training - Step 6598: {'lr': 0.000488764688369116, 'samples': 1267008, 'steps': 6598, 'loss/train': 1.2483475506305695} 01/27/2022 02:04:35 - INFO - codeparrot_training - Step 6599: {'lr': 0.000488759837753092, 'samples': 1267200, 'steps': 6599, 'loss/train': 0.9934367537498474} 01/27/2022 02:04:40 - INFO - codeparrot_training - Step 6600: {'lr': 0.0004887549861142967, 'samples': 1267392, 'steps': 6600, 'loss/train': 0.9496249258518219} 01/27/2022 02:04:43 - INFO - codeparrot_training - Step 6601: {'lr': 0.0004887501334527507, 'samples': 1267584, 'steps': 6601, 'loss/train': 1.3809682130813599} 01/27/2022 02:04:46 - INFO - codeparrot_training - Step 6602: {'lr': 0.000488745279768475, 'samples': 1267776, 'steps': 6602, 'loss/train': 0.8102195262908936} 01/27/2022 02:04:49 - INFO - codeparrot_training - Step 6603: {'lr': 0.0004887404250614904, 'samples': 1267968, 'steps': 6603, 'loss/train': 1.190598338842392} 01/27/2022 02:04:53 - INFO - codeparrot_training - Step 6604: {'lr': 0.0004887355693318176, 'samples': 1268160, 'steps': 6604, 'loss/train': 0.9634336531162262} 01/27/2022 02:04:56 - INFO - codeparrot_training - Step 6605: {'lr': 0.0004887307125794775, 'samples': 1268352, 'steps': 6605, 'loss/train': 0.5986471921205521} 01/27/2022 02:04:59 - INFO - codeparrot_training - Step 6606: {'lr': 0.0004887258548044907, 'samples': 1268544, 'steps': 6606, 'loss/train': 1.1086763441562653} 01/27/2022 02:05:02 - INFO - codeparrot_training - Step 6607: {'lr': 0.0004887209960068782, 'samples': 1268736, 'steps': 6607, 'loss/train': 0.5359965115785599} 01/27/2022 02:05:07 - INFO - codeparrot_training - Step 6608: {'lr': 0.0004887161361866607, 'samples': 1268928, 'steps': 6608, 'loss/train': 1.086944192647934} 01/27/2022 02:05:10 - INFO - codeparrot_training - Step 6609: {'lr': 0.0004887112753438592, 'samples': 1269120, 'steps': 6609, 'loss/train': 1.0441670715808868} 01/27/2022 02:05:13 - INFO - codeparrot_training - Step 6610: {'lr': 0.0004887064134784943, 'samples': 1269312, 'steps': 6610, 'loss/train': 1.1853637397289276} 01/27/2022 02:05:16 - INFO - codeparrot_training - Step 6611: {'lr': 0.0004887015505905869, 'samples': 1269504, 'steps': 6611, 'loss/train': 1.1245638728141785} 01/27/2022 02:05:19 - INFO - codeparrot_training - Step 6612: {'lr': 0.0004886966866801579, 'samples': 1269696, 'steps': 6612, 'loss/train': 0.991584062576294} 01/27/2022 02:05:23 - INFO - codeparrot_training - Step 6613: {'lr': 0.0004886918217472281, 'samples': 1269888, 'steps': 6613, 'loss/train': 0.8827115893363953} 01/27/2022 02:05:26 - INFO - codeparrot_training - Step 6614: {'lr': 0.0004886869557918183, 'samples': 1270080, 'steps': 6614, 'loss/train': 0.6280835419893265} 01/27/2022 02:05:29 - INFO - codeparrot_training - Step 6615: {'lr': 0.0004886820888139494, 'samples': 1270272, 'steps': 6615, 'loss/train': 0.4925963133573532} 01/27/2022 02:05:32 - INFO - codeparrot_training - Step 6616: {'lr': 0.0004886772208136422, 'samples': 1270464, 'steps': 6616, 'loss/train': 0.2796691283583641} 01/27/2022 02:05:35 - INFO - codeparrot_training - Step 6617: {'lr': 0.0004886723517909176, 'samples': 1270656, 'steps': 6617, 'loss/train': 0.12185989692807198} 01/27/2022 02:05:40 - INFO - codeparrot_training - Step 6618: {'lr': 0.0004886674817457964, 'samples': 1270848, 'steps': 6618, 'loss/train': 0.7687055468559265} 01/27/2022 02:05:43 - INFO - codeparrot_training - Step 6619: {'lr': 0.0004886626106782995, 'samples': 1271040, 'steps': 6619, 'loss/train': 0.7522337436676025} 01/27/2022 02:05:46 - INFO - codeparrot_training - Step 6620: {'lr': 0.0004886577385884478, 'samples': 1271232, 'steps': 6620, 'loss/train': 1.524717628955841} 01/27/2022 02:05:49 - INFO - codeparrot_training - Step 6621: {'lr': 0.0004886528654762621, 'samples': 1271424, 'steps': 6621, 'loss/train': 0.680208295583725} 01/27/2022 02:05:52 - INFO - codeparrot_training - Step 6622: {'lr': 0.0004886479913417633, 'samples': 1271616, 'steps': 6622, 'loss/train': 1.0072157084941864} 01/27/2022 02:05:55 - INFO - codeparrot_training - Step 6623: {'lr': 0.0004886431161849722, 'samples': 1271808, 'steps': 6623, 'loss/train': 1.0078071355819702} 01/27/2022 02:05:59 - INFO - codeparrot_training - Step 6624: {'lr': 0.0004886382400059099, 'samples': 1272000, 'steps': 6624, 'loss/train': 0.4809514582157135} 01/27/2022 02:06:02 - INFO - codeparrot_training - Step 6625: {'lr': 0.0004886333628045972, 'samples': 1272192, 'steps': 6625, 'loss/train': 0.5804079920053482} 01/27/2022 02:06:05 - INFO - codeparrot_training - Step 6626: {'lr': 0.0004886284845810548, 'samples': 1272384, 'steps': 6626, 'loss/train': 0.5820657312870026} 01/27/2022 02:06:11 - INFO - codeparrot_training - Step 6627: {'lr': 0.0004886236053353038, 'samples': 1272576, 'steps': 6627, 'loss/train': 0.7543715536594391} 01/27/2022 02:06:14 - INFO - codeparrot_training - Step 6628: {'lr': 0.000488618725067365, 'samples': 1272768, 'steps': 6628, 'loss/train': 0.683508574962616} 01/27/2022 02:06:17 - INFO - codeparrot_training - Step 6629: {'lr': 0.0004886138437772594, 'samples': 1272960, 'steps': 6629, 'loss/train': 0.5970571339130402} 01/27/2022 02:06:20 - INFO - codeparrot_training - Step 6630: {'lr': 0.0004886089614650078, 'samples': 1273152, 'steps': 6630, 'loss/train': 0.7080196291208267} 01/27/2022 02:06:23 - INFO - codeparrot_training - Step 6631: {'lr': 0.0004886040781306313, 'samples': 1273344, 'steps': 6631, 'loss/train': 1.186943918466568} 01/27/2022 02:06:26 - INFO - codeparrot_training - Step 6632: {'lr': 0.0004885991937741506, 'samples': 1273536, 'steps': 6632, 'loss/train': 1.2469097971916199} 01/27/2022 02:06:29 - INFO - codeparrot_training - Step 6633: {'lr': 0.0004885943083955868, 'samples': 1273728, 'steps': 6633, 'loss/train': 0.25698602199554443} 01/27/2022 02:06:33 - INFO - codeparrot_training - Step 6634: {'lr': 0.0004885894219949607, 'samples': 1273920, 'steps': 6634, 'loss/train': 0.631616860628128} 01/27/2022 02:06:37 - INFO - codeparrot_training - Step 6635: {'lr': 0.0004885845345722932, 'samples': 1274112, 'steps': 6635, 'loss/train': 1.021652340888977} 01/27/2022 02:06:40 - INFO - codeparrot_training - Step 6636: {'lr': 0.0004885796461276055, 'samples': 1274304, 'steps': 6636, 'loss/train': 0.8845481872558594} 01/27/2022 02:06:44 - INFO - codeparrot_training - Step 6637: {'lr': 0.0004885747566609182, 'samples': 1274496, 'steps': 6637, 'loss/train': 0.483024537563324} 01/27/2022 02:06:47 - INFO - codeparrot_training - Step 6638: {'lr': 0.0004885698661722524, 'samples': 1274688, 'steps': 6638, 'loss/train': 0.33550219237804413} 01/27/2022 02:06:50 - INFO - codeparrot_training - Step 6639: {'lr': 0.0004885649746616291, 'samples': 1274880, 'steps': 6639, 'loss/train': 0.7700864374637604} 01/27/2022 02:06:53 - INFO - codeparrot_training - Step 6640: {'lr': 0.0004885600821290692, 'samples': 1275072, 'steps': 6640, 'loss/train': 1.1363525390625} 01/27/2022 02:06:56 - INFO - codeparrot_training - Step 6641: {'lr': 0.0004885551885745937, 'samples': 1275264, 'steps': 6641, 'loss/train': 0.9799226224422455} 01/27/2022 02:06:59 - INFO - codeparrot_training - Step 6642: {'lr': 0.0004885502939982235, 'samples': 1275456, 'steps': 6642, 'loss/train': 0.5913839936256409} 01/27/2022 02:07:02 - INFO - codeparrot_training - Step 6643: {'lr': 0.0004885453983999795, 'samples': 1275648, 'steps': 6643, 'loss/train': 0.9257690906524658} 01/27/2022 02:07:07 - INFO - codeparrot_training - Step 6644: {'lr': 0.0004885405017798828, 'samples': 1275840, 'steps': 6644, 'loss/train': 0.9529519379138947} 01/27/2022 02:07:10 - INFO - codeparrot_training - Step 6645: {'lr': 0.0004885356041379544, 'samples': 1276032, 'steps': 6645, 'loss/train': 0.6981494128704071} 01/27/2022 02:07:13 - INFO - codeparrot_training - Step 6646: {'lr': 0.0004885307054742151, 'samples': 1276224, 'steps': 6646, 'loss/train': 0.12722302973270416} 01/27/2022 02:07:16 - INFO - codeparrot_training - Step 6647: {'lr': 0.0004885258057886861, 'samples': 1276416, 'steps': 6647, 'loss/train': 0.9677424430847168} 01/27/2022 02:07:19 - INFO - codeparrot_training - Step 6648: {'lr': 0.0004885209050813882, 'samples': 1276608, 'steps': 6648, 'loss/train': 0.4831143468618393} 01/27/2022 02:07:23 - INFO - codeparrot_training - Step 6649: {'lr': 0.0004885160033523426, 'samples': 1276800, 'steps': 6649, 'loss/train': 0.8085573613643646} 01/27/2022 02:07:26 - INFO - codeparrot_training - Step 6650: {'lr': 0.0004885111006015701, 'samples': 1276992, 'steps': 6650, 'loss/train': 0.5013598948717117} 01/27/2022 02:07:29 - INFO - codeparrot_training - Step 6651: {'lr': 0.0004885061968290919, 'samples': 1277184, 'steps': 6651, 'loss/train': 0.7380141019821167} 01/27/2022 02:07:32 - INFO - codeparrot_training - Step 6652: {'lr': 0.0004885012920349287, 'samples': 1277376, 'steps': 6652, 'loss/train': 1.233612209558487} 01/27/2022 02:07:37 - INFO - codeparrot_training - Step 6653: {'lr': 0.0004884963862191018, 'samples': 1277568, 'steps': 6653, 'loss/train': 0.8556992411613464} 01/27/2022 02:07:40 - INFO - codeparrot_training - Step 6654: {'lr': 0.0004884914793816321, 'samples': 1277760, 'steps': 6654, 'loss/train': 0.9091439545154572} 01/27/2022 02:07:44 - INFO - codeparrot_training - Step 6655: {'lr': 0.0004884865715225407, 'samples': 1277952, 'steps': 6655, 'loss/train': 1.69394052028656} 01/27/2022 02:07:47 - INFO - codeparrot_training - Step 6656: {'lr': 0.0004884816626418484, 'samples': 1278144, 'steps': 6656, 'loss/train': 0.5499929040670395} 01/27/2022 02:07:50 - INFO - codeparrot_training - Step 6657: {'lr': 0.0004884767527395765, 'samples': 1278336, 'steps': 6657, 'loss/train': 0.6538130789995193} 01/27/2022 02:07:53 - INFO - codeparrot_training - Step 6658: {'lr': 0.0004884718418157459, 'samples': 1278528, 'steps': 6658, 'loss/train': 0.7545157670974731} 01/27/2022 02:07:56 - INFO - codeparrot_training - Step 6659: {'lr': 0.0004884669298703775, 'samples': 1278720, 'steps': 6659, 'loss/train': 0.8040609955787659} 01/27/2022 02:07:59 - INFO - codeparrot_training - Step 6660: {'lr': 0.0004884620169034927, 'samples': 1278912, 'steps': 6660, 'loss/train': 0.7279563993215561} 01/27/2022 02:08:02 - INFO - codeparrot_training - Step 6661: {'lr': 0.0004884571029151123, 'samples': 1279104, 'steps': 6661, 'loss/train': 0.8929872214794159} 01/27/2022 02:08:06 - INFO - codeparrot_training - Step 6662: {'lr': 0.0004884521879052573, 'samples': 1279296, 'steps': 6662, 'loss/train': 0.8648751676082611} 01/27/2022 02:08:10 - INFO - codeparrot_training - Step 6663: {'lr': 0.000488447271873949, 'samples': 1279488, 'steps': 6663, 'loss/train': 0.8579853773117065} 01/27/2022 02:08:13 - INFO - codeparrot_training - Step 6664: {'lr': 0.0004884423548212082, 'samples': 1279680, 'steps': 6664, 'loss/train': 1.232727438211441} 01/27/2022 02:08:16 - INFO - codeparrot_training - Step 6665: {'lr': 0.000488437436747056, 'samples': 1279872, 'steps': 6665, 'loss/train': 1.1011701822280884} 01/27/2022 02:08:20 - INFO - codeparrot_training - Step 6666: {'lr': 0.0004884325176515137, 'samples': 1280064, 'steps': 6666, 'loss/train': 1.1592672765254974} 01/27/2022 02:08:23 - INFO - codeparrot_training - Step 6667: {'lr': 0.000488427597534602, 'samples': 1280256, 'steps': 6667, 'loss/train': 1.4817025065422058} 01/27/2022 02:08:26 - INFO - codeparrot_training - Step 6668: {'lr': 0.0004884226763963423, 'samples': 1280448, 'steps': 6668, 'loss/train': 0.7612740397453308} 01/27/2022 02:08:29 - INFO - codeparrot_training - Step 6669: {'lr': 0.0004884177542367556, 'samples': 1280640, 'steps': 6669, 'loss/train': 1.092908799648285} 01/27/2022 02:08:32 - INFO - codeparrot_training - Step 6670: {'lr': 0.0004884128310558628, 'samples': 1280832, 'steps': 6670, 'loss/train': 0.9632634222507477} 01/27/2022 02:08:36 - INFO - codeparrot_training - Step 6671: {'lr': 0.0004884079068536853, 'samples': 1281024, 'steps': 6671, 'loss/train': 0.690290629863739} 01/27/2022 02:08:40 - INFO - codeparrot_training - Step 6672: {'lr': 0.000488402981630244, 'samples': 1281216, 'steps': 6672, 'loss/train': 0.8148079812526703} 01/27/2022 02:08:43 - INFO - codeparrot_training - Step 6673: {'lr': 0.00048839805538556, 'samples': 1281408, 'steps': 6673, 'loss/train': 0.5239747166633606} 01/27/2022 02:08:46 - INFO - codeparrot_training - Step 6674: {'lr': 0.0004883931281196544, 'samples': 1281600, 'steps': 6674, 'loss/train': 0.860966831445694} 01/27/2022 02:08:49 - INFO - codeparrot_training - Step 6675: {'lr': 0.0004883881998325484, 'samples': 1281792, 'steps': 6675, 'loss/train': 0.9549036920070648} 01/27/2022 02:08:52 - INFO - codeparrot_training - Step 6676: {'lr': 0.000488383270524263, 'samples': 1281984, 'steps': 6676, 'loss/train': 0.8045755326747894} 01/27/2022 02:08:55 - INFO - codeparrot_training - Step 6677: {'lr': 0.0004883783401948194, 'samples': 1282176, 'steps': 6677, 'loss/train': 1.1578138768672943} 01/27/2022 02:08:59 - INFO - codeparrot_training - Step 6678: {'lr': 0.0004883734088442387, 'samples': 1282368, 'steps': 6678, 'loss/train': 0.822446197271347} 01/27/2022 02:09:02 - INFO - codeparrot_training - Step 6679: {'lr': 0.0004883684764725419, 'samples': 1282560, 'steps': 6679, 'loss/train': 0.7246659100055695} 01/27/2022 02:09:06 - INFO - codeparrot_training - Step 6680: {'lr': 0.0004883635430797502, 'samples': 1282752, 'steps': 6680, 'loss/train': 1.0948496460914612} 01/27/2022 02:09:09 - INFO - codeparrot_training - Step 6681: {'lr': 0.000488358608665885, 'samples': 1282944, 'steps': 6681, 'loss/train': 0.7586478888988495} 01/27/2022 02:09:12 - INFO - codeparrot_training - Step 6682: {'lr': 0.000488353673230967, 'samples': 1283136, 'steps': 6682, 'loss/train': 1.138339251279831} 01/27/2022 02:09:16 - INFO - codeparrot_training - Step 6683: {'lr': 0.0004883487367750177, 'samples': 1283328, 'steps': 6683, 'loss/train': 1.3007468283176422} 01/27/2022 02:09:19 - INFO - codeparrot_training - Step 6684: {'lr': 0.0004883437992980581, 'samples': 1283520, 'steps': 6684, 'loss/train': 0.8846204280853271} 01/27/2022 02:09:22 - INFO - codeparrot_training - Step 6685: {'lr': 0.0004883388608001093, 'samples': 1283712, 'steps': 6685, 'loss/train': 1.072122037410736} 01/27/2022 02:09:25 - INFO - codeparrot_training - Step 6686: {'lr': 0.0004883339212811924, 'samples': 1283904, 'steps': 6686, 'loss/train': 0.8004401922225952} 01/27/2022 02:09:28 - INFO - codeparrot_training - Step 6687: {'lr': 0.0004883289807413288, 'samples': 1284096, 'steps': 6687, 'loss/train': 1.054635375738144} 01/27/2022 02:09:31 - INFO - codeparrot_training - Step 6688: {'lr': 0.0004883240391805394, 'samples': 1284288, 'steps': 6688, 'loss/train': 0.9190465807914734} 01/27/2022 02:09:36 - INFO - codeparrot_training - Step 6689: {'lr': 0.0004883190965988455, 'samples': 1284480, 'steps': 6689, 'loss/train': 0.6107935309410095} 01/27/2022 02:09:40 - INFO - codeparrot_training - Step 6690: {'lr': 0.0004883141529962683, 'samples': 1284672, 'steps': 6690, 'loss/train': 0.6180095672607422} 01/27/2022 02:09:43 - INFO - codeparrot_training - Step 6691: {'lr': 0.000488309208372829, 'samples': 1284864, 'steps': 6691, 'loss/train': 0.8079845309257507} 01/27/2022 02:09:46 - INFO - codeparrot_training - Step 6692: {'lr': 0.0004883042627285488, 'samples': 1285056, 'steps': 6692, 'loss/train': 0.3181171864271164} 01/27/2022 02:09:49 - INFO - codeparrot_training - Step 6693: {'lr': 0.0004882993160634487, 'samples': 1285248, 'steps': 6693, 'loss/train': 0.9765525162220001} 01/27/2022 02:09:52 - INFO - codeparrot_training - Step 6694: {'lr': 0.0004882943683775499, 'samples': 1285440, 'steps': 6694, 'loss/train': 1.3519660234451294} 01/27/2022 02:09:55 - INFO - codeparrot_training - Step 6695: {'lr': 0.0004882894196708738, 'samples': 1285632, 'steps': 6695, 'loss/train': 0.8673959970474243} 01/27/2022 02:09:58 - INFO - codeparrot_training - Step 6696: {'lr': 0.0004882844699434415, 'samples': 1285824, 'steps': 6696, 'loss/train': 0.810850203037262} 01/27/2022 02:10:03 - INFO - codeparrot_training - Step 6697: {'lr': 0.0004882795191952741, 'samples': 1286016, 'steps': 6697, 'loss/train': 0.8156163990497589} 01/27/2022 02:10:06 - INFO - codeparrot_training - Step 6698: {'lr': 0.0004882745674263931, 'samples': 1286208, 'steps': 6698, 'loss/train': 0.9614773392677307} 01/27/2022 02:10:09 - INFO - codeparrot_training - Step 6699: {'lr': 0.00048826961463681936, 'samples': 1286400, 'steps': 6699, 'loss/train': 1.053586184978485} 01/27/2022 02:10:12 - INFO - codeparrot_training - Step 6700: {'lr': 0.00048826466082657426, 'samples': 1286592, 'steps': 6700, 'loss/train': 1.1067797541618347} 01/27/2022 02:10:15 - INFO - codeparrot_training - Step 6701: {'lr': 0.000488259705995679, 'samples': 1286784, 'steps': 6701, 'loss/train': 0.7217751145362854} 01/27/2022 02:10:19 - INFO - codeparrot_training - Step 6702: {'lr': 0.0004882547501441549, 'samples': 1286976, 'steps': 6702, 'loss/train': 0.5510267615318298} 01/27/2022 02:10:22 - INFO - codeparrot_training - Step 6703: {'lr': 0.000488249793272023, 'samples': 1287168, 'steps': 6703, 'loss/train': 0.7786273062229156} 01/27/2022 02:10:25 - INFO - codeparrot_training - Step 6704: {'lr': 0.0004882448353793048, 'samples': 1287360, 'steps': 6704, 'loss/train': 0.7843199372291565} 01/27/2022 02:10:28 - INFO - codeparrot_training - Step 6705: {'lr': 0.0004882398764660212, 'samples': 1287552, 'steps': 6705, 'loss/train': 1.0519318878650665} 01/27/2022 02:10:33 - INFO - codeparrot_training - Step 6706: {'lr': 0.00048823491653219366, 'samples': 1287744, 'steps': 6706, 'loss/train': 1.1435075998306274} 01/27/2022 02:10:36 - INFO - codeparrot_training - Step 6707: {'lr': 0.00048822995557784343, 'samples': 1287936, 'steps': 6707, 'loss/train': 1.0241499245166779} 01/27/2022 02:10:39 - INFO - codeparrot_training - Step 6708: {'lr': 0.00048822499360299165, 'samples': 1288128, 'steps': 6708, 'loss/train': 1.132986456155777} 01/27/2022 02:10:43 - INFO - codeparrot_training - Step 6709: {'lr': 0.00048822003060765973, 'samples': 1288320, 'steps': 6709, 'loss/train': 0.9712487161159515} 01/27/2022 02:10:46 - INFO - codeparrot_training - Step 6710: {'lr': 0.00048821506659186875, 'samples': 1288512, 'steps': 6710, 'loss/train': 0.7402374297380447} 01/27/2022 02:10:49 - INFO - codeparrot_training - Step 6711: {'lr': 0.0004882101015556402, 'samples': 1288704, 'steps': 6711, 'loss/train': 0.7740326821804047} 01/27/2022 02:10:52 - INFO - codeparrot_training - Step 6712: {'lr': 0.00048820513549899507, 'samples': 1288896, 'steps': 6712, 'loss/train': 0.38138021528720856} 01/27/2022 02:10:55 - INFO - codeparrot_training - Step 6713: {'lr': 0.00048820016842195487, 'samples': 1289088, 'steps': 6713, 'loss/train': 0.6734624058008194} 01/27/2022 02:10:58 - INFO - codeparrot_training - Step 6714: {'lr': 0.0004881952003245408, 'samples': 1289280, 'steps': 6714, 'loss/train': 1.162510871887207} 01/27/2022 02:11:03 - INFO - codeparrot_training - Step 6715: {'lr': 0.00048819023120677405, 'samples': 1289472, 'steps': 6715, 'loss/train': 0.8175325691699982} 01/27/2022 02:11:06 - INFO - codeparrot_training - Step 6716: {'lr': 0.000488185261068676, 'samples': 1289664, 'steps': 6716, 'loss/train': 1.0132586359977722} 01/27/2022 02:11:09 - INFO - codeparrot_training - Step 6717: {'lr': 0.000488180289910268, 'samples': 1289856, 'steps': 6717, 'loss/train': 0.8231652975082397} 01/27/2022 02:11:12 - INFO - codeparrot_training - Step 6718: {'lr': 0.0004881753177315711, 'samples': 1290048, 'steps': 6718, 'loss/train': 0.664407268166542} 01/27/2022 02:11:15 - INFO - codeparrot_training - Step 6719: {'lr': 0.0004881703445326069, 'samples': 1290240, 'steps': 6719, 'loss/train': 0.5449823141098022} 01/27/2022 02:11:18 - INFO - codeparrot_training - Step 6720: {'lr': 0.0004881653703133966, 'samples': 1290432, 'steps': 6720, 'loss/train': 1.0220445692539215} 01/27/2022 02:11:22 - INFO - codeparrot_training - Step 6721: {'lr': 0.00048816039507396135, 'samples': 1290624, 'steps': 6721, 'loss/train': 0.9606027603149414} 01/27/2022 02:11:25 - INFO - codeparrot_training - Step 6722: {'lr': 0.00048815541881432273, 'samples': 1290816, 'steps': 6722, 'loss/train': 0.701797753572464} 01/27/2022 02:11:28 - INFO - codeparrot_training - Step 6723: {'lr': 0.00048815044153450185, 'samples': 1291008, 'steps': 6723, 'loss/train': 0.6461458057165146} 01/27/2022 02:11:32 - INFO - codeparrot_training - Step 6724: {'lr': 0.00048814546323452013, 'samples': 1291200, 'steps': 6724, 'loss/train': 0.7167414575815201} 01/27/2022 02:11:35 - INFO - codeparrot_training - Step 6725: {'lr': 0.0004881404839143988, 'samples': 1291392, 'steps': 6725, 'loss/train': 0.8552539944648743} 01/27/2022 02:11:39 - INFO - codeparrot_training - Step 6726: {'lr': 0.00048813550357415937, 'samples': 1291584, 'steps': 6726, 'loss/train': 0.9355792701244354} 01/27/2022 02:11:42 - INFO - codeparrot_training - Step 6727: {'lr': 0.00048813052221382294, 'samples': 1291776, 'steps': 6727, 'loss/train': 0.8041671216487885} 01/27/2022 02:11:45 - INFO - codeparrot_training - Step 6728: {'lr': 0.000488125539833411, 'samples': 1291968, 'steps': 6728, 'loss/train': 0.9062909781932831} 01/27/2022 02:11:48 - INFO - codeparrot_training - Step 6729: {'lr': 0.0004881205564329449, 'samples': 1292160, 'steps': 6729, 'loss/train': 0.8260897994041443} 01/27/2022 02:11:51 - INFO - codeparrot_training - Step 6730: {'lr': 0.00048811557201244594, 'samples': 1292352, 'steps': 6730, 'loss/train': 0.9453388452529907} 01/27/2022 02:11:54 - INFO - codeparrot_training - Step 6731: {'lr': 0.0004881105865719355, 'samples': 1292544, 'steps': 6731, 'loss/train': 0.9347445666790009} 01/27/2022 02:11:57 - INFO - codeparrot_training - Step 6732: {'lr': 0.00048810560011143485, 'samples': 1292736, 'steps': 6732, 'loss/train': 0.6445549428462982} 01/27/2022 02:12:03 - INFO - codeparrot_training - Step 6733: {'lr': 0.0004881006126309654, 'samples': 1292928, 'steps': 6733, 'loss/train': 0.3110135793685913} 01/27/2022 02:12:06 - INFO - codeparrot_training - Step 6734: {'lr': 0.00048809562413054864, 'samples': 1293120, 'steps': 6734, 'loss/train': 0.8696798980236053} 01/27/2022 02:12:09 - INFO - codeparrot_training - Step 6735: {'lr': 0.00048809063461020575, 'samples': 1293312, 'steps': 6735, 'loss/train': 1.0858007669448853} 01/27/2022 02:12:12 - INFO - codeparrot_training - Step 6736: {'lr': 0.0004880856440699582, 'samples': 1293504, 'steps': 6736, 'loss/train': 1.0629251897335052} 01/27/2022 02:12:15 - INFO - codeparrot_training - Step 6737: {'lr': 0.00048808065250982737, 'samples': 1293696, 'steps': 6737, 'loss/train': 0.6884329319000244} 01/27/2022 02:12:18 - INFO - codeparrot_training - Step 6738: {'lr': 0.0004880756599298346, 'samples': 1293888, 'steps': 6738, 'loss/train': 0.8268263339996338} 01/27/2022 02:12:21 - INFO - codeparrot_training - Step 6739: {'lr': 0.0004880706663300013, 'samples': 1294080, 'steps': 6739, 'loss/train': 0.5889210999011993} 01/27/2022 02:12:25 - INFO - codeparrot_training - Step 6740: {'lr': 0.0004880656717103489, 'samples': 1294272, 'steps': 6740, 'loss/train': 0.832165002822876} 01/27/2022 02:12:28 - INFO - codeparrot_training - Step 6741: {'lr': 0.00048806067607089866, 'samples': 1294464, 'steps': 6741, 'loss/train': 0.6995381712913513} 01/27/2022 02:12:33 - INFO - codeparrot_training - Step 6742: {'lr': 0.00048805567941167215, 'samples': 1294656, 'steps': 6742, 'loss/train': 0.7155459523200989} 01/27/2022 02:12:36 - INFO - codeparrot_training - Step 6743: {'lr': 0.0004880506817326907, 'samples': 1294848, 'steps': 6743, 'loss/train': 0.7418476492166519} 01/27/2022 02:12:39 - INFO - codeparrot_training - Step 6744: {'lr': 0.0004880456830339757, 'samples': 1295040, 'steps': 6744, 'loss/train': 0.8577222526073456} 01/27/2022 02:12:42 - INFO - codeparrot_training - Step 6745: {'lr': 0.00048804068331554864, 'samples': 1295232, 'steps': 6745, 'loss/train': 1.2998214662075043} 01/27/2022 02:12:45 - INFO - codeparrot_training - Step 6746: {'lr': 0.00048803568257743083, 'samples': 1295424, 'steps': 6746, 'loss/train': 0.7378292977809906} 01/27/2022 02:12:48 - INFO - codeparrot_training - Step 6747: {'lr': 0.00048803068081964375, 'samples': 1295616, 'steps': 6747, 'loss/train': 0.9957001805305481} 01/27/2022 02:12:51 - INFO - codeparrot_training - Step 6748: {'lr': 0.00048802567804220875, 'samples': 1295808, 'steps': 6748, 'loss/train': 0.8076088428497314} 01/27/2022 02:12:55 - INFO - codeparrot_training - Step 6749: {'lr': 0.0004880206742451474, 'samples': 1296000, 'steps': 6749, 'loss/train': 0.9982054531574249} 01/27/2022 02:13:00 - INFO - codeparrot_training - Step 6750: {'lr': 0.0004880156694284811, 'samples': 1296192, 'steps': 6750, 'loss/train': 0.8239316940307617} 01/27/2022 02:13:03 - INFO - codeparrot_training - Step 6751: {'lr': 0.00048801066359223117, 'samples': 1296384, 'steps': 6751, 'loss/train': 0.7210314273834229} 01/27/2022 02:13:06 - INFO - codeparrot_training - Step 6752: {'lr': 0.00048800565673641917, 'samples': 1296576, 'steps': 6752, 'loss/train': 1.108259528875351} 01/27/2022 02:13:09 - INFO - codeparrot_training - Step 6753: {'lr': 0.00048800064886106654, 'samples': 1296768, 'steps': 6753, 'loss/train': 0.09922444447875023} 01/27/2022 02:13:12 - INFO - codeparrot_training - Step 6754: {'lr': 0.0004879956399661947, 'samples': 1296960, 'steps': 6754, 'loss/train': 0.8554156422615051} 01/27/2022 02:13:15 - INFO - codeparrot_training - Step 6755: {'lr': 0.000487990630051825, 'samples': 1297152, 'steps': 6755, 'loss/train': 0.7722710072994232} 01/27/2022 02:13:19 - INFO - codeparrot_training - Step 6756: {'lr': 0.00048798561911797913, 'samples': 1297344, 'steps': 6756, 'loss/train': 0.9197956323623657} 01/27/2022 02:13:22 - INFO - codeparrot_training - Step 6757: {'lr': 0.0004879806071646784, 'samples': 1297536, 'steps': 6757, 'loss/train': 0.42100994288921356} 01/27/2022 02:13:25 - INFO - codeparrot_training - Step 6758: {'lr': 0.00048797559419194427, 'samples': 1297728, 'steps': 6758, 'loss/train': 0.6547435373067856} 01/27/2022 02:13:29 - INFO - codeparrot_training - Step 6759: {'lr': 0.00048797058019979837, 'samples': 1297920, 'steps': 6759, 'loss/train': 0.5668638199567795} 01/27/2022 02:13:32 - INFO - codeparrot_training - Step 6760: {'lr': 0.00048796556518826195, 'samples': 1298112, 'steps': 6760, 'loss/train': 0.7382581382989883} 01/27/2022 02:13:36 - INFO - codeparrot_training - Step 6761: {'lr': 0.00048796054915735664, 'samples': 1298304, 'steps': 6761, 'loss/train': 1.0505560040473938} 01/27/2022 02:13:39 - INFO - codeparrot_training - Step 6762: {'lr': 0.00048795553210710397, 'samples': 1298496, 'steps': 6762, 'loss/train': 0.9044670760631561} 01/27/2022 02:13:42 - INFO - codeparrot_training - Step 6763: {'lr': 0.00048795051403752534, 'samples': 1298688, 'steps': 6763, 'loss/train': 0.10930637642741203} 01/27/2022 02:13:45 - INFO - codeparrot_training - Step 6764: {'lr': 0.0004879454949486422, 'samples': 1298880, 'steps': 6764, 'loss/train': 0.5790136903524399} 01/27/2022 02:13:48 - INFO - codeparrot_training - Step 6765: {'lr': 0.00048794047484047615, 'samples': 1299072, 'steps': 6765, 'loss/train': 0.5834757685661316} 01/27/2022 02:13:51 - INFO - codeparrot_training - Step 6766: {'lr': 0.00048793545371304863, 'samples': 1299264, 'steps': 6766, 'loss/train': 0.6385616362094879} 01/27/2022 02:13:54 - INFO - codeparrot_training - Step 6767: {'lr': 0.0004879304315663813, 'samples': 1299456, 'steps': 6767, 'loss/train': 0.9688006639480591} 01/27/2022 02:13:59 - INFO - codeparrot_training - Step 6768: {'lr': 0.00048792540840049544, 'samples': 1299648, 'steps': 6768, 'loss/train': 0.7424716204404831} 01/27/2022 02:14:02 - INFO - codeparrot_training - Step 6769: {'lr': 0.00048792038421541266, 'samples': 1299840, 'steps': 6769, 'loss/train': 0.44919079542160034} 01/27/2022 02:14:05 - INFO - codeparrot_training - Step 6770: {'lr': 0.0004879153590111546, 'samples': 1300032, 'steps': 6770, 'loss/train': 0.8141908049583435} 01/27/2022 02:14:08 - INFO - codeparrot_training - Step 6771: {'lr': 0.0004879103327877426, 'samples': 1300224, 'steps': 6771, 'loss/train': 0.8182716965675354} 01/27/2022 02:14:11 - INFO - codeparrot_training - Step 6772: {'lr': 0.0004879053055451983, 'samples': 1300416, 'steps': 6772, 'loss/train': 1.0989909768104553} 01/27/2022 02:14:15 - INFO - codeparrot_training - Step 6773: {'lr': 0.00048790027728354323, 'samples': 1300608, 'steps': 6773, 'loss/train': 0.7422907054424286} 01/27/2022 02:14:18 - INFO - codeparrot_training - Step 6774: {'lr': 0.0004878952480027989, 'samples': 1300800, 'steps': 6774, 'loss/train': 0.6659541428089142} 01/27/2022 02:14:21 - INFO - codeparrot_training - Step 6775: {'lr': 0.0004878902177029869, 'samples': 1300992, 'steps': 6775, 'loss/train': 0.7765791714191437} 01/27/2022 02:14:24 - INFO - codeparrot_training - Step 6776: {'lr': 0.0004878851863841287, 'samples': 1301184, 'steps': 6776, 'loss/train': 1.0224737226963043} 01/27/2022 02:14:29 - INFO - codeparrot_training - Step 6777: {'lr': 0.00048788015404624597, 'samples': 1301376, 'steps': 6777, 'loss/train': 0.7557822167873383} 01/27/2022 02:14:32 - INFO - codeparrot_training - Step 6778: {'lr': 0.0004878751206893601, 'samples': 1301568, 'steps': 6778, 'loss/train': 0.4874608665704727} 01/27/2022 02:14:35 - INFO - codeparrot_training - Step 6779: {'lr': 0.0004878700863134928, 'samples': 1301760, 'steps': 6779, 'loss/train': 1.1433389782905579} 01/27/2022 02:14:38 - INFO - codeparrot_training - Step 6780: {'lr': 0.00048786505091866564, 'samples': 1301952, 'steps': 6780, 'loss/train': 1.0132285952568054} 01/27/2022 02:14:41 - INFO - codeparrot_training - Step 6781: {'lr': 0.0004878600145049001, 'samples': 1302144, 'steps': 6781, 'loss/train': 0.7418465316295624} 01/27/2022 02:14:44 - INFO - codeparrot_training - Step 6782: {'lr': 0.0004878549770722177, 'samples': 1302336, 'steps': 6782, 'loss/train': 0.7210374623537064} 01/27/2022 02:14:47 - INFO - codeparrot_training - Step 6783: {'lr': 0.0004878499386206402, 'samples': 1302528, 'steps': 6783, 'loss/train': 0.8375597298145294} 01/27/2022 02:14:51 - INFO - codeparrot_training - Step 6784: {'lr': 0.000487844899150189, 'samples': 1302720, 'steps': 6784, 'loss/train': 0.8239235579967499} 01/27/2022 02:14:56 - INFO - codeparrot_training - Step 6785: {'lr': 0.0004878398586608859, 'samples': 1302912, 'steps': 6785, 'loss/train': 0.9818212687969208} 01/27/2022 02:14:59 - INFO - codeparrot_training - Step 6786: {'lr': 0.0004878348171527523, 'samples': 1303104, 'steps': 6786, 'loss/train': 0.4559542089700699} 01/27/2022 02:15:02 - INFO - codeparrot_training - Step 6787: {'lr': 0.0004878297746258099, 'samples': 1303296, 'steps': 6787, 'loss/train': 1.3120242655277252} 01/27/2022 02:15:05 - INFO - codeparrot_training - Step 6788: {'lr': 0.0004878247310800802, 'samples': 1303488, 'steps': 6788, 'loss/train': 0.7124752253293991} 01/27/2022 02:15:08 - INFO - codeparrot_training - Step 6789: {'lr': 0.0004878196865155849, 'samples': 1303680, 'steps': 6789, 'loss/train': 0.9205513000488281} 01/27/2022 02:15:12 - INFO - codeparrot_training - Step 6790: {'lr': 0.0004878146409323456, 'samples': 1303872, 'steps': 6790, 'loss/train': 0.77308389544487} 01/27/2022 02:15:15 - INFO - codeparrot_training - Step 6791: {'lr': 0.00048780959433038386, 'samples': 1304064, 'steps': 6791, 'loss/train': 1.0311209857463837} 01/27/2022 02:15:18 - INFO - codeparrot_training - Step 6792: {'lr': 0.00048780454670972127, 'samples': 1304256, 'steps': 6792, 'loss/train': 0.58047254383564} 01/27/2022 02:15:21 - INFO - codeparrot_training - Step 6793: {'lr': 0.00048779949807037967, 'samples': 1304448, 'steps': 6793, 'loss/train': 0.8117494583129883} 01/27/2022 02:15:26 - INFO - codeparrot_training - Step 6794: {'lr': 0.0004877944484123804, 'samples': 1304640, 'steps': 6794, 'loss/train': 0.6208746582269669} 01/27/2022 02:15:29 - INFO - codeparrot_training - Step 6795: {'lr': 0.00048778939773574525, 'samples': 1304832, 'steps': 6795, 'loss/train': 1.2636407017707825} 01/27/2022 02:15:32 - INFO - codeparrot_training - Step 6796: {'lr': 0.0004877843460404959, 'samples': 1305024, 'steps': 6796, 'loss/train': 0.9115739464759827} 01/27/2022 02:15:35 - INFO - codeparrot_training - Step 6797: {'lr': 0.00048777929332665385, 'samples': 1305216, 'steps': 6797, 'loss/train': 1.0065896809101105} 01/27/2022 02:15:38 - INFO - codeparrot_training - Step 6798: {'lr': 0.00048777423959424083, 'samples': 1305408, 'steps': 6798, 'loss/train': 0.8217456936836243} 01/27/2022 02:15:41 - INFO - codeparrot_training - Step 6799: {'lr': 0.00048776918484327847, 'samples': 1305600, 'steps': 6799, 'loss/train': 0.24896860867738724} 01/27/2022 02:15:44 - INFO - codeparrot_training - Step 6800: {'lr': 0.0004877641290737884, 'samples': 1305792, 'steps': 6800, 'loss/train': 1.0300321877002716} 01/27/2022 02:15:48 - INFO - codeparrot_training - Step 6801: {'lr': 0.0004877590722857923, 'samples': 1305984, 'steps': 6801, 'loss/train': 0.8161667883396149} 01/27/2022 02:15:51 - INFO - codeparrot_training - Step 6802: {'lr': 0.00048775401447931187, 'samples': 1306176, 'steps': 6802, 'loss/train': 0.6594642698764801} 01/27/2022 02:15:55 - INFO - codeparrot_training - Step 6803: {'lr': 0.0004877489556543687, 'samples': 1306368, 'steps': 6803, 'loss/train': 0.9519185721874237} 01/27/2022 02:15:58 - INFO - codeparrot_training - Step 6804: {'lr': 0.00048774389581098454, 'samples': 1306560, 'steps': 6804, 'loss/train': 0.7715279459953308} 01/27/2022 02:16:01 - INFO - codeparrot_training - Step 6805: {'lr': 0.00048773883494918096, 'samples': 1306752, 'steps': 6805, 'loss/train': 0.8106026351451874} 01/27/2022 02:16:04 - INFO - codeparrot_training - Step 6806: {'lr': 0.0004877337730689797, 'samples': 1306944, 'steps': 6806, 'loss/train': 0.8461604118347168} 01/27/2022 02:16:08 - INFO - codeparrot_training - Step 6807: {'lr': 0.00048772871017040256, 'samples': 1307136, 'steps': 6807, 'loss/train': 0.3622007891535759} 01/27/2022 02:16:11 - INFO - codeparrot_training - Step 6808: {'lr': 0.000487723646253471, 'samples': 1307328, 'steps': 6808, 'loss/train': 1.4785711169242859} 01/27/2022 02:16:14 - INFO - codeparrot_training - Step 6809: {'lr': 0.00048771858131820684, 'samples': 1307520, 'steps': 6809, 'loss/train': 0.9877144396305084} 01/27/2022 02:16:17 - INFO - codeparrot_training - Step 6810: {'lr': 0.0004877135153646318, 'samples': 1307712, 'steps': 6810, 'loss/train': 0.8114174008369446} 01/27/2022 02:16:20 - INFO - codeparrot_training - Step 6811: {'lr': 0.0004877084483927675, 'samples': 1307904, 'steps': 6811, 'loss/train': 0.7965520620346069} 01/27/2022 02:16:25 - INFO - codeparrot_training - Step 6812: {'lr': 0.00048770338040263574, 'samples': 1308096, 'steps': 6812, 'loss/train': 0.8616982698440552} 01/27/2022 02:16:29 - INFO - codeparrot_training - Step 6813: {'lr': 0.00048769831139425815, 'samples': 1308288, 'steps': 6813, 'loss/train': 0.8130699992179871} 01/27/2022 02:16:32 - INFO - codeparrot_training - Step 6814: {'lr': 0.0004876932413676565, 'samples': 1308480, 'steps': 6814, 'loss/train': 0.8895336985588074} 01/27/2022 02:16:35 - INFO - codeparrot_training - Step 6815: {'lr': 0.0004876881703228524, 'samples': 1308672, 'steps': 6815, 'loss/train': 1.1168972253799438} 01/27/2022 02:16:38 - INFO - codeparrot_training - Step 6816: {'lr': 0.0004876830982598677, 'samples': 1308864, 'steps': 6816, 'loss/train': 0.557303711771965} 01/27/2022 02:16:41 - INFO - codeparrot_training - Step 6817: {'lr': 0.0004876780251787241, 'samples': 1309056, 'steps': 6817, 'loss/train': 0.1080353669822216} 01/27/2022 02:16:44 - INFO - codeparrot_training - Step 6818: {'lr': 0.0004876729510794433, 'samples': 1309248, 'steps': 6818, 'loss/train': 1.5937371253967285} 01/27/2022 02:16:47 - INFO - codeparrot_training - Step 6819: {'lr': 0.00048766787596204704, 'samples': 1309440, 'steps': 6819, 'loss/train': 0.765739381313324} 01/27/2022 02:16:50 - INFO - codeparrot_training - Step 6820: {'lr': 0.000487662799826557, 'samples': 1309632, 'steps': 6820, 'loss/train': 0.5433285981416702} 01/27/2022 02:16:55 - INFO - codeparrot_training - Step 6821: {'lr': 0.00048765772267299513, 'samples': 1309824, 'steps': 6821, 'loss/train': 0.9978998601436615} 01/27/2022 02:16:58 - INFO - codeparrot_training - Step 6822: {'lr': 0.00048765264450138297, 'samples': 1310016, 'steps': 6822, 'loss/train': 0.7482652366161346} 01/27/2022 02:17:01 - INFO - codeparrot_training - Step 6823: {'lr': 0.00048764756531174237, 'samples': 1310208, 'steps': 6823, 'loss/train': 0.8272646069526672} 01/27/2022 02:17:04 - INFO - codeparrot_training - Step 6824: {'lr': 0.000487642485104095, 'samples': 1310400, 'steps': 6824, 'loss/train': 0.84530308842659} 01/27/2022 02:17:07 - INFO - codeparrot_training - Step 6825: {'lr': 0.0004876374038784627, 'samples': 1310592, 'steps': 6825, 'loss/train': 0.1545145846903324} 01/27/2022 02:17:11 - INFO - codeparrot_training - Step 6826: {'lr': 0.0004876323216348673, 'samples': 1310784, 'steps': 6826, 'loss/train': 1.2357537746429443} 01/27/2022 02:17:14 - INFO - codeparrot_training - Step 6827: {'lr': 0.0004876272383733304, 'samples': 1310976, 'steps': 6827, 'loss/train': 0.9077078998088837} 01/27/2022 02:17:17 - INFO - codeparrot_training - Step 6828: {'lr': 0.0004876221540938739, 'samples': 1311168, 'steps': 6828, 'loss/train': 0.6957989037036896} 01/27/2022 02:17:20 - INFO - codeparrot_training - Step 6829: {'lr': 0.00048761706879651956, 'samples': 1311360, 'steps': 6829, 'loss/train': 1.0914216935634613} 01/27/2022 02:17:25 - INFO - codeparrot_training - Step 6830: {'lr': 0.00048761198248128913, 'samples': 1311552, 'steps': 6830, 'loss/train': 0.5295973420143127} 01/27/2022 02:17:28 - INFO - codeparrot_training - Step 6831: {'lr': 0.00048760689514820444, 'samples': 1311744, 'steps': 6831, 'loss/train': 0.8706179559230804} 01/27/2022 02:17:32 - INFO - codeparrot_training - Step 6832: {'lr': 0.0004876018067972872, 'samples': 1311936, 'steps': 6832, 'loss/train': 0.6770144551992416} 01/27/2022 02:17:35 - INFO - codeparrot_training - Step 6833: {'lr': 0.00048759671742855935, 'samples': 1312128, 'steps': 6833, 'loss/train': 0.729890450835228} 01/27/2022 02:17:38 - INFO - codeparrot_training - Step 6834: {'lr': 0.00048759162704204253, 'samples': 1312320, 'steps': 6834, 'loss/train': 0.8027603924274445} 01/27/2022 02:17:41 - INFO - codeparrot_training - Step 6835: {'lr': 0.0004875865356377587, 'samples': 1312512, 'steps': 6835, 'loss/train': 1.0414401590824127} 01/27/2022 02:17:44 - INFO - codeparrot_training - Step 6836: {'lr': 0.0004875814432157295, 'samples': 1312704, 'steps': 6836, 'loss/train': 0.46516183018684387} 01/27/2022 02:17:47 - INFO - codeparrot_training - Step 6837: {'lr': 0.0004875763497759769, 'samples': 1312896, 'steps': 6837, 'loss/train': 0.47715671360492706} 01/27/2022 02:17:50 - INFO - codeparrot_training - Step 6838: {'lr': 0.00048757125531852263, 'samples': 1313088, 'steps': 6838, 'loss/train': 1.05403670668602} 01/27/2022 02:17:55 - INFO - codeparrot_training - Step 6839: {'lr': 0.00048756615984338857, 'samples': 1313280, 'steps': 6839, 'loss/train': 1.1835665702819824} 01/27/2022 02:17:58 - INFO - codeparrot_training - Step 6840: {'lr': 0.0004875610633505965, 'samples': 1313472, 'steps': 6840, 'loss/train': 0.8430866897106171} 01/27/2022 02:18:01 - INFO - codeparrot_training - Step 6841: {'lr': 0.00048755596584016824, 'samples': 1313664, 'steps': 6841, 'loss/train': 0.8379228115081787} 01/27/2022 02:18:05 - INFO - codeparrot_training - Step 6842: {'lr': 0.0004875508673121257, 'samples': 1313856, 'steps': 6842, 'loss/train': 1.53638756275177} 01/27/2022 02:18:08 - INFO - codeparrot_training - Step 6843: {'lr': 0.00048754576776649066, 'samples': 1314048, 'steps': 6843, 'loss/train': 0.9216309785842896} 01/27/2022 02:18:11 - INFO - codeparrot_training - Step 6844: {'lr': 0.000487540667203285, 'samples': 1314240, 'steps': 6844, 'loss/train': 0.46154429018497467} 01/27/2022 02:18:14 - INFO - codeparrot_training - Step 6845: {'lr': 0.0004875355656225305, 'samples': 1314432, 'steps': 6845, 'loss/train': 0.44841237366199493} 01/27/2022 02:18:17 - INFO - codeparrot_training - Step 6846: {'lr': 0.0004875304630242491, 'samples': 1314624, 'steps': 6846, 'loss/train': 0.8102006614208221} 01/27/2022 02:18:20 - INFO - codeparrot_training - Step 6847: {'lr': 0.00048752535940846267, 'samples': 1314816, 'steps': 6847, 'loss/train': 0.9251637160778046} 01/27/2022 02:18:25 - INFO - codeparrot_training - Step 6848: {'lr': 0.0004875202547751929, 'samples': 1315008, 'steps': 6848, 'loss/train': 1.027937650680542} 01/27/2022 02:18:28 - INFO - codeparrot_training - Step 6849: {'lr': 0.00048751514912446185, 'samples': 1315200, 'steps': 6849, 'loss/train': 0.9637649059295654} 01/27/2022 02:18:31 - INFO - codeparrot_training - Step 6850: {'lr': 0.0004875100424562914, 'samples': 1315392, 'steps': 6850, 'loss/train': 1.351087510585785} 01/27/2022 02:18:34 - INFO - codeparrot_training - Step 6851: {'lr': 0.0004875049347707032, 'samples': 1315584, 'steps': 6851, 'loss/train': 0.5251942723989487} 01/27/2022 02:18:37 - INFO - codeparrot_training - Step 6852: {'lr': 0.00048749982606771934, 'samples': 1315776, 'steps': 6852, 'loss/train': 0.7124141603708267} 01/27/2022 02:18:40 - INFO - codeparrot_training - Step 6853: {'lr': 0.00048749471634736163, 'samples': 1315968, 'steps': 6853, 'loss/train': 0.7693744003772736} 01/27/2022 02:18:43 - INFO - codeparrot_training - Step 6854: {'lr': 0.0004874896056096521, 'samples': 1316160, 'steps': 6854, 'loss/train': 1.0794534087181091} 01/27/2022 02:18:47 - INFO - codeparrot_training - Step 6855: {'lr': 0.0004874844938546123, 'samples': 1316352, 'steps': 6855, 'loss/train': 1.363237202167511} 01/27/2022 02:18:52 - INFO - codeparrot_training - Step 6856: {'lr': 0.0004874793810822644, 'samples': 1316544, 'steps': 6856, 'loss/train': 0.6347203999757767} 01/27/2022 02:18:55 - INFO - codeparrot_training - Step 6857: {'lr': 0.00048747426729263036, 'samples': 1316736, 'steps': 6857, 'loss/train': 0.9766808152198792} 01/27/2022 02:18:58 - INFO - codeparrot_training - Step 6858: {'lr': 0.0004874691524857318, 'samples': 1316928, 'steps': 6858, 'loss/train': 0.7621654272079468} 01/27/2022 02:19:01 - INFO - codeparrot_training - Step 6859: {'lr': 0.00048746403666159087, 'samples': 1317120, 'steps': 6859, 'loss/train': 1.2997123897075653} 01/27/2022 02:19:04 - INFO - codeparrot_training - Step 6860: {'lr': 0.0004874589198202294, 'samples': 1317312, 'steps': 6860, 'loss/train': 0.8042740523815155} 01/27/2022 02:19:07 - INFO - codeparrot_training - Step 6861: {'lr': 0.0004874538019616693, 'samples': 1317504, 'steps': 6861, 'loss/train': 0.8954091668128967} 01/27/2022 02:19:11 - INFO - codeparrot_training - Step 6862: {'lr': 0.0004874486830859326, 'samples': 1317696, 'steps': 6862, 'loss/train': 0.6632251292467117} 01/27/2022 02:19:14 - INFO - codeparrot_training - Step 6863: {'lr': 0.0004874435631930411, 'samples': 1317888, 'steps': 6863, 'loss/train': 0.9462738633155823} 01/27/2022 02:19:17 - INFO - codeparrot_training - Step 6864: {'lr': 0.0004874384422830167, 'samples': 1318080, 'steps': 6864, 'loss/train': 0.8796991109848022} 01/27/2022 02:19:21 - INFO - codeparrot_training - Step 6865: {'lr': 0.0004874333203558815, 'samples': 1318272, 'steps': 6865, 'loss/train': 1.0961297750473022} 01/27/2022 02:19:25 - INFO - codeparrot_training - Step 6866: {'lr': 0.0004874281974116573, 'samples': 1318464, 'steps': 6866, 'loss/train': 0.7298089116811752} 01/27/2022 02:19:28 - INFO - codeparrot_training - Step 6867: {'lr': 0.0004874230734503661, 'samples': 1318656, 'steps': 6867, 'loss/train': 0.7841052711009979} 01/27/2022 02:19:31 - INFO - codeparrot_training - Step 6868: {'lr': 0.00048741794847202984, 'samples': 1318848, 'steps': 6868, 'loss/train': 0.7274097204208374} 01/27/2022 02:19:34 - INFO - codeparrot_training - Step 6869: {'lr': 0.00048741282247667054, 'samples': 1319040, 'steps': 6869, 'loss/train': 1.78056138753891} 01/27/2022 02:19:37 - INFO - codeparrot_training - Step 6870: {'lr': 0.00048740769546431, 'samples': 1319232, 'steps': 6870, 'loss/train': 0.3070768341422081} 01/27/2022 02:19:40 - INFO - codeparrot_training - Step 6871: {'lr': 0.0004874025674349704, 'samples': 1319424, 'steps': 6871, 'loss/train': 1.3687639832496643} 01/27/2022 02:19:43 - INFO - codeparrot_training - Step 6872: {'lr': 0.00048739743838867344, 'samples': 1319616, 'steps': 6872, 'loss/train': 1.2204015254974365} 01/27/2022 02:19:47 - INFO - codeparrot_training - Step 6873: {'lr': 0.0004873923083254413, 'samples': 1319808, 'steps': 6873, 'loss/train': 0.8689853847026825} 01/27/2022 02:19:51 - INFO - codeparrot_training - Step 6874: {'lr': 0.0004873871772452959, 'samples': 1320000, 'steps': 6874, 'loss/train': 0.5550268739461899} 01/27/2022 02:19:54 - INFO - codeparrot_training - Step 6875: {'lr': 0.00048738204514825917, 'samples': 1320192, 'steps': 6875, 'loss/train': 0.9543957710266113} 01/27/2022 02:19:57 - INFO - codeparrot_training - Step 6876: {'lr': 0.0004873769120343532, 'samples': 1320384, 'steps': 6876, 'loss/train': 1.7374202013015747} 01/27/2022 02:20:00 - INFO - codeparrot_training - Step 6877: {'lr': 0.0004873717779035999, 'samples': 1320576, 'steps': 6877, 'loss/train': 0.8980308473110199} 01/27/2022 02:20:04 - INFO - codeparrot_training - Step 6878: {'lr': 0.00048736664275602124, 'samples': 1320768, 'steps': 6878, 'loss/train': 0.9711992740631104} 01/27/2022 02:20:07 - INFO - codeparrot_training - Step 6879: {'lr': 0.00048736150659163925, 'samples': 1320960, 'steps': 6879, 'loss/train': 0.9547217488288879} 01/27/2022 02:20:10 - INFO - codeparrot_training - Step 6880: {'lr': 0.000487356369410476, 'samples': 1321152, 'steps': 6880, 'loss/train': 1.2662831246852875} 01/27/2022 02:20:13 - INFO - codeparrot_training - Step 6881: {'lr': 0.00048735123121255335, 'samples': 1321344, 'steps': 6881, 'loss/train': 0.8130010664463043} 01/27/2022 02:20:16 - INFO - codeparrot_training - Step 6882: {'lr': 0.0004873460919978935, 'samples': 1321536, 'steps': 6882, 'loss/train': 0.8536979556083679} 01/27/2022 02:20:21 - INFO - codeparrot_training - Step 6883: {'lr': 0.00048734095176651825, 'samples': 1321728, 'steps': 6883, 'loss/train': 0.988105058670044} 01/27/2022 02:20:24 - INFO - codeparrot_training - Step 6884: {'lr': 0.00048733581051844976, 'samples': 1321920, 'steps': 6884, 'loss/train': 1.0294859111309052} 01/27/2022 02:20:27 - INFO - codeparrot_training - Step 6885: {'lr': 0.0004873306682537101, 'samples': 1322112, 'steps': 6885, 'loss/train': 0.7264756411314011} 01/27/2022 02:20:30 - INFO - codeparrot_training - Step 6886: {'lr': 0.0004873255249723211, 'samples': 1322304, 'steps': 6886, 'loss/train': 0.8957484662532806} 01/27/2022 02:20:33 - INFO - codeparrot_training - Step 6887: {'lr': 0.000487320380674305, 'samples': 1322496, 'steps': 6887, 'loss/train': 0.3570317253470421} 01/27/2022 02:20:37 - INFO - codeparrot_training - Step 6888: {'lr': 0.0004873152353596837, 'samples': 1322688, 'steps': 6888, 'loss/train': 0.8385611772537231} 01/27/2022 02:20:40 - INFO - codeparrot_training - Step 6889: {'lr': 0.00048731008902847927, 'samples': 1322880, 'steps': 6889, 'loss/train': 0.5840651392936707} 01/27/2022 02:20:43 - INFO - codeparrot_training - Step 6890: {'lr': 0.0004873049416807138, 'samples': 1323072, 'steps': 6890, 'loss/train': 2.072129487991333} 01/27/2022 02:20:46 - INFO - codeparrot_training - Step 6891: {'lr': 0.00048729979331640927, 'samples': 1323264, 'steps': 6891, 'loss/train': 1.1536560952663422} 01/27/2022 02:20:51 - INFO - codeparrot_training - Step 6892: {'lr': 0.0004872946439355879, 'samples': 1323456, 'steps': 6892, 'loss/train': 0.8930999636650085} 01/27/2022 02:20:54 - INFO - codeparrot_training - Step 6893: {'lr': 0.0004872894935382715, 'samples': 1323648, 'steps': 6893, 'loss/train': 0.9914895594120026} 01/27/2022 02:20:58 - INFO - codeparrot_training - Step 6894: {'lr': 0.00048728434212448233, 'samples': 1323840, 'steps': 6894, 'loss/train': 0.8735700845718384} 01/27/2022 02:21:01 - INFO - codeparrot_training - Step 6895: {'lr': 0.0004872791896942423, 'samples': 1324032, 'steps': 6895, 'loss/train': 0.9301161468029022} 01/27/2022 02:21:04 - INFO - codeparrot_training - Step 6896: {'lr': 0.0004872740362475737, 'samples': 1324224, 'steps': 6896, 'loss/train': 0.9783981442451477} 01/27/2022 02:21:07 - INFO - codeparrot_training - Step 6897: {'lr': 0.00048726888178449835, 'samples': 1324416, 'steps': 6897, 'loss/train': 1.1579960882663727} 01/27/2022 02:21:10 - INFO - codeparrot_training - Step 6898: {'lr': 0.00048726372630503845, 'samples': 1324608, 'steps': 6898, 'loss/train': 0.7753594815731049} 01/27/2022 02:21:13 - INFO - codeparrot_training - Step 6899: {'lr': 0.00048725856980921616, 'samples': 1324800, 'steps': 6899, 'loss/train': 1.0033454596996307} 01/27/2022 02:21:18 - INFO - codeparrot_training - Step 6900: {'lr': 0.0004872534122970535, 'samples': 1324992, 'steps': 6900, 'loss/train': 0.8167688548564911} 01/27/2022 02:21:21 - INFO - codeparrot_training - Step 6901: {'lr': 0.00048724825376857253, 'samples': 1325184, 'steps': 6901, 'loss/train': 0.749948188662529} 01/27/2022 02:21:24 - INFO - codeparrot_training - Step 6902: {'lr': 0.0004872430942237953, 'samples': 1325376, 'steps': 6902, 'loss/train': 0.682202160358429} 01/27/2022 02:21:27 - INFO - codeparrot_training - Step 6903: {'lr': 0.0004872379336627441, 'samples': 1325568, 'steps': 6903, 'loss/train': 0.4897422194480896} 01/27/2022 02:21:30 - INFO - codeparrot_training - Step 6904: {'lr': 0.0004872327720854409, 'samples': 1325760, 'steps': 6904, 'loss/train': 0.6852402091026306} 01/27/2022 02:21:34 - INFO - codeparrot_training - Step 6905: {'lr': 0.0004872276094919078, 'samples': 1325952, 'steps': 6905, 'loss/train': 0.9611290991306305} 01/27/2022 02:21:37 - INFO - codeparrot_training - Step 6906: {'lr': 0.00048722244588216695, 'samples': 1326144, 'steps': 6906, 'loss/train': 0.8107783198356628} 01/27/2022 02:21:40 - INFO - codeparrot_training - Step 6907: {'lr': 0.00048721728125624054, 'samples': 1326336, 'steps': 6907, 'loss/train': 0.5393281280994415} 01/27/2022 02:21:43 - INFO - codeparrot_training - Step 6908: {'lr': 0.0004872121156141506, 'samples': 1326528, 'steps': 6908, 'loss/train': 0.8354665338993073} 01/27/2022 02:21:48 - INFO - codeparrot_training - Step 6909: {'lr': 0.0004872069489559192, 'samples': 1326720, 'steps': 6909, 'loss/train': 1.3277118802070618} 01/27/2022 02:21:52 - INFO - codeparrot_training - Step 6910: {'lr': 0.00048720178128156856, 'samples': 1326912, 'steps': 6910, 'loss/train': 0.71421217918396} 01/27/2022 02:21:55 - INFO - codeparrot_training - Step 6911: {'lr': 0.00048719661259112086, 'samples': 1327104, 'steps': 6911, 'loss/train': 0.8693853914737701} 01/27/2022 02:21:58 - INFO - codeparrot_training - Step 6912: {'lr': 0.0004871914428845982, 'samples': 1327296, 'steps': 6912, 'loss/train': 0.4054265320301056} 01/27/2022 02:22:01 - INFO - codeparrot_training - Step 6913: {'lr': 0.0004871862721620227, 'samples': 1327488, 'steps': 6913, 'loss/train': 1.1594705879688263} 01/27/2022 02:22:04 - INFO - codeparrot_training - Step 6914: {'lr': 0.0004871811004234165, 'samples': 1327680, 'steps': 6914, 'loss/train': 1.0822385251522064} 01/27/2022 02:22:07 - INFO - codeparrot_training - Step 6915: {'lr': 0.0004871759276688018, 'samples': 1327872, 'steps': 6915, 'loss/train': 0.8146585822105408} 01/27/2022 02:22:10 - INFO - codeparrot_training - Step 6916: {'lr': 0.00048717075389820074, 'samples': 1328064, 'steps': 6916, 'loss/train': 1.1661287248134613} 01/27/2022 02:22:14 - INFO - codeparrot_training - Step 6917: {'lr': 0.0004871655791116355, 'samples': 1328256, 'steps': 6917, 'loss/train': 1.1251651346683502} 01/27/2022 02:22:18 - INFO - codeparrot_training - Step 6918: {'lr': 0.00048716040330912816, 'samples': 1328448, 'steps': 6918, 'loss/train': 0.11792537569999695} 01/27/2022 02:22:21 - INFO - codeparrot_training - Step 6919: {'lr': 0.000487155226490701, 'samples': 1328640, 'steps': 6919, 'loss/train': 0.9940372109413147} 01/27/2022 02:22:24 - INFO - codeparrot_training - Step 6920: {'lr': 0.0004871500486563761, 'samples': 1328832, 'steps': 6920, 'loss/train': 0.7794888317584991} 01/27/2022 02:22:27 - INFO - codeparrot_training - Step 6921: {'lr': 0.00048714486980617577, 'samples': 1329024, 'steps': 6921, 'loss/train': 0.9339427649974823} 01/27/2022 02:22:31 - INFO - codeparrot_training - Step 6922: {'lr': 0.00048713968994012216, 'samples': 1329216, 'steps': 6922, 'loss/train': 0.738516166806221} 01/27/2022 02:22:34 - INFO - codeparrot_training - Step 6923: {'lr': 0.00048713450905823736, 'samples': 1329408, 'steps': 6923, 'loss/train': 0.7939860820770264} 01/27/2022 02:22:37 - INFO - codeparrot_training - Step 6924: {'lr': 0.0004871293271605436, 'samples': 1329600, 'steps': 6924, 'loss/train': 0.8238635659217834} 01/27/2022 02:22:40 - INFO - codeparrot_training - Step 6925: {'lr': 0.00048712414424706315, 'samples': 1329792, 'steps': 6925, 'loss/train': 0.8935233056545258} 01/27/2022 02:22:43 - INFO - codeparrot_training - Step 6926: {'lr': 0.0004871189603178181, 'samples': 1329984, 'steps': 6926, 'loss/train': 0.9838247001171112} 01/27/2022 02:22:48 - INFO - codeparrot_training - Step 6927: {'lr': 0.00048711377537283073, 'samples': 1330176, 'steps': 6927, 'loss/train': 0.9644960761070251} 01/27/2022 02:22:51 - INFO - codeparrot_training - Step 6928: {'lr': 0.0004871085894121233, 'samples': 1330368, 'steps': 6928, 'loss/train': 0.5559437423944473} 01/27/2022 02:22:54 - INFO - codeparrot_training - Step 6929: {'lr': 0.00048710340243571796, 'samples': 1330560, 'steps': 6929, 'loss/train': 0.7549387514591217} 01/27/2022 02:22:57 - INFO - codeparrot_training - Step 6930: {'lr': 0.0004870982144436369, 'samples': 1330752, 'steps': 6930, 'loss/train': 0.8022176921367645} 01/27/2022 02:23:00 - INFO - codeparrot_training - Step 6931: {'lr': 0.0004870930254359023, 'samples': 1330944, 'steps': 6931, 'loss/train': 0.6206554770469666} 01/27/2022 02:23:03 - INFO - codeparrot_training - Step 6932: {'lr': 0.00048708783541253655, 'samples': 1331136, 'steps': 6932, 'loss/train': 1.146945834159851} 01/27/2022 02:23:06 - INFO - codeparrot_training - Step 6933: {'lr': 0.0004870826443735618, 'samples': 1331328, 'steps': 6933, 'loss/train': 1.0790859460830688} 01/27/2022 02:23:10 - INFO - codeparrot_training - Step 6934: {'lr': 0.0004870774523190003, 'samples': 1331520, 'steps': 6934, 'loss/train': 0.8990480303764343} 01/27/2022 02:23:13 - INFO - codeparrot_training - Step 6935: {'lr': 0.00048707225924887423, 'samples': 1331712, 'steps': 6935, 'loss/train': 0.959615170955658} 01/27/2022 02:23:18 - INFO - codeparrot_training - Step 6936: {'lr': 0.0004870670651632059, 'samples': 1331904, 'steps': 6936, 'loss/train': 1.1425579190254211} 01/27/2022 02:23:21 - INFO - codeparrot_training - Step 6937: {'lr': 0.0004870618700620175, 'samples': 1332096, 'steps': 6937, 'loss/train': 0.28838521242141724} 01/27/2022 02:23:24 - INFO - codeparrot_training - Step 6938: {'lr': 0.0004870566739453314, 'samples': 1332288, 'steps': 6938, 'loss/train': 1.0710580945014954} 01/27/2022 02:23:27 - INFO - codeparrot_training - Step 6939: {'lr': 0.00048705147681316974, 'samples': 1332480, 'steps': 6939, 'loss/train': 0.808917224407196} 01/27/2022 02:23:30 - INFO - codeparrot_training - Step 6940: {'lr': 0.00048704627866555486, 'samples': 1332672, 'steps': 6940, 'loss/train': 0.7804137468338013} 01/27/2022 02:23:33 - INFO - codeparrot_training - Step 6941: {'lr': 0.00048704107950250887, 'samples': 1332864, 'steps': 6941, 'loss/train': 0.8897470235824585} 01/27/2022 02:23:37 - INFO - codeparrot_training - Step 6942: {'lr': 0.0004870358793240543, 'samples': 1333056, 'steps': 6942, 'loss/train': 0.7380708754062653} 01/27/2022 02:23:40 - INFO - codeparrot_training - Step 6943: {'lr': 0.00048703067813021323, 'samples': 1333248, 'steps': 6943, 'loss/train': 0.8530322313308716} 01/27/2022 02:23:43 - INFO - codeparrot_training - Step 6944: {'lr': 0.000487025475921008, 'samples': 1333440, 'steps': 6944, 'loss/train': 0.7525066137313843} 01/27/2022 02:23:47 - INFO - codeparrot_training - Step 6945: {'lr': 0.0004870202726964609, 'samples': 1333632, 'steps': 6945, 'loss/train': 0.6434643119573593} 01/27/2022 02:23:51 - INFO - codeparrot_training - Step 6946: {'lr': 0.0004870150684565943, 'samples': 1333824, 'steps': 6946, 'loss/train': 1.2878297567367554} 01/27/2022 02:23:54 - INFO - codeparrot_training - Step 6947: {'lr': 0.00048700986320143026, 'samples': 1334016, 'steps': 6947, 'loss/train': 1.2422113716602325} 01/27/2022 02:23:57 - INFO - codeparrot_training - Step 6948: {'lr': 0.0004870046569309913, 'samples': 1334208, 'steps': 6948, 'loss/train': 0.6873690783977509} 01/27/2022 02:24:00 - INFO - codeparrot_training - Step 6949: {'lr': 0.0004869994496452996, 'samples': 1334400, 'steps': 6949, 'loss/train': 0.7349257618188858} 01/27/2022 02:24:03 - INFO - codeparrot_training - Step 6950: {'lr': 0.0004869942413443776, 'samples': 1334592, 'steps': 6950, 'loss/train': 1.4240873157978058} 01/27/2022 02:24:06 - INFO - codeparrot_training - Step 6951: {'lr': 0.0004869890320282475, 'samples': 1334784, 'steps': 6951, 'loss/train': 0.712337538599968} 01/27/2022 02:24:09 - INFO - codeparrot_training - Step 6952: {'lr': 0.0004869838216969316, 'samples': 1334976, 'steps': 6952, 'loss/train': 1.2628947794437408} 01/27/2022 02:24:13 - INFO - codeparrot_training - Step 6953: {'lr': 0.0004869786103504523, 'samples': 1335168, 'steps': 6953, 'loss/train': 1.1976524591445923} 01/27/2022 02:24:18 - INFO - codeparrot_training - Step 6954: {'lr': 0.0004869733979888319, 'samples': 1335360, 'steps': 6954, 'loss/train': 0.9914439618587494} 01/27/2022 02:24:21 - INFO - codeparrot_training - Step 6955: {'lr': 0.00048696818461209265, 'samples': 1335552, 'steps': 6955, 'loss/train': 1.04831662774086} 01/27/2022 02:24:24 - INFO - codeparrot_training - Step 6956: {'lr': 0.0004869629702202569, 'samples': 1335744, 'steps': 6956, 'loss/train': 1.2932659685611725} 01/27/2022 02:24:27 - INFO - codeparrot_training - Step 6957: {'lr': 0.0004869577548133471, 'samples': 1335936, 'steps': 6957, 'loss/train': 1.17335906624794} 01/27/2022 02:24:30 - INFO - codeparrot_training - Step 6958: {'lr': 0.00048695253839138553, 'samples': 1336128, 'steps': 6958, 'loss/train': 0.9331876337528229} 01/27/2022 02:24:33 - INFO - codeparrot_training - Step 6959: {'lr': 0.0004869473209543945, 'samples': 1336320, 'steps': 6959, 'loss/train': 0.8770639300346375} 01/27/2022 02:24:36 - INFO - codeparrot_training - Step 6960: {'lr': 0.00048694210250239646, 'samples': 1336512, 'steps': 6960, 'loss/train': 0.5495187789201736} 01/27/2022 02:24:40 - INFO - codeparrot_training - Step 6961: {'lr': 0.0004869368830354136, 'samples': 1336704, 'steps': 6961, 'loss/train': 0.8347639739513397} 01/27/2022 02:24:43 - INFO - codeparrot_training - Step 6962: {'lr': 0.00048693166255346843, 'samples': 1336896, 'steps': 6962, 'loss/train': 0.9809499084949493} 01/27/2022 02:24:47 - INFO - codeparrot_training - Step 6963: {'lr': 0.0004869264410565832, 'samples': 1337088, 'steps': 6963, 'loss/train': 1.434400051832199} 01/27/2022 02:24:50 - INFO - codeparrot_training - Step 6964: {'lr': 0.00048692121854478033, 'samples': 1337280, 'steps': 6964, 'loss/train': 0.7243424355983734} 01/27/2022 02:24:54 - INFO - codeparrot_training - Step 6965: {'lr': 0.00048691599501808223, 'samples': 1337472, 'steps': 6965, 'loss/train': 0.7071366012096405} 01/27/2022 02:24:57 - INFO - codeparrot_training - Step 6966: {'lr': 0.0004869107704765112, 'samples': 1337664, 'steps': 6966, 'loss/train': 0.6760248094797134} 01/27/2022 02:25:00 - INFO - codeparrot_training - Step 6967: {'lr': 0.00048690554492008967, 'samples': 1337856, 'steps': 6967, 'loss/train': 0.6175736635923386} 01/27/2022 02:25:03 - INFO - codeparrot_training - Step 6968: {'lr': 0.00048690031834884004, 'samples': 1338048, 'steps': 6968, 'loss/train': 1.13234943151474} 01/27/2022 02:25:06 - INFO - codeparrot_training - Step 6969: {'lr': 0.0004868950907627846, 'samples': 1338240, 'steps': 6969, 'loss/train': 0.9063209295272827} 01/27/2022 02:25:09 - INFO - codeparrot_training - Step 6970: {'lr': 0.00048688986216194585, 'samples': 1338432, 'steps': 6970, 'loss/train': 0.8041714131832123} 01/27/2022 02:25:12 - INFO - codeparrot_training - Step 6971: {'lr': 0.0004868846325463462, 'samples': 1338624, 'steps': 6971, 'loss/train': 0.7641225457191467} 01/27/2022 02:25:17 - INFO - codeparrot_training - Step 6972: {'lr': 0.000486879401916008, 'samples': 1338816, 'steps': 6972, 'loss/train': 0.9604758024215698} 01/27/2022 02:25:20 - INFO - codeparrot_training - Step 6973: {'lr': 0.0004868741702709536, 'samples': 1339008, 'steps': 6973, 'loss/train': 0.668142780661583} 01/27/2022 02:25:23 - INFO - codeparrot_training - Step 6974: {'lr': 0.0004868689376112055, 'samples': 1339200, 'steps': 6974, 'loss/train': 0.6860837191343307} 01/27/2022 02:25:26 - INFO - codeparrot_training - Step 6975: {'lr': 0.000486863703936786, 'samples': 1339392, 'steps': 6975, 'loss/train': 0.5790587514638901} 01/27/2022 02:25:30 - INFO - codeparrot_training - Step 6976: {'lr': 0.0004868584692477178, 'samples': 1339584, 'steps': 6976, 'loss/train': 0.4451320767402649} 01/27/2022 02:25:33 - INFO - codeparrot_training - Step 6977: {'lr': 0.000486853233544023, 'samples': 1339776, 'steps': 6977, 'loss/train': 1.0665097832679749} 01/27/2022 02:25:36 - INFO - codeparrot_training - Step 6978: {'lr': 0.0004868479968257241, 'samples': 1339968, 'steps': 6978, 'loss/train': 0.8840561807155609} 01/27/2022 02:25:39 - INFO - codeparrot_training - Step 6979: {'lr': 0.0004868427590928437, 'samples': 1340160, 'steps': 6979, 'loss/train': 1.196132630109787} 01/27/2022 02:25:42 - INFO - codeparrot_training - Step 6980: {'lr': 0.0004868375203454041, 'samples': 1340352, 'steps': 6980, 'loss/train': 0.47554895281791687} 01/27/2022 02:25:47 - INFO - codeparrot_training - Step 6981: {'lr': 0.0004868322805834278, 'samples': 1340544, 'steps': 6981, 'loss/train': 0.854915052652359} 01/27/2022 02:25:50 - INFO - codeparrot_training - Step 6982: {'lr': 0.0004868270398069371, 'samples': 1340736, 'steps': 6982, 'loss/train': 0.9370940029621124} 01/27/2022 02:25:53 - INFO - codeparrot_training - Step 6983: {'lr': 0.0004868217980159546, 'samples': 1340928, 'steps': 6983, 'loss/train': 0.8963030576705933} 01/27/2022 02:25:56 - INFO - codeparrot_training - Step 6984: {'lr': 0.0004868165552105028, 'samples': 1341120, 'steps': 6984, 'loss/train': 0.5493615120649338} 01/27/2022 02:25:59 - INFO - codeparrot_training - Step 6985: {'lr': 0.000486811311390604, 'samples': 1341312, 'steps': 6985, 'loss/train': 0.9538052380084991} 01/27/2022 02:26:03 - INFO - codeparrot_training - Step 6986: {'lr': 0.0004868060665562808, 'samples': 1341504, 'steps': 6986, 'loss/train': 0.47645916044712067} 01/27/2022 02:26:06 - INFO - codeparrot_training - Step 6987: {'lr': 0.0004868008207075555, 'samples': 1341696, 'steps': 6987, 'loss/train': 1.286945790052414} 01/27/2022 02:26:09 - INFO - codeparrot_training - Step 6988: {'lr': 0.0004867955738444508, 'samples': 1341888, 'steps': 6988, 'loss/train': 0.7811146080493927} 01/27/2022 02:26:14 - INFO - codeparrot_training - Step 6989: {'lr': 0.000486790325966989, 'samples': 1342080, 'steps': 6989, 'loss/train': 0.9919016361236572} 01/27/2022 02:26:17 - INFO - codeparrot_training - Step 6990: {'lr': 0.0004867850770751926, 'samples': 1342272, 'steps': 6990, 'loss/train': 0.8156549334526062} 01/27/2022 02:26:20 - INFO - codeparrot_training - Step 6991: {'lr': 0.00048677982716908416, 'samples': 1342464, 'steps': 6991, 'loss/train': 0.9930292367935181} 01/27/2022 02:26:23 - INFO - codeparrot_training - Step 6992: {'lr': 0.0004867745762486861, 'samples': 1342656, 'steps': 6992, 'loss/train': 0.31894782185554504} 01/27/2022 02:26:26 - INFO - codeparrot_training - Step 6993: {'lr': 0.0004867693243140209, 'samples': 1342848, 'steps': 6993, 'loss/train': 0.916204959154129} 01/27/2022 02:26:30 - INFO - codeparrot_training - Step 6994: {'lr': 0.0004867640713651112, 'samples': 1343040, 'steps': 6994, 'loss/train': 1.5870124697685242} 01/27/2022 02:26:33 - INFO - codeparrot_training - Step 6995: {'lr': 0.0004867588174019794, 'samples': 1343232, 'steps': 6995, 'loss/train': 0.5113737434148788} 01/27/2022 02:26:36 - INFO - codeparrot_training - Step 6996: {'lr': 0.00048675356242464785, 'samples': 1343424, 'steps': 6996, 'loss/train': 1.136296570301056} 01/27/2022 02:26:39 - INFO - codeparrot_training - Step 6997: {'lr': 0.0004867483064331394, 'samples': 1343616, 'steps': 6997, 'loss/train': 0.7357611358165741} 01/27/2022 02:26:46 - INFO - codeparrot_training - Step 6998: {'lr': 0.00048674304942747626, 'samples': 1343808, 'steps': 6998, 'loss/train': 1.1846646666526794} 01/27/2022 02:26:49 - INFO - codeparrot_training - Step 6999: {'lr': 0.0004867377914076811, 'samples': 1344000, 'steps': 6999, 'loss/train': 0.47043129801750183} 01/27/2022 02:26:52 - INFO - codeparrot_training - Step 7000: {'lr': 0.00048673253237377644, 'samples': 1344192, 'steps': 7000, 'loss/train': 0.7253011018037796} 01/27/2022 02:26:55 - INFO - codeparrot_training - Step 7001: {'lr': 0.00048672727232578476, 'samples': 1344384, 'steps': 7001, 'loss/train': 0.8574353456497192} 01/27/2022 02:26:58 - INFO - codeparrot_training - Step 7002: {'lr': 0.0004867220112637286, 'samples': 1344576, 'steps': 7002, 'loss/train': 0.10855559259653091} 01/27/2022 02:27:02 - INFO - codeparrot_training - Step 7003: {'lr': 0.00048671674918763055, 'samples': 1344768, 'steps': 7003, 'loss/train': 0.30647237598896027} 01/27/2022 02:27:05 - INFO - codeparrot_training - Step 7004: {'lr': 0.00048671148609751307, 'samples': 1344960, 'steps': 7004, 'loss/train': 1.659025490283966} 01/27/2022 02:27:08 - INFO - codeparrot_training - Step 7005: {'lr': 0.0004867062219933988, 'samples': 1345152, 'steps': 7005, 'loss/train': 0.9169272780418396} 01/27/2022 02:27:11 - INFO - codeparrot_training - Step 7006: {'lr': 0.00048670095687531023, 'samples': 1345344, 'steps': 7006, 'loss/train': 0.6714145392179489} 01/27/2022 02:27:16 - INFO - codeparrot_training - Step 7007: {'lr': 0.0004866956907432699, 'samples': 1345536, 'steps': 7007, 'loss/train': 0.7523446083068848} 01/27/2022 02:27:19 - INFO - codeparrot_training - Step 7008: {'lr': 0.00048669042359730043, 'samples': 1345728, 'steps': 7008, 'loss/train': 1.3790034055709839} 01/27/2022 02:27:22 - INFO - codeparrot_training - Step 7009: {'lr': 0.00048668515543742426, 'samples': 1345920, 'steps': 7009, 'loss/train': 0.0898800864815712} 01/27/2022 02:27:25 - INFO - codeparrot_training - Step 7010: {'lr': 0.0004866798862636641, 'samples': 1346112, 'steps': 7010, 'loss/train': 1.0381988883018494} 01/27/2022 02:27:28 - INFO - codeparrot_training - Step 7011: {'lr': 0.0004866746160760425, 'samples': 1346304, 'steps': 7011, 'loss/train': 0.8489904999732971} 01/27/2022 02:27:31 - INFO - codeparrot_training - Step 7012: {'lr': 0.0004866693448745819, 'samples': 1346496, 'steps': 7012, 'loss/train': 1.1078031957149506} 01/27/2022 02:27:34 - INFO - codeparrot_training - Step 7013: {'lr': 0.000486664072659305, 'samples': 1346688, 'steps': 7013, 'loss/train': 1.2477551400661469} 01/27/2022 02:27:38 - INFO - codeparrot_training - Step 7014: {'lr': 0.0004866587994302344, 'samples': 1346880, 'steps': 7014, 'loss/train': 0.9978205561637878} 01/27/2022 02:27:42 - INFO - codeparrot_training - Step 7015: {'lr': 0.0004866535251873926, 'samples': 1347072, 'steps': 7015, 'loss/train': 1.0439000129699707} 01/27/2022 02:27:45 - INFO - codeparrot_training - Step 7016: {'lr': 0.0004866482499308023, 'samples': 1347264, 'steps': 7016, 'loss/train': 0.5422042608261108} 01/27/2022 02:27:48 - INFO - codeparrot_training - Step 7017: {'lr': 0.000486642973660486, 'samples': 1347456, 'steps': 7017, 'loss/train': 0.7565733790397644} 01/27/2022 02:27:51 - INFO - codeparrot_training - Step 7018: {'lr': 0.00048663769637646636, 'samples': 1347648, 'steps': 7018, 'loss/train': 0.583957850933075} 01/27/2022 02:27:55 - INFO - codeparrot_training - Step 7019: {'lr': 0.000486632418078766, 'samples': 1347840, 'steps': 7019, 'loss/train': 0.8524070084095001} 01/27/2022 02:27:58 - INFO - codeparrot_training - Step 7020: {'lr': 0.0004866271387674075, 'samples': 1348032, 'steps': 7020, 'loss/train': 0.9440391361713409} 01/27/2022 02:28:01 - INFO - codeparrot_training - Step 7021: {'lr': 0.00048662185844241347, 'samples': 1348224, 'steps': 7021, 'loss/train': 0.9328208863735199} 01/27/2022 02:28:04 - INFO - codeparrot_training - Step 7022: {'lr': 0.00048661657710380647, 'samples': 1348416, 'steps': 7022, 'loss/train': 0.45343583822250366} 01/27/2022 02:28:07 - INFO - codeparrot_training - Step 7023: {'lr': 0.00048661129475160926, 'samples': 1348608, 'steps': 7023, 'loss/train': 0.6629778295755386} 01/27/2022 02:28:13 - INFO - codeparrot_training - Step 7024: {'lr': 0.00048660601138584436, 'samples': 1348800, 'steps': 7024, 'loss/train': 0.7623049914836884} 01/27/2022 02:28:16 - INFO - codeparrot_training - Step 7025: {'lr': 0.00048660072700653446, 'samples': 1348992, 'steps': 7025, 'loss/train': 1.171348214149475} 01/27/2022 02:28:20 - INFO - codeparrot_training - Step 7026: {'lr': 0.0004865954416137022, 'samples': 1349184, 'steps': 7026, 'loss/train': 0.4315495043992996} 01/27/2022 02:28:23 - INFO - codeparrot_training - Step 7027: {'lr': 0.0004865901552073701, 'samples': 1349376, 'steps': 7027, 'loss/train': 0.9421841204166412} 01/27/2022 02:28:26 - INFO - codeparrot_training - Step 7028: {'lr': 0.00048658486778756097, 'samples': 1349568, 'steps': 7028, 'loss/train': 1.1052480340003967} 01/27/2022 02:28:29 - INFO - codeparrot_training - Step 7029: {'lr': 0.00048657957935429734, 'samples': 1349760, 'steps': 7029, 'loss/train': 0.6264131516218185} 01/27/2022 02:28:32 - INFO - codeparrot_training - Step 7030: {'lr': 0.000486574289907602, 'samples': 1349952, 'steps': 7030, 'loss/train': 0.8489170968532562} 01/27/2022 02:28:35 - INFO - codeparrot_training - Step 7031: {'lr': 0.0004865689994474974, 'samples': 1350144, 'steps': 7031, 'loss/train': 1.0780152082443237} 01/27/2022 02:28:38 - INFO - codeparrot_training - Step 7032: {'lr': 0.00048656370797400643, 'samples': 1350336, 'steps': 7032, 'loss/train': 0.7796342968940735} 01/27/2022 02:28:43 - INFO - codeparrot_training - Step 7033: {'lr': 0.00048655841548715163, 'samples': 1350528, 'steps': 7033, 'loss/train': 0.7483496367931366} 01/27/2022 02:28:46 - INFO - codeparrot_training - Step 7034: {'lr': 0.00048655312198695567, 'samples': 1350720, 'steps': 7034, 'loss/train': 0.47734974324703217} 01/27/2022 02:28:49 - INFO - codeparrot_training - Step 7035: {'lr': 0.00048654782747344126, 'samples': 1350912, 'steps': 7035, 'loss/train': 0.9437796771526337} 01/27/2022 02:28:52 - INFO - codeparrot_training - Step 7036: {'lr': 0.00048654253194663113, 'samples': 1351104, 'steps': 7036, 'loss/train': 0.775248259305954} 01/27/2022 02:28:55 - INFO - codeparrot_training - Step 7037: {'lr': 0.0004865372354065478, 'samples': 1351296, 'steps': 7037, 'loss/train': 0.7034461051225662} 01/27/2022 02:28:59 - INFO - codeparrot_training - Step 7038: {'lr': 0.00048653193785321415, 'samples': 1351488, 'steps': 7038, 'loss/train': 0.7066497802734375} 01/27/2022 02:29:02 - INFO - codeparrot_training - Step 7039: {'lr': 0.00048652663928665273, 'samples': 1351680, 'steps': 7039, 'loss/train': 0.8305737376213074} 01/27/2022 02:29:05 - INFO - codeparrot_training - Step 7040: {'lr': 0.00048652133970688633, 'samples': 1351872, 'steps': 7040, 'loss/train': 1.1145063042640686} 01/27/2022 02:29:08 - INFO - codeparrot_training - Step 7041: {'lr': 0.0004865160391139376, 'samples': 1352064, 'steps': 7041, 'loss/train': 1.368796706199646} 01/27/2022 02:29:14 - INFO - codeparrot_training - Step 7042: {'lr': 0.0004865107375078293, 'samples': 1352256, 'steps': 7042, 'loss/train': 0.8053771555423737} 01/27/2022 02:29:17 - INFO - codeparrot_training - Step 7043: {'lr': 0.000486505434888584, 'samples': 1352448, 'steps': 7043, 'loss/train': 1.0086980760097504} 01/27/2022 02:29:20 - INFO - codeparrot_training - Step 7044: {'lr': 0.0004865001312562246, 'samples': 1352640, 'steps': 7044, 'loss/train': 0.8749274611473083} 01/27/2022 02:29:24 - INFO - codeparrot_training - Step 7045: {'lr': 0.0004864948266107737, 'samples': 1352832, 'steps': 7045, 'loss/train': 0.8332645297050476} 01/27/2022 02:29:27 - INFO - codeparrot_training - Step 7046: {'lr': 0.0004864895209522541, 'samples': 1353024, 'steps': 7046, 'loss/train': 0.5742085129022598} 01/27/2022 02:29:30 - INFO - codeparrot_training - Step 7047: {'lr': 0.00048648421428068843, 'samples': 1353216, 'steps': 7047, 'loss/train': 0.43846888840198517} 01/27/2022 02:29:33 - INFO - codeparrot_training - Step 7048: {'lr': 0.0004864789065960995, 'samples': 1353408, 'steps': 7048, 'loss/train': 1.1164068281650543} 01/27/2022 02:29:36 - INFO - codeparrot_training - Step 7049: {'lr': 0.00048647359789851, 'samples': 1353600, 'steps': 7049, 'loss/train': 1.0281902253627777} 01/27/2022 02:29:40 - INFO - codeparrot_training - Step 7050: {'lr': 0.00048646828818794274, 'samples': 1353792, 'steps': 7050, 'loss/train': 1.2328515350818634} 01/27/2022 02:29:44 - INFO - codeparrot_training - Step 7051: {'lr': 0.00048646297746442044, 'samples': 1353984, 'steps': 7051, 'loss/train': 0.8365853726863861} 01/27/2022 02:29:47 - INFO - codeparrot_training - Step 7052: {'lr': 0.0004864576657279658, 'samples': 1354176, 'steps': 7052, 'loss/train': 0.9214855134487152} 01/27/2022 02:29:50 - INFO - codeparrot_training - Step 7053: {'lr': 0.0004864523529786016, 'samples': 1354368, 'steps': 7053, 'loss/train': 0.5031356960535049} 01/27/2022 02:29:53 - INFO - codeparrot_training - Step 7054: {'lr': 0.0004864470392163506, 'samples': 1354560, 'steps': 7054, 'loss/train': 1.1025226414203644} 01/27/2022 02:29:56 - INFO - codeparrot_training - Step 7055: {'lr': 0.0004864417244412355, 'samples': 1354752, 'steps': 7055, 'loss/train': 0.531793087720871} 01/27/2022 02:29:59 - INFO - codeparrot_training - Step 7056: {'lr': 0.0004864364086532792, 'samples': 1354944, 'steps': 7056, 'loss/train': 0.8962186574935913} 01/27/2022 02:30:03 - INFO - codeparrot_training - Step 7057: {'lr': 0.00048643109185250445, 'samples': 1355136, 'steps': 7057, 'loss/train': 0.7702920734882355} 01/27/2022 02:30:06 - INFO - codeparrot_training - Step 7058: {'lr': 0.0004864257740389338, 'samples': 1355328, 'steps': 7058, 'loss/train': 0.5575747489929199} 01/27/2022 02:30:10 - INFO - codeparrot_training - Step 7059: {'lr': 0.00048642045521259044, 'samples': 1355520, 'steps': 7059, 'loss/train': 1.5289160013198853} 01/27/2022 02:30:13 - INFO - codeparrot_training - Step 7060: {'lr': 0.0004864151353734968, 'samples': 1355712, 'steps': 7060, 'loss/train': 0.8109969198703766} 01/27/2022 02:30:17 - INFO - codeparrot_training - Step 7061: {'lr': 0.0004864098145216758, 'samples': 1355904, 'steps': 7061, 'loss/train': 1.117045283317566} 01/27/2022 02:30:20 - INFO - codeparrot_training - Step 7062: {'lr': 0.0004864044926571503, 'samples': 1356096, 'steps': 7062, 'loss/train': 0.6586803495883942} 01/27/2022 02:30:23 - INFO - codeparrot_training - Step 7063: {'lr': 0.00048639916977994286, 'samples': 1356288, 'steps': 7063, 'loss/train': 0.9248684942722321} 01/27/2022 02:30:26 - INFO - codeparrot_training - Step 7064: {'lr': 0.0004863938458900765, 'samples': 1356480, 'steps': 7064, 'loss/train': 1.0296691060066223} 01/27/2022 02:30:29 - INFO - codeparrot_training - Step 7065: {'lr': 0.000486388520987574, 'samples': 1356672, 'steps': 7065, 'loss/train': 0.9621586203575134} 01/27/2022 02:30:32 - INFO - codeparrot_training - Step 7066: {'lr': 0.0004863831950724582, 'samples': 1356864, 'steps': 7066, 'loss/train': 1.2906014621257782} 01/27/2022 02:30:35 - INFO - codeparrot_training - Step 7067: {'lr': 0.00048637786814475175, 'samples': 1357056, 'steps': 7067, 'loss/train': 0.8971104025840759} 01/27/2022 02:30:42 - INFO - codeparrot_training - Step 7068: {'lr': 0.0004863725402044776, 'samples': 1357248, 'steps': 7068, 'loss/train': 0.9786682426929474} 01/27/2022 02:30:45 - INFO - codeparrot_training - Step 7069: {'lr': 0.00048636721125165855, 'samples': 1357440, 'steps': 7069, 'loss/train': 0.6176658868789673} 01/27/2022 02:30:48 - INFO - codeparrot_training - Step 7070: {'lr': 0.0004863618812863174, 'samples': 1357632, 'steps': 7070, 'loss/train': 1.3479672074317932} 01/27/2022 02:30:51 - INFO - codeparrot_training - Step 7071: {'lr': 0.0004863565503084771, 'samples': 1357824, 'steps': 7071, 'loss/train': 0.881901204586029} 01/27/2022 02:30:54 - INFO - codeparrot_training - Step 7072: {'lr': 0.0004863512183181603, 'samples': 1358016, 'steps': 7072, 'loss/train': 0.7578834593296051} 01/27/2022 02:30:58 - INFO - codeparrot_training - Step 7073: {'lr': 0.0004863458853153899, 'samples': 1358208, 'steps': 7073, 'loss/train': 0.8719675540924072} 01/27/2022 02:31:01 - INFO - codeparrot_training - Step 7074: {'lr': 0.00048634055130018886, 'samples': 1358400, 'steps': 7074, 'loss/train': 1.6821556091308594} 01/27/2022 02:31:04 - INFO - codeparrot_training - Step 7075: {'lr': 0.00048633521627257993, 'samples': 1358592, 'steps': 7075, 'loss/train': 0.6519093811511993} 01/27/2022 02:31:07 - INFO - codeparrot_training - Step 7076: {'lr': 0.00048632988023258596, 'samples': 1358784, 'steps': 7076, 'loss/train': 0.9689817130565643} 01/27/2022 02:31:11 - INFO - codeparrot_training - Step 7077: {'lr': 0.0004863245431802298, 'samples': 1358976, 'steps': 7077, 'loss/train': 0.9395830929279327} 01/27/2022 02:31:14 - INFO - codeparrot_training - Step 7078: {'lr': 0.0004863192051155344, 'samples': 1359168, 'steps': 7078, 'loss/train': 0.920956939458847} 01/27/2022 02:31:18 - INFO - codeparrot_training - Step 7079: {'lr': 0.0004863138660385225, 'samples': 1359360, 'steps': 7079, 'loss/train': 0.8000333905220032} 01/27/2022 02:31:21 - INFO - codeparrot_training - Step 7080: {'lr': 0.00048630852594921703, 'samples': 1359552, 'steps': 7080, 'loss/train': 0.8500598967075348} 01/27/2022 02:31:24 - INFO - codeparrot_training - Step 7081: {'lr': 0.00048630318484764093, 'samples': 1359744, 'steps': 7081, 'loss/train': 0.6295649260282516} 01/27/2022 02:31:27 - INFO - codeparrot_training - Step 7082: {'lr': 0.000486297842733817, 'samples': 1359936, 'steps': 7082, 'loss/train': 1.562120497226715} 01/27/2022 02:31:30 - INFO - codeparrot_training - Step 7083: {'lr': 0.0004862924996077682, 'samples': 1360128, 'steps': 7083, 'loss/train': 0.9840722680091858} 01/27/2022 02:31:33 - INFO - codeparrot_training - Step 7084: {'lr': 0.0004862871554695173, 'samples': 1360320, 'steps': 7084, 'loss/train': 0.6818490028381348} 01/27/2022 02:31:37 - INFO - codeparrot_training - Step 7085: {'lr': 0.00048628181031908725, 'samples': 1360512, 'steps': 7085, 'loss/train': 0.6218547374010086} 01/27/2022 02:31:41 - INFO - codeparrot_training - Step 7086: {'lr': 0.00048627646415650094, 'samples': 1360704, 'steps': 7086, 'loss/train': 0.6161086857318878} 01/27/2022 02:31:44 - INFO - codeparrot_training - Step 7087: {'lr': 0.0004862711169817813, 'samples': 1360896, 'steps': 7087, 'loss/train': 1.2349650263786316} 01/27/2022 02:31:47 - INFO - codeparrot_training - Step 7088: {'lr': 0.0004862657687949512, 'samples': 1361088, 'steps': 7088, 'loss/train': 1.1284984052181244} 01/27/2022 02:31:51 - INFO - codeparrot_training - Step 7089: {'lr': 0.0004862604195960336, 'samples': 1361280, 'steps': 7089, 'loss/train': 0.7644109725952148} 01/27/2022 02:31:54 - INFO - codeparrot_training - Step 7090: {'lr': 0.00048625506938505136, 'samples': 1361472, 'steps': 7090, 'loss/train': 0.7567308247089386} 01/27/2022 02:31:57 - INFO - codeparrot_training - Step 7091: {'lr': 0.00048624971816202747, 'samples': 1361664, 'steps': 7091, 'loss/train': 0.8294719755649567} 01/27/2022 02:32:00 - INFO - codeparrot_training - Step 7092: {'lr': 0.0004862443659269848, 'samples': 1361856, 'steps': 7092, 'loss/train': 0.7796919643878937} 01/27/2022 02:32:03 - INFO - codeparrot_training - Step 7093: {'lr': 0.00048623901267994625, 'samples': 1362048, 'steps': 7093, 'loss/train': 0.8870155513286591} 01/27/2022 02:32:07 - INFO - codeparrot_training - Step 7094: {'lr': 0.00048623365842093483, 'samples': 1362240, 'steps': 7094, 'loss/train': 0.5835224837064743} 01/27/2022 02:32:11 - INFO - codeparrot_training - Step 7095: {'lr': 0.00048622830314997334, 'samples': 1362432, 'steps': 7095, 'loss/train': 1.2750446498394012} 01/27/2022 02:32:14 - INFO - codeparrot_training - Step 7096: {'lr': 0.0004862229468670849, 'samples': 1362624, 'steps': 7096, 'loss/train': 1.1415424346923828} 01/27/2022 02:32:17 - INFO - codeparrot_training - Step 7097: {'lr': 0.0004862175895722923, 'samples': 1362816, 'steps': 7097, 'loss/train': 0.5570892244577408} 01/27/2022 02:32:20 - INFO - codeparrot_training - Step 7098: {'lr': 0.0004862122312656186, 'samples': 1363008, 'steps': 7098, 'loss/train': 0.8135865926742554} 01/27/2022 02:32:23 - INFO - codeparrot_training - Step 7099: {'lr': 0.0004862068719470867, 'samples': 1363200, 'steps': 7099, 'loss/train': 0.780255138874054} 01/27/2022 02:32:26 - INFO - codeparrot_training - Step 7100: {'lr': 0.00048620151161671955, 'samples': 1363392, 'steps': 7100, 'loss/train': 0.9656323492527008} 01/27/2022 02:32:29 - INFO - codeparrot_training - Step 7101: {'lr': 0.0004861961502745401, 'samples': 1363584, 'steps': 7101, 'loss/train': 0.9002433121204376} 01/27/2022 02:32:33 - INFO - codeparrot_training - Step 7102: {'lr': 0.00048619078792057135, 'samples': 1363776, 'steps': 7102, 'loss/train': 0.9724436402320862} 01/27/2022 02:32:39 - INFO - codeparrot_training - Step 7103: {'lr': 0.00048618542455483625, 'samples': 1363968, 'steps': 7103, 'loss/train': 1.3247029781341553} 01/27/2022 02:32:42 - INFO - codeparrot_training - Step 7104: {'lr': 0.0004861800601773579, 'samples': 1364160, 'steps': 7104, 'loss/train': 0.75620037317276} 01/27/2022 02:32:46 - INFO - codeparrot_training - Step 7105: {'lr': 0.00048617469478815905, 'samples': 1364352, 'steps': 7105, 'loss/train': 0.8808561265468597} 01/27/2022 02:32:49 - INFO - codeparrot_training - Step 7106: {'lr': 0.00048616932838726286, 'samples': 1364544, 'steps': 7106, 'loss/train': 0.6935335993766785} 01/27/2022 02:32:52 - INFO - codeparrot_training - Step 7107: {'lr': 0.0004861639609746923, 'samples': 1364736, 'steps': 7107, 'loss/train': 1.7614440321922302} 01/27/2022 02:32:55 - INFO - codeparrot_training - Step 7108: {'lr': 0.0004861585925504702, 'samples': 1364928, 'steps': 7108, 'loss/train': 0.5752880573272705} 01/27/2022 02:32:58 - INFO - codeparrot_training - Step 7109: {'lr': 0.00048615322311461973, 'samples': 1365120, 'steps': 7109, 'loss/train': 1.5272546410560608} 01/27/2022 02:33:01 - INFO - codeparrot_training - Step 7110: {'lr': 0.0004861478526671639, 'samples': 1365312, 'steps': 7110, 'loss/train': 0.7469163984060287} 01/27/2022 02:33:04 - INFO - codeparrot_training - Step 7111: {'lr': 0.0004861424812081256, 'samples': 1365504, 'steps': 7111, 'loss/train': 0.22386742383241653} 01/27/2022 02:33:08 - INFO - codeparrot_training - Step 7112: {'lr': 0.0004861371087375279, 'samples': 1365696, 'steps': 7112, 'loss/train': 0.8690055012702942} 01/27/2022 02:33:12 - INFO - codeparrot_training - Step 7113: {'lr': 0.0004861317352553938, 'samples': 1365888, 'steps': 7113, 'loss/train': 1.1219244003295898} 01/27/2022 02:33:15 - INFO - codeparrot_training - Step 7114: {'lr': 0.0004861263607617463, 'samples': 1366080, 'steps': 7114, 'loss/train': 1.2382054030895233} 01/27/2022 02:33:18 - INFO - codeparrot_training - Step 7115: {'lr': 0.00048612098525660855, 'samples': 1366272, 'steps': 7115, 'loss/train': 0.5857314616441727} 01/27/2022 02:33:22 - INFO - codeparrot_training - Step 7116: {'lr': 0.00048611560874000335, 'samples': 1366464, 'steps': 7116, 'loss/train': 0.934104323387146} 01/27/2022 02:33:25 - INFO - codeparrot_training - Step 7117: {'lr': 0.000486110231211954, 'samples': 1366656, 'steps': 7117, 'loss/train': 0.9740291833877563} 01/27/2022 02:33:28 - INFO - codeparrot_training - Step 7118: {'lr': 0.0004861048526724833, 'samples': 1366848, 'steps': 7118, 'loss/train': 1.2945263385772705} 01/27/2022 02:33:31 - INFO - codeparrot_training - Step 7119: {'lr': 0.00048609947312161435, 'samples': 1367040, 'steps': 7119, 'loss/train': 0.7893727719783783} 01/27/2022 02:33:34 - INFO - codeparrot_training - Step 7120: {'lr': 0.0004860940925593703, 'samples': 1367232, 'steps': 7120, 'loss/train': 0.8361429870128632} 01/27/2022 02:33:37 - INFO - codeparrot_training - Step 7121: {'lr': 0.0004860887109857741, 'samples': 1367424, 'steps': 7121, 'loss/train': 0.8348410427570343} 01/27/2022 02:33:43 - INFO - codeparrot_training - Step 7122: {'lr': 0.0004860833284008488, 'samples': 1367616, 'steps': 7122, 'loss/train': 0.7578361630439758} 01/27/2022 02:33:46 - INFO - codeparrot_training - Step 7123: {'lr': 0.00048607794480461753, 'samples': 1367808, 'steps': 7123, 'loss/train': 0.7895668745040894} 01/27/2022 02:33:50 - INFO - codeparrot_training - Step 7124: {'lr': 0.00048607256019710327, 'samples': 1368000, 'steps': 7124, 'loss/train': 1.1694990992546082} 01/27/2022 02:33:53 - INFO - codeparrot_training - Step 7125: {'lr': 0.0004860671745783292, 'samples': 1368192, 'steps': 7125, 'loss/train': 0.7647014558315277} 01/27/2022 02:33:56 - INFO - codeparrot_training - Step 7126: {'lr': 0.0004860617879483182, 'samples': 1368384, 'steps': 7126, 'loss/train': 0.9836832582950592} 01/27/2022 02:33:59 - INFO - codeparrot_training - Step 7127: {'lr': 0.0004860564003070935, 'samples': 1368576, 'steps': 7127, 'loss/train': 0.6352794617414474} 01/27/2022 02:34:02 - INFO - codeparrot_training - Step 7128: {'lr': 0.00048605101165467813, 'samples': 1368768, 'steps': 7128, 'loss/train': 0.9221675097942352} 01/27/2022 02:34:05 - INFO - codeparrot_training - Step 7129: {'lr': 0.00048604562199109524, 'samples': 1368960, 'steps': 7129, 'loss/train': 0.6002705097198486} 01/27/2022 02:34:10 - INFO - codeparrot_training - Step 7130: {'lr': 0.00048604023131636784, 'samples': 1369152, 'steps': 7130, 'loss/train': 1.0208000242710114} 01/27/2022 02:34:13 - INFO - codeparrot_training - Step 7131: {'lr': 0.00048603483963051896, 'samples': 1369344, 'steps': 7131, 'loss/train': 1.2856314182281494} 01/27/2022 02:34:16 - INFO - codeparrot_training - Step 7132: {'lr': 0.0004860294469335719, 'samples': 1369536, 'steps': 7132, 'loss/train': 0.7396314293146133} 01/27/2022 02:34:19 - INFO - codeparrot_training - Step 7133: {'lr': 0.00048602405322554956, 'samples': 1369728, 'steps': 7133, 'loss/train': 0.7963255047798157} 01/27/2022 02:34:22 - INFO - codeparrot_training - Step 7134: {'lr': 0.00048601865850647516, 'samples': 1369920, 'steps': 7134, 'loss/train': 0.8722293376922607} 01/27/2022 02:34:26 - INFO - codeparrot_training - Step 7135: {'lr': 0.0004860132627763717, 'samples': 1370112, 'steps': 7135, 'loss/train': 0.770694226026535} 01/27/2022 02:34:29 - INFO - codeparrot_training - Step 7136: {'lr': 0.0004860078660352625, 'samples': 1370304, 'steps': 7136, 'loss/train': 0.8171348869800568} 01/27/2022 02:34:32 - INFO - codeparrot_training - Step 7137: {'lr': 0.0004860024682831704, 'samples': 1370496, 'steps': 7137, 'loss/train': 0.8375372886657715} 01/27/2022 02:34:35 - INFO - codeparrot_training - Step 7138: {'lr': 0.0004859970695201187, 'samples': 1370688, 'steps': 7138, 'loss/train': 0.8991466462612152} 01/27/2022 02:34:40 - INFO - codeparrot_training - Step 7139: {'lr': 0.00048599166974613053, 'samples': 1370880, 'steps': 7139, 'loss/train': 1.1681253612041473} 01/27/2022 02:34:43 - INFO - codeparrot_training - Step 7140: {'lr': 0.000485986268961229, 'samples': 1371072, 'steps': 7140, 'loss/train': 0.9684716463088989} 01/27/2022 02:34:46 - INFO - codeparrot_training - Step 7141: {'lr': 0.0004859808671654372, 'samples': 1371264, 'steps': 7141, 'loss/train': 0.6611077934503555} 01/27/2022 02:34:49 - INFO - codeparrot_training - Step 7142: {'lr': 0.00048597546435877824, 'samples': 1371456, 'steps': 7142, 'loss/train': 0.9416018128395081} 01/27/2022 02:34:52 - INFO - codeparrot_training - Step 7143: {'lr': 0.0004859700605412754, 'samples': 1371648, 'steps': 7143, 'loss/train': 1.1977836191654205} 01/27/2022 02:34:55 - INFO - codeparrot_training - Step 7144: {'lr': 0.0004859646557129517, 'samples': 1371840, 'steps': 7144, 'loss/train': 1.1153182089328766} 01/27/2022 02:34:58 - INFO - codeparrot_training - Step 7145: {'lr': 0.0004859592498738304, 'samples': 1372032, 'steps': 7145, 'loss/train': 0.8368942737579346} 01/27/2022 02:35:02 - INFO - codeparrot_training - Step 7146: {'lr': 0.00048595384302393453, 'samples': 1372224, 'steps': 7146, 'loss/train': 1.1730632185935974} 01/27/2022 02:35:05 - INFO - codeparrot_training - Step 7147: {'lr': 0.00048594843516328734, 'samples': 1372416, 'steps': 7147, 'loss/train': 0.5384864956140518} 01/27/2022 02:35:11 - INFO - codeparrot_training - Step 7148: {'lr': 0.000485943026291912, 'samples': 1372608, 'steps': 7148, 'loss/train': 1.071938306093216} 01/27/2022 02:35:14 - INFO - codeparrot_training - Step 7149: {'lr': 0.0004859376164098317, 'samples': 1372800, 'steps': 7149, 'loss/train': 0.980717808008194} 01/27/2022 02:35:17 - INFO - codeparrot_training - Step 7150: {'lr': 0.0004859322055170695, 'samples': 1372992, 'steps': 7150, 'loss/train': 1.0840903222560883} 01/27/2022 02:35:21 - INFO - codeparrot_training - Step 7151: {'lr': 0.00048592679361364867, 'samples': 1373184, 'steps': 7151, 'loss/train': 0.8125215768814087} 01/27/2022 02:35:24 - INFO - codeparrot_training - Step 7152: {'lr': 0.00048592138069959235, 'samples': 1373376, 'steps': 7152, 'loss/train': 0.9388226866722107} 01/27/2022 02:35:27 - INFO - codeparrot_training - Step 7153: {'lr': 0.0004859159667749238, 'samples': 1373568, 'steps': 7153, 'loss/train': 0.9804883003234863} 01/27/2022 02:35:30 - INFO - codeparrot_training - Step 7154: {'lr': 0.000485910551839666, 'samples': 1373760, 'steps': 7154, 'loss/train': 1.2771230041980743} 01/27/2022 02:35:33 - INFO - codeparrot_training - Step 7155: {'lr': 0.0004859051358938425, 'samples': 1373952, 'steps': 7155, 'loss/train': 0.22658801078796387} 01/27/2022 02:35:36 - INFO - codeparrot_training - Step 7156: {'lr': 0.00048589971893747626, 'samples': 1374144, 'steps': 7156, 'loss/train': 1.171486347913742} 01/27/2022 02:35:41 - INFO - codeparrot_training - Step 7157: {'lr': 0.0004858943009705905, 'samples': 1374336, 'steps': 7157, 'loss/train': 1.109773725271225} 01/27/2022 02:35:44 - INFO - codeparrot_training - Step 7158: {'lr': 0.00048588888199320847, 'samples': 1374528, 'steps': 7158, 'loss/train': 1.4451899528503418} 01/27/2022 02:35:47 - INFO - codeparrot_training - Step 7159: {'lr': 0.0004858834620053534, 'samples': 1374720, 'steps': 7159, 'loss/train': 1.5316398739814758} 01/27/2022 02:35:50 - INFO - codeparrot_training - Step 7160: {'lr': 0.0004858780410070484, 'samples': 1374912, 'steps': 7160, 'loss/train': 0.48436814546585083} 01/27/2022 02:35:53 - INFO - codeparrot_training - Step 7161: {'lr': 0.0004858726189983168, 'samples': 1375104, 'steps': 7161, 'loss/train': 0.5391566008329391} 01/27/2022 02:35:57 - INFO - codeparrot_training - Step 7162: {'lr': 0.00048586719597918185, 'samples': 1375296, 'steps': 7162, 'loss/train': 1.3225091099739075} 01/27/2022 02:36:00 - INFO - codeparrot_training - Step 7163: {'lr': 0.0004858617719496667, 'samples': 1375488, 'steps': 7163, 'loss/train': 1.1502639949321747} 01/27/2022 02:36:03 - INFO - codeparrot_training - Step 7164: {'lr': 0.0004858563469097946, 'samples': 1375680, 'steps': 7164, 'loss/train': 0.8957589268684387} 01/27/2022 02:36:06 - INFO - codeparrot_training - Step 7165: {'lr': 0.0004858509208595888, 'samples': 1375872, 'steps': 7165, 'loss/train': 0.8289715647697449} 01/27/2022 02:36:11 - INFO - codeparrot_training - Step 7166: {'lr': 0.0004858454937990726, 'samples': 1376064, 'steps': 7166, 'loss/train': 0.8547893464565277} 01/27/2022 02:36:14 - INFO - codeparrot_training - Step 7167: {'lr': 0.0004858400657282691, 'samples': 1376256, 'steps': 7167, 'loss/train': 0.9085818529129028} 01/27/2022 02:36:17 - INFO - codeparrot_training - Step 7168: {'lr': 0.00048583463664720174, 'samples': 1376448, 'steps': 7168, 'loss/train': 0.5989874750375748} 01/27/2022 02:36:20 - INFO - codeparrot_training - Step 7169: {'lr': 0.00048582920655589366, 'samples': 1376640, 'steps': 7169, 'loss/train': 0.6566199213266373} 01/27/2022 02:36:23 - INFO - codeparrot_training - Step 7170: {'lr': 0.0004858237754543681, 'samples': 1376832, 'steps': 7170, 'loss/train': 1.011367678642273} 01/27/2022 02:36:26 - INFO - codeparrot_training - Step 7171: {'lr': 0.0004858183433426484, 'samples': 1377024, 'steps': 7171, 'loss/train': 1.2293698489665985} 01/27/2022 02:36:29 - INFO - codeparrot_training - Step 7172: {'lr': 0.0004858129102207578, 'samples': 1377216, 'steps': 7172, 'loss/train': 1.2130979597568512} 01/27/2022 02:36:33 - INFO - codeparrot_training - Step 7173: {'lr': 0.00048580747608871955, 'samples': 1377408, 'steps': 7173, 'loss/train': 0.6708489060401917} 01/27/2022 02:36:39 - INFO - codeparrot_training - Step 7174: {'lr': 0.000485802040946557, 'samples': 1377600, 'steps': 7174, 'loss/train': 1.077446311712265} 01/27/2022 02:36:42 - INFO - codeparrot_training - Step 7175: {'lr': 0.00048579660479429335, 'samples': 1377792, 'steps': 7175, 'loss/train': 0.8104543089866638} 01/27/2022 02:36:45 - INFO - codeparrot_training - Step 7176: {'lr': 0.00048579116763195184, 'samples': 1377984, 'steps': 7176, 'loss/train': 0.9992428421974182} 01/27/2022 02:36:48 - INFO - codeparrot_training - Step 7177: {'lr': 0.00048578572945955594, 'samples': 1378176, 'steps': 7177, 'loss/train': 0.5624056309461594} 01/27/2022 02:36:51 - INFO - codeparrot_training - Step 7178: {'lr': 0.00048578029027712883, 'samples': 1378368, 'steps': 7178, 'loss/train': 1.34758660197258} 01/27/2022 02:36:54 - INFO - codeparrot_training - Step 7179: {'lr': 0.0004857748500846938, 'samples': 1378560, 'steps': 7179, 'loss/train': 0.8387890756130219} 01/27/2022 02:36:58 - INFO - codeparrot_training - Step 7180: {'lr': 0.0004857694088822742, 'samples': 1378752, 'steps': 7180, 'loss/train': 0.918486088514328} 01/27/2022 02:37:01 - INFO - codeparrot_training - Step 7181: {'lr': 0.00048576396666989333, 'samples': 1378944, 'steps': 7181, 'loss/train': 0.9395235478878021} 01/27/2022 02:37:04 - INFO - codeparrot_training - Step 7182: {'lr': 0.0004857585234475745, 'samples': 1379136, 'steps': 7182, 'loss/train': 0.7742874920368195} 01/27/2022 02:37:08 - INFO - codeparrot_training - Step 7183: {'lr': 0.00048575307921534095, 'samples': 1379328, 'steps': 7183, 'loss/train': 0.5382783114910126} 01/27/2022 02:37:12 - INFO - codeparrot_training - Step 7184: {'lr': 0.0004857476339732161, 'samples': 1379520, 'steps': 7184, 'loss/train': 0.5918804258108139} 01/27/2022 02:37:15 - INFO - codeparrot_training - Step 7185: {'lr': 0.0004857421877212233, 'samples': 1379712, 'steps': 7185, 'loss/train': 0.9641759097576141} 01/27/2022 02:37:18 - INFO - codeparrot_training - Step 7186: {'lr': 0.00048573674045938577, 'samples': 1379904, 'steps': 7186, 'loss/train': 0.7925241887569427} 01/27/2022 02:37:21 - INFO - codeparrot_training - Step 7187: {'lr': 0.00048573129218772686, 'samples': 1380096, 'steps': 7187, 'loss/train': 0.526454284787178} 01/27/2022 02:37:24 - INFO - codeparrot_training - Step 7188: {'lr': 0.00048572584290627, 'samples': 1380288, 'steps': 7188, 'loss/train': 1.760161578655243} 01/27/2022 02:37:27 - INFO - codeparrot_training - Step 7189: {'lr': 0.00048572039261503855, 'samples': 1380480, 'steps': 7189, 'loss/train': 0.2658250853419304} 01/27/2022 02:37:30 - INFO - codeparrot_training - Step 7190: {'lr': 0.00048571494131405567, 'samples': 1380672, 'steps': 7190, 'loss/train': 0.9239501953125} 01/27/2022 02:37:34 - INFO - codeparrot_training - Step 7191: {'lr': 0.0004857094890033449, 'samples': 1380864, 'steps': 7191, 'loss/train': 0.48016244173049927} 01/27/2022 02:37:40 - INFO - codeparrot_training - Step 7192: {'lr': 0.0004857040356829295, 'samples': 1381056, 'steps': 7192, 'loss/train': 0.6303853690624237} 01/27/2022 02:37:43 - INFO - codeparrot_training - Step 7193: {'lr': 0.00048569858135283285, 'samples': 1381248, 'steps': 7193, 'loss/train': 1.4303488433361053} 01/27/2022 02:37:46 - INFO - codeparrot_training - Step 7194: {'lr': 0.00048569312601307827, 'samples': 1381440, 'steps': 7194, 'loss/train': 0.8217730522155762} 01/27/2022 02:37:49 - INFO - codeparrot_training - Step 7195: {'lr': 0.00048568766966368925, 'samples': 1381632, 'steps': 7195, 'loss/train': 0.9150716364383698} 01/27/2022 02:37:52 - INFO - codeparrot_training - Step 7196: {'lr': 0.00048568221230468905, 'samples': 1381824, 'steps': 7196, 'loss/train': 0.9793284237384796} 01/27/2022 02:37:56 - INFO - codeparrot_training - Step 7197: {'lr': 0.0004856767539361011, 'samples': 1382016, 'steps': 7197, 'loss/train': 0.8352527618408203} 01/27/2022 02:37:59 - INFO - codeparrot_training - Step 7198: {'lr': 0.0004856712945579488, 'samples': 1382208, 'steps': 7198, 'loss/train': 0.8949092030525208} 01/27/2022 02:38:02 - INFO - codeparrot_training - Step 7199: {'lr': 0.00048566583417025553, 'samples': 1382400, 'steps': 7199, 'loss/train': 0.9513883888721466} 01/27/2022 02:38:05 - INFO - codeparrot_training - Step 7200: {'lr': 0.00048566037277304465, 'samples': 1382592, 'steps': 7200, 'loss/train': 0.851531445980072} 01/27/2022 02:38:09 - INFO - codeparrot_training - Step 7201: {'lr': 0.00048565491036633946, 'samples': 1382784, 'steps': 7201, 'loss/train': 0.8293958008289337} 01/27/2022 02:38:13 - INFO - codeparrot_training - Step 7202: {'lr': 0.00048564944695016356, 'samples': 1382976, 'steps': 7202, 'loss/train': 0.8302811086177826} 01/27/2022 02:38:16 - INFO - codeparrot_training - Step 7203: {'lr': 0.00048564398252454026, 'samples': 1383168, 'steps': 7203, 'loss/train': 0.9188760817050934} 01/27/2022 02:38:19 - INFO - codeparrot_training - Step 7204: {'lr': 0.0004856385170894929, 'samples': 1383360, 'steps': 7204, 'loss/train': 0.7149804532527924} 01/27/2022 02:38:22 - INFO - codeparrot_training - Step 7205: {'lr': 0.00048563305064504503, 'samples': 1383552, 'steps': 7205, 'loss/train': 0.8834766447544098} 01/27/2022 02:38:25 - INFO - codeparrot_training - Step 7206: {'lr': 0.00048562758319121996, 'samples': 1383744, 'steps': 7206, 'loss/train': 0.9260527789592743} 01/27/2022 02:38:28 - INFO - codeparrot_training - Step 7207: {'lr': 0.00048562211472804115, 'samples': 1383936, 'steps': 7207, 'loss/train': 0.9478211402893066} 01/27/2022 02:38:31 - INFO - codeparrot_training - Step 7208: {'lr': 0.000485616645255532, 'samples': 1384128, 'steps': 7208, 'loss/train': 1.1998566091060638} 01/27/2022 02:38:36 - INFO - codeparrot_training - Step 7209: {'lr': 0.00048561117477371595, 'samples': 1384320, 'steps': 7209, 'loss/train': 0.7985098958015442} 01/27/2022 02:38:39 - INFO - codeparrot_training - Step 7210: {'lr': 0.0004856057032826165, 'samples': 1384512, 'steps': 7210, 'loss/train': 0.8149127662181854} 01/27/2022 02:38:42 - INFO - codeparrot_training - Step 7211: {'lr': 0.000485600230782257, 'samples': 1384704, 'steps': 7211, 'loss/train': 0.6917067468166351} 01/27/2022 02:38:45 - INFO - codeparrot_training - Step 7212: {'lr': 0.00048559475727266086, 'samples': 1384896, 'steps': 7212, 'loss/train': 0.564648762345314} 01/27/2022 02:38:49 - INFO - codeparrot_training - Step 7213: {'lr': 0.00048558928275385167, 'samples': 1385088, 'steps': 7213, 'loss/train': 0.8113807439804077} 01/27/2022 02:38:52 - INFO - codeparrot_training - Step 7214: {'lr': 0.00048558380722585283, 'samples': 1385280, 'steps': 7214, 'loss/train': 1.0141671001911163} 01/27/2022 02:38:55 - INFO - codeparrot_training - Step 7215: {'lr': 0.00048557833068868766, 'samples': 1385472, 'steps': 7215, 'loss/train': 0.7053898125886917} 01/27/2022 02:38:58 - INFO - codeparrot_training - Step 7216: {'lr': 0.00048557285314237975, 'samples': 1385664, 'steps': 7216, 'loss/train': 0.5675181895494461} 01/27/2022 02:39:01 - INFO - codeparrot_training - Step 7217: {'lr': 0.0004855673745869526, 'samples': 1385856, 'steps': 7217, 'loss/train': 0.9294824302196503} 01/27/2022 02:39:06 - INFO - codeparrot_training - Step 7218: {'lr': 0.00048556189502242956, 'samples': 1386048, 'steps': 7218, 'loss/train': 0.7917294502258301} 01/27/2022 02:39:09 - INFO - codeparrot_training - Step 7219: {'lr': 0.00048555641444883424, 'samples': 1386240, 'steps': 7219, 'loss/train': 1.2046110033988953} 01/27/2022 02:39:12 - INFO - codeparrot_training - Step 7220: {'lr': 0.00048555093286618996, 'samples': 1386432, 'steps': 7220, 'loss/train': 0.30069950222969055} 01/27/2022 02:39:15 - INFO - codeparrot_training - Step 7221: {'lr': 0.00048554545027452035, 'samples': 1386624, 'steps': 7221, 'loss/train': 0.9083172082901001} 01/27/2022 02:39:18 - INFO - codeparrot_training - Step 7222: {'lr': 0.00048553996667384877, 'samples': 1386816, 'steps': 7222, 'loss/train': 0.9044561684131622} 01/27/2022 02:39:22 - INFO - codeparrot_training - Step 7223: {'lr': 0.00048553448206419876, 'samples': 1387008, 'steps': 7223, 'loss/train': 1.359682023525238} 01/27/2022 02:39:25 - INFO - codeparrot_training - Step 7224: {'lr': 0.0004855289964455938, 'samples': 1387200, 'steps': 7224, 'loss/train': 0.11162293329834938} 01/27/2022 02:39:28 - INFO - codeparrot_training - Step 7225: {'lr': 0.0004855235098180575, 'samples': 1387392, 'steps': 7225, 'loss/train': 1.1983477771282196} 01/27/2022 02:39:31 - INFO - codeparrot_training - Step 7226: {'lr': 0.00048551802218161315, 'samples': 1387584, 'steps': 7226, 'loss/train': 0.8712676763534546} 01/27/2022 02:39:37 - INFO - codeparrot_training - Step 7227: {'lr': 0.00048551253353628444, 'samples': 1387776, 'steps': 7227, 'loss/train': 0.7402443587779999} 01/27/2022 02:39:41 - INFO - codeparrot_training - Step 7228: {'lr': 0.0004855070438820949, 'samples': 1387968, 'steps': 7228, 'loss/train': 1.421418160200119} 01/27/2022 02:39:44 - INFO - codeparrot_training - Step 7229: {'lr': 0.0004855015532190679, 'samples': 1388160, 'steps': 7229, 'loss/train': 0.8778181672096252} 01/27/2022 02:39:47 - INFO - codeparrot_training - Step 7230: {'lr': 0.0004854960615472269, 'samples': 1388352, 'steps': 7230, 'loss/train': 0.7675503194332123} 01/27/2022 02:39:50 - INFO - codeparrot_training - Step 7231: {'lr': 0.0004854905688665957, 'samples': 1388544, 'steps': 7231, 'loss/train': 0.9739656150341034} 01/27/2022 02:39:53 - INFO - codeparrot_training - Step 7232: {'lr': 0.00048548507517719766, 'samples': 1388736, 'steps': 7232, 'loss/train': 0.49656200408935547} 01/27/2022 02:39:56 - INFO - codeparrot_training - Step 7233: {'lr': 0.00048547958047905635, 'samples': 1388928, 'steps': 7233, 'loss/train': 0.362677238881588} 01/27/2022 02:39:59 - INFO - codeparrot_training - Step 7234: {'lr': 0.00048547408477219524, 'samples': 1389120, 'steps': 7234, 'loss/train': 0.9763565361499786} 01/27/2022 02:40:04 - INFO - codeparrot_training - Step 7235: {'lr': 0.00048546858805663797, 'samples': 1389312, 'steps': 7235, 'loss/train': 0.9646850824356079} 01/27/2022 02:40:07 - INFO - codeparrot_training - Step 7236: {'lr': 0.000485463090332408, 'samples': 1389504, 'steps': 7236, 'loss/train': 1.0945149958133698} 01/27/2022 02:40:10 - INFO - codeparrot_training - Step 7237: {'lr': 0.0004854575915995289, 'samples': 1389696, 'steps': 7237, 'loss/train': 0.6508594751358032} 01/27/2022 02:40:13 - INFO - codeparrot_training - Step 7238: {'lr': 0.0004854520918580243, 'samples': 1389888, 'steps': 7238, 'loss/train': 0.8337837159633636} 01/27/2022 02:40:17 - INFO - codeparrot_training - Step 7239: {'lr': 0.00048544659110791766, 'samples': 1390080, 'steps': 7239, 'loss/train': 0.48940470814704895} 01/27/2022 02:40:20 - INFO - codeparrot_training - Step 7240: {'lr': 0.0004854410893492326, 'samples': 1390272, 'steps': 7240, 'loss/train': 1.0890962183475494} 01/27/2022 02:40:23 - INFO - codeparrot_training - Step 7241: {'lr': 0.00048543558658199266, 'samples': 1390464, 'steps': 7241, 'loss/train': 0.9697920978069305} 01/27/2022 02:40:26 - INFO - codeparrot_training - Step 7242: {'lr': 0.0004854300828062215, 'samples': 1390656, 'steps': 7242, 'loss/train': 0.7279404848814011} 01/27/2022 02:40:29 - INFO - codeparrot_training - Step 7243: {'lr': 0.0004854245780219425, 'samples': 1390848, 'steps': 7243, 'loss/train': 0.3527924492955208} 01/27/2022 02:40:34 - INFO - codeparrot_training - Step 7244: {'lr': 0.00048541907222917946, 'samples': 1391040, 'steps': 7244, 'loss/train': 1.117811769247055} 01/27/2022 02:40:37 - INFO - codeparrot_training - Step 7245: {'lr': 0.0004854135654279558, 'samples': 1391232, 'steps': 7245, 'loss/train': 0.8589212894439697} 01/27/2022 02:40:40 - INFO - codeparrot_training - Step 7246: {'lr': 0.0004854080576182952, 'samples': 1391424, 'steps': 7246, 'loss/train': 0.830737978219986} 01/27/2022 02:40:43 - INFO - codeparrot_training - Step 7247: {'lr': 0.00048540254880022126, 'samples': 1391616, 'steps': 7247, 'loss/train': 1.8092681765556335} 01/27/2022 02:40:46 - INFO - codeparrot_training - Step 7248: {'lr': 0.00048539703897375753, 'samples': 1391808, 'steps': 7248, 'loss/train': 0.8973172903060913} 01/27/2022 02:40:49 - INFO - codeparrot_training - Step 7249: {'lr': 0.0004853915281389276, 'samples': 1392000, 'steps': 7249, 'loss/train': 0.8261363804340363} 01/27/2022 02:40:53 - INFO - codeparrot_training - Step 7250: {'lr': 0.0004853860162957552, 'samples': 1392192, 'steps': 7250, 'loss/train': 0.48151735961437225} 01/27/2022 02:40:56 - INFO - codeparrot_training - Step 7251: {'lr': 0.00048538050344426375, 'samples': 1392384, 'steps': 7251, 'loss/train': 1.1716758906841278} 01/27/2022 02:40:59 - INFO - codeparrot_training - Step 7252: {'lr': 0.0004853749895844771, 'samples': 1392576, 'steps': 7252, 'loss/train': 0.31808873265981674} 01/27/2022 02:41:05 - INFO - codeparrot_training - Step 7253: {'lr': 0.00048536947471641855, 'samples': 1392768, 'steps': 7253, 'loss/train': 1.1636156737804413} 01/27/2022 02:41:08 - INFO - codeparrot_training - Step 7254: {'lr': 0.00048536395884011207, 'samples': 1392960, 'steps': 7254, 'loss/train': 0.78193598985672} 01/27/2022 02:41:11 - INFO - codeparrot_training - Step 7255: {'lr': 0.00048535844195558104, 'samples': 1393152, 'steps': 7255, 'loss/train': 0.6144669502973557} 01/27/2022 02:41:14 - INFO - codeparrot_training - Step 7256: {'lr': 0.0004853529240628493, 'samples': 1393344, 'steps': 7256, 'loss/train': 1.0364746749401093} 01/27/2022 02:41:18 - INFO - codeparrot_training - Step 7257: {'lr': 0.0004853474051619402, 'samples': 1393536, 'steps': 7257, 'loss/train': 0.8198201358318329} 01/27/2022 02:41:21 - INFO - codeparrot_training - Step 7258: {'lr': 0.0004853418852528776, 'samples': 1393728, 'steps': 7258, 'loss/train': 0.8953089416027069} 01/27/2022 02:41:24 - INFO - codeparrot_training - Step 7259: {'lr': 0.00048533636433568505, 'samples': 1393920, 'steps': 7259, 'loss/train': 1.184931367635727} 01/27/2022 02:41:27 - INFO - codeparrot_training - Step 7260: {'lr': 0.00048533084241038637, 'samples': 1394112, 'steps': 7260, 'loss/train': 0.5852290838956833} 01/27/2022 02:41:30 - INFO - codeparrot_training - Step 7261: {'lr': 0.00048532531947700496, 'samples': 1394304, 'steps': 7261, 'loss/train': 0.4461022764444351} 01/27/2022 02:41:35 - INFO - codeparrot_training - Step 7262: {'lr': 0.00048531979553556473, 'samples': 1394496, 'steps': 7262, 'loss/train': 1.1873425841331482} 01/27/2022 02:41:38 - INFO - codeparrot_training - Step 7263: {'lr': 0.0004853142705860891, 'samples': 1394688, 'steps': 7263, 'loss/train': 0.7780596613883972} 01/27/2022 02:41:41 - INFO - codeparrot_training - Step 7264: {'lr': 0.00048530874462860194, 'samples': 1394880, 'steps': 7264, 'loss/train': 0.5725524723529816} 01/27/2022 02:41:44 - INFO - codeparrot_training - Step 7265: {'lr': 0.0004853032176631268, 'samples': 1395072, 'steps': 7265, 'loss/train': 0.21875260770320892} 01/27/2022 02:41:47 - INFO - codeparrot_training - Step 7266: {'lr': 0.0004852976896896874, 'samples': 1395264, 'steps': 7266, 'loss/train': 0.5948147624731064} 01/27/2022 02:41:51 - INFO - codeparrot_training - Step 7267: {'lr': 0.0004852921607083074, 'samples': 1395456, 'steps': 7267, 'loss/train': 1.0360039472579956} 01/27/2022 02:41:54 - INFO - codeparrot_training - Step 7268: {'lr': 0.00048528663071901047, 'samples': 1395648, 'steps': 7268, 'loss/train': 0.6374850422143936} 01/27/2022 02:41:57 - INFO - codeparrot_training - Step 7269: {'lr': 0.00048528109972182043, 'samples': 1395840, 'steps': 7269, 'loss/train': 0.7811686992645264} 01/27/2022 02:42:00 - INFO - codeparrot_training - Step 7270: {'lr': 0.0004852755677167607, 'samples': 1396032, 'steps': 7270, 'loss/train': 0.8204046785831451} 01/27/2022 02:42:06 - INFO - codeparrot_training - Step 7271: {'lr': 0.00048527003470385534, 'samples': 1396224, 'steps': 7271, 'loss/train': 0.7934105694293976} 01/27/2022 02:42:09 - INFO - codeparrot_training - Step 7272: {'lr': 0.0004852645006831278, 'samples': 1396416, 'steps': 7272, 'loss/train': 0.6562551856040955} 01/27/2022 02:42:12 - INFO - codeparrot_training - Step 7273: {'lr': 0.00048525896565460177, 'samples': 1396608, 'steps': 7273, 'loss/train': 1.2135429382324219} 01/27/2022 02:42:16 - INFO - codeparrot_training - Step 7274: {'lr': 0.00048525342961830106, 'samples': 1396800, 'steps': 7274, 'loss/train': 0.9604936838150024} 01/27/2022 02:42:19 - INFO - codeparrot_training - Step 7275: {'lr': 0.0004852478925742494, 'samples': 1396992, 'steps': 7275, 'loss/train': 0.3485950380563736} 01/27/2022 02:42:22 - INFO - codeparrot_training - Step 7276: {'lr': 0.0004852423545224704, 'samples': 1397184, 'steps': 7276, 'loss/train': 5.467963099479675} 01/27/2022 02:42:25 - INFO - codeparrot_training - Step 7277: {'lr': 0.00048523681546298793, 'samples': 1397376, 'steps': 7277, 'loss/train': 0.9809500873088837} 01/27/2022 02:42:28 - INFO - codeparrot_training - Step 7278: {'lr': 0.0004852312753958256, 'samples': 1397568, 'steps': 7278, 'loss/train': 0.933300107717514} 01/27/2022 02:42:31 - INFO - codeparrot_training - Step 7279: {'lr': 0.00048522573432100715, 'samples': 1397760, 'steps': 7279, 'loss/train': 1.1888924539089203} 01/27/2022 02:42:36 - INFO - codeparrot_training - Step 7280: {'lr': 0.0004852201922385564, 'samples': 1397952, 'steps': 7280, 'loss/train': 0.7871535122394562} 01/27/2022 02:42:39 - INFO - codeparrot_training - Step 7281: {'lr': 0.000485214649148497, 'samples': 1398144, 'steps': 7281, 'loss/train': 1.43331840634346} 01/27/2022 02:42:42 - INFO - codeparrot_training - Step 7282: {'lr': 0.00048520910505085274, 'samples': 1398336, 'steps': 7282, 'loss/train': 0.948998898267746} 01/27/2022 02:42:45 - INFO - codeparrot_training - Step 7283: {'lr': 0.0004852035599456474, 'samples': 1398528, 'steps': 7283, 'loss/train': 1.21121147274971} 01/27/2022 02:42:48 - INFO - codeparrot_training - Step 7284: {'lr': 0.0004851980138329046, 'samples': 1398720, 'steps': 7284, 'loss/train': 1.0929210484027863} 01/27/2022 02:42:51 - INFO - codeparrot_training - Step 7285: {'lr': 0.00048519246671264825, 'samples': 1398912, 'steps': 7285, 'loss/train': 1.3122289180755615} 01/27/2022 02:42:55 - INFO - codeparrot_training - Step 7286: {'lr': 0.0004851869185849021, 'samples': 1399104, 'steps': 7286, 'loss/train': 0.7660429179668427} 01/27/2022 02:42:58 - INFO - codeparrot_training - Step 7287: {'lr': 0.0004851813694496898, 'samples': 1399296, 'steps': 7287, 'loss/train': 0.8382098078727722} 01/27/2022 02:43:02 - INFO - codeparrot_training - Step 7288: {'lr': 0.00048517581930703526, 'samples': 1399488, 'steps': 7288, 'loss/train': 0.843529611825943} 01/27/2022 02:43:05 - INFO - codeparrot_training - Step 7289: {'lr': 0.0004851702681569621, 'samples': 1399680, 'steps': 7289, 'loss/train': 1.0134811699390411} 01/27/2022 02:43:09 - INFO - codeparrot_training - Step 7290: {'lr': 0.0004851647159994943, 'samples': 1399872, 'steps': 7290, 'loss/train': 1.3332189917564392} 01/27/2022 02:43:12 - INFO - codeparrot_training - Step 7291: {'lr': 0.00048515916283465546, 'samples': 1400064, 'steps': 7291, 'loss/train': 1.1392799019813538} 01/27/2022 02:43:15 - INFO - codeparrot_training - Step 7292: {'lr': 0.00048515360866246943, 'samples': 1400256, 'steps': 7292, 'loss/train': 0.9729919731616974} 01/27/2022 02:43:18 - INFO - codeparrot_training - Step 7293: {'lr': 0.00048514805348296, 'samples': 1400448, 'steps': 7293, 'loss/train': 1.197712004184723} 01/27/2022 02:43:21 - INFO - codeparrot_training - Step 7294: {'lr': 0.000485142497296151, 'samples': 1400640, 'steps': 7294, 'loss/train': 1.0482417047023773} 01/27/2022 02:43:24 - INFO - codeparrot_training - Step 7295: {'lr': 0.00048513694010206623, 'samples': 1400832, 'steps': 7295, 'loss/train': 0.8413304686546326} 01/27/2022 02:43:27 - INFO - codeparrot_training - Step 7296: {'lr': 0.0004851313819007295, 'samples': 1401024, 'steps': 7296, 'loss/train': 0.9136654436588287} 01/27/2022 02:43:34 - INFO - codeparrot_training - Step 7297: {'lr': 0.0004851258226921645, 'samples': 1401216, 'steps': 7297, 'loss/train': 0.4334966093301773} 01/27/2022 02:43:37 - INFO - codeparrot_training - Step 7298: {'lr': 0.0004851202624763952, 'samples': 1401408, 'steps': 7298, 'loss/train': 0.8237462639808655} 01/27/2022 02:43:40 - INFO - codeparrot_training - Step 7299: {'lr': 0.0004851147012534453, 'samples': 1401600, 'steps': 7299, 'loss/train': 0.8898056745529175} 01/27/2022 02:43:43 - INFO - codeparrot_training - Step 7300: {'lr': 0.00048510913902333875, 'samples': 1401792, 'steps': 7300, 'loss/train': 0.17112409695982933} 01/27/2022 02:43:46 - INFO - codeparrot_training - Step 7301: {'lr': 0.0004851035757860992, 'samples': 1401984, 'steps': 7301, 'loss/train': 1.7052642703056335} 01/27/2022 02:43:49 - INFO - codeparrot_training - Step 7302: {'lr': 0.0004850980115417507, 'samples': 1402176, 'steps': 7302, 'loss/train': 0.3891821801662445} 01/27/2022 02:43:52 - INFO - codeparrot_training - Step 7303: {'lr': 0.0004850924462903169, 'samples': 1402368, 'steps': 7303, 'loss/train': 0.6523185968399048} 01/27/2022 02:43:56 - INFO - codeparrot_training - Step 7304: {'lr': 0.0004850868800318218, 'samples': 1402560, 'steps': 7304, 'loss/train': 0.8841257393360138} 01/27/2022 02:43:59 - INFO - codeparrot_training - Step 7305: {'lr': 0.00048508131276628905, 'samples': 1402752, 'steps': 7305, 'loss/train': 0.8582388460636139} 01/27/2022 02:44:03 - INFO - codeparrot_training - Step 7306: {'lr': 0.0004850757444937426, 'samples': 1402944, 'steps': 7306, 'loss/train': 1.3971551656723022} 01/27/2022 02:44:06 - INFO - codeparrot_training - Step 7307: {'lr': 0.00048507017521420636, 'samples': 1403136, 'steps': 7307, 'loss/train': 0.993514358997345} 01/27/2022 02:44:09 - INFO - codeparrot_training - Step 7308: {'lr': 0.0004850646049277041, 'samples': 1403328, 'steps': 7308, 'loss/train': 0.814734935760498} 01/27/2022 02:44:12 - INFO - codeparrot_training - Step 7309: {'lr': 0.00048505903363425974, 'samples': 1403520, 'steps': 7309, 'loss/train': 0.5830088406801224} 01/27/2022 02:44:16 - INFO - codeparrot_training - Step 7310: {'lr': 0.0004850534613338972, 'samples': 1403712, 'steps': 7310, 'loss/train': 0.879513680934906} 01/27/2022 02:44:19 - INFO - codeparrot_training - Step 7311: {'lr': 0.00048504788802664013, 'samples': 1403904, 'steps': 7311, 'loss/train': 0.9589834213256836} 01/27/2022 02:44:22 - INFO - codeparrot_training - Step 7312: {'lr': 0.00048504231371251255, 'samples': 1404096, 'steps': 7312, 'loss/train': 0.4147553890943527} 01/27/2022 02:44:25 - INFO - codeparrot_training - Step 7313: {'lr': 0.0004850367383915384, 'samples': 1404288, 'steps': 7313, 'loss/train': 1.16361603140831} 01/27/2022 02:44:28 - INFO - codeparrot_training - Step 7314: {'lr': 0.00048503116206374147, 'samples': 1404480, 'steps': 7314, 'loss/train': 0.7775613069534302} 01/27/2022 02:44:35 - INFO - codeparrot_training - Step 7315: {'lr': 0.00048502558472914573, 'samples': 1404672, 'steps': 7315, 'loss/train': 0.6063434332609177} 01/27/2022 02:44:38 - INFO - codeparrot_training - Step 7316: {'lr': 0.00048502000638777487, 'samples': 1404864, 'steps': 7316, 'loss/train': 0.8781678378582001} 01/27/2022 02:44:41 - INFO - codeparrot_training - Step 7317: {'lr': 0.000485014427039653, 'samples': 1405056, 'steps': 7317, 'loss/train': 0.8322115838527679} 01/27/2022 02:44:44 - INFO - codeparrot_training - Step 7318: {'lr': 0.00048500884668480407, 'samples': 1405248, 'steps': 7318, 'loss/train': 0.701438382267952} 01/27/2022 02:44:47 - INFO - codeparrot_training - Step 7319: {'lr': 0.00048500326532325167, 'samples': 1405440, 'steps': 7319, 'loss/train': 0.9533161818981171} 01/27/2022 02:44:50 - INFO - codeparrot_training - Step 7320: {'lr': 0.00048499768295502, 'samples': 1405632, 'steps': 7320, 'loss/train': 0.6466104090213776} 01/27/2022 02:44:53 - INFO - codeparrot_training - Step 7321: {'lr': 0.0004849920995801329, 'samples': 1405824, 'steps': 7321, 'loss/train': 0.6047532856464386} 01/27/2022 02:44:57 - INFO - codeparrot_training - Step 7322: {'lr': 0.00048498651519861426, 'samples': 1406016, 'steps': 7322, 'loss/train': 0.7578590512275696} 01/27/2022 02:45:00 - INFO - codeparrot_training - Step 7323: {'lr': 0.00048498092981048797, 'samples': 1406208, 'steps': 7323, 'loss/train': 0.7248327881097794} 01/27/2022 02:45:04 - INFO - codeparrot_training - Step 7324: {'lr': 0.000484975343415778, 'samples': 1406400, 'steps': 7324, 'loss/train': 1.054452896118164} 01/27/2022 02:45:07 - INFO - codeparrot_training - Step 7325: {'lr': 0.00048496975601450835, 'samples': 1406592, 'steps': 7325, 'loss/train': 0.8841010630130768} 01/27/2022 02:45:10 - INFO - codeparrot_training - Step 7326: {'lr': 0.0004849641676067027, 'samples': 1406784, 'steps': 7326, 'loss/train': 1.0455904304981232} 01/27/2022 02:45:13 - INFO - codeparrot_training - Step 7327: {'lr': 0.0004849585781923853, 'samples': 1406976, 'steps': 7327, 'loss/train': 1.361135333776474} 01/27/2022 02:45:17 - INFO - codeparrot_training - Step 7328: {'lr': 0.00048495298777157994, 'samples': 1407168, 'steps': 7328, 'loss/train': 0.5858708918094635} 01/27/2022 02:45:20 - INFO - codeparrot_training - Step 7329: {'lr': 0.00048494739634431057, 'samples': 1407360, 'steps': 7329, 'loss/train': 0.13189785182476044} 01/27/2022 02:45:23 - INFO - codeparrot_training - Step 7330: {'lr': 0.00048494180391060114, 'samples': 1407552, 'steps': 7330, 'loss/train': 0.7030125707387924} 01/27/2022 02:45:26 - INFO - codeparrot_training - Step 7331: {'lr': 0.0004849362104704756, 'samples': 1407744, 'steps': 7331, 'loss/train': 0.8124829530715942} 01/27/2022 02:45:30 - INFO - codeparrot_training - Step 7332: {'lr': 0.00048493061602395803, 'samples': 1407936, 'steps': 7332, 'loss/train': 0.7680040597915649} 01/27/2022 02:45:34 - INFO - codeparrot_training - Step 7333: {'lr': 0.0004849250205710722, 'samples': 1408128, 'steps': 7333, 'loss/train': 1.1382178366184235} 01/27/2022 02:45:37 - INFO - codeparrot_training - Step 7334: {'lr': 0.0004849194241118423, 'samples': 1408320, 'steps': 7334, 'loss/train': 0.2979234382510185} 01/27/2022 02:45:40 - INFO - codeparrot_training - Step 7335: {'lr': 0.0004849138266462921, 'samples': 1408512, 'steps': 7335, 'loss/train': 0.9626844227313995} 01/27/2022 02:45:43 - INFO - codeparrot_training - Step 7336: {'lr': 0.0004849082281744457, 'samples': 1408704, 'steps': 7336, 'loss/train': 0.1680743806064129} 01/27/2022 02:45:46 - INFO - codeparrot_training - Step 7337: {'lr': 0.00048490262869632693, 'samples': 1408896, 'steps': 7337, 'loss/train': 1.3151284754276276} 01/27/2022 02:45:49 - INFO - codeparrot_training - Step 7338: {'lr': 0.00048489702821196003, 'samples': 1409088, 'steps': 7338, 'loss/train': 0.6462138444185257} 01/27/2022 02:45:53 - INFO - codeparrot_training - Step 7339: {'lr': 0.0004848914267213688, 'samples': 1409280, 'steps': 7339, 'loss/train': 0.9196175336837769} 01/27/2022 02:45:56 - INFO - codeparrot_training - Step 7340: {'lr': 0.00048488582422457726, 'samples': 1409472, 'steps': 7340, 'loss/train': 0.9286051690578461} 01/27/2022 02:46:00 - INFO - codeparrot_training - Step 7341: {'lr': 0.0004848802207216094, 'samples': 1409664, 'steps': 7341, 'loss/train': 0.9110896289348602} 01/27/2022 02:46:03 - INFO - codeparrot_training - Step 7342: {'lr': 0.0004848746162124894, 'samples': 1409856, 'steps': 7342, 'loss/train': 0.9776924550533295} 01/27/2022 02:46:06 - INFO - codeparrot_training - Step 7343: {'lr': 0.00048486901069724097, 'samples': 1410048, 'steps': 7343, 'loss/train': 1.2665501832962036} 01/27/2022 02:46:10 - INFO - codeparrot_training - Step 7344: {'lr': 0.0004848634041758884, 'samples': 1410240, 'steps': 7344, 'loss/train': 0.39908283948898315} 01/27/2022 02:46:13 - INFO - codeparrot_training - Step 7345: {'lr': 0.00048485779664845553, 'samples': 1410432, 'steps': 7345, 'loss/train': 0.9722552597522736} 01/27/2022 02:46:16 - INFO - codeparrot_training - Step 7346: {'lr': 0.0004848521881149664, 'samples': 1410624, 'steps': 7346, 'loss/train': 0.6061641722917557} 01/27/2022 02:46:19 - INFO - codeparrot_training - Step 7347: {'lr': 0.00048484657857544513, 'samples': 1410816, 'steps': 7347, 'loss/train': 0.7069055736064911} 01/27/2022 02:46:22 - INFO - codeparrot_training - Step 7348: {'lr': 0.0004848409680299156, 'samples': 1411008, 'steps': 7348, 'loss/train': 0.4757213294506073} 01/27/2022 02:46:25 - INFO - codeparrot_training - Step 7349: {'lr': 0.00048483535647840206, 'samples': 1411200, 'steps': 7349, 'loss/train': 0.8757341802120209} 01/27/2022 02:46:31 - INFO - codeparrot_training - Step 7350: {'lr': 0.00048482974392092827, 'samples': 1411392, 'steps': 7350, 'loss/train': 0.6308289617300034} 01/27/2022 02:46:35 - INFO - codeparrot_training - Step 7351: {'lr': 0.0004848241303575185, 'samples': 1411584, 'steps': 7351, 'loss/train': 0.6154752373695374} 01/27/2022 02:46:38 - INFO - codeparrot_training - Step 7352: {'lr': 0.0004848185157881968, 'samples': 1411776, 'steps': 7352, 'loss/train': 0.6524317860603333} 01/27/2022 02:46:41 - INFO - codeparrot_training - Step 7353: {'lr': 0.0004848129002129871, 'samples': 1411968, 'steps': 7353, 'loss/train': 0.45624491572380066} 01/27/2022 02:46:44 - INFO - codeparrot_training - Step 7354: {'lr': 0.0004848072836319134, 'samples': 1412160, 'steps': 7354, 'loss/train': 1.149328351020813} 01/27/2022 02:46:47 - INFO - codeparrot_training - Step 7355: {'lr': 0.000484801666045, 'samples': 1412352, 'steps': 7355, 'loss/train': 1.2803572118282318} 01/27/2022 02:46:51 - INFO - codeparrot_training - Step 7356: {'lr': 0.0004847960474522707, 'samples': 1412544, 'steps': 7356, 'loss/train': 1.2761204838752747} 01/27/2022 02:46:54 - INFO - codeparrot_training - Step 7357: {'lr': 0.00048479042785374974, 'samples': 1412736, 'steps': 7357, 'loss/train': 1.2681745290756226} 01/27/2022 02:46:57 - INFO - codeparrot_training - Step 7358: {'lr': 0.0004847848072494611, 'samples': 1412928, 'steps': 7358, 'loss/train': 0.790545791387558} 01/27/2022 02:47:01 - INFO - codeparrot_training - Step 7359: {'lr': 0.0004847791856394289, 'samples': 1413120, 'steps': 7359, 'loss/train': 0.8758202791213989} 01/27/2022 02:47:04 - INFO - codeparrot_training - Step 7360: {'lr': 0.00048477356302367724, 'samples': 1413312, 'steps': 7360, 'loss/train': 1.1310904026031494} 01/27/2022 02:47:08 - INFO - codeparrot_training - Step 7361: {'lr': 0.00048476793940223026, 'samples': 1413504, 'steps': 7361, 'loss/train': 0.45941850543022156} 01/27/2022 02:47:11 - INFO - codeparrot_training - Step 7362: {'lr': 0.0004847623147751119, 'samples': 1413696, 'steps': 7362, 'loss/train': 0.9022846519947052} 01/27/2022 02:47:14 - INFO - codeparrot_training - Step 7363: {'lr': 0.00048475668914234636, 'samples': 1413888, 'steps': 7363, 'loss/train': 1.2007428109645844} 01/27/2022 02:47:17 - INFO - codeparrot_training - Step 7364: {'lr': 0.0004847510625039577, 'samples': 1414080, 'steps': 7364, 'loss/train': 0.8997186720371246} 01/27/2022 02:47:20 - INFO - codeparrot_training - Step 7365: {'lr': 0.00048474543485997005, 'samples': 1414272, 'steps': 7365, 'loss/train': 0.7460137456655502} 01/27/2022 02:47:23 - INFO - codeparrot_training - Step 7366: {'lr': 0.00048473980621040744, 'samples': 1414464, 'steps': 7366, 'loss/train': 1.0779094398021698} 01/27/2022 02:47:26 - INFO - codeparrot_training - Step 7367: {'lr': 0.00048473417655529405, 'samples': 1414656, 'steps': 7367, 'loss/train': 0.9877134561538696} 01/27/2022 02:47:31 - INFO - codeparrot_training - Step 7368: {'lr': 0.000484728545894654, 'samples': 1414848, 'steps': 7368, 'loss/train': 0.39769497513771057} 01/27/2022 02:47:34 - INFO - codeparrot_training - Step 7369: {'lr': 0.00048472291422851135, 'samples': 1415040, 'steps': 7369, 'loss/train': 1.1692872047424316} 01/27/2022 02:47:37 - INFO - codeparrot_training - Step 7370: {'lr': 0.00048471728155689034, 'samples': 1415232, 'steps': 7370, 'loss/train': 0.7253526002168655} 01/27/2022 02:47:40 - INFO - codeparrot_training - Step 7371: {'lr': 0.000484711647879815, 'samples': 1415424, 'steps': 7371, 'loss/train': 0.7175628393888474} 01/27/2022 02:47:44 - INFO - codeparrot_training - Step 7372: {'lr': 0.00048470601319730946, 'samples': 1415616, 'steps': 7372, 'loss/train': 0.44634345173835754} 01/27/2022 02:47:47 - INFO - codeparrot_training - Step 7373: {'lr': 0.00048470037750939795, 'samples': 1415808, 'steps': 7373, 'loss/train': 0.09811719506978989} 01/27/2022 02:47:50 - INFO - codeparrot_training - Step 7374: {'lr': 0.0004846947408161045, 'samples': 1416000, 'steps': 7374, 'loss/train': 0.7880159318447113} 01/27/2022 02:47:53 - INFO - codeparrot_training - Step 7375: {'lr': 0.0004846891031174533, 'samples': 1416192, 'steps': 7375, 'loss/train': 0.2223331555724144} 01/27/2022 02:47:56 - INFO - codeparrot_training - Step 7376: {'lr': 0.00048468346441346853, 'samples': 1416384, 'steps': 7376, 'loss/train': 0.9273259341716766} 01/27/2022 02:48:02 - INFO - codeparrot_training - Step 7377: {'lr': 0.00048467782470417434, 'samples': 1416576, 'steps': 7377, 'loss/train': 1.1987287402153015} 01/27/2022 02:48:05 - INFO - codeparrot_training - Step 7378: {'lr': 0.0004846721839895948, 'samples': 1416768, 'steps': 7378, 'loss/train': 0.613423615694046} 01/27/2022 02:48:09 - INFO - codeparrot_training - Step 7379: {'lr': 0.00048466654226975414, 'samples': 1416960, 'steps': 7379, 'loss/train': 0.9286536276340485} 01/27/2022 02:48:12 - INFO - codeparrot_training - Step 7380: {'lr': 0.00048466089954467663, 'samples': 1417152, 'steps': 7380, 'loss/train': 0.7912326157093048} 01/27/2022 02:48:15 - INFO - codeparrot_training - Step 7381: {'lr': 0.0004846552558143863, 'samples': 1417344, 'steps': 7381, 'loss/train': 1.1114821135997772} 01/27/2022 02:48:18 - INFO - codeparrot_training - Step 7382: {'lr': 0.00048464961107890734, 'samples': 1417536, 'steps': 7382, 'loss/train': 1.1361466348171234} 01/27/2022 02:48:21 - INFO - codeparrot_training - Step 7383: {'lr': 0.00048464396533826396, 'samples': 1417728, 'steps': 7383, 'loss/train': 1.0355281233787537} 01/27/2022 02:48:24 - INFO - codeparrot_training - Step 7384: {'lr': 0.0004846383185924803, 'samples': 1417920, 'steps': 7384, 'loss/train': 1.3906664550304413} 01/27/2022 02:48:27 - INFO - codeparrot_training - Step 7385: {'lr': 0.0004846326708415806, 'samples': 1418112, 'steps': 7385, 'loss/train': 1.920116901397705} 01/27/2022 02:48:32 - INFO - codeparrot_training - Step 7386: {'lr': 0.00048462702208558906, 'samples': 1418304, 'steps': 7386, 'loss/train': 1.0277037620544434} 01/27/2022 02:48:35 - INFO - codeparrot_training - Step 7387: {'lr': 0.0004846213723245299, 'samples': 1418496, 'steps': 7387, 'loss/train': 0.11119873076677322} 01/27/2022 02:48:38 - INFO - codeparrot_training - Step 7388: {'lr': 0.00048461572155842725, 'samples': 1418688, 'steps': 7388, 'loss/train': 0.9350940585136414} 01/27/2022 02:48:41 - INFO - codeparrot_training - Step 7389: {'lr': 0.0004846100697873054, 'samples': 1418880, 'steps': 7389, 'loss/train': 0.5406285971403122} 01/27/2022 02:48:45 - INFO - codeparrot_training - Step 7390: {'lr': 0.0004846044170111884, 'samples': 1419072, 'steps': 7390, 'loss/train': 0.9187577962875366} 01/27/2022 02:48:48 - INFO - codeparrot_training - Step 7391: {'lr': 0.00048459876323010063, 'samples': 1419264, 'steps': 7391, 'loss/train': 1.0725900828838348} 01/27/2022 02:48:51 - INFO - codeparrot_training - Step 7392: {'lr': 0.00048459310844406624, 'samples': 1419456, 'steps': 7392, 'loss/train': 0.5978888422250748} 01/27/2022 02:48:54 - INFO - codeparrot_training - Step 7393: {'lr': 0.0004845874526531095, 'samples': 1419648, 'steps': 7393, 'loss/train': 0.47578418254852295} 01/27/2022 02:48:58 - INFO - codeparrot_training - Step 7394: {'lr': 0.0004845817958572546, 'samples': 1419840, 'steps': 7394, 'loss/train': 0.7740282118320465} 01/27/2022 02:49:02 - INFO - codeparrot_training - Step 7395: {'lr': 0.0004845761380565257, 'samples': 1420032, 'steps': 7395, 'loss/train': 0.9308974742889404} 01/27/2022 02:49:05 - INFO - codeparrot_training - Step 7396: {'lr': 0.0004845704792509472, 'samples': 1420224, 'steps': 7396, 'loss/train': 0.09026267006993294} 01/27/2022 02:49:08 - INFO - codeparrot_training - Step 7397: {'lr': 0.0004845648194405432, 'samples': 1420416, 'steps': 7397, 'loss/train': 0.2683112919330597} 01/27/2022 02:49:11 - INFO - codeparrot_training - Step 7398: {'lr': 0.00048455915862533804, 'samples': 1420608, 'steps': 7398, 'loss/train': 1.1278045177459717} 01/27/2022 02:49:14 - INFO - codeparrot_training - Step 7399: {'lr': 0.0004845534968053559, 'samples': 1420800, 'steps': 7399, 'loss/train': 0.5719740986824036} 01/27/2022 02:49:17 - INFO - codeparrot_training - Step 7400: {'lr': 0.0004845478339806211, 'samples': 1420992, 'steps': 7400, 'loss/train': 0.7768061757087708} 01/27/2022 02:49:21 - INFO - codeparrot_training - Step 7401: {'lr': 0.0004845421701511578, 'samples': 1421184, 'steps': 7401, 'loss/train': 0.602475643157959} 01/27/2022 02:49:24 - INFO - codeparrot_training - Step 7402: {'lr': 0.0004845365053169903, 'samples': 1421376, 'steps': 7402, 'loss/train': 1.047468513250351} 01/27/2022 02:49:30 - INFO - codeparrot_training - Step 7403: {'lr': 0.0004845308394781429, 'samples': 1421568, 'steps': 7403, 'loss/train': 0.6765099763870239} 01/27/2022 02:49:33 - INFO - codeparrot_training - Step 7404: {'lr': 0.0004845251726346399, 'samples': 1421760, 'steps': 7404, 'loss/train': 0.433450385928154} 01/27/2022 02:49:36 - INFO - codeparrot_training - Step 7405: {'lr': 0.0004845195047865055, 'samples': 1421952, 'steps': 7405, 'loss/train': 0.7588719427585602} 01/27/2022 02:49:39 - INFO - codeparrot_training - Step 7406: {'lr': 0.00048451383593376394, 'samples': 1422144, 'steps': 7406, 'loss/train': 1.4630169868469238} 01/27/2022 02:49:42 - INFO - codeparrot_training - Step 7407: {'lr': 0.0004845081660764397, 'samples': 1422336, 'steps': 7407, 'loss/train': 0.8095904588699341} 01/27/2022 02:49:46 - INFO - codeparrot_training - Step 7408: {'lr': 0.0004845024952145569, 'samples': 1422528, 'steps': 7408, 'loss/train': 0.9640055000782013} 01/27/2022 02:49:49 - INFO - codeparrot_training - Step 7409: {'lr': 0.00048449682334813983, 'samples': 1422720, 'steps': 7409, 'loss/train': 0.8125799596309662} 01/27/2022 02:49:52 - INFO - codeparrot_training - Step 7410: {'lr': 0.00048449115047721286, 'samples': 1422912, 'steps': 7410, 'loss/train': 0.615950345993042} 01/27/2022 02:49:55 - INFO - codeparrot_training - Step 7411: {'lr': 0.00048448547660180034, 'samples': 1423104, 'steps': 7411, 'loss/train': 0.0705069825053215} 01/27/2022 02:49:59 - INFO - codeparrot_training - Step 7412: {'lr': 0.0004844798017219264, 'samples': 1423296, 'steps': 7412, 'loss/train': 1.2162452638149261} 01/27/2022 02:50:03 - INFO - codeparrot_training - Step 7413: {'lr': 0.00048447412583761543, 'samples': 1423488, 'steps': 7413, 'loss/train': 0.18805651366710663} 01/27/2022 02:50:06 - INFO - codeparrot_training - Step 7414: {'lr': 0.00048446844894889173, 'samples': 1423680, 'steps': 7414, 'loss/train': 1.0577926933765411} 01/27/2022 02:50:09 - INFO - codeparrot_training - Step 7415: {'lr': 0.00048446277105577973, 'samples': 1423872, 'steps': 7415, 'loss/train': 0.7479769438505173} 01/27/2022 02:50:12 - INFO - codeparrot_training - Step 7416: {'lr': 0.0004844570921583037, 'samples': 1424064, 'steps': 7416, 'loss/train': 0.9844434857368469} 01/27/2022 02:50:15 - INFO - codeparrot_training - Step 7417: {'lr': 0.00048445141225648785, 'samples': 1424256, 'steps': 7417, 'loss/train': 0.6658144444227219} 01/27/2022 02:50:18 - INFO - codeparrot_training - Step 7418: {'lr': 0.00048444573135035665, 'samples': 1424448, 'steps': 7418, 'loss/train': 0.06803523190319538} 01/27/2022 02:50:22 - INFO - codeparrot_training - Step 7419: {'lr': 0.00048444004943993434, 'samples': 1424640, 'steps': 7419, 'loss/train': 1.1219033896923065} 01/27/2022 02:50:25 - INFO - codeparrot_training - Step 7420: {'lr': 0.0004844343665252453, 'samples': 1424832, 'steps': 7420, 'loss/train': 1.2660664021968842} 01/27/2022 02:50:29 - INFO - codeparrot_training - Step 7421: {'lr': 0.0004844286826063139, 'samples': 1425024, 'steps': 7421, 'loss/train': 0.7526140809059143} 01/27/2022 02:50:32 - INFO - codeparrot_training - Step 7422: {'lr': 0.0004844229976831645, 'samples': 1425216, 'steps': 7422, 'loss/train': 0.6709358096122742} 01/27/2022 02:50:36 - INFO - codeparrot_training - Step 7423: {'lr': 0.00048441731175582136, 'samples': 1425408, 'steps': 7423, 'loss/train': 1.0259068608283997} 01/27/2022 02:50:39 - INFO - codeparrot_training - Step 7424: {'lr': 0.0004844116248243089, 'samples': 1425600, 'steps': 7424, 'loss/train': 1.2379092872142792} 01/27/2022 02:50:42 - INFO - codeparrot_training - Step 7425: {'lr': 0.00048440593688865155, 'samples': 1425792, 'steps': 7425, 'loss/train': 1.1539261937141418} 01/27/2022 02:50:45 - INFO - codeparrot_training - Step 7426: {'lr': 0.0004844002479488735, 'samples': 1425984, 'steps': 7426, 'loss/train': 0.9375653564929962} 01/27/2022 02:50:48 - INFO - codeparrot_training - Step 7427: {'lr': 0.0004843945580049992, 'samples': 1426176, 'steps': 7427, 'loss/train': 0.6855254173278809} 01/27/2022 02:50:51 - INFO - codeparrot_training - Step 7428: {'lr': 0.0004843888670570531, 'samples': 1426368, 'steps': 7428, 'loss/train': 0.6278362423181534} 01/27/2022 02:50:54 - INFO - codeparrot_training - Step 7429: {'lr': 0.00048438317510505954, 'samples': 1426560, 'steps': 7429, 'loss/train': 0.7357306033372879} 01/27/2022 02:51:01 - INFO - codeparrot_training - Step 7430: {'lr': 0.0004843774821490429, 'samples': 1426752, 'steps': 7430, 'loss/train': 0.88595250248909} 01/27/2022 02:51:04 - INFO - codeparrot_training - Step 7431: {'lr': 0.0004843717881890275, 'samples': 1426944, 'steps': 7431, 'loss/train': 0.3778661638498306} 01/27/2022 02:51:07 - INFO - codeparrot_training - Step 7432: {'lr': 0.0004843660932250378, 'samples': 1427136, 'steps': 7432, 'loss/train': 0.5472650974988937} 01/27/2022 02:51:10 - INFO - codeparrot_training - Step 7433: {'lr': 0.0004843603972570981, 'samples': 1427328, 'steps': 7433, 'loss/train': 0.27512019127607346} 01/27/2022 02:51:13 - INFO - codeparrot_training - Step 7434: {'lr': 0.00048435470028523295, 'samples': 1427520, 'steps': 7434, 'loss/train': 0.41305263340473175} 01/27/2022 02:51:16 - INFO - codeparrot_training - Step 7435: {'lr': 0.00048434900230946666, 'samples': 1427712, 'steps': 7435, 'loss/train': 1.0949556827545166} 01/27/2022 02:51:19 - INFO - codeparrot_training - Step 7436: {'lr': 0.0004843433033298237, 'samples': 1427904, 'steps': 7436, 'loss/train': 0.8814671337604523} 01/27/2022 02:51:23 - INFO - codeparrot_training - Step 7437: {'lr': 0.00048433760334632835, 'samples': 1428096, 'steps': 7437, 'loss/train': 1.279754877090454} 01/27/2022 02:51:27 - INFO - codeparrot_training - Step 7438: {'lr': 0.0004843319023590052, 'samples': 1428288, 'steps': 7438, 'loss/train': 0.9151048958301544} 01/27/2022 02:51:30 - INFO - codeparrot_training - Step 7439: {'lr': 0.0004843262003678786, 'samples': 1428480, 'steps': 7439, 'loss/train': 0.9154526889324188} 01/27/2022 02:51:33 - INFO - codeparrot_training - Step 7440: {'lr': 0.0004843204973729729, 'samples': 1428672, 'steps': 7440, 'loss/train': 0.59882752597332} 01/27/2022 02:51:36 - INFO - codeparrot_training - Step 7441: {'lr': 0.0004843147933743126, 'samples': 1428864, 'steps': 7441, 'loss/train': 1.170080691576004} 01/27/2022 02:51:40 - INFO - codeparrot_training - Step 7442: {'lr': 0.0004843090883719222, 'samples': 1429056, 'steps': 7442, 'loss/train': 0.5180162787437439} 01/27/2022 02:51:43 - INFO - codeparrot_training - Step 7443: {'lr': 0.00048430338236582596, 'samples': 1429248, 'steps': 7443, 'loss/train': 0.48835471272468567} 01/27/2022 02:51:46 - INFO - codeparrot_training - Step 7444: {'lr': 0.0004842976753560485, 'samples': 1429440, 'steps': 7444, 'loss/train': 0.6554153859615326} 01/27/2022 02:51:49 - INFO - codeparrot_training - Step 7445: {'lr': 0.00048429196734261413, 'samples': 1429632, 'steps': 7445, 'loss/train': 0.7093051224946976} 01/27/2022 02:51:52 - INFO - codeparrot_training - Step 7446: {'lr': 0.00048428625832554754, 'samples': 1429824, 'steps': 7446, 'loss/train': 0.6992504149675369} 01/27/2022 02:51:57 - INFO - codeparrot_training - Step 7447: {'lr': 0.0004842805483048728, 'samples': 1430016, 'steps': 7447, 'loss/train': 0.5967693328857422} 01/27/2022 02:52:00 - INFO - codeparrot_training - Step 7448: {'lr': 0.0004842748372806147, 'samples': 1430208, 'steps': 7448, 'loss/train': 0.6903793215751648} 01/27/2022 02:52:03 - INFO - codeparrot_training - Step 7449: {'lr': 0.0004842691252527976, 'samples': 1430400, 'steps': 7449, 'loss/train': 1.074194312095642} 01/27/2022 02:52:06 - INFO - codeparrot_training - Step 7450: {'lr': 0.00048426341222144586, 'samples': 1430592, 'steps': 7450, 'loss/train': 0.870233952999115} 01/27/2022 02:52:09 - INFO - codeparrot_training - Step 7451: {'lr': 0.00048425769818658416, 'samples': 1430784, 'steps': 7451, 'loss/train': 0.906788170337677} 01/27/2022 02:52:12 - INFO - codeparrot_training - Step 7452: {'lr': 0.0004842519831482368, 'samples': 1430976, 'steps': 7452, 'loss/train': 0.5956971645355225} 01/27/2022 02:52:15 - INFO - codeparrot_training - Step 7453: {'lr': 0.00048424626710642836, 'samples': 1431168, 'steps': 7453, 'loss/train': 0.9667910635471344} 01/27/2022 02:52:19 - INFO - codeparrot_training - Step 7454: {'lr': 0.0004842405500611833, 'samples': 1431360, 'steps': 7454, 'loss/train': 0.7370336204767227} 01/27/2022 02:52:22 - INFO - codeparrot_training - Step 7455: {'lr': 0.00048423483201252604, 'samples': 1431552, 'steps': 7455, 'loss/train': 1.2023777961730957} 01/27/2022 02:52:28 - INFO - codeparrot_training - Step 7456: {'lr': 0.0004842291129604812, 'samples': 1431744, 'steps': 7456, 'loss/train': 0.9297221302986145} 01/27/2022 02:52:31 - INFO - codeparrot_training - Step 7457: {'lr': 0.0004842233929050732, 'samples': 1431936, 'steps': 7457, 'loss/train': 0.8688090741634369} 01/27/2022 02:52:34 - INFO - codeparrot_training - Step 7458: {'lr': 0.00048421767184632657, 'samples': 1432128, 'steps': 7458, 'loss/train': 0.9955622255802155} 01/27/2022 02:52:37 - INFO - codeparrot_training - Step 7459: {'lr': 0.00048421194978426574, 'samples': 1432320, 'steps': 7459, 'loss/train': 0.8417668640613556} 01/27/2022 02:52:41 - INFO - codeparrot_training - Step 7460: {'lr': 0.00048420622671891533, 'samples': 1432512, 'steps': 7460, 'loss/train': 0.35163024812936783} 01/27/2022 02:52:44 - INFO - codeparrot_training - Step 7461: {'lr': 0.0004842005026502999, 'samples': 1432704, 'steps': 7461, 'loss/train': 0.8193419873714447} 01/27/2022 02:52:47 - INFO - codeparrot_training - Step 7462: {'lr': 0.00048419477757844376, 'samples': 1432896, 'steps': 7462, 'loss/train': 0.3199351206421852} 01/27/2022 02:52:50 - INFO - codeparrot_training - Step 7463: {'lr': 0.00048418905150337166, 'samples': 1433088, 'steps': 7463, 'loss/train': 0.10820966586470604} 01/27/2022 02:52:53 - INFO - codeparrot_training - Step 7464: {'lr': 0.00048418332442510794, 'samples': 1433280, 'steps': 7464, 'loss/train': 0.20474068075418472} 01/27/2022 02:52:58 - INFO - codeparrot_training - Step 7465: {'lr': 0.00048417759634367726, 'samples': 1433472, 'steps': 7465, 'loss/train': 1.1125440895557404} 01/27/2022 02:53:01 - INFO - codeparrot_training - Step 7466: {'lr': 0.00048417186725910414, 'samples': 1433664, 'steps': 7466, 'loss/train': 0.5986321270465851} 01/27/2022 02:53:04 - INFO - codeparrot_training - Step 7467: {'lr': 0.000484166137171413, 'samples': 1433856, 'steps': 7467, 'loss/train': 0.8859290778636932} 01/27/2022 02:53:07 - INFO - codeparrot_training - Step 7468: {'lr': 0.0004841604060806286, 'samples': 1434048, 'steps': 7468, 'loss/train': 0.990475058555603} 01/27/2022 02:53:10 - INFO - codeparrot_training - Step 7469: {'lr': 0.00048415467398677534, 'samples': 1434240, 'steps': 7469, 'loss/train': 0.6857859492301941} 01/27/2022 02:53:13 - INFO - codeparrot_training - Step 7470: {'lr': 0.0004841489408898778, 'samples': 1434432, 'steps': 7470, 'loss/train': 1.1163650751113892} 01/27/2022 02:53:16 - INFO - codeparrot_training - Step 7471: {'lr': 0.0004841432067899605, 'samples': 1434624, 'steps': 7471, 'loss/train': 1.0688905119895935} 01/27/2022 02:53:20 - INFO - codeparrot_training - Step 7472: {'lr': 0.0004841374716870481, 'samples': 1434816, 'steps': 7472, 'loss/train': 1.0634286403656006} 01/27/2022 02:53:23 - INFO - codeparrot_training - Step 7473: {'lr': 0.0004841317355811651, 'samples': 1435008, 'steps': 7473, 'loss/train': 0.9148719012737274} 01/27/2022 02:53:29 - INFO - codeparrot_training - Step 7474: {'lr': 0.00048412599847233613, 'samples': 1435200, 'steps': 7474, 'loss/train': 1.1476677060127258} 01/27/2022 02:53:32 - INFO - codeparrot_training - Step 7475: {'lr': 0.0004841202603605857, 'samples': 1435392, 'steps': 7475, 'loss/train': 0.48646558821201324} 01/27/2022 02:53:35 - INFO - codeparrot_training - Step 7476: {'lr': 0.0004841145212459384, 'samples': 1435584, 'steps': 7476, 'loss/train': 0.7114835232496262} 01/27/2022 02:53:38 - INFO - codeparrot_training - Step 7477: {'lr': 0.0004841087811284188, 'samples': 1435776, 'steps': 7477, 'loss/train': 0.9942750334739685} 01/27/2022 02:53:41 - INFO - codeparrot_training - Step 7478: {'lr': 0.0004841030400080516, 'samples': 1435968, 'steps': 7478, 'loss/train': 0.9437802135944366} 01/27/2022 02:53:45 - INFO - codeparrot_training - Step 7479: {'lr': 0.00048409729788486127, 'samples': 1436160, 'steps': 7479, 'loss/train': 1.343105524778366} 01/27/2022 02:53:48 - INFO - codeparrot_training - Step 7480: {'lr': 0.00048409155475887244, 'samples': 1436352, 'steps': 7480, 'loss/train': 0.8881424367427826} 01/27/2022 02:53:51 - INFO - codeparrot_training - Step 7481: {'lr': 0.00048408581063010973, 'samples': 1436544, 'steps': 7481, 'loss/train': 1.0369686484336853} 01/27/2022 02:53:55 - INFO - codeparrot_training - Step 7482: {'lr': 0.00048408006549859777, 'samples': 1436736, 'steps': 7482, 'loss/train': 1.0929190814495087} 01/27/2022 02:53:58 - INFO - codeparrot_training - Step 7483: {'lr': 0.00048407431936436116, 'samples': 1436928, 'steps': 7483, 'loss/train': 0.8223288953304291} 01/27/2022 02:54:01 - INFO - codeparrot_training - Step 7484: {'lr': 0.0004840685722274244, 'samples': 1437120, 'steps': 7484, 'loss/train': 1.0391244292259216} 01/27/2022 02:54:05 - INFO - codeparrot_training - Step 7485: {'lr': 0.00048406282408781226, 'samples': 1437312, 'steps': 7485, 'loss/train': 1.095378041267395} 01/27/2022 02:54:08 - INFO - codeparrot_training - Step 7486: {'lr': 0.0004840570749455493, 'samples': 1437504, 'steps': 7486, 'loss/train': 0.40191730856895447} 01/27/2022 02:54:11 - INFO - codeparrot_training - Step 7487: {'lr': 0.00048405132480066015, 'samples': 1437696, 'steps': 7487, 'loss/train': 0.5823196023702621} 01/27/2022 02:54:14 - INFO - codeparrot_training - Step 7488: {'lr': 0.00048404557365316946, 'samples': 1437888, 'steps': 7488, 'loss/train': 0.543383002281189} 01/27/2022 02:54:17 - INFO - codeparrot_training - Step 7489: {'lr': 0.00048403982150310184, 'samples': 1438080, 'steps': 7489, 'loss/train': 0.9102653861045837} 01/27/2022 02:54:20 - INFO - codeparrot_training - Step 7490: {'lr': 0.0004840340683504819, 'samples': 1438272, 'steps': 7490, 'loss/train': 1.5913728475570679} 01/27/2022 02:54:25 - INFO - codeparrot_training - Step 7491: {'lr': 0.0004840283141953343, 'samples': 1438464, 'steps': 7491, 'loss/train': 1.448518306016922} 01/27/2022 02:54:28 - INFO - codeparrot_training - Step 7492: {'lr': 0.0004840225590376839, 'samples': 1438656, 'steps': 7492, 'loss/train': 0.41179776191711426} 01/27/2022 02:54:31 - INFO - codeparrot_training - Step 7493: {'lr': 0.000484016802877555, 'samples': 1438848, 'steps': 7493, 'loss/train': 0.8821797966957092} 01/27/2022 02:54:34 - INFO - codeparrot_training - Step 7494: {'lr': 0.00048401104571497245, 'samples': 1439040, 'steps': 7494, 'loss/train': 1.2661755681037903} 01/27/2022 02:54:38 - INFO - codeparrot_training - Step 7495: {'lr': 0.00048400528754996086, 'samples': 1439232, 'steps': 7495, 'loss/train': 0.7622010111808777} 01/27/2022 02:54:41 - INFO - codeparrot_training - Step 7496: {'lr': 0.000483999528382545, 'samples': 1439424, 'steps': 7496, 'loss/train': 1.132755070924759} 01/27/2022 02:54:44 - INFO - codeparrot_training - Step 7497: {'lr': 0.00048399376821274943, 'samples': 1439616, 'steps': 7497, 'loss/train': 0.6819687634706497} 01/27/2022 02:54:47 - INFO - codeparrot_training - Step 7498: {'lr': 0.00048398800704059887, 'samples': 1439808, 'steps': 7498, 'loss/train': 1.1312377452850342} 01/27/2022 02:54:50 - INFO - codeparrot_training - Step 7499: {'lr': 0.000483982244866118, 'samples': 1440000, 'steps': 7499, 'loss/train': 1.0797154605388641} 01/27/2022 02:54:56 - INFO - codeparrot_training - Step 7500: {'lr': 0.00048397648168933144, 'samples': 1440192, 'steps': 7500, 'loss/train': 0.8118403851985931} 01/27/2022 02:54:59 - INFO - codeparrot_training - Step 7501: {'lr': 0.00048397071751026395, 'samples': 1440384, 'steps': 7501, 'loss/train': 0.7410158962011337} 01/27/2022 02:55:03 - INFO - codeparrot_training - Step 7502: {'lr': 0.00048396495232894024, 'samples': 1440576, 'steps': 7502, 'loss/train': 0.6245460212230682} 01/27/2022 02:55:06 - INFO - codeparrot_training - Step 7503: {'lr': 0.0004839591861453849, 'samples': 1440768, 'steps': 7503, 'loss/train': 0.5008704364299774} 01/27/2022 02:55:09 - INFO - codeparrot_training - Step 7504: {'lr': 0.00048395341895962277, 'samples': 1440960, 'steps': 7504, 'loss/train': 0.9457974135875702} 01/27/2022 02:55:12 - INFO - codeparrot_training - Step 7505: {'lr': 0.0004839476507716784, 'samples': 1441152, 'steps': 7505, 'loss/train': 1.0401442050933838} 01/27/2022 02:55:15 - INFO - codeparrot_training - Step 7506: {'lr': 0.0004839418815815766, 'samples': 1441344, 'steps': 7506, 'loss/train': 0.6243360936641693} 01/27/2022 02:55:18 - INFO - codeparrot_training - Step 7507: {'lr': 0.0004839361113893421, 'samples': 1441536, 'steps': 7507, 'loss/train': 0.9568361341953278} 01/27/2022 02:55:21 - INFO - codeparrot_training - Step 7508: {'lr': 0.0004839303401949996, 'samples': 1441728, 'steps': 7508, 'loss/train': 0.9235082566738129} 01/27/2022 02:55:26 - INFO - codeparrot_training - Step 7509: {'lr': 0.00048392456799857374, 'samples': 1441920, 'steps': 7509, 'loss/train': 1.0478965938091278} 01/27/2022 02:55:29 - INFO - codeparrot_training - Step 7510: {'lr': 0.0004839187948000893, 'samples': 1442112, 'steps': 7510, 'loss/train': 0.9348175227642059} 01/27/2022 02:55:32 - INFO - codeparrot_training - Step 7511: {'lr': 0.0004839130205995711, 'samples': 1442304, 'steps': 7511, 'loss/train': 0.9690278470516205} 01/27/2022 02:55:35 - INFO - codeparrot_training - Step 7512: {'lr': 0.0004839072453970438, 'samples': 1442496, 'steps': 7512, 'loss/train': 0.8616491854190826} 01/27/2022 02:55:38 - INFO - codeparrot_training - Step 7513: {'lr': 0.00048390146919253206, 'samples': 1442688, 'steps': 7513, 'loss/train': 0.6582963466644287} 01/27/2022 02:55:42 - INFO - codeparrot_training - Step 7514: {'lr': 0.0004838956919860607, 'samples': 1442880, 'steps': 7514, 'loss/train': 0.8786086142063141} 01/27/2022 02:55:45 - INFO - codeparrot_training - Step 7515: {'lr': 0.0004838899137776545, 'samples': 1443072, 'steps': 7515, 'loss/train': 0.10561903193593025} 01/27/2022 02:55:48 - INFO - codeparrot_training - Step 7516: {'lr': 0.00048388413456733814, 'samples': 1443264, 'steps': 7516, 'loss/train': 0.8980364799499512} 01/27/2022 02:55:51 - INFO - codeparrot_training - Step 7517: {'lr': 0.0004838783543551365, 'samples': 1443456, 'steps': 7517, 'loss/train': 0.8293173909187317} 01/27/2022 02:55:55 - INFO - codeparrot_training - Step 7518: {'lr': 0.0004838725731410742, 'samples': 1443648, 'steps': 7518, 'loss/train': 0.6316135972738266} 01/27/2022 02:55:59 - INFO - codeparrot_training - Step 7519: {'lr': 0.00048386679092517605, 'samples': 1443840, 'steps': 7519, 'loss/train': 1.3251986503601074} 01/27/2022 02:56:02 - INFO - codeparrot_training - Step 7520: {'lr': 0.00048386100770746686, 'samples': 1444032, 'steps': 7520, 'loss/train': 1.1447831690311432} 01/27/2022 02:56:05 - INFO - codeparrot_training - Step 7521: {'lr': 0.00048385522348797134, 'samples': 1444224, 'steps': 7521, 'loss/train': 1.0610007047653198} 01/27/2022 02:56:08 - INFO - codeparrot_training - Step 7522: {'lr': 0.0004838494382667143, 'samples': 1444416, 'steps': 7522, 'loss/train': 0.6451025158166885} 01/27/2022 02:56:11 - INFO - codeparrot_training - Step 7523: {'lr': 0.0004838436520437205, 'samples': 1444608, 'steps': 7523, 'loss/train': 0.5896165072917938} 01/27/2022 02:56:14 - INFO - codeparrot_training - Step 7524: {'lr': 0.00048383786481901483, 'samples': 1444800, 'steps': 7524, 'loss/train': 0.5841744840145111} 01/27/2022 02:56:17 - INFO - codeparrot_training - Step 7525: {'lr': 0.00048383207659262196, 'samples': 1444992, 'steps': 7525, 'loss/train': 0.8955707252025604} 01/27/2022 02:56:22 - INFO - codeparrot_training - Step 7526: {'lr': 0.0004838262873645667, 'samples': 1445184, 'steps': 7526, 'loss/train': 1.0604113340377808} 01/27/2022 02:56:25 - INFO - codeparrot_training - Step 7527: {'lr': 0.00048382049713487383, 'samples': 1445376, 'steps': 7527, 'loss/train': 0.5787443518638611} 01/27/2022 02:56:28 - INFO - codeparrot_training - Step 7528: {'lr': 0.00048381470590356835, 'samples': 1445568, 'steps': 7528, 'loss/train': 0.8691869080066681} 01/27/2022 02:56:32 - INFO - codeparrot_training - Step 7529: {'lr': 0.00048380891367067483, 'samples': 1445760, 'steps': 7529, 'loss/train': 0.6652777343988419} 01/27/2022 02:56:35 - INFO - codeparrot_training - Step 7530: {'lr': 0.0004838031204362181, 'samples': 1445952, 'steps': 7530, 'loss/train': 0.4944991171360016} 01/27/2022 02:56:38 - INFO - codeparrot_training - Step 7531: {'lr': 0.0004837973262002231, 'samples': 1446144, 'steps': 7531, 'loss/train': 0.7876331806182861} 01/27/2022 02:56:41 - INFO - codeparrot_training - Step 7532: {'lr': 0.0004837915309627146, 'samples': 1446336, 'steps': 7532, 'loss/train': 0.746796190738678} 01/27/2022 02:56:44 - INFO - codeparrot_training - Step 7533: {'lr': 0.00048378573472371744, 'samples': 1446528, 'steps': 7533, 'loss/train': 0.8474599421024323} 01/27/2022 02:56:47 - INFO - codeparrot_training - Step 7534: {'lr': 0.0004837799374832564, 'samples': 1446720, 'steps': 7534, 'loss/train': 0.8158031702041626} 01/27/2022 02:56:53 - INFO - codeparrot_training - Step 7535: {'lr': 0.0004837741392413563, 'samples': 1446912, 'steps': 7535, 'loss/train': 0.878383219242096} 01/27/2022 02:56:57 - INFO - codeparrot_training - Step 7536: {'lr': 0.000483768339998042, 'samples': 1447104, 'steps': 7536, 'loss/train': 1.0398503243923187} 01/27/2022 02:57:00 - INFO - codeparrot_training - Step 7537: {'lr': 0.0004837625397533385, 'samples': 1447296, 'steps': 7537, 'loss/train': 1.2911444306373596} 01/27/2022 02:57:03 - INFO - codeparrot_training - Step 7538: {'lr': 0.00048375673850727043, 'samples': 1447488, 'steps': 7538, 'loss/train': 0.9847868978977203} 01/27/2022 02:57:06 - INFO - codeparrot_training - Step 7539: {'lr': 0.00048375093625986274, 'samples': 1447680, 'steps': 7539, 'loss/train': 0.6459067761898041} 01/27/2022 02:57:09 - INFO - codeparrot_training - Step 7540: {'lr': 0.0004837451330111402, 'samples': 1447872, 'steps': 7540, 'loss/train': 0.5743617117404938} 01/27/2022 02:57:12 - INFO - codeparrot_training - Step 7541: {'lr': 0.0004837393287611278, 'samples': 1448064, 'steps': 7541, 'loss/train': 0.9296711683273315} 01/27/2022 02:57:15 - INFO - codeparrot_training - Step 7542: {'lr': 0.0004837335235098503, 'samples': 1448256, 'steps': 7542, 'loss/train': 1.1347551941871643} 01/27/2022 02:57:19 - INFO - codeparrot_training - Step 7543: {'lr': 0.0004837277172573326, 'samples': 1448448, 'steps': 7543, 'loss/train': 0.8187842667102814} 01/27/2022 02:57:23 - INFO - codeparrot_training - Step 7544: {'lr': 0.00048372191000359955, 'samples': 1448640, 'steps': 7544, 'loss/train': 0.9017757475376129} 01/27/2022 02:57:26 - INFO - codeparrot_training - Step 7545: {'lr': 0.00048371610174867614, 'samples': 1448832, 'steps': 7545, 'loss/train': 0.584307074546814} 01/27/2022 02:57:29 - INFO - codeparrot_training - Step 7546: {'lr': 0.00048371029249258716, 'samples': 1449024, 'steps': 7546, 'loss/train': 1.0518494546413422} 01/27/2022 02:57:33 - INFO - codeparrot_training - Step 7547: {'lr': 0.0004837044822353574, 'samples': 1449216, 'steps': 7547, 'loss/train': 1.0933268666267395} 01/27/2022 02:57:36 - INFO - codeparrot_training - Step 7548: {'lr': 0.0004836986709770119, 'samples': 1449408, 'steps': 7548, 'loss/train': 0.47284895181655884} 01/27/2022 02:57:39 - INFO - codeparrot_training - Step 7549: {'lr': 0.00048369285871757554, 'samples': 1449600, 'steps': 7549, 'loss/train': 0.43844349682331085} 01/27/2022 02:57:42 - INFO - codeparrot_training - Step 7550: {'lr': 0.0004836870454570731, 'samples': 1449792, 'steps': 7550, 'loss/train': 0.8797785043716431} 01/27/2022 02:57:45 - INFO - codeparrot_training - Step 7551: {'lr': 0.00048368123119552965, 'samples': 1449984, 'steps': 7551, 'loss/train': 1.0014592409133911} 01/27/2022 02:57:48 - INFO - codeparrot_training - Step 7552: {'lr': 0.00048367541593296996, 'samples': 1450176, 'steps': 7552, 'loss/train': 0.6751915365457535} 01/27/2022 02:57:53 - INFO - codeparrot_training - Step 7553: {'lr': 0.00048366959966941893, 'samples': 1450368, 'steps': 7553, 'loss/train': 0.802943766117096} 01/27/2022 02:57:56 - INFO - codeparrot_training - Step 7554: {'lr': 0.0004836637824049016, 'samples': 1450560, 'steps': 7554, 'loss/train': 0.8707664608955383} 01/27/2022 02:57:59 - INFO - codeparrot_training - Step 7555: {'lr': 0.00048365796413944284, 'samples': 1450752, 'steps': 7555, 'loss/train': 0.8078220784664154} 01/27/2022 02:58:02 - INFO - codeparrot_training - Step 7556: {'lr': 0.00048365214487306753, 'samples': 1450944, 'steps': 7556, 'loss/train': 0.5097995549440384} 01/27/2022 02:58:05 - INFO - codeparrot_training - Step 7557: {'lr': 0.0004836463246058006, 'samples': 1451136, 'steps': 7557, 'loss/train': 0.08909215964376926} 01/27/2022 02:58:09 - INFO - codeparrot_training - Step 7558: {'lr': 0.0004836405033376671, 'samples': 1451328, 'steps': 7558, 'loss/train': 0.13681859895586967} 01/27/2022 02:58:12 - INFO - codeparrot_training - Step 7559: {'lr': 0.00048363468106869177, 'samples': 1451520, 'steps': 7559, 'loss/train': 0.8397833704948425} 01/27/2022 02:58:15 - INFO - codeparrot_training - Step 7560: {'lr': 0.00048362885779889967, 'samples': 1451712, 'steps': 7560, 'loss/train': 0.6269799023866653} 01/27/2022 02:58:18 - INFO - codeparrot_training - Step 7561: {'lr': 0.0004836230335283158, 'samples': 1451904, 'steps': 7561, 'loss/train': 0.6208176165819168} 01/27/2022 02:58:24 - INFO - codeparrot_training - Step 7562: {'lr': 0.00048361720825696494, 'samples': 1452096, 'steps': 7562, 'loss/train': 0.6894878000020981} 01/27/2022 02:58:27 - INFO - codeparrot_training - Step 7563: {'lr': 0.0004836113819848722, 'samples': 1452288, 'steps': 7563, 'loss/train': 0.8708741068840027} 01/27/2022 02:58:30 - INFO - codeparrot_training - Step 7564: {'lr': 0.0004836055547120625, 'samples': 1452480, 'steps': 7564, 'loss/train': 0.7360027134418488} 01/27/2022 02:58:33 - INFO - codeparrot_training - Step 7565: {'lr': 0.0004835997264385607, 'samples': 1452672, 'steps': 7565, 'loss/train': 0.638710230588913} 01/27/2022 02:58:37 - INFO - codeparrot_training - Step 7566: {'lr': 0.0004835938971643919, 'samples': 1452864, 'steps': 7566, 'loss/train': 1.1288261711597443} 01/27/2022 02:58:40 - INFO - codeparrot_training - Step 7567: {'lr': 0.000483588066889581, 'samples': 1453056, 'steps': 7567, 'loss/train': 0.9682736992835999} 01/27/2022 02:58:43 - INFO - codeparrot_training - Step 7568: {'lr': 0.00048358223561415306, 'samples': 1453248, 'steps': 7568, 'loss/train': 0.9999256432056427} 01/27/2022 02:58:46 - INFO - codeparrot_training - Step 7569: {'lr': 0.0004835764033381329, 'samples': 1453440, 'steps': 7569, 'loss/train': 0.5071978569030762} 01/27/2022 02:58:49 - INFO - codeparrot_training - Step 7570: {'lr': 0.00048357057006154566, 'samples': 1453632, 'steps': 7570, 'loss/train': 0.35201798379421234} 01/27/2022 02:58:54 - INFO - codeparrot_training - Step 7571: {'lr': 0.0004835647357844162, 'samples': 1453824, 'steps': 7571, 'loss/train': 1.0055887699127197} 01/27/2022 02:58:57 - INFO - codeparrot_training - Step 7572: {'lr': 0.00048355890050676966, 'samples': 1454016, 'steps': 7572, 'loss/train': 1.0664891302585602} 01/27/2022 02:59:00 - INFO - codeparrot_training - Step 7573: {'lr': 0.0004835530642286309, 'samples': 1454208, 'steps': 7573, 'loss/train': 0.9950547516345978} 01/27/2022 02:59:03 - INFO - codeparrot_training - Step 7574: {'lr': 0.000483547226950025, 'samples': 1454400, 'steps': 7574, 'loss/train': 0.7567064166069031} 01/27/2022 02:59:06 - INFO - codeparrot_training - Step 7575: {'lr': 0.00048354138867097695, 'samples': 1454592, 'steps': 7575, 'loss/train': 0.792333573102951} 01/27/2022 02:59:10 - INFO - codeparrot_training - Step 7576: {'lr': 0.00048353554939151167, 'samples': 1454784, 'steps': 7576, 'loss/train': 0.9951333403587341} 01/27/2022 02:59:13 - INFO - codeparrot_training - Step 7577: {'lr': 0.00048352970911165434, 'samples': 1454976, 'steps': 7577, 'loss/train': 0.9186318218708038} 01/27/2022 02:59:16 - INFO - codeparrot_training - Step 7578: {'lr': 0.0004835238678314299, 'samples': 1455168, 'steps': 7578, 'loss/train': 1.0412828028202057} 01/27/2022 02:59:19 - INFO - codeparrot_training - Step 7579: {'lr': 0.00048351802555086335, 'samples': 1455360, 'steps': 7579, 'loss/train': 1.1981520652770996} 01/27/2022 02:59:25 - INFO - codeparrot_training - Step 7580: {'lr': 0.0004835121822699796, 'samples': 1455552, 'steps': 7580, 'loss/train': 0.3369762450456619} 01/27/2022 02:59:28 - INFO - codeparrot_training - Step 7581: {'lr': 0.00048350633798880397, 'samples': 1455744, 'steps': 7581, 'loss/train': 0.9579931497573853} 01/27/2022 02:59:32 - INFO - codeparrot_training - Step 7582: {'lr': 0.0004835004927073613, 'samples': 1455936, 'steps': 7582, 'loss/train': 1.4370448887348175} 01/27/2022 02:59:35 - INFO - codeparrot_training - Step 7583: {'lr': 0.0004834946464256766, 'samples': 1456128, 'steps': 7583, 'loss/train': 1.2060244381427765} 01/27/2022 02:59:38 - INFO - codeparrot_training - Step 7584: {'lr': 0.00048348879914377504, 'samples': 1456320, 'steps': 7584, 'loss/train': 1.0950269401073456} 01/27/2022 02:59:41 - INFO - codeparrot_training - Step 7585: {'lr': 0.0004834829508616816, 'samples': 1456512, 'steps': 7585, 'loss/train': 0.8842275738716125} 01/27/2022 02:59:44 - INFO - codeparrot_training - Step 7586: {'lr': 0.00048347710157942126, 'samples': 1456704, 'steps': 7586, 'loss/train': 0.9269717931747437} 01/27/2022 02:59:47 - INFO - codeparrot_training - Step 7587: {'lr': 0.00048347125129701924, 'samples': 1456896, 'steps': 7587, 'loss/train': 0.7780607342720032} 01/27/2022 02:59:52 - INFO - codeparrot_training - Step 7588: {'lr': 0.00048346540001450045, 'samples': 1457088, 'steps': 7588, 'loss/train': 0.9682931005954742} 01/27/2022 02:59:55 - INFO - codeparrot_training - Step 7589: {'lr': 0.0004834595477318901, 'samples': 1457280, 'steps': 7589, 'loss/train': 0.9096969366073608} 01/27/2022 02:59:58 - INFO - codeparrot_training - Step 7590: {'lr': 0.00048345369444921315, 'samples': 1457472, 'steps': 7590, 'loss/train': 0.7601725459098816} 01/27/2022 03:00:01 - INFO - codeparrot_training - Step 7591: {'lr': 0.00048344784016649467, 'samples': 1457664, 'steps': 7591, 'loss/train': 0.6545356214046478} 01/27/2022 03:00:04 - INFO - codeparrot_training - Step 7592: {'lr': 0.0004834419848837598, 'samples': 1457856, 'steps': 7592, 'loss/train': 0.7523170709609985} 01/27/2022 03:00:07 - INFO - codeparrot_training - Step 7593: {'lr': 0.0004834361286010336, 'samples': 1458048, 'steps': 7593, 'loss/train': 0.4017167240381241} 01/27/2022 03:00:10 - INFO - codeparrot_training - Step 7594: {'lr': 0.0004834302713183411, 'samples': 1458240, 'steps': 7594, 'loss/train': 0.9323156476020813} 01/27/2022 03:00:14 - INFO - codeparrot_training - Step 7595: {'lr': 0.0004834244130357075, 'samples': 1458432, 'steps': 7595, 'loss/train': 0.5965173840522766} 01/27/2022 03:00:17 - INFO - codeparrot_training - Step 7596: {'lr': 0.0004834185537531578, 'samples': 1458624, 'steps': 7596, 'loss/train': 1.0190193057060242} 01/27/2022 03:00:21 - INFO - codeparrot_training - Step 7597: {'lr': 0.00048341269347071717, 'samples': 1458816, 'steps': 7597, 'loss/train': 0.951317310333252} 01/27/2022 03:00:24 - INFO - codeparrot_training - Step 7598: {'lr': 0.00048340683218841066, 'samples': 1459008, 'steps': 7598, 'loss/train': 0.7742800712585449} 01/27/2022 03:00:27 - INFO - codeparrot_training - Step 7599: {'lr': 0.00048340096990626336, 'samples': 1459200, 'steps': 7599, 'loss/train': 1.1957912743091583} 01/27/2022 03:00:30 - INFO - codeparrot_training - Step 7600: {'lr': 0.00048339510662430044, 'samples': 1459392, 'steps': 7600, 'loss/train': 0.9138658940792084} 01/27/2022 03:00:34 - INFO - codeparrot_training - Step 7601: {'lr': 0.000483389242342547, 'samples': 1459584, 'steps': 7601, 'loss/train': 1.2248471975326538} 01/27/2022 03:00:37 - INFO - codeparrot_training - Step 7602: {'lr': 0.00048338337706102817, 'samples': 1459776, 'steps': 7602, 'loss/train': 1.2084367275238037} 01/27/2022 03:00:40 - INFO - codeparrot_training - Step 7603: {'lr': 0.00048337751077976907, 'samples': 1459968, 'steps': 7603, 'loss/train': 1.157139390707016} 01/27/2022 03:00:43 - INFO - codeparrot_training - Step 7604: {'lr': 0.0004833716434987948, 'samples': 1460160, 'steps': 7604, 'loss/train': 1.0042530298233032} 01/27/2022 03:00:46 - INFO - codeparrot_training - Step 7605: {'lr': 0.0004833657752181305, 'samples': 1460352, 'steps': 7605, 'loss/train': 0.823475182056427} 01/27/2022 03:00:52 - INFO - codeparrot_training - Step 7606: {'lr': 0.00048335990593780133, 'samples': 1460544, 'steps': 7606, 'loss/train': 0.8764871656894684} 01/27/2022 03:00:56 - INFO - codeparrot_training - Step 7607: {'lr': 0.00048335403565783245, 'samples': 1460736, 'steps': 7607, 'loss/train': 0.7459153980016708} 01/27/2022 03:00:59 - INFO - codeparrot_training - Step 7608: {'lr': 0.0004833481643782489, 'samples': 1460928, 'steps': 7608, 'loss/train': 1.0033383071422577} 01/27/2022 03:01:02 - INFO - codeparrot_training - Step 7609: {'lr': 0.000483342292099076, 'samples': 1461120, 'steps': 7609, 'loss/train': 0.8624515235424042} 01/27/2022 03:01:05 - INFO - codeparrot_training - Step 7610: {'lr': 0.0004833364188203387, 'samples': 1461312, 'steps': 7610, 'loss/train': 0.7706023156642914} 01/27/2022 03:01:08 - INFO - codeparrot_training - Step 7611: {'lr': 0.0004833305445420624, 'samples': 1461504, 'steps': 7611, 'loss/train': 0.6867763549089432} 01/27/2022 03:01:11 - INFO - codeparrot_training - Step 7612: {'lr': 0.0004833246692642721, 'samples': 1461696, 'steps': 7612, 'loss/train': 1.298038512468338} 01/27/2022 03:01:14 - INFO - codeparrot_training - Step 7613: {'lr': 0.000483318792986993, 'samples': 1461888, 'steps': 7613, 'loss/train': 1.255040019750595} 01/27/2022 03:01:20 - INFO - codeparrot_training - Step 7614: {'lr': 0.00048331291571025026, 'samples': 1462080, 'steps': 7614, 'loss/train': 0.8770003616809845} 01/27/2022 03:01:23 - INFO - codeparrot_training - Step 7615: {'lr': 0.0004833070374340691, 'samples': 1462272, 'steps': 7615, 'loss/train': 1.0903558731079102} 01/27/2022 03:01:26 - INFO - codeparrot_training - Step 7616: {'lr': 0.00048330115815847465, 'samples': 1462464, 'steps': 7616, 'loss/train': 0.7348701059818268} 01/27/2022 03:01:29 - INFO - codeparrot_training - Step 7617: {'lr': 0.00048329527788349216, 'samples': 1462656, 'steps': 7617, 'loss/train': 1.6727998852729797} 01/27/2022 03:01:32 - INFO - codeparrot_training - Step 7618: {'lr': 0.0004832893966091467, 'samples': 1462848, 'steps': 7618, 'loss/train': 1.8241446018218994} 01/27/2022 03:01:35 - INFO - codeparrot_training - Step 7619: {'lr': 0.00048328351433546364, 'samples': 1463040, 'steps': 7619, 'loss/train': 0.8015196025371552} 01/27/2022 03:01:38 - INFO - codeparrot_training - Step 7620: {'lr': 0.000483277631062468, 'samples': 1463232, 'steps': 7620, 'loss/train': 0.6890595406293869} 01/27/2022 03:01:42 - INFO - codeparrot_training - Step 7621: {'lr': 0.00048327174679018515, 'samples': 1463424, 'steps': 7621, 'loss/train': 1.0285073518753052} 01/27/2022 03:01:45 - INFO - codeparrot_training - Step 7622: {'lr': 0.00048326586151864015, 'samples': 1463616, 'steps': 7622, 'loss/train': 0.7207586467266083} 01/27/2022 03:01:48 - INFO - codeparrot_training - Step 7623: {'lr': 0.00048325997524785826, 'samples': 1463808, 'steps': 7623, 'loss/train': 0.635201632976532} 01/27/2022 03:01:52 - INFO - codeparrot_training - Step 7624: {'lr': 0.00048325408797786476, 'samples': 1464000, 'steps': 7624, 'loss/train': 0.5636766403913498} 01/27/2022 03:01:56 - INFO - codeparrot_training - Step 7625: {'lr': 0.00048324819970868473, 'samples': 1464192, 'steps': 7625, 'loss/train': 0.9715841710567474} 01/27/2022 03:01:59 - INFO - codeparrot_training - Step 7626: {'lr': 0.0004832423104403435, 'samples': 1464384, 'steps': 7626, 'loss/train': 0.97709521651268} 01/27/2022 03:02:02 - INFO - codeparrot_training - Step 7627: {'lr': 0.0004832364201728663, 'samples': 1464576, 'steps': 7627, 'loss/train': 0.571823000907898} 01/27/2022 03:02:05 - INFO - codeparrot_training - Step 7628: {'lr': 0.0004832305289062784, 'samples': 1464768, 'steps': 7628, 'loss/train': 0.8800734579563141} 01/27/2022 03:02:08 - INFO - codeparrot_training - Step 7629: {'lr': 0.0004832246366406049, 'samples': 1464960, 'steps': 7629, 'loss/train': 0.42760366201400757} 01/27/2022 03:02:11 - INFO - codeparrot_training - Step 7630: {'lr': 0.00048321874337587105, 'samples': 1465152, 'steps': 7630, 'loss/train': 0.6239087283611298} 01/27/2022 03:02:14 - INFO - codeparrot_training - Step 7631: {'lr': 0.0004832128491121023, 'samples': 1465344, 'steps': 7631, 'loss/train': 0.6425810158252716} 01/27/2022 03:02:21 - INFO - codeparrot_training - Step 7632: {'lr': 0.00048320695384932366, 'samples': 1465536, 'steps': 7632, 'loss/train': 0.504452258348465} 01/27/2022 03:02:24 - INFO - codeparrot_training - Step 7633: {'lr': 0.0004832010575875605, 'samples': 1465728, 'steps': 7633, 'loss/train': 1.200670748949051} 01/27/2022 03:02:28 - INFO - codeparrot_training - Step 7634: {'lr': 0.0004831951603268381, 'samples': 1465920, 'steps': 7634, 'loss/train': 0.6618849188089371} 01/27/2022 03:02:31 - INFO - codeparrot_training - Step 7635: {'lr': 0.0004831892620671816, 'samples': 1466112, 'steps': 7635, 'loss/train': 0.4754306226968765} 01/27/2022 03:02:34 - INFO - codeparrot_training - Step 7636: {'lr': 0.0004831833628086164, 'samples': 1466304, 'steps': 7636, 'loss/train': 0.8575435280799866} 01/27/2022 03:02:37 - INFO - codeparrot_training - Step 7637: {'lr': 0.0004831774625511677, 'samples': 1466496, 'steps': 7637, 'loss/train': 0.2975154519081116} 01/27/2022 03:02:40 - INFO - codeparrot_training - Step 7638: {'lr': 0.00048317156129486086, 'samples': 1466688, 'steps': 7638, 'loss/train': 0.6466656625270844} 01/27/2022 03:02:43 - INFO - codeparrot_training - Step 7639: {'lr': 0.000483165659039721, 'samples': 1466880, 'steps': 7639, 'loss/train': 0.7232532799243927} 01/27/2022 03:02:46 - INFO - codeparrot_training - Step 7640: {'lr': 0.0004831597557857735, 'samples': 1467072, 'steps': 7640, 'loss/train': 0.7560909390449524} 01/27/2022 03:02:50 - INFO - codeparrot_training - Step 7641: {'lr': 0.0004831538515330437, 'samples': 1467264, 'steps': 7641, 'loss/train': 0.8547402620315552} 01/27/2022 03:02:54 - INFO - codeparrot_training - Step 7642: {'lr': 0.0004831479462815568, 'samples': 1467456, 'steps': 7642, 'loss/train': 0.6870401948690414} 01/27/2022 03:02:57 - INFO - codeparrot_training - Step 7643: {'lr': 0.00048314204003133815, 'samples': 1467648, 'steps': 7643, 'loss/train': 0.9064047038555145} 01/27/2022 03:03:00 - INFO - codeparrot_training - Step 7644: {'lr': 0.00048313613278241305, 'samples': 1467840, 'steps': 7644, 'loss/train': 1.146891564130783} 01/27/2022 03:03:04 - INFO - codeparrot_training - Step 7645: {'lr': 0.0004831302245348068, 'samples': 1468032, 'steps': 7645, 'loss/train': 0.6220729351043701} 01/27/2022 03:03:07 - INFO - codeparrot_training - Step 7646: {'lr': 0.0004831243152885446, 'samples': 1468224, 'steps': 7646, 'loss/train': 0.7748553156852722} 01/27/2022 03:03:10 - INFO - codeparrot_training - Step 7647: {'lr': 0.0004831184050436519, 'samples': 1468416, 'steps': 7647, 'loss/train': 0.809967041015625} 01/27/2022 03:03:13 - INFO - codeparrot_training - Step 7648: {'lr': 0.000483112493800154, 'samples': 1468608, 'steps': 7648, 'loss/train': 0.2919779643416405} 01/27/2022 03:03:16 - INFO - codeparrot_training - Step 7649: {'lr': 0.0004831065815580762, 'samples': 1468800, 'steps': 7649, 'loss/train': 1.2434191703796387} 01/27/2022 03:03:19 - INFO - codeparrot_training - Step 7650: {'lr': 0.0004831006683174438, 'samples': 1468992, 'steps': 7650, 'loss/train': 0.8198688626289368} 01/27/2022 03:03:24 - INFO - codeparrot_training - Step 7651: {'lr': 0.0004830947540782822, 'samples': 1469184, 'steps': 7651, 'loss/train': 0.9373391568660736} 01/27/2022 03:03:27 - INFO - codeparrot_training - Step 7652: {'lr': 0.0004830888388406166, 'samples': 1469376, 'steps': 7652, 'loss/train': 0.8364408016204834} 01/27/2022 03:03:30 - INFO - codeparrot_training - Step 7653: {'lr': 0.0004830829226044725, 'samples': 1469568, 'steps': 7653, 'loss/train': 1.0168322324752808} 01/27/2022 03:03:33 - INFO - codeparrot_training - Step 7654: {'lr': 0.0004830770053698752, 'samples': 1469760, 'steps': 7654, 'loss/train': 1.1086138486862183} 01/27/2022 03:03:37 - INFO - codeparrot_training - Step 7655: {'lr': 0.00048307108713684994, 'samples': 1469952, 'steps': 7655, 'loss/train': 0.614548847079277} 01/27/2022 03:03:40 - INFO - codeparrot_training - Step 7656: {'lr': 0.00048306516790542223, 'samples': 1470144, 'steps': 7656, 'loss/train': 0.9376108646392822} 01/27/2022 03:03:43 - INFO - codeparrot_training - Step 7657: {'lr': 0.00048305924767561725, 'samples': 1470336, 'steps': 7657, 'loss/train': 0.5010148733854294} 01/27/2022 03:03:46 - INFO - codeparrot_training - Step 7658: {'lr': 0.00048305332644746053, 'samples': 1470528, 'steps': 7658, 'loss/train': 0.9435427486896515} 01/27/2022 03:03:49 - INFO - codeparrot_training - Step 7659: {'lr': 0.0004830474042209774, 'samples': 1470720, 'steps': 7659, 'loss/train': 0.7878119051456451} 01/27/2022 03:03:55 - INFO - codeparrot_training - Step 7660: {'lr': 0.00048304148099619304, 'samples': 1470912, 'steps': 7660, 'loss/train': 0.8085560202598572} 01/27/2022 03:03:58 - INFO - codeparrot_training - Step 7661: {'lr': 0.0004830355567731331, 'samples': 1471104, 'steps': 7661, 'loss/train': 0.7345127910375595} 01/27/2022 03:04:01 - INFO - codeparrot_training - Step 7662: {'lr': 0.0004830296315518228, 'samples': 1471296, 'steps': 7662, 'loss/train': 1.0668336153030396} 01/27/2022 03:04:05 - INFO - codeparrot_training - Step 7663: {'lr': 0.00048302370533228754, 'samples': 1471488, 'steps': 7663, 'loss/train': 0.6105150282382965} 01/27/2022 03:04:08 - INFO - codeparrot_training - Step 7664: {'lr': 0.00048301777811455274, 'samples': 1471680, 'steps': 7664, 'loss/train': 0.934611439704895} 01/27/2022 03:04:11 - INFO - codeparrot_training - Step 7665: {'lr': 0.0004830118498986438, 'samples': 1471872, 'steps': 7665, 'loss/train': 0.7985307276248932} 01/27/2022 03:04:14 - INFO - codeparrot_training - Step 7666: {'lr': 0.000483005920684586, 'samples': 1472064, 'steps': 7666, 'loss/train': 1.1436453759670258} 01/27/2022 03:04:17 - INFO - codeparrot_training - Step 7667: {'lr': 0.0004829999904724049, 'samples': 1472256, 'steps': 7667, 'loss/train': 0.7354395389556885} 01/27/2022 03:04:22 - INFO - codeparrot_training - Step 7668: {'lr': 0.0004829940592621258, 'samples': 1472448, 'steps': 7668, 'loss/train': 0.8391033411026001} 01/27/2022 03:04:25 - INFO - codeparrot_training - Step 7669: {'lr': 0.00048298812705377414, 'samples': 1472640, 'steps': 7669, 'loss/train': 1.229896992444992} 01/27/2022 03:04:28 - INFO - codeparrot_training - Step 7670: {'lr': 0.0004829821938473753, 'samples': 1472832, 'steps': 7670, 'loss/train': 0.7460888922214508} 01/27/2022 03:04:31 - INFO - codeparrot_training - Step 7671: {'lr': 0.0004829762596429548, 'samples': 1473024, 'steps': 7671, 'loss/train': 0.6615315824747086} 01/27/2022 03:04:34 - INFO - codeparrot_training - Step 7672: {'lr': 0.0004829703244405379, 'samples': 1473216, 'steps': 7672, 'loss/train': 0.9749057292938232} 01/27/2022 03:04:37 - INFO - codeparrot_training - Step 7673: {'lr': 0.0004829643882401501, 'samples': 1473408, 'steps': 7673, 'loss/train': 1.130369782447815} 01/27/2022 03:04:40 - INFO - codeparrot_training - Step 7674: {'lr': 0.0004829584510418169, 'samples': 1473600, 'steps': 7674, 'loss/train': 0.7473610639572144} 01/27/2022 03:04:44 - INFO - codeparrot_training - Step 7675: {'lr': 0.00048295251284556363, 'samples': 1473792, 'steps': 7675, 'loss/train': 0.8496226072311401} 01/27/2022 03:04:47 - INFO - codeparrot_training - Step 7676: {'lr': 0.0004829465736514157, 'samples': 1473984, 'steps': 7676, 'loss/train': 0.5137948393821716} 01/27/2022 03:04:51 - INFO - codeparrot_training - Step 7677: {'lr': 0.00048294063345939877, 'samples': 1474176, 'steps': 7677, 'loss/train': 1.066158413887024} 01/27/2022 03:04:55 - INFO - codeparrot_training - Step 7678: {'lr': 0.000482934692269538, 'samples': 1474368, 'steps': 7678, 'loss/train': 0.9597440958023071} 01/27/2022 03:04:58 - INFO - codeparrot_training - Step 7679: {'lr': 0.00048292875008185896, 'samples': 1474560, 'steps': 7679, 'loss/train': 1.3621509075164795} 01/27/2022 03:05:01 - INFO - codeparrot_training - Step 7680: {'lr': 0.0004829228068963872, 'samples': 1474752, 'steps': 7680, 'loss/train': 1.0827862322330475} 01/27/2022 03:05:04 - INFO - codeparrot_training - Step 7681: {'lr': 0.00048291686271314816, 'samples': 1474944, 'steps': 7681, 'loss/train': 1.0184938609600067} 01/27/2022 03:05:07 - INFO - codeparrot_training - Step 7682: {'lr': 0.0004829109175321671, 'samples': 1475136, 'steps': 7682, 'loss/train': 0.915107935667038} 01/27/2022 03:05:10 - INFO - codeparrot_training - Step 7683: {'lr': 0.00048290497135346965, 'samples': 1475328, 'steps': 7683, 'loss/train': 0.8316857814788818} 01/27/2022 03:05:13 - INFO - codeparrot_training - Step 7684: {'lr': 0.0004828990241770813, 'samples': 1475520, 'steps': 7684, 'loss/train': 1.1984860002994537} 01/27/2022 03:05:17 - INFO - codeparrot_training - Step 7685: {'lr': 0.0004828930760030275, 'samples': 1475712, 'steps': 7685, 'loss/train': 0.8514697551727295} 01/27/2022 03:05:23 - INFO - codeparrot_training - Step 7686: {'lr': 0.0004828871268313337, 'samples': 1475904, 'steps': 7686, 'loss/train': 0.8870801031589508} 01/27/2022 03:05:26 - INFO - codeparrot_training - Step 7687: {'lr': 0.0004828811766620254, 'samples': 1476096, 'steps': 7687, 'loss/train': 0.8287467956542969} 01/27/2022 03:05:29 - INFO - codeparrot_training - Step 7688: {'lr': 0.00048287522549512806, 'samples': 1476288, 'steps': 7688, 'loss/train': 0.9563798010349274} 01/27/2022 03:05:33 - INFO - codeparrot_training - Step 7689: {'lr': 0.0004828692733306672, 'samples': 1476480, 'steps': 7689, 'loss/train': 0.7743388116359711} 01/27/2022 03:05:36 - INFO - codeparrot_training - Step 7690: {'lr': 0.0004828633201686684, 'samples': 1476672, 'steps': 7690, 'loss/train': 0.8869370520114899} 01/27/2022 03:05:39 - INFO - codeparrot_training - Step 7691: {'lr': 0.00048285736600915696, 'samples': 1476864, 'steps': 7691, 'loss/train': 0.8838708400726318} 01/27/2022 03:05:42 - INFO - codeparrot_training - Step 7692: {'lr': 0.00048285141085215857, 'samples': 1477056, 'steps': 7692, 'loss/train': 1.0491009056568146} 01/27/2022 03:05:45 - INFO - codeparrot_training - Step 7693: {'lr': 0.0004828454546976987, 'samples': 1477248, 'steps': 7693, 'loss/train': 0.22056761384010315} 01/27/2022 03:05:50 - INFO - codeparrot_training - Step 7694: {'lr': 0.00048283949754580283, 'samples': 1477440, 'steps': 7694, 'loss/train': 0.9778264760971069} 01/27/2022 03:05:53 - INFO - codeparrot_training - Step 7695: {'lr': 0.00048283353939649644, 'samples': 1477632, 'steps': 7695, 'loss/train': 0.591932013630867} 01/27/2022 03:05:56 - INFO - codeparrot_training - Step 7696: {'lr': 0.0004828275802498051, 'samples': 1477824, 'steps': 7696, 'loss/train': 0.7260535061359406} 01/27/2022 03:05:59 - INFO - codeparrot_training - Step 7697: {'lr': 0.0004828216201057544, 'samples': 1478016, 'steps': 7697, 'loss/train': 0.8873619139194489} 01/27/2022 03:06:02 - INFO - codeparrot_training - Step 7698: {'lr': 0.00048281565896436966, 'samples': 1478208, 'steps': 7698, 'loss/train': 0.671002596616745} 01/27/2022 03:06:05 - INFO - codeparrot_training - Step 7699: {'lr': 0.0004828096968256767, 'samples': 1478400, 'steps': 7699, 'loss/train': 0.7821603119373322} 01/27/2022 03:06:09 - INFO - codeparrot_training - Step 7700: {'lr': 0.00048280373368970086, 'samples': 1478592, 'steps': 7700, 'loss/train': 0.8332138359546661} 01/27/2022 03:06:12 - INFO - codeparrot_training - Step 7701: {'lr': 0.0004827977695564678, 'samples': 1478784, 'steps': 7701, 'loss/train': 0.9645448923110962} 01/27/2022 03:06:15 - INFO - codeparrot_training - Step 7702: {'lr': 0.000482791804426003, 'samples': 1478976, 'steps': 7702, 'loss/train': 0.648205429315567} 01/27/2022 03:06:19 - INFO - codeparrot_training - Step 7703: {'lr': 0.00048278583829833207, 'samples': 1479168, 'steps': 7703, 'loss/train': 0.9480890035629272} 01/27/2022 03:06:22 - INFO - codeparrot_training - Step 7704: {'lr': 0.00048277987117348043, 'samples': 1479360, 'steps': 7704, 'loss/train': 1.0716075003147125} 01/27/2022 03:06:26 - INFO - codeparrot_training - Step 7705: {'lr': 0.00048277390305147386, 'samples': 1479552, 'steps': 7705, 'loss/train': 0.7006838321685791} 01/27/2022 03:06:29 - INFO - codeparrot_training - Step 7706: {'lr': 0.0004827679339323377, 'samples': 1479744, 'steps': 7706, 'loss/train': 1.2109043598175049} 01/27/2022 03:06:32 - INFO - codeparrot_training - Step 7707: {'lr': 0.0004827619638160977, 'samples': 1479936, 'steps': 7707, 'loss/train': 0.11992943286895752} 01/27/2022 03:06:35 - INFO - codeparrot_training - Step 7708: {'lr': 0.00048275599270277927, 'samples': 1480128, 'steps': 7708, 'loss/train': 0.6551229804754257} 01/27/2022 03:06:38 - INFO - codeparrot_training - Step 7709: {'lr': 0.00048275002059240815, 'samples': 1480320, 'steps': 7709, 'loss/train': 0.7948912382125854} 01/27/2022 03:06:42 - INFO - codeparrot_training - Step 7710: {'lr': 0.00048274404748500975, 'samples': 1480512, 'steps': 7710, 'loss/train': 0.7343800216913223} 01/27/2022 03:06:45 - INFO - codeparrot_training - Step 7711: {'lr': 0.0004827380733806099, 'samples': 1480704, 'steps': 7711, 'loss/train': 0.5505653321743011} 01/27/2022 03:06:51 - INFO - codeparrot_training - Step 7712: {'lr': 0.0004827320982792339, 'samples': 1480896, 'steps': 7712, 'loss/train': 0.7657411694526672} 01/27/2022 03:06:54 - INFO - codeparrot_training - Step 7713: {'lr': 0.0004827261221809076, 'samples': 1481088, 'steps': 7713, 'loss/train': 0.758414089679718} 01/27/2022 03:06:57 - INFO - codeparrot_training - Step 7714: {'lr': 0.00048272014508565645, 'samples': 1481280, 'steps': 7714, 'loss/train': 1.4043056666851044} 01/27/2022 03:07:00 - INFO - codeparrot_training - Step 7715: {'lr': 0.00048271416699350613, 'samples': 1481472, 'steps': 7715, 'loss/train': 0.9531634747982025} 01/27/2022 03:07:04 - INFO - codeparrot_training - Step 7716: {'lr': 0.0004827081879044821, 'samples': 1481664, 'steps': 7716, 'loss/train': 0.9181013703346252} 01/27/2022 03:07:07 - INFO - codeparrot_training - Step 7717: {'lr': 0.00048270220781861025, 'samples': 1481856, 'steps': 7717, 'loss/train': 0.8320972323417664} 01/27/2022 03:07:10 - INFO - codeparrot_training - Step 7718: {'lr': 0.000482696226735916, 'samples': 1482048, 'steps': 7718, 'loss/train': 1.0123603641986847} 01/27/2022 03:07:13 - INFO - codeparrot_training - Step 7719: {'lr': 0.00048269024465642487, 'samples': 1482240, 'steps': 7719, 'loss/train': 1.256813406944275} 01/27/2022 03:07:16 - INFO - codeparrot_training - Step 7720: {'lr': 0.00048268426158016274, 'samples': 1482432, 'steps': 7720, 'loss/train': 1.1100200414657593} 01/27/2022 03:07:21 - INFO - codeparrot_training - Step 7721: {'lr': 0.0004826782775071551, 'samples': 1482624, 'steps': 7721, 'loss/train': 0.996311366558075} 01/27/2022 03:07:24 - INFO - codeparrot_training - Step 7722: {'lr': 0.00048267229243742753, 'samples': 1482816, 'steps': 7722, 'loss/train': 0.7240147143602371} 01/27/2022 03:07:27 - INFO - codeparrot_training - Step 7723: {'lr': 0.00048266630637100585, 'samples': 1483008, 'steps': 7723, 'loss/train': 0.87593874335289} 01/27/2022 03:07:30 - INFO - codeparrot_training - Step 7724: {'lr': 0.00048266031930791555, 'samples': 1483200, 'steps': 7724, 'loss/train': 0.41549183428287506} 01/27/2022 03:07:33 - INFO - codeparrot_training - Step 7725: {'lr': 0.00048265433124818226, 'samples': 1483392, 'steps': 7725, 'loss/train': 0.7628550231456757} 01/27/2022 03:07:36 - INFO - codeparrot_training - Step 7726: {'lr': 0.00048264834219183175, 'samples': 1483584, 'steps': 7726, 'loss/train': 1.0311945676803589} 01/27/2022 03:07:40 - INFO - codeparrot_training - Step 7727: {'lr': 0.00048264235213888964, 'samples': 1483776, 'steps': 7727, 'loss/train': 0.7338914573192596} 01/27/2022 03:07:43 - INFO - codeparrot_training - Step 7728: {'lr': 0.00048263636108938153, 'samples': 1483968, 'steps': 7728, 'loss/train': 1.2074309885501862} 01/27/2022 03:07:46 - INFO - codeparrot_training - Step 7729: {'lr': 0.0004826303690433331, 'samples': 1484160, 'steps': 7729, 'loss/train': 0.8173722624778748} 01/27/2022 03:07:52 - INFO - codeparrot_training - Step 7730: {'lr': 0.0004826243760007701, 'samples': 1484352, 'steps': 7730, 'loss/train': 0.9829074740409851} 01/27/2022 03:07:55 - INFO - codeparrot_training - Step 7731: {'lr': 0.00048261838196171804, 'samples': 1484544, 'steps': 7731, 'loss/train': 0.7662398815155029} 01/27/2022 03:07:58 - INFO - codeparrot_training - Step 7732: {'lr': 0.0004826123869262028, 'samples': 1484736, 'steps': 7732, 'loss/train': 0.9089795351028442} 01/27/2022 03:08:02 - INFO - codeparrot_training - Step 7733: {'lr': 0.0004826063908942499, 'samples': 1484928, 'steps': 7733, 'loss/train': 1.3091033399105072} 01/27/2022 03:08:05 - INFO - codeparrot_training - Step 7734: {'lr': 0.00048260039386588513, 'samples': 1485120, 'steps': 7734, 'loss/train': 0.7963171005249023} 01/27/2022 03:08:08 - INFO - codeparrot_training - Step 7735: {'lr': 0.00048259439584113405, 'samples': 1485312, 'steps': 7735, 'loss/train': 0.5316639840602875} 01/27/2022 03:08:11 - INFO - codeparrot_training - Step 7736: {'lr': 0.00048258839682002253, 'samples': 1485504, 'steps': 7736, 'loss/train': 0.8005248606204987} 01/27/2022 03:08:14 - INFO - codeparrot_training - Step 7737: {'lr': 0.0004825823968025761, 'samples': 1485696, 'steps': 7737, 'loss/train': 1.0126904547214508} 01/27/2022 03:08:17 - INFO - codeparrot_training - Step 7738: {'lr': 0.0004825763957888206, 'samples': 1485888, 'steps': 7738, 'loss/train': 1.0630644857883453} 01/27/2022 03:08:22 - INFO - codeparrot_training - Step 7739: {'lr': 0.00048257039377878165, 'samples': 1486080, 'steps': 7739, 'loss/train': 0.3952483981847763} 01/27/2022 03:08:25 - INFO - codeparrot_training - Step 7740: {'lr': 0.00048256439077248495, 'samples': 1486272, 'steps': 7740, 'loss/train': 1.0292683839797974} 01/27/2022 03:08:28 - INFO - codeparrot_training - Step 7741: {'lr': 0.00048255838676995624, 'samples': 1486464, 'steps': 7741, 'loss/train': 0.5721825510263443} 01/27/2022 03:08:31 - INFO - codeparrot_training - Step 7742: {'lr': 0.00048255238177122127, 'samples': 1486656, 'steps': 7742, 'loss/train': 0.7375155240297318} 01/27/2022 03:08:34 - INFO - codeparrot_training - Step 7743: {'lr': 0.0004825463757763058, 'samples': 1486848, 'steps': 7743, 'loss/train': 0.5251389741897583} 01/27/2022 03:08:37 - INFO - codeparrot_training - Step 7744: {'lr': 0.00048254036878523537, 'samples': 1487040, 'steps': 7744, 'loss/train': 1.2512182295322418} 01/27/2022 03:08:41 - INFO - codeparrot_training - Step 7745: {'lr': 0.00048253436079803594, 'samples': 1487232, 'steps': 7745, 'loss/train': 1.0196521282196045} 01/27/2022 03:08:44 - INFO - codeparrot_training - Step 7746: {'lr': 0.0004825283518147331, 'samples': 1487424, 'steps': 7746, 'loss/train': 0.8549992740154266} 01/27/2022 03:08:47 - INFO - codeparrot_training - Step 7747: {'lr': 0.00048252234183535265, 'samples': 1487616, 'steps': 7747, 'loss/train': 1.157898634672165} 01/27/2022 03:08:51 - INFO - codeparrot_training - Step 7748: {'lr': 0.0004825163308599203, 'samples': 1487808, 'steps': 7748, 'loss/train': 0.1619924046099186} 01/27/2022 03:08:55 - INFO - codeparrot_training - Step 7749: {'lr': 0.0004825103188884619, 'samples': 1488000, 'steps': 7749, 'loss/train': 1.0716613233089447} 01/27/2022 03:08:58 - INFO - codeparrot_training - Step 7750: {'lr': 0.000482504305921003, 'samples': 1488192, 'steps': 7750, 'loss/train': 1.316778838634491} 01/27/2022 03:09:01 - INFO - codeparrot_training - Step 7751: {'lr': 0.00048249829195756954, 'samples': 1488384, 'steps': 7751, 'loss/train': 0.7664682269096375} 01/27/2022 03:09:04 - INFO - codeparrot_training - Step 7752: {'lr': 0.0004824922769981873, 'samples': 1488576, 'steps': 7752, 'loss/train': 1.001452624797821} 01/27/2022 03:09:07 - INFO - codeparrot_training - Step 7753: {'lr': 0.0004824862610428819, 'samples': 1488768, 'steps': 7753, 'loss/train': 0.9412713646888733} 01/27/2022 03:09:10 - INFO - codeparrot_training - Step 7754: {'lr': 0.0004824802440916792, 'samples': 1488960, 'steps': 7754, 'loss/train': 0.9770163595676422} 01/27/2022 03:09:14 - INFO - codeparrot_training - Step 7755: {'lr': 0.0004824742261446049, 'samples': 1489152, 'steps': 7755, 'loss/train': 1.5449214577674866} 01/27/2022 03:09:18 - INFO - codeparrot_training - Step 7756: {'lr': 0.0004824682072016849, 'samples': 1489344, 'steps': 7756, 'loss/train': 0.6794867366552353} 01/27/2022 03:09:21 - INFO - codeparrot_training - Step 7757: {'lr': 0.00048246218726294486, 'samples': 1489536, 'steps': 7757, 'loss/train': 1.015746831893921} 01/27/2022 03:09:24 - INFO - codeparrot_training - Step 7758: {'lr': 0.0004824561663284107, 'samples': 1489728, 'steps': 7758, 'loss/train': 0.8315973579883575} 01/27/2022 03:09:27 - INFO - codeparrot_training - Step 7759: {'lr': 0.0004824501443981081, 'samples': 1489920, 'steps': 7759, 'loss/train': 1.073832392692566} 01/27/2022 03:09:31 - INFO - codeparrot_training - Step 7760: {'lr': 0.00048244412147206283, 'samples': 1490112, 'steps': 7760, 'loss/train': 0.6737937927246094} 01/27/2022 03:09:34 - INFO - codeparrot_training - Step 7761: {'lr': 0.00048243809755030086, 'samples': 1490304, 'steps': 7761, 'loss/train': 1.1076355576515198} 01/27/2022 03:09:37 - INFO - codeparrot_training - Step 7762: {'lr': 0.00048243207263284785, 'samples': 1490496, 'steps': 7762, 'loss/train': 1.050683319568634} 01/27/2022 03:09:40 - INFO - codeparrot_training - Step 7763: {'lr': 0.0004824260467197296, 'samples': 1490688, 'steps': 7763, 'loss/train': 0.894028902053833} 01/27/2022 03:09:43 - INFO - codeparrot_training - Step 7764: {'lr': 0.000482420019810972, 'samples': 1490880, 'steps': 7764, 'loss/train': 0.4258173555135727} 01/27/2022 03:09:49 - INFO - codeparrot_training - Step 7765: {'lr': 0.00048241399190660086, 'samples': 1491072, 'steps': 7765, 'loss/train': 0.6742783337831497} 01/27/2022 03:09:52 - INFO - codeparrot_training - Step 7766: {'lr': 0.0004824079630066419, 'samples': 1491264, 'steps': 7766, 'loss/train': 1.0264931917190552} 01/27/2022 03:09:56 - INFO - codeparrot_training - Step 7767: {'lr': 0.0004824019331111211, 'samples': 1491456, 'steps': 7767, 'loss/train': 0.9918211698532104} 01/27/2022 03:09:59 - INFO - codeparrot_training - Step 7768: {'lr': 0.0004823959022200642, 'samples': 1491648, 'steps': 7768, 'loss/train': 0.8097457587718964} 01/27/2022 03:10:02 - INFO - codeparrot_training - Step 7769: {'lr': 0.00048238987033349706, 'samples': 1491840, 'steps': 7769, 'loss/train': 0.741515502333641} 01/27/2022 03:10:05 - INFO - codeparrot_training - Step 7770: {'lr': 0.0004823838374514455, 'samples': 1492032, 'steps': 7770, 'loss/train': 1.3029408752918243} 01/27/2022 03:10:08 - INFO - codeparrot_training - Step 7771: {'lr': 0.00048237780357393535, 'samples': 1492224, 'steps': 7771, 'loss/train': 1.2083006501197815} 01/27/2022 03:10:11 - INFO - codeparrot_training - Step 7772: {'lr': 0.00048237176870099256, 'samples': 1492416, 'steps': 7772, 'loss/train': 0.7567630112171173} 01/27/2022 03:10:14 - INFO - codeparrot_training - Step 7773: {'lr': 0.0004823657328326427, 'samples': 1492608, 'steps': 7773, 'loss/train': 0.6820453852415085} 01/27/2022 03:10:19 - INFO - codeparrot_training - Step 7774: {'lr': 0.000482359695968912, 'samples': 1492800, 'steps': 7774, 'loss/train': 1.0721243619918823} 01/27/2022 03:10:22 - INFO - codeparrot_training - Step 7775: {'lr': 0.0004823536581098261, 'samples': 1492992, 'steps': 7775, 'loss/train': 0.8989821374416351} 01/27/2022 03:10:25 - INFO - codeparrot_training - Step 7776: {'lr': 0.00048234761925541094, 'samples': 1493184, 'steps': 7776, 'loss/train': 0.46217080950737} 01/27/2022 03:10:28 - INFO - codeparrot_training - Step 7777: {'lr': 0.0004823415794056923, 'samples': 1493376, 'steps': 7777, 'loss/train': 0.3767346739768982} 01/27/2022 03:10:31 - INFO - codeparrot_training - Step 7778: {'lr': 0.00048233553856069617, 'samples': 1493568, 'steps': 7778, 'loss/train': 0.4191160351037979} 01/27/2022 03:10:35 - INFO - codeparrot_training - Step 7779: {'lr': 0.00048232949672044834, 'samples': 1493760, 'steps': 7779, 'loss/train': 0.9704810678958893} 01/27/2022 03:10:38 - INFO - codeparrot_training - Step 7780: {'lr': 0.0004823234538849747, 'samples': 1493952, 'steps': 7780, 'loss/train': 2.473456621170044} 01/27/2022 03:10:41 - INFO - codeparrot_training - Step 7781: {'lr': 0.0004823174100543012, 'samples': 1494144, 'steps': 7781, 'loss/train': 0.837828665971756} 01/27/2022 03:10:44 - INFO - codeparrot_training - Step 7782: {'lr': 0.0004823113652284536, 'samples': 1494336, 'steps': 7782, 'loss/train': 0.6441293209791183} 01/27/2022 03:10:48 - INFO - codeparrot_training - Step 7783: {'lr': 0.00048230531940745793, 'samples': 1494528, 'steps': 7783, 'loss/train': 0.998610109090805} 01/27/2022 03:10:52 - INFO - codeparrot_training - Step 7784: {'lr': 0.0004822992725913401, 'samples': 1494720, 'steps': 7784, 'loss/train': 0.8325110077857971} 01/27/2022 03:10:55 - INFO - codeparrot_training - Step 7785: {'lr': 0.00048229322478012584, 'samples': 1494912, 'steps': 7785, 'loss/train': 0.6412865817546844} 01/27/2022 03:10:58 - INFO - codeparrot_training - Step 7786: {'lr': 0.0004822871759738412, 'samples': 1495104, 'steps': 7786, 'loss/train': 0.5545375496149063} 01/27/2022 03:11:01 - INFO - codeparrot_training - Step 7787: {'lr': 0.0004822811261725121, 'samples': 1495296, 'steps': 7787, 'loss/train': 0.7661893665790558} 01/27/2022 03:11:04 - INFO - codeparrot_training - Step 7788: {'lr': 0.0004822750753761644, 'samples': 1495488, 'steps': 7788, 'loss/train': 0.6968690603971481} 01/27/2022 03:11:07 - INFO - codeparrot_training - Step 7789: {'lr': 0.00048226902358482405, 'samples': 1495680, 'steps': 7789, 'loss/train': 0.9043812453746796} 01/27/2022 03:11:10 - INFO - codeparrot_training - Step 7790: {'lr': 0.0004822629707985169, 'samples': 1495872, 'steps': 7790, 'loss/train': 0.8079680800437927} 01/27/2022 03:11:17 - INFO - codeparrot_training - Step 7791: {'lr': 0.00048225691701726895, 'samples': 1496064, 'steps': 7791, 'loss/train': 0.3629949912428856} 01/27/2022 03:11:20 - INFO - codeparrot_training - Step 7792: {'lr': 0.00048225086224110614, 'samples': 1496256, 'steps': 7792, 'loss/train': 0.5295952409505844} 01/27/2022 03:11:23 - INFO - codeparrot_training - Step 7793: {'lr': 0.00048224480647005437, 'samples': 1496448, 'steps': 7793, 'loss/train': 0.8085954487323761} 01/27/2022 03:11:26 - INFO - codeparrot_training - Step 7794: {'lr': 0.0004822387497041396, 'samples': 1496640, 'steps': 7794, 'loss/train': 0.7532142698764801} 01/27/2022 03:11:29 - INFO - codeparrot_training - Step 7795: {'lr': 0.00048223269194338776, 'samples': 1496832, 'steps': 7795, 'loss/train': 1.0826787650585175} 01/27/2022 03:11:32 - INFO - codeparrot_training - Step 7796: {'lr': 0.0004822266331878248, 'samples': 1497024, 'steps': 7796, 'loss/train': 0.9508843123912811} 01/27/2022 03:11:35 - INFO - codeparrot_training - Step 7797: {'lr': 0.0004822205734374767, 'samples': 1497216, 'steps': 7797, 'loss/train': 0.6545926183462143} 01/27/2022 03:11:39 - INFO - codeparrot_training - Step 7798: {'lr': 0.00048221451269236937, 'samples': 1497408, 'steps': 7798, 'loss/train': 0.5403478592634201} 01/27/2022 03:11:42 - INFO - codeparrot_training - Step 7799: {'lr': 0.0004822084509525289, 'samples': 1497600, 'steps': 7799, 'loss/train': 0.9718144834041595} 01/27/2022 03:11:47 - INFO - codeparrot_training - Step 7800: {'lr': 0.0004822023882179811, 'samples': 1497792, 'steps': 7800, 'loss/train': 0.6989728957414627} 01/27/2022 03:11:50 - INFO - codeparrot_training - Step 7801: {'lr': 0.00048219632448875195, 'samples': 1497984, 'steps': 7801, 'loss/train': 0.8253926932811737} 01/27/2022 03:11:53 - INFO - codeparrot_training - Step 7802: {'lr': 0.0004821902597648675, 'samples': 1498176, 'steps': 7802, 'loss/train': 0.9327609837055206} 01/27/2022 03:11:56 - INFO - codeparrot_training - Step 7803: {'lr': 0.0004821841940463538, 'samples': 1498368, 'steps': 7803, 'loss/train': 1.310807079076767} 01/27/2022 03:11:59 - INFO - codeparrot_training - Step 7804: {'lr': 0.0004821781273332366, 'samples': 1498560, 'steps': 7804, 'loss/train': 0.7950704991817474} 01/27/2022 03:12:02 - INFO - codeparrot_training - Step 7805: {'lr': 0.00048217205962554214, 'samples': 1498752, 'steps': 7805, 'loss/train': 0.5679644197225571} 01/27/2022 03:12:06 - INFO - codeparrot_training - Step 7806: {'lr': 0.0004821659909232963, 'samples': 1498944, 'steps': 7806, 'loss/train': 0.6343128383159637} 01/27/2022 03:12:09 - INFO - codeparrot_training - Step 7807: {'lr': 0.000482159921226525, 'samples': 1499136, 'steps': 7807, 'loss/train': 0.4012307971715927} 01/27/2022 03:12:12 - INFO - codeparrot_training - Step 7808: {'lr': 0.00048215385053525434, 'samples': 1499328, 'steps': 7808, 'loss/train': 0.949615091085434} 01/27/2022 03:12:18 - INFO - codeparrot_training - Step 7809: {'lr': 0.0004821477788495103, 'samples': 1499520, 'steps': 7809, 'loss/train': 0.4637819677591324} 01/27/2022 03:12:21 - INFO - codeparrot_training - Step 7810: {'lr': 0.0004821417061693189, 'samples': 1499712, 'steps': 7810, 'loss/train': 0.6999572664499283} 01/27/2022 03:12:24 - INFO - codeparrot_training - Step 7811: {'lr': 0.00048213563249470615, 'samples': 1499904, 'steps': 7811, 'loss/train': 0.8753483891487122} 01/27/2022 03:12:27 - INFO - codeparrot_training - Step 7812: {'lr': 0.00048212955782569805, 'samples': 1500096, 'steps': 7812, 'loss/train': 1.0009512305259705} 01/27/2022 03:12:31 - INFO - codeparrot_training - Step 7813: {'lr': 0.00048212348216232064, 'samples': 1500288, 'steps': 7813, 'loss/train': 0.6880630999803543} 01/27/2022 03:12:34 - INFO - codeparrot_training - Step 7814: {'lr': 0.0004821174055045999, 'samples': 1500480, 'steps': 7814, 'loss/train': 0.9645684957504272} 01/27/2022 03:12:37 - INFO - codeparrot_training - Step 7815: {'lr': 0.000482111327852562, 'samples': 1500672, 'steps': 7815, 'loss/train': 1.1220112144947052} 01/27/2022 03:12:40 - INFO - codeparrot_training - Step 7816: {'lr': 0.0004821052492062328, 'samples': 1500864, 'steps': 7816, 'loss/train': 0.7077962458133698} 01/27/2022 03:12:43 - INFO - codeparrot_training - Step 7817: {'lr': 0.0004820991695656385, 'samples': 1501056, 'steps': 7817, 'loss/train': 0.6334980726242065} 01/27/2022 03:12:48 - INFO - codeparrot_training - Step 7818: {'lr': 0.00048209308893080495, 'samples': 1501248, 'steps': 7818, 'loss/train': 0.7098113000392914} 01/27/2022 03:12:51 - INFO - codeparrot_training - Step 7819: {'lr': 0.00048208700730175834, 'samples': 1501440, 'steps': 7819, 'loss/train': 0.9952216744422913} 01/27/2022 03:12:54 - INFO - codeparrot_training - Step 7820: {'lr': 0.0004820809246785247, 'samples': 1501632, 'steps': 7820, 'loss/train': 0.8116088211536407} 01/27/2022 03:12:57 - INFO - codeparrot_training - Step 7821: {'lr': 0.00048207484106113, 'samples': 1501824, 'steps': 7821, 'loss/train': 1.300797164440155} 01/27/2022 03:13:00 - INFO - codeparrot_training - Step 7822: {'lr': 0.0004820687564496005, 'samples': 1502016, 'steps': 7822, 'loss/train': 1.1578384637832642} 01/27/2022 03:13:04 - INFO - codeparrot_training - Step 7823: {'lr': 0.00048206267084396204, 'samples': 1502208, 'steps': 7823, 'loss/train': 0.4872313141822815} 01/27/2022 03:13:07 - INFO - codeparrot_training - Step 7824: {'lr': 0.0004820565842442408, 'samples': 1502400, 'steps': 7824, 'loss/train': 0.7453633099794388} 01/27/2022 03:13:10 - INFO - codeparrot_training - Step 7825: {'lr': 0.00048205049665046287, 'samples': 1502592, 'steps': 7825, 'loss/train': 0.7671848237514496} 01/27/2022 03:13:13 - INFO - codeparrot_training - Step 7826: {'lr': 0.0004820444080626543, 'samples': 1502784, 'steps': 7826, 'loss/train': 0.8093358278274536} 01/27/2022 03:13:18 - INFO - codeparrot_training - Step 7827: {'lr': 0.00048203831848084115, 'samples': 1502976, 'steps': 7827, 'loss/train': 0.5655496269464493} 01/27/2022 03:13:21 - INFO - codeparrot_training - Step 7828: {'lr': 0.0004820322279050495, 'samples': 1503168, 'steps': 7828, 'loss/train': 0.885521650314331} 01/27/2022 03:13:24 - INFO - codeparrot_training - Step 7829: {'lr': 0.00048202613633530555, 'samples': 1503360, 'steps': 7829, 'loss/train': 0.7813453674316406} 01/27/2022 03:13:27 - INFO - codeparrot_training - Step 7830: {'lr': 0.00048202004377163524, 'samples': 1503552, 'steps': 7830, 'loss/train': 0.8517396748065948} 01/27/2022 03:13:30 - INFO - codeparrot_training - Step 7831: {'lr': 0.00048201395021406476, 'samples': 1503744, 'steps': 7831, 'loss/train': 3.4453729391098022} 01/27/2022 03:13:33 - INFO - codeparrot_training - Step 7832: {'lr': 0.0004820078556626202, 'samples': 1503936, 'steps': 7832, 'loss/train': 1.1244295835494995} 01/27/2022 03:13:36 - INFO - codeparrot_training - Step 7833: {'lr': 0.0004820017601173276, 'samples': 1504128, 'steps': 7833, 'loss/train': 1.1735132932662964} 01/27/2022 03:13:39 - INFO - codeparrot_training - Step 7834: {'lr': 0.00048199566357821314, 'samples': 1504320, 'steps': 7834, 'loss/train': 0.9443995356559753} 01/27/2022 03:13:46 - INFO - codeparrot_training - Step 7835: {'lr': 0.00048198956604530297, 'samples': 1504512, 'steps': 7835, 'loss/train': 0.6277397274971008} 01/27/2022 03:13:49 - INFO - codeparrot_training - Step 7836: {'lr': 0.0004819834675186231, 'samples': 1504704, 'steps': 7836, 'loss/train': 0.3261656016111374} 01/27/2022 03:13:52 - INFO - codeparrot_training - Step 7837: {'lr': 0.0004819773679981998, 'samples': 1504896, 'steps': 7837, 'loss/train': 1.1070130169391632} 01/27/2022 03:13:55 - INFO - codeparrot_training - Step 7838: {'lr': 0.0004819712674840591, 'samples': 1505088, 'steps': 7838, 'loss/train': 0.8864717781543732} 01/27/2022 03:13:58 - INFO - codeparrot_training - Step 7839: {'lr': 0.00048196516597622706, 'samples': 1505280, 'steps': 7839, 'loss/train': 0.6926346123218536} 01/27/2022 03:14:02 - INFO - codeparrot_training - Step 7840: {'lr': 0.00048195906347473, 'samples': 1505472, 'steps': 7840, 'loss/train': 0.8051336109638214} 01/27/2022 03:14:05 - INFO - codeparrot_training - Step 7841: {'lr': 0.00048195295997959393, 'samples': 1505664, 'steps': 7841, 'loss/train': 0.9829069375991821} 01/27/2022 03:14:08 - INFO - codeparrot_training - Step 7842: {'lr': 0.00048194685549084507, 'samples': 1505856, 'steps': 7842, 'loss/train': 1.2858451902866364} 01/27/2022 03:14:11 - INFO - codeparrot_training - Step 7843: {'lr': 0.00048194075000850944, 'samples': 1506048, 'steps': 7843, 'loss/train': 1.01383638381958} 01/27/2022 03:14:16 - INFO - codeparrot_training - Step 7844: {'lr': 0.0004819346435326134, 'samples': 1506240, 'steps': 7844, 'loss/train': 0.8293365240097046} 01/27/2022 03:14:19 - INFO - codeparrot_training - Step 7845: {'lr': 0.000481928536063183, 'samples': 1506432, 'steps': 7845, 'loss/train': 0.6689314842224121} 01/27/2022 03:14:22 - INFO - codeparrot_training - Step 7846: {'lr': 0.0004819224276002443, 'samples': 1506624, 'steps': 7846, 'loss/train': 0.8037448525428772} 01/27/2022 03:14:25 - INFO - codeparrot_training - Step 7847: {'lr': 0.0004819163181438236, 'samples': 1506816, 'steps': 7847, 'loss/train': 1.1823266744613647} 01/27/2022 03:14:28 - INFO - codeparrot_training - Step 7848: {'lr': 0.000481910207693947, 'samples': 1507008, 'steps': 7848, 'loss/train': 0.5089410245418549} 01/27/2022 03:14:32 - INFO - codeparrot_training - Step 7849: {'lr': 0.0004819040962506408, 'samples': 1507200, 'steps': 7849, 'loss/train': 0.3088833689689636} 01/27/2022 03:14:35 - INFO - codeparrot_training - Step 7850: {'lr': 0.000481897983813931, 'samples': 1507392, 'steps': 7850, 'loss/train': 0.8725158870220184} 01/27/2022 03:14:38 - INFO - codeparrot_training - Step 7851: {'lr': 0.00048189187038384396, 'samples': 1507584, 'steps': 7851, 'loss/train': 0.6697651147842407} 01/27/2022 03:14:41 - INFO - codeparrot_training - Step 7852: {'lr': 0.00048188575596040575, 'samples': 1507776, 'steps': 7852, 'loss/train': 0.5513506382703781} 01/27/2022 03:14:46 - INFO - codeparrot_training - Step 7853: {'lr': 0.00048187964054364254, 'samples': 1507968, 'steps': 7853, 'loss/train': 0.8499627113342285} 01/27/2022 03:14:49 - INFO - codeparrot_training - Step 7854: {'lr': 0.0004818735241335807, 'samples': 1508160, 'steps': 7854, 'loss/train': 0.8906369805335999} 01/27/2022 03:14:52 - INFO - codeparrot_training - Step 7855: {'lr': 0.00048186740673024614, 'samples': 1508352, 'steps': 7855, 'loss/train': 0.7993323504924774} 01/27/2022 03:14:55 - INFO - codeparrot_training - Step 7856: {'lr': 0.00048186128833366536, 'samples': 1508544, 'steps': 7856, 'loss/train': 0.4718369096517563} 01/27/2022 03:14:58 - INFO - codeparrot_training - Step 7857: {'lr': 0.0004818551689438644, 'samples': 1508736, 'steps': 7857, 'loss/train': 0.5390456020832062} 01/27/2022 03:15:01 - INFO - codeparrot_training - Step 7858: {'lr': 0.00048184904856086953, 'samples': 1508928, 'steps': 7858, 'loss/train': 0.9238568544387817} 01/27/2022 03:15:05 - INFO - codeparrot_training - Step 7859: {'lr': 0.0004818429271847069, 'samples': 1509120, 'steps': 7859, 'loss/train': 0.6613622903823853} 01/27/2022 03:15:08 - INFO - codeparrot_training - Step 7860: {'lr': 0.00048183680481540293, 'samples': 1509312, 'steps': 7860, 'loss/train': 0.4445161521434784} 01/27/2022 03:15:11 - INFO - codeparrot_training - Step 7861: {'lr': 0.0004818306814529836, 'samples': 1509504, 'steps': 7861, 'loss/train': 1.1163285970687866} 01/27/2022 03:15:15 - INFO - codeparrot_training - Step 7862: {'lr': 0.00048182455709747525, 'samples': 1509696, 'steps': 7862, 'loss/train': 0.9170120358467102} 01/27/2022 03:15:18 - INFO - codeparrot_training - Step 7863: {'lr': 0.0004818184317489041, 'samples': 1509888, 'steps': 7863, 'loss/train': 1.384129822254181} 01/27/2022 03:15:22 - INFO - codeparrot_training - Step 7864: {'lr': 0.00048181230540729643, 'samples': 1510080, 'steps': 7864, 'loss/train': 0.8685874342918396} 01/27/2022 03:15:25 - INFO - codeparrot_training - Step 7865: {'lr': 0.00048180617807267844, 'samples': 1510272, 'steps': 7865, 'loss/train': 0.5924545526504517} 01/27/2022 03:15:28 - INFO - codeparrot_training - Step 7866: {'lr': 0.0004818000497450764, 'samples': 1510464, 'steps': 7866, 'loss/train': 0.7551144361495972} 01/27/2022 03:15:31 - INFO - codeparrot_training - Step 7867: {'lr': 0.00048179392042451655, 'samples': 1510656, 'steps': 7867, 'loss/train': 0.6873787343502045} 01/27/2022 03:15:34 - INFO - codeparrot_training - Step 7868: {'lr': 0.0004817877901110251, 'samples': 1510848, 'steps': 7868, 'loss/train': 0.6639356017112732} 01/27/2022 03:15:37 - INFO - codeparrot_training - Step 7869: {'lr': 0.00048178165880462845, 'samples': 1511040, 'steps': 7869, 'loss/train': 0.6043915897607803} 01/27/2022 03:15:40 - INFO - codeparrot_training - Step 7870: {'lr': 0.0004817755265053527, 'samples': 1511232, 'steps': 7870, 'loss/train': 1.2866798043251038} 01/27/2022 03:15:47 - INFO - codeparrot_training - Step 7871: {'lr': 0.0004817693932132242, 'samples': 1511424, 'steps': 7871, 'loss/train': 0.9838162958621979} 01/27/2022 03:15:50 - INFO - codeparrot_training - Step 7872: {'lr': 0.0004817632589282693, 'samples': 1511616, 'steps': 7872, 'loss/train': 0.49550190567970276} 01/27/2022 03:15:53 - INFO - codeparrot_training - Step 7873: {'lr': 0.00048175712365051407, 'samples': 1511808, 'steps': 7873, 'loss/train': 0.4096338003873825} 01/27/2022 03:15:56 - INFO - codeparrot_training - Step 7874: {'lr': 0.00048175098737998504, 'samples': 1512000, 'steps': 7874, 'loss/train': 0.8556778728961945} 01/27/2022 03:15:59 - INFO - codeparrot_training - Step 7875: {'lr': 0.0004817448501167082, 'samples': 1512192, 'steps': 7875, 'loss/train': 1.0702468156814575} 01/27/2022 03:16:02 - INFO - codeparrot_training - Step 7876: {'lr': 0.0004817387118607102, 'samples': 1512384, 'steps': 7876, 'loss/train': 0.6854512542486191} 01/27/2022 03:16:05 - INFO - codeparrot_training - Step 7877: {'lr': 0.00048173257261201695, 'samples': 1512576, 'steps': 7877, 'loss/train': 0.9687380790710449} 01/27/2022 03:16:08 - INFO - codeparrot_training - Step 7878: {'lr': 0.00048172643237065504, 'samples': 1512768, 'steps': 7878, 'loss/train': 0.6014144271612167} 01/27/2022 03:16:12 - INFO - codeparrot_training - Step 7879: {'lr': 0.00048172029113665075, 'samples': 1512960, 'steps': 7879, 'loss/train': 0.6629211902618408} 01/27/2022 03:16:16 - INFO - codeparrot_training - Step 7880: {'lr': 0.0004817141489100302, 'samples': 1513152, 'steps': 7880, 'loss/train': 1.0883292853832245} 01/27/2022 03:16:19 - INFO - codeparrot_training - Step 7881: {'lr': 0.00048170800569081985, 'samples': 1513344, 'steps': 7881, 'loss/train': 0.6409168839454651} 01/27/2022 03:16:22 - INFO - codeparrot_training - Step 7882: {'lr': 0.000481701861479046, 'samples': 1513536, 'steps': 7882, 'loss/train': 0.8648069500923157} 01/27/2022 03:16:25 - INFO - codeparrot_training - Step 7883: {'lr': 0.000481695716274735, 'samples': 1513728, 'steps': 7883, 'loss/train': 0.7185922712087631} 01/27/2022 03:16:29 - INFO - codeparrot_training - Step 7884: {'lr': 0.000481689570077913, 'samples': 1513920, 'steps': 7884, 'loss/train': 0.8114057779312134} 01/27/2022 03:16:32 - INFO - codeparrot_training - Step 7885: {'lr': 0.00048168342288860646, 'samples': 1514112, 'steps': 7885, 'loss/train': 1.3363747894763947} 01/27/2022 03:16:35 - INFO - codeparrot_training - Step 7886: {'lr': 0.00048167727470684176, 'samples': 1514304, 'steps': 7886, 'loss/train': 0.5529212057590485} 01/27/2022 03:16:38 - INFO - codeparrot_training - Step 7887: {'lr': 0.0004816711255326452, 'samples': 1514496, 'steps': 7887, 'loss/train': 0.983967125415802} 01/27/2022 03:16:44 - INFO - codeparrot_training - Step 7888: {'lr': 0.00048166497536604306, 'samples': 1514688, 'steps': 7888, 'loss/train': 0.8866623044013977} 01/27/2022 03:16:47 - INFO - codeparrot_training - Step 7889: {'lr': 0.00048165882420706175, 'samples': 1514880, 'steps': 7889, 'loss/train': 0.6700116991996765} 01/27/2022 03:16:50 - INFO - codeparrot_training - Step 7890: {'lr': 0.0004816526720557276, 'samples': 1515072, 'steps': 7890, 'loss/train': 0.6005968004465103} 01/27/2022 03:16:54 - INFO - codeparrot_training - Step 7891: {'lr': 0.0004816465189120669, 'samples': 1515264, 'steps': 7891, 'loss/train': 0.5669369548559189} 01/27/2022 03:16:57 - INFO - codeparrot_training - Step 7892: {'lr': 0.00048164036477610616, 'samples': 1515456, 'steps': 7892, 'loss/train': 1.1160258650779724} 01/27/2022 03:17:00 - INFO - codeparrot_training - Step 7893: {'lr': 0.0004816342096478716, 'samples': 1515648, 'steps': 7893, 'loss/train': 1.1597065329551697} 01/27/2022 03:17:03 - INFO - codeparrot_training - Step 7894: {'lr': 0.00048162805352738966, 'samples': 1515840, 'steps': 7894, 'loss/train': 0.6567567139863968} 01/27/2022 03:17:06 - INFO - codeparrot_training - Step 7895: {'lr': 0.0004816218964146867, 'samples': 1516032, 'steps': 7895, 'loss/train': 0.7038163393735886} 01/27/2022 03:17:09 - INFO - codeparrot_training - Step 7896: {'lr': 0.000481615738309789, 'samples': 1516224, 'steps': 7896, 'loss/train': 0.7327421307563782} 01/27/2022 03:17:14 - INFO - codeparrot_training - Step 7897: {'lr': 0.00048160957921272306, 'samples': 1516416, 'steps': 7897, 'loss/train': 0.9189257025718689} 01/27/2022 03:17:17 - INFO - codeparrot_training - Step 7898: {'lr': 0.00048160341912351523, 'samples': 1516608, 'steps': 7898, 'loss/train': 0.9858128428459167} 01/27/2022 03:17:20 - INFO - codeparrot_training - Step 7899: {'lr': 0.00048159725804219195, 'samples': 1516800, 'steps': 7899, 'loss/train': 1.0720691978931427} 01/27/2022 03:17:23 - INFO - codeparrot_training - Step 7900: {'lr': 0.00048159109596877954, 'samples': 1516992, 'steps': 7900, 'loss/train': 0.9355022013187408} 01/27/2022 03:17:26 - INFO - codeparrot_training - Step 7901: {'lr': 0.00048158493290330443, 'samples': 1517184, 'steps': 7901, 'loss/train': 1.0648155212402344} 01/27/2022 03:17:29 - INFO - codeparrot_training - Step 7902: {'lr': 0.00048157876884579294, 'samples': 1517376, 'steps': 7902, 'loss/train': 1.2182016670703888} 01/27/2022 03:17:32 - INFO - codeparrot_training - Step 7903: {'lr': 0.00048157260379627154, 'samples': 1517568, 'steps': 7903, 'loss/train': 0.9894460737705231} 01/27/2022 03:17:36 - INFO - codeparrot_training - Step 7904: {'lr': 0.0004815664377547667, 'samples': 1517760, 'steps': 7904, 'loss/train': 0.3643922880291939} 01/27/2022 03:17:39 - INFO - codeparrot_training - Step 7905: {'lr': 0.0004815602707213047, 'samples': 1517952, 'steps': 7905, 'loss/train': 0.7452187836170197} 01/27/2022 03:17:43 - INFO - codeparrot_training - Step 7906: {'lr': 0.00048155410269591203, 'samples': 1518144, 'steps': 7906, 'loss/train': 1.1102011799812317} 01/27/2022 03:17:46 - INFO - codeparrot_training - Step 7907: {'lr': 0.00048154793367861514, 'samples': 1518336, 'steps': 7907, 'loss/train': 0.8398913741111755} 01/27/2022 03:17:49 - INFO - codeparrot_training - Step 7908: {'lr': 0.00048154176366944045, 'samples': 1518528, 'steps': 7908, 'loss/train': 1.0450433492660522} 01/27/2022 03:17:53 - INFO - codeparrot_training - Step 7909: {'lr': 0.0004815355926684144, 'samples': 1518720, 'steps': 7909, 'loss/train': 1.0921398997306824} 01/27/2022 03:17:56 - INFO - codeparrot_training - Step 7910: {'lr': 0.0004815294206755633, 'samples': 1518912, 'steps': 7910, 'loss/train': 1.0902434885501862} 01/27/2022 03:17:59 - INFO - codeparrot_training - Step 7911: {'lr': 0.0004815232476909137, 'samples': 1519104, 'steps': 7911, 'loss/train': 0.8110354542732239} 01/27/2022 03:18:02 - INFO - codeparrot_training - Step 7912: {'lr': 0.00048151707371449213, 'samples': 1519296, 'steps': 7912, 'loss/train': 0.4519917368888855} 01/27/2022 03:18:05 - INFO - codeparrot_training - Step 7913: {'lr': 0.0004815108987463248, 'samples': 1519488, 'steps': 7913, 'loss/train': 1.0089521706104279} 01/27/2022 03:18:08 - INFO - codeparrot_training - Step 7914: {'lr': 0.00048150472278643834, 'samples': 1519680, 'steps': 7914, 'loss/train': 0.3352842181921005} 01/27/2022 03:18:15 - INFO - codeparrot_training - Step 7915: {'lr': 0.0004814985458348592, 'samples': 1519872, 'steps': 7915, 'loss/train': 0.7540816962718964} 01/27/2022 03:18:18 - INFO - codeparrot_training - Step 7916: {'lr': 0.00048149236789161374, 'samples': 1520064, 'steps': 7916, 'loss/train': 0.7084206640720367} 01/27/2022 03:18:21 - INFO - codeparrot_training - Step 7917: {'lr': 0.00048148618895672846, 'samples': 1520256, 'steps': 7917, 'loss/train': 0.6379477232694626} 01/27/2022 03:18:24 - INFO - codeparrot_training - Step 7918: {'lr': 0.0004814800090302299, 'samples': 1520448, 'steps': 7918, 'loss/train': 0.8044279217720032} 01/27/2022 03:18:27 - INFO - codeparrot_training - Step 7919: {'lr': 0.00048147382811214445, 'samples': 1520640, 'steps': 7919, 'loss/train': 0.14758781343698502} 01/27/2022 03:18:30 - INFO - codeparrot_training - Step 7920: {'lr': 0.0004814676462024987, 'samples': 1520832, 'steps': 7920, 'loss/train': 1.03317528963089} 01/27/2022 03:18:34 - INFO - codeparrot_training - Step 7921: {'lr': 0.000481461463301319, 'samples': 1521024, 'steps': 7921, 'loss/train': 0.6532317101955414} 01/27/2022 03:18:37 - INFO - codeparrot_training - Step 7922: {'lr': 0.00048145527940863186, 'samples': 1521216, 'steps': 7922, 'loss/train': 0.9272961616516113} 01/27/2022 03:18:41 - INFO - codeparrot_training - Step 7923: {'lr': 0.00048144909452446384, 'samples': 1521408, 'steps': 7923, 'loss/train': 0.7441699057817459} 01/27/2022 03:18:45 - INFO - codeparrot_training - Step 7924: {'lr': 0.00048144290864884145, 'samples': 1521600, 'steps': 7924, 'loss/train': 0.7707991898059845} 01/27/2022 03:18:48 - INFO - codeparrot_training - Step 7925: {'lr': 0.000481436721781791, 'samples': 1521792, 'steps': 7925, 'loss/train': 1.1023491024971008} 01/27/2022 03:18:51 - INFO - codeparrot_training - Step 7926: {'lr': 0.00048143053392333917, 'samples': 1521984, 'steps': 7926, 'loss/train': 0.7214136868715286} 01/27/2022 03:18:54 - INFO - codeparrot_training - Step 7927: {'lr': 0.00048142434507351245, 'samples': 1522176, 'steps': 7927, 'loss/train': 0.5820470005273819} 01/27/2022 03:18:57 - INFO - codeparrot_training - Step 7928: {'lr': 0.00048141815523233735, 'samples': 1522368, 'steps': 7928, 'loss/train': 1.0285706520080566} 01/27/2022 03:19:00 - INFO - codeparrot_training - Step 7929: {'lr': 0.00048141196439984026, 'samples': 1522560, 'steps': 7929, 'loss/train': 1.252197504043579} 01/27/2022 03:19:03 - INFO - codeparrot_training - Step 7930: {'lr': 0.0004814057725760479, 'samples': 1522752, 'steps': 7930, 'loss/train': 0.9513642489910126} 01/27/2022 03:19:07 - INFO - codeparrot_training - Step 7931: {'lr': 0.0004813995797609866, 'samples': 1522944, 'steps': 7931, 'loss/train': 0.9921647608280182} 01/27/2022 03:19:12 - INFO - codeparrot_training - Step 7932: {'lr': 0.000481393385954683, 'samples': 1523136, 'steps': 7932, 'loss/train': 0.2687205746769905} 01/27/2022 03:19:15 - INFO - codeparrot_training - Step 7933: {'lr': 0.00048138719115716367, 'samples': 1523328, 'steps': 7933, 'loss/train': 1.1399496495723724} 01/27/2022 03:19:18 - INFO - codeparrot_training - Step 7934: {'lr': 0.00048138099536845503, 'samples': 1523520, 'steps': 7934, 'loss/train': 0.9981569945812225} 01/27/2022 03:19:21 - INFO - codeparrot_training - Step 7935: {'lr': 0.0004813747985885837, 'samples': 1523712, 'steps': 7935, 'loss/train': 1.2021254897117615} 01/27/2022 03:19:24 - INFO - codeparrot_training - Step 7936: {'lr': 0.00048136860081757617, 'samples': 1523904, 'steps': 7936, 'loss/train': 0.8581766188144684} 01/27/2022 03:19:27 - INFO - codeparrot_training - Step 7937: {'lr': 0.00048136240205545907, 'samples': 1524096, 'steps': 7937, 'loss/train': 0.7840797901153564} 01/27/2022 03:19:30 - INFO - codeparrot_training - Step 7938: {'lr': 0.0004813562023022588, 'samples': 1524288, 'steps': 7938, 'loss/train': 0.806641012430191} 01/27/2022 03:19:34 - INFO - codeparrot_training - Step 7939: {'lr': 0.00048135000155800217, 'samples': 1524480, 'steps': 7939, 'loss/train': 0.9326787292957306} 01/27/2022 03:19:37 - INFO - codeparrot_training - Step 7940: {'lr': 0.0004813437998227155, 'samples': 1524672, 'steps': 7940, 'loss/train': 0.21966589987277985} 01/27/2022 03:19:43 - INFO - codeparrot_training - Step 7941: {'lr': 0.00048133759709642556, 'samples': 1524864, 'steps': 7941, 'loss/train': 0.29686374217271805} 01/27/2022 03:19:46 - INFO - codeparrot_training - Step 7942: {'lr': 0.00048133139337915866, 'samples': 1525056, 'steps': 7942, 'loss/train': 0.6799529492855072} 01/27/2022 03:19:49 - INFO - codeparrot_training - Step 7943: {'lr': 0.00048132518867094167, 'samples': 1525248, 'steps': 7943, 'loss/train': 0.7721538841724396} 01/27/2022 03:19:52 - INFO - codeparrot_training - Step 7944: {'lr': 0.00048131898297180085, 'samples': 1525440, 'steps': 7944, 'loss/train': 0.041510313749313354} 01/27/2022 03:19:56 - INFO - codeparrot_training - Step 7945: {'lr': 0.0004813127762817631, 'samples': 1525632, 'steps': 7945, 'loss/train': 0.8517448604106903} 01/27/2022 03:19:59 - INFO - codeparrot_training - Step 7946: {'lr': 0.00048130656860085485, 'samples': 1525824, 'steps': 7946, 'loss/train': 0.8306281864643097} 01/27/2022 03:20:02 - INFO - codeparrot_training - Step 7947: {'lr': 0.0004813003599291027, 'samples': 1526016, 'steps': 7947, 'loss/train': 0.9948544800281525} 01/27/2022 03:20:05 - INFO - codeparrot_training - Step 7948: {'lr': 0.0004812941502665332, 'samples': 1526208, 'steps': 7948, 'loss/train': 0.9011657238006592} 01/27/2022 03:20:08 - INFO - codeparrot_training - Step 7949: {'lr': 0.0004812879396131731, 'samples': 1526400, 'steps': 7949, 'loss/train': 1.135038435459137} 01/27/2022 03:20:13 - INFO - codeparrot_training - Step 7950: {'lr': 0.0004812817279690488, 'samples': 1526592, 'steps': 7950, 'loss/train': 0.6304885894060135} 01/27/2022 03:20:16 - INFO - codeparrot_training - Step 7951: {'lr': 0.00048127551533418714, 'samples': 1526784, 'steps': 7951, 'loss/train': 0.964875340461731} 01/27/2022 03:20:19 - INFO - codeparrot_training - Step 7952: {'lr': 0.0004812693017086145, 'samples': 1526976, 'steps': 7952, 'loss/train': 0.7528277635574341} 01/27/2022 03:20:22 - INFO - codeparrot_training - Step 7953: {'lr': 0.0004812630870923577, 'samples': 1527168, 'steps': 7953, 'loss/train': 0.5030246526002884} 01/27/2022 03:20:25 - INFO - codeparrot_training - Step 7954: {'lr': 0.00048125687148544316, 'samples': 1527360, 'steps': 7954, 'loss/train': 0.8233906924724579} 01/27/2022 03:20:28 - INFO - codeparrot_training - Step 7955: {'lr': 0.0004812506548878977, 'samples': 1527552, 'steps': 7955, 'loss/train': 0.8618898689746857} 01/27/2022 03:20:31 - INFO - codeparrot_training - Step 7956: {'lr': 0.0004812444372997479, 'samples': 1527744, 'steps': 7956, 'loss/train': 1.1566316485404968} 01/27/2022 03:20:35 - INFO - codeparrot_training - Step 7957: {'lr': 0.00048123821872102023, 'samples': 1527936, 'steps': 7957, 'loss/train': 1.349240005016327} 01/27/2022 03:20:38 - INFO - codeparrot_training - Step 7958: {'lr': 0.00048123199915174153, 'samples': 1528128, 'steps': 7958, 'loss/train': 1.295726716518402} 01/27/2022 03:20:42 - INFO - codeparrot_training - Step 7959: {'lr': 0.0004812257785919384, 'samples': 1528320, 'steps': 7959, 'loss/train': 1.4199441075325012} 01/27/2022 03:20:45 - INFO - codeparrot_training - Step 7960: {'lr': 0.00048121955704163744, 'samples': 1528512, 'steps': 7960, 'loss/train': 0.6226876527070999} 01/27/2022 03:20:48 - INFO - codeparrot_training - Step 7961: {'lr': 0.00048121333450086524, 'samples': 1528704, 'steps': 7961, 'loss/train': 1.167978286743164} 01/27/2022 03:20:52 - INFO - codeparrot_training - Step 7962: {'lr': 0.00048120711096964866, 'samples': 1528896, 'steps': 7962, 'loss/train': 0.8489412367343903} 01/27/2022 03:20:55 - INFO - codeparrot_training - Step 7963: {'lr': 0.0004812008864480142, 'samples': 1529088, 'steps': 7963, 'loss/train': 0.868572860956192} 01/27/2022 03:20:58 - INFO - codeparrot_training - Step 7964: {'lr': 0.0004811946609359885, 'samples': 1529280, 'steps': 7964, 'loss/train': 0.8245476186275482} 01/27/2022 03:21:01 - INFO - codeparrot_training - Step 7965: {'lr': 0.00048118843443359827, 'samples': 1529472, 'steps': 7965, 'loss/train': 0.8326760530471802} 01/27/2022 03:21:04 - INFO - codeparrot_training - Step 7966: {'lr': 0.00048118220694087023, 'samples': 1529664, 'steps': 7966, 'loss/train': 0.8128527402877808} 01/27/2022 03:21:07 - INFO - codeparrot_training - Step 7967: {'lr': 0.00048117597845783106, 'samples': 1529856, 'steps': 7967, 'loss/train': 1.1205047070980072} 01/27/2022 03:21:12 - INFO - codeparrot_training - Step 7968: {'lr': 0.0004811697489845074, 'samples': 1530048, 'steps': 7968, 'loss/train': 0.7544558644294739} 01/27/2022 03:21:15 - INFO - codeparrot_training - Step 7969: {'lr': 0.0004811635185209259, 'samples': 1530240, 'steps': 7969, 'loss/train': 0.6268008202314377} 01/27/2022 03:21:18 - INFO - codeparrot_training - Step 7970: {'lr': 0.0004811572870671133, 'samples': 1530432, 'steps': 7970, 'loss/train': 0.07074415124952793} 01/27/2022 03:21:21 - INFO - codeparrot_training - Step 7971: {'lr': 0.0004811510546230963, 'samples': 1530624, 'steps': 7971, 'loss/train': 0.8258163928985596} 01/27/2022 03:21:24 - INFO - codeparrot_training - Step 7972: {'lr': 0.0004811448211889016, 'samples': 1530816, 'steps': 7972, 'loss/train': 0.910137802362442} 01/27/2022 03:21:27 - INFO - codeparrot_training - Step 7973: {'lr': 0.0004811385867645558, 'samples': 1531008, 'steps': 7973, 'loss/train': 1.037726640701294} 01/27/2022 03:21:31 - INFO - codeparrot_training - Step 7974: {'lr': 0.00048113235135008574, 'samples': 1531200, 'steps': 7974, 'loss/train': 0.24469352513551712} 01/27/2022 03:21:34 - INFO - codeparrot_training - Step 7975: {'lr': 0.0004811261149455181, 'samples': 1531392, 'steps': 7975, 'loss/train': 0.9502870738506317} 01/27/2022 03:21:37 - INFO - codeparrot_training - Step 7976: {'lr': 0.0004811198775508796, 'samples': 1531584, 'steps': 7976, 'loss/train': 0.7919387519359589} 01/27/2022 03:21:43 - INFO - codeparrot_training - Step 7977: {'lr': 0.0004811136391661969, 'samples': 1531776, 'steps': 7977, 'loss/train': 0.4590235501527786} 01/27/2022 03:21:46 - INFO - codeparrot_training - Step 7978: {'lr': 0.0004811073997914967, 'samples': 1531968, 'steps': 7978, 'loss/train': 0.34629546850919724} 01/27/2022 03:21:49 - INFO - codeparrot_training - Step 7979: {'lr': 0.00048110115942680585, 'samples': 1532160, 'steps': 7979, 'loss/train': 0.7414522916078568} 01/27/2022 03:21:53 - INFO - codeparrot_training - Step 7980: {'lr': 0.000481094918072151, 'samples': 1532352, 'steps': 7980, 'loss/train': 0.5606665313243866} 01/27/2022 03:21:56 - INFO - codeparrot_training - Step 7981: {'lr': 0.0004810886757275589, 'samples': 1532544, 'steps': 7981, 'loss/train': 0.11379532143473625} 01/27/2022 03:21:59 - INFO - codeparrot_training - Step 7982: {'lr': 0.0004810824323930563, 'samples': 1532736, 'steps': 7982, 'loss/train': 0.7716605365276337} 01/27/2022 03:22:02 - INFO - codeparrot_training - Step 7983: {'lr': 0.00048107618806866994, 'samples': 1532928, 'steps': 7983, 'loss/train': 0.6893970519304276} 01/27/2022 03:22:05 - INFO - codeparrot_training - Step 7984: {'lr': 0.0004810699427544265, 'samples': 1533120, 'steps': 7984, 'loss/train': 0.7687061727046967} 01/27/2022 03:22:10 - INFO - codeparrot_training - Step 7985: {'lr': 0.00048106369645035284, 'samples': 1533312, 'steps': 7985, 'loss/train': 1.1575816869735718} 01/27/2022 03:22:13 - INFO - codeparrot_training - Step 7986: {'lr': 0.0004810574491564757, 'samples': 1533504, 'steps': 7986, 'loss/train': 0.6350634545087814} 01/27/2022 03:22:16 - INFO - codeparrot_training - Step 7987: {'lr': 0.0004810512008728218, 'samples': 1533696, 'steps': 7987, 'loss/train': 0.4842468202114105} 01/27/2022 03:22:19 - INFO - codeparrot_training - Step 7988: {'lr': 0.00048104495159941794, 'samples': 1533888, 'steps': 7988, 'loss/train': 0.7759857773780823} 01/27/2022 03:22:22 - INFO - codeparrot_training - Step 7989: {'lr': 0.00048103870133629084, 'samples': 1534080, 'steps': 7989, 'loss/train': 0.09200944751501083} 01/27/2022 03:22:25 - INFO - codeparrot_training - Step 7990: {'lr': 0.00048103245008346735, 'samples': 1534272, 'steps': 7990, 'loss/train': 0.6041377633810043} 01/27/2022 03:22:29 - INFO - codeparrot_training - Step 7991: {'lr': 0.0004810261978409742, 'samples': 1534464, 'steps': 7991, 'loss/train': 0.8970527350902557} 01/27/2022 03:22:32 - INFO - codeparrot_training - Step 7992: {'lr': 0.00048101994460883815, 'samples': 1534656, 'steps': 7992, 'loss/train': 1.095897763967514} 01/27/2022 03:22:35 - INFO - codeparrot_training - Step 7993: {'lr': 0.00048101369038708596, 'samples': 1534848, 'steps': 7993, 'loss/train': 1.222396194934845} 01/27/2022 03:22:41 - INFO - codeparrot_training - Step 7994: {'lr': 0.0004810074351757446, 'samples': 1535040, 'steps': 7994, 'loss/train': 0.5680525302886963} 01/27/2022 03:22:44 - INFO - codeparrot_training - Step 7995: {'lr': 0.00048100117897484064, 'samples': 1535232, 'steps': 7995, 'loss/train': 0.6094139367341995} 01/27/2022 03:22:47 - INFO - codeparrot_training - Step 7996: {'lr': 0.0004809949217844011, 'samples': 1535424, 'steps': 7996, 'loss/train': 1.2380552887916565} 01/27/2022 03:22:50 - INFO - codeparrot_training - Step 7997: {'lr': 0.00048098866360445254, 'samples': 1535616, 'steps': 7997, 'loss/train': 0.9796683490276337} 01/27/2022 03:22:54 - INFO - codeparrot_training - Step 7998: {'lr': 0.00048098240443502195, 'samples': 1535808, 'steps': 7998, 'loss/train': 0.4951961785554886} 01/27/2022 03:22:57 - INFO - codeparrot_training - Step 7999: {'lr': 0.000480976144276136, 'samples': 1536000, 'steps': 7999, 'loss/train': 0.8134261965751648} 01/27/2022 03:22:57 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/27/2022 03:23:14 - WARNING - huggingface_hub.repository - Several commits (4) will be pushed upstream. 01/27/2022 03:23:14 - WARNING - huggingface_hub.repository - The progress bars may be unreliable. 01/27/2022 03:24:22 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py 8aa5ec4..52f50af royal-monkey-12 -> royal-monkey-12 01/27/2022 03:24:26 - INFO - codeparrot_training - Step 8000: {'lr': 0.0004809698831278217, 'samples': 1536192, 'steps': 8000, 'loss/train': 1.043077826499939} 01/27/2022 03:24:29 - INFO - codeparrot_training - Step 8001: {'lr': 0.0004809636209901057, 'samples': 1536384, 'steps': 8001, 'loss/train': 0.8112408220767975} 01/27/2022 03:24:32 - INFO - codeparrot_training - Step 8002: {'lr': 0.00048095735786301495, 'samples': 1536576, 'steps': 8002, 'loss/train': 0.5213453024625778} 01/27/2022 03:24:37 - INFO - codeparrot_training - Step 8003: {'lr': 0.00048095109374657617, 'samples': 1536768, 'steps': 8003, 'loss/train': 0.46370740234851837} 01/27/2022 03:24:41 - INFO - codeparrot_training - Step 8004: {'lr': 0.00048094482864081625, 'samples': 1536960, 'steps': 8004, 'loss/train': 0.9975743293762207} 01/27/2022 03:24:44 - INFO - codeparrot_training - Step 8005: {'lr': 0.00048093856254576196, 'samples': 1537152, 'steps': 8005, 'loss/train': 0.8003569543361664} 01/27/2022 03:24:47 - INFO - codeparrot_training - Step 8006: {'lr': 0.0004809322954614403, 'samples': 1537344, 'steps': 8006, 'loss/train': 1.1291024386882782} 01/27/2022 03:24:50 - INFO - codeparrot_training - Step 8007: {'lr': 0.00048092602738787795, 'samples': 1537536, 'steps': 8007, 'loss/train': 0.6338742971420288} 01/27/2022 03:24:53 - INFO - codeparrot_training - Step 8008: {'lr': 0.00048091975832510183, 'samples': 1537728, 'steps': 8008, 'loss/train': 0.8993458449840546} 01/27/2022 03:24:57 - INFO - codeparrot_training - Step 8009: {'lr': 0.00048091348827313885, 'samples': 1537920, 'steps': 8009, 'loss/train': 0.23654162138700485} 01/27/2022 03:25:00 - INFO - codeparrot_training - Step 8010: {'lr': 0.0004809072172320157, 'samples': 1538112, 'steps': 8010, 'loss/train': 0.715812474489212} 01/27/2022 03:25:03 - INFO - codeparrot_training - Step 8011: {'lr': 0.0004809009452017594, 'samples': 1538304, 'steps': 8011, 'loss/train': 0.7412209063768387} 01/27/2022 03:25:07 - INFO - codeparrot_training - Step 8012: {'lr': 0.00048089467218239687, 'samples': 1538496, 'steps': 8012, 'loss/train': 0.8899330794811249} 01/27/2022 03:25:11 - INFO - codeparrot_training - Step 8013: {'lr': 0.0004808883981739548, 'samples': 1538688, 'steps': 8013, 'loss/train': 0.8792992830276489} 01/27/2022 03:25:14 - INFO - codeparrot_training - Step 8014: {'lr': 0.00048088212317646016, 'samples': 1538880, 'steps': 8014, 'loss/train': 0.978124737739563} 01/27/2022 03:25:17 - INFO - codeparrot_training - Step 8015: {'lr': 0.00048087584718993975, 'samples': 1539072, 'steps': 8015, 'loss/train': 0.7189581245183945} 01/27/2022 03:25:20 - INFO - codeparrot_training - Step 8016: {'lr': 0.0004808695702144206, 'samples': 1539264, 'steps': 8016, 'loss/train': 0.7156959772109985} 01/27/2022 03:25:23 - INFO - codeparrot_training - Step 8017: {'lr': 0.0004808632922499295, 'samples': 1539456, 'steps': 8017, 'loss/train': 0.9469901919364929} 01/27/2022 03:25:26 - INFO - codeparrot_training - Step 8018: {'lr': 0.00048085701329649336, 'samples': 1539648, 'steps': 8018, 'loss/train': 1.0204984545707703} 01/27/2022 03:25:29 - INFO - codeparrot_training - Step 8019: {'lr': 0.0004808507333541391, 'samples': 1539840, 'steps': 8019, 'loss/train': 0.7888233661651611} 01/27/2022 03:25:32 - INFO - codeparrot_training - Step 8020: {'lr': 0.00048084445242289355, 'samples': 1540032, 'steps': 8020, 'loss/train': 0.7355433851480484} 01/27/2022 03:25:39 - INFO - codeparrot_training - Step 8021: {'lr': 0.0004808381705027837, 'samples': 1540224, 'steps': 8021, 'loss/train': 0.6752677112817764} 01/27/2022 03:25:42 - INFO - codeparrot_training - Step 8022: {'lr': 0.00048083188759383646, 'samples': 1540416, 'steps': 8022, 'loss/train': 0.938993901014328} 01/27/2022 03:25:45 - INFO - codeparrot_training - Step 8023: {'lr': 0.00048082560369607863, 'samples': 1540608, 'steps': 8023, 'loss/train': 0.7723834812641144} 01/27/2022 03:25:48 - INFO - codeparrot_training - Step 8024: {'lr': 0.0004808193188095372, 'samples': 1540800, 'steps': 8024, 'loss/train': 1.8589260578155518} 01/27/2022 03:25:51 - INFO - codeparrot_training - Step 8025: {'lr': 0.00048081303293423923, 'samples': 1540992, 'steps': 8025, 'loss/train': 0.8012058734893799} 01/27/2022 03:25:55 - INFO - codeparrot_training - Step 8026: {'lr': 0.0004808067460702115, 'samples': 1541184, 'steps': 8026, 'loss/train': 5.510090231895447} 01/27/2022 03:25:58 - INFO - codeparrot_training - Step 8027: {'lr': 0.00048080045821748086, 'samples': 1541376, 'steps': 8027, 'loss/train': 0.7399517744779587} 01/27/2022 03:26:01 - INFO - codeparrot_training - Step 8028: {'lr': 0.00048079416937607436, 'samples': 1541568, 'steps': 8028, 'loss/train': 1.079673707485199} 01/27/2022 03:26:04 - INFO - codeparrot_training - Step 8029: {'lr': 0.000480787879546019, 'samples': 1541760, 'steps': 8029, 'loss/train': 1.1063225269317627} 01/27/2022 03:26:08 - INFO - codeparrot_training - Step 8030: {'lr': 0.00048078158872734157, 'samples': 1541952, 'steps': 8030, 'loss/train': 0.8407124876976013} 01/27/2022 03:26:12 - INFO - codeparrot_training - Step 8031: {'lr': 0.0004807752969200691, 'samples': 1542144, 'steps': 8031, 'loss/train': 1.010835886001587} 01/27/2022 03:26:15 - INFO - codeparrot_training - Step 8032: {'lr': 0.0004807690041242286, 'samples': 1542336, 'steps': 8032, 'loss/train': 0.8546279668807983} 01/27/2022 03:26:18 - INFO - codeparrot_training - Step 8033: {'lr': 0.00048076271033984687, 'samples': 1542528, 'steps': 8033, 'loss/train': 1.0893819630146027} 01/27/2022 03:26:21 - INFO - codeparrot_training - Step 8034: {'lr': 0.00048075641556695107, 'samples': 1542720, 'steps': 8034, 'loss/train': 0.8026865422725677} 01/27/2022 03:26:24 - INFO - codeparrot_training - Step 8035: {'lr': 0.000480750119805568, 'samples': 1542912, 'steps': 8035, 'loss/train': 1.0517053306102753} 01/27/2022 03:26:27 - INFO - codeparrot_training - Step 8036: {'lr': 0.0004807438230557247, 'samples': 1543104, 'steps': 8036, 'loss/train': 0.8666618764400482} 01/27/2022 03:26:30 - INFO - codeparrot_training - Step 8037: {'lr': 0.00048073752531744814, 'samples': 1543296, 'steps': 8037, 'loss/train': 0.7995831370353699} 01/27/2022 03:26:36 - INFO - codeparrot_training - Step 8038: {'lr': 0.0004807312265907653, 'samples': 1543488, 'steps': 8038, 'loss/train': 1.16632479429245} 01/27/2022 03:26:39 - INFO - codeparrot_training - Step 8039: {'lr': 0.0004807249268757031, 'samples': 1543680, 'steps': 8039, 'loss/train': 0.9676771759986877} 01/27/2022 03:26:42 - INFO - codeparrot_training - Step 8040: {'lr': 0.00048071862617228854, 'samples': 1543872, 'steps': 8040, 'loss/train': 0.5639373064041138} 01/27/2022 03:26:45 - INFO - codeparrot_training - Step 8041: {'lr': 0.0004807123244805488, 'samples': 1544064, 'steps': 8041, 'loss/train': 0.8421626687049866} 01/27/2022 03:26:48 - INFO - codeparrot_training - Step 8042: {'lr': 0.0004807060218005106, 'samples': 1544256, 'steps': 8042, 'loss/train': 1.401572048664093} 01/27/2022 03:26:52 - INFO - codeparrot_training - Step 8043: {'lr': 0.00048069971813220107, 'samples': 1544448, 'steps': 8043, 'loss/train': 0.6346340328454971} 01/27/2022 03:26:55 - INFO - codeparrot_training - Step 8044: {'lr': 0.0004806934134756472, 'samples': 1544640, 'steps': 8044, 'loss/train': 0.7563070356845856} 01/27/2022 03:26:58 - INFO - codeparrot_training - Step 8045: {'lr': 0.0004806871078308761, 'samples': 1544832, 'steps': 8045, 'loss/train': 1.6928755044937134} 01/27/2022 03:27:01 - INFO - codeparrot_training - Step 8046: {'lr': 0.0004806808011979146, 'samples': 1545024, 'steps': 8046, 'loss/train': 0.8124042749404907} 01/27/2022 03:27:05 - INFO - codeparrot_training - Step 8047: {'lr': 0.00048067449357678984, 'samples': 1545216, 'steps': 8047, 'loss/train': 0.9979931116104126} 01/27/2022 03:27:08 - INFO - codeparrot_training - Step 8048: {'lr': 0.0004806681849675287, 'samples': 1545408, 'steps': 8048, 'loss/train': 0.8186987042427063} 01/27/2022 03:27:12 - INFO - codeparrot_training - Step 8049: {'lr': 0.00048066187537015837, 'samples': 1545600, 'steps': 8049, 'loss/train': 0.9291662871837616} 01/27/2022 03:27:15 - INFO - codeparrot_training - Step 8050: {'lr': 0.00048065556478470584, 'samples': 1545792, 'steps': 8050, 'loss/train': 0.7209126502275467} 01/27/2022 03:27:18 - INFO - codeparrot_training - Step 8051: {'lr': 0.0004806492532111981, 'samples': 1545984, 'steps': 8051, 'loss/train': 0.5314584821462631} 01/27/2022 03:27:21 - INFO - codeparrot_training - Step 8052: {'lr': 0.00048064294064966215, 'samples': 1546176, 'steps': 8052, 'loss/train': 1.0493180751800537} 01/27/2022 03:27:24 - INFO - codeparrot_training - Step 8053: {'lr': 0.00048063662710012513, 'samples': 1546368, 'steps': 8053, 'loss/train': 0.9219168126583099} 01/27/2022 03:27:27 - INFO - codeparrot_training - Step 8054: {'lr': 0.000480630312562614, 'samples': 1546560, 'steps': 8054, 'loss/train': 0.7618756592273712} 01/27/2022 03:27:30 - INFO - codeparrot_training - Step 8055: {'lr': 0.0004806239970371558, 'samples': 1546752, 'steps': 8055, 'loss/train': 0.9665871262550354} 01/27/2022 03:27:35 - INFO - codeparrot_training - Step 8056: {'lr': 0.0004806176805237777, 'samples': 1546944, 'steps': 8056, 'loss/train': 0.9051233232021332} 01/27/2022 03:27:38 - INFO - codeparrot_training - Step 8057: {'lr': 0.0004806113630225066, 'samples': 1547136, 'steps': 8057, 'loss/train': 0.36071909219026566} 01/27/2022 03:27:41 - INFO - codeparrot_training - Step 8058: {'lr': 0.0004806050445333697, 'samples': 1547328, 'steps': 8058, 'loss/train': 0.921308308839798} 01/27/2022 03:27:44 - INFO - codeparrot_training - Step 8059: {'lr': 0.00048059872505639415, 'samples': 1547520, 'steps': 8059, 'loss/train': 0.8906187415122986} 01/27/2022 03:27:47 - INFO - codeparrot_training - Step 8060: {'lr': 0.0004805924045916067, 'samples': 1547712, 'steps': 8060, 'loss/train': 1.037989318370819} 01/27/2022 03:27:51 - INFO - codeparrot_training - Step 8061: {'lr': 0.00048058608313903474, 'samples': 1547904, 'steps': 8061, 'loss/train': 0.5841266065835953} 01/27/2022 03:27:54 - INFO - codeparrot_training - Step 8062: {'lr': 0.0004805797606987051, 'samples': 1548096, 'steps': 8062, 'loss/train': 0.7110297828912735} 01/27/2022 03:27:57 - INFO - codeparrot_training - Step 8063: {'lr': 0.0004805734372706451, 'samples': 1548288, 'steps': 8063, 'loss/train': 1.0769393742084503} 01/27/2022 03:28:00 - INFO - codeparrot_training - Step 8064: {'lr': 0.0004805671128548816, 'samples': 1548480, 'steps': 8064, 'loss/train': 0.6896863281726837} 01/27/2022 03:28:04 - INFO - codeparrot_training - Step 8065: {'lr': 0.00048056078745144183, 'samples': 1548672, 'steps': 8065, 'loss/train': 0.631222665309906} 01/27/2022 03:28:08 - INFO - codeparrot_training - Step 8066: {'lr': 0.0004805544610603529, 'samples': 1548864, 'steps': 8066, 'loss/train': 1.1238567531108856} 01/27/2022 03:28:11 - INFO - codeparrot_training - Step 8067: {'lr': 0.00048054813368164184, 'samples': 1549056, 'steps': 8067, 'loss/train': 0.9818770587444305} 01/27/2022 03:28:14 - INFO - codeparrot_training - Step 8068: {'lr': 0.00048054180531533576, 'samples': 1549248, 'steps': 8068, 'loss/train': 0.8422971367835999} 01/27/2022 03:28:17 - INFO - codeparrot_training - Step 8069: {'lr': 0.00048053547596146185, 'samples': 1549440, 'steps': 8069, 'loss/train': 0.10013305395841599} 01/27/2022 03:28:20 - INFO - codeparrot_training - Step 8070: {'lr': 0.0004805291456200471, 'samples': 1549632, 'steps': 8070, 'loss/train': 0.22986473143100739} 01/27/2022 03:28:23 - INFO - codeparrot_training - Step 8071: {'lr': 0.0004805228142911188, 'samples': 1549824, 'steps': 8071, 'loss/train': 0.8984387218952179} 01/27/2022 03:28:26 - INFO - codeparrot_training - Step 8072: {'lr': 0.0004805164819747038, 'samples': 1550016, 'steps': 8072, 'loss/train': 0.781223863363266} 01/27/2022 03:28:30 - INFO - codeparrot_training - Step 8073: {'lr': 0.0004805101486708295, 'samples': 1550208, 'steps': 8073, 'loss/train': 1.1792592108249664} 01/27/2022 03:28:35 - INFO - codeparrot_training - Step 8074: {'lr': 0.0004805038143795229, 'samples': 1550400, 'steps': 8074, 'loss/train': 1.1394618451595306} 01/27/2022 03:28:38 - INFO - codeparrot_training - Step 8075: {'lr': 0.00048049747910081114, 'samples': 1550592, 'steps': 8075, 'loss/train': 0.16372893378138542} 01/27/2022 03:28:41 - INFO - codeparrot_training - Step 8076: {'lr': 0.0004804911428347214, 'samples': 1550784, 'steps': 8076, 'loss/train': 0.7571381628513336} 01/27/2022 03:28:44 - INFO - codeparrot_training - Step 8077: {'lr': 0.0004804848055812807, 'samples': 1550976, 'steps': 8077, 'loss/train': 1.0640203356742859} 01/27/2022 03:28:47 - INFO - codeparrot_training - Step 8078: {'lr': 0.0004804784673405164, 'samples': 1551168, 'steps': 8078, 'loss/train': 1.1849548816680908} 01/27/2022 03:28:50 - INFO - codeparrot_training - Step 8079: {'lr': 0.00048047212811245545, 'samples': 1551360, 'steps': 8079, 'loss/train': 0.951287180185318} 01/27/2022 03:28:53 - INFO - codeparrot_training - Step 8080: {'lr': 0.00048046578789712516, 'samples': 1551552, 'steps': 8080, 'loss/train': 0.6085748076438904} 01/27/2022 03:28:57 - INFO - codeparrot_training - Step 8081: {'lr': 0.0004804594466945525, 'samples': 1551744, 'steps': 8081, 'loss/train': 0.8607314229011536} 01/27/2022 03:29:01 - INFO - codeparrot_training - Step 8082: {'lr': 0.00048045310450476486, 'samples': 1551936, 'steps': 8082, 'loss/train': 1.0743852853775024} 01/27/2022 03:29:04 - INFO - codeparrot_training - Step 8083: {'lr': 0.0004804467613277893, 'samples': 1552128, 'steps': 8083, 'loss/train': 1.2686274647712708} 01/27/2022 03:29:07 - INFO - codeparrot_training - Step 8084: {'lr': 0.00048044041716365296, 'samples': 1552320, 'steps': 8084, 'loss/train': 0.6751388311386108} 01/27/2022 03:29:11 - INFO - codeparrot_training - Step 8085: {'lr': 0.000480434072012383, 'samples': 1552512, 'steps': 8085, 'loss/train': 0.909879595041275} 01/27/2022 03:29:14 - INFO - codeparrot_training - Step 8086: {'lr': 0.0004804277258740067, 'samples': 1552704, 'steps': 8086, 'loss/train': 0.6879605054855347} 01/27/2022 03:29:17 - INFO - codeparrot_training - Step 8087: {'lr': 0.0004804213787485512, 'samples': 1552896, 'steps': 8087, 'loss/train': 0.90841144323349} 01/27/2022 03:29:20 - INFO - codeparrot_training - Step 8088: {'lr': 0.00048041503063604366, 'samples': 1553088, 'steps': 8088, 'loss/train': 0.778977245092392} 01/27/2022 03:29:23 - INFO - codeparrot_training - Step 8089: {'lr': 0.00048040868153651124, 'samples': 1553280, 'steps': 8089, 'loss/train': 0.7726026177406311} 01/27/2022 03:29:26 - INFO - codeparrot_training - Step 8090: {'lr': 0.00048040233144998123, 'samples': 1553472, 'steps': 8090, 'loss/train': 0.18543855473399162} 01/27/2022 03:29:31 - INFO - codeparrot_training - Step 8091: {'lr': 0.0004803959803764808, 'samples': 1553664, 'steps': 8091, 'loss/train': 1.0549319386482239} 01/27/2022 03:29:34 - INFO - codeparrot_training - Step 8092: {'lr': 0.0004803896283160372, 'samples': 1553856, 'steps': 8092, 'loss/train': 0.9652485251426697} 01/27/2022 03:29:37 - INFO - codeparrot_training - Step 8093: {'lr': 0.0004803832752686775, 'samples': 1554048, 'steps': 8093, 'loss/train': 0.6373940706253052} 01/27/2022 03:29:40 - INFO - codeparrot_training - Step 8094: {'lr': 0.00048037692123442904, 'samples': 1554240, 'steps': 8094, 'loss/train': 1.1623391211032867} 01/27/2022 03:29:43 - INFO - codeparrot_training - Step 8095: {'lr': 0.000480370566213319, 'samples': 1554432, 'steps': 8095, 'loss/train': 1.516098439693451} 01/27/2022 03:29:46 - INFO - codeparrot_training - Step 8096: {'lr': 0.00048036421020537464, 'samples': 1554624, 'steps': 8096, 'loss/train': 0.4236506223678589} 01/27/2022 03:29:50 - INFO - codeparrot_training - Step 8097: {'lr': 0.0004803578532106231, 'samples': 1554816, 'steps': 8097, 'loss/train': 0.9007028639316559} 01/27/2022 03:29:53 - INFO - codeparrot_training - Step 8098: {'lr': 0.00048035149522909174, 'samples': 1555008, 'steps': 8098, 'loss/train': 0.7591651082038879} 01/27/2022 03:29:56 - INFO - codeparrot_training - Step 8099: {'lr': 0.0004803451362608076, 'samples': 1555200, 'steps': 8099, 'loss/train': 0.9195417165756226} 01/27/2022 03:30:01 - INFO - codeparrot_training - Step 8100: {'lr': 0.00048033877630579815, 'samples': 1555392, 'steps': 8100, 'loss/train': 0.824456512928009} 01/27/2022 03:30:04 - INFO - codeparrot_training - Step 8101: {'lr': 0.00048033241536409043, 'samples': 1555584, 'steps': 8101, 'loss/train': 0.9927333891391754} 01/27/2022 03:30:07 - INFO - codeparrot_training - Step 8102: {'lr': 0.0004803260534357119, 'samples': 1555776, 'steps': 8102, 'loss/train': 0.9983004927635193} 01/27/2022 03:30:10 - INFO - codeparrot_training - Step 8103: {'lr': 0.00048031969052068956, 'samples': 1555968, 'steps': 8103, 'loss/train': 0.9945923388004303} 01/27/2022 03:30:14 - INFO - codeparrot_training - Step 8104: {'lr': 0.00048031332661905093, 'samples': 1556160, 'steps': 8104, 'loss/train': 0.6686235666275024} 01/27/2022 03:30:17 - INFO - codeparrot_training - Step 8105: {'lr': 0.000480306961730823, 'samples': 1556352, 'steps': 8105, 'loss/train': 0.8327425718307495} 01/27/2022 03:30:20 - INFO - codeparrot_training - Step 8106: {'lr': 0.00048030059585603326, 'samples': 1556544, 'steps': 8106, 'loss/train': 1.0259461998939514} 01/27/2022 03:30:23 - INFO - codeparrot_training - Step 8107: {'lr': 0.0004802942289947089, 'samples': 1556736, 'steps': 8107, 'loss/train': 0.8306059241294861} 01/27/2022 03:30:26 - INFO - codeparrot_training - Step 8108: {'lr': 0.00048028786114687715, 'samples': 1556928, 'steps': 8108, 'loss/train': 1.169945865869522} 01/27/2022 03:30:31 - INFO - codeparrot_training - Step 8109: {'lr': 0.0004802814923125654, 'samples': 1557120, 'steps': 8109, 'loss/train': 0.9873835444450378} 01/27/2022 03:30:34 - INFO - codeparrot_training - Step 8110: {'lr': 0.00048027512249180083, 'samples': 1557312, 'steps': 8110, 'loss/train': 0.8116419017314911} 01/27/2022 03:30:37 - INFO - codeparrot_training - Step 8111: {'lr': 0.0004802687516846107, 'samples': 1557504, 'steps': 8111, 'loss/train': 0.4153676927089691} 01/27/2022 03:30:40 - INFO - codeparrot_training - Step 8112: {'lr': 0.0004802623798910224, 'samples': 1557696, 'steps': 8112, 'loss/train': 0.8919033408164978} 01/27/2022 03:30:43 - INFO - codeparrot_training - Step 8113: {'lr': 0.00048025600711106323, 'samples': 1557888, 'steps': 8113, 'loss/train': 0.5574623197317123} 01/27/2022 03:30:46 - INFO - codeparrot_training - Step 8114: {'lr': 0.00048024963334476035, 'samples': 1558080, 'steps': 8114, 'loss/train': 0.8037180304527283} 01/27/2022 03:30:49 - INFO - codeparrot_training - Step 8115: {'lr': 0.00048024325859214123, 'samples': 1558272, 'steps': 8115, 'loss/train': 0.6282425969839096} 01/27/2022 03:30:53 - INFO - codeparrot_training - Step 8116: {'lr': 0.00048023688285323305, 'samples': 1558464, 'steps': 8116, 'loss/train': 0.8071230947971344} 01/27/2022 03:30:56 - INFO - codeparrot_training - Step 8117: {'lr': 0.0004802305061280632, 'samples': 1558656, 'steps': 8117, 'loss/train': 1.0996383726596832} 01/27/2022 03:31:01 - INFO - codeparrot_training - Step 8118: {'lr': 0.0004802241284166589, 'samples': 1558848, 'steps': 8118, 'loss/train': 0.5692374408245087} 01/27/2022 03:31:04 - INFO - codeparrot_training - Step 8119: {'lr': 0.00048021774971904765, 'samples': 1559040, 'steps': 8119, 'loss/train': 1.2666015923023224} 01/27/2022 03:31:07 - INFO - codeparrot_training - Step 8120: {'lr': 0.0004802113700352566, 'samples': 1559232, 'steps': 8120, 'loss/train': 0.8829182982444763} 01/27/2022 03:31:11 - INFO - codeparrot_training - Step 8121: {'lr': 0.0004802049893653131, 'samples': 1559424, 'steps': 8121, 'loss/train': 0.39424681663513184} 01/27/2022 03:31:14 - INFO - codeparrot_training - Step 8122: {'lr': 0.0004801986077092446, 'samples': 1559616, 'steps': 8122, 'loss/train': 0.40269899368286133} 01/27/2022 03:31:17 - INFO - codeparrot_training - Step 8123: {'lr': 0.0004801922250670783, 'samples': 1559808, 'steps': 8123, 'loss/train': 0.9211505949497223} 01/27/2022 03:31:20 - INFO - codeparrot_training - Step 8124: {'lr': 0.0004801858414388416, 'samples': 1560000, 'steps': 8124, 'loss/train': 0.8954902589321136} 01/27/2022 03:31:23 - INFO - codeparrot_training - Step 8125: {'lr': 0.0004801794568245619, 'samples': 1560192, 'steps': 8125, 'loss/train': 0.6637552678585052} 01/27/2022 03:31:28 - INFO - codeparrot_training - Step 8126: {'lr': 0.00048017307122426653, 'samples': 1560384, 'steps': 8126, 'loss/train': 0.791746973991394} 01/27/2022 03:31:31 - INFO - codeparrot_training - Step 8127: {'lr': 0.0004801666846379827, 'samples': 1560576, 'steps': 8127, 'loss/train': 1.0226072072982788} 01/27/2022 03:31:34 - INFO - codeparrot_training - Step 8128: {'lr': 0.00048016029706573793, 'samples': 1560768, 'steps': 8128, 'loss/train': 0.8508562445640564} 01/27/2022 03:31:37 - INFO - codeparrot_training - Step 8129: {'lr': 0.0004801539085075596, 'samples': 1560960, 'steps': 8129, 'loss/train': 0.6336996406316757} 01/27/2022 03:31:40 - INFO - codeparrot_training - Step 8130: {'lr': 0.0004801475189634749, 'samples': 1561152, 'steps': 8130, 'loss/train': 0.0727000292390585} 01/27/2022 03:31:43 - INFO - codeparrot_training - Step 8131: {'lr': 0.0004801411284335114, 'samples': 1561344, 'steps': 8131, 'loss/train': 0.5691192895174026} 01/27/2022 03:31:47 - INFO - codeparrot_training - Step 8132: {'lr': 0.0004801347369176963, 'samples': 1561536, 'steps': 8132, 'loss/train': 0.6783358007669449} 01/27/2022 03:31:50 - INFO - codeparrot_training - Step 8133: {'lr': 0.0004801283444160571, 'samples': 1561728, 'steps': 8133, 'loss/train': 0.9730697572231293} 01/27/2022 03:31:53 - INFO - codeparrot_training - Step 8134: {'lr': 0.0004801219509286212, 'samples': 1561920, 'steps': 8134, 'loss/train': 1.506675124168396} 01/27/2022 03:31:58 - INFO - codeparrot_training - Step 8135: {'lr': 0.00048011555645541585, 'samples': 1562112, 'steps': 8135, 'loss/train': 1.2385354042053223} 01/27/2022 03:32:01 - INFO - codeparrot_training - Step 8136: {'lr': 0.00048010916099646854, 'samples': 1562304, 'steps': 8136, 'loss/train': 0.38743315637111664} 01/27/2022 03:32:04 - INFO - codeparrot_training - Step 8137: {'lr': 0.0004801027645518067, 'samples': 1562496, 'steps': 8137, 'loss/train': 0.28414153307676315} 01/27/2022 03:32:08 - INFO - codeparrot_training - Step 8138: {'lr': 0.00048009636712145764, 'samples': 1562688, 'steps': 8138, 'loss/train': 0.9412962198257446} 01/27/2022 03:32:11 - INFO - codeparrot_training - Step 8139: {'lr': 0.00048008996870544887, 'samples': 1562880, 'steps': 8139, 'loss/train': 1.151311844587326} 01/27/2022 03:32:14 - INFO - codeparrot_training - Step 8140: {'lr': 0.0004800835693038076, 'samples': 1563072, 'steps': 8140, 'loss/train': 1.3115582764148712} 01/27/2022 03:32:17 - INFO - codeparrot_training - Step 8141: {'lr': 0.0004800771689165615, 'samples': 1563264, 'steps': 8141, 'loss/train': 1.599145531654358} 01/27/2022 03:32:20 - INFO - codeparrot_training - Step 8142: {'lr': 0.00048007076754373785, 'samples': 1563456, 'steps': 8142, 'loss/train': 1.6716653108596802} 01/27/2022 03:32:23 - INFO - codeparrot_training - Step 8143: {'lr': 0.00048006436518536403, 'samples': 1563648, 'steps': 8143, 'loss/train': 0.8925158679485321} 01/27/2022 03:32:26 - INFO - codeparrot_training - Step 8144: {'lr': 0.0004800579618414676, 'samples': 1563840, 'steps': 8144, 'loss/train': 0.7823526263237} 01/27/2022 03:32:31 - INFO - codeparrot_training - Step 8145: {'lr': 0.00048005155751207584, 'samples': 1564032, 'steps': 8145, 'loss/train': 0.8344304859638214} 01/27/2022 03:32:35 - INFO - codeparrot_training - Step 8146: {'lr': 0.0004800451521972163, 'samples': 1564224, 'steps': 8146, 'loss/train': 1.1242567598819733} 01/27/2022 03:32:38 - INFO - codeparrot_training - Step 8147: {'lr': 0.0004800387458969164, 'samples': 1564416, 'steps': 8147, 'loss/train': 0.8553459048271179} 01/27/2022 03:32:41 - INFO - codeparrot_training - Step 8148: {'lr': 0.00048003233861120356, 'samples': 1564608, 'steps': 8148, 'loss/train': 0.7882335484027863} 01/27/2022 03:32:44 - INFO - codeparrot_training - Step 8149: {'lr': 0.00048002593034010516, 'samples': 1564800, 'steps': 8149, 'loss/train': 0.9116120338439941} 01/27/2022 03:32:47 - INFO - codeparrot_training - Step 8150: {'lr': 0.00048001952108364876, 'samples': 1564992, 'steps': 8150, 'loss/train': 0.9876896739006042} 01/27/2022 03:32:50 - INFO - codeparrot_training - Step 8151: {'lr': 0.00048001311084186173, 'samples': 1565184, 'steps': 8151, 'loss/train': 0.5258850753307343} 01/27/2022 03:32:54 - INFO - codeparrot_training - Step 8152: {'lr': 0.0004800066996147716, 'samples': 1565376, 'steps': 8152, 'loss/train': 0.9077394604682922} 01/27/2022 03:33:00 - INFO - codeparrot_training - Step 8153: {'lr': 0.0004800002874024058, 'samples': 1565568, 'steps': 8153, 'loss/train': 0.35936055332422256} 01/27/2022 03:33:04 - INFO - codeparrot_training - Step 8154: {'lr': 0.0004799938742047918, 'samples': 1565760, 'steps': 8154, 'loss/train': 0.9469466507434845} 01/27/2022 03:33:07 - INFO - codeparrot_training - Step 8155: {'lr': 0.0004799874600219571, 'samples': 1565952, 'steps': 8155, 'loss/train': 1.4795794486999512} 01/27/2022 03:33:10 - INFO - codeparrot_training - Step 8156: {'lr': 0.00047998104485392915, 'samples': 1566144, 'steps': 8156, 'loss/train': 0.5568716078996658} 01/27/2022 03:33:13 - INFO - codeparrot_training - Step 8157: {'lr': 0.0004799746287007354, 'samples': 1566336, 'steps': 8157, 'loss/train': 0.6688351482152939} 01/27/2022 03:33:16 - INFO - codeparrot_training - Step 8158: {'lr': 0.00047996821156240333, 'samples': 1566528, 'steps': 8158, 'loss/train': 0.49129338562488556} 01/27/2022 03:33:19 - INFO - codeparrot_training - Step 8159: {'lr': 0.0004799617934389605, 'samples': 1566720, 'steps': 8159, 'loss/train': 1.3405130803585052} 01/27/2022 03:33:22 - INFO - codeparrot_training - Step 8160: {'lr': 0.00047995537433043444, 'samples': 1566912, 'steps': 8160, 'loss/train': 0.6356116086244583} 01/27/2022 03:33:26 - INFO - codeparrot_training - Step 8161: {'lr': 0.00047994895423685246, 'samples': 1567104, 'steps': 8161, 'loss/train': 0.988165408372879} 01/27/2022 03:33:29 - INFO - codeparrot_training - Step 8162: {'lr': 0.0004799425331582423, 'samples': 1567296, 'steps': 8162, 'loss/train': 0.4140506833791733} 01/27/2022 03:33:33 - INFO - codeparrot_training - Step 8163: {'lr': 0.00047993611109463125, 'samples': 1567488, 'steps': 8163, 'loss/train': 1.0700446665287018} 01/27/2022 03:33:36 - INFO - codeparrot_training - Step 8164: {'lr': 0.00047992968804604693, 'samples': 1567680, 'steps': 8164, 'loss/train': 1.029764324426651} 01/27/2022 03:33:40 - INFO - codeparrot_training - Step 8165: {'lr': 0.00047992326401251686, 'samples': 1567872, 'steps': 8165, 'loss/train': 0.6011467427015305} 01/27/2022 03:33:43 - INFO - codeparrot_training - Step 8166: {'lr': 0.0004799168389940685, 'samples': 1568064, 'steps': 8166, 'loss/train': 0.9301441311836243} 01/27/2022 03:33:46 - INFO - codeparrot_training - Step 8167: {'lr': 0.00047991041299072946, 'samples': 1568256, 'steps': 8167, 'loss/train': 0.8829062283039093} 01/27/2022 03:33:49 - INFO - codeparrot_training - Step 8168: {'lr': 0.00047990398600252713, 'samples': 1568448, 'steps': 8168, 'loss/train': 1.0850442945957184} 01/27/2022 03:33:52 - INFO - codeparrot_training - Step 8169: {'lr': 0.0004798975580294892, 'samples': 1568640, 'steps': 8169, 'loss/train': 0.9833283126354218} 01/27/2022 03:33:56 - INFO - codeparrot_training - Step 8170: {'lr': 0.0004798911290716431, 'samples': 1568832, 'steps': 8170, 'loss/train': 0.9223221838474274} 01/27/2022 03:34:00 - INFO - codeparrot_training - Step 8171: {'lr': 0.0004798846991290164, 'samples': 1569024, 'steps': 8171, 'loss/train': 0.7798307240009308} 01/27/2022 03:34:03 - INFO - codeparrot_training - Step 8172: {'lr': 0.0004798782682016367, 'samples': 1569216, 'steps': 8172, 'loss/train': 0.5039692372083664} 01/27/2022 03:34:06 - INFO - codeparrot_training - Step 8173: {'lr': 0.0004798718362895315, 'samples': 1569408, 'steps': 8173, 'loss/train': 0.8692110478878021} 01/27/2022 03:34:09 - INFO - codeparrot_training - Step 8174: {'lr': 0.0004798654033927283, 'samples': 1569600, 'steps': 8174, 'loss/train': 1.1173674166202545} 01/27/2022 03:34:13 - INFO - codeparrot_training - Step 8175: {'lr': 0.00047985896951125464, 'samples': 1569792, 'steps': 8175, 'loss/train': 0.7724393606185913} 01/27/2022 03:34:16 - INFO - codeparrot_training - Step 8176: {'lr': 0.00047985253464513823, 'samples': 1569984, 'steps': 8176, 'loss/train': 1.0883709490299225} 01/27/2022 03:34:19 - INFO - codeparrot_training - Step 8177: {'lr': 0.00047984609879440655, 'samples': 1570176, 'steps': 8177, 'loss/train': 0.7597270309925079} 01/27/2022 03:34:22 - INFO - codeparrot_training - Step 8178: {'lr': 0.0004798396619590871, 'samples': 1570368, 'steps': 8178, 'loss/train': 0.8613853454589844} 01/27/2022 03:34:25 - INFO - codeparrot_training - Step 8179: {'lr': 0.0004798332241392076, 'samples': 1570560, 'steps': 8179, 'loss/train': 0.9973427653312683} 01/27/2022 03:34:31 - INFO - codeparrot_training - Step 8180: {'lr': 0.0004798267853347955, 'samples': 1570752, 'steps': 8180, 'loss/train': 0.7196406126022339} 01/27/2022 03:34:35 - INFO - codeparrot_training - Step 8181: {'lr': 0.00047982034554587837, 'samples': 1570944, 'steps': 8181, 'loss/train': 0.6903743594884872} 01/27/2022 03:34:38 - INFO - codeparrot_training - Step 8182: {'lr': 0.000479813904772484, 'samples': 1571136, 'steps': 8182, 'loss/train': 0.18207333236932755} 01/27/2022 03:34:41 - INFO - codeparrot_training - Step 8183: {'lr': 0.0004798074630146397, 'samples': 1571328, 'steps': 8183, 'loss/train': 1.000275582075119} 01/27/2022 03:34:44 - INFO - codeparrot_training - Step 8184: {'lr': 0.0004798010202723733, 'samples': 1571520, 'steps': 8184, 'loss/train': 0.5944158285856247} 01/27/2022 03:34:47 - INFO - codeparrot_training - Step 8185: {'lr': 0.00047979457654571223, 'samples': 1571712, 'steps': 8185, 'loss/train': 0.9139080941677094} 01/27/2022 03:34:50 - INFO - codeparrot_training - Step 8186: {'lr': 0.0004797881318346842, 'samples': 1571904, 'steps': 8186, 'loss/train': 1.0535426437854767} 01/27/2022 03:34:54 - INFO - codeparrot_training - Step 8187: {'lr': 0.00047978168613931684, 'samples': 1572096, 'steps': 8187, 'loss/train': 0.9659009277820587} 01/27/2022 03:34:57 - INFO - codeparrot_training - Step 8188: {'lr': 0.0004797752394596376, 'samples': 1572288, 'steps': 8188, 'loss/train': 1.0677530765533447} 01/27/2022 03:35:02 - INFO - codeparrot_training - Step 8189: {'lr': 0.0004797687917956742, 'samples': 1572480, 'steps': 8189, 'loss/train': 0.6157461851835251} 01/27/2022 03:35:05 - INFO - codeparrot_training - Step 8190: {'lr': 0.0004797623431474543, 'samples': 1572672, 'steps': 8190, 'loss/train': 0.7304432541131973} 01/27/2022 03:35:08 - INFO - codeparrot_training - Step 8191: {'lr': 0.0004797558935150055, 'samples': 1572864, 'steps': 8191, 'loss/train': 4.607888460159302} 01/27/2022 03:35:11 - INFO - codeparrot_training - Step 8192: {'lr': 0.0004797494428983553, 'samples': 1573056, 'steps': 8192, 'loss/train': 1.1640717387199402} 01/27/2022 03:35:14 - INFO - codeparrot_training - Step 8193: {'lr': 0.0004797429912975316, 'samples': 1573248, 'steps': 8193, 'loss/train': 0.9596302807331085} 01/27/2022 03:35:17 - INFO - codeparrot_training - Step 8194: {'lr': 0.00047973653871256173, 'samples': 1573440, 'steps': 8194, 'loss/train': 0.10150989890098572} 01/27/2022 03:35:20 - INFO - codeparrot_training - Step 8195: {'lr': 0.00047973008514347353, 'samples': 1573632, 'steps': 8195, 'loss/train': 1.1124384999275208} 01/27/2022 03:35:24 - INFO - codeparrot_training - Step 8196: {'lr': 0.00047972363059029465, 'samples': 1573824, 'steps': 8196, 'loss/train': 0.25407061725854874} 01/27/2022 03:35:27 - INFO - codeparrot_training - Step 8197: {'lr': 0.0004797171750530526, 'samples': 1574016, 'steps': 8197, 'loss/train': 1.2092508673667908} 01/27/2022 03:35:33 - INFO - codeparrot_training - Step 8198: {'lr': 0.00047971071853177515, 'samples': 1574208, 'steps': 8198, 'loss/train': 0.925349771976471} 01/27/2022 03:35:36 - INFO - codeparrot_training - Step 8199: {'lr': 0.0004797042610264899, 'samples': 1574400, 'steps': 8199, 'loss/train': 0.7024821639060974} 01/27/2022 03:35:39 - INFO - codeparrot_training - Step 8200: {'lr': 0.0004796978025372246, 'samples': 1574592, 'steps': 8200, 'loss/train': 0.5560142397880554} 01/27/2022 03:35:42 - INFO - codeparrot_training - Step 8201: {'lr': 0.0004796913430640068, 'samples': 1574784, 'steps': 8201, 'loss/train': 0.8333023488521576} 01/27/2022 03:35:46 - INFO - codeparrot_training - Step 8202: {'lr': 0.0004796848826068642, 'samples': 1574976, 'steps': 8202, 'loss/train': 1.2532170116901398} 01/27/2022 03:35:49 - INFO - codeparrot_training - Step 8203: {'lr': 0.00047967842116582453, 'samples': 1575168, 'steps': 8203, 'loss/train': 0.6150534600019455} 01/27/2022 03:35:52 - INFO - codeparrot_training - Step 8204: {'lr': 0.00047967195874091547, 'samples': 1575360, 'steps': 8204, 'loss/train': 1.0046133399009705} 01/27/2022 03:35:55 - INFO - codeparrot_training - Step 8205: {'lr': 0.00047966549533216466, 'samples': 1575552, 'steps': 8205, 'loss/train': 0.7340537756681442} 01/27/2022 03:35:58 - INFO - codeparrot_training - Step 8206: {'lr': 0.00047965903093959974, 'samples': 1575744, 'steps': 8206, 'loss/train': 0.9367720484733582} 01/27/2022 03:36:03 - INFO - codeparrot_training - Step 8207: {'lr': 0.0004796525655632484, 'samples': 1575936, 'steps': 8207, 'loss/train': 0.8307023048400879} 01/27/2022 03:36:06 - INFO - codeparrot_training - Step 8208: {'lr': 0.0004796460992031385, 'samples': 1576128, 'steps': 8208, 'loss/train': 0.6186268329620361} 01/27/2022 03:36:09 - INFO - codeparrot_training - Step 8209: {'lr': 0.0004796396318592976, 'samples': 1576320, 'steps': 8209, 'loss/train': 1.0216898918151855} 01/27/2022 03:36:12 - INFO - codeparrot_training - Step 8210: {'lr': 0.00047963316353175344, 'samples': 1576512, 'steps': 8210, 'loss/train': 0.6277755349874496} 01/27/2022 03:36:15 - INFO - codeparrot_training - Step 8211: {'lr': 0.00047962669422053374, 'samples': 1576704, 'steps': 8211, 'loss/train': 0.9255631864070892} 01/27/2022 03:36:18 - INFO - codeparrot_training - Step 8212: {'lr': 0.0004796202239256662, 'samples': 1576896, 'steps': 8212, 'loss/train': 0.9030594527721405} 01/27/2022 03:36:21 - INFO - codeparrot_training - Step 8213: {'lr': 0.0004796137526471785, 'samples': 1577088, 'steps': 8213, 'loss/train': 0.785363495349884} 01/27/2022 03:36:24 - INFO - codeparrot_training - Step 8214: {'lr': 0.0004796072803850984, 'samples': 1577280, 'steps': 8214, 'loss/train': 0.8516064584255219} 01/27/2022 03:36:31 - INFO - codeparrot_training - Step 8215: {'lr': 0.00047960080713945364, 'samples': 1577472, 'steps': 8215, 'loss/train': 0.8660471141338348} 01/27/2022 03:36:34 - INFO - codeparrot_training - Step 8216: {'lr': 0.0004795943329102719, 'samples': 1577664, 'steps': 8216, 'loss/train': 1.0369035601615906} 01/27/2022 03:36:37 - INFO - codeparrot_training - Step 8217: {'lr': 0.00047958785769758094, 'samples': 1577856, 'steps': 8217, 'loss/train': 0.7699616253376007} 01/27/2022 03:36:40 - INFO - codeparrot_training - Step 8218: {'lr': 0.0004795813815014085, 'samples': 1578048, 'steps': 8218, 'loss/train': 0.42667776346206665} 01/27/2022 03:36:43 - INFO - codeparrot_training - Step 8219: {'lr': 0.0004795749043217824, 'samples': 1578240, 'steps': 8219, 'loss/train': 0.8993985950946808} 01/27/2022 03:36:46 - INFO - codeparrot_training - Step 8220: {'lr': 0.0004795684261587302, 'samples': 1578432, 'steps': 8220, 'loss/train': 0.8993988633155823} 01/27/2022 03:36:49 - INFO - codeparrot_training - Step 8221: {'lr': 0.00047956194701227983, 'samples': 1578624, 'steps': 8221, 'loss/train': 0.9264206886291504} 01/27/2022 03:36:53 - INFO - codeparrot_training - Step 8222: {'lr': 0.000479555466882459, 'samples': 1578816, 'steps': 8222, 'loss/train': 1.0164517164230347} 01/27/2022 03:36:56 - INFO - codeparrot_training - Step 8223: {'lr': 0.00047954898576929534, 'samples': 1579008, 'steps': 8223, 'loss/train': 0.9005726873874664} 01/27/2022 03:37:00 - INFO - codeparrot_training - Step 8224: {'lr': 0.0004795425036728168, 'samples': 1579200, 'steps': 8224, 'loss/train': 0.892943948507309} 01/27/2022 03:37:03 - INFO - codeparrot_training - Step 8225: {'lr': 0.000479536020593051, 'samples': 1579392, 'steps': 8225, 'loss/train': 0.818846583366394} 01/27/2022 03:37:06 - INFO - codeparrot_training - Step 8226: {'lr': 0.0004795295365300258, 'samples': 1579584, 'steps': 8226, 'loss/train': 0.8851862847805023} 01/27/2022 03:37:10 - INFO - codeparrot_training - Step 8227: {'lr': 0.00047952305148376895, 'samples': 1579776, 'steps': 8227, 'loss/train': 1.0030360221862793} 01/27/2022 03:37:13 - INFO - codeparrot_training - Step 8228: {'lr': 0.0004795165654543082, 'samples': 1579968, 'steps': 8228, 'loss/train': 1.0159101784229279} 01/27/2022 03:37:16 - INFO - codeparrot_training - Step 8229: {'lr': 0.0004795100784416714, 'samples': 1580160, 'steps': 8229, 'loss/train': 0.8693952262401581} 01/27/2022 03:37:19 - INFO - codeparrot_training - Step 8230: {'lr': 0.0004795035904458863, 'samples': 1580352, 'steps': 8230, 'loss/train': 0.8573724925518036} 01/27/2022 03:37:22 - INFO - codeparrot_training - Step 8231: {'lr': 0.00047949710146698066, 'samples': 1580544, 'steps': 8231, 'loss/train': 0.8560326397418976} 01/27/2022 03:37:25 - INFO - codeparrot_training - Step 8232: {'lr': 0.0004794906115049824, 'samples': 1580736, 'steps': 8232, 'loss/train': 0.9670072495937347} 01/27/2022 03:37:30 - INFO - codeparrot_training - Step 8233: {'lr': 0.00047948412055991916, 'samples': 1580928, 'steps': 8233, 'loss/train': 0.5404883176088333} 01/27/2022 03:37:33 - INFO - codeparrot_training - Step 8234: {'lr': 0.0004794776286318188, 'samples': 1581120, 'steps': 8234, 'loss/train': 1.320103794336319} 01/27/2022 03:37:36 - INFO - codeparrot_training - Step 8235: {'lr': 0.0004794711357207092, 'samples': 1581312, 'steps': 8235, 'loss/train': 0.9764528274536133} 01/27/2022 03:37:40 - INFO - codeparrot_training - Step 8236: {'lr': 0.0004794646418266181, 'samples': 1581504, 'steps': 8236, 'loss/train': 0.20650311559438705} 01/27/2022 03:37:43 - INFO - codeparrot_training - Step 8237: {'lr': 0.0004794581469495733, 'samples': 1581696, 'steps': 8237, 'loss/train': 0.8921400010585785} 01/27/2022 03:37:46 - INFO - codeparrot_training - Step 8238: {'lr': 0.00047945165108960274, 'samples': 1581888, 'steps': 8238, 'loss/train': 0.329583004117012} 01/27/2022 03:37:49 - INFO - codeparrot_training - Step 8239: {'lr': 0.0004794451542467341, 'samples': 1582080, 'steps': 8239, 'loss/train': 0.8972846567630768} 01/27/2022 03:37:52 - INFO - codeparrot_training - Step 8240: {'lr': 0.00047943865642099525, 'samples': 1582272, 'steps': 8240, 'loss/train': 0.45674166083335876} 01/27/2022 03:37:55 - INFO - codeparrot_training - Step 8241: {'lr': 0.0004794321576124141, 'samples': 1582464, 'steps': 8241, 'loss/train': 1.4936584532260895} 01/27/2022 03:38:00 - INFO - codeparrot_training - Step 8242: {'lr': 0.0004794256578210184, 'samples': 1582656, 'steps': 8242, 'loss/train': 0.6006373465061188} 01/27/2022 03:38:03 - INFO - codeparrot_training - Step 8243: {'lr': 0.0004794191570468361, 'samples': 1582848, 'steps': 8243, 'loss/train': 0.9668251276016235} 01/27/2022 03:38:06 - INFO - codeparrot_training - Step 8244: {'lr': 0.00047941265528989496, 'samples': 1583040, 'steps': 8244, 'loss/train': 0.46529679000377655} 01/27/2022 03:38:09 - INFO - codeparrot_training - Step 8245: {'lr': 0.0004794061525502229, 'samples': 1583232, 'steps': 8245, 'loss/train': 0.8605149686336517} 01/27/2022 03:38:12 - INFO - codeparrot_training - Step 8246: {'lr': 0.00047939964882784766, 'samples': 1583424, 'steps': 8246, 'loss/train': 0.22751352936029434} 01/27/2022 03:38:16 - INFO - codeparrot_training - Step 8247: {'lr': 0.0004793931441227972, 'samples': 1583616, 'steps': 8247, 'loss/train': 0.9778260290622711} 01/27/2022 03:38:19 - INFO - codeparrot_training - Step 8248: {'lr': 0.00047938663843509927, 'samples': 1583808, 'steps': 8248, 'loss/train': 0.7968446910381317} 01/27/2022 03:38:22 - INFO - codeparrot_training - Step 8249: {'lr': 0.00047938013176478193, 'samples': 1584000, 'steps': 8249, 'loss/train': 0.9398213624954224} 01/27/2022 03:38:25 - INFO - codeparrot_training - Step 8250: {'lr': 0.0004793736241118728, 'samples': 1584192, 'steps': 8250, 'loss/train': 0.5659483820199966} 01/27/2022 03:38:30 - INFO - codeparrot_training - Step 8251: {'lr': 0.0004793671154764, 'samples': 1584384, 'steps': 8251, 'loss/train': 0.8115736842155457} 01/27/2022 03:38:33 - INFO - codeparrot_training - Step 8252: {'lr': 0.0004793606058583913, 'samples': 1584576, 'steps': 8252, 'loss/train': 0.772871196269989} 01/27/2022 03:38:36 - INFO - codeparrot_training - Step 8253: {'lr': 0.0004793540952578746, 'samples': 1584768, 'steps': 8253, 'loss/train': 0.22413063794374466} 01/27/2022 03:38:39 - INFO - codeparrot_training - Step 8254: {'lr': 0.0004793475836748777, 'samples': 1584960, 'steps': 8254, 'loss/train': 1.1551238000392914} 01/27/2022 03:38:43 - INFO - codeparrot_training - Step 8255: {'lr': 0.0004793410711094287, 'samples': 1585152, 'steps': 8255, 'loss/train': 0.5939846187829971} 01/27/2022 03:38:46 - INFO - codeparrot_training - Step 8256: {'lr': 0.00047933455756155534, 'samples': 1585344, 'steps': 8256, 'loss/train': 0.8638161420822144} 01/27/2022 03:38:49 - INFO - codeparrot_training - Step 8257: {'lr': 0.00047932804303128557, 'samples': 1585536, 'steps': 8257, 'loss/train': 0.8661539554595947} 01/27/2022 03:38:52 - INFO - codeparrot_training - Step 8258: {'lr': 0.0004793215275186472, 'samples': 1585728, 'steps': 8258, 'loss/train': 0.8149116039276123} 01/27/2022 03:38:58 - INFO - codeparrot_training - Step 8259: {'lr': 0.0004793150110236684, 'samples': 1585920, 'steps': 8259, 'loss/train': 0.7935300171375275} 01/27/2022 03:39:01 - INFO - codeparrot_training - Step 8260: {'lr': 0.00047930849354637674, 'samples': 1586112, 'steps': 8260, 'loss/train': 0.8721358180046082} 01/27/2022 03:39:05 - INFO - codeparrot_training - Step 8261: {'lr': 0.00047930197508680027, 'samples': 1586304, 'steps': 8261, 'loss/train': 1.1694676280021667} 01/27/2022 03:39:08 - INFO - codeparrot_training - Step 8262: {'lr': 0.00047929545564496715, 'samples': 1586496, 'steps': 8262, 'loss/train': 1.1672689318656921} 01/27/2022 03:39:11 - INFO - codeparrot_training - Step 8263: {'lr': 0.0004792889352209049, 'samples': 1586688, 'steps': 8263, 'loss/train': 2.085369050502777} 01/27/2022 03:39:14 - INFO - codeparrot_training - Step 8264: {'lr': 0.00047928241381464177, 'samples': 1586880, 'steps': 8264, 'loss/train': 1.2402778565883636} 01/27/2022 03:39:17 - INFO - codeparrot_training - Step 8265: {'lr': 0.00047927589142620556, 'samples': 1587072, 'steps': 8265, 'loss/train': 0.3899199217557907} 01/27/2022 03:39:20 - INFO - codeparrot_training - Step 8266: {'lr': 0.0004792693680556243, 'samples': 1587264, 'steps': 8266, 'loss/train': 0.717880368232727} 01/27/2022 03:39:23 - INFO - codeparrot_training - Step 8267: {'lr': 0.0004792628437029258, 'samples': 1587456, 'steps': 8267, 'loss/train': 0.6334668695926666} 01/27/2022 03:39:28 - INFO - codeparrot_training - Step 8268: {'lr': 0.0004792563183681381, 'samples': 1587648, 'steps': 8268, 'loss/train': 0.5576398819684982} 01/27/2022 03:39:31 - INFO - codeparrot_training - Step 8269: {'lr': 0.0004792497920512891, 'samples': 1587840, 'steps': 8269, 'loss/train': 1.3377446830272675} 01/27/2022 03:39:34 - INFO - codeparrot_training - Step 8270: {'lr': 0.00047924326475240676, 'samples': 1588032, 'steps': 8270, 'loss/train': 0.6228816658258438} 01/27/2022 03:39:38 - INFO - codeparrot_training - Step 8271: {'lr': 0.00047923673647151915, 'samples': 1588224, 'steps': 8271, 'loss/train': 0.6226586848497391} 01/27/2022 03:39:41 - INFO - codeparrot_training - Step 8272: {'lr': 0.00047923020720865413, 'samples': 1588416, 'steps': 8272, 'loss/train': 0.5251597166061401} 01/27/2022 03:39:44 - INFO - codeparrot_training - Step 8273: {'lr': 0.0004792236769638396, 'samples': 1588608, 'steps': 8273, 'loss/train': 1.1802596747875214} 01/27/2022 03:39:47 - INFO - codeparrot_training - Step 8274: {'lr': 0.00047921714573710374, 'samples': 1588800, 'steps': 8274, 'loss/train': 1.3238410949707031} 01/27/2022 03:39:50 - INFO - codeparrot_training - Step 8275: {'lr': 0.0004792106135284744, 'samples': 1588992, 'steps': 8275, 'loss/train': 0.3950500935316086} 01/27/2022 03:39:53 - INFO - codeparrot_training - Step 8276: {'lr': 0.00047920408033797954, 'samples': 1589184, 'steps': 8276, 'loss/train': 0.7666921019554138} 01/27/2022 03:40:00 - INFO - codeparrot_training - Step 8277: {'lr': 0.00047919754616564716, 'samples': 1589376, 'steps': 8277, 'loss/train': 1.0921344459056854} 01/27/2022 03:40:03 - INFO - codeparrot_training - Step 8278: {'lr': 0.0004791910110115053, 'samples': 1589568, 'steps': 8278, 'loss/train': 0.8089722096920013} 01/27/2022 03:40:06 - INFO - codeparrot_training - Step 8279: {'lr': 0.0004791844748755819, 'samples': 1589760, 'steps': 8279, 'loss/train': 1.0718136727809906} 01/27/2022 03:40:09 - INFO - codeparrot_training - Step 8280: {'lr': 0.00047917793775790503, 'samples': 1589952, 'steps': 8280, 'loss/train': 0.47173015773296356} 01/27/2022 03:40:12 - INFO - codeparrot_training - Step 8281: {'lr': 0.00047917139965850266, 'samples': 1590144, 'steps': 8281, 'loss/train': 0.9402825236320496} 01/27/2022 03:40:15 - INFO - codeparrot_training - Step 8282: {'lr': 0.0004791648605774027, 'samples': 1590336, 'steps': 8282, 'loss/train': 1.0754204392433167} 01/27/2022 03:40:18 - INFO - codeparrot_training - Step 8283: {'lr': 0.00047915832051463326, 'samples': 1590528, 'steps': 8283, 'loss/train': 0.2952006384730339} 01/27/2022 03:40:22 - INFO - codeparrot_training - Step 8284: {'lr': 0.0004791517794702224, 'samples': 1590720, 'steps': 8284, 'loss/train': 0.7577008903026581} 01/27/2022 03:40:25 - INFO - codeparrot_training - Step 8285: {'lr': 0.00047914523744419803, 'samples': 1590912, 'steps': 8285, 'loss/train': 0.9021711051464081} 01/27/2022 03:40:29 - INFO - codeparrot_training - Step 8286: {'lr': 0.00047913869443658825, 'samples': 1591104, 'steps': 8286, 'loss/train': 1.1187740564346313} 01/27/2022 03:40:32 - INFO - codeparrot_training - Step 8287: {'lr': 0.0004791321504474211, 'samples': 1591296, 'steps': 8287, 'loss/train': 0.7287866324186325} 01/27/2022 03:40:35 - INFO - codeparrot_training - Step 8288: {'lr': 0.00047912560547672453, 'samples': 1591488, 'steps': 8288, 'loss/train': 0.7381281852722168} 01/27/2022 03:40:39 - INFO - codeparrot_training - Step 8289: {'lr': 0.0004791190595245266, 'samples': 1591680, 'steps': 8289, 'loss/train': 0.8768289685249329} 01/27/2022 03:40:42 - INFO - codeparrot_training - Step 8290: {'lr': 0.0004791125125908554, 'samples': 1591872, 'steps': 8290, 'loss/train': 0.650798499584198} 01/27/2022 03:40:45 - INFO - codeparrot_training - Step 8291: {'lr': 0.000479105964675739, 'samples': 1592064, 'steps': 8291, 'loss/train': 1.3039604723453522} 01/27/2022 03:40:48 - INFO - codeparrot_training - Step 8292: {'lr': 0.0004790994157792053, 'samples': 1592256, 'steps': 8292, 'loss/train': 1.13941490650177} 01/27/2022 03:40:51 - INFO - codeparrot_training - Step 8293: {'lr': 0.0004790928659012825, 'samples': 1592448, 'steps': 8293, 'loss/train': 0.6127717047929764} 01/27/2022 03:40:54 - INFO - codeparrot_training - Step 8294: {'lr': 0.00047908631504199855, 'samples': 1592640, 'steps': 8294, 'loss/train': 1.2219853699207306} 01/27/2022 03:40:59 - INFO - codeparrot_training - Step 8295: {'lr': 0.00047907976320138163, 'samples': 1592832, 'steps': 8295, 'loss/train': 0.652410238981247} 01/27/2022 03:41:02 - INFO - codeparrot_training - Step 8296: {'lr': 0.00047907321037945973, 'samples': 1593024, 'steps': 8296, 'loss/train': 0.9628066420555115} 01/27/2022 03:41:05 - INFO - codeparrot_training - Step 8297: {'lr': 0.0004790666565762609, 'samples': 1593216, 'steps': 8297, 'loss/train': 0.8076191246509552} 01/27/2022 03:41:08 - INFO - codeparrot_training - Step 8298: {'lr': 0.0004790601017918134, 'samples': 1593408, 'steps': 8298, 'loss/train': 0.5026627779006958} 01/27/2022 03:41:11 - INFO - codeparrot_training - Step 8299: {'lr': 0.00047905354602614504, 'samples': 1593600, 'steps': 8299, 'loss/train': 0.7377296537160873} 01/27/2022 03:41:14 - INFO - codeparrot_training - Step 8300: {'lr': 0.00047904698927928404, 'samples': 1593792, 'steps': 8300, 'loss/train': 0.7464188486337662} 01/27/2022 03:41:18 - INFO - codeparrot_training - Step 8301: {'lr': 0.0004790404315512584, 'samples': 1593984, 'steps': 8301, 'loss/train': 1.3728428184986115} 01/27/2022 03:41:21 - INFO - codeparrot_training - Step 8302: {'lr': 0.0004790338728420963, 'samples': 1594176, 'steps': 8302, 'loss/train': 1.3914977610111237} 01/27/2022 03:41:24 - INFO - codeparrot_training - Step 8303: {'lr': 0.0004790273131518259, 'samples': 1594368, 'steps': 8303, 'loss/train': 0.8526073694229126} 01/27/2022 03:41:30 - INFO - codeparrot_training - Step 8304: {'lr': 0.00047902075248047515, 'samples': 1594560, 'steps': 8304, 'loss/train': 0.86778724193573} 01/27/2022 03:41:33 - INFO - codeparrot_training - Step 8305: {'lr': 0.0004790141908280723, 'samples': 1594752, 'steps': 8305, 'loss/train': 0.6952909827232361} 01/27/2022 03:41:36 - INFO - codeparrot_training - Step 8306: {'lr': 0.00047900762819464527, 'samples': 1594944, 'steps': 8306, 'loss/train': 0.8915866613388062} 01/27/2022 03:41:39 - INFO - codeparrot_training - Step 8307: {'lr': 0.0004790010645802223, 'samples': 1595136, 'steps': 8307, 'loss/train': 0.9152886271476746} 01/27/2022 03:41:43 - INFO - codeparrot_training - Step 8308: {'lr': 0.0004789944999848316, 'samples': 1595328, 'steps': 8308, 'loss/train': 0.8986804783344269} 01/27/2022 03:41:46 - INFO - codeparrot_training - Step 8309: {'lr': 0.00047898793440850104, 'samples': 1595520, 'steps': 8309, 'loss/train': 0.844739556312561} 01/27/2022 03:41:49 - INFO - codeparrot_training - Step 8310: {'lr': 0.0004789813678512589, 'samples': 1595712, 'steps': 8310, 'loss/train': 0.5694406628608704} 01/27/2022 03:41:52 - INFO - codeparrot_training - Step 8311: {'lr': 0.0004789748003131333, 'samples': 1595904, 'steps': 8311, 'loss/train': 0.7898423373699188} 01/27/2022 03:41:56 - INFO - codeparrot_training - Step 8312: {'lr': 0.00047896823179415237, 'samples': 1596096, 'steps': 8312, 'loss/train': 1.313929706811905} 01/27/2022 03:42:00 - INFO - codeparrot_training - Step 8313: {'lr': 0.00047896166229434423, 'samples': 1596288, 'steps': 8313, 'loss/train': 0.969230979681015} 01/27/2022 03:42:03 - INFO - codeparrot_training - Step 8314: {'lr': 0.0004789550918137371, 'samples': 1596480, 'steps': 8314, 'loss/train': 0.6892304867506027} 01/27/2022 03:42:06 - INFO - codeparrot_training - Step 8315: {'lr': 0.000478948520352359, 'samples': 1596672, 'steps': 8315, 'loss/train': 0.6987734735012054} 01/27/2022 03:42:09 - INFO - codeparrot_training - Step 8316: {'lr': 0.00047894194791023813, 'samples': 1596864, 'steps': 8316, 'loss/train': 0.8345956206321716} 01/27/2022 03:42:12 - INFO - codeparrot_training - Step 8317: {'lr': 0.0004789353744874027, 'samples': 1597056, 'steps': 8317, 'loss/train': 1.1879315078258514} 01/27/2022 03:42:15 - INFO - codeparrot_training - Step 8318: {'lr': 0.0004789288000838808, 'samples': 1597248, 'steps': 8318, 'loss/train': 0.8255570232868195} 01/27/2022 03:42:18 - INFO - codeparrot_training - Step 8319: {'lr': 0.0004789222246997006, 'samples': 1597440, 'steps': 8319, 'loss/train': 0.9231656491756439} 01/27/2022 03:42:22 - INFO - codeparrot_training - Step 8320: {'lr': 0.00047891564833489034, 'samples': 1597632, 'steps': 8320, 'loss/train': 0.8913863897323608} 01/27/2022 03:42:28 - INFO - codeparrot_training - Step 8321: {'lr': 0.000478909070989478, 'samples': 1597824, 'steps': 8321, 'loss/train': 1.2127690315246582} 01/27/2022 03:42:31 - INFO - codeparrot_training - Step 8322: {'lr': 0.00047890249266349194, 'samples': 1598016, 'steps': 8322, 'loss/train': 0.7229457199573517} 01/27/2022 03:42:34 - INFO - codeparrot_training - Step 8323: {'lr': 0.0004788959133569604, 'samples': 1598208, 'steps': 8323, 'loss/train': 1.0558876991271973} 01/27/2022 03:42:37 - INFO - codeparrot_training - Step 8324: {'lr': 0.00047888933306991136, 'samples': 1598400, 'steps': 8324, 'loss/train': 0.5665737390518188} 01/27/2022 03:42:40 - INFO - codeparrot_training - Step 8325: {'lr': 0.00047888275180237304, 'samples': 1598592, 'steps': 8325, 'loss/train': 0.5388177931308746} 01/27/2022 03:42:43 - INFO - codeparrot_training - Step 8326: {'lr': 0.00047887616955437373, 'samples': 1598784, 'steps': 8326, 'loss/train': 0.7502888739109039} 01/27/2022 03:42:47 - INFO - codeparrot_training - Step 8327: {'lr': 0.0004788695863259416, 'samples': 1598976, 'steps': 8327, 'loss/train': 1.0972971618175507} 01/27/2022 03:42:50 - INFO - codeparrot_training - Step 8328: {'lr': 0.0004788630021171049, 'samples': 1599168, 'steps': 8328, 'loss/train': 1.2812965214252472} 01/27/2022 03:42:53 - INFO - codeparrot_training - Step 8329: {'lr': 0.0004788564169278917, 'samples': 1599360, 'steps': 8329, 'loss/train': 0.8568586707115173} 01/27/2022 03:42:57 - INFO - codeparrot_training - Step 8330: {'lr': 0.00047884983075833023, 'samples': 1599552, 'steps': 8330, 'loss/train': 0.7863804996013641} 01/27/2022 03:43:00 - INFO - codeparrot_training - Step 8331: {'lr': 0.00047884324360844885, 'samples': 1599744, 'steps': 8331, 'loss/train': 0.9842349886894226} 01/27/2022 03:43:04 - INFO - codeparrot_training - Step 8332: {'lr': 0.0004788366554782756, 'samples': 1599936, 'steps': 8332, 'loss/train': 0.782034158706665} 01/27/2022 03:43:07 - INFO - codeparrot_training - Step 8333: {'lr': 0.00047883006636783887, 'samples': 1600128, 'steps': 8333, 'loss/train': 0.9287589490413666} 01/27/2022 03:43:10 - INFO - codeparrot_training - Step 8334: {'lr': 0.0004788234762771667, 'samples': 1600320, 'steps': 8334, 'loss/train': 0.9826672375202179} 01/27/2022 03:43:13 - INFO - codeparrot_training - Step 8335: {'lr': 0.0004788168852062875, 'samples': 1600512, 'steps': 8335, 'loss/train': 1.0714978873729706} 01/27/2022 03:43:16 - INFO - codeparrot_training - Step 8336: {'lr': 0.0004788102931552294, 'samples': 1600704, 'steps': 8336, 'loss/train': 0.971155196428299} 01/27/2022 03:43:19 - INFO - codeparrot_training - Step 8337: {'lr': 0.00047880370012402064, 'samples': 1600896, 'steps': 8337, 'loss/train': 1.1105793714523315} 01/27/2022 03:43:24 - INFO - codeparrot_training - Step 8338: {'lr': 0.0004787971061126895, 'samples': 1601088, 'steps': 8338, 'loss/train': 0.8748506605625153} 01/27/2022 03:43:27 - INFO - codeparrot_training - Step 8339: {'lr': 0.0004787905111212642, 'samples': 1601280, 'steps': 8339, 'loss/train': 2.0426734685897827} 01/27/2022 03:43:30 - INFO - codeparrot_training - Step 8340: {'lr': 0.00047878391514977306, 'samples': 1601472, 'steps': 8340, 'loss/train': 0.8369898498058319} 01/27/2022 03:43:33 - INFO - codeparrot_training - Step 8341: {'lr': 0.0004787773181982442, 'samples': 1601664, 'steps': 8341, 'loss/train': 0.9350418448448181} 01/27/2022 03:43:36 - INFO - codeparrot_training - Step 8342: {'lr': 0.0004787707202667059, 'samples': 1601856, 'steps': 8342, 'loss/train': 1.2157697975635529} 01/27/2022 03:43:39 - INFO - codeparrot_training - Step 8343: {'lr': 0.00047876412135518655, 'samples': 1602048, 'steps': 8343, 'loss/train': 0.8751473128795624} 01/27/2022 03:43:43 - INFO - codeparrot_training - Step 8344: {'lr': 0.0004787575214637144, 'samples': 1602240, 'steps': 8344, 'loss/train': 0.6672609150409698} 01/27/2022 03:43:46 - INFO - codeparrot_training - Step 8345: {'lr': 0.00047875092059231756, 'samples': 1602432, 'steps': 8345, 'loss/train': 0.5919273197650909} 01/27/2022 03:43:49 - INFO - codeparrot_training - Step 8346: {'lr': 0.0004787443187410245, 'samples': 1602624, 'steps': 8346, 'loss/train': 0.36218545585870743} 01/27/2022 03:43:53 - INFO - codeparrot_training - Step 8347: {'lr': 0.00047873771590986337, 'samples': 1602816, 'steps': 8347, 'loss/train': 1.5048362016677856} 01/27/2022 03:43:56 - INFO - codeparrot_training - Step 8348: {'lr': 0.00047873111209886245, 'samples': 1603008, 'steps': 8348, 'loss/train': 0.9403418898582458} 01/27/2022 03:44:00 - INFO - codeparrot_training - Step 8349: {'lr': 0.00047872450730805015, 'samples': 1603200, 'steps': 8349, 'loss/train': 0.722893014550209} 01/27/2022 03:44:03 - INFO - codeparrot_training - Step 8350: {'lr': 0.00047871790153745464, 'samples': 1603392, 'steps': 8350, 'loss/train': 0.8823280334472656} 01/27/2022 03:44:06 - INFO - codeparrot_training - Step 8351: {'lr': 0.0004787112947871043, 'samples': 1603584, 'steps': 8351, 'loss/train': 0.39794155955314636} 01/27/2022 03:44:09 - INFO - codeparrot_training - Step 8352: {'lr': 0.0004787046870570274, 'samples': 1603776, 'steps': 8352, 'loss/train': 1.2312089502811432} 01/27/2022 03:44:12 - INFO - codeparrot_training - Step 8353: {'lr': 0.00047869807834725225, 'samples': 1603968, 'steps': 8353, 'loss/train': 0.5638737380504608} 01/27/2022 03:44:15 - INFO - codeparrot_training - Step 8354: {'lr': 0.0004786914686578071, 'samples': 1604160, 'steps': 8354, 'loss/train': 0.5624624043703079} 01/27/2022 03:44:18 - INFO - codeparrot_training - Step 8355: {'lr': 0.00047868485798872044, 'samples': 1604352, 'steps': 8355, 'loss/train': 0.9000591337680817} 01/27/2022 03:44:25 - INFO - codeparrot_training - Step 8356: {'lr': 0.00047867824634002034, 'samples': 1604544, 'steps': 8356, 'loss/train': 1.0340602397918701} 01/27/2022 03:44:28 - INFO - codeparrot_training - Step 8357: {'lr': 0.0004786716337117353, 'samples': 1604736, 'steps': 8357, 'loss/train': 1.4768663048744202} 01/27/2022 03:44:31 - INFO - codeparrot_training - Step 8358: {'lr': 0.00047866502010389356, 'samples': 1604928, 'steps': 8358, 'loss/train': 0.9729519188404083} 01/27/2022 03:44:34 - INFO - codeparrot_training - Step 8359: {'lr': 0.00047865840551652343, 'samples': 1605120, 'steps': 8359, 'loss/train': 1.3202636539936066} 01/27/2022 03:44:38 - INFO - codeparrot_training - Step 8360: {'lr': 0.0004786517899496534, 'samples': 1605312, 'steps': 8360, 'loss/train': 0.09698740392923355} 01/27/2022 03:44:41 - INFO - codeparrot_training - Step 8361: {'lr': 0.0004786451734033117, 'samples': 1605504, 'steps': 8361, 'loss/train': 0.9900422394275665} 01/27/2022 03:44:44 - INFO - codeparrot_training - Step 8362: {'lr': 0.00047863855587752666, 'samples': 1605696, 'steps': 8362, 'loss/train': 1.2045662999153137} 01/27/2022 03:44:47 - INFO - codeparrot_training - Step 8363: {'lr': 0.0004786319373723266, 'samples': 1605888, 'steps': 8363, 'loss/train': 1.1230382323265076} 01/27/2022 03:44:50 - INFO - codeparrot_training - Step 8364: {'lr': 0.00047862531788774, 'samples': 1606080, 'steps': 8364, 'loss/train': 1.3873832523822784} 01/27/2022 03:44:55 - INFO - codeparrot_training - Step 8365: {'lr': 0.00047861869742379503, 'samples': 1606272, 'steps': 8365, 'loss/train': 1.0119730532169342} 01/27/2022 03:44:58 - INFO - codeparrot_training - Step 8366: {'lr': 0.0004786120759805203, 'samples': 1606464, 'steps': 8366, 'loss/train': 0.375931978225708} 01/27/2022 03:45:01 - INFO - codeparrot_training - Step 8367: {'lr': 0.0004786054535579439, 'samples': 1606656, 'steps': 8367, 'loss/train': 0.586456909775734} 01/27/2022 03:45:04 - INFO - codeparrot_training - Step 8368: {'lr': 0.0004785988301560944, 'samples': 1606848, 'steps': 8368, 'loss/train': 1.5943681597709656} 01/27/2022 03:45:07 - INFO - codeparrot_training - Step 8369: {'lr': 0.0004785922057750001, 'samples': 1607040, 'steps': 8369, 'loss/train': 0.6849008649587631} 01/27/2022 03:45:10 - INFO - codeparrot_training - Step 8370: {'lr': 0.00047858558041468925, 'samples': 1607232, 'steps': 8370, 'loss/train': 0.8980422914028168} 01/27/2022 03:45:13 - INFO - codeparrot_training - Step 8371: {'lr': 0.0004785789540751905, 'samples': 1607424, 'steps': 8371, 'loss/train': 0.13870440423488617} 01/27/2022 03:45:17 - INFO - codeparrot_training - Step 8372: {'lr': 0.00047857232675653207, 'samples': 1607616, 'steps': 8372, 'loss/train': 1.170400857925415} 01/27/2022 03:45:20 - INFO - codeparrot_training - Step 8373: {'lr': 0.0004785656984587423, 'samples': 1607808, 'steps': 8373, 'loss/train': 0.6032487004995346} 01/27/2022 03:45:24 - INFO - codeparrot_training - Step 8374: {'lr': 0.0004785590691818498, 'samples': 1608000, 'steps': 8374, 'loss/train': 0.9066148996353149} 01/27/2022 03:45:27 - INFO - codeparrot_training - Step 8375: {'lr': 0.0004785524389258827, 'samples': 1608192, 'steps': 8375, 'loss/train': 1.071667492389679} 01/27/2022 03:45:31 - INFO - codeparrot_training - Step 8376: {'lr': 0.0004785458076908695, 'samples': 1608384, 'steps': 8376, 'loss/train': 0.5801207721233368} 01/27/2022 03:45:34 - INFO - codeparrot_training - Step 8377: {'lr': 0.00047853917547683873, 'samples': 1608576, 'steps': 8377, 'loss/train': 0.8646868765354156} 01/27/2022 03:45:37 - INFO - codeparrot_training - Step 8378: {'lr': 0.00047853254228381864, 'samples': 1608768, 'steps': 8378, 'loss/train': 1.0172713994979858} 01/27/2022 03:45:40 - INFO - codeparrot_training - Step 8379: {'lr': 0.0004785259081118377, 'samples': 1608960, 'steps': 8379, 'loss/train': 0.6627587378025055} 01/27/2022 03:45:43 - INFO - codeparrot_training - Step 8380: {'lr': 0.0004785192729609244, 'samples': 1609152, 'steps': 8380, 'loss/train': 0.6252397298812866} 01/27/2022 03:45:46 - INFO - codeparrot_training - Step 8381: {'lr': 0.00047851263683110706, 'samples': 1609344, 'steps': 8381, 'loss/train': 0.731914758682251} 01/27/2022 03:45:49 - INFO - codeparrot_training - Step 8382: {'lr': 0.0004785059997224142, 'samples': 1609536, 'steps': 8382, 'loss/train': 1.1393146812915802} 01/27/2022 03:45:56 - INFO - codeparrot_training - Step 8383: {'lr': 0.0004784993616348741, 'samples': 1609728, 'steps': 8383, 'loss/train': 0.8327885270118713} 01/27/2022 03:45:59 - INFO - codeparrot_training - Step 8384: {'lr': 0.0004784927225685153, 'samples': 1609920, 'steps': 8384, 'loss/train': 0.9597087800502777} 01/27/2022 03:46:02 - INFO - codeparrot_training - Step 8385: {'lr': 0.0004784860825233662, 'samples': 1610112, 'steps': 8385, 'loss/train': 0.46528494358062744} 01/27/2022 03:46:05 - INFO - codeparrot_training - Step 8386: {'lr': 0.00047847944149945545, 'samples': 1610304, 'steps': 8386, 'loss/train': 1.1644347310066223} 01/27/2022 03:46:08 - INFO - codeparrot_training - Step 8387: {'lr': 0.00047847279949681117, 'samples': 1610496, 'steps': 8387, 'loss/train': 0.8468520641326904} 01/27/2022 03:46:11 - INFO - codeparrot_training - Step 8388: {'lr': 0.000478466156515462, 'samples': 1610688, 'steps': 8388, 'loss/train': 0.38004711270332336} 01/27/2022 03:46:15 - INFO - codeparrot_training - Step 8389: {'lr': 0.0004784595125554364, 'samples': 1610880, 'steps': 8389, 'loss/train': 0.9726486504077911} 01/27/2022 03:46:18 - INFO - codeparrot_training - Step 8390: {'lr': 0.00047845286761676276, 'samples': 1611072, 'steps': 8390, 'loss/train': 0.14167093858122826} 01/27/2022 03:46:21 - INFO - codeparrot_training - Step 8391: {'lr': 0.00047844622169946954, 'samples': 1611264, 'steps': 8391, 'loss/train': 1.307495355606079} 01/27/2022 03:46:25 - INFO - codeparrot_training - Step 8392: {'lr': 0.0004784395748035853, 'samples': 1611456, 'steps': 8392, 'loss/train': 0.21832691878080368} 01/27/2022 03:46:28 - INFO - codeparrot_training - Step 8393: {'lr': 0.0004784329269291384, 'samples': 1611648, 'steps': 8393, 'loss/train': 1.077049344778061} 01/27/2022 03:46:32 - INFO - codeparrot_training - Step 8394: {'lr': 0.0004784262780761575, 'samples': 1611840, 'steps': 8394, 'loss/train': 0.5608563870191574} 01/27/2022 03:46:35 - INFO - codeparrot_training - Step 8395: {'lr': 0.00047841962824467086, 'samples': 1612032, 'steps': 8395, 'loss/train': 1.0879453718662262} 01/27/2022 03:46:38 - INFO - codeparrot_training - Step 8396: {'lr': 0.000478412977434707, 'samples': 1612224, 'steps': 8396, 'loss/train': 0.5307181477546692} 01/27/2022 03:46:41 - INFO - codeparrot_training - Step 8397: {'lr': 0.0004784063256462946, 'samples': 1612416, 'steps': 8397, 'loss/train': 0.904638022184372} 01/27/2022 03:46:44 - INFO - codeparrot_training - Step 8398: {'lr': 0.00047839967287946196, 'samples': 1612608, 'steps': 8398, 'loss/train': 1.2852924764156342} 01/27/2022 03:46:47 - INFO - codeparrot_training - Step 8399: {'lr': 0.00047839301913423773, 'samples': 1612800, 'steps': 8399, 'loss/train': 1.1948046684265137} 01/27/2022 03:46:51 - INFO - codeparrot_training - Step 8400: {'lr': 0.0004783863644106502, 'samples': 1612992, 'steps': 8400, 'loss/train': 1.0842995345592499} 01/27/2022 03:46:57 - INFO - codeparrot_training - Step 8401: {'lr': 0.0004783797087087281, 'samples': 1613184, 'steps': 8401, 'loss/train': 1.9413578510284424} 01/27/2022 03:47:00 - INFO - codeparrot_training - Step 8402: {'lr': 0.00047837305202849987, 'samples': 1613376, 'steps': 8402, 'loss/train': 1.778461217880249} 01/27/2022 03:47:04 - INFO - codeparrot_training - Step 8403: {'lr': 0.0004783663943699939, 'samples': 1613568, 'steps': 8403, 'loss/train': 0.8199394941329956} 01/27/2022 03:47:07 - INFO - codeparrot_training - Step 8404: {'lr': 0.00047835973573323885, 'samples': 1613760, 'steps': 8404, 'loss/train': 0.5617959648370743} 01/27/2022 03:47:10 - INFO - codeparrot_training - Step 8405: {'lr': 0.00047835307611826327, 'samples': 1613952, 'steps': 8405, 'loss/train': 0.37868034839630127} 01/27/2022 03:47:13 - INFO - codeparrot_training - Step 8406: {'lr': 0.0004783464155250955, 'samples': 1614144, 'steps': 8406, 'loss/train': 0.7296042591333389} 01/27/2022 03:47:16 - INFO - codeparrot_training - Step 8407: {'lr': 0.00047833975395376426, 'samples': 1614336, 'steps': 8407, 'loss/train': 0.7792633473873138} 01/27/2022 03:47:19 - INFO - codeparrot_training - Step 8408: {'lr': 0.00047833309140429803, 'samples': 1614528, 'steps': 8408, 'loss/train': 1.3763131499290466} 01/27/2022 03:47:23 - INFO - codeparrot_training - Step 8409: {'lr': 0.00047832642787672537, 'samples': 1614720, 'steps': 8409, 'loss/train': 0.4880557805299759} 01/27/2022 03:47:27 - INFO - codeparrot_training - Step 8410: {'lr': 0.00047831976337107474, 'samples': 1614912, 'steps': 8410, 'loss/train': 0.7810333371162415} 01/27/2022 03:47:30 - INFO - codeparrot_training - Step 8411: {'lr': 0.00047831309788737476, 'samples': 1615104, 'steps': 8411, 'loss/train': 0.7370374202728271} 01/27/2022 03:47:33 - INFO - codeparrot_training - Step 8412: {'lr': 0.000478306431425654, 'samples': 1615296, 'steps': 8412, 'loss/train': 2.6429944038391113} 01/27/2022 03:47:36 - INFO - codeparrot_training - Step 8413: {'lr': 0.0004782997639859409, 'samples': 1615488, 'steps': 8413, 'loss/train': 0.9212112128734589} 01/27/2022 03:47:39 - INFO - codeparrot_training - Step 8414: {'lr': 0.00047829309556826415, 'samples': 1615680, 'steps': 8414, 'loss/train': 0.9058615565299988} 01/27/2022 03:47:43 - INFO - codeparrot_training - Step 8415: {'lr': 0.0004782864261726523, 'samples': 1615872, 'steps': 8415, 'loss/train': 0.34051041305065155} 01/27/2022 03:47:46 - INFO - codeparrot_training - Step 8416: {'lr': 0.0004782797557991339, 'samples': 1616064, 'steps': 8416, 'loss/train': 1.0534001290798187} 01/27/2022 03:47:49 - INFO - codeparrot_training - Step 8417: {'lr': 0.00047827308444773746, 'samples': 1616256, 'steps': 8417, 'loss/train': 1.5814055800437927} 01/27/2022 03:47:52 - INFO - codeparrot_training - Step 8418: {'lr': 0.00047826641211849165, 'samples': 1616448, 'steps': 8418, 'loss/train': 0.9292451441287994} 01/27/2022 03:47:56 - INFO - codeparrot_training - Step 8419: {'lr': 0.000478259738811425, 'samples': 1616640, 'steps': 8419, 'loss/train': 1.0770211815834045} 01/27/2022 03:48:00 - INFO - codeparrot_training - Step 8420: {'lr': 0.0004782530645265661, 'samples': 1616832, 'steps': 8420, 'loss/train': 0.8073031604290009} 01/27/2022 03:48:03 - INFO - codeparrot_training - Step 8421: {'lr': 0.00047824638926394355, 'samples': 1617024, 'steps': 8421, 'loss/train': 1.098343849182129} 01/27/2022 03:48:06 - INFO - codeparrot_training - Step 8422: {'lr': 0.0004782397130235859, 'samples': 1617216, 'steps': 8422, 'loss/train': 0.9093871414661407} 01/27/2022 03:48:09 - INFO - codeparrot_training - Step 8423: {'lr': 0.0004782330358055219, 'samples': 1617408, 'steps': 8423, 'loss/train': 0.9225034117698669} 01/27/2022 03:48:12 - INFO - codeparrot_training - Step 8424: {'lr': 0.00047822635760977995, 'samples': 1617600, 'steps': 8424, 'loss/train': 0.9957623183727264} 01/27/2022 03:48:15 - INFO - codeparrot_training - Step 8425: {'lr': 0.0004782196784363888, 'samples': 1617792, 'steps': 8425, 'loss/train': 0.7272093594074249} 01/27/2022 03:48:18 - INFO - codeparrot_training - Step 8426: {'lr': 0.000478212998285377, 'samples': 1617984, 'steps': 8426, 'loss/train': 0.9931374192237854} 01/27/2022 03:48:22 - INFO - codeparrot_training - Step 8427: {'lr': 0.0004782063171567732, 'samples': 1618176, 'steps': 8427, 'loss/train': 1.1780845820903778} 01/27/2022 03:48:28 - INFO - codeparrot_training - Step 8428: {'lr': 0.000478199635050606, 'samples': 1618368, 'steps': 8428, 'loss/train': 1.2105640769004822} 01/27/2022 03:48:31 - INFO - codeparrot_training - Step 8429: {'lr': 0.000478192951966904, 'samples': 1618560, 'steps': 8429, 'loss/train': 0.7593769133090973} 01/27/2022 03:48:34 - INFO - codeparrot_training - Step 8430: {'lr': 0.00047818626790569586, 'samples': 1618752, 'steps': 8430, 'loss/train': 0.9851217269897461} 01/27/2022 03:48:37 - INFO - codeparrot_training - Step 8431: {'lr': 0.00047817958286701026, 'samples': 1618944, 'steps': 8431, 'loss/train': 0.570222482085228} 01/27/2022 03:48:41 - INFO - codeparrot_training - Step 8432: {'lr': 0.00047817289685087575, 'samples': 1619136, 'steps': 8432, 'loss/train': 0.9710841178894043} 01/27/2022 03:48:44 - INFO - codeparrot_training - Step 8433: {'lr': 0.00047816620985732095, 'samples': 1619328, 'steps': 8433, 'loss/train': 1.5335935950279236} 01/27/2022 03:48:47 - INFO - codeparrot_training - Step 8434: {'lr': 0.0004781595218863746, 'samples': 1619520, 'steps': 8434, 'loss/train': 0.45645757019519806} 01/27/2022 03:48:50 - INFO - codeparrot_training - Step 8435: {'lr': 0.00047815283293806533, 'samples': 1619712, 'steps': 8435, 'loss/train': 0.8373787701129913} 01/27/2022 03:48:53 - INFO - codeparrot_training - Step 8436: {'lr': 0.0004781461430124217, 'samples': 1619904, 'steps': 8436, 'loss/train': 0.5080785602331161} 01/27/2022 03:48:58 - INFO - codeparrot_training - Step 8437: {'lr': 0.0004781394521094725, 'samples': 1620096, 'steps': 8437, 'loss/train': 0.729441225528717} 01/27/2022 03:49:01 - INFO - codeparrot_training - Step 8438: {'lr': 0.00047813276022924634, 'samples': 1620288, 'steps': 8438, 'loss/train': 0.8877379596233368} 01/27/2022 03:49:04 - INFO - codeparrot_training - Step 8439: {'lr': 0.0004781260673717718, 'samples': 1620480, 'steps': 8439, 'loss/train': 1.653327226638794} 01/27/2022 03:49:07 - INFO - codeparrot_training - Step 8440: {'lr': 0.0004781193735370777, 'samples': 1620672, 'steps': 8440, 'loss/train': 1.8067374229431152} 01/27/2022 03:49:10 - INFO - codeparrot_training - Step 8441: {'lr': 0.0004781126787251926, 'samples': 1620864, 'steps': 8441, 'loss/train': 0.9566204845905304} 01/27/2022 03:49:13 - INFO - codeparrot_training - Step 8442: {'lr': 0.0004781059829361453, 'samples': 1621056, 'steps': 8442, 'loss/train': 0.9915493726730347} 01/27/2022 03:49:16 - INFO - codeparrot_training - Step 8443: {'lr': 0.00047809928616996425, 'samples': 1621248, 'steps': 8443, 'loss/train': 0.46413061022758484} 01/27/2022 03:49:20 - INFO - codeparrot_training - Step 8444: {'lr': 0.00047809258842667837, 'samples': 1621440, 'steps': 8444, 'loss/train': 1.1744134426116943} 01/27/2022 03:49:24 - INFO - codeparrot_training - Step 8445: {'lr': 0.00047808588970631627, 'samples': 1621632, 'steps': 8445, 'loss/train': 0.8304297924041748} 01/27/2022 03:49:27 - INFO - codeparrot_training - Step 8446: {'lr': 0.0004780791900089066, 'samples': 1621824, 'steps': 8446, 'loss/train': 0.5151237845420837} 01/27/2022 03:49:30 - INFO - codeparrot_training - Step 8447: {'lr': 0.0004780724893344782, 'samples': 1622016, 'steps': 8447, 'loss/train': 0.8700571060180664} 01/27/2022 03:49:33 - INFO - codeparrot_training - Step 8448: {'lr': 0.00047806578768305963, 'samples': 1622208, 'steps': 8448, 'loss/train': 0.9549625217914581} 01/27/2022 03:49:37 - INFO - codeparrot_training - Step 8449: {'lr': 0.00047805908505467963, 'samples': 1622400, 'steps': 8449, 'loss/train': 0.7091130763292313} 01/27/2022 03:49:40 - INFO - codeparrot_training - Step 8450: {'lr': 0.0004780523814493669, 'samples': 1622592, 'steps': 8450, 'loss/train': 0.28873685002326965} 01/27/2022 03:49:43 - INFO - codeparrot_training - Step 8451: {'lr': 0.0004780456768671503, 'samples': 1622784, 'steps': 8451, 'loss/train': 0.6861974895000458} 01/27/2022 03:49:46 - INFO - codeparrot_training - Step 8452: {'lr': 0.0004780389713080583, 'samples': 1622976, 'steps': 8452, 'loss/train': 1.1236669421195984} 01/27/2022 03:49:49 - INFO - codeparrot_training - Step 8453: {'lr': 0.0004780322647721198, 'samples': 1623168, 'steps': 8453, 'loss/train': 0.8004076480865479} 01/27/2022 03:49:54 - INFO - codeparrot_training - Step 8454: {'lr': 0.00047802555725936347, 'samples': 1623360, 'steps': 8454, 'loss/train': 1.0548753440380096} 01/27/2022 03:49:57 - INFO - codeparrot_training - Step 8455: {'lr': 0.00047801884876981813, 'samples': 1623552, 'steps': 8455, 'loss/train': 0.7018505483865738} 01/27/2022 03:50:00 - INFO - codeparrot_training - Step 8456: {'lr': 0.0004780121393035124, 'samples': 1623744, 'steps': 8456, 'loss/train': 1.0981078147888184} 01/27/2022 03:50:03 - INFO - codeparrot_training - Step 8457: {'lr': 0.00047800542886047506, 'samples': 1623936, 'steps': 8457, 'loss/train': 0.888223260641098} 01/27/2022 03:50:06 - INFO - codeparrot_training - Step 8458: {'lr': 0.00047799871744073485, 'samples': 1624128, 'steps': 8458, 'loss/train': 0.9288034737110138} 01/27/2022 03:50:09 - INFO - codeparrot_training - Step 8459: {'lr': 0.00047799200504432054, 'samples': 1624320, 'steps': 8459, 'loss/train': 0.7758691012859344} 01/27/2022 03:50:12 - INFO - codeparrot_training - Step 8460: {'lr': 0.0004779852916712609, 'samples': 1624512, 'steps': 8460, 'loss/train': 0.2923335134983063} 01/27/2022 03:50:16 - INFO - codeparrot_training - Step 8461: {'lr': 0.0004779785773215847, 'samples': 1624704, 'steps': 8461, 'loss/train': 1.143264502286911} 01/27/2022 03:50:19 - INFO - codeparrot_training - Step 8462: {'lr': 0.00047797186199532055, 'samples': 1624896, 'steps': 8462, 'loss/train': 1.0012474358081818} 01/27/2022 03:50:25 - INFO - codeparrot_training - Step 8463: {'lr': 0.0004779651456924974, 'samples': 1625088, 'steps': 8463, 'loss/train': 1.3371235728263855} 01/27/2022 03:50:28 - INFO - codeparrot_training - Step 8464: {'lr': 0.00047795842841314394, 'samples': 1625280, 'steps': 8464, 'loss/train': 1.1858358085155487} 01/27/2022 03:50:31 - INFO - codeparrot_training - Step 8465: {'lr': 0.000477951710157289, 'samples': 1625472, 'steps': 8465, 'loss/train': 0.8945725858211517} 01/27/2022 03:50:34 - INFO - codeparrot_training - Step 8466: {'lr': 0.00047794499092496123, 'samples': 1625664, 'steps': 8466, 'loss/train': 0.7559671998023987} 01/27/2022 03:50:38 - INFO - codeparrot_training - Step 8467: {'lr': 0.00047793827071618955, 'samples': 1625856, 'steps': 8467, 'loss/train': 0.9844723641872406} 01/27/2022 03:50:41 - INFO - codeparrot_training - Step 8468: {'lr': 0.0004779315495310027, 'samples': 1626048, 'steps': 8468, 'loss/train': 0.9374142587184906} 01/27/2022 03:50:44 - INFO - codeparrot_training - Step 8469: {'lr': 0.00047792482736942955, 'samples': 1626240, 'steps': 8469, 'loss/train': 0.27886582911014557} 01/27/2022 03:50:47 - INFO - codeparrot_training - Step 8470: {'lr': 0.00047791810423149873, 'samples': 1626432, 'steps': 8470, 'loss/train': 1.183611810207367} 01/27/2022 03:50:52 - INFO - codeparrot_training - Step 8471: {'lr': 0.0004779113801172391, 'samples': 1626624, 'steps': 8471, 'loss/train': 1.0057280659675598} 01/27/2022 03:50:55 - INFO - codeparrot_training - Step 8472: {'lr': 0.0004779046550266795, 'samples': 1626816, 'steps': 8472, 'loss/train': 0.8952800631523132} 01/27/2022 03:50:58 - INFO - codeparrot_training - Step 8473: {'lr': 0.00047789792895984874, 'samples': 1627008, 'steps': 8473, 'loss/train': 0.9344708025455475} 01/27/2022 03:51:01 - INFO - codeparrot_training - Step 8474: {'lr': 0.0004778912019167756, 'samples': 1627200, 'steps': 8474, 'loss/train': 0.6082143187522888} 01/27/2022 03:51:04 - INFO - codeparrot_training - Step 8475: {'lr': 0.00047788447389748894, 'samples': 1627392, 'steps': 8475, 'loss/train': 0.85005122423172} 01/27/2022 03:51:07 - INFO - codeparrot_training - Step 8476: {'lr': 0.0004778777449020176, 'samples': 1627584, 'steps': 8476, 'loss/train': 0.7673435211181641} 01/27/2022 03:51:10 - INFO - codeparrot_training - Step 8477: {'lr': 0.0004778710149303903, 'samples': 1627776, 'steps': 8477, 'loss/train': 0.5289504826068878} 01/27/2022 03:51:14 - INFO - codeparrot_training - Step 8478: {'lr': 0.00047786428398263595, 'samples': 1627968, 'steps': 8478, 'loss/train': 0.09789156541228294} 01/27/2022 03:51:17 - INFO - codeparrot_training - Step 8479: {'lr': 0.00047785755205878333, 'samples': 1628160, 'steps': 8479, 'loss/train': 0.7680899798870087} 01/27/2022 03:51:21 - INFO - codeparrot_training - Step 8480: {'lr': 0.0004778508191588613, 'samples': 1628352, 'steps': 8480, 'loss/train': 0.5003017634153366} 01/27/2022 03:51:24 - INFO - codeparrot_training - Step 8481: {'lr': 0.0004778440852828988, 'samples': 1628544, 'steps': 8481, 'loss/train': 0.5771652013063431} 01/27/2022 03:51:28 - INFO - codeparrot_training - Step 8482: {'lr': 0.00047783735043092446, 'samples': 1628736, 'steps': 8482, 'loss/train': 1.084263414144516} 01/27/2022 03:51:31 - INFO - codeparrot_training - Step 8483: {'lr': 0.0004778306146029674, 'samples': 1628928, 'steps': 8483, 'loss/train': 1.1433315575122833} 01/27/2022 03:51:34 - INFO - codeparrot_training - Step 8484: {'lr': 0.0004778238777990562, 'samples': 1629120, 'steps': 8484, 'loss/train': 0.8872532844543457} 01/27/2022 03:51:37 - INFO - codeparrot_training - Step 8485: {'lr': 0.00047781714001921997, 'samples': 1629312, 'steps': 8485, 'loss/train': 0.59450264275074} 01/27/2022 03:51:40 - INFO - codeparrot_training - Step 8486: {'lr': 0.00047781040126348734, 'samples': 1629504, 'steps': 8486, 'loss/train': 1.292077749967575} 01/27/2022 03:51:43 - INFO - codeparrot_training - Step 8487: {'lr': 0.0004778036615318874, 'samples': 1629696, 'steps': 8487, 'loss/train': 0.6063343584537506} 01/27/2022 03:51:46 - INFO - codeparrot_training - Step 8488: {'lr': 0.0004777969208244488, 'samples': 1629888, 'steps': 8488, 'loss/train': 1.108759582042694} 01/27/2022 03:51:53 - INFO - codeparrot_training - Step 8489: {'lr': 0.0004777901791412006, 'samples': 1630080, 'steps': 8489, 'loss/train': 0.7682544887065887} 01/27/2022 03:51:56 - INFO - codeparrot_training - Step 8490: {'lr': 0.00047778343648217155, 'samples': 1630272, 'steps': 8490, 'loss/train': 0.8466483950614929} 01/27/2022 03:51:59 - INFO - codeparrot_training - Step 8491: {'lr': 0.00047777669284739064, 'samples': 1630464, 'steps': 8491, 'loss/train': 0.47266218066215515} 01/27/2022 03:52:02 - INFO - codeparrot_training - Step 8492: {'lr': 0.0004777699482368867, 'samples': 1630656, 'steps': 8492, 'loss/train': 1.026277095079422} 01/27/2022 03:52:05 - INFO - codeparrot_training - Step 8493: {'lr': 0.0004777632026506886, 'samples': 1630848, 'steps': 8493, 'loss/train': 0.7266435027122498} 01/27/2022 03:52:08 - INFO - codeparrot_training - Step 8494: {'lr': 0.0004777564560888252, 'samples': 1631040, 'steps': 8494, 'loss/train': 0.5500293374061584} 01/27/2022 03:52:11 - INFO - codeparrot_training - Step 8495: {'lr': 0.0004777497085513256, 'samples': 1631232, 'steps': 8495, 'loss/train': 0.5915521681308746} 01/27/2022 03:52:15 - INFO - codeparrot_training - Step 8496: {'lr': 0.0004777429600382185, 'samples': 1631424, 'steps': 8496, 'loss/train': 0.6433407515287399} 01/27/2022 03:52:18 - INFO - codeparrot_training - Step 8497: {'lr': 0.00047773621054953287, 'samples': 1631616, 'steps': 8497, 'loss/train': 0.7618509829044342} 01/27/2022 03:52:22 - INFO - codeparrot_training - Step 8498: {'lr': 0.0004777294600852976, 'samples': 1631808, 'steps': 8498, 'loss/train': 1.0357353687286377} 01/27/2022 03:52:26 - INFO - codeparrot_training - Step 8499: {'lr': 0.0004777227086455417, 'samples': 1632000, 'steps': 8499, 'loss/train': 0.6372696161270142} 01/27/2022 03:52:29 - INFO - codeparrot_training - Step 8500: {'lr': 0.000477715956230294, 'samples': 1632192, 'steps': 8500, 'loss/train': 0.7383484393358231} 01/27/2022 03:52:32 - INFO - codeparrot_training - Step 8501: {'lr': 0.0004777092028395834, 'samples': 1632384, 'steps': 8501, 'loss/train': 0.4001668095588684} 01/27/2022 03:52:35 - INFO - codeparrot_training - Step 8502: {'lr': 0.00047770244847343893, 'samples': 1632576, 'steps': 8502, 'loss/train': 0.6244195997714996} 01/27/2022 03:52:38 - INFO - codeparrot_training - Step 8503: {'lr': 0.0004776956931318895, 'samples': 1632768, 'steps': 8503, 'loss/train': 1.063113033771515} 01/27/2022 03:52:41 - INFO - codeparrot_training - Step 8504: {'lr': 0.00047768893681496397, 'samples': 1632960, 'steps': 8504, 'loss/train': 1.0380083620548248} 01/27/2022 03:52:44 - INFO - codeparrot_training - Step 8505: {'lr': 0.0004776821795226913, 'samples': 1633152, 'steps': 8505, 'loss/train': 1.5139174461364746} 01/27/2022 03:52:48 - INFO - codeparrot_training - Step 8506: {'lr': 0.0004776754212551006, 'samples': 1633344, 'steps': 8506, 'loss/train': 0.554034635424614} 01/27/2022 03:52:54 - INFO - codeparrot_training - Step 8507: {'lr': 0.0004776686620122206, 'samples': 1633536, 'steps': 8507, 'loss/train': 0.8833428025245667} 01/27/2022 03:52:57 - INFO - codeparrot_training - Step 8508: {'lr': 0.00047766190179408043, 'samples': 1633728, 'steps': 8508, 'loss/train': 1.2341668903827667} 01/27/2022 03:53:00 - INFO - codeparrot_training - Step 8509: {'lr': 0.00047765514060070887, 'samples': 1633920, 'steps': 8509, 'loss/train': 1.1286301016807556} 01/27/2022 03:53:03 - INFO - codeparrot_training - Step 8510: {'lr': 0.00047764837843213497, 'samples': 1634112, 'steps': 8510, 'loss/train': 1.5407940745353699} 01/27/2022 03:53:06 - INFO - codeparrot_training - Step 8511: {'lr': 0.0004776416152883878, 'samples': 1634304, 'steps': 8511, 'loss/train': 0.7872059047222137} 01/27/2022 03:53:10 - INFO - codeparrot_training - Step 8512: {'lr': 0.0004776348511694961, 'samples': 1634496, 'steps': 8512, 'loss/train': 1.2848167419433594} 01/27/2022 03:53:13 - INFO - codeparrot_training - Step 8513: {'lr': 0.0004776280860754891, 'samples': 1634688, 'steps': 8513, 'loss/train': 0.7434751242399216} 01/27/2022 03:53:16 - INFO - codeparrot_training - Step 8514: {'lr': 0.0004776213200063956, 'samples': 1634880, 'steps': 8514, 'loss/train': 1.2846665382385254} 01/27/2022 03:53:20 - INFO - codeparrot_training - Step 8515: {'lr': 0.00047761455296224464, 'samples': 1635072, 'steps': 8515, 'loss/train': 0.8873988389968872} 01/27/2022 03:53:23 - INFO - codeparrot_training - Step 8516: {'lr': 0.0004776077849430652, 'samples': 1635264, 'steps': 8516, 'loss/train': 0.8039158880710602} 01/27/2022 03:53:27 - INFO - codeparrot_training - Step 8517: {'lr': 0.00047760101594888633, 'samples': 1635456, 'steps': 8517, 'loss/train': 0.6543142050504684} 01/27/2022 03:53:30 - INFO - codeparrot_training - Step 8518: {'lr': 0.000477594245979737, 'samples': 1635648, 'steps': 8518, 'loss/train': 0.8941981494426727} 01/27/2022 03:53:33 - INFO - codeparrot_training - Step 8519: {'lr': 0.0004775874750356461, 'samples': 1635840, 'steps': 8519, 'loss/train': 1.4109890162944794} 01/27/2022 03:53:36 - INFO - codeparrot_training - Step 8520: {'lr': 0.00047758070311664283, 'samples': 1636032, 'steps': 8520, 'loss/train': 1.0335785150527954} 01/27/2022 03:53:39 - INFO - codeparrot_training - Step 8521: {'lr': 0.000477573930222756, 'samples': 1636224, 'steps': 8521, 'loss/train': 0.7908400297164917} 01/27/2022 03:53:42 - INFO - codeparrot_training - Step 8522: {'lr': 0.0004775671563540147, 'samples': 1636416, 'steps': 8522, 'loss/train': 0.8499214947223663} 01/27/2022 03:53:45 - INFO - codeparrot_training - Step 8523: {'lr': 0.000477560381510448, 'samples': 1636608, 'steps': 8523, 'loss/train': 0.6914671361446381} 01/27/2022 03:53:50 - INFO - codeparrot_training - Step 8524: {'lr': 0.00047755360569208495, 'samples': 1636800, 'steps': 8524, 'loss/train': 0.6925649642944336} 01/27/2022 03:53:53 - INFO - codeparrot_training - Step 8525: {'lr': 0.00047754682889895444, 'samples': 1636992, 'steps': 8525, 'loss/train': 0.7557416260242462} 01/27/2022 03:53:56 - INFO - codeparrot_training - Step 8526: {'lr': 0.00047754005113108557, 'samples': 1637184, 'steps': 8526, 'loss/train': 0.35344306379556656} 01/27/2022 03:53:59 - INFO - codeparrot_training - Step 8527: {'lr': 0.0004775332723885074, 'samples': 1637376, 'steps': 8527, 'loss/train': 0.8307740092277527} 01/27/2022 03:54:02 - INFO - codeparrot_training - Step 8528: {'lr': 0.00047752649267124894, 'samples': 1637568, 'steps': 8528, 'loss/train': 0.7339210510253906} 01/27/2022 03:54:06 - INFO - codeparrot_training - Step 8529: {'lr': 0.0004775197119793392, 'samples': 1637760, 'steps': 8529, 'loss/train': 1.0814023911952972} 01/27/2022 03:54:09 - INFO - codeparrot_training - Step 8530: {'lr': 0.0004775129303128073, 'samples': 1637952, 'steps': 8530, 'loss/train': 1.1419530808925629} 01/27/2022 03:54:12 - INFO - codeparrot_training - Step 8531: {'lr': 0.0004775061476716822, 'samples': 1638144, 'steps': 8531, 'loss/train': 1.3854163885116577} 01/27/2022 03:54:15 - INFO - codeparrot_training - Step 8532: {'lr': 0.000477499364055993, 'samples': 1638336, 'steps': 8532, 'loss/train': 1.3056491911411285} 01/27/2022 03:54:20 - INFO - codeparrot_training - Step 8533: {'lr': 0.00047749257946576887, 'samples': 1638528, 'steps': 8533, 'loss/train': 1.226463407278061} 01/27/2022 03:54:23 - INFO - codeparrot_training - Step 8534: {'lr': 0.0004774857939010387, 'samples': 1638720, 'steps': 8534, 'loss/train': 0.6561640799045563} 01/27/2022 03:54:26 - INFO - codeparrot_training - Step 8535: {'lr': 0.0004774790073618316, 'samples': 1638912, 'steps': 8535, 'loss/train': 0.715465173125267} 01/27/2022 03:54:30 - INFO - codeparrot_training - Step 8536: {'lr': 0.00047747221984817666, 'samples': 1639104, 'steps': 8536, 'loss/train': 0.9470232725143433} 01/27/2022 03:54:33 - INFO - codeparrot_training - Step 8537: {'lr': 0.000477465431360103, 'samples': 1639296, 'steps': 8537, 'loss/train': 0.8689101934432983} 01/27/2022 03:54:36 - INFO - codeparrot_training - Step 8538: {'lr': 0.00047745864189763964, 'samples': 1639488, 'steps': 8538, 'loss/train': 0.7618594765663147} 01/27/2022 03:54:39 - INFO - codeparrot_training - Step 8539: {'lr': 0.0004774518514608157, 'samples': 1639680, 'steps': 8539, 'loss/train': 0.9164738953113556} 01/27/2022 03:54:42 - INFO - codeparrot_training - Step 8540: {'lr': 0.00047744506004966024, 'samples': 1639872, 'steps': 8540, 'loss/train': 1.1147536039352417} 01/27/2022 03:54:45 - INFO - codeparrot_training - Step 8541: {'lr': 0.0004774382676642024, 'samples': 1640064, 'steps': 8541, 'loss/train': 0.4425605982542038} 01/27/2022 03:54:50 - INFO - codeparrot_training - Step 8542: {'lr': 0.0004774314743044712, 'samples': 1640256, 'steps': 8542, 'loss/train': 0.9595220983028412} 01/27/2022 03:54:53 - INFO - codeparrot_training - Step 8543: {'lr': 0.00047742467997049576, 'samples': 1640448, 'steps': 8543, 'loss/train': 0.944178968667984} 01/27/2022 03:54:56 - INFO - codeparrot_training - Step 8544: {'lr': 0.00047741788466230527, 'samples': 1640640, 'steps': 8544, 'loss/train': 0.22825871407985687} 01/27/2022 03:54:59 - INFO - codeparrot_training - Step 8545: {'lr': 0.00047741108837992877, 'samples': 1640832, 'steps': 8545, 'loss/train': 0.3753376454114914} 01/27/2022 03:55:02 - INFO - codeparrot_training - Step 8546: {'lr': 0.0004774042911233953, 'samples': 1641024, 'steps': 8546, 'loss/train': 0.9551796913146973} 01/27/2022 03:55:05 - INFO - codeparrot_training - Step 8547: {'lr': 0.0004773974928927342, 'samples': 1641216, 'steps': 8547, 'loss/train': 0.7626686990261078} 01/27/2022 03:55:09 - INFO - codeparrot_training - Step 8548: {'lr': 0.00047739069368797426, 'samples': 1641408, 'steps': 8548, 'loss/train': 0.4168728142976761} 01/27/2022 03:55:12 - INFO - codeparrot_training - Step 8549: {'lr': 0.0004773838935091449, 'samples': 1641600, 'steps': 8549, 'loss/train': 0.897036999464035} 01/27/2022 03:55:15 - INFO - codeparrot_training - Step 8550: {'lr': 0.00047737709235627515, 'samples': 1641792, 'steps': 8550, 'loss/train': 0.1403317116200924} 01/27/2022 03:55:19 - INFO - codeparrot_training - Step 8551: {'lr': 0.00047737029022939414, 'samples': 1641984, 'steps': 8551, 'loss/train': 1.5735599398612976} 01/27/2022 03:55:23 - INFO - codeparrot_training - Step 8552: {'lr': 0.00047736348712853094, 'samples': 1642176, 'steps': 8552, 'loss/train': 1.0782378315925598} 01/27/2022 03:55:26 - INFO - codeparrot_training - Step 8553: {'lr': 0.00047735668305371484, 'samples': 1642368, 'steps': 8553, 'loss/train': 1.1825980246067047} 01/27/2022 03:55:29 - INFO - codeparrot_training - Step 8554: {'lr': 0.0004773498780049749, 'samples': 1642560, 'steps': 8554, 'loss/train': 0.7246901839971542} 01/27/2022 03:55:32 - INFO - codeparrot_training - Step 8555: {'lr': 0.00047734307198234015, 'samples': 1642752, 'steps': 8555, 'loss/train': 1.4508769512176514} 01/27/2022 03:55:35 - INFO - codeparrot_training - Step 8556: {'lr': 0.00047733626498584, 'samples': 1642944, 'steps': 8556, 'loss/train': 0.8970068693161011} 01/27/2022 03:55:38 - INFO - codeparrot_training - Step 8557: {'lr': 0.0004773294570155035, 'samples': 1643136, 'steps': 8557, 'loss/train': 1.0179735124111176} 01/27/2022 03:55:41 - INFO - codeparrot_training - Step 8558: {'lr': 0.0004773226480713596, 'samples': 1643328, 'steps': 8558, 'loss/train': 0.3791733831167221} 01/27/2022 03:55:45 - INFO - codeparrot_training - Step 8559: {'lr': 0.00047731583815343784, 'samples': 1643520, 'steps': 8559, 'loss/train': 0.8426448404788971} 01/27/2022 03:55:49 - INFO - codeparrot_training - Step 8560: {'lr': 0.00047730902726176715, 'samples': 1643712, 'steps': 8560, 'loss/train': 1.1872669458389282} 01/27/2022 03:55:52 - INFO - codeparrot_training - Step 8561: {'lr': 0.00047730221539637677, 'samples': 1643904, 'steps': 8561, 'loss/train': 0.9689788520336151} 01/27/2022 03:55:55 - INFO - codeparrot_training - Step 8562: {'lr': 0.00047729540255729585, 'samples': 1644096, 'steps': 8562, 'loss/train': 0.7163079679012299} 01/27/2022 03:55:58 - INFO - codeparrot_training - Step 8563: {'lr': 0.0004772885887445536, 'samples': 1644288, 'steps': 8563, 'loss/train': 0.9027139842510223} 01/27/2022 03:56:02 - INFO - codeparrot_training - Step 8564: {'lr': 0.0004772817739581793, 'samples': 1644480, 'steps': 8564, 'loss/train': 0.5195880979299545} 01/27/2022 03:56:05 - INFO - codeparrot_training - Step 8565: {'lr': 0.000477274958198202, 'samples': 1644672, 'steps': 8565, 'loss/train': 1.0025601089000702} 01/27/2022 03:56:08 - INFO - codeparrot_training - Step 8566: {'lr': 0.0004772681414646509, 'samples': 1644864, 'steps': 8566, 'loss/train': 1.2703159153461456} 01/27/2022 03:56:11 - INFO - codeparrot_training - Step 8567: {'lr': 0.00047726132375755525, 'samples': 1645056, 'steps': 8567, 'loss/train': 0.9122527241706848} 01/27/2022 03:56:16 - INFO - codeparrot_training - Step 8568: {'lr': 0.00047725450507694433, 'samples': 1645248, 'steps': 8568, 'loss/train': 0.2887229472398758} 01/27/2022 03:56:19 - INFO - codeparrot_training - Step 8569: {'lr': 0.00047724768542284726, 'samples': 1645440, 'steps': 8569, 'loss/train': 0.8458713591098785} 01/27/2022 03:56:22 - INFO - codeparrot_training - Step 8570: {'lr': 0.0004772408647952932, 'samples': 1645632, 'steps': 8570, 'loss/train': 1.104988932609558} 01/27/2022 03:56:26 - INFO - codeparrot_training - Step 8571: {'lr': 0.0004772340431943114, 'samples': 1645824, 'steps': 8571, 'loss/train': 0.9230436980724335} 01/27/2022 03:56:29 - INFO - codeparrot_training - Step 8572: {'lr': 0.0004772272206199312, 'samples': 1646016, 'steps': 8572, 'loss/train': 0.6572345048189163} 01/27/2022 03:56:32 - INFO - codeparrot_training - Step 8573: {'lr': 0.0004772203970721817, 'samples': 1646208, 'steps': 8573, 'loss/train': 1.1706214249134064} 01/27/2022 03:56:35 - INFO - codeparrot_training - Step 8574: {'lr': 0.0004772135725510922, 'samples': 1646400, 'steps': 8574, 'loss/train': 0.8582601249217987} 01/27/2022 03:56:38 - INFO - codeparrot_training - Step 8575: {'lr': 0.0004772067470566919, 'samples': 1646592, 'steps': 8575, 'loss/train': 0.8330688178539276} 01/27/2022 03:56:41 - INFO - codeparrot_training - Step 8576: {'lr': 0.00047719992058901006, 'samples': 1646784, 'steps': 8576, 'loss/train': 0.6976831555366516} 01/27/2022 03:56:46 - INFO - codeparrot_training - Step 8577: {'lr': 0.00047719309314807584, 'samples': 1646976, 'steps': 8577, 'loss/train': 0.49502845108509064} 01/27/2022 03:56:49 - INFO - codeparrot_training - Step 8578: {'lr': 0.0004771862647339186, 'samples': 1647168, 'steps': 8578, 'loss/train': 1.0079285502433777} 01/27/2022 03:56:52 - INFO - codeparrot_training - Step 8579: {'lr': 0.0004771794353465675, 'samples': 1647360, 'steps': 8579, 'loss/train': 0.9268145263195038} 01/27/2022 03:56:55 - INFO - codeparrot_training - Step 8580: {'lr': 0.00047717260498605186, 'samples': 1647552, 'steps': 8580, 'loss/train': 0.7511219680309296} 01/27/2022 03:56:58 - INFO - codeparrot_training - Step 8581: {'lr': 0.0004771657736524009, 'samples': 1647744, 'steps': 8581, 'loss/train': 0.8576302528381348} 01/27/2022 03:57:02 - INFO - codeparrot_training - Step 8582: {'lr': 0.00047715894134564395, 'samples': 1647936, 'steps': 8582, 'loss/train': 0.6736469864845276} 01/27/2022 03:57:05 - INFO - codeparrot_training - Step 8583: {'lr': 0.0004771521080658102, 'samples': 1648128, 'steps': 8583, 'loss/train': 0.6676284223794937} 01/27/2022 03:57:08 - INFO - codeparrot_training - Step 8584: {'lr': 0.00047714527381292893, 'samples': 1648320, 'steps': 8584, 'loss/train': 0.8089910745620728} 01/27/2022 03:57:11 - INFO - codeparrot_training - Step 8585: {'lr': 0.00047713843858702943, 'samples': 1648512, 'steps': 8585, 'loss/train': 0.892036646604538} 01/27/2022 03:57:17 - INFO - codeparrot_training - Step 8586: {'lr': 0.000477131602388141, 'samples': 1648704, 'steps': 8586, 'loss/train': 2.015627861022949} 01/27/2022 03:57:20 - INFO - codeparrot_training - Step 8587: {'lr': 0.00047712476521629294, 'samples': 1648896, 'steps': 8587, 'loss/train': 0.8409552276134491} 01/27/2022 03:57:23 - INFO - codeparrot_training - Step 8588: {'lr': 0.0004771179270715145, 'samples': 1649088, 'steps': 8588, 'loss/train': 0.48380644619464874} 01/27/2022 03:57:27 - INFO - codeparrot_training - Step 8589: {'lr': 0.000477111087953835, 'samples': 1649280, 'steps': 8589, 'loss/train': 0.9778389930725098} 01/27/2022 03:57:30 - INFO - codeparrot_training - Step 8590: {'lr': 0.0004771042478632836, 'samples': 1649472, 'steps': 8590, 'loss/train': 1.706419050693512} 01/27/2022 03:57:33 - INFO - codeparrot_training - Step 8591: {'lr': 0.0004770974067998898, 'samples': 1649664, 'steps': 8591, 'loss/train': 1.5860912203788757} 01/27/2022 03:57:36 - INFO - codeparrot_training - Step 8592: {'lr': 0.0004770905647636828, 'samples': 1649856, 'steps': 8592, 'loss/train': 1.4162442684173584} 01/27/2022 03:57:39 - INFO - codeparrot_training - Step 8593: {'lr': 0.00047708372175469193, 'samples': 1650048, 'steps': 8593, 'loss/train': 1.0295012891292572} 01/27/2022 03:57:42 - INFO - codeparrot_training - Step 8594: {'lr': 0.0004770768777729465, 'samples': 1650240, 'steps': 8594, 'loss/train': 2.034437835216522} 01/27/2022 03:57:45 - INFO - codeparrot_training - Step 8595: {'lr': 0.0004770700328184758, 'samples': 1650432, 'steps': 8595, 'loss/train': 0.8038685917854309} 01/27/2022 03:57:50 - INFO - codeparrot_training - Step 8596: {'lr': 0.00047706318689130924, 'samples': 1650624, 'steps': 8596, 'loss/train': 1.0504457652568817} 01/27/2022 03:57:53 - INFO - codeparrot_training - Step 8597: {'lr': 0.0004770563399914761, 'samples': 1650816, 'steps': 8597, 'loss/train': 0.7897584736347198} 01/27/2022 03:57:56 - INFO - codeparrot_training - Step 8598: {'lr': 0.00047704949211900565, 'samples': 1651008, 'steps': 8598, 'loss/train': 0.8318645060062408} 01/27/2022 03:57:59 - INFO - codeparrot_training - Step 8599: {'lr': 0.0004770426432739273, 'samples': 1651200, 'steps': 8599, 'loss/train': 0.06548979319632053} 01/27/2022 03:58:02 - INFO - codeparrot_training - Step 8600: {'lr': 0.00047703579345627036, 'samples': 1651392, 'steps': 8600, 'loss/train': 0.6786059439182281} 01/27/2022 03:58:06 - INFO - codeparrot_training - Step 8601: {'lr': 0.00047702894266606413, 'samples': 1651584, 'steps': 8601, 'loss/train': 0.7456271946430206} 01/27/2022 03:58:09 - INFO - codeparrot_training - Step 8602: {'lr': 0.00047702209090333804, 'samples': 1651776, 'steps': 8602, 'loss/train': 0.5312301367521286} 01/27/2022 03:58:12 - INFO - codeparrot_training - Step 8603: {'lr': 0.0004770152381681214, 'samples': 1651968, 'steps': 8603, 'loss/train': 1.1467700600624084} 01/27/2022 03:58:15 - INFO - codeparrot_training - Step 8604: {'lr': 0.0004770083844604435, 'samples': 1652160, 'steps': 8604, 'loss/train': 1.2297537624835968} 01/27/2022 03:58:19 - INFO - codeparrot_training - Step 8605: {'lr': 0.00047700152978033387, 'samples': 1652352, 'steps': 8605, 'loss/train': 1.0673402845859528} 01/27/2022 03:58:23 - INFO - codeparrot_training - Step 8606: {'lr': 0.0004769946741278217, 'samples': 1652544, 'steps': 8606, 'loss/train': 1.2956934571266174} 01/27/2022 03:58:26 - INFO - codeparrot_training - Step 8607: {'lr': 0.00047698781750293644, 'samples': 1652736, 'steps': 8607, 'loss/train': 0.730835035443306} 01/27/2022 03:58:29 - INFO - codeparrot_training - Step 8608: {'lr': 0.00047698095990570744, 'samples': 1652928, 'steps': 8608, 'loss/train': 0.7919160425662994} 01/27/2022 03:58:32 - INFO - codeparrot_training - Step 8609: {'lr': 0.00047697410133616414, 'samples': 1653120, 'steps': 8609, 'loss/train': 0.7303047627210617} 01/27/2022 03:58:35 - INFO - codeparrot_training - Step 8610: {'lr': 0.0004769672417943358, 'samples': 1653312, 'steps': 8610, 'loss/train': 1.0608184039592743} 01/27/2022 03:58:38 - INFO - codeparrot_training - Step 8611: {'lr': 0.00047696038128025185, 'samples': 1653504, 'steps': 8611, 'loss/train': 0.8185466229915619} 01/27/2022 03:58:41 - INFO - codeparrot_training - Step 8612: {'lr': 0.00047695351979394173, 'samples': 1653696, 'steps': 8612, 'loss/train': 0.9022884964942932} 01/27/2022 03:58:45 - INFO - codeparrot_training - Step 8613: {'lr': 0.00047694665733543485, 'samples': 1653888, 'steps': 8613, 'loss/train': 0.7259459048509598} 01/27/2022 03:58:50 - INFO - codeparrot_training - Step 8614: {'lr': 0.00047693979390476046, 'samples': 1654080, 'steps': 8614, 'loss/train': 0.6593299359083176} 01/27/2022 03:58:53 - INFO - codeparrot_training - Step 8615: {'lr': 0.00047693292950194813, 'samples': 1654272, 'steps': 8615, 'loss/train': 0.6612650603055954} 01/27/2022 03:58:56 - INFO - codeparrot_training - Step 8616: {'lr': 0.0004769260641270271, 'samples': 1654464, 'steps': 8616, 'loss/train': 1.0609714686870575} 01/27/2022 03:58:59 - INFO - codeparrot_training - Step 8617: {'lr': 0.0004769191977800269, 'samples': 1654656, 'steps': 8617, 'loss/train': 0.8289287388324738} 01/27/2022 03:59:02 - INFO - codeparrot_training - Step 8618: {'lr': 0.0004769123304609769, 'samples': 1654848, 'steps': 8618, 'loss/train': 3.0458085536956787} 01/27/2022 03:59:05 - INFO - codeparrot_training - Step 8619: {'lr': 0.0004769054621699066, 'samples': 1655040, 'steps': 8619, 'loss/train': 1.075998455286026} 01/27/2022 03:59:09 - INFO - codeparrot_training - Step 8620: {'lr': 0.0004768985929068453, 'samples': 1655232, 'steps': 8620, 'loss/train': 0.6863797456026077} 01/27/2022 03:59:12 - INFO - codeparrot_training - Step 8621: {'lr': 0.0004768917226718225, 'samples': 1655424, 'steps': 8621, 'loss/train': 0.915544331073761} 01/27/2022 03:59:15 - INFO - codeparrot_training - Step 8622: {'lr': 0.0004768848514648676, 'samples': 1655616, 'steps': 8622, 'loss/train': 0.7781392335891724} 01/27/2022 03:59:19 - INFO - codeparrot_training - Step 8623: {'lr': 0.0004768779792860101, 'samples': 1655808, 'steps': 8623, 'loss/train': 0.3963661640882492} 01/27/2022 03:59:23 - INFO - codeparrot_training - Step 8624: {'lr': 0.00047687110613527924, 'samples': 1656000, 'steps': 8624, 'loss/train': 0.8802343010902405} 01/27/2022 03:59:26 - INFO - codeparrot_training - Step 8625: {'lr': 0.0004768642320127047, 'samples': 1656192, 'steps': 8625, 'loss/train': 0.6750632375478745} 01/27/2022 03:59:29 - INFO - codeparrot_training - Step 8626: {'lr': 0.0004768573569183158, 'samples': 1656384, 'steps': 8626, 'loss/train': 0.7164928168058395} 01/27/2022 03:59:32 - INFO - codeparrot_training - Step 8627: {'lr': 0.000476850480852142, 'samples': 1656576, 'steps': 8627, 'loss/train': 0.8202654719352722} 01/27/2022 03:59:35 - INFO - codeparrot_training - Step 8628: {'lr': 0.0004768436038142128, 'samples': 1656768, 'steps': 8628, 'loss/train': 0.46984854340553284} 01/27/2022 03:59:38 - INFO - codeparrot_training - Step 8629: {'lr': 0.00047683672580455764, 'samples': 1656960, 'steps': 8629, 'loss/train': 0.8274288475513458} 01/27/2022 03:59:41 - INFO - codeparrot_training - Step 8630: {'lr': 0.00047682984682320597, 'samples': 1657152, 'steps': 8630, 'loss/train': 1.338501513004303} 01/27/2022 03:59:45 - INFO - codeparrot_training - Step 8631: {'lr': 0.0004768229668701872, 'samples': 1657344, 'steps': 8631, 'loss/train': 0.623545378446579} 01/27/2022 03:59:50 - INFO - codeparrot_training - Step 8632: {'lr': 0.00047681608594553093, 'samples': 1657536, 'steps': 8632, 'loss/train': 1.0293566286563873} 01/27/2022 03:59:53 - INFO - codeparrot_training - Step 8633: {'lr': 0.00047680920404926655, 'samples': 1657728, 'steps': 8633, 'loss/train': 1.563552975654602} 01/27/2022 03:59:56 - INFO - codeparrot_training - Step 8634: {'lr': 0.0004768023211814236, 'samples': 1657920, 'steps': 8634, 'loss/train': 0.838102251291275} 01/27/2022 03:59:59 - INFO - codeparrot_training - Step 8635: {'lr': 0.0004767954373420315, 'samples': 1658112, 'steps': 8635, 'loss/train': 1.0932703614234924} 01/27/2022 04:00:02 - INFO - codeparrot_training - Step 8636: {'lr': 0.0004767885525311197, 'samples': 1658304, 'steps': 8636, 'loss/train': 0.9215690195560455} 01/27/2022 04:00:05 - INFO - codeparrot_training - Step 8637: {'lr': 0.00047678166674871783, 'samples': 1658496, 'steps': 8637, 'loss/train': 1.3764958083629608} 01/27/2022 04:00:08 - INFO - codeparrot_training - Step 8638: {'lr': 0.0004767747799948553, 'samples': 1658688, 'steps': 8638, 'loss/train': 0.7991353869438171} 01/27/2022 04:00:12 - INFO - codeparrot_training - Step 8639: {'lr': 0.0004767678922695616, 'samples': 1658880, 'steps': 8639, 'loss/train': 0.8709728121757507} 01/27/2022 04:00:16 - INFO - codeparrot_training - Step 8640: {'lr': 0.0004767610035728662, 'samples': 1659072, 'steps': 8640, 'loss/train': 1.356156975030899} 01/27/2022 04:00:19 - INFO - codeparrot_training - Step 8641: {'lr': 0.00047675411390479876, 'samples': 1659264, 'steps': 8641, 'loss/train': 0.32922300696372986} 01/27/2022 04:00:22 - INFO - codeparrot_training - Step 8642: {'lr': 0.0004767472232653887, 'samples': 1659456, 'steps': 8642, 'loss/train': 0.9626597464084625} 01/27/2022 04:00:25 - INFO - codeparrot_training - Step 8643: {'lr': 0.00047674033165466545, 'samples': 1659648, 'steps': 8643, 'loss/train': 0.9814795553684235} 01/27/2022 04:00:28 - INFO - codeparrot_training - Step 8644: {'lr': 0.0004767334390726588, 'samples': 1659840, 'steps': 8644, 'loss/train': 1.173525094985962} 01/27/2022 04:00:32 - INFO - codeparrot_training - Step 8645: {'lr': 0.00047672654551939785, 'samples': 1660032, 'steps': 8645, 'loss/train': 1.1105932295322418} 01/27/2022 04:00:35 - INFO - codeparrot_training - Step 8646: {'lr': 0.00047671965099491256, 'samples': 1660224, 'steps': 8646, 'loss/train': 1.099407970905304} 01/27/2022 04:00:38 - INFO - codeparrot_training - Step 8647: {'lr': 0.0004767127554992322, 'samples': 1660416, 'steps': 8647, 'loss/train': 0.9515188336372375} 01/27/2022 04:00:41 - INFO - codeparrot_training - Step 8648: {'lr': 0.0004767058590323864, 'samples': 1660608, 'steps': 8648, 'loss/train': 0.8182938694953918} 01/27/2022 04:00:46 - INFO - codeparrot_training - Step 8649: {'lr': 0.00047669896159440464, 'samples': 1660800, 'steps': 8649, 'loss/train': 0.8561208844184875} 01/27/2022 04:00:49 - INFO - codeparrot_training - Step 8650: {'lr': 0.00047669206318531654, 'samples': 1660992, 'steps': 8650, 'loss/train': 0.38522882759571075} 01/27/2022 04:00:52 - INFO - codeparrot_training - Step 8651: {'lr': 0.00047668516380515165, 'samples': 1661184, 'steps': 8651, 'loss/train': 0.8544170558452606} 01/27/2022 04:00:55 - INFO - codeparrot_training - Step 8652: {'lr': 0.0004766782634539395, 'samples': 1661376, 'steps': 8652, 'loss/train': 1.0394555926322937} 01/27/2022 04:00:58 - INFO - codeparrot_training - Step 8653: {'lr': 0.00047667136213170957, 'samples': 1661568, 'steps': 8653, 'loss/train': 1.02501779794693} 01/27/2022 04:01:01 - INFO - codeparrot_training - Step 8654: {'lr': 0.00047666445983849163, 'samples': 1661760, 'steps': 8654, 'loss/train': 0.969385027885437} 01/27/2022 04:01:05 - INFO - codeparrot_training - Step 8655: {'lr': 0.000476657556574315, 'samples': 1661952, 'steps': 8655, 'loss/train': 0.8611529767513275} 01/27/2022 04:01:08 - INFO - codeparrot_training - Step 8656: {'lr': 0.00047665065233920946, 'samples': 1662144, 'steps': 8656, 'loss/train': 1.0744542181491852} 01/27/2022 04:01:11 - INFO - codeparrot_training - Step 8657: {'lr': 0.0004766437471332045, 'samples': 1662336, 'steps': 8657, 'loss/train': 0.8994928300380707} 01/27/2022 04:01:15 - INFO - codeparrot_training - Step 8658: {'lr': 0.0004766368409563296, 'samples': 1662528, 'steps': 8658, 'loss/train': 0.7472744733095169} 01/27/2022 04:01:18 - INFO - codeparrot_training - Step 8659: {'lr': 0.0004766299338086145, 'samples': 1662720, 'steps': 8659, 'loss/train': 0.8219413161277771} 01/27/2022 04:01:22 - INFO - codeparrot_training - Step 8660: {'lr': 0.0004766230256900887, 'samples': 1662912, 'steps': 8660, 'loss/train': 1.0307218730449677} 01/27/2022 04:01:25 - INFO - codeparrot_training - Step 8661: {'lr': 0.00047661611660078184, 'samples': 1663104, 'steps': 8661, 'loss/train': 1.1101622879505157} 01/27/2022 04:01:28 - INFO - codeparrot_training - Step 8662: {'lr': 0.0004766092065407235, 'samples': 1663296, 'steps': 8662, 'loss/train': 1.037444829940796} 01/27/2022 04:01:31 - INFO - codeparrot_training - Step 8663: {'lr': 0.0004766022955099433, 'samples': 1663488, 'steps': 8663, 'loss/train': 0.5198741108179092} 01/27/2022 04:01:34 - INFO - codeparrot_training - Step 8664: {'lr': 0.00047659538350847076, 'samples': 1663680, 'steps': 8664, 'loss/train': 0.9854938387870789} 01/27/2022 04:01:37 - INFO - codeparrot_training - Step 8665: {'lr': 0.00047658847053633555, 'samples': 1663872, 'steps': 8665, 'loss/train': 0.5437801480293274} 01/27/2022 04:01:43 - INFO - codeparrot_training - Step 8666: {'lr': 0.00047658155659356725, 'samples': 1664064, 'steps': 8666, 'loss/train': 0.6759067922830582} 01/27/2022 04:01:46 - INFO - codeparrot_training - Step 8667: {'lr': 0.0004765746416801956, 'samples': 1664256, 'steps': 8667, 'loss/train': 0.9304957687854767} 01/27/2022 04:01:49 - INFO - codeparrot_training - Step 8668: {'lr': 0.0004765677257962501, 'samples': 1664448, 'steps': 8668, 'loss/train': 1.128305733203888} 01/27/2022 04:01:52 - INFO - codeparrot_training - Step 8669: {'lr': 0.0004765608089417604, 'samples': 1664640, 'steps': 8669, 'loss/train': 1.4559221863746643} 01/27/2022 04:01:55 - INFO - codeparrot_training - Step 8670: {'lr': 0.0004765538911167562, 'samples': 1664832, 'steps': 8670, 'loss/train': 1.207157850265503} 01/27/2022 04:01:58 - INFO - codeparrot_training - Step 8671: {'lr': 0.00047654697232126696, 'samples': 1665024, 'steps': 8671, 'loss/train': 0.8576090633869171} 01/27/2022 04:02:01 - INFO - codeparrot_training - Step 8672: {'lr': 0.00047654005255532247, 'samples': 1665216, 'steps': 8672, 'loss/train': 0.777294784784317} 01/27/2022 04:02:05 - INFO - codeparrot_training - Step 8673: {'lr': 0.0004765331318189523, 'samples': 1665408, 'steps': 8673, 'loss/train': 0.7937143743038177} 01/27/2022 04:02:08 - INFO - codeparrot_training - Step 8674: {'lr': 0.00047652621011218623, 'samples': 1665600, 'steps': 8674, 'loss/train': 0.5480559021234512} 01/27/2022 04:02:12 - INFO - codeparrot_training - Step 8675: {'lr': 0.0004765192874350537, 'samples': 1665792, 'steps': 8675, 'loss/train': 0.19493452459573746} 01/27/2022 04:02:15 - INFO - codeparrot_training - Step 8676: {'lr': 0.0004765123637875845, 'samples': 1665984, 'steps': 8676, 'loss/train': 0.6487187147140503} 01/27/2022 04:02:19 - INFO - codeparrot_training - Step 8677: {'lr': 0.00047650543916980827, 'samples': 1666176, 'steps': 8677, 'loss/train': 1.1005300283432007} 01/27/2022 04:02:22 - INFO - codeparrot_training - Step 8678: {'lr': 0.00047649851358175466, 'samples': 1666368, 'steps': 8678, 'loss/train': 0.8422171175479889} 01/27/2022 04:02:25 - INFO - codeparrot_training - Step 8679: {'lr': 0.0004764915870234533, 'samples': 1666560, 'steps': 8679, 'loss/train': 0.8668735921382904} 01/27/2022 04:02:28 - INFO - codeparrot_training - Step 8680: {'lr': 0.000476484659494934, 'samples': 1666752, 'steps': 8680, 'loss/train': 0.8713429570198059} 01/27/2022 04:02:31 - INFO - codeparrot_training - Step 8681: {'lr': 0.0004764777309962263, 'samples': 1666944, 'steps': 8681, 'loss/train': 0.5666868388652802} 01/27/2022 04:02:34 - INFO - codeparrot_training - Step 8682: {'lr': 0.0004764708015273599, 'samples': 1667136, 'steps': 8682, 'loss/train': 0.719230905175209} 01/27/2022 04:02:38 - INFO - codeparrot_training - Step 8683: {'lr': 0.0004764638710883644, 'samples': 1667328, 'steps': 8683, 'loss/train': 1.1133002042770386} 01/27/2022 04:02:41 - INFO - codeparrot_training - Step 8684: {'lr': 0.0004764569396792697, 'samples': 1667520, 'steps': 8684, 'loss/train': 0.4622996896505356} 01/27/2022 04:02:45 - INFO - codeparrot_training - Step 8685: {'lr': 0.00047645000730010535, 'samples': 1667712, 'steps': 8685, 'loss/train': 0.5372312664985657} 01/27/2022 04:02:49 - INFO - codeparrot_training - Step 8686: {'lr': 0.00047644307395090107, 'samples': 1667904, 'steps': 8686, 'loss/train': 0.8863745033740997} 01/27/2022 04:02:52 - INFO - codeparrot_training - Step 8687: {'lr': 0.0004764361396316866, 'samples': 1668096, 'steps': 8687, 'loss/train': 0.9699511528015137} 01/27/2022 04:02:55 - INFO - codeparrot_training - Step 8688: {'lr': 0.0004764292043424916, 'samples': 1668288, 'steps': 8688, 'loss/train': 0.9055407643318176} 01/27/2022 04:02:58 - INFO - codeparrot_training - Step 8689: {'lr': 0.0004764222680833458, 'samples': 1668480, 'steps': 8689, 'loss/train': 0.9034092128276825} 01/27/2022 04:03:01 - INFO - codeparrot_training - Step 8690: {'lr': 0.0004764153308542788, 'samples': 1668672, 'steps': 8690, 'loss/train': 1.9672751426696777} 01/27/2022 04:03:04 - INFO - codeparrot_training - Step 8691: {'lr': 0.0004764083926553205, 'samples': 1668864, 'steps': 8691, 'loss/train': 0.8582149744033813} 01/27/2022 04:03:07 - INFO - codeparrot_training - Step 8692: {'lr': 0.00047640145348650057, 'samples': 1669056, 'steps': 8692, 'loss/train': 0.7777600586414337} 01/27/2022 04:03:12 - INFO - codeparrot_training - Step 8693: {'lr': 0.0004763945133478486, 'samples': 1669248, 'steps': 8693, 'loss/train': 0.8934420347213745} 01/27/2022 04:03:16 - INFO - codeparrot_training - Step 8694: {'lr': 0.0004763875722393945, 'samples': 1669440, 'steps': 8694, 'loss/train': 1.0536347329616547} 01/27/2022 04:03:19 - INFO - codeparrot_training - Step 8695: {'lr': 0.000476380630161168, 'samples': 1669632, 'steps': 8695, 'loss/train': 1.4006497263908386} 01/27/2022 04:03:22 - INFO - codeparrot_training - Step 8696: {'lr': 0.00047637368711319863, 'samples': 1669824, 'steps': 8696, 'loss/train': 0.5193295776844025} 01/27/2022 04:03:25 - INFO - codeparrot_training - Step 8697: {'lr': 0.00047636674309551626, 'samples': 1670016, 'steps': 8697, 'loss/train': 0.4179660826921463} 01/27/2022 04:03:28 - INFO - codeparrot_training - Step 8698: {'lr': 0.0004763597981081507, 'samples': 1670208, 'steps': 8698, 'loss/train': 0.7419927567243576} 01/27/2022 04:03:31 - INFO - codeparrot_training - Step 8699: {'lr': 0.00047635285215113165, 'samples': 1670400, 'steps': 8699, 'loss/train': 0.8912272453308105} 01/27/2022 04:03:34 - INFO - codeparrot_training - Step 8700: {'lr': 0.0004763459052244888, 'samples': 1670592, 'steps': 8700, 'loss/train': 0.4592955708503723} 01/27/2022 04:03:38 - INFO - codeparrot_training - Step 8701: {'lr': 0.0004763389573282521, 'samples': 1670784, 'steps': 8701, 'loss/train': 1.0707643032073975} 01/27/2022 04:03:42 - INFO - codeparrot_training - Step 8702: {'lr': 0.00047633200846245106, 'samples': 1670976, 'steps': 8702, 'loss/train': 0.3086552694439888} 01/27/2022 04:03:45 - INFO - codeparrot_training - Step 8703: {'lr': 0.0004763250586271156, 'samples': 1671168, 'steps': 8703, 'loss/train': 0.7031025588512421} 01/27/2022 04:03:48 - INFO - codeparrot_training - Step 8704: {'lr': 0.00047631810782227535, 'samples': 1671360, 'steps': 8704, 'loss/train': 1.0947875082492828} 01/27/2022 04:03:51 - INFO - codeparrot_training - Step 8705: {'lr': 0.00047631115604796035, 'samples': 1671552, 'steps': 8705, 'loss/train': 1.3773931860923767} 01/27/2022 04:03:55 - INFO - codeparrot_training - Step 8706: {'lr': 0.0004763042033042001, 'samples': 1671744, 'steps': 8706, 'loss/train': 0.3523501306772232} 01/27/2022 04:03:58 - INFO - codeparrot_training - Step 8707: {'lr': 0.0004762972495910246, 'samples': 1671936, 'steps': 8707, 'loss/train': 0.7454057335853577} 01/27/2022 04:04:01 - INFO - codeparrot_training - Step 8708: {'lr': 0.00047629029490846346, 'samples': 1672128, 'steps': 8708, 'loss/train': 0.9187206029891968} 01/27/2022 04:04:04 - INFO - codeparrot_training - Step 8709: {'lr': 0.0004762833392565466, 'samples': 1672320, 'steps': 8709, 'loss/train': 0.6797366738319397} 01/27/2022 04:04:07 - INFO - codeparrot_training - Step 8710: {'lr': 0.00047627638263530374, 'samples': 1672512, 'steps': 8710, 'loss/train': 0.872597873210907} 01/27/2022 04:04:12 - INFO - codeparrot_training - Step 8711: {'lr': 0.00047626942504476477, 'samples': 1672704, 'steps': 8711, 'loss/train': 1.2637430727481842} 01/27/2022 04:04:15 - INFO - codeparrot_training - Step 8712: {'lr': 0.00047626246648495936, 'samples': 1672896, 'steps': 8712, 'loss/train': 0.8688988387584686} 01/27/2022 04:04:19 - INFO - codeparrot_training - Step 8713: {'lr': 0.0004762555069559175, 'samples': 1673088, 'steps': 8713, 'loss/train': 0.5265332758426666} 01/27/2022 04:04:22 - INFO - codeparrot_training - Step 8714: {'lr': 0.00047624854645766875, 'samples': 1673280, 'steps': 8714, 'loss/train': 0.7521159052848816} 01/27/2022 04:04:25 - INFO - codeparrot_training - Step 8715: {'lr': 0.0004762415849902431, 'samples': 1673472, 'steps': 8715, 'loss/train': 1.2563100457191467} 01/27/2022 04:04:28 - INFO - codeparrot_training - Step 8716: {'lr': 0.0004762346225536703, 'samples': 1673664, 'steps': 8716, 'loss/train': 0.6370216459035873} 01/27/2022 04:04:31 - INFO - codeparrot_training - Step 8717: {'lr': 0.0004762276591479804, 'samples': 1673856, 'steps': 8717, 'loss/train': 0.8574124574661255} 01/27/2022 04:04:34 - INFO - codeparrot_training - Step 8718: {'lr': 0.00047622069477320285, 'samples': 1674048, 'steps': 8718, 'loss/train': 0.8612976372241974} 01/27/2022 04:04:37 - INFO - codeparrot_training - Step 8719: {'lr': 0.0004762137294293678, 'samples': 1674240, 'steps': 8719, 'loss/train': 0.9407105147838593} 01/27/2022 04:04:42 - INFO - codeparrot_training - Step 8720: {'lr': 0.0004762067631165049, 'samples': 1674432, 'steps': 8720, 'loss/train': 1.4363349080085754} 01/27/2022 04:04:45 - INFO - codeparrot_training - Step 8721: {'lr': 0.0004761997958346441, 'samples': 1674624, 'steps': 8721, 'loss/train': 0.8879264295101166} 01/27/2022 04:04:48 - INFO - codeparrot_training - Step 8722: {'lr': 0.00047619282758381513, 'samples': 1674816, 'steps': 8722, 'loss/train': 0.7995741069316864} 01/27/2022 04:04:52 - INFO - codeparrot_training - Step 8723: {'lr': 0.0004761858583640479, 'samples': 1675008, 'steps': 8723, 'loss/train': 0.830101490020752} 01/27/2022 04:04:55 - INFO - codeparrot_training - Step 8724: {'lr': 0.00047617888817537234, 'samples': 1675200, 'steps': 8724, 'loss/train': 0.6424972414970398} 01/27/2022 04:04:58 - INFO - codeparrot_training - Step 8725: {'lr': 0.00047617191701781824, 'samples': 1675392, 'steps': 8725, 'loss/train': 0.7679971754550934} 01/27/2022 04:05:01 - INFO - codeparrot_training - Step 8726: {'lr': 0.0004761649448914155, 'samples': 1675584, 'steps': 8726, 'loss/train': 0.14054446667432785} 01/27/2022 04:05:04 - INFO - codeparrot_training - Step 8727: {'lr': 0.0004761579717961939, 'samples': 1675776, 'steps': 8727, 'loss/train': 0.5735173523426056} 01/27/2022 04:05:09 - INFO - codeparrot_training - Step 8728: {'lr': 0.0004761509977321834, 'samples': 1675968, 'steps': 8728, 'loss/train': 0.9598193764686584} 01/27/2022 04:05:12 - INFO - codeparrot_training - Step 8729: {'lr': 0.0004761440226994138, 'samples': 1676160, 'steps': 8729, 'loss/train': 1.1841634511947632} 01/27/2022 04:05:15 - INFO - codeparrot_training - Step 8730: {'lr': 0.000476137046697915, 'samples': 1676352, 'steps': 8730, 'loss/train': 0.21113506704568863} 01/27/2022 04:05:18 - INFO - codeparrot_training - Step 8731: {'lr': 0.0004761300697277169, 'samples': 1676544, 'steps': 8731, 'loss/train': 0.8588805198669434} 01/27/2022 04:05:21 - INFO - codeparrot_training - Step 8732: {'lr': 0.0004761230917888494, 'samples': 1676736, 'steps': 8732, 'loss/train': 0.6442038416862488} 01/27/2022 04:05:25 - INFO - codeparrot_training - Step 8733: {'lr': 0.00047611611288134236, 'samples': 1676928, 'steps': 8733, 'loss/train': 0.841901957988739} 01/27/2022 04:05:28 - INFO - codeparrot_training - Step 8734: {'lr': 0.00047610913300522576, 'samples': 1677120, 'steps': 8734, 'loss/train': 0.8307494223117828} 01/27/2022 04:05:31 - INFO - codeparrot_training - Step 8735: {'lr': 0.00047610215216052946, 'samples': 1677312, 'steps': 8735, 'loss/train': 0.6518794745206833} 01/27/2022 04:05:34 - INFO - codeparrot_training - Step 8736: {'lr': 0.0004760951703472832, 'samples': 1677504, 'steps': 8736, 'loss/train': 0.5817018449306488} 01/27/2022 04:05:38 - INFO - codeparrot_training - Step 8737: {'lr': 0.0004760881875655171, 'samples': 1677696, 'steps': 8737, 'loss/train': 0.6586654186248779} 01/27/2022 04:05:42 - INFO - codeparrot_training - Step 8738: {'lr': 0.000476081203815261, 'samples': 1677888, 'steps': 8738, 'loss/train': 0.7684539556503296} 01/27/2022 04:05:45 - INFO - codeparrot_training - Step 8739: {'lr': 0.0004760742190965447, 'samples': 1678080, 'steps': 8739, 'loss/train': 0.11352469772100449} 01/27/2022 04:05:48 - INFO - codeparrot_training - Step 8740: {'lr': 0.0004760672334093984, 'samples': 1678272, 'steps': 8740, 'loss/train': 1.145266592502594} 01/27/2022 04:05:51 - INFO - codeparrot_training - Step 8741: {'lr': 0.0004760602467538517, 'samples': 1678464, 'steps': 8741, 'loss/train': 1.18969264626503} 01/27/2022 04:05:54 - INFO - codeparrot_training - Step 8742: {'lr': 0.0004760532591299348, 'samples': 1678656, 'steps': 8742, 'loss/train': 0.6347986310720444} 01/27/2022 04:05:57 - INFO - codeparrot_training - Step 8743: {'lr': 0.00047604627053767754, 'samples': 1678848, 'steps': 8743, 'loss/train': 1.1930729448795319} 01/27/2022 04:06:00 - INFO - codeparrot_training - Step 8744: {'lr': 0.0004760392809771098, 'samples': 1679040, 'steps': 8744, 'loss/train': 0.49827124178409576} 01/27/2022 04:06:04 - INFO - codeparrot_training - Step 8745: {'lr': 0.00047603229044826146, 'samples': 1679232, 'steps': 8745, 'loss/train': 0.8538298308849335} 01/27/2022 04:06:09 - INFO - codeparrot_training - Step 8746: {'lr': 0.00047602529895116264, 'samples': 1679424, 'steps': 8746, 'loss/train': 0.9386561214923859} 01/27/2022 04:06:12 - INFO - codeparrot_training - Step 8747: {'lr': 0.0004760183064858432, 'samples': 1679616, 'steps': 8747, 'loss/train': 0.7624708414077759} 01/27/2022 04:06:15 - INFO - codeparrot_training - Step 8748: {'lr': 0.0004760113130523331, 'samples': 1679808, 'steps': 8748, 'loss/train': 0.8180204629898071} 01/27/2022 04:06:18 - INFO - codeparrot_training - Step 8749: {'lr': 0.0004760043186506624, 'samples': 1680000, 'steps': 8749, 'loss/train': 0.8715585172176361} 01/27/2022 04:06:21 - INFO - codeparrot_training - Step 8750: {'lr': 0.0004759973232808609, 'samples': 1680192, 'steps': 8750, 'loss/train': 0.9888376593589783} 01/27/2022 04:06:24 - INFO - codeparrot_training - Step 8751: {'lr': 0.0004759903269429585, 'samples': 1680384, 'steps': 8751, 'loss/train': 0.375079482793808} 01/27/2022 04:06:28 - INFO - codeparrot_training - Step 8752: {'lr': 0.00047598332963698543, 'samples': 1680576, 'steps': 8752, 'loss/train': 0.6922146677970886} 01/27/2022 04:06:31 - INFO - codeparrot_training - Step 8753: {'lr': 0.00047597633136297154, 'samples': 1680768, 'steps': 8753, 'loss/train': 1.068029522895813} 01/27/2022 04:06:34 - INFO - codeparrot_training - Step 8754: {'lr': 0.0004759693321209467, 'samples': 1680960, 'steps': 8754, 'loss/train': 0.6912199705839157} 01/27/2022 04:06:38 - INFO - codeparrot_training - Step 8755: {'lr': 0.00047596233191094114, 'samples': 1681152, 'steps': 8755, 'loss/train': 0.46337342262268066} 01/27/2022 04:06:42 - INFO - codeparrot_training - Step 8756: {'lr': 0.0004759553307329846, 'samples': 1681344, 'steps': 8756, 'loss/train': 1.0610357522964478} 01/27/2022 04:06:45 - INFO - codeparrot_training - Step 8757: {'lr': 0.00047594832858710725, 'samples': 1681536, 'steps': 8757, 'loss/train': 0.7001245021820068} 01/27/2022 04:06:48 - INFO - codeparrot_training - Step 8758: {'lr': 0.0004759413254733389, 'samples': 1681728, 'steps': 8758, 'loss/train': 1.176485538482666} 01/27/2022 04:06:51 - INFO - codeparrot_training - Step 8759: {'lr': 0.0004759343213917097, 'samples': 1681920, 'steps': 8759, 'loss/train': 0.29349636286497116} 01/27/2022 04:06:54 - INFO - codeparrot_training - Step 8760: {'lr': 0.0004759273163422496, 'samples': 1682112, 'steps': 8760, 'loss/train': 0.7319397479295731} 01/27/2022 04:06:57 - INFO - codeparrot_training - Step 8761: {'lr': 0.00047592031032498875, 'samples': 1682304, 'steps': 8761, 'loss/train': 0.6962304711341858} 01/27/2022 04:07:00 - INFO - codeparrot_training - Step 8762: {'lr': 0.00047591330333995684, 'samples': 1682496, 'steps': 8762, 'loss/train': 0.7357923835515976} 01/27/2022 04:07:03 - INFO - codeparrot_training - Step 8763: {'lr': 0.0004759062953871842, 'samples': 1682688, 'steps': 8763, 'loss/train': 1.1423397660255432} 01/27/2022 04:07:08 - INFO - codeparrot_training - Step 8764: {'lr': 0.0004758992864667007, 'samples': 1682880, 'steps': 8764, 'loss/train': 0.40539756417274475} 01/27/2022 04:07:11 - INFO - codeparrot_training - Step 8765: {'lr': 0.0004758922765785363, 'samples': 1683072, 'steps': 8765, 'loss/train': 0.9931491315364838} 01/27/2022 04:07:14 - INFO - codeparrot_training - Step 8766: {'lr': 0.00047588526572272117, 'samples': 1683264, 'steps': 8766, 'loss/train': 0.7557407319545746} 01/27/2022 04:07:17 - INFO - codeparrot_training - Step 8767: {'lr': 0.0004758782538992853, 'samples': 1683456, 'steps': 8767, 'loss/train': 1.0287723541259766} 01/27/2022 04:07:21 - INFO - codeparrot_training - Step 8768: {'lr': 0.00047587124110825874, 'samples': 1683648, 'steps': 8768, 'loss/train': 1.01328706741333} 01/27/2022 04:07:24 - INFO - codeparrot_training - Step 8769: {'lr': 0.0004758642273496714, 'samples': 1683840, 'steps': 8769, 'loss/train': 0.8972915410995483} 01/27/2022 04:07:27 - INFO - codeparrot_training - Step 8770: {'lr': 0.0004758572126235535, 'samples': 1684032, 'steps': 8770, 'loss/train': 0.6617833077907562} 01/27/2022 04:07:30 - INFO - codeparrot_training - Step 8771: {'lr': 0.0004758501969299351, 'samples': 1684224, 'steps': 8771, 'loss/train': 0.8771815001964569} 01/27/2022 04:07:33 - INFO - codeparrot_training - Step 8772: {'lr': 0.0004758431802688461, 'samples': 1684416, 'steps': 8772, 'loss/train': 0.8256228268146515} 01/27/2022 04:07:38 - INFO - codeparrot_training - Step 8773: {'lr': 0.00047583616264031657, 'samples': 1684608, 'steps': 8773, 'loss/train': 0.8623432517051697} 01/27/2022 04:07:41 - INFO - codeparrot_training - Step 8774: {'lr': 0.00047582914404437673, 'samples': 1684800, 'steps': 8774, 'loss/train': 0.8079828321933746} 01/27/2022 04:07:44 - INFO - codeparrot_training - Step 8775: {'lr': 0.00047582212448105647, 'samples': 1684992, 'steps': 8775, 'loss/train': 0.34557900577783585} 01/27/2022 04:07:47 - INFO - codeparrot_training - Step 8776: {'lr': 0.000475815103950386, 'samples': 1685184, 'steps': 8776, 'loss/train': 0.9886213839054108} 01/27/2022 04:07:51 - INFO - codeparrot_training - Step 8777: {'lr': 0.00047580808245239526, 'samples': 1685376, 'steps': 8777, 'loss/train': 0.571934849023819} 01/27/2022 04:07:54 - INFO - codeparrot_training - Step 8778: {'lr': 0.0004758010599871145, 'samples': 1685568, 'steps': 8778, 'loss/train': 0.8003230690956116} 01/27/2022 04:07:57 - INFO - codeparrot_training - Step 8779: {'lr': 0.0004757940365545736, 'samples': 1685760, 'steps': 8779, 'loss/train': 0.7831953763961792} 01/27/2022 04:08:00 - INFO - codeparrot_training - Step 8780: {'lr': 0.0004757870121548028, 'samples': 1685952, 'steps': 8780, 'loss/train': 1.1076604127883911} 01/27/2022 04:08:03 - INFO - codeparrot_training - Step 8781: {'lr': 0.00047577998678783207, 'samples': 1686144, 'steps': 8781, 'loss/train': 0.607263520359993} 01/27/2022 04:08:08 - INFO - codeparrot_training - Step 8782: {'lr': 0.0004757729604536917, 'samples': 1686336, 'steps': 8782, 'loss/train': 0.3688431456685066} 01/27/2022 04:08:11 - INFO - codeparrot_training - Step 8783: {'lr': 0.0004757659331524115, 'samples': 1686528, 'steps': 8783, 'loss/train': 0.9373420178890228} 01/27/2022 04:08:14 - INFO - codeparrot_training - Step 8784: {'lr': 0.00047575890488402183, 'samples': 1686720, 'steps': 8784, 'loss/train': 0.866534560918808} 01/27/2022 04:08:17 - INFO - codeparrot_training - Step 8785: {'lr': 0.00047575187564855264, 'samples': 1686912, 'steps': 8785, 'loss/train': 1.2322485744953156} 01/27/2022 04:08:20 - INFO - codeparrot_training - Step 8786: {'lr': 0.00047574484544603415, 'samples': 1687104, 'steps': 8786, 'loss/train': 0.6057962626218796} 01/27/2022 04:08:23 - INFO - codeparrot_training - Step 8787: {'lr': 0.00047573781427649644, 'samples': 1687296, 'steps': 8787, 'loss/train': 1.2746661007404327} 01/27/2022 04:08:27 - INFO - codeparrot_training - Step 8788: {'lr': 0.00047573078213996954, 'samples': 1687488, 'steps': 8788, 'loss/train': 0.7298620641231537} 01/27/2022 04:08:30 - INFO - codeparrot_training - Step 8789: {'lr': 0.0004757237490364836, 'samples': 1687680, 'steps': 8789, 'loss/train': 0.5990496128797531} 01/27/2022 04:08:35 - INFO - codeparrot_training - Step 8790: {'lr': 0.00047571671496606893, 'samples': 1687872, 'steps': 8790, 'loss/train': 0.6304442882537842} 01/27/2022 04:08:38 - INFO - codeparrot_training - Step 8791: {'lr': 0.0004757096799287555, 'samples': 1688064, 'steps': 8791, 'loss/train': 1.1721863150596619} 01/27/2022 04:08:41 - INFO - codeparrot_training - Step 8792: {'lr': 0.0004757026439245735, 'samples': 1688256, 'steps': 8792, 'loss/train': 0.8922132253646851} 01/27/2022 04:08:44 - INFO - codeparrot_training - Step 8793: {'lr': 0.00047569560695355295, 'samples': 1688448, 'steps': 8793, 'loss/train': 5.523840665817261} 01/27/2022 04:08:47 - INFO - codeparrot_training - Step 8794: {'lr': 0.0004756885690157241, 'samples': 1688640, 'steps': 8794, 'loss/train': 1.2110011875629425} 01/27/2022 04:08:51 - INFO - codeparrot_training - Step 8795: {'lr': 0.00047568153011111715, 'samples': 1688832, 'steps': 8795, 'loss/train': 0.7985630929470062} 01/27/2022 04:08:54 - INFO - codeparrot_training - Step 8796: {'lr': 0.00047567449023976213, 'samples': 1689024, 'steps': 8796, 'loss/train': 0.796380490064621} 01/27/2022 04:08:57 - INFO - codeparrot_training - Step 8797: {'lr': 0.00047566744940168924, 'samples': 1689216, 'steps': 8797, 'loss/train': 0.7767529785633087} 01/27/2022 04:09:00 - INFO - codeparrot_training - Step 8798: {'lr': 0.0004756604075969287, 'samples': 1689408, 'steps': 8798, 'loss/train': 0.7283542603254318} 01/27/2022 04:09:04 - INFO - codeparrot_training - Step 8799: {'lr': 0.0004756533648255106, 'samples': 1689600, 'steps': 8799, 'loss/train': 3.5143407583236694} 01/27/2022 04:09:08 - INFO - codeparrot_training - Step 8800: {'lr': 0.0004756463210874652, 'samples': 1689792, 'steps': 8800, 'loss/train': 0.7934415936470032} 01/27/2022 04:09:11 - INFO - codeparrot_training - Step 8801: {'lr': 0.0004756392763828226, 'samples': 1689984, 'steps': 8801, 'loss/train': 0.9985527992248535} 01/27/2022 04:09:14 - INFO - codeparrot_training - Step 8802: {'lr': 0.0004756322307116129, 'samples': 1690176, 'steps': 8802, 'loss/train': 0.5835848897695541} 01/27/2022 04:09:17 - INFO - codeparrot_training - Step 8803: {'lr': 0.0004756251840738664, 'samples': 1690368, 'steps': 8803, 'loss/train': 0.8887975215911865} 01/27/2022 04:09:20 - INFO - codeparrot_training - Step 8804: {'lr': 0.00047561813646961325, 'samples': 1690560, 'steps': 8804, 'loss/train': 0.9323580265045166} 01/27/2022 04:09:23 - INFO - codeparrot_training - Step 8805: {'lr': 0.00047561108789888367, 'samples': 1690752, 'steps': 8805, 'loss/train': 0.7101875245571136} 01/27/2022 04:09:26 - INFO - codeparrot_training - Step 8806: {'lr': 0.0004756040383617078, 'samples': 1690944, 'steps': 8806, 'loss/train': 0.6361683905124664} 01/27/2022 04:09:30 - INFO - codeparrot_training - Step 8807: {'lr': 0.00047559698785811595, 'samples': 1691136, 'steps': 8807, 'loss/train': 0.841228723526001} 01/27/2022 04:09:34 - INFO - codeparrot_training - Step 8808: {'lr': 0.0004755899363881382, 'samples': 1691328, 'steps': 8808, 'loss/train': 0.6701041907072067} 01/27/2022 04:09:37 - INFO - codeparrot_training - Step 8809: {'lr': 0.00047558288395180477, 'samples': 1691520, 'steps': 8809, 'loss/train': 0.7889197468757629} 01/27/2022 04:09:40 - INFO - codeparrot_training - Step 8810: {'lr': 0.0004755758305491459, 'samples': 1691712, 'steps': 8810, 'loss/train': 0.3033228814601898} 01/27/2022 04:09:43 - INFO - codeparrot_training - Step 8811: {'lr': 0.0004755687761801918, 'samples': 1691904, 'steps': 8811, 'loss/train': 0.8233927488327026} 01/27/2022 04:09:47 - INFO - codeparrot_training - Step 8812: {'lr': 0.00047556172084497274, 'samples': 1692096, 'steps': 8812, 'loss/train': 1.2554318010807037} 01/27/2022 04:09:50 - INFO - codeparrot_training - Step 8813: {'lr': 0.0004755546645435188, 'samples': 1692288, 'steps': 8813, 'loss/train': 0.8147866129875183} 01/27/2022 04:09:53 - INFO - codeparrot_training - Step 8814: {'lr': 0.0004755476072758604, 'samples': 1692480, 'steps': 8814, 'loss/train': 0.6268217861652374} 01/27/2022 04:09:56 - INFO - codeparrot_training - Step 8815: {'lr': 0.0004755405490420276, 'samples': 1692672, 'steps': 8815, 'loss/train': 1.248115450143814} 01/27/2022 04:09:59 - INFO - codeparrot_training - Step 8816: {'lr': 0.0004755334898420507, 'samples': 1692864, 'steps': 8816, 'loss/train': 0.9695945978164673} 01/27/2022 04:10:04 - INFO - codeparrot_training - Step 8817: {'lr': 0.00047552642967596, 'samples': 1693056, 'steps': 8817, 'loss/train': 0.9068289399147034} 01/27/2022 04:10:08 - INFO - codeparrot_training - Step 8818: {'lr': 0.00047551936854378564, 'samples': 1693248, 'steps': 8818, 'loss/train': 1.0695126950740814} 01/27/2022 04:10:11 - INFO - codeparrot_training - Step 8819: {'lr': 0.00047551230644555793, 'samples': 1693440, 'steps': 8819, 'loss/train': 1.0577378869056702} 01/27/2022 04:10:14 - INFO - codeparrot_training - Step 8820: {'lr': 0.00047550524338130706, 'samples': 1693632, 'steps': 8820, 'loss/train': 1.5236742496490479} 01/27/2022 04:10:17 - INFO - codeparrot_training - Step 8821: {'lr': 0.00047549817935106344, 'samples': 1693824, 'steps': 8821, 'loss/train': 1.0779156982898712} 01/27/2022 04:10:20 - INFO - codeparrot_training - Step 8822: {'lr': 0.00047549111435485716, 'samples': 1694016, 'steps': 8822, 'loss/train': 1.2874997556209564} 01/27/2022 04:10:23 - INFO - codeparrot_training - Step 8823: {'lr': 0.0004754840483927185, 'samples': 1694208, 'steps': 8823, 'loss/train': 0.7026313841342926} 01/27/2022 04:10:26 - INFO - codeparrot_training - Step 8824: {'lr': 0.0004754769814646779, 'samples': 1694400, 'steps': 8824, 'loss/train': 0.7971096038818359} 01/27/2022 04:10:30 - INFO - codeparrot_training - Step 8825: {'lr': 0.00047546991357076544, 'samples': 1694592, 'steps': 8825, 'loss/train': 0.09582379460334778} 01/27/2022 04:10:34 - INFO - codeparrot_training - Step 8826: {'lr': 0.00047546284471101143, 'samples': 1694784, 'steps': 8826, 'loss/train': 0.983845978975296} 01/27/2022 04:10:37 - INFO - codeparrot_training - Step 8827: {'lr': 0.00047545577488544623, 'samples': 1694976, 'steps': 8827, 'loss/train': 0.5936714261770248} 01/27/2022 04:10:40 - INFO - codeparrot_training - Step 8828: {'lr': 0.0004754487040941001, 'samples': 1695168, 'steps': 8828, 'loss/train': 0.8077854216098785} 01/27/2022 04:10:43 - INFO - codeparrot_training - Step 8829: {'lr': 0.00047544163233700324, 'samples': 1695360, 'steps': 8829, 'loss/train': 0.9028644561767578} 01/27/2022 04:10:47 - INFO - codeparrot_training - Step 8830: {'lr': 0.00047543455961418605, 'samples': 1695552, 'steps': 8830, 'loss/train': 0.8716159164905548} 01/27/2022 04:10:50 - INFO - codeparrot_training - Step 8831: {'lr': 0.0004754274859256788, 'samples': 1695744, 'steps': 8831, 'loss/train': 0.05304509215056896} 01/27/2022 04:10:53 - INFO - codeparrot_training - Step 8832: {'lr': 0.0004754204112715118, 'samples': 1695936, 'steps': 8832, 'loss/train': 0.8216550350189209} 01/27/2022 04:10:56 - INFO - codeparrot_training - Step 8833: {'lr': 0.0004754133356517153, 'samples': 1696128, 'steps': 8833, 'loss/train': 0.23043816536664963} 01/27/2022 04:11:00 - INFO - codeparrot_training - Step 8834: {'lr': 0.0004754062590663196, 'samples': 1696320, 'steps': 8834, 'loss/train': 0.4823760688304901} 01/27/2022 04:11:04 - INFO - codeparrot_training - Step 8835: {'lr': 0.00047539918151535515, 'samples': 1696512, 'steps': 8835, 'loss/train': 0.8504853844642639} 01/27/2022 04:11:07 - INFO - codeparrot_training - Step 8836: {'lr': 0.00047539210299885217, 'samples': 1696704, 'steps': 8836, 'loss/train': 1.0764182209968567} 01/27/2022 04:11:10 - INFO - codeparrot_training - Step 8837: {'lr': 0.00047538502351684097, 'samples': 1696896, 'steps': 8837, 'loss/train': 0.163071870803833} 01/27/2022 04:11:13 - INFO - codeparrot_training - Step 8838: {'lr': 0.0004753779430693519, 'samples': 1697088, 'steps': 8838, 'loss/train': 0.7061524093151093} 01/27/2022 04:11:16 - INFO - codeparrot_training - Step 8839: {'lr': 0.0004753708616564153, 'samples': 1697280, 'steps': 8839, 'loss/train': 0.650465190410614} 01/27/2022 04:11:19 - INFO - codeparrot_training - Step 8840: {'lr': 0.00047536377927806143, 'samples': 1697472, 'steps': 8840, 'loss/train': 0.8667261600494385} 01/27/2022 04:11:22 - INFO - codeparrot_training - Step 8841: {'lr': 0.0004753566959343207, 'samples': 1697664, 'steps': 8841, 'loss/train': 0.5812327265739441} 01/27/2022 04:11:26 - INFO - codeparrot_training - Step 8842: {'lr': 0.0004753496116252235, 'samples': 1697856, 'steps': 8842, 'loss/train': 0.635846883058548} 01/27/2022 04:11:31 - INFO - codeparrot_training - Step 8843: {'lr': 0.0004753425263508001, 'samples': 1698048, 'steps': 8843, 'loss/train': 1.0184404850006104} 01/27/2022 04:11:34 - INFO - codeparrot_training - Step 8844: {'lr': 0.0004753354401110809, 'samples': 1698240, 'steps': 8844, 'loss/train': 0.8015635907649994} 01/27/2022 04:11:38 - INFO - codeparrot_training - Step 8845: {'lr': 0.00047532835290609623, 'samples': 1698432, 'steps': 8845, 'loss/train': 0.1714179553091526} 01/27/2022 04:11:41 - INFO - codeparrot_training - Step 8846: {'lr': 0.00047532126473587635, 'samples': 1698624, 'steps': 8846, 'loss/train': 1.5511096715927124} 01/27/2022 04:11:44 - INFO - codeparrot_training - Step 8847: {'lr': 0.0004753141756004518, 'samples': 1698816, 'steps': 8847, 'loss/train': 0.7812064290046692} 01/27/2022 04:11:47 - INFO - codeparrot_training - Step 8848: {'lr': 0.00047530708549985287, 'samples': 1699008, 'steps': 8848, 'loss/train': 1.0071638524532318} 01/27/2022 04:11:50 - INFO - codeparrot_training - Step 8849: {'lr': 0.00047529999443410986, 'samples': 1699200, 'steps': 8849, 'loss/train': 1.6650472283363342} 01/27/2022 04:11:53 - INFO - codeparrot_training - Step 8850: {'lr': 0.0004752929024032533, 'samples': 1699392, 'steps': 8850, 'loss/train': 0.7899336218833923} 01/27/2022 04:11:56 - INFO - codeparrot_training - Step 8851: {'lr': 0.0004752858094073134, 'samples': 1699584, 'steps': 8851, 'loss/train': 0.6839771568775177} 01/27/2022 04:12:01 - INFO - codeparrot_training - Step 8852: {'lr': 0.0004752787154463207, 'samples': 1699776, 'steps': 8852, 'loss/train': 0.5981846451759338} 01/27/2022 04:12:04 - INFO - codeparrot_training - Step 8853: {'lr': 0.0004752716205203055, 'samples': 1699968, 'steps': 8853, 'loss/train': 0.26683077961206436} 01/27/2022 04:12:07 - INFO - codeparrot_training - Step 8854: {'lr': 0.0004752645246292982, 'samples': 1700160, 'steps': 8854, 'loss/train': 1.2917571365833282} 01/27/2022 04:12:10 - INFO - codeparrot_training - Step 8855: {'lr': 0.0004752574277733292, 'samples': 1700352, 'steps': 8855, 'loss/train': 1.3704933822154999} 01/27/2022 04:12:14 - INFO - codeparrot_training - Step 8856: {'lr': 0.0004752503299524289, 'samples': 1700544, 'steps': 8856, 'loss/train': 0.7280902862548828} 01/27/2022 04:12:17 - INFO - codeparrot_training - Step 8857: {'lr': 0.0004752432311666277, 'samples': 1700736, 'steps': 8857, 'loss/train': 0.5573621392250061} 01/27/2022 04:12:20 - INFO - codeparrot_training - Step 8858: {'lr': 0.0004752361314159561, 'samples': 1700928, 'steps': 8858, 'loss/train': 0.9975146949291229} 01/27/2022 04:12:23 - INFO - codeparrot_training - Step 8859: {'lr': 0.0004752290307004444, 'samples': 1701120, 'steps': 8859, 'loss/train': 0.9830653667449951} 01/27/2022 04:12:26 - INFO - codeparrot_training - Step 8860: {'lr': 0.000475221929020123, 'samples': 1701312, 'steps': 8860, 'loss/train': 0.9551282823085785} 01/27/2022 04:12:31 - INFO - codeparrot_training - Step 8861: {'lr': 0.00047521482637502246, 'samples': 1701504, 'steps': 8861, 'loss/train': 0.621444895863533} 01/27/2022 04:12:34 - INFO - codeparrot_training - Step 8862: {'lr': 0.00047520772276517297, 'samples': 1701696, 'steps': 8862, 'loss/train': 1.029265433549881} 01/27/2022 04:12:37 - INFO - codeparrot_training - Step 8863: {'lr': 0.0004752006181906052, 'samples': 1701888, 'steps': 8863, 'loss/train': 1.1435598134994507} 01/27/2022 04:12:40 - INFO - codeparrot_training - Step 8864: {'lr': 0.00047519351265134954, 'samples': 1702080, 'steps': 8864, 'loss/train': 0.7313255220651627} 01/27/2022 04:12:43 - INFO - codeparrot_training - Step 8865: {'lr': 0.0004751864061474364, 'samples': 1702272, 'steps': 8865, 'loss/train': 0.873029887676239} 01/27/2022 04:12:46 - INFO - codeparrot_training - Step 8866: {'lr': 0.000475179298678896, 'samples': 1702464, 'steps': 8866, 'loss/train': 0.6269553601741791} 01/27/2022 04:12:49 - INFO - codeparrot_training - Step 8867: {'lr': 0.0004751721902457592, 'samples': 1702656, 'steps': 8867, 'loss/train': 0.8876466751098633} 01/27/2022 04:12:53 - INFO - codeparrot_training - Step 8868: {'lr': 0.0004751650808480561, 'samples': 1702848, 'steps': 8868, 'loss/train': 1.3480846881866455} 01/27/2022 04:12:56 - INFO - codeparrot_training - Step 8869: {'lr': 0.00047515797048581734, 'samples': 1703040, 'steps': 8869, 'loss/train': 0.6085661351680756} 01/27/2022 04:13:01 - INFO - codeparrot_training - Step 8870: {'lr': 0.00047515085915907334, 'samples': 1703232, 'steps': 8870, 'loss/train': 0.6623333841562271} 01/27/2022 04:13:04 - INFO - codeparrot_training - Step 8871: {'lr': 0.00047514374686785454, 'samples': 1703424, 'steps': 8871, 'loss/train': 1.0068029165267944} 01/27/2022 04:13:07 - INFO - codeparrot_training - Step 8872: {'lr': 0.00047513663361219144, 'samples': 1703616, 'steps': 8872, 'loss/train': 0.43117229640483856} 01/27/2022 04:13:10 - INFO - codeparrot_training - Step 8873: {'lr': 0.00047512951939211447, 'samples': 1703808, 'steps': 8873, 'loss/train': 0.7974176108837128} 01/27/2022 04:13:13 - INFO - codeparrot_training - Step 8874: {'lr': 0.0004751224042076542, 'samples': 1704000, 'steps': 8874, 'loss/train': 1.044609546661377} 01/27/2022 04:13:17 - INFO - codeparrot_training - Step 8875: {'lr': 0.0004751152880588409, 'samples': 1704192, 'steps': 8875, 'loss/train': 0.3257836773991585} 01/27/2022 04:13:20 - INFO - codeparrot_training - Step 8876: {'lr': 0.00047510817094570526, 'samples': 1704384, 'steps': 8876, 'loss/train': 0.7043850123882294} 01/27/2022 04:13:23 - INFO - codeparrot_training - Step 8877: {'lr': 0.0004751010528682777, 'samples': 1704576, 'steps': 8877, 'loss/train': 1.4922800660133362} 01/27/2022 04:13:27 - INFO - codeparrot_training - Step 8878: {'lr': 0.0004750939338265887, 'samples': 1704768, 'steps': 8878, 'loss/train': 0.657108798623085} 01/27/2022 04:13:30 - INFO - codeparrot_training - Step 8879: {'lr': 0.0004750868138206688, 'samples': 1704960, 'steps': 8879, 'loss/train': 0.8913991749286652} 01/27/2022 04:13:34 - INFO - codeparrot_training - Step 8880: {'lr': 0.0004750796928505484, 'samples': 1705152, 'steps': 8880, 'loss/train': 1.0711599290370941} 01/27/2022 04:13:37 - INFO - codeparrot_training - Step 8881: {'lr': 0.0004750725709162581, 'samples': 1705344, 'steps': 8881, 'loss/train': 0.18771381676197052} 01/27/2022 04:13:40 - INFO - codeparrot_training - Step 8882: {'lr': 0.00047506544801782834, 'samples': 1705536, 'steps': 8882, 'loss/train': 1.1394799053668976} 01/27/2022 04:13:43 - INFO - codeparrot_training - Step 8883: {'lr': 0.00047505832415528973, 'samples': 1705728, 'steps': 8883, 'loss/train': 0.6290287524461746} 01/27/2022 04:13:46 - INFO - codeparrot_training - Step 8884: {'lr': 0.0004750511993286727, 'samples': 1705920, 'steps': 8884, 'loss/train': 1.3272260427474976} 01/27/2022 04:13:49 - INFO - codeparrot_training - Step 8885: {'lr': 0.0004750440735380077, 'samples': 1706112, 'steps': 8885, 'loss/train': 0.6312117129564285} 01/27/2022 04:13:52 - INFO - codeparrot_training - Step 8886: {'lr': 0.00047503694678332543, 'samples': 1706304, 'steps': 8886, 'loss/train': 1.3584345281124115} 01/27/2022 04:13:57 - INFO - codeparrot_training - Step 8887: {'lr': 0.00047502981906465634, 'samples': 1706496, 'steps': 8887, 'loss/train': 0.3282416760921478} 01/27/2022 04:14:00 - INFO - codeparrot_training - Step 8888: {'lr': 0.000475022690382031, 'samples': 1706688, 'steps': 8888, 'loss/train': 0.6865123808383942} 01/27/2022 04:14:03 - INFO - codeparrot_training - Step 8889: {'lr': 0.0004750155607354799, 'samples': 1706880, 'steps': 8889, 'loss/train': 0.7360482215881348} 01/27/2022 04:14:06 - INFO - codeparrot_training - Step 8890: {'lr': 0.0004750084301250335, 'samples': 1707072, 'steps': 8890, 'loss/train': 0.8129855990409851} 01/27/2022 04:14:09 - INFO - codeparrot_training - Step 8891: {'lr': 0.0004750012985507225, 'samples': 1707264, 'steps': 8891, 'loss/train': 1.1230358183383942} 01/27/2022 04:14:13 - INFO - codeparrot_training - Step 8892: {'lr': 0.0004749941660125774, 'samples': 1707456, 'steps': 8892, 'loss/train': 0.536829337477684} 01/27/2022 04:14:16 - INFO - codeparrot_training - Step 8893: {'lr': 0.0004749870325106287, 'samples': 1707648, 'steps': 8893, 'loss/train': 0.9235728085041046} 01/27/2022 04:14:19 - INFO - codeparrot_training - Step 8894: {'lr': 0.00047497989804490693, 'samples': 1707840, 'steps': 8894, 'loss/train': 0.8652359247207642} 01/27/2022 04:14:22 - INFO - codeparrot_training - Step 8895: {'lr': 0.0004749727626154428, 'samples': 1708032, 'steps': 8895, 'loss/train': 1.4117088317871094} 01/27/2022 04:14:27 - INFO - codeparrot_training - Step 8896: {'lr': 0.0004749656262222668, 'samples': 1708224, 'steps': 8896, 'loss/train': 0.3472478538751602} 01/27/2022 04:14:30 - INFO - codeparrot_training - Step 8897: {'lr': 0.0004749584888654095, 'samples': 1708416, 'steps': 8897, 'loss/train': 0.8790525197982788} 01/27/2022 04:14:33 - INFO - codeparrot_training - Step 8898: {'lr': 0.0004749513505449014, 'samples': 1708608, 'steps': 8898, 'loss/train': 0.6645831316709518} 01/27/2022 04:14:37 - INFO - codeparrot_training - Step 8899: {'lr': 0.00047494421126077313, 'samples': 1708800, 'steps': 8899, 'loss/train': 0.8659674525260925} 01/27/2022 04:14:40 - INFO - codeparrot_training - Step 8900: {'lr': 0.0004749370710130554, 'samples': 1708992, 'steps': 8900, 'loss/train': 0.717073604464531} 01/27/2022 04:14:43 - INFO - codeparrot_training - Step 8901: {'lr': 0.0004749299298017786, 'samples': 1709184, 'steps': 8901, 'loss/train': 1.0830568671226501} 01/27/2022 04:14:46 - INFO - codeparrot_training - Step 8902: {'lr': 0.00047492278762697337, 'samples': 1709376, 'steps': 8902, 'loss/train': 1.0187194347381592} 01/27/2022 04:14:49 - INFO - codeparrot_training - Step 8903: {'lr': 0.0004749156444886704, 'samples': 1709568, 'steps': 8903, 'loss/train': 1.0385811924934387} 01/27/2022 04:14:52 - INFO - codeparrot_training - Step 8904: {'lr': 0.0004749085003869003, 'samples': 1709760, 'steps': 8904, 'loss/train': 0.8249586224555969} 01/27/2022 04:14:57 - INFO - codeparrot_training - Step 8905: {'lr': 0.00047490135532169347, 'samples': 1709952, 'steps': 8905, 'loss/train': 1.3577050566673279} 01/27/2022 04:15:00 - INFO - codeparrot_training - Step 8906: {'lr': 0.0004748942092930807, 'samples': 1710144, 'steps': 8906, 'loss/train': 0.9853910207748413} 01/27/2022 04:15:03 - INFO - codeparrot_training - Step 8907: {'lr': 0.00047488706230109257, 'samples': 1710336, 'steps': 8907, 'loss/train': 0.4901323914527893} 01/27/2022 04:15:06 - INFO - codeparrot_training - Step 8908: {'lr': 0.00047487991434575963, 'samples': 1710528, 'steps': 8908, 'loss/train': 0.8499287366867065} 01/27/2022 04:15:10 - INFO - codeparrot_training - Step 8909: {'lr': 0.0004748727654271126, 'samples': 1710720, 'steps': 8909, 'loss/train': 0.6546403169631958} 01/27/2022 04:15:13 - INFO - codeparrot_training - Step 8910: {'lr': 0.000474865615545182, 'samples': 1710912, 'steps': 8910, 'loss/train': 0.6291689425706863} 01/27/2022 04:15:16 - INFO - codeparrot_training - Step 8911: {'lr': 0.0004748584646999985, 'samples': 1711104, 'steps': 8911, 'loss/train': 0.8038555383682251} 01/27/2022 04:15:19 - INFO - codeparrot_training - Step 8912: {'lr': 0.0004748513128915928, 'samples': 1711296, 'steps': 8912, 'loss/train': 0.5740706920623779} 01/27/2022 04:15:22 - INFO - codeparrot_training - Step 8913: {'lr': 0.0004748441601199954, 'samples': 1711488, 'steps': 8913, 'loss/train': 0.5146073698997498} 01/27/2022 04:15:26 - INFO - codeparrot_training - Step 8914: {'lr': 0.0004748370063852371, 'samples': 1711680, 'steps': 8914, 'loss/train': 1.017938107252121} 01/27/2022 04:15:30 - INFO - codeparrot_training - Step 8915: {'lr': 0.0004748298516873484, 'samples': 1711872, 'steps': 8915, 'loss/train': 0.8168390393257141} 01/27/2022 04:15:33 - INFO - codeparrot_training - Step 8916: {'lr': 0.00047482269602636, 'samples': 1712064, 'steps': 8916, 'loss/train': 0.7283893525600433} 01/27/2022 04:15:36 - INFO - codeparrot_training - Step 8917: {'lr': 0.00047481553940230257, 'samples': 1712256, 'steps': 8917, 'loss/train': 0.637556791305542} 01/27/2022 04:15:39 - INFO - codeparrot_training - Step 8918: {'lr': 0.0004748083818152067, 'samples': 1712448, 'steps': 8918, 'loss/train': 0.46593135595321655} 01/27/2022 04:15:42 - INFO - codeparrot_training - Step 8919: {'lr': 0.00047480122326510325, 'samples': 1712640, 'steps': 8919, 'loss/train': 0.7900237441062927} 01/27/2022 04:15:45 - INFO - codeparrot_training - Step 8920: {'lr': 0.0004747940637520226, 'samples': 1712832, 'steps': 8920, 'loss/train': 0.9554906487464905} 01/27/2022 04:15:48 - INFO - codeparrot_training - Step 8921: {'lr': 0.0004747869032759956, 'samples': 1713024, 'steps': 8921, 'loss/train': 0.38433319330215454} 01/27/2022 04:15:52 - INFO - codeparrot_training - Step 8922: {'lr': 0.00047477974183705293, 'samples': 1713216, 'steps': 8922, 'loss/train': 0.5641205012798309} 01/27/2022 04:15:57 - INFO - codeparrot_training - Step 8923: {'lr': 0.0004747725794352252, 'samples': 1713408, 'steps': 8923, 'loss/train': 0.8634643256664276} 01/27/2022 04:16:00 - INFO - codeparrot_training - Step 8924: {'lr': 0.00047476541607054313, 'samples': 1713600, 'steps': 8924, 'loss/train': 0.6325281858444214} 01/27/2022 04:16:03 - INFO - codeparrot_training - Step 8925: {'lr': 0.0004747582517430373, 'samples': 1713792, 'steps': 8925, 'loss/train': 1.2084268033504486} 01/27/2022 04:16:06 - INFO - codeparrot_training - Step 8926: {'lr': 0.00047475108645273856, 'samples': 1713984, 'steps': 8926, 'loss/train': 0.8978492617607117} 01/27/2022 04:16:09 - INFO - codeparrot_training - Step 8927: {'lr': 0.00047474392019967754, 'samples': 1714176, 'steps': 8927, 'loss/train': 0.886025458574295} 01/27/2022 04:16:12 - INFO - codeparrot_training - Step 8928: {'lr': 0.0004747367529838849, 'samples': 1714368, 'steps': 8928, 'loss/train': 0.8021923005580902} 01/27/2022 04:16:15 - INFO - codeparrot_training - Step 8929: {'lr': 0.0004747295848053914, 'samples': 1714560, 'steps': 8929, 'loss/train': 0.4324270784854889} 01/27/2022 04:16:18 - INFO - codeparrot_training - Step 8930: {'lr': 0.0004747224156642277, 'samples': 1714752, 'steps': 8930, 'loss/train': 0.5716483891010284} 01/27/2022 04:16:22 - INFO - codeparrot_training - Step 8931: {'lr': 0.00047471524556042454, 'samples': 1714944, 'steps': 8931, 'loss/train': 0.09258739091455936} 01/27/2022 04:16:26 - INFO - codeparrot_training - Step 8932: {'lr': 0.00047470807449401264, 'samples': 1715136, 'steps': 8932, 'loss/train': 0.814824789762497} 01/27/2022 04:16:29 - INFO - codeparrot_training - Step 8933: {'lr': 0.0004747009024650227, 'samples': 1715328, 'steps': 8933, 'loss/train': 1.0435634851455688} 01/27/2022 04:16:32 - INFO - codeparrot_training - Step 8934: {'lr': 0.00047469372947348546, 'samples': 1715520, 'steps': 8934, 'loss/train': 0.7506498992443085} 01/27/2022 04:16:36 - INFO - codeparrot_training - Step 8935: {'lr': 0.0004746865555194315, 'samples': 1715712, 'steps': 8935, 'loss/train': 1.0570406913757324} 01/27/2022 04:16:39 - INFO - codeparrot_training - Step 8936: {'lr': 0.00047467938060289185, 'samples': 1715904, 'steps': 8936, 'loss/train': 0.6992247551679611} 01/27/2022 04:16:42 - INFO - codeparrot_training - Step 8937: {'lr': 0.00047467220472389694, 'samples': 1716096, 'steps': 8937, 'loss/train': 0.46257269382476807} 01/27/2022 04:16:45 - INFO - codeparrot_training - Step 8938: {'lr': 0.0004746650278824777, 'samples': 1716288, 'steps': 8938, 'loss/train': 1.2013504207134247} 01/27/2022 04:16:48 - INFO - codeparrot_training - Step 8939: {'lr': 0.00047465785007866487, 'samples': 1716480, 'steps': 8939, 'loss/train': 0.6719390451908112} 01/27/2022 04:16:51 - INFO - codeparrot_training - Step 8940: {'lr': 0.00047465067131248907, 'samples': 1716672, 'steps': 8940, 'loss/train': 0.6768035888671875} 01/27/2022 04:16:56 - INFO - codeparrot_training - Step 8941: {'lr': 0.0004746434915839812, 'samples': 1716864, 'steps': 8941, 'loss/train': 0.5392390787601471} 01/27/2022 04:16:59 - INFO - codeparrot_training - Step 8942: {'lr': 0.00047463631089317195, 'samples': 1717056, 'steps': 8942, 'loss/train': 1.0396932363510132} 01/27/2022 04:17:02 - INFO - codeparrot_training - Step 8943: {'lr': 0.000474629129240092, 'samples': 1717248, 'steps': 8943, 'loss/train': 1.1123823523521423} 01/27/2022 04:17:05 - INFO - codeparrot_training - Step 8944: {'lr': 0.0004746219466247722, 'samples': 1717440, 'steps': 8944, 'loss/train': 0.4243011921644211} 01/27/2022 04:17:08 - INFO - codeparrot_training - Step 8945: {'lr': 0.0004746147630472434, 'samples': 1717632, 'steps': 8945, 'loss/train': 0.9845975339412689} 01/27/2022 04:17:11 - INFO - codeparrot_training - Step 8946: {'lr': 0.00047460757850753614, 'samples': 1717824, 'steps': 8946, 'loss/train': 0.9688665568828583} 01/27/2022 04:17:15 - INFO - codeparrot_training - Step 8947: {'lr': 0.00047460039300568143, 'samples': 1718016, 'steps': 8947, 'loss/train': 1.279451698064804} 01/27/2022 04:17:18 - INFO - codeparrot_training - Step 8948: {'lr': 0.0004745932065417099, 'samples': 1718208, 'steps': 8948, 'loss/train': 0.8683800101280212} 01/27/2022 04:17:21 - INFO - codeparrot_training - Step 8949: {'lr': 0.00047458601911565246, 'samples': 1718400, 'steps': 8949, 'loss/train': 0.8722105622291565} 01/27/2022 04:17:25 - INFO - codeparrot_training - Step 8950: {'lr': 0.0004745788307275398, 'samples': 1718592, 'steps': 8950, 'loss/train': 0.24020299315452576} 01/27/2022 04:17:28 - INFO - codeparrot_training - Step 8951: {'lr': 0.0004745716413774027, 'samples': 1718784, 'steps': 8951, 'loss/train': 0.4956173300743103} 01/27/2022 04:17:32 - INFO - codeparrot_training - Step 8952: {'lr': 0.000474564451065272, 'samples': 1718976, 'steps': 8952, 'loss/train': 0.08861454203724861} 01/27/2022 04:17:35 - INFO - codeparrot_training - Step 8953: {'lr': 0.00047455725979117855, 'samples': 1719168, 'steps': 8953, 'loss/train': 1.0472849607467651} 01/27/2022 04:17:38 - INFO - codeparrot_training - Step 8954: {'lr': 0.00047455006755515306, 'samples': 1719360, 'steps': 8954, 'loss/train': 1.1201128363609314} 01/27/2022 04:17:41 - INFO - codeparrot_training - Step 8955: {'lr': 0.00047454287435722643, 'samples': 1719552, 'steps': 8955, 'loss/train': 1.222910463809967} 01/27/2022 04:17:44 - INFO - codeparrot_training - Step 8956: {'lr': 0.00047453568019742936, 'samples': 1719744, 'steps': 8956, 'loss/train': 0.7241966128349304} 01/27/2022 04:17:47 - INFO - codeparrot_training - Step 8957: {'lr': 0.0004745284850757928, 'samples': 1719936, 'steps': 8957, 'loss/train': 0.7363390177488327} 01/27/2022 04:17:50 - INFO - codeparrot_training - Step 8958: {'lr': 0.00047452128899234746, 'samples': 1720128, 'steps': 8958, 'loss/train': 1.7090059518814087} 01/27/2022 04:17:56 - INFO - codeparrot_training - Step 8959: {'lr': 0.0004745140919471243, 'samples': 1720320, 'steps': 8959, 'loss/train': 0.9207563102245331} 01/27/2022 04:17:59 - INFO - codeparrot_training - Step 8960: {'lr': 0.0004745068939401539, 'samples': 1720512, 'steps': 8960, 'loss/train': 0.8140423893928528} 01/27/2022 04:18:02 - INFO - codeparrot_training - Step 8961: {'lr': 0.0004744996949714674, 'samples': 1720704, 'steps': 8961, 'loss/train': 0.615035355091095} 01/27/2022 04:18:05 - INFO - codeparrot_training - Step 8962: {'lr': 0.0004744924950410954, 'samples': 1720896, 'steps': 8962, 'loss/train': 0.4859382212162018} 01/27/2022 04:18:08 - INFO - codeparrot_training - Step 8963: {'lr': 0.0004744852941490689, 'samples': 1721088, 'steps': 8963, 'loss/train': 0.49122731387615204} 01/27/2022 04:18:11 - INFO - codeparrot_training - Step 8964: {'lr': 0.0004744780922954186, 'samples': 1721280, 'steps': 8964, 'loss/train': 0.3574376776814461} 01/27/2022 04:18:15 - INFO - codeparrot_training - Step 8965: {'lr': 0.00047447088948017555, 'samples': 1721472, 'steps': 8965, 'loss/train': 0.37440336495637894} 01/27/2022 04:18:18 - INFO - codeparrot_training - Step 8966: {'lr': 0.0004744636857033704, 'samples': 1721664, 'steps': 8966, 'loss/train': 0.8426318764686584} 01/27/2022 04:18:22 - INFO - codeparrot_training - Step 8967: {'lr': 0.00047445648096503413, 'samples': 1721856, 'steps': 8967, 'loss/train': 1.4218335449695587} 01/27/2022 04:18:25 - INFO - codeparrot_training - Step 8968: {'lr': 0.00047444927526519757, 'samples': 1722048, 'steps': 8968, 'loss/train': 0.661797970533371} 01/27/2022 04:18:28 - INFO - codeparrot_training - Step 8969: {'lr': 0.00047444206860389155, 'samples': 1722240, 'steps': 8969, 'loss/train': 0.7647450864315033} 01/27/2022 04:18:32 - INFO - codeparrot_training - Step 8970: {'lr': 0.00047443486098114703, 'samples': 1722432, 'steps': 8970, 'loss/train': 0.9103866219520569} 01/27/2022 04:18:35 - INFO - codeparrot_training - Step 8971: {'lr': 0.0004744276523969948, 'samples': 1722624, 'steps': 8971, 'loss/train': 0.9659748673439026} 01/27/2022 04:18:38 - INFO - codeparrot_training - Step 8972: {'lr': 0.0004744204428514658, 'samples': 1722816, 'steps': 8972, 'loss/train': 1.2843463718891144} 01/27/2022 04:18:41 - INFO - codeparrot_training - Step 8973: {'lr': 0.0004744132323445908, 'samples': 1723008, 'steps': 8973, 'loss/train': 1.0386129319667816} 01/27/2022 04:18:44 - INFO - codeparrot_training - Step 8974: {'lr': 0.00047440602087640084, 'samples': 1723200, 'steps': 8974, 'loss/train': 0.9624074399471283} 01/27/2022 04:18:47 - INFO - codeparrot_training - Step 8975: {'lr': 0.0004743988084469267, 'samples': 1723392, 'steps': 8975, 'loss/train': 0.7900190055370331} 01/27/2022 04:18:53 - INFO - codeparrot_training - Step 8976: {'lr': 0.00047439159505619936, 'samples': 1723584, 'steps': 8976, 'loss/train': 1.4047608375549316} 01/27/2022 04:18:56 - INFO - codeparrot_training - Step 8977: {'lr': 0.0004743843807042497, 'samples': 1723776, 'steps': 8977, 'loss/train': 0.8962242901325226} 01/27/2022 04:19:00 - INFO - codeparrot_training - Step 8978: {'lr': 0.0004743771653911086, 'samples': 1723968, 'steps': 8978, 'loss/train': 0.7002300024032593} 01/27/2022 04:19:03 - INFO - codeparrot_training - Step 8979: {'lr': 0.00047436994911680694, 'samples': 1724160, 'steps': 8979, 'loss/train': 1.4298570156097412} 01/27/2022 04:19:06 - INFO - codeparrot_training - Step 8980: {'lr': 0.0004743627318813757, 'samples': 1724352, 'steps': 8980, 'loss/train': 0.7291596829891205} 01/27/2022 04:19:09 - INFO - codeparrot_training - Step 8981: {'lr': 0.00047435551368484567, 'samples': 1724544, 'steps': 8981, 'loss/train': 0.877103179693222} 01/27/2022 04:19:12 - INFO - codeparrot_training - Step 8982: {'lr': 0.00047434829452724795, 'samples': 1724736, 'steps': 8982, 'loss/train': 0.8548544347286224} 01/27/2022 04:19:15 - INFO - codeparrot_training - Step 8983: {'lr': 0.00047434107440861336, 'samples': 1724928, 'steps': 8983, 'loss/train': 0.6389434486627579} 01/27/2022 04:19:18 - INFO - codeparrot_training - Step 8984: {'lr': 0.0004743338533289728, 'samples': 1725120, 'steps': 8984, 'loss/train': 0.9521706104278564} 01/27/2022 04:19:23 - INFO - codeparrot_training - Step 8985: {'lr': 0.00047432663128835727, 'samples': 1725312, 'steps': 8985, 'loss/train': 1.0057012438774109} 01/27/2022 04:19:26 - INFO - codeparrot_training - Step 8986: {'lr': 0.0004743194082867977, 'samples': 1725504, 'steps': 8986, 'loss/train': 1.193362534046173} 01/27/2022 04:19:29 - INFO - codeparrot_training - Step 8987: {'lr': 0.000474312184324325, 'samples': 1725696, 'steps': 8987, 'loss/train': 1.1812311708927155} 01/27/2022 04:19:32 - INFO - codeparrot_training - Step 8988: {'lr': 0.0004743049594009701, 'samples': 1725888, 'steps': 8988, 'loss/train': 0.9071344435214996} 01/27/2022 04:19:35 - INFO - codeparrot_training - Step 8989: {'lr': 0.0004742977335167641, 'samples': 1726080, 'steps': 8989, 'loss/train': 1.1123055517673492} 01/27/2022 04:19:39 - INFO - codeparrot_training - Step 8990: {'lr': 0.0004742905066717377, 'samples': 1726272, 'steps': 8990, 'loss/train': 0.8789660632610321} 01/27/2022 04:19:42 - INFO - codeparrot_training - Step 8991: {'lr': 0.00047428327886592204, 'samples': 1726464, 'steps': 8991, 'loss/train': 0.7093481719493866} 01/27/2022 04:19:45 - INFO - codeparrot_training - Step 8992: {'lr': 0.00047427605009934805, 'samples': 1726656, 'steps': 8992, 'loss/train': 0.8829330503940582} 01/27/2022 04:19:48 - INFO - codeparrot_training - Step 8993: {'lr': 0.00047426882037204663, 'samples': 1726848, 'steps': 8993, 'loss/train': 0.48947829008102417} 01/27/2022 04:19:52 - INFO - codeparrot_training - Step 8994: {'lr': 0.0004742615896840488, 'samples': 1727040, 'steps': 8994, 'loss/train': 1.0914503931999207} 01/27/2022 04:19:56 - INFO - codeparrot_training - Step 8995: {'lr': 0.00047425435803538554, 'samples': 1727232, 'steps': 8995, 'loss/train': 0.440748855471611} 01/27/2022 04:19:59 - INFO - codeparrot_training - Step 8996: {'lr': 0.0004742471254260878, 'samples': 1727424, 'steps': 8996, 'loss/train': 1.2413172125816345} 01/27/2022 04:20:02 - INFO - codeparrot_training - Step 8997: {'lr': 0.00047423989185618666, 'samples': 1727616, 'steps': 8997, 'loss/train': 0.9914840161800385} 01/27/2022 04:20:05 - INFO - codeparrot_training - Step 8998: {'lr': 0.00047423265732571295, 'samples': 1727808, 'steps': 8998, 'loss/train': 0.7766639292240143} 01/27/2022 04:20:08 - INFO - codeparrot_training - Step 8999: {'lr': 0.00047422542183469775, 'samples': 1728000, 'steps': 8999, 'loss/train': 1.1778424680233002} 01/27/2022 04:20:11 - INFO - codeparrot_training - Step 9000: {'lr': 0.0004742181853831721, 'samples': 1728192, 'steps': 9000, 'loss/train': 0.7771744430065155} 01/27/2022 04:20:14 - INFO - codeparrot_training - Step 9001: {'lr': 0.00047421094797116687, 'samples': 1728384, 'steps': 9001, 'loss/train': 0.9076082110404968} 01/27/2022 04:20:18 - INFO - codeparrot_training - Step 9002: {'lr': 0.00047420370959871315, 'samples': 1728576, 'steps': 9002, 'loss/train': 1.079535573720932} 01/27/2022 04:20:23 - INFO - codeparrot_training - Step 9003: {'lr': 0.000474196470265842, 'samples': 1728768, 'steps': 9003, 'loss/train': 1.1187499165534973} 01/27/2022 04:20:26 - INFO - codeparrot_training - Step 9004: {'lr': 0.0004741892299725843, 'samples': 1728960, 'steps': 9004, 'loss/train': 0.7346632182598114} 01/27/2022 04:20:29 - INFO - codeparrot_training - Step 9005: {'lr': 0.0004741819887189711, 'samples': 1729152, 'steps': 9005, 'loss/train': 0.5041556060314178} 01/27/2022 04:20:32 - INFO - codeparrot_training - Step 9006: {'lr': 0.00047417474650503347, 'samples': 1729344, 'steps': 9006, 'loss/train': 0.223540261387825} 01/27/2022 04:20:35 - INFO - codeparrot_training - Step 9007: {'lr': 0.00047416750333080244, 'samples': 1729536, 'steps': 9007, 'loss/train': 1.2867793142795563} 01/27/2022 04:20:39 - INFO - codeparrot_training - Step 9008: {'lr': 0.000474160259196309, 'samples': 1729728, 'steps': 9008, 'loss/train': 0.6953534781932831} 01/27/2022 04:20:42 - INFO - codeparrot_training - Step 9009: {'lr': 0.00047415301410158416, 'samples': 1729920, 'steps': 9009, 'loss/train': 1.0166221261024475} 01/27/2022 04:20:45 - INFO - codeparrot_training - Step 9010: {'lr': 0.00047414576804665897, 'samples': 1730112, 'steps': 9010, 'loss/train': 0.9811387360095978} 01/27/2022 04:20:48 - INFO - codeparrot_training - Step 9011: {'lr': 0.0004741385210315645, 'samples': 1730304, 'steps': 9011, 'loss/train': 1.2129747569561005} 01/27/2022 04:20:52 - INFO - codeparrot_training - Step 9012: {'lr': 0.0004741312730563318, 'samples': 1730496, 'steps': 9012, 'loss/train': 0.7552070617675781} 01/27/2022 04:20:55 - INFO - codeparrot_training - Step 9013: {'lr': 0.00047412402412099185, 'samples': 1730688, 'steps': 9013, 'loss/train': 1.452377200126648} 01/27/2022 04:20:59 - INFO - codeparrot_training - Step 9014: {'lr': 0.00047411677422557586, 'samples': 1730880, 'steps': 9014, 'loss/train': 0.8021737933158875} 01/27/2022 04:21:02 - INFO - codeparrot_training - Step 9015: {'lr': 0.0004741095233701147, 'samples': 1731072, 'steps': 9015, 'loss/train': 0.2929062098264694} 01/27/2022 04:21:05 - INFO - codeparrot_training - Step 9016: {'lr': 0.00047410227155463946, 'samples': 1731264, 'steps': 9016, 'loss/train': 0.5480101257562637} 01/27/2022 04:21:08 - INFO - codeparrot_training - Step 9017: {'lr': 0.00047409501877918134, 'samples': 1731456, 'steps': 9017, 'loss/train': 0.4384970963001251} 01/27/2022 04:21:11 - INFO - codeparrot_training - Step 9018: {'lr': 0.00047408776504377127, 'samples': 1731648, 'steps': 9018, 'loss/train': 1.1049569249153137} 01/27/2022 04:21:14 - INFO - codeparrot_training - Step 9019: {'lr': 0.00047408051034844036, 'samples': 1731840, 'steps': 9019, 'loss/train': 0.9303486049175262} 01/27/2022 04:21:17 - INFO - codeparrot_training - Step 9020: {'lr': 0.00047407325469321973, 'samples': 1732032, 'steps': 9020, 'loss/train': 0.5622637420892715} 01/27/2022 04:21:23 - INFO - codeparrot_training - Step 9021: {'lr': 0.00047406599807814034, 'samples': 1732224, 'steps': 9021, 'loss/train': 1.061391681432724} 01/27/2022 04:21:26 - INFO - codeparrot_training - Step 9022: {'lr': 0.00047405874050323346, 'samples': 1732416, 'steps': 9022, 'loss/train': 0.6084168255329132} 01/27/2022 04:21:29 - INFO - codeparrot_training - Step 9023: {'lr': 0.00047405148196853005, 'samples': 1732608, 'steps': 9023, 'loss/train': 0.18123172223567963} 01/27/2022 04:21:32 - INFO - codeparrot_training - Step 9024: {'lr': 0.0004740442224740612, 'samples': 1732800, 'steps': 9024, 'loss/train': 0.7082088142633438} 01/27/2022 04:21:35 - INFO - codeparrot_training - Step 9025: {'lr': 0.00047403696201985814, 'samples': 1732992, 'steps': 9025, 'loss/train': 0.8428613841533661} 01/27/2022 04:21:38 - INFO - codeparrot_training - Step 9026: {'lr': 0.0004740297006059517, 'samples': 1733184, 'steps': 9026, 'loss/train': 1.2086678445339203} 01/27/2022 04:21:41 - INFO - codeparrot_training - Step 9027: {'lr': 0.00047402243823237335, 'samples': 1733376, 'steps': 9027, 'loss/train': 1.0209216177463531} 01/27/2022 04:21:45 - INFO - codeparrot_training - Step 9028: {'lr': 0.0004740151748991539, 'samples': 1733568, 'steps': 9028, 'loss/train': 0.8615425229072571} 01/27/2022 04:21:49 - INFO - codeparrot_training - Step 9029: {'lr': 0.00047400791060632464, 'samples': 1733760, 'steps': 9029, 'loss/train': 1.423444926738739} 01/27/2022 04:21:52 - INFO - codeparrot_training - Step 9030: {'lr': 0.0004740006453539166, 'samples': 1733952, 'steps': 9030, 'loss/train': 0.8078824281692505} 01/27/2022 04:21:55 - INFO - codeparrot_training - Step 9031: {'lr': 0.0004739933791419609, 'samples': 1734144, 'steps': 9031, 'loss/train': 0.9350261092185974} 01/27/2022 04:21:58 - INFO - codeparrot_training - Step 9032: {'lr': 0.0004739861119704887, 'samples': 1734336, 'steps': 9032, 'loss/train': 1.1261023879051208} 01/27/2022 04:22:02 - INFO - codeparrot_training - Step 9033: {'lr': 0.00047397884383953114, 'samples': 1734528, 'steps': 9033, 'loss/train': 0.32824987918138504} 01/27/2022 04:22:05 - INFO - codeparrot_training - Step 9034: {'lr': 0.0004739715747491193, 'samples': 1734720, 'steps': 9034, 'loss/train': 0.8976612389087677} 01/27/2022 04:22:08 - INFO - codeparrot_training - Step 9035: {'lr': 0.00047396430469928436, 'samples': 1734912, 'steps': 9035, 'loss/train': 0.6154105961322784} 01/27/2022 04:22:11 - INFO - codeparrot_training - Step 9036: {'lr': 0.0004739570336900575, 'samples': 1735104, 'steps': 9036, 'loss/train': 0.4190179109573364} 01/27/2022 04:22:14 - INFO - codeparrot_training - Step 9037: {'lr': 0.00047394976172146974, 'samples': 1735296, 'steps': 9037, 'loss/train': 0.8598197400569916} 01/27/2022 04:22:18 - INFO - codeparrot_training - Step 9038: {'lr': 0.0004739424887935524, 'samples': 1735488, 'steps': 9038, 'loss/train': 0.8379837870597839} 01/27/2022 04:22:22 - INFO - codeparrot_training - Step 9039: {'lr': 0.0004739352149063365, 'samples': 1735680, 'steps': 9039, 'loss/train': 0.9373243153095245} 01/27/2022 04:22:25 - INFO - codeparrot_training - Step 9040: {'lr': 0.0004739279400598532, 'samples': 1735872, 'steps': 9040, 'loss/train': 0.9337469637393951} 01/27/2022 04:22:28 - INFO - codeparrot_training - Step 9041: {'lr': 0.0004739206642541338, 'samples': 1736064, 'steps': 9041, 'loss/train': 0.803492546081543} 01/27/2022 04:22:31 - INFO - codeparrot_training - Step 9042: {'lr': 0.0004739133874892093, 'samples': 1736256, 'steps': 9042, 'loss/train': 1.2261309027671814} 01/27/2022 04:22:34 - INFO - codeparrot_training - Step 9043: {'lr': 0.0004739061097651111, 'samples': 1736448, 'steps': 9043, 'loss/train': 0.5603633970022202} 01/27/2022 04:22:37 - INFO - codeparrot_training - Step 9044: {'lr': 0.00047389883108187004, 'samples': 1736640, 'steps': 9044, 'loss/train': 1.0923816561698914} 01/27/2022 04:22:40 - INFO - codeparrot_training - Step 9045: {'lr': 0.0004738915514395176, 'samples': 1736832, 'steps': 9045, 'loss/train': 1.0028901100158691} 01/27/2022 04:22:44 - INFO - codeparrot_training - Step 9046: {'lr': 0.0004738842708380847, 'samples': 1737024, 'steps': 9046, 'loss/train': 1.1661715507507324} 01/27/2022 04:22:48 - INFO - codeparrot_training - Step 9047: {'lr': 0.0004738769892776028, 'samples': 1737216, 'steps': 9047, 'loss/train': 1.5090317130088806} 01/27/2022 04:22:51 - INFO - codeparrot_training - Step 9048: {'lr': 0.00047386970675810297, 'samples': 1737408, 'steps': 9048, 'loss/train': 0.7935076653957367} 01/27/2022 04:22:54 - INFO - codeparrot_training - Step 9049: {'lr': 0.00047386242327961635, 'samples': 1737600, 'steps': 9049, 'loss/train': 0.8976083993911743} 01/27/2022 04:22:57 - INFO - codeparrot_training - Step 9050: {'lr': 0.0004738551388421742, 'samples': 1737792, 'steps': 9050, 'loss/train': 1.0176627337932587} 01/27/2022 04:23:01 - INFO - codeparrot_training - Step 9051: {'lr': 0.00047384785344580784, 'samples': 1737984, 'steps': 9051, 'loss/train': 0.25870437920093536} 01/27/2022 04:23:04 - INFO - codeparrot_training - Step 9052: {'lr': 0.00047384056709054824, 'samples': 1738176, 'steps': 9052, 'loss/train': 0.993509978055954} 01/27/2022 04:23:07 - INFO - codeparrot_training - Step 9053: {'lr': 0.0004738332797764267, 'samples': 1738368, 'steps': 9053, 'loss/train': 0.7605777382850647} 01/27/2022 04:23:10 - INFO - codeparrot_training - Step 9054: {'lr': 0.0004738259915034745, 'samples': 1738560, 'steps': 9054, 'loss/train': 0.7978147566318512} 01/27/2022 04:23:13 - INFO - codeparrot_training - Step 9055: {'lr': 0.00047381870227172285, 'samples': 1738752, 'steps': 9055, 'loss/train': 0.38161759078502655} 01/27/2022 04:23:19 - INFO - codeparrot_training - Step 9056: {'lr': 0.0004738114120812029, 'samples': 1738944, 'steps': 9056, 'loss/train': 0.15026232227683067} 01/27/2022 04:23:22 - INFO - codeparrot_training - Step 9057: {'lr': 0.000473804120931946, 'samples': 1739136, 'steps': 9057, 'loss/train': 0.7829065918922424} 01/27/2022 04:23:25 - INFO - codeparrot_training - Step 9058: {'lr': 0.0004737968288239832, 'samples': 1739328, 'steps': 9058, 'loss/train': 0.7773432433605194} 01/27/2022 04:23:28 - INFO - codeparrot_training - Step 9059: {'lr': 0.00047378953575734594, 'samples': 1739520, 'steps': 9059, 'loss/train': 1.0029049515724182} 01/27/2022 04:23:31 - INFO - codeparrot_training - Step 9060: {'lr': 0.0004737822417320654, 'samples': 1739712, 'steps': 9060, 'loss/train': 1.0672341585159302} 01/27/2022 04:23:34 - INFO - codeparrot_training - Step 9061: {'lr': 0.00047377494674817275, 'samples': 1739904, 'steps': 9061, 'loss/train': 0.8591782450675964} 01/27/2022 04:23:37 - INFO - codeparrot_training - Step 9062: {'lr': 0.00047376765080569925, 'samples': 1740096, 'steps': 9062, 'loss/train': 1.0185173749923706} 01/27/2022 04:23:41 - INFO - codeparrot_training - Step 9063: {'lr': 0.0004737603539046762, 'samples': 1740288, 'steps': 9063, 'loss/train': 1.0873967707157135} 01/27/2022 04:23:45 - INFO - codeparrot_training - Step 9064: {'lr': 0.0004737530560451349, 'samples': 1740480, 'steps': 9064, 'loss/train': 0.9664613306522369} 01/27/2022 04:23:49 - INFO - codeparrot_training - Step 9065: {'lr': 0.00047374575722710656, 'samples': 1740672, 'steps': 9065, 'loss/train': 0.9318711161613464} 01/27/2022 04:23:52 - INFO - codeparrot_training - Step 9066: {'lr': 0.0004737384574506224, 'samples': 1740864, 'steps': 9066, 'loss/train': 0.6660089045763016} 01/27/2022 04:23:55 - INFO - codeparrot_training - Step 9067: {'lr': 0.0004737311567157137, 'samples': 1741056, 'steps': 9067, 'loss/train': 1.0799754559993744} 01/27/2022 04:23:58 - INFO - codeparrot_training - Step 9068: {'lr': 0.00047372385502241176, 'samples': 1741248, 'steps': 9068, 'loss/train': 0.8376772105693817} 01/27/2022 04:24:01 - INFO - codeparrot_training - Step 9069: {'lr': 0.00047371655237074794, 'samples': 1741440, 'steps': 9069, 'loss/train': 1.4808258712291718} 01/27/2022 04:24:04 - INFO - codeparrot_training - Step 9070: {'lr': 0.0004737092487607534, 'samples': 1741632, 'steps': 9070, 'loss/train': 0.23958127945661545} 01/27/2022 04:24:07 - INFO - codeparrot_training - Step 9071: {'lr': 0.00047370194419245955, 'samples': 1741824, 'steps': 9071, 'loss/train': 1.0564523041248322} 01/27/2022 04:24:11 - INFO - codeparrot_training - Step 9072: {'lr': 0.00047369463866589755, 'samples': 1742016, 'steps': 9072, 'loss/train': 0.831735759973526} 01/27/2022 04:24:15 - INFO - codeparrot_training - Step 9073: {'lr': 0.00047368733218109874, 'samples': 1742208, 'steps': 9073, 'loss/train': 1.0290983319282532} 01/27/2022 04:24:18 - INFO - codeparrot_training - Step 9074: {'lr': 0.00047368002473809447, 'samples': 1742400, 'steps': 9074, 'loss/train': 1.1802278459072113} 01/27/2022 04:24:21 - INFO - codeparrot_training - Step 9075: {'lr': 0.0004736727163369159, 'samples': 1742592, 'steps': 9075, 'loss/train': 1.1788841485977173} 01/27/2022 04:24:24 - INFO - codeparrot_training - Step 9076: {'lr': 0.00047366540697759454, 'samples': 1742784, 'steps': 9076, 'loss/train': 0.7955542802810669} 01/27/2022 04:24:28 - INFO - codeparrot_training - Step 9077: {'lr': 0.00047365809666016155, 'samples': 1742976, 'steps': 9077, 'loss/train': 1.020652413368225} 01/27/2022 04:24:31 - INFO - codeparrot_training - Step 9078: {'lr': 0.00047365078538464826, 'samples': 1743168, 'steps': 9078, 'loss/train': 1.2617948055267334} 01/27/2022 04:24:34 - INFO - codeparrot_training - Step 9079: {'lr': 0.0004736434731510861, 'samples': 1743360, 'steps': 9079, 'loss/train': 0.8331636786460876} 01/27/2022 04:24:37 - INFO - codeparrot_training - Step 9080: {'lr': 0.00047363615995950624, 'samples': 1743552, 'steps': 9080, 'loss/train': 0.6512245684862137} 01/27/2022 04:24:40 - INFO - codeparrot_training - Step 9081: {'lr': 0.0004736288458099401, 'samples': 1743744, 'steps': 9081, 'loss/train': 0.7616068124771118} 01/27/2022 04:24:46 - INFO - codeparrot_training - Step 9082: {'lr': 0.0004736215307024191, 'samples': 1743936, 'steps': 9082, 'loss/train': 0.8962433338165283} 01/27/2022 04:24:49 - INFO - codeparrot_training - Step 9083: {'lr': 0.0004736142146369744, 'samples': 1744128, 'steps': 9083, 'loss/train': 0.8547718226909637} 01/27/2022 04:24:52 - INFO - codeparrot_training - Step 9084: {'lr': 0.0004736068976136374, 'samples': 1744320, 'steps': 9084, 'loss/train': 1.0440376996994019} 01/27/2022 04:24:55 - INFO - codeparrot_training - Step 9085: {'lr': 0.00047359957963243943, 'samples': 1744512, 'steps': 9085, 'loss/train': 1.0184416472911835} 01/27/2022 04:24:58 - INFO - codeparrot_training - Step 9086: {'lr': 0.0004735922606934119, 'samples': 1744704, 'steps': 9086, 'loss/train': 0.4799973964691162} 01/27/2022 04:25:01 - INFO - codeparrot_training - Step 9087: {'lr': 0.0004735849407965861, 'samples': 1744896, 'steps': 9087, 'loss/train': 0.6882337331771851} 01/27/2022 04:25:05 - INFO - codeparrot_training - Step 9088: {'lr': 0.00047357761994199345, 'samples': 1745088, 'steps': 9088, 'loss/train': 0.4670456349849701} 01/27/2022 04:25:08 - INFO - codeparrot_training - Step 9089: {'lr': 0.00047357029812966525, 'samples': 1745280, 'steps': 9089, 'loss/train': 0.7452275902032852} 01/27/2022 04:25:11 - INFO - codeparrot_training - Step 9090: {'lr': 0.0004735629753596328, 'samples': 1745472, 'steps': 9090, 'loss/train': 0.6969839483499527} 01/27/2022 04:25:15 - INFO - codeparrot_training - Step 9091: {'lr': 0.00047355565163192763, 'samples': 1745664, 'steps': 9091, 'loss/train': 0.962771862745285} 01/27/2022 04:25:18 - INFO - codeparrot_training - Step 9092: {'lr': 0.00047354832694658104, 'samples': 1745856, 'steps': 9092, 'loss/train': 0.6898308545351028} 01/27/2022 04:25:22 - INFO - codeparrot_training - Step 9093: {'lr': 0.00047354100130362443, 'samples': 1746048, 'steps': 9093, 'loss/train': 0.9679726660251617} 01/27/2022 04:25:25 - INFO - codeparrot_training - Step 9094: {'lr': 0.00047353367470308913, 'samples': 1746240, 'steps': 9094, 'loss/train': 1.4024378657341003} 01/27/2022 04:25:28 - INFO - codeparrot_training - Step 9095: {'lr': 0.0004735263471450065, 'samples': 1746432, 'steps': 9095, 'loss/train': 1.096450924873352} 01/27/2022 04:25:31 - INFO - codeparrot_training - Step 9096: {'lr': 0.00047351901862940807, 'samples': 1746624, 'steps': 9096, 'loss/train': 1.1605715453624725} 01/27/2022 04:25:34 - INFO - codeparrot_training - Step 9097: {'lr': 0.000473511689156325, 'samples': 1746816, 'steps': 9097, 'loss/train': 1.0373104512691498} 01/27/2022 04:25:37 - INFO - codeparrot_training - Step 9098: {'lr': 0.0004735043587257889, 'samples': 1747008, 'steps': 9098, 'loss/train': 0.45917996764183044} 01/27/2022 04:25:40 - INFO - codeparrot_training - Step 9099: {'lr': 0.00047349702733783113, 'samples': 1747200, 'steps': 9099, 'loss/train': 0.655758798122406} 01/27/2022 04:25:45 - INFO - codeparrot_training - Step 9100: {'lr': 0.00047348969499248306, 'samples': 1747392, 'steps': 9100, 'loss/train': 0.09141189604997635} 01/27/2022 04:25:48 - INFO - codeparrot_training - Step 9101: {'lr': 0.0004734823616897761, 'samples': 1747584, 'steps': 9101, 'loss/train': 0.6592042744159698} 01/27/2022 04:25:52 - INFO - codeparrot_training - Step 9102: {'lr': 0.0004734750274297416, 'samples': 1747776, 'steps': 9102, 'loss/train': 0.38163480162620544} 01/27/2022 04:25:55 - INFO - codeparrot_training - Step 9103: {'lr': 0.0004734676922124111, 'samples': 1747968, 'steps': 9103, 'loss/train': 0.7463604658842087} 01/27/2022 04:25:58 - INFO - codeparrot_training - Step 9104: {'lr': 0.00047346035603781597, 'samples': 1748160, 'steps': 9104, 'loss/train': 1.2304226160049438} 01/27/2022 04:26:01 - INFO - codeparrot_training - Step 9105: {'lr': 0.0004734530189059876, 'samples': 1748352, 'steps': 9105, 'loss/train': 0.7659517228603363} 01/27/2022 04:26:04 - INFO - codeparrot_training - Step 9106: {'lr': 0.0004734456808169575, 'samples': 1748544, 'steps': 9106, 'loss/train': 0.7261328548192978} 01/27/2022 04:26:07 - INFO - codeparrot_training - Step 9107: {'lr': 0.00047343834177075695, 'samples': 1748736, 'steps': 9107, 'loss/train': 1.1652812361717224} 01/27/2022 04:26:10 - INFO - codeparrot_training - Step 9108: {'lr': 0.0004734310017674176, 'samples': 1748928, 'steps': 9108, 'loss/train': 0.4975394904613495} 01/27/2022 04:26:16 - INFO - codeparrot_training - Step 9109: {'lr': 0.00047342366080697077, 'samples': 1749120, 'steps': 9109, 'loss/train': 1.1190355718135834} 01/27/2022 04:26:19 - INFO - codeparrot_training - Step 9110: {'lr': 0.00047341631888944794, 'samples': 1749312, 'steps': 9110, 'loss/train': 0.7683405876159668} 01/27/2022 04:26:22 - INFO - codeparrot_training - Step 9111: {'lr': 0.0004734089760148805, 'samples': 1749504, 'steps': 9111, 'loss/train': 1.1549864709377289} 01/27/2022 04:26:25 - INFO - codeparrot_training - Step 9112: {'lr': 0.0004734016321832999, 'samples': 1749696, 'steps': 9112, 'loss/train': 1.0747979879379272} 01/27/2022 04:26:28 - INFO - codeparrot_training - Step 9113: {'lr': 0.0004733942873947377, 'samples': 1749888, 'steps': 9113, 'loss/train': 0.7167875915765762} 01/27/2022 04:26:31 - INFO - codeparrot_training - Step 9114: {'lr': 0.00047338694164922535, 'samples': 1750080, 'steps': 9114, 'loss/train': 0.7368990182876587} 01/27/2022 04:26:34 - INFO - codeparrot_training - Step 9115: {'lr': 0.0004733795949467942, 'samples': 1750272, 'steps': 9115, 'loss/train': 1.1519437730312347} 01/27/2022 04:26:38 - INFO - codeparrot_training - Step 9116: {'lr': 0.0004733722472874759, 'samples': 1750464, 'steps': 9116, 'loss/train': 0.974005937576294} 01/27/2022 04:26:42 - INFO - codeparrot_training - Step 9117: {'lr': 0.0004733648986713017, 'samples': 1750656, 'steps': 9117, 'loss/train': 0.48666465282440186} 01/27/2022 04:26:45 - INFO - codeparrot_training - Step 9118: {'lr': 0.00047335754909830327, 'samples': 1750848, 'steps': 9118, 'loss/train': 1.0892152190208435} 01/27/2022 04:26:48 - INFO - codeparrot_training - Step 9119: {'lr': 0.00047335019856851204, 'samples': 1751040, 'steps': 9119, 'loss/train': 1.8057256937026978} 01/27/2022 04:26:51 - INFO - codeparrot_training - Step 9120: {'lr': 0.0004733428470819594, 'samples': 1751232, 'steps': 9120, 'loss/train': 0.3529437705874443} 01/27/2022 04:26:55 - INFO - codeparrot_training - Step 9121: {'lr': 0.000473335494638677, 'samples': 1751424, 'steps': 9121, 'loss/train': 0.32923246175050735} 01/27/2022 04:26:58 - INFO - codeparrot_training - Step 9122: {'lr': 0.00047332814123869616, 'samples': 1751616, 'steps': 9122, 'loss/train': 0.7888280153274536} 01/27/2022 04:27:01 - INFO - codeparrot_training - Step 9123: {'lr': 0.0004733207868820486, 'samples': 1751808, 'steps': 9123, 'loss/train': 0.6827069967985153} 01/27/2022 04:27:04 - INFO - codeparrot_training - Step 9124: {'lr': 0.0004733134315687656, 'samples': 1752000, 'steps': 9124, 'loss/train': 0.8519332408905029} 01/27/2022 04:27:07 - INFO - codeparrot_training - Step 9125: {'lr': 0.00047330607529887884, 'samples': 1752192, 'steps': 9125, 'loss/train': 0.9570151269435883} 01/27/2022 04:27:12 - INFO - codeparrot_training - Step 9126: {'lr': 0.00047329871807241976, 'samples': 1752384, 'steps': 9126, 'loss/train': 1.3590588569641113} 01/27/2022 04:27:15 - INFO - codeparrot_training - Step 9127: {'lr': 0.00047329135988941984, 'samples': 1752576, 'steps': 9127, 'loss/train': 1.34071746468544} 01/27/2022 04:27:19 - INFO - codeparrot_training - Step 9128: {'lr': 0.00047328400074991064, 'samples': 1752768, 'steps': 9128, 'loss/train': 1.0220050513744354} 01/27/2022 04:27:22 - INFO - codeparrot_training - Step 9129: {'lr': 0.00047327664065392375, 'samples': 1752960, 'steps': 9129, 'loss/train': 0.8347452878952026} 01/27/2022 04:27:25 - INFO - codeparrot_training - Step 9130: {'lr': 0.0004732692796014905, 'samples': 1753152, 'steps': 9130, 'loss/train': 0.9471652507781982} 01/27/2022 04:27:28 - INFO - codeparrot_training - Step 9131: {'lr': 0.00047326191759264265, 'samples': 1753344, 'steps': 9131, 'loss/train': 1.046014130115509} 01/27/2022 04:27:31 - INFO - codeparrot_training - Step 9132: {'lr': 0.00047325455462741164, 'samples': 1753536, 'steps': 9132, 'loss/train': 0.612546980381012} 01/27/2022 04:27:34 - INFO - codeparrot_training - Step 9133: {'lr': 0.00047324719070582894, 'samples': 1753728, 'steps': 9133, 'loss/train': 0.9043631851673126} 01/27/2022 04:27:38 - INFO - codeparrot_training - Step 9134: {'lr': 0.00047323982582792625, 'samples': 1753920, 'steps': 9134, 'loss/train': 0.8701188862323761} 01/27/2022 04:27:42 - INFO - codeparrot_training - Step 9135: {'lr': 0.00047323245999373497, 'samples': 1754112, 'steps': 9135, 'loss/train': 0.688435971736908} 01/27/2022 04:27:45 - INFO - codeparrot_training - Step 9136: {'lr': 0.0004732250932032867, 'samples': 1754304, 'steps': 9136, 'loss/train': 0.3707653284072876} 01/27/2022 04:27:48 - INFO - codeparrot_training - Step 9137: {'lr': 0.0004732177254566131, 'samples': 1754496, 'steps': 9137, 'loss/train': 1.0542405545711517} 01/27/2022 04:27:52 - INFO - codeparrot_training - Step 9138: {'lr': 0.0004732103567537456, 'samples': 1754688, 'steps': 9138, 'loss/train': 0.8229526877403259} 01/27/2022 04:27:55 - INFO - codeparrot_training - Step 9139: {'lr': 0.00047320298709471574, 'samples': 1754880, 'steps': 9139, 'loss/train': 0.5534852296113968} 01/27/2022 04:27:58 - INFO - codeparrot_training - Step 9140: {'lr': 0.0004731956164795552, 'samples': 1755072, 'steps': 9140, 'loss/train': 0.7997081279754639} 01/27/2022 04:28:01 - INFO - codeparrot_training - Step 9141: {'lr': 0.0004731882449082956, 'samples': 1755264, 'steps': 9141, 'loss/train': 0.8904591500759125} 01/27/2022 04:28:04 - INFO - codeparrot_training - Step 9142: {'lr': 0.0004731808723809683, 'samples': 1755456, 'steps': 9142, 'loss/train': 0.8570089638233185} 01/27/2022 04:28:07 - INFO - codeparrot_training - Step 9143: {'lr': 0.0004731734988976051, 'samples': 1755648, 'steps': 9143, 'loss/train': 0.7939708828926086} 01/27/2022 04:28:12 - INFO - codeparrot_training - Step 9144: {'lr': 0.00047316612445823746, 'samples': 1755840, 'steps': 9144, 'loss/train': 0.7422450184822083} 01/27/2022 04:28:15 - INFO - codeparrot_training - Step 9145: {'lr': 0.000473158749062897, 'samples': 1756032, 'steps': 9145, 'loss/train': 1.3212454319000244} 01/27/2022 04:28:18 - INFO - codeparrot_training - Step 9146: {'lr': 0.00047315137271161537, 'samples': 1756224, 'steps': 9146, 'loss/train': 0.5029579102993011} 01/27/2022 04:28:21 - INFO - codeparrot_training - Step 9147: {'lr': 0.00047314399540442407, 'samples': 1756416, 'steps': 9147, 'loss/train': 0.8700232207775116} 01/27/2022 04:28:24 - INFO - codeparrot_training - Step 9148: {'lr': 0.00047313661714135476, 'samples': 1756608, 'steps': 9148, 'loss/train': 0.8211541771888733} 01/27/2022 04:28:27 - INFO - codeparrot_training - Step 9149: {'lr': 0.000473129237922439, 'samples': 1756800, 'steps': 9149, 'loss/train': 1.442174881696701} 01/27/2022 04:28:31 - INFO - codeparrot_training - Step 9150: {'lr': 0.0004731218577477085, 'samples': 1756992, 'steps': 9150, 'loss/train': 0.6448989808559418} 01/27/2022 04:28:34 - INFO - codeparrot_training - Step 9151: {'lr': 0.0004731144766171948, 'samples': 1757184, 'steps': 9151, 'loss/train': 1.0967772603034973} 01/27/2022 04:28:38 - INFO - codeparrot_training - Step 9152: {'lr': 0.0004731070945309295, 'samples': 1757376, 'steps': 9152, 'loss/train': 0.7488759756088257} 01/27/2022 04:28:41 - INFO - codeparrot_training - Step 9153: {'lr': 0.00047309971148894425, 'samples': 1757568, 'steps': 9153, 'loss/train': 0.4244470149278641} 01/27/2022 04:28:44 - INFO - codeparrot_training - Step 9154: {'lr': 0.00047309232749127074, 'samples': 1757760, 'steps': 9154, 'loss/train': 1.456723004579544} 01/27/2022 04:28:48 - INFO - codeparrot_training - Step 9155: {'lr': 0.0004730849425379404, 'samples': 1757952, 'steps': 9155, 'loss/train': 0.5868671089410782} 01/27/2022 04:28:51 - INFO - codeparrot_training - Step 9156: {'lr': 0.0004730775566289851, 'samples': 1758144, 'steps': 9156, 'loss/train': 0.7734397351741791} 01/27/2022 04:28:54 - INFO - codeparrot_training - Step 9157: {'lr': 0.0004730701697644364, 'samples': 1758336, 'steps': 9157, 'loss/train': 0.7827455699443817} 01/27/2022 04:28:57 - INFO - codeparrot_training - Step 9158: {'lr': 0.00047306278194432597, 'samples': 1758528, 'steps': 9158, 'loss/train': 0.8388201892375946} 01/27/2022 04:29:00 - INFO - codeparrot_training - Step 9159: {'lr': 0.0004730553931686853, 'samples': 1758720, 'steps': 9159, 'loss/train': 1.3509061932563782} 01/27/2022 04:29:03 - INFO - codeparrot_training - Step 9160: {'lr': 0.00047304800343754615, 'samples': 1758912, 'steps': 9160, 'loss/train': 1.5265719294548035} 01/27/2022 04:29:08 - INFO - codeparrot_training - Step 9161: {'lr': 0.00047304061275094025, 'samples': 1759104, 'steps': 9161, 'loss/train': 1.432863861322403} 01/27/2022 04:29:11 - INFO - codeparrot_training - Step 9162: {'lr': 0.0004730332211088992, 'samples': 1759296, 'steps': 9162, 'loss/train': 0.87148717045784} 01/27/2022 04:29:14 - INFO - codeparrot_training - Step 9163: {'lr': 0.0004730258285114546, 'samples': 1759488, 'steps': 9163, 'loss/train': 0.9213757216930389} 01/27/2022 04:29:18 - INFO - codeparrot_training - Step 9164: {'lr': 0.0004730184349586382, 'samples': 1759680, 'steps': 9164, 'loss/train': 0.6196489781141281} 01/27/2022 04:29:21 - INFO - codeparrot_training - Step 9165: {'lr': 0.0004730110404504816, 'samples': 1759872, 'steps': 9165, 'loss/train': 0.24148236215114594} 01/27/2022 04:29:24 - INFO - codeparrot_training - Step 9166: {'lr': 0.00047300364498701654, 'samples': 1760064, 'steps': 9166, 'loss/train': 0.6422866433858871} 01/27/2022 04:29:27 - INFO - codeparrot_training - Step 9167: {'lr': 0.00047299624856827474, 'samples': 1760256, 'steps': 9167, 'loss/train': 0.7827271521091461} 01/27/2022 04:29:30 - INFO - codeparrot_training - Step 9168: {'lr': 0.0004729888511942877, 'samples': 1760448, 'steps': 9168, 'loss/train': 0.6663665771484375} 01/27/2022 04:29:33 - INFO - codeparrot_training - Step 9169: {'lr': 0.0004729814528650873, 'samples': 1760640, 'steps': 9169, 'loss/train': 0.8385560810565948} 01/27/2022 04:29:38 - INFO - codeparrot_training - Step 9170: {'lr': 0.00047297405358070517, 'samples': 1760832, 'steps': 9170, 'loss/train': 0.7371735870838165} 01/27/2022 04:29:41 - INFO - codeparrot_training - Step 9171: {'lr': 0.00047296665334117295, 'samples': 1761024, 'steps': 9171, 'loss/train': 1.0432482361793518} 01/27/2022 04:29:44 - INFO - codeparrot_training - Step 9172: {'lr': 0.0004729592521465224, 'samples': 1761216, 'steps': 9172, 'loss/train': 0.874191015958786} 01/27/2022 04:29:47 - INFO - codeparrot_training - Step 9173: {'lr': 0.00047295184999678524, 'samples': 1761408, 'steps': 9173, 'loss/train': 0.5683966130018234} 01/27/2022 04:29:50 - INFO - codeparrot_training - Step 9174: {'lr': 0.00047294444689199313, 'samples': 1761600, 'steps': 9174, 'loss/train': 0.4173583388328552} 01/27/2022 04:29:53 - INFO - codeparrot_training - Step 9175: {'lr': 0.0004729370428321778, 'samples': 1761792, 'steps': 9175, 'loss/train': 0.6037731617689133} 01/27/2022 04:29:57 - INFO - codeparrot_training - Step 9176: {'lr': 0.000472929637817371, 'samples': 1761984, 'steps': 9176, 'loss/train': 0.7367102354764938} 01/27/2022 04:30:00 - INFO - codeparrot_training - Step 9177: {'lr': 0.0004729222318476044, 'samples': 1762176, 'steps': 9177, 'loss/train': 0.6641362309455872} 01/27/2022 04:30:03 - INFO - codeparrot_training - Step 9178: {'lr': 0.0004729148249229097, 'samples': 1762368, 'steps': 9178, 'loss/train': 0.6941618174314499} 01/27/2022 04:30:07 - INFO - codeparrot_training - Step 9179: {'lr': 0.0004729074170433187, 'samples': 1762560, 'steps': 9179, 'loss/train': 0.9273380041122437} 01/27/2022 04:30:10 - INFO - codeparrot_training - Step 9180: {'lr': 0.0004729000082088631, 'samples': 1762752, 'steps': 9180, 'loss/train': 0.9659673571586609} 01/27/2022 04:30:14 - INFO - codeparrot_training - Step 9181: {'lr': 0.0004728925984195748, 'samples': 1762944, 'steps': 9181, 'loss/train': 1.1032764315605164} 01/27/2022 04:30:17 - INFO - codeparrot_training - Step 9182: {'lr': 0.00047288518767548516, 'samples': 1763136, 'steps': 9182, 'loss/train': 0.1780160553753376} 01/27/2022 04:30:20 - INFO - codeparrot_training - Step 9183: {'lr': 0.0004728777759766263, 'samples': 1763328, 'steps': 9183, 'loss/train': 0.9568573236465454} 01/27/2022 04:30:23 - INFO - codeparrot_training - Step 9184: {'lr': 0.00047287036332302967, 'samples': 1763520, 'steps': 9184, 'loss/train': 1.1818119585514069} 01/27/2022 04:30:26 - INFO - codeparrot_training - Step 9185: {'lr': 0.0004728629497147273, 'samples': 1763712, 'steps': 9185, 'loss/train': 0.4890294671058655} 01/27/2022 04:30:29 - INFO - codeparrot_training - Step 9186: {'lr': 0.00047285553515175077, 'samples': 1763904, 'steps': 9186, 'loss/train': 0.7525205612182617} 01/27/2022 04:30:32 - INFO - codeparrot_training - Step 9187: {'lr': 0.0004728481196341319, 'samples': 1764096, 'steps': 9187, 'loss/train': 1.0754805207252502} 01/27/2022 04:30:38 - INFO - codeparrot_training - Step 9188: {'lr': 0.0004728407031619025, 'samples': 1764288, 'steps': 9188, 'loss/train': 0.6666642129421234} 01/27/2022 04:30:41 - INFO - codeparrot_training - Step 9189: {'lr': 0.0004728332857350942, 'samples': 1764480, 'steps': 9189, 'loss/train': 0.5471738576889038} 01/27/2022 04:30:44 - INFO - codeparrot_training - Step 9190: {'lr': 0.00047282586735373887, 'samples': 1764672, 'steps': 9190, 'loss/train': 0.611887514591217} 01/27/2022 04:30:47 - INFO - codeparrot_training - Step 9191: {'lr': 0.0004728184480178683, 'samples': 1764864, 'steps': 9191, 'loss/train': 0.5540771037340164} 01/27/2022 04:30:50 - INFO - codeparrot_training - Step 9192: {'lr': 0.00047281102772751425, 'samples': 1765056, 'steps': 9192, 'loss/train': 1.0586495697498322} 01/27/2022 04:30:53 - INFO - codeparrot_training - Step 9193: {'lr': 0.0004728036064827086, 'samples': 1765248, 'steps': 9193, 'loss/train': 0.7888849675655365} 01/27/2022 04:30:57 - INFO - codeparrot_training - Step 9194: {'lr': 0.00047279618428348294, 'samples': 1765440, 'steps': 9194, 'loss/train': 0.8351584374904633} 01/27/2022 04:31:00 - INFO - codeparrot_training - Step 9195: {'lr': 0.00047278876112986923, 'samples': 1765632, 'steps': 9195, 'loss/train': 0.6998818963766098} 01/27/2022 04:31:04 - INFO - codeparrot_training - Step 9196: {'lr': 0.0004727813370218992, 'samples': 1765824, 'steps': 9196, 'loss/train': 0.8863105773925781} 01/27/2022 04:31:07 - INFO - codeparrot_training - Step 9197: {'lr': 0.00047277391195960463, 'samples': 1766016, 'steps': 9197, 'loss/train': 0.7014480382204056} 01/27/2022 04:31:11 - INFO - codeparrot_training - Step 9198: {'lr': 0.00047276648594301733, 'samples': 1766208, 'steps': 9198, 'loss/train': 0.5559815168380737} 01/27/2022 04:31:14 - INFO - codeparrot_training - Step 9199: {'lr': 0.0004727590589721692, 'samples': 1766400, 'steps': 9199, 'loss/train': 1.102350264787674} 01/27/2022 04:31:17 - INFO - codeparrot_training - Step 9200: {'lr': 0.00047275163104709196, 'samples': 1766592, 'steps': 9200, 'loss/train': 0.5564400851726532} 01/27/2022 04:31:20 - INFO - codeparrot_training - Step 9201: {'lr': 0.0004727442021678175, 'samples': 1766784, 'steps': 9201, 'loss/train': 0.9784869253635406} 01/27/2022 04:31:23 - INFO - codeparrot_training - Step 9202: {'lr': 0.0004727367723343776, 'samples': 1766976, 'steps': 9202, 'loss/train': 0.9842956960201263} 01/27/2022 04:31:26 - INFO - codeparrot_training - Step 9203: {'lr': 0.0004727293415468041, 'samples': 1767168, 'steps': 9203, 'loss/train': 0.6297835260629654} 01/27/2022 04:31:29 - INFO - codeparrot_training - Step 9204: {'lr': 0.00047272190980512875, 'samples': 1767360, 'steps': 9204, 'loss/train': 1.1503638625144958} 01/27/2022 04:31:35 - INFO - codeparrot_training - Step 9205: {'lr': 0.0004727144771093835, 'samples': 1767552, 'steps': 9205, 'loss/train': 0.2456865906715393} 01/27/2022 04:31:38 - INFO - codeparrot_training - Step 9206: {'lr': 0.00047270704345960023, 'samples': 1767744, 'steps': 9206, 'loss/train': 0.921604335308075} 01/27/2022 04:31:41 - INFO - codeparrot_training - Step 9207: {'lr': 0.00047269960885581064, 'samples': 1767936, 'steps': 9207, 'loss/train': 0.617575541138649} 01/27/2022 04:31:44 - INFO - codeparrot_training - Step 9208: {'lr': 0.00047269217329804663, 'samples': 1768128, 'steps': 9208, 'loss/train': 0.8200415968894958} 01/27/2022 04:31:47 - INFO - codeparrot_training - Step 9209: {'lr': 0.00047268473678634007, 'samples': 1768320, 'steps': 9209, 'loss/train': 0.6015293151140213} 01/27/2022 04:31:50 - INFO - codeparrot_training - Step 9210: {'lr': 0.00047267729932072284, 'samples': 1768512, 'steps': 9210, 'loss/train': 1.7162087559700012} 01/27/2022 04:31:53 - INFO - codeparrot_training - Step 9211: {'lr': 0.00047266986090122677, 'samples': 1768704, 'steps': 9211, 'loss/train': 2.0633533000946045} 01/27/2022 04:31:57 - INFO - codeparrot_training - Step 9212: {'lr': 0.0004726624215278836, 'samples': 1768896, 'steps': 9212, 'loss/train': 0.5499760508537292} 01/27/2022 04:32:00 - INFO - codeparrot_training - Step 9213: {'lr': 0.00047265498120072546, 'samples': 1769088, 'steps': 9213, 'loss/train': 0.09205403365194798} 01/27/2022 04:32:04 - INFO - codeparrot_training - Step 9214: {'lr': 0.00047264753991978404, 'samples': 1769280, 'steps': 9214, 'loss/train': 0.7239807397127151} 01/27/2022 04:32:07 - INFO - codeparrot_training - Step 9215: {'lr': 0.00047264009768509127, 'samples': 1769472, 'steps': 9215, 'loss/train': 0.8204635977745056} 01/27/2022 04:32:11 - INFO - codeparrot_training - Step 9216: {'lr': 0.000472632654496679, 'samples': 1769664, 'steps': 9216, 'loss/train': 0.8964456617832184} 01/27/2022 04:32:14 - INFO - codeparrot_training - Step 9217: {'lr': 0.00047262521035457914, 'samples': 1769856, 'steps': 9217, 'loss/train': 0.48954565823078156} 01/27/2022 04:32:17 - INFO - codeparrot_training - Step 9218: {'lr': 0.00047261776525882353, 'samples': 1770048, 'steps': 9218, 'loss/train': 0.9491358697414398} 01/27/2022 04:32:20 - INFO - codeparrot_training - Step 9219: {'lr': 0.00047261031920944413, 'samples': 1770240, 'steps': 9219, 'loss/train': 0.7029668837785721} 01/27/2022 04:32:23 - INFO - codeparrot_training - Step 9220: {'lr': 0.0004726028722064728, 'samples': 1770432, 'steps': 9220, 'loss/train': 0.755409836769104} 01/27/2022 04:32:26 - INFO - codeparrot_training - Step 9221: {'lr': 0.0004725954242499415, 'samples': 1770624, 'steps': 9221, 'loss/train': 0.8705601990222931} 01/27/2022 04:32:29 - INFO - codeparrot_training - Step 9222: {'lr': 0.00047258797533988205, 'samples': 1770816, 'steps': 9222, 'loss/train': 0.2630550116300583} 01/27/2022 04:32:34 - INFO - codeparrot_training - Step 9223: {'lr': 0.00047258052547632636, 'samples': 1771008, 'steps': 9223, 'loss/train': 0.9777106940746307} 01/27/2022 04:32:37 - INFO - codeparrot_training - Step 9224: {'lr': 0.0004725730746593064, 'samples': 1771200, 'steps': 9224, 'loss/train': 0.8768384456634521} 01/27/2022 04:32:40 - INFO - codeparrot_training - Step 9225: {'lr': 0.0004725656228888541, 'samples': 1771392, 'steps': 9225, 'loss/train': 0.8392081260681152} 01/27/2022 04:32:43 - INFO - codeparrot_training - Step 9226: {'lr': 0.0004725581701650014, 'samples': 1771584, 'steps': 9226, 'loss/train': 0.5704875290393829} 01/27/2022 04:32:46 - INFO - codeparrot_training - Step 9227: {'lr': 0.00047255071648778004, 'samples': 1771776, 'steps': 9227, 'loss/train': 1.4816190898418427} 01/27/2022 04:32:49 - INFO - codeparrot_training - Step 9228: {'lr': 0.00047254326185722207, 'samples': 1771968, 'steps': 9228, 'loss/train': 0.6045921295881271} 01/27/2022 04:32:53 - INFO - codeparrot_training - Step 9229: {'lr': 0.00047253580627335944, 'samples': 1772160, 'steps': 9229, 'loss/train': 3.876941442489624} 01/27/2022 04:32:56 - INFO - codeparrot_training - Step 9230: {'lr': 0.00047252834973622414, 'samples': 1772352, 'steps': 9230, 'loss/train': 1.3622867166996002} 01/27/2022 04:32:59 - INFO - codeparrot_training - Step 9231: {'lr': 0.00047252089224584804, 'samples': 1772544, 'steps': 9231, 'loss/train': 0.6782225668430328} 01/27/2022 04:33:04 - INFO - codeparrot_training - Step 9232: {'lr': 0.0004725134338022631, 'samples': 1772736, 'steps': 9232, 'loss/train': 0.906622052192688} 01/27/2022 04:33:07 - INFO - codeparrot_training - Step 9233: {'lr': 0.00047250597440550124, 'samples': 1772928, 'steps': 9233, 'loss/train': 0.5540201514959335} 01/27/2022 04:33:10 - INFO - codeparrot_training - Step 9234: {'lr': 0.0004724985140555945, 'samples': 1773120, 'steps': 9234, 'loss/train': 0.620517298579216} 01/27/2022 04:33:13 - INFO - codeparrot_training - Step 9235: {'lr': 0.0004724910527525748, 'samples': 1773312, 'steps': 9235, 'loss/train': 1.0255195498466492} 01/27/2022 04:33:17 - INFO - codeparrot_training - Step 9236: {'lr': 0.0004724835904964739, 'samples': 1773504, 'steps': 9236, 'loss/train': 1.0253455638885498} 01/27/2022 04:33:20 - INFO - codeparrot_training - Step 9237: {'lr': 0.00047247612728732407, 'samples': 1773696, 'steps': 9237, 'loss/train': 0.9168468117713928} 01/27/2022 04:33:23 - INFO - codeparrot_training - Step 9238: {'lr': 0.0004724686631251572, 'samples': 1773888, 'steps': 9238, 'loss/train': 0.5453496873378754} 01/27/2022 04:33:26 - INFO - codeparrot_training - Step 9239: {'lr': 0.00047246119801000507, 'samples': 1774080, 'steps': 9239, 'loss/train': 0.9256144165992737} 01/27/2022 04:33:29 - INFO - codeparrot_training - Step 9240: {'lr': 0.00047245373194189995, 'samples': 1774272, 'steps': 9240, 'loss/train': 0.12802696600556374} 01/27/2022 04:33:34 - INFO - codeparrot_training - Step 9241: {'lr': 0.0004724462649208736, 'samples': 1774464, 'steps': 9241, 'loss/train': 0.756415843963623} 01/27/2022 04:33:37 - INFO - codeparrot_training - Step 9242: {'lr': 0.0004724387969469581, 'samples': 1774656, 'steps': 9242, 'loss/train': 0.8363883197307587} 01/27/2022 04:33:40 - INFO - codeparrot_training - Step 9243: {'lr': 0.00047243132802018544, 'samples': 1774848, 'steps': 9243, 'loss/train': 0.6305385231971741} 01/27/2022 04:33:43 - INFO - codeparrot_training - Step 9244: {'lr': 0.00047242385814058764, 'samples': 1775040, 'steps': 9244, 'loss/train': 0.7655874788761139} 01/27/2022 04:33:46 - INFO - codeparrot_training - Step 9245: {'lr': 0.0004724163873081966, 'samples': 1775232, 'steps': 9245, 'loss/train': 0.4344387799501419} 01/27/2022 04:33:49 - INFO - codeparrot_training - Step 9246: {'lr': 0.00047240891552304443, 'samples': 1775424, 'steps': 9246, 'loss/train': 0.5917866826057434} 01/27/2022 04:33:52 - INFO - codeparrot_training - Step 9247: {'lr': 0.0004724014427851631, 'samples': 1775616, 'steps': 9247, 'loss/train': 0.7963441908359528} 01/27/2022 04:33:56 - INFO - codeparrot_training - Step 9248: {'lr': 0.0004723939690945845, 'samples': 1775808, 'steps': 9248, 'loss/train': 1.0888419449329376} 01/27/2022 04:33:59 - INFO - codeparrot_training - Step 9249: {'lr': 0.00047238649445134086, 'samples': 1776000, 'steps': 9249, 'loss/train': 0.820350855588913} 01/27/2022 04:34:04 - INFO - codeparrot_training - Step 9250: {'lr': 0.00047237901885546405, 'samples': 1776192, 'steps': 9250, 'loss/train': 0.3566935211420059} 01/27/2022 04:34:07 - INFO - codeparrot_training - Step 9251: {'lr': 0.00047237154230698607, 'samples': 1776384, 'steps': 9251, 'loss/train': 1.2312782406806946} 01/27/2022 04:34:10 - INFO - codeparrot_training - Step 9252: {'lr': 0.0004723640648059391, 'samples': 1776576, 'steps': 9252, 'loss/train': 0.40459419786930084} 01/27/2022 04:34:13 - INFO - codeparrot_training - Step 9253: {'lr': 0.0004723565863523551, 'samples': 1776768, 'steps': 9253, 'loss/train': 0.1741744726896286} 01/27/2022 04:34:16 - INFO - codeparrot_training - Step 9254: {'lr': 0.0004723491069462661, 'samples': 1776960, 'steps': 9254, 'loss/train': 0.4140294939279556} 01/27/2022 04:34:19 - INFO - codeparrot_training - Step 9255: {'lr': 0.00047234162658770407, 'samples': 1777152, 'steps': 9255, 'loss/train': 0.8783622980117798} 01/27/2022 04:34:23 - INFO - codeparrot_training - Step 9256: {'lr': 0.00047233414527670113, 'samples': 1777344, 'steps': 9256, 'loss/train': 1.1650423407554626} 01/27/2022 04:34:26 - INFO - codeparrot_training - Step 9257: {'lr': 0.0004723266630132893, 'samples': 1777536, 'steps': 9257, 'loss/train': 0.7795849442481995} 01/27/2022 04:34:30 - INFO - codeparrot_training - Step 9258: {'lr': 0.0004723191797975007, 'samples': 1777728, 'steps': 9258, 'loss/train': 0.6518879681825638} 01/27/2022 04:34:33 - INFO - codeparrot_training - Step 9259: {'lr': 0.00047231169562936726, 'samples': 1777920, 'steps': 9259, 'loss/train': 1.7123780250549316} 01/27/2022 04:34:36 - INFO - codeparrot_training - Step 9260: {'lr': 0.00047230421050892116, 'samples': 1778112, 'steps': 9260, 'loss/train': 0.755747526884079} 01/27/2022 04:34:40 - INFO - codeparrot_training - Step 9261: {'lr': 0.00047229672443619433, 'samples': 1778304, 'steps': 9261, 'loss/train': 0.6411335617303848} 01/27/2022 04:34:43 - INFO - codeparrot_training - Step 9262: {'lr': 0.00047228923741121897, 'samples': 1778496, 'steps': 9262, 'loss/train': 0.9756079316139221} 01/27/2022 04:34:46 - INFO - codeparrot_training - Step 9263: {'lr': 0.0004722817494340271, 'samples': 1778688, 'steps': 9263, 'loss/train': 0.8756522834300995} 01/27/2022 04:34:49 - INFO - codeparrot_training - Step 9264: {'lr': 0.00047227426050465085, 'samples': 1778880, 'steps': 9264, 'loss/train': 0.9102149605751038} 01/27/2022 04:34:52 - INFO - codeparrot_training - Step 9265: {'lr': 0.00047226677062312217, 'samples': 1779072, 'steps': 9265, 'loss/train': 0.4477773606777191} 01/27/2022 04:34:55 - INFO - codeparrot_training - Step 9266: {'lr': 0.00047225927978947327, 'samples': 1779264, 'steps': 9266, 'loss/train': 0.847565084695816} 01/27/2022 04:35:00 - INFO - codeparrot_training - Step 9267: {'lr': 0.00047225178800373613, 'samples': 1779456, 'steps': 9267, 'loss/train': 1.1455726325511932} 01/27/2022 04:35:04 - INFO - codeparrot_training - Step 9268: {'lr': 0.00047224429526594296, 'samples': 1779648, 'steps': 9268, 'loss/train': 0.6957840621471405} 01/27/2022 04:35:07 - INFO - codeparrot_training - Step 9269: {'lr': 0.0004722368015761258, 'samples': 1779840, 'steps': 9269, 'loss/train': 1.323660135269165} 01/27/2022 04:35:10 - INFO - codeparrot_training - Step 9270: {'lr': 0.0004722293069343168, 'samples': 1780032, 'steps': 9270, 'loss/train': 1.944918930530548} 01/27/2022 04:35:13 - INFO - codeparrot_training - Step 9271: {'lr': 0.00047222181134054785, 'samples': 1780224, 'steps': 9271, 'loss/train': 0.42924754321575165} 01/27/2022 04:35:16 - INFO - codeparrot_training - Step 9272: {'lr': 0.0004722143147948513, 'samples': 1780416, 'steps': 9272, 'loss/train': 0.8443268537521362} 01/27/2022 04:35:19 - INFO - codeparrot_training - Step 9273: {'lr': 0.0004722068172972593, 'samples': 1780608, 'steps': 9273, 'loss/train': 1.1212426722049713} 01/27/2022 04:35:22 - INFO - codeparrot_training - Step 9274: {'lr': 0.00047219931884780376, 'samples': 1780800, 'steps': 9274, 'loss/train': 1.1039375066757202} 01/27/2022 04:35:26 - INFO - codeparrot_training - Step 9275: {'lr': 0.0004721918194465169, 'samples': 1780992, 'steps': 9275, 'loss/train': 1.0095669329166412} 01/27/2022 04:35:29 - INFO - codeparrot_training - Step 9276: {'lr': 0.00047218431909343083, 'samples': 1781184, 'steps': 9276, 'loss/train': 0.9529325366020203} 01/27/2022 04:35:33 - INFO - codeparrot_training - Step 9277: {'lr': 0.0004721768177885777, 'samples': 1781376, 'steps': 9277, 'loss/train': 0.9330330491065979} 01/27/2022 04:35:37 - INFO - codeparrot_training - Step 9278: {'lr': 0.00047216931553198963, 'samples': 1781568, 'steps': 9278, 'loss/train': 0.8912504017353058} 01/27/2022 04:35:40 - INFO - codeparrot_training - Step 9279: {'lr': 0.0004721618123236987, 'samples': 1781760, 'steps': 9279, 'loss/train': 1.0307781994342804} 01/27/2022 04:35:43 - INFO - codeparrot_training - Step 9280: {'lr': 0.0004721543081637372, 'samples': 1781952, 'steps': 9280, 'loss/train': 0.7400768548250198} 01/27/2022 04:35:46 - INFO - codeparrot_training - Step 9281: {'lr': 0.0004721468030521372, 'samples': 1782144, 'steps': 9281, 'loss/train': 0.9035755097866058} 01/27/2022 04:35:49 - INFO - codeparrot_training - Step 9282: {'lr': 0.0004721392969889308, 'samples': 1782336, 'steps': 9282, 'loss/train': 0.8525265455245972} 01/27/2022 04:35:52 - INFO - codeparrot_training - Step 9283: {'lr': 0.00047213178997415015, 'samples': 1782528, 'steps': 9283, 'loss/train': 0.6692837476730347} 01/27/2022 04:35:55 - INFO - codeparrot_training - Step 9284: {'lr': 0.00047212428200782744, 'samples': 1782720, 'steps': 9284, 'loss/train': 0.8061622381210327} 01/27/2022 04:36:00 - INFO - codeparrot_training - Step 9285: {'lr': 0.0004721167730899949, 'samples': 1782912, 'steps': 9285, 'loss/train': 1.0626409649848938} 01/27/2022 04:36:03 - INFO - codeparrot_training - Step 9286: {'lr': 0.0004721092632206846, 'samples': 1783104, 'steps': 9286, 'loss/train': 1.141641765832901} 01/27/2022 04:36:07 - INFO - codeparrot_training - Step 9287: {'lr': 0.00047210175239992876, 'samples': 1783296, 'steps': 9287, 'loss/train': 0.7283767014741898} 01/27/2022 04:36:10 - INFO - codeparrot_training - Step 9288: {'lr': 0.0004720942406277595, 'samples': 1783488, 'steps': 9288, 'loss/train': 0.3877851516008377} 01/27/2022 04:36:13 - INFO - codeparrot_training - Step 9289: {'lr': 0.0004720867279042091, 'samples': 1783680, 'steps': 9289, 'loss/train': 1.3741280436515808} 01/27/2022 04:36:16 - INFO - codeparrot_training - Step 9290: {'lr': 0.00047207921422930967, 'samples': 1783872, 'steps': 9290, 'loss/train': 0.8113794922828674} 01/27/2022 04:36:19 - INFO - codeparrot_training - Step 9291: {'lr': 0.00047207169960309335, 'samples': 1784064, 'steps': 9291, 'loss/train': 0.57839535176754} 01/27/2022 04:36:22 - INFO - codeparrot_training - Step 9292: {'lr': 0.00047206418402559236, 'samples': 1784256, 'steps': 9292, 'loss/train': 0.5745449066162109} 01/27/2022 04:36:25 - INFO - codeparrot_training - Step 9293: {'lr': 0.000472056667496839, 'samples': 1784448, 'steps': 9293, 'loss/train': 0.8994670808315277} 01/27/2022 04:36:30 - INFO - codeparrot_training - Step 9294: {'lr': 0.0004720491500168654, 'samples': 1784640, 'steps': 9294, 'loss/train': 1.5170849561691284} 01/27/2022 04:36:33 - INFO - codeparrot_training - Step 9295: {'lr': 0.0004720416315857037, 'samples': 1784832, 'steps': 9295, 'loss/train': 0.7220356464385986} 01/27/2022 04:36:36 - INFO - codeparrot_training - Step 9296: {'lr': 0.00047203411220338615, 'samples': 1785024, 'steps': 9296, 'loss/train': 0.7900552153587341} 01/27/2022 04:36:39 - INFO - codeparrot_training - Step 9297: {'lr': 0.000472026591869945, 'samples': 1785216, 'steps': 9297, 'loss/train': 0.5315293818712234} 01/27/2022 04:36:42 - INFO - codeparrot_training - Step 9298: {'lr': 0.00047201907058541236, 'samples': 1785408, 'steps': 9298, 'loss/train': 0.9628465175628662} 01/27/2022 04:36:46 - INFO - codeparrot_training - Step 9299: {'lr': 0.0004720115483498206, 'samples': 1785600, 'steps': 9299, 'loss/train': 0.43994687497615814} 01/27/2022 04:36:49 - INFO - codeparrot_training - Step 9300: {'lr': 0.00047200402516320186, 'samples': 1785792, 'steps': 9300, 'loss/train': 0.6651338338851929} 01/27/2022 04:36:52 - INFO - codeparrot_training - Step 9301: {'lr': 0.00047199650102558834, 'samples': 1785984, 'steps': 9301, 'loss/train': 0.27905353903770447} 01/27/2022 04:36:56 - INFO - codeparrot_training - Step 9302: {'lr': 0.0004719889759370123, 'samples': 1786176, 'steps': 9302, 'loss/train': 0.730888769030571} 01/27/2022 04:36:59 - INFO - codeparrot_training - Step 9303: {'lr': 0.00047198144989750603, 'samples': 1786368, 'steps': 9303, 'loss/train': 1.1049546003341675} 01/27/2022 04:37:02 - INFO - codeparrot_training - Step 9304: {'lr': 0.00047197392290710164, 'samples': 1786560, 'steps': 9304, 'loss/train': 0.9428194463253021} 01/27/2022 04:37:06 - INFO - codeparrot_training - Step 9305: {'lr': 0.0004719663949658315, 'samples': 1786752, 'steps': 9305, 'loss/train': 0.9342278838157654} 01/27/2022 04:37:09 - INFO - codeparrot_training - Step 9306: {'lr': 0.00047195886607372773, 'samples': 1786944, 'steps': 9306, 'loss/train': 0.06917623803019524} 01/27/2022 04:37:12 - INFO - codeparrot_training - Step 9307: {'lr': 0.0004719513362308228, 'samples': 1787136, 'steps': 9307, 'loss/train': 0.5592367351055145} 01/27/2022 04:37:15 - INFO - codeparrot_training - Step 9308: {'lr': 0.0004719438054371487, 'samples': 1787328, 'steps': 9308, 'loss/train': 0.8913589417934418} 01/27/2022 04:37:18 - INFO - codeparrot_training - Step 9309: {'lr': 0.00047193627369273786, 'samples': 1787520, 'steps': 9309, 'loss/train': 0.7610340714454651} 01/27/2022 04:37:21 - INFO - codeparrot_training - Step 9310: {'lr': 0.00047192874099762246, 'samples': 1787712, 'steps': 9310, 'loss/train': 0.5153814107179642} 01/27/2022 04:37:26 - INFO - codeparrot_training - Step 9311: {'lr': 0.00047192120735183485, 'samples': 1787904, 'steps': 9311, 'loss/train': 1.019675463438034} 01/27/2022 04:37:29 - INFO - codeparrot_training - Step 9312: {'lr': 0.0004719136727554072, 'samples': 1788096, 'steps': 9312, 'loss/train': 0.7877813279628754} 01/27/2022 04:37:32 - INFO - codeparrot_training - Step 9313: {'lr': 0.0004719061372083719, 'samples': 1788288, 'steps': 9313, 'loss/train': 0.788474053144455} 01/27/2022 04:37:36 - INFO - codeparrot_training - Step 9314: {'lr': 0.00047189860071076114, 'samples': 1788480, 'steps': 9314, 'loss/train': 1.020917683839798} 01/27/2022 04:37:39 - INFO - codeparrot_training - Step 9315: {'lr': 0.00047189106326260723, 'samples': 1788672, 'steps': 9315, 'loss/train': 1.6790285110473633} 01/27/2022 04:37:42 - INFO - codeparrot_training - Step 9316: {'lr': 0.0004718835248639425, 'samples': 1788864, 'steps': 9316, 'loss/train': 0.6297753006219864} 01/27/2022 04:37:45 - INFO - codeparrot_training - Step 9317: {'lr': 0.0004718759855147992, 'samples': 1789056, 'steps': 9317, 'loss/train': 0.9930145740509033} 01/27/2022 04:37:48 - INFO - codeparrot_training - Step 9318: {'lr': 0.00047186844521520955, 'samples': 1789248, 'steps': 9318, 'loss/train': 0.8952537775039673} 01/27/2022 04:37:51 - INFO - codeparrot_training - Step 9319: {'lr': 0.000471860903965206, 'samples': 1789440, 'steps': 9319, 'loss/train': 0.6520312875509262} 01/27/2022 04:37:58 - INFO - codeparrot_training - Step 9320: {'lr': 0.00047185336176482084, 'samples': 1789632, 'steps': 9320, 'loss/train': 0.8305051624774933} 01/27/2022 04:38:01 - INFO - codeparrot_training - Step 9321: {'lr': 0.0004718458186140863, 'samples': 1789824, 'steps': 9321, 'loss/train': 0.8516818284988403} 01/27/2022 04:38:04 - INFO - codeparrot_training - Step 9322: {'lr': 0.0004718382745130346, 'samples': 1790016, 'steps': 9322, 'loss/train': 1.0690981149673462} 01/27/2022 04:38:07 - INFO - codeparrot_training - Step 9323: {'lr': 0.0004718307294616983, 'samples': 1790208, 'steps': 9323, 'loss/train': 0.6708613783121109} 01/27/2022 04:38:11 - INFO - codeparrot_training - Step 9324: {'lr': 0.00047182318346010953, 'samples': 1790400, 'steps': 9324, 'loss/train': 1.5783737897872925} 01/27/2022 04:38:14 - INFO - codeparrot_training - Step 9325: {'lr': 0.0004718156365083007, 'samples': 1790592, 'steps': 9325, 'loss/train': 1.064488023519516} 01/27/2022 04:38:17 - INFO - codeparrot_training - Step 9326: {'lr': 0.0004718080886063041, 'samples': 1790784, 'steps': 9326, 'loss/train': 1.1327772438526154} 01/27/2022 04:38:20 - INFO - codeparrot_training - Step 9327: {'lr': 0.00047180053975415216, 'samples': 1790976, 'steps': 9327, 'loss/train': 0.7768111824989319} 01/27/2022 04:38:23 - INFO - codeparrot_training - Step 9328: {'lr': 0.00047179298995187705, 'samples': 1791168, 'steps': 9328, 'loss/train': 0.852427750825882} 01/27/2022 04:38:27 - INFO - codeparrot_training - Step 9329: {'lr': 0.00047178543919951124, 'samples': 1791360, 'steps': 9329, 'loss/train': 0.7541499137878418} 01/27/2022 04:38:31 - INFO - codeparrot_training - Step 9330: {'lr': 0.000471777887497087, 'samples': 1791552, 'steps': 9330, 'loss/train': 0.8823018372058868} 01/27/2022 04:38:34 - INFO - codeparrot_training - Step 9331: {'lr': 0.0004717703348446367, 'samples': 1791744, 'steps': 9331, 'loss/train': 0.5074364840984344} 01/27/2022 04:38:37 - INFO - codeparrot_training - Step 9332: {'lr': 0.00047176278124219276, 'samples': 1791936, 'steps': 9332, 'loss/train': 0.3629629388451576} 01/27/2022 04:38:40 - INFO - codeparrot_training - Step 9333: {'lr': 0.0004717552266897874, 'samples': 1792128, 'steps': 9333, 'loss/train': 0.9841884076595306} 01/27/2022 04:38:43 - INFO - codeparrot_training - Step 9334: {'lr': 0.0004717476711874532, 'samples': 1792320, 'steps': 9334, 'loss/train': 0.9114029109477997} 01/27/2022 04:38:46 - INFO - codeparrot_training - Step 9335: {'lr': 0.00047174011473522225, 'samples': 1792512, 'steps': 9335, 'loss/train': 1.0649738609790802} 01/27/2022 04:38:49 - INFO - codeparrot_training - Step 9336: {'lr': 0.0004717325573331271, 'samples': 1792704, 'steps': 9336, 'loss/train': 0.6982036381959915} 01/27/2022 04:38:54 - INFO - codeparrot_training - Step 9337: {'lr': 0.00047172499898120014, 'samples': 1792896, 'steps': 9337, 'loss/train': 0.9462790489196777} 01/27/2022 04:38:57 - INFO - codeparrot_training - Step 9338: {'lr': 0.0004717174396794737, 'samples': 1793088, 'steps': 9338, 'loss/train': 0.8630395531654358} 01/27/2022 04:39:00 - INFO - codeparrot_training - Step 9339: {'lr': 0.00047170987942798004, 'samples': 1793280, 'steps': 9339, 'loss/train': 1.0319884121418} 01/27/2022 04:39:03 - INFO - codeparrot_training - Step 9340: {'lr': 0.0004717023182267518, 'samples': 1793472, 'steps': 9340, 'loss/train': 0.78631192445755} 01/27/2022 04:39:06 - INFO - codeparrot_training - Step 9341: {'lr': 0.00047169475607582113, 'samples': 1793664, 'steps': 9341, 'loss/train': 1.1103792786598206} 01/27/2022 04:39:10 - INFO - codeparrot_training - Step 9342: {'lr': 0.00047168719297522053, 'samples': 1793856, 'steps': 9342, 'loss/train': 0.7163410931825638} 01/27/2022 04:39:13 - INFO - codeparrot_training - Step 9343: {'lr': 0.0004716796289249824, 'samples': 1794048, 'steps': 9343, 'loss/train': 0.553520143032074} 01/27/2022 04:39:16 - INFO - codeparrot_training - Step 9344: {'lr': 0.0004716720639251392, 'samples': 1794240, 'steps': 9344, 'loss/train': 4.728183388710022} 01/27/2022 04:39:19 - INFO - codeparrot_training - Step 9345: {'lr': 0.00047166449797572316, 'samples': 1794432, 'steps': 9345, 'loss/train': 0.8656742870807648} 01/27/2022 04:39:25 - INFO - codeparrot_training - Step 9346: {'lr': 0.0004716569310767668, 'samples': 1794624, 'steps': 9346, 'loss/train': 0.6661222279071808} 01/27/2022 04:39:28 - INFO - codeparrot_training - Step 9347: {'lr': 0.00047164936322830256, 'samples': 1794816, 'steps': 9347, 'loss/train': 0.981398731470108} 01/27/2022 04:39:32 - INFO - codeparrot_training - Step 9348: {'lr': 0.0004716417944303628, 'samples': 1795008, 'steps': 9348, 'loss/train': 1.000332623720169} 01/27/2022 04:39:35 - INFO - codeparrot_training - Step 9349: {'lr': 0.00047163422468298003, 'samples': 1795200, 'steps': 9349, 'loss/train': 0.724285751581192} 01/27/2022 04:39:38 - INFO - codeparrot_training - Step 9350: {'lr': 0.00047162665398618666, 'samples': 1795392, 'steps': 9350, 'loss/train': 1.1749948561191559} 01/27/2022 04:39:41 - INFO - codeparrot_training - Step 9351: {'lr': 0.00047161908234001496, 'samples': 1795584, 'steps': 9351, 'loss/train': 1.1059048175811768} 01/27/2022 04:39:44 - INFO - codeparrot_training - Step 9352: {'lr': 0.0004716115097444975, 'samples': 1795776, 'steps': 9352, 'loss/train': 1.0643802881240845} 01/27/2022 04:39:47 - INFO - codeparrot_training - Step 9353: {'lr': 0.0004716039361996668, 'samples': 1795968, 'steps': 9353, 'loss/train': 1.1324751377105713} 01/27/2022 04:39:51 - INFO - codeparrot_training - Step 9354: {'lr': 0.0004715963617055551, 'samples': 1796160, 'steps': 9354, 'loss/train': 0.9166973233222961} 01/27/2022 04:39:55 - INFO - codeparrot_training - Step 9355: {'lr': 0.00047158878626219505, 'samples': 1796352, 'steps': 9355, 'loss/train': 0.7288177907466888} 01/27/2022 04:39:58 - INFO - codeparrot_training - Step 9356: {'lr': 0.00047158120986961897, 'samples': 1796544, 'steps': 9356, 'loss/train': 1.0772902965545654} 01/27/2022 04:40:01 - INFO - codeparrot_training - Step 9357: {'lr': 0.0004715736325278593, 'samples': 1796736, 'steps': 9357, 'loss/train': 0.8662097454071045} 01/27/2022 04:40:05 - INFO - codeparrot_training - Step 9358: {'lr': 0.0004715660542369485, 'samples': 1796928, 'steps': 9358, 'loss/train': 0.7547895312309265} 01/27/2022 04:40:08 - INFO - codeparrot_training - Step 9359: {'lr': 0.0004715584749969192, 'samples': 1797120, 'steps': 9359, 'loss/train': 0.7932896018028259} 01/27/2022 04:40:11 - INFO - codeparrot_training - Step 9360: {'lr': 0.00047155089480780364, 'samples': 1797312, 'steps': 9360, 'loss/train': 0.6657851189374924} 01/27/2022 04:40:14 - INFO - codeparrot_training - Step 9361: {'lr': 0.0004715433136696345, 'samples': 1797504, 'steps': 9361, 'loss/train': 0.4871722161769867} 01/27/2022 04:40:17 - INFO - codeparrot_training - Step 9362: {'lr': 0.0004715357315824441, 'samples': 1797696, 'steps': 9362, 'loss/train': 0.5709322392940521} 01/27/2022 04:40:20 - INFO - codeparrot_training - Step 9363: {'lr': 0.00047152814854626494, 'samples': 1797888, 'steps': 9363, 'loss/train': 0.8243232071399689} 01/27/2022 04:40:25 - INFO - codeparrot_training - Step 9364: {'lr': 0.0004715205645611296, 'samples': 1798080, 'steps': 9364, 'loss/train': 1.3394184708595276} 01/27/2022 04:40:28 - INFO - codeparrot_training - Step 9365: {'lr': 0.00047151297962707054, 'samples': 1798272, 'steps': 9365, 'loss/train': 1.0047267079353333} 01/27/2022 04:40:31 - INFO - codeparrot_training - Step 9366: {'lr': 0.00047150539374412004, 'samples': 1798464, 'steps': 9366, 'loss/train': 0.8597422242164612} 01/27/2022 04:40:34 - INFO - codeparrot_training - Step 9367: {'lr': 0.0004714978069123109, 'samples': 1798656, 'steps': 9367, 'loss/train': 1.0611863136291504} 01/27/2022 04:40:37 - INFO - codeparrot_training - Step 9368: {'lr': 0.00047149021913167545, 'samples': 1798848, 'steps': 9368, 'loss/train': 1.0031478703022003} 01/27/2022 04:40:40 - INFO - codeparrot_training - Step 9369: {'lr': 0.00047148263040224626, 'samples': 1799040, 'steps': 9369, 'loss/train': 0.9719275832176208} 01/27/2022 04:40:44 - INFO - codeparrot_training - Step 9370: {'lr': 0.00047147504072405575, 'samples': 1799232, 'steps': 9370, 'loss/train': 1.068197876214981} 01/27/2022 04:40:47 - INFO - codeparrot_training - Step 9371: {'lr': 0.0004714674500971366, 'samples': 1799424, 'steps': 9371, 'loss/train': 1.0091654062271118} 01/27/2022 04:40:50 - INFO - codeparrot_training - Step 9372: {'lr': 0.00047145985852152115, 'samples': 1799616, 'steps': 9372, 'loss/train': 0.6097289621829987} 01/27/2022 04:40:56 - INFO - codeparrot_training - Step 9373: {'lr': 0.000471452265997242, 'samples': 1799808, 'steps': 9373, 'loss/train': 0.7352131605148315} 01/27/2022 04:40:59 - INFO - codeparrot_training - Step 9374: {'lr': 0.00047144467252433164, 'samples': 1800000, 'steps': 9374, 'loss/train': 1.0272563695907593} 01/27/2022 04:41:02 - INFO - codeparrot_training - Step 9375: {'lr': 0.00047143707810282266, 'samples': 1800192, 'steps': 9375, 'loss/train': 0.6932017654180527} 01/27/2022 04:41:06 - INFO - codeparrot_training - Step 9376: {'lr': 0.0004714294827327475, 'samples': 1800384, 'steps': 9376, 'loss/train': 0.2818893939256668} 01/27/2022 04:41:09 - INFO - codeparrot_training - Step 9377: {'lr': 0.00047142188641413873, 'samples': 1800576, 'steps': 9377, 'loss/train': 0.7609494030475616} 01/27/2022 04:41:12 - INFO - codeparrot_training - Step 9378: {'lr': 0.000471414289147029, 'samples': 1800768, 'steps': 9378, 'loss/train': 0.5689816027879715} 01/27/2022 04:41:15 - INFO - codeparrot_training - Step 9379: {'lr': 0.00047140669093145073, 'samples': 1800960, 'steps': 9379, 'loss/train': 0.9180544316768646} 01/27/2022 04:41:18 - INFO - codeparrot_training - Step 9380: {'lr': 0.00047139909176743643, 'samples': 1801152, 'steps': 9380, 'loss/train': 0.44206103682518005} 01/27/2022 04:41:21 - INFO - codeparrot_training - Step 9381: {'lr': 0.0004713914916550188, 'samples': 1801344, 'steps': 9381, 'loss/train': 0.9794307947158813} 01/27/2022 04:41:26 - INFO - codeparrot_training - Step 9382: {'lr': 0.00047138389059423033, 'samples': 1801536, 'steps': 9382, 'loss/train': 5.107281804084778} 01/27/2022 04:41:30 - INFO - codeparrot_training - Step 9383: {'lr': 0.0004713762885851035, 'samples': 1801728, 'steps': 9383, 'loss/train': 3.028761148452759} 01/27/2022 04:41:33 - INFO - codeparrot_training - Step 9384: {'lr': 0.000471368685627671, 'samples': 1801920, 'steps': 9384, 'loss/train': 0.7845144867897034} 01/27/2022 04:41:36 - INFO - codeparrot_training - Step 9385: {'lr': 0.00047136108172196535, 'samples': 1802112, 'steps': 9385, 'loss/train': 1.1602222323417664} 01/27/2022 04:41:39 - INFO - codeparrot_training - Step 9386: {'lr': 0.00047135347686801907, 'samples': 1802304, 'steps': 9386, 'loss/train': 0.6112133413553238} 01/27/2022 04:41:42 - INFO - codeparrot_training - Step 9387: {'lr': 0.0004713458710658648, 'samples': 1802496, 'steps': 9387, 'loss/train': 1.2865109145641327} 01/27/2022 04:41:45 - INFO - codeparrot_training - Step 9388: {'lr': 0.0004713382643155351, 'samples': 1802688, 'steps': 9388, 'loss/train': 0.9765339195728302} 01/27/2022 04:41:48 - INFO - codeparrot_training - Step 9389: {'lr': 0.00047133065661706254, 'samples': 1802880, 'steps': 9389, 'loss/train': 0.6315789967775345} 01/27/2022 04:41:51 - INFO - codeparrot_training - Step 9390: {'lr': 0.00047132304797047975, 'samples': 1803072, 'steps': 9390, 'loss/train': 1.0928944945335388} 01/27/2022 04:41:58 - INFO - codeparrot_training - Step 9391: {'lr': 0.00047131543837581935, 'samples': 1803264, 'steps': 9391, 'loss/train': 1.2657219171524048} 01/27/2022 04:42:01 - INFO - codeparrot_training - Step 9392: {'lr': 0.0004713078278331138, 'samples': 1803456, 'steps': 9392, 'loss/train': 1.0948424935340881} 01/27/2022 04:42:04 - INFO - codeparrot_training - Step 9393: {'lr': 0.00047130021634239584, 'samples': 1803648, 'steps': 9393, 'loss/train': 0.845026820898056} 01/27/2022 04:42:07 - INFO - codeparrot_training - Step 9394: {'lr': 0.000471292603903698, 'samples': 1803840, 'steps': 9394, 'loss/train': 0.7686464488506317} 01/27/2022 04:42:10 - INFO - codeparrot_training - Step 9395: {'lr': 0.00047128499051705296, 'samples': 1804032, 'steps': 9395, 'loss/train': 0.9642145335674286} 01/27/2022 04:42:13 - INFO - codeparrot_training - Step 9396: {'lr': 0.00047127737618249323, 'samples': 1804224, 'steps': 9396, 'loss/train': 0.8665502071380615} 01/27/2022 04:42:16 - INFO - codeparrot_training - Step 9397: {'lr': 0.00047126976090005153, 'samples': 1804416, 'steps': 9397, 'loss/train': 0.7523791193962097} 01/27/2022 04:42:20 - INFO - codeparrot_training - Step 9398: {'lr': 0.00047126214466976034, 'samples': 1804608, 'steps': 9398, 'loss/train': 0.9327680468559265} 01/27/2022 04:42:23 - INFO - codeparrot_training - Step 9399: {'lr': 0.0004712545274916525, 'samples': 1804800, 'steps': 9399, 'loss/train': 0.9255942106246948} 01/27/2022 04:42:27 - INFO - codeparrot_training - Step 9400: {'lr': 0.00047124690936576046, 'samples': 1804992, 'steps': 9400, 'loss/train': 0.7945279777050018} 01/27/2022 04:42:30 - INFO - codeparrot_training - Step 9401: {'lr': 0.000471239290292117, 'samples': 1805184, 'steps': 9401, 'loss/train': 0.6648336052894592} 01/27/2022 04:42:34 - INFO - codeparrot_training - Step 9402: {'lr': 0.00047123167027075455, 'samples': 1805376, 'steps': 9402, 'loss/train': 1.012950360774994} 01/27/2022 04:42:37 - INFO - codeparrot_training - Step 9403: {'lr': 0.0004712240493017059, 'samples': 1805568, 'steps': 9403, 'loss/train': 0.9329683184623718} 01/27/2022 04:42:40 - INFO - codeparrot_training - Step 9404: {'lr': 0.0004712164273850037, 'samples': 1805760, 'steps': 9404, 'loss/train': 0.7681244909763336} 01/27/2022 04:42:43 - INFO - codeparrot_training - Step 9405: {'lr': 0.0004712088045206806, 'samples': 1805952, 'steps': 9405, 'loss/train': 1.7930266857147217} 01/27/2022 04:42:46 - INFO - codeparrot_training - Step 9406: {'lr': 0.00047120118070876916, 'samples': 1806144, 'steps': 9406, 'loss/train': 0.4487888216972351} 01/27/2022 04:42:49 - INFO - codeparrot_training - Step 9407: {'lr': 0.0004711935559493021, 'samples': 1806336, 'steps': 9407, 'loss/train': 1.3751649856567383} 01/27/2022 04:42:53 - INFO - codeparrot_training - Step 9408: {'lr': 0.00047118593024231216, 'samples': 1806528, 'steps': 9408, 'loss/train': 1.019029676914215} 01/27/2022 04:42:57 - INFO - codeparrot_training - Step 9409: {'lr': 0.00047117830358783184, 'samples': 1806720, 'steps': 9409, 'loss/train': 0.6385756283998489} 01/27/2022 04:43:00 - INFO - codeparrot_training - Step 9410: {'lr': 0.0004711706759858939, 'samples': 1806912, 'steps': 9410, 'loss/train': 0.7962217926979065} 01/27/2022 04:43:03 - INFO - codeparrot_training - Step 9411: {'lr': 0.0004711630474365311, 'samples': 1807104, 'steps': 9411, 'loss/train': 1.3003291189670563} 01/27/2022 04:43:06 - INFO - codeparrot_training - Step 9412: {'lr': 0.000471155417939776, 'samples': 1807296, 'steps': 9412, 'loss/train': 0.8367109894752502} 01/27/2022 04:43:09 - INFO - codeparrot_training - Step 9413: {'lr': 0.00047114778749566123, 'samples': 1807488, 'steps': 9413, 'loss/train': 1.2914106845855713} 01/27/2022 04:43:12 - INFO - codeparrot_training - Step 9414: {'lr': 0.00047114015610421966, 'samples': 1807680, 'steps': 9414, 'loss/train': 0.6574753224849701} 01/27/2022 04:43:15 - INFO - codeparrot_training - Step 9415: {'lr': 0.00047113252376548387, 'samples': 1807872, 'steps': 9415, 'loss/train': 1.6462247371673584} 01/27/2022 04:43:19 - INFO - codeparrot_training - Step 9416: {'lr': 0.00047112489047948655, 'samples': 1808064, 'steps': 9416, 'loss/train': 0.6705807745456696} 01/27/2022 04:43:25 - INFO - codeparrot_training - Step 9417: {'lr': 0.0004711172562462604, 'samples': 1808256, 'steps': 9417, 'loss/train': 7.009225130081177} 01/27/2022 04:43:28 - INFO - codeparrot_training - Step 9418: {'lr': 0.0004711096210658381, 'samples': 1808448, 'steps': 9418, 'loss/train': 0.8210842609405518} 01/27/2022 04:43:31 - INFO - codeparrot_training - Step 9419: {'lr': 0.0004711019849382525, 'samples': 1808640, 'steps': 9419, 'loss/train': 1.5158146619796753} 01/27/2022 04:43:34 - INFO - codeparrot_training - Step 9420: {'lr': 0.0004710943478635361, 'samples': 1808832, 'steps': 9420, 'loss/train': 0.9711583256721497} 01/27/2022 04:43:37 - INFO - codeparrot_training - Step 9421: {'lr': 0.00047108670984172176, 'samples': 1809024, 'steps': 9421, 'loss/train': 1.01883003115654} 01/27/2022 04:43:41 - INFO - codeparrot_training - Step 9422: {'lr': 0.00047107907087284216, 'samples': 1809216, 'steps': 9422, 'loss/train': 1.0916169583797455} 01/27/2022 04:43:44 - INFO - codeparrot_training - Step 9423: {'lr': 0.00047107143095693007, 'samples': 1809408, 'steps': 9423, 'loss/train': 0.9209024906158447} 01/27/2022 04:43:47 - INFO - codeparrot_training - Step 9424: {'lr': 0.0004710637900940181, 'samples': 1809600, 'steps': 9424, 'loss/train': 1.0330053269863129} 01/27/2022 04:43:50 - INFO - codeparrot_training - Step 9425: {'lr': 0.00047105614828413906, 'samples': 1809792, 'steps': 9425, 'loss/train': 1.0663224756717682} 01/27/2022 04:43:54 - INFO - codeparrot_training - Step 9426: {'lr': 0.0004710485055273257, 'samples': 1809984, 'steps': 9426, 'loss/train': 0.7957240641117096} 01/27/2022 04:43:58 - INFO - codeparrot_training - Step 9427: {'lr': 0.00047104086182361073, 'samples': 1810176, 'steps': 9427, 'loss/train': 1.991298258304596} 01/27/2022 04:44:01 - INFO - codeparrot_training - Step 9428: {'lr': 0.00047103321717302684, 'samples': 1810368, 'steps': 9428, 'loss/train': 1.068750947713852} 01/27/2022 04:44:04 - INFO - codeparrot_training - Step 9429: {'lr': 0.00047102557157560686, 'samples': 1810560, 'steps': 9429, 'loss/train': 0.5564437508583069} 01/27/2022 04:44:07 - INFO - codeparrot_training - Step 9430: {'lr': 0.00047101792503138353, 'samples': 1810752, 'steps': 9430, 'loss/train': 0.8574271202087402} 01/27/2022 04:44:10 - INFO - codeparrot_training - Step 9431: {'lr': 0.0004710102775403896, 'samples': 1810944, 'steps': 9431, 'loss/train': 1.1207642555236816} 01/27/2022 04:44:13 - INFO - codeparrot_training - Step 9432: {'lr': 0.00047100262910265787, 'samples': 1811136, 'steps': 9432, 'loss/train': 0.6371132880449295} 01/27/2022 04:44:16 - INFO - codeparrot_training - Step 9433: {'lr': 0.00047099497971822096, 'samples': 1811328, 'steps': 9433, 'loss/train': 1.3225749135017395} 01/27/2022 04:44:20 - INFO - codeparrot_training - Step 9434: {'lr': 0.00047098732938711174, 'samples': 1811520, 'steps': 9434, 'loss/train': 0.8840703070163727} 01/27/2022 04:44:24 - INFO - codeparrot_training - Step 9435: {'lr': 0.00047097967810936305, 'samples': 1811712, 'steps': 9435, 'loss/train': 1.1077390015125275} 01/27/2022 04:44:27 - INFO - codeparrot_training - Step 9436: {'lr': 0.00047097202588500747, 'samples': 1811904, 'steps': 9436, 'loss/train': 0.9024711549282074} 01/27/2022 04:44:30 - INFO - codeparrot_training - Step 9437: {'lr': 0.000470964372714078, 'samples': 1812096, 'steps': 9437, 'loss/train': 1.4691778421401978} 01/27/2022 04:44:33 - INFO - codeparrot_training - Step 9438: {'lr': 0.00047095671859660726, 'samples': 1812288, 'steps': 9438, 'loss/train': 0.42736999690532684} 01/27/2022 04:44:37 - INFO - codeparrot_training - Step 9439: {'lr': 0.0004709490635326281, 'samples': 1812480, 'steps': 9439, 'loss/train': 1.0597228109836578} 01/27/2022 04:44:40 - INFO - codeparrot_training - Step 9440: {'lr': 0.0004709414075221734, 'samples': 1812672, 'steps': 9440, 'loss/train': 0.7045184075832367} 01/27/2022 04:44:43 - INFO - codeparrot_training - Step 9441: {'lr': 0.00047093375056527577, 'samples': 1812864, 'steps': 9441, 'loss/train': 0.9418537616729736} 01/27/2022 04:44:46 - INFO - codeparrot_training - Step 9442: {'lr': 0.0004709260926619682, 'samples': 1813056, 'steps': 9442, 'loss/train': 0.7466558218002319} 01/27/2022 04:44:49 - INFO - codeparrot_training - Step 9443: {'lr': 0.00047091843381228326, 'samples': 1813248, 'steps': 9443, 'loss/train': 1.0928925275802612} 01/27/2022 04:44:54 - INFO - codeparrot_training - Step 9444: {'lr': 0.000470910774016254, 'samples': 1813440, 'steps': 9444, 'loss/train': 0.532623365521431} 01/27/2022 04:44:57 - INFO - codeparrot_training - Step 9445: {'lr': 0.0004709031132739131, 'samples': 1813632, 'steps': 9445, 'loss/train': 0.841397613286972} 01/27/2022 04:45:00 - INFO - codeparrot_training - Step 9446: {'lr': 0.0004708954515852934, 'samples': 1813824, 'steps': 9446, 'loss/train': 1.3961626589298248} 01/27/2022 04:45:03 - INFO - codeparrot_training - Step 9447: {'lr': 0.00047088778895042774, 'samples': 1814016, 'steps': 9447, 'loss/train': 2.0935816168785095} 01/27/2022 04:45:07 - INFO - codeparrot_training - Step 9448: {'lr': 0.000470880125369349, 'samples': 1814208, 'steps': 9448, 'loss/train': 0.4226476103067398} 01/27/2022 04:45:10 - INFO - codeparrot_training - Step 9449: {'lr': 0.0004708724608420898, 'samples': 1814400, 'steps': 9449, 'loss/train': 0.27083753049373627} 01/27/2022 04:45:13 - INFO - codeparrot_training - Step 9450: {'lr': 0.0004708647953686832, 'samples': 1814592, 'steps': 9450, 'loss/train': 0.8250689506530762} 01/27/2022 04:45:16 - INFO - codeparrot_training - Step 9451: {'lr': 0.000470857128949162, 'samples': 1814784, 'steps': 9451, 'loss/train': 0.8374244570732117} 01/27/2022 04:45:19 - INFO - codeparrot_training - Step 9452: {'lr': 0.0004708494615835589, 'samples': 1814976, 'steps': 9452, 'loss/train': 0.8226969838142395} 01/27/2022 04:45:25 - INFO - codeparrot_training - Step 9453: {'lr': 0.0004708417932719068, 'samples': 1815168, 'steps': 9453, 'loss/train': 0.9535369277000427} 01/27/2022 04:45:28 - INFO - codeparrot_training - Step 9454: {'lr': 0.0004708341240142387, 'samples': 1815360, 'steps': 9454, 'loss/train': 1.0173006355762482} 01/27/2022 04:45:32 - INFO - codeparrot_training - Step 9455: {'lr': 0.0004708264538105873, 'samples': 1815552, 'steps': 9455, 'loss/train': 1.0622149407863617} 01/27/2022 04:45:35 - INFO - codeparrot_training - Step 9456: {'lr': 0.0004708187826609854, 'samples': 1815744, 'steps': 9456, 'loss/train': 0.9366731643676758} 01/27/2022 04:45:38 - INFO - codeparrot_training - Step 9457: {'lr': 0.0004708111105654661, 'samples': 1815936, 'steps': 9457, 'loss/train': 1.4406043589115143} 01/27/2022 04:45:41 - INFO - codeparrot_training - Step 9458: {'lr': 0.000470803437524062, 'samples': 1816128, 'steps': 9458, 'loss/train': 1.0022457540035248} 01/27/2022 04:45:44 - INFO - codeparrot_training - Step 9459: {'lr': 0.00047079576353680614, 'samples': 1816320, 'steps': 9459, 'loss/train': 0.8859618008136749} 01/27/2022 04:45:47 - INFO - codeparrot_training - Step 9460: {'lr': 0.0004707880886037314, 'samples': 1816512, 'steps': 9460, 'loss/train': 0.3549385517835617} 01/27/2022 04:45:50 - INFO - codeparrot_training - Step 9461: {'lr': 0.00047078041272487046, 'samples': 1816704, 'steps': 9461, 'loss/train': 1.0274795293807983} 01/27/2022 04:45:55 - INFO - codeparrot_training - Step 9462: {'lr': 0.00047077273590025637, 'samples': 1816896, 'steps': 9462, 'loss/train': 1.0101121366024017} 01/27/2022 04:45:58 - INFO - codeparrot_training - Step 9463: {'lr': 0.00047076505812992204, 'samples': 1817088, 'steps': 9463, 'loss/train': 0.6344990283250809} 01/27/2022 04:46:01 - INFO - codeparrot_training - Step 9464: {'lr': 0.0004707573794139003, 'samples': 1817280, 'steps': 9464, 'loss/train': 0.9592462778091431} 01/27/2022 04:46:04 - INFO - codeparrot_training - Step 9465: {'lr': 0.00047074969975222406, 'samples': 1817472, 'steps': 9465, 'loss/train': 0.4593121111392975} 01/27/2022 04:46:07 - INFO - codeparrot_training - Step 9466: {'lr': 0.0004707420191449261, 'samples': 1817664, 'steps': 9466, 'loss/train': 0.7019084841012955} 01/27/2022 04:46:11 - INFO - codeparrot_training - Step 9467: {'lr': 0.0004707343375920395, 'samples': 1817856, 'steps': 9467, 'loss/train': 0.48382651805877686} 01/27/2022 04:46:14 - INFO - codeparrot_training - Step 9468: {'lr': 0.0004707266550935971, 'samples': 1818048, 'steps': 9468, 'loss/train': 0.7188005447387695} 01/27/2022 04:46:17 - INFO - codeparrot_training - Step 9469: {'lr': 0.00047071897164963175, 'samples': 1818240, 'steps': 9469, 'loss/train': 0.8606115281581879} 01/27/2022 04:46:23 - INFO - codeparrot_training - Step 9470: {'lr': 0.00047071128726017643, 'samples': 1818432, 'steps': 9470, 'loss/train': 1.1879061162471771} 01/27/2022 04:46:26 - INFO - codeparrot_training - Step 9471: {'lr': 0.0004707036019252641, 'samples': 1818624, 'steps': 9471, 'loss/train': 0.6717607229948044} 01/27/2022 04:46:29 - INFO - codeparrot_training - Step 9472: {'lr': 0.00047069591564492753, 'samples': 1818816, 'steps': 9472, 'loss/train': 0.37901821732521057} 01/27/2022 04:46:32 - INFO - codeparrot_training - Step 9473: {'lr': 0.00047068822841919976, 'samples': 1819008, 'steps': 9473, 'loss/train': 1.1477887630462646} 01/27/2022 04:46:36 - INFO - codeparrot_training - Step 9474: {'lr': 0.0004706805402481137, 'samples': 1819200, 'steps': 9474, 'loss/train': 0.977578192949295} 01/27/2022 04:46:39 - INFO - codeparrot_training - Step 9475: {'lr': 0.00047067285113170233, 'samples': 1819392, 'steps': 9475, 'loss/train': 1.282191663980484} 01/27/2022 04:46:42 - INFO - codeparrot_training - Step 9476: {'lr': 0.0004706651610699985, 'samples': 1819584, 'steps': 9476, 'loss/train': 0.8346131443977356} 01/27/2022 04:46:45 - INFO - codeparrot_training - Step 9477: {'lr': 0.0004706574700630352, 'samples': 1819776, 'steps': 9477, 'loss/train': 0.83189857006073} 01/27/2022 04:46:48 - INFO - codeparrot_training - Step 9478: {'lr': 0.0004706497781108453, 'samples': 1819968, 'steps': 9478, 'loss/train': 0.9841245710849762} 01/27/2022 04:46:53 - INFO - codeparrot_training - Step 9479: {'lr': 0.00047064208521346184, 'samples': 1820160, 'steps': 9479, 'loss/train': 0.4196517616510391} 01/27/2022 04:46:56 - INFO - codeparrot_training - Step 9480: {'lr': 0.0004706343913709178, 'samples': 1820352, 'steps': 9480, 'loss/train': 0.9059531986713409} 01/27/2022 04:46:59 - INFO - codeparrot_training - Step 9481: {'lr': 0.0004706266965832461, 'samples': 1820544, 'steps': 9481, 'loss/train': 0.48698459565639496} 01/27/2022 04:47:02 - INFO - codeparrot_training - Step 9482: {'lr': 0.0004706190008504796, 'samples': 1820736, 'steps': 9482, 'loss/train': 0.9834691286087036} 01/27/2022 04:47:05 - INFO - codeparrot_training - Step 9483: {'lr': 0.00047061130417265143, 'samples': 1820928, 'steps': 9483, 'loss/train': 0.6457389146089554} 01/27/2022 04:47:08 - INFO - codeparrot_training - Step 9484: {'lr': 0.0004706036065497944, 'samples': 1821120, 'steps': 9484, 'loss/train': 1.0713378489017487} 01/27/2022 04:47:11 - INFO - codeparrot_training - Step 9485: {'lr': 0.0004705959079819416, 'samples': 1821312, 'steps': 9485, 'loss/train': 0.8020868003368378} 01/27/2022 04:47:15 - INFO - codeparrot_training - Step 9486: {'lr': 0.0004705882084691261, 'samples': 1821504, 'steps': 9486, 'loss/train': 0.9969196915626526} 01/27/2022 04:47:18 - INFO - codeparrot_training - Step 9487: {'lr': 0.00047058050801138064, 'samples': 1821696, 'steps': 9487, 'loss/train': 0.46651889383792877} 01/27/2022 04:47:22 - INFO - codeparrot_training - Step 9488: {'lr': 0.00047057280660873835, 'samples': 1821888, 'steps': 9488, 'loss/train': 0.7506610751152039} 01/27/2022 04:47:25 - INFO - codeparrot_training - Step 9489: {'lr': 0.0004705651042612322, 'samples': 1822080, 'steps': 9489, 'loss/train': 0.6323945671319962} 01/27/2022 04:47:28 - INFO - codeparrot_training - Step 9490: {'lr': 0.00047055740096889516, 'samples': 1822272, 'steps': 9490, 'loss/train': 0.9002503752708435} 01/27/2022 04:47:32 - INFO - codeparrot_training - Step 9491: {'lr': 0.0004705496967317603, 'samples': 1822464, 'steps': 9491, 'loss/train': 0.5571229755878448} 01/27/2022 04:47:35 - INFO - codeparrot_training - Step 9492: {'lr': 0.0004705419915498605, 'samples': 1822656, 'steps': 9492, 'loss/train': 0.6723111122846603} 01/27/2022 04:47:38 - INFO - codeparrot_training - Step 9493: {'lr': 0.0004705342854232288, 'samples': 1822848, 'steps': 9493, 'loss/train': 0.5974967926740646} 01/27/2022 04:47:41 - INFO - codeparrot_training - Step 9494: {'lr': 0.00047052657835189836, 'samples': 1823040, 'steps': 9494, 'loss/train': 0.7160481959581375} 01/27/2022 04:47:44 - INFO - codeparrot_training - Step 9495: {'lr': 0.00047051887033590205, 'samples': 1823232, 'steps': 9495, 'loss/train': 0.6080731451511383} 01/27/2022 04:47:47 - INFO - codeparrot_training - Step 9496: {'lr': 0.00047051116137527296, 'samples': 1823424, 'steps': 9496, 'loss/train': 1.0913041234016418} 01/27/2022 04:47:54 - INFO - codeparrot_training - Step 9497: {'lr': 0.000470503451470044, 'samples': 1823616, 'steps': 9497, 'loss/train': 1.0628184378147125} 01/27/2022 04:47:57 - INFO - codeparrot_training - Step 9498: {'lr': 0.00047049574062024837, 'samples': 1823808, 'steps': 9498, 'loss/train': 0.7879559397697449} 01/27/2022 04:48:00 - INFO - codeparrot_training - Step 9499: {'lr': 0.0004704880288259189, 'samples': 1824000, 'steps': 9499, 'loss/train': 1.2022849023342133} 01/27/2022 04:48:03 - INFO - codeparrot_training - Step 9500: {'lr': 0.00047048031608708875, 'samples': 1824192, 'steps': 9500, 'loss/train': 0.8644236624240875} 01/27/2022 04:48:06 - INFO - codeparrot_training - Step 9501: {'lr': 0.00047047260240379096, 'samples': 1824384, 'steps': 9501, 'loss/train': 0.31496165692806244} 01/27/2022 04:48:09 - INFO - codeparrot_training - Step 9502: {'lr': 0.00047046488777605853, 'samples': 1824576, 'steps': 9502, 'loss/train': 1.1743095517158508} 01/27/2022 04:48:12 - INFO - codeparrot_training - Step 9503: {'lr': 0.0004704571722039246, 'samples': 1824768, 'steps': 9503, 'loss/train': 0.6576171219348907} 01/27/2022 04:48:16 - INFO - codeparrot_training - Step 9504: {'lr': 0.00047044945568742205, 'samples': 1824960, 'steps': 9504, 'loss/train': 0.9184885919094086} 01/27/2022 04:48:19 - INFO - codeparrot_training - Step 9505: {'lr': 0.0004704417382265841, 'samples': 1825152, 'steps': 9505, 'loss/train': 0.7361412048339844} 01/27/2022 04:48:22 - INFO - codeparrot_training - Step 9506: {'lr': 0.0004704340198214437, 'samples': 1825344, 'steps': 9506, 'loss/train': 0.615703359246254} 01/27/2022 04:48:27 - INFO - codeparrot_training - Step 9507: {'lr': 0.00047042630047203394, 'samples': 1825536, 'steps': 9507, 'loss/train': 0.9414246082305908} 01/27/2022 04:48:30 - INFO - codeparrot_training - Step 9508: {'lr': 0.0004704185801783879, 'samples': 1825728, 'steps': 9508, 'loss/train': 0.7898639738559723} 01/27/2022 04:48:33 - INFO - codeparrot_training - Step 9509: {'lr': 0.0004704108589405387, 'samples': 1825920, 'steps': 9509, 'loss/train': 1.0108325779438019} 01/27/2022 04:48:36 - INFO - codeparrot_training - Step 9510: {'lr': 0.0004704031367585193, 'samples': 1826112, 'steps': 9510, 'loss/train': 0.7602777779102325} 01/27/2022 04:48:39 - INFO - codeparrot_training - Step 9511: {'lr': 0.0004703954136323629, 'samples': 1826304, 'steps': 9511, 'loss/train': 0.4180622398853302} 01/27/2022 04:48:42 - INFO - codeparrot_training - Step 9512: {'lr': 0.0004703876895621025, 'samples': 1826496, 'steps': 9512, 'loss/train': 0.7528340220451355} 01/27/2022 04:48:45 - INFO - codeparrot_training - Step 9513: {'lr': 0.00047037996454777134, 'samples': 1826688, 'steps': 9513, 'loss/train': 0.9663788974285126} 01/27/2022 04:48:49 - INFO - codeparrot_training - Step 9514: {'lr': 0.00047037223858940224, 'samples': 1826880, 'steps': 9514, 'loss/train': 0.8450997769832611} 01/27/2022 04:48:55 - INFO - codeparrot_training - Step 9515: {'lr': 0.00047036451168702855, 'samples': 1827072, 'steps': 9515, 'loss/train': 0.5654204785823822} 01/27/2022 04:48:58 - INFO - codeparrot_training - Step 9516: {'lr': 0.0004703567838406832, 'samples': 1827264, 'steps': 9516, 'loss/train': 0.49653491377830505} 01/27/2022 04:49:01 - INFO - codeparrot_training - Step 9517: {'lr': 0.00047034905505039936, 'samples': 1827456, 'steps': 9517, 'loss/train': 0.5197928845882416} 01/27/2022 04:49:04 - INFO - codeparrot_training - Step 9518: {'lr': 0.0004703413253162102, 'samples': 1827648, 'steps': 9518, 'loss/train': 1.2045795321464539} 01/27/2022 04:49:07 - INFO - codeparrot_training - Step 9519: {'lr': 0.00047033359463814875, 'samples': 1827840, 'steps': 9519, 'loss/train': 0.7318852543830872} 01/27/2022 04:49:11 - INFO - codeparrot_training - Step 9520: {'lr': 0.00047032586301624804, 'samples': 1828032, 'steps': 9520, 'loss/train': 0.9802179336547852} 01/27/2022 04:49:14 - INFO - codeparrot_training - Step 9521: {'lr': 0.0004703181304505414, 'samples': 1828224, 'steps': 9521, 'loss/train': 0.43129321932792664} 01/27/2022 04:49:17 - INFO - codeparrot_training - Step 9522: {'lr': 0.0004703103969410618, 'samples': 1828416, 'steps': 9522, 'loss/train': 0.669066846370697} 01/27/2022 04:49:20 - INFO - codeparrot_training - Step 9523: {'lr': 0.0004703026624878425, 'samples': 1828608, 'steps': 9523, 'loss/train': 1.2528586685657501} 01/27/2022 04:49:25 - INFO - codeparrot_training - Step 9524: {'lr': 0.0004702949270909164, 'samples': 1828800, 'steps': 9524, 'loss/train': 0.9926972687244415} 01/27/2022 04:49:29 - INFO - codeparrot_training - Step 9525: {'lr': 0.0004702871907503169, 'samples': 1828992, 'steps': 9525, 'loss/train': 0.7655785381793976} 01/27/2022 04:49:32 - INFO - codeparrot_training - Step 9526: {'lr': 0.000470279453466077, 'samples': 1829184, 'steps': 9526, 'loss/train': 0.8630818426609039} 01/27/2022 04:49:35 - INFO - codeparrot_training - Step 9527: {'lr': 0.0004702717152382299, 'samples': 1829376, 'steps': 9527, 'loss/train': 0.42750442028045654} 01/27/2022 04:49:38 - INFO - codeparrot_training - Step 9528: {'lr': 0.0004702639760668086, 'samples': 1829568, 'steps': 9528, 'loss/train': 0.8590148091316223} 01/27/2022 04:49:41 - INFO - codeparrot_training - Step 9529: {'lr': 0.00047025623595184645, 'samples': 1829760, 'steps': 9529, 'loss/train': 0.33374742418527603} 01/27/2022 04:49:44 - INFO - codeparrot_training - Step 9530: {'lr': 0.0004702484948933765, 'samples': 1829952, 'steps': 9530, 'loss/train': 1.7161847949028015} 01/27/2022 04:49:47 - INFO - codeparrot_training - Step 9531: {'lr': 0.000470240752891432, 'samples': 1830144, 'steps': 9531, 'loss/train': 1.6842230558395386} 01/27/2022 04:49:51 - INFO - codeparrot_training - Step 9532: {'lr': 0.000470233009946046, 'samples': 1830336, 'steps': 9532, 'loss/train': 0.8651960492134094} 01/27/2022 04:49:54 - INFO - codeparrot_training - Step 9533: {'lr': 0.0004702252660572517, 'samples': 1830528, 'steps': 9533, 'loss/train': 1.0812988579273224} 01/27/2022 04:49:58 - INFO - codeparrot_training - Step 9534: {'lr': 0.00047021752122508234, 'samples': 1830720, 'steps': 9534, 'loss/train': 0.2110266163945198} 01/27/2022 04:50:01 - INFO - codeparrot_training - Step 9535: {'lr': 0.000470209775449571, 'samples': 1830912, 'steps': 9535, 'loss/train': 0.4942733645439148} 01/27/2022 04:50:04 - INFO - codeparrot_training - Step 9536: {'lr': 0.00047020202873075093, 'samples': 1831104, 'steps': 9536, 'loss/train': 0.6741926372051239} 01/27/2022 04:50:08 - INFO - codeparrot_training - Step 9537: {'lr': 0.0004701942810686552, 'samples': 1831296, 'steps': 9537, 'loss/train': 0.8099030256271362} 01/27/2022 04:50:11 - INFO - codeparrot_training - Step 9538: {'lr': 0.00047018653246331724, 'samples': 1831488, 'steps': 9538, 'loss/train': 0.8043084740638733} 01/27/2022 04:50:14 - INFO - codeparrot_training - Step 9539: {'lr': 0.00047017878291477, 'samples': 1831680, 'steps': 9539, 'loss/train': 0.4862038940191269} 01/27/2022 04:50:17 - INFO - codeparrot_training - Step 9540: {'lr': 0.0004701710324230468, 'samples': 1831872, 'steps': 9540, 'loss/train': 0.8448970913887024} 01/27/2022 04:50:20 - INFO - codeparrot_training - Step 9541: {'lr': 0.00047016328098818086, 'samples': 1832064, 'steps': 9541, 'loss/train': 1.1573018431663513} 01/27/2022 04:50:25 - INFO - codeparrot_training - Step 9542: {'lr': 0.00047015552861020524, 'samples': 1832256, 'steps': 9542, 'loss/train': 1.1675638854503632} 01/27/2022 04:50:28 - INFO - codeparrot_training - Step 9543: {'lr': 0.00047014777528915327, 'samples': 1832448, 'steps': 9543, 'loss/train': 1.286134421825409} 01/27/2022 04:50:31 - INFO - codeparrot_training - Step 9544: {'lr': 0.0004701400210250581, 'samples': 1832640, 'steps': 9544, 'loss/train': 0.9543185234069824} 01/27/2022 04:50:34 - INFO - codeparrot_training - Step 9545: {'lr': 0.00047013226581795305, 'samples': 1832832, 'steps': 9545, 'loss/train': 0.9687520265579224} 01/27/2022 04:50:37 - INFO - codeparrot_training - Step 9546: {'lr': 0.00047012450966787126, 'samples': 1833024, 'steps': 9546, 'loss/train': 0.946584552526474} 01/27/2022 04:50:40 - INFO - codeparrot_training - Step 9547: {'lr': 0.000470116752574846, 'samples': 1833216, 'steps': 9547, 'loss/train': 0.9545236229896545} 01/27/2022 04:50:44 - INFO - codeparrot_training - Step 9548: {'lr': 0.0004701089945389104, 'samples': 1833408, 'steps': 9548, 'loss/train': 0.7034563422203064} 01/27/2022 04:50:47 - INFO - codeparrot_training - Step 9549: {'lr': 0.00047010123556009774, 'samples': 1833600, 'steps': 9549, 'loss/train': 0.8832165598869324} 01/27/2022 04:50:50 - INFO - codeparrot_training - Step 9550: {'lr': 0.0004700934756384413, 'samples': 1833792, 'steps': 9550, 'loss/train': 0.9271171689033508} 01/27/2022 04:50:56 - INFO - codeparrot_training - Step 9551: {'lr': 0.00047008571477397435, 'samples': 1833984, 'steps': 9551, 'loss/train': 0.9834287166595459} 01/27/2022 04:50:59 - INFO - codeparrot_training - Step 9552: {'lr': 0.00047007795296673006, 'samples': 1834176, 'steps': 9552, 'loss/train': 0.6492666006088257} 01/27/2022 04:51:03 - INFO - codeparrot_training - Step 9553: {'lr': 0.00047007019021674167, 'samples': 1834368, 'steps': 9553, 'loss/train': 1.0473147332668304} 01/27/2022 04:51:06 - INFO - codeparrot_training - Step 9554: {'lr': 0.0004700624265240425, 'samples': 1834560, 'steps': 9554, 'loss/train': 0.5579513758420944} 01/27/2022 04:51:09 - INFO - codeparrot_training - Step 9555: {'lr': 0.00047005466188866575, 'samples': 1834752, 'steps': 9555, 'loss/train': 0.48673421144485474} 01/27/2022 04:51:12 - INFO - codeparrot_training - Step 9556: {'lr': 0.00047004689631064474, 'samples': 1834944, 'steps': 9556, 'loss/train': 1.0211460292339325} 01/27/2022 04:51:15 - INFO - codeparrot_training - Step 9557: {'lr': 0.00047003912979001267, 'samples': 1835136, 'steps': 9557, 'loss/train': 0.9309269785881042} 01/27/2022 04:51:18 - INFO - codeparrot_training - Step 9558: {'lr': 0.0004700313623268028, 'samples': 1835328, 'steps': 9558, 'loss/train': 0.9783517420291901} 01/27/2022 04:51:21 - INFO - codeparrot_training - Step 9559: {'lr': 0.00047002359392104854, 'samples': 1835520, 'steps': 9559, 'loss/train': 1.0025951564311981} 01/27/2022 04:51:26 - INFO - codeparrot_training - Step 9560: {'lr': 0.000470015824572783, 'samples': 1835712, 'steps': 9560, 'loss/train': 0.7326255440711975} 01/27/2022 04:51:29 - INFO - codeparrot_training - Step 9561: {'lr': 0.00047000805428203953, 'samples': 1835904, 'steps': 9561, 'loss/train': 0.7386915385723114} 01/27/2022 04:51:32 - INFO - codeparrot_training - Step 9562: {'lr': 0.00047000028304885143, 'samples': 1836096, 'steps': 9562, 'loss/train': 0.9031415283679962} 01/27/2022 04:51:35 - INFO - codeparrot_training - Step 9563: {'lr': 0.00046999251087325204, 'samples': 1836288, 'steps': 9563, 'loss/train': 0.9546038210391998} 01/27/2022 04:51:38 - INFO - codeparrot_training - Step 9564: {'lr': 0.0004699847377552745, 'samples': 1836480, 'steps': 9564, 'loss/train': 0.9025626182556152} 01/27/2022 04:51:42 - INFO - codeparrot_training - Step 9565: {'lr': 0.00046997696369495217, 'samples': 1836672, 'steps': 9565, 'loss/train': 0.8505388498306274} 01/27/2022 04:51:45 - INFO - codeparrot_training - Step 9566: {'lr': 0.00046996918869231843, 'samples': 1836864, 'steps': 9566, 'loss/train': 0.6693016290664673} 01/27/2022 04:51:48 - INFO - codeparrot_training - Step 9567: {'lr': 0.00046996141274740653, 'samples': 1837056, 'steps': 9567, 'loss/train': 1.2747509479522705} 01/27/2022 04:51:51 - INFO - codeparrot_training - Step 9568: {'lr': 0.00046995363586024977, 'samples': 1837248, 'steps': 9568, 'loss/train': 0.505514994263649} 01/27/2022 04:51:55 - INFO - codeparrot_training - Step 9569: {'lr': 0.0004699458580308815, 'samples': 1837440, 'steps': 9569, 'loss/train': 0.764052003622055} 01/27/2022 04:51:58 - INFO - codeparrot_training - Step 9570: {'lr': 0.00046993807925933503, 'samples': 1837632, 'steps': 9570, 'loss/train': 0.28257641941308975} 01/27/2022 04:52:02 - INFO - codeparrot_training - Step 9571: {'lr': 0.00046993029954564363, 'samples': 1837824, 'steps': 9571, 'loss/train': 0.977720707654953} 01/27/2022 04:52:05 - INFO - codeparrot_training - Step 9572: {'lr': 0.0004699225188898407, 'samples': 1838016, 'steps': 9572, 'loss/train': 0.7718082368373871} 01/27/2022 04:52:08 - INFO - codeparrot_training - Step 9573: {'lr': 0.0004699147372919595, 'samples': 1838208, 'steps': 9573, 'loss/train': 1.2074586153030396} 01/27/2022 04:52:11 - INFO - codeparrot_training - Step 9574: {'lr': 0.00046990695475203337, 'samples': 1838400, 'steps': 9574, 'loss/train': 1.1332179307937622} 01/27/2022 04:52:14 - INFO - codeparrot_training - Step 9575: {'lr': 0.00046989917127009573, 'samples': 1838592, 'steps': 9575, 'loss/train': 0.9197939336299896} 01/27/2022 04:52:17 - INFO - codeparrot_training - Step 9576: {'lr': 0.0004698913868461798, 'samples': 1838784, 'steps': 9576, 'loss/train': 0.6737616062164307} 01/27/2022 04:52:20 - INFO - codeparrot_training - Step 9577: {'lr': 0.00046988360148031904, 'samples': 1838976, 'steps': 9577, 'loss/train': 0.8970410227775574} 01/27/2022 04:52:27 - INFO - codeparrot_training - Step 9578: {'lr': 0.0004698758151725468, 'samples': 1839168, 'steps': 9578, 'loss/train': 0.6536348015069962} 01/27/2022 04:52:30 - INFO - codeparrot_training - Step 9579: {'lr': 0.0004698680279228963, 'samples': 1839360, 'steps': 9579, 'loss/train': 0.9264151453971863} 01/27/2022 04:52:33 - INFO - codeparrot_training - Step 9580: {'lr': 0.000469860239731401, 'samples': 1839552, 'steps': 9580, 'loss/train': 0.9073115587234497} 01/27/2022 04:52:36 - INFO - codeparrot_training - Step 9581: {'lr': 0.00046985245059809436, 'samples': 1839744, 'steps': 9581, 'loss/train': 0.7449557930231094} 01/27/2022 04:52:39 - INFO - codeparrot_training - Step 9582: {'lr': 0.0004698446605230095, 'samples': 1839936, 'steps': 9582, 'loss/train': 0.8482527136802673} 01/27/2022 04:52:42 - INFO - codeparrot_training - Step 9583: {'lr': 0.00046983686950618, 'samples': 1840128, 'steps': 9583, 'loss/train': 0.9822859168052673} 01/27/2022 04:52:45 - INFO - codeparrot_training - Step 9584: {'lr': 0.00046982907754763905, 'samples': 1840320, 'steps': 9584, 'loss/train': 0.8362584114074707} 01/27/2022 04:52:48 - INFO - codeparrot_training - Step 9585: {'lr': 0.00046982128464742026, 'samples': 1840512, 'steps': 9585, 'loss/train': 0.9400649070739746} 01/27/2022 04:52:53 - INFO - codeparrot_training - Step 9586: {'lr': 0.0004698134908055568, 'samples': 1840704, 'steps': 9586, 'loss/train': 0.7237153798341751} 01/27/2022 04:52:56 - INFO - codeparrot_training - Step 9587: {'lr': 0.00046980569602208215, 'samples': 1840896, 'steps': 9587, 'loss/train': 0.6281555593013763} 01/27/2022 04:52:59 - INFO - codeparrot_training - Step 9588: {'lr': 0.00046979790029702973, 'samples': 1841088, 'steps': 9588, 'loss/train': 0.7714801132678986} 01/27/2022 04:53:02 - INFO - codeparrot_training - Step 9589: {'lr': 0.0004697901036304329, 'samples': 1841280, 'steps': 9589, 'loss/train': 0.7713728249073029} 01/27/2022 04:53:05 - INFO - codeparrot_training - Step 9590: {'lr': 0.00046978230602232507, 'samples': 1841472, 'steps': 9590, 'loss/train': 0.779688835144043} 01/27/2022 04:53:09 - INFO - codeparrot_training - Step 9591: {'lr': 0.00046977450747273956, 'samples': 1841664, 'steps': 9591, 'loss/train': 1.078298807144165} 01/27/2022 04:53:12 - INFO - codeparrot_training - Step 9592: {'lr': 0.00046976670798171, 'samples': 1841856, 'steps': 9592, 'loss/train': 1.0199958086013794} 01/27/2022 04:53:15 - INFO - codeparrot_training - Step 9593: {'lr': 0.00046975890754926943, 'samples': 1842048, 'steps': 9593, 'loss/train': 0.4892909377813339} 01/27/2022 04:53:18 - INFO - codeparrot_training - Step 9594: {'lr': 0.0004697511061754516, 'samples': 1842240, 'steps': 9594, 'loss/train': 1.0919077098369598} 01/27/2022 04:53:24 - INFO - codeparrot_training - Step 9595: {'lr': 0.00046974330386028985, 'samples': 1842432, 'steps': 9595, 'loss/train': 1.250577449798584} 01/27/2022 04:53:27 - INFO - codeparrot_training - Step 9596: {'lr': 0.0004697355006038175, 'samples': 1842624, 'steps': 9596, 'loss/train': 1.0449124574661255} 01/27/2022 04:53:30 - INFO - codeparrot_training - Step 9597: {'lr': 0.00046972769640606804, 'samples': 1842816, 'steps': 9597, 'loss/train': 1.0784011781215668} 01/27/2022 04:53:34 - INFO - codeparrot_training - Step 9598: {'lr': 0.0004697198912670749, 'samples': 1843008, 'steps': 9598, 'loss/train': 0.7607461810112} 01/27/2022 04:53:37 - INFO - codeparrot_training - Step 9599: {'lr': 0.0004697120851868715, 'samples': 1843200, 'steps': 9599, 'loss/train': 0.8377433717250824} 01/27/2022 04:53:40 - INFO - codeparrot_training - Step 9600: {'lr': 0.00046970427816549133, 'samples': 1843392, 'steps': 9600, 'loss/train': 0.8782452642917633} 01/27/2022 04:53:43 - INFO - codeparrot_training - Step 9601: {'lr': 0.0004696964702029678, 'samples': 1843584, 'steps': 9601, 'loss/train': 0.6669680178165436} 01/27/2022 04:53:46 - INFO - codeparrot_training - Step 9602: {'lr': 0.00046968866129933436, 'samples': 1843776, 'steps': 9602, 'loss/train': 0.7379384189844131} 01/27/2022 04:53:49 - INFO - codeparrot_training - Step 9603: {'lr': 0.0004696808514546244, 'samples': 1843968, 'steps': 9603, 'loss/train': 0.8333199620246887} 01/27/2022 04:53:54 - INFO - codeparrot_training - Step 9604: {'lr': 0.0004696730406688715, 'samples': 1844160, 'steps': 9604, 'loss/train': 0.9631455838680267} 01/27/2022 04:53:57 - INFO - codeparrot_training - Step 9605: {'lr': 0.000469665228942109, 'samples': 1844352, 'steps': 9605, 'loss/train': 0.5170351266860962} 01/27/2022 04:54:00 - INFO - codeparrot_training - Step 9606: {'lr': 0.0004696574162743704, 'samples': 1844544, 'steps': 9606, 'loss/train': 1.1934536397457123} 01/27/2022 04:54:03 - INFO - codeparrot_training - Step 9607: {'lr': 0.00046964960266568926, 'samples': 1844736, 'steps': 9607, 'loss/train': 0.23262682557106018} 01/27/2022 04:54:06 - INFO - codeparrot_training - Step 9608: {'lr': 0.0004696417881160989, 'samples': 1844928, 'steps': 9608, 'loss/train': 0.4345772713422775} 01/27/2022 04:54:09 - INFO - codeparrot_training - Step 9609: {'lr': 0.0004696339726256328, 'samples': 1845120, 'steps': 9609, 'loss/train': 1.4041883647441864} 01/27/2022 04:54:12 - INFO - codeparrot_training - Step 9610: {'lr': 0.00046962615619432457, 'samples': 1845312, 'steps': 9610, 'loss/train': 0.1905362382531166} 01/27/2022 04:54:16 - INFO - codeparrot_training - Step 9611: {'lr': 0.0004696183388222077, 'samples': 1845504, 'steps': 9611, 'loss/train': 0.5062539875507355} 01/27/2022 04:54:19 - INFO - codeparrot_training - Step 9612: {'lr': 0.0004696105205093155, 'samples': 1845696, 'steps': 9612, 'loss/train': 1.0750406384468079} 01/27/2022 04:54:23 - INFO - codeparrot_training - Step 9613: {'lr': 0.0004696027012556816, 'samples': 1845888, 'steps': 9613, 'loss/train': 1.1737622916698456} 01/27/2022 04:54:26 - INFO - codeparrot_training - Step 9614: {'lr': 0.00046959488106133944, 'samples': 1846080, 'steps': 9614, 'loss/train': 0.864740252494812} 01/27/2022 04:54:29 - INFO - codeparrot_training - Step 9615: {'lr': 0.0004695870599263226, 'samples': 1846272, 'steps': 9615, 'loss/train': 0.4561614543199539} 01/27/2022 04:54:33 - INFO - codeparrot_training - Step 9616: {'lr': 0.0004695792378506645, 'samples': 1846464, 'steps': 9616, 'loss/train': 1.0364704728126526} 01/27/2022 04:54:36 - INFO - codeparrot_training - Step 9617: {'lr': 0.00046957141483439856, 'samples': 1846656, 'steps': 9617, 'loss/train': 0.7825702428817749} 01/27/2022 04:54:39 - INFO - codeparrot_training - Step 9618: {'lr': 0.0004695635908775585, 'samples': 1846848, 'steps': 9618, 'loss/train': 1.4407812058925629} 01/27/2022 04:54:42 - INFO - codeparrot_training - Step 9619: {'lr': 0.0004695557659801778, 'samples': 1847040, 'steps': 9619, 'loss/train': 0.6985725313425064} 01/27/2022 04:54:45 - INFO - codeparrot_training - Step 9620: {'lr': 0.0004695479401422898, 'samples': 1847232, 'steps': 9620, 'loss/train': 0.7793406844139099} 01/27/2022 04:54:48 - INFO - codeparrot_training - Step 9621: {'lr': 0.0004695401133639282, 'samples': 1847424, 'steps': 9621, 'loss/train': 0.942846268415451} 01/27/2022 04:54:55 - INFO - codeparrot_training - Step 9622: {'lr': 0.0004695322856451264, 'samples': 1847616, 'steps': 9622, 'loss/train': 0.7847471237182617} 01/27/2022 04:54:58 - INFO - codeparrot_training - Step 9623: {'lr': 0.00046952445698591805, 'samples': 1847808, 'steps': 9623, 'loss/train': 0.16754933819174767} 01/27/2022 04:55:01 - INFO - codeparrot_training - Step 9624: {'lr': 0.0004695166273863367, 'samples': 1848000, 'steps': 9624, 'loss/train': 0.8473497033119202} 01/27/2022 04:55:04 - INFO - codeparrot_training - Step 9625: {'lr': 0.00046950879684641567, 'samples': 1848192, 'steps': 9625, 'loss/train': 0.9502731263637543} 01/27/2022 04:55:08 - INFO - codeparrot_training - Step 9626: {'lr': 0.00046950096536618876, 'samples': 1848384, 'steps': 9626, 'loss/train': 0.481338769197464} 01/27/2022 04:55:11 - INFO - codeparrot_training - Step 9627: {'lr': 0.0004694931329456894, 'samples': 1848576, 'steps': 9627, 'loss/train': 1.1206963956356049} 01/27/2022 04:55:14 - INFO - codeparrot_training - Step 9628: {'lr': 0.0004694852995849511, 'samples': 1848768, 'steps': 9628, 'loss/train': 0.862506777048111} 01/27/2022 04:55:17 - INFO - codeparrot_training - Step 9629: {'lr': 0.00046947746528400755, 'samples': 1848960, 'steps': 9629, 'loss/train': 0.7584489583969116} 01/27/2022 04:55:20 - INFO - codeparrot_training - Step 9630: {'lr': 0.00046946963004289223, 'samples': 1849152, 'steps': 9630, 'loss/train': 1.7473236322402954} 01/27/2022 04:55:25 - INFO - codeparrot_training - Step 9631: {'lr': 0.0004694617938616386, 'samples': 1849344, 'steps': 9631, 'loss/train': 0.9645160138607025} 01/27/2022 04:55:28 - INFO - codeparrot_training - Step 9632: {'lr': 0.00046945395674028047, 'samples': 1849536, 'steps': 9632, 'loss/train': 0.953042060136795} 01/27/2022 04:55:31 - INFO - codeparrot_training - Step 9633: {'lr': 0.0004694461186788512, 'samples': 1849728, 'steps': 9633, 'loss/train': 0.6617476344108582} 01/27/2022 04:55:34 - INFO - codeparrot_training - Step 9634: {'lr': 0.0004694382796773844, 'samples': 1849920, 'steps': 9634, 'loss/train': 0.6312430948019028} 01/27/2022 04:55:37 - INFO - codeparrot_training - Step 9635: {'lr': 0.0004694304397359137, 'samples': 1850112, 'steps': 9635, 'loss/train': 0.8373411297798157} 01/27/2022 04:55:40 - INFO - codeparrot_training - Step 9636: {'lr': 0.00046942259885447273, 'samples': 1850304, 'steps': 9636, 'loss/train': 1.1241203248500824} 01/27/2022 04:55:43 - INFO - codeparrot_training - Step 9637: {'lr': 0.000469414757033095, 'samples': 1850496, 'steps': 9637, 'loss/train': 0.8617646992206573} 01/27/2022 04:55:47 - INFO - codeparrot_training - Step 9638: {'lr': 0.00046940691427181414, 'samples': 1850688, 'steps': 9638, 'loss/train': 0.95586758852005} 01/27/2022 04:55:50 - INFO - codeparrot_training - Step 9639: {'lr': 0.00046939907057066374, 'samples': 1850880, 'steps': 9639, 'loss/train': 0.8224671185016632} 01/27/2022 04:55:55 - INFO - codeparrot_training - Step 9640: {'lr': 0.0004693912259296773, 'samples': 1851072, 'steps': 9640, 'loss/train': 0.783404678106308} 01/27/2022 04:55:59 - INFO - codeparrot_training - Step 9641: {'lr': 0.0004693833803488886, 'samples': 1851264, 'steps': 9641, 'loss/train': 0.7442797869443893} 01/27/2022 04:56:02 - INFO - codeparrot_training - Step 9642: {'lr': 0.00046937553382833116, 'samples': 1851456, 'steps': 9642, 'loss/train': 0.8930297791957855} 01/27/2022 04:56:05 - INFO - codeparrot_training - Step 9643: {'lr': 0.00046936768636803857, 'samples': 1851648, 'steps': 9643, 'loss/train': 0.5361937880516052} 01/27/2022 04:56:08 - INFO - codeparrot_training - Step 9644: {'lr': 0.00046935983796804443, 'samples': 1851840, 'steps': 9644, 'loss/train': 0.49701592326164246} 01/27/2022 04:56:11 - INFO - codeparrot_training - Step 9645: {'lr': 0.00046935198862838246, 'samples': 1852032, 'steps': 9645, 'loss/train': 0.7709438502788544} 01/27/2022 04:56:14 - INFO - codeparrot_training - Step 9646: {'lr': 0.00046934413834908616, 'samples': 1852224, 'steps': 9646, 'loss/train': 0.8664379119873047} 01/27/2022 04:56:17 - INFO - codeparrot_training - Step 9647: {'lr': 0.0004693362871301893, 'samples': 1852416, 'steps': 9647, 'loss/train': 1.3794591128826141} 01/27/2022 04:56:22 - INFO - codeparrot_training - Step 9648: {'lr': 0.0004693284349717254, 'samples': 1852608, 'steps': 9648, 'loss/train': 0.7365268617868423} 01/27/2022 04:56:25 - INFO - codeparrot_training - Step 9649: {'lr': 0.00046932058187372803, 'samples': 1852800, 'steps': 9649, 'loss/train': 0.8698121309280396} 01/27/2022 04:56:29 - INFO - codeparrot_training - Step 9650: {'lr': 0.00046931272783623106, 'samples': 1852992, 'steps': 9650, 'loss/train': 0.43532533943653107} 01/27/2022 04:56:32 - INFO - codeparrot_training - Step 9651: {'lr': 0.00046930487285926797, 'samples': 1853184, 'steps': 9651, 'loss/train': 1.8507387042045593} 01/27/2022 04:56:35 - INFO - codeparrot_training - Step 9652: {'lr': 0.00046929701694287243, 'samples': 1853376, 'steps': 9652, 'loss/train': 0.7723917961120605} 01/27/2022 04:56:38 - INFO - codeparrot_training - Step 9653: {'lr': 0.0004692891600870781, 'samples': 1853568, 'steps': 9653, 'loss/train': 0.9673847258090973} 01/27/2022 04:56:41 - INFO - codeparrot_training - Step 9654: {'lr': 0.00046928130229191865, 'samples': 1853760, 'steps': 9654, 'loss/train': 0.7645681500434875} 01/27/2022 04:56:44 - INFO - codeparrot_training - Step 9655: {'lr': 0.00046927344355742774, 'samples': 1853952, 'steps': 9655, 'loss/train': 0.6050185114145279} 01/27/2022 04:56:47 - INFO - codeparrot_training - Step 9656: {'lr': 0.00046926558388363904, 'samples': 1854144, 'steps': 9656, 'loss/train': 1.2586068212985992} 01/27/2022 04:56:52 - INFO - codeparrot_training - Step 9657: {'lr': 0.00046925772327058616, 'samples': 1854336, 'steps': 9657, 'loss/train': 0.9281889796257019} 01/27/2022 04:56:56 - INFO - codeparrot_training - Step 9658: {'lr': 0.0004692498617183028, 'samples': 1854528, 'steps': 9658, 'loss/train': 0.3611656129360199} 01/27/2022 04:56:59 - INFO - codeparrot_training - Step 9659: {'lr': 0.0004692419992268227, 'samples': 1854720, 'steps': 9659, 'loss/train': 0.6100278496742249} 01/27/2022 04:57:02 - INFO - codeparrot_training - Step 9660: {'lr': 0.00046923413579617944, 'samples': 1854912, 'steps': 9660, 'loss/train': 0.8311553299427032} 01/27/2022 04:57:05 - INFO - codeparrot_training - Step 9661: {'lr': 0.00046922627142640685, 'samples': 1855104, 'steps': 9661, 'loss/train': 1.0906105935573578} 01/27/2022 04:57:08 - INFO - codeparrot_training - Step 9662: {'lr': 0.00046921840611753845, 'samples': 1855296, 'steps': 9662, 'loss/train': 0.9096201360225677} 01/27/2022 04:57:11 - INFO - codeparrot_training - Step 9663: {'lr': 0.000469210539869608, 'samples': 1855488, 'steps': 9663, 'loss/train': 0.8708465695381165} 01/27/2022 04:57:14 - INFO - codeparrot_training - Step 9664: {'lr': 0.0004692026726826493, 'samples': 1855680, 'steps': 9664, 'loss/train': 0.3030914068222046} 01/27/2022 04:57:18 - INFO - codeparrot_training - Step 9665: {'lr': 0.0004691948045566958, 'samples': 1855872, 'steps': 9665, 'loss/train': 1.0367369055747986} 01/27/2022 04:57:22 - INFO - codeparrot_training - Step 9666: {'lr': 0.0004691869354917815, 'samples': 1856064, 'steps': 9666, 'loss/train': 1.3183167278766632} 01/27/2022 04:57:25 - INFO - codeparrot_training - Step 9667: {'lr': 0.0004691790654879399, 'samples': 1856256, 'steps': 9667, 'loss/train': 0.7835927903652191} 01/27/2022 04:57:29 - INFO - codeparrot_training - Step 9668: {'lr': 0.00046917119454520487, 'samples': 1856448, 'steps': 9668, 'loss/train': 0.9062977731227875} 01/27/2022 04:57:32 - INFO - codeparrot_training - Step 9669: {'lr': 0.0004691633226636099, 'samples': 1856640, 'steps': 9669, 'loss/train': 0.3903849273920059} 01/27/2022 04:57:35 - INFO - codeparrot_training - Step 9670: {'lr': 0.0004691554498431889, 'samples': 1856832, 'steps': 9670, 'loss/train': 0.6771022975444794} 01/27/2022 04:57:38 - INFO - codeparrot_training - Step 9671: {'lr': 0.00046914757608397555, 'samples': 1857024, 'steps': 9671, 'loss/train': 0.8675961792469025} 01/27/2022 04:57:41 - INFO - codeparrot_training - Step 9672: {'lr': 0.00046913970138600357, 'samples': 1857216, 'steps': 9672, 'loss/train': 0.9576593041419983} 01/27/2022 04:57:44 - INFO - codeparrot_training - Step 9673: {'lr': 0.0004691318257493067, 'samples': 1857408, 'steps': 9673, 'loss/train': 0.9095136523246765} 01/27/2022 04:57:47 - INFO - codeparrot_training - Step 9674: {'lr': 0.00046912394917391866, 'samples': 1857600, 'steps': 9674, 'loss/train': 0.7349126636981964} 01/27/2022 04:57:52 - INFO - codeparrot_training - Step 9675: {'lr': 0.00046911607165987324, 'samples': 1857792, 'steps': 9675, 'loss/train': 1.103965312242508} 01/27/2022 04:57:55 - INFO - codeparrot_training - Step 9676: {'lr': 0.0004691081932072041, 'samples': 1857984, 'steps': 9676, 'loss/train': 0.942993700504303} 01/27/2022 04:57:58 - INFO - codeparrot_training - Step 9677: {'lr': 0.0004691003138159451, 'samples': 1858176, 'steps': 9677, 'loss/train': 0.7688771188259125} 01/27/2022 04:58:01 - INFO - codeparrot_training - Step 9678: {'lr': 0.00046909243348612986, 'samples': 1858368, 'steps': 9678, 'loss/train': 0.9950248003005981} 01/27/2022 04:58:04 - INFO - codeparrot_training - Step 9679: {'lr': 0.0004690845522177922, 'samples': 1858560, 'steps': 9679, 'loss/train': 1.0676768124103546} 01/27/2022 04:58:07 - INFO - codeparrot_training - Step 9680: {'lr': 0.0004690766700109659, 'samples': 1858752, 'steps': 9680, 'loss/train': 0.9703943431377411} 01/27/2022 04:58:11 - INFO - codeparrot_training - Step 9681: {'lr': 0.0004690687868656847, 'samples': 1858944, 'steps': 9681, 'loss/train': 1.0815220177173615} 01/27/2022 04:58:14 - INFO - codeparrot_training - Step 9682: {'lr': 0.00046906090278198246, 'samples': 1859136, 'steps': 9682, 'loss/train': 0.7308285981416702} 01/27/2022 04:58:17 - INFO - codeparrot_training - Step 9683: {'lr': 0.00046905301775989277, 'samples': 1859328, 'steps': 9683, 'loss/train': 1.0230867862701416} 01/27/2022 04:58:23 - INFO - codeparrot_training - Step 9684: {'lr': 0.0004690451317994495, 'samples': 1859520, 'steps': 9684, 'loss/train': 0.4857870787382126} 01/27/2022 04:58:26 - INFO - codeparrot_training - Step 9685: {'lr': 0.00046903724490068654, 'samples': 1859712, 'steps': 9685, 'loss/train': 0.7886024415493011} 01/27/2022 04:58:29 - INFO - codeparrot_training - Step 9686: {'lr': 0.00046902935706363754, 'samples': 1859904, 'steps': 9686, 'loss/train': 1.339219093322754} 01/27/2022 04:58:32 - INFO - codeparrot_training - Step 9687: {'lr': 0.0004690214682883363, 'samples': 1860096, 'steps': 9687, 'loss/train': 0.8132479190826416} 01/27/2022 04:58:36 - INFO - codeparrot_training - Step 9688: {'lr': 0.00046901357857481664, 'samples': 1860288, 'steps': 9688, 'loss/train': 0.27490460872650146} 01/27/2022 04:58:39 - INFO - codeparrot_training - Step 9689: {'lr': 0.0004690056879231124, 'samples': 1860480, 'steps': 9689, 'loss/train': 0.9287185370922089} 01/27/2022 04:58:42 - INFO - codeparrot_training - Step 9690: {'lr': 0.0004689977963332572, 'samples': 1860672, 'steps': 9690, 'loss/train': 0.824845165014267} 01/27/2022 04:58:45 - INFO - codeparrot_training - Step 9691: {'lr': 0.0004689899038052852, 'samples': 1860864, 'steps': 9691, 'loss/train': 0.8560210168361664} 01/27/2022 04:58:48 - INFO - codeparrot_training - Step 9692: {'lr': 0.0004689820103392298, 'samples': 1861056, 'steps': 9692, 'loss/train': 0.8389615416526794} 01/27/2022 04:58:53 - INFO - codeparrot_training - Step 9693: {'lr': 0.0004689741159351251, 'samples': 1861248, 'steps': 9693, 'loss/train': 0.9621855318546295} 01/27/2022 04:58:56 - INFO - codeparrot_training - Step 9694: {'lr': 0.00046896622059300477, 'samples': 1861440, 'steps': 9694, 'loss/train': 0.49800266325473785} 01/27/2022 04:58:59 - INFO - codeparrot_training - Step 9695: {'lr': 0.00046895832431290266, 'samples': 1861632, 'steps': 9695, 'loss/train': 1.3601329028606415} 01/27/2022 04:59:02 - INFO - codeparrot_training - Step 9696: {'lr': 0.0004689504270948527, 'samples': 1861824, 'steps': 9696, 'loss/train': 0.5132619291543961} 01/27/2022 04:59:05 - INFO - codeparrot_training - Step 9697: {'lr': 0.00046894252893888854, 'samples': 1862016, 'steps': 9697, 'loss/train': 0.7361322641372681} 01/27/2022 04:59:08 - INFO - codeparrot_training - Step 9698: {'lr': 0.0004689346298450442, 'samples': 1862208, 'steps': 9698, 'loss/train': 0.7880577743053436} 01/27/2022 04:59:12 - INFO - codeparrot_training - Step 9699: {'lr': 0.0004689267298133534, 'samples': 1862400, 'steps': 9699, 'loss/train': 1.2075456976890564} 01/27/2022 04:59:15 - INFO - codeparrot_training - Step 9700: {'lr': 0.00046891882884384997, 'samples': 1862592, 'steps': 9700, 'loss/train': 1.177333116531372} 01/27/2022 04:59:18 - INFO - codeparrot_training - Step 9701: {'lr': 0.00046891092693656777, 'samples': 1862784, 'steps': 9701, 'loss/train': 1.1070522665977478} 01/27/2022 04:59:23 - INFO - codeparrot_training - Step 9702: {'lr': 0.0004689030240915407, 'samples': 1862976, 'steps': 9702, 'loss/train': 1.0097446739673615} 01/27/2022 04:59:26 - INFO - codeparrot_training - Step 9703: {'lr': 0.0004688951203088026, 'samples': 1863168, 'steps': 9703, 'loss/train': 0.674508199095726} 01/27/2022 04:59:29 - INFO - codeparrot_training - Step 9704: {'lr': 0.00046888721558838734, 'samples': 1863360, 'steps': 9704, 'loss/train': 1.120725005865097} 01/27/2022 04:59:33 - INFO - codeparrot_training - Step 9705: {'lr': 0.0004688793099303287, 'samples': 1863552, 'steps': 9705, 'loss/train': 1.1232729256153107} 01/27/2022 04:59:36 - INFO - codeparrot_training - Step 9706: {'lr': 0.0004688714033346606, 'samples': 1863744, 'steps': 9706, 'loss/train': 0.27762501686811447} 01/27/2022 04:59:39 - INFO - codeparrot_training - Step 9707: {'lr': 0.000468863495801417, 'samples': 1863936, 'steps': 9707, 'loss/train': 0.5401257276535034} 01/27/2022 04:59:42 - INFO - codeparrot_training - Step 9708: {'lr': 0.00046885558733063157, 'samples': 1864128, 'steps': 9708, 'loss/train': 1.0019286274909973} 01/27/2022 04:59:45 - INFO - codeparrot_training - Step 9709: {'lr': 0.00046884767792233827, 'samples': 1864320, 'steps': 9709, 'loss/train': 0.862686038017273} 01/27/2022 04:59:50 - INFO - codeparrot_training - Step 9710: {'lr': 0.00046883976757657107, 'samples': 1864512, 'steps': 9710, 'loss/train': 0.9864230453968048} 01/27/2022 04:59:53 - INFO - codeparrot_training - Step 9711: {'lr': 0.00046883185629336386, 'samples': 1864704, 'steps': 9711, 'loss/train': 1.091356784105301} 01/27/2022 04:59:56 - INFO - codeparrot_training - Step 9712: {'lr': 0.0004688239440727504, 'samples': 1864896, 'steps': 9712, 'loss/train': 0.7332210391759872} 01/27/2022 04:59:59 - INFO - codeparrot_training - Step 9713: {'lr': 0.00046881603091476466, 'samples': 1865088, 'steps': 9713, 'loss/train': 0.9995805323123932} 01/27/2022 05:00:02 - INFO - codeparrot_training - Step 9714: {'lr': 0.0004688081168194405, 'samples': 1865280, 'steps': 9714, 'loss/train': 1.0468727946281433} 01/27/2022 05:00:05 - INFO - codeparrot_training - Step 9715: {'lr': 0.0004688002017868119, 'samples': 1865472, 'steps': 9715, 'loss/train': 0.6202057600021362} 01/27/2022 05:00:09 - INFO - codeparrot_training - Step 9716: {'lr': 0.0004687922858169126, 'samples': 1865664, 'steps': 9716, 'loss/train': 0.5918238312005997} 01/27/2022 05:00:12 - INFO - codeparrot_training - Step 9717: {'lr': 0.0004687843689097767, 'samples': 1865856, 'steps': 9717, 'loss/train': 0.45743680000305176} 01/27/2022 05:00:15 - INFO - codeparrot_training - Step 9718: {'lr': 0.0004687764510654381, 'samples': 1866048, 'steps': 9718, 'loss/train': 0.8121854066848755} 01/27/2022 05:00:19 - INFO - codeparrot_training - Step 9719: {'lr': 0.0004687685322839306, 'samples': 1866240, 'steps': 9719, 'loss/train': 0.3564930707216263} 01/27/2022 05:00:23 - INFO - codeparrot_training - Step 9720: {'lr': 0.00046876061256528813, 'samples': 1866432, 'steps': 9720, 'loss/train': 1.0850717425346375} 01/27/2022 05:00:26 - INFO - codeparrot_training - Step 9721: {'lr': 0.00046875269190954465, 'samples': 1866624, 'steps': 9721, 'loss/train': 2.0903032422065735} 01/27/2022 05:00:29 - INFO - codeparrot_training - Step 9722: {'lr': 0.00046874477031673417, 'samples': 1866816, 'steps': 9722, 'loss/train': 1.205346018075943} 01/27/2022 05:00:32 - INFO - codeparrot_training - Step 9723: {'lr': 0.00046873684778689053, 'samples': 1867008, 'steps': 9723, 'loss/train': 0.9046847820281982} 01/27/2022 05:00:35 - INFO - codeparrot_training - Step 9724: {'lr': 0.00046872892432004765, 'samples': 1867200, 'steps': 9724, 'loss/train': 1.4076025485992432} 01/27/2022 05:00:38 - INFO - codeparrot_training - Step 9725: {'lr': 0.00046872099991623954, 'samples': 1867392, 'steps': 9725, 'loss/train': 0.8638469874858856} 01/27/2022 05:00:41 - INFO - codeparrot_training - Step 9726: {'lr': 0.0004687130745755002, 'samples': 1867584, 'steps': 9726, 'loss/train': 0.5771912187337875} 01/27/2022 05:00:44 - INFO - codeparrot_training - Step 9727: {'lr': 0.0004687051482978634, 'samples': 1867776, 'steps': 9727, 'loss/train': 0.8699185252189636} 01/27/2022 05:00:49 - INFO - codeparrot_training - Step 9728: {'lr': 0.0004686972210833632, 'samples': 1867968, 'steps': 9728, 'loss/train': 0.8848888278007507} 01/27/2022 05:00:53 - INFO - codeparrot_training - Step 9729: {'lr': 0.00046868929293203355, 'samples': 1868160, 'steps': 9729, 'loss/train': 1.1175609827041626} 01/27/2022 05:00:56 - INFO - codeparrot_training - Step 9730: {'lr': 0.0004686813638439085, 'samples': 1868352, 'steps': 9730, 'loss/train': 0.7617199122905731} 01/27/2022 05:00:59 - INFO - codeparrot_training - Step 9731: {'lr': 0.00046867343381902185, 'samples': 1868544, 'steps': 9731, 'loss/train': 0.46799178421497345} 01/27/2022 05:01:02 - INFO - codeparrot_training - Step 9732: {'lr': 0.0004686655028574076, 'samples': 1868736, 'steps': 9732, 'loss/train': 0.6802250146865845} 01/27/2022 05:01:05 - INFO - codeparrot_training - Step 9733: {'lr': 0.0004686575709590998, 'samples': 1868928, 'steps': 9733, 'loss/train': 0.9493067264556885} 01/27/2022 05:01:08 - INFO - codeparrot_training - Step 9734: {'lr': 0.00046864963812413244, 'samples': 1869120, 'steps': 9734, 'loss/train': 0.6113565266132355} 01/27/2022 05:01:11 - INFO - codeparrot_training - Step 9735: {'lr': 0.00046864170435253946, 'samples': 1869312, 'steps': 9735, 'loss/train': 0.3508133813738823} 01/27/2022 05:01:15 - INFO - codeparrot_training - Step 9736: {'lr': 0.0004686337696443548, 'samples': 1869504, 'steps': 9736, 'loss/train': 1.3996162712574005} 01/27/2022 05:01:19 - INFO - codeparrot_training - Step 9737: {'lr': 0.0004686258339996125, 'samples': 1869696, 'steps': 9737, 'loss/train': 0.7507693469524384} 01/27/2022 05:01:22 - INFO - codeparrot_training - Step 9738: {'lr': 0.0004686178974183466, 'samples': 1869888, 'steps': 9738, 'loss/train': 0.458033949136734} 01/27/2022 05:01:25 - INFO - codeparrot_training - Step 9739: {'lr': 0.00046860995990059096, 'samples': 1870080, 'steps': 9739, 'loss/train': 1.1069153845310211} 01/27/2022 05:01:29 - INFO - codeparrot_training - Step 9740: {'lr': 0.00046860202144637976, 'samples': 1870272, 'steps': 9740, 'loss/train': 0.9361193776130676} 01/27/2022 05:01:32 - INFO - codeparrot_training - Step 9741: {'lr': 0.0004685940820557468, 'samples': 1870464, 'steps': 9741, 'loss/train': 1.399347335100174} 01/27/2022 05:01:35 - INFO - codeparrot_training - Step 9742: {'lr': 0.0004685861417287263, 'samples': 1870656, 'steps': 9742, 'loss/train': 0.835165947675705} 01/27/2022 05:01:38 - INFO - codeparrot_training - Step 9743: {'lr': 0.00046857820046535215, 'samples': 1870848, 'steps': 9743, 'loss/train': 0.5582652390003204} 01/27/2022 05:01:41 - INFO - codeparrot_training - Step 9744: {'lr': 0.0004685702582656584, 'samples': 1871040, 'steps': 9744, 'loss/train': 0.6507942527532578} 01/27/2022 05:01:44 - INFO - codeparrot_training - Step 9745: {'lr': 0.0004685623151296791, 'samples': 1871232, 'steps': 9745, 'loss/train': 1.0796229243278503} 01/27/2022 05:01:50 - INFO - codeparrot_training - Step 9746: {'lr': 0.0004685543710574482, 'samples': 1871424, 'steps': 9746, 'loss/train': 0.7812491655349731} 01/27/2022 05:01:53 - INFO - codeparrot_training - Step 9747: {'lr': 0.00046854642604899976, 'samples': 1871616, 'steps': 9747, 'loss/train': 0.759501188993454} 01/27/2022 05:01:56 - INFO - codeparrot_training - Step 9748: {'lr': 0.00046853848010436783, 'samples': 1871808, 'steps': 9748, 'loss/train': 0.5837737619876862} 01/27/2022 05:01:59 - INFO - codeparrot_training - Step 9749: {'lr': 0.00046853053322358653, 'samples': 1872000, 'steps': 9749, 'loss/train': 0.9948899745941162} 01/27/2022 05:02:02 - INFO - codeparrot_training - Step 9750: {'lr': 0.00046852258540668973, 'samples': 1872192, 'steps': 9750, 'loss/train': 0.9707159399986267} 01/27/2022 05:02:05 - INFO - codeparrot_training - Step 9751: {'lr': 0.0004685146366537116, 'samples': 1872384, 'steps': 9751, 'loss/train': 0.8653213083744049} 01/27/2022 05:02:09 - INFO - codeparrot_training - Step 9752: {'lr': 0.00046850668696468614, 'samples': 1872576, 'steps': 9752, 'loss/train': 1.1897996664047241} 01/27/2022 05:02:12 - INFO - codeparrot_training - Step 9753: {'lr': 0.0004684987363396474, 'samples': 1872768, 'steps': 9753, 'loss/train': 0.7963666319847107} 01/27/2022 05:02:15 - INFO - codeparrot_training - Step 9754: {'lr': 0.0004684907847786295, 'samples': 1872960, 'steps': 9754, 'loss/train': 0.6902772635221481} 01/27/2022 05:02:19 - INFO - codeparrot_training - Step 9755: {'lr': 0.0004684828322816664, 'samples': 1873152, 'steps': 9755, 'loss/train': 0.6993052214384079} 01/27/2022 05:02:22 - INFO - codeparrot_training - Step 9756: {'lr': 0.00046847487884879227, 'samples': 1873344, 'steps': 9756, 'loss/train': 0.904557466506958} 01/27/2022 05:02:25 - INFO - codeparrot_training - Step 9757: {'lr': 0.0004684669244800411, 'samples': 1873536, 'steps': 9757, 'loss/train': 1.3558013141155243} 01/27/2022 05:02:29 - INFO - codeparrot_training - Step 9758: {'lr': 0.00046845896917544703, 'samples': 1873728, 'steps': 9758, 'loss/train': 1.0372111201286316} 01/27/2022 05:02:32 - INFO - codeparrot_training - Step 9759: {'lr': 0.00046845101293504403, 'samples': 1873920, 'steps': 9759, 'loss/train': 1.143460750579834} 01/27/2022 05:02:35 - INFO - codeparrot_training - Step 9760: {'lr': 0.00046844305575886636, 'samples': 1874112, 'steps': 9760, 'loss/train': 0.8943251073360443} 01/27/2022 05:02:38 - INFO - codeparrot_training - Step 9761: {'lr': 0.00046843509764694794, 'samples': 1874304, 'steps': 9761, 'loss/train': 0.729582890868187} 01/27/2022 05:02:41 - INFO - codeparrot_training - Step 9762: {'lr': 0.0004684271385993229, 'samples': 1874496, 'steps': 9762, 'loss/train': 0.9230979681015015} 01/27/2022 05:02:44 - INFO - codeparrot_training - Step 9763: {'lr': 0.0004684191786160254, 'samples': 1874688, 'steps': 9763, 'loss/train': 0.9189364314079285} 01/27/2022 05:02:49 - INFO - codeparrot_training - Step 9764: {'lr': 0.0004684112176970895, 'samples': 1874880, 'steps': 9764, 'loss/train': 0.7929114997386932} 01/27/2022 05:02:52 - INFO - codeparrot_training - Step 9765: {'lr': 0.0004684032558425493, 'samples': 1875072, 'steps': 9765, 'loss/train': 0.5465534627437592} 01/27/2022 05:02:55 - INFO - codeparrot_training - Step 9766: {'lr': 0.00046839529305243885, 'samples': 1875264, 'steps': 9766, 'loss/train': 0.33386436849832535} 01/27/2022 05:02:58 - INFO - codeparrot_training - Step 9767: {'lr': 0.00046838732932679236, 'samples': 1875456, 'steps': 9767, 'loss/train': 0.8522070050239563} 01/27/2022 05:03:01 - INFO - codeparrot_training - Step 9768: {'lr': 0.0004683793646656439, 'samples': 1875648, 'steps': 9768, 'loss/train': 0.7915922105312347} 01/27/2022 05:03:05 - INFO - codeparrot_training - Step 9769: {'lr': 0.00046837139906902753, 'samples': 1875840, 'steps': 9769, 'loss/train': 0.9176023006439209} 01/27/2022 05:03:08 - INFO - codeparrot_training - Step 9770: {'lr': 0.00046836343253697744, 'samples': 1876032, 'steps': 9770, 'loss/train': 0.8032577633857727} 01/27/2022 05:03:11 - INFO - codeparrot_training - Step 9771: {'lr': 0.0004683554650695278, 'samples': 1876224, 'steps': 9771, 'loss/train': 0.6697886735200882} 01/27/2022 05:03:15 - INFO - codeparrot_training - Step 9772: {'lr': 0.0004683474966667127, 'samples': 1876416, 'steps': 9772, 'loss/train': 1.1282751560211182} 01/27/2022 05:03:18 - INFO - codeparrot_training - Step 9773: {'lr': 0.00046833952732856614, 'samples': 1876608, 'steps': 9773, 'loss/train': 1.0062804222106934} 01/27/2022 05:03:22 - INFO - codeparrot_training - Step 9774: {'lr': 0.00046833155705512246, 'samples': 1876800, 'steps': 9774, 'loss/train': 0.3500572443008423} 01/27/2022 05:03:25 - INFO - codeparrot_training - Step 9775: {'lr': 0.0004683235858464157, 'samples': 1876992, 'steps': 9775, 'loss/train': 0.8816007971763611} 01/27/2022 05:03:28 - INFO - codeparrot_training - Step 9776: {'lr': 0.0004683156137024801, 'samples': 1877184, 'steps': 9776, 'loss/train': 0.8883237540721893} 01/27/2022 05:03:31 - INFO - codeparrot_training - Step 9777: {'lr': 0.0004683076406233496, 'samples': 1877376, 'steps': 9777, 'loss/train': 0.6788193583488464} 01/27/2022 05:03:34 - INFO - codeparrot_training - Step 9778: {'lr': 0.0004682996666090585, 'samples': 1877568, 'steps': 9778, 'loss/train': 0.926941305398941} 01/27/2022 05:03:37 - INFO - codeparrot_training - Step 9779: {'lr': 0.00046829169165964104, 'samples': 1877760, 'steps': 9779, 'loss/train': 0.8425805568695068} 01/27/2022 05:03:40 - INFO - codeparrot_training - Step 9780: {'lr': 0.0004682837157751313, 'samples': 1877952, 'steps': 9780, 'loss/train': 0.7628106772899628} 01/27/2022 05:03:45 - INFO - codeparrot_training - Step 9781: {'lr': 0.00046827573895556334, 'samples': 1878144, 'steps': 9781, 'loss/train': 0.751440167427063} 01/27/2022 05:03:49 - INFO - codeparrot_training - Step 9782: {'lr': 0.00046826776120097147, 'samples': 1878336, 'steps': 9782, 'loss/train': 0.802772730588913} 01/27/2022 05:03:52 - INFO - codeparrot_training - Step 9783: {'lr': 0.0004682597825113898, 'samples': 1878528, 'steps': 9783, 'loss/train': 0.8535660803318024} 01/27/2022 05:03:55 - INFO - codeparrot_training - Step 9784: {'lr': 0.00046825180288685253, 'samples': 1878720, 'steps': 9784, 'loss/train': 0.9915100336074829} 01/27/2022 05:03:58 - INFO - codeparrot_training - Step 9785: {'lr': 0.00046824382232739386, 'samples': 1878912, 'steps': 9785, 'loss/train': 1.0889405608177185} 01/27/2022 05:04:01 - INFO - codeparrot_training - Step 9786: {'lr': 0.00046823584083304794, 'samples': 1879104, 'steps': 9786, 'loss/train': 0.6582307666540146} 01/27/2022 05:04:04 - INFO - codeparrot_training - Step 9787: {'lr': 0.00046822785840384897, 'samples': 1879296, 'steps': 9787, 'loss/train': 0.9089199900627136} 01/27/2022 05:04:08 - INFO - codeparrot_training - Step 9788: {'lr': 0.0004682198750398312, 'samples': 1879488, 'steps': 9788, 'loss/train': 0.41884008049964905} 01/27/2022 05:04:12 - INFO - codeparrot_training - Step 9789: {'lr': 0.0004682118907410287, 'samples': 1879680, 'steps': 9789, 'loss/train': 1.0589573979377747} 01/27/2022 05:04:15 - INFO - codeparrot_training - Step 9790: {'lr': 0.00046820390550747585, 'samples': 1879872, 'steps': 9790, 'loss/train': 1.09207883477211} 01/27/2022 05:04:18 - INFO - codeparrot_training - Step 9791: {'lr': 0.0004681959193392067, 'samples': 1880064, 'steps': 9791, 'loss/train': 0.8741379976272583} 01/27/2022 05:04:21 - INFO - codeparrot_training - Step 9792: {'lr': 0.00046818793223625543, 'samples': 1880256, 'steps': 9792, 'loss/train': 0.6702439785003662} 01/27/2022 05:04:24 - INFO - codeparrot_training - Step 9793: {'lr': 0.0004681799441986564, 'samples': 1880448, 'steps': 9793, 'loss/train': 0.7839181423187256} 01/27/2022 05:04:28 - INFO - codeparrot_training - Step 9794: {'lr': 0.00046817195522644387, 'samples': 1880640, 'steps': 9794, 'loss/train': 1.1146999597549438} 01/27/2022 05:04:31 - INFO - codeparrot_training - Step 9795: {'lr': 0.00046816396531965186, 'samples': 1880832, 'steps': 9795, 'loss/train': 0.8828629553318024} 01/27/2022 05:04:34 - INFO - codeparrot_training - Step 9796: {'lr': 0.0004681559744783147, 'samples': 1881024, 'steps': 9796, 'loss/train': 1.1343439221382141} 01/27/2022 05:04:37 - INFO - codeparrot_training - Step 9797: {'lr': 0.00046814798270246663, 'samples': 1881216, 'steps': 9797, 'loss/train': 1.0044412314891815} 01/27/2022 05:04:42 - INFO - codeparrot_training - Step 9798: {'lr': 0.00046813998999214193, 'samples': 1881408, 'steps': 9798, 'loss/train': 1.018725425004959} 01/27/2022 05:04:45 - INFO - codeparrot_training - Step 9799: {'lr': 0.0004681319963473747, 'samples': 1881600, 'steps': 9799, 'loss/train': 0.8268677294254303} 01/27/2022 05:04:48 - INFO - codeparrot_training - Step 9800: {'lr': 0.0004681240017681993, 'samples': 1881792, 'steps': 9800, 'loss/train': 0.5660408735275269} 01/27/2022 05:04:51 - INFO - codeparrot_training - Step 9801: {'lr': 0.0004681160062546499, 'samples': 1881984, 'steps': 9801, 'loss/train': 0.49776823818683624} 01/27/2022 05:04:54 - INFO - codeparrot_training - Step 9802: {'lr': 0.00046810800980676083, 'samples': 1882176, 'steps': 9802, 'loss/train': 0.7055796235799789} 01/27/2022 05:04:57 - INFO - codeparrot_training - Step 9803: {'lr': 0.0004681000124245663, 'samples': 1882368, 'steps': 9803, 'loss/train': 0.6484775394201279} 01/27/2022 05:05:01 - INFO - codeparrot_training - Step 9804: {'lr': 0.0004680920141081005, 'samples': 1882560, 'steps': 9804, 'loss/train': 1.0305871367454529} 01/27/2022 05:05:04 - INFO - codeparrot_training - Step 9805: {'lr': 0.00046808401485739793, 'samples': 1882752, 'steps': 9805, 'loss/train': 0.9582599401473999} 01/27/2022 05:05:07 - INFO - codeparrot_training - Step 9806: {'lr': 0.00046807601467249255, 'samples': 1882944, 'steps': 9806, 'loss/train': 0.9442242980003357} 01/27/2022 05:05:12 - INFO - codeparrot_training - Step 9807: {'lr': 0.0004680680135534188, 'samples': 1883136, 'steps': 9807, 'loss/train': 0.8267237842082977} 01/27/2022 05:05:15 - INFO - codeparrot_training - Step 9808: {'lr': 0.00046806001150021095, 'samples': 1883328, 'steps': 9808, 'loss/train': 0.7352886199951172} 01/27/2022 05:05:18 - INFO - codeparrot_training - Step 9809: {'lr': 0.0004680520085129032, 'samples': 1883520, 'steps': 9809, 'loss/train': 0.7579556107521057} 01/27/2022 05:05:21 - INFO - codeparrot_training - Step 9810: {'lr': 0.00046804400459152994, 'samples': 1883712, 'steps': 9810, 'loss/train': 0.7620457112789154} 01/27/2022 05:05:24 - INFO - codeparrot_training - Step 9811: {'lr': 0.0004680359997361254, 'samples': 1883904, 'steps': 9811, 'loss/train': 1.2106908559799194} 01/27/2022 05:05:28 - INFO - codeparrot_training - Step 9812: {'lr': 0.0004680279939467238, 'samples': 1884096, 'steps': 9812, 'loss/train': 1.048517793416977} 01/27/2022 05:05:31 - INFO - codeparrot_training - Step 9813: {'lr': 0.0004680199872233596, 'samples': 1884288, 'steps': 9813, 'loss/train': 0.4626120775938034} 01/27/2022 05:05:34 - INFO - codeparrot_training - Step 9814: {'lr': 0.00046801197956606693, 'samples': 1884480, 'steps': 9814, 'loss/train': 0.7628340125083923} 01/27/2022 05:05:38 - INFO - codeparrot_training - Step 9815: {'lr': 0.00046800397097488024, 'samples': 1884672, 'steps': 9815, 'loss/train': 0.9397338330745697} 01/27/2022 05:05:42 - INFO - codeparrot_training - Step 9816: {'lr': 0.0004679959614498337, 'samples': 1884864, 'steps': 9816, 'loss/train': 0.8927637934684753} 01/27/2022 05:05:45 - INFO - codeparrot_training - Step 9817: {'lr': 0.0004679879509909617, 'samples': 1885056, 'steps': 9817, 'loss/train': 1.144431710243225} 01/27/2022 05:05:48 - INFO - codeparrot_training - Step 9818: {'lr': 0.00046797993959829857, 'samples': 1885248, 'steps': 9818, 'loss/train': 0.8020174205303192} 01/27/2022 05:05:51 - INFO - codeparrot_training - Step 9819: {'lr': 0.00046797192727187855, 'samples': 1885440, 'steps': 9819, 'loss/train': 0.8190645575523376} 01/27/2022 05:05:54 - INFO - codeparrot_training - Step 9820: {'lr': 0.000467963914011736, 'samples': 1885632, 'steps': 9820, 'loss/train': 0.6399305909872055} 01/27/2022 05:05:57 - INFO - codeparrot_training - Step 9821: {'lr': 0.0004679558998179053, 'samples': 1885824, 'steps': 9821, 'loss/train': 0.0954340472817421} 01/27/2022 05:06:00 - INFO - codeparrot_training - Step 9822: {'lr': 0.0004679478846904207, 'samples': 1886016, 'steps': 9822, 'loss/train': 0.8017407059669495} 01/27/2022 05:06:04 - INFO - codeparrot_training - Step 9823: {'lr': 0.00046793986862931654, 'samples': 1886208, 'steps': 9823, 'loss/train': 0.9445806741714478} 01/27/2022 05:06:09 - INFO - codeparrot_training - Step 9824: {'lr': 0.0004679318516346273, 'samples': 1886400, 'steps': 9824, 'loss/train': 1.0956363379955292} 01/27/2022 05:06:12 - INFO - codeparrot_training - Step 9825: {'lr': 0.00046792383370638705, 'samples': 1886592, 'steps': 9825, 'loss/train': 1.2748546600341797} 01/27/2022 05:06:15 - INFO - codeparrot_training - Step 9826: {'lr': 0.0004679158148446304, 'samples': 1886784, 'steps': 9826, 'loss/train': 0.9395224750041962} 01/27/2022 05:06:18 - INFO - codeparrot_training - Step 9827: {'lr': 0.00046790779504939155, 'samples': 1886976, 'steps': 9827, 'loss/train': 1.1133824586868286} 01/27/2022 05:06:21 - INFO - codeparrot_training - Step 9828: {'lr': 0.00046789977432070497, 'samples': 1887168, 'steps': 9828, 'loss/train': 1.3263095319271088} 01/27/2022 05:06:24 - INFO - codeparrot_training - Step 9829: {'lr': 0.00046789175265860483, 'samples': 1887360, 'steps': 9829, 'loss/train': 0.5204928517341614} 01/27/2022 05:06:28 - INFO - codeparrot_training - Step 9830: {'lr': 0.00046788373006312567, 'samples': 1887552, 'steps': 9830, 'loss/train': 1.3566154539585114} 01/27/2022 05:06:31 - INFO - codeparrot_training - Step 9831: {'lr': 0.0004678757065343019, 'samples': 1887744, 'steps': 9831, 'loss/train': 0.6316483318805695} 01/27/2022 05:06:34 - INFO - codeparrot_training - Step 9832: {'lr': 0.0004678676820721677, 'samples': 1887936, 'steps': 9832, 'loss/train': 1.089049905538559} 01/27/2022 05:06:38 - INFO - codeparrot_training - Step 9833: {'lr': 0.00046785965667675745, 'samples': 1888128, 'steps': 9833, 'loss/train': 0.220702663064003} 01/27/2022 05:06:41 - INFO - codeparrot_training - Step 9834: {'lr': 0.00046785163034810567, 'samples': 1888320, 'steps': 9834, 'loss/train': 0.37753571569919586} 01/27/2022 05:06:45 - INFO - codeparrot_training - Step 9835: {'lr': 0.00046784360308624675, 'samples': 1888512, 'steps': 9835, 'loss/train': 0.9688604772090912} 01/27/2022 05:06:48 - INFO - codeparrot_training - Step 9836: {'lr': 0.0004678355748912149, 'samples': 1888704, 'steps': 9836, 'loss/train': 1.217737466096878} 01/27/2022 05:06:51 - INFO - codeparrot_training - Step 9837: {'lr': 0.0004678275457630447, 'samples': 1888896, 'steps': 9837, 'loss/train': 1.2008644044399261} 01/27/2022 05:06:54 - INFO - codeparrot_training - Step 9838: {'lr': 0.0004678195157017704, 'samples': 1889088, 'steps': 9838, 'loss/train': 0.4310332238674164} 01/27/2022 05:06:57 - INFO - codeparrot_training - Step 9839: {'lr': 0.00046781148470742654, 'samples': 1889280, 'steps': 9839, 'loss/train': 0.8224710524082184} 01/27/2022 05:07:00 - INFO - codeparrot_training - Step 9840: {'lr': 0.0004678034527800474, 'samples': 1889472, 'steps': 9840, 'loss/train': 0.9111533761024475} 01/27/2022 05:07:03 - INFO - codeparrot_training - Step 9841: {'lr': 0.0004677954199196674, 'samples': 1889664, 'steps': 9841, 'loss/train': 0.5330579727888107} 01/27/2022 05:07:08 - INFO - codeparrot_training - Step 9842: {'lr': 0.00046778738612632097, 'samples': 1889856, 'steps': 9842, 'loss/train': 0.6244020313024521} 01/27/2022 05:07:11 - INFO - codeparrot_training - Step 9843: {'lr': 0.00046777935140004256, 'samples': 1890048, 'steps': 9843, 'loss/train': 0.8877481520175934} 01/27/2022 05:07:14 - INFO - codeparrot_training - Step 9844: {'lr': 0.00046777131574086663, 'samples': 1890240, 'steps': 9844, 'loss/train': 0.7373486012220383} 01/27/2022 05:07:17 - INFO - codeparrot_training - Step 9845: {'lr': 0.0004677632791488274, 'samples': 1890432, 'steps': 9845, 'loss/train': 0.9093828499317169} 01/27/2022 05:07:20 - INFO - codeparrot_training - Step 9846: {'lr': 0.00046775524162395954, 'samples': 1890624, 'steps': 9846, 'loss/train': 0.9989941120147705} 01/27/2022 05:07:24 - INFO - codeparrot_training - Step 9847: {'lr': 0.00046774720316629734, 'samples': 1890816, 'steps': 9847, 'loss/train': 1.1171596348285675} 01/27/2022 05:07:27 - INFO - codeparrot_training - Step 9848: {'lr': 0.00046773916377587524, 'samples': 1891008, 'steps': 9848, 'loss/train': 0.44067227840423584} 01/27/2022 05:07:30 - INFO - codeparrot_training - Step 9849: {'lr': 0.00046773112345272773, 'samples': 1891200, 'steps': 9849, 'loss/train': 0.7995927929878235} 01/27/2022 05:07:33 - INFO - codeparrot_training - Step 9850: {'lr': 0.0004677230821968892, 'samples': 1891392, 'steps': 9850, 'loss/train': 0.6087605059146881} 01/27/2022 05:07:38 - INFO - codeparrot_training - Step 9851: {'lr': 0.00046771504000839417, 'samples': 1891584, 'steps': 9851, 'loss/train': 0.7630469799041748} 01/27/2022 05:07:42 - INFO - codeparrot_training - Step 9852: {'lr': 0.0004677069968872769, 'samples': 1891776, 'steps': 9852, 'loss/train': 0.8244670629501343} 01/27/2022 05:07:45 - INFO - codeparrot_training - Step 9853: {'lr': 0.0004676989528335721, 'samples': 1891968, 'steps': 9853, 'loss/train': 0.9236890375614166} 01/27/2022 05:07:48 - INFO - codeparrot_training - Step 9854: {'lr': 0.0004676909078473142, 'samples': 1892160, 'steps': 9854, 'loss/train': 0.965815544128418} 01/27/2022 05:07:51 - INFO - codeparrot_training - Step 9855: {'lr': 0.00046768286192853736, 'samples': 1892352, 'steps': 9855, 'loss/train': 2.1495142579078674} 01/27/2022 05:07:54 - INFO - codeparrot_training - Step 9856: {'lr': 0.00046767481507727646, 'samples': 1892544, 'steps': 9856, 'loss/train': 0.6066224277019501} 01/27/2022 05:07:57 - INFO - codeparrot_training - Step 9857: {'lr': 0.00046766676729356564, 'samples': 1892736, 'steps': 9857, 'loss/train': 0.5795161128044128} 01/27/2022 05:08:00 - INFO - codeparrot_training - Step 9858: {'lr': 0.0004676587185774396, 'samples': 1892928, 'steps': 9858, 'loss/train': 0.9292364716529846} 01/27/2022 05:08:04 - INFO - codeparrot_training - Step 9859: {'lr': 0.00046765066892893266, 'samples': 1893120, 'steps': 9859, 'loss/train': 1.031782329082489} 01/27/2022 05:08:08 - INFO - codeparrot_training - Step 9860: {'lr': 0.00046764261834807944, 'samples': 1893312, 'steps': 9860, 'loss/train': 0.46694768965244293} 01/27/2022 05:08:11 - INFO - codeparrot_training - Step 9861: {'lr': 0.0004676345668349142, 'samples': 1893504, 'steps': 9861, 'loss/train': 0.834747701883316} 01/27/2022 05:08:15 - INFO - codeparrot_training - Step 9862: {'lr': 0.0004676265143894717, 'samples': 1893696, 'steps': 9862, 'loss/train': 0.9974414706230164} 01/27/2022 05:08:18 - INFO - codeparrot_training - Step 9863: {'lr': 0.0004676184610117863, 'samples': 1893888, 'steps': 9863, 'loss/train': 1.3284332156181335} 01/27/2022 05:08:21 - INFO - codeparrot_training - Step 9864: {'lr': 0.0004676104067018925, 'samples': 1894080, 'steps': 9864, 'loss/train': 0.9928745627403259} 01/27/2022 05:08:24 - INFO - codeparrot_training - Step 9865: {'lr': 0.0004676023514598249, 'samples': 1894272, 'steps': 9865, 'loss/train': 0.6862851083278656} 01/27/2022 05:08:27 - INFO - codeparrot_training - Step 9866: {'lr': 0.0004675942952856178, 'samples': 1894464, 'steps': 9866, 'loss/train': 0.4278344660997391} 01/27/2022 05:08:30 - INFO - codeparrot_training - Step 9867: {'lr': 0.0004675862381793059, 'samples': 1894656, 'steps': 9867, 'loss/train': 0.33460336178541183} 01/27/2022 05:08:33 - INFO - codeparrot_training - Step 9868: {'lr': 0.0004675781801409236, 'samples': 1894848, 'steps': 9868, 'loss/train': 0.6396247744560242} 01/27/2022 05:08:38 - INFO - codeparrot_training - Step 9869: {'lr': 0.00046757012117050554, 'samples': 1895040, 'steps': 9869, 'loss/train': 1.0593091249465942} 01/27/2022 05:08:41 - INFO - codeparrot_training - Step 9870: {'lr': 0.00046756206126808607, 'samples': 1895232, 'steps': 9870, 'loss/train': 1.033915400505066} 01/27/2022 05:08:44 - INFO - codeparrot_training - Step 9871: {'lr': 0.0004675540004336999, 'samples': 1895424, 'steps': 9871, 'loss/train': 0.8194423913955688} 01/27/2022 05:08:47 - INFO - codeparrot_training - Step 9872: {'lr': 0.00046754593866738144, 'samples': 1895616, 'steps': 9872, 'loss/train': 0.6094836294651031} 01/27/2022 05:08:51 - INFO - codeparrot_training - Step 9873: {'lr': 0.0004675378759691652, 'samples': 1895808, 'steps': 9873, 'loss/train': 0.8015723526477814} 01/27/2022 05:08:54 - INFO - codeparrot_training - Step 9874: {'lr': 0.00046752981233908587, 'samples': 1896000, 'steps': 9874, 'loss/train': 1.184764176607132} 01/27/2022 05:08:57 - INFO - codeparrot_training - Step 9875: {'lr': 0.0004675217477771779, 'samples': 1896192, 'steps': 9875, 'loss/train': 0.7853537499904633} 01/27/2022 05:09:00 - INFO - codeparrot_training - Step 9876: {'lr': 0.0004675136822834758, 'samples': 1896384, 'steps': 9876, 'loss/train': 0.9146861135959625} 01/27/2022 05:09:03 - INFO - codeparrot_training - Step 9877: {'lr': 0.0004675056158580141, 'samples': 1896576, 'steps': 9877, 'loss/train': 0.9779843688011169} 01/27/2022 05:09:08 - INFO - codeparrot_training - Step 9878: {'lr': 0.0004674975485008275, 'samples': 1896768, 'steps': 9878, 'loss/train': 0.8672700226306915} 01/27/2022 05:09:11 - INFO - codeparrot_training - Step 9879: {'lr': 0.00046748948021195036, 'samples': 1896960, 'steps': 9879, 'loss/train': 1.1741250157356262} 01/27/2022 05:09:14 - INFO - codeparrot_training - Step 9880: {'lr': 0.0004674814109914174, 'samples': 1897152, 'steps': 9880, 'loss/train': 1.1119362115859985} 01/27/2022 05:09:17 - INFO - codeparrot_training - Step 9881: {'lr': 0.00046747334083926316, 'samples': 1897344, 'steps': 9881, 'loss/train': 0.8449435830116272} 01/27/2022 05:09:20 - INFO - codeparrot_training - Step 9882: {'lr': 0.0004674652697555222, 'samples': 1897536, 'steps': 9882, 'loss/train': 0.34133362770080566} 01/27/2022 05:09:24 - INFO - codeparrot_training - Step 9883: {'lr': 0.000467457197740229, 'samples': 1897728, 'steps': 9883, 'loss/train': 1.3340635299682617} 01/27/2022 05:09:27 - INFO - codeparrot_training - Step 9884: {'lr': 0.00046744912479341826, 'samples': 1897920, 'steps': 9884, 'loss/train': 0.3410104885697365} 01/27/2022 05:09:30 - INFO - codeparrot_training - Step 9885: {'lr': 0.0004674410509151246, 'samples': 1898112, 'steps': 9885, 'loss/train': 0.7546420991420746} 01/27/2022 05:09:33 - INFO - codeparrot_training - Step 9886: {'lr': 0.0004674329761053824, 'samples': 1898304, 'steps': 9886, 'loss/train': 0.7748260796070099} 01/27/2022 05:09:38 - INFO - codeparrot_training - Step 9887: {'lr': 0.00046742490036422635, 'samples': 1898496, 'steps': 9887, 'loss/train': 0.7402768582105637} 01/27/2022 05:09:41 - INFO - codeparrot_training - Step 9888: {'lr': 0.00046741682369169115, 'samples': 1898688, 'steps': 9888, 'loss/train': 0.2072524130344391} 01/27/2022 05:09:45 - INFO - codeparrot_training - Step 9889: {'lr': 0.00046740874608781126, 'samples': 1898880, 'steps': 9889, 'loss/train': 1.5228275656700134} 01/27/2022 05:09:48 - INFO - codeparrot_training - Step 9890: {'lr': 0.0004674006675526214, 'samples': 1899072, 'steps': 9890, 'loss/train': 0.9619344770908356} 01/27/2022 05:09:51 - INFO - codeparrot_training - Step 9891: {'lr': 0.00046739258808615607, 'samples': 1899264, 'steps': 9891, 'loss/train': 0.6582333147525787} 01/27/2022 05:09:54 - INFO - codeparrot_training - Step 9892: {'lr': 0.00046738450768845, 'samples': 1899456, 'steps': 9892, 'loss/train': 0.4647851586341858} 01/27/2022 05:09:57 - INFO - codeparrot_training - Step 9893: {'lr': 0.0004673764263595376, 'samples': 1899648, 'steps': 9893, 'loss/train': 0.7767461836338043} 01/27/2022 05:10:00 - INFO - codeparrot_training - Step 9894: {'lr': 0.00046736834409945364, 'samples': 1899840, 'steps': 9894, 'loss/train': 0.7418145686388016} 01/27/2022 05:10:03 - INFO - codeparrot_training - Step 9895: {'lr': 0.0004673602609082328, 'samples': 1900032, 'steps': 9895, 'loss/train': 0.465616911649704} 01/27/2022 05:10:08 - INFO - codeparrot_training - Step 9896: {'lr': 0.00046735217678590957, 'samples': 1900224, 'steps': 9896, 'loss/train': 0.9274986684322357} 01/27/2022 05:10:11 - INFO - codeparrot_training - Step 9897: {'lr': 0.0004673440917325186, 'samples': 1900416, 'steps': 9897, 'loss/train': 0.7709591388702393} 01/27/2022 05:10:15 - INFO - codeparrot_training - Step 9898: {'lr': 0.00046733600574809465, 'samples': 1900608, 'steps': 9898, 'loss/train': 0.6106051504611969} 01/27/2022 05:10:18 - INFO - codeparrot_training - Step 9899: {'lr': 0.0004673279188326722, 'samples': 1900800, 'steps': 9899, 'loss/train': 0.6746719032526016} 01/27/2022 05:10:21 - INFO - codeparrot_training - Step 9900: {'lr': 0.00046731983098628597, 'samples': 1900992, 'steps': 9900, 'loss/train': 0.6639350205659866} 01/27/2022 05:10:24 - INFO - codeparrot_training - Step 9901: {'lr': 0.00046731174220897054, 'samples': 1901184, 'steps': 9901, 'loss/train': 0.821855753660202} 01/27/2022 05:10:27 - INFO - codeparrot_training - Step 9902: {'lr': 0.0004673036525007607, 'samples': 1901376, 'steps': 9902, 'loss/train': 0.8020432591438293} 01/27/2022 05:10:30 - INFO - codeparrot_training - Step 9903: {'lr': 0.0004672955618616909, 'samples': 1901568, 'steps': 9903, 'loss/train': 1.0120823979377747} 01/27/2022 05:10:35 - INFO - codeparrot_training - Step 9904: {'lr': 0.00046728747029179594, 'samples': 1901760, 'steps': 9904, 'loss/train': 0.9971366822719574} 01/27/2022 05:10:39 - INFO - codeparrot_training - Step 9905: {'lr': 0.00046727937779111054, 'samples': 1901952, 'steps': 9905, 'loss/train': 0.7740155160427094} 01/27/2022 05:10:42 - INFO - codeparrot_training - Step 9906: {'lr': 0.0004672712843596693, 'samples': 1902144, 'steps': 9906, 'loss/train': 1.2957091927528381} 01/27/2022 05:10:45 - INFO - codeparrot_training - Step 9907: {'lr': 0.0004672631899975067, 'samples': 1902336, 'steps': 9907, 'loss/train': 1.085093915462494} 01/27/2022 05:10:48 - INFO - codeparrot_training - Step 9908: {'lr': 0.0004672550947046577, 'samples': 1902528, 'steps': 9908, 'loss/train': 1.0051438808441162} 01/27/2022 05:10:51 - INFO - codeparrot_training - Step 9909: {'lr': 0.0004672469984811568, 'samples': 1902720, 'steps': 9909, 'loss/train': 0.9407480657100677} 01/27/2022 05:10:54 - INFO - codeparrot_training - Step 9910: {'lr': 0.00046723890132703886, 'samples': 1902912, 'steps': 9910, 'loss/train': 0.7886335551738739} 01/27/2022 05:10:57 - INFO - codeparrot_training - Step 9911: {'lr': 0.0004672308032423384, 'samples': 1903104, 'steps': 9911, 'loss/train': 0.8059402406215668} 01/27/2022 05:11:01 - INFO - codeparrot_training - Step 9912: {'lr': 0.0004672227042270901, 'samples': 1903296, 'steps': 9912, 'loss/train': 0.3988105058670044} 01/27/2022 05:11:05 - INFO - codeparrot_training - Step 9913: {'lr': 0.00046721460428132873, 'samples': 1903488, 'steps': 9913, 'loss/train': 1.1815135180950165} 01/27/2022 05:11:08 - INFO - codeparrot_training - Step 9914: {'lr': 0.00046720650340508895, 'samples': 1903680, 'steps': 9914, 'loss/train': 1.445079892873764} 01/27/2022 05:11:11 - INFO - codeparrot_training - Step 9915: {'lr': 0.00046719840159840557, 'samples': 1903872, 'steps': 9915, 'loss/train': 0.6123143881559372} 01/27/2022 05:11:14 - INFO - codeparrot_training - Step 9916: {'lr': 0.00046719029886131317, 'samples': 1904064, 'steps': 9916, 'loss/train': 0.6682193577289581} 01/27/2022 05:11:18 - INFO - codeparrot_training - Step 9917: {'lr': 0.0004671821951938464, 'samples': 1904256, 'steps': 9917, 'loss/train': 0.9198486506938934} 01/27/2022 05:11:21 - INFO - codeparrot_training - Step 9918: {'lr': 0.0004671740905960401, 'samples': 1904448, 'steps': 9918, 'loss/train': 0.9266087114810944} 01/27/2022 05:11:24 - INFO - codeparrot_training - Step 9919: {'lr': 0.00046716598506792905, 'samples': 1904640, 'steps': 9919, 'loss/train': 0.6497261971235275} 01/27/2022 05:11:27 - INFO - codeparrot_training - Step 9920: {'lr': 0.00046715787860954785, 'samples': 1904832, 'steps': 9920, 'loss/train': 0.9209049940109253} 01/27/2022 05:11:30 - INFO - codeparrot_training - Step 9921: {'lr': 0.0004671497712209312, 'samples': 1905024, 'steps': 9921, 'loss/train': 0.5348566621541977} 01/27/2022 05:11:35 - INFO - codeparrot_training - Step 9922: {'lr': 0.0004671416629021139, 'samples': 1905216, 'steps': 9922, 'loss/train': 0.7124885469675064} 01/27/2022 05:11:38 - INFO - codeparrot_training - Step 9923: {'lr': 0.0004671335536531307, 'samples': 1905408, 'steps': 9923, 'loss/train': 0.8924842178821564} 01/27/2022 05:11:41 - INFO - codeparrot_training - Step 9924: {'lr': 0.00046712544347401623, 'samples': 1905600, 'steps': 9924, 'loss/train': 0.8466224670410156} 01/27/2022 05:11:44 - INFO - codeparrot_training - Step 9925: {'lr': 0.0004671173323648054, 'samples': 1905792, 'steps': 9925, 'loss/train': 0.71087846159935} 01/27/2022 05:11:47 - INFO - codeparrot_training - Step 9926: {'lr': 0.00046710922032553283, 'samples': 1905984, 'steps': 9926, 'loss/train': 1.0649833381175995} 01/27/2022 05:11:51 - INFO - codeparrot_training - Step 9927: {'lr': 0.00046710110735623326, 'samples': 1906176, 'steps': 9927, 'loss/train': 0.9730616211891174} 01/27/2022 05:11:54 - INFO - codeparrot_training - Step 9928: {'lr': 0.00046709299345694156, 'samples': 1906368, 'steps': 9928, 'loss/train': 0.7995707988739014} 01/27/2022 05:11:57 - INFO - codeparrot_training - Step 9929: {'lr': 0.00046708487862769235, 'samples': 1906560, 'steps': 9929, 'loss/train': 0.7286640554666519} 01/27/2022 05:12:00 - INFO - codeparrot_training - Step 9930: {'lr': 0.0004670767628685204, 'samples': 1906752, 'steps': 9930, 'loss/train': 1.0004598498344421} 01/27/2022 05:12:06 - INFO - codeparrot_training - Step 9931: {'lr': 0.00046706864617946064, 'samples': 1906944, 'steps': 9931, 'loss/train': 0.8247159719467163} 01/27/2022 05:12:09 - INFO - codeparrot_training - Step 9932: {'lr': 0.0004670605285605477, 'samples': 1907136, 'steps': 9932, 'loss/train': 0.9913876354694366} 01/27/2022 05:12:12 - INFO - codeparrot_training - Step 9933: {'lr': 0.0004670524100118163, 'samples': 1907328, 'steps': 9933, 'loss/train': 2.548993706703186} 01/27/2022 05:12:15 - INFO - codeparrot_training - Step 9934: {'lr': 0.00046704429053330137, 'samples': 1907520, 'steps': 9934, 'loss/train': 1.1002099514007568} 01/27/2022 05:12:18 - INFO - codeparrot_training - Step 9935: {'lr': 0.00046703617012503764, 'samples': 1907712, 'steps': 9935, 'loss/train': 1.15547713637352} 01/27/2022 05:12:21 - INFO - codeparrot_training - Step 9936: {'lr': 0.00046702804878705987, 'samples': 1907904, 'steps': 9936, 'loss/train': 1.0456304848194122} 01/27/2022 05:12:25 - INFO - codeparrot_training - Step 9937: {'lr': 0.00046701992651940275, 'samples': 1908096, 'steps': 9937, 'loss/train': 0.5583078861236572} 01/27/2022 05:12:28 - INFO - codeparrot_training - Step 9938: {'lr': 0.00046701180332210125, 'samples': 1908288, 'steps': 9938, 'loss/train': 0.8989236652851105} 01/27/2022 05:12:31 - INFO - codeparrot_training - Step 9939: {'lr': 0.0004670036791951901, 'samples': 1908480, 'steps': 9939, 'loss/train': 0.5415217727422714} 01/27/2022 05:12:36 - INFO - codeparrot_training - Step 9940: {'lr': 0.0004669955541387041, 'samples': 1908672, 'steps': 9940, 'loss/train': 1.2291463315486908} 01/27/2022 05:12:39 - INFO - codeparrot_training - Step 9941: {'lr': 0.000466987428152678, 'samples': 1908864, 'steps': 9941, 'loss/train': 1.1611782610416412} 01/27/2022 05:12:42 - INFO - codeparrot_training - Step 9942: {'lr': 0.00046697930123714673, 'samples': 1909056, 'steps': 9942, 'loss/train': 0.9872477352619171} 01/27/2022 05:12:45 - INFO - codeparrot_training - Step 9943: {'lr': 0.000466971173392145, 'samples': 1909248, 'steps': 9943, 'loss/train': 0.593122199177742} 01/27/2022 05:12:48 - INFO - codeparrot_training - Step 9944: {'lr': 0.0004669630446177077, 'samples': 1909440, 'steps': 9944, 'loss/train': 1.3090420067310333} 01/27/2022 05:12:51 - INFO - codeparrot_training - Step 9945: {'lr': 0.00046695491491386955, 'samples': 1909632, 'steps': 9945, 'loss/train': 1.072484314441681} 01/27/2022 05:12:54 - INFO - codeparrot_training - Step 9946: {'lr': 0.0004669467842806654, 'samples': 1909824, 'steps': 9946, 'loss/train': 1.2787722945213318} 01/27/2022 05:12:58 - INFO - codeparrot_training - Step 9947: {'lr': 0.00046693865271813016, 'samples': 1910016, 'steps': 9947, 'loss/train': 0.6553361266851425} 01/27/2022 05:13:01 - INFO - codeparrot_training - Step 9948: {'lr': 0.0004669305202262987, 'samples': 1910208, 'steps': 9948, 'loss/train': 5.157761335372925} 01/27/2022 05:13:05 - INFO - codeparrot_training - Step 9949: {'lr': 0.00046692238680520564, 'samples': 1910400, 'steps': 9949, 'loss/train': 0.7871299982070923} 01/27/2022 05:13:08 - INFO - codeparrot_training - Step 9950: {'lr': 0.00046691425245488607, 'samples': 1910592, 'steps': 9950, 'loss/train': 0.8346870839595795} 01/27/2022 05:13:12 - INFO - codeparrot_training - Step 9951: {'lr': 0.0004669061171753746, 'samples': 1910784, 'steps': 9951, 'loss/train': 0.6847431063652039} 01/27/2022 05:13:15 - INFO - codeparrot_training - Step 9952: {'lr': 0.0004668979809667063, 'samples': 1910976, 'steps': 9952, 'loss/train': 0.3461862802505493} 01/27/2022 05:13:18 - INFO - codeparrot_training - Step 9953: {'lr': 0.0004668898438289159, 'samples': 1911168, 'steps': 9953, 'loss/train': 0.14535212516784668} 01/27/2022 05:13:21 - INFO - codeparrot_training - Step 9954: {'lr': 0.00046688170576203827, 'samples': 1911360, 'steps': 9954, 'loss/train': 1.0517131090164185} 01/27/2022 05:13:24 - INFO - codeparrot_training - Step 9955: {'lr': 0.00046687356676610825, 'samples': 1911552, 'steps': 9955, 'loss/train': 1.032987892627716} 01/27/2022 05:13:27 - INFO - codeparrot_training - Step 9956: {'lr': 0.00046686542684116073, 'samples': 1911744, 'steps': 9956, 'loss/train': 0.6784675866365433} 01/27/2022 05:13:30 - INFO - codeparrot_training - Step 9957: {'lr': 0.00046685728598723063, 'samples': 1911936, 'steps': 9957, 'loss/train': 0.911881685256958} 01/27/2022 05:13:35 - INFO - codeparrot_training - Step 9958: {'lr': 0.00046684914420435275, 'samples': 1912128, 'steps': 9958, 'loss/train': 0.8291423320770264} 01/27/2022 05:13:39 - INFO - codeparrot_training - Step 9959: {'lr': 0.00046684100149256205, 'samples': 1912320, 'steps': 9959, 'loss/train': 0.8286493420600891} 01/27/2022 05:13:42 - INFO - codeparrot_training - Step 9960: {'lr': 0.0004668328578518933, 'samples': 1912512, 'steps': 9960, 'loss/train': 0.94197016954422} 01/27/2022 05:13:45 - INFO - codeparrot_training - Step 9961: {'lr': 0.0004668247132823814, 'samples': 1912704, 'steps': 9961, 'loss/train': 0.45117759704589844} 01/27/2022 05:13:48 - INFO - codeparrot_training - Step 9962: {'lr': 0.00046681656778406136, 'samples': 1912896, 'steps': 9962, 'loss/train': 1.0880067050457} 01/27/2022 05:13:51 - INFO - codeparrot_training - Step 9963: {'lr': 0.000466808421356968, 'samples': 1913088, 'steps': 9963, 'loss/train': 0.7521657049655914} 01/27/2022 05:13:54 - INFO - codeparrot_training - Step 9964: {'lr': 0.00046680027400113614, 'samples': 1913280, 'steps': 9964, 'loss/train': 1.2890223562717438} 01/27/2022 05:13:57 - INFO - codeparrot_training - Step 9965: {'lr': 0.0004667921257166008, 'samples': 1913472, 'steps': 9965, 'loss/train': 0.8879135549068451} 01/27/2022 05:14:01 - INFO - codeparrot_training - Step 9966: {'lr': 0.00046678397650339677, 'samples': 1913664, 'steps': 9966, 'loss/train': 1.0588273108005524} 01/27/2022 05:14:05 - INFO - codeparrot_training - Step 9967: {'lr': 0.00046677582636155904, 'samples': 1913856, 'steps': 9967, 'loss/train': 1.1427773237228394} 01/27/2022 05:14:08 - INFO - codeparrot_training - Step 9968: {'lr': 0.00046676767529112254, 'samples': 1914048, 'steps': 9968, 'loss/train': 0.8637476563453674} 01/27/2022 05:14:11 - INFO - codeparrot_training - Step 9969: {'lr': 0.0004667595232921221, 'samples': 1914240, 'steps': 9969, 'loss/train': 0.6287658959627151} 01/27/2022 05:14:14 - INFO - codeparrot_training - Step 9970: {'lr': 0.00046675137036459273, 'samples': 1914432, 'steps': 9970, 'loss/train': 0.6748470515012741} 01/27/2022 05:14:18 - INFO - codeparrot_training - Step 9971: {'lr': 0.0004667432165085693, 'samples': 1914624, 'steps': 9971, 'loss/train': 0.981174498796463} 01/27/2022 05:14:21 - INFO - codeparrot_training - Step 9972: {'lr': 0.00046673506172408675, 'samples': 1914816, 'steps': 9972, 'loss/train': 0.7567615807056427} 01/27/2022 05:14:24 - INFO - codeparrot_training - Step 9973: {'lr': 0.0004667269060111801, 'samples': 1915008, 'steps': 9973, 'loss/train': 0.7656403183937073} 01/27/2022 05:14:27 - INFO - codeparrot_training - Step 9974: {'lr': 0.0004667187493698841, 'samples': 1915200, 'steps': 9974, 'loss/train': 0.6469078660011292} 01/27/2022 05:14:32 - INFO - codeparrot_training - Step 9975: {'lr': 0.00046671059180023377, 'samples': 1915392, 'steps': 9975, 'loss/train': 1.1684363186359406} 01/27/2022 05:14:35 - INFO - codeparrot_training - Step 9976: {'lr': 0.0004667024333022642, 'samples': 1915584, 'steps': 9976, 'loss/train': 1.1471396684646606} 01/27/2022 05:14:38 - INFO - codeparrot_training - Step 9977: {'lr': 0.00046669427387601017, 'samples': 1915776, 'steps': 9977, 'loss/train': 0.7447823882102966} 01/27/2022 05:14:41 - INFO - codeparrot_training - Step 9978: {'lr': 0.0004666861135215066, 'samples': 1915968, 'steps': 9978, 'loss/train': 0.6785638332366943} 01/27/2022 05:14:45 - INFO - codeparrot_training - Step 9979: {'lr': 0.0004666779522387886, 'samples': 1916160, 'steps': 9979, 'loss/train': 1.4486593008041382} 01/27/2022 05:14:48 - INFO - codeparrot_training - Step 9980: {'lr': 0.000466669790027891, 'samples': 1916352, 'steps': 9980, 'loss/train': 0.8156457245349884} 01/27/2022 05:14:51 - INFO - codeparrot_training - Step 9981: {'lr': 0.00046666162688884893, 'samples': 1916544, 'steps': 9981, 'loss/train': 1.058842420578003} 01/27/2022 05:14:54 - INFO - codeparrot_training - Step 9982: {'lr': 0.0004666534628216972, 'samples': 1916736, 'steps': 9982, 'loss/train': 0.5804478675127029} 01/27/2022 05:14:57 - INFO - codeparrot_training - Step 9983: {'lr': 0.0004666452978264708, 'samples': 1916928, 'steps': 9983, 'loss/train': 0.8935233056545258} 01/27/2022 05:15:01 - INFO - codeparrot_training - Step 9984: {'lr': 0.0004666371319032047, 'samples': 1917120, 'steps': 9984, 'loss/train': 0.9927999973297119} 01/27/2022 05:15:05 - INFO - codeparrot_training - Step 9985: {'lr': 0.00046662896505193395, 'samples': 1917312, 'steps': 9985, 'loss/train': 1.0545498132705688} 01/27/2022 05:15:08 - INFO - codeparrot_training - Step 9986: {'lr': 0.00046662079727269356, 'samples': 1917504, 'steps': 9986, 'loss/train': 0.3117845132946968} 01/27/2022 05:15:11 - INFO - codeparrot_training - Step 9987: {'lr': 0.0004666126285655184, 'samples': 1917696, 'steps': 9987, 'loss/train': 1.0073846876621246} 01/27/2022 05:15:14 - INFO - codeparrot_training - Step 9988: {'lr': 0.0004666044589304436, 'samples': 1917888, 'steps': 9988, 'loss/train': 0.830070823431015} 01/27/2022 05:15:17 - INFO - codeparrot_training - Step 9989: {'lr': 0.000466596288367504, 'samples': 1918080, 'steps': 9989, 'loss/train': 0.8373678624629974} 01/27/2022 05:15:20 - INFO - codeparrot_training - Step 9990: {'lr': 0.0004665881168767346, 'samples': 1918272, 'steps': 9990, 'loss/train': 0.6192463785409927} 01/27/2022 05:15:24 - INFO - codeparrot_training - Step 9991: {'lr': 0.00046657994445817064, 'samples': 1918464, 'steps': 9991, 'loss/train': 1.2562193870544434} 01/27/2022 05:15:27 - INFO - codeparrot_training - Step 9992: {'lr': 0.0004665717711118469, 'samples': 1918656, 'steps': 9992, 'loss/train': 0.9806605875492096} 01/27/2022 05:15:31 - INFO - codeparrot_training - Step 9993: {'lr': 0.00046656359683779845, 'samples': 1918848, 'steps': 9993, 'loss/train': 0.8713115751743317} 01/27/2022 05:15:34 - INFO - codeparrot_training - Step 9994: {'lr': 0.00046655542163606033, 'samples': 1919040, 'steps': 9994, 'loss/train': 0.9243660271167755} 01/27/2022 05:15:38 - INFO - codeparrot_training - Step 9995: {'lr': 0.0004665472455066675, 'samples': 1919232, 'steps': 9995, 'loss/train': 0.8440124988555908} 01/27/2022 05:15:41 - INFO - codeparrot_training - Step 9996: {'lr': 0.0004665390684496551, 'samples': 1919424, 'steps': 9996, 'loss/train': 0.9463157057762146} 01/27/2022 05:15:44 - INFO - codeparrot_training - Step 9997: {'lr': 0.0004665308904650581, 'samples': 1919616, 'steps': 9997, 'loss/train': 0.61367167532444} 01/27/2022 05:15:47 - INFO - codeparrot_training - Step 9998: {'lr': 0.00046652271155291146, 'samples': 1919808, 'steps': 9998, 'loss/train': 0.7927663922309875} 01/27/2022 05:15:50 - INFO - codeparrot_training - Step 9999: {'lr': 0.0004665145317132503, 'samples': 1920000, 'steps': 9999, 'loss/train': 0.9165996015071869} 01/27/2022 05:15:50 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/27/2022 05:16:08 - WARNING - huggingface_hub.repository - Several commits (5) will be pushed upstream. 01/27/2022 05:16:08 - WARNING - huggingface_hub.repository - The progress bars may be unreliable. 01/27/2022 05:16:45 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py 52f50af..71111c3 royal-monkey-12 -> royal-monkey-12 01/27/2022 05:16:50 - INFO - codeparrot_training - Step 10000: {'lr': 0.00046650635094610973, 'samples': 1920192, 'steps': 10000, 'loss/train': 0.6005915701389313} 01/27/2022 05:16:53 - INFO - codeparrot_training - Step 10001: {'lr': 0.00046649816925152456, 'samples': 1920384, 'steps': 10001, 'loss/train': 1.3139053881168365} 01/27/2022 05:16:57 - INFO - codeparrot_training - Step 10002: {'lr': 0.00046648998662953003, 'samples': 1920576, 'steps': 10002, 'loss/train': 1.1544798910617828} 01/27/2022 05:17:01 - INFO - codeparrot_training - Step 10003: {'lr': 0.00046648180308016116, 'samples': 1920768, 'steps': 10003, 'loss/train': 1.0219331681728363} 01/27/2022 05:17:04 - INFO - codeparrot_training - Step 10004: {'lr': 0.00046647361860345293, 'samples': 1920960, 'steps': 10004, 'loss/train': 1.0954038798809052} 01/27/2022 05:17:07 - INFO - codeparrot_training - Step 10005: {'lr': 0.00046646543319944057, 'samples': 1921152, 'steps': 10005, 'loss/train': 0.58241406083107} 01/27/2022 05:17:10 - INFO - codeparrot_training - Step 10006: {'lr': 0.00046645724686815893, 'samples': 1921344, 'steps': 10006, 'loss/train': 0.9176115095615387} 01/27/2022 05:17:13 - INFO - codeparrot_training - Step 10007: {'lr': 0.00046644905960964325, 'samples': 1921536, 'steps': 10007, 'loss/train': 0.7149100452661514} 01/27/2022 05:17:16 - INFO - codeparrot_training - Step 10008: {'lr': 0.00046644087142392845, 'samples': 1921728, 'steps': 10008, 'loss/train': 1.0030787587165833} 01/27/2022 05:17:19 - INFO - codeparrot_training - Step 10009: {'lr': 0.00046643268231104975, 'samples': 1921920, 'steps': 10009, 'loss/train': 1.0973497331142426} 01/27/2022 05:17:23 - INFO - codeparrot_training - Step 10010: {'lr': 0.00046642449227104213, 'samples': 1922112, 'steps': 10010, 'loss/train': 0.9569967091083527} 01/27/2022 05:17:28 - INFO - codeparrot_training - Step 10011: {'lr': 0.00046641630130394066, 'samples': 1922304, 'steps': 10011, 'loss/train': 0.9721659421920776} 01/27/2022 05:17:31 - INFO - codeparrot_training - Step 10012: {'lr': 0.0004664081094097805, 'samples': 1922496, 'steps': 10012, 'loss/train': 0.7321601808071136} 01/27/2022 05:17:34 - INFO - codeparrot_training - Step 10013: {'lr': 0.00046639991658859684, 'samples': 1922688, 'steps': 10013, 'loss/train': 0.6792892813682556} 01/27/2022 05:17:37 - INFO - codeparrot_training - Step 10014: {'lr': 0.00046639172284042453, 'samples': 1922880, 'steps': 10014, 'loss/train': 0.6830310076475143} 01/27/2022 05:17:40 - INFO - codeparrot_training - Step 10015: {'lr': 0.00046638352816529883, 'samples': 1923072, 'steps': 10015, 'loss/train': 1.7280149459838867} 01/27/2022 05:17:43 - INFO - codeparrot_training - Step 10016: {'lr': 0.00046637533256325476, 'samples': 1923264, 'steps': 10016, 'loss/train': 1.306578129529953} 01/27/2022 05:17:47 - INFO - codeparrot_training - Step 10017: {'lr': 0.0004663671360343275, 'samples': 1923456, 'steps': 10017, 'loss/train': 0.5482209026813507} 01/27/2022 05:17:50 - INFO - codeparrot_training - Step 10018: {'lr': 0.00046635893857855217, 'samples': 1923648, 'steps': 10018, 'loss/train': 1.0194438993930817} 01/27/2022 05:17:53 - INFO - codeparrot_training - Step 10019: {'lr': 0.0004663507401959638, 'samples': 1923840, 'steps': 10019, 'loss/train': 0.30946578830480576} 01/27/2022 05:17:57 - INFO - codeparrot_training - Step 10020: {'lr': 0.00046634254088659757, 'samples': 1924032, 'steps': 10020, 'loss/train': 0.5171806365251541} 01/27/2022 05:18:01 - INFO - codeparrot_training - Step 10021: {'lr': 0.00046633434065048855, 'samples': 1924224, 'steps': 10021, 'loss/train': 1.037062257528305} 01/27/2022 05:18:04 - INFO - codeparrot_training - Step 10022: {'lr': 0.000466326139487672, 'samples': 1924416, 'steps': 10022, 'loss/train': 1.013926774263382} 01/27/2022 05:18:07 - INFO - codeparrot_training - Step 10023: {'lr': 0.0004663179373981829, 'samples': 1924608, 'steps': 10023, 'loss/train': 0.9440710544586182} 01/27/2022 05:18:10 - INFO - codeparrot_training - Step 10024: {'lr': 0.0004663097343820565, 'samples': 1924800, 'steps': 10024, 'loss/train': 0.9659089744091034} 01/27/2022 05:18:13 - INFO - codeparrot_training - Step 10025: {'lr': 0.00046630153043932784, 'samples': 1924992, 'steps': 10025, 'loss/train': 1.1719491183757782} 01/27/2022 05:18:16 - INFO - codeparrot_training - Step 10026: {'lr': 0.00046629332557003215, 'samples': 1925184, 'steps': 10026, 'loss/train': 0.8239561915397644} 01/27/2022 05:18:19 - INFO - codeparrot_training - Step 10027: {'lr': 0.00046628511977420443, 'samples': 1925376, 'steps': 10027, 'loss/train': 0.4438592791557312} 01/27/2022 05:18:23 - INFO - codeparrot_training - Step 10028: {'lr': 0.00046627691305188004, 'samples': 1925568, 'steps': 10028, 'loss/train': 0.5880804508924484} 01/27/2022 05:18:27 - INFO - codeparrot_training - Step 10029: {'lr': 0.00046626870540309394, 'samples': 1925760, 'steps': 10029, 'loss/train': 1.1258325576782227} 01/27/2022 05:18:30 - INFO - codeparrot_training - Step 10030: {'lr': 0.00046626049682788143, 'samples': 1925952, 'steps': 10030, 'loss/train': 3.5104161500930786} 01/27/2022 05:18:34 - INFO - codeparrot_training - Step 10031: {'lr': 0.00046625228732627763, 'samples': 1926144, 'steps': 10031, 'loss/train': 0.742792010307312} 01/27/2022 05:18:37 - INFO - codeparrot_training - Step 10032: {'lr': 0.00046624407689831773, 'samples': 1926336, 'steps': 10032, 'loss/train': 1.020949512720108} 01/27/2022 05:18:40 - INFO - codeparrot_training - Step 10033: {'lr': 0.0004662358655440368, 'samples': 1926528, 'steps': 10033, 'loss/train': 1.4861469268798828} 01/27/2022 05:18:43 - INFO - codeparrot_training - Step 10034: {'lr': 0.0004662276532634701, 'samples': 1926720, 'steps': 10034, 'loss/train': 1.6191641092300415} 01/27/2022 05:18:46 - INFO - codeparrot_training - Step 10035: {'lr': 0.0004662194400566528, 'samples': 1926912, 'steps': 10035, 'loss/train': 0.9579936861991882} 01/27/2022 05:18:49 - INFO - codeparrot_training - Step 10036: {'lr': 0.0004662112259236201, 'samples': 1927104, 'steps': 10036, 'loss/train': 0.9603204131126404} 01/27/2022 05:18:52 - INFO - codeparrot_training - Step 10037: {'lr': 0.00046620301086440713, 'samples': 1927296, 'steps': 10037, 'loss/train': 0.9983008503913879} 01/27/2022 05:18:57 - INFO - codeparrot_training - Step 10038: {'lr': 0.00046619479487904915, 'samples': 1927488, 'steps': 10038, 'loss/train': 0.8358047604560852} 01/27/2022 05:19:01 - INFO - codeparrot_training - Step 10039: {'lr': 0.0004661865779675813, 'samples': 1927680, 'steps': 10039, 'loss/train': 0.7745360434055328} 01/27/2022 05:19:04 - INFO - codeparrot_training - Step 10040: {'lr': 0.0004661783601300388, 'samples': 1927872, 'steps': 10040, 'loss/train': 0.8982997834682465} 01/27/2022 05:19:07 - INFO - codeparrot_training - Step 10041: {'lr': 0.00046617014136645686, 'samples': 1928064, 'steps': 10041, 'loss/train': 0.6650847494602203} 01/27/2022 05:19:10 - INFO - codeparrot_training - Step 10042: {'lr': 0.00046616192167687066, 'samples': 1928256, 'steps': 10042, 'loss/train': 0.7121569812297821} 01/27/2022 05:19:13 - INFO - codeparrot_training - Step 10043: {'lr': 0.00046615370106131536, 'samples': 1928448, 'steps': 10043, 'loss/train': 0.8040236234664917} 01/27/2022 05:19:16 - INFO - codeparrot_training - Step 10044: {'lr': 0.00046614547951982636, 'samples': 1928640, 'steps': 10044, 'loss/train': 0.6524379998445511} 01/27/2022 05:19:19 - INFO - codeparrot_training - Step 10045: {'lr': 0.00046613725705243873, 'samples': 1928832, 'steps': 10045, 'loss/train': 0.9010388553142548} 01/27/2022 05:19:23 - INFO - codeparrot_training - Step 10046: {'lr': 0.0004661290336591877, 'samples': 1929024, 'steps': 10046, 'loss/train': 0.9410867393016815} 01/27/2022 05:19:27 - INFO - codeparrot_training - Step 10047: {'lr': 0.0004661208093401085, 'samples': 1929216, 'steps': 10047, 'loss/train': 1.1382524371147156} 01/27/2022 05:19:30 - INFO - codeparrot_training - Step 10048: {'lr': 0.0004661125840952364, 'samples': 1929408, 'steps': 10048, 'loss/train': 1.4960942566394806} 01/27/2022 05:19:33 - INFO - codeparrot_training - Step 10049: {'lr': 0.0004661043579246066, 'samples': 1929600, 'steps': 10049, 'loss/train': 0.6184068918228149} 01/27/2022 05:19:37 - INFO - codeparrot_training - Step 10050: {'lr': 0.00046609613082825436, 'samples': 1929792, 'steps': 10050, 'loss/train': 0.8521901965141296} 01/27/2022 05:19:40 - INFO - codeparrot_training - Step 10051: {'lr': 0.00046608790280621494, 'samples': 1929984, 'steps': 10051, 'loss/train': 1.0061456859111786} 01/27/2022 05:19:43 - INFO - codeparrot_training - Step 10052: {'lr': 0.0004660796738585235, 'samples': 1930176, 'steps': 10052, 'loss/train': 0.8312442004680634} 01/27/2022 05:19:46 - INFO - codeparrot_training - Step 10053: {'lr': 0.0004660714439852154, 'samples': 1930368, 'steps': 10053, 'loss/train': 0.4958902448415756} 01/27/2022 05:19:49 - INFO - codeparrot_training - Step 10054: {'lr': 0.0004660632131863258, 'samples': 1930560, 'steps': 10054, 'loss/train': 0.939630925655365} 01/27/2022 05:19:54 - INFO - codeparrot_training - Step 10055: {'lr': 0.0004660549814618901, 'samples': 1930752, 'steps': 10055, 'loss/train': 0.7676550149917603} 01/27/2022 05:19:57 - INFO - codeparrot_training - Step 10056: {'lr': 0.00046604674881194335, 'samples': 1930944, 'steps': 10056, 'loss/train': 0.5162013173103333} 01/27/2022 05:20:00 - INFO - codeparrot_training - Step 10057: {'lr': 0.000466038515236521, 'samples': 1931136, 'steps': 10057, 'loss/train': 0.7474970519542694} 01/27/2022 05:20:04 - INFO - codeparrot_training - Step 10058: {'lr': 0.0004660302807356582, 'samples': 1931328, 'steps': 10058, 'loss/train': 1.2691766023635864} 01/27/2022 05:20:07 - INFO - codeparrot_training - Step 10059: {'lr': 0.0004660220453093903, 'samples': 1931520, 'steps': 10059, 'loss/train': 0.23723217844963074} 01/27/2022 05:20:10 - INFO - codeparrot_training - Step 10060: {'lr': 0.0004660138089577526, 'samples': 1931712, 'steps': 10060, 'loss/train': 1.1411873996257782} 01/27/2022 05:20:13 - INFO - codeparrot_training - Step 10061: {'lr': 0.00046600557168078026, 'samples': 1931904, 'steps': 10061, 'loss/train': 0.5550977289676666} 01/27/2022 05:20:16 - INFO - codeparrot_training - Step 10062: {'lr': 0.0004659973334785087, 'samples': 1932096, 'steps': 10062, 'loss/train': 0.6990444213151932} 01/27/2022 05:20:19 - INFO - codeparrot_training - Step 10063: {'lr': 0.00046598909435097315, 'samples': 1932288, 'steps': 10063, 'loss/train': 0.6241828948259354} 01/27/2022 05:20:24 - INFO - codeparrot_training - Step 10064: {'lr': 0.0004659808542982088, 'samples': 1932480, 'steps': 10064, 'loss/train': 1.1722317337989807} 01/27/2022 05:20:27 - INFO - codeparrot_training - Step 10065: {'lr': 0.0004659726133202512, 'samples': 1932672, 'steps': 10065, 'loss/train': 0.8662413954734802} 01/27/2022 05:20:30 - INFO - codeparrot_training - Step 10066: {'lr': 0.0004659643714171354, 'samples': 1932864, 'steps': 10066, 'loss/train': 0.949228048324585} 01/27/2022 05:20:33 - INFO - codeparrot_training - Step 10067: {'lr': 0.00046595612858889686, 'samples': 1933056, 'steps': 10067, 'loss/train': 0.8883390426635742} 01/27/2022 05:20:36 - INFO - codeparrot_training - Step 10068: {'lr': 0.00046594788483557084, 'samples': 1933248, 'steps': 10068, 'loss/train': 0.8156486749649048} 01/27/2022 05:20:39 - INFO - codeparrot_training - Step 10069: {'lr': 0.00046593964015719257, 'samples': 1933440, 'steps': 10069, 'loss/train': 0.6750431209802628} 01/27/2022 05:20:43 - INFO - codeparrot_training - Step 10070: {'lr': 0.0004659313945537975, 'samples': 1933632, 'steps': 10070, 'loss/train': 0.9120054244995117} 01/27/2022 05:20:46 - INFO - codeparrot_training - Step 10071: {'lr': 0.00046592314802542095, 'samples': 1933824, 'steps': 10071, 'loss/train': 0.9926100075244904} 01/27/2022 05:20:49 - INFO - codeparrot_training - Step 10072: {'lr': 0.0004659149005720982, 'samples': 1934016, 'steps': 10072, 'loss/train': 1.1689247488975525} 01/27/2022 05:20:54 - INFO - codeparrot_training - Step 10073: {'lr': 0.00046590665219386454, 'samples': 1934208, 'steps': 10073, 'loss/train': 0.8250739574432373} 01/27/2022 05:20:57 - INFO - codeparrot_training - Step 10074: {'lr': 0.0004658984028907553, 'samples': 1934400, 'steps': 10074, 'loss/train': 0.6275497376918793} 01/27/2022 05:21:00 - INFO - codeparrot_training - Step 10075: {'lr': 0.0004658901526628059, 'samples': 1934592, 'steps': 10075, 'loss/train': 0.8667729198932648} 01/27/2022 05:21:03 - INFO - codeparrot_training - Step 10076: {'lr': 0.00046588190151005163, 'samples': 1934784, 'steps': 10076, 'loss/train': 0.7754137516021729} 01/27/2022 05:21:06 - INFO - codeparrot_training - Step 10077: {'lr': 0.00046587364943252783, 'samples': 1934976, 'steps': 10077, 'loss/train': 1.485621839761734} 01/27/2022 05:21:09 - INFO - codeparrot_training - Step 10078: {'lr': 0.00046586539643026994, 'samples': 1935168, 'steps': 10078, 'loss/train': 1.5708485841751099} 01/27/2022 05:21:13 - INFO - codeparrot_training - Step 10079: {'lr': 0.0004658571425033131, 'samples': 1935360, 'steps': 10079, 'loss/train': 0.37208517640829086} 01/27/2022 05:21:16 - INFO - codeparrot_training - Step 10080: {'lr': 0.0004658488876516929, 'samples': 1935552, 'steps': 10080, 'loss/train': 0.543111115694046} 01/27/2022 05:21:19 - INFO - codeparrot_training - Step 10081: {'lr': 0.0004658406318754446, 'samples': 1935744, 'steps': 10081, 'loss/train': 0.7215395718812943} 01/27/2022 05:21:25 - INFO - codeparrot_training - Step 10082: {'lr': 0.0004658323751746036, 'samples': 1935936, 'steps': 10082, 'loss/train': 0.820753276348114} 01/27/2022 05:21:28 - INFO - codeparrot_training - Step 10083: {'lr': 0.00046582411754920517, 'samples': 1936128, 'steps': 10083, 'loss/train': 0.8272719383239746} 01/27/2022 05:21:31 - INFO - codeparrot_training - Step 10084: {'lr': 0.0004658158589992848, 'samples': 1936320, 'steps': 10084, 'loss/train': 0.6392053216695786} 01/27/2022 05:21:34 - INFO - codeparrot_training - Step 10085: {'lr': 0.00046580759952487776, 'samples': 1936512, 'steps': 10085, 'loss/train': 2.8909133076667786} 01/27/2022 05:21:37 - INFO - codeparrot_training - Step 10086: {'lr': 0.00046579933912601956, 'samples': 1936704, 'steps': 10086, 'loss/train': 0.7056777477264404} 01/27/2022 05:21:40 - INFO - codeparrot_training - Step 10087: {'lr': 0.00046579107780274543, 'samples': 1936896, 'steps': 10087, 'loss/train': 0.9488657712936401} 01/27/2022 05:21:44 - INFO - codeparrot_training - Step 10088: {'lr': 0.00046578281555509094, 'samples': 1937088, 'steps': 10088, 'loss/train': 0.642118826508522} 01/27/2022 05:21:47 - INFO - codeparrot_training - Step 10089: {'lr': 0.0004657745523830914, 'samples': 1937280, 'steps': 10089, 'loss/train': 2.249519169330597} 01/27/2022 05:21:50 - INFO - codeparrot_training - Step 10090: {'lr': 0.0004657662882867821, 'samples': 1937472, 'steps': 10090, 'loss/train': 1.1383342444896698} 01/27/2022 05:21:53 - INFO - codeparrot_training - Step 10091: {'lr': 0.0004657580232661985, 'samples': 1937664, 'steps': 10091, 'loss/train': 0.6141999363899231} 01/27/2022 05:21:57 - INFO - codeparrot_training - Step 10092: {'lr': 0.00046574975732137613, 'samples': 1937856, 'steps': 10092, 'loss/train': 1.1729754209518433} 01/27/2022 05:22:01 - INFO - codeparrot_training - Step 10093: {'lr': 0.0004657414904523504, 'samples': 1938048, 'steps': 10093, 'loss/train': 1.0357111394405365} 01/27/2022 05:22:04 - INFO - codeparrot_training - Step 10094: {'lr': 0.0004657332226591565, 'samples': 1938240, 'steps': 10094, 'loss/train': 1.0131386518478394} 01/27/2022 05:22:07 - INFO - codeparrot_training - Step 10095: {'lr': 0.00046572495394183, 'samples': 1938432, 'steps': 10095, 'loss/train': 0.6240531653165817} 01/27/2022 05:22:10 - INFO - codeparrot_training - Step 10096: {'lr': 0.00046571668430040624, 'samples': 1938624, 'steps': 10096, 'loss/train': 0.46570149064064026} 01/27/2022 05:22:13 - INFO - codeparrot_training - Step 10097: {'lr': 0.0004657084137349208, 'samples': 1938816, 'steps': 10097, 'loss/train': 1.5380219221115112} 01/27/2022 05:22:16 - INFO - codeparrot_training - Step 10098: {'lr': 0.0004657001422454089, 'samples': 1939008, 'steps': 10098, 'loss/train': 0.43187785148620605} 01/27/2022 05:22:19 - INFO - codeparrot_training - Step 10099: {'lr': 0.0004656918698319062, 'samples': 1939200, 'steps': 10099, 'loss/train': 1.1010900735855103} 01/27/2022 05:22:23 - INFO - codeparrot_training - Step 10100: {'lr': 0.00046568359649444796, 'samples': 1939392, 'steps': 10100, 'loss/train': 1.0717078149318695} 01/27/2022 05:22:27 - INFO - codeparrot_training - Step 10101: {'lr': 0.0004656753222330697, 'samples': 1939584, 'steps': 10101, 'loss/train': 1.1960492134094238} 01/27/2022 05:22:30 - INFO - codeparrot_training - Step 10102: {'lr': 0.0004656670470478068, 'samples': 1939776, 'steps': 10102, 'loss/train': 0.8107607066631317} 01/27/2022 05:22:33 - INFO - codeparrot_training - Step 10103: {'lr': 0.0004656587709386948, 'samples': 1939968, 'steps': 10103, 'loss/train': 0.9459178447723389} 01/27/2022 05:22:36 - INFO - codeparrot_training - Step 10104: {'lr': 0.00046565049390576906, 'samples': 1940160, 'steps': 10104, 'loss/train': 0.7564416825771332} 01/27/2022 05:22:40 - INFO - codeparrot_training - Step 10105: {'lr': 0.0004656422159490652, 'samples': 1940352, 'steps': 10105, 'loss/train': 0.5315505713224411} 01/27/2022 05:22:43 - INFO - codeparrot_training - Step 10106: {'lr': 0.00046563393706861847, 'samples': 1940544, 'steps': 10106, 'loss/train': 0.24555686116218567} 01/27/2022 05:22:46 - INFO - codeparrot_training - Step 10107: {'lr': 0.00046562565726446437, 'samples': 1940736, 'steps': 10107, 'loss/train': 2.2314752340316772} 01/27/2022 05:22:49 - INFO - codeparrot_training - Step 10108: {'lr': 0.0004656173765366385, 'samples': 1940928, 'steps': 10108, 'loss/train': 0.8039705157279968} 01/27/2022 05:22:52 - INFO - codeparrot_training - Step 10109: {'lr': 0.00046560909488517623, 'samples': 1941120, 'steps': 10109, 'loss/train': 0.7014846950769424} 01/27/2022 05:22:57 - INFO - codeparrot_training - Step 10110: {'lr': 0.0004656008123101131, 'samples': 1941312, 'steps': 10110, 'loss/train': 1.0767475962638855} 01/27/2022 05:23:00 - INFO - codeparrot_training - Step 10111: {'lr': 0.0004655925288114845, 'samples': 1941504, 'steps': 10111, 'loss/train': 1.4036001563072205} 01/27/2022 05:23:03 - INFO - codeparrot_training - Step 10112: {'lr': 0.000465584244389326, 'samples': 1941696, 'steps': 10112, 'loss/train': 0.7590197324752808} 01/27/2022 05:23:06 - INFO - codeparrot_training - Step 10113: {'lr': 0.000465575959043673, 'samples': 1941888, 'steps': 10113, 'loss/train': 1.1214948892593384} 01/27/2022 05:23:09 - INFO - codeparrot_training - Step 10114: {'lr': 0.0004655676727745611, 'samples': 1942080, 'steps': 10114, 'loss/train': 0.7798351943492889} 01/27/2022 05:23:12 - INFO - codeparrot_training - Step 10115: {'lr': 0.0004655593855820257, 'samples': 1942272, 'steps': 10115, 'loss/train': 0.7367156893014908} 01/27/2022 05:23:16 - INFO - codeparrot_training - Step 10116: {'lr': 0.00046555109746610244, 'samples': 1942464, 'steps': 10116, 'loss/train': 0.8567930459976196} 01/27/2022 05:23:19 - INFO - codeparrot_training - Step 10117: {'lr': 0.0004655428084268266, 'samples': 1942656, 'steps': 10117, 'loss/train': 2.2256911396980286} 01/27/2022 05:23:22 - INFO - codeparrot_training - Step 10118: {'lr': 0.00046553451846423387, 'samples': 1942848, 'steps': 10118, 'loss/train': 0.5767085552215576} 01/27/2022 05:23:27 - INFO - codeparrot_training - Step 10119: {'lr': 0.0004655262275783597, 'samples': 1943040, 'steps': 10119, 'loss/train': 0.9733096361160278} 01/27/2022 05:23:30 - INFO - codeparrot_training - Step 10120: {'lr': 0.00046551793576923964, 'samples': 1943232, 'steps': 10120, 'loss/train': 1.2957590818405151} 01/27/2022 05:23:33 - INFO - codeparrot_training - Step 10121: {'lr': 0.0004655096430369091, 'samples': 1943424, 'steps': 10121, 'loss/train': 0.3128966689109802} 01/27/2022 05:23:36 - INFO - codeparrot_training - Step 10122: {'lr': 0.00046550134938140375, 'samples': 1943616, 'steps': 10122, 'loss/train': 0.6644976586103439} 01/27/2022 05:23:40 - INFO - codeparrot_training - Step 10123: {'lr': 0.00046549305480275894, 'samples': 1943808, 'steps': 10123, 'loss/train': 0.9163658916950226} 01/27/2022 05:23:43 - INFO - codeparrot_training - Step 10124: {'lr': 0.0004654847593010104, 'samples': 1944000, 'steps': 10124, 'loss/train': 0.9010556638240814} 01/27/2022 05:23:46 - INFO - codeparrot_training - Step 10125: {'lr': 0.00046547646287619363, 'samples': 1944192, 'steps': 10125, 'loss/train': 0.8808541595935822} 01/27/2022 05:23:49 - INFO - codeparrot_training - Step 10126: {'lr': 0.00046546816552834404, 'samples': 1944384, 'steps': 10126, 'loss/train': 1.1384673714637756} 01/27/2022 05:23:53 - INFO - codeparrot_training - Step 10127: {'lr': 0.00046545986725749725, 'samples': 1944576, 'steps': 10127, 'loss/train': 0.9273562431335449} 01/27/2022 05:23:57 - INFO - codeparrot_training - Step 10128: {'lr': 0.0004654515680636888, 'samples': 1944768, 'steps': 10128, 'loss/train': 0.9514276385307312} 01/27/2022 05:24:00 - INFO - codeparrot_training - Step 10129: {'lr': 0.00046544326794695424, 'samples': 1944960, 'steps': 10129, 'loss/train': 0.6858924329280853} 01/27/2022 05:24:03 - INFO - codeparrot_training - Step 10130: {'lr': 0.00046543496690732914, 'samples': 1945152, 'steps': 10130, 'loss/train': 0.7141848653554916} 01/27/2022 05:24:06 - INFO - codeparrot_training - Step 10131: {'lr': 0.0004654266649448491, 'samples': 1945344, 'steps': 10131, 'loss/train': 1.4058441817760468} 01/27/2022 05:24:09 - INFO - codeparrot_training - Step 10132: {'lr': 0.00046541836205954955, 'samples': 1945536, 'steps': 10132, 'loss/train': 0.6335228383541107} 01/27/2022 05:24:12 - INFO - codeparrot_training - Step 10133: {'lr': 0.0004654100582514662, 'samples': 1945728, 'steps': 10133, 'loss/train': 1.3932573795318604} 01/27/2022 05:24:16 - INFO - codeparrot_training - Step 10134: {'lr': 0.0004654017535206345, 'samples': 1945920, 'steps': 10134, 'loss/train': 0.6176371425390244} 01/27/2022 05:24:19 - INFO - codeparrot_training - Step 10135: {'lr': 0.00046539344786709013, 'samples': 1946112, 'steps': 10135, 'loss/train': 1.230492889881134} 01/27/2022 05:24:24 - INFO - codeparrot_training - Step 10136: {'lr': 0.0004653851412908686, 'samples': 1946304, 'steps': 10136, 'loss/train': 0.7010306864976883} 01/27/2022 05:24:27 - INFO - codeparrot_training - Step 10137: {'lr': 0.0004653768337920056, 'samples': 1946496, 'steps': 10137, 'loss/train': 0.9096249639987946} 01/27/2022 05:24:30 - INFO - codeparrot_training - Step 10138: {'lr': 0.00046536852537053654, 'samples': 1946688, 'steps': 10138, 'loss/train': 0.8495095074176788} 01/27/2022 05:24:33 - INFO - codeparrot_training - Step 10139: {'lr': 0.00046536021602649715, 'samples': 1946880, 'steps': 10139, 'loss/train': 0.7778137028217316} 01/27/2022 05:24:37 - INFO - codeparrot_training - Step 10140: {'lr': 0.0004653519057599229, 'samples': 1947072, 'steps': 10140, 'loss/train': 0.27843426167964935} 01/27/2022 05:24:40 - INFO - codeparrot_training - Step 10141: {'lr': 0.0004653435945708496, 'samples': 1947264, 'steps': 10141, 'loss/train': 0.7884618043899536} 01/27/2022 05:24:43 - INFO - codeparrot_training - Step 10142: {'lr': 0.00046533528245931266, 'samples': 1947456, 'steps': 10142, 'loss/train': 0.38102254271507263} 01/27/2022 05:24:46 - INFO - codeparrot_training - Step 10143: {'lr': 0.0004653269694253477, 'samples': 1947648, 'steps': 10143, 'loss/train': 0.979806661605835} 01/27/2022 05:24:49 - INFO - codeparrot_training - Step 10144: {'lr': 0.00046531865546899044, 'samples': 1947840, 'steps': 10144, 'loss/train': 0.8660438060760498} 01/27/2022 05:24:54 - INFO - codeparrot_training - Step 10145: {'lr': 0.00046531034059027644, 'samples': 1948032, 'steps': 10145, 'loss/train': 3.819414496421814} 01/27/2022 05:24:57 - INFO - codeparrot_training - Step 10146: {'lr': 0.0004653020247892412, 'samples': 1948224, 'steps': 10146, 'loss/train': 0.5981952399015427} 01/27/2022 05:25:00 - INFO - codeparrot_training - Step 10147: {'lr': 0.0004652937080659206, 'samples': 1948416, 'steps': 10147, 'loss/train': 0.8464160263538361} 01/27/2022 05:25:03 - INFO - codeparrot_training - Step 10148: {'lr': 0.00046528539042035, 'samples': 1948608, 'steps': 10148, 'loss/train': 0.7645143270492554} 01/27/2022 05:25:06 - INFO - codeparrot_training - Step 10149: {'lr': 0.0004652770718525652, 'samples': 1948800, 'steps': 10149, 'loss/train': 0.519204631447792} 01/27/2022 05:25:09 - INFO - codeparrot_training - Step 10150: {'lr': 0.0004652687523626018, 'samples': 1948992, 'steps': 10150, 'loss/train': 1.222640186548233} 01/27/2022 05:25:12 - INFO - codeparrot_training - Step 10151: {'lr': 0.0004652604319504954, 'samples': 1949184, 'steps': 10151, 'loss/train': 0.46195225417613983} 01/27/2022 05:25:15 - INFO - codeparrot_training - Step 10152: {'lr': 0.00046525211061628163, 'samples': 1949376, 'steps': 10152, 'loss/train': 0.8601406216621399} 01/27/2022 05:25:19 - INFO - codeparrot_training - Step 10153: {'lr': 0.0004652437883599962, 'samples': 1949568, 'steps': 10153, 'loss/train': 1.0684794187545776} 01/27/2022 05:25:23 - INFO - codeparrot_training - Step 10154: {'lr': 0.0004652354651816747, 'samples': 1949760, 'steps': 10154, 'loss/train': 1.1448250114917755} 01/27/2022 05:25:26 - INFO - codeparrot_training - Step 10155: {'lr': 0.0004652271410813529, 'samples': 1949952, 'steps': 10155, 'loss/train': 0.48094792664051056} 01/27/2022 05:25:29 - INFO - codeparrot_training - Step 10156: {'lr': 0.0004652188160590663, 'samples': 1950144, 'steps': 10156, 'loss/train': 1.1338700652122498} 01/27/2022 05:25:32 - INFO - codeparrot_training - Step 10157: {'lr': 0.00046521049011485064, 'samples': 1950336, 'steps': 10157, 'loss/train': 1.1238001585006714} 01/27/2022 05:25:36 - INFO - codeparrot_training - Step 10158: {'lr': 0.0004652021632487415, 'samples': 1950528, 'steps': 10158, 'loss/train': 0.8075042366981506} 01/27/2022 05:25:39 - INFO - codeparrot_training - Step 10159: {'lr': 0.00046519383546077476, 'samples': 1950720, 'steps': 10159, 'loss/train': 0.96889328956604} 01/27/2022 05:25:42 - INFO - codeparrot_training - Step 10160: {'lr': 0.0004651855067509859, 'samples': 1950912, 'steps': 10160, 'loss/train': 0.7378159314393997} 01/27/2022 05:25:45 - INFO - codeparrot_training - Step 10161: {'lr': 0.00046517717711941066, 'samples': 1951104, 'steps': 10161, 'loss/train': 1.0935104191303253} 01/27/2022 05:25:48 - INFO - codeparrot_training - Step 10162: {'lr': 0.0004651688465660847, 'samples': 1951296, 'steps': 10162, 'loss/train': 0.9209607839584351} 01/27/2022 05:25:53 - INFO - codeparrot_training - Step 10163: {'lr': 0.00046516051509104376, 'samples': 1951488, 'steps': 10163, 'loss/train': 0.7501765787601471} 01/27/2022 05:25:57 - INFO - codeparrot_training - Step 10164: {'lr': 0.0004651521826943235, 'samples': 1951680, 'steps': 10164, 'loss/train': 1.2656697928905487} 01/27/2022 05:26:00 - INFO - codeparrot_training - Step 10165: {'lr': 0.00046514384937595965, 'samples': 1951872, 'steps': 10165, 'loss/train': 0.747044026851654} 01/27/2022 05:26:03 - INFO - codeparrot_training - Step 10166: {'lr': 0.0004651355151359878, 'samples': 1952064, 'steps': 10166, 'loss/train': 0.6533554494380951} 01/27/2022 05:26:06 - INFO - codeparrot_training - Step 10167: {'lr': 0.0004651271799744437, 'samples': 1952256, 'steps': 10167, 'loss/train': 0.7760400474071503} 01/27/2022 05:26:09 - INFO - codeparrot_training - Step 10168: {'lr': 0.0004651188438913631, 'samples': 1952448, 'steps': 10168, 'loss/train': 0.9976007044315338} 01/27/2022 05:26:12 - INFO - codeparrot_training - Step 10169: {'lr': 0.0004651105068867817, 'samples': 1952640, 'steps': 10169, 'loss/train': 0.8832924664020538} 01/27/2022 05:26:15 - INFO - codeparrot_training - Step 10170: {'lr': 0.00046510216896073517, 'samples': 1952832, 'steps': 10170, 'loss/train': 0.7614469528198242} 01/27/2022 05:26:20 - INFO - codeparrot_training - Step 10171: {'lr': 0.00046509383011325925, 'samples': 1953024, 'steps': 10171, 'loss/train': 0.9647558033466339} 01/27/2022 05:26:23 - INFO - codeparrot_training - Step 10172: {'lr': 0.0004650854903443896, 'samples': 1953216, 'steps': 10172, 'loss/train': 0.8637141287326813} 01/27/2022 05:26:26 - INFO - codeparrot_training - Step 10173: {'lr': 0.0004650771496541621, 'samples': 1953408, 'steps': 10173, 'loss/train': 0.7107570022344589} 01/27/2022 05:26:29 - INFO - codeparrot_training - Step 10174: {'lr': 0.0004650688080426123, 'samples': 1953600, 'steps': 10174, 'loss/train': 1.1635105311870575} 01/27/2022 05:26:33 - INFO - codeparrot_training - Step 10175: {'lr': 0.0004650604655097761, 'samples': 1953792, 'steps': 10175, 'loss/train': 1.04913130402565} 01/27/2022 05:26:36 - INFO - codeparrot_training - Step 10176: {'lr': 0.00046505212205568916, 'samples': 1953984, 'steps': 10176, 'loss/train': 1.003859281539917} 01/27/2022 05:26:39 - INFO - codeparrot_training - Step 10177: {'lr': 0.0004650437776803872, 'samples': 1954176, 'steps': 10177, 'loss/train': 0.9981171190738678} 01/27/2022 05:26:42 - INFO - codeparrot_training - Step 10178: {'lr': 0.00046503543238390595, 'samples': 1954368, 'steps': 10178, 'loss/train': 0.8172044456005096} 01/27/2022 05:26:45 - INFO - codeparrot_training - Step 10179: {'lr': 0.0004650270861662812, 'samples': 1954560, 'steps': 10179, 'loss/train': 0.8071482181549072} 01/27/2022 05:26:50 - INFO - codeparrot_training - Step 10180: {'lr': 0.00046501873902754867, 'samples': 1954752, 'steps': 10180, 'loss/train': 0.74101822078228} 01/27/2022 05:26:53 - INFO - codeparrot_training - Step 10181: {'lr': 0.00046501039096774415, 'samples': 1954944, 'steps': 10181, 'loss/train': 0.7054651826620102} 01/27/2022 05:26:56 - INFO - codeparrot_training - Step 10182: {'lr': 0.00046500204198690343, 'samples': 1955136, 'steps': 10182, 'loss/train': 1.111262708902359} 01/27/2022 05:27:00 - INFO - codeparrot_training - Step 10183: {'lr': 0.0004649936920850622, 'samples': 1955328, 'steps': 10183, 'loss/train': 1.0530760288238525} 01/27/2022 05:27:03 - INFO - codeparrot_training - Step 10184: {'lr': 0.00046498534126225625, 'samples': 1955520, 'steps': 10184, 'loss/train': 0.737921878695488} 01/27/2022 05:27:06 - INFO - codeparrot_training - Step 10185: {'lr': 0.0004649769895185214, 'samples': 1955712, 'steps': 10185, 'loss/train': 0.5650784075260162} 01/27/2022 05:27:09 - INFO - codeparrot_training - Step 10186: {'lr': 0.00046496863685389336, 'samples': 1955904, 'steps': 10186, 'loss/train': 0.536064013838768} 01/27/2022 05:27:12 - INFO - codeparrot_training - Step 10187: {'lr': 0.00046496028326840796, 'samples': 1956096, 'steps': 10187, 'loss/train': 0.46957218647003174} 01/27/2022 05:27:15 - INFO - codeparrot_training - Step 10188: {'lr': 0.000464951928762101, 'samples': 1956288, 'steps': 10188, 'loss/train': 0.8252644836902618} 01/27/2022 05:27:20 - INFO - codeparrot_training - Step 10189: {'lr': 0.00046494357333500816, 'samples': 1956480, 'steps': 10189, 'loss/train': 1.2444356381893158} 01/27/2022 05:27:23 - INFO - codeparrot_training - Step 10190: {'lr': 0.00046493521698716536, 'samples': 1956672, 'steps': 10190, 'loss/train': 0.31812579184770584} 01/27/2022 05:27:26 - INFO - codeparrot_training - Step 10191: {'lr': 0.00046492685971860826, 'samples': 1956864, 'steps': 10191, 'loss/train': 0.7141887992620468} 01/27/2022 05:27:29 - INFO - codeparrot_training - Step 10192: {'lr': 0.00046491850152937276, 'samples': 1957056, 'steps': 10192, 'loss/train': 1.1650924980640411} 01/27/2022 05:27:32 - INFO - codeparrot_training - Step 10193: {'lr': 0.0004649101424194947, 'samples': 1957248, 'steps': 10193, 'loss/train': 1.0816326141357422} 01/27/2022 05:27:36 - INFO - codeparrot_training - Step 10194: {'lr': 0.0004649017823890098, 'samples': 1957440, 'steps': 10194, 'loss/train': 0.5030400305986404} 01/27/2022 05:27:39 - INFO - codeparrot_training - Step 10195: {'lr': 0.0004648934214379539, 'samples': 1957632, 'steps': 10195, 'loss/train': 0.9798301756381989} 01/27/2022 05:27:42 - INFO - codeparrot_training - Step 10196: {'lr': 0.00046488505956636286, 'samples': 1957824, 'steps': 10196, 'loss/train': 0.6955835670232773} 01/27/2022 05:27:45 - INFO - codeparrot_training - Step 10197: {'lr': 0.00046487669677427237, 'samples': 1958016, 'steps': 10197, 'loss/train': 0.9336158037185669} 01/27/2022 05:27:49 - INFO - codeparrot_training - Step 10198: {'lr': 0.0004648683330617184, 'samples': 1958208, 'steps': 10198, 'loss/train': 1.0852987468242645} 01/27/2022 05:27:53 - INFO - codeparrot_training - Step 10199: {'lr': 0.00046485996842873676, 'samples': 1958400, 'steps': 10199, 'loss/train': 0.7012423127889633} 01/27/2022 05:27:56 - INFO - codeparrot_training - Step 10200: {'lr': 0.0004648516028753632, 'samples': 1958592, 'steps': 10200, 'loss/train': 0.7108409553766251} 01/27/2022 05:27:59 - INFO - codeparrot_training - Step 10201: {'lr': 0.00046484323640163356, 'samples': 1958784, 'steps': 10201, 'loss/train': 0.8016427159309387} 01/27/2022 05:28:02 - INFO - codeparrot_training - Step 10202: {'lr': 0.00046483486900758374, 'samples': 1958976, 'steps': 10202, 'loss/train': 5.537448763847351} 01/27/2022 05:28:05 - INFO - codeparrot_training - Step 10203: {'lr': 0.0004648265006932496, 'samples': 1959168, 'steps': 10203, 'loss/train': 0.8417496085166931} 01/27/2022 05:28:08 - INFO - codeparrot_training - Step 10204: {'lr': 0.0004648181314586669, 'samples': 1959360, 'steps': 10204, 'loss/train': 0.6689517349004745} 01/27/2022 05:28:11 - INFO - codeparrot_training - Step 10205: {'lr': 0.00046480976130387156, 'samples': 1959552, 'steps': 10205, 'loss/train': 0.7112913429737091} 01/27/2022 05:28:15 - INFO - codeparrot_training - Step 10206: {'lr': 0.0004648013902288994, 'samples': 1959744, 'steps': 10206, 'loss/train': 0.701105386018753} 01/27/2022 05:28:19 - INFO - codeparrot_training - Step 10207: {'lr': 0.0004647930182337863, 'samples': 1959936, 'steps': 10207, 'loss/train': 0.8497755825519562} 01/27/2022 05:28:22 - INFO - codeparrot_training - Step 10208: {'lr': 0.0004647846453185681, 'samples': 1960128, 'steps': 10208, 'loss/train': 1.05536949634552} 01/27/2022 05:28:25 - INFO - codeparrot_training - Step 10209: {'lr': 0.0004647762714832807, 'samples': 1960320, 'steps': 10209, 'loss/train': 0.6644413322210312} 01/27/2022 05:28:28 - INFO - codeparrot_training - Step 10210: {'lr': 0.00046476789672795994, 'samples': 1960512, 'steps': 10210, 'loss/train': 0.4906451404094696} 01/27/2022 05:28:32 - INFO - codeparrot_training - Step 10211: {'lr': 0.00046475952105264176, 'samples': 1960704, 'steps': 10211, 'loss/train': 0.8878102004528046} 01/27/2022 05:28:35 - INFO - codeparrot_training - Step 10212: {'lr': 0.0004647511444573619, 'samples': 1960896, 'steps': 10212, 'loss/train': 1.0672608017921448} 01/27/2022 05:28:38 - INFO - codeparrot_training - Step 10213: {'lr': 0.00046474276694215635, 'samples': 1961088, 'steps': 10213, 'loss/train': 1.0066204369068146} 01/27/2022 05:28:41 - INFO - codeparrot_training - Step 10214: {'lr': 0.000464734388507061, 'samples': 1961280, 'steps': 10214, 'loss/train': 1.2299825549125671} 01/27/2022 05:28:44 - INFO - codeparrot_training - Step 10215: {'lr': 0.00046472600915211174, 'samples': 1961472, 'steps': 10215, 'loss/train': 1.0480345487594604} 01/27/2022 05:28:49 - INFO - codeparrot_training - Step 10216: {'lr': 0.00046471762887734437, 'samples': 1961664, 'steps': 10216, 'loss/train': 0.6253791600465775} 01/27/2022 05:28:52 - INFO - codeparrot_training - Step 10217: {'lr': 0.0004647092476827949, 'samples': 1961856, 'steps': 10217, 'loss/train': 0.7249222844839096} 01/27/2022 05:28:56 - INFO - codeparrot_training - Step 10218: {'lr': 0.0004647008655684992, 'samples': 1962048, 'steps': 10218, 'loss/train': 1.256228506565094} 01/27/2022 05:28:59 - INFO - codeparrot_training - Step 10219: {'lr': 0.00046469248253449316, 'samples': 1962240, 'steps': 10219, 'loss/train': 0.8429140448570251} 01/27/2022 05:29:02 - INFO - codeparrot_training - Step 10220: {'lr': 0.0004646840985808126, 'samples': 1962432, 'steps': 10220, 'loss/train': 0.9217612445354462} 01/27/2022 05:29:05 - INFO - codeparrot_training - Step 10221: {'lr': 0.00046467571370749366, 'samples': 1962624, 'steps': 10221, 'loss/train': 0.5578846782445908} 01/27/2022 05:29:08 - INFO - codeparrot_training - Step 10222: {'lr': 0.0004646673279145721, 'samples': 1962816, 'steps': 10222, 'loss/train': 0.9230392277240753} 01/27/2022 05:29:11 - INFO - codeparrot_training - Step 10223: {'lr': 0.00046465894120208384, 'samples': 1963008, 'steps': 10223, 'loss/train': 0.9935635328292847} 01/27/2022 05:29:16 - INFO - codeparrot_training - Step 10224: {'lr': 0.00046465055357006494, 'samples': 1963200, 'steps': 10224, 'loss/train': 0.6933543831110001} 01/27/2022 05:29:19 - INFO - codeparrot_training - Step 10225: {'lr': 0.00046464216501855104, 'samples': 1963392, 'steps': 10225, 'loss/train': 0.8003103733062744} 01/27/2022 05:29:22 - INFO - codeparrot_training - Step 10226: {'lr': 0.0004646337755475784, 'samples': 1963584, 'steps': 10226, 'loss/train': 1.2471224963665009} 01/27/2022 05:29:25 - INFO - codeparrot_training - Step 10227: {'lr': 0.00046462538515718276, 'samples': 1963776, 'steps': 10227, 'loss/train': 0.4650270938873291} 01/27/2022 05:29:28 - INFO - codeparrot_training - Step 10228: {'lr': 0.0004646169938474002, 'samples': 1963968, 'steps': 10228, 'loss/train': 0.9857453405857086} 01/27/2022 05:29:31 - INFO - codeparrot_training - Step 10229: {'lr': 0.0004646086016182666, 'samples': 1964160, 'steps': 10229, 'loss/train': 0.5698279291391373} 01/27/2022 05:29:35 - INFO - codeparrot_training - Step 10230: {'lr': 0.00046460020846981776, 'samples': 1964352, 'steps': 10230, 'loss/train': 0.7677686512470245} 01/27/2022 05:29:38 - INFO - codeparrot_training - Step 10231: {'lr': 0.00046459181440208986, 'samples': 1964544, 'steps': 10231, 'loss/train': 1.196465939283371} 01/27/2022 05:29:41 - INFO - codeparrot_training - Step 10232: {'lr': 0.0004645834194151187, 'samples': 1964736, 'steps': 10232, 'loss/train': 0.850615382194519} 01/27/2022 05:29:45 - INFO - codeparrot_training - Step 10233: {'lr': 0.00046457502350894046, 'samples': 1964928, 'steps': 10233, 'loss/train': 1.0765008330345154} 01/27/2022 05:29:48 - INFO - codeparrot_training - Step 10234: {'lr': 0.0004645666266835908, 'samples': 1965120, 'steps': 10234, 'loss/train': 1.103518545627594} 01/27/2022 05:29:52 - INFO - codeparrot_training - Step 10235: {'lr': 0.0004645582289391059, 'samples': 1965312, 'steps': 10235, 'loss/train': 0.8795328140258789} 01/27/2022 05:29:55 - INFO - codeparrot_training - Step 10236: {'lr': 0.00046454983027552165, 'samples': 1965504, 'steps': 10236, 'loss/train': 1.41930091381073} 01/27/2022 05:29:58 - INFO - codeparrot_training - Step 10237: {'lr': 0.0004645414306928741, 'samples': 1965696, 'steps': 10237, 'loss/train': 1.0026497840881348} 01/27/2022 05:30:01 - INFO - codeparrot_training - Step 10238: {'lr': 0.0004645330301911992, 'samples': 1965888, 'steps': 10238, 'loss/train': 0.8790025413036346} 01/27/2022 05:30:04 - INFO - codeparrot_training - Step 10239: {'lr': 0.0004645246287705329, 'samples': 1966080, 'steps': 10239, 'loss/train': 0.6789700537919998} 01/27/2022 05:30:07 - INFO - codeparrot_training - Step 10240: {'lr': 0.0004645162264309112, 'samples': 1966272, 'steps': 10240, 'loss/train': 0.986088752746582} 01/27/2022 05:30:10 - INFO - codeparrot_training - Step 10241: {'lr': 0.0004645078231723701, 'samples': 1966464, 'steps': 10241, 'loss/train': 0.11770906671881676} 01/27/2022 05:30:16 - INFO - codeparrot_training - Step 10242: {'lr': 0.0004644994189949455, 'samples': 1966656, 'steps': 10242, 'loss/train': 0.769197553396225} 01/27/2022 05:30:19 - INFO - codeparrot_training - Step 10243: {'lr': 0.00046449101389867364, 'samples': 1966848, 'steps': 10243, 'loss/train': 0.619423046708107} 01/27/2022 05:30:22 - INFO - codeparrot_training - Step 10244: {'lr': 0.0004644826078835903, 'samples': 1967040, 'steps': 10244, 'loss/train': 0.610198438167572} 01/27/2022 05:30:25 - INFO - codeparrot_training - Step 10245: {'lr': 0.00046447420094973167, 'samples': 1967232, 'steps': 10245, 'loss/train': 0.40858232975006104} 01/27/2022 05:30:28 - INFO - codeparrot_training - Step 10246: {'lr': 0.0004644657930971336, 'samples': 1967424, 'steps': 10246, 'loss/train': 0.9629703462123871} 01/27/2022 05:30:31 - INFO - codeparrot_training - Step 10247: {'lr': 0.00046445738432583216, 'samples': 1967616, 'steps': 10247, 'loss/train': 0.6225197464227676} 01/27/2022 05:30:34 - INFO - codeparrot_training - Step 10248: {'lr': 0.00046444897463586345, 'samples': 1967808, 'steps': 10248, 'loss/train': 0.8981349170207977} 01/27/2022 05:30:38 - INFO - codeparrot_training - Step 10249: {'lr': 0.00046444056402726336, 'samples': 1968000, 'steps': 10249, 'loss/train': 1.4335297644138336} 01/27/2022 05:30:42 - INFO - codeparrot_training - Step 10250: {'lr': 0.00046443215250006805, 'samples': 1968192, 'steps': 10250, 'loss/train': 0.43282994627952576} 01/27/2022 05:30:45 - INFO - codeparrot_training - Step 10251: {'lr': 0.00046442374005431345, 'samples': 1968384, 'steps': 10251, 'loss/train': 0.873375803232193} 01/27/2022 05:30:48 - INFO - codeparrot_training - Step 10252: {'lr': 0.0004644153266900356, 'samples': 1968576, 'steps': 10252, 'loss/train': 0.6999341994524002} 01/27/2022 05:30:51 - INFO - codeparrot_training - Step 10253: {'lr': 0.0004644069124072706, 'samples': 1968768, 'steps': 10253, 'loss/train': 0.43791116774082184} 01/27/2022 05:30:55 - INFO - codeparrot_training - Step 10254: {'lr': 0.0004643984972060545, 'samples': 1968960, 'steps': 10254, 'loss/train': 0.11722991243004799} 01/27/2022 05:30:58 - INFO - codeparrot_training - Step 10255: {'lr': 0.00046439008108642335, 'samples': 1969152, 'steps': 10255, 'loss/train': 0.6377756148576736} 01/27/2022 05:31:01 - INFO - codeparrot_training - Step 10256: {'lr': 0.0004643816640484131, 'samples': 1969344, 'steps': 10256, 'loss/train': 0.8321471214294434} 01/27/2022 05:31:04 - INFO - codeparrot_training - Step 10257: {'lr': 0.0004643732460920599, 'samples': 1969536, 'steps': 10257, 'loss/train': 0.6923001408576965} 01/27/2022 05:31:07 - INFO - codeparrot_training - Step 10258: {'lr': 0.00046436482721739976, 'samples': 1969728, 'steps': 10258, 'loss/train': 0.9697608053684235} 01/27/2022 05:31:12 - INFO - codeparrot_training - Step 10259: {'lr': 0.00046435640742446875, 'samples': 1969920, 'steps': 10259, 'loss/train': 0.8931134641170502} 01/27/2022 05:31:16 - INFO - codeparrot_training - Step 10260: {'lr': 0.000464347986713303, 'samples': 1970112, 'steps': 10260, 'loss/train': 0.8315699100494385} 01/27/2022 05:31:19 - INFO - codeparrot_training - Step 10261: {'lr': 0.00046433956508393855, 'samples': 1970304, 'steps': 10261, 'loss/train': 0.6798934489488602} 01/27/2022 05:31:22 - INFO - codeparrot_training - Step 10262: {'lr': 0.0004643311425364114, 'samples': 1970496, 'steps': 10262, 'loss/train': 0.33776530623435974} 01/27/2022 05:31:25 - INFO - codeparrot_training - Step 10263: {'lr': 0.0004643227190707577, 'samples': 1970688, 'steps': 10263, 'loss/train': 0.7050584703683853} 01/27/2022 05:31:28 - INFO - codeparrot_training - Step 10264: {'lr': 0.00046431429468701363, 'samples': 1970880, 'steps': 10264, 'loss/train': 0.8959516882896423} 01/27/2022 05:31:31 - INFO - codeparrot_training - Step 10265: {'lr': 0.0004643058693852151, 'samples': 1971072, 'steps': 10265, 'loss/train': 1.0726135969161987} 01/27/2022 05:31:35 - INFO - codeparrot_training - Step 10266: {'lr': 0.0004642974431653983, 'samples': 1971264, 'steps': 10266, 'loss/train': 0.16245540976524353} 01/27/2022 05:31:38 - INFO - codeparrot_training - Step 10267: {'lr': 0.00046428901602759933, 'samples': 1971456, 'steps': 10267, 'loss/train': 0.6942025423049927} 01/27/2022 05:31:41 - INFO - codeparrot_training - Step 10268: {'lr': 0.00046428058797185417, 'samples': 1971648, 'steps': 10268, 'loss/train': 0.3298070803284645} 01/27/2022 05:31:45 - INFO - codeparrot_training - Step 10269: {'lr': 0.0004642721589981991, 'samples': 1971840, 'steps': 10269, 'loss/train': 1.1191563606262207} 01/27/2022 05:31:49 - INFO - codeparrot_training - Step 10270: {'lr': 0.00046426372910667003, 'samples': 1972032, 'steps': 10270, 'loss/train': 0.8592516481876373} 01/27/2022 05:31:52 - INFO - codeparrot_training - Step 10271: {'lr': 0.00046425529829730326, 'samples': 1972224, 'steps': 10271, 'loss/train': 0.807557612657547} 01/27/2022 05:31:55 - INFO - codeparrot_training - Step 10272: {'lr': 0.0004642468665701348, 'samples': 1972416, 'steps': 10272, 'loss/train': 0.8570375740528107} 01/27/2022 05:31:58 - INFO - codeparrot_training - Step 10273: {'lr': 0.0004642384339252008, 'samples': 1972608, 'steps': 10273, 'loss/train': 0.8212518990039825} 01/27/2022 05:32:01 - INFO - codeparrot_training - Step 10274: {'lr': 0.0004642300003625374, 'samples': 1972800, 'steps': 10274, 'loss/train': 0.3957020491361618} 01/27/2022 05:32:04 - INFO - codeparrot_training - Step 10275: {'lr': 0.0004642215658821807, 'samples': 1972992, 'steps': 10275, 'loss/train': 0.5907256901264191} 01/27/2022 05:32:07 - INFO - codeparrot_training - Step 10276: {'lr': 0.0004642131304841668, 'samples': 1973184, 'steps': 10276, 'loss/train': 0.8404759168624878} 01/27/2022 05:32:12 - INFO - codeparrot_training - Step 10277: {'lr': 0.00046420469416853197, 'samples': 1973376, 'steps': 10277, 'loss/train': 1.0217708945274353} 01/27/2022 05:32:15 - INFO - codeparrot_training - Step 10278: {'lr': 0.0004641962569353121, 'samples': 1973568, 'steps': 10278, 'loss/train': 0.9991310834884644} 01/27/2022 05:32:18 - INFO - codeparrot_training - Step 10279: {'lr': 0.0004641878187845436, 'samples': 1973760, 'steps': 10279, 'loss/train': 0.712072491645813} 01/27/2022 05:32:22 - INFO - codeparrot_training - Step 10280: {'lr': 0.00046417937971626245, 'samples': 1973952, 'steps': 10280, 'loss/train': 1.0537553429603577} 01/27/2022 05:32:25 - INFO - codeparrot_training - Step 10281: {'lr': 0.00046417093973050486, 'samples': 1974144, 'steps': 10281, 'loss/train': 0.9078789353370667} 01/27/2022 05:32:28 - INFO - codeparrot_training - Step 10282: {'lr': 0.0004641624988273069, 'samples': 1974336, 'steps': 10282, 'loss/train': 0.18632546067237854} 01/27/2022 05:32:31 - INFO - codeparrot_training - Step 10283: {'lr': 0.0004641540570067049, 'samples': 1974528, 'steps': 10283, 'loss/train': 0.918301910161972} 01/27/2022 05:32:34 - INFO - codeparrot_training - Step 10284: {'lr': 0.0004641456142687348, 'samples': 1974720, 'steps': 10284, 'loss/train': 0.7846343815326691} 01/27/2022 05:32:37 - INFO - codeparrot_training - Step 10285: {'lr': 0.000464137170613433, 'samples': 1974912, 'steps': 10285, 'loss/train': 0.7259860038757324} 01/27/2022 05:32:43 - INFO - codeparrot_training - Step 10286: {'lr': 0.00046412872604083554, 'samples': 1975104, 'steps': 10286, 'loss/train': 0.9291252493858337} 01/27/2022 05:32:46 - INFO - codeparrot_training - Step 10287: {'lr': 0.00046412028055097855, 'samples': 1975296, 'steps': 10287, 'loss/train': 0.7824455201625824} 01/27/2022 05:32:49 - INFO - codeparrot_training - Step 10288: {'lr': 0.00046411183414389834, 'samples': 1975488, 'steps': 10288, 'loss/train': 0.5023370236158371} 01/27/2022 05:32:52 - INFO - codeparrot_training - Step 10289: {'lr': 0.000464103386819631, 'samples': 1975680, 'steps': 10289, 'loss/train': 0.7886306941509247} 01/27/2022 05:32:55 - INFO - codeparrot_training - Step 10290: {'lr': 0.00046409493857821273, 'samples': 1975872, 'steps': 10290, 'loss/train': 1.1001510322093964} 01/27/2022 05:32:58 - INFO - codeparrot_training - Step 10291: {'lr': 0.00046408648941967975, 'samples': 1976064, 'steps': 10291, 'loss/train': 0.5536664575338364} 01/27/2022 05:33:01 - INFO - codeparrot_training - Step 10292: {'lr': 0.0004640780393440682, 'samples': 1976256, 'steps': 10292, 'loss/train': 0.9839866161346436} 01/27/2022 05:33:05 - INFO - codeparrot_training - Step 10293: {'lr': 0.0004640695883514143, 'samples': 1976448, 'steps': 10293, 'loss/train': 0.8775339424610138} 01/27/2022 05:33:08 - INFO - codeparrot_training - Step 10294: {'lr': 0.0004640611364417543, 'samples': 1976640, 'steps': 10294, 'loss/train': 0.5696676224470139} 01/27/2022 05:33:12 - INFO - codeparrot_training - Step 10295: {'lr': 0.0004640526836151243, 'samples': 1976832, 'steps': 10295, 'loss/train': 0.7273924648761749} 01/27/2022 05:33:15 - INFO - codeparrot_training - Step 10296: {'lr': 0.0004640442298715606, 'samples': 1977024, 'steps': 10296, 'loss/train': 0.9374109506607056} 01/27/2022 05:33:18 - INFO - codeparrot_training - Step 10297: {'lr': 0.0004640357752110994, 'samples': 1977216, 'steps': 10297, 'loss/train': 1.1565410792827606} 01/27/2022 05:33:22 - INFO - codeparrot_training - Step 10298: {'lr': 0.00046402731963377685, 'samples': 1977408, 'steps': 10298, 'loss/train': 1.1974213421344757} 01/27/2022 05:33:25 - INFO - codeparrot_training - Step 10299: {'lr': 0.0004640188631396293, 'samples': 1977600, 'steps': 10299, 'loss/train': 0.2918899431824684} 01/27/2022 05:33:28 - INFO - codeparrot_training - Step 10300: {'lr': 0.0004640104057286929, 'samples': 1977792, 'steps': 10300, 'loss/train': 1.1392532587051392} 01/27/2022 05:33:31 - INFO - codeparrot_training - Step 10301: {'lr': 0.0004640019474010038, 'samples': 1977984, 'steps': 10301, 'loss/train': 0.4170621782541275} 01/27/2022 05:33:34 - INFO - codeparrot_training - Step 10302: {'lr': 0.00046399348815659837, 'samples': 1978176, 'steps': 10302, 'loss/train': 0.5519894957542419} 01/27/2022 05:33:37 - INFO - codeparrot_training - Step 10303: {'lr': 0.0004639850279955128, 'samples': 1978368, 'steps': 10303, 'loss/train': 0.7369118481874466} 01/27/2022 05:33:42 - INFO - codeparrot_training - Step 10304: {'lr': 0.0004639765669177833, 'samples': 1978560, 'steps': 10304, 'loss/train': 0.7122964113950729} 01/27/2022 05:33:45 - INFO - codeparrot_training - Step 10305: {'lr': 0.0004639681049234461, 'samples': 1978752, 'steps': 10305, 'loss/train': 0.49556104838848114} 01/27/2022 05:33:48 - INFO - codeparrot_training - Step 10306: {'lr': 0.0004639596420125375, 'samples': 1978944, 'steps': 10306, 'loss/train': 1.147794485092163} 01/27/2022 05:33:51 - INFO - codeparrot_training - Step 10307: {'lr': 0.0004639511781850937, 'samples': 1979136, 'steps': 10307, 'loss/train': 1.0985969603061676} 01/27/2022 05:33:54 - INFO - codeparrot_training - Step 10308: {'lr': 0.000463942713441151, 'samples': 1979328, 'steps': 10308, 'loss/train': 0.4425423592329025} 01/27/2022 05:33:58 - INFO - codeparrot_training - Step 10309: {'lr': 0.00046393424778074573, 'samples': 1979520, 'steps': 10309, 'loss/train': 0.3524588271975517} 01/27/2022 05:34:01 - INFO - codeparrot_training - Step 10310: {'lr': 0.000463925781203914, 'samples': 1979712, 'steps': 10310, 'loss/train': 1.0207692682743073} 01/27/2022 05:34:04 - INFO - codeparrot_training - Step 10311: {'lr': 0.00046391731371069224, 'samples': 1979904, 'steps': 10311, 'loss/train': 0.6658549904823303} 01/27/2022 05:34:08 - INFO - codeparrot_training - Step 10312: {'lr': 0.00046390884530111656, 'samples': 1980096, 'steps': 10312, 'loss/train': 0.939742773771286} 01/27/2022 05:34:12 - INFO - codeparrot_training - Step 10313: {'lr': 0.0004639003759752233, 'samples': 1980288, 'steps': 10313, 'loss/train': 0.5748582780361176} 01/27/2022 05:34:15 - INFO - codeparrot_training - Step 10314: {'lr': 0.00046389190573304875, 'samples': 1980480, 'steps': 10314, 'loss/train': 1.3151122033596039} 01/27/2022 05:34:18 - INFO - codeparrot_training - Step 10315: {'lr': 0.0004638834345746292, 'samples': 1980672, 'steps': 10315, 'loss/train': 0.5325622111558914} 01/27/2022 05:34:21 - INFO - codeparrot_training - Step 10316: {'lr': 0.00046387496250000095, 'samples': 1980864, 'steps': 10316, 'loss/train': 0.8546470999717712} 01/27/2022 05:34:24 - INFO - codeparrot_training - Step 10317: {'lr': 0.00046386648950920027, 'samples': 1981056, 'steps': 10317, 'loss/train': 0.4839324653148651} 01/27/2022 05:34:27 - INFO - codeparrot_training - Step 10318: {'lr': 0.0004638580156022635, 'samples': 1981248, 'steps': 10318, 'loss/train': 1.1478789746761322} 01/27/2022 05:34:30 - INFO - codeparrot_training - Step 10319: {'lr': 0.0004638495407792268, 'samples': 1981440, 'steps': 10319, 'loss/train': 0.9306082427501678} 01/27/2022 05:34:34 - INFO - codeparrot_training - Step 10320: {'lr': 0.0004638410650401267, 'samples': 1981632, 'steps': 10320, 'loss/train': 0.7653739750385284} 01/27/2022 05:34:39 - INFO - codeparrot_training - Step 10321: {'lr': 0.0004638325883849993, 'samples': 1981824, 'steps': 10321, 'loss/train': 0.5963940471410751} 01/27/2022 05:34:42 - INFO - codeparrot_training - Step 10322: {'lr': 0.00046382411081388096, 'samples': 1982016, 'steps': 10322, 'loss/train': 1.0611310601234436} 01/27/2022 05:34:45 - INFO - codeparrot_training - Step 10323: {'lr': 0.0004638156323268081, 'samples': 1982208, 'steps': 10323, 'loss/train': 0.4193708002567291} 01/27/2022 05:34:48 - INFO - codeparrot_training - Step 10324: {'lr': 0.00046380715292381695, 'samples': 1982400, 'steps': 10324, 'loss/train': 0.877145916223526} 01/27/2022 05:34:51 - INFO - codeparrot_training - Step 10325: {'lr': 0.0004637986726049438, 'samples': 1982592, 'steps': 10325, 'loss/train': 0.8592255413532257} 01/27/2022 05:34:54 - INFO - codeparrot_training - Step 10326: {'lr': 0.00046379019137022506, 'samples': 1982784, 'steps': 10326, 'loss/train': 0.8453416228294373} 01/27/2022 05:34:57 - INFO - codeparrot_training - Step 10327: {'lr': 0.000463781709219697, 'samples': 1982976, 'steps': 10327, 'loss/train': 1.0708614885807037} 01/27/2022 05:35:01 - INFO - codeparrot_training - Step 10328: {'lr': 0.000463773226153396, 'samples': 1983168, 'steps': 10328, 'loss/train': 0.7512524127960205} 01/27/2022 05:35:04 - INFO - codeparrot_training - Step 10329: {'lr': 0.0004637647421713584, 'samples': 1983360, 'steps': 10329, 'loss/train': 0.7957866489887238} 01/27/2022 05:35:08 - INFO - codeparrot_training - Step 10330: {'lr': 0.0004637562572736205, 'samples': 1983552, 'steps': 10330, 'loss/train': 1.4350114166736603} 01/27/2022 05:35:12 - INFO - codeparrot_training - Step 10331: {'lr': 0.00046374777146021865, 'samples': 1983744, 'steps': 10331, 'loss/train': 0.6287669241428375} 01/27/2022 05:35:15 - INFO - codeparrot_training - Step 10332: {'lr': 0.00046373928473118927, 'samples': 1983936, 'steps': 10332, 'loss/train': 0.6771432012319565} 01/27/2022 05:35:18 - INFO - codeparrot_training - Step 10333: {'lr': 0.0004637307970865686, 'samples': 1984128, 'steps': 10333, 'loss/train': 0.9395806789398193} 01/27/2022 05:35:21 - INFO - codeparrot_training - Step 10334: {'lr': 0.00046372230852639314, 'samples': 1984320, 'steps': 10334, 'loss/train': 0.8596571087837219} 01/27/2022 05:35:24 - INFO - codeparrot_training - Step 10335: {'lr': 0.0004637138190506991, 'samples': 1984512, 'steps': 10335, 'loss/train': 0.7273314446210861} 01/27/2022 05:35:27 - INFO - codeparrot_training - Step 10336: {'lr': 0.00046370532865952296, 'samples': 1984704, 'steps': 10336, 'loss/train': 0.920493096113205} 01/27/2022 05:35:30 - INFO - codeparrot_training - Step 10337: {'lr': 0.0004636968373529011, 'samples': 1984896, 'steps': 10337, 'loss/train': 0.8187345564365387} 01/27/2022 05:35:34 - INFO - codeparrot_training - Step 10338: {'lr': 0.00046368834513086976, 'samples': 1985088, 'steps': 10338, 'loss/train': 0.9151084721088409} 01/27/2022 05:35:38 - INFO - codeparrot_training - Step 10339: {'lr': 0.00046367985199346546, 'samples': 1985280, 'steps': 10339, 'loss/train': 0.8649381995201111} 01/27/2022 05:35:41 - INFO - codeparrot_training - Step 10340: {'lr': 0.00046367135794072445, 'samples': 1985472, 'steps': 10340, 'loss/train': 0.9349227547645569} 01/27/2022 05:35:44 - INFO - codeparrot_training - Step 10341: {'lr': 0.00046366286297268327, 'samples': 1985664, 'steps': 10341, 'loss/train': 1.061290293931961} 01/27/2022 05:35:47 - INFO - codeparrot_training - Step 10342: {'lr': 0.0004636543670893782, 'samples': 1985856, 'steps': 10342, 'loss/train': 0.7865298986434937} 01/27/2022 05:35:50 - INFO - codeparrot_training - Step 10343: {'lr': 0.0004636458702908457, 'samples': 1986048, 'steps': 10343, 'loss/train': 1.0173075199127197} 01/27/2022 05:35:54 - INFO - codeparrot_training - Step 10344: {'lr': 0.0004636373725771221, 'samples': 1986240, 'steps': 10344, 'loss/train': 0.947484165430069} 01/27/2022 05:35:57 - INFO - codeparrot_training - Step 10345: {'lr': 0.0004636288739482438, 'samples': 1986432, 'steps': 10345, 'loss/train': 1.3241394460201263} 01/27/2022 05:36:00 - INFO - codeparrot_training - Step 10346: {'lr': 0.0004636203744042473, 'samples': 1986624, 'steps': 10346, 'loss/train': 0.8888459801673889} 01/27/2022 05:36:05 - INFO - codeparrot_training - Step 10347: {'lr': 0.0004636118739451689, 'samples': 1986816, 'steps': 10347, 'loss/train': 0.801074892282486} 01/27/2022 05:36:08 - INFO - codeparrot_training - Step 10348: {'lr': 0.0004636033725710451, 'samples': 1987008, 'steps': 10348, 'loss/train': 0.6295650601387024} 01/27/2022 05:36:11 - INFO - codeparrot_training - Step 10349: {'lr': 0.00046359487028191224, 'samples': 1987200, 'steps': 10349, 'loss/train': 0.950567364692688} 01/27/2022 05:36:15 - INFO - codeparrot_training - Step 10350: {'lr': 0.0004635863670778068, 'samples': 1987392, 'steps': 10350, 'loss/train': 0.9847062528133392} 01/27/2022 05:36:18 - INFO - codeparrot_training - Step 10351: {'lr': 0.00046357786295876517, 'samples': 1987584, 'steps': 10351, 'loss/train': 1.4526788592338562} 01/27/2022 05:36:21 - INFO - codeparrot_training - Step 10352: {'lr': 0.0004635693579248238, 'samples': 1987776, 'steps': 10352, 'loss/train': 0.753848522901535} 01/27/2022 05:36:24 - INFO - codeparrot_training - Step 10353: {'lr': 0.0004635608519760191, 'samples': 1987968, 'steps': 10353, 'loss/train': 0.8226287662982941} 01/27/2022 05:36:27 - INFO - codeparrot_training - Step 10354: {'lr': 0.00046355234511238756, 'samples': 1988160, 'steps': 10354, 'loss/train': 1.1452437937259674} 01/27/2022 05:36:30 - INFO - codeparrot_training - Step 10355: {'lr': 0.00046354383733396553, 'samples': 1988352, 'steps': 10355, 'loss/train': 0.8486416339874268} 01/27/2022 05:36:35 - INFO - codeparrot_training - Step 10356: {'lr': 0.0004635353286407896, 'samples': 1988544, 'steps': 10356, 'loss/train': 0.9824750125408173} 01/27/2022 05:36:38 - INFO - codeparrot_training - Step 10357: {'lr': 0.00046352681903289605, 'samples': 1988736, 'steps': 10357, 'loss/train': 0.7928096652030945} 01/27/2022 05:36:41 - INFO - codeparrot_training - Step 10358: {'lr': 0.00046351830851032146, 'samples': 1988928, 'steps': 10358, 'loss/train': 1.0047537982463837} 01/27/2022 05:36:44 - INFO - codeparrot_training - Step 10359: {'lr': 0.00046350979707310226, 'samples': 1989120, 'steps': 10359, 'loss/train': 1.063296228647232} 01/27/2022 05:36:47 - INFO - codeparrot_training - Step 10360: {'lr': 0.00046350128472127483, 'samples': 1989312, 'steps': 10360, 'loss/train': 0.9886266589164734} 01/27/2022 05:36:50 - INFO - codeparrot_training - Step 10361: {'lr': 0.00046349277145487565, 'samples': 1989504, 'steps': 10361, 'loss/train': 0.8710369169712067} 01/27/2022 05:36:54 - INFO - codeparrot_training - Step 10362: {'lr': 0.00046348425727394126, 'samples': 1989696, 'steps': 10362, 'loss/train': 0.7154182344675064} 01/27/2022 05:36:57 - INFO - codeparrot_training - Step 10363: {'lr': 0.0004634757421785082, 'samples': 1989888, 'steps': 10363, 'loss/train': 0.7667844593524933} 01/27/2022 05:37:00 - INFO - codeparrot_training - Step 10364: {'lr': 0.0004634672261686127, 'samples': 1990080, 'steps': 10364, 'loss/train': 1.299640953540802} 01/27/2022 05:37:05 - INFO - codeparrot_training - Step 10365: {'lr': 0.0004634587092442915, 'samples': 1990272, 'steps': 10365, 'loss/train': 1.1899334192276} 01/27/2022 05:37:08 - INFO - codeparrot_training - Step 10366: {'lr': 0.00046345019140558085, 'samples': 1990464, 'steps': 10366, 'loss/train': 0.9485263824462891} 01/27/2022 05:37:11 - INFO - codeparrot_training - Step 10367: {'lr': 0.0004634416726525175, 'samples': 1990656, 'steps': 10367, 'loss/train': 0.5868599116802216} 01/27/2022 05:37:14 - INFO - codeparrot_training - Step 10368: {'lr': 0.00046343315298513765, 'samples': 1990848, 'steps': 10368, 'loss/train': 0.700992152094841} 01/27/2022 05:37:18 - INFO - codeparrot_training - Step 10369: {'lr': 0.0004634246324034781, 'samples': 1991040, 'steps': 10369, 'loss/train': 1.8401498794555664} 01/27/2022 05:37:21 - INFO - codeparrot_training - Step 10370: {'lr': 0.0004634161109075751, 'samples': 1991232, 'steps': 10370, 'loss/train': 0.6610772609710693} 01/27/2022 05:37:24 - INFO - codeparrot_training - Step 10371: {'lr': 0.0004634075884974652, 'samples': 1991424, 'steps': 10371, 'loss/train': 0.9491287171840668} 01/27/2022 05:37:27 - INFO - codeparrot_training - Step 10372: {'lr': 0.00046339906517318507, 'samples': 1991616, 'steps': 10372, 'loss/train': 0.5661735981702805} 01/27/2022 05:37:30 - INFO - codeparrot_training - Step 10373: {'lr': 0.0004633905409347711, 'samples': 1991808, 'steps': 10373, 'loss/train': 0.6652521640062332} 01/27/2022 05:37:35 - INFO - codeparrot_training - Step 10374: {'lr': 0.00046338201578225975, 'samples': 1992000, 'steps': 10374, 'loss/train': 0.6810886859893799} 01/27/2022 05:37:38 - INFO - codeparrot_training - Step 10375: {'lr': 0.0004633734897156876, 'samples': 1992192, 'steps': 10375, 'loss/train': 0.27432239055633545} 01/27/2022 05:37:41 - INFO - codeparrot_training - Step 10376: {'lr': 0.0004633649627350912, 'samples': 1992384, 'steps': 10376, 'loss/train': 0.10280144959688187} 01/27/2022 05:37:44 - INFO - codeparrot_training - Step 10377: {'lr': 0.000463356434840507, 'samples': 1992576, 'steps': 10377, 'loss/train': 0.5997254848480225} 01/27/2022 05:37:47 - INFO - codeparrot_training - Step 10378: {'lr': 0.0004633479060319717, 'samples': 1992768, 'steps': 10378, 'loss/train': 1.2921984493732452} 01/27/2022 05:37:50 - INFO - codeparrot_training - Step 10379: {'lr': 0.00046333937630952163, 'samples': 1992960, 'steps': 10379, 'loss/train': 0.8694352805614471} 01/27/2022 05:37:54 - INFO - codeparrot_training - Step 10380: {'lr': 0.00046333084567319344, 'samples': 1993152, 'steps': 10380, 'loss/train': 0.70761439204216} 01/27/2022 05:37:57 - INFO - codeparrot_training - Step 10381: {'lr': 0.0004633223141230236, 'samples': 1993344, 'steps': 10381, 'loss/train': 0.7165381014347076} 01/27/2022 05:38:00 - INFO - codeparrot_training - Step 10382: {'lr': 0.0004633137816590488, 'samples': 1993536, 'steps': 10382, 'loss/train': 0.2745114415884018} 01/27/2022 05:38:04 - INFO - codeparrot_training - Step 10383: {'lr': 0.00046330524828130536, 'samples': 1993728, 'steps': 10383, 'loss/train': 0.4870065897703171} 01/27/2022 05:38:08 - INFO - codeparrot_training - Step 10384: {'lr': 0.00046329671398983007, 'samples': 1993920, 'steps': 10384, 'loss/train': 0.7911028861999512} 01/27/2022 05:38:11 - INFO - codeparrot_training - Step 10385: {'lr': 0.0004632881787846594, 'samples': 1994112, 'steps': 10385, 'loss/train': 1.2852230966091156} 01/27/2022 05:38:14 - INFO - codeparrot_training - Step 10386: {'lr': 0.0004632796426658298, 'samples': 1994304, 'steps': 10386, 'loss/train': 1.174020141363144} 01/27/2022 05:38:17 - INFO - codeparrot_training - Step 10387: {'lr': 0.00046327110563337804, 'samples': 1994496, 'steps': 10387, 'loss/train': 1.4214322865009308} 01/27/2022 05:38:20 - INFO - codeparrot_training - Step 10388: {'lr': 0.00046326256768734053, 'samples': 1994688, 'steps': 10388, 'loss/train': 0.20297440141439438} 01/27/2022 05:38:23 - INFO - codeparrot_training - Step 10389: {'lr': 0.0004632540288277539, 'samples': 1994880, 'steps': 10389, 'loss/train': 0.7797329127788544} 01/27/2022 05:38:27 - INFO - codeparrot_training - Step 10390: {'lr': 0.0004632454890546547, 'samples': 1995072, 'steps': 10390, 'loss/train': 0.8702205419540405} 01/27/2022 05:38:32 - INFO - codeparrot_training - Step 10391: {'lr': 0.0004632369483680796, 'samples': 1995264, 'steps': 10391, 'loss/train': 0.8050839900970459} 01/27/2022 05:38:35 - INFO - codeparrot_training - Step 10392: {'lr': 0.0004632284067680651, 'samples': 1995456, 'steps': 10392, 'loss/train': 1.2335028648376465} 01/27/2022 05:38:38 - INFO - codeparrot_training - Step 10393: {'lr': 0.0004632198642546478, 'samples': 1995648, 'steps': 10393, 'loss/train': 1.9053706526756287} 01/27/2022 05:38:41 - INFO - codeparrot_training - Step 10394: {'lr': 0.0004632113208278643, 'samples': 1995840, 'steps': 10394, 'loss/train': 0.9294961988925934} 01/27/2022 05:38:44 - INFO - codeparrot_training - Step 10395: {'lr': 0.00046320277648775123, 'samples': 1996032, 'steps': 10395, 'loss/train': 0.8313639163970947} 01/27/2022 05:38:47 - INFO - codeparrot_training - Step 10396: {'lr': 0.0004631942312343452, 'samples': 1996224, 'steps': 10396, 'loss/train': 0.582364484667778} 01/27/2022 05:38:51 - INFO - codeparrot_training - Step 10397: {'lr': 0.00046318568506768267, 'samples': 1996416, 'steps': 10397, 'loss/train': 0.849009096622467} 01/27/2022 05:38:54 - INFO - codeparrot_training - Step 10398: {'lr': 0.0004631771379878005, 'samples': 1996608, 'steps': 10398, 'loss/train': 0.9312305152416229} 01/27/2022 05:38:57 - INFO - codeparrot_training - Step 10399: {'lr': 0.00046316858999473506, 'samples': 1996800, 'steps': 10399, 'loss/train': 1.0119254887104034} 01/27/2022 05:39:02 - INFO - codeparrot_training - Step 10400: {'lr': 0.00046316004108852305, 'samples': 1996992, 'steps': 10400, 'loss/train': 1.0448890328407288} 01/27/2022 05:39:05 - INFO - codeparrot_training - Step 10401: {'lr': 0.0004631514912692012, 'samples': 1997184, 'steps': 10401, 'loss/train': 0.9794467985630035} 01/27/2022 05:39:08 - INFO - codeparrot_training - Step 10402: {'lr': 0.00046314294053680593, 'samples': 1997376, 'steps': 10402, 'loss/train': 0.987624853849411} 01/27/2022 05:39:11 - INFO - codeparrot_training - Step 10403: {'lr': 0.0004631343888913741, 'samples': 1997568, 'steps': 10403, 'loss/train': 1.77701336145401} 01/27/2022 05:39:14 - INFO - codeparrot_training - Step 10404: {'lr': 0.00046312583633294213, 'samples': 1997760, 'steps': 10404, 'loss/train': 0.40650317072868347} 01/27/2022 05:39:17 - INFO - codeparrot_training - Step 10405: {'lr': 0.0004631172828615469, 'samples': 1997952, 'steps': 10405, 'loss/train': 0.9832825362682343} 01/27/2022 05:39:20 - INFO - codeparrot_training - Step 10406: {'lr': 0.0004631087284772247, 'samples': 1998144, 'steps': 10406, 'loss/train': 0.5891781449317932} 01/27/2022 05:39:24 - INFO - codeparrot_training - Step 10407: {'lr': 0.0004631001731800125, 'samples': 1998336, 'steps': 10407, 'loss/train': 0.9768303036689758} 01/27/2022 05:39:27 - INFO - codeparrot_training - Step 10408: {'lr': 0.0004630916169699468, 'samples': 1998528, 'steps': 10408, 'loss/train': 0.7505472600460052} 01/27/2022 05:39:31 - INFO - codeparrot_training - Step 10409: {'lr': 0.00046308305984706435, 'samples': 1998720, 'steps': 10409, 'loss/train': 0.9515818655490875} 01/27/2022 05:39:34 - INFO - codeparrot_training - Step 10410: {'lr': 0.00046307450181140163, 'samples': 1998912, 'steps': 10410, 'loss/train': 0.8568049371242523} 01/27/2022 05:39:37 - INFO - codeparrot_training - Step 10411: {'lr': 0.00046306594286299544, 'samples': 1999104, 'steps': 10411, 'loss/train': 1.0030975341796875} 01/27/2022 05:39:41 - INFO - codeparrot_training - Step 10412: {'lr': 0.0004630573830018824, 'samples': 1999296, 'steps': 10412, 'loss/train': 1.376331388950348} 01/27/2022 05:39:44 - INFO - codeparrot_training - Step 10413: {'lr': 0.00046304882222809917, 'samples': 1999488, 'steps': 10413, 'loss/train': 0.35422539710998535} 01/27/2022 05:39:47 - INFO - codeparrot_training - Step 10414: {'lr': 0.0004630402605416825, 'samples': 1999680, 'steps': 10414, 'loss/train': 1.052196353673935} 01/27/2022 05:39:50 - INFO - codeparrot_training - Step 10415: {'lr': 0.0004630316979426689, 'samples': 1999872, 'steps': 10415, 'loss/train': 0.7634359896183014} 01/27/2022 05:39:53 - INFO - codeparrot_training - Step 10416: {'lr': 0.00046302313443109523, 'samples': 2000064, 'steps': 10416, 'loss/train': 0.4086475074291229} 01/27/2022 05:39:56 - INFO - codeparrot_training - Step 10417: {'lr': 0.00046301457000699807, 'samples': 2000256, 'steps': 10417, 'loss/train': 0.969290167093277} 01/27/2022 05:40:01 - INFO - codeparrot_training - Step 10418: {'lr': 0.0004630060046704141, 'samples': 2000448, 'steps': 10418, 'loss/train': 0.8963711857795715} 01/27/2022 05:40:04 - INFO - codeparrot_training - Step 10419: {'lr': 0.0004629974384213801, 'samples': 2000640, 'steps': 10419, 'loss/train': 0.8324830234050751} 01/27/2022 05:40:07 - INFO - codeparrot_training - Step 10420: {'lr': 0.0004629888712599327, 'samples': 2000832, 'steps': 10420, 'loss/train': 0.9164494872093201} 01/27/2022 05:40:10 - INFO - codeparrot_training - Step 10421: {'lr': 0.0004629803031861086, 'samples': 2001024, 'steps': 10421, 'loss/train': 0.8003860116004944} 01/27/2022 05:40:13 - INFO - codeparrot_training - Step 10422: {'lr': 0.0004629717341999445, 'samples': 2001216, 'steps': 10422, 'loss/train': 0.7977730929851532} 01/27/2022 05:40:16 - INFO - codeparrot_training - Step 10423: {'lr': 0.0004629631643014771, 'samples': 2001408, 'steps': 10423, 'loss/train': 0.7907688617706299} 01/27/2022 05:40:19 - INFO - codeparrot_training - Step 10424: {'lr': 0.00046295459349074316, 'samples': 2001600, 'steps': 10424, 'loss/train': 0.587860107421875} 01/27/2022 05:40:23 - INFO - codeparrot_training - Step 10425: {'lr': 0.00046294602176777936, 'samples': 2001792, 'steps': 10425, 'loss/train': 1.0100463330745697} 01/27/2022 05:40:28 - INFO - codeparrot_training - Step 10426: {'lr': 0.0004629374491326224, 'samples': 2001984, 'steps': 10426, 'loss/train': 0.4980085641145706} 01/27/2022 05:40:31 - INFO - codeparrot_training - Step 10427: {'lr': 0.00046292887558530905, 'samples': 2002176, 'steps': 10427, 'loss/train': 0.17456481233239174} 01/27/2022 05:40:34 - INFO - codeparrot_training - Step 10428: {'lr': 0.000462920301125876, 'samples': 2002368, 'steps': 10428, 'loss/train': 0.6495936959981918} 01/27/2022 05:40:37 - INFO - codeparrot_training - Step 10429: {'lr': 0.0004629117257543599, 'samples': 2002560, 'steps': 10429, 'loss/train': 1.1607847809791565} 01/27/2022 05:40:40 - INFO - codeparrot_training - Step 10430: {'lr': 0.0004629031494707977, 'samples': 2002752, 'steps': 10430, 'loss/train': 0.6862828731536865} 01/27/2022 05:40:43 - INFO - codeparrot_training - Step 10431: {'lr': 0.00046289457227522595, 'samples': 2002944, 'steps': 10431, 'loss/train': 1.6267669200897217} 01/27/2022 05:40:47 - INFO - codeparrot_training - Step 10432: {'lr': 0.0004628859941676815, 'samples': 2003136, 'steps': 10432, 'loss/train': 0.6777840256690979} 01/27/2022 05:40:50 - INFO - codeparrot_training - Step 10433: {'lr': 0.000462877415148201, 'samples': 2003328, 'steps': 10433, 'loss/train': 0.2240433767437935} 01/27/2022 05:40:53 - INFO - codeparrot_training - Step 10434: {'lr': 0.0004628688352168213, 'samples': 2003520, 'steps': 10434, 'loss/train': 0.7480701506137848} 01/27/2022 05:40:57 - INFO - codeparrot_training - Step 10435: {'lr': 0.00046286025437357905, 'samples': 2003712, 'steps': 10435, 'loss/train': 0.8519812524318695} 01/27/2022 05:41:00 - INFO - codeparrot_training - Step 10436: {'lr': 0.00046285167261851114, 'samples': 2003904, 'steps': 10436, 'loss/train': 0.8207605183124542} 01/27/2022 05:41:04 - INFO - codeparrot_training - Step 10437: {'lr': 0.00046284308995165414, 'samples': 2004096, 'steps': 10437, 'loss/train': 0.6037151366472244} 01/27/2022 05:41:07 - INFO - codeparrot_training - Step 10438: {'lr': 0.00046283450637304497, 'samples': 2004288, 'steps': 10438, 'loss/train': 0.6831705272197723} 01/27/2022 05:41:10 - INFO - codeparrot_training - Step 10439: {'lr': 0.0004628259218827204, 'samples': 2004480, 'steps': 10439, 'loss/train': 0.590895339846611} 01/27/2022 05:41:13 - INFO - codeparrot_training - Step 10440: {'lr': 0.0004628173364807171, 'samples': 2004672, 'steps': 10440, 'loss/train': 0.6979294270277023} 01/27/2022 05:41:16 - INFO - codeparrot_training - Step 10441: {'lr': 0.00046280875016707195, 'samples': 2004864, 'steps': 10441, 'loss/train': 0.5223518908023834} 01/27/2022 05:41:19 - INFO - codeparrot_training - Step 10442: {'lr': 0.0004628001629418217, 'samples': 2005056, 'steps': 10442, 'loss/train': 0.694562092423439} 01/27/2022 05:41:22 - INFO - codeparrot_training - Step 10443: {'lr': 0.0004627915748050031, 'samples': 2005248, 'steps': 10443, 'loss/train': 0.2260562852025032} 01/27/2022 05:41:27 - INFO - codeparrot_training - Step 10444: {'lr': 0.000462782985756653, 'samples': 2005440, 'steps': 10444, 'loss/train': 1.3950996100902557} 01/27/2022 05:41:31 - INFO - codeparrot_training - Step 10445: {'lr': 0.0004627743957968081, 'samples': 2005632, 'steps': 10445, 'loss/train': 0.8989537060260773} 01/27/2022 05:41:34 - INFO - codeparrot_training - Step 10446: {'lr': 0.00046276580492550523, 'samples': 2005824, 'steps': 10446, 'loss/train': 1.4362156391143799} 01/27/2022 05:41:37 - INFO - codeparrot_training - Step 10447: {'lr': 0.0004627572131427813, 'samples': 2006016, 'steps': 10447, 'loss/train': 0.6585431545972824} 01/27/2022 05:41:40 - INFO - codeparrot_training - Step 10448: {'lr': 0.000462748620448673, 'samples': 2006208, 'steps': 10448, 'loss/train': 0.544958308339119} 01/27/2022 05:41:43 - INFO - codeparrot_training - Step 10449: {'lr': 0.00046274002684321716, 'samples': 2006400, 'steps': 10449, 'loss/train': 0.8563658595085144} 01/27/2022 05:41:46 - INFO - codeparrot_training - Step 10450: {'lr': 0.00046273143232645054, 'samples': 2006592, 'steps': 10450, 'loss/train': 1.0718255639076233} 01/27/2022 05:41:49 - INFO - codeparrot_training - Step 10451: {'lr': 0.0004627228368984101, 'samples': 2006784, 'steps': 10451, 'loss/train': 0.605218380689621} 01/27/2022 05:41:53 - INFO - codeparrot_training - Step 10452: {'lr': 0.0004627142405591325, 'samples': 2006976, 'steps': 10452, 'loss/train': 0.9676437377929688} 01/27/2022 05:41:57 - INFO - codeparrot_training - Step 10453: {'lr': 0.00046270564330865466, 'samples': 2007168, 'steps': 10453, 'loss/train': 0.8858714997768402} 01/27/2022 05:42:00 - INFO - codeparrot_training - Step 10454: {'lr': 0.0004626970451470134, 'samples': 2007360, 'steps': 10454, 'loss/train': 1.0256116390228271} 01/27/2022 05:42:04 - INFO - codeparrot_training - Step 10455: {'lr': 0.0004626884460742455, 'samples': 2007552, 'steps': 10455, 'loss/train': 0.5464193522930145} 01/27/2022 05:42:07 - INFO - codeparrot_training - Step 10456: {'lr': 0.00046267984609038793, 'samples': 2007744, 'steps': 10456, 'loss/train': 0.9941117763519287} 01/27/2022 05:42:10 - INFO - codeparrot_training - Step 10457: {'lr': 0.0004626712451954773, 'samples': 2007936, 'steps': 10457, 'loss/train': 0.6788288354873657} 01/27/2022 05:42:13 - INFO - codeparrot_training - Step 10458: {'lr': 0.0004626626433895507, 'samples': 2008128, 'steps': 10458, 'loss/train': 0.9521565735340118} 01/27/2022 05:42:16 - INFO - codeparrot_training - Step 10459: {'lr': 0.00046265404067264484, 'samples': 2008320, 'steps': 10459, 'loss/train': 1.0038671493530273} 01/27/2022 05:42:19 - INFO - codeparrot_training - Step 10460: {'lr': 0.00046264543704479654, 'samples': 2008512, 'steps': 10460, 'loss/train': 0.7376825362443924} 01/27/2022 05:42:24 - INFO - codeparrot_training - Step 10461: {'lr': 0.0004626368325060428, 'samples': 2008704, 'steps': 10461, 'loss/train': 0.8303482532501221} 01/27/2022 05:42:27 - INFO - codeparrot_training - Step 10462: {'lr': 0.00046262822705642025, 'samples': 2008896, 'steps': 10462, 'loss/train': 1.0586170256137848} 01/27/2022 05:42:30 - INFO - codeparrot_training - Step 10463: {'lr': 0.00046261962069596603, 'samples': 2009088, 'steps': 10463, 'loss/train': 0.8380784690380096} 01/27/2022 05:42:33 - INFO - codeparrot_training - Step 10464: {'lr': 0.0004626110134247168, 'samples': 2009280, 'steps': 10464, 'loss/train': 0.6285709887742996} 01/27/2022 05:42:36 - INFO - codeparrot_training - Step 10465: {'lr': 0.0004626024052427095, 'samples': 2009472, 'steps': 10465, 'loss/train': 0.5317466408014297} 01/27/2022 05:42:39 - INFO - codeparrot_training - Step 10466: {'lr': 0.00046259379614998103, 'samples': 2009664, 'steps': 10466, 'loss/train': 0.37331655621528625} 01/27/2022 05:42:43 - INFO - codeparrot_training - Step 10467: {'lr': 0.00046258518614656827, 'samples': 2009856, 'steps': 10467, 'loss/train': 0.637438952922821} 01/27/2022 05:42:46 - INFO - codeparrot_training - Step 10468: {'lr': 0.0004625765752325081, 'samples': 2010048, 'steps': 10468, 'loss/train': 0.7769638001918793} 01/27/2022 05:42:49 - INFO - codeparrot_training - Step 10469: {'lr': 0.0004625679634078372, 'samples': 2010240, 'steps': 10469, 'loss/train': 1.347658395767212} 01/27/2022 05:42:54 - INFO - codeparrot_training - Step 10470: {'lr': 0.0004625593506725928, 'samples': 2010432, 'steps': 10470, 'loss/train': 0.9341509938240051} 01/27/2022 05:42:57 - INFO - codeparrot_training - Step 10471: {'lr': 0.0004625507370268116, 'samples': 2010624, 'steps': 10471, 'loss/train': 0.6064991354942322} 01/27/2022 05:43:00 - INFO - codeparrot_training - Step 10472: {'lr': 0.00046254212247053055, 'samples': 2010816, 'steps': 10472, 'loss/train': 0.836152195930481} 01/27/2022 05:43:03 - INFO - codeparrot_training - Step 10473: {'lr': 0.00046253350700378655, 'samples': 2011008, 'steps': 10473, 'loss/train': 0.8529670536518097} 01/27/2022 05:43:07 - INFO - codeparrot_training - Step 10474: {'lr': 0.0004625248906266165, 'samples': 2011200, 'steps': 10474, 'loss/train': 0.3696860074996948} 01/27/2022 05:43:10 - INFO - codeparrot_training - Step 10475: {'lr': 0.00046251627333905723, 'samples': 2011392, 'steps': 10475, 'loss/train': 1.1670288741588593} 01/27/2022 05:43:13 - INFO - codeparrot_training - Step 10476: {'lr': 0.0004625076551411458, 'samples': 2011584, 'steps': 10476, 'loss/train': 0.4819381982088089} 01/27/2022 05:43:16 - INFO - codeparrot_training - Step 10477: {'lr': 0.000462499036032919, 'samples': 2011776, 'steps': 10477, 'loss/train': 0.6404583603143692} 01/27/2022 05:43:19 - INFO - codeparrot_training - Step 10478: {'lr': 0.0004624904160144138, 'samples': 2011968, 'steps': 10478, 'loss/train': 0.6218980997800827} 01/27/2022 05:43:24 - INFO - codeparrot_training - Step 10479: {'lr': 0.00046248179508566716, 'samples': 2012160, 'steps': 10479, 'loss/train': 0.6789125204086304} 01/27/2022 05:43:27 - INFO - codeparrot_training - Step 10480: {'lr': 0.000462473173246716, 'samples': 2012352, 'steps': 10480, 'loss/train': 0.7391423732042313} 01/27/2022 05:43:30 - INFO - codeparrot_training - Step 10481: {'lr': 0.00046246455049759716, 'samples': 2012544, 'steps': 10481, 'loss/train': 0.5748594850301743} 01/27/2022 05:43:33 - INFO - codeparrot_training - Step 10482: {'lr': 0.00046245592683834773, 'samples': 2012736, 'steps': 10482, 'loss/train': 0.7642199099063873} 01/27/2022 05:43:36 - INFO - codeparrot_training - Step 10483: {'lr': 0.00046244730226900453, 'samples': 2012928, 'steps': 10483, 'loss/train': 1.2576575875282288} 01/27/2022 05:43:39 - INFO - codeparrot_training - Step 10484: {'lr': 0.00046243867678960463, 'samples': 2013120, 'steps': 10484, 'loss/train': 0.632804274559021} 01/27/2022 05:43:42 - INFO - codeparrot_training - Step 10485: {'lr': 0.00046243005040018484, 'samples': 2013312, 'steps': 10485, 'loss/train': 1.3811073303222656} 01/27/2022 05:43:46 - INFO - codeparrot_training - Step 10486: {'lr': 0.0004624214231007821, 'samples': 2013504, 'steps': 10486, 'loss/train': 0.3489898592233658} 01/27/2022 05:43:49 - INFO - codeparrot_training - Step 10487: {'lr': 0.0004624127948914335, 'samples': 2013696, 'steps': 10487, 'loss/train': 0.6721904128789902} 01/27/2022 05:43:53 - INFO - codeparrot_training - Step 10488: {'lr': 0.0004624041657721759, 'samples': 2013888, 'steps': 10488, 'loss/train': 0.563529297709465} 01/27/2022 05:43:56 - INFO - codeparrot_training - Step 10489: {'lr': 0.0004623955357430464, 'samples': 2014080, 'steps': 10489, 'loss/train': 0.6508192867040634} 01/27/2022 05:44:00 - INFO - codeparrot_training - Step 10490: {'lr': 0.0004623869048040817, 'samples': 2014272, 'steps': 10490, 'loss/train': 0.6135933101177216} 01/27/2022 05:44:03 - INFO - codeparrot_training - Step 10491: {'lr': 0.0004623782729553191, 'samples': 2014464, 'steps': 10491, 'loss/train': 0.27589011937379837} 01/27/2022 05:44:06 - INFO - codeparrot_training - Step 10492: {'lr': 0.00046236964019679533, 'samples': 2014656, 'steps': 10492, 'loss/train': 0.5448070764541626} 01/27/2022 05:44:09 - INFO - codeparrot_training - Step 10493: {'lr': 0.0004623610065285475, 'samples': 2014848, 'steps': 10493, 'loss/train': 0.5951773524284363} 01/27/2022 05:44:12 - INFO - codeparrot_training - Step 10494: {'lr': 0.00046235237195061253, 'samples': 2015040, 'steps': 10494, 'loss/train': 0.5930521041154861} 01/27/2022 05:44:15 - INFO - codeparrot_training - Step 10495: {'lr': 0.00046234373646302743, 'samples': 2015232, 'steps': 10495, 'loss/train': 0.8861451745033264} 01/27/2022 05:44:21 - INFO - codeparrot_training - Step 10496: {'lr': 0.00046233510006582913, 'samples': 2015424, 'steps': 10496, 'loss/train': 0.794025331735611} 01/27/2022 05:44:24 - INFO - codeparrot_training - Step 10497: {'lr': 0.00046232646275905475, 'samples': 2015616, 'steps': 10497, 'loss/train': 0.487652912735939} 01/27/2022 05:44:28 - INFO - codeparrot_training - Step 10498: {'lr': 0.00046231782454274117, 'samples': 2015808, 'steps': 10498, 'loss/train': 0.9116227626800537} 01/27/2022 05:44:31 - INFO - codeparrot_training - Step 10499: {'lr': 0.00046230918541692557, 'samples': 2016000, 'steps': 10499, 'loss/train': 0.6879710555076599} 01/27/2022 05:44:34 - INFO - codeparrot_training - Step 10500: {'lr': 0.00046230054538164475, 'samples': 2016192, 'steps': 10500, 'loss/train': 0.9462619721889496} 01/27/2022 05:44:37 - INFO - codeparrot_training - Step 10501: {'lr': 0.0004622919044369358, 'samples': 2016384, 'steps': 10501, 'loss/train': 0.9069041311740875} 01/27/2022 05:44:40 - INFO - codeparrot_training - Step 10502: {'lr': 0.00046228326258283576, 'samples': 2016576, 'steps': 10502, 'loss/train': 0.9732508063316345} 01/27/2022 05:44:43 - INFO - codeparrot_training - Step 10503: {'lr': 0.0004622746198193816, 'samples': 2016768, 'steps': 10503, 'loss/train': 0.21864507347345352} 01/27/2022 05:44:46 - INFO - codeparrot_training - Step 10504: {'lr': 0.00046226597614661044, 'samples': 2016960, 'steps': 10504, 'loss/train': 1.5183407664299011} 01/27/2022 05:44:51 - INFO - codeparrot_training - Step 10505: {'lr': 0.00046225733156455916, 'samples': 2017152, 'steps': 10505, 'loss/train': 0.5709404200315475} 01/27/2022 05:44:54 - INFO - codeparrot_training - Step 10506: {'lr': 0.00046224868607326494, 'samples': 2017344, 'steps': 10506, 'loss/train': 1.198510229587555} 01/27/2022 05:44:57 - INFO - codeparrot_training - Step 10507: {'lr': 0.00046224003967276474, 'samples': 2017536, 'steps': 10507, 'loss/train': 1.112825095653534} 01/27/2022 05:45:00 - INFO - codeparrot_training - Step 10508: {'lr': 0.00046223139236309553, 'samples': 2017728, 'steps': 10508, 'loss/train': 0.9081092476844788} 01/27/2022 05:45:04 - INFO - codeparrot_training - Step 10509: {'lr': 0.0004622227441442945, 'samples': 2017920, 'steps': 10509, 'loss/train': 0.7958706021308899} 01/27/2022 05:45:07 - INFO - codeparrot_training - Step 10510: {'lr': 0.00046221409501639863, 'samples': 2018112, 'steps': 10510, 'loss/train': 0.6990787982940674} 01/27/2022 05:45:10 - INFO - codeparrot_training - Step 10511: {'lr': 0.0004622054449794449, 'samples': 2018304, 'steps': 10511, 'loss/train': 1.0623705089092255} 01/27/2022 05:45:13 - INFO - codeparrot_training - Step 10512: {'lr': 0.0004621967940334705, 'samples': 2018496, 'steps': 10512, 'loss/train': 0.777215301990509} 01/27/2022 05:45:16 - INFO - codeparrot_training - Step 10513: {'lr': 0.00046218814217851233, 'samples': 2018688, 'steps': 10513, 'loss/train': 0.7274245172739029} 01/27/2022 05:45:21 - INFO - codeparrot_training - Step 10514: {'lr': 0.0004621794894146076, 'samples': 2018880, 'steps': 10514, 'loss/train': 0.4925408363342285} 01/27/2022 05:45:24 - INFO - codeparrot_training - Step 10515: {'lr': 0.0004621708357417933, 'samples': 2019072, 'steps': 10515, 'loss/train': 0.6854949742555618} 01/27/2022 05:45:28 - INFO - codeparrot_training - Step 10516: {'lr': 0.00046216218116010646, 'samples': 2019264, 'steps': 10516, 'loss/train': 0.3390468433499336} 01/27/2022 05:45:31 - INFO - codeparrot_training - Step 10517: {'lr': 0.00046215352566958423, 'samples': 2019456, 'steps': 10517, 'loss/train': 0.6664779335260391} 01/27/2022 05:45:34 - INFO - codeparrot_training - Step 10518: {'lr': 0.00046214486927026373, 'samples': 2019648, 'steps': 10518, 'loss/train': 0.8562524020671844} 01/27/2022 05:45:37 - INFO - codeparrot_training - Step 10519: {'lr': 0.0004621362119621819, 'samples': 2019840, 'steps': 10519, 'loss/train': 0.7251572012901306} 01/27/2022 05:45:40 - INFO - codeparrot_training - Step 10520: {'lr': 0.00046212755374537594, 'samples': 2020032, 'steps': 10520, 'loss/train': 0.8706865310668945} 01/27/2022 05:45:43 - INFO - codeparrot_training - Step 10521: {'lr': 0.00046211889461988286, 'samples': 2020224, 'steps': 10521, 'loss/train': 0.9295139014720917} 01/27/2022 05:45:46 - INFO - codeparrot_training - Step 10522: {'lr': 0.0004621102345857399, 'samples': 2020416, 'steps': 10522, 'loss/train': 0.6639433801174164} 01/27/2022 05:45:51 - INFO - codeparrot_training - Step 10523: {'lr': 0.0004621015736429839, 'samples': 2020608, 'steps': 10523, 'loss/train': 0.8532122075557709} 01/27/2022 05:45:54 - INFO - codeparrot_training - Step 10524: {'lr': 0.00046209291179165216, 'samples': 2020800, 'steps': 10524, 'loss/train': 0.6802052557468414} 01/27/2022 05:45:57 - INFO - codeparrot_training - Step 10525: {'lr': 0.0004620842490317817, 'samples': 2020992, 'steps': 10525, 'loss/train': 0.743750274181366} 01/27/2022 05:46:00 - INFO - codeparrot_training - Step 10526: {'lr': 0.0004620755853634097, 'samples': 2021184, 'steps': 10526, 'loss/train': 0.8616834282875061} 01/27/2022 05:46:03 - INFO - codeparrot_training - Step 10527: {'lr': 0.00046206692078657325, 'samples': 2021376, 'steps': 10527, 'loss/train': 0.4559721350669861} 01/27/2022 05:46:07 - INFO - codeparrot_training - Step 10528: {'lr': 0.0004620582553013094, 'samples': 2021568, 'steps': 10528, 'loss/train': 0.2627672106027603} 01/27/2022 05:46:10 - INFO - codeparrot_training - Step 10529: {'lr': 0.00046204958890765536, 'samples': 2021760, 'steps': 10529, 'loss/train': 0.9540771245956421} 01/27/2022 05:46:13 - INFO - codeparrot_training - Step 10530: {'lr': 0.0004620409216056483, 'samples': 2021952, 'steps': 10530, 'loss/train': 1.0767497420310974} 01/27/2022 05:46:16 - INFO - codeparrot_training - Step 10531: {'lr': 0.00046203225339532515, 'samples': 2022144, 'steps': 10531, 'loss/train': 0.6132410913705826} 01/27/2022 05:46:21 - INFO - codeparrot_training - Step 10532: {'lr': 0.00046202358427672313, 'samples': 2022336, 'steps': 10532, 'loss/train': 0.9473001658916473} 01/27/2022 05:46:24 - INFO - codeparrot_training - Step 10533: {'lr': 0.0004620149142498795, 'samples': 2022528, 'steps': 10533, 'loss/train': 0.7600686550140381} 01/27/2022 05:46:27 - INFO - codeparrot_training - Step 10534: {'lr': 0.0004620062433148312, 'samples': 2022720, 'steps': 10534, 'loss/train': 0.8985318839550018} 01/27/2022 05:46:30 - INFO - codeparrot_training - Step 10535: {'lr': 0.00046199757147161554, 'samples': 2022912, 'steps': 10535, 'loss/train': 0.9309282302856445} 01/27/2022 05:46:33 - INFO - codeparrot_training - Step 10536: {'lr': 0.00046198889872026963, 'samples': 2023104, 'steps': 10536, 'loss/train': 0.8556081354618073} 01/27/2022 05:46:36 - INFO - codeparrot_training - Step 10537: {'lr': 0.0004619802250608305, 'samples': 2023296, 'steps': 10537, 'loss/train': 0.7566341757774353} 01/27/2022 05:46:40 - INFO - codeparrot_training - Step 10538: {'lr': 0.0004619715504933354, 'samples': 2023488, 'steps': 10538, 'loss/train': 0.5463695526123047} 01/27/2022 05:46:43 - INFO - codeparrot_training - Step 10539: {'lr': 0.00046196287501782155, 'samples': 2023680, 'steps': 10539, 'loss/train': 0.38279615342617035} 01/27/2022 05:46:46 - INFO - codeparrot_training - Step 10540: {'lr': 0.00046195419863432604, 'samples': 2023872, 'steps': 10540, 'loss/train': 0.12660200893878937} 01/27/2022 05:46:51 - INFO - codeparrot_training - Step 10541: {'lr': 0.000461945521342886, 'samples': 2024064, 'steps': 10541, 'loss/train': 1.087006151676178} 01/27/2022 05:46:54 - INFO - codeparrot_training - Step 10542: {'lr': 0.0004619368431435387, 'samples': 2024256, 'steps': 10542, 'loss/train': 1.0231222808361053} 01/27/2022 05:46:57 - INFO - codeparrot_training - Step 10543: {'lr': 0.0004619281640363212, 'samples': 2024448, 'steps': 10543, 'loss/train': 0.7131575793027878} 01/27/2022 05:47:00 - INFO - codeparrot_training - Step 10544: {'lr': 0.0004619194840212708, 'samples': 2024640, 'steps': 10544, 'loss/train': 1.102747231721878} 01/27/2022 05:47:04 - INFO - codeparrot_training - Step 10545: {'lr': 0.00046191080309842457, 'samples': 2024832, 'steps': 10545, 'loss/train': 0.743821308016777} 01/27/2022 05:47:07 - INFO - codeparrot_training - Step 10546: {'lr': 0.0004619021212678198, 'samples': 2025024, 'steps': 10546, 'loss/train': 0.6614109724760056} 01/27/2022 05:47:10 - INFO - codeparrot_training - Step 10547: {'lr': 0.0004618934385294936, 'samples': 2025216, 'steps': 10547, 'loss/train': 0.502522811293602} 01/27/2022 05:47:13 - INFO - codeparrot_training - Step 10548: {'lr': 0.0004618847548834833, 'samples': 2025408, 'steps': 10548, 'loss/train': 0.4747445583343506} 01/27/2022 05:47:16 - INFO - codeparrot_training - Step 10549: {'lr': 0.0004618760703298258, 'samples': 2025600, 'steps': 10549, 'loss/train': 0.7754090130329132} 01/27/2022 05:47:21 - INFO - codeparrot_training - Step 10550: {'lr': 0.0004618673848685586, 'samples': 2025792, 'steps': 10550, 'loss/train': 0.9031569957733154} 01/27/2022 05:47:24 - INFO - codeparrot_training - Step 10551: {'lr': 0.00046185869849971884, 'samples': 2025984, 'steps': 10551, 'loss/train': 1.128605604171753} 01/27/2022 05:47:27 - INFO - codeparrot_training - Step 10552: {'lr': 0.0004618500112233436, 'samples': 2026176, 'steps': 10552, 'loss/train': 0.0969260036945343} 01/27/2022 05:47:31 - INFO - codeparrot_training - Step 10553: {'lr': 0.0004618413230394702, 'samples': 2026368, 'steps': 10553, 'loss/train': 0.6772567480802536} 01/27/2022 05:47:34 - INFO - codeparrot_training - Step 10554: {'lr': 0.0004618326339481359, 'samples': 2026560, 'steps': 10554, 'loss/train': 0.8609679937362671} 01/27/2022 05:47:37 - INFO - codeparrot_training - Step 10555: {'lr': 0.00046182394394937774, 'samples': 2026752, 'steps': 10555, 'loss/train': 0.9399316906929016} 01/27/2022 05:47:40 - INFO - codeparrot_training - Step 10556: {'lr': 0.00046181525304323325, 'samples': 2026944, 'steps': 10556, 'loss/train': 1.2843858003616333} 01/27/2022 05:47:43 - INFO - codeparrot_training - Step 10557: {'lr': 0.0004618065612297393, 'samples': 2027136, 'steps': 10557, 'loss/train': 0.9859524965286255} 01/27/2022 05:47:48 - INFO - codeparrot_training - Step 10558: {'lr': 0.00046179786850893335, 'samples': 2027328, 'steps': 10558, 'loss/train': 0.9172419905662537} 01/27/2022 05:47:52 - INFO - codeparrot_training - Step 10559: {'lr': 0.0004617891748808526, 'samples': 2027520, 'steps': 10559, 'loss/train': 0.7784877419471741} 01/27/2022 05:47:55 - INFO - codeparrot_training - Step 10560: {'lr': 0.0004617804803455343, 'samples': 2027712, 'steps': 10560, 'loss/train': 1.2977064549922943} 01/27/2022 05:47:58 - INFO - codeparrot_training - Step 10561: {'lr': 0.0004617717849030156, 'samples': 2027904, 'steps': 10561, 'loss/train': 0.6653361171483994} 01/27/2022 05:48:01 - INFO - codeparrot_training - Step 10562: {'lr': 0.00046176308855333395, 'samples': 2028096, 'steps': 10562, 'loss/train': 0.8830970227718353} 01/27/2022 05:48:04 - INFO - codeparrot_training - Step 10563: {'lr': 0.00046175439129652636, 'samples': 2028288, 'steps': 10563, 'loss/train': 0.797212690114975} 01/27/2022 05:48:07 - INFO - codeparrot_training - Step 10564: {'lr': 0.0004617456931326302, 'samples': 2028480, 'steps': 10564, 'loss/train': 0.9670681357383728} 01/27/2022 05:48:10 - INFO - codeparrot_training - Step 10565: {'lr': 0.00046173699406168277, 'samples': 2028672, 'steps': 10565, 'loss/train': 0.7575710713863373} 01/27/2022 05:48:14 - INFO - codeparrot_training - Step 10566: {'lr': 0.00046172829408372125, 'samples': 2028864, 'steps': 10566, 'loss/train': 0.46649426221847534} 01/27/2022 05:48:18 - INFO - codeparrot_training - Step 10567: {'lr': 0.000461719593198783, 'samples': 2029056, 'steps': 10567, 'loss/train': 1.071031004190445} 01/27/2022 05:48:21 - INFO - codeparrot_training - Step 10568: {'lr': 0.0004617108914069052, 'samples': 2029248, 'steps': 10568, 'loss/train': 1.0975737869739532} 01/27/2022 05:48:24 - INFO - codeparrot_training - Step 10569: {'lr': 0.00046170218870812517, 'samples': 2029440, 'steps': 10569, 'loss/train': 0.8193448483943939} 01/27/2022 05:48:27 - INFO - codeparrot_training - Step 10570: {'lr': 0.0004616934851024802, 'samples': 2029632, 'steps': 10570, 'loss/train': 1.1343052983283997} 01/27/2022 05:48:30 - INFO - codeparrot_training - Step 10571: {'lr': 0.00046168478059000753, 'samples': 2029824, 'steps': 10571, 'loss/train': 0.47565047442913055} 01/27/2022 05:48:34 - INFO - codeparrot_training - Step 10572: {'lr': 0.0004616760751707445, 'samples': 2030016, 'steps': 10572, 'loss/train': 0.863483190536499} 01/27/2022 05:48:37 - INFO - codeparrot_training - Step 10573: {'lr': 0.0004616673688447284, 'samples': 2030208, 'steps': 10573, 'loss/train': 0.8383707404136658} 01/27/2022 05:48:40 - INFO - codeparrot_training - Step 10574: {'lr': 0.0004616586616119964, 'samples': 2030400, 'steps': 10574, 'loss/train': 0.37832437455654144} 01/27/2022 05:48:43 - INFO - codeparrot_training - Step 10575: {'lr': 0.0004616499534725861, 'samples': 2030592, 'steps': 10575, 'loss/train': 0.7686295509338379} 01/27/2022 05:48:47 - INFO - codeparrot_training - Step 10576: {'lr': 0.00046164124442653445, 'samples': 2030784, 'steps': 10576, 'loss/train': 0.5364928990602493} 01/27/2022 05:48:51 - INFO - codeparrot_training - Step 10577: {'lr': 0.00046163253447387896, 'samples': 2030976, 'steps': 10577, 'loss/train': 0.5851802676916122} 01/27/2022 05:48:54 - INFO - codeparrot_training - Step 10578: {'lr': 0.0004616238236146569, 'samples': 2031168, 'steps': 10578, 'loss/train': 0.6838072389364243} 01/27/2022 05:48:57 - INFO - codeparrot_training - Step 10579: {'lr': 0.0004616151118489056, 'samples': 2031360, 'steps': 10579, 'loss/train': 1.082158774137497} 01/27/2022 05:49:00 - INFO - codeparrot_training - Step 10580: {'lr': 0.0004616063991766623, 'samples': 2031552, 'steps': 10580, 'loss/train': 0.9381127953529358} 01/27/2022 05:49:03 - INFO - codeparrot_training - Step 10581: {'lr': 0.00046159768559796437, 'samples': 2031744, 'steps': 10581, 'loss/train': 1.2914222180843353} 01/27/2022 05:49:06 - INFO - codeparrot_training - Step 10582: {'lr': 0.0004615889711128492, 'samples': 2031936, 'steps': 10582, 'loss/train': 1.157869666814804} 01/27/2022 05:49:09 - INFO - codeparrot_training - Step 10583: {'lr': 0.00046158025572135404, 'samples': 2032128, 'steps': 10583, 'loss/train': 1.0795326232910156} 01/27/2022 05:49:13 - INFO - codeparrot_training - Step 10584: {'lr': 0.00046157153942351625, 'samples': 2032320, 'steps': 10584, 'loss/train': 0.8686473369598389} 01/27/2022 05:49:17 - INFO - codeparrot_training - Step 10585: {'lr': 0.0004615628222193732, 'samples': 2032512, 'steps': 10585, 'loss/train': 0.5889396518468857} 01/27/2022 05:49:20 - INFO - codeparrot_training - Step 10586: {'lr': 0.00046155410410896215, 'samples': 2032704, 'steps': 10586, 'loss/train': 0.7241277247667313} 01/27/2022 05:49:23 - INFO - codeparrot_training - Step 10587: {'lr': 0.00046154538509232044, 'samples': 2032896, 'steps': 10587, 'loss/train': 0.917204350233078} 01/27/2022 05:49:26 - INFO - codeparrot_training - Step 10588: {'lr': 0.00046153666516948554, 'samples': 2033088, 'steps': 10588, 'loss/train': 0.6382250636816025} 01/27/2022 05:49:30 - INFO - codeparrot_training - Step 10589: {'lr': 0.0004615279443404948, 'samples': 2033280, 'steps': 10589, 'loss/train': 1.0344119668006897} 01/27/2022 05:49:33 - INFO - codeparrot_training - Step 10590: {'lr': 0.0004615192226053855, 'samples': 2033472, 'steps': 10590, 'loss/train': 0.8708963692188263} 01/27/2022 05:49:36 - INFO - codeparrot_training - Step 10591: {'lr': 0.0004615104999641949, 'samples': 2033664, 'steps': 10591, 'loss/train': 0.6184127032756805} 01/27/2022 05:49:39 - INFO - codeparrot_training - Step 10592: {'lr': 0.0004615017764169606, 'samples': 2033856, 'steps': 10592, 'loss/train': 1.035076528787613} 01/27/2022 05:49:42 - INFO - codeparrot_training - Step 10593: {'lr': 0.0004614930519637198, 'samples': 2034048, 'steps': 10593, 'loss/train': 0.8885170519351959} 01/27/2022 05:49:47 - INFO - codeparrot_training - Step 10594: {'lr': 0.0004614843266045099, 'samples': 2034240, 'steps': 10594, 'loss/train': 0.7721175849437714} 01/27/2022 05:49:51 - INFO - codeparrot_training - Step 10595: {'lr': 0.0004614756003393683, 'samples': 2034432, 'steps': 10595, 'loss/train': 0.9018478989601135} 01/27/2022 05:49:54 - INFO - codeparrot_training - Step 10596: {'lr': 0.00046146687316833235, 'samples': 2034624, 'steps': 10596, 'loss/train': 0.9349842667579651} 01/27/2022 05:49:57 - INFO - codeparrot_training - Step 10597: {'lr': 0.00046145814509143955, 'samples': 2034816, 'steps': 10597, 'loss/train': 0.8563564717769623} 01/27/2022 05:50:00 - INFO - codeparrot_training - Step 10598: {'lr': 0.0004614494161087271, 'samples': 2035008, 'steps': 10598, 'loss/train': 0.9976811707019806} 01/27/2022 05:50:03 - INFO - codeparrot_training - Step 10599: {'lr': 0.00046144068622023263, 'samples': 2035200, 'steps': 10599, 'loss/train': 1.008530080318451} 01/27/2022 05:50:06 - INFO - codeparrot_training - Step 10600: {'lr': 0.00046143195542599336, 'samples': 2035392, 'steps': 10600, 'loss/train': 1.3114607334136963} 01/27/2022 05:50:09 - INFO - codeparrot_training - Step 10601: {'lr': 0.00046142322372604667, 'samples': 2035584, 'steps': 10601, 'loss/train': 0.7332651615142822} 01/27/2022 05:50:14 - INFO - codeparrot_training - Step 10602: {'lr': 0.00046141449112043, 'samples': 2035776, 'steps': 10602, 'loss/train': 1.2492263317108154} 01/27/2022 05:50:17 - INFO - codeparrot_training - Step 10603: {'lr': 0.0004614057576091809, 'samples': 2035968, 'steps': 10603, 'loss/train': 0.2646523788571358} 01/27/2022 05:50:20 - INFO - codeparrot_training - Step 10604: {'lr': 0.00046139702319233656, 'samples': 2036160, 'steps': 10604, 'loss/train': 0.5429588109254837} 01/27/2022 05:50:23 - INFO - codeparrot_training - Step 10605: {'lr': 0.00046138828786993456, 'samples': 2036352, 'steps': 10605, 'loss/train': 0.630739152431488} 01/27/2022 05:50:26 - INFO - codeparrot_training - Step 10606: {'lr': 0.0004613795516420122, 'samples': 2036544, 'steps': 10606, 'loss/train': 0.6174032539129257} 01/27/2022 05:50:29 - INFO - codeparrot_training - Step 10607: {'lr': 0.000461370814508607, 'samples': 2036736, 'steps': 10607, 'loss/train': 0.90947225689888} 01/27/2022 05:50:33 - INFO - codeparrot_training - Step 10608: {'lr': 0.00046136207646975635, 'samples': 2036928, 'steps': 10608, 'loss/train': 0.5104492753744125} 01/27/2022 05:50:36 - INFO - codeparrot_training - Step 10609: {'lr': 0.0004613533375254977, 'samples': 2037120, 'steps': 10609, 'loss/train': 0.7578671872615814} 01/27/2022 05:50:39 - INFO - codeparrot_training - Step 10610: {'lr': 0.00046134459767586847, 'samples': 2037312, 'steps': 10610, 'loss/train': 1.1935733556747437} 01/27/2022 05:50:43 - INFO - codeparrot_training - Step 10611: {'lr': 0.00046133585692090603, 'samples': 2037504, 'steps': 10611, 'loss/train': 0.8871932029724121} 01/27/2022 05:50:47 - INFO - codeparrot_training - Step 10612: {'lr': 0.0004613271152606479, 'samples': 2037696, 'steps': 10612, 'loss/train': 0.8473878800868988} 01/27/2022 05:50:50 - INFO - codeparrot_training - Step 10613: {'lr': 0.00046131837269513154, 'samples': 2037888, 'steps': 10613, 'loss/train': 0.7530445754528046} 01/27/2022 05:50:53 - INFO - codeparrot_training - Step 10614: {'lr': 0.00046130962922439435, 'samples': 2038080, 'steps': 10614, 'loss/train': 0.9057621359825134} 01/27/2022 05:50:56 - INFO - codeparrot_training - Step 10615: {'lr': 0.00046130088484847383, 'samples': 2038272, 'steps': 10615, 'loss/train': 0.7807964086532593} 01/27/2022 05:50:59 - INFO - codeparrot_training - Step 10616: {'lr': 0.0004612921395674074, 'samples': 2038464, 'steps': 10616, 'loss/train': 0.6958180814981461} 01/27/2022 05:51:02 - INFO - codeparrot_training - Step 10617: {'lr': 0.00046128339338123253, 'samples': 2038656, 'steps': 10617, 'loss/train': 0.7864518463611603} 01/27/2022 05:51:05 - INFO - codeparrot_training - Step 10618: {'lr': 0.0004612746462899867, 'samples': 2038848, 'steps': 10618, 'loss/train': 1.345144361257553} 01/27/2022 05:51:09 - INFO - codeparrot_training - Step 10619: {'lr': 0.00046126589829370736, 'samples': 2039040, 'steps': 10619, 'loss/train': 0.7050953954458237} 01/27/2022 05:51:14 - INFO - codeparrot_training - Step 10620: {'lr': 0.00046125714939243204, 'samples': 2039232, 'steps': 10620, 'loss/train': 1.168353796005249} 01/27/2022 05:51:17 - INFO - codeparrot_training - Step 10621: {'lr': 0.00046124839958619815, 'samples': 2039424, 'steps': 10621, 'loss/train': 1.1658776700496674} 01/27/2022 05:51:20 - INFO - codeparrot_training - Step 10622: {'lr': 0.0004612396488750432, 'samples': 2039616, 'steps': 10622, 'loss/train': 0.6693430244922638} 01/27/2022 05:51:23 - INFO - codeparrot_training - Step 10623: {'lr': 0.00046123089725900464, 'samples': 2039808, 'steps': 10623, 'loss/train': 0.7694835662841797} 01/27/2022 05:51:26 - INFO - codeparrot_training - Step 10624: {'lr': 0.00046122214473812005, 'samples': 2040000, 'steps': 10624, 'loss/train': 0.6737596392631531} 01/27/2022 05:51:30 - INFO - codeparrot_training - Step 10625: {'lr': 0.0004612133913124268, 'samples': 2040192, 'steps': 10625, 'loss/train': 0.7581698298454285} 01/27/2022 05:51:33 - INFO - codeparrot_training - Step 10626: {'lr': 0.00046120463698196245, 'samples': 2040384, 'steps': 10626, 'loss/train': 0.7297220528125763} 01/27/2022 05:51:36 - INFO - codeparrot_training - Step 10627: {'lr': 0.00046119588174676454, 'samples': 2040576, 'steps': 10627, 'loss/train': 0.08308760449290276} 01/27/2022 05:51:40 - INFO - codeparrot_training - Step 10628: {'lr': 0.0004611871256068705, 'samples': 2040768, 'steps': 10628, 'loss/train': 0.9646637141704559} 01/27/2022 05:51:44 - INFO - codeparrot_training - Step 10629: {'lr': 0.0004611783685623179, 'samples': 2040960, 'steps': 10629, 'loss/train': 0.8940888941287994} 01/27/2022 05:51:47 - INFO - codeparrot_training - Step 10630: {'lr': 0.00046116961061314424, 'samples': 2041152, 'steps': 10630, 'loss/train': 0.082685686647892} 01/27/2022 05:51:50 - INFO - codeparrot_training - Step 10631: {'lr': 0.00046116085175938694, 'samples': 2041344, 'steps': 10631, 'loss/train': 0.9032265543937683} 01/27/2022 05:51:53 - INFO - codeparrot_training - Step 10632: {'lr': 0.00046115209200108366, 'samples': 2041536, 'steps': 10632, 'loss/train': 0.5763976871967316} 01/27/2022 05:51:56 - INFO - codeparrot_training - Step 10633: {'lr': 0.00046114333133827194, 'samples': 2041728, 'steps': 10633, 'loss/train': 0.8046042323112488} 01/27/2022 05:51:59 - INFO - codeparrot_training - Step 10634: {'lr': 0.0004611345697709891, 'samples': 2041920, 'steps': 10634, 'loss/train': 0.6037546992301941} 01/27/2022 05:52:02 - INFO - codeparrot_training - Step 10635: {'lr': 0.0004611258072992729, 'samples': 2042112, 'steps': 10635, 'loss/train': 0.6915976256132126} 01/27/2022 05:52:05 - INFO - codeparrot_training - Step 10636: {'lr': 0.0004611170439231607, 'samples': 2042304, 'steps': 10636, 'loss/train': 0.6504362672567368} 01/27/2022 05:52:10 - INFO - codeparrot_training - Step 10637: {'lr': 0.0004611082796426902, 'samples': 2042496, 'steps': 10637, 'loss/train': 0.4998708665370941} 01/27/2022 05:52:13 - INFO - codeparrot_training - Step 10638: {'lr': 0.00046109951445789883, 'samples': 2042688, 'steps': 10638, 'loss/train': 0.35212862491607666} 01/27/2022 05:52:16 - INFO - codeparrot_training - Step 10639: {'lr': 0.00046109074836882415, 'samples': 2042880, 'steps': 10639, 'loss/train': 0.8054638803005219} 01/27/2022 05:52:19 - INFO - codeparrot_training - Step 10640: {'lr': 0.00046108198137550377, 'samples': 2043072, 'steps': 10640, 'loss/train': 0.7115688621997833} 01/27/2022 05:52:22 - INFO - codeparrot_training - Step 10641: {'lr': 0.0004610732134779752, 'samples': 2043264, 'steps': 10641, 'loss/train': 0.7362036108970642} 01/27/2022 05:52:26 - INFO - codeparrot_training - Step 10642: {'lr': 0.000461064444676276, 'samples': 2043456, 'steps': 10642, 'loss/train': 0.654128685593605} 01/27/2022 05:52:29 - INFO - codeparrot_training - Step 10643: {'lr': 0.0004610556749704438, 'samples': 2043648, 'steps': 10643, 'loss/train': 0.5551665723323822} 01/27/2022 05:52:32 - INFO - codeparrot_training - Step 10644: {'lr': 0.000461046904360516, 'samples': 2043840, 'steps': 10644, 'loss/train': 1.4145657420158386} 01/27/2022 05:52:35 - INFO - codeparrot_training - Step 10645: {'lr': 0.0004610381328465303, 'samples': 2044032, 'steps': 10645, 'loss/train': 1.0122643411159515} 01/27/2022 05:52:40 - INFO - codeparrot_training - Step 10646: {'lr': 0.0004610293604285243, 'samples': 2044224, 'steps': 10646, 'loss/train': 0.8009702861309052} 01/27/2022 05:52:43 - INFO - codeparrot_training - Step 10647: {'lr': 0.0004610205871065355, 'samples': 2044416, 'steps': 10647, 'loss/train': 0.28613005578517914} 01/27/2022 05:52:46 - INFO - codeparrot_training - Step 10648: {'lr': 0.0004610118128806016, 'samples': 2044608, 'steps': 10648, 'loss/train': 1.1824906468391418} 01/27/2022 05:52:50 - INFO - codeparrot_training - Step 10649: {'lr': 0.0004610030377507599, 'samples': 2044800, 'steps': 10649, 'loss/train': 0.9573903679847717} 01/27/2022 05:52:53 - INFO - codeparrot_training - Step 10650: {'lr': 0.0004609942617170483, 'samples': 2044992, 'steps': 10650, 'loss/train': 0.5342010408639908} 01/27/2022 05:52:56 - INFO - codeparrot_training - Step 10651: {'lr': 0.0004609854847795043, 'samples': 2045184, 'steps': 10651, 'loss/train': 0.9186785817146301} 01/27/2022 05:52:59 - INFO - codeparrot_training - Step 10652: {'lr': 0.0004609767069381655, 'samples': 2045376, 'steps': 10652, 'loss/train': 0.6348438262939453} 01/27/2022 05:53:02 - INFO - codeparrot_training - Step 10653: {'lr': 0.00046096792819306945, 'samples': 2045568, 'steps': 10653, 'loss/train': 1.1069338917732239} 01/27/2022 05:53:05 - INFO - codeparrot_training - Step 10654: {'lr': 0.00046095914854425376, 'samples': 2045760, 'steps': 10654, 'loss/train': 0.8395264148712158} 01/27/2022 05:53:10 - INFO - codeparrot_training - Step 10655: {'lr': 0.00046095036799175606, 'samples': 2045952, 'steps': 10655, 'loss/train': 0.41266778111457825} 01/27/2022 05:53:13 - INFO - codeparrot_training - Step 10656: {'lr': 0.000460941586535614, 'samples': 2046144, 'steps': 10656, 'loss/train': 0.9879962503910065} 01/27/2022 05:53:16 - INFO - codeparrot_training - Step 10657: {'lr': 0.00046093280417586517, 'samples': 2046336, 'steps': 10657, 'loss/train': 0.6761751919984818} 01/27/2022 05:53:19 - INFO - codeparrot_training - Step 10658: {'lr': 0.0004609240209125472, 'samples': 2046528, 'steps': 10658, 'loss/train': 0.5837187319993973} 01/27/2022 05:53:23 - INFO - codeparrot_training - Step 10659: {'lr': 0.00046091523674569765, 'samples': 2046720, 'steps': 10659, 'loss/train': 0.6800457537174225} 01/27/2022 05:53:26 - INFO - codeparrot_training - Step 10660: {'lr': 0.00046090645167535415, 'samples': 2046912, 'steps': 10660, 'loss/train': 0.5133776664733887} 01/27/2022 05:53:29 - INFO - codeparrot_training - Step 10661: {'lr': 0.00046089766570155447, 'samples': 2047104, 'steps': 10661, 'loss/train': 0.8697632253170013} 01/27/2022 05:53:32 - INFO - codeparrot_training - Step 10662: {'lr': 0.0004608888788243362, 'samples': 2047296, 'steps': 10662, 'loss/train': 0.7911972105503082} 01/27/2022 05:53:35 - INFO - codeparrot_training - Step 10663: {'lr': 0.00046088009104373683, 'samples': 2047488, 'steps': 10663, 'loss/train': 0.8936160206794739} 01/27/2022 05:53:40 - INFO - codeparrot_training - Step 10664: {'lr': 0.0004608713023597941, 'samples': 2047680, 'steps': 10664, 'loss/train': 0.714919924736023} 01/27/2022 05:53:43 - INFO - codeparrot_training - Step 10665: {'lr': 0.0004608625127725458, 'samples': 2047872, 'steps': 10665, 'loss/train': 0.8083405494689941} 01/27/2022 05:53:47 - INFO - codeparrot_training - Step 10666: {'lr': 0.0004608537222820294, 'samples': 2048064, 'steps': 10666, 'loss/train': 0.8565883040428162} 01/27/2022 05:53:50 - INFO - codeparrot_training - Step 10667: {'lr': 0.0004608449308882826, 'samples': 2048256, 'steps': 10667, 'loss/train': 5.482840061187744} 01/27/2022 05:53:53 - INFO - codeparrot_training - Step 10668: {'lr': 0.000460836138591343, 'samples': 2048448, 'steps': 10668, 'loss/train': 1.052570253610611} 01/27/2022 05:53:56 - INFO - codeparrot_training - Step 10669: {'lr': 0.0004608273453912484, 'samples': 2048640, 'steps': 10669, 'loss/train': 0.9199326038360596} 01/27/2022 05:53:59 - INFO - codeparrot_training - Step 10670: {'lr': 0.0004608185512880364, 'samples': 2048832, 'steps': 10670, 'loss/train': 0.7217386811971664} 01/27/2022 05:54:02 - INFO - codeparrot_training - Step 10671: {'lr': 0.0004608097562817446, 'samples': 2049024, 'steps': 10671, 'loss/train': 1.1540459990501404} 01/27/2022 05:54:06 - INFO - codeparrot_training - Step 10672: {'lr': 0.0004608009603724108, 'samples': 2049216, 'steps': 10672, 'loss/train': 0.9073629677295685} 01/27/2022 05:54:10 - INFO - codeparrot_training - Step 10673: {'lr': 0.0004607921635600726, 'samples': 2049408, 'steps': 10673, 'loss/train': 0.6392549872398376} 01/27/2022 05:54:13 - INFO - codeparrot_training - Step 10674: {'lr': 0.00046078336584476777, 'samples': 2049600, 'steps': 10674, 'loss/train': 1.131260722875595} 01/27/2022 05:54:16 - INFO - codeparrot_training - Step 10675: {'lr': 0.00046077456722653387, 'samples': 2049792, 'steps': 10675, 'loss/train': 0.8186698257923126} 01/27/2022 05:54:19 - INFO - codeparrot_training - Step 10676: {'lr': 0.00046076576770540865, 'samples': 2049984, 'steps': 10676, 'loss/train': 0.9513207077980042} 01/27/2022 05:54:23 - INFO - codeparrot_training - Step 10677: {'lr': 0.00046075696728142986, 'samples': 2050176, 'steps': 10677, 'loss/train': 1.325293242931366} 01/27/2022 05:54:26 - INFO - codeparrot_training - Step 10678: {'lr': 0.0004607481659546351, 'samples': 2050368, 'steps': 10678, 'loss/train': 0.8606624901294708} 01/27/2022 05:54:29 - INFO - codeparrot_training - Step 10679: {'lr': 0.0004607393637250621, 'samples': 2050560, 'steps': 10679, 'loss/train': 0.5083109736442566} 01/27/2022 05:54:32 - INFO - codeparrot_training - Step 10680: {'lr': 0.00046073056059274867, 'samples': 2050752, 'steps': 10680, 'loss/train': 0.5202161371707916} 01/27/2022 05:54:35 - INFO - codeparrot_training - Step 10681: {'lr': 0.0004607217565577323, 'samples': 2050944, 'steps': 10681, 'loss/train': 0.7052146643400192} 01/27/2022 05:54:40 - INFO - codeparrot_training - Step 10682: {'lr': 0.0004607129516200509, 'samples': 2051136, 'steps': 10682, 'loss/train': 0.5410535484552383} 01/27/2022 05:54:43 - INFO - codeparrot_training - Step 10683: {'lr': 0.00046070414577974216, 'samples': 2051328, 'steps': 10683, 'loss/train': 1.0965481996536255} 01/27/2022 05:54:46 - INFO - codeparrot_training - Step 10684: {'lr': 0.00046069533903684374, 'samples': 2051520, 'steps': 10684, 'loss/train': 1.1501432061195374} 01/27/2022 05:54:49 - INFO - codeparrot_training - Step 10685: {'lr': 0.00046068653139139337, 'samples': 2051712, 'steps': 10685, 'loss/train': 0.9122324287891388} 01/27/2022 05:54:52 - INFO - codeparrot_training - Step 10686: {'lr': 0.0004606777228434288, 'samples': 2051904, 'steps': 10686, 'loss/train': 0.6843375116586685} 01/27/2022 05:54:56 - INFO - codeparrot_training - Step 10687: {'lr': 0.00046066891339298783, 'samples': 2052096, 'steps': 10687, 'loss/train': 0.868652880191803} 01/27/2022 05:54:59 - INFO - codeparrot_training - Step 10688: {'lr': 0.0004606601030401081, 'samples': 2052288, 'steps': 10688, 'loss/train': 0.9045096337795258} 01/27/2022 05:55:02 - INFO - codeparrot_training - Step 10689: {'lr': 0.00046065129178482733, 'samples': 2052480, 'steps': 10689, 'loss/train': 0.7700233161449432} 01/27/2022 05:55:05 - INFO - codeparrot_training - Step 10690: {'lr': 0.0004606424796271834, 'samples': 2052672, 'steps': 10690, 'loss/train': 1.142713487148285} 01/27/2022 05:55:10 - INFO - codeparrot_training - Step 10691: {'lr': 0.0004606336665672139, 'samples': 2052864, 'steps': 10691, 'loss/train': 1.5508900880813599} 01/27/2022 05:55:13 - INFO - codeparrot_training - Step 10692: {'lr': 0.00046062485260495666, 'samples': 2053056, 'steps': 10692, 'loss/train': 0.16018079593777657} 01/27/2022 05:55:16 - INFO - codeparrot_training - Step 10693: {'lr': 0.00046061603774044945, 'samples': 2053248, 'steps': 10693, 'loss/train': 1.499931961297989} 01/27/2022 05:55:20 - INFO - codeparrot_training - Step 10694: {'lr': 0.00046060722197373, 'samples': 2053440, 'steps': 10694, 'loss/train': 0.8711905181407928} 01/27/2022 05:55:23 - INFO - codeparrot_training - Step 10695: {'lr': 0.0004605984053048361, 'samples': 2053632, 'steps': 10695, 'loss/train': 0.507192850112915} 01/27/2022 05:55:26 - INFO - codeparrot_training - Step 10696: {'lr': 0.0004605895877338055, 'samples': 2053824, 'steps': 10696, 'loss/train': 0.7407217025756836} 01/27/2022 05:55:29 - INFO - codeparrot_training - Step 10697: {'lr': 0.000460580769260676, 'samples': 2054016, 'steps': 10697, 'loss/train': 0.8090106546878815} 01/27/2022 05:55:32 - INFO - codeparrot_training - Step 10698: {'lr': 0.0004605719498854853, 'samples': 2054208, 'steps': 10698, 'loss/train': 0.7330667227506638} 01/27/2022 05:55:37 - INFO - codeparrot_training - Step 10699: {'lr': 0.0004605631296082713, 'samples': 2054400, 'steps': 10699, 'loss/train': 0.9017016291618347} 01/27/2022 05:55:40 - INFO - codeparrot_training - Step 10700: {'lr': 0.0004605543084290716, 'samples': 2054592, 'steps': 10700, 'loss/train': 0.7068610936403275} 01/27/2022 05:55:43 - INFO - codeparrot_training - Step 10701: {'lr': 0.00046054548634792426, 'samples': 2054784, 'steps': 10701, 'loss/train': 0.8024073243141174} 01/27/2022 05:55:46 - INFO - codeparrot_training - Step 10702: {'lr': 0.0004605366633648668, 'samples': 2054976, 'steps': 10702, 'loss/train': 0.6600955724716187} 01/27/2022 05:55:49 - INFO - codeparrot_training - Step 10703: {'lr': 0.00046052783947993713, 'samples': 2055168, 'steps': 10703, 'loss/train': 1.4924385845661163} 01/27/2022 05:55:52 - INFO - codeparrot_training - Step 10704: {'lr': 0.0004605190146931731, 'samples': 2055360, 'steps': 10704, 'loss/train': 0.884470134973526} 01/27/2022 05:55:56 - INFO - codeparrot_training - Step 10705: {'lr': 0.0004605101890046124, 'samples': 2055552, 'steps': 10705, 'loss/train': 0.8863871991634369} 01/27/2022 05:55:59 - INFO - codeparrot_training - Step 10706: {'lr': 0.00046050136241429295, 'samples': 2055744, 'steps': 10706, 'loss/train': 1.3119295835494995} 01/27/2022 05:56:02 - INFO - codeparrot_training - Step 10707: {'lr': 0.0004604925349222525, 'samples': 2055936, 'steps': 10707, 'loss/train': 0.6411631554365158} 01/27/2022 05:56:06 - INFO - codeparrot_training - Step 10708: {'lr': 0.00046048370652852885, 'samples': 2056128, 'steps': 10708, 'loss/train': 1.069927990436554} 01/27/2022 05:56:09 - INFO - codeparrot_training - Step 10709: {'lr': 0.00046047487723315986, 'samples': 2056320, 'steps': 10709, 'loss/train': 0.8933937549591064} 01/27/2022 05:56:13 - INFO - codeparrot_training - Step 10710: {'lr': 0.0004604660470361832, 'samples': 2056512, 'steps': 10710, 'loss/train': 0.6328044086694717} 01/27/2022 05:56:16 - INFO - codeparrot_training - Step 10711: {'lr': 0.000460457215937637, 'samples': 2056704, 'steps': 10711, 'loss/train': 1.0860864222049713} 01/27/2022 05:56:19 - INFO - codeparrot_training - Step 10712: {'lr': 0.00046044838393755885, 'samples': 2056896, 'steps': 10712, 'loss/train': 0.6960941255092621} 01/27/2022 05:56:22 - INFO - codeparrot_training - Step 10713: {'lr': 0.0004604395510359867, 'samples': 2057088, 'steps': 10713, 'loss/train': 0.772866278886795} 01/27/2022 05:56:25 - INFO - codeparrot_training - Step 10714: {'lr': 0.0004604307172329582, 'samples': 2057280, 'steps': 10714, 'loss/train': 0.9163029491901398} 01/27/2022 05:56:28 - INFO - codeparrot_training - Step 10715: {'lr': 0.0004604218825285114, 'samples': 2057472, 'steps': 10715, 'loss/train': 0.9637348651885986} 01/27/2022 05:56:32 - INFO - codeparrot_training - Step 10716: {'lr': 0.00046041304692268407, 'samples': 2057664, 'steps': 10716, 'loss/train': 0.8240399658679962} 01/27/2022 05:56:36 - INFO - codeparrot_training - Step 10717: {'lr': 0.00046040421041551404, 'samples': 2057856, 'steps': 10717, 'loss/train': 0.5853287279605865} 01/27/2022 05:56:39 - INFO - codeparrot_training - Step 10718: {'lr': 0.00046039537300703926, 'samples': 2058048, 'steps': 10718, 'loss/train': 0.9736569821834564} 01/27/2022 05:56:42 - INFO - codeparrot_training - Step 10719: {'lr': 0.00046038653469729747, 'samples': 2058240, 'steps': 10719, 'loss/train': 1.0255856215953827} 01/27/2022 05:56:46 - INFO - codeparrot_training - Step 10720: {'lr': 0.00046037769548632656, 'samples': 2058432, 'steps': 10720, 'loss/train': 0.8955161869525909} 01/27/2022 05:56:49 - INFO - codeparrot_training - Step 10721: {'lr': 0.0004603688553741644, 'samples': 2058624, 'steps': 10721, 'loss/train': 0.5824822336435318} 01/27/2022 05:56:52 - INFO - codeparrot_training - Step 10722: {'lr': 0.0004603600143608488, 'samples': 2058816, 'steps': 10722, 'loss/train': 1.097270429134369} 01/27/2022 05:56:55 - INFO - codeparrot_training - Step 10723: {'lr': 0.00046035117244641783, 'samples': 2059008, 'steps': 10723, 'loss/train': 0.5822937190532684} 01/27/2022 05:56:58 - INFO - codeparrot_training - Step 10724: {'lr': 0.0004603423296309092, 'samples': 2059200, 'steps': 10724, 'loss/train': 1.998494267463684} 01/27/2022 05:57:01 - INFO - codeparrot_training - Step 10725: {'lr': 0.0004603334859143608, 'samples': 2059392, 'steps': 10725, 'loss/train': 1.1598709523677826} 01/27/2022 05:57:06 - INFO - codeparrot_training - Step 10726: {'lr': 0.0004603246412968105, 'samples': 2059584, 'steps': 10726, 'loss/train': 1.1943225860595703} 01/27/2022 05:57:09 - INFO - codeparrot_training - Step 10727: {'lr': 0.00046031579577829616, 'samples': 2059776, 'steps': 10727, 'loss/train': 1.0147444009780884} 01/27/2022 05:57:13 - INFO - codeparrot_training - Step 10728: {'lr': 0.00046030694935885586, 'samples': 2059968, 'steps': 10728, 'loss/train': 0.705818384885788} 01/27/2022 05:57:16 - INFO - codeparrot_training - Step 10729: {'lr': 0.00046029810203852736, 'samples': 2060160, 'steps': 10729, 'loss/train': 0.7642630040645599} 01/27/2022 05:57:19 - INFO - codeparrot_training - Step 10730: {'lr': 0.00046028925381734855, 'samples': 2060352, 'steps': 10730, 'loss/train': 0.7900058627128601} 01/27/2022 05:57:22 - INFO - codeparrot_training - Step 10731: {'lr': 0.00046028040469535734, 'samples': 2060544, 'steps': 10731, 'loss/train': 1.1840274631977081} 01/27/2022 05:57:25 - INFO - codeparrot_training - Step 10732: {'lr': 0.00046027155467259166, 'samples': 2060736, 'steps': 10732, 'loss/train': 0.48257996141910553} 01/27/2022 05:57:28 - INFO - codeparrot_training - Step 10733: {'lr': 0.00046026270374908935, 'samples': 2060928, 'steps': 10733, 'loss/train': 0.761489599943161} 01/27/2022 05:57:33 - INFO - codeparrot_training - Step 10734: {'lr': 0.0004602538519248884, 'samples': 2061120, 'steps': 10734, 'loss/train': 0.9412389099597931} 01/27/2022 05:57:36 - INFO - codeparrot_training - Step 10735: {'lr': 0.00046024499920002676, 'samples': 2061312, 'steps': 10735, 'loss/train': 0.8029177486896515} 01/27/2022 05:57:39 - INFO - codeparrot_training - Step 10736: {'lr': 0.0004602361455745423, 'samples': 2061504, 'steps': 10736, 'loss/train': 1.2476645708084106} 01/27/2022 05:57:42 - INFO - codeparrot_training - Step 10737: {'lr': 0.00046022729104847293, 'samples': 2061696, 'steps': 10737, 'loss/train': 0.48903264105319977} 01/27/2022 05:57:45 - INFO - codeparrot_training - Step 10738: {'lr': 0.0004602184356218566, 'samples': 2061888, 'steps': 10738, 'loss/train': 1.0080349445343018} 01/27/2022 05:57:49 - INFO - codeparrot_training - Step 10739: {'lr': 0.0004602095792947312, 'samples': 2062080, 'steps': 10739, 'loss/train': 1.0728323757648468} 01/27/2022 05:57:52 - INFO - codeparrot_training - Step 10740: {'lr': 0.00046020072206713484, 'samples': 2062272, 'steps': 10740, 'loss/train': 0.8267936110496521} 01/27/2022 05:57:55 - INFO - codeparrot_training - Step 10741: {'lr': 0.0004601918639391052, 'samples': 2062464, 'steps': 10741, 'loss/train': 0.569891631603241} 01/27/2022 05:57:58 - INFO - codeparrot_training - Step 10742: {'lr': 0.0004601830049106804, 'samples': 2062656, 'steps': 10742, 'loss/train': 0.6048392951488495} 01/27/2022 05:58:03 - INFO - codeparrot_training - Step 10743: {'lr': 0.0004601741449818984, 'samples': 2062848, 'steps': 10743, 'loss/train': 1.02145653963089} 01/27/2022 05:58:06 - INFO - codeparrot_training - Step 10744: {'lr': 0.000460165284152797, 'samples': 2063040, 'steps': 10744, 'loss/train': 0.7966554164886475} 01/27/2022 05:58:09 - INFO - codeparrot_training - Step 10745: {'lr': 0.0004601564224234143, 'samples': 2063232, 'steps': 10745, 'loss/train': 0.6441530138254166} 01/27/2022 05:58:12 - INFO - codeparrot_training - Step 10746: {'lr': 0.00046014755979378825, 'samples': 2063424, 'steps': 10746, 'loss/train': 1.0921249687671661} 01/27/2022 05:58:16 - INFO - codeparrot_training - Step 10747: {'lr': 0.0004601386962639568, 'samples': 2063616, 'steps': 10747, 'loss/train': 1.1830469369888306} 01/27/2022 05:58:19 - INFO - codeparrot_training - Step 10748: {'lr': 0.0004601298318339578, 'samples': 2063808, 'steps': 10748, 'loss/train': 0.6258515864610672} 01/27/2022 05:58:22 - INFO - codeparrot_training - Step 10749: {'lr': 0.0004601209665038294, 'samples': 2064000, 'steps': 10749, 'loss/train': 0.7062706500291824} 01/27/2022 05:58:25 - INFO - codeparrot_training - Step 10750: {'lr': 0.0004601121002736095, 'samples': 2064192, 'steps': 10750, 'loss/train': 0.6903091371059418} 01/27/2022 05:58:28 - INFO - codeparrot_training - Step 10751: {'lr': 0.0004601032331433361, 'samples': 2064384, 'steps': 10751, 'loss/train': 1.1682786047458649} 01/27/2022 05:58:33 - INFO - codeparrot_training - Step 10752: {'lr': 0.00046009436511304714, 'samples': 2064576, 'steps': 10752, 'loss/train': 0.9357738196849823} 01/27/2022 05:58:36 - INFO - codeparrot_training - Step 10753: {'lr': 0.0004600854961827806, 'samples': 2064768, 'steps': 10753, 'loss/train': 1.41667440533638} 01/27/2022 05:58:39 - INFO - codeparrot_training - Step 10754: {'lr': 0.00046007662635257453, 'samples': 2064960, 'steps': 10754, 'loss/train': 0.6961110234260559} 01/27/2022 05:58:42 - INFO - codeparrot_training - Step 10755: {'lr': 0.0004600677556224669, 'samples': 2065152, 'steps': 10755, 'loss/train': 0.5891768932342529} 01/27/2022 05:58:45 - INFO - codeparrot_training - Step 10756: {'lr': 0.00046005888399249575, 'samples': 2065344, 'steps': 10756, 'loss/train': 0.6119339168071747} 01/27/2022 05:58:48 - INFO - codeparrot_training - Step 10757: {'lr': 0.000460050011462699, 'samples': 2065536, 'steps': 10757, 'loss/train': 0.9921836256980896} 01/27/2022 05:58:51 - INFO - codeparrot_training - Step 10758: {'lr': 0.0004600411380331146, 'samples': 2065728, 'steps': 10758, 'loss/train': 0.6537316292524338} 01/27/2022 05:58:55 - INFO - codeparrot_training - Step 10759: {'lr': 0.0004600322637037808, 'samples': 2065920, 'steps': 10759, 'loss/train': 0.9253493249416351} 01/27/2022 05:58:58 - INFO - codeparrot_training - Step 10760: {'lr': 0.00046002338847473545, 'samples': 2066112, 'steps': 10760, 'loss/train': 1.0774503350257874} 01/27/2022 05:59:02 - INFO - codeparrot_training - Step 10761: {'lr': 0.00046001451234601665, 'samples': 2066304, 'steps': 10761, 'loss/train': 0.45011723041534424} 01/27/2022 05:59:05 - INFO - codeparrot_training - Step 10762: {'lr': 0.0004600056353176623, 'samples': 2066496, 'steps': 10762, 'loss/train': 0.6570072323083878} 01/27/2022 05:59:08 - INFO - codeparrot_training - Step 10763: {'lr': 0.00045999675738971047, 'samples': 2066688, 'steps': 10763, 'loss/train': 1.0317431688308716} 01/27/2022 05:59:12 - INFO - codeparrot_training - Step 10764: {'lr': 0.00045998787856219925, 'samples': 2066880, 'steps': 10764, 'loss/train': 0.6882973909378052} 01/27/2022 05:59:15 - INFO - codeparrot_training - Step 10765: {'lr': 0.0004599789988351666, 'samples': 2067072, 'steps': 10765, 'loss/train': 0.8664650917053223} 01/27/2022 05:59:18 - INFO - codeparrot_training - Step 10766: {'lr': 0.0004599701182086506, 'samples': 2067264, 'steps': 10766, 'loss/train': 0.9048789739608765} 01/27/2022 05:59:21 - INFO - codeparrot_training - Step 10767: {'lr': 0.0004599612366826893, 'samples': 2067456, 'steps': 10767, 'loss/train': 0.4958610087633133} 01/27/2022 05:59:24 - INFO - codeparrot_training - Step 10768: {'lr': 0.00045995235425732076, 'samples': 2067648, 'steps': 10768, 'loss/train': 1.0758809745311737} 01/27/2022 05:59:27 - INFO - codeparrot_training - Step 10769: {'lr': 0.00045994347093258295, 'samples': 2067840, 'steps': 10769, 'loss/train': 0.6967832744121552} 01/27/2022 05:59:33 - INFO - codeparrot_training - Step 10770: {'lr': 0.00045993458670851397, 'samples': 2068032, 'steps': 10770, 'loss/train': 0.9411801695823669} 01/27/2022 05:59:36 - INFO - codeparrot_training - Step 10771: {'lr': 0.0004599257015851519, 'samples': 2068224, 'steps': 10771, 'loss/train': 0.807549923658371} 01/27/2022 05:59:39 - INFO - codeparrot_training - Step 10772: {'lr': 0.0004599168155625348, 'samples': 2068416, 'steps': 10772, 'loss/train': 0.1447775401175022} 01/27/2022 05:59:42 - INFO - codeparrot_training - Step 10773: {'lr': 0.00045990792864070075, 'samples': 2068608, 'steps': 10773, 'loss/train': 0.841722697019577} 01/27/2022 05:59:45 - INFO - codeparrot_training - Step 10774: {'lr': 0.0004598990408196878, 'samples': 2068800, 'steps': 10774, 'loss/train': 1.0521583557128906} 01/27/2022 05:59:48 - INFO - codeparrot_training - Step 10775: {'lr': 0.00045989015209953394, 'samples': 2068992, 'steps': 10775, 'loss/train': 0.6992580145597458} 01/27/2022 05:59:52 - INFO - codeparrot_training - Step 10776: {'lr': 0.00045988126248027735, 'samples': 2069184, 'steps': 10776, 'loss/train': 0.7515650689601898} 01/27/2022 05:59:55 - INFO - codeparrot_training - Step 10777: {'lr': 0.00045987237196195603, 'samples': 2069376, 'steps': 10777, 'loss/train': 0.9590854346752167} 01/27/2022 05:59:58 - INFO - codeparrot_training - Step 10778: {'lr': 0.00045986348054460815, 'samples': 2069568, 'steps': 10778, 'loss/train': 1.3121949434280396} 01/27/2022 06:00:02 - INFO - codeparrot_training - Step 10779: {'lr': 0.00045985458822827175, 'samples': 2069760, 'steps': 10779, 'loss/train': 1.2909200191497803} 01/27/2022 06:00:06 - INFO - codeparrot_training - Step 10780: {'lr': 0.0004598456950129849, 'samples': 2069952, 'steps': 10780, 'loss/train': 1.2143585085868835} 01/27/2022 06:00:09 - INFO - codeparrot_training - Step 10781: {'lr': 0.00045983680089878575, 'samples': 2070144, 'steps': 10781, 'loss/train': 1.1794177293777466} 01/27/2022 06:00:12 - INFO - codeparrot_training - Step 10782: {'lr': 0.0004598279058857124, 'samples': 2070336, 'steps': 10782, 'loss/train': 0.7809513509273529} 01/27/2022 06:00:15 - INFO - codeparrot_training - Step 10783: {'lr': 0.00045981900997380296, 'samples': 2070528, 'steps': 10783, 'loss/train': 0.43233610689640045} 01/27/2022 06:00:18 - INFO - codeparrot_training - Step 10784: {'lr': 0.0004598101131630954, 'samples': 2070720, 'steps': 10784, 'loss/train': 0.755113273859024} 01/27/2022 06:00:21 - INFO - codeparrot_training - Step 10785: {'lr': 0.00045980121545362805, 'samples': 2070912, 'steps': 10785, 'loss/train': 0.8299075663089752} 01/27/2022 06:00:24 - INFO - codeparrot_training - Step 10786: {'lr': 0.0004597923168454389, 'samples': 2071104, 'steps': 10786, 'loss/train': 0.5867969691753387} 01/27/2022 06:00:29 - INFO - codeparrot_training - Step 10787: {'lr': 0.000459783417338566, 'samples': 2071296, 'steps': 10787, 'loss/train': 0.5799829512834549} 01/27/2022 06:00:33 - INFO - codeparrot_training - Step 10788: {'lr': 0.0004597745169330476, 'samples': 2071488, 'steps': 10788, 'loss/train': 0.9070489704608917} 01/27/2022 06:00:36 - INFO - codeparrot_training - Step 10789: {'lr': 0.0004597656156289217, 'samples': 2071680, 'steps': 10789, 'loss/train': 0.6700001209974289} 01/27/2022 06:00:39 - INFO - codeparrot_training - Step 10790: {'lr': 0.0004597567134262266, 'samples': 2071872, 'steps': 10790, 'loss/train': 0.7332644015550613} 01/27/2022 06:00:42 - INFO - codeparrot_training - Step 10791: {'lr': 0.00045974781032500034, 'samples': 2072064, 'steps': 10791, 'loss/train': 1.1593101024627686} 01/27/2022 06:00:45 - INFO - codeparrot_training - Step 10792: {'lr': 0.00045973890632528106, 'samples': 2072256, 'steps': 10792, 'loss/train': 1.001510202884674} 01/27/2022 06:00:48 - INFO - codeparrot_training - Step 10793: {'lr': 0.00045973000142710696, 'samples': 2072448, 'steps': 10793, 'loss/train': 0.35209283977746964} 01/27/2022 06:00:51 - INFO - codeparrot_training - Step 10794: {'lr': 0.000459721095630516, 'samples': 2072640, 'steps': 10794, 'loss/train': 1.006472110748291} 01/27/2022 06:00:55 - INFO - codeparrot_training - Step 10795: {'lr': 0.00045971218893554655, 'samples': 2072832, 'steps': 10795, 'loss/train': 0.8093495965003967} 01/27/2022 06:00:59 - INFO - codeparrot_training - Step 10796: {'lr': 0.0004597032813422367, 'samples': 2073024, 'steps': 10796, 'loss/train': 0.845027357339859} 01/27/2022 06:01:02 - INFO - codeparrot_training - Step 10797: {'lr': 0.00045969437285062453, 'samples': 2073216, 'steps': 10797, 'loss/train': 0.8556160032749176} 01/27/2022 06:01:05 - INFO - codeparrot_training - Step 10798: {'lr': 0.00045968546346074823, 'samples': 2073408, 'steps': 10798, 'loss/train': 0.7423011660575867} 01/27/2022 06:01:08 - INFO - codeparrot_training - Step 10799: {'lr': 0.000459676553172646, 'samples': 2073600, 'steps': 10799, 'loss/train': 1.0001908242702484} 01/27/2022 06:01:11 - INFO - codeparrot_training - Step 10800: {'lr': 0.00045966764198635603, 'samples': 2073792, 'steps': 10800, 'loss/train': 0.7183511853218079} 01/27/2022 06:01:15 - INFO - codeparrot_training - Step 10801: {'lr': 0.0004596587299019164, 'samples': 2073984, 'steps': 10801, 'loss/train': 1.1938781440258026} 01/27/2022 06:01:18 - INFO - codeparrot_training - Step 10802: {'lr': 0.0004596498169193654, 'samples': 2074176, 'steps': 10802, 'loss/train': 1.2497554421424866} 01/27/2022 06:01:21 - INFO - codeparrot_training - Step 10803: {'lr': 0.00045964090303874115, 'samples': 2074368, 'steps': 10803, 'loss/train': 0.7728368639945984} 01/27/2022 06:01:24 - INFO - codeparrot_training - Step 10804: {'lr': 0.0004596319882600818, 'samples': 2074560, 'steps': 10804, 'loss/train': 0.9104586839675903} 01/27/2022 06:01:28 - INFO - codeparrot_training - Step 10805: {'lr': 0.00045962307258342564, 'samples': 2074752, 'steps': 10805, 'loss/train': 0.7976725101470947} 01/27/2022 06:01:32 - INFO - codeparrot_training - Step 10806: {'lr': 0.00045961415600881075, 'samples': 2074944, 'steps': 10806, 'loss/train': 0.6977080553770065} 01/27/2022 06:01:35 - INFO - codeparrot_training - Step 10807: {'lr': 0.0004596052385362754, 'samples': 2075136, 'steps': 10807, 'loss/train': 0.5916188657283783} 01/27/2022 06:01:38 - INFO - codeparrot_training - Step 10808: {'lr': 0.00045959632016585774, 'samples': 2075328, 'steps': 10808, 'loss/train': 0.9338304698467255} 01/27/2022 06:01:41 - INFO - codeparrot_training - Step 10809: {'lr': 0.00045958740089759606, 'samples': 2075520, 'steps': 10809, 'loss/train': 0.09018008597195148} 01/27/2022 06:01:44 - INFO - codeparrot_training - Step 10810: {'lr': 0.0004595784807315284, 'samples': 2075712, 'steps': 10810, 'loss/train': 1.4057759642601013} 01/27/2022 06:01:47 - INFO - codeparrot_training - Step 10811: {'lr': 0.0004595695596676932, 'samples': 2075904, 'steps': 10811, 'loss/train': 0.7329603284597397} 01/27/2022 06:01:50 - INFO - codeparrot_training - Step 10812: {'lr': 0.00045956063770612843, 'samples': 2076096, 'steps': 10812, 'loss/train': 0.5742074400186539} 01/27/2022 06:01:54 - INFO - codeparrot_training - Step 10813: {'lr': 0.00045955171484687255, 'samples': 2076288, 'steps': 10813, 'loss/train': 0.700109526515007} 01/27/2022 06:01:58 - INFO - codeparrot_training - Step 10814: {'lr': 0.0004595427910899636, 'samples': 2076480, 'steps': 10814, 'loss/train': 1.1797821521759033} 01/27/2022 06:02:01 - INFO - codeparrot_training - Step 10815: {'lr': 0.00045953386643543987, 'samples': 2076672, 'steps': 10815, 'loss/train': 1.2192304730415344} 01/27/2022 06:02:05 - INFO - codeparrot_training - Step 10816: {'lr': 0.0004595249408833396, 'samples': 2076864, 'steps': 10816, 'loss/train': 0.9022214412689209} 01/27/2022 06:02:08 - INFO - codeparrot_training - Step 10817: {'lr': 0.00045951601443370107, 'samples': 2077056, 'steps': 10817, 'loss/train': 1.0962296426296234} 01/27/2022 06:02:11 - INFO - codeparrot_training - Step 10818: {'lr': 0.00045950708708656236, 'samples': 2077248, 'steps': 10818, 'loss/train': 0.8810968995094299} 01/27/2022 06:02:14 - INFO - codeparrot_training - Step 10819: {'lr': 0.0004594981588419619, 'samples': 2077440, 'steps': 10819, 'loss/train': 0.8610187768936157} 01/27/2022 06:02:17 - INFO - codeparrot_training - Step 10820: {'lr': 0.00045948922969993777, 'samples': 2077632, 'steps': 10820, 'loss/train': 0.4397711008787155} 01/27/2022 06:02:20 - INFO - codeparrot_training - Step 10821: {'lr': 0.00045948029966052834, 'samples': 2077824, 'steps': 10821, 'loss/train': 0.8505102396011353} 01/27/2022 06:02:25 - INFO - codeparrot_training - Step 10822: {'lr': 0.0004594713687237718, 'samples': 2078016, 'steps': 10822, 'loss/train': 0.524662435054779} 01/27/2022 06:02:29 - INFO - codeparrot_training - Step 10823: {'lr': 0.00045946243688970643, 'samples': 2078208, 'steps': 10823, 'loss/train': 0.5712525844573975} 01/27/2022 06:02:32 - INFO - codeparrot_training - Step 10824: {'lr': 0.00045945350415837056, 'samples': 2078400, 'steps': 10824, 'loss/train': 0.7112665772438049} 01/27/2022 06:02:35 - INFO - codeparrot_training - Step 10825: {'lr': 0.00045944457052980237, 'samples': 2078592, 'steps': 10825, 'loss/train': 0.9182926118373871} 01/27/2022 06:02:38 - INFO - codeparrot_training - Step 10826: {'lr': 0.0004594356360040401, 'samples': 2078784, 'steps': 10826, 'loss/train': 0.7045855075120926} 01/27/2022 06:02:41 - INFO - codeparrot_training - Step 10827: {'lr': 0.0004594267005811221, 'samples': 2078976, 'steps': 10827, 'loss/train': 1.424558401107788} 01/27/2022 06:02:44 - INFO - codeparrot_training - Step 10828: {'lr': 0.0004594177642610866, 'samples': 2079168, 'steps': 10828, 'loss/train': 0.4610580503940582} 01/27/2022 06:02:47 - INFO - codeparrot_training - Step 10829: {'lr': 0.0004594088270439719, 'samples': 2079360, 'steps': 10829, 'loss/train': 0.5581796318292618} 01/27/2022 06:02:51 - INFO - codeparrot_training - Step 10830: {'lr': 0.00045939988892981624, 'samples': 2079552, 'steps': 10830, 'loss/train': 0.6597130000591278} 01/27/2022 06:02:55 - INFO - codeparrot_training - Step 10831: {'lr': 0.00045939094991865806, 'samples': 2079744, 'steps': 10831, 'loss/train': 0.5198228806257248} 01/27/2022 06:02:58 - INFO - codeparrot_training - Step 10832: {'lr': 0.00045938201001053546, 'samples': 2079936, 'steps': 10832, 'loss/train': 0.8235695064067841} 01/27/2022 06:03:01 - INFO - codeparrot_training - Step 10833: {'lr': 0.00045937306920548684, 'samples': 2080128, 'steps': 10833, 'loss/train': 0.8637816309928894} 01/27/2022 06:03:05 - INFO - codeparrot_training - Step 10834: {'lr': 0.0004593641275035504, 'samples': 2080320, 'steps': 10834, 'loss/train': 0.8312424123287201} 01/27/2022 06:03:08 - INFO - codeparrot_training - Step 10835: {'lr': 0.00045935518490476456, 'samples': 2080512, 'steps': 10835, 'loss/train': 0.8954105973243713} 01/27/2022 06:03:11 - INFO - codeparrot_training - Step 10836: {'lr': 0.00045934624140916763, 'samples': 2080704, 'steps': 10836, 'loss/train': 1.0590575337409973} 01/27/2022 06:03:14 - INFO - codeparrot_training - Step 10837: {'lr': 0.0004593372970167978, 'samples': 2080896, 'steps': 10837, 'loss/train': 0.6499078273773193} 01/27/2022 06:03:17 - INFO - codeparrot_training - Step 10838: {'lr': 0.0004593283517276936, 'samples': 2081088, 'steps': 10838, 'loss/train': 0.9235454499721527} 01/27/2022 06:03:20 - INFO - codeparrot_training - Step 10839: {'lr': 0.0004593194055418931, 'samples': 2081280, 'steps': 10839, 'loss/train': 1.086887151002884} 01/27/2022 06:03:24 - INFO - codeparrot_training - Step 10840: {'lr': 0.00045931045845943474, 'samples': 2081472, 'steps': 10840, 'loss/train': 0.599461555480957} 01/27/2022 06:03:28 - INFO - codeparrot_training - Step 10841: {'lr': 0.00045930151048035684, 'samples': 2081664, 'steps': 10841, 'loss/train': 1.0271725058555603} 01/27/2022 06:03:31 - INFO - codeparrot_training - Step 10842: {'lr': 0.0004592925616046978, 'samples': 2081856, 'steps': 10842, 'loss/train': 0.7454720288515091} 01/27/2022 06:03:34 - INFO - codeparrot_training - Step 10843: {'lr': 0.0004592836118324958, 'samples': 2082048, 'steps': 10843, 'loss/train': 0.971908450126648} 01/27/2022 06:03:37 - INFO - codeparrot_training - Step 10844: {'lr': 0.0004592746611637893, 'samples': 2082240, 'steps': 10844, 'loss/train': 0.663274884223938} 01/27/2022 06:03:40 - INFO - codeparrot_training - Step 10845: {'lr': 0.00045926570959861656, 'samples': 2082432, 'steps': 10845, 'loss/train': 0.7349517792463303} 01/27/2022 06:03:43 - INFO - codeparrot_training - Step 10846: {'lr': 0.000459256757137016, 'samples': 2082624, 'steps': 10846, 'loss/train': 0.6281118392944336} 01/27/2022 06:03:46 - INFO - codeparrot_training - Step 10847: {'lr': 0.00045924780377902595, 'samples': 2082816, 'steps': 10847, 'loss/train': 0.8085889220237732} 01/27/2022 06:03:50 - INFO - codeparrot_training - Step 10848: {'lr': 0.00045923884952468475, 'samples': 2083008, 'steps': 10848, 'loss/train': 0.623017743229866} 01/27/2022 06:03:55 - INFO - codeparrot_training - Step 10849: {'lr': 0.00045922989437403074, 'samples': 2083200, 'steps': 10849, 'loss/train': 0.5296316295862198} 01/27/2022 06:03:58 - INFO - codeparrot_training - Step 10850: {'lr': 0.0004592209383271023, 'samples': 2083392, 'steps': 10850, 'loss/train': 1.2828709781169891} 01/27/2022 06:04:01 - INFO - codeparrot_training - Step 10851: {'lr': 0.0004592119813839378, 'samples': 2083584, 'steps': 10851, 'loss/train': 0.7014201432466507} 01/27/2022 06:04:04 - INFO - codeparrot_training - Step 10852: {'lr': 0.0004592030235445757, 'samples': 2083776, 'steps': 10852, 'loss/train': 1.0553626120090485} 01/27/2022 06:04:07 - INFO - codeparrot_training - Step 10853: {'lr': 0.00045919406480905413, 'samples': 2083968, 'steps': 10853, 'loss/train': 0.8475538194179535} 01/27/2022 06:04:10 - INFO - codeparrot_training - Step 10854: {'lr': 0.0004591851051774117, 'samples': 2084160, 'steps': 10854, 'loss/train': 0.17692333459854126} 01/27/2022 06:04:13 - INFO - codeparrot_training - Step 10855: {'lr': 0.00045917614464968665, 'samples': 2084352, 'steps': 10855, 'loss/train': 0.8340095579624176} 01/27/2022 06:04:17 - INFO - codeparrot_training - Step 10856: {'lr': 0.0004591671832259174, 'samples': 2084544, 'steps': 10856, 'loss/train': 0.6747082024812698} 01/27/2022 06:04:21 - INFO - codeparrot_training - Step 10857: {'lr': 0.00045915822090614243, 'samples': 2084736, 'steps': 10857, 'loss/train': 0.6418906152248383} 01/27/2022 06:04:24 - INFO - codeparrot_training - Step 10858: {'lr': 0.00045914925769040006, 'samples': 2084928, 'steps': 10858, 'loss/train': 1.4693017601966858} 01/27/2022 06:04:27 - INFO - codeparrot_training - Step 10859: {'lr': 0.0004591402935787287, 'samples': 2085120, 'steps': 10859, 'loss/train': 1.3552963435649872} 01/27/2022 06:04:30 - INFO - codeparrot_training - Step 10860: {'lr': 0.00045913132857116663, 'samples': 2085312, 'steps': 10860, 'loss/train': 0.8319592773914337} 01/27/2022 06:04:34 - INFO - codeparrot_training - Step 10861: {'lr': 0.00045912236266775245, 'samples': 2085504, 'steps': 10861, 'loss/train': 0.8069823682308197} 01/27/2022 06:04:37 - INFO - codeparrot_training - Step 10862: {'lr': 0.0004591133958685244, 'samples': 2085696, 'steps': 10862, 'loss/train': 0.7662497162818909} 01/27/2022 06:04:40 - INFO - codeparrot_training - Step 10863: {'lr': 0.00045910442817352095, 'samples': 2085888, 'steps': 10863, 'loss/train': 0.802098959684372} 01/27/2022 06:04:43 - INFO - codeparrot_training - Step 10864: {'lr': 0.0004590954595827806, 'samples': 2086080, 'steps': 10864, 'loss/train': 0.695303812623024} 01/27/2022 06:04:46 - INFO - codeparrot_training - Step 10865: {'lr': 0.00045908649009634165, 'samples': 2086272, 'steps': 10865, 'loss/train': 0.9819301664829254} 01/27/2022 06:04:51 - INFO - codeparrot_training - Step 10866: {'lr': 0.0004590775197142426, 'samples': 2086464, 'steps': 10866, 'loss/train': 0.8034507930278778} 01/27/2022 06:04:55 - INFO - codeparrot_training - Step 10867: {'lr': 0.0004590685484365218, 'samples': 2086656, 'steps': 10867, 'loss/train': 0.9333161115646362} 01/27/2022 06:04:58 - INFO - codeparrot_training - Step 10868: {'lr': 0.00045905957626321775, 'samples': 2086848, 'steps': 10868, 'loss/train': 1.0692626237869263} 01/27/2022 06:05:01 - INFO - codeparrot_training - Step 10869: {'lr': 0.0004590506031943689, 'samples': 2087040, 'steps': 10869, 'loss/train': 0.8494892120361328} 01/27/2022 06:05:04 - INFO - codeparrot_training - Step 10870: {'lr': 0.00045904162923001356, 'samples': 2087232, 'steps': 10870, 'loss/train': 0.6158648282289505} 01/27/2022 06:05:07 - INFO - codeparrot_training - Step 10871: {'lr': 0.00045903265437019036, 'samples': 2087424, 'steps': 10871, 'loss/train': 0.9898312389850616} 01/27/2022 06:05:10 - INFO - codeparrot_training - Step 10872: {'lr': 0.00045902367861493754, 'samples': 2087616, 'steps': 10872, 'loss/train': 1.7110515832901} 01/27/2022 06:05:13 - INFO - codeparrot_training - Step 10873: {'lr': 0.00045901470196429376, 'samples': 2087808, 'steps': 10873, 'loss/train': 0.4383297711610794} 01/27/2022 06:05:16 - INFO - codeparrot_training - Step 10874: {'lr': 0.0004590057244182972, 'samples': 2088000, 'steps': 10874, 'loss/train': 0.5216405242681503} 01/27/2022 06:05:21 - INFO - codeparrot_training - Step 10875: {'lr': 0.0004589967459769867, 'samples': 2088192, 'steps': 10875, 'loss/train': 0.7703507244586945} 01/27/2022 06:05:24 - INFO - codeparrot_training - Step 10876: {'lr': 0.00045898776664040036, 'samples': 2088384, 'steps': 10876, 'loss/train': 0.760128378868103} 01/27/2022 06:05:27 - INFO - codeparrot_training - Step 10877: {'lr': 0.00045897878640857684, 'samples': 2088576, 'steps': 10877, 'loss/train': 1.1145767569541931} 01/27/2022 06:05:30 - INFO - codeparrot_training - Step 10878: {'lr': 0.00045896980528155454, 'samples': 2088768, 'steps': 10878, 'loss/train': 1.3342823088169098} 01/27/2022 06:05:33 - INFO - codeparrot_training - Step 10879: {'lr': 0.0004589608232593719, 'samples': 2088960, 'steps': 10879, 'loss/train': 1.344458520412445} 01/27/2022 06:05:37 - INFO - codeparrot_training - Step 10880: {'lr': 0.0004589518403420676, 'samples': 2089152, 'steps': 10880, 'loss/train': 0.5141909569501877} 01/27/2022 06:05:40 - INFO - codeparrot_training - Step 10881: {'lr': 0.0004589428565296798, 'samples': 2089344, 'steps': 10881, 'loss/train': 0.7640118598937988} 01/27/2022 06:05:43 - INFO - codeparrot_training - Step 10882: {'lr': 0.0004589338718222473, 'samples': 2089536, 'steps': 10882, 'loss/train': 0.9023914039134979} 01/27/2022 06:05:46 - INFO - codeparrot_training - Step 10883: {'lr': 0.0004589248862198083, 'samples': 2089728, 'steps': 10883, 'loss/train': 0.804531991481781} 01/27/2022 06:05:51 - INFO - codeparrot_training - Step 10884: {'lr': 0.0004589158997224015, 'samples': 2089920, 'steps': 10884, 'loss/train': 0.3630651757121086} 01/27/2022 06:05:54 - INFO - codeparrot_training - Step 10885: {'lr': 0.0004589069123300653, 'samples': 2090112, 'steps': 10885, 'loss/train': 0.4232098460197449} 01/27/2022 06:05:57 - INFO - codeparrot_training - Step 10886: {'lr': 0.0004588979240428383, 'samples': 2090304, 'steps': 10886, 'loss/train': 1.2544749677181244} 01/27/2022 06:06:00 - INFO - codeparrot_training - Step 10887: {'lr': 0.00045888893486075875, 'samples': 2090496, 'steps': 10887, 'loss/train': 1.1511460840702057} 01/27/2022 06:06:03 - INFO - codeparrot_training - Step 10888: {'lr': 0.0004588799447838655, 'samples': 2090688, 'steps': 10888, 'loss/train': 1.2796645760536194} 01/27/2022 06:06:07 - INFO - codeparrot_training - Step 10889: {'lr': 0.0004588709538121968, 'samples': 2090880, 'steps': 10889, 'loss/train': 0.7359296679496765} 01/27/2022 06:06:10 - INFO - codeparrot_training - Step 10890: {'lr': 0.00045886196194579133, 'samples': 2091072, 'steps': 10890, 'loss/train': 1.1976370811462402} 01/27/2022 06:06:13 - INFO - codeparrot_training - Step 10891: {'lr': 0.00045885296918468746, 'samples': 2091264, 'steps': 10891, 'loss/train': 0.494836762547493} 01/27/2022 06:06:16 - INFO - codeparrot_training - Step 10892: {'lr': 0.0004588439755289238, 'samples': 2091456, 'steps': 10892, 'loss/train': 0.7361876517534256} 01/27/2022 06:06:21 - INFO - codeparrot_training - Step 10893: {'lr': 0.00045883498097853894, 'samples': 2091648, 'steps': 10893, 'loss/train': 0.9749897718429565} 01/27/2022 06:06:24 - INFO - codeparrot_training - Step 10894: {'lr': 0.00045882598553357125, 'samples': 2091840, 'steps': 10894, 'loss/train': 1.2350395917892456} 01/27/2022 06:06:28 - INFO - codeparrot_training - Step 10895: {'lr': 0.00045881698919405937, 'samples': 2092032, 'steps': 10895, 'loss/train': 0.6496738940477371} 01/27/2022 06:06:31 - INFO - codeparrot_training - Step 10896: {'lr': 0.00045880799196004187, 'samples': 2092224, 'steps': 10896, 'loss/train': 0.5703037977218628} 01/27/2022 06:06:34 - INFO - codeparrot_training - Step 10897: {'lr': 0.00045879899383155715, 'samples': 2092416, 'steps': 10897, 'loss/train': 0.9107340574264526} 01/27/2022 06:06:37 - INFO - codeparrot_training - Step 10898: {'lr': 0.00045878999480864386, 'samples': 2092608, 'steps': 10898, 'loss/train': 0.2706909254193306} 01/27/2022 06:06:40 - INFO - codeparrot_training - Step 10899: {'lr': 0.0004587809948913406, 'samples': 2092800, 'steps': 10899, 'loss/train': 0.8156654834747314} 01/27/2022 06:06:43 - INFO - codeparrot_training - Step 10900: {'lr': 0.00045877199407968577, 'samples': 2092992, 'steps': 10900, 'loss/train': 1.0627516508102417} 01/27/2022 06:06:46 - INFO - codeparrot_training - Step 10901: {'lr': 0.00045876299237371807, 'samples': 2093184, 'steps': 10901, 'loss/train': 1.0122444927692413} 01/27/2022 06:06:51 - INFO - codeparrot_training - Step 10902: {'lr': 0.00045875398977347596, 'samples': 2093376, 'steps': 10902, 'loss/train': 0.9115457832813263} 01/27/2022 06:06:54 - INFO - codeparrot_training - Step 10903: {'lr': 0.00045874498627899806, 'samples': 2093568, 'steps': 10903, 'loss/train': 0.7638806104660034} 01/27/2022 06:06:57 - INFO - codeparrot_training - Step 10904: {'lr': 0.00045873598189032295, 'samples': 2093760, 'steps': 10904, 'loss/train': 0.6004537045955658} 01/27/2022 06:07:00 - INFO - codeparrot_training - Step 10905: {'lr': 0.0004587269766074891, 'samples': 2093952, 'steps': 10905, 'loss/train': 0.9382863342761993} 01/27/2022 06:07:03 - INFO - codeparrot_training - Step 10906: {'lr': 0.0004587179704305353, 'samples': 2094144, 'steps': 10906, 'loss/train': 1.0811404287815094} 01/27/2022 06:07:07 - INFO - codeparrot_training - Step 10907: {'lr': 0.00045870896335949987, 'samples': 2094336, 'steps': 10907, 'loss/train': 0.7186820358037949} 01/27/2022 06:07:10 - INFO - codeparrot_training - Step 10908: {'lr': 0.00045869995539442153, 'samples': 2094528, 'steps': 10908, 'loss/train': 0.9318822920322418} 01/27/2022 06:07:13 - INFO - codeparrot_training - Step 10909: {'lr': 0.0004586909465353388, 'samples': 2094720, 'steps': 10909, 'loss/train': 0.9239306151866913} 01/27/2022 06:07:16 - INFO - codeparrot_training - Step 10910: {'lr': 0.0004586819367822904, 'samples': 2094912, 'steps': 10910, 'loss/train': 0.8490882217884064} 01/27/2022 06:07:20 - INFO - codeparrot_training - Step 10911: {'lr': 0.00045867292613531484, 'samples': 2095104, 'steps': 10911, 'loss/train': 0.4134870618581772} 01/27/2022 06:07:24 - INFO - codeparrot_training - Step 10912: {'lr': 0.0004586639145944508, 'samples': 2095296, 'steps': 10912, 'loss/train': 0.8445818424224854} 01/27/2022 06:07:27 - INFO - codeparrot_training - Step 10913: {'lr': 0.0004586549021597367, 'samples': 2095488, 'steps': 10913, 'loss/train': 1.087411344051361} 01/27/2022 06:07:30 - INFO - codeparrot_training - Step 10914: {'lr': 0.00045864588883121125, 'samples': 2095680, 'steps': 10914, 'loss/train': 0.8352998793125153} 01/27/2022 06:07:33 - INFO - codeparrot_training - Step 10915: {'lr': 0.00045863687460891313, 'samples': 2095872, 'steps': 10915, 'loss/train': 0.5174917280673981} 01/27/2022 06:07:36 - INFO - codeparrot_training - Step 10916: {'lr': 0.0004586278594928808, 'samples': 2096064, 'steps': 10916, 'loss/train': 0.7706626653671265} 01/27/2022 06:07:39 - INFO - codeparrot_training - Step 10917: {'lr': 0.0004586188434831531, 'samples': 2096256, 'steps': 10917, 'loss/train': 0.856208324432373} 01/27/2022 06:07:42 - INFO - codeparrot_training - Step 10918: {'lr': 0.00045860982657976835, 'samples': 2096448, 'steps': 10918, 'loss/train': 0.7730249762535095} 01/27/2022 06:07:47 - INFO - codeparrot_training - Step 10919: {'lr': 0.00045860080878276546, 'samples': 2096640, 'steps': 10919, 'loss/train': 1.3399277329444885} 01/27/2022 06:07:50 - INFO - codeparrot_training - Step 10920: {'lr': 0.0004585917900921829, 'samples': 2096832, 'steps': 10920, 'loss/train': 0.7051072418689728} 01/27/2022 06:07:53 - INFO - codeparrot_training - Step 10921: {'lr': 0.0004585827705080594, 'samples': 2097024, 'steps': 10921, 'loss/train': 0.9666065275669098} 01/27/2022 06:07:56 - INFO - codeparrot_training - Step 10922: {'lr': 0.0004585737500304335, 'samples': 2097216, 'steps': 10922, 'loss/train': 0.635846883058548} 01/27/2022 06:07:59 - INFO - codeparrot_training - Step 10923: {'lr': 0.0004585647286593439, 'samples': 2097408, 'steps': 10923, 'loss/train': 0.9097277820110321} 01/27/2022 06:08:02 - INFO - codeparrot_training - Step 10924: {'lr': 0.0004585557063948292, 'samples': 2097600, 'steps': 10924, 'loss/train': 0.8773861527442932} 01/27/2022 06:08:06 - INFO - codeparrot_training - Step 10925: {'lr': 0.00045854668323692813, 'samples': 2097792, 'steps': 10925, 'loss/train': 0.8507674634456635} 01/27/2022 06:08:09 - INFO - codeparrot_training - Step 10926: {'lr': 0.00045853765918567926, 'samples': 2097984, 'steps': 10926, 'loss/train': 0.5444082766771317} 01/27/2022 06:08:12 - INFO - codeparrot_training - Step 10927: {'lr': 0.00045852863424112125, 'samples': 2098176, 'steps': 10927, 'loss/train': 1.0700038969516754} 01/27/2022 06:08:17 - INFO - codeparrot_training - Step 10928: {'lr': 0.0004585196084032928, 'samples': 2098368, 'steps': 10928, 'loss/train': 1.166175127029419} 01/27/2022 06:08:20 - INFO - codeparrot_training - Step 10929: {'lr': 0.0004585105816722326, 'samples': 2098560, 'steps': 10929, 'loss/train': 0.8955007195472717} 01/27/2022 06:08:23 - INFO - codeparrot_training - Step 10930: {'lr': 0.0004585015540479792, 'samples': 2098752, 'steps': 10930, 'loss/train': 0.4644475132226944} 01/27/2022 06:08:26 - INFO - codeparrot_training - Step 10931: {'lr': 0.00045849252553057144, 'samples': 2098944, 'steps': 10931, 'loss/train': 0.6177027672529221} 01/27/2022 06:08:29 - INFO - codeparrot_training - Step 10932: {'lr': 0.00045848349612004786, 'samples': 2099136, 'steps': 10932, 'loss/train': 0.717121347784996} 01/27/2022 06:08:33 - INFO - codeparrot_training - Step 10933: {'lr': 0.0004584744658164472, 'samples': 2099328, 'steps': 10933, 'loss/train': 0.5708117634057999} 01/27/2022 06:08:36 - INFO - codeparrot_training - Step 10934: {'lr': 0.00045846543461980805, 'samples': 2099520, 'steps': 10934, 'loss/train': 0.965813398361206} 01/27/2022 06:08:39 - INFO - codeparrot_training - Step 10935: {'lr': 0.0004584564025301693, 'samples': 2099712, 'steps': 10935, 'loss/train': 0.9265420138835907} 01/27/2022 06:08:42 - INFO - codeparrot_training - Step 10936: {'lr': 0.00045844736954756937, 'samples': 2099904, 'steps': 10936, 'loss/train': 1.2174130082130432} 01/27/2022 06:08:47 - INFO - codeparrot_training - Step 10937: {'lr': 0.0004584383356720472, 'samples': 2100096, 'steps': 10937, 'loss/train': 1.0840289890766144} 01/27/2022 06:08:50 - INFO - codeparrot_training - Step 10938: {'lr': 0.0004584293009036414, 'samples': 2100288, 'steps': 10938, 'loss/train': 0.09184769913554192} 01/27/2022 06:08:53 - INFO - codeparrot_training - Step 10939: {'lr': 0.0004584202652423906, 'samples': 2100480, 'steps': 10939, 'loss/train': 0.6824610382318497} 01/27/2022 06:08:56 - INFO - codeparrot_training - Step 10940: {'lr': 0.0004584112286883336, 'samples': 2100672, 'steps': 10940, 'loss/train': 0.9464736878871918} 01/27/2022 06:08:59 - INFO - codeparrot_training - Step 10941: {'lr': 0.00045840219124150907, 'samples': 2100864, 'steps': 10941, 'loss/train': 0.9221742153167725} 01/27/2022 06:09:02 - INFO - codeparrot_training - Step 10942: {'lr': 0.0004583931529019557, 'samples': 2101056, 'steps': 10942, 'loss/train': 0.26224688440561295} 01/27/2022 06:09:06 - INFO - codeparrot_training - Step 10943: {'lr': 0.00045838411366971225, 'samples': 2101248, 'steps': 10943, 'loss/train': 0.7491223365068436} 01/27/2022 06:09:09 - INFO - codeparrot_training - Step 10944: {'lr': 0.00045837507354481744, 'samples': 2101440, 'steps': 10944, 'loss/train': 0.6880353391170502} 01/27/2022 06:09:12 - INFO - codeparrot_training - Step 10945: {'lr': 0.00045836603252731004, 'samples': 2101632, 'steps': 10945, 'loss/train': 0.8709147870540619} 01/27/2022 06:09:17 - INFO - codeparrot_training - Step 10946: {'lr': 0.0004583569906172286, 'samples': 2101824, 'steps': 10946, 'loss/train': 0.3437366411089897} 01/27/2022 06:09:20 - INFO - codeparrot_training - Step 10947: {'lr': 0.000458347947814612, 'samples': 2102016, 'steps': 10947, 'loss/train': 0.42021721601486206} 01/27/2022 06:09:23 - INFO - codeparrot_training - Step 10948: {'lr': 0.00045833890411949897, 'samples': 2102208, 'steps': 10948, 'loss/train': 0.7929025590419769} 01/27/2022 06:09:27 - INFO - codeparrot_training - Step 10949: {'lr': 0.0004583298595319283, 'samples': 2102400, 'steps': 10949, 'loss/train': 0.32587046921253204} 01/27/2022 06:09:30 - INFO - codeparrot_training - Step 10950: {'lr': 0.0004583208140519386, 'samples': 2102592, 'steps': 10950, 'loss/train': 0.5663302838802338} 01/27/2022 06:09:33 - INFO - codeparrot_training - Step 10951: {'lr': 0.00045831176767956866, 'samples': 2102784, 'steps': 10951, 'loss/train': 0.44731856882572174} 01/27/2022 06:09:36 - INFO - codeparrot_training - Step 10952: {'lr': 0.0004583027204148573, 'samples': 2102976, 'steps': 10952, 'loss/train': 1.3880697190761566} 01/27/2022 06:09:39 - INFO - codeparrot_training - Step 10953: {'lr': 0.00045829367225784317, 'samples': 2103168, 'steps': 10953, 'loss/train': 0.82862788438797} 01/27/2022 06:09:44 - INFO - codeparrot_training - Step 10954: {'lr': 0.0004582846232085651, 'samples': 2103360, 'steps': 10954, 'loss/train': 0.959164559841156} 01/27/2022 06:09:47 - INFO - codeparrot_training - Step 10955: {'lr': 0.0004582755732670619, 'samples': 2103552, 'steps': 10955, 'loss/train': 0.4429623931646347} 01/27/2022 06:09:50 - INFO - codeparrot_training - Step 10956: {'lr': 0.00045826652243337226, 'samples': 2103744, 'steps': 10956, 'loss/train': 1.2320116460323334} 01/27/2022 06:09:53 - INFO - codeparrot_training - Step 10957: {'lr': 0.0004582574707075349, 'samples': 2103936, 'steps': 10957, 'loss/train': 0.807620108127594} 01/27/2022 06:09:56 - INFO - codeparrot_training - Step 10958: {'lr': 0.00045824841808958874, 'samples': 2104128, 'steps': 10958, 'loss/train': 0.3290173038840294} 01/27/2022 06:09:59 - INFO - codeparrot_training - Step 10959: {'lr': 0.0004582393645795725, 'samples': 2104320, 'steps': 10959, 'loss/train': 0.8775820434093475} 01/27/2022 06:10:02 - INFO - codeparrot_training - Step 10960: {'lr': 0.00045823031017752484, 'samples': 2104512, 'steps': 10960, 'loss/train': 1.1985101401805878} 01/27/2022 06:10:06 - INFO - codeparrot_training - Step 10961: {'lr': 0.00045822125488348474, 'samples': 2104704, 'steps': 10961, 'loss/train': 0.9106201529502869} 01/27/2022 06:10:09 - INFO - codeparrot_training - Step 10962: {'lr': 0.00045821219869749086, 'samples': 2104896, 'steps': 10962, 'loss/train': 0.6341353207826614} 01/27/2022 06:10:13 - INFO - codeparrot_training - Step 10963: {'lr': 0.00045820314161958207, 'samples': 2105088, 'steps': 10963, 'loss/train': 0.4910050928592682} 01/27/2022 06:10:16 - INFO - codeparrot_training - Step 10964: {'lr': 0.00045819408364979714, 'samples': 2105280, 'steps': 10964, 'loss/train': 0.9129577875137329} 01/27/2022 06:10:20 - INFO - codeparrot_training - Step 10965: {'lr': 0.0004581850247881749, 'samples': 2105472, 'steps': 10965, 'loss/train': 0.5541746020317078} 01/27/2022 06:10:23 - INFO - codeparrot_training - Step 10966: {'lr': 0.000458175965034754, 'samples': 2105664, 'steps': 10966, 'loss/train': 0.7555620968341827} 01/27/2022 06:10:26 - INFO - codeparrot_training - Step 10967: {'lr': 0.0004581669043895734, 'samples': 2105856, 'steps': 10967, 'loss/train': 0.08096488751471043} 01/27/2022 06:10:29 - INFO - codeparrot_training - Step 10968: {'lr': 0.000458157842852672, 'samples': 2106048, 'steps': 10968, 'loss/train': 0.948307603597641} 01/27/2022 06:10:32 - INFO - codeparrot_training - Step 10969: {'lr': 0.0004581487804240884, 'samples': 2106240, 'steps': 10969, 'loss/train': 0.9691860973834991} 01/27/2022 06:10:35 - INFO - codeparrot_training - Step 10970: {'lr': 0.00045813971710386147, 'samples': 2106432, 'steps': 10970, 'loss/train': 0.6659877151250839} 01/27/2022 06:10:38 - INFO - codeparrot_training - Step 10971: {'lr': 0.0004581306528920302, 'samples': 2106624, 'steps': 10971, 'loss/train': 0.8763467073440552} 01/27/2022 06:10:43 - INFO - codeparrot_training - Step 10972: {'lr': 0.0004581215877886332, 'samples': 2106816, 'steps': 10972, 'loss/train': 0.5779235064983368} 01/27/2022 06:10:47 - INFO - codeparrot_training - Step 10973: {'lr': 0.0004581125217937095, 'samples': 2107008, 'steps': 10973, 'loss/train': 0.8721256256103516} 01/27/2022 06:10:50 - INFO - codeparrot_training - Step 10974: {'lr': 0.00045810345490729777, 'samples': 2107200, 'steps': 10974, 'loss/train': 0.6561562120914459} 01/27/2022 06:10:53 - INFO - codeparrot_training - Step 10975: {'lr': 0.00045809438712943694, 'samples': 2107392, 'steps': 10975, 'loss/train': 0.8563551306724548} 01/27/2022 06:10:56 - INFO - codeparrot_training - Step 10976: {'lr': 0.0004580853184601659, 'samples': 2107584, 'steps': 10976, 'loss/train': 0.6397163718938828} 01/27/2022 06:10:59 - INFO - codeparrot_training - Step 10977: {'lr': 0.00045807624889952336, 'samples': 2107776, 'steps': 10977, 'loss/train': 0.4974237084388733} 01/27/2022 06:11:02 - INFO - codeparrot_training - Step 10978: {'lr': 0.0004580671784475482, 'samples': 2107968, 'steps': 10978, 'loss/train': 0.5357603430747986} 01/27/2022 06:11:05 - INFO - codeparrot_training - Step 10979: {'lr': 0.0004580581071042794, 'samples': 2108160, 'steps': 10979, 'loss/train': 0.77088662981987} 01/27/2022 06:11:09 - INFO - codeparrot_training - Step 10980: {'lr': 0.00045804903486975566, 'samples': 2108352, 'steps': 10980, 'loss/train': 0.5412486791610718} 01/27/2022 06:11:13 - INFO - codeparrot_training - Step 10981: {'lr': 0.00045803996174401595, 'samples': 2108544, 'steps': 10981, 'loss/train': 0.7287245839834213} 01/27/2022 06:11:16 - INFO - codeparrot_training - Step 10982: {'lr': 0.00045803088772709914, 'samples': 2108736, 'steps': 10982, 'loss/train': 0.945572018623352} 01/27/2022 06:11:20 - INFO - codeparrot_training - Step 10983: {'lr': 0.00045802181281904403, 'samples': 2108928, 'steps': 10983, 'loss/train': 1.308431714773178} 01/27/2022 06:11:23 - INFO - codeparrot_training - Step 10984: {'lr': 0.00045801273701988955, 'samples': 2109120, 'steps': 10984, 'loss/train': 0.423999086022377} 01/27/2022 06:11:26 - INFO - codeparrot_training - Step 10985: {'lr': 0.0004580036603296746, 'samples': 2109312, 'steps': 10985, 'loss/train': 0.8232463002204895} 01/27/2022 06:11:29 - INFO - codeparrot_training - Step 10986: {'lr': 0.00045799458274843786, 'samples': 2109504, 'steps': 10986, 'loss/train': 1.3526667058467865} 01/27/2022 06:11:32 - INFO - codeparrot_training - Step 10987: {'lr': 0.0004579855042762185, 'samples': 2109696, 'steps': 10987, 'loss/train': 0.309976190328598} 01/27/2022 06:11:35 - INFO - codeparrot_training - Step 10988: {'lr': 0.00045797642491305523, 'samples': 2109888, 'steps': 10988, 'loss/train': 1.5418084859848022} 01/27/2022 06:11:38 - INFO - codeparrot_training - Step 10989: {'lr': 0.00045796734465898705, 'samples': 2110080, 'steps': 10989, 'loss/train': 1.1615184545516968} 01/27/2022 06:11:43 - INFO - codeparrot_training - Step 10990: {'lr': 0.00045795826351405276, 'samples': 2110272, 'steps': 10990, 'loss/train': 0.8969757556915283} 01/27/2022 06:11:46 - INFO - codeparrot_training - Step 10991: {'lr': 0.00045794918147829135, 'samples': 2110464, 'steps': 10991, 'loss/train': 0.98114213347435} 01/27/2022 06:11:49 - INFO - codeparrot_training - Step 10992: {'lr': 0.00045794009855174163, 'samples': 2110656, 'steps': 10992, 'loss/train': 0.9817461669445038} 01/27/2022 06:11:52 - INFO - codeparrot_training - Step 10993: {'lr': 0.0004579310147344425, 'samples': 2110848, 'steps': 10993, 'loss/train': 0.969245195388794} 01/27/2022 06:11:55 - INFO - codeparrot_training - Step 10994: {'lr': 0.000457921930026433, 'samples': 2111040, 'steps': 10994, 'loss/train': 0.8910101652145386} 01/27/2022 06:11:58 - INFO - codeparrot_training - Step 10995: {'lr': 0.00045791284442775205, 'samples': 2111232, 'steps': 10995, 'loss/train': 0.7725716829299927} 01/27/2022 06:12:02 - INFO - codeparrot_training - Step 10996: {'lr': 0.0004579037579384384, 'samples': 2111424, 'steps': 10996, 'loss/train': 1.0149723887443542} 01/27/2022 06:12:05 - INFO - codeparrot_training - Step 10997: {'lr': 0.00045789467055853104, 'samples': 2111616, 'steps': 10997, 'loss/train': 0.769051730632782} 01/27/2022 06:12:10 - INFO - codeparrot_training - Step 10998: {'lr': 0.000457885582288069, 'samples': 2111808, 'steps': 10998, 'loss/train': 0.6601852029561996} 01/27/2022 06:12:13 - INFO - codeparrot_training - Step 10999: {'lr': 0.0004578764931270911, 'samples': 2112000, 'steps': 10999, 'loss/train': 0.9785732924938202} 01/27/2022 06:12:16 - INFO - codeparrot_training - Step 11000: {'lr': 0.00045786740307563633, 'samples': 2112192, 'steps': 11000, 'loss/train': 0.704647421836853} 01/27/2022 06:12:19 - INFO - codeparrot_training - Step 11001: {'lr': 0.0004578583121337436, 'samples': 2112384, 'steps': 11001, 'loss/train': 0.9863539338111877} 01/27/2022 06:12:23 - INFO - codeparrot_training - Step 11002: {'lr': 0.0004578492203014518, 'samples': 2112576, 'steps': 11002, 'loss/train': 1.0164225697517395} 01/27/2022 06:12:26 - INFO - codeparrot_training - Step 11003: {'lr': 0.00045784012757880006, 'samples': 2112768, 'steps': 11003, 'loss/train': 0.7576990127563477} 01/27/2022 06:12:29 - INFO - codeparrot_training - Step 11004: {'lr': 0.00045783103396582713, 'samples': 2112960, 'steps': 11004, 'loss/train': 1.0711896121501923} 01/27/2022 06:12:32 - INFO - codeparrot_training - Step 11005: {'lr': 0.0004578219394625721, 'samples': 2113152, 'steps': 11005, 'loss/train': 0.4522624611854553} 01/27/2022 06:12:35 - INFO - codeparrot_training - Step 11006: {'lr': 0.0004578128440690738, 'samples': 2113344, 'steps': 11006, 'loss/train': 0.38902468979358673} 01/27/2022 06:12:40 - INFO - codeparrot_training - Step 11007: {'lr': 0.00045780374778537134, 'samples': 2113536, 'steps': 11007, 'loss/train': 0.3098374307155609} 01/27/2022 06:12:43 - INFO - codeparrot_training - Step 11008: {'lr': 0.00045779465061150356, 'samples': 2113728, 'steps': 11008, 'loss/train': 0.7555175721645355} 01/27/2022 06:12:46 - INFO - codeparrot_training - Step 11009: {'lr': 0.0004577855525475095, 'samples': 2113920, 'steps': 11009, 'loss/train': 0.4525063633918762} 01/27/2022 06:12:49 - INFO - codeparrot_training - Step 11010: {'lr': 0.0004577764535934281, 'samples': 2114112, 'steps': 11010, 'loss/train': 0.9255960881710052} 01/27/2022 06:12:52 - INFO - codeparrot_training - Step 11011: {'lr': 0.00045776735374929834, 'samples': 2114304, 'steps': 11011, 'loss/train': 0.5963398665189743} 01/27/2022 06:12:55 - INFO - codeparrot_training - Step 11012: {'lr': 0.00045775825301515923, 'samples': 2114496, 'steps': 11012, 'loss/train': 0.5027071684598923} 01/27/2022 06:12:58 - INFO - codeparrot_training - Step 11013: {'lr': 0.00045774915139104973, 'samples': 2114688, 'steps': 11013, 'loss/train': 1.3625040650367737} 01/27/2022 06:13:02 - INFO - codeparrot_training - Step 11014: {'lr': 0.0004577400488770088, 'samples': 2114880, 'steps': 11014, 'loss/train': 0.7791140377521515} 01/27/2022 06:13:05 - INFO - codeparrot_training - Step 11015: {'lr': 0.0004577309454730755, 'samples': 2115072, 'steps': 11015, 'loss/train': 1.0729951858520508} 01/27/2022 06:13:09 - INFO - codeparrot_training - Step 11016: {'lr': 0.00045772184117928884, 'samples': 2115264, 'steps': 11016, 'loss/train': 0.9647801220417023} 01/27/2022 06:13:13 - INFO - codeparrot_training - Step 11017: {'lr': 0.00045771273599568767, 'samples': 2115456, 'steps': 11017, 'loss/train': 0.8677554130554199} 01/27/2022 06:13:16 - INFO - codeparrot_training - Step 11018: {'lr': 0.0004577036299223112, 'samples': 2115648, 'steps': 11018, 'loss/train': 0.9215275347232819} 01/27/2022 06:13:19 - INFO - codeparrot_training - Step 11019: {'lr': 0.0004576945229591982, 'samples': 2115840, 'steps': 11019, 'loss/train': 0.68598572909832} 01/27/2022 06:13:22 - INFO - codeparrot_training - Step 11020: {'lr': 0.0004576854151063879, 'samples': 2116032, 'steps': 11020, 'loss/train': 0.7574905157089233} 01/27/2022 06:13:25 - INFO - codeparrot_training - Step 11021: {'lr': 0.0004576763063639192, 'samples': 2116224, 'steps': 11021, 'loss/train': 1.13506418466568} 01/27/2022 06:13:28 - INFO - codeparrot_training - Step 11022: {'lr': 0.0004576671967318312, 'samples': 2116416, 'steps': 11022, 'loss/train': 0.8490128517150879} 01/27/2022 06:13:31 - INFO - codeparrot_training - Step 11023: {'lr': 0.0004576580862101628, 'samples': 2116608, 'steps': 11023, 'loss/train': 0.24848446995019913} 01/27/2022 06:13:35 - INFO - codeparrot_training - Step 11024: {'lr': 0.00045764897479895315, 'samples': 2116800, 'steps': 11024, 'loss/train': 0.8985768556594849} 01/27/2022 06:13:40 - INFO - codeparrot_training - Step 11025: {'lr': 0.00045763986249824126, 'samples': 2116992, 'steps': 11025, 'loss/train': 0.8483946919441223} 01/27/2022 06:13:43 - INFO - codeparrot_training - Step 11026: {'lr': 0.00045763074930806606, 'samples': 2117184, 'steps': 11026, 'loss/train': 1.186758577823639} 01/27/2022 06:13:46 - INFO - codeparrot_training - Step 11027: {'lr': 0.0004576216352284667, 'samples': 2117376, 'steps': 11027, 'loss/train': 1.0283474028110504} 01/27/2022 06:13:49 - INFO - codeparrot_training - Step 11028: {'lr': 0.0004576125202594822, 'samples': 2117568, 'steps': 11028, 'loss/train': 1.0622069835662842} 01/27/2022 06:13:52 - INFO - codeparrot_training - Step 11029: {'lr': 0.0004576034044011515, 'samples': 2117760, 'steps': 11029, 'loss/train': 0.6700707525014877} 01/27/2022 06:13:55 - INFO - codeparrot_training - Step 11030: {'lr': 0.00045759428765351377, 'samples': 2117952, 'steps': 11030, 'loss/train': 0.8039137423038483} 01/27/2022 06:13:59 - INFO - codeparrot_training - Step 11031: {'lr': 0.0004575851700166081, 'samples': 2118144, 'steps': 11031, 'loss/train': 0.718399778008461} 01/27/2022 06:14:02 - INFO - codeparrot_training - Step 11032: {'lr': 0.0004575760514904734, 'samples': 2118336, 'steps': 11032, 'loss/train': 0.6554349660873413} 01/27/2022 06:14:06 - INFO - codeparrot_training - Step 11033: {'lr': 0.0004575669320751489, 'samples': 2118528, 'steps': 11033, 'loss/train': 0.9458209276199341} 01/27/2022 06:14:09 - INFO - codeparrot_training - Step 11034: {'lr': 0.00045755781177067345, 'samples': 2118720, 'steps': 11034, 'loss/train': 0.6815347373485565} 01/27/2022 06:14:12 - INFO - codeparrot_training - Step 11035: {'lr': 0.00045754869057708635, 'samples': 2118912, 'steps': 11035, 'loss/train': 0.9658701717853546} 01/27/2022 06:14:16 - INFO - codeparrot_training - Step 11036: {'lr': 0.00045753956849442647, 'samples': 2119104, 'steps': 11036, 'loss/train': 0.7581861913204193} 01/27/2022 06:14:19 - INFO - codeparrot_training - Step 11037: {'lr': 0.00045753044552273306, 'samples': 2119296, 'steps': 11037, 'loss/train': 1.0953694581985474} 01/27/2022 06:14:22 - INFO - codeparrot_training - Step 11038: {'lr': 0.0004575213216620451, 'samples': 2119488, 'steps': 11038, 'loss/train': 0.8577906489372253} 01/27/2022 06:14:25 - INFO - codeparrot_training - Step 11039: {'lr': 0.0004575121969124016, 'samples': 2119680, 'steps': 11039, 'loss/train': 0.890349805355072} 01/27/2022 06:14:28 - INFO - codeparrot_training - Step 11040: {'lr': 0.00045750307127384186, 'samples': 2119872, 'steps': 11040, 'loss/train': 0.8501847088336945} 01/27/2022 06:14:31 - INFO - codeparrot_training - Step 11041: {'lr': 0.0004574939447464048, 'samples': 2120064, 'steps': 11041, 'loss/train': 1.1038071513175964} 01/27/2022 06:14:36 - INFO - codeparrot_training - Step 11042: {'lr': 0.0004574848173301296, 'samples': 2120256, 'steps': 11042, 'loss/train': 0.08340455777943134} 01/27/2022 06:14:39 - INFO - codeparrot_training - Step 11043: {'lr': 0.0004574756890250553, 'samples': 2120448, 'steps': 11043, 'loss/train': 1.1344411075115204} 01/27/2022 06:14:42 - INFO - codeparrot_training - Step 11044: {'lr': 0.00045746655983122105, 'samples': 2120640, 'steps': 11044, 'loss/train': 0.08640438504517078} 01/27/2022 06:14:45 - INFO - codeparrot_training - Step 11045: {'lr': 0.0004574574297486659, 'samples': 2120832, 'steps': 11045, 'loss/train': 0.8632813990116119} 01/27/2022 06:14:49 - INFO - codeparrot_training - Step 11046: {'lr': 0.00045744829877742907, 'samples': 2121024, 'steps': 11046, 'loss/train': 0.8674517869949341} 01/27/2022 06:14:52 - INFO - codeparrot_training - Step 11047: {'lr': 0.0004574391669175495, 'samples': 2121216, 'steps': 11047, 'loss/train': 0.7812692821025848} 01/27/2022 06:14:55 - INFO - codeparrot_training - Step 11048: {'lr': 0.0004574300341690665, 'samples': 2121408, 'steps': 11048, 'loss/train': 0.8340987861156464} 01/27/2022 06:14:58 - INFO - codeparrot_training - Step 11049: {'lr': 0.000457420900532019, 'samples': 2121600, 'steps': 11049, 'loss/train': 1.2001797258853912} 01/27/2022 06:15:01 - INFO - codeparrot_training - Step 11050: {'lr': 0.0004574117660064463, 'samples': 2121792, 'steps': 11050, 'loss/train': 0.7347110956907272} 01/27/2022 06:15:06 - INFO - codeparrot_training - Step 11051: {'lr': 0.0004574026305923875, 'samples': 2121984, 'steps': 11051, 'loss/train': 0.8758536279201508} 01/27/2022 06:15:09 - INFO - codeparrot_training - Step 11052: {'lr': 0.0004573934942898816, 'samples': 2122176, 'steps': 11052, 'loss/train': 1.1258998811244965} 01/27/2022 06:15:13 - INFO - codeparrot_training - Step 11053: {'lr': 0.0004573843570989679, 'samples': 2122368, 'steps': 11053, 'loss/train': 0.6889818012714386} 01/27/2022 06:15:16 - INFO - codeparrot_training - Step 11054: {'lr': 0.00045737521901968535, 'samples': 2122560, 'steps': 11054, 'loss/train': 0.7817571759223938} 01/27/2022 06:15:19 - INFO - codeparrot_training - Step 11055: {'lr': 0.00045736608005207327, 'samples': 2122752, 'steps': 11055, 'loss/train': 0.8549469709396362} 01/27/2022 06:15:22 - INFO - codeparrot_training - Step 11056: {'lr': 0.0004573569401961708, 'samples': 2122944, 'steps': 11056, 'loss/train': 0.8951388001441956} 01/27/2022 06:15:25 - INFO - codeparrot_training - Step 11057: {'lr': 0.000457347799452017, 'samples': 2123136, 'steps': 11057, 'loss/train': 0.5874244272708893} 01/27/2022 06:15:28 - INFO - codeparrot_training - Step 11058: {'lr': 0.000457338657819651, 'samples': 2123328, 'steps': 11058, 'loss/train': 0.710011750459671} 01/27/2022 06:15:31 - INFO - codeparrot_training - Step 11059: {'lr': 0.00045732951529911216, 'samples': 2123520, 'steps': 11059, 'loss/train': 0.8029939234256744} 01/27/2022 06:15:36 - INFO - codeparrot_training - Step 11060: {'lr': 0.0004573203718904394, 'samples': 2123712, 'steps': 11060, 'loss/train': 1.1819824576377869} 01/27/2022 06:15:39 - INFO - codeparrot_training - Step 11061: {'lr': 0.00045731122759367206, 'samples': 2123904, 'steps': 11061, 'loss/train': 0.873755693435669} 01/27/2022 06:15:42 - INFO - codeparrot_training - Step 11062: {'lr': 0.00045730208240884926, 'samples': 2124096, 'steps': 11062, 'loss/train': 0.9978561401367188} 01/27/2022 06:15:45 - INFO - codeparrot_training - Step 11063: {'lr': 0.0004572929363360101, 'samples': 2124288, 'steps': 11063, 'loss/train': 1.0322090685367584} 01/27/2022 06:15:48 - INFO - codeparrot_training - Step 11064: {'lr': 0.0004572837893751939, 'samples': 2124480, 'steps': 11064, 'loss/train': 1.0058347284793854} 01/27/2022 06:15:52 - INFO - codeparrot_training - Step 11065: {'lr': 0.0004572746415264397, 'samples': 2124672, 'steps': 11065, 'loss/train': 0.3111008182168007} 01/27/2022 06:15:55 - INFO - codeparrot_training - Step 11066: {'lr': 0.0004572654927897868, 'samples': 2124864, 'steps': 11066, 'loss/train': 1.1642791628837585} 01/27/2022 06:15:58 - INFO - codeparrot_training - Step 11067: {'lr': 0.0004572563431652743, 'samples': 2125056, 'steps': 11067, 'loss/train': 0.7390456795692444} 01/27/2022 06:16:02 - INFO - codeparrot_training - Step 11068: {'lr': 0.00045724719265294143, 'samples': 2125248, 'steps': 11068, 'loss/train': 0.6125679910182953} 01/27/2022 06:16:06 - INFO - codeparrot_training - Step 11069: {'lr': 0.00045723804125282744, 'samples': 2125440, 'steps': 11069, 'loss/train': 0.9725605845451355} 01/27/2022 06:16:09 - INFO - codeparrot_training - Step 11070: {'lr': 0.0004572288889649715, 'samples': 2125632, 'steps': 11070, 'loss/train': 0.7957821786403656} 01/27/2022 06:16:12 - INFO - codeparrot_training - Step 11071: {'lr': 0.00045721973578941277, 'samples': 2125824, 'steps': 11071, 'loss/train': 1.691579282283783} 01/27/2022 06:16:15 - INFO - codeparrot_training - Step 11072: {'lr': 0.00045721058172619043, 'samples': 2126016, 'steps': 11072, 'loss/train': 0.4949203133583069} 01/27/2022 06:16:18 - INFO - codeparrot_training - Step 11073: {'lr': 0.00045720142677534387, 'samples': 2126208, 'steps': 11073, 'loss/train': 1.1075099408626556} 01/27/2022 06:16:21 - INFO - codeparrot_training - Step 11074: {'lr': 0.00045719227093691216, 'samples': 2126400, 'steps': 11074, 'loss/train': 0.701694130897522} 01/27/2022 06:16:24 - INFO - codeparrot_training - Step 11075: {'lr': 0.0004571831142109345, 'samples': 2126592, 'steps': 11075, 'loss/train': 0.11410742998123169} 01/27/2022 06:16:28 - INFO - codeparrot_training - Step 11076: {'lr': 0.0004571739565974502, 'samples': 2126784, 'steps': 11076, 'loss/train': 0.761514276266098} 01/27/2022 06:16:33 - INFO - codeparrot_training - Step 11077: {'lr': 0.0004571647980964985, 'samples': 2126976, 'steps': 11077, 'loss/train': 0.8618589341640472} 01/27/2022 06:16:36 - INFO - codeparrot_training - Step 11078: {'lr': 0.0004571556387081185, 'samples': 2127168, 'steps': 11078, 'loss/train': 1.3423528969287872} 01/27/2022 06:16:39 - INFO - codeparrot_training - Step 11079: {'lr': 0.0004571464784323496, 'samples': 2127360, 'steps': 11079, 'loss/train': 1.4499079585075378} 01/27/2022 06:16:42 - INFO - codeparrot_training - Step 11080: {'lr': 0.0004571373172692309, 'samples': 2127552, 'steps': 11080, 'loss/train': 1.101884812116623} 01/27/2022 06:16:45 - INFO - codeparrot_training - Step 11081: {'lr': 0.0004571281552188018, 'samples': 2127744, 'steps': 11081, 'loss/train': 0.723779171705246} 01/27/2022 06:16:48 - INFO - codeparrot_training - Step 11082: {'lr': 0.0004571189922811013, 'samples': 2127936, 'steps': 11082, 'loss/train': 1.0835031867027283} 01/27/2022 06:16:51 - INFO - codeparrot_training - Step 11083: {'lr': 0.00045710982845616893, 'samples': 2128128, 'steps': 11083, 'loss/train': 0.7873824834823608} 01/27/2022 06:16:54 - INFO - codeparrot_training - Step 11084: {'lr': 0.0004571006637440438, 'samples': 2128320, 'steps': 11084, 'loss/train': 0.2115795761346817} 01/27/2022 06:16:58 - INFO - codeparrot_training - Step 11085: {'lr': 0.00045709149814476515, 'samples': 2128512, 'steps': 11085, 'loss/train': 1.0229496359825134} 01/27/2022 06:17:02 - INFO - codeparrot_training - Step 11086: {'lr': 0.0004570823316583723, 'samples': 2128704, 'steps': 11086, 'loss/train': 1.1983555555343628} 01/27/2022 06:17:05 - INFO - codeparrot_training - Step 11087: {'lr': 0.00045707316428490453, 'samples': 2128896, 'steps': 11087, 'loss/train': 0.9578194320201874} 01/27/2022 06:17:09 - INFO - codeparrot_training - Step 11088: {'lr': 0.0004570639960244011, 'samples': 2129088, 'steps': 11088, 'loss/train': 1.2318685054779053} 01/27/2022 06:17:12 - INFO - codeparrot_training - Step 11089: {'lr': 0.00045705482687690113, 'samples': 2129280, 'steps': 11089, 'loss/train': 0.4582032859325409} 01/27/2022 06:17:15 - INFO - codeparrot_training - Step 11090: {'lr': 0.00045704565684244415, 'samples': 2129472, 'steps': 11090, 'loss/train': 0.9761954247951508} 01/27/2022 06:17:18 - INFO - codeparrot_training - Step 11091: {'lr': 0.0004570364859210693, 'samples': 2129664, 'steps': 11091, 'loss/train': 0.8474090695381165} 01/27/2022 06:17:21 - INFO - codeparrot_training - Step 11092: {'lr': 0.0004570273141128158, 'samples': 2129856, 'steps': 11092, 'loss/train': 0.6954598277807236} 01/27/2022 06:17:24 - INFO - codeparrot_training - Step 11093: {'lr': 0.00045701814141772313, 'samples': 2130048, 'steps': 11093, 'loss/train': 0.7487005591392517} 01/27/2022 06:17:27 - INFO - codeparrot_training - Step 11094: {'lr': 0.0004570089678358305, 'samples': 2130240, 'steps': 11094, 'loss/train': 0.6938502341508865} 01/27/2022 06:17:31 - INFO - codeparrot_training - Step 11095: {'lr': 0.000456999793367177, 'samples': 2130432, 'steps': 11095, 'loss/train': 0.7556215524673462} 01/27/2022 06:17:36 - INFO - codeparrot_training - Step 11096: {'lr': 0.0004569906180118023, 'samples': 2130624, 'steps': 11096, 'loss/train': 0.4063669592142105} 01/27/2022 06:17:39 - INFO - codeparrot_training - Step 11097: {'lr': 0.0004569814417697454, 'samples': 2130816, 'steps': 11097, 'loss/train': 1.2298841178417206} 01/27/2022 06:17:42 - INFO - codeparrot_training - Step 11098: {'lr': 0.0004569722646410458, 'samples': 2131008, 'steps': 11098, 'loss/train': 0.7459423094987869} 01/27/2022 06:17:45 - INFO - codeparrot_training - Step 11099: {'lr': 0.0004569630866257428, 'samples': 2131200, 'steps': 11099, 'loss/train': 1.0592408180236816} 01/27/2022 06:17:48 - INFO - codeparrot_training - Step 11100: {'lr': 0.00045695390772387557, 'samples': 2131392, 'steps': 11100, 'loss/train': 0.2347136288881302} 01/27/2022 06:17:51 - INFO - codeparrot_training - Step 11101: {'lr': 0.00045694472793548346, 'samples': 2131584, 'steps': 11101, 'loss/train': 0.870717465877533} 01/27/2022 06:17:55 - INFO - codeparrot_training - Step 11102: {'lr': 0.0004569355472606059, 'samples': 2131776, 'steps': 11102, 'loss/train': 1.2002482116222382} 01/27/2022 06:17:58 - INFO - codeparrot_training - Step 11103: {'lr': 0.0004569263656992822, 'samples': 2131968, 'steps': 11103, 'loss/train': 1.1494206190109253} 01/27/2022 06:18:02 - INFO - codeparrot_training - Step 11104: {'lr': 0.0004569171832515516, 'samples': 2132160, 'steps': 11104, 'loss/train': 1.1311349272727966} 01/27/2022 06:18:05 - INFO - codeparrot_training - Step 11105: {'lr': 0.0004569079999174536, 'samples': 2132352, 'steps': 11105, 'loss/train': 1.106175720691681} 01/27/2022 06:18:08 - INFO - codeparrot_training - Step 11106: {'lr': 0.0004568988156970273, 'samples': 2132544, 'steps': 11106, 'loss/train': 0.764539361000061} 01/27/2022 06:18:12 - INFO - codeparrot_training - Step 11107: {'lr': 0.00045688963059031226, 'samples': 2132736, 'steps': 11107, 'loss/train': 0.24651335924863815} 01/27/2022 06:18:15 - INFO - codeparrot_training - Step 11108: {'lr': 0.00045688044459734766, 'samples': 2132928, 'steps': 11108, 'loss/train': 0.586223691701889} 01/27/2022 06:18:18 - INFO - codeparrot_training - Step 11109: {'lr': 0.00045687125771817294, 'samples': 2133120, 'steps': 11109, 'loss/train': 0.4541427791118622} 01/27/2022 06:18:21 - INFO - codeparrot_training - Step 11110: {'lr': 0.00045686206995282754, 'samples': 2133312, 'steps': 11110, 'loss/train': 0.8137856125831604} 01/27/2022 06:18:24 - INFO - codeparrot_training - Step 11111: {'lr': 0.00045685288130135063, 'samples': 2133504, 'steps': 11111, 'loss/train': 0.9425930678844452} 01/27/2022 06:18:27 - INFO - codeparrot_training - Step 11112: {'lr': 0.00045684369176378164, 'samples': 2133696, 'steps': 11112, 'loss/train': 1.238666296005249} 01/27/2022 06:18:32 - INFO - codeparrot_training - Step 11113: {'lr': 0.00045683450134016, 'samples': 2133888, 'steps': 11113, 'loss/train': 0.9303277730941772} 01/27/2022 06:18:35 - INFO - codeparrot_training - Step 11114: {'lr': 0.0004568253100305251, 'samples': 2134080, 'steps': 11114, 'loss/train': 1.3082170486450195} 01/27/2022 06:18:38 - INFO - codeparrot_training - Step 11115: {'lr': 0.0004568161178349161, 'samples': 2134272, 'steps': 11115, 'loss/train': 0.7417206466197968} 01/27/2022 06:18:41 - INFO - codeparrot_training - Step 11116: {'lr': 0.0004568069247533726, 'samples': 2134464, 'steps': 11116, 'loss/train': 0.716804176568985} 01/27/2022 06:18:44 - INFO - codeparrot_training - Step 11117: {'lr': 0.0004567977307859339, 'samples': 2134656, 'steps': 11117, 'loss/train': 0.6819820404052734} 01/27/2022 06:18:47 - INFO - codeparrot_training - Step 11118: {'lr': 0.0004567885359326394, 'samples': 2134848, 'steps': 11118, 'loss/train': 0.8180955648422241} 01/27/2022 06:18:51 - INFO - codeparrot_training - Step 11119: {'lr': 0.00045677934019352844, 'samples': 2135040, 'steps': 11119, 'loss/train': 0.7134448885917664} 01/27/2022 06:18:54 - INFO - codeparrot_training - Step 11120: {'lr': 0.00045677014356864043, 'samples': 2135232, 'steps': 11120, 'loss/train': 0.6107445806264877} 01/27/2022 06:18:57 - INFO - codeparrot_training - Step 11121: {'lr': 0.00045676094605801487, 'samples': 2135424, 'steps': 11121, 'loss/train': 0.09052365459501743} 01/27/2022 06:19:02 - INFO - codeparrot_training - Step 11122: {'lr': 0.00045675174766169105, 'samples': 2135616, 'steps': 11122, 'loss/train': 1.232734501361847} 01/27/2022 06:19:05 - INFO - codeparrot_training - Step 11123: {'lr': 0.0004567425483797083, 'samples': 2135808, 'steps': 11123, 'loss/train': 0.9075548350811005} 01/27/2022 06:19:08 - INFO - codeparrot_training - Step 11124: {'lr': 0.0004567333482121062, 'samples': 2136000, 'steps': 11124, 'loss/train': 1.664235770702362} 01/27/2022 06:19:11 - INFO - codeparrot_training - Step 11125: {'lr': 0.0004567241471589241, 'samples': 2136192, 'steps': 11125, 'loss/train': 1.3732185065746307} 01/27/2022 06:19:14 - INFO - codeparrot_training - Step 11126: {'lr': 0.0004567149452202013, 'samples': 2136384, 'steps': 11126, 'loss/train': 1.0628626942634583} 01/27/2022 06:19:18 - INFO - codeparrot_training - Step 11127: {'lr': 0.0004567057423959774, 'samples': 2136576, 'steps': 11127, 'loss/train': 0.8859248757362366} 01/27/2022 06:19:21 - INFO - codeparrot_training - Step 11128: {'lr': 0.0004566965386862917, 'samples': 2136768, 'steps': 11128, 'loss/train': 0.9238291382789612} 01/27/2022 06:19:24 - INFO - codeparrot_training - Step 11129: {'lr': 0.0004566873340911837, 'samples': 2136960, 'steps': 11129, 'loss/train': 0.4873410612344742} 01/27/2022 06:19:27 - INFO - codeparrot_training - Step 11130: {'lr': 0.00045667812861069275, 'samples': 2137152, 'steps': 11130, 'loss/train': 1.642317295074463} 01/27/2022 06:19:32 - INFO - codeparrot_training - Step 11131: {'lr': 0.00045666892224485836, 'samples': 2137344, 'steps': 11131, 'loss/train': 0.6833125948905945} 01/27/2022 06:19:35 - INFO - codeparrot_training - Step 11132: {'lr': 0.0004566597149937199, 'samples': 2137536, 'steps': 11132, 'loss/train': 1.1772635579109192} 01/27/2022 06:19:38 - INFO - codeparrot_training - Step 11133: {'lr': 0.0004566505068573168, 'samples': 2137728, 'steps': 11133, 'loss/train': 0.7051241844892502} 01/27/2022 06:19:42 - INFO - codeparrot_training - Step 11134: {'lr': 0.00045664129783568866, 'samples': 2137920, 'steps': 11134, 'loss/train': 0.8526126444339752} 01/27/2022 06:19:45 - INFO - codeparrot_training - Step 11135: {'lr': 0.00045663208792887474, 'samples': 2138112, 'steps': 11135, 'loss/train': 0.080328069627285} 01/27/2022 06:19:48 - INFO - codeparrot_training - Step 11136: {'lr': 0.0004566228771369146, 'samples': 2138304, 'steps': 11136, 'loss/train': 0.9282421767711639} 01/27/2022 06:19:51 - INFO - codeparrot_training - Step 11137: {'lr': 0.00045661366545984763, 'samples': 2138496, 'steps': 11137, 'loss/train': 0.9827672839164734} 01/27/2022 06:19:54 - INFO - codeparrot_training - Step 11138: {'lr': 0.00045660445289771336, 'samples': 2138688, 'steps': 11138, 'loss/train': 0.9826788604259491} 01/27/2022 06:19:57 - INFO - codeparrot_training - Step 11139: {'lr': 0.00045659523945055114, 'samples': 2138880, 'steps': 11139, 'loss/train': 0.9530911445617676} 01/27/2022 06:20:02 - INFO - codeparrot_training - Step 11140: {'lr': 0.0004565860251184006, 'samples': 2139072, 'steps': 11140, 'loss/train': 0.8702484369277954} 01/27/2022 06:20:05 - INFO - codeparrot_training - Step 11141: {'lr': 0.0004565768099013011, 'samples': 2139264, 'steps': 11141, 'loss/train': 0.7261873483657837} 01/27/2022 06:20:08 - INFO - codeparrot_training - Step 11142: {'lr': 0.00045656759379929213, 'samples': 2139456, 'steps': 11142, 'loss/train': 0.8096451759338379} 01/27/2022 06:20:11 - INFO - codeparrot_training - Step 11143: {'lr': 0.0004565583768124132, 'samples': 2139648, 'steps': 11143, 'loss/train': 1.1340749859809875} 01/27/2022 06:20:14 - INFO - codeparrot_training - Step 11144: {'lr': 0.0004565491589407038, 'samples': 2139840, 'steps': 11144, 'loss/train': 0.5163401216268539} 01/27/2022 06:20:17 - INFO - codeparrot_training - Step 11145: {'lr': 0.0004565399401842034, 'samples': 2140032, 'steps': 11145, 'loss/train': 0.6556723415851593} 01/27/2022 06:20:21 - INFO - codeparrot_training - Step 11146: {'lr': 0.0004565307205429514, 'samples': 2140224, 'steps': 11146, 'loss/train': 1.4975597262382507} 01/27/2022 06:20:24 - INFO - codeparrot_training - Step 11147: {'lr': 0.00045652150001698744, 'samples': 2140416, 'steps': 11147, 'loss/train': 0.6448207944631577} 01/27/2022 06:20:27 - INFO - codeparrot_training - Step 11148: {'lr': 0.00045651227860635094, 'samples': 2140608, 'steps': 11148, 'loss/train': 0.9122401177883148} 01/27/2022 06:20:32 - INFO - codeparrot_training - Step 11149: {'lr': 0.00045650305631108137, 'samples': 2140800, 'steps': 11149, 'loss/train': 1.1489852964878082} 01/27/2022 06:20:35 - INFO - codeparrot_training - Step 11150: {'lr': 0.0004564938331312183, 'samples': 2140992, 'steps': 11150, 'loss/train': 0.7297298312187195} 01/27/2022 06:20:38 - INFO - codeparrot_training - Step 11151: {'lr': 0.00045648460906680123, 'samples': 2141184, 'steps': 11151, 'loss/train': 0.7823494076728821} 01/27/2022 06:20:41 - INFO - codeparrot_training - Step 11152: {'lr': 0.00045647538411786964, 'samples': 2141376, 'steps': 11152, 'loss/train': 0.6964137107133865} 01/27/2022 06:20:44 - INFO - codeparrot_training - Step 11153: {'lr': 0.00045646615828446316, 'samples': 2141568, 'steps': 11153, 'loss/train': 0.8677049875259399} 01/27/2022 06:20:47 - INFO - codeparrot_training - Step 11154: {'lr': 0.00045645693156662104, 'samples': 2141760, 'steps': 11154, 'loss/train': 1.005569189786911} 01/27/2022 06:20:50 - INFO - codeparrot_training - Step 11155: {'lr': 0.0004564477039643831, 'samples': 2141952, 'steps': 11155, 'loss/train': 0.9227924644947052} 01/27/2022 06:20:54 - INFO - codeparrot_training - Step 11156: {'lr': 0.0004564384754777888, 'samples': 2142144, 'steps': 11156, 'loss/train': 0.6145620793104172} 01/27/2022 06:20:59 - INFO - codeparrot_training - Step 11157: {'lr': 0.0004564292461068775, 'samples': 2142336, 'steps': 11157, 'loss/train': 0.7804997563362122} 01/27/2022 06:21:02 - INFO - codeparrot_training - Step 11158: {'lr': 0.00045642001585168896, 'samples': 2142528, 'steps': 11158, 'loss/train': 0.7062917947769165} 01/27/2022 06:21:05 - INFO - codeparrot_training - Step 11159: {'lr': 0.0004564107847122626, 'samples': 2142720, 'steps': 11159, 'loss/train': 0.8283485770225525} 01/27/2022 06:21:08 - INFO - codeparrot_training - Step 11160: {'lr': 0.0004564015526886379, 'samples': 2142912, 'steps': 11160, 'loss/train': 0.6047827005386353} 01/27/2022 06:21:11 - INFO - codeparrot_training - Step 11161: {'lr': 0.0004563923197808546, 'samples': 2143104, 'steps': 11161, 'loss/train': 0.8946964144706726} 01/27/2022 06:21:14 - INFO - codeparrot_training - Step 11162: {'lr': 0.00045638308598895205, 'samples': 2143296, 'steps': 11162, 'loss/train': 1.1006185412406921} 01/27/2022 06:21:18 - INFO - codeparrot_training - Step 11163: {'lr': 0.0004563738513129699, 'samples': 2143488, 'steps': 11163, 'loss/train': 0.9733647108078003} 01/27/2022 06:21:21 - INFO - codeparrot_training - Step 11164: {'lr': 0.0004563646157529477, 'samples': 2143680, 'steps': 11164, 'loss/train': 1.5655433535575867} 01/27/2022 06:21:24 - INFO - codeparrot_training - Step 11165: {'lr': 0.0004563553793089251, 'samples': 2143872, 'steps': 11165, 'loss/train': 0.8258061110973358} 01/27/2022 06:21:28 - INFO - codeparrot_training - Step 11166: {'lr': 0.00045634614198094154, 'samples': 2144064, 'steps': 11166, 'loss/train': 0.6038179993629456} 01/27/2022 06:21:32 - INFO - codeparrot_training - Step 11167: {'lr': 0.0004563369037690366, 'samples': 2144256, 'steps': 11167, 'loss/train': 0.35408125072717667} 01/27/2022 06:21:35 - INFO - codeparrot_training - Step 11168: {'lr': 0.00045632766467324995, 'samples': 2144448, 'steps': 11168, 'loss/train': 0.9659488499164581} 01/27/2022 06:21:38 - INFO - codeparrot_training - Step 11169: {'lr': 0.00045631842469362103, 'samples': 2144640, 'steps': 11169, 'loss/train': 1.130654364824295} 01/27/2022 06:21:41 - INFO - codeparrot_training - Step 11170: {'lr': 0.00045630918383018947, 'samples': 2144832, 'steps': 11170, 'loss/train': 1.0753424763679504} 01/27/2022 06:21:44 - INFO - codeparrot_training - Step 11171: {'lr': 0.00045629994208299496, 'samples': 2145024, 'steps': 11171, 'loss/train': 0.6933581382036209} 01/27/2022 06:21:47 - INFO - codeparrot_training - Step 11172: {'lr': 0.0004562906994520769, 'samples': 2145216, 'steps': 11172, 'loss/train': 0.9676609039306641} 01/27/2022 06:21:50 - INFO - codeparrot_training - Step 11173: {'lr': 0.0004562814559374751, 'samples': 2145408, 'steps': 11173, 'loss/train': 0.8396938741207123} 01/27/2022 06:21:53 - INFO - codeparrot_training - Step 11174: {'lr': 0.000456272211539229, 'samples': 2145600, 'steps': 11174, 'loss/train': 0.946832925081253} 01/27/2022 06:21:59 - INFO - codeparrot_training - Step 11175: {'lr': 0.00045626296625737823, 'samples': 2145792, 'steps': 11175, 'loss/train': 0.7582629024982452} 01/27/2022 06:22:02 - INFO - codeparrot_training - Step 11176: {'lr': 0.0004562537200919624, 'samples': 2145984, 'steps': 11176, 'loss/train': 0.9386052489280701} 01/27/2022 06:22:05 - INFO - codeparrot_training - Step 11177: {'lr': 0.00045624447304302117, 'samples': 2146176, 'steps': 11177, 'loss/train': 0.8572585880756378} 01/27/2022 06:22:08 - INFO - codeparrot_training - Step 11178: {'lr': 0.00045623522511059405, 'samples': 2146368, 'steps': 11178, 'loss/train': 0.7802062332630157} 01/27/2022 06:22:11 - INFO - codeparrot_training - Step 11179: {'lr': 0.00045622597629472073, 'samples': 2146560, 'steps': 11179, 'loss/train': 0.6476582139730453} 01/27/2022 06:22:14 - INFO - codeparrot_training - Step 11180: {'lr': 0.0004562167265954409, 'samples': 2146752, 'steps': 11180, 'loss/train': 0.8514670729637146} 01/27/2022 06:22:17 - INFO - codeparrot_training - Step 11181: {'lr': 0.000456207476012794, 'samples': 2146944, 'steps': 11181, 'loss/train': 1.0910285711288452} 01/27/2022 06:22:21 - INFO - codeparrot_training - Step 11182: {'lr': 0.0004561982245468198, 'samples': 2147136, 'steps': 11182, 'loss/train': 0.5400148630142212} 01/27/2022 06:22:24 - INFO - codeparrot_training - Step 11183: {'lr': 0.0004561889721975578, 'samples': 2147328, 'steps': 11183, 'loss/train': 1.2559833526611328} 01/27/2022 06:22:29 - INFO - codeparrot_training - Step 11184: {'lr': 0.0004561797189650478, 'samples': 2147520, 'steps': 11184, 'loss/train': 1.7698287963867188} 01/27/2022 06:22:32 - INFO - codeparrot_training - Step 11185: {'lr': 0.0004561704648493293, 'samples': 2147712, 'steps': 11185, 'loss/train': 0.07156413048505783} 01/27/2022 06:22:35 - INFO - codeparrot_training - Step 11186: {'lr': 0.00045616120985044205, 'samples': 2147904, 'steps': 11186, 'loss/train': 1.3294879496097565} 01/27/2022 06:22:38 - INFO - codeparrot_training - Step 11187: {'lr': 0.0004561519539684256, 'samples': 2148096, 'steps': 11187, 'loss/train': 0.8335686922073364} 01/27/2022 06:22:41 - INFO - codeparrot_training - Step 11188: {'lr': 0.00045614269720331964, 'samples': 2148288, 'steps': 11188, 'loss/train': 0.7514711022377014} 01/27/2022 06:22:45 - INFO - codeparrot_training - Step 11189: {'lr': 0.00045613343955516386, 'samples': 2148480, 'steps': 11189, 'loss/train': 0.8336548805236816} 01/27/2022 06:22:48 - INFO - codeparrot_training - Step 11190: {'lr': 0.00045612418102399785, 'samples': 2148672, 'steps': 11190, 'loss/train': 0.9807206690311432} 01/27/2022 06:22:51 - INFO - codeparrot_training - Step 11191: {'lr': 0.00045611492160986127, 'samples': 2148864, 'steps': 11191, 'loss/train': 0.3518221825361252} 01/27/2022 06:22:54 - INFO - codeparrot_training - Step 11192: {'lr': 0.00045610566131279386, 'samples': 2149056, 'steps': 11192, 'loss/train': 0.9095037281513214} 01/27/2022 06:22:59 - INFO - codeparrot_training - Step 11193: {'lr': 0.00045609640013283525, 'samples': 2149248, 'steps': 11193, 'loss/train': 1.147481381893158} 01/27/2022 06:23:02 - INFO - codeparrot_training - Step 11194: {'lr': 0.00045608713807002507, 'samples': 2149440, 'steps': 11194, 'loss/train': 0.8194521367549896} 01/27/2022 06:23:05 - INFO - codeparrot_training - Step 11195: {'lr': 0.000456077875124403, 'samples': 2149632, 'steps': 11195, 'loss/train': 1.1392643451690674} 01/27/2022 06:23:08 - INFO - codeparrot_training - Step 11196: {'lr': 0.00045606861129600883, 'samples': 2149824, 'steps': 11196, 'loss/train': 1.0888820886611938} 01/27/2022 06:23:11 - INFO - codeparrot_training - Step 11197: {'lr': 0.00045605934658488214, 'samples': 2150016, 'steps': 11197, 'loss/train': 1.4088122248649597} 01/27/2022 06:23:15 - INFO - codeparrot_training - Step 11198: {'lr': 0.0004560500809910626, 'samples': 2150208, 'steps': 11198, 'loss/train': 0.604651540517807} 01/27/2022 06:23:18 - INFO - codeparrot_training - Step 11199: {'lr': 0.00045604081451459, 'samples': 2150400, 'steps': 11199, 'loss/train': 0.3971620202064514} 01/27/2022 06:23:21 - INFO - codeparrot_training - Step 11200: {'lr': 0.0004560315471555039, 'samples': 2150592, 'steps': 11200, 'loss/train': 1.0726843178272247} 01/27/2022 06:23:24 - INFO - codeparrot_training - Step 11201: {'lr': 0.00045602227891384416, 'samples': 2150784, 'steps': 11201, 'loss/train': 1.037003517150879} 01/27/2022 06:23:28 - INFO - codeparrot_training - Step 11202: {'lr': 0.00045601300978965033, 'samples': 2150976, 'steps': 11202, 'loss/train': 0.8800152540206909} 01/27/2022 06:23:32 - INFO - codeparrot_training - Step 11203: {'lr': 0.00045600373978296223, 'samples': 2151168, 'steps': 11203, 'loss/train': 0.7912302911281586} 01/27/2022 06:23:35 - INFO - codeparrot_training - Step 11204: {'lr': 0.0004559944688938195, 'samples': 2151360, 'steps': 11204, 'loss/train': 1.402311623096466} 01/27/2022 06:23:38 - INFO - codeparrot_training - Step 11205: {'lr': 0.0004559851971222618, 'samples': 2151552, 'steps': 11205, 'loss/train': 0.7795590162277222} 01/27/2022 06:23:41 - INFO - codeparrot_training - Step 11206: {'lr': 0.00045597592446832905, 'samples': 2151744, 'steps': 11206, 'loss/train': 0.7053898125886917} 01/27/2022 06:23:44 - INFO - codeparrot_training - Step 11207: {'lr': 0.0004559666509320608, 'samples': 2151936, 'steps': 11207, 'loss/train': 0.6723612695932388} 01/27/2022 06:23:47 - INFO - codeparrot_training - Step 11208: {'lr': 0.0004559573765134968, 'samples': 2152128, 'steps': 11208, 'loss/train': 0.5972758233547211} 01/27/2022 06:23:51 - INFO - codeparrot_training - Step 11209: {'lr': 0.0004559481012126768, 'samples': 2152320, 'steps': 11209, 'loss/train': 0.4749313294887543} 01/27/2022 06:23:54 - INFO - codeparrot_training - Step 11210: {'lr': 0.00045593882502964055, 'samples': 2152512, 'steps': 11210, 'loss/train': 0.9619843661785126} 01/27/2022 06:23:59 - INFO - codeparrot_training - Step 11211: {'lr': 0.00045592954796442784, 'samples': 2152704, 'steps': 11211, 'loss/train': 0.37403976917266846} 01/27/2022 06:24:02 - INFO - codeparrot_training - Step 11212: {'lr': 0.0004559202700170782, 'samples': 2152896, 'steps': 11212, 'loss/train': 0.5655209273099899} 01/27/2022 06:24:05 - INFO - codeparrot_training - Step 11213: {'lr': 0.00045591099118763156, 'samples': 2153088, 'steps': 11213, 'loss/train': 0.8738000392913818} 01/27/2022 06:24:08 - INFO - codeparrot_training - Step 11214: {'lr': 0.0004559017114761276, 'samples': 2153280, 'steps': 11214, 'loss/train': 0.832734614610672} 01/27/2022 06:24:11 - INFO - codeparrot_training - Step 11215: {'lr': 0.00045589243088260613, 'samples': 2153472, 'steps': 11215, 'loss/train': 0.7977747023105621} 01/27/2022 06:24:15 - INFO - codeparrot_training - Step 11216: {'lr': 0.00045588314940710683, 'samples': 2153664, 'steps': 11216, 'loss/train': 0.7056207060813904} 01/27/2022 06:24:18 - INFO - codeparrot_training - Step 11217: {'lr': 0.00045587386704966956, 'samples': 2153856, 'steps': 11217, 'loss/train': 1.14191535115242} 01/27/2022 06:24:21 - INFO - codeparrot_training - Step 11218: {'lr': 0.00045586458381033395, 'samples': 2154048, 'steps': 11218, 'loss/train': 1.3062002956867218} 01/27/2022 06:24:25 - INFO - codeparrot_training - Step 11219: {'lr': 0.00045585529968913984, 'samples': 2154240, 'steps': 11219, 'loss/train': 0.8736057579517365} 01/27/2022 06:24:28 - INFO - codeparrot_training - Step 11220: {'lr': 0.00045584601468612703, 'samples': 2154432, 'steps': 11220, 'loss/train': 0.5104178488254547} 01/27/2022 06:24:32 - INFO - codeparrot_training - Step 11221: {'lr': 0.0004558367288013352, 'samples': 2154624, 'steps': 11221, 'loss/train': 1.1755611598491669} 01/27/2022 06:24:35 - INFO - codeparrot_training - Step 11222: {'lr': 0.00045582744203480417, 'samples': 2154816, 'steps': 11222, 'loss/train': 1.013990968465805} 01/27/2022 06:24:38 - INFO - codeparrot_training - Step 11223: {'lr': 0.0004558181543865738, 'samples': 2155008, 'steps': 11223, 'loss/train': 0.42064033448696136} 01/27/2022 06:24:41 - INFO - codeparrot_training - Step 11224: {'lr': 0.0004558088658566838, 'samples': 2155200, 'steps': 11224, 'loss/train': 0.941361665725708} 01/27/2022 06:24:44 - INFO - codeparrot_training - Step 11225: {'lr': 0.000455799576445174, 'samples': 2155392, 'steps': 11225, 'loss/train': 0.4119721055030823} 01/27/2022 06:24:47 - INFO - codeparrot_training - Step 11226: {'lr': 0.00045579028615208404, 'samples': 2155584, 'steps': 11226, 'loss/train': 0.6317532062530518} 01/27/2022 06:24:50 - INFO - codeparrot_training - Step 11227: {'lr': 0.000455780994977454, 'samples': 2155776, 'steps': 11227, 'loss/train': 0.8576793372631073} 01/27/2022 06:24:55 - INFO - codeparrot_training - Step 11228: {'lr': 0.0004557717029213234, 'samples': 2155968, 'steps': 11228, 'loss/train': 0.9589234292507172} 01/27/2022 06:24:58 - INFO - codeparrot_training - Step 11229: {'lr': 0.00045576240998373226, 'samples': 2156160, 'steps': 11229, 'loss/train': 1.2211984992027283} 01/27/2022 06:25:01 - INFO - codeparrot_training - Step 11230: {'lr': 0.00045575311616472024, 'samples': 2156352, 'steps': 11230, 'loss/train': 0.8172100782394409} 01/27/2022 06:25:04 - INFO - codeparrot_training - Step 11231: {'lr': 0.0004557438214643272, 'samples': 2156544, 'steps': 11231, 'loss/train': 1.2034554183483124} 01/27/2022 06:25:07 - INFO - codeparrot_training - Step 11232: {'lr': 0.00045573452588259296, 'samples': 2156736, 'steps': 11232, 'loss/train': 0.5258142203092575} 01/27/2022 06:25:11 - INFO - codeparrot_training - Step 11233: {'lr': 0.0004557252294195573, 'samples': 2156928, 'steps': 11233, 'loss/train': 0.4064871221780777} 01/27/2022 06:25:14 - INFO - codeparrot_training - Step 11234: {'lr': 0.00045571593207526016, 'samples': 2157120, 'steps': 11234, 'loss/train': 0.8287241756916046} 01/27/2022 06:25:17 - INFO - codeparrot_training - Step 11235: {'lr': 0.00045570663384974125, 'samples': 2157312, 'steps': 11235, 'loss/train': 0.9030649065971375} 01/27/2022 06:25:20 - INFO - codeparrot_training - Step 11236: {'lr': 0.00045569733474304044, 'samples': 2157504, 'steps': 11236, 'loss/train': 0.699557438492775} 01/27/2022 06:25:25 - INFO - codeparrot_training - Step 11237: {'lr': 0.0004556880347551976, 'samples': 2157696, 'steps': 11237, 'loss/train': 0.92086261510849} 01/27/2022 06:25:28 - INFO - codeparrot_training - Step 11238: {'lr': 0.0004556787338862525, 'samples': 2157888, 'steps': 11238, 'loss/train': 0.8769729137420654} 01/27/2022 06:25:31 - INFO - codeparrot_training - Step 11239: {'lr': 0.000455669432136245, 'samples': 2158080, 'steps': 11239, 'loss/train': 0.5857781767845154} 01/27/2022 06:25:34 - INFO - codeparrot_training - Step 11240: {'lr': 0.00045566012950521497, 'samples': 2158272, 'steps': 11240, 'loss/train': 1.0583751797676086} 01/27/2022 06:25:38 - INFO - codeparrot_training - Step 11241: {'lr': 0.0004556508259932022, 'samples': 2158464, 'steps': 11241, 'loss/train': 0.6524210125207901} 01/27/2022 06:25:41 - INFO - codeparrot_training - Step 11242: {'lr': 0.0004556415216002467, 'samples': 2158656, 'steps': 11242, 'loss/train': 0.8946846127510071} 01/27/2022 06:25:44 - INFO - codeparrot_training - Step 11243: {'lr': 0.0004556322163263882, 'samples': 2158848, 'steps': 11243, 'loss/train': 0.44696715474128723} 01/27/2022 06:25:47 - INFO - codeparrot_training - Step 11244: {'lr': 0.00045562291017166653, 'samples': 2159040, 'steps': 11244, 'loss/train': 0.7280422747135162} 01/27/2022 06:25:51 - INFO - codeparrot_training - Step 11245: {'lr': 0.0004556136031361216, 'samples': 2159232, 'steps': 11245, 'loss/train': 0.39311470091342926} 01/27/2022 06:25:55 - INFO - codeparrot_training - Step 11246: {'lr': 0.0004556042952197933, 'samples': 2159424, 'steps': 11246, 'loss/train': 0.6336377710103989} 01/27/2022 06:25:58 - INFO - codeparrot_training - Step 11247: {'lr': 0.00045559498642272153, 'samples': 2159616, 'steps': 11247, 'loss/train': 1.0082440674304962} 01/27/2022 06:26:01 - INFO - codeparrot_training - Step 11248: {'lr': 0.0004555856767449461, 'samples': 2159808, 'steps': 11248, 'loss/train': 1.020416021347046} 01/27/2022 06:26:04 - INFO - codeparrot_training - Step 11249: {'lr': 0.00045557636618650686, 'samples': 2160000, 'steps': 11249, 'loss/train': 0.511223092675209} 01/27/2022 06:26:07 - INFO - codeparrot_training - Step 11250: {'lr': 0.00045556705474744376, 'samples': 2160192, 'steps': 11250, 'loss/train': 0.6537382453680038} 01/27/2022 06:26:10 - INFO - codeparrot_training - Step 11251: {'lr': 0.00045555774242779675, 'samples': 2160384, 'steps': 11251, 'loss/train': 0.440191850066185} 01/27/2022 06:26:14 - INFO - codeparrot_training - Step 11252: {'lr': 0.0004555484292276055, 'samples': 2160576, 'steps': 11252, 'loss/train': 1.2131151258945465} 01/27/2022 06:26:17 - INFO - codeparrot_training - Step 11253: {'lr': 0.0004555391151469102, 'samples': 2160768, 'steps': 11253, 'loss/train': 0.8538441359996796} 01/27/2022 06:26:22 - INFO - codeparrot_training - Step 11254: {'lr': 0.00045552980018575054, 'samples': 2160960, 'steps': 11254, 'loss/train': 0.5105471312999725} 01/27/2022 06:26:25 - INFO - codeparrot_training - Step 11255: {'lr': 0.0004555204843441665, 'samples': 2161152, 'steps': 11255, 'loss/train': 1.0484688878059387} 01/27/2022 06:26:28 - INFO - codeparrot_training - Step 11256: {'lr': 0.0004555111676221979, 'samples': 2161344, 'steps': 11256, 'loss/train': 0.8583239614963531} 01/27/2022 06:26:31 - INFO - codeparrot_training - Step 11257: {'lr': 0.00045550185001988475, 'samples': 2161536, 'steps': 11257, 'loss/train': 0.6829981952905655} 01/27/2022 06:26:34 - INFO - codeparrot_training - Step 11258: {'lr': 0.00045549253153726694, 'samples': 2161728, 'steps': 11258, 'loss/train': 0.19750135391950607} 01/27/2022 06:26:38 - INFO - codeparrot_training - Step 11259: {'lr': 0.00045548321217438436, 'samples': 2161920, 'steps': 11259, 'loss/train': 0.24459922313690186} 01/27/2022 06:26:41 - INFO - codeparrot_training - Step 11260: {'lr': 0.00045547389193127696, 'samples': 2162112, 'steps': 11260, 'loss/train': 0.9104658365249634} 01/27/2022 06:26:44 - INFO - codeparrot_training - Step 11261: {'lr': 0.00045546457080798463, 'samples': 2162304, 'steps': 11261, 'loss/train': 0.880683571100235} 01/27/2022 06:26:47 - INFO - codeparrot_training - Step 11262: {'lr': 0.00045545524880454734, 'samples': 2162496, 'steps': 11262, 'loss/train': 0.9488608539104462} 01/27/2022 06:26:52 - INFO - codeparrot_training - Step 11263: {'lr': 0.000455445925921005, 'samples': 2162688, 'steps': 11263, 'loss/train': 0.4546412229537964} 01/27/2022 06:26:55 - INFO - codeparrot_training - Step 11264: {'lr': 0.00045543660215739755, 'samples': 2162880, 'steps': 11264, 'loss/train': 0.6206670105457306} 01/27/2022 06:26:58 - INFO - codeparrot_training - Step 11265: {'lr': 0.00045542727751376495, 'samples': 2163072, 'steps': 11265, 'loss/train': 0.9005638360977173} 01/27/2022 06:27:01 - INFO - codeparrot_training - Step 11266: {'lr': 0.00045541795199014715, 'samples': 2163264, 'steps': 11266, 'loss/train': 0.17370974645018578} 01/27/2022 06:27:04 - INFO - codeparrot_training - Step 11267: {'lr': 0.00045540862558658403, 'samples': 2163456, 'steps': 11267, 'loss/train': 1.1430361568927765} 01/27/2022 06:27:07 - INFO - codeparrot_training - Step 11268: {'lr': 0.00045539929830311555, 'samples': 2163648, 'steps': 11268, 'loss/train': 1.720956802368164} 01/27/2022 06:27:11 - INFO - codeparrot_training - Step 11269: {'lr': 0.00045538997013978166, 'samples': 2163840, 'steps': 11269, 'loss/train': 0.9236764311790466} 01/27/2022 06:27:14 - INFO - codeparrot_training - Step 11270: {'lr': 0.0004553806410966225, 'samples': 2164032, 'steps': 11270, 'loss/train': 0.9115880727767944} 01/27/2022 06:27:17 - INFO - codeparrot_training - Step 11271: {'lr': 0.0004553713111736778, 'samples': 2164224, 'steps': 11271, 'loss/train': 1.0217442512512207} 01/27/2022 06:27:21 - INFO - codeparrot_training - Step 11272: {'lr': 0.0004553619803709876, 'samples': 2164416, 'steps': 11272, 'loss/train': 0.6699433028697968} 01/27/2022 06:27:24 - INFO - codeparrot_training - Step 11273: {'lr': 0.00045535264868859195, 'samples': 2164608, 'steps': 11273, 'loss/train': 1.1189665496349335} 01/27/2022 06:27:27 - INFO - codeparrot_training - Step 11274: {'lr': 0.0004553433161265307, 'samples': 2164800, 'steps': 11274, 'loss/train': 0.9548225998878479} 01/27/2022 06:27:31 - INFO - codeparrot_training - Step 11275: {'lr': 0.00045533398268484396, 'samples': 2164992, 'steps': 11275, 'loss/train': 0.30648449063301086} 01/27/2022 06:27:34 - INFO - codeparrot_training - Step 11276: {'lr': 0.00045532464836357155, 'samples': 2165184, 'steps': 11276, 'loss/train': 0.8547605574131012} 01/27/2022 06:27:37 - INFO - codeparrot_training - Step 11277: {'lr': 0.0004553153131627536, 'samples': 2165376, 'steps': 11277, 'loss/train': 1.3160883486270905} 01/27/2022 06:27:40 - INFO - codeparrot_training - Step 11278: {'lr': 0.00045530597708243, 'samples': 2165568, 'steps': 11278, 'loss/train': 0.6889692842960358} 01/27/2022 06:27:43 - INFO - codeparrot_training - Step 11279: {'lr': 0.0004552966401226408, 'samples': 2165760, 'steps': 11279, 'loss/train': 0.17093012854456902} 01/27/2022 06:27:46 - INFO - codeparrot_training - Step 11280: {'lr': 0.000455287302283426, 'samples': 2165952, 'steps': 11280, 'loss/train': 0.5803962349891663} 01/27/2022 06:27:51 - INFO - codeparrot_training - Step 11281: {'lr': 0.00045527796356482566, 'samples': 2166144, 'steps': 11281, 'loss/train': 0.5458649396896362} 01/27/2022 06:27:55 - INFO - codeparrot_training - Step 11282: {'lr': 0.00045526862396687957, 'samples': 2166336, 'steps': 11282, 'loss/train': 0.8699771761894226} 01/27/2022 06:27:58 - INFO - codeparrot_training - Step 11283: {'lr': 0.000455259283489628, 'samples': 2166528, 'steps': 11283, 'loss/train': 0.7315486371517181} 01/27/2022 06:28:01 - INFO - codeparrot_training - Step 11284: {'lr': 0.0004552499421331107, 'samples': 2166720, 'steps': 11284, 'loss/train': 0.614323228597641} 01/27/2022 06:28:04 - INFO - codeparrot_training - Step 11285: {'lr': 0.0004552405998973679, 'samples': 2166912, 'steps': 11285, 'loss/train': 0.814791351556778} 01/27/2022 06:28:07 - INFO - codeparrot_training - Step 11286: {'lr': 0.0004552312567824395, 'samples': 2167104, 'steps': 11286, 'loss/train': 0.6031284034252167} 01/27/2022 06:28:10 - INFO - codeparrot_training - Step 11287: {'lr': 0.00045522191278836563, 'samples': 2167296, 'steps': 11287, 'loss/train': 0.7457756102085114} 01/27/2022 06:28:13 - INFO - codeparrot_training - Step 11288: {'lr': 0.00045521256791518616, 'samples': 2167488, 'steps': 11288, 'loss/train': 0.9792254269123077} 01/27/2022 06:28:18 - INFO - codeparrot_training - Step 11289: {'lr': 0.0004552032221629413, 'samples': 2167680, 'steps': 11289, 'loss/train': 0.6317219585180283} 01/27/2022 06:28:21 - INFO - codeparrot_training - Step 11290: {'lr': 0.000455193875531671, 'samples': 2167872, 'steps': 11290, 'loss/train': 0.3681740239262581} 01/27/2022 06:28:24 - INFO - codeparrot_training - Step 11291: {'lr': 0.00045518452802141524, 'samples': 2168064, 'steps': 11291, 'loss/train': 0.6539545208215714} 01/27/2022 06:28:27 - INFO - codeparrot_training - Step 11292: {'lr': 0.0004551751796322141, 'samples': 2168256, 'steps': 11292, 'loss/train': 0.9576071798801422} 01/27/2022 06:28:31 - INFO - codeparrot_training - Step 11293: {'lr': 0.00045516583036410777, 'samples': 2168448, 'steps': 11293, 'loss/train': 0.7882395386695862} 01/27/2022 06:28:34 - INFO - codeparrot_training - Step 11294: {'lr': 0.00045515648021713604, 'samples': 2168640, 'steps': 11294, 'loss/train': 0.8867442011833191} 01/27/2022 06:28:37 - INFO - codeparrot_training - Step 11295: {'lr': 0.0004551471291913391, 'samples': 2168832, 'steps': 11295, 'loss/train': 0.6378899663686752} 01/27/2022 06:28:40 - INFO - codeparrot_training - Step 11296: {'lr': 0.00045513777728675703, 'samples': 2169024, 'steps': 11296, 'loss/train': 0.7208981662988663} 01/27/2022 06:28:43 - INFO - codeparrot_training - Step 11297: {'lr': 0.0004551284245034298, 'samples': 2169216, 'steps': 11297, 'loss/train': 0.861709713935852} 01/27/2022 06:28:47 - INFO - codeparrot_training - Step 11298: {'lr': 0.00045511907084139767, 'samples': 2169408, 'steps': 11298, 'loss/train': 0.4585038274526596} 01/27/2022 06:28:51 - INFO - codeparrot_training - Step 11299: {'lr': 0.0004551097163007005, 'samples': 2169600, 'steps': 11299, 'loss/train': 0.426534041762352} 01/27/2022 06:28:54 - INFO - codeparrot_training - Step 11300: {'lr': 0.0004551003608813784, 'samples': 2169792, 'steps': 11300, 'loss/train': 0.558644637465477} 01/27/2022 06:28:57 - INFO - codeparrot_training - Step 11301: {'lr': 0.00045509100458347154, 'samples': 2169984, 'steps': 11301, 'loss/train': 0.4314793199300766} 01/27/2022 06:29:00 - INFO - codeparrot_training - Step 11302: {'lr': 0.0004550816474070199, 'samples': 2170176, 'steps': 11302, 'loss/train': 1.0090279877185822} 01/27/2022 06:29:03 - INFO - codeparrot_training - Step 11303: {'lr': 0.0004550722893520636, 'samples': 2170368, 'steps': 11303, 'loss/train': 0.7478728741407394} 01/27/2022 06:29:06 - INFO - codeparrot_training - Step 11304: {'lr': 0.0004550629304186428, 'samples': 2170560, 'steps': 11304, 'loss/train': 0.8923974931240082} 01/27/2022 06:29:10 - INFO - codeparrot_training - Step 11305: {'lr': 0.0004550535706067974, 'samples': 2170752, 'steps': 11305, 'loss/train': 0.6973164081573486} 01/27/2022 06:29:13 - INFO - codeparrot_training - Step 11306: {'lr': 0.0004550442099165677, 'samples': 2170944, 'steps': 11306, 'loss/train': 0.8664865493774414} 01/27/2022 06:29:17 - INFO - codeparrot_training - Step 11307: {'lr': 0.0004550348483479937, 'samples': 2171136, 'steps': 11307, 'loss/train': 1.155278205871582} 01/27/2022 06:29:21 - INFO - codeparrot_training - Step 11308: {'lr': 0.00045502548590111553, 'samples': 2171328, 'steps': 11308, 'loss/train': 0.7163897305727005} 01/27/2022 06:29:24 - INFO - codeparrot_training - Step 11309: {'lr': 0.0004550161225759732, 'samples': 2171520, 'steps': 11309, 'loss/train': 0.5305931568145752} 01/27/2022 06:29:27 - INFO - codeparrot_training - Step 11310: {'lr': 0.000455006758372607, 'samples': 2171712, 'steps': 11310, 'loss/train': 1.1236997544765472} 01/27/2022 06:29:30 - INFO - codeparrot_training - Step 11311: {'lr': 0.00045499739329105696, 'samples': 2171904, 'steps': 11311, 'loss/train': 0.8817172050476074} 01/27/2022 06:29:33 - INFO - codeparrot_training - Step 11312: {'lr': 0.00045498802733136306, 'samples': 2172096, 'steps': 11312, 'loss/train': 0.6621185839176178} 01/27/2022 06:29:36 - INFO - codeparrot_training - Step 11313: {'lr': 0.00045497866049356564, 'samples': 2172288, 'steps': 11313, 'loss/train': 0.8506363034248352} 01/27/2022 06:29:39 - INFO - codeparrot_training - Step 11314: {'lr': 0.0004549692927777047, 'samples': 2172480, 'steps': 11314, 'loss/train': 0.20135092735290527} 01/27/2022 06:29:43 - INFO - codeparrot_training - Step 11315: {'lr': 0.00045495992418382035, 'samples': 2172672, 'steps': 11315, 'loss/train': 1.6321218609809875} 01/27/2022 06:29:48 - INFO - codeparrot_training - Step 11316: {'lr': 0.0004549505547119529, 'samples': 2172864, 'steps': 11316, 'loss/train': 0.7794636189937592} 01/27/2022 06:29:51 - INFO - codeparrot_training - Step 11317: {'lr': 0.00045494118436214225, 'samples': 2173056, 'steps': 11317, 'loss/train': 0.649726152420044} 01/27/2022 06:29:54 - INFO - codeparrot_training - Step 11318: {'lr': 0.00045493181313442866, 'samples': 2173248, 'steps': 11318, 'loss/train': 1.1057960093021393} 01/27/2022 06:29:57 - INFO - codeparrot_training - Step 11319: {'lr': 0.00045492244102885224, 'samples': 2173440, 'steps': 11319, 'loss/train': 0.7885055243968964} 01/27/2022 06:30:00 - INFO - codeparrot_training - Step 11320: {'lr': 0.00045491306804545316, 'samples': 2173632, 'steps': 11320, 'loss/train': 0.7923150658607483} 01/27/2022 06:30:03 - INFO - codeparrot_training - Step 11321: {'lr': 0.0004549036941842716, 'samples': 2173824, 'steps': 11321, 'loss/train': 0.7137769013643265} 01/27/2022 06:30:07 - INFO - codeparrot_training - Step 11322: {'lr': 0.0004548943194453476, 'samples': 2174016, 'steps': 11322, 'loss/train': 0.9408278167247772} 01/27/2022 06:30:10 - INFO - codeparrot_training - Step 11323: {'lr': 0.0004548849438287214, 'samples': 2174208, 'steps': 11323, 'loss/train': 0.7105090320110321} 01/27/2022 06:30:13 - INFO - codeparrot_training - Step 11324: {'lr': 0.00045487556733443327, 'samples': 2174400, 'steps': 11324, 'loss/train': 1.1661058366298676} 01/27/2022 06:30:17 - INFO - codeparrot_training - Step 11325: {'lr': 0.00045486618996252315, 'samples': 2174592, 'steps': 11325, 'loss/train': 0.9051360189914703} 01/27/2022 06:30:20 - INFO - codeparrot_training - Step 11326: {'lr': 0.0004548568117130314, 'samples': 2174784, 'steps': 11326, 'loss/train': 0.9612902998924255} 01/27/2022 06:30:24 - INFO - codeparrot_training - Step 11327: {'lr': 0.00045484743258599803, 'samples': 2174976, 'steps': 11327, 'loss/train': 0.8901205658912659} 01/27/2022 06:30:27 - INFO - codeparrot_training - Step 11328: {'lr': 0.0004548380525814634, 'samples': 2175168, 'steps': 11328, 'loss/train': 0.7406803965568542} 01/27/2022 06:30:30 - INFO - codeparrot_training - Step 11329: {'lr': 0.0004548286716994676, 'samples': 2175360, 'steps': 11329, 'loss/train': 0.8854418098926544} 01/27/2022 06:30:33 - INFO - codeparrot_training - Step 11330: {'lr': 0.0004548192899400507, 'samples': 2175552, 'steps': 11330, 'loss/train': 0.9694742560386658} 01/27/2022 06:30:36 - INFO - codeparrot_training - Step 11331: {'lr': 0.0004548099073032531, 'samples': 2175744, 'steps': 11331, 'loss/train': 0.9714675843715668} 01/27/2022 06:30:39 - INFO - codeparrot_training - Step 11332: {'lr': 0.00045480052378911483, 'samples': 2175936, 'steps': 11332, 'loss/train': 1.684446930885315} 01/27/2022 06:30:44 - INFO - codeparrot_training - Step 11333: {'lr': 0.0004547911393976762, 'samples': 2176128, 'steps': 11333, 'loss/train': 0.10896428301930428} 01/27/2022 06:30:47 - INFO - codeparrot_training - Step 11334: {'lr': 0.00045478175412897733, 'samples': 2176320, 'steps': 11334, 'loss/train': 0.9582734405994415} 01/27/2022 06:30:50 - INFO - codeparrot_training - Step 11335: {'lr': 0.00045477236798305846, 'samples': 2176512, 'steps': 11335, 'loss/train': 0.711568146944046} 01/27/2022 06:30:53 - INFO - codeparrot_training - Step 11336: {'lr': 0.00045476298095995985, 'samples': 2176704, 'steps': 11336, 'loss/train': 0.19008585065603256} 01/27/2022 06:30:57 - INFO - codeparrot_training - Step 11337: {'lr': 0.0004547535930597215, 'samples': 2176896, 'steps': 11337, 'loss/train': 0.7834577858448029} 01/27/2022 06:31:00 - INFO - codeparrot_training - Step 11338: {'lr': 0.0004547442042823839, 'samples': 2177088, 'steps': 11338, 'loss/train': 0.5743413269519806} 01/27/2022 06:31:03 - INFO - codeparrot_training - Step 11339: {'lr': 0.0004547348146279871, 'samples': 2177280, 'steps': 11339, 'loss/train': 0.37456483393907547} 01/27/2022 06:31:06 - INFO - codeparrot_training - Step 11340: {'lr': 0.00045472542409657135, 'samples': 2177472, 'steps': 11340, 'loss/train': 0.8627846539020538} 01/27/2022 06:31:09 - INFO - codeparrot_training - Step 11341: {'lr': 0.00045471603268817696, 'samples': 2177664, 'steps': 11341, 'loss/train': 0.7315313369035721} 01/27/2022 06:31:14 - INFO - codeparrot_training - Step 11342: {'lr': 0.000454706640402844, 'samples': 2177856, 'steps': 11342, 'loss/train': 0.7449817210435867} 01/27/2022 06:31:18 - INFO - codeparrot_training - Step 11343: {'lr': 0.00045469724724061286, 'samples': 2178048, 'steps': 11343, 'loss/train': 0.8561884760856628} 01/27/2022 06:31:21 - INFO - codeparrot_training - Step 11344: {'lr': 0.0004546878532015236, 'samples': 2178240, 'steps': 11344, 'loss/train': 0.587617501616478} 01/27/2022 06:31:24 - INFO - codeparrot_training - Step 11345: {'lr': 0.00045467845828561673, 'samples': 2178432, 'steps': 11345, 'loss/train': 0.9270500242710114} 01/27/2022 06:31:27 - INFO - codeparrot_training - Step 11346: {'lr': 0.0004546690624929322, 'samples': 2178624, 'steps': 11346, 'loss/train': 0.6815306693315506} 01/27/2022 06:31:30 - INFO - codeparrot_training - Step 11347: {'lr': 0.0004546596658235105, 'samples': 2178816, 'steps': 11347, 'loss/train': 0.5874369889497757} 01/27/2022 06:31:33 - INFO - codeparrot_training - Step 11348: {'lr': 0.00045465026827739175, 'samples': 2179008, 'steps': 11348, 'loss/train': 0.7367548048496246} 01/27/2022 06:31:36 - INFO - codeparrot_training - Step 11349: {'lr': 0.00045464086985461615, 'samples': 2179200, 'steps': 11349, 'loss/train': 1.051552265882492} 01/27/2022 06:31:40 - INFO - codeparrot_training - Step 11350: {'lr': 0.0004546314705552241, 'samples': 2179392, 'steps': 11350, 'loss/train': 0.9695321023464203} 01/27/2022 06:31:44 - INFO - codeparrot_training - Step 11351: {'lr': 0.00045462207037925593, 'samples': 2179584, 'steps': 11351, 'loss/train': 1.0653811991214752} 01/27/2022 06:31:47 - INFO - codeparrot_training - Step 11352: {'lr': 0.0004546126693267516, 'samples': 2179776, 'steps': 11352, 'loss/train': 0.8755263090133667} 01/27/2022 06:31:50 - INFO - codeparrot_training - Step 11353: {'lr': 0.0004546032673977517, 'samples': 2179968, 'steps': 11353, 'loss/train': 0.7588215172290802} 01/27/2022 06:31:53 - INFO - codeparrot_training - Step 11354: {'lr': 0.0004545938645922963, 'samples': 2180160, 'steps': 11354, 'loss/train': 0.5451037287712097} 01/27/2022 06:31:57 - INFO - codeparrot_training - Step 11355: {'lr': 0.0004545844609104258, 'samples': 2180352, 'steps': 11355, 'loss/train': 1.5248059630393982} 01/27/2022 06:32:00 - INFO - codeparrot_training - Step 11356: {'lr': 0.0004545750563521804, 'samples': 2180544, 'steps': 11356, 'loss/train': 0.9317502379417419} 01/27/2022 06:32:03 - INFO - codeparrot_training - Step 11357: {'lr': 0.0004545656509176004, 'samples': 2180736, 'steps': 11357, 'loss/train': 0.825527161359787} 01/27/2022 06:32:06 - INFO - codeparrot_training - Step 11358: {'lr': 0.0004545562446067261, 'samples': 2180928, 'steps': 11358, 'loss/train': 0.4121427834033966} 01/27/2022 06:32:09 - INFO - codeparrot_training - Step 11359: {'lr': 0.00045454683741959787, 'samples': 2181120, 'steps': 11359, 'loss/train': 0.9729240238666534} 01/27/2022 06:32:14 - INFO - codeparrot_training - Step 11360: {'lr': 0.0004545374293562559, 'samples': 2181312, 'steps': 11360, 'loss/train': 0.1768917739391327} 01/27/2022 06:32:17 - INFO - codeparrot_training - Step 11361: {'lr': 0.00045452802041674045, 'samples': 2181504, 'steps': 11361, 'loss/train': 0.7829243838787079} 01/27/2022 06:32:20 - INFO - codeparrot_training - Step 11362: {'lr': 0.000454518610601092, 'samples': 2181696, 'steps': 11362, 'loss/train': 0.8807999789714813} 01/27/2022 06:32:24 - INFO - codeparrot_training - Step 11363: {'lr': 0.0004545091999093508, 'samples': 2181888, 'steps': 11363, 'loss/train': 0.6127082258462906} 01/27/2022 06:32:27 - INFO - codeparrot_training - Step 11364: {'lr': 0.00045449978834155705, 'samples': 2182080, 'steps': 11364, 'loss/train': 0.908951461315155} 01/27/2022 06:32:30 - INFO - codeparrot_training - Step 11365: {'lr': 0.00045449037589775123, 'samples': 2182272, 'steps': 11365, 'loss/train': 0.8721783757209778} 01/27/2022 06:32:33 - INFO - codeparrot_training - Step 11366: {'lr': 0.00045448096257797344, 'samples': 2182464, 'steps': 11366, 'loss/train': 0.6910669952630997} 01/27/2022 06:32:36 - INFO - codeparrot_training - Step 11367: {'lr': 0.0004544715483822642, 'samples': 2182656, 'steps': 11367, 'loss/train': 0.6440505534410477} 01/27/2022 06:32:39 - INFO - codeparrot_training - Step 11368: {'lr': 0.00045446213331066376, 'samples': 2182848, 'steps': 11368, 'loss/train': 0.5736405998468399} 01/27/2022 06:32:44 - INFO - codeparrot_training - Step 11369: {'lr': 0.0004544527173632125, 'samples': 2183040, 'steps': 11369, 'loss/train': 1.193167358636856} 01/27/2022 06:32:47 - INFO - codeparrot_training - Step 11370: {'lr': 0.00045444330053995074, 'samples': 2183232, 'steps': 11370, 'loss/train': 2.9040979743003845} 01/27/2022 06:32:50 - INFO - codeparrot_training - Step 11371: {'lr': 0.00045443388284091877, 'samples': 2183424, 'steps': 11371, 'loss/train': 0.8585743010044098} 01/27/2022 06:32:53 - INFO - codeparrot_training - Step 11372: {'lr': 0.0004544244642661569, 'samples': 2183616, 'steps': 11372, 'loss/train': 0.8611966073513031} 01/27/2022 06:32:56 - INFO - codeparrot_training - Step 11373: {'lr': 0.0004544150448157056, 'samples': 2183808, 'steps': 11373, 'loss/train': 0.4407329857349396} 01/27/2022 06:32:59 - INFO - codeparrot_training - Step 11374: {'lr': 0.0004544056244896052, 'samples': 2184000, 'steps': 11374, 'loss/train': 0.600284680724144} 01/27/2022 06:33:03 - INFO - codeparrot_training - Step 11375: {'lr': 0.00045439620328789593, 'samples': 2184192, 'steps': 11375, 'loss/train': 1.0647085905075073} 01/27/2022 06:33:06 - INFO - codeparrot_training - Step 11376: {'lr': 0.00045438678121061826, 'samples': 2184384, 'steps': 11376, 'loss/train': 0.9176764190196991} 01/27/2022 06:33:09 - INFO - codeparrot_training - Step 11377: {'lr': 0.0004543773582578125, 'samples': 2184576, 'steps': 11377, 'loss/train': 0.8923023641109467} 01/27/2022 06:33:13 - INFO - codeparrot_training - Step 11378: {'lr': 0.00045436793442951907, 'samples': 2184768, 'steps': 11378, 'loss/train': 0.8024831414222717} 01/27/2022 06:33:16 - INFO - codeparrot_training - Step 11379: {'lr': 0.0004543585097257783, 'samples': 2184960, 'steps': 11379, 'loss/train': 1.075046181678772} 01/27/2022 06:33:20 - INFO - codeparrot_training - Step 11380: {'lr': 0.0004543490841466306, 'samples': 2185152, 'steps': 11380, 'loss/train': 0.6679933369159698} 01/27/2022 06:33:23 - INFO - codeparrot_training - Step 11381: {'lr': 0.00045433965769211616, 'samples': 2185344, 'steps': 11381, 'loss/train': 1.023998647928238} 01/27/2022 06:33:26 - INFO - codeparrot_training - Step 11382: {'lr': 0.00045433023036227566, 'samples': 2185536, 'steps': 11382, 'loss/train': 0.9275394380092621} 01/27/2022 06:33:29 - INFO - codeparrot_training - Step 11383: {'lr': 0.00045432080215714927, 'samples': 2185728, 'steps': 11383, 'loss/train': 0.7458218336105347} 01/27/2022 06:33:32 - INFO - codeparrot_training - Step 11384: {'lr': 0.00045431137307677753, 'samples': 2185920, 'steps': 11384, 'loss/train': 0.6994647681713104} 01/27/2022 06:33:35 - INFO - codeparrot_training - Step 11385: {'lr': 0.00045430194312120066, 'samples': 2186112, 'steps': 11385, 'loss/train': 0.9049176871776581} 01/27/2022 06:33:38 - INFO - codeparrot_training - Step 11386: {'lr': 0.0004542925122904591, 'samples': 2186304, 'steps': 11386, 'loss/train': 0.6130196750164032} 01/27/2022 06:33:44 - INFO - codeparrot_training - Step 11387: {'lr': 0.00045428308058459335, 'samples': 2186496, 'steps': 11387, 'loss/train': 0.5385434031486511} 01/27/2022 06:33:47 - INFO - codeparrot_training - Step 11388: {'lr': 0.00045427364800364374, 'samples': 2186688, 'steps': 11388, 'loss/train': 0.5150376856327057} 01/27/2022 06:33:50 - INFO - codeparrot_training - Step 11389: {'lr': 0.00045426421454765065, 'samples': 2186880, 'steps': 11389, 'loss/train': 1.0670167207717896} 01/27/2022 06:33:53 - INFO - codeparrot_training - Step 11390: {'lr': 0.0004542547802166546, 'samples': 2187072, 'steps': 11390, 'loss/train': 0.8309533596038818} 01/27/2022 06:33:56 - INFO - codeparrot_training - Step 11391: {'lr': 0.00045424534501069594, 'samples': 2187264, 'steps': 11391, 'loss/train': 0.9968915283679962} 01/27/2022 06:34:00 - INFO - codeparrot_training - Step 11392: {'lr': 0.00045423590892981503, 'samples': 2187456, 'steps': 11392, 'loss/train': 1.2210871875286102} 01/27/2022 06:34:03 - INFO - codeparrot_training - Step 11393: {'lr': 0.0004542264719740523, 'samples': 2187648, 'steps': 11393, 'loss/train': 1.0396680235862732} 01/27/2022 06:34:06 - INFO - codeparrot_training - Step 11394: {'lr': 0.0004542170341434483, 'samples': 2187840, 'steps': 11394, 'loss/train': 0.8878920078277588} 01/27/2022 06:34:10 - INFO - codeparrot_training - Step 11395: {'lr': 0.00045420759543804326, 'samples': 2188032, 'steps': 11395, 'loss/train': 1.1991231143474579} 01/27/2022 06:34:14 - INFO - codeparrot_training - Step 11396: {'lr': 0.0004541981558578778, 'samples': 2188224, 'steps': 11396, 'loss/train': 0.6730743795633316} 01/27/2022 06:34:17 - INFO - codeparrot_training - Step 11397: {'lr': 0.0004541887154029922, 'samples': 2188416, 'steps': 11397, 'loss/train': 0.7654941380023956} 01/27/2022 06:34:20 - INFO - codeparrot_training - Step 11398: {'lr': 0.0004541792740734271, 'samples': 2188608, 'steps': 11398, 'loss/train': 0.8965388238430023} 01/27/2022 06:34:23 - INFO - codeparrot_training - Step 11399: {'lr': 0.0004541698318692228, 'samples': 2188800, 'steps': 11399, 'loss/train': 0.8454189598560333} 01/27/2022 06:34:26 - INFO - codeparrot_training - Step 11400: {'lr': 0.0004541603887904198, 'samples': 2188992, 'steps': 11400, 'loss/train': 0.9804148077964783} 01/27/2022 06:34:29 - INFO - codeparrot_training - Step 11401: {'lr': 0.0004541509448370584, 'samples': 2189184, 'steps': 11401, 'loss/train': 1.4698217511177063} 01/27/2022 06:34:32 - INFO - codeparrot_training - Step 11402: {'lr': 0.00045414150000917927, 'samples': 2189376, 'steps': 11402, 'loss/train': 1.354372501373291} 01/27/2022 06:34:36 - INFO - codeparrot_training - Step 11403: {'lr': 0.0004541320543068227, 'samples': 2189568, 'steps': 11403, 'loss/train': 0.9502271711826324} 01/27/2022 06:34:40 - INFO - codeparrot_training - Step 11404: {'lr': 0.00045412260773002933, 'samples': 2189760, 'steps': 11404, 'loss/train': 0.5523876249790192} 01/27/2022 06:34:43 - INFO - codeparrot_training - Step 11405: {'lr': 0.0004541131602788395, 'samples': 2189952, 'steps': 11405, 'loss/train': 0.8401238322257996} 01/27/2022 06:34:46 - INFO - codeparrot_training - Step 11406: {'lr': 0.00045410371195329365, 'samples': 2190144, 'steps': 11406, 'loss/train': 0.8138517737388611} 01/27/2022 06:34:49 - INFO - codeparrot_training - Step 11407: {'lr': 0.00045409426275343234, 'samples': 2190336, 'steps': 11407, 'loss/train': 0.9140254855155945} 01/27/2022 06:34:53 - INFO - codeparrot_training - Step 11408: {'lr': 0.00045408481267929604, 'samples': 2190528, 'steps': 11408, 'loss/train': 0.9349231123924255} 01/27/2022 06:34:56 - INFO - codeparrot_training - Step 11409: {'lr': 0.0004540753617309251, 'samples': 2190720, 'steps': 11409, 'loss/train': 1.0572145879268646} 01/27/2022 06:34:59 - INFO - codeparrot_training - Step 11410: {'lr': 0.0004540659099083602, 'samples': 2190912, 'steps': 11410, 'loss/train': 0.7494983822107315} 01/27/2022 06:35:02 - INFO - codeparrot_training - Step 11411: {'lr': 0.0004540564572116418, 'samples': 2191104, 'steps': 11411, 'loss/train': 0.43590183556079865} 01/27/2022 06:35:05 - INFO - codeparrot_training - Step 11412: {'lr': 0.0004540470036408102, 'samples': 2191296, 'steps': 11412, 'loss/train': 0.7117263078689575} 01/27/2022 06:35:10 - INFO - codeparrot_training - Step 11413: {'lr': 0.0004540375491959061, 'samples': 2191488, 'steps': 11413, 'loss/train': 0.5417716652154922} 01/27/2022 06:35:13 - INFO - codeparrot_training - Step 11414: {'lr': 0.00045402809387697, 'samples': 2191680, 'steps': 11414, 'loss/train': 1.397729068994522} 01/27/2022 06:35:16 - INFO - codeparrot_training - Step 11415: {'lr': 0.00045401863768404217, 'samples': 2191872, 'steps': 11415, 'loss/train': 0.9116975963115692} 01/27/2022 06:35:19 - INFO - codeparrot_training - Step 11416: {'lr': 0.0004540091806171634, 'samples': 2192064, 'steps': 11416, 'loss/train': 0.7283731698989868} 01/27/2022 06:35:22 - INFO - codeparrot_training - Step 11417: {'lr': 0.000453999722676374, 'samples': 2192256, 'steps': 11417, 'loss/train': 0.5397384166717529} 01/27/2022 06:35:25 - INFO - codeparrot_training - Step 11418: {'lr': 0.0004539902638617146, 'samples': 2192448, 'steps': 11418, 'loss/train': 0.35225602984428406} 01/27/2022 06:35:29 - INFO - codeparrot_training - Step 11419: {'lr': 0.0004539808041732257, 'samples': 2192640, 'steps': 11419, 'loss/train': 0.8014498651027679} 01/27/2022 06:35:32 - INFO - codeparrot_training - Step 11420: {'lr': 0.0004539713436109478, 'samples': 2192832, 'steps': 11420, 'loss/train': 0.7189098000526428} 01/27/2022 06:35:35 - INFO - codeparrot_training - Step 11421: {'lr': 0.00045396188217492145, 'samples': 2193024, 'steps': 11421, 'loss/train': 1.0192568600177765} 01/27/2022 06:35:40 - INFO - codeparrot_training - Step 11422: {'lr': 0.00045395241986518714, 'samples': 2193216, 'steps': 11422, 'loss/train': 0.5014984309673309} 01/27/2022 06:35:43 - INFO - codeparrot_training - Step 11423: {'lr': 0.0004539429566817854, 'samples': 2193408, 'steps': 11423, 'loss/train': 0.7798128426074982} 01/27/2022 06:35:46 - INFO - codeparrot_training - Step 11424: {'lr': 0.00045393349262475686, 'samples': 2193600, 'steps': 11424, 'loss/train': 1.0087879300117493} 01/27/2022 06:35:50 - INFO - codeparrot_training - Step 11425: {'lr': 0.000453924027694142, 'samples': 2193792, 'steps': 11425, 'loss/train': 2.2891963720321655} 01/27/2022 06:35:53 - INFO - codeparrot_training - Step 11426: {'lr': 0.00045391456188998124, 'samples': 2193984, 'steps': 11426, 'loss/train': 0.3393818289041519} 01/27/2022 06:35:56 - INFO - codeparrot_training - Step 11427: {'lr': 0.00045390509521231535, 'samples': 2194176, 'steps': 11427, 'loss/train': 0.4644929766654968} 01/27/2022 06:35:59 - INFO - codeparrot_training - Step 11428: {'lr': 0.00045389562766118475, 'samples': 2194368, 'steps': 11428, 'loss/train': 0.8222583532333374} 01/27/2022 06:36:02 - INFO - codeparrot_training - Step 11429: {'lr': 0.00045388615923663004, 'samples': 2194560, 'steps': 11429, 'loss/train': 1.1956492066383362} 01/27/2022 06:36:07 - INFO - codeparrot_training - Step 11430: {'lr': 0.0004538766899386917, 'samples': 2194752, 'steps': 11430, 'loss/train': 0.8378959894180298} 01/27/2022 06:36:10 - INFO - codeparrot_training - Step 11431: {'lr': 0.00045386721976741043, 'samples': 2194944, 'steps': 11431, 'loss/train': 1.424436628818512} 01/27/2022 06:36:13 - INFO - codeparrot_training - Step 11432: {'lr': 0.0004538577487228267, 'samples': 2195136, 'steps': 11432, 'loss/train': 0.9608902037143707} 01/27/2022 06:36:16 - INFO - codeparrot_training - Step 11433: {'lr': 0.0004538482768049811, 'samples': 2195328, 'steps': 11433, 'loss/train': 0.739740639925003} 01/27/2022 06:36:19 - INFO - codeparrot_training - Step 11434: {'lr': 0.00045383880401391423, 'samples': 2195520, 'steps': 11434, 'loss/train': 1.143098384141922} 01/27/2022 06:36:22 - INFO - codeparrot_training - Step 11435: {'lr': 0.00045382933034966667, 'samples': 2195712, 'steps': 11435, 'loss/train': 0.8036879897117615} 01/27/2022 06:36:26 - INFO - codeparrot_training - Step 11436: {'lr': 0.0004538198558122789, 'samples': 2195904, 'steps': 11436, 'loss/train': 0.5114059299230576} 01/27/2022 06:36:29 - INFO - codeparrot_training - Step 11437: {'lr': 0.0004538103804017917, 'samples': 2196096, 'steps': 11437, 'loss/train': 0.6278243511915207} 01/27/2022 06:36:32 - INFO - codeparrot_training - Step 11438: {'lr': 0.00045380090411824547, 'samples': 2196288, 'steps': 11438, 'loss/train': 0.6384107172489166} 01/27/2022 06:36:37 - INFO - codeparrot_training - Step 11439: {'lr': 0.0004537914269616809, 'samples': 2196480, 'steps': 11439, 'loss/train': 0.5731350481510162} 01/27/2022 06:36:40 - INFO - codeparrot_training - Step 11440: {'lr': 0.00045378194893213854, 'samples': 2196672, 'steps': 11440, 'loss/train': 0.9266542196273804} 01/27/2022 06:36:44 - INFO - codeparrot_training - Step 11441: {'lr': 0.00045377247002965904, 'samples': 2196864, 'steps': 11441, 'loss/train': 0.3400718495249748} 01/27/2022 06:36:47 - INFO - codeparrot_training - Step 11442: {'lr': 0.000453762990254283, 'samples': 2197056, 'steps': 11442, 'loss/train': 0.908601701259613} 01/27/2022 06:36:50 - INFO - codeparrot_training - Step 11443: {'lr': 0.000453753509606051, 'samples': 2197248, 'steps': 11443, 'loss/train': 1.1627945601940155} 01/27/2022 06:36:53 - INFO - codeparrot_training - Step 11444: {'lr': 0.0004537440280850037, 'samples': 2197440, 'steps': 11444, 'loss/train': 0.7822781503200531} 01/27/2022 06:36:56 - INFO - codeparrot_training - Step 11445: {'lr': 0.00045373454569118166, 'samples': 2197632, 'steps': 11445, 'loss/train': 0.1779763475060463} 01/27/2022 06:36:59 - INFO - codeparrot_training - Step 11446: {'lr': 0.0004537250624246255, 'samples': 2197824, 'steps': 11446, 'loss/train': 0.8004272282123566} 01/27/2022 06:37:02 - INFO - codeparrot_training - Step 11447: {'lr': 0.00045371557828537585, 'samples': 2198016, 'steps': 11447, 'loss/train': 0.9370399117469788} 01/27/2022 06:37:07 - INFO - codeparrot_training - Step 11448: {'lr': 0.0004537060932734733, 'samples': 2198208, 'steps': 11448, 'loss/train': 1.3640542924404144} 01/27/2022 06:37:10 - INFO - codeparrot_training - Step 11449: {'lr': 0.0004536966073889587, 'samples': 2198400, 'steps': 11449, 'loss/train': 0.6002101600170135} 01/27/2022 06:37:13 - INFO - codeparrot_training - Step 11450: {'lr': 0.00045368712063187237, 'samples': 2198592, 'steps': 11450, 'loss/train': 0.9178058803081512} 01/27/2022 06:37:17 - INFO - codeparrot_training - Step 11451: {'lr': 0.0004536776330022552, 'samples': 2198784, 'steps': 11451, 'loss/train': 1.1440359950065613} 01/27/2022 06:37:20 - INFO - codeparrot_training - Step 11452: {'lr': 0.0004536681445001476, 'samples': 2198976, 'steps': 11452, 'loss/train': 0.4949653297662735} 01/27/2022 06:37:23 - INFO - codeparrot_training - Step 11453: {'lr': 0.0004536586551255904, 'samples': 2199168, 'steps': 11453, 'loss/train': 1.0318913161754608} 01/27/2022 06:37:26 - INFO - codeparrot_training - Step 11454: {'lr': 0.0004536491648786242, 'samples': 2199360, 'steps': 11454, 'loss/train': 1.0128153562545776} 01/27/2022 06:37:29 - INFO - codeparrot_training - Step 11455: {'lr': 0.0004536396737592896, 'samples': 2199552, 'steps': 11455, 'loss/train': 0.6797840148210526} 01/27/2022 06:37:32 - INFO - codeparrot_training - Step 11456: {'lr': 0.0004536301817676274, 'samples': 2199744, 'steps': 11456, 'loss/train': 0.7101286500692368} 01/27/2022 06:37:37 - INFO - codeparrot_training - Step 11457: {'lr': 0.00045362068890367804, 'samples': 2199936, 'steps': 11457, 'loss/train': 0.8031945526599884} 01/27/2022 06:37:40 - INFO - codeparrot_training - Step 11458: {'lr': 0.0004536111951674824, 'samples': 2200128, 'steps': 11458, 'loss/train': 0.6240772604942322} 01/27/2022 06:37:43 - INFO - codeparrot_training - Step 11459: {'lr': 0.000453601700559081, 'samples': 2200320, 'steps': 11459, 'loss/train': 0.5723814368247986} 01/27/2022 06:37:46 - INFO - codeparrot_training - Step 11460: {'lr': 0.00045359220507851456, 'samples': 2200512, 'steps': 11460, 'loss/train': 1.6904123425483704} 01/27/2022 06:37:49 - INFO - codeparrot_training - Step 11461: {'lr': 0.0004535827087258238, 'samples': 2200704, 'steps': 11461, 'loss/train': 0.8626556396484375} 01/27/2022 06:37:52 - INFO - codeparrot_training - Step 11462: {'lr': 0.00045357321150104934, 'samples': 2200896, 'steps': 11462, 'loss/train': 1.122241884469986} 01/27/2022 06:37:56 - INFO - codeparrot_training - Step 11463: {'lr': 0.0004535637134042319, 'samples': 2201088, 'steps': 11463, 'loss/train': 0.9362962245941162} 01/27/2022 06:37:59 - INFO - codeparrot_training - Step 11464: {'lr': 0.00045355421443541214, 'samples': 2201280, 'steps': 11464, 'loss/train': 1.0156200528144836} 01/27/2022 06:38:02 - INFO - codeparrot_training - Step 11465: {'lr': 0.00045354471459463076, 'samples': 2201472, 'steps': 11465, 'loss/train': 1.2609115540981293} 01/27/2022 06:38:07 - INFO - codeparrot_training - Step 11466: {'lr': 0.0004535352138819284, 'samples': 2201664, 'steps': 11466, 'loss/train': 0.9106815755367279} 01/27/2022 06:38:10 - INFO - codeparrot_training - Step 11467: {'lr': 0.0004535257122973459, 'samples': 2201856, 'steps': 11467, 'loss/train': 1.0085870325565338} 01/27/2022 06:38:13 - INFO - codeparrot_training - Step 11468: {'lr': 0.0004535162098409238, 'samples': 2202048, 'steps': 11468, 'loss/train': 1.192331224679947} 01/27/2022 06:38:17 - INFO - codeparrot_training - Step 11469: {'lr': 0.000453506706512703, 'samples': 2202240, 'steps': 11469, 'loss/train': 1.999705195426941} 01/27/2022 06:38:20 - INFO - codeparrot_training - Step 11470: {'lr': 0.00045349720231272395, 'samples': 2202432, 'steps': 11470, 'loss/train': 0.7169463336467743} 01/27/2022 06:38:23 - INFO - codeparrot_training - Step 11471: {'lr': 0.0004534876972410276, 'samples': 2202624, 'steps': 11471, 'loss/train': 1.1027265787124634} 01/27/2022 06:38:26 - INFO - codeparrot_training - Step 11472: {'lr': 0.0004534781912976545, 'samples': 2202816, 'steps': 11472, 'loss/train': 0.9608303010463715} 01/27/2022 06:38:29 - INFO - codeparrot_training - Step 11473: {'lr': 0.00045346868448264553, 'samples': 2203008, 'steps': 11473, 'loss/train': 0.5129726082086563} 01/27/2022 06:38:32 - INFO - codeparrot_training - Step 11474: {'lr': 0.00045345917679604126, 'samples': 2203200, 'steps': 11474, 'loss/train': 0.7580524384975433} 01/27/2022 06:38:37 - INFO - codeparrot_training - Step 11475: {'lr': 0.0004534496682378825, 'samples': 2203392, 'steps': 11475, 'loss/train': 0.06549962237477303} 01/27/2022 06:38:40 - INFO - codeparrot_training - Step 11476: {'lr': 0.00045344015880821, 'samples': 2203584, 'steps': 11476, 'loss/train': 1.0683825016021729} 01/27/2022 06:38:43 - INFO - codeparrot_training - Step 11477: {'lr': 0.0004534306485070644, 'samples': 2203776, 'steps': 11477, 'loss/train': 0.9582383036613464} 01/27/2022 06:38:46 - INFO - codeparrot_training - Step 11478: {'lr': 0.0004534211373344864, 'samples': 2203968, 'steps': 11478, 'loss/train': 0.2581598460674286} 01/27/2022 06:38:49 - INFO - codeparrot_training - Step 11479: {'lr': 0.00045341162529051704, 'samples': 2204160, 'steps': 11479, 'loss/train': 0.9248396158218384} 01/27/2022 06:38:53 - INFO - codeparrot_training - Step 11480: {'lr': 0.0004534021123751968, 'samples': 2204352, 'steps': 11480, 'loss/train': 0.5699222087860107} 01/27/2022 06:38:56 - INFO - codeparrot_training - Step 11481: {'lr': 0.0004533925985885664, 'samples': 2204544, 'steps': 11481, 'loss/train': 0.708722323179245} 01/27/2022 06:38:59 - INFO - codeparrot_training - Step 11482: {'lr': 0.00045338308393066685, 'samples': 2204736, 'steps': 11482, 'loss/train': 0.569604679942131} 01/27/2022 06:39:02 - INFO - codeparrot_training - Step 11483: {'lr': 0.00045337356840153864, 'samples': 2204928, 'steps': 11483, 'loss/train': 0.7031693458557129} 01/27/2022 06:39:07 - INFO - codeparrot_training - Step 11484: {'lr': 0.00045336405200122266, 'samples': 2205120, 'steps': 11484, 'loss/train': 1.1933955252170563} 01/27/2022 06:39:10 - INFO - codeparrot_training - Step 11485: {'lr': 0.0004533545347297597, 'samples': 2205312, 'steps': 11485, 'loss/train': 1.0159386098384857} 01/27/2022 06:39:14 - INFO - codeparrot_training - Step 11486: {'lr': 0.0004533450165871904, 'samples': 2205504, 'steps': 11486, 'loss/train': 1.0342833995819092} 01/27/2022 06:39:17 - INFO - codeparrot_training - Step 11487: {'lr': 0.00045333549757355573, 'samples': 2205696, 'steps': 11487, 'loss/train': 0.6280984282493591} 01/27/2022 06:39:20 - INFO - codeparrot_training - Step 11488: {'lr': 0.0004533259776888963, 'samples': 2205888, 'steps': 11488, 'loss/train': 0.8527387976646423} 01/27/2022 06:39:23 - INFO - codeparrot_training - Step 11489: {'lr': 0.00045331645693325295, 'samples': 2206080, 'steps': 11489, 'loss/train': 1.0946163833141327} 01/27/2022 06:39:26 - INFO - codeparrot_training - Step 11490: {'lr': 0.0004533069353066664, 'samples': 2206272, 'steps': 11490, 'loss/train': 1.3616521060466766} 01/27/2022 06:39:29 - INFO - codeparrot_training - Step 11491: {'lr': 0.0004532974128091776, 'samples': 2206464, 'steps': 11491, 'loss/train': 1.1966636180877686} 01/27/2022 06:39:34 - INFO - codeparrot_training - Step 11492: {'lr': 0.00045328788944082717, 'samples': 2206656, 'steps': 11492, 'loss/train': 0.9138822555541992} 01/27/2022 06:39:37 - INFO - codeparrot_training - Step 11493: {'lr': 0.000453278365201656, 'samples': 2206848, 'steps': 11493, 'loss/train': 1.0335793197154999} 01/27/2022 06:39:40 - INFO - codeparrot_training - Step 11494: {'lr': 0.00045326884009170486, 'samples': 2207040, 'steps': 11494, 'loss/train': 0.6006902754306793} 01/27/2022 06:39:43 - INFO - codeparrot_training - Step 11495: {'lr': 0.0004532593141110145, 'samples': 2207232, 'steps': 11495, 'loss/train': 0.8531838655471802} 01/27/2022 06:39:46 - INFO - codeparrot_training - Step 11496: {'lr': 0.00045324978725962584, 'samples': 2207424, 'steps': 11496, 'loss/train': 1.1672607958316803} 01/27/2022 06:39:49 - INFO - codeparrot_training - Step 11497: {'lr': 0.0004532402595375796, 'samples': 2207616, 'steps': 11497, 'loss/train': 0.9625104367733002} 01/27/2022 06:39:52 - INFO - codeparrot_training - Step 11498: {'lr': 0.0004532307309449167, 'samples': 2207808, 'steps': 11498, 'loss/train': 1.0233373045921326} 01/27/2022 06:39:56 - INFO - codeparrot_training - Step 11499: {'lr': 0.00045322120148167777, 'samples': 2208000, 'steps': 11499, 'loss/train': 0.3175632432103157} 01/27/2022 06:39:59 - INFO - codeparrot_training - Step 11500: {'lr': 0.0004532116711479038, 'samples': 2208192, 'steps': 11500, 'loss/train': 1.0011613368988037} 01/27/2022 06:40:03 - INFO - codeparrot_training - Step 11501: {'lr': 0.00045320213994363555, 'samples': 2208384, 'steps': 11501, 'loss/train': 0.9698614776134491} 01/27/2022 06:40:06 - INFO - codeparrot_training - Step 11502: {'lr': 0.00045319260786891394, 'samples': 2208576, 'steps': 11502, 'loss/train': 0.5987488925457001} 01/27/2022 06:40:10 - INFO - codeparrot_training - Step 11503: {'lr': 0.0004531830749237796, 'samples': 2208768, 'steps': 11503, 'loss/train': 0.7506066262722015} 01/27/2022 06:40:13 - INFO - codeparrot_training - Step 11504: {'lr': 0.00045317354110827344, 'samples': 2208960, 'steps': 11504, 'loss/train': 0.6163538843393326} 01/27/2022 06:40:16 - INFO - codeparrot_training - Step 11505: {'lr': 0.0004531640064224365, 'samples': 2209152, 'steps': 11505, 'loss/train': 0.6267715394496918} 01/27/2022 06:40:19 - INFO - codeparrot_training - Step 11506: {'lr': 0.00045315447086630937, 'samples': 2209344, 'steps': 11506, 'loss/train': 0.7373014390468597} 01/27/2022 06:40:22 - INFO - codeparrot_training - Step 11507: {'lr': 0.000453144934439933, 'samples': 2209536, 'steps': 11507, 'loss/train': 0.658080518245697} 01/27/2022 06:40:25 - INFO - codeparrot_training - Step 11508: {'lr': 0.0004531353971433483, 'samples': 2209728, 'steps': 11508, 'loss/train': 0.8170672953128815} 01/27/2022 06:40:28 - INFO - codeparrot_training - Step 11509: {'lr': 0.000453125858976596, 'samples': 2209920, 'steps': 11509, 'loss/train': 0.6527111828327179} 01/27/2022 06:40:33 - INFO - codeparrot_training - Step 11510: {'lr': 0.000453116319939717, 'samples': 2210112, 'steps': 11510, 'loss/train': 0.8411539793014526} 01/27/2022 06:40:36 - INFO - codeparrot_training - Step 11511: {'lr': 0.0004531067800327523, 'samples': 2210304, 'steps': 11511, 'loss/train': 1.6081321835517883} 01/27/2022 06:40:39 - INFO - codeparrot_training - Step 11512: {'lr': 0.0004530972392557425, 'samples': 2210496, 'steps': 11512, 'loss/train': 0.5396456569433212} 01/27/2022 06:40:42 - INFO - codeparrot_training - Step 11513: {'lr': 0.0004530876976087288, 'samples': 2210688, 'steps': 11513, 'loss/train': 1.1236506700515747} 01/27/2022 06:40:45 - INFO - codeparrot_training - Step 11514: {'lr': 0.00045307815509175177, 'samples': 2210880, 'steps': 11514, 'loss/train': 0.7360989153385162} 01/27/2022 06:40:49 - INFO - codeparrot_training - Step 11515: {'lr': 0.00045306861170485235, 'samples': 2211072, 'steps': 11515, 'loss/train': 1.0350320935249329} 01/27/2022 06:40:52 - INFO - codeparrot_training - Step 11516: {'lr': 0.00045305906744807156, 'samples': 2211264, 'steps': 11516, 'loss/train': 0.8009562492370605} 01/27/2022 06:40:55 - INFO - codeparrot_training - Step 11517: {'lr': 0.0004530495223214502, 'samples': 2211456, 'steps': 11517, 'loss/train': 0.6608315706253052} 01/27/2022 06:40:58 - INFO - codeparrot_training - Step 11518: {'lr': 0.00045303997632502915, 'samples': 2211648, 'steps': 11518, 'loss/train': 1.2446990311145782} 01/27/2022 06:41:03 - INFO - codeparrot_training - Step 11519: {'lr': 0.00045303042945884933, 'samples': 2211840, 'steps': 11519, 'loss/train': 0.8030176162719727} 01/27/2022 06:41:07 - INFO - codeparrot_training - Step 11520: {'lr': 0.0004530208817229516, 'samples': 2212032, 'steps': 11520, 'loss/train': 0.9598545134067535} 01/27/2022 06:41:10 - INFO - codeparrot_training - Step 11521: {'lr': 0.00045301133311737685, 'samples': 2212224, 'steps': 11521, 'loss/train': 0.645999401807785} 01/27/2022 06:41:13 - INFO - codeparrot_training - Step 11522: {'lr': 0.00045300178364216605, 'samples': 2212416, 'steps': 11522, 'loss/train': 0.6166174113750458} 01/27/2022 06:41:16 - INFO - codeparrot_training - Step 11523: {'lr': 0.00045299223329736004, 'samples': 2212608, 'steps': 11523, 'loss/train': 0.8580339252948761} 01/27/2022 06:41:19 - INFO - codeparrot_training - Step 11524: {'lr': 0.00045298268208299983, 'samples': 2212800, 'steps': 11524, 'loss/train': 0.7064031511545181} 01/27/2022 06:41:22 - INFO - codeparrot_training - Step 11525: {'lr': 0.0004529731299991262, 'samples': 2212992, 'steps': 11525, 'loss/train': 0.9971950650215149} 01/27/2022 06:41:25 - INFO - codeparrot_training - Step 11526: {'lr': 0.00045296357704578016, 'samples': 2213184, 'steps': 11526, 'loss/train': 1.0753525793552399} 01/27/2022 06:41:30 - INFO - codeparrot_training - Step 11527: {'lr': 0.0004529540232230026, 'samples': 2213376, 'steps': 11527, 'loss/train': 0.6140541136264801} 01/27/2022 06:41:33 - INFO - codeparrot_training - Step 11528: {'lr': 0.00045294446853083446, 'samples': 2213568, 'steps': 11528, 'loss/train': 1.154639482498169} 01/27/2022 06:41:36 - INFO - codeparrot_training - Step 11529: {'lr': 0.0004529349129693166, 'samples': 2213760, 'steps': 11529, 'loss/train': 0.9292485415935516} 01/27/2022 06:41:39 - INFO - codeparrot_training - Step 11530: {'lr': 0.0004529253565384901, 'samples': 2213952, 'steps': 11530, 'loss/train': 1.2256974577903748} 01/27/2022 06:41:42 - INFO - codeparrot_training - Step 11531: {'lr': 0.00045291579923839576, 'samples': 2214144, 'steps': 11531, 'loss/train': 1.1545886099338531} 01/27/2022 06:41:45 - INFO - codeparrot_training - Step 11532: {'lr': 0.0004529062410690745, 'samples': 2214336, 'steps': 11532, 'loss/train': 0.7180290520191193} 01/27/2022 06:41:49 - INFO - codeparrot_training - Step 11533: {'lr': 0.00045289668203056743, 'samples': 2214528, 'steps': 11533, 'loss/train': 0.803458034992218} 01/27/2022 06:41:52 - INFO - codeparrot_training - Step 11534: {'lr': 0.00045288712212291537, 'samples': 2214720, 'steps': 11534, 'loss/train': 0.7936736047267914} 01/27/2022 06:41:55 - INFO - codeparrot_training - Step 11535: {'lr': 0.0004528775613461593, 'samples': 2214912, 'steps': 11535, 'loss/train': 0.5076494067907333} 01/27/2022 06:41:59 - INFO - codeparrot_training - Step 11536: {'lr': 0.0004528679997003403, 'samples': 2215104, 'steps': 11536, 'loss/train': 0.8206999897956848} 01/27/2022 06:42:03 - INFO - codeparrot_training - Step 11537: {'lr': 0.000452858437185499, 'samples': 2215296, 'steps': 11537, 'loss/train': 0.8972380757331848} 01/27/2022 06:42:06 - INFO - codeparrot_training - Step 11538: {'lr': 0.00045284887380167674, 'samples': 2215488, 'steps': 11538, 'loss/train': 0.7665487825870514} 01/27/2022 06:42:09 - INFO - codeparrot_training - Step 11539: {'lr': 0.0004528393095489142, 'samples': 2215680, 'steps': 11539, 'loss/train': 0.96059650182724} 01/27/2022 06:42:12 - INFO - codeparrot_training - Step 11540: {'lr': 0.0004528297444272525, 'samples': 2215872, 'steps': 11540, 'loss/train': 0.8950610160827637} 01/27/2022 06:42:15 - INFO - codeparrot_training - Step 11541: {'lr': 0.0004528201784367326, 'samples': 2216064, 'steps': 11541, 'loss/train': 0.567240759730339} 01/27/2022 06:42:18 - INFO - codeparrot_training - Step 11542: {'lr': 0.00045281061157739544, 'samples': 2216256, 'steps': 11542, 'loss/train': 0.587341234087944} 01/27/2022 06:42:21 - INFO - codeparrot_training - Step 11543: {'lr': 0.000452801043849282, 'samples': 2216448, 'steps': 11543, 'loss/train': 0.9143921434879303} 01/27/2022 06:42:25 - INFO - codeparrot_training - Step 11544: {'lr': 0.00045279147525243335, 'samples': 2216640, 'steps': 11544, 'loss/train': 1.0305374264717102} 01/27/2022 06:42:30 - INFO - codeparrot_training - Step 11545: {'lr': 0.0004527819057868904, 'samples': 2216832, 'steps': 11545, 'loss/train': 0.6625239551067352} 01/27/2022 06:42:33 - INFO - codeparrot_training - Step 11546: {'lr': 0.00045277233545269415, 'samples': 2217024, 'steps': 11546, 'loss/train': 0.4786856174468994} 01/27/2022 06:42:36 - INFO - codeparrot_training - Step 11547: {'lr': 0.00045276276424988554, 'samples': 2217216, 'steps': 11547, 'loss/train': 1.000669151544571} 01/27/2022 06:42:39 - INFO - codeparrot_training - Step 11548: {'lr': 0.0004527531921785057, 'samples': 2217408, 'steps': 11548, 'loss/train': 1.0173732340335846} 01/27/2022 06:42:42 - INFO - codeparrot_training - Step 11549: {'lr': 0.00045274361923859554, 'samples': 2217600, 'steps': 11549, 'loss/train': 0.2245052084326744} 01/27/2022 06:42:45 - INFO - codeparrot_training - Step 11550: {'lr': 0.0004527340454301961, 'samples': 2217792, 'steps': 11550, 'loss/train': 1.081914871931076} 01/27/2022 06:42:49 - INFO - codeparrot_training - Step 11551: {'lr': 0.0004527244707533483, 'samples': 2217984, 'steps': 11551, 'loss/train': 1.2453157603740692} 01/27/2022 06:42:52 - INFO - codeparrot_training - Step 11552: {'lr': 0.00045271489520809337, 'samples': 2218176, 'steps': 11552, 'loss/train': 0.8348513245582581} 01/27/2022 06:42:55 - INFO - codeparrot_training - Step 11553: {'lr': 0.0004527053187944722, 'samples': 2218368, 'steps': 11553, 'loss/train': 0.6230732649564743} 01/27/2022 06:42:59 - INFO - codeparrot_training - Step 11554: {'lr': 0.00045269574151252567, 'samples': 2218560, 'steps': 11554, 'loss/train': 0.6504286229610443} 01/27/2022 06:43:03 - INFO - codeparrot_training - Step 11555: {'lr': 0.00045268616336229504, 'samples': 2218752, 'steps': 11555, 'loss/train': 0.44005103409290314} 01/27/2022 06:43:06 - INFO - codeparrot_training - Step 11556: {'lr': 0.0004526765843438213, 'samples': 2218944, 'steps': 11556, 'loss/train': 0.9773183763027191} 01/27/2022 06:43:09 - INFO - codeparrot_training - Step 11557: {'lr': 0.0004526670044571454, 'samples': 2219136, 'steps': 11557, 'loss/train': 1.2604984045028687} 01/27/2022 06:43:12 - INFO - codeparrot_training - Step 11558: {'lr': 0.00045265742370230835, 'samples': 2219328, 'steps': 11558, 'loss/train': 1.3381958305835724} 01/27/2022 06:43:15 - INFO - codeparrot_training - Step 11559: {'lr': 0.00045264784207935127, 'samples': 2219520, 'steps': 11559, 'loss/train': 1.0848155915737152} 01/27/2022 06:43:18 - INFO - codeparrot_training - Step 11560: {'lr': 0.0004526382595883152, 'samples': 2219712, 'steps': 11560, 'loss/train': 0.8420762121677399} 01/27/2022 06:43:21 - INFO - codeparrot_training - Step 11561: {'lr': 0.0004526286762292411, 'samples': 2219904, 'steps': 11561, 'loss/train': 0.867650181055069} 01/27/2022 06:43:25 - INFO - codeparrot_training - Step 11562: {'lr': 0.00045261909200217023, 'samples': 2220096, 'steps': 11562, 'loss/train': 1.0030976235866547} 01/27/2022 06:43:30 - INFO - codeparrot_training - Step 11563: {'lr': 0.0004526095069071434, 'samples': 2220288, 'steps': 11563, 'loss/train': 1.0747171640396118} 01/27/2022 06:43:33 - INFO - codeparrot_training - Step 11564: {'lr': 0.0004525999209442018, 'samples': 2220480, 'steps': 11564, 'loss/train': 0.6480318903923035} 01/27/2022 06:43:36 - INFO - codeparrot_training - Step 11565: {'lr': 0.0004525903341133865, 'samples': 2220672, 'steps': 11565, 'loss/train': 0.9619384109973907} 01/27/2022 06:43:39 - INFO - codeparrot_training - Step 11566: {'lr': 0.0004525807464147385, 'samples': 2220864, 'steps': 11566, 'loss/train': 1.1204120814800262} 01/27/2022 06:43:42 - INFO - codeparrot_training - Step 11567: {'lr': 0.00045257115784829897, 'samples': 2221056, 'steps': 11567, 'loss/train': 1.133689284324646} 01/27/2022 06:43:45 - INFO - codeparrot_training - Step 11568: {'lr': 0.00045256156841410884, 'samples': 2221248, 'steps': 11568, 'loss/train': 0.46293117105960846} 01/27/2022 06:43:49 - INFO - codeparrot_training - Step 11569: {'lr': 0.0004525519781122093, 'samples': 2221440, 'steps': 11569, 'loss/train': 0.7499113082885742} 01/27/2022 06:43:52 - INFO - codeparrot_training - Step 11570: {'lr': 0.00045254238694264145, 'samples': 2221632, 'steps': 11570, 'loss/train': 0.7765835523605347} 01/27/2022 06:43:55 - INFO - codeparrot_training - Step 11571: {'lr': 0.00045253279490544627, 'samples': 2221824, 'steps': 11571, 'loss/train': 0.697838231921196} 01/27/2022 06:43:59 - INFO - codeparrot_training - Step 11572: {'lr': 0.0004525232020006649, 'samples': 2222016, 'steps': 11572, 'loss/train': 0.9850761294364929} 01/27/2022 06:44:02 - INFO - codeparrot_training - Step 11573: {'lr': 0.00045251360822833855, 'samples': 2222208, 'steps': 11573, 'loss/train': 0.562737375497818} 01/27/2022 06:44:06 - INFO - codeparrot_training - Step 11574: {'lr': 0.00045250401358850814, 'samples': 2222400, 'steps': 11574, 'loss/train': 0.5420144498348236} 01/27/2022 06:44:09 - INFO - codeparrot_training - Step 11575: {'lr': 0.00045249441808121484, 'samples': 2222592, 'steps': 11575, 'loss/train': 0.3634616509079933} 01/27/2022 06:44:12 - INFO - codeparrot_training - Step 11576: {'lr': 0.0004524848217064997, 'samples': 2222784, 'steps': 11576, 'loss/train': 0.35281457751989365} 01/27/2022 06:44:15 - INFO - codeparrot_training - Step 11577: {'lr': 0.0004524752244644039, 'samples': 2222976, 'steps': 11577, 'loss/train': 0.7471393346786499} 01/27/2022 06:44:18 - INFO - codeparrot_training - Step 11578: {'lr': 0.0004524656263549686, 'samples': 2223168, 'steps': 11578, 'loss/train': 0.7435552328824997} 01/27/2022 06:44:21 - INFO - codeparrot_training - Step 11579: {'lr': 0.0004524560273782348, 'samples': 2223360, 'steps': 11579, 'loss/train': 0.9223426580429077} 01/27/2022 06:44:24 - INFO - codeparrot_training - Step 11580: {'lr': 0.00045244642753424364, 'samples': 2223552, 'steps': 11580, 'loss/train': 0.6533783376216888} 01/27/2022 06:44:29 - INFO - codeparrot_training - Step 11581: {'lr': 0.0004524368268230363, 'samples': 2223744, 'steps': 11581, 'loss/train': 0.8324856162071228} 01/27/2022 06:44:32 - INFO - codeparrot_training - Step 11582: {'lr': 0.00045242722524465386, 'samples': 2223936, 'steps': 11582, 'loss/train': 0.7371080964803696} 01/27/2022 06:44:35 - INFO - codeparrot_training - Step 11583: {'lr': 0.00045241762279913745, 'samples': 2224128, 'steps': 11583, 'loss/train': 0.6088697612285614} 01/27/2022 06:44:38 - INFO - codeparrot_training - Step 11584: {'lr': 0.0004524080194865283, 'samples': 2224320, 'steps': 11584, 'loss/train': 0.8454663455486298} 01/27/2022 06:44:41 - INFO - codeparrot_training - Step 11585: {'lr': 0.00045239841530686736, 'samples': 2224512, 'steps': 11585, 'loss/train': 1.0174543261528015} 01/27/2022 06:44:45 - INFO - codeparrot_training - Step 11586: {'lr': 0.000452388810260196, 'samples': 2224704, 'steps': 11586, 'loss/train': 0.7122825533151627} 01/27/2022 06:44:48 - INFO - codeparrot_training - Step 11587: {'lr': 0.0004523792043465551, 'samples': 2224896, 'steps': 11587, 'loss/train': 0.8296945095062256} 01/27/2022 06:44:51 - INFO - codeparrot_training - Step 11588: {'lr': 0.00045236959756598605, 'samples': 2225088, 'steps': 11588, 'loss/train': 0.8292447924613953} 01/27/2022 06:44:56 - INFO - codeparrot_training - Step 11589: {'lr': 0.0004523599899185299, 'samples': 2225280, 'steps': 11589, 'loss/train': 1.092229038476944} 01/27/2022 06:45:00 - INFO - codeparrot_training - Step 11590: {'lr': 0.0004523503814042277, 'samples': 2225472, 'steps': 11590, 'loss/train': 0.9228117763996124} 01/27/2022 06:45:03 - INFO - codeparrot_training - Step 11591: {'lr': 0.00045234077202312086, 'samples': 2225664, 'steps': 11591, 'loss/train': 0.9257749915122986} 01/27/2022 06:45:06 - INFO - codeparrot_training - Step 11592: {'lr': 0.00045233116177525036, 'samples': 2225856, 'steps': 11592, 'loss/train': 1.084083080291748} 01/27/2022 06:45:09 - INFO - codeparrot_training - Step 11593: {'lr': 0.00045232155066065737, 'samples': 2226048, 'steps': 11593, 'loss/train': 0.9684381186962128} 01/27/2022 06:45:12 - INFO - codeparrot_training - Step 11594: {'lr': 0.00045231193867938314, 'samples': 2226240, 'steps': 11594, 'loss/train': 0.9986497163772583} 01/27/2022 06:45:15 - INFO - codeparrot_training - Step 11595: {'lr': 0.0004523023258314688, 'samples': 2226432, 'steps': 11595, 'loss/train': 0.8646848201751709} 01/27/2022 06:45:18 - INFO - codeparrot_training - Step 11596: {'lr': 0.00045229271211695554, 'samples': 2226624, 'steps': 11596, 'loss/train': 0.21237869560718536} 01/27/2022 06:45:22 - INFO - codeparrot_training - Step 11597: {'lr': 0.00045228309753588447, 'samples': 2226816, 'steps': 11597, 'loss/train': 0.19402959197759628} 01/27/2022 06:45:26 - INFO - codeparrot_training - Step 11598: {'lr': 0.0004522734820882969, 'samples': 2227008, 'steps': 11598, 'loss/train': 0.5256506949663162} 01/27/2022 06:45:29 - INFO - codeparrot_training - Step 11599: {'lr': 0.00045226386577423394, 'samples': 2227200, 'steps': 11599, 'loss/train': 0.4201370179653168} 01/27/2022 06:45:32 - INFO - codeparrot_training - Step 11600: {'lr': 0.0004522542485937369, 'samples': 2227392, 'steps': 11600, 'loss/train': 2.0068910121917725} 01/27/2022 06:45:35 - INFO - codeparrot_training - Step 11601: {'lr': 0.0004522446305468468, 'samples': 2227584, 'steps': 11601, 'loss/train': 0.9315994083881378} 01/27/2022 06:45:39 - INFO - codeparrot_training - Step 11602: {'lr': 0.00045223501163360494, 'samples': 2227776, 'steps': 11602, 'loss/train': 1.0398103594779968} 01/27/2022 06:45:42 - INFO - codeparrot_training - Step 11603: {'lr': 0.0004522253918540524, 'samples': 2227968, 'steps': 11603, 'loss/train': 0.33701491355895996} 01/27/2022 06:45:45 - INFO - codeparrot_training - Step 11604: {'lr': 0.00045221577120823064, 'samples': 2228160, 'steps': 11604, 'loss/train': 0.9511698782444} 01/27/2022 06:45:48 - INFO - codeparrot_training - Step 11605: {'lr': 0.00045220614969618066, 'samples': 2228352, 'steps': 11605, 'loss/train': 0.9692080914974213} 01/27/2022 06:45:51 - INFO - codeparrot_training - Step 11606: {'lr': 0.0004521965273179438, 'samples': 2228544, 'steps': 11606, 'loss/train': 0.7573259174823761} 01/27/2022 06:45:55 - INFO - codeparrot_training - Step 11607: {'lr': 0.00045218690407356117, 'samples': 2228736, 'steps': 11607, 'loss/train': 0.7648589909076691} 01/27/2022 06:45:59 - INFO - codeparrot_training - Step 11608: {'lr': 0.00045217727996307405, 'samples': 2228928, 'steps': 11608, 'loss/train': 1.2033653855323792} 01/27/2022 06:46:02 - INFO - codeparrot_training - Step 11609: {'lr': 0.0004521676549865237, 'samples': 2229120, 'steps': 11609, 'loss/train': 0.3614354655146599} 01/27/2022 06:46:05 - INFO - codeparrot_training - Step 11610: {'lr': 0.0004521580291439513, 'samples': 2229312, 'steps': 11610, 'loss/train': 0.9941319823265076} 01/27/2022 06:46:08 - INFO - codeparrot_training - Step 11611: {'lr': 0.00045214840243539803, 'samples': 2229504, 'steps': 11611, 'loss/train': 0.5513012856245041} 01/27/2022 06:46:11 - INFO - codeparrot_training - Step 11612: {'lr': 0.00045213877486090524, 'samples': 2229696, 'steps': 11612, 'loss/train': 1.3595267236232758} 01/27/2022 06:46:14 - INFO - codeparrot_training - Step 11613: {'lr': 0.0004521291464205141, 'samples': 2229888, 'steps': 11613, 'loss/train': 1.1000244319438934} 01/27/2022 06:46:17 - INFO - codeparrot_training - Step 11614: {'lr': 0.0004521195171142659, 'samples': 2230080, 'steps': 11614, 'loss/train': 0.6784709841012955} 01/27/2022 06:46:21 - INFO - codeparrot_training - Step 11615: {'lr': 0.0004521098869422019, 'samples': 2230272, 'steps': 11615, 'loss/train': 0.7418342381715775} 01/27/2022 06:46:25 - INFO - codeparrot_training - Step 11616: {'lr': 0.00045210025590436333, 'samples': 2230464, 'steps': 11616, 'loss/train': 0.49397604167461395} 01/27/2022 06:46:28 - INFO - codeparrot_training - Step 11617: {'lr': 0.00045209062400079135, 'samples': 2230656, 'steps': 11617, 'loss/train': 0.5719863921403885} 01/27/2022 06:46:31 - INFO - codeparrot_training - Step 11618: {'lr': 0.00045208099123152735, 'samples': 2230848, 'steps': 11618, 'loss/train': 0.3843739181756973} 01/27/2022 06:46:34 - INFO - codeparrot_training - Step 11619: {'lr': 0.00045207135759661255, 'samples': 2231040, 'steps': 11619, 'loss/train': 0.5754173845052719} 01/27/2022 06:46:38 - INFO - codeparrot_training - Step 11620: {'lr': 0.0004520617230960883, 'samples': 2231232, 'steps': 11620, 'loss/train': 0.6298675686120987} 01/27/2022 06:46:41 - INFO - codeparrot_training - Step 11621: {'lr': 0.0004520520877299957, 'samples': 2231424, 'steps': 11621, 'loss/train': 0.5724048167467117} 01/27/2022 06:46:44 - INFO - codeparrot_training - Step 11622: {'lr': 0.00045204245149837606, 'samples': 2231616, 'steps': 11622, 'loss/train': 1.0385315716266632} 01/27/2022 06:46:47 - INFO - codeparrot_training - Step 11623: {'lr': 0.00045203281440127087, 'samples': 2231808, 'steps': 11623, 'loss/train': 0.47346358001232147} 01/27/2022 06:46:50 - INFO - codeparrot_training - Step 11624: {'lr': 0.00045202317643872113, 'samples': 2232000, 'steps': 11624, 'loss/train': 0.7625638246536255} 01/27/2022 06:46:56 - INFO - codeparrot_training - Step 11625: {'lr': 0.0004520135376107683, 'samples': 2232192, 'steps': 11625, 'loss/train': 0.8922566771507263} 01/27/2022 06:46:59 - INFO - codeparrot_training - Step 11626: {'lr': 0.00045200389791745364, 'samples': 2232384, 'steps': 11626, 'loss/train': 0.9779039025306702} 01/27/2022 06:47:02 - INFO - codeparrot_training - Step 11627: {'lr': 0.0004519942573588184, 'samples': 2232576, 'steps': 11627, 'loss/train': 0.6746272891759872} 01/27/2022 06:47:05 - INFO - codeparrot_training - Step 11628: {'lr': 0.00045198461593490394, 'samples': 2232768, 'steps': 11628, 'loss/train': 1.1872599720954895} 01/27/2022 06:47:08 - INFO - codeparrot_training - Step 11629: {'lr': 0.0004519749736457515, 'samples': 2232960, 'steps': 11629, 'loss/train': 0.4356771558523178} 01/27/2022 06:47:11 - INFO - codeparrot_training - Step 11630: {'lr': 0.00045196533049140234, 'samples': 2233152, 'steps': 11630, 'loss/train': 1.0100825428962708} 01/27/2022 06:47:14 - INFO - codeparrot_training - Step 11631: {'lr': 0.0004519556864718979, 'samples': 2233344, 'steps': 11631, 'loss/train': 0.8484320640563965} 01/27/2022 06:47:18 - INFO - codeparrot_training - Step 11632: {'lr': 0.00045194604158727936, 'samples': 2233536, 'steps': 11632, 'loss/train': 1.2696498334407806} 01/27/2022 06:47:22 - INFO - codeparrot_training - Step 11633: {'lr': 0.0004519363958375882, 'samples': 2233728, 'steps': 11633, 'loss/train': 0.29233087599277496} 01/27/2022 06:47:25 - INFO - codeparrot_training - Step 11634: {'lr': 0.00045192674922286556, 'samples': 2233920, 'steps': 11634, 'loss/train': 1.1091161370277405} 01/27/2022 06:47:28 - INFO - codeparrot_training - Step 11635: {'lr': 0.00045191710174315294, 'samples': 2234112, 'steps': 11635, 'loss/train': 0.40861286222934723} 01/27/2022 06:47:31 - INFO - codeparrot_training - Step 11636: {'lr': 0.0004519074533984915, 'samples': 2234304, 'steps': 11636, 'loss/train': 0.9793251156806946} 01/27/2022 06:47:35 - INFO - codeparrot_training - Step 11637: {'lr': 0.0004518978041889227, 'samples': 2234496, 'steps': 11637, 'loss/train': 0.8622389137744904} 01/27/2022 06:47:38 - INFO - codeparrot_training - Step 11638: {'lr': 0.00045188815411448767, 'samples': 2234688, 'steps': 11638, 'loss/train': 0.5533164292573929} 01/27/2022 06:47:41 - INFO - codeparrot_training - Step 11639: {'lr': 0.00045187850317522806, 'samples': 2234880, 'steps': 11639, 'loss/train': 0.49635668098926544} 01/27/2022 06:47:44 - INFO - codeparrot_training - Step 11640: {'lr': 0.00045186885137118494, 'samples': 2235072, 'steps': 11640, 'loss/train': 0.7798639833927155} 01/27/2022 06:47:47 - INFO - codeparrot_training - Step 11641: {'lr': 0.0004518591987023999, 'samples': 2235264, 'steps': 11641, 'loss/train': 0.6208588778972626} 01/27/2022 06:47:52 - INFO - codeparrot_training - Step 11642: {'lr': 0.000451849545168914, 'samples': 2235456, 'steps': 11642, 'loss/train': 0.690403550863266} 01/27/2022 06:47:55 - INFO - codeparrot_training - Step 11643: {'lr': 0.00045183989077076883, 'samples': 2235648, 'steps': 11643, 'loss/train': 0.8565342128276825} 01/27/2022 06:47:58 - INFO - codeparrot_training - Step 11644: {'lr': 0.00045183023550800564, 'samples': 2235840, 'steps': 11644, 'loss/train': 1.2095072865486145} 01/27/2022 06:48:01 - INFO - codeparrot_training - Step 11645: {'lr': 0.0004518205793806658, 'samples': 2236032, 'steps': 11645, 'loss/train': 0.7535840570926666} 01/27/2022 06:48:05 - INFO - codeparrot_training - Step 11646: {'lr': 0.0004518109223887907, 'samples': 2236224, 'steps': 11646, 'loss/train': 0.9382065832614899} 01/27/2022 06:48:08 - INFO - codeparrot_training - Step 11647: {'lr': 0.0004518012645324217, 'samples': 2236416, 'steps': 11647, 'loss/train': 0.7792284786701202} 01/27/2022 06:48:11 - INFO - codeparrot_training - Step 11648: {'lr': 0.00045179160581160005, 'samples': 2236608, 'steps': 11648, 'loss/train': 0.6775699406862259} 01/27/2022 06:48:14 - INFO - codeparrot_training - Step 11649: {'lr': 0.0004517819462263674, 'samples': 2236800, 'steps': 11649, 'loss/train': 0.5474046170711517} 01/27/2022 06:48:17 - INFO - codeparrot_training - Step 11650: {'lr': 0.0004517722857767649, 'samples': 2236992, 'steps': 11650, 'loss/train': 0.8844499289989471} 01/27/2022 06:48:22 - INFO - codeparrot_training - Step 11651: {'lr': 0.0004517626244628339, 'samples': 2237184, 'steps': 11651, 'loss/train': 1.2493087649345398} 01/27/2022 06:48:25 - INFO - codeparrot_training - Step 11652: {'lr': 0.000451752962284616, 'samples': 2237376, 'steps': 11652, 'loss/train': 0.655258297920227} 01/27/2022 06:48:28 - INFO - codeparrot_training - Step 11653: {'lr': 0.0004517432992421524, 'samples': 2237568, 'steps': 11653, 'loss/train': 0.7848450243473053} 01/27/2022 06:48:31 - INFO - codeparrot_training - Step 11654: {'lr': 0.00045173363533548464, 'samples': 2237760, 'steps': 11654, 'loss/train': 0.6268559396266937} 01/27/2022 06:48:34 - INFO - codeparrot_training - Step 11655: {'lr': 0.00045172397056465405, 'samples': 2237952, 'steps': 11655, 'loss/train': 0.8496752679347992} 01/27/2022 06:48:37 - INFO - codeparrot_training - Step 11656: {'lr': 0.000451714304929702, 'samples': 2238144, 'steps': 11656, 'loss/train': 0.7586089074611664} 01/27/2022 06:48:41 - INFO - codeparrot_training - Step 11657: {'lr': 0.0004517046384306699, 'samples': 2238336, 'steps': 11657, 'loss/train': 1.1502533555030823} 01/27/2022 06:48:44 - INFO - codeparrot_training - Step 11658: {'lr': 0.00045169497106759915, 'samples': 2238528, 'steps': 11658, 'loss/train': 0.8546790182590485} 01/27/2022 06:48:48 - INFO - codeparrot_training - Step 11659: {'lr': 0.0004516853028405312, 'samples': 2238720, 'steps': 11659, 'loss/train': 0.5768780261278152} 01/27/2022 06:48:51 - INFO - codeparrot_training - Step 11660: {'lr': 0.0004516756337495075, 'samples': 2238912, 'steps': 11660, 'loss/train': 0.9889976978302002} 01/27/2022 06:48:54 - INFO - codeparrot_training - Step 11661: {'lr': 0.00045166596379456935, 'samples': 2239104, 'steps': 11661, 'loss/train': 0.8276777565479279} 01/27/2022 06:48:58 - INFO - codeparrot_training - Step 11662: {'lr': 0.0004516562929757584, 'samples': 2239296, 'steps': 11662, 'loss/train': 0.8331027030944824} 01/27/2022 06:49:01 - INFO - codeparrot_training - Step 11663: {'lr': 0.0004516466212931158, 'samples': 2239488, 'steps': 11663, 'loss/train': 0.7303040027618408} 01/27/2022 06:49:04 - INFO - codeparrot_training - Step 11664: {'lr': 0.00045163694874668316, 'samples': 2239680, 'steps': 11664, 'loss/train': 0.9364366829395294} 01/27/2022 06:49:07 - INFO - codeparrot_training - Step 11665: {'lr': 0.0004516272753365018, 'samples': 2239872, 'steps': 11665, 'loss/train': 0.7353095859289169} 01/27/2022 06:49:10 - INFO - codeparrot_training - Step 11666: {'lr': 0.0004516176010626132, 'samples': 2240064, 'steps': 11666, 'loss/train': 1.083126425743103} 01/27/2022 06:49:13 - INFO - codeparrot_training - Step 11667: {'lr': 0.00045160792592505893, 'samples': 2240256, 'steps': 11667, 'loss/train': 1.3330162167549133} 01/27/2022 06:49:20 - INFO - codeparrot_training - Step 11668: {'lr': 0.0004515982499238802, 'samples': 2240448, 'steps': 11668, 'loss/train': 0.7186975926160812} 01/27/2022 06:49:23 - INFO - codeparrot_training - Step 11669: {'lr': 0.0004515885730591187, 'samples': 2240640, 'steps': 11669, 'loss/train': 0.43566741049289703} 01/27/2022 06:49:26 - INFO - codeparrot_training - Step 11670: {'lr': 0.0004515788953308156, 'samples': 2240832, 'steps': 11670, 'loss/train': 0.6776440590620041} 01/27/2022 06:49:29 - INFO - codeparrot_training - Step 11671: {'lr': 0.00045156921673901267, 'samples': 2241024, 'steps': 11671, 'loss/train': 0.602923795580864} 01/27/2022 06:49:32 - INFO - codeparrot_training - Step 11672: {'lr': 0.0004515595372837512, 'samples': 2241216, 'steps': 11672, 'loss/train': 1.0155137479305267} 01/27/2022 06:49:35 - INFO - codeparrot_training - Step 11673: {'lr': 0.00045154985696507267, 'samples': 2241408, 'steps': 11673, 'loss/train': 0.9739509522914886} 01/27/2022 06:49:38 - INFO - codeparrot_training - Step 11674: {'lr': 0.0004515401757830185, 'samples': 2241600, 'steps': 11674, 'loss/train': 1.2373728454113007} 01/27/2022 06:49:42 - INFO - codeparrot_training - Step 11675: {'lr': 0.0004515304937376302, 'samples': 2241792, 'steps': 11675, 'loss/train': 0.05005623959004879} 01/27/2022 06:49:45 - INFO - codeparrot_training - Step 11676: {'lr': 0.00045152081082894935, 'samples': 2241984, 'steps': 11676, 'loss/train': 0.9597529470920563} 01/27/2022 06:49:49 - INFO - codeparrot_training - Step 11677: {'lr': 0.00045151112705701723, 'samples': 2242176, 'steps': 11677, 'loss/train': 0.7039603739976883} 01/27/2022 06:49:52 - INFO - codeparrot_training - Step 11678: {'lr': 0.00045150144242187554, 'samples': 2242368, 'steps': 11678, 'loss/train': 1.0168679058551788} 01/27/2022 06:49:55 - INFO - codeparrot_training - Step 11679: {'lr': 0.0004514917569235656, 'samples': 2242560, 'steps': 11679, 'loss/train': 0.8458783328533173} 01/27/2022 06:49:59 - INFO - codeparrot_training - Step 11680: {'lr': 0.00045148207056212896, 'samples': 2242752, 'steps': 11680, 'loss/train': 0.4155832529067993} 01/27/2022 06:50:02 - INFO - codeparrot_training - Step 11681: {'lr': 0.0004514723833376071, 'samples': 2242944, 'steps': 11681, 'loss/train': 1.0828678607940674} 01/27/2022 06:50:05 - INFO - codeparrot_training - Step 11682: {'lr': 0.00045146269525004153, 'samples': 2243136, 'steps': 11682, 'loss/train': 0.6497815400362015} 01/27/2022 06:50:08 - INFO - codeparrot_training - Step 11683: {'lr': 0.00045145300629947374, 'samples': 2243328, 'steps': 11683, 'loss/train': 0.423800066113472} 01/27/2022 06:50:11 - INFO - codeparrot_training - Step 11684: {'lr': 0.0004514433164859453, 'samples': 2243520, 'steps': 11684, 'loss/train': 1.1314956843852997} 01/27/2022 06:50:14 - INFO - codeparrot_training - Step 11685: {'lr': 0.00045143362580949754, 'samples': 2243712, 'steps': 11685, 'loss/train': 1.245765209197998} 01/27/2022 06:50:19 - INFO - codeparrot_training - Step 11686: {'lr': 0.00045142393427017214, 'samples': 2243904, 'steps': 11686, 'loss/train': 0.6914822459220886} 01/27/2022 06:50:22 - INFO - codeparrot_training - Step 11687: {'lr': 0.0004514142418680106, 'samples': 2244096, 'steps': 11687, 'loss/train': 0.5385032594203949} 01/27/2022 06:50:25 - INFO - codeparrot_training - Step 11688: {'lr': 0.00045140454860305435, 'samples': 2244288, 'steps': 11688, 'loss/train': 0.7369271814823151} 01/27/2022 06:50:28 - INFO - codeparrot_training - Step 11689: {'lr': 0.000451394854475345, 'samples': 2244480, 'steps': 11689, 'loss/train': 0.6910302937030792} 01/27/2022 06:50:31 - INFO - codeparrot_training - Step 11690: {'lr': 0.0004513851594849241, 'samples': 2244672, 'steps': 11690, 'loss/train': 1.1074188351631165} 01/27/2022 06:50:34 - INFO - codeparrot_training - Step 11691: {'lr': 0.000451375463631833, 'samples': 2244864, 'steps': 11691, 'loss/train': 0.9362645745277405} 01/27/2022 06:50:38 - INFO - codeparrot_training - Step 11692: {'lr': 0.0004513657669161134, 'samples': 2245056, 'steps': 11692, 'loss/train': 0.7872235178947449} 01/27/2022 06:50:41 - INFO - codeparrot_training - Step 11693: {'lr': 0.0004513560693378068, 'samples': 2245248, 'steps': 11693, 'loss/train': 0.8643371164798737} 01/27/2022 06:50:44 - INFO - codeparrot_training - Step 11694: {'lr': 0.00045134637089695484, 'samples': 2245440, 'steps': 11694, 'loss/train': 0.9502831399440765} 01/27/2022 06:50:49 - INFO - codeparrot_training - Step 11695: {'lr': 0.0004513366715935988, 'samples': 2245632, 'steps': 11695, 'loss/train': 0.6810372322797775} 01/27/2022 06:50:52 - INFO - codeparrot_training - Step 11696: {'lr': 0.00045132697142778044, 'samples': 2245824, 'steps': 11696, 'loss/train': 0.9986663460731506} 01/27/2022 06:50:55 - INFO - codeparrot_training - Step 11697: {'lr': 0.00045131727039954137, 'samples': 2246016, 'steps': 11697, 'loss/train': 0.3830302208662033} 01/27/2022 06:50:58 - INFO - codeparrot_training - Step 11698: {'lr': 0.00045130756850892296, 'samples': 2246208, 'steps': 11698, 'loss/train': 0.9649479389190674} 01/27/2022 06:51:02 - INFO - codeparrot_training - Step 11699: {'lr': 0.00045129786575596683, 'samples': 2246400, 'steps': 11699, 'loss/train': 0.9685792922973633} 01/27/2022 06:51:05 - INFO - codeparrot_training - Step 11700: {'lr': 0.00045128816214071453, 'samples': 2246592, 'steps': 11700, 'loss/train': 1.229659080505371} 01/27/2022 06:51:08 - INFO - codeparrot_training - Step 11701: {'lr': 0.00045127845766320773, 'samples': 2246784, 'steps': 11701, 'loss/train': 0.9481448829174042} 01/27/2022 06:51:11 - INFO - codeparrot_training - Step 11702: {'lr': 0.0004512687523234879, 'samples': 2246976, 'steps': 11702, 'loss/train': 1.056679755449295} 01/27/2022 06:51:14 - INFO - codeparrot_training - Step 11703: {'lr': 0.0004512590461215967, 'samples': 2247168, 'steps': 11703, 'loss/train': 5.715895771980286} 01/27/2022 06:51:19 - INFO - codeparrot_training - Step 11704: {'lr': 0.0004512493390575756, 'samples': 2247360, 'steps': 11704, 'loss/train': 0.9296891391277313} 01/27/2022 06:51:22 - INFO - codeparrot_training - Step 11705: {'lr': 0.0004512396311314662, 'samples': 2247552, 'steps': 11705, 'loss/train': 0.9984403252601624} 01/27/2022 06:51:25 - INFO - codeparrot_training - Step 11706: {'lr': 0.00045122992234331017, 'samples': 2247744, 'steps': 11706, 'loss/train': 0.4757862836122513} 01/27/2022 06:51:28 - INFO - codeparrot_training - Step 11707: {'lr': 0.00045122021269314907, 'samples': 2247936, 'steps': 11707, 'loss/train': 1.258249193429947} 01/27/2022 06:51:31 - INFO - codeparrot_training - Step 11708: {'lr': 0.0004512105021810244, 'samples': 2248128, 'steps': 11708, 'loss/train': 0.6900665760040283} 01/27/2022 06:51:34 - INFO - codeparrot_training - Step 11709: {'lr': 0.0004512007908069779, 'samples': 2248320, 'steps': 11709, 'loss/train': 1.4712740778923035} 01/27/2022 06:51:38 - INFO - codeparrot_training - Step 11710: {'lr': 0.0004511910785710511, 'samples': 2248512, 'steps': 11710, 'loss/train': 1.066391497850418} 01/27/2022 06:51:41 - INFO - codeparrot_training - Step 11711: {'lr': 0.0004511813654732856, 'samples': 2248704, 'steps': 11711, 'loss/train': 0.5834784507751465} 01/27/2022 06:51:45 - INFO - codeparrot_training - Step 11712: {'lr': 0.00045117165151372296, 'samples': 2248896, 'steps': 11712, 'loss/train': 1.2887399196624756} 01/27/2022 06:51:49 - INFO - codeparrot_training - Step 11713: {'lr': 0.0004511619366924049, 'samples': 2249088, 'steps': 11713, 'loss/train': 0.9184854626655579} 01/27/2022 06:51:52 - INFO - codeparrot_training - Step 11714: {'lr': 0.00045115222100937293, 'samples': 2249280, 'steps': 11714, 'loss/train': 0.9169316589832306} 01/27/2022 06:51:55 - INFO - codeparrot_training - Step 11715: {'lr': 0.00045114250446466874, 'samples': 2249472, 'steps': 11715, 'loss/train': 0.7804275155067444} 01/27/2022 06:51:58 - INFO - codeparrot_training - Step 11716: {'lr': 0.00045113278705833396, 'samples': 2249664, 'steps': 11716, 'loss/train': 0.4944729655981064} 01/27/2022 06:52:01 - INFO - codeparrot_training - Step 11717: {'lr': 0.00045112306879041016, 'samples': 2249856, 'steps': 11717, 'loss/train': 0.791756808757782} 01/27/2022 06:52:05 - INFO - codeparrot_training - Step 11718: {'lr': 0.000451113349660939, 'samples': 2250048, 'steps': 11718, 'loss/train': 0.5194413363933563} 01/27/2022 06:52:08 - INFO - codeparrot_training - Step 11719: {'lr': 0.0004511036296699621, 'samples': 2250240, 'steps': 11719, 'loss/train': 0.9255731999874115} 01/27/2022 06:52:11 - INFO - codeparrot_training - Step 11720: {'lr': 0.0004510939088175211, 'samples': 2250432, 'steps': 11720, 'loss/train': 0.7836832702159882} 01/27/2022 06:52:16 - INFO - codeparrot_training - Step 11721: {'lr': 0.00045108418710365774, 'samples': 2250624, 'steps': 11721, 'loss/train': 0.7580241858959198} 01/27/2022 06:52:19 - INFO - codeparrot_training - Step 11722: {'lr': 0.0004510744645284135, 'samples': 2250816, 'steps': 11722, 'loss/train': 0.13855573162436485} 01/27/2022 06:52:22 - INFO - codeparrot_training - Step 11723: {'lr': 0.00045106474109183004, 'samples': 2251008, 'steps': 11723, 'loss/train': 0.1103186309337616} 01/27/2022 06:52:25 - INFO - codeparrot_training - Step 11724: {'lr': 0.00045105501679394916, 'samples': 2251200, 'steps': 11724, 'loss/train': 0.6525294631719589} 01/27/2022 06:52:29 - INFO - codeparrot_training - Step 11725: {'lr': 0.00045104529163481245, 'samples': 2251392, 'steps': 11725, 'loss/train': 0.8278773128986359} 01/27/2022 06:52:32 - INFO - codeparrot_training - Step 11726: {'lr': 0.0004510355656144615, 'samples': 2251584, 'steps': 11726, 'loss/train': 0.9845456779003143} 01/27/2022 06:52:35 - INFO - codeparrot_training - Step 11727: {'lr': 0.000451025838732938, 'samples': 2251776, 'steps': 11727, 'loss/train': 1.1557731628417969} 01/27/2022 06:52:38 - INFO - codeparrot_training - Step 11728: {'lr': 0.0004510161109902837, 'samples': 2251968, 'steps': 11728, 'loss/train': 0.8615412712097168} 01/27/2022 06:52:41 - INFO - codeparrot_training - Step 11729: {'lr': 0.00045100638238654013, 'samples': 2252160, 'steps': 11729, 'loss/train': 0.6739615648984909} 01/27/2022 06:52:45 - INFO - codeparrot_training - Step 11730: {'lr': 0.00045099665292174917, 'samples': 2252352, 'steps': 11730, 'loss/train': 0.8846946358680725} 01/27/2022 06:52:49 - INFO - codeparrot_training - Step 11731: {'lr': 0.00045098692259595233, 'samples': 2252544, 'steps': 11731, 'loss/train': 0.489016592502594} 01/27/2022 06:52:52 - INFO - codeparrot_training - Step 11732: {'lr': 0.00045097719140919126, 'samples': 2252736, 'steps': 11732, 'loss/train': 1.370825082063675} 01/27/2022 06:52:55 - INFO - codeparrot_training - Step 11733: {'lr': 0.00045096745936150774, 'samples': 2252928, 'steps': 11733, 'loss/train': 0.9118716716766357} 01/27/2022 06:52:58 - INFO - codeparrot_training - Step 11734: {'lr': 0.00045095772645294347, 'samples': 2253120, 'steps': 11734, 'loss/train': 0.5445540100336075} 01/27/2022 06:53:01 - INFO - codeparrot_training - Step 11735: {'lr': 0.00045094799268354007, 'samples': 2253312, 'steps': 11735, 'loss/train': 0.7915652990341187} 01/27/2022 06:53:04 - INFO - codeparrot_training - Step 11736: {'lr': 0.00045093825805333934, 'samples': 2253504, 'steps': 11736, 'loss/train': 0.8908776640892029} 01/27/2022 06:53:07 - INFO - codeparrot_training - Step 11737: {'lr': 0.0004509285225623829, 'samples': 2253696, 'steps': 11737, 'loss/train': 0.4621904790401459} 01/27/2022 06:53:11 - INFO - codeparrot_training - Step 11738: {'lr': 0.0004509187862107125, 'samples': 2253888, 'steps': 11738, 'loss/train': 1.281900554895401} 01/27/2022 06:53:16 - INFO - codeparrot_training - Step 11739: {'lr': 0.0004509090489983697, 'samples': 2254080, 'steps': 11739, 'loss/train': 0.9407254457473755} 01/27/2022 06:53:19 - INFO - codeparrot_training - Step 11740: {'lr': 0.0004508993109253964, 'samples': 2254272, 'steps': 11740, 'loss/train': 0.38610997796058655} 01/27/2022 06:53:22 - INFO - codeparrot_training - Step 11741: {'lr': 0.00045088957199183427, 'samples': 2254464, 'steps': 11741, 'loss/train': 0.544067770242691} 01/27/2022 06:53:25 - INFO - codeparrot_training - Step 11742: {'lr': 0.000450879832197725, 'samples': 2254656, 'steps': 11742, 'loss/train': 0.6750472784042358} 01/27/2022 06:53:28 - INFO - codeparrot_training - Step 11743: {'lr': 0.0004508700915431103, 'samples': 2254848, 'steps': 11743, 'loss/train': 1.0666701793670654} 01/27/2022 06:53:32 - INFO - codeparrot_training - Step 11744: {'lr': 0.0004508603500280319, 'samples': 2255040, 'steps': 11744, 'loss/train': 1.103096455335617} 01/27/2022 06:53:35 - INFO - codeparrot_training - Step 11745: {'lr': 0.00045085060765253157, 'samples': 2255232, 'steps': 11745, 'loss/train': 0.7646693587303162} 01/27/2022 06:53:38 - INFO - codeparrot_training - Step 11746: {'lr': 0.00045084086441665093, 'samples': 2255424, 'steps': 11746, 'loss/train': 0.26519693434238434} 01/27/2022 06:53:41 - INFO - codeparrot_training - Step 11747: {'lr': 0.00045083112032043196, 'samples': 2255616, 'steps': 11747, 'loss/train': 0.6488456726074219} 01/27/2022 06:53:45 - INFO - codeparrot_training - Step 11748: {'lr': 0.0004508213753639161, 'samples': 2255808, 'steps': 11748, 'loss/train': 0.2871037870645523} 01/27/2022 06:53:49 - INFO - codeparrot_training - Step 11749: {'lr': 0.0004508116295471453, 'samples': 2256000, 'steps': 11749, 'loss/train': 0.9603637754917145} 01/27/2022 06:53:52 - INFO - codeparrot_training - Step 11750: {'lr': 0.0004508018828701612, 'samples': 2256192, 'steps': 11750, 'loss/train': 1.229174941778183} 01/27/2022 06:53:55 - INFO - codeparrot_training - Step 11751: {'lr': 0.0004507921353330057, 'samples': 2256384, 'steps': 11751, 'loss/train': 0.8428590595722198} 01/27/2022 06:53:58 - INFO - codeparrot_training - Step 11752: {'lr': 0.0004507823869357204, 'samples': 2256576, 'steps': 11752, 'loss/train': 0.6553726494312286} 01/27/2022 06:54:01 - INFO - codeparrot_training - Step 11753: {'lr': 0.00045077263767834703, 'samples': 2256768, 'steps': 11753, 'loss/train': 0.822269082069397} 01/27/2022 06:54:04 - INFO - codeparrot_training - Step 11754: {'lr': 0.00045076288756092754, 'samples': 2256960, 'steps': 11754, 'loss/train': 1.0323087573051453} 01/27/2022 06:54:07 - INFO - codeparrot_training - Step 11755: {'lr': 0.0004507531365835035, 'samples': 2257152, 'steps': 11755, 'loss/train': 0.4989110380411148} 01/27/2022 06:54:11 - INFO - codeparrot_training - Step 11756: {'lr': 0.00045074338474611683, 'samples': 2257344, 'steps': 11756, 'loss/train': 0.7242601811885834} 01/27/2022 06:54:15 - INFO - codeparrot_training - Step 11757: {'lr': 0.00045073363204880916, 'samples': 2257536, 'steps': 11757, 'loss/train': 1.7987072467803955} 01/27/2022 06:54:18 - INFO - codeparrot_training - Step 11758: {'lr': 0.0004507238784916224, 'samples': 2257728, 'steps': 11758, 'loss/train': 0.48470310866832733} 01/27/2022 06:54:21 - INFO - codeparrot_training - Step 11759: {'lr': 0.0004507141240745983, 'samples': 2257920, 'steps': 11759, 'loss/train': 0.5034952014684677} 01/27/2022 06:54:24 - INFO - codeparrot_training - Step 11760: {'lr': 0.0004507043687977787, 'samples': 2258112, 'steps': 11760, 'loss/train': 0.597736045718193} 01/27/2022 06:54:27 - INFO - codeparrot_training - Step 11761: {'lr': 0.00045069461266120515, 'samples': 2258304, 'steps': 11761, 'loss/train': 0.8264180123806} 01/27/2022 06:54:31 - INFO - codeparrot_training - Step 11762: {'lr': 0.0004506848556649197, 'samples': 2258496, 'steps': 11762, 'loss/train': 1.1435705423355103} 01/27/2022 06:54:34 - INFO - codeparrot_training - Step 11763: {'lr': 0.0004506750978089641, 'samples': 2258688, 'steps': 11763, 'loss/train': 0.8355095386505127} 01/27/2022 06:54:37 - INFO - codeparrot_training - Step 11764: {'lr': 0.00045066533909338005, 'samples': 2258880, 'steps': 11764, 'loss/train': 1.1215185821056366} 01/27/2022 06:54:40 - INFO - codeparrot_training - Step 11765: {'lr': 0.00045065557951820935, 'samples': 2259072, 'steps': 11765, 'loss/train': 1.142713487148285} 01/27/2022 06:54:45 - INFO - codeparrot_training - Step 11766: {'lr': 0.0004506458190834939, 'samples': 2259264, 'steps': 11766, 'loss/train': 1.0804125666618347} 01/27/2022 06:54:48 - INFO - codeparrot_training - Step 11767: {'lr': 0.0004506360577892755, 'samples': 2259456, 'steps': 11767, 'loss/train': 1.1277242302894592} 01/27/2022 06:54:52 - INFO - codeparrot_training - Step 11768: {'lr': 0.00045062629563559595, 'samples': 2259648, 'steps': 11768, 'loss/train': 0.37424182891845703} 01/27/2022 06:54:55 - INFO - codeparrot_training - Step 11769: {'lr': 0.00045061653262249703, 'samples': 2259840, 'steps': 11769, 'loss/train': 1.305417001247406} 01/27/2022 06:54:58 - INFO - codeparrot_training - Step 11770: {'lr': 0.0004506067687500206, 'samples': 2260032, 'steps': 11770, 'loss/train': 0.5126402378082275} 01/27/2022 06:55:01 - INFO - codeparrot_training - Step 11771: {'lr': 0.00045059700401820846, 'samples': 2260224, 'steps': 11771, 'loss/train': 0.7612732350826263} 01/27/2022 06:55:04 - INFO - codeparrot_training - Step 11772: {'lr': 0.00045058723842710246, 'samples': 2260416, 'steps': 11772, 'loss/train': 1.0112020075321198} 01/27/2022 06:55:07 - INFO - codeparrot_training - Step 11773: {'lr': 0.0004505774719767444, 'samples': 2260608, 'steps': 11773, 'loss/train': 1.045888066291809} 01/27/2022 06:55:12 - INFO - codeparrot_training - Step 11774: {'lr': 0.0004505677046671761, 'samples': 2260800, 'steps': 11774, 'loss/train': 0.9537261128425598} 01/27/2022 06:55:15 - INFO - codeparrot_training - Step 11775: {'lr': 0.0004505579364984396, 'samples': 2260992, 'steps': 11775, 'loss/train': 0.8533498048782349} 01/27/2022 06:55:18 - INFO - codeparrot_training - Step 11776: {'lr': 0.0004505481674705764, 'samples': 2261184, 'steps': 11776, 'loss/train': 0.3399965912103653} 01/27/2022 06:55:21 - INFO - codeparrot_training - Step 11777: {'lr': 0.0004505383975836286, 'samples': 2261376, 'steps': 11777, 'loss/train': 0.4410693347454071} 01/27/2022 06:55:24 - INFO - codeparrot_training - Step 11778: {'lr': 0.00045052862683763806, 'samples': 2261568, 'steps': 11778, 'loss/train': 1.235921323299408} 01/27/2022 06:55:28 - INFO - codeparrot_training - Step 11779: {'lr': 0.0004505188552326465, 'samples': 2261760, 'steps': 11779, 'loss/train': 0.9309656918048859} 01/27/2022 06:55:31 - INFO - codeparrot_training - Step 11780: {'lr': 0.00045050908276869585, 'samples': 2261952, 'steps': 11780, 'loss/train': 0.7307640910148621} 01/27/2022 06:55:34 - INFO - codeparrot_training - Step 11781: {'lr': 0.00045049930944582783, 'samples': 2262144, 'steps': 11781, 'loss/train': 0.6887211799621582} 01/27/2022 06:55:37 - INFO - codeparrot_training - Step 11782: {'lr': 0.0004504895352640846, 'samples': 2262336, 'steps': 11782, 'loss/train': 0.9484911561012268} 01/27/2022 06:55:42 - INFO - codeparrot_training - Step 11783: {'lr': 0.0004504797602235078, 'samples': 2262528, 'steps': 11783, 'loss/train': 1.1720474660396576} 01/27/2022 06:55:45 - INFO - codeparrot_training - Step 11784: {'lr': 0.0004504699843241394, 'samples': 2262720, 'steps': 11784, 'loss/train': 0.9658480882644653} 01/27/2022 06:55:48 - INFO - codeparrot_training - Step 11785: {'lr': 0.0004504602075660212, 'samples': 2262912, 'steps': 11785, 'loss/train': 0.8000377714633942} 01/27/2022 06:55:51 - INFO - codeparrot_training - Step 11786: {'lr': 0.00045045042994919514, 'samples': 2263104, 'steps': 11786, 'loss/train': 1.3192013204097748} 01/27/2022 06:55:54 - INFO - codeparrot_training - Step 11787: {'lr': 0.00045044065147370303, 'samples': 2263296, 'steps': 11787, 'loss/train': 0.46976011991500854} 01/27/2022 06:55:58 - INFO - codeparrot_training - Step 11788: {'lr': 0.0004504308721395869, 'samples': 2263488, 'steps': 11788, 'loss/train': 0.6962299793958664} 01/27/2022 06:56:01 - INFO - codeparrot_training - Step 11789: {'lr': 0.0004504210919468886, 'samples': 2263680, 'steps': 11789, 'loss/train': 1.0651618838310242} 01/27/2022 06:56:04 - INFO - codeparrot_training - Step 11790: {'lr': 0.0004504113108956499, 'samples': 2263872, 'steps': 11790, 'loss/train': 1.380445808172226} 01/27/2022 06:56:07 - INFO - codeparrot_training - Step 11791: {'lr': 0.0004504015289859128, 'samples': 2264064, 'steps': 11791, 'loss/train': 0.2680828794836998} 01/27/2022 06:56:12 - INFO - codeparrot_training - Step 11792: {'lr': 0.00045039174621771915, 'samples': 2264256, 'steps': 11792, 'loss/train': 0.7983532547950745} 01/27/2022 06:56:15 - INFO - codeparrot_training - Step 11793: {'lr': 0.0004503819625911109, 'samples': 2264448, 'steps': 11793, 'loss/train': 0.5371891111135483} 01/27/2022 06:56:18 - INFO - codeparrot_training - Step 11794: {'lr': 0.00045037217810613004, 'samples': 2264640, 'steps': 11794, 'loss/train': 0.5550664365291595} 01/27/2022 06:56:22 - INFO - codeparrot_training - Step 11795: {'lr': 0.0004503623927628183, 'samples': 2264832, 'steps': 11795, 'loss/train': 0.8846822082996368} 01/27/2022 06:56:25 - INFO - codeparrot_training - Step 11796: {'lr': 0.0004503526065612177, 'samples': 2265024, 'steps': 11796, 'loss/train': 0.8573060631752014} 01/27/2022 06:56:28 - INFO - codeparrot_training - Step 11797: {'lr': 0.0004503428195013702, 'samples': 2265216, 'steps': 11797, 'loss/train': 0.8832274675369263} 01/27/2022 06:56:31 - INFO - codeparrot_training - Step 11798: {'lr': 0.00045033303158331764, 'samples': 2265408, 'steps': 11798, 'loss/train': 0.8735852837562561} 01/27/2022 06:56:34 - INFO - codeparrot_training - Step 11799: {'lr': 0.00045032324280710204, 'samples': 2265600, 'steps': 11799, 'loss/train': 0.8258028924465179} 01/27/2022 06:56:37 - INFO - codeparrot_training - Step 11800: {'lr': 0.0004503134531727652, 'samples': 2265792, 'steps': 11800, 'loss/train': 0.10773666948080063} 01/27/2022 06:56:42 - INFO - codeparrot_training - Step 11801: {'lr': 0.00045030366268034917, 'samples': 2265984, 'steps': 11801, 'loss/train': 0.8284577429294586} 01/27/2022 06:56:45 - INFO - codeparrot_training - Step 11802: {'lr': 0.00045029387132989587, 'samples': 2266176, 'steps': 11802, 'loss/train': 0.7322304546833038} 01/27/2022 06:56:48 - INFO - codeparrot_training - Step 11803: {'lr': 0.0004502840791214472, 'samples': 2266368, 'steps': 11803, 'loss/train': 0.654955118894577} 01/27/2022 06:56:51 - INFO - codeparrot_training - Step 11804: {'lr': 0.00045027428605504507, 'samples': 2266560, 'steps': 11804, 'loss/train': 0.41453665494918823} 01/27/2022 06:56:54 - INFO - codeparrot_training - Step 11805: {'lr': 0.00045026449213073154, 'samples': 2266752, 'steps': 11805, 'loss/train': 1.0612497925758362} 01/27/2022 06:56:57 - INFO - codeparrot_training - Step 11806: {'lr': 0.00045025469734854856, 'samples': 2266944, 'steps': 11806, 'loss/train': 0.4421335458755493} 01/27/2022 06:57:01 - INFO - codeparrot_training - Step 11807: {'lr': 0.00045024490170853806, 'samples': 2267136, 'steps': 11807, 'loss/train': 0.633960172533989} 01/27/2022 06:57:04 - INFO - codeparrot_training - Step 11808: {'lr': 0.000450235105210742, 'samples': 2267328, 'steps': 11808, 'loss/train': 0.07163443230092525} 01/27/2022 06:57:07 - INFO - codeparrot_training - Step 11809: {'lr': 0.0004502253078552022, 'samples': 2267520, 'steps': 11809, 'loss/train': 1.1696884632110596} 01/27/2022 06:57:11 - INFO - codeparrot_training - Step 11810: {'lr': 0.00045021550964196086, 'samples': 2267712, 'steps': 11810, 'loss/train': 2.0812965631484985} 01/27/2022 06:57:14 - INFO - codeparrot_training - Step 11811: {'lr': 0.0004502057105710598, 'samples': 2267904, 'steps': 11811, 'loss/train': 1.1615363359451294} 01/27/2022 06:57:18 - INFO - codeparrot_training - Step 11812: {'lr': 0.00045019591064254105, 'samples': 2268096, 'steps': 11812, 'loss/train': 0.9045401215553284} 01/27/2022 06:57:21 - INFO - codeparrot_training - Step 11813: {'lr': 0.00045018610985644663, 'samples': 2268288, 'steps': 11813, 'loss/train': 1.0229947865009308} 01/27/2022 06:57:24 - INFO - codeparrot_training - Step 11814: {'lr': 0.00045017630821281854, 'samples': 2268480, 'steps': 11814, 'loss/train': 0.7696622908115387} 01/27/2022 06:57:27 - INFO - codeparrot_training - Step 11815: {'lr': 0.0004501665057116986, 'samples': 2268672, 'steps': 11815, 'loss/train': 0.8553557395935059} 01/27/2022 06:57:30 - INFO - codeparrot_training - Step 11816: {'lr': 0.00045015670235312895, 'samples': 2268864, 'steps': 11816, 'loss/train': 0.8299788236618042} 01/27/2022 06:57:33 - INFO - codeparrot_training - Step 11817: {'lr': 0.00045014689813715147, 'samples': 2269056, 'steps': 11817, 'loss/train': 0.4999477118253708} 01/27/2022 06:57:36 - INFO - codeparrot_training - Step 11818: {'lr': 0.00045013709306380837, 'samples': 2269248, 'steps': 11818, 'loss/train': 0.7826340794563293} 01/27/2022 06:57:42 - INFO - codeparrot_training - Step 11819: {'lr': 0.00045012728713314146, 'samples': 2269440, 'steps': 11819, 'loss/train': 0.6123727262020111} 01/27/2022 06:57:45 - INFO - codeparrot_training - Step 11820: {'lr': 0.00045011748034519275, 'samples': 2269632, 'steps': 11820, 'loss/train': 0.9382451176643372} 01/27/2022 06:57:48 - INFO - codeparrot_training - Step 11821: {'lr': 0.00045010767270000436, 'samples': 2269824, 'steps': 11821, 'loss/train': 0.7267088145017624} 01/27/2022 06:57:51 - INFO - codeparrot_training - Step 11822: {'lr': 0.00045009786419761825, 'samples': 2270016, 'steps': 11822, 'loss/train': 0.8177593946456909} 01/27/2022 06:57:54 - INFO - codeparrot_training - Step 11823: {'lr': 0.00045008805483807637, 'samples': 2270208, 'steps': 11823, 'loss/train': 0.9088791310787201} 01/27/2022 06:57:57 - INFO - codeparrot_training - Step 11824: {'lr': 0.0004500782446214208, 'samples': 2270400, 'steps': 11824, 'loss/train': 1.2076953649520874} 01/27/2022 06:58:00 - INFO - codeparrot_training - Step 11825: {'lr': 0.00045006843354769354, 'samples': 2270592, 'steps': 11825, 'loss/train': 0.8256931900978088} 01/27/2022 06:58:04 - INFO - codeparrot_training - Step 11826: {'lr': 0.0004500586216169367, 'samples': 2270784, 'steps': 11826, 'loss/train': 1.081102430820465} 01/27/2022 06:58:07 - INFO - codeparrot_training - Step 11827: {'lr': 0.0004500488088291923, 'samples': 2270976, 'steps': 11827, 'loss/train': 0.9109267294406891} 01/27/2022 06:58:11 - INFO - codeparrot_training - Step 11828: {'lr': 0.0004500389951845022, 'samples': 2271168, 'steps': 11828, 'loss/train': 0.6073612421751022} 01/27/2022 06:58:15 - INFO - codeparrot_training - Step 11829: {'lr': 0.00045002918068290864, 'samples': 2271360, 'steps': 11829, 'loss/train': 1.0698454678058624} 01/27/2022 06:58:18 - INFO - codeparrot_training - Step 11830: {'lr': 0.00045001936532445354, 'samples': 2271552, 'steps': 11830, 'loss/train': 0.5306936502456665} 01/27/2022 06:58:21 - INFO - codeparrot_training - Step 11831: {'lr': 0.000450009549109179, 'samples': 2271744, 'steps': 11831, 'loss/train': 0.9065660834312439} 01/27/2022 06:58:24 - INFO - codeparrot_training - Step 11832: {'lr': 0.0004499997320371271, 'samples': 2271936, 'steps': 11832, 'loss/train': 0.6652970463037491} 01/27/2022 06:58:27 - INFO - codeparrot_training - Step 11833: {'lr': 0.0004499899141083399, 'samples': 2272128, 'steps': 11833, 'loss/train': 1.124927133321762} 01/27/2022 06:58:30 - INFO - codeparrot_training - Step 11834: {'lr': 0.0004499800953228593, 'samples': 2272320, 'steps': 11834, 'loss/train': 0.2964809685945511} 01/27/2022 06:58:34 - INFO - codeparrot_training - Step 11835: {'lr': 0.00044997027568072754, 'samples': 2272512, 'steps': 11835, 'loss/train': 1.4446935653686523} 01/27/2022 06:58:37 - INFO - codeparrot_training - Step 11836: {'lr': 0.00044996045518198657, 'samples': 2272704, 'steps': 11836, 'loss/train': 2.0118489861488342} 01/27/2022 06:58:41 - INFO - codeparrot_training - Step 11837: {'lr': 0.00044995063382667855, 'samples': 2272896, 'steps': 11837, 'loss/train': 0.8113957643508911} 01/27/2022 06:58:44 - INFO - codeparrot_training - Step 11838: {'lr': 0.0004499408116148455, 'samples': 2273088, 'steps': 11838, 'loss/train': 0.5394685417413712} 01/27/2022 06:58:47 - INFO - codeparrot_training - Step 11839: {'lr': 0.00044993098854652954, 'samples': 2273280, 'steps': 11839, 'loss/train': 0.3895614445209503} 01/27/2022 06:58:50 - INFO - codeparrot_training - Step 11840: {'lr': 0.0004499211646217727, 'samples': 2273472, 'steps': 11840, 'loss/train': 0.5074153393507004} 01/27/2022 06:58:54 - INFO - codeparrot_training - Step 11841: {'lr': 0.000449911339840617, 'samples': 2273664, 'steps': 11841, 'loss/train': 0.2292054444551468} 01/27/2022 06:58:57 - INFO - codeparrot_training - Step 11842: {'lr': 0.00044990151420310463, 'samples': 2273856, 'steps': 11842, 'loss/train': 1.1033568978309631} 01/27/2022 06:59:00 - INFO - codeparrot_training - Step 11843: {'lr': 0.0004498916877092776, 'samples': 2274048, 'steps': 11843, 'loss/train': 0.8583648204803467} 01/27/2022 06:59:03 - INFO - codeparrot_training - Step 11844: {'lr': 0.00044988186035917817, 'samples': 2274240, 'steps': 11844, 'loss/train': 0.7613267004489899} 01/27/2022 06:59:06 - INFO - codeparrot_training - Step 11845: {'lr': 0.00044987203215284823, 'samples': 2274432, 'steps': 11845, 'loss/train': 0.8249414563179016} 01/27/2022 06:59:11 - INFO - codeparrot_training - Step 11846: {'lr': 0.00044986220309033, 'samples': 2274624, 'steps': 11846, 'loss/train': 0.1657089963555336} 01/27/2022 06:59:14 - INFO - codeparrot_training - Step 11847: {'lr': 0.00044985237317166554, 'samples': 2274816, 'steps': 11847, 'loss/train': 0.5035967230796814} 01/27/2022 06:59:18 - INFO - codeparrot_training - Step 11848: {'lr': 0.00044984254239689703, 'samples': 2275008, 'steps': 11848, 'loss/train': 0.7533794045448303} 01/27/2022 06:59:21 - INFO - codeparrot_training - Step 11849: {'lr': 0.00044983271076606644, 'samples': 2275200, 'steps': 11849, 'loss/train': 0.6295280903577805} 01/27/2022 06:59:24 - INFO - codeparrot_training - Step 11850: {'lr': 0.000449822878279216, 'samples': 2275392, 'steps': 11850, 'loss/train': 0.40290749073028564} 01/27/2022 06:59:27 - INFO - codeparrot_training - Step 11851: {'lr': 0.00044981304493638786, 'samples': 2275584, 'steps': 11851, 'loss/train': 0.9637055397033691} 01/27/2022 06:59:30 - INFO - codeparrot_training - Step 11852: {'lr': 0.00044980321073762405, 'samples': 2275776, 'steps': 11852, 'loss/train': 0.6801163852214813} 01/27/2022 06:59:33 - INFO - codeparrot_training - Step 11853: {'lr': 0.0004497933756829667, 'samples': 2275968, 'steps': 11853, 'loss/train': 0.8991596102714539} 01/27/2022 06:59:38 - INFO - codeparrot_training - Step 11854: {'lr': 0.000449783539772458, 'samples': 2276160, 'steps': 11854, 'loss/train': 1.0178713202476501} 01/27/2022 06:59:41 - INFO - codeparrot_training - Step 11855: {'lr': 0.00044977370300614, 'samples': 2276352, 'steps': 11855, 'loss/train': 0.6393000930547714} 01/27/2022 06:59:44 - INFO - codeparrot_training - Step 11856: {'lr': 0.00044976386538405494, 'samples': 2276544, 'steps': 11856, 'loss/train': 0.7199775874614716} 01/27/2022 06:59:47 - INFO - codeparrot_training - Step 11857: {'lr': 0.0004497540269062449, 'samples': 2276736, 'steps': 11857, 'loss/train': 1.104445606470108} 01/27/2022 06:59:50 - INFO - codeparrot_training - Step 11858: {'lr': 0.00044974418757275206, 'samples': 2276928, 'steps': 11858, 'loss/train': 0.8321705460548401} 01/27/2022 06:59:53 - INFO - codeparrot_training - Step 11859: {'lr': 0.00044973434738361853, 'samples': 2277120, 'steps': 11859, 'loss/train': 0.5258215069770813} 01/27/2022 06:59:57 - INFO - codeparrot_training - Step 11860: {'lr': 0.0004497245063388865, 'samples': 2277312, 'steps': 11860, 'loss/train': 0.7226619869470596} 01/27/2022 07:00:00 - INFO - codeparrot_training - Step 11861: {'lr': 0.0004497146644385981, 'samples': 2277504, 'steps': 11861, 'loss/train': 0.9473461210727692} 01/27/2022 07:00:03 - INFO - codeparrot_training - Step 11862: {'lr': 0.00044970482168279547, 'samples': 2277696, 'steps': 11862, 'loss/train': 0.8358094096183777} 01/27/2022 07:00:07 - INFO - codeparrot_training - Step 11863: {'lr': 0.0004496949780715208, 'samples': 2277888, 'steps': 11863, 'loss/train': 0.8408685922622681} 01/27/2022 07:00:11 - INFO - codeparrot_training - Step 11864: {'lr': 0.00044968513360481624, 'samples': 2278080, 'steps': 11864, 'loss/train': 0.9244476556777954} 01/27/2022 07:00:14 - INFO - codeparrot_training - Step 11865: {'lr': 0.000449675288282724, 'samples': 2278272, 'steps': 11865, 'loss/train': 0.8353349268436432} 01/27/2022 07:00:17 - INFO - codeparrot_training - Step 11866: {'lr': 0.0004496654421052862, 'samples': 2278464, 'steps': 11866, 'loss/train': 0.43523307144641876} 01/27/2022 07:00:20 - INFO - codeparrot_training - Step 11867: {'lr': 0.00044965559507254504, 'samples': 2278656, 'steps': 11867, 'loss/train': 0.7962246537208557} 01/27/2022 07:00:23 - INFO - codeparrot_training - Step 11868: {'lr': 0.0004496457471845428, 'samples': 2278848, 'steps': 11868, 'loss/train': 0.8732105791568756} 01/27/2022 07:00:26 - INFO - codeparrot_training - Step 11869: {'lr': 0.0004496358984413215, 'samples': 2279040, 'steps': 11869, 'loss/train': 0.944403201341629} 01/27/2022 07:00:29 - INFO - codeparrot_training - Step 11870: {'lr': 0.0004496260488429234, 'samples': 2279232, 'steps': 11870, 'loss/train': 0.34660110622644424} 01/27/2022 07:00:33 - INFO - codeparrot_training - Step 11871: {'lr': 0.0004496161983893907, 'samples': 2279424, 'steps': 11871, 'loss/train': 1.365119218826294} 01/27/2022 07:00:38 - INFO - codeparrot_training - Step 11872: {'lr': 0.0004496063470807656, 'samples': 2279616, 'steps': 11872, 'loss/train': 0.8453704118728638} 01/27/2022 07:00:41 - INFO - codeparrot_training - Step 11873: {'lr': 0.0004495964949170903, 'samples': 2279808, 'steps': 11873, 'loss/train': 0.7928828001022339} 01/27/2022 07:00:44 - INFO - codeparrot_training - Step 11874: {'lr': 0.000449586641898407, 'samples': 2280000, 'steps': 11874, 'loss/train': 0.8988221883773804} 01/27/2022 07:00:47 - INFO - codeparrot_training - Step 11875: {'lr': 0.0004495767880247579, 'samples': 2280192, 'steps': 11875, 'loss/train': 0.5216467827558517} 01/27/2022 07:00:50 - INFO - codeparrot_training - Step 11876: {'lr': 0.0004495669332961852, 'samples': 2280384, 'steps': 11876, 'loss/train': 0.6527556627988815} 01/27/2022 07:00:53 - INFO - codeparrot_training - Step 11877: {'lr': 0.0004495570777127311, 'samples': 2280576, 'steps': 11877, 'loss/train': 0.9995581805706024} 01/27/2022 07:00:57 - INFO - codeparrot_training - Step 11878: {'lr': 0.00044954722127443786, 'samples': 2280768, 'steps': 11878, 'loss/train': 1.0246752798557281} 01/27/2022 07:01:00 - INFO - codeparrot_training - Step 11879: {'lr': 0.0004495373639813477, 'samples': 2280960, 'steps': 11879, 'loss/train': 0.6198970824480057} 01/27/2022 07:01:03 - INFO - codeparrot_training - Step 11880: {'lr': 0.00044952750583350287, 'samples': 2281152, 'steps': 11880, 'loss/train': 0.5531079769134521} 01/27/2022 07:01:08 - INFO - codeparrot_training - Step 11881: {'lr': 0.00044951764683094555, 'samples': 2281344, 'steps': 11881, 'loss/train': 0.6608201712369919} 01/27/2022 07:01:11 - INFO - codeparrot_training - Step 11882: {'lr': 0.000449507786973718, 'samples': 2281536, 'steps': 11882, 'loss/train': 0.8211762607097626} 01/27/2022 07:01:14 - INFO - codeparrot_training - Step 11883: {'lr': 0.0004494979262618624, 'samples': 2281728, 'steps': 11883, 'loss/train': 0.5103790909051895} 01/27/2022 07:01:17 - INFO - codeparrot_training - Step 11884: {'lr': 0.00044948806469542095, 'samples': 2281920, 'steps': 11884, 'loss/train': 0.9439103901386261} 01/27/2022 07:01:20 - INFO - codeparrot_training - Step 11885: {'lr': 0.0004494782022744361, 'samples': 2282112, 'steps': 11885, 'loss/train': 0.9934240579605103} 01/27/2022 07:01:23 - INFO - codeparrot_training - Step 11886: {'lr': 0.0004494683389989499, 'samples': 2282304, 'steps': 11886, 'loss/train': 1.0143978595733643} 01/27/2022 07:01:26 - INFO - codeparrot_training - Step 11887: {'lr': 0.0004494584748690047, 'samples': 2282496, 'steps': 11887, 'loss/train': 0.8956961631774902} 01/27/2022 07:01:30 - INFO - codeparrot_training - Step 11888: {'lr': 0.00044944860988464276, 'samples': 2282688, 'steps': 11888, 'loss/train': 1.19089937210083} 01/27/2022 07:01:34 - INFO - codeparrot_training - Step 11889: {'lr': 0.0004494387440459063, 'samples': 2282880, 'steps': 11889, 'loss/train': 0.8122552335262299} 01/27/2022 07:01:37 - INFO - codeparrot_training - Step 11890: {'lr': 0.00044942887735283755, 'samples': 2283072, 'steps': 11890, 'loss/train': 1.1248342394828796} 01/27/2022 07:01:40 - INFO - codeparrot_training - Step 11891: {'lr': 0.00044941900980547886, 'samples': 2283264, 'steps': 11891, 'loss/train': 0.9312116503715515} 01/27/2022 07:01:44 - INFO - codeparrot_training - Step 11892: {'lr': 0.00044940914140387245, 'samples': 2283456, 'steps': 11892, 'loss/train': 0.9496877789497375} 01/27/2022 07:01:47 - INFO - codeparrot_training - Step 11893: {'lr': 0.00044939927214806055, 'samples': 2283648, 'steps': 11893, 'loss/train': 0.8787130415439606} 01/27/2022 07:01:50 - INFO - codeparrot_training - Step 11894: {'lr': 0.0004493894020380855, 'samples': 2283840, 'steps': 11894, 'loss/train': 1.157031387090683} 01/27/2022 07:01:53 - INFO - codeparrot_training - Step 11895: {'lr': 0.0004493795310739896, 'samples': 2284032, 'steps': 11895, 'loss/train': 0.834974616765976} 01/27/2022 07:01:56 - INFO - codeparrot_training - Step 11896: {'lr': 0.00044936965925581506, 'samples': 2284224, 'steps': 11896, 'loss/train': 1.012470155954361} 01/27/2022 07:01:59 - INFO - codeparrot_training - Step 11897: {'lr': 0.0004493597865836042, 'samples': 2284416, 'steps': 11897, 'loss/train': 0.871737152338028} 01/27/2022 07:02:05 - INFO - codeparrot_training - Step 11898: {'lr': 0.00044934991305739936, 'samples': 2284608, 'steps': 11898, 'loss/train': 0.8010927736759186} 01/27/2022 07:02:08 - INFO - codeparrot_training - Step 11899: {'lr': 0.00044934003867724284, 'samples': 2284800, 'steps': 11899, 'loss/train': 1.0979735255241394} 01/27/2022 07:02:11 - INFO - codeparrot_training - Step 11900: {'lr': 0.0004493301634431768, 'samples': 2284992, 'steps': 11900, 'loss/train': 1.4916420578956604} 01/27/2022 07:02:14 - INFO - codeparrot_training - Step 11901: {'lr': 0.00044932028735524367, 'samples': 2285184, 'steps': 11901, 'loss/train': 0.9984169900417328} 01/27/2022 07:02:17 - INFO - codeparrot_training - Step 11902: {'lr': 0.0004493104104134857, 'samples': 2285376, 'steps': 11902, 'loss/train': 0.4306148439645767} 01/27/2022 07:02:20 - INFO - codeparrot_training - Step 11903: {'lr': 0.0004493005326179452, 'samples': 2285568, 'steps': 11903, 'loss/train': 0.9185740649700165} 01/27/2022 07:02:23 - INFO - codeparrot_training - Step 11904: {'lr': 0.00044929065396866457, 'samples': 2285760, 'steps': 11904, 'loss/train': 0.953387975692749} 01/27/2022 07:02:26 - INFO - codeparrot_training - Step 11905: {'lr': 0.00044928077446568606, 'samples': 2285952, 'steps': 11905, 'loss/train': 0.8288863599300385} 01/27/2022 07:02:30 - INFO - codeparrot_training - Step 11906: {'lr': 0.000449270894109052, 'samples': 2286144, 'steps': 11906, 'loss/train': 0.9412760138511658} 01/27/2022 07:02:34 - INFO - codeparrot_training - Step 11907: {'lr': 0.0004492610128988046, 'samples': 2286336, 'steps': 11907, 'loss/train': 0.6788930296897888} 01/27/2022 07:02:37 - INFO - codeparrot_training - Step 11908: {'lr': 0.00044925113083498636, 'samples': 2286528, 'steps': 11908, 'loss/train': 1.0056670010089874} 01/27/2022 07:02:40 - INFO - codeparrot_training - Step 11909: {'lr': 0.00044924124791763956, 'samples': 2286720, 'steps': 11909, 'loss/train': 0.8603222072124481} 01/27/2022 07:02:44 - INFO - codeparrot_training - Step 11910: {'lr': 0.0004492313641468065, 'samples': 2286912, 'steps': 11910, 'loss/train': 0.36114994436502457} 01/27/2022 07:02:47 - INFO - codeparrot_training - Step 11911: {'lr': 0.00044922147952252957, 'samples': 2287104, 'steps': 11911, 'loss/train': 0.803687185049057} 01/27/2022 07:02:50 - INFO - codeparrot_training - Step 11912: {'lr': 0.000449211594044851, 'samples': 2287296, 'steps': 11912, 'loss/train': 0.3557978421449661} 01/27/2022 07:02:53 - INFO - codeparrot_training - Step 11913: {'lr': 0.0004492017077138133, 'samples': 2287488, 'steps': 11913, 'loss/train': 1.0972043573856354} 01/27/2022 07:02:56 - INFO - codeparrot_training - Step 11914: {'lr': 0.00044919182052945866, 'samples': 2287680, 'steps': 11914, 'loss/train': 0.6565480828285217} 01/27/2022 07:02:59 - INFO - codeparrot_training - Step 11915: {'lr': 0.00044918193249182957, 'samples': 2287872, 'steps': 11915, 'loss/train': 0.6532482951879501} 01/27/2022 07:03:04 - INFO - codeparrot_training - Step 11916: {'lr': 0.0004491720436009683, 'samples': 2288064, 'steps': 11916, 'loss/train': 0.8691043853759766} 01/27/2022 07:03:07 - INFO - codeparrot_training - Step 11917: {'lr': 0.0004491621538569173, 'samples': 2288256, 'steps': 11917, 'loss/train': 0.8077718317508698} 01/27/2022 07:03:10 - INFO - codeparrot_training - Step 11918: {'lr': 0.0004491522632597188, 'samples': 2288448, 'steps': 11918, 'loss/train': 0.712167352437973} 01/27/2022 07:03:13 - INFO - codeparrot_training - Step 11919: {'lr': 0.0004491423718094153, 'samples': 2288640, 'steps': 11919, 'loss/train': 0.9468896985054016} 01/27/2022 07:03:16 - INFO - codeparrot_training - Step 11920: {'lr': 0.00044913247950604905, 'samples': 2288832, 'steps': 11920, 'loss/train': 1.2057137489318848} 01/27/2022 07:03:19 - INFO - codeparrot_training - Step 11921: {'lr': 0.0004491225863496625, 'samples': 2289024, 'steps': 11921, 'loss/train': 0.6066458970308304} 01/27/2022 07:03:22 - INFO - codeparrot_training - Step 11922: {'lr': 0.0004491126923402981, 'samples': 2289216, 'steps': 11922, 'loss/train': 0.6857669949531555} 01/27/2022 07:03:26 - INFO - codeparrot_training - Step 11923: {'lr': 0.0004491027974779981, 'samples': 2289408, 'steps': 11923, 'loss/train': 0.9569257199764252} 01/27/2022 07:03:31 - INFO - codeparrot_training - Step 11924: {'lr': 0.00044909290176280495, 'samples': 2289600, 'steps': 11924, 'loss/train': 0.9001846611499786} 01/27/2022 07:03:35 - INFO - codeparrot_training - Step 11925: {'lr': 0.000449083005194761, 'samples': 2289792, 'steps': 11925, 'loss/train': 0.9720467627048492} 01/27/2022 07:03:38 - INFO - codeparrot_training - Step 11926: {'lr': 0.0004490731077739087, 'samples': 2289984, 'steps': 11926, 'loss/train': 0.7845772504806519} 01/27/2022 07:03:41 - INFO - codeparrot_training - Step 11927: {'lr': 0.0004490632095002904, 'samples': 2290176, 'steps': 11927, 'loss/train': 1.0471782088279724} 01/27/2022 07:03:44 - INFO - codeparrot_training - Step 11928: {'lr': 0.00044905331037394853, 'samples': 2290368, 'steps': 11928, 'loss/train': 0.6286101937294006} 01/27/2022 07:03:47 - INFO - codeparrot_training - Step 11929: {'lr': 0.00044904341039492544, 'samples': 2290560, 'steps': 11929, 'loss/train': 0.8820819854736328} 01/27/2022 07:03:50 - INFO - codeparrot_training - Step 11930: {'lr': 0.00044903350956326365, 'samples': 2290752, 'steps': 11930, 'loss/train': 0.27428001165390015} 01/27/2022 07:03:54 - INFO - codeparrot_training - Step 11931: {'lr': 0.0004490236078790055, 'samples': 2290944, 'steps': 11931, 'loss/train': 0.8976892232894897} 01/27/2022 07:03:57 - INFO - codeparrot_training - Step 11932: {'lr': 0.0004490137053421934, 'samples': 2291136, 'steps': 11932, 'loss/train': 1.6782934069633484} 01/27/2022 07:04:00 - INFO - codeparrot_training - Step 11933: {'lr': 0.00044900380195286974, 'samples': 2291328, 'steps': 11933, 'loss/train': 1.668755829334259} 01/27/2022 07:04:04 - INFO - codeparrot_training - Step 11934: {'lr': 0.00044899389771107704, 'samples': 2291520, 'steps': 11934, 'loss/train': 0.932902067899704} 01/27/2022 07:04:08 - INFO - codeparrot_training - Step 11935: {'lr': 0.00044898399261685765, 'samples': 2291712, 'steps': 11935, 'loss/train': 0.8100921213626862} 01/27/2022 07:04:11 - INFO - codeparrot_training - Step 11936: {'lr': 0.00044897408667025397, 'samples': 2291904, 'steps': 11936, 'loss/train': 0.987919807434082} 01/27/2022 07:04:14 - INFO - codeparrot_training - Step 11937: {'lr': 0.00044896417987130854, 'samples': 2292096, 'steps': 11937, 'loss/train': 0.92616006731987} 01/27/2022 07:04:17 - INFO - codeparrot_training - Step 11938: {'lr': 0.0004489542722200637, 'samples': 2292288, 'steps': 11938, 'loss/train': 1.017664521932602} 01/27/2022 07:04:20 - INFO - codeparrot_training - Step 11939: {'lr': 0.000448944363716562, 'samples': 2292480, 'steps': 11939, 'loss/train': 1.1069433689117432} 01/27/2022 07:04:23 - INFO - codeparrot_training - Step 11940: {'lr': 0.0004489344543608458, 'samples': 2292672, 'steps': 11940, 'loss/train': 0.811680793762207} 01/27/2022 07:04:26 - INFO - codeparrot_training - Step 11941: {'lr': 0.00044892454415295746, 'samples': 2292864, 'steps': 11941, 'loss/train': 0.9256044924259186} 01/27/2022 07:04:32 - INFO - codeparrot_training - Step 11942: {'lr': 0.0004489146330929397, 'samples': 2293056, 'steps': 11942, 'loss/train': 0.6617585867643356} 01/27/2022 07:04:35 - INFO - codeparrot_training - Step 11943: {'lr': 0.0004489047211808347, 'samples': 2293248, 'steps': 11943, 'loss/train': 1.1716060638427734} 01/27/2022 07:04:38 - INFO - codeparrot_training - Step 11944: {'lr': 0.0004488948084166851, 'samples': 2293440, 'steps': 11944, 'loss/train': 0.6362883299589157} 01/27/2022 07:04:41 - INFO - codeparrot_training - Step 11945: {'lr': 0.00044888489480053324, 'samples': 2293632, 'steps': 11945, 'loss/train': 0.7641447186470032} 01/27/2022 07:04:44 - INFO - codeparrot_training - Step 11946: {'lr': 0.00044887498033242167, 'samples': 2293824, 'steps': 11946, 'loss/train': 0.8098630607128143} 01/27/2022 07:04:47 - INFO - codeparrot_training - Step 11947: {'lr': 0.0004488650650123929, 'samples': 2294016, 'steps': 11947, 'loss/train': 0.8611891865730286} 01/27/2022 07:04:51 - INFO - codeparrot_training - Step 11948: {'lr': 0.00044885514884048926, 'samples': 2294208, 'steps': 11948, 'loss/train': 0.886652022600174} 01/27/2022 07:04:54 - INFO - codeparrot_training - Step 11949: {'lr': 0.0004488452318167533, 'samples': 2294400, 'steps': 11949, 'loss/train': 0.560135543346405} 01/27/2022 07:04:57 - INFO - codeparrot_training - Step 11950: {'lr': 0.00044883531394122753, 'samples': 2294592, 'steps': 11950, 'loss/train': 1.0697079598903656} 01/27/2022 07:05:01 - INFO - codeparrot_training - Step 11951: {'lr': 0.00044882539521395436, 'samples': 2294784, 'steps': 11951, 'loss/train': 0.9730653762817383} 01/27/2022 07:05:04 - INFO - codeparrot_training - Step 11952: {'lr': 0.0004488154756349764, 'samples': 2294976, 'steps': 11952, 'loss/train': 1.1779370605945587} 01/27/2022 07:05:07 - INFO - codeparrot_training - Step 11953: {'lr': 0.0004488055552043361, 'samples': 2295168, 'steps': 11953, 'loss/train': 1.0591991543769836} 01/27/2022 07:05:11 - INFO - codeparrot_training - Step 11954: {'lr': 0.0004487956339220759, 'samples': 2295360, 'steps': 11954, 'loss/train': 0.8123069107532501} 01/27/2022 07:05:14 - INFO - codeparrot_training - Step 11955: {'lr': 0.00044878571178823826, 'samples': 2295552, 'steps': 11955, 'loss/train': 0.6790103763341904} 01/27/2022 07:05:17 - INFO - codeparrot_training - Step 11956: {'lr': 0.00044877578880286585, 'samples': 2295744, 'steps': 11956, 'loss/train': 0.8237028121948242} 01/27/2022 07:05:20 - INFO - codeparrot_training - Step 11957: {'lr': 0.000448765864966001, 'samples': 2295936, 'steps': 11957, 'loss/train': 0.7659370601177216} 01/27/2022 07:05:23 - INFO - codeparrot_training - Step 11958: {'lr': 0.00044875594027768634, 'samples': 2296128, 'steps': 11958, 'loss/train': 0.9892949759960175} 01/27/2022 07:05:26 - INFO - codeparrot_training - Step 11959: {'lr': 0.00044874601473796435, 'samples': 2296320, 'steps': 11959, 'loss/train': 0.6640435606241226} 01/27/2022 07:05:31 - INFO - codeparrot_training - Step 11960: {'lr': 0.00044873608834687754, 'samples': 2296512, 'steps': 11960, 'loss/train': 0.7499203383922577} 01/27/2022 07:05:34 - INFO - codeparrot_training - Step 11961: {'lr': 0.0004487261611044684, 'samples': 2296704, 'steps': 11961, 'loss/train': 0.6047747433185577} 01/27/2022 07:05:37 - INFO - codeparrot_training - Step 11962: {'lr': 0.0004487162330107795, 'samples': 2296896, 'steps': 11962, 'loss/train': 0.5938701331615448} 01/27/2022 07:05:40 - INFO - codeparrot_training - Step 11963: {'lr': 0.0004487063040658534, 'samples': 2297088, 'steps': 11963, 'loss/train': 1.033753126859665} 01/27/2022 07:05:43 - INFO - codeparrot_training - Step 11964: {'lr': 0.00044869637426973256, 'samples': 2297280, 'steps': 11964, 'loss/train': 0.571290448307991} 01/27/2022 07:05:47 - INFO - codeparrot_training - Step 11965: {'lr': 0.0004486864436224595, 'samples': 2297472, 'steps': 11965, 'loss/train': 0.7165465950965881} 01/27/2022 07:05:50 - INFO - codeparrot_training - Step 11966: {'lr': 0.0004486765121240769, 'samples': 2297664, 'steps': 11966, 'loss/train': 0.8890007436275482} 01/27/2022 07:05:53 - INFO - codeparrot_training - Step 11967: {'lr': 0.0004486665797746271, 'samples': 2297856, 'steps': 11967, 'loss/train': 1.2903350293636322} 01/27/2022 07:05:57 - INFO - codeparrot_training - Step 11968: {'lr': 0.00044865664657415286, 'samples': 2298048, 'steps': 11968, 'loss/train': 0.905661553144455} 01/27/2022 07:06:01 - INFO - codeparrot_training - Step 11969: {'lr': 0.00044864671252269663, 'samples': 2298240, 'steps': 11969, 'loss/train': 0.8356333673000336} 01/27/2022 07:06:04 - INFO - codeparrot_training - Step 11970: {'lr': 0.00044863677762030087, 'samples': 2298432, 'steps': 11970, 'loss/train': 0.8572281897068024} 01/27/2022 07:06:07 - INFO - codeparrot_training - Step 11971: {'lr': 0.0004486268418670083, 'samples': 2298624, 'steps': 11971, 'loss/train': 0.8617141842842102} 01/27/2022 07:06:10 - INFO - codeparrot_training - Step 11972: {'lr': 0.00044861690526286135, 'samples': 2298816, 'steps': 11972, 'loss/train': 0.7001402825117111} 01/27/2022 07:06:13 - INFO - codeparrot_training - Step 11973: {'lr': 0.00044860696780790266, 'samples': 2299008, 'steps': 11973, 'loss/train': 1.2406060695648193} 01/27/2022 07:06:16 - INFO - codeparrot_training - Step 11974: {'lr': 0.00044859702950217486, 'samples': 2299200, 'steps': 11974, 'loss/train': 0.9957866370677948} 01/27/2022 07:06:20 - INFO - codeparrot_training - Step 11975: {'lr': 0.00044858709034572035, 'samples': 2299392, 'steps': 11975, 'loss/train': 0.7329312264919281} 01/27/2022 07:06:23 - INFO - codeparrot_training - Step 11976: {'lr': 0.00044857715033858183, 'samples': 2299584, 'steps': 11976, 'loss/train': 0.5295913517475128} 01/27/2022 07:06:28 - INFO - codeparrot_training - Step 11977: {'lr': 0.0004485672094808019, 'samples': 2299776, 'steps': 11977, 'loss/train': 1.8374354839324951} 01/27/2022 07:06:31 - INFO - codeparrot_training - Step 11978: {'lr': 0.0004485572677724231, 'samples': 2299968, 'steps': 11978, 'loss/train': 0.3843527287244797} 01/27/2022 07:06:34 - INFO - codeparrot_training - Step 11979: {'lr': 0.00044854732521348796, 'samples': 2300160, 'steps': 11979, 'loss/train': 1.039853274822235} 01/27/2022 07:06:38 - INFO - codeparrot_training - Step 11980: {'lr': 0.0004485373818040391, 'samples': 2300352, 'steps': 11980, 'loss/train': 0.6803296655416489} 01/27/2022 07:06:41 - INFO - codeparrot_training - Step 11981: {'lr': 0.00044852743754411915, 'samples': 2300544, 'steps': 11981, 'loss/train': 0.8208551108837128} 01/27/2022 07:06:44 - INFO - codeparrot_training - Step 11982: {'lr': 0.00044851749243377085, 'samples': 2300736, 'steps': 11982, 'loss/train': 0.45822563767433167} 01/27/2022 07:06:47 - INFO - codeparrot_training - Step 11983: {'lr': 0.0004485075464730365, 'samples': 2300928, 'steps': 11983, 'loss/train': 0.5442589223384857} 01/27/2022 07:06:50 - INFO - codeparrot_training - Step 11984: {'lr': 0.0004484975996619589, 'samples': 2301120, 'steps': 11984, 'loss/train': 0.09770005568861961} 01/27/2022 07:06:53 - INFO - codeparrot_training - Step 11985: {'lr': 0.0004484876520005805, 'samples': 2301312, 'steps': 11985, 'loss/train': 1.4829617142677307} 01/27/2022 07:06:58 - INFO - codeparrot_training - Step 11986: {'lr': 0.0004484777034889441, 'samples': 2301504, 'steps': 11986, 'loss/train': 0.9553914964199066} 01/27/2022 07:07:01 - INFO - codeparrot_training - Step 11987: {'lr': 0.0004484677541270923, 'samples': 2301696, 'steps': 11987, 'loss/train': 1.535578429698944} 01/27/2022 07:07:04 - INFO - codeparrot_training - Step 11988: {'lr': 0.00044845780391506763, 'samples': 2301888, 'steps': 11988, 'loss/train': 0.6897485554218292} 01/27/2022 07:07:07 - INFO - codeparrot_training - Step 11989: {'lr': 0.0004484478528529128, 'samples': 2302080, 'steps': 11989, 'loss/train': 0.33852115273475647} 01/27/2022 07:07:10 - INFO - codeparrot_training - Step 11990: {'lr': 0.00044843790094067026, 'samples': 2302272, 'steps': 11990, 'loss/train': 0.7817637920379639} 01/27/2022 07:07:13 - INFO - codeparrot_training - Step 11991: {'lr': 0.00044842794817838286, 'samples': 2302464, 'steps': 11991, 'loss/train': 0.8196988999843597} 01/27/2022 07:07:17 - INFO - codeparrot_training - Step 11992: {'lr': 0.0004484179945660931, 'samples': 2302656, 'steps': 11992, 'loss/train': 1.1488071084022522} 01/27/2022 07:07:20 - INFO - codeparrot_training - Step 11993: {'lr': 0.00044840804010384366, 'samples': 2302848, 'steps': 11993, 'loss/train': 0.9100580513477325} 01/27/2022 07:07:23 - INFO - codeparrot_training - Step 11994: {'lr': 0.00044839808479167723, 'samples': 2303040, 'steps': 11994, 'loss/train': 0.987813413143158} 01/27/2022 07:07:28 - INFO - codeparrot_training - Step 11995: {'lr': 0.00044838812862963627, 'samples': 2303232, 'steps': 11995, 'loss/train': 0.482761412858963} 01/27/2022 07:07:31 - INFO - codeparrot_training - Step 11996: {'lr': 0.00044837817161776366, 'samples': 2303424, 'steps': 11996, 'loss/train': 0.6940030753612518} 01/27/2022 07:07:34 - INFO - codeparrot_training - Step 11997: {'lr': 0.00044836821375610194, 'samples': 2303616, 'steps': 11997, 'loss/train': 0.796558678150177} 01/27/2022 07:07:37 - INFO - codeparrot_training - Step 11998: {'lr': 0.0004483582550446938, 'samples': 2303808, 'steps': 11998, 'loss/train': 1.1905471086502075} 01/27/2022 07:07:40 - INFO - codeparrot_training - Step 11999: {'lr': 0.0004483482954835819, 'samples': 2304000, 'steps': 11999, 'loss/train': 1.5027480125427246} 01/27/2022 07:07:40 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/27/2022 07:07:58 - WARNING - huggingface_hub.repository - Several commits (6) will be pushed upstream. 01/27/2022 07:07:58 - WARNING - huggingface_hub.repository - The progress bars may be unreliable. 01/27/2022 07:09:13 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py 71111c3..2f6074d royal-monkey-12 -> royal-monkey-12 01/27/2022 07:09:17 - INFO - codeparrot_training - Step 12000: {'lr': 0.0004483383350728088, 'samples': 2304192, 'steps': 12000, 'loss/train': 0.9131032526493073} 01/27/2022 07:09:20 - INFO - codeparrot_training - Step 12001: {'lr': 0.00044832837381241733, 'samples': 2304384, 'steps': 12001, 'loss/train': 1.1720669567584991} 01/27/2022 07:09:23 - INFO - codeparrot_training - Step 12002: {'lr': 0.00044831841170245003, 'samples': 2304576, 'steps': 12002, 'loss/train': 0.7890955209732056} 01/27/2022 07:09:26 - INFO - codeparrot_training - Step 12003: {'lr': 0.0004483084487429496, 'samples': 2304768, 'steps': 12003, 'loss/train': 0.9476674497127533} 01/27/2022 07:09:31 - INFO - codeparrot_training - Step 12004: {'lr': 0.00044829848493395884, 'samples': 2304960, 'steps': 12004, 'loss/train': 0.766146719455719} 01/27/2022 07:09:35 - INFO - codeparrot_training - Step 12005: {'lr': 0.00044828852027552023, 'samples': 2305152, 'steps': 12005, 'loss/train': 1.236779361963272} 01/27/2022 07:09:38 - INFO - codeparrot_training - Step 12006: {'lr': 0.00044827855476767665, 'samples': 2305344, 'steps': 12006, 'loss/train': 1.8873873353004456} 01/27/2022 07:09:41 - INFO - codeparrot_training - Step 12007: {'lr': 0.00044826858841047067, 'samples': 2305536, 'steps': 12007, 'loss/train': 0.7603171169757843} 01/27/2022 07:09:44 - INFO - codeparrot_training - Step 12008: {'lr': 0.00044825862120394504, 'samples': 2305728, 'steps': 12008, 'loss/train': 0.7256976664066315} 01/27/2022 07:09:47 - INFO - codeparrot_training - Step 12009: {'lr': 0.00044824865314814245, 'samples': 2305920, 'steps': 12009, 'loss/train': 1.093521773815155} 01/27/2022 07:09:50 - INFO - codeparrot_training - Step 12010: {'lr': 0.00044823868424310553, 'samples': 2306112, 'steps': 12010, 'loss/train': 0.7251077592372894} 01/27/2022 07:09:54 - INFO - codeparrot_training - Step 12011: {'lr': 0.00044822871448887703, 'samples': 2306304, 'steps': 12011, 'loss/train': 1.0844770073890686} 01/27/2022 07:09:57 - INFO - codeparrot_training - Step 12012: {'lr': 0.0004482187438854997, 'samples': 2306496, 'steps': 12012, 'loss/train': 0.8427806496620178} 01/27/2022 07:10:01 - INFO - codeparrot_training - Step 12013: {'lr': 0.00044820877243301617, 'samples': 2306688, 'steps': 12013, 'loss/train': 1.9570780992507935} 01/27/2022 07:10:04 - INFO - codeparrot_training - Step 12014: {'lr': 0.00044819880013146924, 'samples': 2306880, 'steps': 12014, 'loss/train': 1.098417341709137} 01/27/2022 07:10:08 - INFO - codeparrot_training - Step 12015: {'lr': 0.0004481888269809016, 'samples': 2307072, 'steps': 12015, 'loss/train': 0.9361422657966614} 01/27/2022 07:10:11 - INFO - codeparrot_training - Step 12016: {'lr': 0.0004481788529813559, 'samples': 2307264, 'steps': 12016, 'loss/train': 1.254083901643753} 01/27/2022 07:10:14 - INFO - codeparrot_training - Step 12017: {'lr': 0.00044816887813287494, 'samples': 2307456, 'steps': 12017, 'loss/train': 0.6621815711259842} 01/27/2022 07:10:17 - INFO - codeparrot_training - Step 12018: {'lr': 0.0004481589024355014, 'samples': 2307648, 'steps': 12018, 'loss/train': 1.0199036300182343} 01/27/2022 07:10:20 - INFO - codeparrot_training - Step 12019: {'lr': 0.00044814892588927816, 'samples': 2307840, 'steps': 12019, 'loss/train': 1.1281749308109283} 01/27/2022 07:10:23 - INFO - codeparrot_training - Step 12020: {'lr': 0.00044813894849424777, 'samples': 2308032, 'steps': 12020, 'loss/train': 0.8308287262916565} 01/27/2022 07:10:26 - INFO - codeparrot_training - Step 12021: {'lr': 0.00044812897025045295, 'samples': 2308224, 'steps': 12021, 'loss/train': 1.247148871421814} 01/27/2022 07:10:31 - INFO - codeparrot_training - Step 12022: {'lr': 0.00044811899115793666, 'samples': 2308416, 'steps': 12022, 'loss/train': 0.9897028505802155} 01/27/2022 07:10:35 - INFO - codeparrot_training - Step 12023: {'lr': 0.0004481090112167415, 'samples': 2308608, 'steps': 12023, 'loss/train': 1.0170314311981201} 01/27/2022 07:10:38 - INFO - codeparrot_training - Step 12024: {'lr': 0.0004480990304269102, 'samples': 2308800, 'steps': 12024, 'loss/train': 0.8874135911464691} 01/27/2022 07:10:41 - INFO - codeparrot_training - Step 12025: {'lr': 0.00044808904878848555, 'samples': 2308992, 'steps': 12025, 'loss/train': 0.7563833892345428} 01/27/2022 07:10:44 - INFO - codeparrot_training - Step 12026: {'lr': 0.00044807906630151033, 'samples': 2309184, 'steps': 12026, 'loss/train': 0.7349322438240051} 01/27/2022 07:10:47 - INFO - codeparrot_training - Step 12027: {'lr': 0.00044806908296602733, 'samples': 2309376, 'steps': 12027, 'loss/train': 0.6373585313558578} 01/27/2022 07:10:50 - INFO - codeparrot_training - Step 12028: {'lr': 0.0004480590987820793, 'samples': 2309568, 'steps': 12028, 'loss/train': 1.10153129696846} 01/27/2022 07:10:54 - INFO - codeparrot_training - Step 12029: {'lr': 0.00044804911374970893, 'samples': 2309760, 'steps': 12029, 'loss/train': 0.6417626738548279} 01/27/2022 07:10:57 - INFO - codeparrot_training - Step 12030: {'lr': 0.000448039127868959, 'samples': 2309952, 'steps': 12030, 'loss/train': 0.8682069182395935} 01/27/2022 07:11:01 - INFO - codeparrot_training - Step 12031: {'lr': 0.0004480291411398724, 'samples': 2310144, 'steps': 12031, 'loss/train': 1.036831945180893} 01/27/2022 07:11:04 - INFO - codeparrot_training - Step 12032: {'lr': 0.0004480191535624918, 'samples': 2310336, 'steps': 12032, 'loss/train': 1.1318652033805847} 01/27/2022 07:11:07 - INFO - codeparrot_training - Step 12033: {'lr': 0.00044800916513686, 'samples': 2310528, 'steps': 12033, 'loss/train': 0.827936053276062} 01/27/2022 07:11:11 - INFO - codeparrot_training - Step 12034: {'lr': 0.00044799917586301987, 'samples': 2310720, 'steps': 12034, 'loss/train': 0.7545828223228455} 01/27/2022 07:11:14 - INFO - codeparrot_training - Step 12035: {'lr': 0.00044798918574101413, 'samples': 2310912, 'steps': 12035, 'loss/train': 0.8072331547737122} 01/27/2022 07:11:17 - INFO - codeparrot_training - Step 12036: {'lr': 0.00044797919477088555, 'samples': 2311104, 'steps': 12036, 'loss/train': 0.8278308212757111} 01/27/2022 07:11:20 - INFO - codeparrot_training - Step 12037: {'lr': 0.00044796920295267696, 'samples': 2311296, 'steps': 12037, 'loss/train': 0.35753631591796875} 01/27/2022 07:11:23 - INFO - codeparrot_training - Step 12038: {'lr': 0.0004479592102864313, 'samples': 2311488, 'steps': 12038, 'loss/train': 0.6621298938989639} 01/27/2022 07:11:26 - INFO - codeparrot_training - Step 12039: {'lr': 0.0004479492167721911, 'samples': 2311680, 'steps': 12039, 'loss/train': 0.23284989595413208} 01/27/2022 07:11:31 - INFO - codeparrot_training - Step 12040: {'lr': 0.0004479392224099993, 'samples': 2311872, 'steps': 12040, 'loss/train': 0.4830282926559448} 01/27/2022 07:11:34 - INFO - codeparrot_training - Step 12041: {'lr': 0.00044792922719989883, 'samples': 2312064, 'steps': 12041, 'loss/train': 0.8209190368652344} 01/27/2022 07:11:37 - INFO - codeparrot_training - Step 12042: {'lr': 0.00044791923114193233, 'samples': 2312256, 'steps': 12042, 'loss/train': 0.7713969647884369} 01/27/2022 07:11:40 - INFO - codeparrot_training - Step 12043: {'lr': 0.0004479092342361427, 'samples': 2312448, 'steps': 12043, 'loss/train': 1.2253699600696564} 01/27/2022 07:11:43 - INFO - codeparrot_training - Step 12044: {'lr': 0.0004478992364825728, 'samples': 2312640, 'steps': 12044, 'loss/train': 0.9678555428981781} 01/27/2022 07:11:46 - INFO - codeparrot_training - Step 12045: {'lr': 0.00044788923788126534, 'samples': 2312832, 'steps': 12045, 'loss/train': 0.8889021277427673} 01/27/2022 07:11:50 - INFO - codeparrot_training - Step 12046: {'lr': 0.00044787923843226323, 'samples': 2313024, 'steps': 12046, 'loss/train': 0.6924737244844437} 01/27/2022 07:11:53 - INFO - codeparrot_training - Step 12047: {'lr': 0.0004478692381356093, 'samples': 2313216, 'steps': 12047, 'loss/train': 0.6797019392251968} 01/27/2022 07:11:58 - INFO - codeparrot_training - Step 12048: {'lr': 0.00044785923699134646, 'samples': 2313408, 'steps': 12048, 'loss/train': 0.5929422676563263} 01/27/2022 07:12:01 - INFO - codeparrot_training - Step 12049: {'lr': 0.0004478492349995174, 'samples': 2313600, 'steps': 12049, 'loss/train': 0.632967934012413} 01/27/2022 07:12:04 - INFO - codeparrot_training - Step 12050: {'lr': 0.00044783923216016507, 'samples': 2313792, 'steps': 12050, 'loss/train': 0.7316986173391342} 01/27/2022 07:12:07 - INFO - codeparrot_training - Step 12051: {'lr': 0.0004478292284733323, 'samples': 2313984, 'steps': 12051, 'loss/train': 0.7261872589588165} 01/27/2022 07:12:11 - INFO - codeparrot_training - Step 12052: {'lr': 0.00044781922393906186, 'samples': 2314176, 'steps': 12052, 'loss/train': 0.3991461396217346} 01/27/2022 07:12:14 - INFO - codeparrot_training - Step 12053: {'lr': 0.00044780921855739676, 'samples': 2314368, 'steps': 12053, 'loss/train': 0.7652874290943146} 01/27/2022 07:12:17 - INFO - codeparrot_training - Step 12054: {'lr': 0.00044779921232837973, 'samples': 2314560, 'steps': 12054, 'loss/train': 1.043486773967743} 01/27/2022 07:12:20 - INFO - codeparrot_training - Step 12055: {'lr': 0.0004477892052520537, 'samples': 2314752, 'steps': 12055, 'loss/train': 0.5633971989154816} 01/27/2022 07:12:23 - INFO - codeparrot_training - Step 12056: {'lr': 0.0004477791973284616, 'samples': 2314944, 'steps': 12056, 'loss/train': 0.9831972420215607} 01/27/2022 07:12:28 - INFO - codeparrot_training - Step 12057: {'lr': 0.00044776918855764616, 'samples': 2315136, 'steps': 12057, 'loss/train': 0.8836492896080017} 01/27/2022 07:12:31 - INFO - codeparrot_training - Step 12058: {'lr': 0.00044775917893965025, 'samples': 2315328, 'steps': 12058, 'loss/train': 0.9546863436698914} 01/27/2022 07:12:34 - INFO - codeparrot_training - Step 12059: {'lr': 0.00044774916847451683, 'samples': 2315520, 'steps': 12059, 'loss/train': 0.6233745217323303} 01/27/2022 07:12:37 - INFO - codeparrot_training - Step 12060: {'lr': 0.0004477391571622889, 'samples': 2315712, 'steps': 12060, 'loss/train': 0.8889648020267487} 01/27/2022 07:12:40 - INFO - codeparrot_training - Step 12061: {'lr': 0.00044772914500300907, 'samples': 2315904, 'steps': 12061, 'loss/train': 0.9367986917495728} 01/27/2022 07:12:44 - INFO - codeparrot_training - Step 12062: {'lr': 0.0004477191319967204, 'samples': 2316096, 'steps': 12062, 'loss/train': 0.707605853676796} 01/27/2022 07:12:47 - INFO - codeparrot_training - Step 12063: {'lr': 0.0004477091181434658, 'samples': 2316288, 'steps': 12063, 'loss/train': 0.8378485143184662} 01/27/2022 07:12:50 - INFO - codeparrot_training - Step 12064: {'lr': 0.00044769910344328803, 'samples': 2316480, 'steps': 12064, 'loss/train': 0.7061160653829575} 01/27/2022 07:12:53 - INFO - codeparrot_training - Step 12065: {'lr': 0.00044768908789623015, 'samples': 2316672, 'steps': 12065, 'loss/train': 0.7869120240211487} 01/27/2022 07:12:58 - INFO - codeparrot_training - Step 12066: {'lr': 0.00044767907150233496, 'samples': 2316864, 'steps': 12066, 'loss/train': 0.894908219575882} 01/27/2022 07:13:01 - INFO - codeparrot_training - Step 12067: {'lr': 0.0004476690542616454, 'samples': 2317056, 'steps': 12067, 'loss/train': 0.689705953001976} 01/27/2022 07:13:04 - INFO - codeparrot_training - Step 12068: {'lr': 0.00044765903617420436, 'samples': 2317248, 'steps': 12068, 'loss/train': 0.9977291822433472} 01/27/2022 07:13:07 - INFO - codeparrot_training - Step 12069: {'lr': 0.0004476490172400548, 'samples': 2317440, 'steps': 12069, 'loss/train': 0.9105515778064728} 01/27/2022 07:13:10 - INFO - codeparrot_training - Step 12070: {'lr': 0.00044763899745923965, 'samples': 2317632, 'steps': 12070, 'loss/train': 1.135503888130188} 01/27/2022 07:13:13 - INFO - codeparrot_training - Step 12071: {'lr': 0.0004476289768318017, 'samples': 2317824, 'steps': 12071, 'loss/train': 1.820925772190094} 01/27/2022 07:13:16 - INFO - codeparrot_training - Step 12072: {'lr': 0.00044761895535778404, 'samples': 2318016, 'steps': 12072, 'loss/train': 1.1138343214988708} 01/27/2022 07:13:20 - INFO - codeparrot_training - Step 12073: {'lr': 0.0004476089330372295, 'samples': 2318208, 'steps': 12073, 'loss/train': 0.625363290309906} 01/27/2022 07:13:23 - INFO - codeparrot_training - Step 12074: {'lr': 0.00044759890987018105, 'samples': 2318400, 'steps': 12074, 'loss/train': 0.8135371506214142} 01/27/2022 07:13:27 - INFO - codeparrot_training - Step 12075: {'lr': 0.0004475888858566816, 'samples': 2318592, 'steps': 12075, 'loss/train': 0.9672026038169861} 01/27/2022 07:13:30 - INFO - codeparrot_training - Step 12076: {'lr': 0.00044757886099677416, 'samples': 2318784, 'steps': 12076, 'loss/train': 0.877702921628952} 01/27/2022 07:13:33 - INFO - codeparrot_training - Step 12077: {'lr': 0.0004475688352905015, 'samples': 2318976, 'steps': 12077, 'loss/train': 0.8449716567993164} 01/27/2022 07:13:36 - INFO - codeparrot_training - Step 12078: {'lr': 0.00044755880873790675, 'samples': 2319168, 'steps': 12078, 'loss/train': 0.507723480463028} 01/27/2022 07:13:40 - INFO - codeparrot_training - Step 12079: {'lr': 0.00044754878133903284, 'samples': 2319360, 'steps': 12079, 'loss/train': 1.173777312040329} 01/27/2022 07:13:43 - INFO - codeparrot_training - Step 12080: {'lr': 0.0004475387530939226, 'samples': 2319552, 'steps': 12080, 'loss/train': 0.919619232416153} 01/27/2022 07:13:46 - INFO - codeparrot_training - Step 12081: {'lr': 0.00044752872400261913, 'samples': 2319744, 'steps': 12081, 'loss/train': 1.2546881139278412} 01/27/2022 07:13:49 - INFO - codeparrot_training - Step 12082: {'lr': 0.0004475186940651653, 'samples': 2319936, 'steps': 12082, 'loss/train': 1.1179669797420502} 01/27/2022 07:13:52 - INFO - codeparrot_training - Step 12083: {'lr': 0.0004475086632816041, 'samples': 2320128, 'steps': 12083, 'loss/train': 1.2567194402217865} 01/27/2022 07:13:57 - INFO - codeparrot_training - Step 12084: {'lr': 0.00044749863165197845, 'samples': 2320320, 'steps': 12084, 'loss/train': 1.2243638634681702} 01/27/2022 07:14:00 - INFO - codeparrot_training - Step 12085: {'lr': 0.00044748859917633144, 'samples': 2320512, 'steps': 12085, 'loss/train': 1.0253104269504547} 01/27/2022 07:14:04 - INFO - codeparrot_training - Step 12086: {'lr': 0.00044747856585470604, 'samples': 2320704, 'steps': 12086, 'loss/train': 0.3093618080019951} 01/27/2022 07:14:07 - INFO - codeparrot_training - Step 12087: {'lr': 0.00044746853168714507, 'samples': 2320896, 'steps': 12087, 'loss/train': 1.3104119002819061} 01/27/2022 07:14:10 - INFO - codeparrot_training - Step 12088: {'lr': 0.0004474584966736917, 'samples': 2321088, 'steps': 12088, 'loss/train': 1.0491786897182465} 01/27/2022 07:14:13 - INFO - codeparrot_training - Step 12089: {'lr': 0.00044744846081438874, 'samples': 2321280, 'steps': 12089, 'loss/train': 0.898136705160141} 01/27/2022 07:14:16 - INFO - codeparrot_training - Step 12090: {'lr': 0.0004474384241092793, 'samples': 2321472, 'steps': 12090, 'loss/train': 0.6292217820882797} 01/27/2022 07:14:19 - INFO - codeparrot_training - Step 12091: {'lr': 0.00044742838655840636, 'samples': 2321664, 'steps': 12091, 'loss/train': 0.9579508602619171} 01/27/2022 07:14:22 - INFO - codeparrot_training - Step 12092: {'lr': 0.0004474183481618129, 'samples': 2321856, 'steps': 12092, 'loss/train': 1.309549480676651} 01/27/2022 07:14:27 - INFO - codeparrot_training - Step 12093: {'lr': 0.00044740830891954196, 'samples': 2322048, 'steps': 12093, 'loss/train': 0.9369538128376007} 01/27/2022 07:14:30 - INFO - codeparrot_training - Step 12094: {'lr': 0.0004473982688316365, 'samples': 2322240, 'steps': 12094, 'loss/train': 0.7763133645057678} 01/27/2022 07:14:33 - INFO - codeparrot_training - Step 12095: {'lr': 0.0004473882278981395, 'samples': 2322432, 'steps': 12095, 'loss/train': 0.8703371286392212} 01/27/2022 07:14:36 - INFO - codeparrot_training - Step 12096: {'lr': 0.000447378186119094, 'samples': 2322624, 'steps': 12096, 'loss/train': 1.0695066154003143} 01/27/2022 07:14:40 - INFO - codeparrot_training - Step 12097: {'lr': 0.00044736814349454303, 'samples': 2322816, 'steps': 12097, 'loss/train': 0.7807779014110565} 01/27/2022 07:14:43 - INFO - codeparrot_training - Step 12098: {'lr': 0.0004473581000245296, 'samples': 2323008, 'steps': 12098, 'loss/train': 1.2774908244609833} 01/27/2022 07:14:46 - INFO - codeparrot_training - Step 12099: {'lr': 0.00044734805570909676, 'samples': 2323200, 'steps': 12099, 'loss/train': 0.8869657516479492} 01/27/2022 07:14:49 - INFO - codeparrot_training - Step 12100: {'lr': 0.0004473380105482875, 'samples': 2323392, 'steps': 12100, 'loss/train': 0.2185254916548729} 01/27/2022 07:14:52 - INFO - codeparrot_training - Step 12101: {'lr': 0.0004473279645421449, 'samples': 2323584, 'steps': 12101, 'loss/train': 1.0314715504646301} 01/27/2022 07:14:57 - INFO - codeparrot_training - Step 12102: {'lr': 0.00044731791769071197, 'samples': 2323776, 'steps': 12102, 'loss/train': 1.0292823314666748} 01/27/2022 07:15:00 - INFO - codeparrot_training - Step 12103: {'lr': 0.00044730786999403166, 'samples': 2323968, 'steps': 12103, 'loss/train': 0.9537997841835022} 01/27/2022 07:15:03 - INFO - codeparrot_training - Step 12104: {'lr': 0.00044729782145214717, 'samples': 2324160, 'steps': 12104, 'loss/train': 0.3873461186885834} 01/27/2022 07:15:06 - INFO - codeparrot_training - Step 12105: {'lr': 0.0004472877720651014, 'samples': 2324352, 'steps': 12105, 'loss/train': 0.7772423923015594} 01/27/2022 07:15:09 - INFO - codeparrot_training - Step 12106: {'lr': 0.0004472777218329375, 'samples': 2324544, 'steps': 12106, 'loss/train': 0.372709259390831} 01/27/2022 07:15:12 - INFO - codeparrot_training - Step 12107: {'lr': 0.00044726767075569843, 'samples': 2324736, 'steps': 12107, 'loss/train': 0.05656872130930424} 01/27/2022 07:15:16 - INFO - codeparrot_training - Step 12108: {'lr': 0.0004472576188334273, 'samples': 2324928, 'steps': 12108, 'loss/train': 1.0120544135570526} 01/27/2022 07:15:19 - INFO - codeparrot_training - Step 12109: {'lr': 0.00044724756606616726, 'samples': 2325120, 'steps': 12109, 'loss/train': 0.8142538368701935} 01/27/2022 07:15:22 - INFO - codeparrot_training - Step 12110: {'lr': 0.00044723751245396117, 'samples': 2325312, 'steps': 12110, 'loss/train': 0.8462562561035156} 01/27/2022 07:15:27 - INFO - codeparrot_training - Step 12111: {'lr': 0.00044722745799685227, 'samples': 2325504, 'steps': 12111, 'loss/train': 1.192044049501419} 01/27/2022 07:15:30 - INFO - codeparrot_training - Step 12112: {'lr': 0.00044721740269488354, 'samples': 2325696, 'steps': 12112, 'loss/train': 1.175875961780548} 01/27/2022 07:15:33 - INFO - codeparrot_training - Step 12113: {'lr': 0.0004472073465480981, 'samples': 2325888, 'steps': 12113, 'loss/train': 0.8381420373916626} 01/27/2022 07:15:37 - INFO - codeparrot_training - Step 12114: {'lr': 0.000447197289556539, 'samples': 2326080, 'steps': 12114, 'loss/train': 1.0438521802425385} 01/27/2022 07:15:40 - INFO - codeparrot_training - Step 12115: {'lr': 0.0004471872317202493, 'samples': 2326272, 'steps': 12115, 'loss/train': 0.49265094101428986} 01/27/2022 07:15:43 - INFO - codeparrot_training - Step 12116: {'lr': 0.0004471771730392722, 'samples': 2326464, 'steps': 12116, 'loss/train': 1.2140796482563019} 01/27/2022 07:15:46 - INFO - codeparrot_training - Step 12117: {'lr': 0.00044716711351365057, 'samples': 2326656, 'steps': 12117, 'loss/train': 0.8684354424476624} 01/27/2022 07:15:49 - INFO - codeparrot_training - Step 12118: {'lr': 0.00044715705314342776, 'samples': 2326848, 'steps': 12118, 'loss/train': 0.7329856753349304} 01/27/2022 07:15:52 - INFO - codeparrot_training - Step 12119: {'lr': 0.0004471469919286467, 'samples': 2327040, 'steps': 12119, 'loss/train': 1.3291119933128357} 01/27/2022 07:15:57 - INFO - codeparrot_training - Step 12120: {'lr': 0.0004471369298693505, 'samples': 2327232, 'steps': 12120, 'loss/train': 0.8180459439754486} 01/27/2022 07:16:00 - INFO - codeparrot_training - Step 12121: {'lr': 0.0004471268669655822, 'samples': 2327424, 'steps': 12121, 'loss/train': 0.7554691135883331} 01/27/2022 07:16:03 - INFO - codeparrot_training - Step 12122: {'lr': 0.0004471168032173852, 'samples': 2327616, 'steps': 12122, 'loss/train': 0.3980603814125061} 01/27/2022 07:16:06 - INFO - codeparrot_training - Step 12123: {'lr': 0.0004471067386248023, 'samples': 2327808, 'steps': 12123, 'loss/train': 1.3624224364757538} 01/27/2022 07:16:09 - INFO - codeparrot_training - Step 12124: {'lr': 0.0004470966731878767, 'samples': 2328000, 'steps': 12124, 'loss/train': 1.0386950969696045} 01/27/2022 07:16:13 - INFO - codeparrot_training - Step 12125: {'lr': 0.0004470866069066516, 'samples': 2328192, 'steps': 12125, 'loss/train': 0.7187041640281677} 01/27/2022 07:16:16 - INFO - codeparrot_training - Step 12126: {'lr': 0.00044707653978117004, 'samples': 2328384, 'steps': 12126, 'loss/train': 0.8292063474655151} 01/27/2022 07:16:19 - INFO - codeparrot_training - Step 12127: {'lr': 0.00044706647181147507, 'samples': 2328576, 'steps': 12127, 'loss/train': 1.0797443389892578} 01/27/2022 07:16:22 - INFO - codeparrot_training - Step 12128: {'lr': 0.00044705640299761004, 'samples': 2328768, 'steps': 12128, 'loss/train': 0.8729239404201508} 01/27/2022 07:16:27 - INFO - codeparrot_training - Step 12129: {'lr': 0.0004470463333396179, 'samples': 2328960, 'steps': 12129, 'loss/train': 0.6192236691713333} 01/27/2022 07:16:30 - INFO - codeparrot_training - Step 12130: {'lr': 0.0004470362628375418, 'samples': 2329152, 'steps': 12130, 'loss/train': 0.7966689169406891} 01/27/2022 07:16:33 - INFO - codeparrot_training - Step 12131: {'lr': 0.000447026191491425, 'samples': 2329344, 'steps': 12131, 'loss/train': 1.4475993812084198} 01/27/2022 07:16:37 - INFO - codeparrot_training - Step 12132: {'lr': 0.0004470161193013105, 'samples': 2329536, 'steps': 12132, 'loss/train': 0.9049491584300995} 01/27/2022 07:16:40 - INFO - codeparrot_training - Step 12133: {'lr': 0.0004470060462672415, 'samples': 2329728, 'steps': 12133, 'loss/train': 0.8381018936634064} 01/27/2022 07:16:43 - INFO - codeparrot_training - Step 12134: {'lr': 0.0004469959723892612, 'samples': 2329920, 'steps': 12134, 'loss/train': 0.8697022497653961} 01/27/2022 07:16:46 - INFO - codeparrot_training - Step 12135: {'lr': 0.0004469858976674126, 'samples': 2330112, 'steps': 12135, 'loss/train': 0.6129904836416245} 01/27/2022 07:16:49 - INFO - codeparrot_training - Step 12136: {'lr': 0.000446975822101739, 'samples': 2330304, 'steps': 12136, 'loss/train': 1.1163817942142487} 01/27/2022 07:16:52 - INFO - codeparrot_training - Step 12137: {'lr': 0.00044696574569228365, 'samples': 2330496, 'steps': 12137, 'loss/train': 0.8623010516166687} 01/27/2022 07:16:57 - INFO - codeparrot_training - Step 12138: {'lr': 0.00044695566843908947, 'samples': 2330688, 'steps': 12138, 'loss/train': 0.8916383385658264} 01/27/2022 07:17:00 - INFO - codeparrot_training - Step 12139: {'lr': 0.0004469455903421998, 'samples': 2330880, 'steps': 12139, 'loss/train': 0.5753321349620819} 01/27/2022 07:17:03 - INFO - codeparrot_training - Step 12140: {'lr': 0.0004469355114016577, 'samples': 2331072, 'steps': 12140, 'loss/train': 0.5466098338365555} 01/27/2022 07:17:06 - INFO - codeparrot_training - Step 12141: {'lr': 0.0004469254316175065, 'samples': 2331264, 'steps': 12141, 'loss/train': 1.2085741460323334} 01/27/2022 07:17:09 - INFO - codeparrot_training - Step 12142: {'lr': 0.0004469153509897892, 'samples': 2331456, 'steps': 12142, 'loss/train': 0.7110555768013} 01/27/2022 07:17:12 - INFO - codeparrot_training - Step 12143: {'lr': 0.00044690526951854907, 'samples': 2331648, 'steps': 12143, 'loss/train': 0.7945336997509003} 01/27/2022 07:17:16 - INFO - codeparrot_training - Step 12144: {'lr': 0.0004468951872038293, 'samples': 2331840, 'steps': 12144, 'loss/train': 0.8071233630180359} 01/27/2022 07:17:19 - INFO - codeparrot_training - Step 12145: {'lr': 0.00044688510404567307, 'samples': 2332032, 'steps': 12145, 'loss/train': 0.9313871562480927} 01/27/2022 07:17:23 - INFO - codeparrot_training - Step 12146: {'lr': 0.0004468750200441236, 'samples': 2332224, 'steps': 12146, 'loss/train': 0.7404634952545166} 01/27/2022 07:17:26 - INFO - codeparrot_training - Step 12147: {'lr': 0.00044686493519922405, 'samples': 2332416, 'steps': 12147, 'loss/train': 1.4853056073188782} 01/27/2022 07:17:29 - INFO - codeparrot_training - Step 12148: {'lr': 0.00044685484951101763, 'samples': 2332608, 'steps': 12148, 'loss/train': 0.6128359436988831} 01/27/2022 07:17:33 - INFO - codeparrot_training - Step 12149: {'lr': 0.0004468447629795475, 'samples': 2332800, 'steps': 12149, 'loss/train': 1.1494457423686981} 01/27/2022 07:17:36 - INFO - codeparrot_training - Step 12150: {'lr': 0.00044683467560485696, 'samples': 2332992, 'steps': 12150, 'loss/train': 0.7545457184314728} 01/27/2022 07:17:39 - INFO - codeparrot_training - Step 12151: {'lr': 0.00044682458738698916, 'samples': 2333184, 'steps': 12151, 'loss/train': 1.0413522720336914} 01/27/2022 07:17:42 - INFO - codeparrot_training - Step 12152: {'lr': 0.0004468144983259873, 'samples': 2333376, 'steps': 12152, 'loss/train': 0.786812961101532} 01/27/2022 07:17:45 - INFO - codeparrot_training - Step 12153: {'lr': 0.00044680440842189464, 'samples': 2333568, 'steps': 12153, 'loss/train': 0.8756998479366302} 01/27/2022 07:17:48 - INFO - codeparrot_training - Step 12154: {'lr': 0.0004467943176747544, 'samples': 2333760, 'steps': 12154, 'loss/train': 0.88331338763237} 01/27/2022 07:17:53 - INFO - codeparrot_training - Step 12155: {'lr': 0.0004467842260846098, 'samples': 2333952, 'steps': 12155, 'loss/train': 1.1141711175441742} 01/27/2022 07:17:57 - INFO - codeparrot_training - Step 12156: {'lr': 0.00044677413365150397, 'samples': 2334144, 'steps': 12156, 'loss/train': 1.0314733386039734} 01/27/2022 07:18:00 - INFO - codeparrot_training - Step 12157: {'lr': 0.00044676404037548035, 'samples': 2334336, 'steps': 12157, 'loss/train': 1.5219733715057373} 01/27/2022 07:18:03 - INFO - codeparrot_training - Step 12158: {'lr': 0.0004467539462565821, 'samples': 2334528, 'steps': 12158, 'loss/train': 0.5064755827188492} 01/27/2022 07:18:06 - INFO - codeparrot_training - Step 12159: {'lr': 0.0004467438512948523, 'samples': 2334720, 'steps': 12159, 'loss/train': 1.1947512924671173} 01/27/2022 07:18:09 - INFO - codeparrot_training - Step 12160: {'lr': 0.00044673375549033435, 'samples': 2334912, 'steps': 12160, 'loss/train': 0.9128781259059906} 01/27/2022 07:18:12 - INFO - codeparrot_training - Step 12161: {'lr': 0.0004467236588430714, 'samples': 2335104, 'steps': 12161, 'loss/train': 0.809264212846756} 01/27/2022 07:18:15 - INFO - codeparrot_training - Step 12162: {'lr': 0.00044671356135310685, 'samples': 2335296, 'steps': 12162, 'loss/train': 1.010197252035141} 01/27/2022 07:18:19 - INFO - codeparrot_training - Step 12163: {'lr': 0.0004467034630204839, 'samples': 2335488, 'steps': 12163, 'loss/train': 0.8852156102657318} 01/27/2022 07:18:23 - INFO - codeparrot_training - Step 12164: {'lr': 0.0004466933638452457, 'samples': 2335680, 'steps': 12164, 'loss/train': 0.19550246000289917} 01/27/2022 07:18:26 - INFO - codeparrot_training - Step 12165: {'lr': 0.0004466832638274356, 'samples': 2335872, 'steps': 12165, 'loss/train': 0.7252181321382523} 01/27/2022 07:18:29 - INFO - codeparrot_training - Step 12166: {'lr': 0.0004466731629670969, 'samples': 2336064, 'steps': 12166, 'loss/train': 0.959660679101944} 01/27/2022 07:18:33 - INFO - codeparrot_training - Step 12167: {'lr': 0.00044666306126427276, 'samples': 2336256, 'steps': 12167, 'loss/train': 0.870833694934845} 01/27/2022 07:18:36 - INFO - codeparrot_training - Step 12168: {'lr': 0.00044665295871900655, 'samples': 2336448, 'steps': 12168, 'loss/train': 0.40345050394535065} 01/27/2022 07:18:39 - INFO - codeparrot_training - Step 12169: {'lr': 0.0004466428553313415, 'samples': 2336640, 'steps': 12169, 'loss/train': 0.8193452954292297} 01/27/2022 07:18:42 - INFO - codeparrot_training - Step 12170: {'lr': 0.0004466327511013208, 'samples': 2336832, 'steps': 12170, 'loss/train': 1.130470722913742} 01/27/2022 07:18:45 - INFO - codeparrot_training - Step 12171: {'lr': 0.00044662264602898794, 'samples': 2337024, 'steps': 12171, 'loss/train': 0.3917778432369232} 01/27/2022 07:18:48 - INFO - codeparrot_training - Step 12172: {'lr': 0.00044661254011438614, 'samples': 2337216, 'steps': 12172, 'loss/train': 0.9046306014060974} 01/27/2022 07:18:53 - INFO - codeparrot_training - Step 12173: {'lr': 0.00044660243335755854, 'samples': 2337408, 'steps': 12173, 'loss/train': 0.4966825693845749} 01/27/2022 07:18:56 - INFO - codeparrot_training - Step 12174: {'lr': 0.00044659232575854866, 'samples': 2337600, 'steps': 12174, 'loss/train': 0.9395710229873657} 01/27/2022 07:18:59 - INFO - codeparrot_training - Step 12175: {'lr': 0.00044658221731739954, 'samples': 2337792, 'steps': 12175, 'loss/train': 0.9343090653419495} 01/27/2022 07:19:02 - INFO - codeparrot_training - Step 12176: {'lr': 0.0004465721080341547, 'samples': 2337984, 'steps': 12176, 'loss/train': 0.9292960166931152} 01/27/2022 07:19:05 - INFO - codeparrot_training - Step 12177: {'lr': 0.00044656199790885743, 'samples': 2338176, 'steps': 12177, 'loss/train': 0.5663747638463974} 01/27/2022 07:19:08 - INFO - codeparrot_training - Step 12178: {'lr': 0.0004465518869415509, 'samples': 2338368, 'steps': 12178, 'loss/train': 0.7830042243003845} 01/27/2022 07:19:12 - INFO - codeparrot_training - Step 12179: {'lr': 0.0004465417751322785, 'samples': 2338560, 'steps': 12179, 'loss/train': 0.7380959987640381} 01/27/2022 07:19:15 - INFO - codeparrot_training - Step 12180: {'lr': 0.00044653166248108357, 'samples': 2338752, 'steps': 12180, 'loss/train': 1.6130987405776978} 01/27/2022 07:19:19 - INFO - codeparrot_training - Step 12181: {'lr': 0.00044652154898800937, 'samples': 2338944, 'steps': 12181, 'loss/train': 0.738875538110733} 01/27/2022 07:19:22 - INFO - codeparrot_training - Step 12182: {'lr': 0.0004465114346530993, 'samples': 2339136, 'steps': 12182, 'loss/train': 0.8674822747707367} 01/27/2022 07:19:25 - INFO - codeparrot_training - Step 12183: {'lr': 0.0004465013194763966, 'samples': 2339328, 'steps': 12183, 'loss/train': 0.6507478058338165} 01/27/2022 07:19:29 - INFO - codeparrot_training - Step 12184: {'lr': 0.0004464912034579447, 'samples': 2339520, 'steps': 12184, 'loss/train': 0.555087223649025} 01/27/2022 07:19:32 - INFO - codeparrot_training - Step 12185: {'lr': 0.00044648108659778687, 'samples': 2339712, 'steps': 12185, 'loss/train': 0.9097442328929901} 01/27/2022 07:19:35 - INFO - codeparrot_training - Step 12186: {'lr': 0.0004464709688959664, 'samples': 2339904, 'steps': 12186, 'loss/train': 0.966466873884201} 01/27/2022 07:19:38 - INFO - codeparrot_training - Step 12187: {'lr': 0.0004464608503525267, 'samples': 2340096, 'steps': 12187, 'loss/train': 0.9187323153018951} 01/27/2022 07:19:41 - INFO - codeparrot_training - Step 12188: {'lr': 0.0004464507309675111, 'samples': 2340288, 'steps': 12188, 'loss/train': 0.8589552640914917} 01/27/2022 07:19:44 - INFO - codeparrot_training - Step 12189: {'lr': 0.000446440610740963, 'samples': 2340480, 'steps': 12189, 'loss/train': 0.7072043269872665} 01/27/2022 07:19:49 - INFO - codeparrot_training - Step 12190: {'lr': 0.0004464304896729257, 'samples': 2340672, 'steps': 12190, 'loss/train': 1.2750205993652344} 01/27/2022 07:19:53 - INFO - codeparrot_training - Step 12191: {'lr': 0.0004464203677634424, 'samples': 2340864, 'steps': 12191, 'loss/train': 0.986696720123291} 01/27/2022 07:19:56 - INFO - codeparrot_training - Step 12192: {'lr': 0.0004464102450125568, 'samples': 2341056, 'steps': 12192, 'loss/train': 0.33626212924718857} 01/27/2022 07:19:59 - INFO - codeparrot_training - Step 12193: {'lr': 0.00044640012142031196, 'samples': 2341248, 'steps': 12193, 'loss/train': 0.8235611915588379} 01/27/2022 07:20:02 - INFO - codeparrot_training - Step 12194: {'lr': 0.0004463899969867514, 'samples': 2341440, 'steps': 12194, 'loss/train': 1.0508825182914734} 01/27/2022 07:20:05 - INFO - codeparrot_training - Step 12195: {'lr': 0.0004463798717119185, 'samples': 2341632, 'steps': 12195, 'loss/train': 0.8824029564857483} 01/27/2022 07:20:08 - INFO - codeparrot_training - Step 12196: {'lr': 0.00044636974559585655, 'samples': 2341824, 'steps': 12196, 'loss/train': 0.576947495341301} 01/27/2022 07:20:11 - INFO - codeparrot_training - Step 12197: {'lr': 0.00044635961863860894, 'samples': 2342016, 'steps': 12197, 'loss/train': 0.9124509394168854} 01/27/2022 07:20:14 - INFO - codeparrot_training - Step 12198: {'lr': 0.00044634949084021913, 'samples': 2342208, 'steps': 12198, 'loss/train': 0.9137145280838013} 01/27/2022 07:20:19 - INFO - codeparrot_training - Step 12199: {'lr': 0.0004463393622007305, 'samples': 2342400, 'steps': 12199, 'loss/train': 1.0226755142211914} 01/27/2022 07:20:22 - INFO - codeparrot_training - Step 12200: {'lr': 0.0004463292327201862, 'samples': 2342592, 'steps': 12200, 'loss/train': 0.8194570541381836} 01/27/2022 07:20:25 - INFO - codeparrot_training - Step 12201: {'lr': 0.0004463191023986299, 'samples': 2342784, 'steps': 12201, 'loss/train': 0.9454659819602966} 01/27/2022 07:20:28 - INFO - codeparrot_training - Step 12202: {'lr': 0.00044630897123610497, 'samples': 2342976, 'steps': 12202, 'loss/train': 1.0038117170333862} 01/27/2022 07:20:31 - INFO - codeparrot_training - Step 12203: {'lr': 0.0004462988392326547, 'samples': 2343168, 'steps': 12203, 'loss/train': 1.3172310590744019} 01/27/2022 07:20:35 - INFO - codeparrot_training - Step 12204: {'lr': 0.00044628870638832254, 'samples': 2343360, 'steps': 12204, 'loss/train': 0.7216599136590958} 01/27/2022 07:20:38 - INFO - codeparrot_training - Step 12205: {'lr': 0.00044627857270315187, 'samples': 2343552, 'steps': 12205, 'loss/train': 0.37066923826932907} 01/27/2022 07:20:41 - INFO - codeparrot_training - Step 12206: {'lr': 0.00044626843817718615, 'samples': 2343744, 'steps': 12206, 'loss/train': 1.281747579574585} 01/27/2022 07:20:44 - INFO - codeparrot_training - Step 12207: {'lr': 0.00044625830281046875, 'samples': 2343936, 'steps': 12207, 'loss/train': 0.6657535135746002} 01/27/2022 07:20:49 - INFO - codeparrot_training - Step 12208: {'lr': 0.0004462481666030431, 'samples': 2344128, 'steps': 12208, 'loss/train': 0.6654796600341797} 01/27/2022 07:20:52 - INFO - codeparrot_training - Step 12209: {'lr': 0.0004462380295549526, 'samples': 2344320, 'steps': 12209, 'loss/train': 0.961599200963974} 01/27/2022 07:20:56 - INFO - codeparrot_training - Step 12210: {'lr': 0.0004462278916662407, 'samples': 2344512, 'steps': 12210, 'loss/train': 1.1796616315841675} 01/27/2022 07:20:59 - INFO - codeparrot_training - Step 12211: {'lr': 0.00044621775293695085, 'samples': 2344704, 'steps': 12211, 'loss/train': 0.46137991547584534} 01/27/2022 07:21:02 - INFO - codeparrot_training - Step 12212: {'lr': 0.00044620761336712646, 'samples': 2344896, 'steps': 12212, 'loss/train': 0.42424432933330536} 01/27/2022 07:21:05 - INFO - codeparrot_training - Step 12213: {'lr': 0.0004461974729568109, 'samples': 2345088, 'steps': 12213, 'loss/train': 0.7092628329992294} 01/27/2022 07:21:08 - INFO - codeparrot_training - Step 12214: {'lr': 0.0004461873317060477, 'samples': 2345280, 'steps': 12214, 'loss/train': 0.8930952250957489} 01/27/2022 07:21:11 - INFO - codeparrot_training - Step 12215: {'lr': 0.00044617718961488024, 'samples': 2345472, 'steps': 12215, 'loss/train': 0.612250417470932} 01/27/2022 07:21:14 - INFO - codeparrot_training - Step 12216: {'lr': 0.000446167046683352, 'samples': 2345664, 'steps': 12216, 'loss/train': 0.18437924981117249} 01/27/2022 07:21:19 - INFO - codeparrot_training - Step 12217: {'lr': 0.0004461569029115065, 'samples': 2345856, 'steps': 12217, 'loss/train': 1.2182866930961609} 01/27/2022 07:21:22 - INFO - codeparrot_training - Step 12218: {'lr': 0.000446146758299387, 'samples': 2346048, 'steps': 12218, 'loss/train': 0.432705357670784} 01/27/2022 07:21:25 - INFO - codeparrot_training - Step 12219: {'lr': 0.0004461366128470371, 'samples': 2346240, 'steps': 12219, 'loss/train': 0.777524471282959} 01/27/2022 07:21:28 - INFO - codeparrot_training - Step 12220: {'lr': 0.0004461264665545003, 'samples': 2346432, 'steps': 12220, 'loss/train': 0.9995087385177612} 01/27/2022 07:21:31 - INFO - codeparrot_training - Step 12221: {'lr': 0.00044611631942182, 'samples': 2346624, 'steps': 12221, 'loss/train': 0.6547381728887558} 01/27/2022 07:21:35 - INFO - codeparrot_training - Step 12222: {'lr': 0.0004461061714490395, 'samples': 2346816, 'steps': 12222, 'loss/train': 0.5131990760564804} 01/27/2022 07:21:38 - INFO - codeparrot_training - Step 12223: {'lr': 0.0004460960226362026, 'samples': 2347008, 'steps': 12223, 'loss/train': 0.5961633324623108} 01/27/2022 07:21:41 - INFO - codeparrot_training - Step 12224: {'lr': 0.0004460858729833525, 'samples': 2347200, 'steps': 12224, 'loss/train': 1.0264840722084045} 01/27/2022 07:21:45 - INFO - codeparrot_training - Step 12225: {'lr': 0.00044607572249053283, 'samples': 2347392, 'steps': 12225, 'loss/train': 0.750999391078949} 01/27/2022 07:21:49 - INFO - codeparrot_training - Step 12226: {'lr': 0.0004460655711577871, 'samples': 2347584, 'steps': 12226, 'loss/train': 0.6822444945573807} 01/27/2022 07:21:52 - INFO - codeparrot_training - Step 12227: {'lr': 0.00044605541898515863, 'samples': 2347776, 'steps': 12227, 'loss/train': 0.3330850750207901} 01/27/2022 07:21:55 - INFO - codeparrot_training - Step 12228: {'lr': 0.00044604526597269103, 'samples': 2347968, 'steps': 12228, 'loss/train': 0.37638477981090546} 01/27/2022 07:21:58 - INFO - codeparrot_training - Step 12229: {'lr': 0.0004460351121204277, 'samples': 2348160, 'steps': 12229, 'loss/train': 0.7153971344232559} 01/27/2022 07:22:01 - INFO - codeparrot_training - Step 12230: {'lr': 0.00044602495742841226, 'samples': 2348352, 'steps': 12230, 'loss/train': 1.1074588894844055} 01/27/2022 07:22:04 - INFO - codeparrot_training - Step 12231: {'lr': 0.00044601480189668816, 'samples': 2348544, 'steps': 12231, 'loss/train': 0.6604180634021759} 01/27/2022 07:22:07 - INFO - codeparrot_training - Step 12232: {'lr': 0.00044600464552529886, 'samples': 2348736, 'steps': 12232, 'loss/train': 1.1379228830337524} 01/27/2022 07:22:11 - INFO - codeparrot_training - Step 12233: {'lr': 0.0004459944883142879, 'samples': 2348928, 'steps': 12233, 'loss/train': 0.4476655572652817} 01/27/2022 07:22:16 - INFO - codeparrot_training - Step 12234: {'lr': 0.0004459843302636988, 'samples': 2349120, 'steps': 12234, 'loss/train': 1.1159464716911316} 01/27/2022 07:22:19 - INFO - codeparrot_training - Step 12235: {'lr': 0.000445974171373575, 'samples': 2349312, 'steps': 12235, 'loss/train': 1.0519877672195435} 01/27/2022 07:22:22 - INFO - codeparrot_training - Step 12236: {'lr': 0.0004459640116439602, 'samples': 2349504, 'steps': 12236, 'loss/train': 0.9409769475460052} 01/27/2022 07:22:25 - INFO - codeparrot_training - Step 12237: {'lr': 0.0004459538510748977, 'samples': 2349696, 'steps': 12237, 'loss/train': 0.8491616249084473} 01/27/2022 07:22:28 - INFO - codeparrot_training - Step 12238: {'lr': 0.0004459436896664312, 'samples': 2349888, 'steps': 12238, 'loss/train': 1.0243788063526154} 01/27/2022 07:22:31 - INFO - codeparrot_training - Step 12239: {'lr': 0.00044593352741860404, 'samples': 2350080, 'steps': 12239, 'loss/train': 0.9154432117938995} 01/27/2022 07:22:35 - INFO - codeparrot_training - Step 12240: {'lr': 0.00044592336433145995, 'samples': 2350272, 'steps': 12240, 'loss/train': 1.1262227296829224} 01/27/2022 07:22:38 - INFO - codeparrot_training - Step 12241: {'lr': 0.00044591320040504237, 'samples': 2350464, 'steps': 12241, 'loss/train': 0.8121738731861115} 01/27/2022 07:22:41 - INFO - codeparrot_training - Step 12242: {'lr': 0.00044590303563939485, 'samples': 2350656, 'steps': 12242, 'loss/train': 0.7390766143798828} 01/27/2022 07:22:45 - INFO - codeparrot_training - Step 12243: {'lr': 0.0004458928700345609, 'samples': 2350848, 'steps': 12243, 'loss/train': 1.7318324446678162} 01/27/2022 07:22:49 - INFO - codeparrot_training - Step 12244: {'lr': 0.00044588270359058416, 'samples': 2351040, 'steps': 12244, 'loss/train': 0.9543865621089935} 01/27/2022 07:22:52 - INFO - codeparrot_training - Step 12245: {'lr': 0.000445872536307508, 'samples': 2351232, 'steps': 12245, 'loss/train': 0.7179437577724457} 01/27/2022 07:22:55 - INFO - codeparrot_training - Step 12246: {'lr': 0.0004458623681853762, 'samples': 2351424, 'steps': 12246, 'loss/train': 0.746115043759346} 01/27/2022 07:22:58 - INFO - codeparrot_training - Step 12247: {'lr': 0.0004458521992242322, 'samples': 2351616, 'steps': 12247, 'loss/train': 0.7091953754425049} 01/27/2022 07:23:01 - INFO - codeparrot_training - Step 12248: {'lr': 0.00044584202942411956, 'samples': 2351808, 'steps': 12248, 'loss/train': 0.45215070247650146} 01/27/2022 07:23:04 - INFO - codeparrot_training - Step 12249: {'lr': 0.00044583185878508183, 'samples': 2352000, 'steps': 12249, 'loss/train': 0.6831008344888687} 01/27/2022 07:23:07 - INFO - codeparrot_training - Step 12250: {'lr': 0.0004458216873071626, 'samples': 2352192, 'steps': 12250, 'loss/train': 0.7945852875709534} 01/27/2022 07:23:11 - INFO - codeparrot_training - Step 12251: {'lr': 0.00044581151499040547, 'samples': 2352384, 'steps': 12251, 'loss/train': 0.9868383407592773} 01/27/2022 07:23:16 - INFO - codeparrot_training - Step 12252: {'lr': 0.000445801341834854, 'samples': 2352576, 'steps': 12252, 'loss/train': 0.2821195274591446} 01/27/2022 07:23:19 - INFO - codeparrot_training - Step 12253: {'lr': 0.0004457911678405517, 'samples': 2352768, 'steps': 12253, 'loss/train': 0.48157185316085815} 01/27/2022 07:23:22 - INFO - codeparrot_training - Step 12254: {'lr': 0.0004457809930075422, 'samples': 2352960, 'steps': 12254, 'loss/train': 0.6686439961194992} 01/27/2022 07:23:25 - INFO - codeparrot_training - Step 12255: {'lr': 0.0004457708173358691, 'samples': 2353152, 'steps': 12255, 'loss/train': 0.33968614786863327} 01/27/2022 07:23:28 - INFO - codeparrot_training - Step 12256: {'lr': 0.00044576064082557605, 'samples': 2353344, 'steps': 12256, 'loss/train': 0.8103116154670715} 01/27/2022 07:23:31 - INFO - codeparrot_training - Step 12257: {'lr': 0.0004457504634767066, 'samples': 2353536, 'steps': 12257, 'loss/train': 0.6009193807840347} 01/27/2022 07:23:34 - INFO - codeparrot_training - Step 12258: {'lr': 0.0004457402852893042, 'samples': 2353728, 'steps': 12258, 'loss/train': 0.6895751953125} 01/27/2022 07:23:38 - INFO - codeparrot_training - Step 12259: {'lr': 0.0004457301062634126, 'samples': 2353920, 'steps': 12259, 'loss/train': 0.43402621150016785} 01/27/2022 07:23:41 - INFO - codeparrot_training - Step 12260: {'lr': 0.0004457199263990754, 'samples': 2354112, 'steps': 12260, 'loss/train': 0.8044395446777344} 01/27/2022 07:23:45 - INFO - codeparrot_training - Step 12261: {'lr': 0.0004457097456963362, 'samples': 2354304, 'steps': 12261, 'loss/train': 0.9303604960441589} 01/27/2022 07:23:49 - INFO - codeparrot_training - Step 12262: {'lr': 0.0004456995641552386, 'samples': 2354496, 'steps': 12262, 'loss/train': 1.258493810892105} 01/27/2022 07:23:52 - INFO - codeparrot_training - Step 12263: {'lr': 0.0004456893817758262, 'samples': 2354688, 'steps': 12263, 'loss/train': 0.6735131442546844} 01/27/2022 07:23:55 - INFO - codeparrot_training - Step 12264: {'lr': 0.00044567919855814257, 'samples': 2354880, 'steps': 12264, 'loss/train': 1.0348646342754364} 01/27/2022 07:23:58 - INFO - codeparrot_training - Step 12265: {'lr': 0.0004456690145022314, 'samples': 2355072, 'steps': 12265, 'loss/train': 0.7712135016918182} 01/27/2022 07:24:01 - INFO - codeparrot_training - Step 12266: {'lr': 0.0004456588296081364, 'samples': 2355264, 'steps': 12266, 'loss/train': 1.054212212562561} 01/27/2022 07:24:04 - INFO - codeparrot_training - Step 12267: {'lr': 0.000445648643875901, 'samples': 2355456, 'steps': 12267, 'loss/train': 1.6617761850357056} 01/27/2022 07:24:08 - INFO - codeparrot_training - Step 12268: {'lr': 0.000445638457305569, 'samples': 2355648, 'steps': 12268, 'loss/train': 0.8247804343700409} 01/27/2022 07:24:11 - INFO - codeparrot_training - Step 12269: {'lr': 0.00044562826989718397, 'samples': 2355840, 'steps': 12269, 'loss/train': 1.2301734387874603} 01/27/2022 07:24:15 - INFO - codeparrot_training - Step 12270: {'lr': 0.00044561808165078954, 'samples': 2356032, 'steps': 12270, 'loss/train': 0.8731019496917725} 01/27/2022 07:24:19 - INFO - codeparrot_training - Step 12271: {'lr': 0.0004456078925664293, 'samples': 2356224, 'steps': 12271, 'loss/train': 1.0002182722091675} 01/27/2022 07:24:22 - INFO - codeparrot_training - Step 12272: {'lr': 0.000445597702644147, 'samples': 2356416, 'steps': 12272, 'loss/train': 0.6764575839042664} 01/27/2022 07:24:25 - INFO - codeparrot_training - Step 12273: {'lr': 0.0004455875118839863, 'samples': 2356608, 'steps': 12273, 'loss/train': 0.8105164468288422} 01/27/2022 07:24:28 - INFO - codeparrot_training - Step 12274: {'lr': 0.00044557732028599077, 'samples': 2356800, 'steps': 12274, 'loss/train': 0.7983677387237549} 01/27/2022 07:24:31 - INFO - codeparrot_training - Step 12275: {'lr': 0.0004455671278502041, 'samples': 2356992, 'steps': 12275, 'loss/train': 0.6631886512041092} 01/27/2022 07:24:34 - INFO - codeparrot_training - Step 12276: {'lr': 0.00044555693457667, 'samples': 2357184, 'steps': 12276, 'loss/train': 0.787754237651825} 01/27/2022 07:24:37 - INFO - codeparrot_training - Step 12277: {'lr': 0.000445546740465432, 'samples': 2357376, 'steps': 12277, 'loss/train': 0.5789844989776611} 01/27/2022 07:24:41 - INFO - codeparrot_training - Step 12278: {'lr': 0.00044553654551653387, 'samples': 2357568, 'steps': 12278, 'loss/train': 0.6162083297967911} 01/27/2022 07:24:45 - INFO - codeparrot_training - Step 12279: {'lr': 0.0004455263497300194, 'samples': 2357760, 'steps': 12279, 'loss/train': 0.9969406127929688} 01/27/2022 07:24:48 - INFO - codeparrot_training - Step 12280: {'lr': 0.000445516153105932, 'samples': 2357952, 'steps': 12280, 'loss/train': 0.8415431678295135} 01/27/2022 07:24:51 - INFO - codeparrot_training - Step 12281: {'lr': 0.0004455059556443155, 'samples': 2358144, 'steps': 12281, 'loss/train': 0.7796682715415955} 01/27/2022 07:24:55 - INFO - codeparrot_training - Step 12282: {'lr': 0.0004454957573452136, 'samples': 2358336, 'steps': 12282, 'loss/train': 0.5590267181396484} 01/27/2022 07:24:58 - INFO - codeparrot_training - Step 12283: {'lr': 0.0004454855582086699, 'samples': 2358528, 'steps': 12283, 'loss/train': 0.7744691669940948} 01/27/2022 07:25:01 - INFO - codeparrot_training - Step 12284: {'lr': 0.0004454753582347282, 'samples': 2358720, 'steps': 12284, 'loss/train': 0.9759856760501862} 01/27/2022 07:25:04 - INFO - codeparrot_training - Step 12285: {'lr': 0.00044546515742343207, 'samples': 2358912, 'steps': 12285, 'loss/train': 1.2458005249500275} 01/27/2022 07:25:07 - INFO - codeparrot_training - Step 12286: {'lr': 0.00044545495577482535, 'samples': 2359104, 'steps': 12286, 'loss/train': 0.835276186466217} 01/27/2022 07:25:12 - INFO - codeparrot_training - Step 12287: {'lr': 0.00044544475328895164, 'samples': 2359296, 'steps': 12287, 'loss/train': 1.3594876527786255} 01/27/2022 07:25:15 - INFO - codeparrot_training - Step 12288: {'lr': 0.00044543454996585463, 'samples': 2359488, 'steps': 12288, 'loss/train': 0.5855644047260284} 01/27/2022 07:25:18 - INFO - codeparrot_training - Step 12289: {'lr': 0.0004454243458055781, 'samples': 2359680, 'steps': 12289, 'loss/train': 0.9623049795627594} 01/27/2022 07:25:22 - INFO - codeparrot_training - Step 12290: {'lr': 0.00044541414080816573, 'samples': 2359872, 'steps': 12290, 'loss/train': 0.9766892194747925} 01/27/2022 07:25:25 - INFO - codeparrot_training - Step 12291: {'lr': 0.00044540393497366124, 'samples': 2360064, 'steps': 12291, 'loss/train': 1.1587992310523987} 01/27/2022 07:25:28 - INFO - codeparrot_training - Step 12292: {'lr': 0.00044539372830210833, 'samples': 2360256, 'steps': 12292, 'loss/train': 0.6058519631624222} 01/27/2022 07:25:31 - INFO - codeparrot_training - Step 12293: {'lr': 0.0004453835207935507, 'samples': 2360448, 'steps': 12293, 'loss/train': 0.7806226909160614} 01/27/2022 07:25:34 - INFO - codeparrot_training - Step 12294: {'lr': 0.0004453733124480321, 'samples': 2360640, 'steps': 12294, 'loss/train': 0.6116857677698135} 01/27/2022 07:25:37 - INFO - codeparrot_training - Step 12295: {'lr': 0.0004453631032655964, 'samples': 2360832, 'steps': 12295, 'loss/train': 0.8175734281539917} 01/27/2022 07:25:42 - INFO - codeparrot_training - Step 12296: {'lr': 0.00044535289324628704, 'samples': 2361024, 'steps': 12296, 'loss/train': 1.1028328835964203} 01/27/2022 07:25:45 - INFO - codeparrot_training - Step 12297: {'lr': 0.00044534268239014796, 'samples': 2361216, 'steps': 12297, 'loss/train': 0.5616368651390076} 01/27/2022 07:25:48 - INFO - codeparrot_training - Step 12298: {'lr': 0.00044533247069722295, 'samples': 2361408, 'steps': 12298, 'loss/train': 1.0281177163124084} 01/27/2022 07:25:51 - INFO - codeparrot_training - Step 12299: {'lr': 0.0004453222581675556, 'samples': 2361600, 'steps': 12299, 'loss/train': 0.7002411782741547} 01/27/2022 07:25:54 - INFO - codeparrot_training - Step 12300: {'lr': 0.0004453120448011897, 'samples': 2361792, 'steps': 12300, 'loss/train': 0.5164918899536133} 01/27/2022 07:25:57 - INFO - codeparrot_training - Step 12301: {'lr': 0.00044530183059816896, 'samples': 2361984, 'steps': 12301, 'loss/train': 1.1397797763347626} 01/27/2022 07:26:01 - INFO - codeparrot_training - Step 12302: {'lr': 0.00044529161555853725, 'samples': 2362176, 'steps': 12302, 'loss/train': 0.835210382938385} 01/27/2022 07:26:04 - INFO - codeparrot_training - Step 12303: {'lr': 0.0004452813996823383, 'samples': 2362368, 'steps': 12303, 'loss/train': 1.0152828097343445} 01/27/2022 07:26:07 - INFO - codeparrot_training - Step 12304: {'lr': 0.00044527118296961576, 'samples': 2362560, 'steps': 12304, 'loss/train': 0.6070653051137924} 01/27/2022 07:26:11 - INFO - codeparrot_training - Step 12305: {'lr': 0.0004452609654204136, 'samples': 2362752, 'steps': 12305, 'loss/train': 1.1918846368789673} 01/27/2022 07:26:14 - INFO - codeparrot_training - Step 12306: {'lr': 0.0004452507470347754, 'samples': 2362944, 'steps': 12306, 'loss/train': 0.9290452301502228} 01/27/2022 07:26:18 - INFO - codeparrot_training - Step 12307: {'lr': 0.00044524052781274497, 'samples': 2363136, 'steps': 12307, 'loss/train': 1.3686561584472656} 01/27/2022 07:26:21 - INFO - codeparrot_training - Step 12308: {'lr': 0.00044523030775436617, 'samples': 2363328, 'steps': 12308, 'loss/train': 1.0512775182724} 01/27/2022 07:26:24 - INFO - codeparrot_training - Step 12309: {'lr': 0.0004452200868596827, 'samples': 2363520, 'steps': 12309, 'loss/train': 0.7248244285583496} 01/27/2022 07:26:27 - INFO - codeparrot_training - Step 12310: {'lr': 0.0004452098651287384, 'samples': 2363712, 'steps': 12310, 'loss/train': 0.5002974718809128} 01/27/2022 07:26:30 - INFO - codeparrot_training - Step 12311: {'lr': 0.000445199642561577, 'samples': 2363904, 'steps': 12311, 'loss/train': 1.033643513917923} 01/27/2022 07:26:33 - INFO - codeparrot_training - Step 12312: {'lr': 0.0004451894191582423, 'samples': 2364096, 'steps': 12312, 'loss/train': 1.2650711238384247} 01/27/2022 07:26:37 - INFO - codeparrot_training - Step 12313: {'lr': 0.0004451791949187781, 'samples': 2364288, 'steps': 12313, 'loss/train': 0.7842510044574738} 01/27/2022 07:26:42 - INFO - codeparrot_training - Step 12314: {'lr': 0.0004451689698432282, 'samples': 2364480, 'steps': 12314, 'loss/train': 0.38925428688526154} 01/27/2022 07:26:45 - INFO - codeparrot_training - Step 12315: {'lr': 0.0004451587439316365, 'samples': 2364672, 'steps': 12315, 'loss/train': 0.7425267845392227} 01/27/2022 07:26:48 - INFO - codeparrot_training - Step 12316: {'lr': 0.0004451485171840466, 'samples': 2364864, 'steps': 12316, 'loss/train': 1.2654037177562714} 01/27/2022 07:26:51 - INFO - codeparrot_training - Step 12317: {'lr': 0.0004451382896005024, 'samples': 2365056, 'steps': 12317, 'loss/train': 0.9027789831161499} 01/27/2022 07:26:54 - INFO - codeparrot_training - Step 12318: {'lr': 0.00044512806118104784, 'samples': 2365248, 'steps': 12318, 'loss/train': 0.9821445643901825} 01/27/2022 07:26:58 - INFO - codeparrot_training - Step 12319: {'lr': 0.0004451178319257265, 'samples': 2365440, 'steps': 12319, 'loss/train': 1.599617600440979} 01/27/2022 07:27:01 - INFO - codeparrot_training - Step 12320: {'lr': 0.0004451076018345824, 'samples': 2365632, 'steps': 12320, 'loss/train': 0.9511178433895111} 01/27/2022 07:27:04 - INFO - codeparrot_training - Step 12321: {'lr': 0.00044509737090765933, 'samples': 2365824, 'steps': 12321, 'loss/train': 1.128823846578598} 01/27/2022 07:27:07 - INFO - codeparrot_training - Step 12322: {'lr': 0.00044508713914500107, 'samples': 2366016, 'steps': 12322, 'loss/train': 1.264392614364624} 01/27/2022 07:27:12 - INFO - codeparrot_training - Step 12323: {'lr': 0.0004450769065466514, 'samples': 2366208, 'steps': 12323, 'loss/train': 0.5748104453086853} 01/27/2022 07:27:15 - INFO - codeparrot_training - Step 12324: {'lr': 0.0004450666731126542, 'samples': 2366400, 'steps': 12324, 'loss/train': 1.041947364807129} 01/27/2022 07:27:18 - INFO - codeparrot_training - Step 12325: {'lr': 0.0004450564388430533, 'samples': 2366592, 'steps': 12325, 'loss/train': 1.0020514726638794} 01/27/2022 07:27:21 - INFO - codeparrot_training - Step 12326: {'lr': 0.0004450462037378926, 'samples': 2366784, 'steps': 12326, 'loss/train': 0.8537925481796265} 01/27/2022 07:27:24 - INFO - codeparrot_training - Step 12327: {'lr': 0.0004450359677972159, 'samples': 2366976, 'steps': 12327, 'loss/train': 0.7171632349491119} 01/27/2022 07:27:27 - INFO - codeparrot_training - Step 12328: {'lr': 0.000445025731021067, 'samples': 2367168, 'steps': 12328, 'loss/train': 0.7999124228954315} 01/27/2022 07:27:30 - INFO - codeparrot_training - Step 12329: {'lr': 0.0004450154934094898, 'samples': 2367360, 'steps': 12329, 'loss/train': 0.781664103269577} 01/27/2022 07:27:34 - INFO - codeparrot_training - Step 12330: {'lr': 0.0004450052549625282, 'samples': 2367552, 'steps': 12330, 'loss/train': 0.0690444353967905} 01/27/2022 07:27:37 - INFO - codeparrot_training - Step 12331: {'lr': 0.000444995015680226, 'samples': 2367744, 'steps': 12331, 'loss/train': 0.7889828681945801} 01/27/2022 07:27:43 - INFO - codeparrot_training - Step 12332: {'lr': 0.0004449847755626271, 'samples': 2367936, 'steps': 12332, 'loss/train': 0.45147259533405304} 01/27/2022 07:27:46 - INFO - codeparrot_training - Step 12333: {'lr': 0.00044497453460977523, 'samples': 2368128, 'steps': 12333, 'loss/train': 0.6977919638156891} 01/27/2022 07:27:49 - INFO - codeparrot_training - Step 12334: {'lr': 0.0004449642928217144, 'samples': 2368320, 'steps': 12334, 'loss/train': 2.5467989444732666} 01/27/2022 07:27:52 - INFO - codeparrot_training - Step 12335: {'lr': 0.0004449540501984885, 'samples': 2368512, 'steps': 12335, 'loss/train': 1.9324116110801697} 01/27/2022 07:27:55 - INFO - codeparrot_training - Step 12336: {'lr': 0.0004449438067401413, 'samples': 2368704, 'steps': 12336, 'loss/train': 0.8519292175769806} 01/27/2022 07:27:58 - INFO - codeparrot_training - Step 12337: {'lr': 0.0004449335624467168, 'samples': 2368896, 'steps': 12337, 'loss/train': 0.9183501899242401} 01/27/2022 07:28:02 - INFO - codeparrot_training - Step 12338: {'lr': 0.00044492331731825875, 'samples': 2369088, 'steps': 12338, 'loss/train': 1.013237625360489} 01/27/2022 07:28:05 - INFO - codeparrot_training - Step 12339: {'lr': 0.0004449130713548111, 'samples': 2369280, 'steps': 12339, 'loss/train': 1.0673104226589203} 01/27/2022 07:28:08 - INFO - codeparrot_training - Step 12340: {'lr': 0.00044490282455641783, 'samples': 2369472, 'steps': 12340, 'loss/train': 0.19796673953533173} 01/27/2022 07:28:11 - INFO - codeparrot_training - Step 12341: {'lr': 0.0004448925769231227, 'samples': 2369664, 'steps': 12341, 'loss/train': 0.8060234785079956} 01/27/2022 07:28:15 - INFO - codeparrot_training - Step 12342: {'lr': 0.0004448823284549696, 'samples': 2369856, 'steps': 12342, 'loss/train': 1.1924013197422028} 01/27/2022 07:28:19 - INFO - codeparrot_training - Step 12343: {'lr': 0.00044487207915200257, 'samples': 2370048, 'steps': 12343, 'loss/train': 1.6866701245307922} 01/27/2022 07:28:22 - INFO - codeparrot_training - Step 12344: {'lr': 0.0004448618290142654, 'samples': 2370240, 'steps': 12344, 'loss/train': 0.5487280189990997} 01/27/2022 07:28:25 - INFO - codeparrot_training - Step 12345: {'lr': 0.000444851578041802, 'samples': 2370432, 'steps': 12345, 'loss/train': 0.940241664648056} 01/27/2022 07:28:28 - INFO - codeparrot_training - Step 12346: {'lr': 0.00044484132623465633, 'samples': 2370624, 'steps': 12346, 'loss/train': 0.8456894159317017} 01/27/2022 07:28:31 - INFO - codeparrot_training - Step 12347: {'lr': 0.0004448310735928723, 'samples': 2370816, 'steps': 12347, 'loss/train': 0.945944756269455} 01/27/2022 07:28:34 - INFO - codeparrot_training - Step 12348: {'lr': 0.0004448208201164938, 'samples': 2371008, 'steps': 12348, 'loss/train': 1.2069083154201508} 01/27/2022 07:28:37 - INFO - codeparrot_training - Step 12349: {'lr': 0.0004448105658055648, 'samples': 2371200, 'steps': 12349, 'loss/train': 0.6001661717891693} 01/27/2022 07:28:41 - INFO - codeparrot_training - Step 12350: {'lr': 0.00044480031066012916, 'samples': 2371392, 'steps': 12350, 'loss/train': 0.5709603577852249} 01/27/2022 07:28:45 - INFO - codeparrot_training - Step 12351: {'lr': 0.00044479005468023086, 'samples': 2371584, 'steps': 12351, 'loss/train': 1.195422112941742} 01/27/2022 07:28:48 - INFO - codeparrot_training - Step 12352: {'lr': 0.0004447797978659138, 'samples': 2371776, 'steps': 12352, 'loss/train': 0.873058408498764} 01/27/2022 07:28:51 - INFO - codeparrot_training - Step 12353: {'lr': 0.000444769540217222, 'samples': 2371968, 'steps': 12353, 'loss/train': 1.0905442535877228} 01/27/2022 07:28:54 - INFO - codeparrot_training - Step 12354: {'lr': 0.0004447592817341993, 'samples': 2372160, 'steps': 12354, 'loss/train': 2.012032449245453} 01/27/2022 07:28:58 - INFO - codeparrot_training - Step 12355: {'lr': 0.0004447490224168896, 'samples': 2372352, 'steps': 12355, 'loss/train': 0.8059608936309814} 01/27/2022 07:29:01 - INFO - codeparrot_training - Step 12356: {'lr': 0.00044473876226533703, 'samples': 2372544, 'steps': 12356, 'loss/train': 1.0710345804691315} 01/27/2022 07:29:04 - INFO - codeparrot_training - Step 12357: {'lr': 0.0004447285012795854, 'samples': 2372736, 'steps': 12357, 'loss/train': 0.8523290455341339} 01/27/2022 07:29:07 - INFO - codeparrot_training - Step 12358: {'lr': 0.0004447182394596788, 'samples': 2372928, 'steps': 12358, 'loss/train': 0.6809972673654556} 01/27/2022 07:29:10 - INFO - codeparrot_training - Step 12359: {'lr': 0.000444707976805661, 'samples': 2373120, 'steps': 12359, 'loss/train': 1.3268761932849884} 01/27/2022 07:29:16 - INFO - codeparrot_training - Step 12360: {'lr': 0.00044469771331757604, 'samples': 2373312, 'steps': 12360, 'loss/train': 0.6268832087516785} 01/27/2022 07:29:19 - INFO - codeparrot_training - Step 12361: {'lr': 0.00044468744899546785, 'samples': 2373504, 'steps': 12361, 'loss/train': 0.10002204403281212} 01/27/2022 07:29:22 - INFO - codeparrot_training - Step 12362: {'lr': 0.0004446771838393806, 'samples': 2373696, 'steps': 12362, 'loss/train': 0.6692797690629959} 01/27/2022 07:29:25 - INFO - codeparrot_training - Step 12363: {'lr': 0.00044466691784935796, 'samples': 2373888, 'steps': 12363, 'loss/train': 0.9872771501541138} 01/27/2022 07:29:28 - INFO - codeparrot_training - Step 12364: {'lr': 0.00044465665102544415, 'samples': 2374080, 'steps': 12364, 'loss/train': 1.016461730003357} 01/27/2022 07:29:31 - INFO - codeparrot_training - Step 12365: {'lr': 0.000444646383367683, 'samples': 2374272, 'steps': 12365, 'loss/train': 0.6056216508150101} 01/27/2022 07:29:34 - INFO - codeparrot_training - Step 12366: {'lr': 0.00044463611487611864, 'samples': 2374464, 'steps': 12366, 'loss/train': 0.9582634270191193} 01/27/2022 07:29:38 - INFO - codeparrot_training - Step 12367: {'lr': 0.0004446258455507949, 'samples': 2374656, 'steps': 12367, 'loss/train': 0.8547457158565521} 01/27/2022 07:29:42 - INFO - codeparrot_training - Step 12368: {'lr': 0.00044461557539175587, 'samples': 2374848, 'steps': 12368, 'loss/train': 1.062954694032669} 01/27/2022 07:29:45 - INFO - codeparrot_training - Step 12369: {'lr': 0.0004446053043990455, 'samples': 2375040, 'steps': 12369, 'loss/train': 1.2686162889003754} 01/27/2022 07:29:48 - INFO - codeparrot_training - Step 12370: {'lr': 0.00044459503257270776, 'samples': 2375232, 'steps': 12370, 'loss/train': 0.8415127694606781} 01/27/2022 07:29:51 - INFO - codeparrot_training - Step 12371: {'lr': 0.0004445847599127868, 'samples': 2375424, 'steps': 12371, 'loss/train': 1.027587890625} 01/27/2022 07:29:55 - INFO - codeparrot_training - Step 12372: {'lr': 0.0004445744864193264, 'samples': 2375616, 'steps': 12372, 'loss/train': 0.49051840603351593} 01/27/2022 07:29:58 - INFO - codeparrot_training - Step 12373: {'lr': 0.00044456421209237073, 'samples': 2375808, 'steps': 12373, 'loss/train': 0.9016005098819733} 01/27/2022 07:30:01 - INFO - codeparrot_training - Step 12374: {'lr': 0.00044455393693196375, 'samples': 2376000, 'steps': 12374, 'loss/train': 0.6100080907344818} 01/27/2022 07:30:04 - INFO - codeparrot_training - Step 12375: {'lr': 0.00044454366093814947, 'samples': 2376192, 'steps': 12375, 'loss/train': 0.8140582144260406} 01/27/2022 07:30:07 - INFO - codeparrot_training - Step 12376: {'lr': 0.0004445333841109719, 'samples': 2376384, 'steps': 12376, 'loss/train': 0.9350046515464783} 01/27/2022 07:30:12 - INFO - codeparrot_training - Step 12377: {'lr': 0.0004445231064504751, 'samples': 2376576, 'steps': 12377, 'loss/train': 1.2245730757713318} 01/27/2022 07:30:15 - INFO - codeparrot_training - Step 12378: {'lr': 0.00044451282795670313, 'samples': 2376768, 'steps': 12378, 'loss/train': 0.5201955288648605} 01/27/2022 07:30:18 - INFO - codeparrot_training - Step 12379: {'lr': 0.0004445025486297, 'samples': 2376960, 'steps': 12379, 'loss/train': 0.6124875247478485} 01/27/2022 07:30:21 - INFO - codeparrot_training - Step 12380: {'lr': 0.00044449226846950964, 'samples': 2377152, 'steps': 12380, 'loss/train': 1.1664993166923523} 01/27/2022 07:30:24 - INFO - codeparrot_training - Step 12381: {'lr': 0.0004444819874761762, 'samples': 2377344, 'steps': 12381, 'loss/train': 0.736018180847168} 01/27/2022 07:30:27 - INFO - codeparrot_training - Step 12382: {'lr': 0.0004444717056497436, 'samples': 2377536, 'steps': 12382, 'loss/train': 0.8422733545303345} 01/27/2022 07:30:30 - INFO - codeparrot_training - Step 12383: {'lr': 0.00044446142299025605, 'samples': 2377728, 'steps': 12383, 'loss/train': 0.3371285945177078} 01/27/2022 07:30:34 - INFO - codeparrot_training - Step 12384: {'lr': 0.0004444511394977575, 'samples': 2377920, 'steps': 12384, 'loss/train': 0.9132356643676758} 01/27/2022 07:30:37 - INFO - codeparrot_training - Step 12385: {'lr': 0.0004444408551722919, 'samples': 2378112, 'steps': 12385, 'loss/train': 0.08670958690345287} 01/27/2022 07:30:42 - INFO - codeparrot_training - Step 12386: {'lr': 0.00044443057001390354, 'samples': 2378304, 'steps': 12386, 'loss/train': 0.7106990665197372} 01/27/2022 07:30:45 - INFO - codeparrot_training - Step 12387: {'lr': 0.00044442028402263636, 'samples': 2378496, 'steps': 12387, 'loss/train': 0.6951268762350082} 01/27/2022 07:30:48 - INFO - codeparrot_training - Step 12388: {'lr': 0.00044440999719853435, 'samples': 2378688, 'steps': 12388, 'loss/train': 1.0806593298912048} 01/27/2022 07:30:51 - INFO - codeparrot_training - Step 12389: {'lr': 0.0004443997095416417, 'samples': 2378880, 'steps': 12389, 'loss/train': 0.5816140025854111} 01/27/2022 07:30:55 - INFO - codeparrot_training - Step 12390: {'lr': 0.0004443894210520024, 'samples': 2379072, 'steps': 12390, 'loss/train': 0.9166259765625} 01/27/2022 07:30:58 - INFO - codeparrot_training - Step 12391: {'lr': 0.0004443791317296606, 'samples': 2379264, 'steps': 12391, 'loss/train': 0.9026581943035126} 01/27/2022 07:31:01 - INFO - codeparrot_training - Step 12392: {'lr': 0.0004443688415746602, 'samples': 2379456, 'steps': 12392, 'loss/train': 0.6659624129533768} 01/27/2022 07:31:04 - INFO - codeparrot_training - Step 12393: {'lr': 0.0004443585505870456, 'samples': 2379648, 'steps': 12393, 'loss/train': 1.094845175743103} 01/27/2022 07:31:07 - INFO - codeparrot_training - Step 12394: {'lr': 0.0004443482587668605, 'samples': 2379840, 'steps': 12394, 'loss/train': 1.063211739063263} 01/27/2022 07:31:12 - INFO - codeparrot_training - Step 12395: {'lr': 0.00044433796611414924, 'samples': 2380032, 'steps': 12395, 'loss/train': 1.0453848838806152} 01/27/2022 07:31:15 - INFO - codeparrot_training - Step 12396: {'lr': 0.0004443276726289558, 'samples': 2380224, 'steps': 12396, 'loss/train': 0.8220523595809937} 01/27/2022 07:31:18 - INFO - codeparrot_training - Step 12397: {'lr': 0.00044431737831132433, 'samples': 2380416, 'steps': 12397, 'loss/train': 1.0559658408164978} 01/27/2022 07:31:21 - INFO - codeparrot_training - Step 12398: {'lr': 0.000444307083161299, 'samples': 2380608, 'steps': 12398, 'loss/train': 0.7279383838176727} 01/27/2022 07:31:25 - INFO - codeparrot_training - Step 12399: {'lr': 0.00044429678717892366, 'samples': 2380800, 'steps': 12399, 'loss/train': 0.9641941487789154} 01/27/2022 07:31:28 - INFO - codeparrot_training - Step 12400: {'lr': 0.0004442864903642427, 'samples': 2380992, 'steps': 12400, 'loss/train': 1.695567548274994} 01/27/2022 07:31:31 - INFO - codeparrot_training - Step 12401: {'lr': 0.00044427619271730014, 'samples': 2381184, 'steps': 12401, 'loss/train': 0.7431675642728806} 01/27/2022 07:31:34 - INFO - codeparrot_training - Step 12402: {'lr': 0.00044426589423814003, 'samples': 2381376, 'steps': 12402, 'loss/train': 0.7756941318511963} 01/27/2022 07:31:37 - INFO - codeparrot_training - Step 12403: {'lr': 0.00044425559492680645, 'samples': 2381568, 'steps': 12403, 'loss/train': 0.444881871342659} 01/27/2022 07:31:42 - INFO - codeparrot_training - Step 12404: {'lr': 0.00044424529478334364, 'samples': 2381760, 'steps': 12404, 'loss/train': 0.5607298314571381} 01/27/2022 07:31:45 - INFO - codeparrot_training - Step 12405: {'lr': 0.00044423499380779566, 'samples': 2381952, 'steps': 12405, 'loss/train': 1.510445237159729} 01/27/2022 07:31:48 - INFO - codeparrot_training - Step 12406: {'lr': 0.00044422469200020666, 'samples': 2382144, 'steps': 12406, 'loss/train': 1.1612336039543152} 01/27/2022 07:31:51 - INFO - codeparrot_training - Step 12407: {'lr': 0.0004442143893606207, 'samples': 2382336, 'steps': 12407, 'loss/train': 0.11380390450358391} 01/27/2022 07:31:54 - INFO - codeparrot_training - Step 12408: {'lr': 0.000444204085889082, 'samples': 2382528, 'steps': 12408, 'loss/train': 1.4436814785003662} 01/27/2022 07:31:57 - INFO - codeparrot_training - Step 12409: {'lr': 0.00044419378158563465, 'samples': 2382720, 'steps': 12409, 'loss/train': 0.7173594385385513} 01/27/2022 07:32:00 - INFO - codeparrot_training - Step 12410: {'lr': 0.0004441834764503228, 'samples': 2382912, 'steps': 12410, 'loss/train': 1.2192183136940002} 01/27/2022 07:32:04 - INFO - codeparrot_training - Step 12411: {'lr': 0.0004441731704831906, 'samples': 2383104, 'steps': 12411, 'loss/train': 0.954216867685318} 01/27/2022 07:32:07 - INFO - codeparrot_training - Step 12412: {'lr': 0.0004441628636842822, 'samples': 2383296, 'steps': 12412, 'loss/train': 1.024294674396515} 01/27/2022 07:32:12 - INFO - codeparrot_training - Step 12413: {'lr': 0.0004441525560536418, 'samples': 2383488, 'steps': 12413, 'loss/train': 0.6544697731733322} 01/27/2022 07:32:15 - INFO - codeparrot_training - Step 12414: {'lr': 0.0004441422475913134, 'samples': 2383680, 'steps': 12414, 'loss/train': 0.7599959671497345} 01/27/2022 07:32:18 - INFO - codeparrot_training - Step 12415: {'lr': 0.0004441319382973413, 'samples': 2383872, 'steps': 12415, 'loss/train': 1.0678304135799408} 01/27/2022 07:32:21 - INFO - codeparrot_training - Step 12416: {'lr': 0.00044412162817176966, 'samples': 2384064, 'steps': 12416, 'loss/train': 1.333870768547058} 01/27/2022 07:32:25 - INFO - codeparrot_training - Step 12417: {'lr': 0.0004441113172146426, 'samples': 2384256, 'steps': 12417, 'loss/train': 0.6247766464948654} 01/27/2022 07:32:28 - INFO - codeparrot_training - Step 12418: {'lr': 0.00044410100542600423, 'samples': 2384448, 'steps': 12418, 'loss/train': 0.5904186218976974} 01/27/2022 07:32:31 - INFO - codeparrot_training - Step 12419: {'lr': 0.00044409069280589887, 'samples': 2384640, 'steps': 12419, 'loss/train': 0.6527435928583145} 01/27/2022 07:32:34 - INFO - codeparrot_training - Step 12420: {'lr': 0.0004440803793543705, 'samples': 2384832, 'steps': 12420, 'loss/train': 0.9701429307460785} 01/27/2022 07:32:37 - INFO - codeparrot_training - Step 12421: {'lr': 0.00044407006507146354, 'samples': 2385024, 'steps': 12421, 'loss/train': 1.0125064551830292} 01/27/2022 07:32:42 - INFO - codeparrot_training - Step 12422: {'lr': 0.000444059749957222, 'samples': 2385216, 'steps': 12422, 'loss/train': 0.9002794325351715} 01/27/2022 07:32:45 - INFO - codeparrot_training - Step 12423: {'lr': 0.00044404943401169005, 'samples': 2385408, 'steps': 12423, 'loss/train': 0.5338903069496155} 01/27/2022 07:32:48 - INFO - codeparrot_training - Step 12424: {'lr': 0.00044403911723491196, 'samples': 2385600, 'steps': 12424, 'loss/train': 1.5753529071807861} 01/27/2022 07:32:51 - INFO - codeparrot_training - Step 12425: {'lr': 0.000444028799626932, 'samples': 2385792, 'steps': 12425, 'loss/train': 0.3902967721223831} 01/27/2022 07:32:54 - INFO - codeparrot_training - Step 12426: {'lr': 0.0004440184811877942, 'samples': 2385984, 'steps': 12426, 'loss/train': 1.0309752523899078} 01/27/2022 07:32:57 - INFO - codeparrot_training - Step 12427: {'lr': 0.0004440081619175428, 'samples': 2386176, 'steps': 12427, 'loss/train': 1.1509135365486145} 01/27/2022 07:33:01 - INFO - codeparrot_training - Step 12428: {'lr': 0.00044399784181622216, 'samples': 2386368, 'steps': 12428, 'loss/train': 1.0662547051906586} 01/27/2022 07:33:04 - INFO - codeparrot_training - Step 12429: {'lr': 0.0004439875208838763, 'samples': 2386560, 'steps': 12429, 'loss/train': 2.6140365600585938} 01/27/2022 07:33:07 - INFO - codeparrot_training - Step 12430: {'lr': 0.00044397719912054944, 'samples': 2386752, 'steps': 12430, 'loss/train': 1.1106720864772797} 01/27/2022 07:33:12 - INFO - codeparrot_training - Step 12431: {'lr': 0.00044396687652628586, 'samples': 2386944, 'steps': 12431, 'loss/train': 0.8434398472309113} 01/27/2022 07:33:15 - INFO - codeparrot_training - Step 12432: {'lr': 0.00044395655310112985, 'samples': 2387136, 'steps': 12432, 'loss/train': 0.3263658508658409} 01/27/2022 07:33:18 - INFO - codeparrot_training - Step 12433: {'lr': 0.00044394622884512554, 'samples': 2387328, 'steps': 12433, 'loss/train': 1.2269450426101685} 01/27/2022 07:33:21 - INFO - codeparrot_training - Step 12434: {'lr': 0.00044393590375831716, 'samples': 2387520, 'steps': 12434, 'loss/train': 0.862239271402359} 01/27/2022 07:33:24 - INFO - codeparrot_training - Step 12435: {'lr': 0.00044392557784074895, 'samples': 2387712, 'steps': 12435, 'loss/train': 0.7780023515224457} 01/27/2022 07:33:27 - INFO - codeparrot_training - Step 12436: {'lr': 0.0004439152510924651, 'samples': 2387904, 'steps': 12436, 'loss/train': 0.8014856278896332} 01/27/2022 07:33:31 - INFO - codeparrot_training - Step 12437: {'lr': 0.0004439049235135099, 'samples': 2388096, 'steps': 12437, 'loss/train': 1.2680665254592896} 01/27/2022 07:33:34 - INFO - codeparrot_training - Step 12438: {'lr': 0.0004438945951039276, 'samples': 2388288, 'steps': 12438, 'loss/train': 1.0576953291893005} 01/27/2022 07:33:37 - INFO - codeparrot_training - Step 12439: {'lr': 0.0004438842658637624, 'samples': 2388480, 'steps': 12439, 'loss/train': 1.501615583896637} 01/27/2022 07:33:42 - INFO - codeparrot_training - Step 12440: {'lr': 0.0004438739357930586, 'samples': 2388672, 'steps': 12440, 'loss/train': 1.2297468781471252} 01/27/2022 07:33:45 - INFO - codeparrot_training - Step 12441: {'lr': 0.00044386360489186047, 'samples': 2388864, 'steps': 12441, 'loss/train': 1.038871943950653} 01/27/2022 07:33:48 - INFO - codeparrot_training - Step 12442: {'lr': 0.00044385327316021214, 'samples': 2389056, 'steps': 12442, 'loss/train': 1.1155617535114288} 01/27/2022 07:33:51 - INFO - codeparrot_training - Step 12443: {'lr': 0.000443842940598158, 'samples': 2389248, 'steps': 12443, 'loss/train': 1.3377664983272552} 01/27/2022 07:33:55 - INFO - codeparrot_training - Step 12444: {'lr': 0.00044383260720574214, 'samples': 2389440, 'steps': 12444, 'loss/train': 0.7532484233379364} 01/27/2022 07:33:58 - INFO - codeparrot_training - Step 12445: {'lr': 0.00044382227298300905, 'samples': 2389632, 'steps': 12445, 'loss/train': 0.9037575423717499} 01/27/2022 07:34:01 - INFO - codeparrot_training - Step 12446: {'lr': 0.0004438119379300028, 'samples': 2389824, 'steps': 12446, 'loss/train': 0.5099595487117767} 01/27/2022 07:34:04 - INFO - codeparrot_training - Step 12447: {'lr': 0.00044380160204676787, 'samples': 2390016, 'steps': 12447, 'loss/train': 1.0150373876094818} 01/27/2022 07:34:07 - INFO - codeparrot_training - Step 12448: {'lr': 0.00044379126533334836, 'samples': 2390208, 'steps': 12448, 'loss/train': 0.8924556076526642} 01/27/2022 07:34:12 - INFO - codeparrot_training - Step 12449: {'lr': 0.00044378092778978864, 'samples': 2390400, 'steps': 12449, 'loss/train': 0.9718220829963684} 01/27/2022 07:34:15 - INFO - codeparrot_training - Step 12450: {'lr': 0.00044377058941613283, 'samples': 2390592, 'steps': 12450, 'loss/train': 0.44411560893058777} 01/27/2022 07:34:18 - INFO - codeparrot_training - Step 12451: {'lr': 0.0004437602502124255, 'samples': 2390784, 'steps': 12451, 'loss/train': 0.4296950697898865} 01/27/2022 07:34:21 - INFO - codeparrot_training - Step 12452: {'lr': 0.0004437499101787107, 'samples': 2390976, 'steps': 12452, 'loss/train': 0.9977073669433594} 01/27/2022 07:34:24 - INFO - codeparrot_training - Step 12453: {'lr': 0.0004437395693150328, 'samples': 2391168, 'steps': 12453, 'loss/train': 0.8179852366447449} 01/27/2022 07:34:27 - INFO - codeparrot_training - Step 12454: {'lr': 0.0004437292276214361, 'samples': 2391360, 'steps': 12454, 'loss/train': 1.265015423297882} 01/27/2022 07:34:30 - INFO - codeparrot_training - Step 12455: {'lr': 0.000443718885097965, 'samples': 2391552, 'steps': 12455, 'loss/train': 1.5397444367408752} 01/27/2022 07:34:34 - INFO - codeparrot_training - Step 12456: {'lr': 0.0004437085417446636, 'samples': 2391744, 'steps': 12456, 'loss/train': 0.9626648426055908} 01/27/2022 07:34:37 - INFO - codeparrot_training - Step 12457: {'lr': 0.0004436981975615764, 'samples': 2391936, 'steps': 12457, 'loss/train': 0.5638210326433182} 01/27/2022 07:34:41 - INFO - codeparrot_training - Step 12458: {'lr': 0.00044368785254874754, 'samples': 2392128, 'steps': 12458, 'loss/train': 1.0855716168880463} 01/27/2022 07:34:44 - INFO - codeparrot_training - Step 12459: {'lr': 0.00044367750670622143, 'samples': 2392320, 'steps': 12459, 'loss/train': 1.0321305692195892} 01/27/2022 07:34:47 - INFO - codeparrot_training - Step 12460: {'lr': 0.0004436671600340424, 'samples': 2392512, 'steps': 12460, 'loss/train': 0.9916349351406097} 01/27/2022 07:34:51 - INFO - codeparrot_training - Step 12461: {'lr': 0.00044365681253225476, 'samples': 2392704, 'steps': 12461, 'loss/train': 1.35299813747406} 01/27/2022 07:34:54 - INFO - codeparrot_training - Step 12462: {'lr': 0.0004436464642009029, 'samples': 2392896, 'steps': 12462, 'loss/train': 0.4297240823507309} 01/27/2022 07:34:57 - INFO - codeparrot_training - Step 12463: {'lr': 0.00044363611504003096, 'samples': 2393088, 'steps': 12463, 'loss/train': 0.775292694568634} 01/27/2022 07:35:00 - INFO - codeparrot_training - Step 12464: {'lr': 0.00044362576504968344, 'samples': 2393280, 'steps': 12464, 'loss/train': 0.9030198454856873} 01/27/2022 07:35:03 - INFO - codeparrot_training - Step 12465: {'lr': 0.0004436154142299046, 'samples': 2393472, 'steps': 12465, 'loss/train': 0.8298511505126953} 01/27/2022 07:35:08 - INFO - codeparrot_training - Step 12466: {'lr': 0.00044360506258073884, 'samples': 2393664, 'steps': 12466, 'loss/train': 0.39185643196105957} 01/27/2022 07:35:11 - INFO - codeparrot_training - Step 12467: {'lr': 0.0004435947101022305, 'samples': 2393856, 'steps': 12467, 'loss/train': 1.1567240953445435} 01/27/2022 07:35:15 - INFO - codeparrot_training - Step 12468: {'lr': 0.0004435843567944239, 'samples': 2394048, 'steps': 12468, 'loss/train': 0.8837926983833313} 01/27/2022 07:35:18 - INFO - codeparrot_training - Step 12469: {'lr': 0.0004435740026573633, 'samples': 2394240, 'steps': 12469, 'loss/train': 0.912889301776886} 01/27/2022 07:35:21 - INFO - codeparrot_training - Step 12470: {'lr': 0.0004435636476910932, 'samples': 2394432, 'steps': 12470, 'loss/train': 0.9090836942195892} 01/27/2022 07:35:24 - INFO - codeparrot_training - Step 12471: {'lr': 0.00044355329189565783, 'samples': 2394624, 'steps': 12471, 'loss/train': 0.9928692877292633} 01/27/2022 07:35:27 - INFO - codeparrot_training - Step 12472: {'lr': 0.00044354293527110167, 'samples': 2394816, 'steps': 12472, 'loss/train': 0.9104254245758057} 01/27/2022 07:35:30 - INFO - codeparrot_training - Step 12473: {'lr': 0.000443532577817469, 'samples': 2395008, 'steps': 12473, 'loss/train': 1.3905088305473328} 01/27/2022 07:35:33 - INFO - codeparrot_training - Step 12474: {'lr': 0.0004435222195348043, 'samples': 2395200, 'steps': 12474, 'loss/train': 1.2388838231563568} 01/27/2022 07:35:38 - INFO - codeparrot_training - Step 12475: {'lr': 0.00044351186042315184, 'samples': 2395392, 'steps': 12475, 'loss/train': 0.655980572104454} 01/27/2022 07:35:41 - INFO - codeparrot_training - Step 12476: {'lr': 0.000443501500482556, 'samples': 2395584, 'steps': 12476, 'loss/train': 0.8840948045253754} 01/27/2022 07:35:44 - INFO - codeparrot_training - Step 12477: {'lr': 0.0004434911397130612, 'samples': 2395776, 'steps': 12477, 'loss/train': 0.8115145862102509} 01/27/2022 07:35:47 - INFO - codeparrot_training - Step 12478: {'lr': 0.0004434807781147117, 'samples': 2395968, 'steps': 12478, 'loss/train': 0.8599046766757965} 01/27/2022 07:35:51 - INFO - codeparrot_training - Step 12479: {'lr': 0.0004434704156875521, 'samples': 2396160, 'steps': 12479, 'loss/train': 0.9583016932010651} 01/27/2022 07:35:54 - INFO - codeparrot_training - Step 12480: {'lr': 0.00044346005243162654, 'samples': 2396352, 'steps': 12480, 'loss/train': 0.7805295288562775} 01/27/2022 07:35:57 - INFO - codeparrot_training - Step 12481: {'lr': 0.0004434496883469796, 'samples': 2396544, 'steps': 12481, 'loss/train': 1.0185272991657257} 01/27/2022 07:36:00 - INFO - codeparrot_training - Step 12482: {'lr': 0.0004434393234336557, 'samples': 2396736, 'steps': 12482, 'loss/train': 0.4861017018556595} 01/27/2022 07:36:03 - INFO - codeparrot_training - Step 12483: {'lr': 0.0004434289576916991, 'samples': 2396928, 'steps': 12483, 'loss/train': 0.7073965966701508} 01/27/2022 07:36:08 - INFO - codeparrot_training - Step 12484: {'lr': 0.00044341859112115425, 'samples': 2397120, 'steps': 12484, 'loss/train': 1.0544685423374176} 01/27/2022 07:36:11 - INFO - codeparrot_training - Step 12485: {'lr': 0.00044340822372206557, 'samples': 2397312, 'steps': 12485, 'loss/train': 0.6175372749567032} 01/27/2022 07:36:14 - INFO - codeparrot_training - Step 12486: {'lr': 0.00044339785549447756, 'samples': 2397504, 'steps': 12486, 'loss/train': 0.741400882601738} 01/27/2022 07:36:18 - INFO - codeparrot_training - Step 12487: {'lr': 0.00044338748643843446, 'samples': 2397696, 'steps': 12487, 'loss/train': 0.8509025573730469} 01/27/2022 07:36:21 - INFO - codeparrot_training - Step 12488: {'lr': 0.00044337711655398083, 'samples': 2397888, 'steps': 12488, 'loss/train': 0.7258532345294952} 01/27/2022 07:36:24 - INFO - codeparrot_training - Step 12489: {'lr': 0.00044336674584116096, 'samples': 2398080, 'steps': 12489, 'loss/train': 0.24835453927516937} 01/27/2022 07:36:27 - INFO - codeparrot_training - Step 12490: {'lr': 0.0004433563743000195, 'samples': 2398272, 'steps': 12490, 'loss/train': 0.6689233928918839} 01/27/2022 07:36:30 - INFO - codeparrot_training - Step 12491: {'lr': 0.0004433460019306006, 'samples': 2398464, 'steps': 12491, 'loss/train': 1.0399279296398163} 01/27/2022 07:36:33 - INFO - codeparrot_training - Step 12492: {'lr': 0.00044333562873294884, 'samples': 2398656, 'steps': 12492, 'loss/train': 0.558726578950882} 01/27/2022 07:36:38 - INFO - codeparrot_training - Step 12493: {'lr': 0.00044332525470710865, 'samples': 2398848, 'steps': 12493, 'loss/train': 0.9740355312824249} 01/27/2022 07:36:41 - INFO - codeparrot_training - Step 12494: {'lr': 0.0004433148798531245, 'samples': 2399040, 'steps': 12494, 'loss/train': 0.9545582234859467} 01/27/2022 07:36:44 - INFO - codeparrot_training - Step 12495: {'lr': 0.0004433045041710407, 'samples': 2399232, 'steps': 12495, 'loss/train': 1.7243070602416992} 01/27/2022 07:36:47 - INFO - codeparrot_training - Step 12496: {'lr': 0.0004432941276609018, 'samples': 2399424, 'steps': 12496, 'loss/train': 1.2118935585021973} 01/27/2022 07:36:51 - INFO - codeparrot_training - Step 12497: {'lr': 0.00044328375032275227, 'samples': 2399616, 'steps': 12497, 'loss/train': 1.4904474020004272} 01/27/2022 07:36:54 - INFO - codeparrot_training - Step 12498: {'lr': 0.00044327337215663656, 'samples': 2399808, 'steps': 12498, 'loss/train': 0.4974481612443924} 01/27/2022 07:36:57 - INFO - codeparrot_training - Step 12499: {'lr': 0.000443262993162599, 'samples': 2400000, 'steps': 12499, 'loss/train': 0.7902533411979675} 01/27/2022 07:37:00 - INFO - codeparrot_training - Step 12500: {'lr': 0.0004432526133406842, 'samples': 2400192, 'steps': 12500, 'loss/train': 0.8632298111915588} 01/27/2022 07:37:03 - INFO - codeparrot_training - Step 12501: {'lr': 0.00044324223269093666, 'samples': 2400384, 'steps': 12501, 'loss/train': 0.9635652601718903} 01/27/2022 07:37:08 - INFO - codeparrot_training - Step 12502: {'lr': 0.00044323185121340064, 'samples': 2400576, 'steps': 12502, 'loss/train': 0.8362767398357391} 01/27/2022 07:37:11 - INFO - codeparrot_training - Step 12503: {'lr': 0.00044322146890812076, 'samples': 2400768, 'steps': 12503, 'loss/train': 0.22531946003437042} 01/27/2022 07:37:14 - INFO - codeparrot_training - Step 12504: {'lr': 0.0004432110857751415, 'samples': 2400960, 'steps': 12504, 'loss/train': 1.0453023612499237} 01/27/2022 07:37:17 - INFO - codeparrot_training - Step 12505: {'lr': 0.0004432007018145072, 'samples': 2401152, 'steps': 12505, 'loss/train': 1.2611549198627472} 01/27/2022 07:37:20 - INFO - codeparrot_training - Step 12506: {'lr': 0.00044319031702626255, 'samples': 2401344, 'steps': 12506, 'loss/train': 1.3866438567638397} 01/27/2022 07:37:23 - INFO - codeparrot_training - Step 12507: {'lr': 0.0004431799314104519, 'samples': 2401536, 'steps': 12507, 'loss/train': 0.9005045592784882} 01/27/2022 07:37:26 - INFO - codeparrot_training - Step 12508: {'lr': 0.0004431695449671197, 'samples': 2401728, 'steps': 12508, 'loss/train': 0.7981516420841217} 01/27/2022 07:37:30 - INFO - codeparrot_training - Step 12509: {'lr': 0.00044315915769631054, 'samples': 2401920, 'steps': 12509, 'loss/train': 0.7691733241081238} 01/27/2022 07:37:33 - INFO - codeparrot_training - Step 12510: {'lr': 0.0004431487695980689, 'samples': 2402112, 'steps': 12510, 'loss/train': 0.5438365191221237} 01/27/2022 07:37:37 - INFO - codeparrot_training - Step 12511: {'lr': 0.0004431383806724393, 'samples': 2402304, 'steps': 12511, 'loss/train': 0.7225814759731293} 01/27/2022 07:37:40 - INFO - codeparrot_training - Step 12512: {'lr': 0.0004431279909194661, 'samples': 2402496, 'steps': 12512, 'loss/train': 0.9981140792369843} 01/27/2022 07:37:43 - INFO - codeparrot_training - Step 12513: {'lr': 0.000443117600339194, 'samples': 2402688, 'steps': 12513, 'loss/train': 1.0383068919181824} 01/27/2022 07:37:47 - INFO - codeparrot_training - Step 12514: {'lr': 0.0004431072089316674, 'samples': 2402880, 'steps': 12514, 'loss/train': 1.2340986728668213} 01/27/2022 07:37:50 - INFO - codeparrot_training - Step 12515: {'lr': 0.0004430968166969308, 'samples': 2403072, 'steps': 12515, 'loss/train': 0.8661737143993378} 01/27/2022 07:37:53 - INFO - codeparrot_training - Step 12516: {'lr': 0.00044308642363502884, 'samples': 2403264, 'steps': 12516, 'loss/train': 0.06408857926726341} 01/27/2022 07:37:56 - INFO - codeparrot_training - Step 12517: {'lr': 0.00044307602974600594, 'samples': 2403456, 'steps': 12517, 'loss/train': 1.7174611687660217} 01/27/2022 07:37:59 - INFO - codeparrot_training - Step 12518: {'lr': 0.00044306563502990656, 'samples': 2403648, 'steps': 12518, 'loss/train': 0.545172706246376} 01/27/2022 07:38:04 - INFO - codeparrot_training - Step 12519: {'lr': 0.0004430552394867753, 'samples': 2403840, 'steps': 12519, 'loss/train': 0.8402443528175354} 01/27/2022 07:38:07 - INFO - codeparrot_training - Step 12520: {'lr': 0.0004430448431166567, 'samples': 2404032, 'steps': 12520, 'loss/train': 1.041546642780304} 01/27/2022 07:38:11 - INFO - codeparrot_training - Step 12521: {'lr': 0.00044303444591959533, 'samples': 2404224, 'steps': 12521, 'loss/train': 0.9818131327629089} 01/27/2022 07:38:14 - INFO - codeparrot_training - Step 12522: {'lr': 0.00044302404789563573, 'samples': 2404416, 'steps': 12522, 'loss/train': 0.8963176310062408} 01/27/2022 07:38:17 - INFO - codeparrot_training - Step 12523: {'lr': 0.0004430136490448223, 'samples': 2404608, 'steps': 12523, 'loss/train': 0.8508876264095306} 01/27/2022 07:38:20 - INFO - codeparrot_training - Step 12524: {'lr': 0.0004430032493671998, 'samples': 2404800, 'steps': 12524, 'loss/train': 0.7917302548885345} 01/27/2022 07:38:23 - INFO - codeparrot_training - Step 12525: {'lr': 0.0004429928488628126, 'samples': 2404992, 'steps': 12525, 'loss/train': 0.6589880883693695} 01/27/2022 07:38:26 - INFO - codeparrot_training - Step 12526: {'lr': 0.00044298244753170535, 'samples': 2405184, 'steps': 12526, 'loss/train': 0.7225103974342346} 01/27/2022 07:38:29 - INFO - codeparrot_training - Step 12527: {'lr': 0.00044297204537392253, 'samples': 2405376, 'steps': 12527, 'loss/train': 0.728220596909523} 01/27/2022 07:38:34 - INFO - codeparrot_training - Step 12528: {'lr': 0.00044296164238950874, 'samples': 2405568, 'steps': 12528, 'loss/train': 0.5186905413866043} 01/27/2022 07:38:37 - INFO - codeparrot_training - Step 12529: {'lr': 0.0004429512385785086, 'samples': 2405760, 'steps': 12529, 'loss/train': 0.5659583061933517} 01/27/2022 07:38:40 - INFO - codeparrot_training - Step 12530: {'lr': 0.0004429408339409666, 'samples': 2405952, 'steps': 12530, 'loss/train': 0.8759289085865021} 01/27/2022 07:38:43 - INFO - codeparrot_training - Step 12531: {'lr': 0.00044293042847692735, 'samples': 2406144, 'steps': 12531, 'loss/train': 1.2204853892326355} 01/27/2022 07:38:47 - INFO - codeparrot_training - Step 12532: {'lr': 0.00044292002218643533, 'samples': 2406336, 'steps': 12532, 'loss/train': 0.526084765791893} 01/27/2022 07:38:50 - INFO - codeparrot_training - Step 12533: {'lr': 0.00044290961506953525, 'samples': 2406528, 'steps': 12533, 'loss/train': 0.8788238167762756} 01/27/2022 07:38:53 - INFO - codeparrot_training - Step 12534: {'lr': 0.0004428992071262716, 'samples': 2406720, 'steps': 12534, 'loss/train': 0.6469307094812393} 01/27/2022 07:38:56 - INFO - codeparrot_training - Step 12535: {'lr': 0.00044288879835668903, 'samples': 2406912, 'steps': 12535, 'loss/train': 1.188188910484314} 01/27/2022 07:38:59 - INFO - codeparrot_training - Step 12536: {'lr': 0.0004428783887608321, 'samples': 2407104, 'steps': 12536, 'loss/train': 0.8437519669532776} 01/27/2022 07:39:04 - INFO - codeparrot_training - Step 12537: {'lr': 0.0004428679783387454, 'samples': 2407296, 'steps': 12537, 'loss/train': 0.49862225353717804} 01/27/2022 07:39:07 - INFO - codeparrot_training - Step 12538: {'lr': 0.00044285756709047354, 'samples': 2407488, 'steps': 12538, 'loss/train': 1.0308943390846252} 01/27/2022 07:39:11 - INFO - codeparrot_training - Step 12539: {'lr': 0.0004428471550160611, 'samples': 2407680, 'steps': 12539, 'loss/train': 0.9544934928417206} 01/27/2022 07:39:14 - INFO - codeparrot_training - Step 12540: {'lr': 0.00044283674211555266, 'samples': 2407872, 'steps': 12540, 'loss/train': 1.1545932590961456} 01/27/2022 07:39:17 - INFO - codeparrot_training - Step 12541: {'lr': 0.0004428263283889928, 'samples': 2408064, 'steps': 12541, 'loss/train': 1.2245941758155823} 01/27/2022 07:39:20 - INFO - codeparrot_training - Step 12542: {'lr': 0.0004428159138364263, 'samples': 2408256, 'steps': 12542, 'loss/train': 0.5828361511230469} 01/27/2022 07:39:23 - INFO - codeparrot_training - Step 12543: {'lr': 0.0004428054984578975, 'samples': 2408448, 'steps': 12543, 'loss/train': 0.7380574643611908} 01/27/2022 07:39:26 - INFO - codeparrot_training - Step 12544: {'lr': 0.0004427950822534513, 'samples': 2408640, 'steps': 12544, 'loss/train': 0.8466839790344238} 01/27/2022 07:39:29 - INFO - codeparrot_training - Step 12545: {'lr': 0.0004427846652231321, 'samples': 2408832, 'steps': 12545, 'loss/train': 0.7033702433109283} 01/27/2022 07:39:34 - INFO - codeparrot_training - Step 12546: {'lr': 0.0004427742473669847, 'samples': 2409024, 'steps': 12546, 'loss/train': 0.9582515358924866} 01/27/2022 07:39:37 - INFO - codeparrot_training - Step 12547: {'lr': 0.00044276382868505356, 'samples': 2409216, 'steps': 12547, 'loss/train': 1.3124121129512787} 01/27/2022 07:39:40 - INFO - codeparrot_training - Step 12548: {'lr': 0.0004427534091773834, 'samples': 2409408, 'steps': 12548, 'loss/train': 0.6996733993291855} 01/27/2022 07:39:43 - INFO - codeparrot_training - Step 12549: {'lr': 0.00044274298884401886, 'samples': 2409600, 'steps': 12549, 'loss/train': 1.505942165851593} 01/27/2022 07:39:46 - INFO - codeparrot_training - Step 12550: {'lr': 0.0004427325676850045, 'samples': 2409792, 'steps': 12550, 'loss/train': 0.45432235300540924} 01/27/2022 07:39:49 - INFO - codeparrot_training - Step 12551: {'lr': 0.00044272214570038513, 'samples': 2409984, 'steps': 12551, 'loss/train': 0.6372009068727493} 01/27/2022 07:39:53 - INFO - codeparrot_training - Step 12552: {'lr': 0.00044271172289020525, 'samples': 2410176, 'steps': 12552, 'loss/train': 1.3596647679805756} 01/27/2022 07:39:56 - INFO - codeparrot_training - Step 12553: {'lr': 0.00044270129925450945, 'samples': 2410368, 'steps': 12553, 'loss/train': 0.7305526882410049} 01/27/2022 07:39:59 - INFO - codeparrot_training - Step 12554: {'lr': 0.00044269087479334256, 'samples': 2410560, 'steps': 12554, 'loss/train': 0.16546816751360893} 01/27/2022 07:40:04 - INFO - codeparrot_training - Step 12555: {'lr': 0.00044268044950674913, 'samples': 2410752, 'steps': 12555, 'loss/train': 0.7452675998210907} 01/27/2022 07:40:07 - INFO - codeparrot_training - Step 12556: {'lr': 0.0004426700233947738, 'samples': 2410944, 'steps': 12556, 'loss/train': 1.3373144567012787} 01/27/2022 07:40:10 - INFO - codeparrot_training - Step 12557: {'lr': 0.00044265959645746136, 'samples': 2411136, 'steps': 12557, 'loss/train': 0.6864316016435623} 01/27/2022 07:40:13 - INFO - codeparrot_training - Step 12558: {'lr': 0.0004426491686948563, 'samples': 2411328, 'steps': 12558, 'loss/train': 0.9317808151245117} 01/27/2022 07:40:17 - INFO - codeparrot_training - Step 12559: {'lr': 0.00044263874010700343, 'samples': 2411520, 'steps': 12559, 'loss/train': 0.8177908658981323} 01/27/2022 07:40:20 - INFO - codeparrot_training - Step 12560: {'lr': 0.0004426283106939473, 'samples': 2411712, 'steps': 12560, 'loss/train': 0.9612716138362885} 01/27/2022 07:40:23 - INFO - codeparrot_training - Step 12561: {'lr': 0.0004426178804557327, 'samples': 2411904, 'steps': 12561, 'loss/train': 1.8254259824752808} 01/27/2022 07:40:26 - INFO - codeparrot_training - Step 12562: {'lr': 0.0004426074493924043, 'samples': 2412096, 'steps': 12562, 'loss/train': 1.710928201675415} 01/27/2022 07:40:29 - INFO - codeparrot_training - Step 12563: {'lr': 0.00044259701750400674, 'samples': 2412288, 'steps': 12563, 'loss/train': 0.6937476843595505} 01/27/2022 07:40:34 - INFO - codeparrot_training - Step 12564: {'lr': 0.00044258658479058463, 'samples': 2412480, 'steps': 12564, 'loss/train': 1.2631155252456665} 01/27/2022 07:40:37 - INFO - codeparrot_training - Step 12565: {'lr': 0.00044257615125218273, 'samples': 2412672, 'steps': 12565, 'loss/train': 1.1406463980674744} 01/27/2022 07:40:41 - INFO - codeparrot_training - Step 12566: {'lr': 0.00044256571688884583, 'samples': 2412864, 'steps': 12566, 'loss/train': 1.1012141704559326} 01/27/2022 07:40:44 - INFO - codeparrot_training - Step 12567: {'lr': 0.00044255528170061853, 'samples': 2413056, 'steps': 12567, 'loss/train': 1.474882185459137} 01/27/2022 07:40:47 - INFO - codeparrot_training - Step 12568: {'lr': 0.00044254484568754556, 'samples': 2413248, 'steps': 12568, 'loss/train': 0.8233294486999512} 01/27/2022 07:40:50 - INFO - codeparrot_training - Step 12569: {'lr': 0.0004425344088496716, 'samples': 2413440, 'steps': 12569, 'loss/train': 0.28578729182481766} 01/27/2022 07:40:53 - INFO - codeparrot_training - Step 12570: {'lr': 0.00044252397118704133, 'samples': 2413632, 'steps': 12570, 'loss/train': 1.2679709494113922} 01/27/2022 07:40:56 - INFO - codeparrot_training - Step 12571: {'lr': 0.0004425135326996995, 'samples': 2413824, 'steps': 12571, 'loss/train': 0.5840864181518555} 01/27/2022 07:41:00 - INFO - codeparrot_training - Step 12572: {'lr': 0.0004425030933876909, 'samples': 2414016, 'steps': 12572, 'loss/train': 1.412510633468628} 01/27/2022 07:41:04 - INFO - codeparrot_training - Step 12573: {'lr': 0.00044249265325106013, 'samples': 2414208, 'steps': 12573, 'loss/train': 0.8452616035938263} 01/27/2022 07:41:07 - INFO - codeparrot_training - Step 12574: {'lr': 0.000442482212289852, 'samples': 2414400, 'steps': 12574, 'loss/train': 0.8451715707778931} 01/27/2022 07:41:10 - INFO - codeparrot_training - Step 12575: {'lr': 0.00044247177050411114, 'samples': 2414592, 'steps': 12575, 'loss/train': 0.752539873123169} 01/27/2022 07:41:13 - INFO - codeparrot_training - Step 12576: {'lr': 0.00044246132789388235, 'samples': 2414784, 'steps': 12576, 'loss/train': 0.9545845091342926} 01/27/2022 07:41:17 - INFO - codeparrot_training - Step 12577: {'lr': 0.00044245088445921035, 'samples': 2414976, 'steps': 12577, 'loss/train': 0.9668420255184174} 01/27/2022 07:41:20 - INFO - codeparrot_training - Step 12578: {'lr': 0.00044244044020013985, 'samples': 2415168, 'steps': 12578, 'loss/train': 0.8707015514373779} 01/27/2022 07:41:23 - INFO - codeparrot_training - Step 12579: {'lr': 0.0004424299951167156, 'samples': 2415360, 'steps': 12579, 'loss/train': 0.6694605499505997} 01/27/2022 07:41:26 - INFO - codeparrot_training - Step 12580: {'lr': 0.0004424195492089824, 'samples': 2415552, 'steps': 12580, 'loss/train': 0.7546118795871735} 01/27/2022 07:41:29 - INFO - codeparrot_training - Step 12581: {'lr': 0.0004424091024769849, 'samples': 2415744, 'steps': 12581, 'loss/train': 1.283128023147583} 01/27/2022 07:41:34 - INFO - codeparrot_training - Step 12582: {'lr': 0.00044239865492076794, 'samples': 2415936, 'steps': 12582, 'loss/train': 0.5083162933588028} 01/27/2022 07:41:37 - INFO - codeparrot_training - Step 12583: {'lr': 0.0004423882065403762, 'samples': 2416128, 'steps': 12583, 'loss/train': 0.46616385877132416} 01/27/2022 07:41:40 - INFO - codeparrot_training - Step 12584: {'lr': 0.0004423777573358545, 'samples': 2416320, 'steps': 12584, 'loss/train': 0.6737842708826065} 01/27/2022 07:41:43 - INFO - codeparrot_training - Step 12585: {'lr': 0.0004423673073072476, 'samples': 2416512, 'steps': 12585, 'loss/train': 1.0505485832691193} 01/27/2022 07:41:46 - INFO - codeparrot_training - Step 12586: {'lr': 0.0004423568564546002, 'samples': 2416704, 'steps': 12586, 'loss/train': 0.7692604959011078} 01/27/2022 07:41:49 - INFO - codeparrot_training - Step 12587: {'lr': 0.00044234640477795707, 'samples': 2416896, 'steps': 12587, 'loss/train': 0.909434974193573} 01/27/2022 07:41:52 - INFO - codeparrot_training - Step 12588: {'lr': 0.0004423359522773631, 'samples': 2417088, 'steps': 12588, 'loss/train': 1.070112258195877} 01/27/2022 07:41:56 - INFO - codeparrot_training - Step 12589: {'lr': 0.00044232549895286294, 'samples': 2417280, 'steps': 12589, 'loss/train': 0.057606177404522896} 01/27/2022 07:41:59 - INFO - codeparrot_training - Step 12590: {'lr': 0.00044231504480450145, 'samples': 2417472, 'steps': 12590, 'loss/train': 0.7340866774320602} 01/27/2022 07:42:04 - INFO - codeparrot_training - Step 12591: {'lr': 0.0004423045898323233, 'samples': 2417664, 'steps': 12591, 'loss/train': 0.5379713326692581} 01/27/2022 07:42:07 - INFO - codeparrot_training - Step 12592: {'lr': 0.0004422941340363734, 'samples': 2417856, 'steps': 12592, 'loss/train': 0.8873337507247925} 01/27/2022 07:42:10 - INFO - codeparrot_training - Step 12593: {'lr': 0.0004422836774166965, 'samples': 2418048, 'steps': 12593, 'loss/train': 0.5515860915184021} 01/27/2022 07:42:14 - INFO - codeparrot_training - Step 12594: {'lr': 0.00044227321997333737, 'samples': 2418240, 'steps': 12594, 'loss/train': 0.8114677369594574} 01/27/2022 07:42:17 - INFO - codeparrot_training - Step 12595: {'lr': 0.0004422627617063408, 'samples': 2418432, 'steps': 12595, 'loss/train': 1.4745289385318756} 01/27/2022 07:42:20 - INFO - codeparrot_training - Step 12596: {'lr': 0.00044225230261575165, 'samples': 2418624, 'steps': 12596, 'loss/train': 1.126346915960312} 01/27/2022 07:42:23 - INFO - codeparrot_training - Step 12597: {'lr': 0.00044224184270161466, 'samples': 2418816, 'steps': 12597, 'loss/train': 1.7520655989646912} 01/27/2022 07:42:26 - INFO - codeparrot_training - Step 12598: {'lr': 0.0004422313819639747, 'samples': 2419008, 'steps': 12598, 'loss/train': 0.6486103981733322} 01/27/2022 07:42:31 - INFO - codeparrot_training - Step 12599: {'lr': 0.0004422209204028765, 'samples': 2419200, 'steps': 12599, 'loss/train': 1.1890787780284882} 01/27/2022 07:42:34 - INFO - codeparrot_training - Step 12600: {'lr': 0.0004422104580183649, 'samples': 2419392, 'steps': 12600, 'loss/train': 0.24845324456691742} 01/27/2022 07:42:37 - INFO - codeparrot_training - Step 12601: {'lr': 0.0004421999948104848, 'samples': 2419584, 'steps': 12601, 'loss/train': 0.9491150379180908} 01/27/2022 07:42:40 - INFO - codeparrot_training - Step 12602: {'lr': 0.00044218953077928083, 'samples': 2419776, 'steps': 12602, 'loss/train': 0.8789937794208527} 01/27/2022 07:42:43 - INFO - codeparrot_training - Step 12603: {'lr': 0.000442179065924798, 'samples': 2419968, 'steps': 12603, 'loss/train': 0.6022536903619766} 01/27/2022 07:42:46 - INFO - codeparrot_training - Step 12604: {'lr': 0.0004421686002470811, 'samples': 2420160, 'steps': 12604, 'loss/train': 0.5077358186244965} 01/27/2022 07:42:49 - INFO - codeparrot_training - Step 12605: {'lr': 0.0004421581337461749, 'samples': 2420352, 'steps': 12605, 'loss/train': 0.4602848142385483} 01/27/2022 07:42:53 - INFO - codeparrot_training - Step 12606: {'lr': 0.00044214766642212435, 'samples': 2420544, 'steps': 12606, 'loss/train': 0.9579845666885376} 01/27/2022 07:42:56 - INFO - codeparrot_training - Step 12607: {'lr': 0.00044213719827497413, 'samples': 2420736, 'steps': 12607, 'loss/train': 1.0263141095638275} 01/27/2022 07:43:00 - INFO - codeparrot_training - Step 12608: {'lr': 0.0004421267293047692, 'samples': 2420928, 'steps': 12608, 'loss/train': 0.970055490732193} 01/27/2022 07:43:03 - INFO - codeparrot_training - Step 12609: {'lr': 0.00044211625951155433, 'samples': 2421120, 'steps': 12609, 'loss/train': 1.169284164905548} 01/27/2022 07:43:06 - INFO - codeparrot_training - Step 12610: {'lr': 0.00044210578889537446, 'samples': 2421312, 'steps': 12610, 'loss/train': 0.501809298992157} 01/27/2022 07:43:10 - INFO - codeparrot_training - Step 12611: {'lr': 0.0004420953174562743, 'samples': 2421504, 'steps': 12611, 'loss/train': 0.9173799455165863} 01/27/2022 07:43:13 - INFO - codeparrot_training - Step 12612: {'lr': 0.0004420848451942989, 'samples': 2421696, 'steps': 12612, 'loss/train': 0.6489778161048889} 01/27/2022 07:43:16 - INFO - codeparrot_training - Step 12613: {'lr': 0.000442074372109493, 'samples': 2421888, 'steps': 12613, 'loss/train': 0.544271394610405} 01/27/2022 07:43:19 - INFO - codeparrot_training - Step 12614: {'lr': 0.0004420638982019014, 'samples': 2422080, 'steps': 12614, 'loss/train': 0.8752546012401581} 01/27/2022 07:43:22 - INFO - codeparrot_training - Step 12615: {'lr': 0.0004420534234715691, 'samples': 2422272, 'steps': 12615, 'loss/train': 0.5994856059551239} 01/27/2022 07:43:25 - INFO - codeparrot_training - Step 12616: {'lr': 0.00044204294791854094, 'samples': 2422464, 'steps': 12616, 'loss/train': 0.6093144714832306} 01/27/2022 07:43:30 - INFO - codeparrot_training - Step 12617: {'lr': 0.00044203247154286175, 'samples': 2422656, 'steps': 12617, 'loss/train': 0.7104417979717255} 01/27/2022 07:43:33 - INFO - codeparrot_training - Step 12618: {'lr': 0.0004420219943445765, 'samples': 2422848, 'steps': 12618, 'loss/train': 1.1442264318466187} 01/27/2022 07:43:36 - INFO - codeparrot_training - Step 12619: {'lr': 0.0004420115163237299, 'samples': 2423040, 'steps': 12619, 'loss/train': 1.6287805438041687} 01/27/2022 07:43:39 - INFO - codeparrot_training - Step 12620: {'lr': 0.000442001037480367, 'samples': 2423232, 'steps': 12620, 'loss/train': 0.9843862652778625} 01/27/2022 07:43:42 - INFO - codeparrot_training - Step 12621: {'lr': 0.0004419905578145326, 'samples': 2423424, 'steps': 12621, 'loss/train': 1.0919638574123383} 01/27/2022 07:43:45 - INFO - codeparrot_training - Step 12622: {'lr': 0.00044198007732627155, 'samples': 2423616, 'steps': 12622, 'loss/train': 0.8891672194004059} 01/27/2022 07:43:49 - INFO - codeparrot_training - Step 12623: {'lr': 0.00044196959601562884, 'samples': 2423808, 'steps': 12623, 'loss/train': 0.6182270050048828} 01/27/2022 07:43:52 - INFO - codeparrot_training - Step 12624: {'lr': 0.0004419591138826494, 'samples': 2424000, 'steps': 12624, 'loss/train': 0.8508460521697998} 01/27/2022 07:43:55 - INFO - codeparrot_training - Step 12625: {'lr': 0.000441948630927378, 'samples': 2424192, 'steps': 12625, 'loss/train': 0.8657464385032654} 01/27/2022 07:44:00 - INFO - codeparrot_training - Step 12626: {'lr': 0.0004419381471498597, 'samples': 2424384, 'steps': 12626, 'loss/train': 0.8589036762714386} 01/27/2022 07:44:03 - INFO - codeparrot_training - Step 12627: {'lr': 0.00044192766255013926, 'samples': 2424576, 'steps': 12627, 'loss/train': 0.06747337616980076} 01/27/2022 07:44:06 - INFO - codeparrot_training - Step 12628: {'lr': 0.0004419171771282616, 'samples': 2424768, 'steps': 12628, 'loss/train': 0.7387343645095825} 01/27/2022 07:44:09 - INFO - codeparrot_training - Step 12629: {'lr': 0.0004419066908842718, 'samples': 2424960, 'steps': 12629, 'loss/train': 1.3500736355781555} 01/27/2022 07:44:13 - INFO - codeparrot_training - Step 12630: {'lr': 0.0004418962038182146, 'samples': 2425152, 'steps': 12630, 'loss/train': 0.36471667885780334} 01/27/2022 07:44:16 - INFO - codeparrot_training - Step 12631: {'lr': 0.00044188571593013504, 'samples': 2425344, 'steps': 12631, 'loss/train': 0.9093074798583984} 01/27/2022 07:44:19 - INFO - codeparrot_training - Step 12632: {'lr': 0.000441875227220078, 'samples': 2425536, 'steps': 12632, 'loss/train': 0.5777555108070374} 01/27/2022 07:44:22 - INFO - codeparrot_training - Step 12633: {'lr': 0.00044186473768808844, 'samples': 2425728, 'steps': 12633, 'loss/train': 0.7357423156499863} 01/27/2022 07:44:25 - INFO - codeparrot_training - Step 12634: {'lr': 0.0004418542473342112, 'samples': 2425920, 'steps': 12634, 'loss/train': 0.9607804119586945} 01/27/2022 07:44:30 - INFO - codeparrot_training - Step 12635: {'lr': 0.0004418437561584914, 'samples': 2426112, 'steps': 12635, 'loss/train': 0.700939267873764} 01/27/2022 07:44:33 - INFO - codeparrot_training - Step 12636: {'lr': 0.00044183326416097373, 'samples': 2426304, 'steps': 12636, 'loss/train': 1.0371690094470978} 01/27/2022 07:44:36 - INFO - codeparrot_training - Step 12637: {'lr': 0.0004418227713417033, 'samples': 2426496, 'steps': 12637, 'loss/train': 0.6863858252763748} 01/27/2022 07:44:39 - INFO - codeparrot_training - Step 12638: {'lr': 0.0004418122777007251, 'samples': 2426688, 'steps': 12638, 'loss/train': 1.8096261620521545} 01/27/2022 07:44:42 - INFO - codeparrot_training - Step 12639: {'lr': 0.00044180178323808395, 'samples': 2426880, 'steps': 12639, 'loss/train': 1.1318177282810211} 01/27/2022 07:44:46 - INFO - codeparrot_training - Step 12640: {'lr': 0.00044179128795382493, 'samples': 2427072, 'steps': 12640, 'loss/train': 0.10186750441789627} 01/27/2022 07:44:49 - INFO - codeparrot_training - Step 12641: {'lr': 0.00044178079184799284, 'samples': 2427264, 'steps': 12641, 'loss/train': 0.7943970859050751} 01/27/2022 07:44:52 - INFO - codeparrot_training - Step 12642: {'lr': 0.0004417702949206328, 'samples': 2427456, 'steps': 12642, 'loss/train': 1.37494495511055} 01/27/2022 07:44:55 - INFO - codeparrot_training - Step 12643: {'lr': 0.0004417597971717897, 'samples': 2427648, 'steps': 12643, 'loss/train': 0.7122875601053238} 01/27/2022 07:45:00 - INFO - codeparrot_training - Step 12644: {'lr': 0.0004417492986015085, 'samples': 2427840, 'steps': 12644, 'loss/train': 0.5775775909423828} 01/27/2022 07:45:03 - INFO - codeparrot_training - Step 12645: {'lr': 0.00044173879920983417, 'samples': 2428032, 'steps': 12645, 'loss/train': 0.9660619497299194} 01/27/2022 07:45:06 - INFO - codeparrot_training - Step 12646: {'lr': 0.00044172829899681175, 'samples': 2428224, 'steps': 12646, 'loss/train': 0.7642166912555695} 01/27/2022 07:45:09 - INFO - codeparrot_training - Step 12647: {'lr': 0.00044171779796248623, 'samples': 2428416, 'steps': 12647, 'loss/train': 0.7730759382247925} 01/27/2022 07:45:13 - INFO - codeparrot_training - Step 12648: {'lr': 0.0004417072961069024, 'samples': 2428608, 'steps': 12648, 'loss/train': 0.8799847662448883} 01/27/2022 07:45:16 - INFO - codeparrot_training - Step 12649: {'lr': 0.0004416967934301055, 'samples': 2428800, 'steps': 12649, 'loss/train': 0.7922570407390594} 01/27/2022 07:45:19 - INFO - codeparrot_training - Step 12650: {'lr': 0.00044168628993214036, 'samples': 2428992, 'steps': 12650, 'loss/train': 0.7971552014350891} 01/27/2022 07:45:22 - INFO - codeparrot_training - Step 12651: {'lr': 0.0004416757856130521, 'samples': 2429184, 'steps': 12651, 'loss/train': 0.5784088522195816} 01/27/2022 07:45:25 - INFO - codeparrot_training - Step 12652: {'lr': 0.0004416652804728855, 'samples': 2429376, 'steps': 12652, 'loss/train': 1.0094662606716156} 01/27/2022 07:45:29 - INFO - codeparrot_training - Step 12653: {'lr': 0.0004416547745116858, 'samples': 2429568, 'steps': 12653, 'loss/train': 1.038343459367752} 01/27/2022 07:45:33 - INFO - codeparrot_training - Step 12654: {'lr': 0.00044164426772949785, 'samples': 2429760, 'steps': 12654, 'loss/train': 0.7566057443618774} 01/27/2022 07:45:36 - INFO - codeparrot_training - Step 12655: {'lr': 0.0004416337601263667, 'samples': 2429952, 'steps': 12655, 'loss/train': 0.5600170344114304} 01/27/2022 07:45:39 - INFO - codeparrot_training - Step 12656: {'lr': 0.00044162325170233745, 'samples': 2430144, 'steps': 12656, 'loss/train': 0.6710685789585114} 01/27/2022 07:45:42 - INFO - codeparrot_training - Step 12657: {'lr': 0.00044161274245745497, 'samples': 2430336, 'steps': 12657, 'loss/train': 0.7369587868452072} 01/27/2022 07:45:45 - INFO - codeparrot_training - Step 12658: {'lr': 0.00044160223239176445, 'samples': 2430528, 'steps': 12658, 'loss/train': 0.6246489733457565} 01/27/2022 07:45:48 - INFO - codeparrot_training - Step 12659: {'lr': 0.0004415917215053107, 'samples': 2430720, 'steps': 12659, 'loss/train': 0.5256569534540176} 01/27/2022 07:45:51 - INFO - codeparrot_training - Step 12660: {'lr': 0.00044158120979813885, 'samples': 2430912, 'steps': 12660, 'loss/train': 0.8550284206867218} 01/27/2022 07:45:56 - INFO - codeparrot_training - Step 12661: {'lr': 0.000441570697270294, 'samples': 2431104, 'steps': 12661, 'loss/train': 0.9621084630489349} 01/27/2022 07:45:59 - INFO - codeparrot_training - Step 12662: {'lr': 0.00044156018392182105, 'samples': 2431296, 'steps': 12662, 'loss/train': 0.9038648307323456} 01/27/2022 07:46:02 - INFO - codeparrot_training - Step 12663: {'lr': 0.00044154966975276514, 'samples': 2431488, 'steps': 12663, 'loss/train': 1.3891939222812653} 01/27/2022 07:46:05 - INFO - codeparrot_training - Step 12664: {'lr': 0.00044153915476317126, 'samples': 2431680, 'steps': 12664, 'loss/train': 0.950604110956192} 01/27/2022 07:46:08 - INFO - codeparrot_training - Step 12665: {'lr': 0.00044152863895308446, 'samples': 2431872, 'steps': 12665, 'loss/train': 1.0947866141796112} 01/27/2022 07:46:11 - INFO - codeparrot_training - Step 12666: {'lr': 0.0004415181223225497, 'samples': 2432064, 'steps': 12666, 'loss/train': 1.1076753437519073} 01/27/2022 07:46:15 - INFO - codeparrot_training - Step 12667: {'lr': 0.0004415076048716122, 'samples': 2432256, 'steps': 12667, 'loss/train': 1.092253714799881} 01/27/2022 07:46:18 - INFO - codeparrot_training - Step 12668: {'lr': 0.00044149708660031704, 'samples': 2432448, 'steps': 12668, 'loss/train': 0.6091621667146683} 01/27/2022 07:46:21 - INFO - codeparrot_training - Step 12669: {'lr': 0.000441486567508709, 'samples': 2432640, 'steps': 12669, 'loss/train': 0.7604460418224335} 01/27/2022 07:46:26 - INFO - codeparrot_training - Step 12670: {'lr': 0.0004414760475968334, 'samples': 2432832, 'steps': 12670, 'loss/train': 0.7030576765537262} 01/27/2022 07:46:29 - INFO - codeparrot_training - Step 12671: {'lr': 0.0004414655268647352, 'samples': 2433024, 'steps': 12671, 'loss/train': 0.8451159596443176} 01/27/2022 07:46:32 - INFO - codeparrot_training - Step 12672: {'lr': 0.0004414550053124594, 'samples': 2433216, 'steps': 12672, 'loss/train': 0.8622380197048187} 01/27/2022 07:46:35 - INFO - codeparrot_training - Step 12673: {'lr': 0.0004414444829400512, 'samples': 2433408, 'steps': 12673, 'loss/train': 0.7058532536029816} 01/27/2022 07:46:39 - INFO - codeparrot_training - Step 12674: {'lr': 0.00044143395974755565, 'samples': 2433600, 'steps': 12674, 'loss/train': 1.0719459056854248} 01/27/2022 07:46:42 - INFO - codeparrot_training - Step 12675: {'lr': 0.00044142343573501787, 'samples': 2433792, 'steps': 12675, 'loss/train': 0.6332068294286728} 01/27/2022 07:46:45 - INFO - codeparrot_training - Step 12676: {'lr': 0.0004414129109024827, 'samples': 2433984, 'steps': 12676, 'loss/train': 0.8913850486278534} 01/27/2022 07:46:48 - INFO - codeparrot_training - Step 12677: {'lr': 0.00044140238524999556, 'samples': 2434176, 'steps': 12677, 'loss/train': 0.9226489663124084} 01/27/2022 07:46:51 - INFO - codeparrot_training - Step 12678: {'lr': 0.0004413918587776013, 'samples': 2434368, 'steps': 12678, 'loss/train': 0.7249162048101425} 01/27/2022 07:46:56 - INFO - codeparrot_training - Step 12679: {'lr': 0.0004413813314853451, 'samples': 2434560, 'steps': 12679, 'loss/train': 0.8294258415699005} 01/27/2022 07:46:59 - INFO - codeparrot_training - Step 12680: {'lr': 0.00044137080337327205, 'samples': 2434752, 'steps': 12680, 'loss/train': 0.8382528126239777} 01/27/2022 07:47:02 - INFO - codeparrot_training - Step 12681: {'lr': 0.00044136027444142723, 'samples': 2434944, 'steps': 12681, 'loss/train': 1.0568104684352875} 01/27/2022 07:47:05 - INFO - codeparrot_training - Step 12682: {'lr': 0.0004413497446898558, 'samples': 2435136, 'steps': 12682, 'loss/train': 1.0725422501564026} 01/27/2022 07:47:08 - INFO - codeparrot_training - Step 12683: {'lr': 0.0004413392141186028, 'samples': 2435328, 'steps': 12683, 'loss/train': 0.7117479890584946} 01/27/2022 07:47:11 - INFO - codeparrot_training - Step 12684: {'lr': 0.00044132868272771334, 'samples': 2435520, 'steps': 12684, 'loss/train': 0.9138977229595184} 01/27/2022 07:47:14 - INFO - codeparrot_training - Step 12685: {'lr': 0.0004413181505172326, 'samples': 2435712, 'steps': 12685, 'loss/train': 0.7549859583377838} 01/27/2022 07:47:18 - INFO - codeparrot_training - Step 12686: {'lr': 0.0004413076174872056, 'samples': 2435904, 'steps': 12686, 'loss/train': 1.0360060930252075} 01/27/2022 07:47:23 - INFO - codeparrot_training - Step 12687: {'lr': 0.0004412970836376776, 'samples': 2436096, 'steps': 12687, 'loss/train': 1.031388759613037} 01/27/2022 07:47:26 - INFO - codeparrot_training - Step 12688: {'lr': 0.00044128654896869357, 'samples': 2436288, 'steps': 12688, 'loss/train': 0.8150423169136047} 01/27/2022 07:47:29 - INFO - codeparrot_training - Step 12689: {'lr': 0.00044127601348029874, 'samples': 2436480, 'steps': 12689, 'loss/train': 0.34635036438703537} 01/27/2022 07:47:32 - INFO - codeparrot_training - Step 12690: {'lr': 0.0004412654771725382, 'samples': 2436672, 'steps': 12690, 'loss/train': 4.36512815952301} 01/27/2022 07:47:35 - INFO - codeparrot_training - Step 12691: {'lr': 0.00044125494004545703, 'samples': 2436864, 'steps': 12691, 'loss/train': 1.0089164972305298} 01/27/2022 07:47:38 - INFO - codeparrot_training - Step 12692: {'lr': 0.0004412444020991004, 'samples': 2437056, 'steps': 12692, 'loss/train': 4.903389573097229} 01/27/2022 07:47:42 - INFO - codeparrot_training - Step 12693: {'lr': 0.00044123386333351364, 'samples': 2437248, 'steps': 12693, 'loss/train': 0.7473496645689011} 01/27/2022 07:47:45 - INFO - codeparrot_training - Step 12694: {'lr': 0.00044122332374874166, 'samples': 2437440, 'steps': 12694, 'loss/train': 0.8296410441398621} 01/27/2022 07:47:48 - INFO - codeparrot_training - Step 12695: {'lr': 0.0004412127833448296, 'samples': 2437632, 'steps': 12695, 'loss/train': 0.3833889216184616} 01/27/2022 07:47:52 - INFO - codeparrot_training - Step 12696: {'lr': 0.00044120224212182283, 'samples': 2437824, 'steps': 12696, 'loss/train': 0.7268106490373611} 01/27/2022 07:47:55 - INFO - codeparrot_training - Step 12697: {'lr': 0.0004411917000797663, 'samples': 2438016, 'steps': 12697, 'loss/train': 0.8982835114002228} 01/27/2022 07:47:58 - INFO - codeparrot_training - Step 12698: {'lr': 0.0004411811572187052, 'samples': 2438208, 'steps': 12698, 'loss/train': 0.45354214310646057} 01/27/2022 07:48:02 - INFO - codeparrot_training - Step 12699: {'lr': 0.0004411706135386847, 'samples': 2438400, 'steps': 12699, 'loss/train': 1.1328021883964539} 01/27/2022 07:48:05 - INFO - codeparrot_training - Step 12700: {'lr': 0.0004411600690397501, 'samples': 2438592, 'steps': 12700, 'loss/train': 0.4641272574663162} 01/27/2022 07:48:08 - INFO - codeparrot_training - Step 12701: {'lr': 0.0004411495237219464, 'samples': 2438784, 'steps': 12701, 'loss/train': 0.8660268187522888} 01/27/2022 07:48:11 - INFO - codeparrot_training - Step 12702: {'lr': 0.00044113897758531884, 'samples': 2438976, 'steps': 12702, 'loss/train': 1.0650245547294617} 01/27/2022 07:48:14 - INFO - codeparrot_training - Step 12703: {'lr': 0.00044112843062991264, 'samples': 2439168, 'steps': 12703, 'loss/train': 0.9380347430706024} 01/27/2022 07:48:17 - INFO - codeparrot_training - Step 12704: {'lr': 0.0004411178828557729, 'samples': 2439360, 'steps': 12704, 'loss/train': 1.1201068460941315} 01/27/2022 07:48:22 - INFO - codeparrot_training - Step 12705: {'lr': 0.00044110733426294484, 'samples': 2439552, 'steps': 12705, 'loss/train': 0.8931186497211456} 01/27/2022 07:48:25 - INFO - codeparrot_training - Step 12706: {'lr': 0.00044109678485147367, 'samples': 2439744, 'steps': 12706, 'loss/train': 1.6196562051773071} 01/27/2022 07:48:28 - INFO - codeparrot_training - Step 12707: {'lr': 0.00044108623462140454, 'samples': 2439936, 'steps': 12707, 'loss/train': 1.0323582887649536} 01/27/2022 07:48:31 - INFO - codeparrot_training - Step 12708: {'lr': 0.0004410756835727826, 'samples': 2440128, 'steps': 12708, 'loss/train': 0.6801545172929764} 01/27/2022 07:48:34 - INFO - codeparrot_training - Step 12709: {'lr': 0.0004410651317056532, 'samples': 2440320, 'steps': 12709, 'loss/train': 0.6477064043283463} 01/27/2022 07:48:38 - INFO - codeparrot_training - Step 12710: {'lr': 0.0004410545790200614, 'samples': 2440512, 'steps': 12710, 'loss/train': 0.6467557400465012} 01/27/2022 07:48:41 - INFO - codeparrot_training - Step 12711: {'lr': 0.00044104402551605246, 'samples': 2440704, 'steps': 12711, 'loss/train': 0.6344307214021683} 01/27/2022 07:48:44 - INFO - codeparrot_training - Step 12712: {'lr': 0.00044103347119367155, 'samples': 2440896, 'steps': 12712, 'loss/train': 0.8003877103328705} 01/27/2022 07:48:47 - INFO - codeparrot_training - Step 12713: {'lr': 0.0004410229160529639, 'samples': 2441088, 'steps': 12713, 'loss/train': 0.8182559609413147} 01/27/2022 07:48:52 - INFO - codeparrot_training - Step 12714: {'lr': 0.0004410123600939747, 'samples': 2441280, 'steps': 12714, 'loss/train': 0.6500777453184128} 01/27/2022 07:48:55 - INFO - codeparrot_training - Step 12715: {'lr': 0.00044100180331674933, 'samples': 2441472, 'steps': 12715, 'loss/train': 2.1650830507278442} 01/27/2022 07:48:58 - INFO - codeparrot_training - Step 12716: {'lr': 0.00044099124572133283, 'samples': 2441664, 'steps': 12716, 'loss/train': 0.693444013595581} 01/27/2022 07:49:01 - INFO - codeparrot_training - Step 12717: {'lr': 0.0004409806873077704, 'samples': 2441856, 'steps': 12717, 'loss/train': 1.0041046142578125} 01/27/2022 07:49:04 - INFO - codeparrot_training - Step 12718: {'lr': 0.0004409701280761075, 'samples': 2442048, 'steps': 12718, 'loss/train': 0.8923108577728271} 01/27/2022 07:49:07 - INFO - codeparrot_training - Step 12719: {'lr': 0.0004409595680263891, 'samples': 2442240, 'steps': 12719, 'loss/train': 0.568995863199234} 01/27/2022 07:49:10 - INFO - codeparrot_training - Step 12720: {'lr': 0.0004409490071586606, 'samples': 2442432, 'steps': 12720, 'loss/train': 0.5744617134332657} 01/27/2022 07:49:14 - INFO - codeparrot_training - Step 12721: {'lr': 0.00044093844547296715, 'samples': 2442624, 'steps': 12721, 'loss/train': 1.490464210510254} 01/27/2022 07:49:17 - INFO - codeparrot_training - Step 12722: {'lr': 0.000440927882969354, 'samples': 2442816, 'steps': 12722, 'loss/train': 0.8223584890365601} 01/27/2022 07:49:22 - INFO - codeparrot_training - Step 12723: {'lr': 0.0004409173196478665, 'samples': 2443008, 'steps': 12723, 'loss/train': 0.9133482277393341} 01/27/2022 07:49:25 - INFO - codeparrot_training - Step 12724: {'lr': 0.00044090675550854973, 'samples': 2443200, 'steps': 12724, 'loss/train': 1.0392537117004395} 01/27/2022 07:49:28 - INFO - codeparrot_training - Step 12725: {'lr': 0.00044089619055144916, 'samples': 2443392, 'steps': 12725, 'loss/train': 0.541268527507782} 01/27/2022 07:49:31 - INFO - codeparrot_training - Step 12726: {'lr': 0.0004408856247766098, 'samples': 2443584, 'steps': 12726, 'loss/train': 0.9287963211536407} 01/27/2022 07:49:34 - INFO - codeparrot_training - Step 12727: {'lr': 0.00044087505818407715, 'samples': 2443776, 'steps': 12727, 'loss/train': 0.6339396089315414} 01/27/2022 07:49:38 - INFO - codeparrot_training - Step 12728: {'lr': 0.00044086449077389636, 'samples': 2443968, 'steps': 12728, 'loss/train': 1.1202631294727325} 01/27/2022 07:49:41 - INFO - codeparrot_training - Step 12729: {'lr': 0.0004408539225461126, 'samples': 2444160, 'steps': 12729, 'loss/train': 0.7658480107784271} 01/27/2022 07:49:44 - INFO - codeparrot_training - Step 12730: {'lr': 0.0004408433535007713, 'samples': 2444352, 'steps': 12730, 'loss/train': 0.747945562005043} 01/27/2022 07:49:47 - INFO - codeparrot_training - Step 12731: {'lr': 0.0004408327836379177, 'samples': 2444544, 'steps': 12731, 'loss/train': 0.2901424840092659} 01/27/2022 07:49:52 - INFO - codeparrot_training - Step 12732: {'lr': 0.0004408222129575969, 'samples': 2444736, 'steps': 12732, 'loss/train': 0.8265334367752075} 01/27/2022 07:49:55 - INFO - codeparrot_training - Step 12733: {'lr': 0.0004408116414598545, 'samples': 2444928, 'steps': 12733, 'loss/train': 0.9537586569786072} 01/27/2022 07:49:58 - INFO - codeparrot_training - Step 12734: {'lr': 0.0004408010691447356, 'samples': 2445120, 'steps': 12734, 'loss/train': 0.9588800668716431} 01/27/2022 07:50:01 - INFO - codeparrot_training - Step 12735: {'lr': 0.00044079049601228543, 'samples': 2445312, 'steps': 12735, 'loss/train': 0.513706773519516} 01/27/2022 07:50:04 - INFO - codeparrot_training - Step 12736: {'lr': 0.00044077992206254934, 'samples': 2445504, 'steps': 12736, 'loss/train': 1.0192183256149292} 01/27/2022 07:50:07 - INFO - codeparrot_training - Step 12737: {'lr': 0.0004407693472955727, 'samples': 2445696, 'steps': 12737, 'loss/train': 0.7106431871652603} 01/27/2022 07:50:10 - INFO - codeparrot_training - Step 12738: {'lr': 0.00044075877171140075, 'samples': 2445888, 'steps': 12738, 'loss/train': 0.5739456564188004} 01/27/2022 07:50:14 - INFO - codeparrot_training - Step 12739: {'lr': 0.00044074819531007885, 'samples': 2446080, 'steps': 12739, 'loss/train': 0.39987869560718536} 01/27/2022 07:50:17 - INFO - codeparrot_training - Step 12740: {'lr': 0.0004407376180916522, 'samples': 2446272, 'steps': 12740, 'loss/train': 0.5442996025085449} 01/27/2022 07:50:22 - INFO - codeparrot_training - Step 12741: {'lr': 0.00044072704005616614, 'samples': 2446464, 'steps': 12741, 'loss/train': 0.7884292602539062} 01/27/2022 07:50:25 - INFO - codeparrot_training - Step 12742: {'lr': 0.00044071646120366604, 'samples': 2446656, 'steps': 12742, 'loss/train': 1.0329207479953766} 01/27/2022 07:50:28 - INFO - codeparrot_training - Step 12743: {'lr': 0.00044070588153419715, 'samples': 2446848, 'steps': 12743, 'loss/train': 1.047818899154663} 01/27/2022 07:50:31 - INFO - codeparrot_training - Step 12744: {'lr': 0.00044069530104780486, 'samples': 2447040, 'steps': 12744, 'loss/train': 0.7268070727586746} 01/27/2022 07:50:34 - INFO - codeparrot_training - Step 12745: {'lr': 0.00044068471974453437, 'samples': 2447232, 'steps': 12745, 'loss/train': 1.3027430176734924} 01/27/2022 07:50:38 - INFO - codeparrot_training - Step 12746: {'lr': 0.0004406741376244312, 'samples': 2447424, 'steps': 12746, 'loss/train': 0.9888102114200592} 01/27/2022 07:50:41 - INFO - codeparrot_training - Step 12747: {'lr': 0.00044066355468754047, 'samples': 2447616, 'steps': 12747, 'loss/train': 1.9610477685928345} 01/27/2022 07:50:44 - INFO - codeparrot_training - Step 12748: {'lr': 0.00044065297093390764, 'samples': 2447808, 'steps': 12748, 'loss/train': 1.8013279438018799} 01/27/2022 07:50:47 - INFO - codeparrot_training - Step 12749: {'lr': 0.0004406423863635781, 'samples': 2448000, 'steps': 12749, 'loss/train': 0.8269000947475433} 01/27/2022 07:50:50 - INFO - codeparrot_training - Step 12750: {'lr': 0.00044063180097659704, 'samples': 2448192, 'steps': 12750, 'loss/train': 1.156616896390915} 01/27/2022 07:50:56 - INFO - codeparrot_training - Step 12751: {'lr': 0.00044062121477300985, 'samples': 2448384, 'steps': 12751, 'loss/train': 0.4612003415822983} 01/27/2022 07:50:59 - INFO - codeparrot_training - Step 12752: {'lr': 0.000440610627752862, 'samples': 2448576, 'steps': 12752, 'loss/train': 1.7202565670013428} 01/27/2022 07:51:02 - INFO - codeparrot_training - Step 12753: {'lr': 0.0004406000399161987, 'samples': 2448768, 'steps': 12753, 'loss/train': 1.2126311659812927} 01/27/2022 07:51:05 - INFO - codeparrot_training - Step 12754: {'lr': 0.00044058945126306535, 'samples': 2448960, 'steps': 12754, 'loss/train': 1.1712864339351654} 01/27/2022 07:51:08 - INFO - codeparrot_training - Step 12755: {'lr': 0.0004405788617935073, 'samples': 2449152, 'steps': 12755, 'loss/train': 0.7911227345466614} 01/27/2022 07:51:11 - INFO - codeparrot_training - Step 12756: {'lr': 0.0004405682715075699, 'samples': 2449344, 'steps': 12756, 'loss/train': 1.248079240322113} 01/27/2022 07:51:14 - INFO - codeparrot_training - Step 12757: {'lr': 0.0004405576804052985, 'samples': 2449536, 'steps': 12757, 'loss/train': 0.16618801653385162} 01/27/2022 07:51:18 - INFO - codeparrot_training - Step 12758: {'lr': 0.0004405470884867386, 'samples': 2449728, 'steps': 12758, 'loss/train': 0.8555069267749786} 01/27/2022 07:51:21 - INFO - codeparrot_training - Step 12759: {'lr': 0.00044053649575193543, 'samples': 2449920, 'steps': 12759, 'loss/train': 0.8532268702983856} 01/27/2022 07:51:25 - INFO - codeparrot_training - Step 12760: {'lr': 0.00044052590220093445, 'samples': 2450112, 'steps': 12760, 'loss/train': 0.9817930161952972} 01/27/2022 07:51:28 - INFO - codeparrot_training - Step 12761: {'lr': 0.00044051530783378103, 'samples': 2450304, 'steps': 12761, 'loss/train': 0.9269329011440277} 01/27/2022 07:51:31 - INFO - codeparrot_training - Step 12762: {'lr': 0.0004405047126505204, 'samples': 2450496, 'steps': 12762, 'loss/train': 0.47947409749031067} 01/27/2022 07:51:35 - INFO - codeparrot_training - Step 12763: {'lr': 0.0004404941166511982, 'samples': 2450688, 'steps': 12763, 'loss/train': 0.8345717489719391} 01/27/2022 07:51:38 - INFO - codeparrot_training - Step 12764: {'lr': 0.00044048351983585966, 'samples': 2450880, 'steps': 12764, 'loss/train': 0.9322222173213959} 01/27/2022 07:51:41 - INFO - codeparrot_training - Step 12765: {'lr': 0.00044047292220455016, 'samples': 2451072, 'steps': 12765, 'loss/train': 0.9348075985908508} 01/27/2022 07:51:44 - INFO - codeparrot_training - Step 12766: {'lr': 0.0004404623237573152, 'samples': 2451264, 'steps': 12766, 'loss/train': 1.0971903204917908} 01/27/2022 07:51:47 - INFO - codeparrot_training - Step 12767: {'lr': 0.00044045172449420005, 'samples': 2451456, 'steps': 12767, 'loss/train': 0.34853053092956543} 01/27/2022 07:51:50 - INFO - codeparrot_training - Step 12768: {'lr': 0.00044044112441525026, 'samples': 2451648, 'steps': 12768, 'loss/train': 0.74464550614357} 01/27/2022 07:51:55 - INFO - codeparrot_training - Step 12769: {'lr': 0.0004404305235205112, 'samples': 2451840, 'steps': 12769, 'loss/train': 1.232123225927353} 01/27/2022 07:51:59 - INFO - codeparrot_training - Step 12770: {'lr': 0.0004404199218100281, 'samples': 2452032, 'steps': 12770, 'loss/train': 0.8542844653129578} 01/27/2022 07:52:02 - INFO - codeparrot_training - Step 12771: {'lr': 0.00044040931928384665, 'samples': 2452224, 'steps': 12771, 'loss/train': 1.432610660791397} 01/27/2022 07:52:05 - INFO - codeparrot_training - Step 12772: {'lr': 0.0004403987159420121, 'samples': 2452416, 'steps': 12772, 'loss/train': 1.0959198474884033} 01/27/2022 07:52:08 - INFO - codeparrot_training - Step 12773: {'lr': 0.0004403881117845699, 'samples': 2452608, 'steps': 12773, 'loss/train': 1.0691321790218353} 01/27/2022 07:52:11 - INFO - codeparrot_training - Step 12774: {'lr': 0.00044037750681156547, 'samples': 2452800, 'steps': 12774, 'loss/train': 0.3072492554783821} 01/27/2022 07:52:14 - INFO - codeparrot_training - Step 12775: {'lr': 0.0004403669010230443, 'samples': 2452992, 'steps': 12775, 'loss/train': 0.5703501999378204} 01/27/2022 07:52:17 - INFO - codeparrot_training - Step 12776: {'lr': 0.00044035629441905173, 'samples': 2453184, 'steps': 12776, 'loss/train': 0.765423059463501} 01/27/2022 07:52:22 - INFO - codeparrot_training - Step 12777: {'lr': 0.0004403456869996333, 'samples': 2453376, 'steps': 12777, 'loss/train': 0.15093760192394257} 01/27/2022 07:52:25 - INFO - codeparrot_training - Step 12778: {'lr': 0.0004403350787648343, 'samples': 2453568, 'steps': 12778, 'loss/train': 0.6913269460201263} 01/27/2022 07:52:28 - INFO - codeparrot_training - Step 12779: {'lr': 0.0004403244697147003, 'samples': 2453760, 'steps': 12779, 'loss/train': 0.7043568938970566} 01/27/2022 07:52:31 - INFO - codeparrot_training - Step 12780: {'lr': 0.00044031385984927675, 'samples': 2453952, 'steps': 12780, 'loss/train': 1.2234747111797333} 01/27/2022 07:52:34 - INFO - codeparrot_training - Step 12781: {'lr': 0.000440303249168609, 'samples': 2454144, 'steps': 12781, 'loss/train': 0.9593128859996796} 01/27/2022 07:52:37 - INFO - codeparrot_training - Step 12782: {'lr': 0.0004402926376727425, 'samples': 2454336, 'steps': 12782, 'loss/train': 0.9505273103713989} 01/27/2022 07:52:41 - INFO - codeparrot_training - Step 12783: {'lr': 0.0004402820253617229, 'samples': 2454528, 'steps': 12783, 'loss/train': 1.4888754487037659} 01/27/2022 07:52:44 - INFO - codeparrot_training - Step 12784: {'lr': 0.0004402714122355955, 'samples': 2454720, 'steps': 12784, 'loss/train': 0.8949586451053619} 01/27/2022 07:52:47 - INFO - codeparrot_training - Step 12785: {'lr': 0.00044026079829440567, 'samples': 2454912, 'steps': 12785, 'loss/train': 1.2907841205596924} 01/27/2022 07:52:51 - INFO - codeparrot_training - Step 12786: {'lr': 0.0004402501835381991, 'samples': 2455104, 'steps': 12786, 'loss/train': 1.2825971245765686} 01/27/2022 07:52:54 - INFO - codeparrot_training - Step 12787: {'lr': 0.00044023956796702116, 'samples': 2455296, 'steps': 12787, 'loss/train': 1.1763282716274261} 01/27/2022 07:52:58 - INFO - codeparrot_training - Step 12788: {'lr': 0.0004402289515809172, 'samples': 2455488, 'steps': 12788, 'loss/train': 0.5482691824436188} 01/27/2022 07:53:01 - INFO - codeparrot_training - Step 12789: {'lr': 0.00044021833437993296, 'samples': 2455680, 'steps': 12789, 'loss/train': 0.6134434640407562} 01/27/2022 07:53:04 - INFO - codeparrot_training - Step 12790: {'lr': 0.0004402077163641137, 'samples': 2455872, 'steps': 12790, 'loss/train': 0.8692174851894379} 01/27/2022 07:53:07 - INFO - codeparrot_training - Step 12791: {'lr': 0.000440197097533505, 'samples': 2456064, 'steps': 12791, 'loss/train': 0.5052824467420578} 01/27/2022 07:53:10 - INFO - codeparrot_training - Step 12792: {'lr': 0.00044018647788815235, 'samples': 2456256, 'steps': 12792, 'loss/train': 1.2591506838798523} 01/27/2022 07:53:13 - INFO - codeparrot_training - Step 12793: {'lr': 0.00044017585742810124, 'samples': 2456448, 'steps': 12793, 'loss/train': 1.0895857214927673} 01/27/2022 07:53:16 - INFO - codeparrot_training - Step 12794: {'lr': 0.0004401652361533971, 'samples': 2456640, 'steps': 12794, 'loss/train': 0.8629805445671082} 01/27/2022 07:53:22 - INFO - codeparrot_training - Step 12795: {'lr': 0.00044015461406408544, 'samples': 2456832, 'steps': 12795, 'loss/train': 0.8660799264907837} 01/27/2022 07:53:25 - INFO - codeparrot_training - Step 12796: {'lr': 0.00044014399116021184, 'samples': 2457024, 'steps': 12796, 'loss/train': 0.6915447413921356} 01/27/2022 07:53:28 - INFO - codeparrot_training - Step 12797: {'lr': 0.00044013336744182176, 'samples': 2457216, 'steps': 12797, 'loss/train': 0.7593064606189728} 01/27/2022 07:53:31 - INFO - codeparrot_training - Step 12798: {'lr': 0.0004401227429089607, 'samples': 2457408, 'steps': 12798, 'loss/train': 1.1437808275222778} 01/27/2022 07:53:34 - INFO - codeparrot_training - Step 12799: {'lr': 0.00044011211756167425, 'samples': 2457600, 'steps': 12799, 'loss/train': 1.6389650702476501} 01/27/2022 07:53:38 - INFO - codeparrot_training - Step 12800: {'lr': 0.0004401014914000078, 'samples': 2457792, 'steps': 12800, 'loss/train': 0.44413724541664124} 01/27/2022 07:53:41 - INFO - codeparrot_training - Step 12801: {'lr': 0.00044009086442400684, 'samples': 2457984, 'steps': 12801, 'loss/train': 1.2622810900211334} 01/27/2022 07:53:44 - INFO - codeparrot_training - Step 12802: {'lr': 0.0004400802366337171, 'samples': 2458176, 'steps': 12802, 'loss/train': 0.6383539885282516} 01/27/2022 07:53:47 - INFO - codeparrot_training - Step 12803: {'lr': 0.00044006960802918393, 'samples': 2458368, 'steps': 12803, 'loss/train': 1.0793473720550537} 01/27/2022 07:53:51 - INFO - codeparrot_training - Step 12804: {'lr': 0.0004400589786104529, 'samples': 2458560, 'steps': 12804, 'loss/train': 0.8669790923595428} 01/27/2022 07:53:55 - INFO - codeparrot_training - Step 12805: {'lr': 0.0004400483483775696, 'samples': 2458752, 'steps': 12805, 'loss/train': 0.9533829689025879} 01/27/2022 07:53:58 - INFO - codeparrot_training - Step 12806: {'lr': 0.00044003771733057943, 'samples': 2458944, 'steps': 12806, 'loss/train': 1.0194848477840424} 01/27/2022 07:54:01 - INFO - codeparrot_training - Step 12807: {'lr': 0.0004400270854695281, 'samples': 2459136, 'steps': 12807, 'loss/train': 0.8118232190608978} 01/27/2022 07:54:04 - INFO - codeparrot_training - Step 12808: {'lr': 0.0004400164527944611, 'samples': 2459328, 'steps': 12808, 'loss/train': 0.6209758222103119} 01/27/2022 07:54:07 - INFO - codeparrot_training - Step 12809: {'lr': 0.0004400058193054239, 'samples': 2459520, 'steps': 12809, 'loss/train': 0.6365658044815063} 01/27/2022 07:54:10 - INFO - codeparrot_training - Step 12810: {'lr': 0.0004399951850024621, 'samples': 2459712, 'steps': 12810, 'loss/train': 1.413355439901352} 01/27/2022 07:54:13 - INFO - codeparrot_training - Step 12811: {'lr': 0.0004399845498856213, 'samples': 2459904, 'steps': 12811, 'loss/train': 0.5993366092443466} 01/27/2022 07:54:17 - INFO - codeparrot_training - Step 12812: {'lr': 0.000439973913954947, 'samples': 2460096, 'steps': 12812, 'loss/train': 0.7118782550096512} 01/27/2022 07:54:21 - INFO - codeparrot_training - Step 12813: {'lr': 0.0004399632772104848, 'samples': 2460288, 'steps': 12813, 'loss/train': 0.76853808760643} 01/27/2022 07:54:24 - INFO - codeparrot_training - Step 12814: {'lr': 0.00043995263965228016, 'samples': 2460480, 'steps': 12814, 'loss/train': 0.719673290848732} 01/27/2022 07:54:27 - INFO - codeparrot_training - Step 12815: {'lr': 0.00043994200128037877, 'samples': 2460672, 'steps': 12815, 'loss/train': 0.6863805055618286} 01/27/2022 07:54:30 - INFO - codeparrot_training - Step 12816: {'lr': 0.0004399313620948262, 'samples': 2460864, 'steps': 12816, 'loss/train': 5.56387460231781} 01/27/2022 07:54:34 - INFO - codeparrot_training - Step 12817: {'lr': 0.00043992072209566793, 'samples': 2461056, 'steps': 12817, 'loss/train': 0.9817400872707367} 01/27/2022 07:54:37 - INFO - codeparrot_training - Step 12818: {'lr': 0.0004399100812829496, 'samples': 2461248, 'steps': 12818, 'loss/train': 0.4171940088272095} 01/27/2022 07:54:40 - INFO - codeparrot_training - Step 12819: {'lr': 0.00043989943965671685, 'samples': 2461440, 'steps': 12819, 'loss/train': 0.9224005937576294} 01/27/2022 07:54:43 - INFO - codeparrot_training - Step 12820: {'lr': 0.00043988879721701515, 'samples': 2461632, 'steps': 12820, 'loss/train': 1.0154407918453217} 01/27/2022 07:54:46 - INFO - codeparrot_training - Step 12821: {'lr': 0.0004398781539638901, 'samples': 2461824, 'steps': 12821, 'loss/train': 0.8828924596309662} 01/27/2022 07:54:51 - INFO - codeparrot_training - Step 12822: {'lr': 0.00043986750989738737, 'samples': 2462016, 'steps': 12822, 'loss/train': 0.5140145570039749} 01/27/2022 07:54:54 - INFO - codeparrot_training - Step 12823: {'lr': 0.0004398568650175525, 'samples': 2462208, 'steps': 12823, 'loss/train': 0.952353447675705} 01/27/2022 07:54:57 - INFO - codeparrot_training - Step 12824: {'lr': 0.00043984621932443115, 'samples': 2462400, 'steps': 12824, 'loss/train': 0.7543449103832245} 01/27/2022 07:55:00 - INFO - codeparrot_training - Step 12825: {'lr': 0.0004398355728180689, 'samples': 2462592, 'steps': 12825, 'loss/train': 0.7329846024513245} 01/27/2022 07:55:03 - INFO - codeparrot_training - Step 12826: {'lr': 0.0004398249254985113, 'samples': 2462784, 'steps': 12826, 'loss/train': 0.8654396831989288} 01/27/2022 07:55:06 - INFO - codeparrot_training - Step 12827: {'lr': 0.00043981427736580395, 'samples': 2462976, 'steps': 12827, 'loss/train': 1.2827099561691284} 01/27/2022 07:55:10 - INFO - codeparrot_training - Step 12828: {'lr': 0.00043980362841999253, 'samples': 2463168, 'steps': 12828, 'loss/train': 0.8824068009853363} 01/27/2022 07:55:13 - INFO - codeparrot_training - Step 12829: {'lr': 0.0004397929786611227, 'samples': 2463360, 'steps': 12829, 'loss/train': 0.575408086180687} 01/27/2022 07:55:19 - INFO - codeparrot_training - Step 12830: {'lr': 0.00043978232808923996, 'samples': 2463552, 'steps': 12830, 'loss/train': 0.8329132497310638} 01/27/2022 07:55:23 - INFO - codeparrot_training - Step 12831: {'lr': 0.00043977167670439, 'samples': 2463744, 'steps': 12831, 'loss/train': 0.7625610530376434} 01/27/2022 07:55:26 - INFO - codeparrot_training - Step 12832: {'lr': 0.0004397610245066184, 'samples': 2463936, 'steps': 12832, 'loss/train': 0.8219465017318726} 01/27/2022 07:55:29 - INFO - codeparrot_training - Step 12833: {'lr': 0.00043975037149597085, 'samples': 2464128, 'steps': 12833, 'loss/train': 0.37692488729953766} 01/27/2022 07:55:32 - INFO - codeparrot_training - Step 12834: {'lr': 0.00043973971767249297, 'samples': 2464320, 'steps': 12834, 'loss/train': 0.9098158478736877} 01/27/2022 07:55:35 - INFO - codeparrot_training - Step 12835: {'lr': 0.0004397290630362304, 'samples': 2464512, 'steps': 12835, 'loss/train': 0.8475042879581451} 01/27/2022 07:55:38 - INFO - codeparrot_training - Step 12836: {'lr': 0.0004397184075872288, 'samples': 2464704, 'steps': 12836, 'loss/train': 0.9899342358112335} 01/27/2022 07:55:42 - INFO - codeparrot_training - Step 12837: {'lr': 0.00043970775132553375, 'samples': 2464896, 'steps': 12837, 'loss/train': 1.0326098799705505} 01/27/2022 07:55:45 - INFO - codeparrot_training - Step 12838: {'lr': 0.00043969709425119085, 'samples': 2465088, 'steps': 12838, 'loss/train': 1.0483671426773071} 01/27/2022 07:55:49 - INFO - codeparrot_training - Step 12839: {'lr': 0.000439686436364246, 'samples': 2465280, 'steps': 12839, 'loss/train': 0.837760716676712} 01/27/2022 07:55:52 - INFO - codeparrot_training - Step 12840: {'lr': 0.00043967577766474455, 'samples': 2465472, 'steps': 12840, 'loss/train': 0.9787591695785522} 01/27/2022 07:55:55 - INFO - codeparrot_training - Step 12841: {'lr': 0.00043966511815273233, 'samples': 2465664, 'steps': 12841, 'loss/train': 1.5712165832519531} 01/27/2022 07:55:59 - INFO - codeparrot_training - Step 12842: {'lr': 0.00043965445782825495, 'samples': 2465856, 'steps': 12842, 'loss/train': 1.2777431309223175} 01/27/2022 07:56:02 - INFO - codeparrot_training - Step 12843: {'lr': 0.00043964379669135815, 'samples': 2466048, 'steps': 12843, 'loss/train': 0.545013964176178} 01/27/2022 07:56:05 - INFO - codeparrot_training - Step 12844: {'lr': 0.00043963313474208753, 'samples': 2466240, 'steps': 12844, 'loss/train': 0.7362724989652634} 01/27/2022 07:56:08 - INFO - codeparrot_training - Step 12845: {'lr': 0.0004396224719804888, 'samples': 2466432, 'steps': 12845, 'loss/train': 1.0646477043628693} 01/27/2022 07:56:11 - INFO - codeparrot_training - Step 12846: {'lr': 0.0004396118084066075, 'samples': 2466624, 'steps': 12846, 'loss/train': 0.43582865595817566} 01/27/2022 07:56:14 - INFO - codeparrot_training - Step 12847: {'lr': 0.00043960114402048957, 'samples': 2466816, 'steps': 12847, 'loss/train': 0.9550872445106506} 01/27/2022 07:56:21 - INFO - codeparrot_training - Step 12848: {'lr': 0.0004395904788221805, 'samples': 2467008, 'steps': 12848, 'loss/train': 1.0264510810375214} 01/27/2022 07:56:24 - INFO - codeparrot_training - Step 12849: {'lr': 0.00043957981281172597, 'samples': 2467200, 'steps': 12849, 'loss/train': 0.9688723683357239} 01/27/2022 07:56:27 - INFO - codeparrot_training - Step 12850: {'lr': 0.00043956914598917177, 'samples': 2467392, 'steps': 12850, 'loss/train': 0.9877005815505981} 01/27/2022 07:56:30 - INFO - codeparrot_training - Step 12851: {'lr': 0.00043955847835456353, 'samples': 2467584, 'steps': 12851, 'loss/train': 0.7149689644575119} 01/27/2022 07:56:33 - INFO - codeparrot_training - Step 12852: {'lr': 0.00043954780990794695, 'samples': 2467776, 'steps': 12852, 'loss/train': 0.6059526354074478} 01/27/2022 07:56:36 - INFO - codeparrot_training - Step 12853: {'lr': 0.0004395371406493677, 'samples': 2467968, 'steps': 12853, 'loss/train': 1.2772019505500793} 01/27/2022 07:56:39 - INFO - codeparrot_training - Step 12854: {'lr': 0.0004395264705788716, 'samples': 2468160, 'steps': 12854, 'loss/train': 1.1121181547641754} 01/27/2022 07:56:43 - INFO - codeparrot_training - Step 12855: {'lr': 0.00043951579969650424, 'samples': 2468352, 'steps': 12855, 'loss/train': 0.7687987089157104} 01/27/2022 07:56:46 - INFO - codeparrot_training - Step 12856: {'lr': 0.00043950512800231136, 'samples': 2468544, 'steps': 12856, 'loss/train': 0.6196648925542831} 01/27/2022 07:56:50 - INFO - codeparrot_training - Step 12857: {'lr': 0.0004394944554963387, 'samples': 2468736, 'steps': 12857, 'loss/train': 0.9155126810073853} 01/27/2022 07:56:54 - INFO - codeparrot_training - Step 12858: {'lr': 0.000439483782178632, 'samples': 2468928, 'steps': 12858, 'loss/train': 0.900991827249527} 01/27/2022 07:56:57 - INFO - codeparrot_training - Step 12859: {'lr': 0.0004394731080492369, 'samples': 2469120, 'steps': 12859, 'loss/train': 0.9401300847530365} 01/27/2022 07:57:00 - INFO - codeparrot_training - Step 12860: {'lr': 0.0004394624331081992, 'samples': 2469312, 'steps': 12860, 'loss/train': 0.7567673027515411} 01/27/2022 07:57:03 - INFO - codeparrot_training - Step 12861: {'lr': 0.00043945175735556454, 'samples': 2469504, 'steps': 12861, 'loss/train': 0.8954168558120728} 01/27/2022 07:57:06 - INFO - codeparrot_training - Step 12862: {'lr': 0.0004394410807913788, 'samples': 2469696, 'steps': 12862, 'loss/train': 0.771345466375351} 01/27/2022 07:57:09 - INFO - codeparrot_training - Step 12863: {'lr': 0.0004394304034156875, 'samples': 2469888, 'steps': 12863, 'loss/train': 0.8790397346019745} 01/27/2022 07:57:12 - INFO - codeparrot_training - Step 12864: {'lr': 0.00043941972522853665, 'samples': 2470080, 'steps': 12864, 'loss/train': 0.9239480495452881} 01/27/2022 07:57:19 - INFO - codeparrot_training - Step 12865: {'lr': 0.00043940904622997176, 'samples': 2470272, 'steps': 12865, 'loss/train': 0.7167223691940308} 01/27/2022 07:57:22 - INFO - codeparrot_training - Step 12866: {'lr': 0.00043939836642003865, 'samples': 2470464, 'steps': 12866, 'loss/train': 0.6133832931518555} 01/27/2022 07:57:25 - INFO - codeparrot_training - Step 12867: {'lr': 0.0004393876857987831, 'samples': 2470656, 'steps': 12867, 'loss/train': 0.934740275144577} 01/27/2022 07:57:28 - INFO - codeparrot_training - Step 12868: {'lr': 0.0004393770043662508, 'samples': 2470848, 'steps': 12868, 'loss/train': 0.9289538562297821} 01/27/2022 07:57:31 - INFO - codeparrot_training - Step 12869: {'lr': 0.0004393663221224876, 'samples': 2471040, 'steps': 12869, 'loss/train': 0.7103485018014908} 01/27/2022 07:57:34 - INFO - codeparrot_training - Step 12870: {'lr': 0.00043935563906753923, 'samples': 2471232, 'steps': 12870, 'loss/train': 0.922269880771637} 01/27/2022 07:57:38 - INFO - codeparrot_training - Step 12871: {'lr': 0.0004393449552014514, 'samples': 2471424, 'steps': 12871, 'loss/train': 0.9400824308395386} 01/27/2022 07:57:41 - INFO - codeparrot_training - Step 12872: {'lr': 0.00043933427052426986, 'samples': 2471616, 'steps': 12872, 'loss/train': 1.473177284002304} 01/27/2022 07:57:44 - INFO - codeparrot_training - Step 12873: {'lr': 0.00043932358503604054, 'samples': 2471808, 'steps': 12873, 'loss/train': 0.7742366194725037} 01/27/2022 07:57:48 - INFO - codeparrot_training - Step 12874: {'lr': 0.000439312898736809, 'samples': 2472000, 'steps': 12874, 'loss/train': 0.9969524145126343} 01/27/2022 07:57:51 - INFO - codeparrot_training - Step 12875: {'lr': 0.00043930221162662115, 'samples': 2472192, 'steps': 12875, 'loss/train': 0.873320996761322} 01/27/2022 07:57:55 - INFO - codeparrot_training - Step 12876: {'lr': 0.0004392915237055227, 'samples': 2472384, 'steps': 12876, 'loss/train': 0.7630592286586761} 01/27/2022 07:57:58 - INFO - codeparrot_training - Step 12877: {'lr': 0.00043928083497355954, 'samples': 2472576, 'steps': 12877, 'loss/train': 1.2785763144493103} 01/27/2022 07:58:01 - INFO - codeparrot_training - Step 12878: {'lr': 0.0004392701454307773, 'samples': 2472768, 'steps': 12878, 'loss/train': 0.12591353058815002} 01/27/2022 07:58:04 - INFO - codeparrot_training - Step 12879: {'lr': 0.00043925945507722195, 'samples': 2472960, 'steps': 12879, 'loss/train': 1.3353896141052246} 01/27/2022 07:58:07 - INFO - codeparrot_training - Step 12880: {'lr': 0.0004392487639129391, 'samples': 2473152, 'steps': 12880, 'loss/train': 0.7577363848686218} 01/27/2022 07:58:10 - INFO - codeparrot_training - Step 12881: {'lr': 0.0004392380719379747, 'samples': 2473344, 'steps': 12881, 'loss/train': 0.9548872411251068} 01/27/2022 07:58:13 - INFO - codeparrot_training - Step 12882: {'lr': 0.0004392273791523744, 'samples': 2473536, 'steps': 12882, 'loss/train': 0.7637374699115753} 01/27/2022 07:58:18 - INFO - codeparrot_training - Step 12883: {'lr': 0.0004392166855561842, 'samples': 2473728, 'steps': 12883, 'loss/train': 0.4037669599056244} 01/27/2022 07:58:21 - INFO - codeparrot_training - Step 12884: {'lr': 0.0004392059911494498, 'samples': 2473920, 'steps': 12884, 'loss/train': 0.736060380935669} 01/27/2022 07:58:24 - INFO - codeparrot_training - Step 12885: {'lr': 0.00043919529593221696, 'samples': 2474112, 'steps': 12885, 'loss/train': 0.4308176636695862} 01/27/2022 07:58:28 - INFO - codeparrot_training - Step 12886: {'lr': 0.00043918459990453156, 'samples': 2474304, 'steps': 12886, 'loss/train': 0.8025737106800079} 01/27/2022 07:58:31 - INFO - codeparrot_training - Step 12887: {'lr': 0.00043917390306643945, 'samples': 2474496, 'steps': 12887, 'loss/train': 0.8422924876213074} 01/27/2022 07:58:34 - INFO - codeparrot_training - Step 12888: {'lr': 0.0004391632054179864, 'samples': 2474688, 'steps': 12888, 'loss/train': 0.9047326147556305} 01/27/2022 07:58:37 - INFO - codeparrot_training - Step 12889: {'lr': 0.00043915250695921815, 'samples': 2474880, 'steps': 12889, 'loss/train': 0.8124623000621796} 01/27/2022 07:58:40 - INFO - codeparrot_training - Step 12890: {'lr': 0.00043914180769018073, 'samples': 2475072, 'steps': 12890, 'loss/train': 0.5016563236713409} 01/27/2022 07:58:46 - INFO - codeparrot_training - Step 12891: {'lr': 0.0004391311076109198, 'samples': 2475264, 'steps': 12891, 'loss/train': 0.9376819431781769} 01/27/2022 07:58:49 - INFO - codeparrot_training - Step 12892: {'lr': 0.00043912040672148135, 'samples': 2475456, 'steps': 12892, 'loss/train': 0.7747671604156494} 01/27/2022 07:58:53 - INFO - codeparrot_training - Step 12893: {'lr': 0.00043910970502191105, 'samples': 2475648, 'steps': 12893, 'loss/train': 0.625015452504158} 01/27/2022 07:58:56 - INFO - codeparrot_training - Step 12894: {'lr': 0.00043909900251225476, 'samples': 2475840, 'steps': 12894, 'loss/train': 0.9807164669036865} 01/27/2022 07:58:59 - INFO - codeparrot_training - Step 12895: {'lr': 0.00043908829919255855, 'samples': 2476032, 'steps': 12895, 'loss/train': 0.557600811123848} 01/27/2022 07:59:02 - INFO - codeparrot_training - Step 12896: {'lr': 0.00043907759506286797, 'samples': 2476224, 'steps': 12896, 'loss/train': 1.0605399906635284} 01/27/2022 07:59:05 - INFO - codeparrot_training - Step 12897: {'lr': 0.0004390668901232291, 'samples': 2476416, 'steps': 12897, 'loss/train': 1.2628551721572876} 01/27/2022 07:59:08 - INFO - codeparrot_training - Step 12898: {'lr': 0.00043905618437368766, 'samples': 2476608, 'steps': 12898, 'loss/train': 0.796659529209137} 01/27/2022 07:59:11 - INFO - codeparrot_training - Step 12899: {'lr': 0.0004390454778142896, 'samples': 2476800, 'steps': 12899, 'loss/train': 1.1556319892406464} 01/27/2022 07:59:16 - INFO - codeparrot_training - Step 12900: {'lr': 0.00043903477044508066, 'samples': 2476992, 'steps': 12900, 'loss/train': 1.1617566347122192} 01/27/2022 07:59:19 - INFO - codeparrot_training - Step 12901: {'lr': 0.0004390240622661069, 'samples': 2477184, 'steps': 12901, 'loss/train': 0.8270131945610046} 01/27/2022 07:59:22 - INFO - codeparrot_training - Step 12902: {'lr': 0.000439013353277414, 'samples': 2477376, 'steps': 12902, 'loss/train': 1.1491713523864746} 01/27/2022 07:59:25 - INFO - codeparrot_training - Step 12903: {'lr': 0.00043900264347904796, 'samples': 2477568, 'steps': 12903, 'loss/train': 0.7662493586540222} 01/27/2022 07:59:29 - INFO - codeparrot_training - Step 12904: {'lr': 0.00043899193287105456, 'samples': 2477760, 'steps': 12904, 'loss/train': 0.4385281205177307} 01/27/2022 07:59:32 - INFO - codeparrot_training - Step 12905: {'lr': 0.0004389812214534798, 'samples': 2477952, 'steps': 12905, 'loss/train': 0.663462907075882} 01/27/2022 07:59:35 - INFO - codeparrot_training - Step 12906: {'lr': 0.00043897050922636947, 'samples': 2478144, 'steps': 12906, 'loss/train': 0.47289419174194336} 01/27/2022 07:59:38 - INFO - codeparrot_training - Step 12907: {'lr': 0.00043895979618976944, 'samples': 2478336, 'steps': 12907, 'loss/train': 0.6567566245794296} 01/27/2022 07:59:41 - INFO - codeparrot_training - Step 12908: {'lr': 0.00043894908234372564, 'samples': 2478528, 'steps': 12908, 'loss/train': 0.38387905061244965} 01/27/2022 07:59:45 - INFO - codeparrot_training - Step 12909: {'lr': 0.00043893836768828405, 'samples': 2478720, 'steps': 12909, 'loss/train': 1.1053133010864258} 01/27/2022 07:59:49 - INFO - codeparrot_training - Step 12910: {'lr': 0.0004389276522234904, 'samples': 2478912, 'steps': 12910, 'loss/train': 1.101422756910324} 01/27/2022 07:59:52 - INFO - codeparrot_training - Step 12911: {'lr': 0.00043891693594939077, 'samples': 2479104, 'steps': 12911, 'loss/train': 0.7174773216247559} 01/27/2022 07:59:55 - INFO - codeparrot_training - Step 12912: {'lr': 0.0004389062188660309, 'samples': 2479296, 'steps': 12912, 'loss/train': 1.4451311230659485} 01/27/2022 07:59:58 - INFO - codeparrot_training - Step 12913: {'lr': 0.00043889550097345675, 'samples': 2479488, 'steps': 12913, 'loss/train': 0.4571616053581238} 01/27/2022 08:00:01 - INFO - codeparrot_training - Step 12914: {'lr': 0.0004388847822717144, 'samples': 2479680, 'steps': 12914, 'loss/train': 0.5964820235967636} 01/27/2022 08:00:04 - INFO - codeparrot_training - Step 12915: {'lr': 0.0004388740627608495, 'samples': 2479872, 'steps': 12915, 'loss/train': 0.8185421526432037} 01/27/2022 08:00:07 - INFO - codeparrot_training - Step 12916: {'lr': 0.0004388633424409081, 'samples': 2480064, 'steps': 12916, 'loss/train': 0.9026091992855072} 01/27/2022 08:00:11 - INFO - codeparrot_training - Step 12917: {'lr': 0.0004388526213119361, 'samples': 2480256, 'steps': 12917, 'loss/train': 0.8098459839820862} 01/27/2022 08:00:15 - INFO - codeparrot_training - Step 12918: {'lr': 0.00043884189937397946, 'samples': 2480448, 'steps': 12918, 'loss/train': 0.7406044900417328} 01/27/2022 08:00:18 - INFO - codeparrot_training - Step 12919: {'lr': 0.00043883117662708404, 'samples': 2480640, 'steps': 12919, 'loss/train': 0.5403788834810257} 01/27/2022 08:00:21 - INFO - codeparrot_training - Step 12920: {'lr': 0.0004388204530712959, 'samples': 2480832, 'steps': 12920, 'loss/train': 1.009154498577118} 01/27/2022 08:00:25 - INFO - codeparrot_training - Step 12921: {'lr': 0.00043880972870666084, 'samples': 2481024, 'steps': 12921, 'loss/train': 0.7362146079540253} 01/27/2022 08:00:28 - INFO - codeparrot_training - Step 12922: {'lr': 0.0004387990035332249, 'samples': 2481216, 'steps': 12922, 'loss/train': 0.9145942032337189} 01/27/2022 08:00:31 - INFO - codeparrot_training - Step 12923: {'lr': 0.00043878827755103404, 'samples': 2481408, 'steps': 12923, 'loss/train': 0.8371401429176331} 01/27/2022 08:00:34 - INFO - codeparrot_training - Step 12924: {'lr': 0.00043877755076013406, 'samples': 2481600, 'steps': 12924, 'loss/train': 0.7749732434749603} 01/27/2022 08:00:37 - INFO - codeparrot_training - Step 12925: {'lr': 0.00043876682316057095, 'samples': 2481792, 'steps': 12925, 'loss/train': 0.9326722919940948} 01/27/2022 08:00:40 - INFO - codeparrot_training - Step 12926: {'lr': 0.0004387560947523908, 'samples': 2481984, 'steps': 12926, 'loss/train': 1.006930947303772} 01/27/2022 08:00:47 - INFO - codeparrot_training - Step 12927: {'lr': 0.0004387453655356394, 'samples': 2482176, 'steps': 12927, 'loss/train': 0.6037773191928864} 01/27/2022 08:00:50 - INFO - codeparrot_training - Step 12928: {'lr': 0.00043873463551036284, 'samples': 2482368, 'steps': 12928, 'loss/train': 1.1379228830337524} 01/27/2022 08:00:53 - INFO - codeparrot_training - Step 12929: {'lr': 0.000438723904676607, 'samples': 2482560, 'steps': 12929, 'loss/train': 0.6402118653059006} 01/27/2022 08:00:56 - INFO - codeparrot_training - Step 12930: {'lr': 0.0004387131730344179, 'samples': 2482752, 'steps': 12930, 'loss/train': 0.5758780092000961} 01/27/2022 08:00:59 - INFO - codeparrot_training - Step 12931: {'lr': 0.00043870244058384145, 'samples': 2482944, 'steps': 12931, 'loss/train': 0.5180577635765076} 01/27/2022 08:01:03 - INFO - codeparrot_training - Step 12932: {'lr': 0.0004386917073249237, 'samples': 2483136, 'steps': 12932, 'loss/train': 0.7147423177957535} 01/27/2022 08:01:06 - INFO - codeparrot_training - Step 12933: {'lr': 0.00043868097325771064, 'samples': 2483328, 'steps': 12933, 'loss/train': 0.4976092278957367} 01/27/2022 08:01:09 - INFO - codeparrot_training - Step 12934: {'lr': 0.0004386702383822482, 'samples': 2483520, 'steps': 12934, 'loss/train': 0.3906601667404175} 01/27/2022 08:01:12 - INFO - codeparrot_training - Step 12935: {'lr': 0.00043865950269858224, 'samples': 2483712, 'steps': 12935, 'loss/train': 0.6380659639835358} 01/27/2022 08:01:17 - INFO - codeparrot_training - Step 12936: {'lr': 0.000438648766206759, 'samples': 2483904, 'steps': 12936, 'loss/train': 0.8618307709693909} 01/27/2022 08:01:20 - INFO - codeparrot_training - Step 12937: {'lr': 0.0004386380289068243, 'samples': 2484096, 'steps': 12937, 'loss/train': 0.10331983119249344} 01/27/2022 08:01:23 - INFO - codeparrot_training - Step 12938: {'lr': 0.0004386272907988242, 'samples': 2484288, 'steps': 12938, 'loss/train': 0.6885059773921967} 01/27/2022 08:01:26 - INFO - codeparrot_training - Step 12939: {'lr': 0.0004386165518828047, 'samples': 2484480, 'steps': 12939, 'loss/train': 0.5429956018924713} 01/27/2022 08:01:29 - INFO - codeparrot_training - Step 12940: {'lr': 0.0004386058121588117, 'samples': 2484672, 'steps': 12940, 'loss/train': 0.7980629503726959} 01/27/2022 08:01:32 - INFO - codeparrot_training - Step 12941: {'lr': 0.0004385950716268914, 'samples': 2484864, 'steps': 12941, 'loss/train': 0.5989046394824982} 01/27/2022 08:01:36 - INFO - codeparrot_training - Step 12942: {'lr': 0.0004385843302870896, 'samples': 2485056, 'steps': 12942, 'loss/train': 1.477044939994812} 01/27/2022 08:01:39 - INFO - codeparrot_training - Step 12943: {'lr': 0.0004385735881394525, 'samples': 2485248, 'steps': 12943, 'loss/train': 0.8627152740955353} 01/27/2022 08:01:43 - INFO - codeparrot_training - Step 12944: {'lr': 0.00043856284518402594, 'samples': 2485440, 'steps': 12944, 'loss/train': 0.8811114728450775} 01/27/2022 08:01:46 - INFO - codeparrot_training - Step 12945: {'lr': 0.00043855210142085613, 'samples': 2485632, 'steps': 12945, 'loss/train': 0.8928857445716858} 01/27/2022 08:01:49 - INFO - codeparrot_training - Step 12946: {'lr': 0.00043854135684998893, 'samples': 2485824, 'steps': 12946, 'loss/train': 0.7336706668138504} 01/27/2022 08:01:53 - INFO - codeparrot_training - Step 12947: {'lr': 0.0004385306114714704, 'samples': 2486016, 'steps': 12947, 'loss/train': 0.8331993520259857} 01/27/2022 08:01:56 - INFO - codeparrot_training - Step 12948: {'lr': 0.0004385198652853466, 'samples': 2486208, 'steps': 12948, 'loss/train': 0.8189904391765594} 01/27/2022 08:01:59 - INFO - codeparrot_training - Step 12949: {'lr': 0.00043850911829166364, 'samples': 2486400, 'steps': 12949, 'loss/train': 0.6201857328414917} 01/27/2022 08:02:02 - INFO - codeparrot_training - Step 12950: {'lr': 0.00043849837049046735, 'samples': 2486592, 'steps': 12950, 'loss/train': 0.8073696792125702} 01/27/2022 08:02:05 - INFO - codeparrot_training - Step 12951: {'lr': 0.000438487621881804, 'samples': 2486784, 'steps': 12951, 'loss/train': 1.2384983897209167} 01/27/2022 08:02:08 - INFO - codeparrot_training - Step 12952: {'lr': 0.00043847687246571955, 'samples': 2486976, 'steps': 12952, 'loss/train': 0.9999567568302155} 01/27/2022 08:02:14 - INFO - codeparrot_training - Step 12953: {'lr': 0.0004384661222422599, 'samples': 2487168, 'steps': 12953, 'loss/train': 0.9459817707538605} 01/27/2022 08:02:17 - INFO - codeparrot_training - Step 12954: {'lr': 0.00043845537121147126, 'samples': 2487360, 'steps': 12954, 'loss/train': 0.6708146184682846} 01/27/2022 08:02:20 - INFO - codeparrot_training - Step 12955: {'lr': 0.00043844461937339976, 'samples': 2487552, 'steps': 12955, 'loss/train': 0.707727313041687} 01/27/2022 08:02:24 - INFO - codeparrot_training - Step 12956: {'lr': 0.00043843386672809127, 'samples': 2487744, 'steps': 12956, 'loss/train': 1.1350246667861938} 01/27/2022 08:02:27 - INFO - codeparrot_training - Step 12957: {'lr': 0.00043842311327559194, 'samples': 2487936, 'steps': 12957, 'loss/train': 0.6175381690263748} 01/27/2022 08:02:30 - INFO - codeparrot_training - Step 12958: {'lr': 0.0004384123590159478, 'samples': 2488128, 'steps': 12958, 'loss/train': 0.6743222773075104} 01/27/2022 08:02:33 - INFO - codeparrot_training - Step 12959: {'lr': 0.000438401603949205, 'samples': 2488320, 'steps': 12959, 'loss/train': 0.8720486462116241} 01/27/2022 08:02:36 - INFO - codeparrot_training - Step 12960: {'lr': 0.0004383908480754095, 'samples': 2488512, 'steps': 12960, 'loss/train': 0.9281177222728729} 01/27/2022 08:02:39 - INFO - codeparrot_training - Step 12961: {'lr': 0.0004383800913946074, 'samples': 2488704, 'steps': 12961, 'loss/train': 0.902748316526413} 01/27/2022 08:02:44 - INFO - codeparrot_training - Step 12962: {'lr': 0.00043836933390684486, 'samples': 2488896, 'steps': 12962, 'loss/train': 0.6661141812801361} 01/27/2022 08:02:47 - INFO - codeparrot_training - Step 12963: {'lr': 0.0004383585756121679, 'samples': 2489088, 'steps': 12963, 'loss/train': 0.9056498408317566} 01/27/2022 08:02:50 - INFO - codeparrot_training - Step 12964: {'lr': 0.00043834781651062263, 'samples': 2489280, 'steps': 12964, 'loss/train': 0.9855388104915619} 01/27/2022 08:02:53 - INFO - codeparrot_training - Step 12965: {'lr': 0.00043833705660225507, 'samples': 2489472, 'steps': 12965, 'loss/train': 0.6731363385915756} 01/27/2022 08:02:57 - INFO - codeparrot_training - Step 12966: {'lr': 0.0004383262958871114, 'samples': 2489664, 'steps': 12966, 'loss/train': 0.9893546104431152} 01/27/2022 08:03:00 - INFO - codeparrot_training - Step 12967: {'lr': 0.0004383155343652377, 'samples': 2489856, 'steps': 12967, 'loss/train': 0.1898704245686531} 01/27/2022 08:03:03 - INFO - codeparrot_training - Step 12968: {'lr': 0.00043830477203668, 'samples': 2490048, 'steps': 12968, 'loss/train': 0.9152245223522186} 01/27/2022 08:03:06 - INFO - codeparrot_training - Step 12969: {'lr': 0.00043829400890148446, 'samples': 2490240, 'steps': 12969, 'loss/train': 0.9615536034107208} 01/27/2022 08:03:09 - INFO - codeparrot_training - Step 12970: {'lr': 0.0004382832449596972, 'samples': 2490432, 'steps': 12970, 'loss/train': 0.8831782042980194} 01/27/2022 08:03:15 - INFO - codeparrot_training - Step 12971: {'lr': 0.0004382724802113643, 'samples': 2490624, 'steps': 12971, 'loss/train': 1.165023922920227} 01/27/2022 08:03:18 - INFO - codeparrot_training - Step 12972: {'lr': 0.0004382617146565319, 'samples': 2490816, 'steps': 12972, 'loss/train': 0.6336932480335236} 01/27/2022 08:03:21 - INFO - codeparrot_training - Step 12973: {'lr': 0.00043825094829524604, 'samples': 2491008, 'steps': 12973, 'loss/train': 0.8627437055110931} 01/27/2022 08:03:24 - INFO - codeparrot_training - Step 12974: {'lr': 0.0004382401811275529, 'samples': 2491200, 'steps': 12974, 'loss/train': 0.7830715477466583} 01/27/2022 08:03:27 - INFO - codeparrot_training - Step 12975: {'lr': 0.0004382294131534986, 'samples': 2491392, 'steps': 12975, 'loss/train': 1.013732761144638} 01/27/2022 08:03:30 - INFO - codeparrot_training - Step 12976: {'lr': 0.00043821864437312933, 'samples': 2491584, 'steps': 12976, 'loss/train': 0.7190096229314804} 01/27/2022 08:03:33 - INFO - codeparrot_training - Step 12977: {'lr': 0.00043820787478649105, 'samples': 2491776, 'steps': 12977, 'loss/train': 0.707111120223999} 01/27/2022 08:03:37 - INFO - codeparrot_training - Step 12978: {'lr': 0.00043819710439363, 'samples': 2491968, 'steps': 12978, 'loss/train': 1.0847667753696442} 01/27/2022 08:03:41 - INFO - codeparrot_training - Step 12979: {'lr': 0.00043818633319459244, 'samples': 2492160, 'steps': 12979, 'loss/train': 0.96273073554039} 01/27/2022 08:03:44 - INFO - codeparrot_training - Step 12980: {'lr': 0.00043817556118942426, 'samples': 2492352, 'steps': 12980, 'loss/train': 1.2904804944992065} 01/27/2022 08:03:47 - INFO - codeparrot_training - Step 12981: {'lr': 0.00043816478837817183, 'samples': 2492544, 'steps': 12981, 'loss/train': 0.8611558377742767} 01/27/2022 08:03:50 - INFO - codeparrot_training - Step 12982: {'lr': 0.0004381540147608811, 'samples': 2492736, 'steps': 12982, 'loss/train': 0.7917608320713043} 01/27/2022 08:03:54 - INFO - codeparrot_training - Step 12983: {'lr': 0.00043814324033759834, 'samples': 2492928, 'steps': 12983, 'loss/train': 0.5720646232366562} 01/27/2022 08:03:57 - INFO - codeparrot_training - Step 12984: {'lr': 0.0004381324651083697, 'samples': 2493120, 'steps': 12984, 'loss/train': 0.7902843654155731} 01/27/2022 08:04:00 - INFO - codeparrot_training - Step 12985: {'lr': 0.00043812168907324137, 'samples': 2493312, 'steps': 12985, 'loss/train': 0.5776707082986832} 01/27/2022 08:04:03 - INFO - codeparrot_training - Step 12986: {'lr': 0.0004381109122322594, 'samples': 2493504, 'steps': 12986, 'loss/train': 0.6497831046581268} 01/27/2022 08:04:06 - INFO - codeparrot_training - Step 12987: {'lr': 0.00043810013458547007, 'samples': 2493696, 'steps': 12987, 'loss/train': 0.8652882277965546} 01/27/2022 08:04:11 - INFO - codeparrot_training - Step 12988: {'lr': 0.00043808935613291934, 'samples': 2493888, 'steps': 12988, 'loss/train': 1.066385418176651} 01/27/2022 08:04:14 - INFO - codeparrot_training - Step 12989: {'lr': 0.0004380785768746537, 'samples': 2494080, 'steps': 12989, 'loss/train': 0.967716783285141} 01/27/2022 08:04:17 - INFO - codeparrot_training - Step 12990: {'lr': 0.00043806779681071907, 'samples': 2494272, 'steps': 12990, 'loss/train': 0.8338604271411896} 01/27/2022 08:04:20 - INFO - codeparrot_training - Step 12991: {'lr': 0.00043805701594116175, 'samples': 2494464, 'steps': 12991, 'loss/train': 0.6123136729001999} 01/27/2022 08:04:23 - INFO - codeparrot_training - Step 12992: {'lr': 0.00043804623426602784, 'samples': 2494656, 'steps': 12992, 'loss/train': 0.6424781978130341} 01/27/2022 08:04:26 - INFO - codeparrot_training - Step 12993: {'lr': 0.00043803545178536365, 'samples': 2494848, 'steps': 12993, 'loss/train': 0.6143860816955566} 01/27/2022 08:04:29 - INFO - codeparrot_training - Step 12994: {'lr': 0.00043802466849921526, 'samples': 2495040, 'steps': 12994, 'loss/train': 1.1141848862171173} 01/27/2022 08:04:32 - INFO - codeparrot_training - Step 12995: {'lr': 0.0004380138844076289, 'samples': 2495232, 'steps': 12995, 'loss/train': 0.5791980028152466} 01/27/2022 08:04:36 - INFO - codeparrot_training - Step 12996: {'lr': 0.00043800309951065076, 'samples': 2495424, 'steps': 12996, 'loss/train': 0.9693458676338196} 01/27/2022 08:04:42 - INFO - codeparrot_training - Step 12997: {'lr': 0.000437992313808327, 'samples': 2495616, 'steps': 12997, 'loss/train': 1.3436661958694458} 01/27/2022 08:04:45 - INFO - codeparrot_training - Step 12998: {'lr': 0.0004379815273007039, 'samples': 2495808, 'steps': 12998, 'loss/train': 1.2116703987121582} 01/27/2022 08:04:48 - INFO - codeparrot_training - Step 12999: {'lr': 0.0004379707399878276, 'samples': 2496000, 'steps': 12999, 'loss/train': 0.6189105212688446} 01/27/2022 08:04:51 - INFO - codeparrot_training - Step 13000: {'lr': 0.00043795995186974435, 'samples': 2496192, 'steps': 13000, 'loss/train': 1.041029155254364} 01/27/2022 08:04:54 - INFO - codeparrot_training - Step 13001: {'lr': 0.0004379491629465004, 'samples': 2496384, 'steps': 13001, 'loss/train': 0.7736731767654419} 01/27/2022 08:04:57 - INFO - codeparrot_training - Step 13002: {'lr': 0.00043793837321814185, 'samples': 2496576, 'steps': 13002, 'loss/train': 0.9191117584705353} 01/27/2022 08:05:01 - INFO - codeparrot_training - Step 13003: {'lr': 0.000437927582684715, 'samples': 2496768, 'steps': 13003, 'loss/train': 1.3811999559402466} 01/27/2022 08:05:04 - INFO - codeparrot_training - Step 13004: {'lr': 0.0004379167913462661, 'samples': 2496960, 'steps': 13004, 'loss/train': 0.8100082576274872} 01/27/2022 08:05:08 - INFO - codeparrot_training - Step 13005: {'lr': 0.0004379059992028412, 'samples': 2497152, 'steps': 13005, 'loss/train': 0.8519087433815002} 01/27/2022 08:05:11 - INFO - codeparrot_training - Step 13006: {'lr': 0.00043789520625448685, 'samples': 2497344, 'steps': 13006, 'loss/train': 0.5044215470552444} 01/27/2022 08:05:15 - INFO - codeparrot_training - Step 13007: {'lr': 0.000437884412501249, 'samples': 2497536, 'steps': 13007, 'loss/train': 0.8832314908504486} 01/27/2022 08:05:18 - INFO - codeparrot_training - Step 13008: {'lr': 0.00043787361794317403, 'samples': 2497728, 'steps': 13008, 'loss/train': 1.059787541627884} 01/27/2022 08:05:21 - INFO - codeparrot_training - Step 13009: {'lr': 0.0004378628225803081, 'samples': 2497920, 'steps': 13009, 'loss/train': 1.0079661905765533} 01/27/2022 08:05:24 - INFO - codeparrot_training - Step 13010: {'lr': 0.0004378520264126975, 'samples': 2498112, 'steps': 13010, 'loss/train': 1.1780110001564026} 01/27/2022 08:05:27 - INFO - codeparrot_training - Step 13011: {'lr': 0.0004378412294403885, 'samples': 2498304, 'steps': 13011, 'loss/train': 0.8535940647125244} 01/27/2022 08:05:30 - INFO - codeparrot_training - Step 13012: {'lr': 0.0004378304316634273, 'samples': 2498496, 'steps': 13012, 'loss/train': 1.4269927740097046} 01/27/2022 08:05:33 - INFO - codeparrot_training - Step 13013: {'lr': 0.0004378196330818602, 'samples': 2498688, 'steps': 13013, 'loss/train': 0.7129750996828079} 01/27/2022 08:05:40 - INFO - codeparrot_training - Step 13014: {'lr': 0.00043780883369573336, 'samples': 2498880, 'steps': 13014, 'loss/train': 1.0470465123653412} 01/27/2022 08:05:43 - INFO - codeparrot_training - Step 13015: {'lr': 0.00043779803350509316, 'samples': 2499072, 'steps': 13015, 'loss/train': 0.8800782859325409} 01/27/2022 08:05:46 - INFO - codeparrot_training - Step 13016: {'lr': 0.0004377872325099858, 'samples': 2499264, 'steps': 13016, 'loss/train': 1.1770986020565033} 01/27/2022 08:05:49 - INFO - codeparrot_training - Step 13017: {'lr': 0.0004377764307104576, 'samples': 2499456, 'steps': 13017, 'loss/train': 0.7831641733646393} 01/27/2022 08:05:52 - INFO - codeparrot_training - Step 13018: {'lr': 0.00043776562810655473, 'samples': 2499648, 'steps': 13018, 'loss/train': 0.8101353049278259} 01/27/2022 08:05:55 - INFO - codeparrot_training - Step 13019: {'lr': 0.0004377548246983236, 'samples': 2499840, 'steps': 13019, 'loss/train': 0.5207889229059219} 01/27/2022 08:05:58 - INFO - codeparrot_training - Step 13020: {'lr': 0.0004377440204858104, 'samples': 2500032, 'steps': 13020, 'loss/train': 0.7489674836397171} 01/27/2022 08:06:02 - INFO - codeparrot_training - Step 13021: {'lr': 0.0004377332154690614, 'samples': 2500224, 'steps': 13021, 'loss/train': 0.7367685288190842} 01/27/2022 08:06:05 - INFO - codeparrot_training - Step 13022: {'lr': 0.0004377224096481229, 'samples': 2500416, 'steps': 13022, 'loss/train': 0.8640190064907074} 01/27/2022 08:06:09 - INFO - codeparrot_training - Step 13023: {'lr': 0.0004377116030230413, 'samples': 2500608, 'steps': 13023, 'loss/train': 1.4251420497894287} 01/27/2022 08:06:12 - INFO - codeparrot_training - Step 13024: {'lr': 0.0004377007955938628, 'samples': 2500800, 'steps': 13024, 'loss/train': 0.646938756108284} 01/27/2022 08:06:15 - INFO - codeparrot_training - Step 13025: {'lr': 0.0004376899873606336, 'samples': 2500992, 'steps': 13025, 'loss/train': 0.6920108199119568} 01/27/2022 08:06:19 - INFO - codeparrot_training - Step 13026: {'lr': 0.0004376791783234001, 'samples': 2501184, 'steps': 13026, 'loss/train': 1.14643594622612} 01/27/2022 08:06:22 - INFO - codeparrot_training - Step 13027: {'lr': 0.0004376683684822086, 'samples': 2501376, 'steps': 13027, 'loss/train': 0.7073472887277603} 01/27/2022 08:06:25 - INFO - codeparrot_training - Step 13028: {'lr': 0.0004376575578371055, 'samples': 2501568, 'steps': 13028, 'loss/train': 1.180305540561676} 01/27/2022 08:06:28 - INFO - codeparrot_training - Step 13029: {'lr': 0.0004376467463881369, 'samples': 2501760, 'steps': 13029, 'loss/train': 0.6893018335103989} 01/27/2022 08:06:31 - INFO - codeparrot_training - Step 13030: {'lr': 0.0004376359341353492, 'samples': 2501952, 'steps': 13030, 'loss/train': 0.5391160100698471} 01/27/2022 08:06:36 - INFO - codeparrot_training - Step 13031: {'lr': 0.00043762512107878884, 'samples': 2502144, 'steps': 13031, 'loss/train': 0.659487247467041} 01/27/2022 08:06:39 - INFO - codeparrot_training - Step 13032: {'lr': 0.00043761430721850206, 'samples': 2502336, 'steps': 13032, 'loss/train': 0.5766575485467911} 01/27/2022 08:06:42 - INFO - codeparrot_training - Step 13033: {'lr': 0.0004376034925545351, 'samples': 2502528, 'steps': 13033, 'loss/train': 0.5968007594347} 01/27/2022 08:06:45 - INFO - codeparrot_training - Step 13034: {'lr': 0.0004375926770869343, 'samples': 2502720, 'steps': 13034, 'loss/train': 0.3061910346150398} 01/27/2022 08:06:48 - INFO - codeparrot_training - Step 13035: {'lr': 0.00043758186081574614, 'samples': 2502912, 'steps': 13035, 'loss/train': 0.1835305541753769} 01/27/2022 08:06:51 - INFO - codeparrot_training - Step 13036: {'lr': 0.00043757104374101677, 'samples': 2503104, 'steps': 13036, 'loss/train': 0.9948248863220215} 01/27/2022 08:06:54 - INFO - codeparrot_training - Step 13037: {'lr': 0.00043756022586279264, 'samples': 2503296, 'steps': 13037, 'loss/train': 0.5226071029901505} 01/27/2022 08:06:58 - INFO - codeparrot_training - Step 13038: {'lr': 0.00043754940718112, 'samples': 2503488, 'steps': 13038, 'loss/train': 0.8331736028194427} 01/27/2022 08:07:01 - INFO - codeparrot_training - Step 13039: {'lr': 0.0004375385876960454, 'samples': 2503680, 'steps': 13039, 'loss/train': 1.1802391111850739} 01/27/2022 08:07:05 - INFO - codeparrot_training - Step 13040: {'lr': 0.0004375277674076149, 'samples': 2503872, 'steps': 13040, 'loss/train': 0.743566632270813} 01/27/2022 08:07:08 - INFO - codeparrot_training - Step 13041: {'lr': 0.00043751694631587504, 'samples': 2504064, 'steps': 13041, 'loss/train': 0.8893028497695923} 01/27/2022 08:07:11 - INFO - codeparrot_training - Step 13042: {'lr': 0.00043750612442087215, 'samples': 2504256, 'steps': 13042, 'loss/train': 0.9563214182853699} 01/27/2022 08:07:15 - INFO - codeparrot_training - Step 13043: {'lr': 0.0004374953017226525, 'samples': 2504448, 'steps': 13043, 'loss/train': 0.8621782958507538} 01/27/2022 08:07:18 - INFO - codeparrot_training - Step 13044: {'lr': 0.0004374844782212626, 'samples': 2504640, 'steps': 13044, 'loss/train': 0.7840605676174164} 01/27/2022 08:07:21 - INFO - codeparrot_training - Step 13045: {'lr': 0.0004374736539167487, 'samples': 2504832, 'steps': 13045, 'loss/train': 0.46972623467445374} 01/27/2022 08:07:24 - INFO - codeparrot_training - Step 13046: {'lr': 0.0004374628288091571, 'samples': 2505024, 'steps': 13046, 'loss/train': 1.1833998262882233} 01/27/2022 08:07:27 - INFO - codeparrot_training - Step 13047: {'lr': 0.0004374520028985344, 'samples': 2505216, 'steps': 13047, 'loss/train': 1.271619200706482} 01/27/2022 08:07:30 - INFO - codeparrot_training - Step 13048: {'lr': 0.0004374411761849268, 'samples': 2505408, 'steps': 13048, 'loss/train': 0.500396803021431} 01/27/2022 08:07:36 - INFO - codeparrot_training - Step 13049: {'lr': 0.0004374303486683807, 'samples': 2505600, 'steps': 13049, 'loss/train': 0.6055331826210022} 01/27/2022 08:07:40 - INFO - codeparrot_training - Step 13050: {'lr': 0.0004374195203489425, 'samples': 2505792, 'steps': 13050, 'loss/train': 1.1112707555294037} 01/27/2022 08:07:43 - INFO - codeparrot_training - Step 13051: {'lr': 0.0004374086912266586, 'samples': 2505984, 'steps': 13051, 'loss/train': 0.5009181797504425} 01/27/2022 08:07:46 - INFO - codeparrot_training - Step 13052: {'lr': 0.0004373978613015753, 'samples': 2506176, 'steps': 13052, 'loss/train': 0.5136839300394058} 01/27/2022 08:07:49 - INFO - codeparrot_training - Step 13053: {'lr': 0.0004373870305737392, 'samples': 2506368, 'steps': 13053, 'loss/train': 1.435265064239502} 01/27/2022 08:07:52 - INFO - codeparrot_training - Step 13054: {'lr': 0.00043737619904319654, 'samples': 2506560, 'steps': 13054, 'loss/train': 1.0599994361400604} 01/27/2022 08:07:55 - INFO - codeparrot_training - Step 13055: {'lr': 0.0004373653667099937, 'samples': 2506752, 'steps': 13055, 'loss/train': 0.5352413803339005} 01/27/2022 08:07:59 - INFO - codeparrot_training - Step 13056: {'lr': 0.00043735453357417707, 'samples': 2506944, 'steps': 13056, 'loss/train': 0.9071041345596313} 01/27/2022 08:08:02 - INFO - codeparrot_training - Step 13057: {'lr': 0.00043734369963579323, 'samples': 2507136, 'steps': 13057, 'loss/train': 0.7199379354715347} 01/27/2022 08:08:06 - INFO - codeparrot_training - Step 13058: {'lr': 0.0004373328648948884, 'samples': 2507328, 'steps': 13058, 'loss/train': 0.7863609194755554} 01/27/2022 08:08:09 - INFO - codeparrot_training - Step 13059: {'lr': 0.0004373220293515091, 'samples': 2507520, 'steps': 13059, 'loss/train': 1.0631801784038544} 01/27/2022 08:08:13 - INFO - codeparrot_training - Step 13060: {'lr': 0.00043731119300570166, 'samples': 2507712, 'steps': 13060, 'loss/train': 1.10155588388443} 01/27/2022 08:08:16 - INFO - codeparrot_training - Step 13061: {'lr': 0.0004373003558575126, 'samples': 2507904, 'steps': 13061, 'loss/train': 0.4436287432909012} 01/27/2022 08:08:19 - INFO - codeparrot_training - Step 13062: {'lr': 0.00043728951790698823, 'samples': 2508096, 'steps': 13062, 'loss/train': 0.7746743559837341} 01/27/2022 08:08:22 - INFO - codeparrot_training - Step 13063: {'lr': 0.00043727867915417505, 'samples': 2508288, 'steps': 13063, 'loss/train': 0.6070175617933273} 01/27/2022 08:08:25 - INFO - codeparrot_training - Step 13064: {'lr': 0.00043726783959911953, 'samples': 2508480, 'steps': 13064, 'loss/train': 0.28962821513414383} 01/27/2022 08:08:28 - INFO - codeparrot_training - Step 13065: {'lr': 0.00043725699924186803, 'samples': 2508672, 'steps': 13065, 'loss/train': 0.7535389065742493} 01/27/2022 08:08:31 - INFO - codeparrot_training - Step 13066: {'lr': 0.00043724615808246695, 'samples': 2508864, 'steps': 13066, 'loss/train': 0.24876265972852707} 01/27/2022 08:08:36 - INFO - codeparrot_training - Step 13067: {'lr': 0.0004372353161209628, 'samples': 2509056, 'steps': 13067, 'loss/train': 0.9226526319980621} 01/27/2022 08:08:39 - INFO - codeparrot_training - Step 13068: {'lr': 0.000437224473357402, 'samples': 2509248, 'steps': 13068, 'loss/train': 0.8030663430690765} 01/27/2022 08:08:42 - INFO - codeparrot_training - Step 13069: {'lr': 0.0004372136297918311, 'samples': 2509440, 'steps': 13069, 'loss/train': 1.1988338828086853} 01/27/2022 08:08:45 - INFO - codeparrot_training - Step 13070: {'lr': 0.0004372027854242964, 'samples': 2509632, 'steps': 13070, 'loss/train': 0.47956833243370056} 01/27/2022 08:08:49 - INFO - codeparrot_training - Step 13071: {'lr': 0.0004371919402548444, 'samples': 2509824, 'steps': 13071, 'loss/train': 0.7495414316654205} 01/27/2022 08:08:52 - INFO - codeparrot_training - Step 13072: {'lr': 0.00043718109428352156, 'samples': 2510016, 'steps': 13072, 'loss/train': 0.827290803194046} 01/27/2022 08:08:55 - INFO - codeparrot_training - Step 13073: {'lr': 0.00043717024751037436, 'samples': 2510208, 'steps': 13073, 'loss/train': 1.0878642797470093} 01/27/2022 08:08:58 - INFO - codeparrot_training - Step 13074: {'lr': 0.0004371593999354493, 'samples': 2510400, 'steps': 13074, 'loss/train': 1.1676753759384155} 01/27/2022 08:09:04 - INFO - codeparrot_training - Step 13075: {'lr': 0.0004371485515587927, 'samples': 2510592, 'steps': 13075, 'loss/train': 0.721020519733429} 01/27/2022 08:09:08 - INFO - codeparrot_training - Step 13076: {'lr': 0.0004371377023804512, 'samples': 2510784, 'steps': 13076, 'loss/train': 0.690473198890686} 01/27/2022 08:09:11 - INFO - codeparrot_training - Step 13077: {'lr': 0.00043712685240047125, 'samples': 2510976, 'steps': 13077, 'loss/train': 0.7586862444877625} 01/27/2022 08:09:14 - INFO - codeparrot_training - Step 13078: {'lr': 0.00043711600161889917, 'samples': 2511168, 'steps': 13078, 'loss/train': 0.8838630616664886} 01/27/2022 08:09:17 - INFO - codeparrot_training - Step 13079: {'lr': 0.0004371051500357816, 'samples': 2511360, 'steps': 13079, 'loss/train': 0.6475089490413666} 01/27/2022 08:09:20 - INFO - codeparrot_training - Step 13080: {'lr': 0.000437094297651165, 'samples': 2511552, 'steps': 13080, 'loss/train': 1.0780413150787354} 01/27/2022 08:09:23 - INFO - codeparrot_training - Step 13081: {'lr': 0.00043708344446509586, 'samples': 2511744, 'steps': 13081, 'loss/train': 0.5766284912824631} 01/27/2022 08:09:26 - INFO - codeparrot_training - Step 13082: {'lr': 0.0004370725904776206, 'samples': 2511936, 'steps': 13082, 'loss/train': 0.6851208508014679} 01/27/2022 08:09:30 - INFO - codeparrot_training - Step 13083: {'lr': 0.0004370617356887858, 'samples': 2512128, 'steps': 13083, 'loss/train': 0.8958613872528076} 01/27/2022 08:09:34 - INFO - codeparrot_training - Step 13084: {'lr': 0.00043705088009863793, 'samples': 2512320, 'steps': 13084, 'loss/train': 0.9907849431037903} 01/27/2022 08:09:38 - INFO - codeparrot_training - Step 13085: {'lr': 0.0004370400237072234, 'samples': 2512512, 'steps': 13085, 'loss/train': 1.1343848705291748} 01/27/2022 08:09:41 - INFO - codeparrot_training - Step 13086: {'lr': 0.0004370291665145889, 'samples': 2512704, 'steps': 13086, 'loss/train': 0.13143698126077652} 01/27/2022 08:09:44 - INFO - codeparrot_training - Step 13087: {'lr': 0.00043701830852078076, 'samples': 2512896, 'steps': 13087, 'loss/train': 1.301228553056717} 01/27/2022 08:09:47 - INFO - codeparrot_training - Step 13088: {'lr': 0.0004370074497258456, 'samples': 2513088, 'steps': 13088, 'loss/train': 0.8768996000289917} 01/27/2022 08:09:50 - INFO - codeparrot_training - Step 13089: {'lr': 0.00043699659012983, 'samples': 2513280, 'steps': 13089, 'loss/train': 5.59298300743103} 01/27/2022 08:09:53 - INFO - codeparrot_training - Step 13090: {'lr': 0.00043698572973278026, 'samples': 2513472, 'steps': 13090, 'loss/train': 1.2638144195079803} 01/27/2022 08:09:56 - INFO - codeparrot_training - Step 13091: {'lr': 0.0004369748685347431, 'samples': 2513664, 'steps': 13091, 'loss/train': 0.8104203343391418} 01/27/2022 08:10:00 - INFO - codeparrot_training - Step 13092: {'lr': 0.00043696400653576496, 'samples': 2513856, 'steps': 13092, 'loss/train': 0.6689144521951675} 01/27/2022 08:10:03 - INFO - codeparrot_training - Step 13093: {'lr': 0.00043695314373589234, 'samples': 2514048, 'steps': 13093, 'loss/train': 0.8801732361316681} 01/27/2022 08:10:09 - INFO - codeparrot_training - Step 13094: {'lr': 0.00043694228013517185, 'samples': 2514240, 'steps': 13094, 'loss/train': 0.845230221748352} 01/27/2022 08:10:12 - INFO - codeparrot_training - Step 13095: {'lr': 0.00043693141573365003, 'samples': 2514432, 'steps': 13095, 'loss/train': 0.5178143978118896} 01/27/2022 08:10:15 - INFO - codeparrot_training - Step 13096: {'lr': 0.0004369205505313733, 'samples': 2514624, 'steps': 13096, 'loss/train': 1.116996020078659} 01/27/2022 08:10:18 - INFO - codeparrot_training - Step 13097: {'lr': 0.0004369096845283883, 'samples': 2514816, 'steps': 13097, 'loss/train': 0.6319782435894012} 01/27/2022 08:10:22 - INFO - codeparrot_training - Step 13098: {'lr': 0.0004368988177247416, 'samples': 2515008, 'steps': 13098, 'loss/train': 0.9573234021663666} 01/27/2022 08:10:25 - INFO - codeparrot_training - Step 13099: {'lr': 0.00043688795012047975, 'samples': 2515200, 'steps': 13099, 'loss/train': 0.5660451650619507} 01/27/2022 08:10:28 - INFO - codeparrot_training - Step 13100: {'lr': 0.00043687708171564923, 'samples': 2515392, 'steps': 13100, 'loss/train': 0.6347164660692215} 01/27/2022 08:10:31 - INFO - codeparrot_training - Step 13101: {'lr': 0.0004368662125102966, 'samples': 2515584, 'steps': 13101, 'loss/train': 1.0630605518817902} 01/27/2022 08:10:35 - INFO - codeparrot_training - Step 13102: {'lr': 0.00043685534250446846, 'samples': 2515776, 'steps': 13102, 'loss/train': 0.8292021453380585} 01/27/2022 08:10:38 - INFO - codeparrot_training - Step 13103: {'lr': 0.0004368444716982114, 'samples': 2515968, 'steps': 13103, 'loss/train': 0.7478952705860138} 01/27/2022 08:10:42 - INFO - codeparrot_training - Step 13104: {'lr': 0.0004368336000915719, 'samples': 2516160, 'steps': 13104, 'loss/train': 0.69517882168293} 01/27/2022 08:10:45 - INFO - codeparrot_training - Step 13105: {'lr': 0.0004368227276845966, 'samples': 2516352, 'steps': 13105, 'loss/train': 0.8624415993690491} 01/27/2022 08:10:48 - INFO - codeparrot_training - Step 13106: {'lr': 0.0004368118544773321, 'samples': 2516544, 'steps': 13106, 'loss/train': 1.1630515158176422} 01/27/2022 08:10:51 - INFO - codeparrot_training - Step 13107: {'lr': 0.00043680098046982495, 'samples': 2516736, 'steps': 13107, 'loss/train': 0.6916687041521072} 01/27/2022 08:10:54 - INFO - codeparrot_training - Step 13108: {'lr': 0.00043679010566212163, 'samples': 2516928, 'steps': 13108, 'loss/train': 0.9741834104061127} 01/27/2022 08:10:57 - INFO - codeparrot_training - Step 13109: {'lr': 0.0004367792300542689, 'samples': 2517120, 'steps': 13109, 'loss/train': 1.0249549448490143} 01/27/2022 08:11:00 - INFO - codeparrot_training - Step 13110: {'lr': 0.00043676835364631316, 'samples': 2517312, 'steps': 13110, 'loss/train': 0.9811237156391144} 01/27/2022 08:11:05 - INFO - codeparrot_training - Step 13111: {'lr': 0.00043675747643830116, 'samples': 2517504, 'steps': 13111, 'loss/train': 1.0267564952373505} 01/27/2022 08:11:08 - INFO - codeparrot_training - Step 13112: {'lr': 0.0004367465984302794, 'samples': 2517696, 'steps': 13112, 'loss/train': 0.7103345543146133} 01/27/2022 08:11:11 - INFO - codeparrot_training - Step 13113: {'lr': 0.0004367357196222946, 'samples': 2517888, 'steps': 13113, 'loss/train': 1.2617327570915222} 01/27/2022 08:11:14 - INFO - codeparrot_training - Step 13114: {'lr': 0.00043672484001439316, 'samples': 2518080, 'steps': 13114, 'loss/train': 0.8322985768318176} 01/27/2022 08:11:17 - INFO - codeparrot_training - Step 13115: {'lr': 0.00043671395960662184, 'samples': 2518272, 'steps': 13115, 'loss/train': 0.7610429227352142} 01/27/2022 08:11:21 - INFO - codeparrot_training - Step 13116: {'lr': 0.0004367030783990272, 'samples': 2518464, 'steps': 13116, 'loss/train': 1.0308139622211456} 01/27/2022 08:11:24 - INFO - codeparrot_training - Step 13117: {'lr': 0.0004366921963916559, 'samples': 2518656, 'steps': 13117, 'loss/train': 0.4366143196821213} 01/27/2022 08:11:27 - INFO - codeparrot_training - Step 13118: {'lr': 0.0004366813135845545, 'samples': 2518848, 'steps': 13118, 'loss/train': 1.1230903565883636} 01/27/2022 08:11:30 - INFO - codeparrot_training - Step 13119: {'lr': 0.00043667042997776965, 'samples': 2519040, 'steps': 13119, 'loss/train': 0.6005142778158188} 01/27/2022 08:11:36 - INFO - codeparrot_training - Step 13120: {'lr': 0.00043665954557134786, 'samples': 2519232, 'steps': 13120, 'loss/train': 1.0536729097366333} 01/27/2022 08:11:39 - INFO - codeparrot_training - Step 13121: {'lr': 0.0004366486603653359, 'samples': 2519424, 'steps': 13121, 'loss/train': 0.9161282479763031} 01/27/2022 08:11:42 - INFO - codeparrot_training - Step 13122: {'lr': 0.00043663777435978037, 'samples': 2519616, 'steps': 13122, 'loss/train': 1.3927080631256104} 01/27/2022 08:11:46 - INFO - codeparrot_training - Step 13123: {'lr': 0.0004366268875547278, 'samples': 2519808, 'steps': 13123, 'loss/train': 0.7592837512493134} 01/27/2022 08:11:49 - INFO - codeparrot_training - Step 13124: {'lr': 0.000436615999950225, 'samples': 2520000, 'steps': 13124, 'loss/train': 1.1552155315876007} 01/27/2022 08:11:52 - INFO - codeparrot_training - Step 13125: {'lr': 0.0004366051115463184, 'samples': 2520192, 'steps': 13125, 'loss/train': 0.039342619478702545} 01/27/2022 08:11:55 - INFO - codeparrot_training - Step 13126: {'lr': 0.0004365942223430549, 'samples': 2520384, 'steps': 13126, 'loss/train': 0.962476372718811} 01/27/2022 08:11:58 - INFO - codeparrot_training - Step 13127: {'lr': 0.0004365833323404809, 'samples': 2520576, 'steps': 13127, 'loss/train': 0.7138117700815201} 01/27/2022 08:12:03 - INFO - codeparrot_training - Step 13128: {'lr': 0.0004365724415386432, 'samples': 2520768, 'steps': 13128, 'loss/train': 1.158832848072052} 01/27/2022 08:12:06 - INFO - codeparrot_training - Step 13129: {'lr': 0.0004365615499375884, 'samples': 2520960, 'steps': 13129, 'loss/train': 0.8752760589122772} 01/27/2022 08:12:09 - INFO - codeparrot_training - Step 13130: {'lr': 0.0004365506575373631, 'samples': 2521152, 'steps': 13130, 'loss/train': 1.0418362319469452} 01/27/2022 08:12:12 - INFO - codeparrot_training - Step 13131: {'lr': 0.0004365397643380141, 'samples': 2521344, 'steps': 13131, 'loss/train': 0.6172489821910858} 01/27/2022 08:12:16 - INFO - codeparrot_training - Step 13132: {'lr': 0.000436528870339588, 'samples': 2521536, 'steps': 13132, 'loss/train': 1.161657303571701} 01/27/2022 08:12:19 - INFO - codeparrot_training - Step 13133: {'lr': 0.0004365179755421314, 'samples': 2521728, 'steps': 13133, 'loss/train': 0.653795599937439} 01/27/2022 08:12:22 - INFO - codeparrot_training - Step 13134: {'lr': 0.00043650707994569095, 'samples': 2521920, 'steps': 13134, 'loss/train': 0.13881925866007805} 01/27/2022 08:12:25 - INFO - codeparrot_training - Step 13135: {'lr': 0.0004364961835503135, 'samples': 2522112, 'steps': 13135, 'loss/train': 0.6892961114645004} 01/27/2022 08:12:28 - INFO - codeparrot_training - Step 13136: {'lr': 0.00043648528635604556, 'samples': 2522304, 'steps': 13136, 'loss/train': 1.5575852394104004} 01/27/2022 08:12:32 - INFO - codeparrot_training - Step 13137: {'lr': 0.00043647438836293383, 'samples': 2522496, 'steps': 13137, 'loss/train': 0.4135882258415222} 01/27/2022 08:12:36 - INFO - codeparrot_training - Step 13138: {'lr': 0.0004364634895710251, 'samples': 2522688, 'steps': 13138, 'loss/train': 0.9690369665622711} 01/27/2022 08:12:39 - INFO - codeparrot_training - Step 13139: {'lr': 0.000436452589980366, 'samples': 2522880, 'steps': 13139, 'loss/train': 0.7620083391666412} 01/27/2022 08:12:42 - INFO - codeparrot_training - Step 13140: {'lr': 0.00043644168959100315, 'samples': 2523072, 'steps': 13140, 'loss/train': 0.831114649772644} 01/27/2022 08:12:45 - INFO - codeparrot_training - Step 13141: {'lr': 0.0004364307884029834, 'samples': 2523264, 'steps': 13141, 'loss/train': 1.0162357091903687} 01/27/2022 08:12:48 - INFO - codeparrot_training - Step 13142: {'lr': 0.0004364198864163533, 'samples': 2523456, 'steps': 13142, 'loss/train': 0.8128378987312317} 01/27/2022 08:12:51 - INFO - codeparrot_training - Step 13143: {'lr': 0.00043640898363115954, 'samples': 2523648, 'steps': 13143, 'loss/train': 1.3131626844406128} 01/27/2022 08:12:54 - INFO - codeparrot_training - Step 13144: {'lr': 0.000436398080047449, 'samples': 2523840, 'steps': 13144, 'loss/train': 0.7544841170310974} 01/27/2022 08:12:58 - INFO - codeparrot_training - Step 13145: {'lr': 0.0004363871756652682, 'samples': 2524032, 'steps': 13145, 'loss/train': 0.7104261517524719} 01/27/2022 08:13:02 - INFO - codeparrot_training - Step 13146: {'lr': 0.00043637627048466395, 'samples': 2524224, 'steps': 13146, 'loss/train': 0.33333273231983185} 01/27/2022 08:13:05 - INFO - codeparrot_training - Step 13147: {'lr': 0.00043636536450568293, 'samples': 2524416, 'steps': 13147, 'loss/train': 0.8223664462566376} 01/27/2022 08:13:09 - INFO - codeparrot_training - Step 13148: {'lr': 0.0004363544577283718, 'samples': 2524608, 'steps': 13148, 'loss/train': 1.3743236660957336} 01/27/2022 08:13:12 - INFO - codeparrot_training - Step 13149: {'lr': 0.00043634355015277745, 'samples': 2524800, 'steps': 13149, 'loss/train': 0.6686883866786957} 01/27/2022 08:13:15 - INFO - codeparrot_training - Step 13150: {'lr': 0.0004363326417789465, 'samples': 2524992, 'steps': 13150, 'loss/train': 0.8094083368778229} 01/27/2022 08:13:18 - INFO - codeparrot_training - Step 13151: {'lr': 0.0004363217326069256, 'samples': 2525184, 'steps': 13151, 'loss/train': 0.23812678456306458} 01/27/2022 08:13:21 - INFO - codeparrot_training - Step 13152: {'lr': 0.0004363108226367616, 'samples': 2525376, 'steps': 13152, 'loss/train': 1.0161721408367157} 01/27/2022 08:13:24 - INFO - codeparrot_training - Step 13153: {'lr': 0.0004362999118685012, 'samples': 2525568, 'steps': 13153, 'loss/train': 0.7227863967418671} 01/27/2022 08:13:27 - INFO - codeparrot_training - Step 13154: {'lr': 0.0004362890003021911, 'samples': 2525760, 'steps': 13154, 'loss/train': 0.5449011772871017} 01/27/2022 08:13:34 - INFO - codeparrot_training - Step 13155: {'lr': 0.00043627808793787813, 'samples': 2525952, 'steps': 13155, 'loss/train': 1.1992354989051819} 01/27/2022 08:13:37 - INFO - codeparrot_training - Step 13156: {'lr': 0.00043626717477560897, 'samples': 2526144, 'steps': 13156, 'loss/train': 2.007376492023468} 01/27/2022 08:13:40 - INFO - codeparrot_training - Step 13157: {'lr': 0.00043625626081543033, 'samples': 2526336, 'steps': 13157, 'loss/train': 0.9547470510005951} 01/27/2022 08:13:43 - INFO - codeparrot_training - Step 13158: {'lr': 0.0004362453460573891, 'samples': 2526528, 'steps': 13158, 'loss/train': 0.6933795511722565} 01/27/2022 08:13:47 - INFO - codeparrot_training - Step 13159: {'lr': 0.0004362344305015319, 'samples': 2526720, 'steps': 13159, 'loss/train': 1.1459473371505737} 01/27/2022 08:13:50 - INFO - codeparrot_training - Step 13160: {'lr': 0.0004362235141479055, 'samples': 2526912, 'steps': 13160, 'loss/train': 0.8555326759815216} 01/27/2022 08:13:53 - INFO - codeparrot_training - Step 13161: {'lr': 0.00043621259699655674, 'samples': 2527104, 'steps': 13161, 'loss/train': 0.7145726680755615} 01/27/2022 08:13:56 - INFO - codeparrot_training - Step 13162: {'lr': 0.0004362016790475324, 'samples': 2527296, 'steps': 13162, 'loss/train': 0.8600977063179016} 01/27/2022 08:14:01 - INFO - codeparrot_training - Step 13163: {'lr': 0.0004361907603008791, 'samples': 2527488, 'steps': 13163, 'loss/train': 0.9565053284168243} 01/27/2022 08:14:04 - INFO - codeparrot_training - Step 13164: {'lr': 0.00043617984075664375, 'samples': 2527680, 'steps': 13164, 'loss/train': 0.4404553323984146} 01/27/2022 08:14:07 - INFO - codeparrot_training - Step 13165: {'lr': 0.000436168920414873, 'samples': 2527872, 'steps': 13165, 'loss/train': 0.8077199757099152} 01/27/2022 08:14:10 - INFO - codeparrot_training - Step 13166: {'lr': 0.0004361579992756138, 'samples': 2528064, 'steps': 13166, 'loss/train': 0.9305075705051422} 01/27/2022 08:14:13 - INFO - codeparrot_training - Step 13167: {'lr': 0.00043614707733891285, 'samples': 2528256, 'steps': 13167, 'loss/train': 0.357427641749382} 01/27/2022 08:14:16 - INFO - codeparrot_training - Step 13168: {'lr': 0.00043613615460481685, 'samples': 2528448, 'steps': 13168, 'loss/train': 0.5764957219362259} 01/27/2022 08:14:19 - INFO - codeparrot_training - Step 13169: {'lr': 0.0004361252310733728, 'samples': 2528640, 'steps': 13169, 'loss/train': 1.042936384677887} 01/27/2022 08:14:23 - INFO - codeparrot_training - Step 13170: {'lr': 0.0004361143067446273, 'samples': 2528832, 'steps': 13170, 'loss/train': 1.2332830131053925} 01/27/2022 08:14:26 - INFO - codeparrot_training - Step 13171: {'lr': 0.00043610338161862713, 'samples': 2529024, 'steps': 13171, 'loss/train': 0.6070282459259033} 01/27/2022 08:14:32 - INFO - codeparrot_training - Step 13172: {'lr': 0.00043609245569541924, 'samples': 2529216, 'steps': 13172, 'loss/train': 0.11969861760735512} 01/27/2022 08:14:35 - INFO - codeparrot_training - Step 13173: {'lr': 0.0004360815289750503, 'samples': 2529408, 'steps': 13173, 'loss/train': 0.4167364686727524} 01/27/2022 08:14:38 - INFO - codeparrot_training - Step 13174: {'lr': 0.0004360706014575672, 'samples': 2529600, 'steps': 13174, 'loss/train': 0.7121220678091049} 01/27/2022 08:14:41 - INFO - codeparrot_training - Step 13175: {'lr': 0.00043605967314301673, 'samples': 2529792, 'steps': 13175, 'loss/train': 0.6216753870248795} 01/27/2022 08:14:44 - INFO - codeparrot_training - Step 13176: {'lr': 0.0004360487440314458, 'samples': 2529984, 'steps': 13176, 'loss/train': 1.8583036065101624} 01/27/2022 08:14:48 - INFO - codeparrot_training - Step 13177: {'lr': 0.000436037814122901, 'samples': 2530176, 'steps': 13177, 'loss/train': 0.715517520904541} 01/27/2022 08:14:51 - INFO - codeparrot_training - Step 13178: {'lr': 0.0004360268834174294, 'samples': 2530368, 'steps': 13178, 'loss/train': 0.8906543254852295} 01/27/2022 08:14:54 - INFO - codeparrot_training - Step 13179: {'lr': 0.00043601595191507757, 'samples': 2530560, 'steps': 13179, 'loss/train': 0.9178552329540253} 01/27/2022 08:14:57 - INFO - codeparrot_training - Step 13180: {'lr': 0.0004360050196158925, 'samples': 2530752, 'steps': 13180, 'loss/train': 0.8118998408317566} 01/27/2022 08:15:01 - INFO - codeparrot_training - Step 13181: {'lr': 0.000435994086519921, 'samples': 2530944, 'steps': 13181, 'loss/train': 1.3139971196651459} 01/27/2022 08:15:05 - INFO - codeparrot_training - Step 13182: {'lr': 0.00043598315262720995, 'samples': 2531136, 'steps': 13182, 'loss/train': 0.538182333111763} 01/27/2022 08:15:08 - INFO - codeparrot_training - Step 13183: {'lr': 0.00043597221793780606, 'samples': 2531328, 'steps': 13183, 'loss/train': 0.8925068378448486} 01/27/2022 08:15:11 - INFO - codeparrot_training - Step 13184: {'lr': 0.0004359612824517563, 'samples': 2531520, 'steps': 13184, 'loss/train': 0.6759561896324158} 01/27/2022 08:15:14 - INFO - codeparrot_training - Step 13185: {'lr': 0.0004359503461691074, 'samples': 2531712, 'steps': 13185, 'loss/train': 1.139397293329239} 01/27/2022 08:15:17 - INFO - codeparrot_training - Step 13186: {'lr': 0.00043593940908990625, 'samples': 2531904, 'steps': 13186, 'loss/train': 0.6479115933179855} 01/27/2022 08:15:20 - INFO - codeparrot_training - Step 13187: {'lr': 0.00043592847121419974, 'samples': 2532096, 'steps': 13187, 'loss/train': 0.7574523389339447} 01/27/2022 08:15:23 - INFO - codeparrot_training - Step 13188: {'lr': 0.00043591753254203474, 'samples': 2532288, 'steps': 13188, 'loss/train': 0.49663978815078735} 01/27/2022 08:15:27 - INFO - codeparrot_training - Step 13189: {'lr': 0.00043590659307345803, 'samples': 2532480, 'steps': 13189, 'loss/train': 1.1326944530010223} 01/27/2022 08:15:31 - INFO - codeparrot_training - Step 13190: {'lr': 0.0004358956528085165, 'samples': 2532672, 'steps': 13190, 'loss/train': 0.8008592426776886} 01/27/2022 08:15:34 - INFO - codeparrot_training - Step 13191: {'lr': 0.0004358847117472571, 'samples': 2532864, 'steps': 13191, 'loss/train': 0.6332093775272369} 01/27/2022 08:15:37 - INFO - codeparrot_training - Step 13192: {'lr': 0.00043587376988972655, 'samples': 2533056, 'steps': 13192, 'loss/train': 1.0954343676567078} 01/27/2022 08:15:40 - INFO - codeparrot_training - Step 13193: {'lr': 0.0004358628272359718, 'samples': 2533248, 'steps': 13193, 'loss/train': 0.514320507645607} 01/27/2022 08:15:44 - INFO - codeparrot_training - Step 13194: {'lr': 0.0004358518837860397, 'samples': 2533440, 'steps': 13194, 'loss/train': 0.6250339597463608} 01/27/2022 08:15:47 - INFO - codeparrot_training - Step 13195: {'lr': 0.0004358409395399772, 'samples': 2533632, 'steps': 13195, 'loss/train': 1.4439226984977722} 01/27/2022 08:15:50 - INFO - codeparrot_training - Step 13196: {'lr': 0.00043582999449783103, 'samples': 2533824, 'steps': 13196, 'loss/train': 0.6774487048387527} 01/27/2022 08:15:53 - INFO - codeparrot_training - Step 13197: {'lr': 0.00043581904865964825, 'samples': 2534016, 'steps': 13197, 'loss/train': 0.6922009438276291} 01/27/2022 08:15:59 - INFO - codeparrot_training - Step 13198: {'lr': 0.0004358081020254756, 'samples': 2534208, 'steps': 13198, 'loss/train': 0.9192717969417572} 01/27/2022 08:16:02 - INFO - codeparrot_training - Step 13199: {'lr': 0.0004357971545953601, 'samples': 2534400, 'steps': 13199, 'loss/train': 0.593514695763588} 01/27/2022 08:16:06 - INFO - codeparrot_training - Step 13200: {'lr': 0.00043578620636934855, 'samples': 2534592, 'steps': 13200, 'loss/train': 0.9800488650798798} 01/27/2022 08:16:09 - INFO - codeparrot_training - Step 13201: {'lr': 0.0004357752573474879, 'samples': 2534784, 'steps': 13201, 'loss/train': 0.6754514425992966} 01/27/2022 08:16:12 - INFO - codeparrot_training - Step 13202: {'lr': 0.0004357643075298251, 'samples': 2534976, 'steps': 13202, 'loss/train': 1.06019926071167} 01/27/2022 08:16:15 - INFO - codeparrot_training - Step 13203: {'lr': 0.00043575335691640695, 'samples': 2535168, 'steps': 13203, 'loss/train': 1.073545753955841} 01/27/2022 08:16:18 - INFO - codeparrot_training - Step 13204: {'lr': 0.0004357424055072804, 'samples': 2535360, 'steps': 13204, 'loss/train': 0.703296884894371} 01/27/2022 08:16:21 - INFO - codeparrot_training - Step 13205: {'lr': 0.0004357314533024923, 'samples': 2535552, 'steps': 13205, 'loss/train': 0.9752467274665833} 01/27/2022 08:16:24 - INFO - codeparrot_training - Step 13206: {'lr': 0.0004357205003020897, 'samples': 2535744, 'steps': 13206, 'loss/train': 0.7282177358865738} 01/27/2022 08:16:29 - INFO - codeparrot_training - Step 13207: {'lr': 0.00043570954650611944, 'samples': 2535936, 'steps': 13207, 'loss/train': 1.2048228085041046} 01/27/2022 08:16:32 - INFO - codeparrot_training - Step 13208: {'lr': 0.00043569859191462847, 'samples': 2536128, 'steps': 13208, 'loss/train': 0.9139333963394165} 01/27/2022 08:16:35 - INFO - codeparrot_training - Step 13209: {'lr': 0.0004356876365276636, 'samples': 2536320, 'steps': 13209, 'loss/train': 0.6919466257095337} 01/27/2022 08:16:38 - INFO - codeparrot_training - Step 13210: {'lr': 0.00043567668034527195, 'samples': 2536512, 'steps': 13210, 'loss/train': 0.8840990960597992} 01/27/2022 08:16:41 - INFO - codeparrot_training - Step 13211: {'lr': 0.0004356657233675004, 'samples': 2536704, 'steps': 13211, 'loss/train': 0.5813469886779785} 01/27/2022 08:16:45 - INFO - codeparrot_training - Step 13212: {'lr': 0.00043565476559439577, 'samples': 2536896, 'steps': 13212, 'loss/train': 1.2836532890796661} 01/27/2022 08:16:48 - INFO - codeparrot_training - Step 13213: {'lr': 0.0004356438070260051, 'samples': 2537088, 'steps': 13213, 'loss/train': 0.450760155916214} 01/27/2022 08:16:51 - INFO - codeparrot_training - Step 13214: {'lr': 0.00043563284766237533, 'samples': 2537280, 'steps': 13214, 'loss/train': 0.8931409120559692} 01/27/2022 08:16:54 - INFO - codeparrot_training - Step 13215: {'lr': 0.00043562188750355336, 'samples': 2537472, 'steps': 13215, 'loss/train': 0.6023834198713303} 01/27/2022 08:16:58 - INFO - codeparrot_training - Step 13216: {'lr': 0.0004356109265495861, 'samples': 2537664, 'steps': 13216, 'loss/train': 0.9176414608955383} 01/27/2022 08:17:01 - INFO - codeparrot_training - Step 13217: {'lr': 0.00043559996480052067, 'samples': 2537856, 'steps': 13217, 'loss/train': 1.4059405624866486} 01/27/2022 08:17:05 - INFO - codeparrot_training - Step 13218: {'lr': 0.0004355890022564039, 'samples': 2538048, 'steps': 13218, 'loss/train': 1.0536007583141327} 01/27/2022 08:17:08 - INFO - codeparrot_training - Step 13219: {'lr': 0.00043557803891728275, 'samples': 2538240, 'steps': 13219, 'loss/train': 0.6757484525442123} 01/27/2022 08:17:11 - INFO - codeparrot_training - Step 13220: {'lr': 0.00043556707478320425, 'samples': 2538432, 'steps': 13220, 'loss/train': 0.8588005900382996} 01/27/2022 08:17:14 - INFO - codeparrot_training - Step 13221: {'lr': 0.00043555610985421527, 'samples': 2538624, 'steps': 13221, 'loss/train': 0.9661165773868561} 01/27/2022 08:17:17 - INFO - codeparrot_training - Step 13222: {'lr': 0.0004355451441303629, 'samples': 2538816, 'steps': 13222, 'loss/train': 0.5977008193731308} 01/27/2022 08:17:20 - INFO - codeparrot_training - Step 13223: {'lr': 0.000435534177611694, 'samples': 2539008, 'steps': 13223, 'loss/train': 0.46990616619586945} 01/27/2022 08:17:23 - INFO - codeparrot_training - Step 13224: {'lr': 0.0004355232102982556, 'samples': 2539200, 'steps': 13224, 'loss/train': 1.307166874408722} 01/27/2022 08:17:30 - INFO - codeparrot_training - Step 13225: {'lr': 0.00043551224219009473, 'samples': 2539392, 'steps': 13225, 'loss/train': 1.0883513689041138} 01/27/2022 08:17:33 - INFO - codeparrot_training - Step 13226: {'lr': 0.0004355012732872583, 'samples': 2539584, 'steps': 13226, 'loss/train': 0.186858169734478} 01/27/2022 08:17:36 - INFO - codeparrot_training - Step 13227: {'lr': 0.00043549030358979324, 'samples': 2539776, 'steps': 13227, 'loss/train': 1.1337982714176178} 01/27/2022 08:17:39 - INFO - codeparrot_training - Step 13228: {'lr': 0.0004354793330977467, 'samples': 2539968, 'steps': 13228, 'loss/train': 0.46522141993045807} 01/27/2022 08:17:42 - INFO - codeparrot_training - Step 13229: {'lr': 0.00043546836181116555, 'samples': 2540160, 'steps': 13229, 'loss/train': 0.5234140902757645} 01/27/2022 08:17:46 - INFO - codeparrot_training - Step 13230: {'lr': 0.0004354573897300969, 'samples': 2540352, 'steps': 13230, 'loss/train': 1.8626055121421814} 01/27/2022 08:17:49 - INFO - codeparrot_training - Step 13231: {'lr': 0.0004354464168545876, 'samples': 2540544, 'steps': 13231, 'loss/train': 1.0625673830509186} 01/27/2022 08:17:52 - INFO - codeparrot_training - Step 13232: {'lr': 0.0004354354431846848, 'samples': 2540736, 'steps': 13232, 'loss/train': 0.815781980752945} 01/27/2022 08:17:55 - INFO - codeparrot_training - Step 13233: {'lr': 0.0004354244687204354, 'samples': 2540928, 'steps': 13233, 'loss/train': 1.156973272562027} 01/27/2022 08:17:59 - INFO - codeparrot_training - Step 13234: {'lr': 0.00043541349346188653, 'samples': 2541120, 'steps': 13234, 'loss/train': 0.8217245936393738} 01/27/2022 08:18:03 - INFO - codeparrot_training - Step 13235: {'lr': 0.000435402517409085, 'samples': 2541312, 'steps': 13235, 'loss/train': 0.4356546252965927} 01/27/2022 08:18:06 - INFO - codeparrot_training - Step 13236: {'lr': 0.0004353915405620781, 'samples': 2541504, 'steps': 13236, 'loss/train': 0.570270761847496} 01/27/2022 08:18:09 - INFO - codeparrot_training - Step 13237: {'lr': 0.0004353805629209126, 'samples': 2541696, 'steps': 13237, 'loss/train': 1.267923653125763} 01/27/2022 08:18:12 - INFO - codeparrot_training - Step 13238: {'lr': 0.0004353695844856357, 'samples': 2541888, 'steps': 13238, 'loss/train': 0.8335921168327332} 01/27/2022 08:18:15 - INFO - codeparrot_training - Step 13239: {'lr': 0.00043535860525629436, 'samples': 2542080, 'steps': 13239, 'loss/train': 0.1257845163345337} 01/27/2022 08:18:18 - INFO - codeparrot_training - Step 13240: {'lr': 0.00043534762523293557, 'samples': 2542272, 'steps': 13240, 'loss/train': 1.1492722928524017} 01/27/2022 08:18:22 - INFO - codeparrot_training - Step 13241: {'lr': 0.00043533664441560636, 'samples': 2542464, 'steps': 13241, 'loss/train': 0.6195503175258636} 01/27/2022 08:18:25 - INFO - codeparrot_training - Step 13242: {'lr': 0.0004353256628043539, 'samples': 2542656, 'steps': 13242, 'loss/train': 0.9847628474235535} 01/27/2022 08:18:31 - INFO - codeparrot_training - Step 13243: {'lr': 0.00043531468039922515, 'samples': 2542848, 'steps': 13243, 'loss/train': 0.8083600401878357} 01/27/2022 08:18:34 - INFO - codeparrot_training - Step 13244: {'lr': 0.0004353036972002671, 'samples': 2543040, 'steps': 13244, 'loss/train': 0.7512311339378357} 01/27/2022 08:18:37 - INFO - codeparrot_training - Step 13245: {'lr': 0.0004352927132075269, 'samples': 2543232, 'steps': 13245, 'loss/train': 1.21025750041008} 01/27/2022 08:18:40 - INFO - codeparrot_training - Step 13246: {'lr': 0.00043528172842105154, 'samples': 2543424, 'steps': 13246, 'loss/train': 0.7016460299491882} 01/27/2022 08:18:44 - INFO - codeparrot_training - Step 13247: {'lr': 0.00043527074284088806, 'samples': 2543616, 'steps': 13247, 'loss/train': 1.297046184539795} 01/27/2022 08:18:47 - INFO - codeparrot_training - Step 13248: {'lr': 0.0004352597564670836, 'samples': 2543808, 'steps': 13248, 'loss/train': 0.6605357676744461} 01/27/2022 08:18:50 - INFO - codeparrot_training - Step 13249: {'lr': 0.00043524876929968516, 'samples': 2544000, 'steps': 13249, 'loss/train': 0.9138393402099609} 01/27/2022 08:18:53 - INFO - codeparrot_training - Step 13250: {'lr': 0.0004352377813387398, 'samples': 2544192, 'steps': 13250, 'loss/train': 0.7397779226303101} 01/27/2022 08:18:58 - INFO - codeparrot_training - Step 13251: {'lr': 0.0004352267925842946, 'samples': 2544384, 'steps': 13251, 'loss/train': 2.150784909725189} 01/27/2022 08:19:01 - INFO - codeparrot_training - Step 13252: {'lr': 0.00043521580303639663, 'samples': 2544576, 'steps': 13252, 'loss/train': 1.158785194158554} 01/27/2022 08:19:04 - INFO - codeparrot_training - Step 13253: {'lr': 0.000435204812695093, 'samples': 2544768, 'steps': 13253, 'loss/train': 0.23959214240312576} 01/27/2022 08:19:07 - INFO - codeparrot_training - Step 13254: {'lr': 0.00043519382156043075, 'samples': 2544960, 'steps': 13254, 'loss/train': 0.8673397600650787} 01/27/2022 08:19:10 - INFO - codeparrot_training - Step 13255: {'lr': 0.0004351828296324569, 'samples': 2545152, 'steps': 13255, 'loss/train': 0.8057017028331757} 01/27/2022 08:19:13 - INFO - codeparrot_training - Step 13256: {'lr': 0.00043517183691121875, 'samples': 2545344, 'steps': 13256, 'loss/train': 1.1133762001991272} 01/27/2022 08:19:16 - INFO - codeparrot_training - Step 13257: {'lr': 0.00043516084339676316, 'samples': 2545536, 'steps': 13257, 'loss/train': 0.901478111743927} 01/27/2022 08:19:20 - INFO - codeparrot_training - Step 13258: {'lr': 0.00043514984908913734, 'samples': 2545728, 'steps': 13258, 'loss/train': 1.0865050256252289} 01/27/2022 08:19:23 - INFO - codeparrot_training - Step 13259: {'lr': 0.0004351388539883883, 'samples': 2545920, 'steps': 13259, 'loss/train': 0.7467178255319595} 01/27/2022 08:19:27 - INFO - codeparrot_training - Step 13260: {'lr': 0.00043512785809456323, 'samples': 2546112, 'steps': 13260, 'loss/train': 0.753191739320755} 01/27/2022 08:19:31 - INFO - codeparrot_training - Step 13261: {'lr': 0.00043511686140770925, 'samples': 2546304, 'steps': 13261, 'loss/train': 1.053628832101822} 01/27/2022 08:19:34 - INFO - codeparrot_training - Step 13262: {'lr': 0.0004351058639278734, 'samples': 2546496, 'steps': 13262, 'loss/train': 0.8630383908748627} 01/27/2022 08:19:37 - INFO - codeparrot_training - Step 13263: {'lr': 0.0004350948656551028, 'samples': 2546688, 'steps': 13263, 'loss/train': 0.9679794609546661} 01/27/2022 08:19:40 - INFO - codeparrot_training - Step 13264: {'lr': 0.0004350838665894445, 'samples': 2546880, 'steps': 13264, 'loss/train': 1.2638835310935974} 01/27/2022 08:19:43 - INFO - codeparrot_training - Step 13265: {'lr': 0.0004350728667309458, 'samples': 2547072, 'steps': 13265, 'loss/train': 0.7995885014533997} 01/27/2022 08:19:46 - INFO - codeparrot_training - Step 13266: {'lr': 0.0004350618660796536, 'samples': 2547264, 'steps': 13266, 'loss/train': 0.43097519874572754} 01/27/2022 08:19:49 - INFO - codeparrot_training - Step 13267: {'lr': 0.0004350508646356152, 'samples': 2547456, 'steps': 13267, 'loss/train': 0.9549370408058167} 01/27/2022 08:19:53 - INFO - codeparrot_training - Step 13268: {'lr': 0.00043503986239887765, 'samples': 2547648, 'steps': 13268, 'loss/train': 0.43865302205085754} 01/27/2022 08:19:58 - INFO - codeparrot_training - Step 13269: {'lr': 0.0004350288593694881, 'samples': 2547840, 'steps': 13269, 'loss/train': 0.9087775647640228} 01/27/2022 08:20:01 - INFO - codeparrot_training - Step 13270: {'lr': 0.00043501785554749363, 'samples': 2548032, 'steps': 13270, 'loss/train': 1.1922151744365692} 01/27/2022 08:20:04 - INFO - codeparrot_training - Step 13271: {'lr': 0.00043500685093294145, 'samples': 2548224, 'steps': 13271, 'loss/train': 0.8249112367630005} 01/27/2022 08:20:07 - INFO - codeparrot_training - Step 13272: {'lr': 0.0004349958455258786, 'samples': 2548416, 'steps': 13272, 'loss/train': 0.5961361974477768} 01/27/2022 08:20:10 - INFO - codeparrot_training - Step 13273: {'lr': 0.00043498483932635237, 'samples': 2548608, 'steps': 13273, 'loss/train': 0.7493188083171844} 01/27/2022 08:20:13 - INFO - codeparrot_training - Step 13274: {'lr': 0.0004349738323344098, 'samples': 2548800, 'steps': 13274, 'loss/train': 0.08533383719623089} 01/27/2022 08:20:16 - INFO - codeparrot_training - Step 13275: {'lr': 0.00043496282455009807, 'samples': 2548992, 'steps': 13275, 'loss/train': 0.9994264841079712} 01/27/2022 08:20:20 - INFO - codeparrot_training - Step 13276: {'lr': 0.00043495181597346435, 'samples': 2549184, 'steps': 13276, 'loss/train': 0.8305179476737976} 01/27/2022 08:20:23 - INFO - codeparrot_training - Step 13277: {'lr': 0.0004349408066045557, 'samples': 2549376, 'steps': 13277, 'loss/train': 0.9971922039985657} 01/27/2022 08:20:28 - INFO - codeparrot_training - Step 13278: {'lr': 0.00043492979644341943, 'samples': 2549568, 'steps': 13278, 'loss/train': 1.3471448421478271} 01/27/2022 08:20:31 - INFO - codeparrot_training - Step 13279: {'lr': 0.0004349187854901026, 'samples': 2549760, 'steps': 13279, 'loss/train': 0.8862424492835999} 01/27/2022 08:20:35 - INFO - codeparrot_training - Step 13280: {'lr': 0.00043490777374465244, 'samples': 2549952, 'steps': 13280, 'loss/train': 0.49953603744506836} 01/27/2022 08:20:38 - INFO - codeparrot_training - Step 13281: {'lr': 0.0004348967612071161, 'samples': 2550144, 'steps': 13281, 'loss/train': 1.129291981458664} 01/27/2022 08:20:41 - INFO - codeparrot_training - Step 13282: {'lr': 0.0004348857478775407, 'samples': 2550336, 'steps': 13282, 'loss/train': 0.887112557888031} 01/27/2022 08:20:44 - INFO - codeparrot_training - Step 13283: {'lr': 0.00043487473375597354, 'samples': 2550528, 'steps': 13283, 'loss/train': 2.3453872203826904} 01/27/2022 08:20:47 - INFO - codeparrot_training - Step 13284: {'lr': 0.00043486371884246164, 'samples': 2550720, 'steps': 13284, 'loss/train': 1.0177717208862305} 01/27/2022 08:20:50 - INFO - codeparrot_training - Step 13285: {'lr': 0.0004348527031370523, 'samples': 2550912, 'steps': 13285, 'loss/train': 0.8239574432373047} 01/27/2022 08:20:53 - INFO - codeparrot_training - Step 13286: {'lr': 0.00043484168663979265, 'samples': 2551104, 'steps': 13286, 'loss/train': 1.0885798931121826} 01/27/2022 08:20:58 - INFO - codeparrot_training - Step 13287: {'lr': 0.00043483066935073, 'samples': 2551296, 'steps': 13287, 'loss/train': 0.8876204788684845} 01/27/2022 08:21:01 - INFO - codeparrot_training - Step 13288: {'lr': 0.0004348196512699114, 'samples': 2551488, 'steps': 13288, 'loss/train': 1.7882881164550781} 01/27/2022 08:21:04 - INFO - codeparrot_training - Step 13289: {'lr': 0.00043480863239738404, 'samples': 2551680, 'steps': 13289, 'loss/train': 0.5932269841432571} 01/27/2022 08:21:07 - INFO - codeparrot_training - Step 13290: {'lr': 0.0004347976127331953, 'samples': 2551872, 'steps': 13290, 'loss/train': 0.7271009534597397} 01/27/2022 08:21:11 - INFO - codeparrot_training - Step 13291: {'lr': 0.00043478659227739216, 'samples': 2552064, 'steps': 13291, 'loss/train': 0.5455514341592789} 01/27/2022 08:21:14 - INFO - codeparrot_training - Step 13292: {'lr': 0.00043477557103002197, 'samples': 2552256, 'steps': 13292, 'loss/train': 1.0721368789672852} 01/27/2022 08:21:17 - INFO - codeparrot_training - Step 13293: {'lr': 0.00043476454899113193, 'samples': 2552448, 'steps': 13293, 'loss/train': 0.9436391294002533} 01/27/2022 08:21:20 - INFO - codeparrot_training - Step 13294: {'lr': 0.00043475352616076927, 'samples': 2552640, 'steps': 13294, 'loss/train': 0.826788604259491} 01/27/2022 08:21:26 - INFO - codeparrot_training - Step 13295: {'lr': 0.0004347425025389811, 'samples': 2552832, 'steps': 13295, 'loss/train': 0.9050884544849396} 01/27/2022 08:21:29 - INFO - codeparrot_training - Step 13296: {'lr': 0.0004347314781258147, 'samples': 2553024, 'steps': 13296, 'loss/train': 0.6079207509756088} 01/27/2022 08:21:32 - INFO - codeparrot_training - Step 13297: {'lr': 0.00043472045292131735, 'samples': 2553216, 'steps': 13297, 'loss/train': 0.8054307997226715} 01/27/2022 08:21:36 - INFO - codeparrot_training - Step 13298: {'lr': 0.0004347094269255362, 'samples': 2553408, 'steps': 13298, 'loss/train': 0.7941085696220398} 01/27/2022 08:21:39 - INFO - codeparrot_training - Step 13299: {'lr': 0.0004346984001385186, 'samples': 2553600, 'steps': 13299, 'loss/train': 1.0414783358573914} 01/27/2022 08:21:42 - INFO - codeparrot_training - Step 13300: {'lr': 0.00043468737256031155, 'samples': 2553792, 'steps': 13300, 'loss/train': 0.24969615787267685} 01/27/2022 08:21:45 - INFO - codeparrot_training - Step 13301: {'lr': 0.00043467634419096257, 'samples': 2553984, 'steps': 13301, 'loss/train': 0.8837157189846039} 01/27/2022 08:21:48 - INFO - codeparrot_training - Step 13302: {'lr': 0.00043466531503051875, 'samples': 2554176, 'steps': 13302, 'loss/train': 0.90262171626091} 01/27/2022 08:21:51 - INFO - codeparrot_training - Step 13303: {'lr': 0.0004346542850790273, 'samples': 2554368, 'steps': 13303, 'loss/train': 0.6830783039331436} 01/27/2022 08:21:54 - INFO - codeparrot_training - Step 13304: {'lr': 0.00043464325433653563, 'samples': 2554560, 'steps': 13304, 'loss/train': 0.2969953268766403} 01/27/2022 08:21:59 - INFO - codeparrot_training - Step 13305: {'lr': 0.00043463222280309076, 'samples': 2554752, 'steps': 13305, 'loss/train': 0.5742794573307037} 01/27/2022 08:22:02 - INFO - codeparrot_training - Step 13306: {'lr': 0.00043462119047874015, 'samples': 2554944, 'steps': 13306, 'loss/train': 0.8306904137134552} 01/27/2022 08:22:06 - INFO - codeparrot_training - Step 13307: {'lr': 0.000434610157363531, 'samples': 2555136, 'steps': 13307, 'loss/train': 1.2596251666545868} 01/27/2022 08:22:09 - INFO - codeparrot_training - Step 13308: {'lr': 0.0004345991234575105, 'samples': 2555328, 'steps': 13308, 'loss/train': 0.746537446975708} 01/27/2022 08:22:12 - INFO - codeparrot_training - Step 13309: {'lr': 0.00043458808876072595, 'samples': 2555520, 'steps': 13309, 'loss/train': 0.9107934236526489} 01/27/2022 08:22:15 - INFO - codeparrot_training - Step 13310: {'lr': 0.0004345770532732247, 'samples': 2555712, 'steps': 13310, 'loss/train': 0.44285546243190765} 01/27/2022 08:22:18 - INFO - codeparrot_training - Step 13311: {'lr': 0.00043456601699505407, 'samples': 2555904, 'steps': 13311, 'loss/train': 0.13885710015892982} 01/27/2022 08:22:21 - INFO - codeparrot_training - Step 13312: {'lr': 0.00043455497992626104, 'samples': 2556096, 'steps': 13312, 'loss/train': 0.9814010560512543} 01/27/2022 08:22:26 - INFO - codeparrot_training - Step 13313: {'lr': 0.0004345439420668932, 'samples': 2556288, 'steps': 13313, 'loss/train': 1.0480073690414429} 01/27/2022 08:22:29 - INFO - codeparrot_training - Step 13314: {'lr': 0.0004345329034169977, 'samples': 2556480, 'steps': 13314, 'loss/train': 0.5219889432191849} 01/27/2022 08:22:32 - INFO - codeparrot_training - Step 13315: {'lr': 0.00043452186397662174, 'samples': 2556672, 'steps': 13315, 'loss/train': 0.5491813123226166} 01/27/2022 08:22:35 - INFO - codeparrot_training - Step 13316: {'lr': 0.0004345108237458128, 'samples': 2556864, 'steps': 13316, 'loss/train': 0.927384227514267} 01/27/2022 08:22:38 - INFO - codeparrot_training - Step 13317: {'lr': 0.00043449978272461806, 'samples': 2557056, 'steps': 13317, 'loss/train': 0.5081390887498856} 01/27/2022 08:22:41 - INFO - codeparrot_training - Step 13318: {'lr': 0.0004344887409130848, 'samples': 2557248, 'steps': 13318, 'loss/train': 0.9289917647838593} 01/27/2022 08:22:45 - INFO - codeparrot_training - Step 13319: {'lr': 0.0004344776983112604, 'samples': 2557440, 'steps': 13319, 'loss/train': 1.3546821177005768} 01/27/2022 08:22:48 - INFO - codeparrot_training - Step 13320: {'lr': 0.0004344666549191921, 'samples': 2557632, 'steps': 13320, 'loss/train': 0.691285103559494} 01/27/2022 08:22:51 - INFO - codeparrot_training - Step 13321: {'lr': 0.0004344556107369272, 'samples': 2557824, 'steps': 13321, 'loss/train': 0.9168404638767242} 01/27/2022 08:22:57 - INFO - codeparrot_training - Step 13322: {'lr': 0.00043444456576451307, 'samples': 2558016, 'steps': 13322, 'loss/train': 0.5893308073282242} 01/27/2022 08:23:00 - INFO - codeparrot_training - Step 13323: {'lr': 0.000434433520001997, 'samples': 2558208, 'steps': 13323, 'loss/train': 0.4387550354003906} 01/27/2022 08:23:03 - INFO - codeparrot_training - Step 13324: {'lr': 0.0004344224734494263, 'samples': 2558400, 'steps': 13324, 'loss/train': 0.9116765856742859} 01/27/2022 08:23:07 - INFO - codeparrot_training - Step 13325: {'lr': 0.00043441142610684826, 'samples': 2558592, 'steps': 13325, 'loss/train': 1.2221163511276245} 01/27/2022 08:23:10 - INFO - codeparrot_training - Step 13326: {'lr': 0.0004344003779743102, 'samples': 2558784, 'steps': 13326, 'loss/train': 0.6675280630588531} 01/27/2022 08:23:13 - INFO - codeparrot_training - Step 13327: {'lr': 0.0004343893290518595, 'samples': 2558976, 'steps': 13327, 'loss/train': 0.12220119684934616} 01/27/2022 08:23:16 - INFO - codeparrot_training - Step 13328: {'lr': 0.0004343782793395435, 'samples': 2559168, 'steps': 13328, 'loss/train': 0.8046923875808716} 01/27/2022 08:23:19 - INFO - codeparrot_training - Step 13329: {'lr': 0.00043436722883740943, 'samples': 2559360, 'steps': 13329, 'loss/train': 0.16136103495955467} 01/27/2022 08:23:22 - INFO - codeparrot_training - Step 13330: {'lr': 0.0004343561775455047, 'samples': 2559552, 'steps': 13330, 'loss/train': 0.7560321986675262} 01/27/2022 08:23:27 - INFO - codeparrot_training - Step 13331: {'lr': 0.00043434512546387674, 'samples': 2559744, 'steps': 13331, 'loss/train': 0.5529328733682632} 01/27/2022 08:23:30 - INFO - codeparrot_training - Step 13332: {'lr': 0.0004343340725925727, 'samples': 2559936, 'steps': 13332, 'loss/train': 1.1866036355495453} 01/27/2022 08:23:33 - INFO - codeparrot_training - Step 13333: {'lr': 0.0004343230189316401, 'samples': 2560128, 'steps': 13333, 'loss/train': 0.951781153678894} 01/27/2022 08:23:36 - INFO - codeparrot_training - Step 13334: {'lr': 0.00043431196448112615, 'samples': 2560320, 'steps': 13334, 'loss/train': 0.9056089818477631} 01/27/2022 08:23:39 - INFO - codeparrot_training - Step 13335: {'lr': 0.0004343009092410783, 'samples': 2560512, 'steps': 13335, 'loss/train': 0.8772481977939606} 01/27/2022 08:23:43 - INFO - codeparrot_training - Step 13336: {'lr': 0.0004342898532115439, 'samples': 2560704, 'steps': 13336, 'loss/train': 0.7741691172122955} 01/27/2022 08:23:46 - INFO - codeparrot_training - Step 13337: {'lr': 0.00043427879639257024, 'samples': 2560896, 'steps': 13337, 'loss/train': 0.9122647941112518} 01/27/2022 08:23:49 - INFO - codeparrot_training - Step 13338: {'lr': 0.0004342677387842048, 'samples': 2561088, 'steps': 13338, 'loss/train': 1.0105890333652496} 01/27/2022 08:23:52 - INFO - codeparrot_training - Step 13339: {'lr': 0.0004342566803864948, 'samples': 2561280, 'steps': 13339, 'loss/train': 0.5800352543592453} 01/27/2022 08:23:56 - INFO - codeparrot_training - Step 13340: {'lr': 0.0004342456211994877, 'samples': 2561472, 'steps': 13340, 'loss/train': 1.1233566105365753} 01/27/2022 08:24:00 - INFO - codeparrot_training - Step 13341: {'lr': 0.0004342345612232309, 'samples': 2561664, 'steps': 13341, 'loss/train': 0.8626880943775177} 01/27/2022 08:24:03 - INFO - codeparrot_training - Step 13342: {'lr': 0.0004342235004577717, 'samples': 2561856, 'steps': 13342, 'loss/train': 0.9071507155895233} 01/27/2022 08:24:06 - INFO - codeparrot_training - Step 13343: {'lr': 0.00043421243890315753, 'samples': 2562048, 'steps': 13343, 'loss/train': 0.9515069425106049} 01/27/2022 08:24:09 - INFO - codeparrot_training - Step 13344: {'lr': 0.0004342013765594358, 'samples': 2562240, 'steps': 13344, 'loss/train': 0.9307571947574615} 01/27/2022 08:24:12 - INFO - codeparrot_training - Step 13345: {'lr': 0.0004341903134266538, 'samples': 2562432, 'steps': 13345, 'loss/train': 0.7901261150836945} 01/27/2022 08:24:15 - INFO - codeparrot_training - Step 13346: {'lr': 0.0004341792495048591, 'samples': 2562624, 'steps': 13346, 'loss/train': 0.8682673573493958} 01/27/2022 08:24:18 - INFO - codeparrot_training - Step 13347: {'lr': 0.00043416818479409894, 'samples': 2562816, 'steps': 13347, 'loss/train': 1.078797698020935} 01/27/2022 08:24:22 - INFO - codeparrot_training - Step 13348: {'lr': 0.0004341571192944207, 'samples': 2563008, 'steps': 13348, 'loss/train': 0.9804964363574982} 01/27/2022 08:24:26 - INFO - codeparrot_training - Step 13349: {'lr': 0.00043414605300587183, 'samples': 2563200, 'steps': 13349, 'loss/train': 0.411090686917305} 01/27/2022 08:24:30 - INFO - codeparrot_training - Step 13350: {'lr': 0.0004341349859284998, 'samples': 2563392, 'steps': 13350, 'loss/train': 0.7063817828893661} 01/27/2022 08:24:33 - INFO - codeparrot_training - Step 13351: {'lr': 0.0004341239180623519, 'samples': 2563584, 'steps': 13351, 'loss/train': 0.5311427861452103} 01/27/2022 08:24:36 - INFO - codeparrot_training - Step 13352: {'lr': 0.0004341128494074756, 'samples': 2563776, 'steps': 13352, 'loss/train': 2.47433602809906} 01/27/2022 08:24:39 - INFO - codeparrot_training - Step 13353: {'lr': 0.00043410177996391837, 'samples': 2563968, 'steps': 13353, 'loss/train': 0.9253462851047516} 01/27/2022 08:24:42 - INFO - codeparrot_training - Step 13354: {'lr': 0.00043409070973172753, 'samples': 2564160, 'steps': 13354, 'loss/train': 1.012000948190689} 01/27/2022 08:24:45 - INFO - codeparrot_training - Step 13355: {'lr': 0.0004340796387109506, 'samples': 2564352, 'steps': 13355, 'loss/train': 0.22713768482208252} 01/27/2022 08:24:48 - INFO - codeparrot_training - Step 13356: {'lr': 0.00043406856690163487, 'samples': 2564544, 'steps': 13356, 'loss/train': 0.7253155410289764} 01/27/2022 08:24:55 - INFO - codeparrot_training - Step 13357: {'lr': 0.0004340574943038279, 'samples': 2564736, 'steps': 13357, 'loss/train': 0.937874972820282} 01/27/2022 08:24:58 - INFO - codeparrot_training - Step 13358: {'lr': 0.00043404642091757705, 'samples': 2564928, 'steps': 13358, 'loss/train': 0.6299993544816971} 01/27/2022 08:25:01 - INFO - codeparrot_training - Step 13359: {'lr': 0.0004340353467429299, 'samples': 2565120, 'steps': 13359, 'loss/train': 1.454814612865448} 01/27/2022 08:25:04 - INFO - codeparrot_training - Step 13360: {'lr': 0.00043402427177993366, 'samples': 2565312, 'steps': 13360, 'loss/train': 0.772456705570221} 01/27/2022 08:25:07 - INFO - codeparrot_training - Step 13361: {'lr': 0.00043401319602863584, 'samples': 2565504, 'steps': 13361, 'loss/train': 0.9259617626667023} 01/27/2022 08:25:11 - INFO - codeparrot_training - Step 13362: {'lr': 0.0004340021194890839, 'samples': 2565696, 'steps': 13362, 'loss/train': 0.7496244460344315} 01/27/2022 08:25:14 - INFO - codeparrot_training - Step 13363: {'lr': 0.0004339910421613253, 'samples': 2565888, 'steps': 13363, 'loss/train': 1.1277878880500793} 01/27/2022 08:25:17 - INFO - codeparrot_training - Step 13364: {'lr': 0.0004339799640454076, 'samples': 2566080, 'steps': 13364, 'loss/train': 0.964038223028183} 01/27/2022 08:25:20 - INFO - codeparrot_training - Step 13365: {'lr': 0.0004339688851413781, 'samples': 2566272, 'steps': 13365, 'loss/train': 1.219385951757431} 01/27/2022 08:25:25 - INFO - codeparrot_training - Step 13366: {'lr': 0.0004339578054492843, 'samples': 2566464, 'steps': 13366, 'loss/train': 0.48900890350341797} 01/27/2022 08:25:28 - INFO - codeparrot_training - Step 13367: {'lr': 0.0004339467249691737, 'samples': 2566656, 'steps': 13367, 'loss/train': 1.2177509665489197} 01/27/2022 08:25:31 - INFO - codeparrot_training - Step 13368: {'lr': 0.0004339356437010937, 'samples': 2566848, 'steps': 13368, 'loss/train': 0.9269033074378967} 01/27/2022 08:25:34 - INFO - codeparrot_training - Step 13369: {'lr': 0.00043392456164509185, 'samples': 2567040, 'steps': 13369, 'loss/train': 0.8524897992610931} 01/27/2022 08:25:37 - INFO - codeparrot_training - Step 13370: {'lr': 0.00043391347880121554, 'samples': 2567232, 'steps': 13370, 'loss/train': 0.35085754841566086} 01/27/2022 08:25:40 - INFO - codeparrot_training - Step 13371: {'lr': 0.00043390239516951235, 'samples': 2567424, 'steps': 13371, 'loss/train': 1.1150116324424744} 01/27/2022 08:25:43 - INFO - codeparrot_training - Step 13372: {'lr': 0.0004338913107500297, 'samples': 2567616, 'steps': 13372, 'loss/train': 1.0758156180381775} 01/27/2022 08:25:47 - INFO - codeparrot_training - Step 13373: {'lr': 0.00043388022554281504, 'samples': 2567808, 'steps': 13373, 'loss/train': 0.3820732533931732} 01/27/2022 08:25:50 - INFO - codeparrot_training - Step 13374: {'lr': 0.00043386913954791584, 'samples': 2568000, 'steps': 13374, 'loss/train': 0.8679307401180267} 01/27/2022 08:25:54 - INFO - codeparrot_training - Step 13375: {'lr': 0.0004338580527653797, 'samples': 2568192, 'steps': 13375, 'loss/train': 1.0145237445831299} 01/27/2022 08:25:57 - INFO - codeparrot_training - Step 13376: {'lr': 0.000433846965195254, 'samples': 2568384, 'steps': 13376, 'loss/train': 0.31817270815372467} 01/27/2022 08:26:01 - INFO - codeparrot_training - Step 13377: {'lr': 0.0004338358768375863, 'samples': 2568576, 'steps': 13377, 'loss/train': 0.47401660680770874} 01/27/2022 08:26:04 - INFO - codeparrot_training - Step 13378: {'lr': 0.000433824787692424, 'samples': 2568768, 'steps': 13378, 'loss/train': 0.6073286980390549} 01/27/2022 08:26:07 - INFO - codeparrot_training - Step 13379: {'lr': 0.0004338136977598148, 'samples': 2568960, 'steps': 13379, 'loss/train': 0.9305051565170288} 01/27/2022 08:26:10 - INFO - codeparrot_training - Step 13380: {'lr': 0.000433802607039806, 'samples': 2569152, 'steps': 13380, 'loss/train': 1.9165086150169373} 01/27/2022 08:26:13 - INFO - codeparrot_training - Step 13381: {'lr': 0.00043379151553244523, 'samples': 2569344, 'steps': 13381, 'loss/train': 0.5903118252754211} 01/27/2022 08:26:16 - INFO - codeparrot_training - Step 13382: {'lr': 0.00043378042323778, 'samples': 2569536, 'steps': 13382, 'loss/train': 0.7232031226158142} 01/27/2022 08:26:22 - INFO - codeparrot_training - Step 13383: {'lr': 0.00043376933015585776, 'samples': 2569728, 'steps': 13383, 'loss/train': 0.07329780422151089} 01/27/2022 08:26:26 - INFO - codeparrot_training - Step 13384: {'lr': 0.000433758236286726, 'samples': 2569920, 'steps': 13384, 'loss/train': 0.6634549051523209} 01/27/2022 08:26:29 - INFO - codeparrot_training - Step 13385: {'lr': 0.0004337471416304324, 'samples': 2570112, 'steps': 13385, 'loss/train': 0.9936537444591522} 01/27/2022 08:26:32 - INFO - codeparrot_training - Step 13386: {'lr': 0.00043373604618702436, 'samples': 2570304, 'steps': 13386, 'loss/train': 0.6775400340557098} 01/27/2022 08:26:35 - INFO - codeparrot_training - Step 13387: {'lr': 0.00043372494995654943, 'samples': 2570496, 'steps': 13387, 'loss/train': 1.1460575759410858} 01/27/2022 08:26:38 - INFO - codeparrot_training - Step 13388: {'lr': 0.00043371385293905517, 'samples': 2570688, 'steps': 13388, 'loss/train': 0.4966094344854355} 01/27/2022 08:26:41 - INFO - codeparrot_training - Step 13389: {'lr': 0.0004337027551345891, 'samples': 2570880, 'steps': 13389, 'loss/train': 0.7207109034061432} 01/27/2022 08:26:44 - INFO - codeparrot_training - Step 13390: {'lr': 0.0004336916565431987, 'samples': 2571072, 'steps': 13390, 'loss/train': 0.6481411457061768} 01/27/2022 08:26:48 - INFO - codeparrot_training - Step 13391: {'lr': 0.0004336805571649316, 'samples': 2571264, 'steps': 13391, 'loss/train': 1.1779546737670898} 01/27/2022 08:26:52 - INFO - codeparrot_training - Step 13392: {'lr': 0.0004336694569998354, 'samples': 2571456, 'steps': 13392, 'loss/train': 1.2135407030582428} 01/27/2022 08:26:55 - INFO - codeparrot_training - Step 13393: {'lr': 0.00043365835604795746, 'samples': 2571648, 'steps': 13393, 'loss/train': 0.8368164896965027} 01/27/2022 08:26:58 - INFO - codeparrot_training - Step 13394: {'lr': 0.0004336472543093455, 'samples': 2571840, 'steps': 13394, 'loss/train': 0.8134753704071045} 01/27/2022 08:27:01 - INFO - codeparrot_training - Step 13395: {'lr': 0.000433636151784047, 'samples': 2572032, 'steps': 13395, 'loss/train': 0.5447030961513519} 01/27/2022 08:27:04 - INFO - codeparrot_training - Step 13396: {'lr': 0.00043362504847210956, 'samples': 2572224, 'steps': 13396, 'loss/train': 0.6776342689990997} 01/27/2022 08:27:08 - INFO - codeparrot_training - Step 13397: {'lr': 0.0004336139443735807, 'samples': 2572416, 'steps': 13397, 'loss/train': 1.0390665829181671} 01/27/2022 08:27:11 - INFO - codeparrot_training - Step 13398: {'lr': 0.000433602839488508, 'samples': 2572608, 'steps': 13398, 'loss/train': 0.4664747267961502} 01/27/2022 08:27:14 - INFO - codeparrot_training - Step 13399: {'lr': 0.00043359173381693906, 'samples': 2572800, 'steps': 13399, 'loss/train': 0.7915416955947876} 01/27/2022 08:27:17 - INFO - codeparrot_training - Step 13400: {'lr': 0.0004335806273589214, 'samples': 2572992, 'steps': 13400, 'loss/train': 0.8218657672405243} 01/27/2022 08:27:23 - INFO - codeparrot_training - Step 13401: {'lr': 0.00043356952011450265, 'samples': 2573184, 'steps': 13401, 'loss/train': 5.585807919502258} 01/27/2022 08:27:26 - INFO - codeparrot_training - Step 13402: {'lr': 0.0004335584120837304, 'samples': 2573376, 'steps': 13402, 'loss/train': 0.7237119823694229} 01/27/2022 08:27:30 - INFO - codeparrot_training - Step 13403: {'lr': 0.0004335473032666521, 'samples': 2573568, 'steps': 13403, 'loss/train': 1.1930688321590424} 01/27/2022 08:27:33 - INFO - codeparrot_training - Step 13404: {'lr': 0.00043353619366331546, 'samples': 2573760, 'steps': 13404, 'loss/train': 0.44799479842185974} 01/27/2022 08:27:36 - INFO - codeparrot_training - Step 13405: {'lr': 0.0004335250832737681, 'samples': 2573952, 'steps': 13405, 'loss/train': 1.4083768129348755} 01/27/2022 08:27:39 - INFO - codeparrot_training - Step 13406: {'lr': 0.00043351397209805755, 'samples': 2574144, 'steps': 13406, 'loss/train': 0.856287807226181} 01/27/2022 08:27:42 - INFO - codeparrot_training - Step 13407: {'lr': 0.0004335028601362314, 'samples': 2574336, 'steps': 13407, 'loss/train': 1.222307413816452} 01/27/2022 08:27:45 - INFO - codeparrot_training - Step 13408: {'lr': 0.0004334917473883373, 'samples': 2574528, 'steps': 13408, 'loss/train': 1.0261097252368927} 01/27/2022 08:27:48 - INFO - codeparrot_training - Step 13409: {'lr': 0.0004334806338544227, 'samples': 2574720, 'steps': 13409, 'loss/train': 0.8607193529605865} 01/27/2022 08:27:53 - INFO - codeparrot_training - Step 13410: {'lr': 0.0004334695195345355, 'samples': 2574912, 'steps': 13410, 'loss/train': 0.9731580018997192} 01/27/2022 08:27:56 - INFO - codeparrot_training - Step 13411: {'lr': 0.000433458404428723, 'samples': 2575104, 'steps': 13411, 'loss/train': 0.9278806149959564} 01/27/2022 08:28:00 - INFO - codeparrot_training - Step 13412: {'lr': 0.00043344728853703297, 'samples': 2575296, 'steps': 13412, 'loss/train': 0.8826992511749268} 01/27/2022 08:28:03 - INFO - codeparrot_training - Step 13413: {'lr': 0.00043343617185951305, 'samples': 2575488, 'steps': 13413, 'loss/train': 0.3994361311197281} 01/27/2022 08:28:06 - INFO - codeparrot_training - Step 13414: {'lr': 0.0004334250543962108, 'samples': 2575680, 'steps': 13414, 'loss/train': 0.644516721367836} 01/27/2022 08:28:09 - INFO - codeparrot_training - Step 13415: {'lr': 0.00043341393614717384, 'samples': 2575872, 'steps': 13415, 'loss/train': 1.4293345212936401} 01/27/2022 08:28:12 - INFO - codeparrot_training - Step 13416: {'lr': 0.0004334028171124499, 'samples': 2576064, 'steps': 13416, 'loss/train': 0.6550723314285278} 01/27/2022 08:28:15 - INFO - codeparrot_training - Step 13417: {'lr': 0.0004333916972920864, 'samples': 2576256, 'steps': 13417, 'loss/train': 0.3883902579545975} 01/27/2022 08:28:18 - INFO - codeparrot_training - Step 13418: {'lr': 0.00043338057668613117, 'samples': 2576448, 'steps': 13418, 'loss/train': 0.09846251830458641} 01/27/2022 08:28:23 - INFO - codeparrot_training - Step 13419: {'lr': 0.00043336945529463177, 'samples': 2576640, 'steps': 13419, 'loss/train': 1.1462642848491669} 01/27/2022 08:28:26 - INFO - codeparrot_training - Step 13420: {'lr': 0.00043335833311763597, 'samples': 2576832, 'steps': 13420, 'loss/train': 0.573901891708374} 01/27/2022 08:28:29 - INFO - codeparrot_training - Step 13421: {'lr': 0.00043334721015519115, 'samples': 2577024, 'steps': 13421, 'loss/train': 0.6164305955171585} 01/27/2022 08:28:32 - INFO - codeparrot_training - Step 13422: {'lr': 0.00043333608640734513, 'samples': 2577216, 'steps': 13422, 'loss/train': 0.5170375406742096} 01/27/2022 08:28:35 - INFO - codeparrot_training - Step 13423: {'lr': 0.0004333249618741455, 'samples': 2577408, 'steps': 13423, 'loss/train': 0.9282579123973846} 01/27/2022 08:28:38 - INFO - codeparrot_training - Step 13424: {'lr': 0.00043331383655564003, 'samples': 2577600, 'steps': 13424, 'loss/train': 1.5456726551055908} 01/27/2022 08:28:42 - INFO - codeparrot_training - Step 13425: {'lr': 0.0004333027104518762, 'samples': 2577792, 'steps': 13425, 'loss/train': 0.8879064917564392} 01/27/2022 08:28:45 - INFO - codeparrot_training - Step 13426: {'lr': 0.00043329158356290187, 'samples': 2577984, 'steps': 13426, 'loss/train': 0.8455747961997986} 01/27/2022 08:28:48 - INFO - codeparrot_training - Step 13427: {'lr': 0.00043328045588876454, 'samples': 2578176, 'steps': 13427, 'loss/train': 0.9001378118991852} 01/27/2022 08:28:54 - INFO - codeparrot_training - Step 13428: {'lr': 0.0004332693274295119, 'samples': 2578368, 'steps': 13428, 'loss/train': 0.7359036952257156} 01/27/2022 08:28:57 - INFO - codeparrot_training - Step 13429: {'lr': 0.0004332581981851917, 'samples': 2578560, 'steps': 13429, 'loss/train': 0.7872476577758789} 01/27/2022 08:29:00 - INFO - codeparrot_training - Step 13430: {'lr': 0.00043324706815585156, 'samples': 2578752, 'steps': 13430, 'loss/train': 0.9116925895214081} 01/27/2022 08:29:03 - INFO - codeparrot_training - Step 13431: {'lr': 0.00043323593734153915, 'samples': 2578944, 'steps': 13431, 'loss/train': 1.170777440071106} 01/27/2022 08:29:06 - INFO - codeparrot_training - Step 13432: {'lr': 0.00043322480574230215, 'samples': 2579136, 'steps': 13432, 'loss/train': 0.12852003425359726} 01/27/2022 08:29:09 - INFO - codeparrot_training - Step 13433: {'lr': 0.00043321367335818833, 'samples': 2579328, 'steps': 13433, 'loss/train': 0.758859246969223} 01/27/2022 08:29:12 - INFO - codeparrot_training - Step 13434: {'lr': 0.0004332025401892453, 'samples': 2579520, 'steps': 13434, 'loss/train': 1.016704648733139} 01/27/2022 08:29:15 - INFO - codeparrot_training - Step 13435: {'lr': 0.00043319140623552073, 'samples': 2579712, 'steps': 13435, 'loss/train': 1.0406054556369781} 01/27/2022 08:29:20 - INFO - codeparrot_training - Step 13436: {'lr': 0.0004331802714970624, 'samples': 2579904, 'steps': 13436, 'loss/train': 0.9353068470954895} 01/27/2022 08:29:23 - INFO - codeparrot_training - Step 13437: {'lr': 0.00043316913597391785, 'samples': 2580096, 'steps': 13437, 'loss/train': 0.7450995147228241} 01/27/2022 08:29:26 - INFO - codeparrot_training - Step 13438: {'lr': 0.00043315799966613496, 'samples': 2580288, 'steps': 13438, 'loss/train': 0.7029251307249069} 01/27/2022 08:29:29 - INFO - codeparrot_training - Step 13439: {'lr': 0.00043314686257376136, 'samples': 2580480, 'steps': 13439, 'loss/train': 0.6691693067550659} 01/27/2022 08:29:32 - INFO - codeparrot_training - Step 13440: {'lr': 0.0004331357246968447, 'samples': 2580672, 'steps': 13440, 'loss/train': 0.8281074464321136} 01/27/2022 08:29:36 - INFO - codeparrot_training - Step 13441: {'lr': 0.0004331245860354328, 'samples': 2580864, 'steps': 13441, 'loss/train': 0.6418368369340897} 01/27/2022 08:29:39 - INFO - codeparrot_training - Step 13442: {'lr': 0.0004331134465895733, 'samples': 2581056, 'steps': 13442, 'loss/train': 0.949256032705307} 01/27/2022 08:29:42 - INFO - codeparrot_training - Step 13443: {'lr': 0.00043310230635931394, 'samples': 2581248, 'steps': 13443, 'loss/train': 1.2187424898147583} 01/27/2022 08:29:45 - INFO - codeparrot_training - Step 13444: {'lr': 0.0004330911653447024, 'samples': 2581440, 'steps': 13444, 'loss/train': 0.7907719910144806} 01/27/2022 08:29:49 - INFO - codeparrot_training - Step 13445: {'lr': 0.0004330800235457866, 'samples': 2581632, 'steps': 13445, 'loss/train': 0.29711712151765823} 01/27/2022 08:29:53 - INFO - codeparrot_training - Step 13446: {'lr': 0.00043306888096261394, 'samples': 2581824, 'steps': 13446, 'loss/train': 0.7768667042255402} 01/27/2022 08:29:56 - INFO - codeparrot_training - Step 13447: {'lr': 0.0004330577375952324, 'samples': 2582016, 'steps': 13447, 'loss/train': 1.5061240196228027} 01/27/2022 08:29:59 - INFO - codeparrot_training - Step 13448: {'lr': 0.0004330465934436896, 'samples': 2582208, 'steps': 13448, 'loss/train': 0.6087931841611862} 01/27/2022 08:30:02 - INFO - codeparrot_training - Step 13449: {'lr': 0.0004330354485080334, 'samples': 2582400, 'steps': 13449, 'loss/train': 0.785700112581253} 01/27/2022 08:30:05 - INFO - codeparrot_training - Step 13450: {'lr': 0.0004330243027883114, 'samples': 2582592, 'steps': 13450, 'loss/train': 0.12857864052057266} 01/27/2022 08:30:08 - INFO - codeparrot_training - Step 13451: {'lr': 0.0004330131562845714, 'samples': 2582784, 'steps': 13451, 'loss/train': 1.2239906787872314} 01/27/2022 08:30:11 - INFO - codeparrot_training - Step 13452: {'lr': 0.00043300200899686113, 'samples': 2582976, 'steps': 13452, 'loss/train': 0.9462788701057434} 01/27/2022 08:30:15 - INFO - codeparrot_training - Step 13453: {'lr': 0.0004329908609252284, 'samples': 2583168, 'steps': 13453, 'loss/train': 0.9373716115951538} 01/27/2022 08:30:19 - INFO - codeparrot_training - Step 13454: {'lr': 0.00043297971206972095, 'samples': 2583360, 'steps': 13454, 'loss/train': 0.5596383064985275} 01/27/2022 08:30:22 - INFO - codeparrot_training - Step 13455: {'lr': 0.0004329685624303865, 'samples': 2583552, 'steps': 13455, 'loss/train': 0.9799975454807281} 01/27/2022 08:30:25 - INFO - codeparrot_training - Step 13456: {'lr': 0.0004329574120072728, 'samples': 2583744, 'steps': 13456, 'loss/train': 0.6820051968097687} 01/27/2022 08:30:28 - INFO - codeparrot_training - Step 13457: {'lr': 0.00043294626080042767, 'samples': 2583936, 'steps': 13457, 'loss/train': 0.5057934522628784} 01/27/2022 08:30:32 - INFO - codeparrot_training - Step 13458: {'lr': 0.0004329351088098988, 'samples': 2584128, 'steps': 13458, 'loss/train': 0.5966451913118362} 01/27/2022 08:30:35 - INFO - codeparrot_training - Step 13459: {'lr': 0.0004329239560357341, 'samples': 2584320, 'steps': 13459, 'loss/train': 1.0476052165031433} 01/27/2022 08:30:38 - INFO - codeparrot_training - Step 13460: {'lr': 0.0004329128024779812, 'samples': 2584512, 'steps': 13460, 'loss/train': 0.6944953501224518} 01/27/2022 08:30:41 - INFO - codeparrot_training - Step 13461: {'lr': 0.00043290164813668795, 'samples': 2584704, 'steps': 13461, 'loss/train': 0.7565641701221466} 01/27/2022 08:30:47 - INFO - codeparrot_training - Step 13462: {'lr': 0.0004328904930119021, 'samples': 2584896, 'steps': 13462, 'loss/train': 1.146892637014389} 01/27/2022 08:30:50 - INFO - codeparrot_training - Step 13463: {'lr': 0.0004328793371036714, 'samples': 2585088, 'steps': 13463, 'loss/train': 0.42770250141620636} 01/27/2022 08:30:53 - INFO - codeparrot_training - Step 13464: {'lr': 0.0004328681804120438, 'samples': 2585280, 'steps': 13464, 'loss/train': 0.8118953704833984} 01/27/2022 08:30:57 - INFO - codeparrot_training - Step 13465: {'lr': 0.000432857022937067, 'samples': 2585472, 'steps': 13465, 'loss/train': 0.7309150546789169} 01/27/2022 08:31:00 - INFO - codeparrot_training - Step 13466: {'lr': 0.00043284586467878865, 'samples': 2585664, 'steps': 13466, 'loss/train': 0.5495769828557968} 01/27/2022 08:31:03 - INFO - codeparrot_training - Step 13467: {'lr': 0.0004328347056372568, 'samples': 2585856, 'steps': 13467, 'loss/train': 1.0297512710094452} 01/27/2022 08:31:06 - INFO - codeparrot_training - Step 13468: {'lr': 0.00043282354581251903, 'samples': 2586048, 'steps': 13468, 'loss/train': 0.7831433415412903} 01/27/2022 08:31:09 - INFO - codeparrot_training - Step 13469: {'lr': 0.0004328123852046233, 'samples': 2586240, 'steps': 13469, 'loss/train': 0.19134429842233658} 01/27/2022 08:31:12 - INFO - codeparrot_training - Step 13470: {'lr': 0.0004328012238136173, 'samples': 2586432, 'steps': 13470, 'loss/train': 0.6158613860607147} 01/27/2022 08:31:17 - INFO - codeparrot_training - Step 13471: {'lr': 0.000432790061639549, 'samples': 2586624, 'steps': 13471, 'loss/train': 0.7720659077167511} 01/27/2022 08:31:20 - INFO - codeparrot_training - Step 13472: {'lr': 0.00043277889868246605, 'samples': 2586816, 'steps': 13472, 'loss/train': 0.4875126779079437} 01/27/2022 08:31:23 - INFO - codeparrot_training - Step 13473: {'lr': 0.0004327677349424164, 'samples': 2587008, 'steps': 13473, 'loss/train': 0.9092941582202911} 01/27/2022 08:31:26 - INFO - codeparrot_training - Step 13474: {'lr': 0.0004327565704194477, 'samples': 2587200, 'steps': 13474, 'loss/train': 0.6722055673599243} 01/27/2022 08:31:29 - INFO - codeparrot_training - Step 13475: {'lr': 0.0004327454051136079, 'samples': 2587392, 'steps': 13475, 'loss/train': 0.8843047320842743} 01/27/2022 08:31:33 - INFO - codeparrot_training - Step 13476: {'lr': 0.0004327342390249449, 'samples': 2587584, 'steps': 13476, 'loss/train': 1.1586735248565674} 01/27/2022 08:31:36 - INFO - codeparrot_training - Step 13477: {'lr': 0.00043272307215350635, 'samples': 2587776, 'steps': 13477, 'loss/train': 0.8907387256622314} 01/27/2022 08:31:39 - INFO - codeparrot_training - Step 13478: {'lr': 0.0004327119044993403, 'samples': 2587968, 'steps': 13478, 'loss/train': 1.1365923285484314} 01/27/2022 08:31:42 - INFO - codeparrot_training - Step 13479: {'lr': 0.0004327007360624944, 'samples': 2588160, 'steps': 13479, 'loss/train': 0.9677816033363342} 01/27/2022 08:31:48 - INFO - codeparrot_training - Step 13480: {'lr': 0.0004326895668430165, 'samples': 2588352, 'steps': 13480, 'loss/train': 0.4107922464609146} 01/27/2022 08:31:52 - INFO - codeparrot_training - Step 13481: {'lr': 0.0004326783968409546, 'samples': 2588544, 'steps': 13481, 'loss/train': 0.5684218257665634} 01/27/2022 08:31:55 - INFO - codeparrot_training - Step 13482: {'lr': 0.00043266722605635644, 'samples': 2588736, 'steps': 13482, 'loss/train': 0.508557915687561} 01/27/2022 08:31:58 - INFO - codeparrot_training - Step 13483: {'lr': 0.0004326560544892699, 'samples': 2588928, 'steps': 13483, 'loss/train': 0.9507810473442078} 01/27/2022 08:32:01 - INFO - codeparrot_training - Step 13484: {'lr': 0.00043264488213974275, 'samples': 2589120, 'steps': 13484, 'loss/train': 0.6734292358160019} 01/27/2022 08:32:04 - INFO - codeparrot_training - Step 13485: {'lr': 0.00043263370900782297, 'samples': 2589312, 'steps': 13485, 'loss/train': 0.926142543554306} 01/27/2022 08:32:07 - INFO - codeparrot_training - Step 13486: {'lr': 0.0004326225350935583, 'samples': 2589504, 'steps': 13486, 'loss/train': 1.0571189224720001} 01/27/2022 08:32:10 - INFO - codeparrot_training - Step 13487: {'lr': 0.00043261136039699676, 'samples': 2589696, 'steps': 13487, 'loss/train': 0.26910504698753357} 01/27/2022 08:32:15 - INFO - codeparrot_training - Step 13488: {'lr': 0.0004326001849181862, 'samples': 2589888, 'steps': 13488, 'loss/train': 0.5475893765687943} 01/27/2022 08:32:18 - INFO - codeparrot_training - Step 13489: {'lr': 0.0004325890086571743, 'samples': 2590080, 'steps': 13489, 'loss/train': 1.283080905675888} 01/27/2022 08:32:21 - INFO - codeparrot_training - Step 13490: {'lr': 0.00043257783161400917, 'samples': 2590272, 'steps': 13490, 'loss/train': 0.6089065968990326} 01/27/2022 08:32:24 - INFO - codeparrot_training - Step 13491: {'lr': 0.0004325666537887385, 'samples': 2590464, 'steps': 13491, 'loss/train': 0.9097738265991211} 01/27/2022 08:32:28 - INFO - codeparrot_training - Step 13492: {'lr': 0.00043255547518141033, 'samples': 2590656, 'steps': 13492, 'loss/train': 0.7477465867996216} 01/27/2022 08:32:31 - INFO - codeparrot_training - Step 13493: {'lr': 0.0004325442957920724, 'samples': 2590848, 'steps': 13493, 'loss/train': 0.7989383339881897} 01/27/2022 08:32:34 - INFO - codeparrot_training - Step 13494: {'lr': 0.0004325331156207727, 'samples': 2591040, 'steps': 13494, 'loss/train': 0.30632150173187256} 01/27/2022 08:32:37 - INFO - codeparrot_training - Step 13495: {'lr': 0.00043252193466755906, 'samples': 2591232, 'steps': 13495, 'loss/train': 0.5252599865198135} 01/27/2022 08:32:40 - INFO - codeparrot_training - Step 13496: {'lr': 0.0004325107529324795, 'samples': 2591424, 'steps': 13496, 'loss/train': 1.0130844712257385} 01/27/2022 08:32:45 - INFO - codeparrot_training - Step 13497: {'lr': 0.0004324995704155817, 'samples': 2591616, 'steps': 13497, 'loss/train': 0.6758613735437393} 01/27/2022 08:32:48 - INFO - codeparrot_training - Step 13498: {'lr': 0.0004324883871169138, 'samples': 2591808, 'steps': 13498, 'loss/train': 1.0701014399528503} 01/27/2022 08:32:51 - INFO - codeparrot_training - Step 13499: {'lr': 0.00043247720303652353, 'samples': 2592000, 'steps': 13499, 'loss/train': 1.017765372991562} 01/27/2022 08:32:54 - INFO - codeparrot_training - Step 13500: {'lr': 0.0004324660181744589, 'samples': 2592192, 'steps': 13500, 'loss/train': 0.795381098985672} 01/27/2022 08:32:57 - INFO - codeparrot_training - Step 13501: {'lr': 0.00043245483253076777, 'samples': 2592384, 'steps': 13501, 'loss/train': 0.7741815447807312} 01/27/2022 08:33:00 - INFO - codeparrot_training - Step 13502: {'lr': 0.0004324436461054981, 'samples': 2592576, 'steps': 13502, 'loss/train': 0.8646622896194458} 01/27/2022 08:33:04 - INFO - codeparrot_training - Step 13503: {'lr': 0.00043243245889869775, 'samples': 2592768, 'steps': 13503, 'loss/train': 0.5053550451993942} 01/27/2022 08:33:07 - INFO - codeparrot_training - Step 13504: {'lr': 0.0004324212709104147, 'samples': 2592960, 'steps': 13504, 'loss/train': 0.8471148312091827} 01/27/2022 08:33:10 - INFO - codeparrot_training - Step 13505: {'lr': 0.0004324100821406969, 'samples': 2593152, 'steps': 13505, 'loss/train': 0.8303325176239014} 01/27/2022 08:33:16 - INFO - codeparrot_training - Step 13506: {'lr': 0.00043239889258959215, 'samples': 2593344, 'steps': 13506, 'loss/train': 0.6242540180683136} 01/27/2022 08:33:19 - INFO - codeparrot_training - Step 13507: {'lr': 0.00043238770225714854, 'samples': 2593536, 'steps': 13507, 'loss/train': 0.24137096107006073} 01/27/2022 08:33:22 - INFO - codeparrot_training - Step 13508: {'lr': 0.00043237651114341383, 'samples': 2593728, 'steps': 13508, 'loss/train': 0.990554541349411} 01/27/2022 08:33:26 - INFO - codeparrot_training - Step 13509: {'lr': 0.0004323653192484361, 'samples': 2593920, 'steps': 13509, 'loss/train': 0.694486990571022} 01/27/2022 08:33:29 - INFO - codeparrot_training - Step 13510: {'lr': 0.0004323541265722633, 'samples': 2594112, 'steps': 13510, 'loss/train': 0.48472993075847626} 01/27/2022 08:33:32 - INFO - codeparrot_training - Step 13511: {'lr': 0.0004323429331149432, 'samples': 2594304, 'steps': 13511, 'loss/train': 0.642354816198349} 01/27/2022 08:33:35 - INFO - codeparrot_training - Step 13512: {'lr': 0.000432331738876524, 'samples': 2594496, 'steps': 13512, 'loss/train': 0.9107494354248047} 01/27/2022 08:33:38 - INFO - codeparrot_training - Step 13513: {'lr': 0.00043232054385705345, 'samples': 2594688, 'steps': 13513, 'loss/train': 1.0573618412017822} 01/27/2022 08:33:43 - INFO - codeparrot_training - Step 13514: {'lr': 0.0004323093480565796, 'samples': 2594880, 'steps': 13514, 'loss/train': 0.943970650434494} 01/27/2022 08:33:46 - INFO - codeparrot_training - Step 13515: {'lr': 0.0004322981514751504, 'samples': 2595072, 'steps': 13515, 'loss/train': 0.716996356844902} 01/27/2022 08:33:49 - INFO - codeparrot_training - Step 13516: {'lr': 0.0004322869541128138, 'samples': 2595264, 'steps': 13516, 'loss/train': 0.6031931340694427} 01/27/2022 08:33:52 - INFO - codeparrot_training - Step 13517: {'lr': 0.00043227575596961783, 'samples': 2595456, 'steps': 13517, 'loss/train': 0.7711884677410126} 01/27/2022 08:33:55 - INFO - codeparrot_training - Step 13518: {'lr': 0.00043226455704561034, 'samples': 2595648, 'steps': 13518, 'loss/train': 1.0717544853687286} 01/27/2022 08:33:58 - INFO - codeparrot_training - Step 13519: {'lr': 0.0004322533573408394, 'samples': 2595840, 'steps': 13519, 'loss/train': 1.0214852392673492} 01/27/2022 08:34:02 - INFO - codeparrot_training - Step 13520: {'lr': 0.00043224215685535287, 'samples': 2596032, 'steps': 13520, 'loss/train': 1.401941031217575} 01/27/2022 08:34:05 - INFO - codeparrot_training - Step 13521: {'lr': 0.0004322309555891989, 'samples': 2596224, 'steps': 13521, 'loss/train': 1.109206885099411} 01/27/2022 08:34:08 - INFO - codeparrot_training - Step 13522: {'lr': 0.00043221975354242536, 'samples': 2596416, 'steps': 13522, 'loss/train': 1.523305892944336} 01/27/2022 08:34:12 - INFO - codeparrot_training - Step 13523: {'lr': 0.0004322085507150802, 'samples': 2596608, 'steps': 13523, 'loss/train': 0.5938914567232132} 01/27/2022 08:34:16 - INFO - codeparrot_training - Step 13524: {'lr': 0.00043219734710721146, 'samples': 2596800, 'steps': 13524, 'loss/train': 0.9343593120574951} 01/27/2022 08:34:19 - INFO - codeparrot_training - Step 13525: {'lr': 0.00043218614271886725, 'samples': 2596992, 'steps': 13525, 'loss/train': 0.38998858630657196} 01/27/2022 08:34:22 - INFO - codeparrot_training - Step 13526: {'lr': 0.0004321749375500954, 'samples': 2597184, 'steps': 13526, 'loss/train': 0.844549834728241} 01/27/2022 08:34:25 - INFO - codeparrot_training - Step 13527: {'lr': 0.0004321637316009439, 'samples': 2597376, 'steps': 13527, 'loss/train': 0.7180292755365372} 01/27/2022 08:34:28 - INFO - codeparrot_training - Step 13528: {'lr': 0.00043215252487146096, 'samples': 2597568, 'steps': 13528, 'loss/train': 0.5511549711227417} 01/27/2022 08:34:31 - INFO - codeparrot_training - Step 13529: {'lr': 0.0004321413173616943, 'samples': 2597760, 'steps': 13529, 'loss/train': 1.9064460396766663} 01/27/2022 08:34:34 - INFO - codeparrot_training - Step 13530: {'lr': 0.00043213010907169213, 'samples': 2597952, 'steps': 13530, 'loss/train': 0.8428634405136108} 01/27/2022 08:34:38 - INFO - codeparrot_training - Step 13531: {'lr': 0.00043211890000150247, 'samples': 2598144, 'steps': 13531, 'loss/train': 0.6574760377407074} 01/27/2022 08:34:44 - INFO - codeparrot_training - Step 13532: {'lr': 0.0004321076901511731, 'samples': 2598336, 'steps': 13532, 'loss/train': 0.6602063477039337} 01/27/2022 08:34:47 - INFO - codeparrot_training - Step 13533: {'lr': 0.00043209647952075235, 'samples': 2598528, 'steps': 13533, 'loss/train': 0.572923332452774} 01/27/2022 08:34:50 - INFO - codeparrot_training - Step 13534: {'lr': 0.00043208526811028806, 'samples': 2598720, 'steps': 13534, 'loss/train': 1.164073884487152} 01/27/2022 08:34:53 - INFO - codeparrot_training - Step 13535: {'lr': 0.00043207405591982835, 'samples': 2598912, 'steps': 13535, 'loss/train': 0.9639991521835327} 01/27/2022 08:34:56 - INFO - codeparrot_training - Step 13536: {'lr': 0.0004320628429494212, 'samples': 2599104, 'steps': 13536, 'loss/train': 1.12765771150589} 01/27/2022 08:35:00 - INFO - codeparrot_training - Step 13537: {'lr': 0.00043205162919911455, 'samples': 2599296, 'steps': 13537, 'loss/train': 0.9231483042240143} 01/27/2022 08:35:03 - INFO - codeparrot_training - Step 13538: {'lr': 0.0004320404146689566, 'samples': 2599488, 'steps': 13538, 'loss/train': 0.6915690153837204} 01/27/2022 08:35:06 - INFO - codeparrot_training - Step 13539: {'lr': 0.0004320291993589953, 'samples': 2599680, 'steps': 13539, 'loss/train': 0.9825755953788757} 01/27/2022 08:35:10 - INFO - codeparrot_training - Step 13540: {'lr': 0.0004320179832692787, 'samples': 2599872, 'steps': 13540, 'loss/train': 0.9751217365264893} 01/27/2022 08:35:13 - INFO - codeparrot_training - Step 13541: {'lr': 0.0004320067663998549, 'samples': 2600064, 'steps': 13541, 'loss/train': 0.8213211894035339} 01/27/2022 08:35:17 - INFO - codeparrot_training - Step 13542: {'lr': 0.00043199554875077183, 'samples': 2600256, 'steps': 13542, 'loss/train': 1.0520459711551666} 01/27/2022 08:35:20 - INFO - codeparrot_training - Step 13543: {'lr': 0.00043198433032207774, 'samples': 2600448, 'steps': 13543, 'loss/train': 0.9644903540611267} 01/27/2022 08:35:23 - INFO - codeparrot_training - Step 13544: {'lr': 0.00043197311111382045, 'samples': 2600640, 'steps': 13544, 'loss/train': 0.5400976538658142} 01/27/2022 08:35:26 - INFO - codeparrot_training - Step 13545: {'lr': 0.0004319618911260482, 'samples': 2600832, 'steps': 13545, 'loss/train': 0.8100479543209076} 01/27/2022 08:35:29 - INFO - codeparrot_training - Step 13546: {'lr': 0.0004319506703588089, 'samples': 2601024, 'steps': 13546, 'loss/train': 0.30661491304636} 01/27/2022 08:35:32 - INFO - codeparrot_training - Step 13547: {'lr': 0.00043193944881215075, 'samples': 2601216, 'steps': 13547, 'loss/train': 0.9412554502487183} 01/27/2022 08:35:35 - INFO - codeparrot_training - Step 13548: {'lr': 0.00043192822648612184, 'samples': 2601408, 'steps': 13548, 'loss/train': 0.7683148384094238} 01/27/2022 08:35:40 - INFO - codeparrot_training - Step 13549: {'lr': 0.0004319170033807701, 'samples': 2601600, 'steps': 13549, 'loss/train': 0.9384405612945557} 01/27/2022 08:35:43 - INFO - codeparrot_training - Step 13550: {'lr': 0.00043190577949614375, 'samples': 2601792, 'steps': 13550, 'loss/train': 0.5854089707136154} 01/27/2022 08:35:46 - INFO - codeparrot_training - Step 13551: {'lr': 0.00043189455483229073, 'samples': 2601984, 'steps': 13551, 'loss/train': 0.6277766972780228} 01/27/2022 08:35:50 - INFO - codeparrot_training - Step 13552: {'lr': 0.00043188332938925923, 'samples': 2602176, 'steps': 13552, 'loss/train': 0.6708999574184418} 01/27/2022 08:35:53 - INFO - codeparrot_training - Step 13553: {'lr': 0.0004318721031670973, 'samples': 2602368, 'steps': 13553, 'loss/train': 1.1860899031162262} 01/27/2022 08:35:56 - INFO - codeparrot_training - Step 13554: {'lr': 0.00043186087616585303, 'samples': 2602560, 'steps': 13554, 'loss/train': 0.7676045894622803} 01/27/2022 08:35:59 - INFO - codeparrot_training - Step 13555: {'lr': 0.0004318496483855745, 'samples': 2602752, 'steps': 13555, 'loss/train': 1.0109403133392334} 01/27/2022 08:36:02 - INFO - codeparrot_training - Step 13556: {'lr': 0.0004318384198263099, 'samples': 2602944, 'steps': 13556, 'loss/train': 0.6718660891056061} 01/27/2022 08:36:05 - INFO - codeparrot_training - Step 13557: {'lr': 0.00043182719048810714, 'samples': 2603136, 'steps': 13557, 'loss/train': 0.5625051409006119} 01/27/2022 08:36:11 - INFO - codeparrot_training - Step 13558: {'lr': 0.00043181596037101443, 'samples': 2603328, 'steps': 13558, 'loss/train': 0.7491225600242615} 01/27/2022 08:36:15 - INFO - codeparrot_training - Step 13559: {'lr': 0.00043180472947508, 'samples': 2603520, 'steps': 13559, 'loss/train': 0.943743109703064} 01/27/2022 08:36:18 - INFO - codeparrot_training - Step 13560: {'lr': 0.0004317934978003517, 'samples': 2603712, 'steps': 13560, 'loss/train': 0.6964326649904251} 01/27/2022 08:36:21 - INFO - codeparrot_training - Step 13561: {'lr': 0.0004317822653468778, 'samples': 2603904, 'steps': 13561, 'loss/train': 1.4406892955303192} 01/27/2022 08:36:24 - INFO - codeparrot_training - Step 13562: {'lr': 0.00043177103211470647, 'samples': 2604096, 'steps': 13562, 'loss/train': 0.3923904150724411} 01/27/2022 08:36:27 - INFO - codeparrot_training - Step 13563: {'lr': 0.00043175979810388575, 'samples': 2604288, 'steps': 13563, 'loss/train': 0.7258643209934235} 01/27/2022 08:36:30 - INFO - codeparrot_training - Step 13564: {'lr': 0.0004317485633144638, 'samples': 2604480, 'steps': 13564, 'loss/train': 0.6189334094524384} 01/27/2022 08:36:33 - INFO - codeparrot_training - Step 13565: {'lr': 0.0004317373277464886, 'samples': 2604672, 'steps': 13565, 'loss/train': 1.1325304806232452} 01/27/2022 08:36:37 - INFO - codeparrot_training - Step 13566: {'lr': 0.0004317260914000085, 'samples': 2604864, 'steps': 13566, 'loss/train': 0.8104606568813324} 01/27/2022 08:36:41 - INFO - codeparrot_training - Step 13567: {'lr': 0.00043171485427507145, 'samples': 2605056, 'steps': 13567, 'loss/train': 0.5119721442461014} 01/27/2022 08:36:44 - INFO - codeparrot_training - Step 13568: {'lr': 0.0004317036163717257, 'samples': 2605248, 'steps': 13568, 'loss/train': 0.9095383286476135} 01/27/2022 08:36:47 - INFO - codeparrot_training - Step 13569: {'lr': 0.00043169237769001936, 'samples': 2605440, 'steps': 13569, 'loss/train': 0.9481506049633026} 01/27/2022 08:36:50 - INFO - codeparrot_training - Step 13570: {'lr': 0.0004316811382300006, 'samples': 2605632, 'steps': 13570, 'loss/train': 0.5764535218477249} 01/27/2022 08:36:53 - INFO - codeparrot_training - Step 13571: {'lr': 0.0004316698979917175, 'samples': 2605824, 'steps': 13571, 'loss/train': 0.053715839982032776} 01/27/2022 08:36:57 - INFO - codeparrot_training - Step 13572: {'lr': 0.0004316586569752182, 'samples': 2606016, 'steps': 13572, 'loss/train': 0.6649827808141708} 01/27/2022 08:37:00 - INFO - codeparrot_training - Step 13573: {'lr': 0.00043164741518055097, 'samples': 2606208, 'steps': 13573, 'loss/train': 1.1978473663330078} 01/27/2022 08:37:03 - INFO - codeparrot_training - Step 13574: {'lr': 0.0004316361726077639, 'samples': 2606400, 'steps': 13574, 'loss/train': 0.6004363149404526} 01/27/2022 08:37:06 - INFO - codeparrot_training - Step 13575: {'lr': 0.0004316249292569051, 'samples': 2606592, 'steps': 13575, 'loss/train': 1.0946078896522522} 01/27/2022 08:37:10 - INFO - codeparrot_training - Step 13576: {'lr': 0.0004316136851280228, 'samples': 2606784, 'steps': 13576, 'loss/train': 0.8083348274230957} 01/27/2022 08:37:14 - INFO - codeparrot_training - Step 13577: {'lr': 0.00043160244022116514, 'samples': 2606976, 'steps': 13577, 'loss/train': 0.7766627669334412} 01/27/2022 08:37:17 - INFO - codeparrot_training - Step 13578: {'lr': 0.0004315911945363802, 'samples': 2607168, 'steps': 13578, 'loss/train': 0.9682188034057617} 01/27/2022 08:37:20 - INFO - codeparrot_training - Step 13579: {'lr': 0.00043157994807371634, 'samples': 2607360, 'steps': 13579, 'loss/train': 0.7972125113010406} 01/27/2022 08:37:23 - INFO - codeparrot_training - Step 13580: {'lr': 0.00043156870083322166, 'samples': 2607552, 'steps': 13580, 'loss/train': 0.7625767886638641} 01/27/2022 08:37:26 - INFO - codeparrot_training - Step 13581: {'lr': 0.0004315574528149443, 'samples': 2607744, 'steps': 13581, 'loss/train': 0.81752809882164} 01/27/2022 08:37:29 - INFO - codeparrot_training - Step 13582: {'lr': 0.00043154620401893244, 'samples': 2607936, 'steps': 13582, 'loss/train': 0.9674726128578186} 01/27/2022 08:37:32 - INFO - codeparrot_training - Step 13583: {'lr': 0.0004315349544452343, 'samples': 2608128, 'steps': 13583, 'loss/train': 1.1019817292690277} 01/27/2022 08:37:39 - INFO - codeparrot_training - Step 13584: {'lr': 0.00043152370409389794, 'samples': 2608320, 'steps': 13584, 'loss/train': 0.9095583558082581} 01/27/2022 08:37:42 - INFO - codeparrot_training - Step 13585: {'lr': 0.00043151245296497184, 'samples': 2608512, 'steps': 13585, 'loss/train': 0.5154134184122086} 01/27/2022 08:37:45 - INFO - codeparrot_training - Step 13586: {'lr': 0.000431501201058504, 'samples': 2608704, 'steps': 13586, 'loss/train': 0.8320457339286804} 01/27/2022 08:37:48 - INFO - codeparrot_training - Step 13587: {'lr': 0.0004314899483745426, 'samples': 2608896, 'steps': 13587, 'loss/train': 0.9569768607616425} 01/27/2022 08:37:51 - INFO - codeparrot_training - Step 13588: {'lr': 0.0004314786949131359, 'samples': 2609088, 'steps': 13588, 'loss/train': 1.2647480964660645} 01/27/2022 08:37:54 - INFO - codeparrot_training - Step 13589: {'lr': 0.0004314674406743321, 'samples': 2609280, 'steps': 13589, 'loss/train': 0.8876041173934937} 01/27/2022 08:37:58 - INFO - codeparrot_training - Step 13590: {'lr': 0.00043145618565817946, 'samples': 2609472, 'steps': 13590, 'loss/train': 0.8153751790523529} 01/27/2022 08:38:01 - INFO - codeparrot_training - Step 13591: {'lr': 0.00043144492986472603, 'samples': 2609664, 'steps': 13591, 'loss/train': 1.4626378118991852} 01/27/2022 08:38:04 - INFO - codeparrot_training - Step 13592: {'lr': 0.0004314336732940202, 'samples': 2609856, 'steps': 13592, 'loss/train': 0.7965735197067261} 01/27/2022 08:38:08 - INFO - codeparrot_training - Step 13593: {'lr': 0.0004314224159461102, 'samples': 2610048, 'steps': 13593, 'loss/train': 0.44390156865119934} 01/27/2022 08:38:11 - INFO - codeparrot_training - Step 13594: {'lr': 0.0004314111578210441, 'samples': 2610240, 'steps': 13594, 'loss/train': 0.26754283905029297} 01/27/2022 08:38:15 - INFO - codeparrot_training - Step 13595: {'lr': 0.0004313998989188702, 'samples': 2610432, 'steps': 13595, 'loss/train': 0.2368030920624733} 01/27/2022 08:38:18 - INFO - codeparrot_training - Step 13596: {'lr': 0.00043138863923963664, 'samples': 2610624, 'steps': 13596, 'loss/train': 1.0218679904937744} 01/27/2022 08:38:21 - INFO - codeparrot_training - Step 13597: {'lr': 0.0004313773787833919, 'samples': 2610816, 'steps': 13597, 'loss/train': 0.5488185882568359} 01/27/2022 08:38:24 - INFO - codeparrot_training - Step 13598: {'lr': 0.0004313661175501841, 'samples': 2611008, 'steps': 13598, 'loss/train': 0.9888699352741241} 01/27/2022 08:38:27 - INFO - codeparrot_training - Step 13599: {'lr': 0.00043135485554006127, 'samples': 2611200, 'steps': 13599, 'loss/train': 0.7074080407619476} 01/27/2022 08:38:30 - INFO - codeparrot_training - Step 13600: {'lr': 0.0004313435927530719, 'samples': 2611392, 'steps': 13600, 'loss/train': 0.6890252530574799} 01/27/2022 08:38:34 - INFO - codeparrot_training - Step 13601: {'lr': 0.00043133232918926426, 'samples': 2611584, 'steps': 13601, 'loss/train': 0.9411030113697052} 01/27/2022 08:38:38 - INFO - codeparrot_training - Step 13602: {'lr': 0.0004313210648486864, 'samples': 2611776, 'steps': 13602, 'loss/train': 0.6970986574888229} 01/27/2022 08:38:41 - INFO - codeparrot_training - Step 13603: {'lr': 0.00043130979973138664, 'samples': 2611968, 'steps': 13603, 'loss/train': 1.065318077802658} 01/27/2022 08:38:44 - INFO - codeparrot_training - Step 13604: {'lr': 0.00043129853383741334, 'samples': 2612160, 'steps': 13604, 'loss/train': 0.7322731912136078} 01/27/2022 08:38:47 - INFO - codeparrot_training - Step 13605: {'lr': 0.00043128726716681464, 'samples': 2612352, 'steps': 13605, 'loss/train': 0.9187818467617035} 01/27/2022 08:38:50 - INFO - codeparrot_training - Step 13606: {'lr': 0.0004312759997196389, 'samples': 2612544, 'steps': 13606, 'loss/train': 0.7440441995859146} 01/27/2022 08:38:54 - INFO - codeparrot_training - Step 13607: {'lr': 0.00043126473149593424, 'samples': 2612736, 'steps': 13607, 'loss/train': 0.1458912156522274} 01/27/2022 08:38:57 - INFO - codeparrot_training - Step 13608: {'lr': 0.00043125346249574915, 'samples': 2612928, 'steps': 13608, 'loss/train': 0.8384076654911041} 01/27/2022 08:39:00 - INFO - codeparrot_training - Step 13609: {'lr': 0.0004312421927191318, 'samples': 2613120, 'steps': 13609, 'loss/train': 1.0179645717144012} 01/27/2022 08:39:06 - INFO - codeparrot_training - Step 13610: {'lr': 0.00043123092216613035, 'samples': 2613312, 'steps': 13610, 'loss/train': 0.7726767361164093} 01/27/2022 08:39:09 - INFO - codeparrot_training - Step 13611: {'lr': 0.0004312196508367932, 'samples': 2613504, 'steps': 13611, 'loss/train': 0.4559251517057419} 01/27/2022 08:39:12 - INFO - codeparrot_training - Step 13612: {'lr': 0.0004312083787311686, 'samples': 2613696, 'steps': 13612, 'loss/train': 0.5928859412670135} 01/27/2022 08:39:16 - INFO - codeparrot_training - Step 13613: {'lr': 0.0004311971058493049, 'samples': 2613888, 'steps': 13613, 'loss/train': 1.0873785316944122} 01/27/2022 08:39:19 - INFO - codeparrot_training - Step 13614: {'lr': 0.0004311858321912503, 'samples': 2614080, 'steps': 13614, 'loss/train': 0.6279845237731934} 01/27/2022 08:39:22 - INFO - codeparrot_training - Step 13615: {'lr': 0.0004311745577570531, 'samples': 2614272, 'steps': 13615, 'loss/train': 0.9091774821281433} 01/27/2022 08:39:25 - INFO - codeparrot_training - Step 13616: {'lr': 0.0004311632825467617, 'samples': 2614464, 'steps': 13616, 'loss/train': 1.1203753352165222} 01/27/2022 08:39:28 - INFO - codeparrot_training - Step 13617: {'lr': 0.00043115200656042426, 'samples': 2614656, 'steps': 13617, 'loss/train': 0.9217291474342346} 01/27/2022 08:39:31 - INFO - codeparrot_training - Step 13618: {'lr': 0.00043114072979808914, 'samples': 2614848, 'steps': 13618, 'loss/train': 1.1061775982379913} 01/27/2022 08:39:36 - INFO - codeparrot_training - Step 13619: {'lr': 0.00043112945225980473, 'samples': 2615040, 'steps': 13619, 'loss/train': 0.8560596406459808} 01/27/2022 08:39:39 - INFO - codeparrot_training - Step 13620: {'lr': 0.00043111817394561917, 'samples': 2615232, 'steps': 13620, 'loss/train': 0.7799742221832275} 01/27/2022 08:39:42 - INFO - codeparrot_training - Step 13621: {'lr': 0.0004311068948555809, 'samples': 2615424, 'steps': 13621, 'loss/train': 0.6916389763355255} 01/27/2022 08:39:45 - INFO - codeparrot_training - Step 13622: {'lr': 0.0004310956149897382, 'samples': 2615616, 'steps': 13622, 'loss/train': 0.4809325039386749} 01/27/2022 08:39:48 - INFO - codeparrot_training - Step 13623: {'lr': 0.00043108433434813943, 'samples': 2615808, 'steps': 13623, 'loss/train': 0.840624064207077} 01/27/2022 08:39:52 - INFO - codeparrot_training - Step 13624: {'lr': 0.00043107305293083276, 'samples': 2616000, 'steps': 13624, 'loss/train': 1.0112192630767822} 01/27/2022 08:39:55 - INFO - codeparrot_training - Step 13625: {'lr': 0.0004310617707378668, 'samples': 2616192, 'steps': 13625, 'loss/train': 0.9761004745960236} 01/27/2022 08:39:58 - INFO - codeparrot_training - Step 13626: {'lr': 0.0004310504877692896, 'samples': 2616384, 'steps': 13626, 'loss/train': 0.045811593532562256} 01/27/2022 08:40:01 - INFO - codeparrot_training - Step 13627: {'lr': 0.00043103920402514956, 'samples': 2616576, 'steps': 13627, 'loss/train': 0.8795292377471924} 01/27/2022 08:40:07 - INFO - codeparrot_training - Step 13628: {'lr': 0.00043102791950549513, 'samples': 2616768, 'steps': 13628, 'loss/train': 0.6761597692966461} 01/27/2022 08:40:10 - INFO - codeparrot_training - Step 13629: {'lr': 0.00043101663421037453, 'samples': 2616960, 'steps': 13629, 'loss/train': 0.3941738158464432} 01/27/2022 08:40:13 - INFO - codeparrot_training - Step 13630: {'lr': 0.00043100534813983617, 'samples': 2617152, 'steps': 13630, 'loss/train': 0.90389683842659} 01/27/2022 08:40:17 - INFO - codeparrot_training - Step 13631: {'lr': 0.00043099406129392835, 'samples': 2617344, 'steps': 13631, 'loss/train': 0.6192349344491959} 01/27/2022 08:40:20 - INFO - codeparrot_training - Step 13632: {'lr': 0.00043098277367269953, 'samples': 2617536, 'steps': 13632, 'loss/train': 0.659832090139389} 01/27/2022 08:40:23 - INFO - codeparrot_training - Step 13633: {'lr': 0.0004309714852761979, 'samples': 2617728, 'steps': 13633, 'loss/train': 0.9503207802772522} 01/27/2022 08:40:26 - INFO - codeparrot_training - Step 13634: {'lr': 0.0004309601961044719, 'samples': 2617920, 'steps': 13634, 'loss/train': 0.7560108304023743} 01/27/2022 08:40:29 - INFO - codeparrot_training - Step 13635: {'lr': 0.0004309489061575699, 'samples': 2618112, 'steps': 13635, 'loss/train': 1.0991493165493011} 01/27/2022 08:40:34 - INFO - codeparrot_training - Step 13636: {'lr': 0.0004309376154355402, 'samples': 2618304, 'steps': 13636, 'loss/train': 0.7379133403301239} 01/27/2022 08:40:37 - INFO - codeparrot_training - Step 13637: {'lr': 0.00043092632393843124, 'samples': 2618496, 'steps': 13637, 'loss/train': 1.209851235151291} 01/27/2022 08:40:40 - INFO - codeparrot_training - Step 13638: {'lr': 0.00043091503166629136, 'samples': 2618688, 'steps': 13638, 'loss/train': 0.7236490398645401} 01/27/2022 08:40:43 - INFO - codeparrot_training - Step 13639: {'lr': 0.000430903738619169, 'samples': 2618880, 'steps': 13639, 'loss/train': 0.6103259772062302} 01/27/2022 08:40:46 - INFO - codeparrot_training - Step 13640: {'lr': 0.00043089244479711233, 'samples': 2619072, 'steps': 13640, 'loss/train': 1.0525491535663605} 01/27/2022 08:40:49 - INFO - codeparrot_training - Step 13641: {'lr': 0.00043088115020016994, 'samples': 2619264, 'steps': 13641, 'loss/train': 1.025584727525711} 01/27/2022 08:40:52 - INFO - codeparrot_training - Step 13642: {'lr': 0.00043086985482839016, 'samples': 2619456, 'steps': 13642, 'loss/train': 1.3386784493923187} 01/27/2022 08:40:56 - INFO - codeparrot_training - Step 13643: {'lr': 0.00043085855868182135, 'samples': 2619648, 'steps': 13643, 'loss/train': 0.9842920303344727} 01/27/2022 08:40:59 - INFO - codeparrot_training - Step 13644: {'lr': 0.0004308472617605118, 'samples': 2619840, 'steps': 13644, 'loss/train': 0.7670854032039642} 01/27/2022 08:41:03 - INFO - codeparrot_training - Step 13645: {'lr': 0.00043083596406451015, 'samples': 2620032, 'steps': 13645, 'loss/train': 0.8446833193302155} 01/27/2022 08:41:06 - INFO - codeparrot_training - Step 13646: {'lr': 0.0004308246655938646, 'samples': 2620224, 'steps': 13646, 'loss/train': 0.8092890679836273} 01/27/2022 08:41:09 - INFO - codeparrot_training - Step 13647: {'lr': 0.0004308133663486236, 'samples': 2620416, 'steps': 13647, 'loss/train': 1.0130429863929749} 01/27/2022 08:41:13 - INFO - codeparrot_training - Step 13648: {'lr': 0.00043080206632883553, 'samples': 2620608, 'steps': 13648, 'loss/train': 0.8406269252300262} 01/27/2022 08:41:16 - INFO - codeparrot_training - Step 13649: {'lr': 0.0004307907655345488, 'samples': 2620800, 'steps': 13649, 'loss/train': 0.728263333439827} 01/27/2022 08:41:19 - INFO - codeparrot_training - Step 13650: {'lr': 0.0004307794639658119, 'samples': 2620992, 'steps': 13650, 'loss/train': 0.8303167819976807} 01/27/2022 08:41:22 - INFO - codeparrot_training - Step 13651: {'lr': 0.0004307681616226732, 'samples': 2621184, 'steps': 13651, 'loss/train': 1.0220189988613129} 01/27/2022 08:41:25 - INFO - codeparrot_training - Step 13652: {'lr': 0.000430756858505181, 'samples': 2621376, 'steps': 13652, 'loss/train': 0.9073814749717712} 01/27/2022 08:41:28 - INFO - codeparrot_training - Step 13653: {'lr': 0.0004307455546133838, 'samples': 2621568, 'steps': 13653, 'loss/train': 1.040587842464447} 01/27/2022 08:41:33 - INFO - codeparrot_training - Step 13654: {'lr': 0.00043073424994733014, 'samples': 2621760, 'steps': 13654, 'loss/train': 0.6281575709581375} 01/27/2022 08:41:36 - INFO - codeparrot_training - Step 13655: {'lr': 0.0004307229445070683, 'samples': 2621952, 'steps': 13655, 'loss/train': 0.6472644656896591} 01/27/2022 08:41:39 - INFO - codeparrot_training - Step 13656: {'lr': 0.0004307116382926468, 'samples': 2622144, 'steps': 13656, 'loss/train': 0.7333104461431503} 01/27/2022 08:41:43 - INFO - codeparrot_training - Step 13657: {'lr': 0.0004307003313041139, 'samples': 2622336, 'steps': 13657, 'loss/train': 0.9233343601226807} 01/27/2022 08:41:46 - INFO - codeparrot_training - Step 13658: {'lr': 0.0004306890235415183, 'samples': 2622528, 'steps': 13658, 'loss/train': 1.0151333212852478} 01/27/2022 08:41:49 - INFO - codeparrot_training - Step 13659: {'lr': 0.0004306777150049082, 'samples': 2622720, 'steps': 13659, 'loss/train': 0.3983006179332733} 01/27/2022 08:41:52 - INFO - codeparrot_training - Step 13660: {'lr': 0.0004306664056943322, 'samples': 2622912, 'steps': 13660, 'loss/train': 0.18939609825611115} 01/27/2022 08:41:55 - INFO - codeparrot_training - Step 13661: {'lr': 0.0004306550956098386, 'samples': 2623104, 'steps': 13661, 'loss/train': 1.0143523514270782} 01/27/2022 08:41:58 - INFO - codeparrot_training - Step 13662: {'lr': 0.000430643784751476, 'samples': 2623296, 'steps': 13662, 'loss/train': 0.12114725634455681} 01/27/2022 08:42:04 - INFO - codeparrot_training - Step 13663: {'lr': 0.0004306324731192929, 'samples': 2623488, 'steps': 13663, 'loss/train': 0.7814697325229645} 01/27/2022 08:42:08 - INFO - codeparrot_training - Step 13664: {'lr': 0.00043062116071333745, 'samples': 2623680, 'steps': 13664, 'loss/train': 0.6616772264242172} 01/27/2022 08:42:11 - INFO - codeparrot_training - Step 13665: {'lr': 0.0004306098475336584, 'samples': 2623872, 'steps': 13665, 'loss/train': 0.3165493905544281} 01/27/2022 08:42:14 - INFO - codeparrot_training - Step 13666: {'lr': 0.0004305985335803041, 'samples': 2624064, 'steps': 13666, 'loss/train': 0.6328834444284439} 01/27/2022 08:42:17 - INFO - codeparrot_training - Step 13667: {'lr': 0.000430587218853323, 'samples': 2624256, 'steps': 13667, 'loss/train': 0.9487209320068359} 01/27/2022 08:42:20 - INFO - codeparrot_training - Step 13668: {'lr': 0.0004305759033527636, 'samples': 2624448, 'steps': 13668, 'loss/train': 1.2537489831447601} 01/27/2022 08:42:23 - INFO - codeparrot_training - Step 13669: {'lr': 0.0004305645870786744, 'samples': 2624640, 'steps': 13669, 'loss/train': 0.6988199204206467} 01/27/2022 08:42:26 - INFO - codeparrot_training - Step 13670: {'lr': 0.00043055327003110384, 'samples': 2624832, 'steps': 13670, 'loss/train': 0.7216034531593323} 01/27/2022 08:42:31 - INFO - codeparrot_training - Step 13671: {'lr': 0.00043054195221010037, 'samples': 2625024, 'steps': 13671, 'loss/train': 1.0158875584602356} 01/27/2022 08:42:34 - INFO - codeparrot_training - Step 13672: {'lr': 0.00043053063361571256, 'samples': 2625216, 'steps': 13672, 'loss/train': 2.3944864869117737} 01/27/2022 08:42:37 - INFO - codeparrot_training - Step 13673: {'lr': 0.0004305193142479888, 'samples': 2625408, 'steps': 13673, 'loss/train': 0.9171792268753052} 01/27/2022 08:42:40 - INFO - codeparrot_training - Step 13674: {'lr': 0.0004305079941069776, 'samples': 2625600, 'steps': 13674, 'loss/train': 1.2032590806484222} 01/27/2022 08:42:43 - INFO - codeparrot_training - Step 13675: {'lr': 0.0004304966731927276, 'samples': 2625792, 'steps': 13675, 'loss/train': 0.8147778511047363} 01/27/2022 08:42:47 - INFO - codeparrot_training - Step 13676: {'lr': 0.000430485351505287, 'samples': 2625984, 'steps': 13676, 'loss/train': 0.7493250668048859} 01/27/2022 08:42:50 - INFO - codeparrot_training - Step 13677: {'lr': 0.00043047402904470455, 'samples': 2626176, 'steps': 13677, 'loss/train': 0.8099036514759064} 01/27/2022 08:42:53 - INFO - codeparrot_training - Step 13678: {'lr': 0.00043046270581102865, 'samples': 2626368, 'steps': 13678, 'loss/train': 1.003079742193222} 01/27/2022 08:42:56 - INFO - codeparrot_training - Step 13679: {'lr': 0.00043045138180430783, 'samples': 2626560, 'steps': 13679, 'loss/train': 0.9449927508831024} 01/27/2022 08:43:00 - INFO - codeparrot_training - Step 13680: {'lr': 0.00043044005702459054, 'samples': 2626752, 'steps': 13680, 'loss/train': 0.9407326877117157} 01/27/2022 08:43:04 - INFO - codeparrot_training - Step 13681: {'lr': 0.0004304287314719254, 'samples': 2626944, 'steps': 13681, 'loss/train': 0.6174779981374741} 01/27/2022 08:43:07 - INFO - codeparrot_training - Step 13682: {'lr': 0.00043041740514636085, 'samples': 2627136, 'steps': 13682, 'loss/train': 0.42296965420246124} 01/27/2022 08:43:10 - INFO - codeparrot_training - Step 13683: {'lr': 0.0004304060780479454, 'samples': 2627328, 'steps': 13683, 'loss/train': 1.400260180234909} 01/27/2022 08:43:13 - INFO - codeparrot_training - Step 13684: {'lr': 0.0004303947501767276, 'samples': 2627520, 'steps': 13684, 'loss/train': 0.8732599318027496} 01/27/2022 08:43:16 - INFO - codeparrot_training - Step 13685: {'lr': 0.0004303834215327561, 'samples': 2627712, 'steps': 13685, 'loss/train': 0.9360763728618622} 01/27/2022 08:43:19 - INFO - codeparrot_training - Step 13686: {'lr': 0.00043037209211607913, 'samples': 2627904, 'steps': 13686, 'loss/train': 0.849505752325058} 01/27/2022 08:43:22 - INFO - codeparrot_training - Step 13687: {'lr': 0.00043036076192674546, 'samples': 2628096, 'steps': 13687, 'loss/train': 0.9725703299045563} 01/27/2022 08:43:26 - INFO - codeparrot_training - Step 13688: {'lr': 0.00043034943096480357, 'samples': 2628288, 'steps': 13688, 'loss/train': 0.7977933883666992} 01/27/2022 08:43:32 - INFO - codeparrot_training - Step 13689: {'lr': 0.000430338099230302, 'samples': 2628480, 'steps': 13689, 'loss/train': 0.5153320133686066} 01/27/2022 08:43:35 - INFO - codeparrot_training - Step 13690: {'lr': 0.00043032676672328916, 'samples': 2628672, 'steps': 13690, 'loss/train': 1.1488316059112549} 01/27/2022 08:43:38 - INFO - codeparrot_training - Step 13691: {'lr': 0.00043031543344381384, 'samples': 2628864, 'steps': 13691, 'loss/train': 0.695096343755722} 01/27/2022 08:43:41 - INFO - codeparrot_training - Step 13692: {'lr': 0.0004303040993919244, 'samples': 2629056, 'steps': 13692, 'loss/train': 0.8408720791339874} 01/27/2022 08:43:44 - INFO - codeparrot_training - Step 13693: {'lr': 0.00043029276456766946, 'samples': 2629248, 'steps': 13693, 'loss/train': 0.7659375965595245} 01/27/2022 08:43:47 - INFO - codeparrot_training - Step 13694: {'lr': 0.00043028142897109754, 'samples': 2629440, 'steps': 13694, 'loss/train': 0.6051311194896698} 01/27/2022 08:43:51 - INFO - codeparrot_training - Step 13695: {'lr': 0.0004302700926022573, 'samples': 2629632, 'steps': 13695, 'loss/train': 1.1930330693721771} 01/27/2022 08:43:54 - INFO - codeparrot_training - Step 13696: {'lr': 0.0004302587554611972, 'samples': 2629824, 'steps': 13696, 'loss/train': 1.01035675406456} 01/27/2022 08:43:58 - INFO - codeparrot_training - Step 13697: {'lr': 0.0004302474175479658, 'samples': 2630016, 'steps': 13697, 'loss/train': 1.3575858771800995} 01/27/2022 08:44:01 - INFO - codeparrot_training - Step 13698: {'lr': 0.0004302360788626117, 'samples': 2630208, 'steps': 13698, 'loss/train': 0.1742706298828125} 01/27/2022 08:44:05 - INFO - codeparrot_training - Step 13699: {'lr': 0.00043022473940518345, 'samples': 2630400, 'steps': 13699, 'loss/train': 0.6722508519887924} 01/27/2022 08:44:08 - INFO - codeparrot_training - Step 13700: {'lr': 0.0004302133991757297, 'samples': 2630592, 'steps': 13700, 'loss/train': 0.0365661196410656} 01/27/2022 08:44:11 - INFO - codeparrot_training - Step 13701: {'lr': 0.00043020205817429895, 'samples': 2630784, 'steps': 13701, 'loss/train': 0.7070308327674866} 01/27/2022 08:44:14 - INFO - codeparrot_training - Step 13702: {'lr': 0.0004301907164009398, 'samples': 2630976, 'steps': 13702, 'loss/train': 0.530577689409256} 01/27/2022 08:44:17 - INFO - codeparrot_training - Step 13703: {'lr': 0.00043017937385570083, 'samples': 2631168, 'steps': 13703, 'loss/train': 1.1992181539535522} 01/27/2022 08:44:20 - INFO - codeparrot_training - Step 13704: {'lr': 0.00043016803053863063, 'samples': 2631360, 'steps': 13704, 'loss/train': 1.258898913860321} 01/27/2022 08:44:23 - INFO - codeparrot_training - Step 13705: {'lr': 0.00043015668644977783, 'samples': 2631552, 'steps': 13705, 'loss/train': 0.4673382639884949} 01/27/2022 08:44:27 - INFO - codeparrot_training - Step 13706: {'lr': 0.000430145341589191, 'samples': 2631744, 'steps': 13706, 'loss/train': 3.7575713396072388} 01/27/2022 08:44:33 - INFO - codeparrot_training - Step 13707: {'lr': 0.0004301339959569187, 'samples': 2631936, 'steps': 13707, 'loss/train': 0.8783011436462402} 01/27/2022 08:44:36 - INFO - codeparrot_training - Step 13708: {'lr': 0.00043012264955300954, 'samples': 2632128, 'steps': 13708, 'loss/train': 1.035287082195282} 01/27/2022 08:44:39 - INFO - codeparrot_training - Step 13709: {'lr': 0.0004301113023775122, 'samples': 2632320, 'steps': 13709, 'loss/train': 0.9434501230716705} 01/27/2022 08:44:42 - INFO - codeparrot_training - Step 13710: {'lr': 0.00043009995443047517, 'samples': 2632512, 'steps': 13710, 'loss/train': 0.6914860010147095} 01/27/2022 08:44:45 - INFO - codeparrot_training - Step 13711: {'lr': 0.0004300886057119472, 'samples': 2632704, 'steps': 13711, 'loss/train': 1.2325156331062317} 01/27/2022 08:44:49 - INFO - codeparrot_training - Step 13712: {'lr': 0.00043007725622197675, 'samples': 2632896, 'steps': 13712, 'loss/train': 1.0424696803092957} 01/27/2022 08:44:52 - INFO - codeparrot_training - Step 13713: {'lr': 0.00043006590596061256, 'samples': 2633088, 'steps': 13713, 'loss/train': 0.9217566847801208} 01/27/2022 08:44:55 - INFO - codeparrot_training - Step 13714: {'lr': 0.0004300545549279032, 'samples': 2633280, 'steps': 13714, 'loss/train': 0.736187070608139} 01/27/2022 08:44:59 - INFO - codeparrot_training - Step 13715: {'lr': 0.0004300432031238973, 'samples': 2633472, 'steps': 13715, 'loss/train': 0.844123899936676} 01/27/2022 08:45:03 - INFO - codeparrot_training - Step 13716: {'lr': 0.00043003185054864344, 'samples': 2633664, 'steps': 13716, 'loss/train': 0.8115066289901733} 01/27/2022 08:45:06 - INFO - codeparrot_training - Step 13717: {'lr': 0.0004300204972021903, 'samples': 2633856, 'steps': 13717, 'loss/train': 0.5479164719581604} 01/27/2022 08:45:09 - INFO - codeparrot_training - Step 13718: {'lr': 0.00043000914308458663, 'samples': 2634048, 'steps': 13718, 'loss/train': 0.4403366893529892} 01/27/2022 08:45:12 - INFO - codeparrot_training - Step 13719: {'lr': 0.0004299977881958808, 'samples': 2634240, 'steps': 13719, 'loss/train': 0.7290661185979843} 01/27/2022 08:45:15 - INFO - codeparrot_training - Step 13720: {'lr': 0.0004299864325361217, 'samples': 2634432, 'steps': 13720, 'loss/train': 7.388068914413452} 01/27/2022 08:45:18 - INFO - codeparrot_training - Step 13721: {'lr': 0.00042997507610535783, 'samples': 2634624, 'steps': 13721, 'loss/train': 1.2046352326869965} 01/27/2022 08:45:22 - INFO - codeparrot_training - Step 13722: {'lr': 0.00042996371890363796, 'samples': 2634816, 'steps': 13722, 'loss/train': 1.2520920038223267} 01/27/2022 08:45:25 - INFO - codeparrot_training - Step 13723: {'lr': 0.00042995236093101055, 'samples': 2635008, 'steps': 13723, 'loss/train': 0.5443447083234787} 01/27/2022 08:45:29 - INFO - codeparrot_training - Step 13724: {'lr': 0.0004299410021875244, 'samples': 2635200, 'steps': 13724, 'loss/train': 1.172979712486267} 01/27/2022 08:45:32 - INFO - codeparrot_training - Step 13725: {'lr': 0.00042992964267322823, 'samples': 2635392, 'steps': 13725, 'loss/train': 0.7650468349456787} 01/27/2022 08:45:36 - INFO - codeparrot_training - Step 13726: {'lr': 0.00042991828238817046, 'samples': 2635584, 'steps': 13726, 'loss/train': 1.1375907361507416} 01/27/2022 08:45:39 - INFO - codeparrot_training - Step 13727: {'lr': 0.0004299069213324, 'samples': 2635776, 'steps': 13727, 'loss/train': 0.7873046100139618} 01/27/2022 08:45:42 - INFO - codeparrot_training - Step 13728: {'lr': 0.0004298955595059654, 'samples': 2635968, 'steps': 13728, 'loss/train': 0.9711568057537079} 01/27/2022 08:45:45 - INFO - codeparrot_training - Step 13729: {'lr': 0.00042988419690891534, 'samples': 2636160, 'steps': 13729, 'loss/train': 0.973406195640564} 01/27/2022 08:45:48 - INFO - codeparrot_training - Step 13730: {'lr': 0.00042987283354129846, 'samples': 2636352, 'steps': 13730, 'loss/train': 0.9209392368793488} 01/27/2022 08:45:51 - INFO - codeparrot_training - Step 13731: {'lr': 0.0004298614694031635, 'samples': 2636544, 'steps': 13731, 'loss/train': 0.2091830000281334} 01/27/2022 08:45:54 - INFO - codeparrot_training - Step 13732: {'lr': 0.0004298501044945591, 'samples': 2636736, 'steps': 13732, 'loss/train': 1.1626395285129547} 01/27/2022 08:46:01 - INFO - codeparrot_training - Step 13733: {'lr': 0.000429838738815534, 'samples': 2636928, 'steps': 13733, 'loss/train': 0.6647145599126816} 01/27/2022 08:46:04 - INFO - codeparrot_training - Step 13734: {'lr': 0.00042982737236613687, 'samples': 2637120, 'steps': 13734, 'loss/train': 1.100582867860794} 01/27/2022 08:46:07 - INFO - codeparrot_training - Step 13735: {'lr': 0.00042981600514641635, 'samples': 2637312, 'steps': 13735, 'loss/train': 1.1496722996234894} 01/27/2022 08:46:10 - INFO - codeparrot_training - Step 13736: {'lr': 0.00042980463715642115, 'samples': 2637504, 'steps': 13736, 'loss/train': 0.8169615268707275} 01/27/2022 08:46:13 - INFO - codeparrot_training - Step 13737: {'lr': 0.0004297932683962, 'samples': 2637696, 'steps': 13737, 'loss/train': 0.6607073396444321} 01/27/2022 08:46:16 - INFO - codeparrot_training - Step 13738: {'lr': 0.00042978189886580157, 'samples': 2637888, 'steps': 13738, 'loss/train': 0.5690440088510513} 01/27/2022 08:46:19 - INFO - codeparrot_training - Step 13739: {'lr': 0.00042977052856527456, 'samples': 2638080, 'steps': 13739, 'loss/train': 0.8255841135978699} 01/27/2022 08:46:23 - INFO - codeparrot_training - Step 13740: {'lr': 0.00042975915749466763, 'samples': 2638272, 'steps': 13740, 'loss/train': 0.7070157676935196} 01/27/2022 08:46:26 - INFO - codeparrot_training - Step 13741: {'lr': 0.0004297477856540296, 'samples': 2638464, 'steps': 13741, 'loss/train': 0.7231067419052124} 01/27/2022 08:46:30 - INFO - codeparrot_training - Step 13742: {'lr': 0.00042973641304340916, 'samples': 2638656, 'steps': 13742, 'loss/train': 0.6442225277423859} 01/27/2022 08:46:34 - INFO - codeparrot_training - Step 13743: {'lr': 0.00042972503966285503, 'samples': 2638848, 'steps': 13743, 'loss/train': 1.2494927644729614} 01/27/2022 08:46:37 - INFO - codeparrot_training - Step 13744: {'lr': 0.00042971366551241587, 'samples': 2639040, 'steps': 13744, 'loss/train': 0.9850124716758728} 01/27/2022 08:46:40 - INFO - codeparrot_training - Step 13745: {'lr': 0.00042970229059214037, 'samples': 2639232, 'steps': 13745, 'loss/train': 1.1060741543769836} 01/27/2022 08:46:43 - INFO - codeparrot_training - Step 13746: {'lr': 0.0004296909149020774, 'samples': 2639424, 'steps': 13746, 'loss/train': 0.8883127570152283} 01/27/2022 08:46:46 - INFO - codeparrot_training - Step 13747: {'lr': 0.0004296795384422756, 'samples': 2639616, 'steps': 13747, 'loss/train': 0.05088728293776512} 01/27/2022 08:46:49 - INFO - codeparrot_training - Step 13748: {'lr': 0.00042966816121278365, 'samples': 2639808, 'steps': 13748, 'loss/train': 0.3110775724053383} 01/27/2022 08:46:53 - INFO - codeparrot_training - Step 13749: {'lr': 0.00042965678321365045, 'samples': 2640000, 'steps': 13749, 'loss/train': 0.7291873097419739} 01/27/2022 08:46:57 - INFO - codeparrot_training - Step 13750: {'lr': 0.00042964540444492453, 'samples': 2640192, 'steps': 13750, 'loss/train': 0.9973601996898651} 01/27/2022 08:47:00 - INFO - codeparrot_training - Step 13751: {'lr': 0.00042963402490665484, 'samples': 2640384, 'steps': 13751, 'loss/train': 0.5646481364965439} 01/27/2022 08:47:03 - INFO - codeparrot_training - Step 13752: {'lr': 0.0004296226445988899, 'samples': 2640576, 'steps': 13752, 'loss/train': 0.7653636932373047} 01/27/2022 08:47:07 - INFO - codeparrot_training - Step 13753: {'lr': 0.0004296112635216787, 'samples': 2640768, 'steps': 13753, 'loss/train': 0.742449626326561} 01/27/2022 08:47:10 - INFO - codeparrot_training - Step 13754: {'lr': 0.00042959988167506983, 'samples': 2640960, 'steps': 13754, 'loss/train': 0.8267060816287994} 01/27/2022 08:47:13 - INFO - codeparrot_training - Step 13755: {'lr': 0.00042958849905911213, 'samples': 2641152, 'steps': 13755, 'loss/train': 0.6817313879728317} 01/27/2022 08:47:16 - INFO - codeparrot_training - Step 13756: {'lr': 0.0004295771156738543, 'samples': 2641344, 'steps': 13756, 'loss/train': 0.7217875868082047} 01/27/2022 08:47:19 - INFO - codeparrot_training - Step 13757: {'lr': 0.00042956573151934507, 'samples': 2641536, 'steps': 13757, 'loss/train': 0.5798236727714539} 01/27/2022 08:47:22 - INFO - codeparrot_training - Step 13758: {'lr': 0.00042955434659563334, 'samples': 2641728, 'steps': 13758, 'loss/train': 1.109450250864029} 01/27/2022 08:47:29 - INFO - codeparrot_training - Step 13759: {'lr': 0.00042954296090276777, 'samples': 2641920, 'steps': 13759, 'loss/train': 0.2950577661395073} 01/27/2022 08:47:32 - INFO - codeparrot_training - Step 13760: {'lr': 0.0004295315744407972, 'samples': 2642112, 'steps': 13760, 'loss/train': 2.4474021792411804} 01/27/2022 08:47:35 - INFO - codeparrot_training - Step 13761: {'lr': 0.0004295201872097704, 'samples': 2642304, 'steps': 13761, 'loss/train': 0.23143268376588821} 01/27/2022 08:47:38 - INFO - codeparrot_training - Step 13762: {'lr': 0.0004295087992097361, 'samples': 2642496, 'steps': 13762, 'loss/train': 1.8708803057670593} 01/27/2022 08:47:41 - INFO - codeparrot_training - Step 13763: {'lr': 0.00042949741044074306, 'samples': 2642688, 'steps': 13763, 'loss/train': 0.7728751301765442} 01/27/2022 08:47:45 - INFO - codeparrot_training - Step 13764: {'lr': 0.00042948602090284014, 'samples': 2642880, 'steps': 13764, 'loss/train': 0.7879186570644379} 01/27/2022 08:47:48 - INFO - codeparrot_training - Step 13765: {'lr': 0.00042947463059607606, 'samples': 2643072, 'steps': 13765, 'loss/train': 1.0293666422367096} 01/27/2022 08:47:51 - INFO - codeparrot_training - Step 13766: {'lr': 0.0004294632395204997, 'samples': 2643264, 'steps': 13766, 'loss/train': 0.4175293296575546} 01/27/2022 08:47:55 - INFO - codeparrot_training - Step 13767: {'lr': 0.0004294518476761598, 'samples': 2643456, 'steps': 13767, 'loss/train': 1.6713727712631226} 01/27/2022 08:47:59 - INFO - codeparrot_training - Step 13768: {'lr': 0.00042944045506310515, 'samples': 2643648, 'steps': 13768, 'loss/train': 0.8303065896034241} 01/27/2022 08:48:02 - INFO - codeparrot_training - Step 13769: {'lr': 0.0004294290616813846, 'samples': 2643840, 'steps': 13769, 'loss/train': 0.7996055781841278} 01/27/2022 08:48:05 - INFO - codeparrot_training - Step 13770: {'lr': 0.00042941766753104696, 'samples': 2644032, 'steps': 13770, 'loss/train': 0.36451535671949387} 01/27/2022 08:48:08 - INFO - codeparrot_training - Step 13771: {'lr': 0.00042940627261214094, 'samples': 2644224, 'steps': 13771, 'loss/train': 0.773578405380249} 01/27/2022 08:48:11 - INFO - codeparrot_training - Step 13772: {'lr': 0.00042939487692471534, 'samples': 2644416, 'steps': 13772, 'loss/train': 0.8256031572818756} 01/27/2022 08:48:14 - INFO - codeparrot_training - Step 13773: {'lr': 0.0004293834804688192, 'samples': 2644608, 'steps': 13773, 'loss/train': 0.3411117419600487} 01/27/2022 08:48:18 - INFO - codeparrot_training - Step 13774: {'lr': 0.00042937208324450116, 'samples': 2644800, 'steps': 13774, 'loss/train': 0.8648852705955505} 01/27/2022 08:48:21 - INFO - codeparrot_training - Step 13775: {'lr': 0.00042936068525181004, 'samples': 2644992, 'steps': 13775, 'loss/train': 0.9745249450206757} 01/27/2022 08:48:25 - INFO - codeparrot_training - Step 13776: {'lr': 0.00042934928649079467, 'samples': 2645184, 'steps': 13776, 'loss/train': 5.83121645450592} 01/27/2022 08:48:28 - INFO - codeparrot_training - Step 13777: {'lr': 0.0004293378869615039, 'samples': 2645376, 'steps': 13777, 'loss/train': 0.5414970070123672} 01/27/2022 08:48:31 - INFO - codeparrot_training - Step 13778: {'lr': 0.00042932648666398667, 'samples': 2645568, 'steps': 13778, 'loss/train': 0.7360385209321976} 01/27/2022 08:48:35 - INFO - codeparrot_training - Step 13779: {'lr': 0.0004293150855982916, 'samples': 2645760, 'steps': 13779, 'loss/train': 0.6590493321418762} 01/27/2022 08:48:38 - INFO - codeparrot_training - Step 13780: {'lr': 0.0004293036837644677, 'samples': 2645952, 'steps': 13780, 'loss/train': 0.6423948258161545} 01/27/2022 08:48:41 - INFO - codeparrot_training - Step 13781: {'lr': 0.0004292922811625637, 'samples': 2646144, 'steps': 13781, 'loss/train': 0.9980275332927704} 01/27/2022 08:48:44 - INFO - codeparrot_training - Step 13782: {'lr': 0.0004292808777926286, 'samples': 2646336, 'steps': 13782, 'loss/train': 0.44052667915821075} 01/27/2022 08:48:47 - INFO - codeparrot_training - Step 13783: {'lr': 0.0004292694736547111, 'samples': 2646528, 'steps': 13783, 'loss/train': 4.077642202377319} 01/27/2022 08:48:50 - INFO - codeparrot_training - Step 13784: {'lr': 0.0004292580687488601, 'samples': 2646720, 'steps': 13784, 'loss/train': 0.6713641583919525} 01/27/2022 08:48:57 - INFO - codeparrot_training - Step 13785: {'lr': 0.00042924666307512437, 'samples': 2646912, 'steps': 13785, 'loss/train': 0.9630264937877655} 01/27/2022 08:49:00 - INFO - codeparrot_training - Step 13786: {'lr': 0.000429235256633553, 'samples': 2647104, 'steps': 13786, 'loss/train': 0.9904425144195557} 01/27/2022 08:49:03 - INFO - codeparrot_training - Step 13787: {'lr': 0.0004292238494241946, 'samples': 2647296, 'steps': 13787, 'loss/train': 0.8541065454483032} 01/27/2022 08:49:06 - INFO - codeparrot_training - Step 13788: {'lr': 0.00042921244144709817, 'samples': 2647488, 'steps': 13788, 'loss/train': 0.9368584156036377} 01/27/2022 08:49:09 - INFO - codeparrot_training - Step 13789: {'lr': 0.0004292010327023125, 'samples': 2647680, 'steps': 13789, 'loss/train': 0.31066538393497467} 01/27/2022 08:49:12 - INFO - codeparrot_training - Step 13790: {'lr': 0.00042918962318988664, 'samples': 2647872, 'steps': 13790, 'loss/train': 0.4682288020849228} 01/27/2022 08:49:16 - INFO - codeparrot_training - Step 13791: {'lr': 0.00042917821290986926, 'samples': 2648064, 'steps': 13791, 'loss/train': 0.5728520303964615} 01/27/2022 08:49:19 - INFO - codeparrot_training - Step 13792: {'lr': 0.0004291668018623093, 'samples': 2648256, 'steps': 13792, 'loss/train': 0.06330270320177078} 01/27/2022 08:49:22 - INFO - codeparrot_training - Step 13793: {'lr': 0.00042915539004725564, 'samples': 2648448, 'steps': 13793, 'loss/train': 0.5884652137756348} 01/27/2022 08:49:26 - INFO - codeparrot_training - Step 13794: {'lr': 0.0004291439774647572, 'samples': 2648640, 'steps': 13794, 'loss/train': 0.8946874737739563} 01/27/2022 08:49:29 - INFO - codeparrot_training - Step 13795: {'lr': 0.00042913256411486277, 'samples': 2648832, 'steps': 13795, 'loss/train': 0.3703639358282089} 01/27/2022 08:49:33 - INFO - codeparrot_training - Step 13796: {'lr': 0.0004291211499976214, 'samples': 2649024, 'steps': 13796, 'loss/train': 1.1988703608512878} 01/27/2022 08:49:36 - INFO - codeparrot_training - Step 13797: {'lr': 0.00042910973511308195, 'samples': 2649216, 'steps': 13797, 'loss/train': 0.48152241110801697} 01/27/2022 08:49:39 - INFO - codeparrot_training - Step 13798: {'lr': 0.0004290983194612932, 'samples': 2649408, 'steps': 13798, 'loss/train': 0.860960841178894} 01/27/2022 08:49:42 - INFO - codeparrot_training - Step 13799: {'lr': 0.00042908690304230415, 'samples': 2649600, 'steps': 13799, 'loss/train': 0.6629406362771988} 01/27/2022 08:49:45 - INFO - codeparrot_training - Step 13800: {'lr': 0.00042907548585616363, 'samples': 2649792, 'steps': 13800, 'loss/train': 0.3879850208759308} 01/27/2022 08:49:48 - INFO - codeparrot_training - Step 13801: {'lr': 0.00042906406790292053, 'samples': 2649984, 'steps': 13801, 'loss/train': 0.5028567016124725} 01/27/2022 08:49:51 - INFO - codeparrot_training - Step 13802: {'lr': 0.00042905264918262386, 'samples': 2650176, 'steps': 13802, 'loss/train': 0.7579675912857056} 01/27/2022 08:49:56 - INFO - codeparrot_training - Step 13803: {'lr': 0.00042904122969532256, 'samples': 2650368, 'steps': 13803, 'loss/train': 0.8530365228652954} 01/27/2022 08:49:59 - INFO - codeparrot_training - Step 13804: {'lr': 0.0004290298094410655, 'samples': 2650560, 'steps': 13804, 'loss/train': 0.8643683195114136} 01/27/2022 08:50:02 - INFO - codeparrot_training - Step 13805: {'lr': 0.0004290183884199015, 'samples': 2650752, 'steps': 13805, 'loss/train': 0.5593741536140442} 01/27/2022 08:50:05 - INFO - codeparrot_training - Step 13806: {'lr': 0.00042900696663187963, 'samples': 2650944, 'steps': 13806, 'loss/train': 0.7731158137321472} 01/27/2022 08:50:08 - INFO - codeparrot_training - Step 13807: {'lr': 0.00042899554407704876, 'samples': 2651136, 'steps': 13807, 'loss/train': 0.6957103461027145} 01/27/2022 08:50:11 - INFO - codeparrot_training - Step 13808: {'lr': 0.0004289841207554578, 'samples': 2651328, 'steps': 13808, 'loss/train': 1.0878617763519287} 01/27/2022 08:50:15 - INFO - codeparrot_training - Step 13809: {'lr': 0.0004289726966671557, 'samples': 2651520, 'steps': 13809, 'loss/train': 0.69654580950737} 01/27/2022 08:50:18 - INFO - codeparrot_training - Step 13810: {'lr': 0.00042896127181219135, 'samples': 2651712, 'steps': 13810, 'loss/train': 1.0232361853122711} 01/27/2022 08:50:24 - INFO - codeparrot_training - Step 13811: {'lr': 0.0004289498461906138, 'samples': 2651904, 'steps': 13811, 'loss/train': 1.0864394009113312} 01/27/2022 08:50:27 - INFO - codeparrot_training - Step 13812: {'lr': 0.00042893841980247194, 'samples': 2652096, 'steps': 13812, 'loss/train': 1.121010571718216} 01/27/2022 08:50:30 - INFO - codeparrot_training - Step 13813: {'lr': 0.00042892699264781463, 'samples': 2652288, 'steps': 13813, 'loss/train': 0.31643568724393845} 01/27/2022 08:50:34 - INFO - codeparrot_training - Step 13814: {'lr': 0.000428915564726691, 'samples': 2652480, 'steps': 13814, 'loss/train': 0.6111512035131454} 01/27/2022 08:50:37 - INFO - codeparrot_training - Step 13815: {'lr': 0.0004289041360391499, 'samples': 2652672, 'steps': 13815, 'loss/train': 0.9340247511863708} 01/27/2022 08:50:40 - INFO - codeparrot_training - Step 13816: {'lr': 0.0004288927065852402, 'samples': 2652864, 'steps': 13816, 'loss/train': 0.5420972406864166} 01/27/2022 08:50:43 - INFO - codeparrot_training - Step 13817: {'lr': 0.000428881276365011, 'samples': 2653056, 'steps': 13817, 'loss/train': 0.7327509373426437} 01/27/2022 08:50:46 - INFO - codeparrot_training - Step 13818: {'lr': 0.00042886984537851124, 'samples': 2653248, 'steps': 13818, 'loss/train': 0.5124057680368423} 01/27/2022 08:50:49 - INFO - codeparrot_training - Step 13819: {'lr': 0.0004288584136257898, 'samples': 2653440, 'steps': 13819, 'loss/train': 0.8401549458503723} 01/27/2022 08:50:54 - INFO - codeparrot_training - Step 13820: {'lr': 0.00042884698110689574, 'samples': 2653632, 'steps': 13820, 'loss/train': 1.0480345487594604} 01/27/2022 08:50:57 - INFO - codeparrot_training - Step 13821: {'lr': 0.000428835547821878, 'samples': 2653824, 'steps': 13821, 'loss/train': 0.8279214799404144} 01/27/2022 08:51:00 - INFO - codeparrot_training - Step 13822: {'lr': 0.00042882411377078556, 'samples': 2654016, 'steps': 13822, 'loss/train': 0.4220629781484604} 01/27/2022 08:51:03 - INFO - codeparrot_training - Step 13823: {'lr': 0.00042881267895366736, 'samples': 2654208, 'steps': 13823, 'loss/train': 0.8820464015007019} 01/27/2022 08:51:06 - INFO - codeparrot_training - Step 13824: {'lr': 0.00042880124337057253, 'samples': 2654400, 'steps': 13824, 'loss/train': 0.5324597507715225} 01/27/2022 08:51:09 - INFO - codeparrot_training - Step 13825: {'lr': 0.00042878980702154985, 'samples': 2654592, 'steps': 13825, 'loss/train': 0.714147537946701} 01/27/2022 08:51:12 - INFO - codeparrot_training - Step 13826: {'lr': 0.00042877836990664844, 'samples': 2654784, 'steps': 13826, 'loss/train': 0.6763297319412231} 01/27/2022 08:51:16 - INFO - codeparrot_training - Step 13827: {'lr': 0.00042876693202591724, 'samples': 2654976, 'steps': 13827, 'loss/train': 1.0727865099906921} 01/27/2022 08:51:19 - INFO - codeparrot_training - Step 13828: {'lr': 0.0004287554933794053, 'samples': 2655168, 'steps': 13828, 'loss/train': 0.8671480715274811} 01/27/2022 08:51:23 - INFO - codeparrot_training - Step 13829: {'lr': 0.0004287440539671616, 'samples': 2655360, 'steps': 13829, 'loss/train': 1.3309948146343231} 01/27/2022 08:51:26 - INFO - codeparrot_training - Step 13830: {'lr': 0.0004287326137892351, 'samples': 2655552, 'steps': 13830, 'loss/train': 1.1222662925720215} 01/27/2022 08:51:29 - INFO - codeparrot_training - Step 13831: {'lr': 0.00042872117284567486, 'samples': 2655744, 'steps': 13831, 'loss/train': 0.039214820601046085} 01/27/2022 08:51:33 - INFO - codeparrot_training - Step 13832: {'lr': 0.0004287097311365299, 'samples': 2655936, 'steps': 13832, 'loss/train': 0.578194722533226} 01/27/2022 08:51:36 - INFO - codeparrot_training - Step 13833: {'lr': 0.0004286982886618491, 'samples': 2656128, 'steps': 13833, 'loss/train': 0.9005359411239624} 01/27/2022 08:51:39 - INFO - codeparrot_training - Step 13834: {'lr': 0.0004286868454216816, 'samples': 2656320, 'steps': 13834, 'loss/train': 0.9943450391292572} 01/27/2022 08:51:42 - INFO - codeparrot_training - Step 13835: {'lr': 0.00042867540141607643, 'samples': 2656512, 'steps': 13835, 'loss/train': 0.8960950076580048} 01/27/2022 08:51:45 - INFO - codeparrot_training - Step 13836: {'lr': 0.0004286639566450826, 'samples': 2656704, 'steps': 13836, 'loss/train': 1.161164939403534} 01/27/2022 08:51:48 - INFO - codeparrot_training - Step 13837: {'lr': 0.00042865251110874903, 'samples': 2656896, 'steps': 13837, 'loss/train': 0.5689613521099091} 01/27/2022 08:51:55 - INFO - codeparrot_training - Step 13838: {'lr': 0.00042864106480712495, 'samples': 2657088, 'steps': 13838, 'loss/train': 1.0898176431655884} 01/27/2022 08:51:58 - INFO - codeparrot_training - Step 13839: {'lr': 0.00042862961774025915, 'samples': 2657280, 'steps': 13839, 'loss/train': 0.2461552619934082} 01/27/2022 08:52:01 - INFO - codeparrot_training - Step 13840: {'lr': 0.00042861816990820087, 'samples': 2657472, 'steps': 13840, 'loss/train': 0.8166646957397461} 01/27/2022 08:52:04 - INFO - codeparrot_training - Step 13841: {'lr': 0.00042860672131099904, 'samples': 2657664, 'steps': 13841, 'loss/train': 0.8298573195934296} 01/27/2022 08:52:07 - INFO - codeparrot_training - Step 13842: {'lr': 0.00042859527194870275, 'samples': 2657856, 'steps': 13842, 'loss/train': 0.6752789318561554} 01/27/2022 08:52:10 - INFO - codeparrot_training - Step 13843: {'lr': 0.000428583821821361, 'samples': 2658048, 'steps': 13843, 'loss/train': 1.1755167245864868} 01/27/2022 08:52:13 - INFO - codeparrot_training - Step 13844: {'lr': 0.00042857237092902285, 'samples': 2658240, 'steps': 13844, 'loss/train': 0.8188618719577789} 01/27/2022 08:52:17 - INFO - codeparrot_training - Step 13845: {'lr': 0.0004285609192717374, 'samples': 2658432, 'steps': 13845, 'loss/train': 0.717331275343895} 01/27/2022 08:52:21 - INFO - codeparrot_training - Step 13846: {'lr': 0.00042854946684955366, 'samples': 2658624, 'steps': 13846, 'loss/train': 0.4616059362888336} 01/27/2022 08:52:24 - INFO - codeparrot_training - Step 13847: {'lr': 0.00042853801366252067, 'samples': 2658816, 'steps': 13847, 'loss/train': 1.6429853439331055} 01/27/2022 08:52:27 - INFO - codeparrot_training - Step 13848: {'lr': 0.00042852655971068756, 'samples': 2659008, 'steps': 13848, 'loss/train': 0.6538319885730743} 01/27/2022 08:52:31 - INFO - codeparrot_training - Step 13849: {'lr': 0.0004285151049941033, 'samples': 2659200, 'steps': 13849, 'loss/train': 0.808701753616333} 01/27/2022 08:52:34 - INFO - codeparrot_training - Step 13850: {'lr': 0.00042850364951281707, 'samples': 2659392, 'steps': 13850, 'loss/train': 0.818149745464325} 01/27/2022 08:52:37 - INFO - codeparrot_training - Step 13851: {'lr': 0.00042849219326687786, 'samples': 2659584, 'steps': 13851, 'loss/train': 0.5012286007404327} 01/27/2022 08:52:40 - INFO - codeparrot_training - Step 13852: {'lr': 0.0004284807362563348, 'samples': 2659776, 'steps': 13852, 'loss/train': 0.8741093873977661} 01/27/2022 08:52:43 - INFO - codeparrot_training - Step 13853: {'lr': 0.00042846927848123694, 'samples': 2659968, 'steps': 13853, 'loss/train': 0.7404936254024506} 01/27/2022 08:52:46 - INFO - codeparrot_training - Step 13854: {'lr': 0.00042845781994163334, 'samples': 2660160, 'steps': 13854, 'loss/train': 0.7124796509742737} 01/27/2022 08:52:52 - INFO - codeparrot_training - Step 13855: {'lr': 0.00042844636063757316, 'samples': 2660352, 'steps': 13855, 'loss/train': 0.9858466386795044} 01/27/2022 08:52:56 - INFO - codeparrot_training - Step 13856: {'lr': 0.00042843490056910534, 'samples': 2660544, 'steps': 13856, 'loss/train': 0.8036041259765625} 01/27/2022 08:52:59 - INFO - codeparrot_training - Step 13857: {'lr': 0.0004284234397362791, 'samples': 2660736, 'steps': 13857, 'loss/train': 0.7242976427078247} 01/27/2022 08:53:02 - INFO - codeparrot_training - Step 13858: {'lr': 0.0004284119781391436, 'samples': 2660928, 'steps': 13858, 'loss/train': 0.7975390255451202} 01/27/2022 08:53:05 - INFO - codeparrot_training - Step 13859: {'lr': 0.00042840051577774766, 'samples': 2661120, 'steps': 13859, 'loss/train': 0.4221113473176956} 01/27/2022 08:53:08 - INFO - codeparrot_training - Step 13860: {'lr': 0.00042838905265214067, 'samples': 2661312, 'steps': 13860, 'loss/train': 0.9155139327049255} 01/27/2022 08:53:11 - INFO - codeparrot_training - Step 13861: {'lr': 0.0004283775887623716, 'samples': 2661504, 'steps': 13861, 'loss/train': 0.8927333056926727} 01/27/2022 08:53:14 - INFO - codeparrot_training - Step 13862: {'lr': 0.0004283661241084896, 'samples': 2661696, 'steps': 13862, 'loss/train': 0.48361600935459137} 01/27/2022 08:53:18 - INFO - codeparrot_training - Step 13863: {'lr': 0.0004283546586905437, 'samples': 2661888, 'steps': 13863, 'loss/train': 1.1004652976989746} 01/27/2022 08:53:22 - INFO - codeparrot_training - Step 13864: {'lr': 0.00042834319250858316, 'samples': 2662080, 'steps': 13864, 'loss/train': 1.4364127814769745} 01/27/2022 08:53:25 - INFO - codeparrot_training - Step 13865: {'lr': 0.000428331725562657, 'samples': 2662272, 'steps': 13865, 'loss/train': 0.38227397203445435} 01/27/2022 08:53:29 - INFO - codeparrot_training - Step 13866: {'lr': 0.0004283202578528143, 'samples': 2662464, 'steps': 13866, 'loss/train': 0.49467936158180237} 01/27/2022 08:53:32 - INFO - codeparrot_training - Step 13867: {'lr': 0.00042830878937910426, 'samples': 2662656, 'steps': 13867, 'loss/train': 0.5873212516307831} 01/27/2022 08:53:35 - INFO - codeparrot_training - Step 13868: {'lr': 0.000428297320141576, 'samples': 2662848, 'steps': 13868, 'loss/train': 0.6370091289281845} 01/27/2022 08:53:38 - INFO - codeparrot_training - Step 13869: {'lr': 0.00042828585014027863, 'samples': 2663040, 'steps': 13869, 'loss/train': 1.2852340936660767} 01/27/2022 08:53:41 - INFO - codeparrot_training - Step 13870: {'lr': 0.0004282743793752613, 'samples': 2663232, 'steps': 13870, 'loss/train': 0.9949750900268555} 01/27/2022 08:53:44 - INFO - codeparrot_training - Step 13871: {'lr': 0.0004282629078465732, 'samples': 2663424, 'steps': 13871, 'loss/train': 0.46803680062294006} 01/27/2022 08:53:49 - INFO - codeparrot_training - Step 13872: {'lr': 0.0004282514355542633, 'samples': 2663616, 'steps': 13872, 'loss/train': 0.8489353358745575} 01/27/2022 08:53:52 - INFO - codeparrot_training - Step 13873: {'lr': 0.0004282399624983808, 'samples': 2663808, 'steps': 13873, 'loss/train': 0.8740226626396179} 01/27/2022 08:53:55 - INFO - codeparrot_training - Step 13874: {'lr': 0.000428228488678975, 'samples': 2664000, 'steps': 13874, 'loss/train': 1.0036135017871857} 01/27/2022 08:53:58 - INFO - codeparrot_training - Step 13875: {'lr': 0.000428217014096095, 'samples': 2664192, 'steps': 13875, 'loss/train': 0.7883225083351135} 01/27/2022 08:54:01 - INFO - codeparrot_training - Step 13876: {'lr': 0.00042820553874978987, 'samples': 2664384, 'steps': 13876, 'loss/train': 1.0838094055652618} 01/27/2022 08:54:04 - INFO - codeparrot_training - Step 13877: {'lr': 0.0004281940626401087, 'samples': 2664576, 'steps': 13877, 'loss/train': 0.7687283456325531} 01/27/2022 08:54:08 - INFO - codeparrot_training - Step 13878: {'lr': 0.0004281825857671008, 'samples': 2664768, 'steps': 13878, 'loss/train': 0.9160965979099274} 01/27/2022 08:54:11 - INFO - codeparrot_training - Step 13879: {'lr': 0.00042817110813081526, 'samples': 2664960, 'steps': 13879, 'loss/train': 0.4914821684360504} 01/27/2022 08:54:14 - INFO - codeparrot_training - Step 13880: {'lr': 0.00042815962973130134, 'samples': 2665152, 'steps': 13880, 'loss/train': 0.8076615035533905} 01/27/2022 08:54:18 - INFO - codeparrot_training - Step 13881: {'lr': 0.00042814815056860814, 'samples': 2665344, 'steps': 13881, 'loss/train': 0.6485987305641174} 01/27/2022 08:54:22 - INFO - codeparrot_training - Step 13882: {'lr': 0.0004281366706427848, 'samples': 2665536, 'steps': 13882, 'loss/train': 1.0227655470371246} 01/27/2022 08:54:25 - INFO - codeparrot_training - Step 13883: {'lr': 0.0004281251899538805, 'samples': 2665728, 'steps': 13883, 'loss/train': 0.8490572869777679} 01/27/2022 08:54:28 - INFO - codeparrot_training - Step 13884: {'lr': 0.0004281137085019445, 'samples': 2665920, 'steps': 13884, 'loss/train': 0.8877147138118744} 01/27/2022 08:54:31 - INFO - codeparrot_training - Step 13885: {'lr': 0.0004281022262870259, 'samples': 2666112, 'steps': 13885, 'loss/train': 0.9432249069213867} 01/27/2022 08:54:34 - INFO - codeparrot_training - Step 13886: {'lr': 0.00042809074330917387, 'samples': 2666304, 'steps': 13886, 'loss/train': 0.04048174247145653} 01/27/2022 08:54:37 - INFO - codeparrot_training - Step 13887: {'lr': 0.00042807925956843775, 'samples': 2666496, 'steps': 13887, 'loss/train': 0.5654646456241608} 01/27/2022 08:54:40 - INFO - codeparrot_training - Step 13888: {'lr': 0.0004280677750648665, 'samples': 2666688, 'steps': 13888, 'loss/train': 0.7011259943246841} 01/27/2022 08:54:44 - INFO - codeparrot_training - Step 13889: {'lr': 0.0004280562897985095, 'samples': 2666880, 'steps': 13889, 'loss/train': 0.9667715728282928} 01/27/2022 08:54:50 - INFO - codeparrot_training - Step 13890: {'lr': 0.00042804480376941597, 'samples': 2667072, 'steps': 13890, 'loss/train': 0.8408430218696594} 01/27/2022 08:54:53 - INFO - codeparrot_training - Step 13891: {'lr': 0.0004280333169776349, 'samples': 2667264, 'steps': 13891, 'loss/train': 0.7275713682174683} 01/27/2022 08:54:56 - INFO - codeparrot_training - Step 13892: {'lr': 0.00042802182942321576, 'samples': 2667456, 'steps': 13892, 'loss/train': 1.077554315328598} 01/27/2022 08:54:59 - INFO - codeparrot_training - Step 13893: {'lr': 0.00042801034110620756, 'samples': 2667648, 'steps': 13893, 'loss/train': 0.7017288208007812} 01/27/2022 08:55:02 - INFO - codeparrot_training - Step 13894: {'lr': 0.00042799885202665964, 'samples': 2667840, 'steps': 13894, 'loss/train': 0.597329244017601} 01/27/2022 08:55:06 - INFO - codeparrot_training - Step 13895: {'lr': 0.0004279873621846211, 'samples': 2668032, 'steps': 13895, 'loss/train': 0.6904964447021484} 01/27/2022 08:55:09 - INFO - codeparrot_training - Step 13896: {'lr': 0.0004279758715801412, 'samples': 2668224, 'steps': 13896, 'loss/train': 0.9054792523384094} 01/27/2022 08:55:12 - INFO - codeparrot_training - Step 13897: {'lr': 0.0004279643802132692, 'samples': 2668416, 'steps': 13897, 'loss/train': 0.6931419521570206} 01/27/2022 08:55:16 - INFO - codeparrot_training - Step 13898: {'lr': 0.0004279528880840544, 'samples': 2668608, 'steps': 13898, 'loss/train': 1.1543718874454498} 01/27/2022 08:55:19 - INFO - codeparrot_training - Step 13899: {'lr': 0.00042794139519254583, 'samples': 2668800, 'steps': 13899, 'loss/train': 0.32537345588207245} 01/27/2022 08:55:23 - INFO - codeparrot_training - Step 13900: {'lr': 0.00042792990153879285, 'samples': 2668992, 'steps': 13900, 'loss/train': 0.8413664102554321} 01/27/2022 08:55:26 - INFO - codeparrot_training - Step 13901: {'lr': 0.00042791840712284466, 'samples': 2669184, 'steps': 13901, 'loss/train': 0.4568094313144684} 01/27/2022 08:55:29 - INFO - codeparrot_training - Step 13902: {'lr': 0.0004279069119447505, 'samples': 2669376, 'steps': 13902, 'loss/train': 0.8735588192939758} 01/27/2022 08:55:32 - INFO - codeparrot_training - Step 13903: {'lr': 0.0004278954160045597, 'samples': 2669568, 'steps': 13903, 'loss/train': 0.832696795463562} 01/27/2022 08:55:35 - INFO - codeparrot_training - Step 13904: {'lr': 0.0004278839193023214, 'samples': 2669760, 'steps': 13904, 'loss/train': 1.1717459857463837} 01/27/2022 08:55:38 - INFO - codeparrot_training - Step 13905: {'lr': 0.00042787242183808485, 'samples': 2669952, 'steps': 13905, 'loss/train': 0.5241596102714539} 01/27/2022 08:55:41 - INFO - codeparrot_training - Step 13906: {'lr': 0.00042786092361189927, 'samples': 2670144, 'steps': 13906, 'loss/train': 0.4039171636104584} 01/27/2022 08:55:46 - INFO - codeparrot_training - Step 13907: {'lr': 0.00042784942462381403, 'samples': 2670336, 'steps': 13907, 'loss/train': 0.649848684668541} 01/27/2022 08:55:49 - INFO - codeparrot_training - Step 13908: {'lr': 0.0004278379248738783, 'samples': 2670528, 'steps': 13908, 'loss/train': 0.9678409695625305} 01/27/2022 08:55:52 - INFO - codeparrot_training - Step 13909: {'lr': 0.00042782642436214137, 'samples': 2670720, 'steps': 13909, 'loss/train': 0.7262443006038666} 01/27/2022 08:55:55 - INFO - codeparrot_training - Step 13910: {'lr': 0.00042781492308865255, 'samples': 2670912, 'steps': 13910, 'loss/train': 0.9751592874526978} 01/27/2022 08:55:58 - INFO - codeparrot_training - Step 13911: {'lr': 0.000427803421053461, 'samples': 2671104, 'steps': 13911, 'loss/train': 1.1924347579479218} 01/27/2022 08:56:02 - INFO - codeparrot_training - Step 13912: {'lr': 0.0004277919182566161, 'samples': 2671296, 'steps': 13912, 'loss/train': 0.8961992561817169} 01/27/2022 08:56:05 - INFO - codeparrot_training - Step 13913: {'lr': 0.0004277804146981671, 'samples': 2671488, 'steps': 13913, 'loss/train': 0.8534969687461853} 01/27/2022 08:56:08 - INFO - codeparrot_training - Step 13914: {'lr': 0.00042776891037816324, 'samples': 2671680, 'steps': 13914, 'loss/train': 0.6549821645021439} 01/27/2022 08:56:11 - INFO - codeparrot_training - Step 13915: {'lr': 0.00042775740529665373, 'samples': 2671872, 'steps': 13915, 'loss/train': 0.7787545323371887} 01/27/2022 08:56:17 - INFO - codeparrot_training - Step 13916: {'lr': 0.000427745899453688, 'samples': 2672064, 'steps': 13916, 'loss/train': 0.7493182271718979} 01/27/2022 08:56:20 - INFO - codeparrot_training - Step 13917: {'lr': 0.0004277343928493153, 'samples': 2672256, 'steps': 13917, 'loss/train': 0.8740135431289673} 01/27/2022 08:56:23 - INFO - codeparrot_training - Step 13918: {'lr': 0.0004277228854835849, 'samples': 2672448, 'steps': 13918, 'loss/train': 0.8958773016929626} 01/27/2022 08:56:27 - INFO - codeparrot_training - Step 13919: {'lr': 0.0004277113773565461, 'samples': 2672640, 'steps': 13919, 'loss/train': 0.9072807133197784} 01/27/2022 08:56:30 - INFO - codeparrot_training - Step 13920: {'lr': 0.00042769986846824813, 'samples': 2672832, 'steps': 13920, 'loss/train': 0.9310323894023895} 01/27/2022 08:56:33 - INFO - codeparrot_training - Step 13921: {'lr': 0.00042768835881874036, 'samples': 2673024, 'steps': 13921, 'loss/train': 0.6931277811527252} 01/27/2022 08:56:36 - INFO - codeparrot_training - Step 13922: {'lr': 0.00042767684840807214, 'samples': 2673216, 'steps': 13922, 'loss/train': 0.7064490616321564} 01/27/2022 08:56:39 - INFO - codeparrot_training - Step 13923: {'lr': 0.00042766533723629264, 'samples': 2673408, 'steps': 13923, 'loss/train': 0.6936585009098053} 01/27/2022 08:56:42 - INFO - codeparrot_training - Step 13924: {'lr': 0.0004276538253034513, 'samples': 2673600, 'steps': 13924, 'loss/train': 0.8504623174667358} 01/27/2022 08:56:47 - INFO - codeparrot_training - Step 13925: {'lr': 0.0004276423126095974, 'samples': 2673792, 'steps': 13925, 'loss/train': 0.31395716965198517} 01/27/2022 08:56:50 - INFO - codeparrot_training - Step 13926: {'lr': 0.0004276307991547802, 'samples': 2673984, 'steps': 13926, 'loss/train': 0.3562784269452095} 01/27/2022 08:56:53 - INFO - codeparrot_training - Step 13927: {'lr': 0.0004276192849390491, 'samples': 2674176, 'steps': 13927, 'loss/train': 1.4254516661167145} 01/27/2022 08:56:56 - INFO - codeparrot_training - Step 13928: {'lr': 0.0004276077699624534, 'samples': 2674368, 'steps': 13928, 'loss/train': 1.2819559872150421} 01/27/2022 08:57:00 - INFO - codeparrot_training - Step 13929: {'lr': 0.00042759625422504236, 'samples': 2674560, 'steps': 13929, 'loss/train': 0.6975346505641937} 01/27/2022 08:57:03 - INFO - codeparrot_training - Step 13930: {'lr': 0.00042758473772686533, 'samples': 2674752, 'steps': 13930, 'loss/train': 1.0209948420524597} 01/27/2022 08:57:06 - INFO - codeparrot_training - Step 13931: {'lr': 0.0004275732204679718, 'samples': 2674944, 'steps': 13931, 'loss/train': 1.0426197946071625} 01/27/2022 08:57:09 - INFO - codeparrot_training - Step 13932: {'lr': 0.0004275617024484109, 'samples': 2675136, 'steps': 13932, 'loss/train': 0.8361509442329407} 01/27/2022 08:57:15 - INFO - codeparrot_training - Step 13933: {'lr': 0.000427550183668232, 'samples': 2675328, 'steps': 13933, 'loss/train': 0.6400686353445053} 01/27/2022 08:57:18 - INFO - codeparrot_training - Step 13934: {'lr': 0.00042753866412748455, 'samples': 2675520, 'steps': 13934, 'loss/train': 0.9716850221157074} 01/27/2022 08:57:21 - INFO - codeparrot_training - Step 13935: {'lr': 0.00042752714382621784, 'samples': 2675712, 'steps': 13935, 'loss/train': 1.1679034531116486} 01/27/2022 08:57:25 - INFO - codeparrot_training - Step 13936: {'lr': 0.0004275156227644812, 'samples': 2675904, 'steps': 13936, 'loss/train': 0.3517627716064453} 01/27/2022 08:57:28 - INFO - codeparrot_training - Step 13937: {'lr': 0.00042750410094232394, 'samples': 2676096, 'steps': 13937, 'loss/train': 0.9691512286663055} 01/27/2022 08:57:31 - INFO - codeparrot_training - Step 13938: {'lr': 0.0004274925783597956, 'samples': 2676288, 'steps': 13938, 'loss/train': 1.2963848412036896} 01/27/2022 08:57:34 - INFO - codeparrot_training - Step 13939: {'lr': 0.0004274810550169453, 'samples': 2676480, 'steps': 13939, 'loss/train': 0.5473775714635849} 01/27/2022 08:57:37 - INFO - codeparrot_training - Step 13940: {'lr': 0.00042746953091382254, 'samples': 2676672, 'steps': 13940, 'loss/train': 0.29045578837394714} 01/27/2022 08:57:40 - INFO - codeparrot_training - Step 13941: {'lr': 0.00042745800605047677, 'samples': 2676864, 'steps': 13941, 'loss/train': 0.6981845498085022} 01/27/2022 08:57:45 - INFO - codeparrot_training - Step 13942: {'lr': 0.00042744648042695717, 'samples': 2677056, 'steps': 13942, 'loss/train': 1.152517944574356} 01/27/2022 08:57:48 - INFO - codeparrot_training - Step 13943: {'lr': 0.0004274349540433132, 'samples': 2677248, 'steps': 13943, 'loss/train': 0.5567484050989151} 01/27/2022 08:57:51 - INFO - codeparrot_training - Step 13944: {'lr': 0.00042742342689959425, 'samples': 2677440, 'steps': 13944, 'loss/train': 0.7086983174085617} 01/27/2022 08:57:54 - INFO - codeparrot_training - Step 13945: {'lr': 0.00042741189899584965, 'samples': 2677632, 'steps': 13945, 'loss/train': 0.45922520756721497} 01/27/2022 08:57:57 - INFO - codeparrot_training - Step 13946: {'lr': 0.00042740037033212877, 'samples': 2677824, 'steps': 13946, 'loss/train': 1.199466347694397} 01/27/2022 08:58:00 - INFO - codeparrot_training - Step 13947: {'lr': 0.0004273888409084811, 'samples': 2678016, 'steps': 13947, 'loss/train': 0.515448734164238} 01/27/2022 08:58:04 - INFO - codeparrot_training - Step 13948: {'lr': 0.0004273773107249559, 'samples': 2678208, 'steps': 13948, 'loss/train': 0.697272464632988} 01/27/2022 08:58:07 - INFO - codeparrot_training - Step 13949: {'lr': 0.0004273657797816027, 'samples': 2678400, 'steps': 13949, 'loss/train': 1.0823032557964325} 01/27/2022 08:58:10 - INFO - codeparrot_training - Step 13950: {'lr': 0.0004273542480784708, 'samples': 2678592, 'steps': 13950, 'loss/train': 0.8236296772956848} 01/27/2022 08:58:14 - INFO - codeparrot_training - Step 13951: {'lr': 0.00042734271561560956, 'samples': 2678784, 'steps': 13951, 'loss/train': 0.5687158405780792} 01/27/2022 08:58:17 - INFO - codeparrot_training - Step 13952: {'lr': 0.00042733118239306845, 'samples': 2678976, 'steps': 13952, 'loss/train': 0.670230969786644} 01/27/2022 08:58:21 - INFO - codeparrot_training - Step 13953: {'lr': 0.0004273196484108969, 'samples': 2679168, 'steps': 13953, 'loss/train': 0.6126995533704758} 01/27/2022 08:58:24 - INFO - codeparrot_training - Step 13954: {'lr': 0.00042730811366914435, 'samples': 2679360, 'steps': 13954, 'loss/train': 0.9251695275306702} 01/27/2022 08:58:27 - INFO - codeparrot_training - Step 13955: {'lr': 0.0004272965781678601, 'samples': 2679552, 'steps': 13955, 'loss/train': 0.628987267613411} 01/27/2022 08:58:30 - INFO - codeparrot_training - Step 13956: {'lr': 0.0004272850419070935, 'samples': 2679744, 'steps': 13956, 'loss/train': 0.8789744675159454} 01/27/2022 08:58:33 - INFO - codeparrot_training - Step 13957: {'lr': 0.00042727350488689416, 'samples': 2679936, 'steps': 13957, 'loss/train': 0.6272430717945099} 01/27/2022 08:58:36 - INFO - codeparrot_training - Step 13958: {'lr': 0.00042726196710731135, 'samples': 2680128, 'steps': 13958, 'loss/train': 0.6984213441610336} 01/27/2022 08:58:39 - INFO - codeparrot_training - Step 13959: {'lr': 0.0004272504285683947, 'samples': 2680320, 'steps': 13959, 'loss/train': 0.7277782559394836} 01/27/2022 08:58:44 - INFO - codeparrot_training - Step 13960: {'lr': 0.0004272388892701934, 'samples': 2680512, 'steps': 13960, 'loss/train': 0.11569785699248314} 01/27/2022 08:58:47 - INFO - codeparrot_training - Step 13961: {'lr': 0.000427227349212757, 'samples': 2680704, 'steps': 13961, 'loss/train': 1.066531777381897} 01/27/2022 08:58:50 - INFO - codeparrot_training - Step 13962: {'lr': 0.0004272158083961348, 'samples': 2680896, 'steps': 13962, 'loss/train': 0.9582881927490234} 01/27/2022 08:58:53 - INFO - codeparrot_training - Step 13963: {'lr': 0.0004272042668203765, 'samples': 2681088, 'steps': 13963, 'loss/train': 0.7895938754081726} 01/27/2022 08:58:57 - INFO - codeparrot_training - Step 13964: {'lr': 0.00042719272448553137, 'samples': 2681280, 'steps': 13964, 'loss/train': 0.5809776037931442} 01/27/2022 08:59:00 - INFO - codeparrot_training - Step 13965: {'lr': 0.00042718118139164883, 'samples': 2681472, 'steps': 13965, 'loss/train': 0.4739217013120651} 01/27/2022 08:59:03 - INFO - codeparrot_training - Step 13966: {'lr': 0.00042716963753877836, 'samples': 2681664, 'steps': 13966, 'loss/train': 0.5053001046180725} 01/27/2022 08:59:06 - INFO - codeparrot_training - Step 13967: {'lr': 0.0004271580929269695, 'samples': 2681856, 'steps': 13967, 'loss/train': 1.3718132972717285} 01/27/2022 08:59:09 - INFO - codeparrot_training - Step 13968: {'lr': 0.0004271465475562716, 'samples': 2682048, 'steps': 13968, 'loss/train': 0.48775041103363037} 01/27/2022 08:59:16 - INFO - codeparrot_training - Step 13969: {'lr': 0.00042713500142673404, 'samples': 2682240, 'steps': 13969, 'loss/train': 1.0898828208446503} 01/27/2022 08:59:19 - INFO - codeparrot_training - Step 13970: {'lr': 0.00042712345453840644, 'samples': 2682432, 'steps': 13970, 'loss/train': 0.724991574883461} 01/27/2022 08:59:22 - INFO - codeparrot_training - Step 13971: {'lr': 0.00042711190689133827, 'samples': 2682624, 'steps': 13971, 'loss/train': 0.6022749692201614} 01/27/2022 08:59:25 - INFO - codeparrot_training - Step 13972: {'lr': 0.0004271003584855788, 'samples': 2682816, 'steps': 13972, 'loss/train': 0.5815793126821518} 01/27/2022 08:59:28 - INFO - codeparrot_training - Step 13973: {'lr': 0.0004270888093211778, 'samples': 2683008, 'steps': 13973, 'loss/train': 0.7302842438220978} 01/27/2022 08:59:31 - INFO - codeparrot_training - Step 13974: {'lr': 0.0004270772593981844, 'samples': 2683200, 'steps': 13974, 'loss/train': 0.931761234998703} 01/27/2022 08:59:34 - INFO - codeparrot_training - Step 13975: {'lr': 0.0004270657087166484, 'samples': 2683392, 'steps': 13975, 'loss/train': 0.8530378639698029} 01/27/2022 08:59:38 - INFO - codeparrot_training - Step 13976: {'lr': 0.000427054157276619, 'samples': 2683584, 'steps': 13976, 'loss/train': 0.5259234756231308} 01/27/2022 08:59:41 - INFO - codeparrot_training - Step 13977: {'lr': 0.0004270426050781458, 'samples': 2683776, 'steps': 13977, 'loss/train': 0.28553999215364456} 01/27/2022 08:59:45 - INFO - codeparrot_training - Step 13978: {'lr': 0.00042703105212127846, 'samples': 2683968, 'steps': 13978, 'loss/train': 1.001140147447586} 01/27/2022 08:59:48 - INFO - codeparrot_training - Step 13979: {'lr': 0.0004270194984060662, 'samples': 2684160, 'steps': 13979, 'loss/train': 0.30439797788858414} 01/27/2022 08:59:52 - INFO - codeparrot_training - Step 13980: {'lr': 0.0004270079439325586, 'samples': 2684352, 'steps': 13980, 'loss/train': 0.9787314534187317} 01/27/2022 08:59:55 - INFO - codeparrot_training - Step 13981: {'lr': 0.0004269963887008053, 'samples': 2684544, 'steps': 13981, 'loss/train': 0.9373588263988495} 01/27/2022 08:59:58 - INFO - codeparrot_training - Step 13982: {'lr': 0.00042698483271085555, 'samples': 2684736, 'steps': 13982, 'loss/train': 0.5027903169393539} 01/27/2022 09:00:01 - INFO - codeparrot_training - Step 13983: {'lr': 0.0004269732759627589, 'samples': 2684928, 'steps': 13983, 'loss/train': 1.0804051458835602} 01/27/2022 09:00:04 - INFO - codeparrot_training - Step 13984: {'lr': 0.0004269617184565651, 'samples': 2685120, 'steps': 13984, 'loss/train': 1.0423319041728973} 01/27/2022 09:00:07 - INFO - codeparrot_training - Step 13985: {'lr': 0.00042695016019232343, 'samples': 2685312, 'steps': 13985, 'loss/train': 0.896950364112854} 01/27/2022 09:00:12 - INFO - codeparrot_training - Step 13986: {'lr': 0.0004269386011700834, 'samples': 2685504, 'steps': 13986, 'loss/train': 1.1700842678546906} 01/27/2022 09:00:15 - INFO - codeparrot_training - Step 13987: {'lr': 0.00042692704138989467, 'samples': 2685696, 'steps': 13987, 'loss/train': 1.6333833932876587} 01/27/2022 09:00:18 - INFO - codeparrot_training - Step 13988: {'lr': 0.00042691548085180666, 'samples': 2685888, 'steps': 13988, 'loss/train': 1.2234432399272919} 01/27/2022 09:00:21 - INFO - codeparrot_training - Step 13989: {'lr': 0.00042690391955586886, 'samples': 2686080, 'steps': 13989, 'loss/train': 0.8658387064933777} 01/27/2022 09:00:24 - INFO - codeparrot_training - Step 13990: {'lr': 0.00042689235750213093, 'samples': 2686272, 'steps': 13990, 'loss/train': 1.0662318170070648} 01/27/2022 09:00:28 - INFO - codeparrot_training - Step 13991: {'lr': 0.0004268807946906422, 'samples': 2686464, 'steps': 13991, 'loss/train': 0.8437352478504181} 01/27/2022 09:00:31 - INFO - codeparrot_training - Step 13992: {'lr': 0.0004268692311214524, 'samples': 2686656, 'steps': 13992, 'loss/train': 0.8796559274196625} 01/27/2022 09:00:34 - INFO - codeparrot_training - Step 13993: {'lr': 0.00042685766679461095, 'samples': 2686848, 'steps': 13993, 'loss/train': 0.6275034248828888} 01/27/2022 09:00:37 - INFO - codeparrot_training - Step 13994: {'lr': 0.0004268461017101674, 'samples': 2687040, 'steps': 13994, 'loss/train': 1.105334848165512} 01/27/2022 09:00:43 - INFO - codeparrot_training - Step 13995: {'lr': 0.00042683453586817136, 'samples': 2687232, 'steps': 13995, 'loss/train': 1.0676705539226532} 01/27/2022 09:00:47 - INFO - codeparrot_training - Step 13996: {'lr': 0.00042682296926867226, 'samples': 2687424, 'steps': 13996, 'loss/train': 1.1254613399505615} 01/27/2022 09:00:50 - INFO - codeparrot_training - Step 13997: {'lr': 0.0004268114019117197, 'samples': 2687616, 'steps': 13997, 'loss/train': 1.0033139884471893} 01/27/2022 09:00:53 - INFO - codeparrot_training - Step 13998: {'lr': 0.00042679983379736324, 'samples': 2687808, 'steps': 13998, 'loss/train': 1.0843375325202942} 01/27/2022 09:00:56 - INFO - codeparrot_training - Step 13999: {'lr': 0.0004267882649256525, 'samples': 2688000, 'steps': 13999, 'loss/train': 1.0735023021697998} 01/27/2022 09:00:56 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/27/2022 09:01:14 - WARNING - huggingface_hub.repository - Several commits (7) will be pushed upstream. 01/27/2022 09:01:14 - WARNING - huggingface_hub.repository - The progress bars may be unreliable. 01/27/2022 09:01:48 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py 2f6074d..283559f royal-monkey-12 -> royal-monkey-12 01/27/2022 09:01:53 - INFO - codeparrot_training - Step 14000: {'lr': 0.00042677669529663686, 'samples': 2688192, 'steps': 14000, 'loss/train': 0.9348163604736328} 01/27/2022 09:01:56 - INFO - codeparrot_training - Step 14001: {'lr': 0.0004267651249103661, 'samples': 2688384, 'steps': 14001, 'loss/train': 0.03093577455729246} 01/27/2022 09:01:59 - INFO - codeparrot_training - Step 14002: {'lr': 0.00042675355376688964, 'samples': 2688576, 'steps': 14002, 'loss/train': 0.6535166501998901} 01/27/2022 09:02:02 - INFO - codeparrot_training - Step 14003: {'lr': 0.000426741981866257, 'samples': 2688768, 'steps': 14003, 'loss/train': 1.2150499820709229} 01/27/2022 09:02:07 - INFO - codeparrot_training - Step 14004: {'lr': 0.00042673040920851793, 'samples': 2688960, 'steps': 14004, 'loss/train': 0.5038377642631531} 01/27/2022 09:02:10 - INFO - codeparrot_training - Step 14005: {'lr': 0.00042671883579372186, 'samples': 2689152, 'steps': 14005, 'loss/train': 0.7759323120117188} 01/27/2022 09:02:13 - INFO - codeparrot_training - Step 14006: {'lr': 0.00042670726162191843, 'samples': 2689344, 'steps': 14006, 'loss/train': 0.9634372293949127} 01/27/2022 09:02:16 - INFO - codeparrot_training - Step 14007: {'lr': 0.0004266956866931572, 'samples': 2689536, 'steps': 14007, 'loss/train': 0.7067593485116959} 01/27/2022 09:02:19 - INFO - codeparrot_training - Step 14008: {'lr': 0.0004266841110074878, 'samples': 2689728, 'steps': 14008, 'loss/train': 0.24828940629959106} 01/27/2022 09:02:22 - INFO - codeparrot_training - Step 14009: {'lr': 0.0004266725345649597, 'samples': 2689920, 'steps': 14009, 'loss/train': 0.22894486784934998} 01/27/2022 09:02:26 - INFO - codeparrot_training - Step 14010: {'lr': 0.0004266609573656226, 'samples': 2690112, 'steps': 14010, 'loss/train': 1.2506467401981354} 01/27/2022 09:02:29 - INFO - codeparrot_training - Step 14011: {'lr': 0.000426649379409526, 'samples': 2690304, 'steps': 14011, 'loss/train': 1.0491205751895905} 01/27/2022 09:02:32 - INFO - codeparrot_training - Step 14012: {'lr': 0.00042663780069671965, 'samples': 2690496, 'steps': 14012, 'loss/train': 1.26598459482193} 01/27/2022 09:02:36 - INFO - codeparrot_training - Step 14013: {'lr': 0.000426626221227253, 'samples': 2690688, 'steps': 14013, 'loss/train': 0.3706120401620865} 01/27/2022 09:02:39 - INFO - codeparrot_training - Step 14014: {'lr': 0.00042661464100117566, 'samples': 2690880, 'steps': 14014, 'loss/train': 0.7020441144704819} 01/27/2022 09:02:43 - INFO - codeparrot_training - Step 14015: {'lr': 0.00042660306001853735, 'samples': 2691072, 'steps': 14015, 'loss/train': 0.5079659074544907} 01/27/2022 09:02:46 - INFO - codeparrot_training - Step 14016: {'lr': 0.0004265914782793875, 'samples': 2691264, 'steps': 14016, 'loss/train': 1.1829104125499725} 01/27/2022 09:02:49 - INFO - codeparrot_training - Step 14017: {'lr': 0.000426579895783776, 'samples': 2691456, 'steps': 14017, 'loss/train': 0.7406394481658936} 01/27/2022 09:02:52 - INFO - codeparrot_training - Step 14018: {'lr': 0.0004265683125317521, 'samples': 2691648, 'steps': 14018, 'loss/train': 0.7921828329563141} 01/27/2022 09:02:55 - INFO - codeparrot_training - Step 14019: {'lr': 0.0004265567285233658, 'samples': 2691840, 'steps': 14019, 'loss/train': 0.6530756950378418} 01/27/2022 09:02:58 - INFO - codeparrot_training - Step 14020: {'lr': 0.0004265451437586664, 'samples': 2692032, 'steps': 14020, 'loss/train': 0.7967126369476318} 01/27/2022 09:03:01 - INFO - codeparrot_training - Step 14021: {'lr': 0.0004265335582377038, 'samples': 2692224, 'steps': 14021, 'loss/train': 0.9161058962345123} 01/27/2022 09:03:09 - INFO - codeparrot_training - Step 14022: {'lr': 0.0004265219719605273, 'samples': 2692416, 'steps': 14022, 'loss/train': 0.8824150264263153} 01/27/2022 09:03:12 - INFO - codeparrot_training - Step 14023: {'lr': 0.0004265103849271869, 'samples': 2692608, 'steps': 14023, 'loss/train': 0.7025805115699768} 01/27/2022 09:03:15 - INFO - codeparrot_training - Step 14024: {'lr': 0.000426498797137732, 'samples': 2692800, 'steps': 14024, 'loss/train': 0.745609849691391} 01/27/2022 09:03:18 - INFO - codeparrot_training - Step 14025: {'lr': 0.0004264872085922122, 'samples': 2692992, 'steps': 14025, 'loss/train': 0.7472414821386337} 01/27/2022 09:03:21 - INFO - codeparrot_training - Step 14026: {'lr': 0.0004264756192906774, 'samples': 2693184, 'steps': 14026, 'loss/train': 0.8370361626148224} 01/27/2022 09:03:24 - INFO - codeparrot_training - Step 14027: {'lr': 0.000426464029233177, 'samples': 2693376, 'steps': 14027, 'loss/train': 0.4303290992975235} 01/27/2022 09:03:28 - INFO - codeparrot_training - Step 14028: {'lr': 0.0004264524384197608, 'samples': 2693568, 'steps': 14028, 'loss/train': 0.9047146439552307} 01/27/2022 09:03:31 - INFO - codeparrot_training - Step 14029: {'lr': 0.0004264408468504783, 'samples': 2693760, 'steps': 14029, 'loss/train': 0.8816965520381927} 01/27/2022 09:03:35 - INFO - codeparrot_training - Step 14030: {'lr': 0.00042642925452537927, 'samples': 2693952, 'steps': 14030, 'loss/train': 1.0178795456886292} 01/27/2022 09:03:38 - INFO - codeparrot_training - Step 14031: {'lr': 0.0004264176614445133, 'samples': 2694144, 'steps': 14031, 'loss/train': 0.6776069104671478} 01/27/2022 09:03:41 - INFO - codeparrot_training - Step 14032: {'lr': 0.0004264060676079302, 'samples': 2694336, 'steps': 14032, 'loss/train': 0.3680715411901474} 01/27/2022 09:03:45 - INFO - codeparrot_training - Step 14033: {'lr': 0.00042639447301567944, 'samples': 2694528, 'steps': 14033, 'loss/train': 0.47217366099357605} 01/27/2022 09:03:48 - INFO - codeparrot_training - Step 14034: {'lr': 0.0004263828776678108, 'samples': 2694720, 'steps': 14034, 'loss/train': 1.163200557231903} 01/27/2022 09:03:51 - INFO - codeparrot_training - Step 14035: {'lr': 0.00042637128156437385, 'samples': 2694912, 'steps': 14035, 'loss/train': 1.0121871829032898} 01/27/2022 09:03:54 - INFO - codeparrot_training - Step 14036: {'lr': 0.0004263596847054184, 'samples': 2695104, 'steps': 14036, 'loss/train': 0.6059588491916656} 01/27/2022 09:03:57 - INFO - codeparrot_training - Step 14037: {'lr': 0.00042634808709099403, 'samples': 2695296, 'steps': 14037, 'loss/train': 0.9278047978878021} 01/27/2022 09:04:00 - INFO - codeparrot_training - Step 14038: {'lr': 0.0004263364887211505, 'samples': 2695488, 'steps': 14038, 'loss/train': 1.0545045733451843} 01/27/2022 09:04:05 - INFO - codeparrot_training - Step 14039: {'lr': 0.0004263248895959374, 'samples': 2695680, 'steps': 14039, 'loss/train': 0.9491678774356842} 01/27/2022 09:04:08 - INFO - codeparrot_training - Step 14040: {'lr': 0.0004263132897154044, 'samples': 2695872, 'steps': 14040, 'loss/train': 0.6365851163864136} 01/27/2022 09:04:11 - INFO - codeparrot_training - Step 14041: {'lr': 0.0004263016890796014, 'samples': 2696064, 'steps': 14041, 'loss/train': 0.9167093932628632} 01/27/2022 09:04:14 - INFO - codeparrot_training - Step 14042: {'lr': 0.0004262900876885778, 'samples': 2696256, 'steps': 14042, 'loss/train': 0.5883894860744476} 01/27/2022 09:04:17 - INFO - codeparrot_training - Step 14043: {'lr': 0.0004262784855423836, 'samples': 2696448, 'steps': 14043, 'loss/train': 0.7767824828624725} 01/27/2022 09:04:20 - INFO - codeparrot_training - Step 14044: {'lr': 0.00042626688264106816, 'samples': 2696640, 'steps': 14044, 'loss/train': 1.11994868516922} 01/27/2022 09:04:24 - INFO - codeparrot_training - Step 14045: {'lr': 0.00042625527898468155, 'samples': 2696832, 'steps': 14045, 'loss/train': 0.4896458834409714} 01/27/2022 09:04:27 - INFO - codeparrot_training - Step 14046: {'lr': 0.0004262436745732732, 'samples': 2697024, 'steps': 14046, 'loss/train': 1.068556934595108} 01/27/2022 09:04:30 - INFO - codeparrot_training - Step 14047: {'lr': 0.00042623206940689285, 'samples': 2697216, 'steps': 14047, 'loss/train': 0.8149688243865967} 01/27/2022 09:04:36 - INFO - codeparrot_training - Step 14048: {'lr': 0.00042622046348559034, 'samples': 2697408, 'steps': 14048, 'loss/train': 0.7361974865198135} 01/27/2022 09:04:39 - INFO - codeparrot_training - Step 14049: {'lr': 0.0004262088568094153, 'samples': 2697600, 'steps': 14049, 'loss/train': 1.253288984298706} 01/27/2022 09:04:42 - INFO - codeparrot_training - Step 14050: {'lr': 0.0004261972493784175, 'samples': 2697792, 'steps': 14050, 'loss/train': 1.1130180358886719} 01/27/2022 09:04:46 - INFO - codeparrot_training - Step 14051: {'lr': 0.0004261856411926467, 'samples': 2697984, 'steps': 14051, 'loss/train': 0.9638851583003998} 01/27/2022 09:04:49 - INFO - codeparrot_training - Step 14052: {'lr': 0.0004261740322521525, 'samples': 2698176, 'steps': 14052, 'loss/train': 0.5950068533420563} 01/27/2022 09:04:52 - INFO - codeparrot_training - Step 14053: {'lr': 0.00042616242255698463, 'samples': 2698368, 'steps': 14053, 'loss/train': 0.8018048107624054} 01/27/2022 09:04:55 - INFO - codeparrot_training - Step 14054: {'lr': 0.0004261508121071929, 'samples': 2698560, 'steps': 14054, 'loss/train': 0.7273528575897217} 01/27/2022 09:04:58 - INFO - codeparrot_training - Step 14055: {'lr': 0.00042613920090282706, 'samples': 2698752, 'steps': 14055, 'loss/train': 0.9404013454914093} 01/27/2022 09:05:01 - INFO - codeparrot_training - Step 14056: {'lr': 0.0004261275889439368, 'samples': 2698944, 'steps': 14056, 'loss/train': 0.32792188972234726} 01/27/2022 09:05:06 - INFO - codeparrot_training - Step 14057: {'lr': 0.0004261159762305719, 'samples': 2699136, 'steps': 14057, 'loss/train': 0.8232925236225128} 01/27/2022 09:05:09 - INFO - codeparrot_training - Step 14058: {'lr': 0.00042610436276278196, 'samples': 2699328, 'steps': 14058, 'loss/train': 0.8409935832023621} 01/27/2022 09:05:12 - INFO - codeparrot_training - Step 14059: {'lr': 0.00042609274854061695, 'samples': 2699520, 'steps': 14059, 'loss/train': 1.1656158864498138} 01/27/2022 09:05:15 - INFO - codeparrot_training - Step 14060: {'lr': 0.0004260811335641266, 'samples': 2699712, 'steps': 14060, 'loss/train': 0.491950660943985} 01/27/2022 09:05:18 - INFO - codeparrot_training - Step 14061: {'lr': 0.00042606951783336045, 'samples': 2699904, 'steps': 14061, 'loss/train': 0.05803181603550911} 01/27/2022 09:05:22 - INFO - codeparrot_training - Step 14062: {'lr': 0.0004260579013483684, 'samples': 2700096, 'steps': 14062, 'loss/train': 0.9512220919132233} 01/27/2022 09:05:25 - INFO - codeparrot_training - Step 14063: {'lr': 0.0004260462841092003, 'samples': 2700288, 'steps': 14063, 'loss/train': 1.027407556772232} 01/27/2022 09:05:28 - INFO - codeparrot_training - Step 14064: {'lr': 0.00042603466611590575, 'samples': 2700480, 'steps': 14064, 'loss/train': 0.7963347136974335} 01/27/2022 09:05:31 - INFO - codeparrot_training - Step 14065: {'lr': 0.00042602304736853464, 'samples': 2700672, 'steps': 14065, 'loss/train': 0.7517253756523132} 01/27/2022 09:05:36 - INFO - codeparrot_training - Step 14066: {'lr': 0.00042601142786713664, 'samples': 2700864, 'steps': 14066, 'loss/train': 1.1680482029914856} 01/27/2022 09:05:39 - INFO - codeparrot_training - Step 14067: {'lr': 0.0004259998076117616, 'samples': 2701056, 'steps': 14067, 'loss/train': 1.5107131004333496} 01/27/2022 09:05:42 - INFO - codeparrot_training - Step 14068: {'lr': 0.00042598818660245926, 'samples': 2701248, 'steps': 14068, 'loss/train': 1.087300032377243} 01/27/2022 09:05:45 - INFO - codeparrot_training - Step 14069: {'lr': 0.00042597656483927936, 'samples': 2701440, 'steps': 14069, 'loss/train': 0.4929453134536743} 01/27/2022 09:05:48 - INFO - codeparrot_training - Step 14070: {'lr': 0.0004259649423222718, 'samples': 2701632, 'steps': 14070, 'loss/train': 0.45052555203437805} 01/27/2022 09:05:52 - INFO - codeparrot_training - Step 14071: {'lr': 0.0004259533190514863, 'samples': 2701824, 'steps': 14071, 'loss/train': 1.63324373960495} 01/27/2022 09:05:55 - INFO - codeparrot_training - Step 14072: {'lr': 0.00042594169502697265, 'samples': 2702016, 'steps': 14072, 'loss/train': 0.3566892296075821} 01/27/2022 09:05:58 - INFO - codeparrot_training - Step 14073: {'lr': 0.0004259300702487806, 'samples': 2702208, 'steps': 14073, 'loss/train': 0.7061193734407425} 01/27/2022 09:06:01 - INFO - codeparrot_training - Step 14074: {'lr': 0.00042591844471696005, 'samples': 2702400, 'steps': 14074, 'loss/train': 0.7390531897544861} 01/27/2022 09:06:07 - INFO - codeparrot_training - Step 14075: {'lr': 0.00042590681843156073, 'samples': 2702592, 'steps': 14075, 'loss/train': 0.8177030682563782} 01/27/2022 09:06:11 - INFO - codeparrot_training - Step 14076: {'lr': 0.00042589519139263246, 'samples': 2702784, 'steps': 14076, 'loss/train': 1.420966923236847} 01/27/2022 09:06:14 - INFO - codeparrot_training - Step 14077: {'lr': 0.0004258835636002251, 'samples': 2702976, 'steps': 14077, 'loss/train': 0.6224155426025391} 01/27/2022 09:06:17 - INFO - codeparrot_training - Step 14078: {'lr': 0.0004258719350543883, 'samples': 2703168, 'steps': 14078, 'loss/train': 0.9374057650566101} 01/27/2022 09:06:20 - INFO - codeparrot_training - Step 14079: {'lr': 0.00042586030575517196, 'samples': 2703360, 'steps': 14079, 'loss/train': 0.5243774056434631} 01/27/2022 09:06:23 - INFO - codeparrot_training - Step 14080: {'lr': 0.00042584867570262595, 'samples': 2703552, 'steps': 14080, 'loss/train': 1.0744171142578125} 01/27/2022 09:06:26 - INFO - codeparrot_training - Step 14081: {'lr': 0.00042583704489680007, 'samples': 2703744, 'steps': 14081, 'loss/train': 0.7812465727329254} 01/27/2022 09:06:29 - INFO - codeparrot_training - Step 14082: {'lr': 0.00042582541333774414, 'samples': 2703936, 'steps': 14082, 'loss/train': 0.7918497025966644} 01/27/2022 09:06:33 - INFO - codeparrot_training - Step 14083: {'lr': 0.0004258137810255079, 'samples': 2704128, 'steps': 14083, 'loss/train': 0.757999062538147} 01/27/2022 09:06:37 - INFO - codeparrot_training - Step 14084: {'lr': 0.0004258021479601414, 'samples': 2704320, 'steps': 14084, 'loss/train': 0.809131532907486} 01/27/2022 09:06:41 - INFO - codeparrot_training - Step 14085: {'lr': 0.00042579051414169417, 'samples': 2704512, 'steps': 14085, 'loss/train': 0.7772054672241211} 01/27/2022 09:06:44 - INFO - codeparrot_training - Step 14086: {'lr': 0.0004257788795702162, 'samples': 2704704, 'steps': 14086, 'loss/train': 0.2788681089878082} 01/27/2022 09:06:47 - INFO - codeparrot_training - Step 14087: {'lr': 0.0004257672442457574, 'samples': 2704896, 'steps': 14087, 'loss/train': 0.27799395471811295} 01/27/2022 09:06:50 - INFO - codeparrot_training - Step 14088: {'lr': 0.00042575560816836755, 'samples': 2705088, 'steps': 14088, 'loss/train': 0.875478208065033} 01/27/2022 09:06:53 - INFO - codeparrot_training - Step 14089: {'lr': 0.00042574397133809646, 'samples': 2705280, 'steps': 14089, 'loss/train': 0.924744039773941} 01/27/2022 09:06:56 - INFO - codeparrot_training - Step 14090: {'lr': 0.000425732333754994, 'samples': 2705472, 'steps': 14090, 'loss/train': 0.07110870257019997} 01/27/2022 09:07:00 - INFO - codeparrot_training - Step 14091: {'lr': 0.00042572069541911, 'samples': 2705664, 'steps': 14091, 'loss/train': 0.8246610760688782} 01/27/2022 09:07:03 - INFO - codeparrot_training - Step 14092: {'lr': 0.0004257090563304943, 'samples': 2705856, 'steps': 14092, 'loss/train': 1.0051999390125275} 01/27/2022 09:07:09 - INFO - codeparrot_training - Step 14093: {'lr': 0.0004256974164891969, 'samples': 2706048, 'steps': 14093, 'loss/train': 0.6540942192077637} 01/27/2022 09:07:12 - INFO - codeparrot_training - Step 14094: {'lr': 0.00042568577589526744, 'samples': 2706240, 'steps': 14094, 'loss/train': 1.091096431016922} 01/27/2022 09:07:15 - INFO - codeparrot_training - Step 14095: {'lr': 0.00042567413454875605, 'samples': 2706432, 'steps': 14095, 'loss/train': 0.908374696969986} 01/27/2022 09:07:19 - INFO - codeparrot_training - Step 14096: {'lr': 0.00042566249244971235, 'samples': 2706624, 'steps': 14096, 'loss/train': 0.5112421810626984} 01/27/2022 09:07:22 - INFO - codeparrot_training - Step 14097: {'lr': 0.0004256508495981863, 'samples': 2706816, 'steps': 14097, 'loss/train': 0.5233426541090012} 01/27/2022 09:07:25 - INFO - codeparrot_training - Step 14098: {'lr': 0.00042563920599422776, 'samples': 2707008, 'steps': 14098, 'loss/train': 1.1279726922512054} 01/27/2022 09:07:28 - INFO - codeparrot_training - Step 14099: {'lr': 0.00042562756163788673, 'samples': 2707200, 'steps': 14099, 'loss/train': 1.3508569300174713} 01/27/2022 09:07:31 - INFO - codeparrot_training - Step 14100: {'lr': 0.00042561591652921294, 'samples': 2707392, 'steps': 14100, 'loss/train': 0.8024035692214966} 01/27/2022 09:07:34 - INFO - codeparrot_training - Step 14101: {'lr': 0.00042560427066825636, 'samples': 2707584, 'steps': 14101, 'loss/train': 0.6024859249591827} 01/27/2022 09:07:39 - INFO - codeparrot_training - Step 14102: {'lr': 0.0004255926240550668, 'samples': 2707776, 'steps': 14102, 'loss/train': 1.1285130679607391} 01/27/2022 09:07:42 - INFO - codeparrot_training - Step 14103: {'lr': 0.0004255809766896942, 'samples': 2707968, 'steps': 14103, 'loss/train': 0.7987686395645142} 01/27/2022 09:07:45 - INFO - codeparrot_training - Step 14104: {'lr': 0.00042556932857218855, 'samples': 2708160, 'steps': 14104, 'loss/train': 1.0526391863822937} 01/27/2022 09:07:48 - INFO - codeparrot_training - Step 14105: {'lr': 0.0004255576797025995, 'samples': 2708352, 'steps': 14105, 'loss/train': 1.005784124135971} 01/27/2022 09:07:52 - INFO - codeparrot_training - Step 14106: {'lr': 0.0004255460300809772, 'samples': 2708544, 'steps': 14106, 'loss/train': 0.7830086052417755} 01/27/2022 09:07:55 - INFO - codeparrot_training - Step 14107: {'lr': 0.00042553437970737143, 'samples': 2708736, 'steps': 14107, 'loss/train': 0.6049012541770935} 01/27/2022 09:07:58 - INFO - codeparrot_training - Step 14108: {'lr': 0.00042552272858183203, 'samples': 2708928, 'steps': 14108, 'loss/train': 0.6687837839126587} 01/27/2022 09:08:01 - INFO - codeparrot_training - Step 14109: {'lr': 0.0004255110767044091, 'samples': 2709120, 'steps': 14109, 'loss/train': 0.5699407160282135} 01/27/2022 09:08:04 - INFO - codeparrot_training - Step 14110: {'lr': 0.0004254994240751524, 'samples': 2709312, 'steps': 14110, 'loss/train': 0.8779260814189911} 01/27/2022 09:08:09 - INFO - codeparrot_training - Step 14111: {'lr': 0.00042548777069411194, 'samples': 2709504, 'steps': 14111, 'loss/train': 0.8589128851890564} 01/27/2022 09:08:12 - INFO - codeparrot_training - Step 14112: {'lr': 0.0004254761165613375, 'samples': 2709696, 'steps': 14112, 'loss/train': 1.1824904680252075} 01/27/2022 09:08:15 - INFO - codeparrot_training - Step 14113: {'lr': 0.00042546446167687914, 'samples': 2709888, 'steps': 14113, 'loss/train': 0.7250030189752579} 01/27/2022 09:08:18 - INFO - codeparrot_training - Step 14114: {'lr': 0.00042545280604078673, 'samples': 2710080, 'steps': 14114, 'loss/train': 1.024240404367447} 01/27/2022 09:08:21 - INFO - codeparrot_training - Step 14115: {'lr': 0.0004254411496531103, 'samples': 2710272, 'steps': 14115, 'loss/train': 0.7177295833826065} 01/27/2022 09:08:24 - INFO - codeparrot_training - Step 14116: {'lr': 0.0004254294925138996, 'samples': 2710464, 'steps': 14116, 'loss/train': 0.44890156388282776} 01/27/2022 09:08:27 - INFO - codeparrot_training - Step 14117: {'lr': 0.00042541783462320473, 'samples': 2710656, 'steps': 14117, 'loss/train': 0.8932174444198608} 01/27/2022 09:08:31 - INFO - codeparrot_training - Step 14118: {'lr': 0.00042540617598107544, 'samples': 2710848, 'steps': 14118, 'loss/train': 0.7107881605625153} 01/27/2022 09:08:37 - INFO - codeparrot_training - Step 14119: {'lr': 0.00042539451658756195, 'samples': 2711040, 'steps': 14119, 'loss/train': 1.136687457561493} 01/27/2022 09:08:40 - INFO - codeparrot_training - Step 14120: {'lr': 0.000425382856442714, 'samples': 2711232, 'steps': 14120, 'loss/train': 1.086502343416214} 01/27/2022 09:08:43 - INFO - codeparrot_training - Step 14121: {'lr': 0.0004253711955465815, 'samples': 2711424, 'steps': 14121, 'loss/train': 0.8080064356327057} 01/27/2022 09:08:46 - INFO - codeparrot_training - Step 14122: {'lr': 0.00042535953389921454, 'samples': 2711616, 'steps': 14122, 'loss/train': 0.9183163046836853} 01/27/2022 09:08:49 - INFO - codeparrot_training - Step 14123: {'lr': 0.000425347871500663, 'samples': 2711808, 'steps': 14123, 'loss/train': 0.9480222165584564} 01/27/2022 09:08:53 - INFO - codeparrot_training - Step 14124: {'lr': 0.0004253362083509769, 'samples': 2712000, 'steps': 14124, 'loss/train': 0.9792280197143555} 01/27/2022 09:08:56 - INFO - codeparrot_training - Step 14125: {'lr': 0.0004253245444502061, 'samples': 2712192, 'steps': 14125, 'loss/train': 0.6016221642494202} 01/27/2022 09:08:59 - INFO - codeparrot_training - Step 14126: {'lr': 0.00042531287979840065, 'samples': 2712384, 'steps': 14126, 'loss/train': 0.8418521583080292} 01/27/2022 09:09:02 - INFO - codeparrot_training - Step 14127: {'lr': 0.0004253012143956105, 'samples': 2712576, 'steps': 14127, 'loss/train': 0.24277731031179428} 01/27/2022 09:09:07 - INFO - codeparrot_training - Step 14128: {'lr': 0.0004252895482418856, 'samples': 2712768, 'steps': 14128, 'loss/train': 0.8921669125556946} 01/27/2022 09:09:10 - INFO - codeparrot_training - Step 14129: {'lr': 0.00042527788133727595, 'samples': 2712960, 'steps': 14129, 'loss/train': 0.9275230765342712} 01/27/2022 09:09:13 - INFO - codeparrot_training - Step 14130: {'lr': 0.0004252662136818315, 'samples': 2713152, 'steps': 14130, 'loss/train': 0.7922074198722839} 01/27/2022 09:09:16 - INFO - codeparrot_training - Step 14131: {'lr': 0.00042525454527560225, 'samples': 2713344, 'steps': 14131, 'loss/train': 1.1529269814491272} 01/27/2022 09:09:19 - INFO - codeparrot_training - Step 14132: {'lr': 0.0004252428761186382, 'samples': 2713536, 'steps': 14132, 'loss/train': 0.5792557150125504} 01/27/2022 09:09:22 - INFO - codeparrot_training - Step 14133: {'lr': 0.00042523120621098924, 'samples': 2713728, 'steps': 14133, 'loss/train': 1.1692822873592377} 01/27/2022 09:09:25 - INFO - codeparrot_training - Step 14134: {'lr': 0.0004252195355527055, 'samples': 2713920, 'steps': 14134, 'loss/train': 0.6653560996055603} 01/27/2022 09:09:29 - INFO - codeparrot_training - Step 14135: {'lr': 0.0004252078641438369, 'samples': 2714112, 'steps': 14135, 'loss/train': 1.0650108754634857} 01/27/2022 09:09:32 - INFO - codeparrot_training - Step 14136: {'lr': 0.00042519619198443337, 'samples': 2714304, 'steps': 14136, 'loss/train': 0.9998654723167419} 01/27/2022 09:09:37 - INFO - codeparrot_training - Step 14137: {'lr': 0.0004251845190745451, 'samples': 2714496, 'steps': 14137, 'loss/train': 0.8024831414222717} 01/27/2022 09:09:40 - INFO - codeparrot_training - Step 14138: {'lr': 0.00042517284541422195, 'samples': 2714688, 'steps': 14138, 'loss/train': 0.9910060465335846} 01/27/2022 09:09:43 - INFO - codeparrot_training - Step 14139: {'lr': 0.00042516117100351394, 'samples': 2714880, 'steps': 14139, 'loss/train': 0.7594958245754242} 01/27/2022 09:09:46 - INFO - codeparrot_training - Step 14140: {'lr': 0.0004251494958424711, 'samples': 2715072, 'steps': 14140, 'loss/train': 0.7912977933883667} 01/27/2022 09:09:49 - INFO - codeparrot_training - Step 14141: {'lr': 0.0004251378199311434, 'samples': 2715264, 'steps': 14141, 'loss/train': 0.750194638967514} 01/27/2022 09:09:52 - INFO - codeparrot_training - Step 14142: {'lr': 0.0004251261432695809, 'samples': 2715456, 'steps': 14142, 'loss/train': 0.5675356239080429} 01/27/2022 09:09:55 - INFO - codeparrot_training - Step 14143: {'lr': 0.00042511446585783363, 'samples': 2715648, 'steps': 14143, 'loss/train': 1.613798439502716} 01/27/2022 09:09:59 - INFO - codeparrot_training - Step 14144: {'lr': 0.0004251027876959516, 'samples': 2715840, 'steps': 14144, 'loss/train': 0.8720637559890747} 01/27/2022 09:10:02 - INFO - codeparrot_training - Step 14145: {'lr': 0.0004250911087839848, 'samples': 2716032, 'steps': 14145, 'loss/train': 1.0000647604465485} 01/27/2022 09:10:06 - INFO - codeparrot_training - Step 14146: {'lr': 0.0004250794291219833, 'samples': 2716224, 'steps': 14146, 'loss/train': 1.1733674705028534} 01/27/2022 09:10:09 - INFO - codeparrot_training - Step 14147: {'lr': 0.00042506774870999716, 'samples': 2716416, 'steps': 14147, 'loss/train': 0.2114538475871086} 01/27/2022 09:10:12 - INFO - codeparrot_training - Step 14148: {'lr': 0.00042505606754807634, 'samples': 2716608, 'steps': 14148, 'loss/train': 0.7159367501735687} 01/27/2022 09:10:15 - INFO - codeparrot_training - Step 14149: {'lr': 0.00042504438563627093, 'samples': 2716800, 'steps': 14149, 'loss/train': 1.0990912020206451} 01/27/2022 09:10:19 - INFO - codeparrot_training - Step 14150: {'lr': 0.0004250327029746309, 'samples': 2716992, 'steps': 14150, 'loss/train': 1.0736496448516846} 01/27/2022 09:10:22 - INFO - codeparrot_training - Step 14151: {'lr': 0.0004250210195632064, 'samples': 2717184, 'steps': 14151, 'loss/train': 1.2296795547008514} 01/27/2022 09:10:25 - INFO - codeparrot_training - Step 14152: {'lr': 0.00042500933540204745, 'samples': 2717376, 'steps': 14152, 'loss/train': 0.9675138294696808} 01/27/2022 09:10:28 - INFO - codeparrot_training - Step 14153: {'lr': 0.00042499765049120396, 'samples': 2717568, 'steps': 14153, 'loss/train': 0.4593135863542557} 01/27/2022 09:10:34 - INFO - codeparrot_training - Step 14154: {'lr': 0.0004249859648307263, 'samples': 2717760, 'steps': 14154, 'loss/train': 1.5326351523399353} 01/27/2022 09:10:37 - INFO - codeparrot_training - Step 14155: {'lr': 0.0004249742784206642, 'samples': 2717952, 'steps': 14155, 'loss/train': 0.7025694251060486} 01/27/2022 09:10:41 - INFO - codeparrot_training - Step 14156: {'lr': 0.00042496259126106786, 'samples': 2718144, 'steps': 14156, 'loss/train': 1.1694866716861725} 01/27/2022 09:10:44 - INFO - codeparrot_training - Step 14157: {'lr': 0.00042495090335198735, 'samples': 2718336, 'steps': 14157, 'loss/train': 0.7951121628284454} 01/27/2022 09:10:47 - INFO - codeparrot_training - Step 14158: {'lr': 0.0004249392146934726, 'samples': 2718528, 'steps': 14158, 'loss/train': 0.5545029938220978} 01/27/2022 09:10:50 - INFO - codeparrot_training - Step 14159: {'lr': 0.000424927525285574, 'samples': 2718720, 'steps': 14159, 'loss/train': 0.3984987437725067} 01/27/2022 09:10:53 - INFO - codeparrot_training - Step 14160: {'lr': 0.00042491583512834137, 'samples': 2718912, 'steps': 14160, 'loss/train': 0.5410241782665253} 01/27/2022 09:10:56 - INFO - codeparrot_training - Step 14161: {'lr': 0.00042490414422182484, 'samples': 2719104, 'steps': 14161, 'loss/train': 0.7094369530677795} 01/27/2022 09:11:00 - INFO - codeparrot_training - Step 14162: {'lr': 0.00042489245256607447, 'samples': 2719296, 'steps': 14162, 'loss/train': 1.264877736568451} 01/27/2022 09:11:04 - INFO - codeparrot_training - Step 14163: {'lr': 0.0004248807601611404, 'samples': 2719488, 'steps': 14163, 'loss/train': 1.3792417645454407} 01/27/2022 09:11:07 - INFO - codeparrot_training - Step 14164: {'lr': 0.0004248690670070726, 'samples': 2719680, 'steps': 14164, 'loss/train': 0.8296686708927155} 01/27/2022 09:11:10 - INFO - codeparrot_training - Step 14165: {'lr': 0.00042485737310392135, 'samples': 2719872, 'steps': 14165, 'loss/train': 0.8631308376789093} 01/27/2022 09:11:14 - INFO - codeparrot_training - Step 14166: {'lr': 0.0004248456784517366, 'samples': 2720064, 'steps': 14166, 'loss/train': 0.5005262643098831} 01/27/2022 09:11:17 - INFO - codeparrot_training - Step 14167: {'lr': 0.00042483398305056847, 'samples': 2720256, 'steps': 14167, 'loss/train': 0.13920693844556808} 01/27/2022 09:11:20 - INFO - codeparrot_training - Step 14168: {'lr': 0.0004248222869004671, 'samples': 2720448, 'steps': 14168, 'loss/train': 0.9926440715789795} 01/27/2022 09:11:23 - INFO - codeparrot_training - Step 14169: {'lr': 0.00042481059000148253, 'samples': 2720640, 'steps': 14169, 'loss/train': 0.7302407473325729} 01/27/2022 09:11:26 - INFO - codeparrot_training - Step 14170: {'lr': 0.00042479889235366486, 'samples': 2720832, 'steps': 14170, 'loss/train': 0.7028046548366547} 01/27/2022 09:11:29 - INFO - codeparrot_training - Step 14171: {'lr': 0.0004247871939570643, 'samples': 2721024, 'steps': 14171, 'loss/train': 0.674618661403656} 01/27/2022 09:11:34 - INFO - codeparrot_training - Step 14172: {'lr': 0.00042477549481173093, 'samples': 2721216, 'steps': 14172, 'loss/train': 1.0695421993732452} 01/27/2022 09:11:37 - INFO - codeparrot_training - Step 14173: {'lr': 0.00042476379491771475, 'samples': 2721408, 'steps': 14173, 'loss/train': 0.9074954688549042} 01/27/2022 09:11:40 - INFO - codeparrot_training - Step 14174: {'lr': 0.00042475209427506614, 'samples': 2721600, 'steps': 14174, 'loss/train': 0.5739330053329468} 01/27/2022 09:11:43 - INFO - codeparrot_training - Step 14175: {'lr': 0.00042474039288383484, 'samples': 2721792, 'steps': 14175, 'loss/train': 0.584278866648674} 01/27/2022 09:11:46 - INFO - codeparrot_training - Step 14176: {'lr': 0.0004247286907440713, 'samples': 2721984, 'steps': 14176, 'loss/train': 0.5511540323495865} 01/27/2022 09:11:49 - INFO - codeparrot_training - Step 14177: {'lr': 0.00042471698785582546, 'samples': 2722176, 'steps': 14177, 'loss/train': 0.36209555715322495} 01/27/2022 09:11:53 - INFO - codeparrot_training - Step 14178: {'lr': 0.00042470528421914767, 'samples': 2722368, 'steps': 14178, 'loss/train': 1.1904435753822327} 01/27/2022 09:11:56 - INFO - codeparrot_training - Step 14179: {'lr': 0.0004246935798340877, 'samples': 2722560, 'steps': 14179, 'loss/train': 0.6325263977050781} 01/27/2022 09:12:02 - INFO - codeparrot_training - Step 14180: {'lr': 0.0004246818747006961, 'samples': 2722752, 'steps': 14180, 'loss/train': 0.6864745616912842} 01/27/2022 09:12:05 - INFO - codeparrot_training - Step 14181: {'lr': 0.0004246701688190227, 'samples': 2722944, 'steps': 14181, 'loss/train': 0.8472280204296112} 01/27/2022 09:12:08 - INFO - codeparrot_training - Step 14182: {'lr': 0.0004246584621891179, 'samples': 2723136, 'steps': 14182, 'loss/train': 0.043760388158261776} 01/27/2022 09:12:11 - INFO - codeparrot_training - Step 14183: {'lr': 0.00042464675481103154, 'samples': 2723328, 'steps': 14183, 'loss/train': 0.901787281036377} 01/27/2022 09:12:15 - INFO - codeparrot_training - Step 14184: {'lr': 0.00042463504668481403, 'samples': 2723520, 'steps': 14184, 'loss/train': 1.1302764415740967} 01/27/2022 09:12:18 - INFO - codeparrot_training - Step 14185: {'lr': 0.00042462333781051535, 'samples': 2723712, 'steps': 14185, 'loss/train': 0.8666879832744598} 01/27/2022 09:12:21 - INFO - codeparrot_training - Step 14186: {'lr': 0.00042461162818818585, 'samples': 2723904, 'steps': 14186, 'loss/train': 0.9073822796344757} 01/27/2022 09:12:24 - INFO - codeparrot_training - Step 14187: {'lr': 0.0004245999178178755, 'samples': 2724096, 'steps': 14187, 'loss/train': 0.9063083231449127} 01/27/2022 09:12:27 - INFO - codeparrot_training - Step 14188: {'lr': 0.0004245882066996346, 'samples': 2724288, 'steps': 14188, 'loss/train': 0.935711145401001} 01/27/2022 09:12:32 - INFO - codeparrot_training - Step 14189: {'lr': 0.0004245764948335132, 'samples': 2724480, 'steps': 14189, 'loss/train': 0.1890510767698288} 01/27/2022 09:12:35 - INFO - codeparrot_training - Step 14190: {'lr': 0.0004245647822195616, 'samples': 2724672, 'steps': 14190, 'loss/train': 0.6397561132907867} 01/27/2022 09:12:38 - INFO - codeparrot_training - Step 14191: {'lr': 0.00042455306885782985, 'samples': 2724864, 'steps': 14191, 'loss/train': 0.782521516084671} 01/27/2022 09:12:41 - INFO - codeparrot_training - Step 14192: {'lr': 0.00042454135474836817, 'samples': 2725056, 'steps': 14192, 'loss/train': 1.0616942346096039} 01/27/2022 09:12:44 - INFO - codeparrot_training - Step 14193: {'lr': 0.00042452963989122685, 'samples': 2725248, 'steps': 14193, 'loss/train': 0.6567959189414978} 01/27/2022 09:12:47 - INFO - codeparrot_training - Step 14194: {'lr': 0.00042451792428645587, 'samples': 2725440, 'steps': 14194, 'loss/train': 0.9930361211299896} 01/27/2022 09:12:50 - INFO - codeparrot_training - Step 14195: {'lr': 0.0004245062079341055, 'samples': 2725632, 'steps': 14195, 'loss/train': 0.9698990285396576} 01/27/2022 09:12:54 - INFO - codeparrot_training - Step 14196: {'lr': 0.000424494490834226, 'samples': 2725824, 'steps': 14196, 'loss/train': 0.8933344781398773} 01/27/2022 09:12:57 - INFO - codeparrot_training - Step 14197: {'lr': 0.0004244827729868675, 'samples': 2726016, 'steps': 14197, 'loss/train': 0.6686280816793442} 01/27/2022 09:13:03 - INFO - codeparrot_training - Step 14198: {'lr': 0.00042447105439208024, 'samples': 2726208, 'steps': 14198, 'loss/train': 1.0991587936878204} 01/27/2022 09:13:06 - INFO - codeparrot_training - Step 14199: {'lr': 0.0004244593350499143, 'samples': 2726400, 'steps': 14199, 'loss/train': 0.5852805376052856} 01/27/2022 09:13:09 - INFO - codeparrot_training - Step 14200: {'lr': 0.00042444761496042004, 'samples': 2726592, 'steps': 14200, 'loss/train': 0.8918935060501099} 01/27/2022 09:13:12 - INFO - codeparrot_training - Step 14201: {'lr': 0.0004244358941236476, 'samples': 2726784, 'steps': 14201, 'loss/train': 0.8540935814380646} 01/27/2022 09:13:15 - INFO - codeparrot_training - Step 14202: {'lr': 0.00042442417253964713, 'samples': 2726976, 'steps': 14202, 'loss/train': 0.823919266462326} 01/27/2022 09:13:19 - INFO - codeparrot_training - Step 14203: {'lr': 0.00042441245020846885, 'samples': 2727168, 'steps': 14203, 'loss/train': 0.7311174273490906} 01/27/2022 09:13:22 - INFO - codeparrot_training - Step 14204: {'lr': 0.00042440072713016317, 'samples': 2727360, 'steps': 14204, 'loss/train': 0.901512086391449} 01/27/2022 09:13:25 - INFO - codeparrot_training - Step 14205: {'lr': 0.00042438900330478, 'samples': 2727552, 'steps': 14205, 'loss/train': 1.0390595197677612} 01/27/2022 09:13:28 - INFO - codeparrot_training - Step 14206: {'lr': 0.00042437727873236974, 'samples': 2727744, 'steps': 14206, 'loss/train': 0.6871996968984604} 01/27/2022 09:13:32 - INFO - codeparrot_training - Step 14207: {'lr': 0.00042436555341298266, 'samples': 2727936, 'steps': 14207, 'loss/train': 0.5297298431396484} 01/27/2022 09:13:36 - INFO - codeparrot_training - Step 14208: {'lr': 0.0004243538273466689, 'samples': 2728128, 'steps': 14208, 'loss/train': 0.7079792618751526} 01/27/2022 09:13:39 - INFO - codeparrot_training - Step 14209: {'lr': 0.00042434210053347865, 'samples': 2728320, 'steps': 14209, 'loss/train': 0.5485041439533234} 01/27/2022 09:13:42 - INFO - codeparrot_training - Step 14210: {'lr': 0.0004243303729734622, 'samples': 2728512, 'steps': 14210, 'loss/train': 0.3788662701845169} 01/27/2022 09:13:45 - INFO - codeparrot_training - Step 14211: {'lr': 0.0004243186446666699, 'samples': 2728704, 'steps': 14211, 'loss/train': 0.6294958144426346} 01/27/2022 09:13:48 - INFO - codeparrot_training - Step 14212: {'lr': 0.00042430691561315176, 'samples': 2728896, 'steps': 14212, 'loss/train': 0.659247949719429} 01/27/2022 09:13:51 - INFO - codeparrot_training - Step 14213: {'lr': 0.0004242951858129582, 'samples': 2729088, 'steps': 14213, 'loss/train': 0.53548564016819} 01/27/2022 09:13:55 - INFO - codeparrot_training - Step 14214: {'lr': 0.0004242834552661394, 'samples': 2729280, 'steps': 14214, 'loss/train': 1.1008204221725464} 01/27/2022 09:13:58 - INFO - codeparrot_training - Step 14215: {'lr': 0.0004242717239727456, 'samples': 2729472, 'steps': 14215, 'loss/train': 0.7000266909599304} 01/27/2022 09:14:02 - INFO - codeparrot_training - Step 14216: {'lr': 0.00042425999193282713, 'samples': 2729664, 'steps': 14216, 'loss/train': 0.9997349381446838} 01/27/2022 09:14:05 - INFO - codeparrot_training - Step 14217: {'lr': 0.0004242482591464342, 'samples': 2729856, 'steps': 14217, 'loss/train': 1.5237582921981812} 01/27/2022 09:14:09 - INFO - codeparrot_training - Step 14218: {'lr': 0.0004242365256136169, 'samples': 2730048, 'steps': 14218, 'loss/train': 1.1358030438423157} 01/27/2022 09:14:12 - INFO - codeparrot_training - Step 14219: {'lr': 0.00042422479133442573, 'samples': 2730240, 'steps': 14219, 'loss/train': 0.7082807868719101} 01/27/2022 09:14:15 - INFO - codeparrot_training - Step 14220: {'lr': 0.00042421305630891093, 'samples': 2730432, 'steps': 14220, 'loss/train': 0.5639189332723618} 01/27/2022 09:14:18 - INFO - codeparrot_training - Step 14221: {'lr': 0.0004242013205371227, 'samples': 2730624, 'steps': 14221, 'loss/train': 1.2490195333957672} 01/27/2022 09:14:21 - INFO - codeparrot_training - Step 14222: {'lr': 0.00042418958401911134, 'samples': 2730816, 'steps': 14222, 'loss/train': 0.9368443787097931} 01/27/2022 09:14:24 - INFO - codeparrot_training - Step 14223: {'lr': 0.000424177846754927, 'samples': 2731008, 'steps': 14223, 'loss/train': 0.3169071525335312} 01/27/2022 09:14:29 - INFO - codeparrot_training - Step 14224: {'lr': 0.0004241661087446202, 'samples': 2731200, 'steps': 14224, 'loss/train': 1.0905480980873108} 01/27/2022 09:14:32 - INFO - codeparrot_training - Step 14225: {'lr': 0.00042415436998824105, 'samples': 2731392, 'steps': 14225, 'loss/train': 0.7522726356983185} 01/27/2022 09:14:35 - INFO - codeparrot_training - Step 14226: {'lr': 0.0004241426304858399, 'samples': 2731584, 'steps': 14226, 'loss/train': 0.9421751797199249} 01/27/2022 09:14:38 - INFO - codeparrot_training - Step 14227: {'lr': 0.00042413089023746696, 'samples': 2731776, 'steps': 14227, 'loss/train': 0.2881450653076172} 01/27/2022 09:14:41 - INFO - codeparrot_training - Step 14228: {'lr': 0.00042411914924317265, 'samples': 2731968, 'steps': 14228, 'loss/train': 1.2160905003547668} 01/27/2022 09:14:45 - INFO - codeparrot_training - Step 14229: {'lr': 0.00042410740750300715, 'samples': 2732160, 'steps': 14229, 'loss/train': 1.1507703959941864} 01/27/2022 09:14:48 - INFO - codeparrot_training - Step 14230: {'lr': 0.0004240956650170208, 'samples': 2732352, 'steps': 14230, 'loss/train': 1.1108221113681793} 01/27/2022 09:14:51 - INFO - codeparrot_training - Step 14231: {'lr': 0.00042408392178526396, 'samples': 2732544, 'steps': 14231, 'loss/train': 0.5989001244306564} 01/27/2022 09:14:54 - INFO - codeparrot_training - Step 14232: {'lr': 0.0004240721778077868, 'samples': 2732736, 'steps': 14232, 'loss/train': 0.5962806791067123} 01/27/2022 09:15:00 - INFO - codeparrot_training - Step 14233: {'lr': 0.0004240604330846397, 'samples': 2732928, 'steps': 14233, 'loss/train': 0.8390455842018127} 01/27/2022 09:15:04 - INFO - codeparrot_training - Step 14234: {'lr': 0.000424048687615873, 'samples': 2733120, 'steps': 14234, 'loss/train': 1.0985259711742401} 01/27/2022 09:15:07 - INFO - codeparrot_training - Step 14235: {'lr': 0.00042403694140153705, 'samples': 2733312, 'steps': 14235, 'loss/train': 1.0540865063667297} 01/27/2022 09:15:10 - INFO - codeparrot_training - Step 14236: {'lr': 0.00042402519444168207, 'samples': 2733504, 'steps': 14236, 'loss/train': 0.9751394391059875} 01/27/2022 09:15:13 - INFO - codeparrot_training - Step 14237: {'lr': 0.00042401344673635846, 'samples': 2733696, 'steps': 14237, 'loss/train': 1.0729214251041412} 01/27/2022 09:15:16 - INFO - codeparrot_training - Step 14238: {'lr': 0.00042400169828561636, 'samples': 2733888, 'steps': 14238, 'loss/train': 0.7427747994661331} 01/27/2022 09:15:19 - INFO - codeparrot_training - Step 14239: {'lr': 0.0004239899490895063, 'samples': 2734080, 'steps': 14239, 'loss/train': 0.7567748129367828} 01/27/2022 09:15:23 - INFO - codeparrot_training - Step 14240: {'lr': 0.00042397819914807855, 'samples': 2734272, 'steps': 14240, 'loss/train': 0.9293297231197357} 01/27/2022 09:15:26 - INFO - codeparrot_training - Step 14241: {'lr': 0.00042396644846138355, 'samples': 2734464, 'steps': 14241, 'loss/train': 0.9282006025314331} 01/27/2022 09:15:30 - INFO - codeparrot_training - Step 14242: {'lr': 0.00042395469702947135, 'samples': 2734656, 'steps': 14242, 'loss/train': 0.26918093115091324} 01/27/2022 09:15:33 - INFO - codeparrot_training - Step 14243: {'lr': 0.0004239429448523925, 'samples': 2734848, 'steps': 14243, 'loss/train': 0.544265404343605} 01/27/2022 09:15:37 - INFO - codeparrot_training - Step 14244: {'lr': 0.00042393119193019743, 'samples': 2735040, 'steps': 14244, 'loss/train': 0.8511837422847748} 01/27/2022 09:15:40 - INFO - codeparrot_training - Step 14245: {'lr': 0.00042391943826293623, 'samples': 2735232, 'steps': 14245, 'loss/train': 0.6432247906923294} 01/27/2022 09:15:43 - INFO - codeparrot_training - Step 14246: {'lr': 0.0004239076838506595, 'samples': 2735424, 'steps': 14246, 'loss/train': 0.7615760564804077} 01/27/2022 09:15:46 - INFO - codeparrot_training - Step 14247: {'lr': 0.0004238959286934174, 'samples': 2735616, 'steps': 14247, 'loss/train': 0.9416446387767792} 01/27/2022 09:15:49 - INFO - codeparrot_training - Step 14248: {'lr': 0.0004238841727912603, 'samples': 2735808, 'steps': 14248, 'loss/train': 0.9987920522689819} 01/27/2022 09:15:52 - INFO - codeparrot_training - Step 14249: {'lr': 0.00042387241614423875, 'samples': 2736000, 'steps': 14249, 'loss/train': 0.8325115442276001} 01/27/2022 09:15:55 - INFO - codeparrot_training - Step 14250: {'lr': 0.0004238606587524029, 'samples': 2736192, 'steps': 14250, 'loss/train': 0.937267005443573} 01/27/2022 09:16:00 - INFO - codeparrot_training - Step 14251: {'lr': 0.0004238489006158033, 'samples': 2736384, 'steps': 14251, 'loss/train': 1.268635869026184} 01/27/2022 09:16:03 - INFO - codeparrot_training - Step 14252: {'lr': 0.00042383714173449007, 'samples': 2736576, 'steps': 14252, 'loss/train': 1.4034784734249115} 01/27/2022 09:16:06 - INFO - codeparrot_training - Step 14253: {'lr': 0.0004238253821085138, 'samples': 2736768, 'steps': 14253, 'loss/train': 1.039599359035492} 01/27/2022 09:16:09 - INFO - codeparrot_training - Step 14254: {'lr': 0.00042381362173792475, 'samples': 2736960, 'steps': 14254, 'loss/train': 1.2722280621528625} 01/27/2022 09:16:13 - INFO - codeparrot_training - Step 14255: {'lr': 0.00042380186062277337, 'samples': 2737152, 'steps': 14255, 'loss/train': 0.7872245907783508} 01/27/2022 09:16:16 - INFO - codeparrot_training - Step 14256: {'lr': 0.00042379009876311, 'samples': 2737344, 'steps': 14256, 'loss/train': 0.624341681599617} 01/27/2022 09:16:19 - INFO - codeparrot_training - Step 14257: {'lr': 0.00042377833615898496, 'samples': 2737536, 'steps': 14257, 'loss/train': 0.6198979765176773} 01/27/2022 09:16:22 - INFO - codeparrot_training - Step 14258: {'lr': 0.0004237665728104488, 'samples': 2737728, 'steps': 14258, 'loss/train': 0.6413855999708176} 01/27/2022 09:16:25 - INFO - codeparrot_training - Step 14259: {'lr': 0.0004237548087175518, 'samples': 2737920, 'steps': 14259, 'loss/train': 0.36263226717710495} 01/27/2022 09:16:31 - INFO - codeparrot_training - Step 14260: {'lr': 0.00042374304388034437, 'samples': 2738112, 'steps': 14260, 'loss/train': 0.9732244312763214} 01/27/2022 09:16:34 - INFO - codeparrot_training - Step 14261: {'lr': 0.00042373127829887694, 'samples': 2738304, 'steps': 14261, 'loss/train': 0.7516420483589172} 01/27/2022 09:16:38 - INFO - codeparrot_training - Step 14262: {'lr': 0.0004237195119731998, 'samples': 2738496, 'steps': 14262, 'loss/train': 0.5315220057964325} 01/27/2022 09:16:41 - INFO - codeparrot_training - Step 14263: {'lr': 0.0004237077449033635, 'samples': 2738688, 'steps': 14263, 'loss/train': 0.7832159399986267} 01/27/2022 09:16:44 - INFO - codeparrot_training - Step 14264: {'lr': 0.0004236959770894183, 'samples': 2738880, 'steps': 14264, 'loss/train': 0.9474566280841827} 01/27/2022 09:16:47 - INFO - codeparrot_training - Step 14265: {'lr': 0.0004236842085314148, 'samples': 2739072, 'steps': 14265, 'loss/train': 0.586104691028595} 01/27/2022 09:16:50 - INFO - codeparrot_training - Step 14266: {'lr': 0.0004236724392294032, 'samples': 2739264, 'steps': 14266, 'loss/train': 1.1891035437583923} 01/27/2022 09:16:53 - INFO - codeparrot_training - Step 14267: {'lr': 0.0004236606691834341, 'samples': 2739456, 'steps': 14267, 'loss/train': 0.666012167930603} 01/27/2022 09:16:58 - INFO - codeparrot_training - Step 14268: {'lr': 0.0004236488983935578, 'samples': 2739648, 'steps': 14268, 'loss/train': 0.8797098398208618} 01/27/2022 09:17:01 - INFO - codeparrot_training - Step 14269: {'lr': 0.0004236371268598248, 'samples': 2739840, 'steps': 14269, 'loss/train': 0.6376208961009979} 01/27/2022 09:17:04 - INFO - codeparrot_training - Step 14270: {'lr': 0.0004236253545822855, 'samples': 2740032, 'steps': 14270, 'loss/train': 0.5103703290224075} 01/27/2022 09:17:07 - INFO - codeparrot_training - Step 14271: {'lr': 0.00042361358156099016, 'samples': 2740224, 'steps': 14271, 'loss/train': 0.8884707391262054} 01/27/2022 09:17:10 - INFO - codeparrot_training - Step 14272: {'lr': 0.0004236018077959895, 'samples': 2740416, 'steps': 14272, 'loss/train': 0.20819831639528275} 01/27/2022 09:17:14 - INFO - codeparrot_training - Step 14273: {'lr': 0.00042359003328733383, 'samples': 2740608, 'steps': 14273, 'loss/train': 0.9746081829071045} 01/27/2022 09:17:17 - INFO - codeparrot_training - Step 14274: {'lr': 0.0004235782580350734, 'samples': 2740800, 'steps': 14274, 'loss/train': 0.5404078960418701} 01/27/2022 09:17:20 - INFO - codeparrot_training - Step 14275: {'lr': 0.0004235664820392591, 'samples': 2740992, 'steps': 14275, 'loss/train': 0.6333228796720505} 01/27/2022 09:17:23 - INFO - codeparrot_training - Step 14276: {'lr': 0.0004235547052999409, 'samples': 2741184, 'steps': 14276, 'loss/train': 1.0300936102867126} 01/27/2022 09:17:29 - INFO - codeparrot_training - Step 14277: {'lr': 0.0004235429278171695, 'samples': 2741376, 'steps': 14277, 'loss/train': 1.0848921239376068} 01/27/2022 09:17:32 - INFO - codeparrot_training - Step 14278: {'lr': 0.00042353114959099535, 'samples': 2741568, 'steps': 14278, 'loss/train': 0.7284392416477203} 01/27/2022 09:17:36 - INFO - codeparrot_training - Step 14279: {'lr': 0.0004235193706214688, 'samples': 2741760, 'steps': 14279, 'loss/train': 0.3238080069422722} 01/27/2022 09:17:39 - INFO - codeparrot_training - Step 14280: {'lr': 0.00042350759090864043, 'samples': 2741952, 'steps': 14280, 'loss/train': 0.9572981894016266} 01/27/2022 09:17:42 - INFO - codeparrot_training - Step 14281: {'lr': 0.00042349581045256055, 'samples': 2742144, 'steps': 14281, 'loss/train': 0.5750007033348083} 01/27/2022 09:17:45 - INFO - codeparrot_training - Step 14282: {'lr': 0.00042348402925327977, 'samples': 2742336, 'steps': 14282, 'loss/train': 0.8269672393798828} 01/27/2022 09:17:48 - INFO - codeparrot_training - Step 14283: {'lr': 0.00042347224731084854, 'samples': 2742528, 'steps': 14283, 'loss/train': 0.5129463225603104} 01/27/2022 09:17:51 - INFO - codeparrot_training - Step 14284: {'lr': 0.0004234604646253172, 'samples': 2742720, 'steps': 14284, 'loss/train': 1.0970392227172852} 01/27/2022 09:17:54 - INFO - codeparrot_training - Step 14285: {'lr': 0.0004234486811967364, 'samples': 2742912, 'steps': 14285, 'loss/train': 2.0517890453338623} 01/27/2022 09:17:59 - INFO - codeparrot_training - Step 14286: {'lr': 0.00042343689702515643, 'samples': 2743104, 'steps': 14286, 'loss/train': 0.9682847857475281} 01/27/2022 09:18:02 - INFO - codeparrot_training - Step 14287: {'lr': 0.0004234251121106279, 'samples': 2743296, 'steps': 14287, 'loss/train': 1.1051902770996094} 01/27/2022 09:18:05 - INFO - codeparrot_training - Step 14288: {'lr': 0.00042341332645320126, 'samples': 2743488, 'steps': 14288, 'loss/train': 0.5120789408683777} 01/27/2022 09:18:08 - INFO - codeparrot_training - Step 14289: {'lr': 0.000423401540052927, 'samples': 2743680, 'steps': 14289, 'loss/train': 0.5988226979970932} 01/27/2022 09:18:12 - INFO - codeparrot_training - Step 14290: {'lr': 0.0004233897529098556, 'samples': 2743872, 'steps': 14290, 'loss/train': 0.9767035245895386} 01/27/2022 09:18:15 - INFO - codeparrot_training - Step 14291: {'lr': 0.0004233779650240376, 'samples': 2744064, 'steps': 14291, 'loss/train': 1.0035127401351929} 01/27/2022 09:18:18 - INFO - codeparrot_training - Step 14292: {'lr': 0.00042336617639552335, 'samples': 2744256, 'steps': 14292, 'loss/train': 0.7534035444259644} 01/27/2022 09:18:21 - INFO - codeparrot_training - Step 14293: {'lr': 0.00042335438702436354, 'samples': 2744448, 'steps': 14293, 'loss/train': 0.8752596974372864} 01/27/2022 09:18:24 - INFO - codeparrot_training - Step 14294: {'lr': 0.0004233425969106085, 'samples': 2744640, 'steps': 14294, 'loss/train': 1.3263380527496338} 01/27/2022 09:18:29 - INFO - codeparrot_training - Step 14295: {'lr': 0.00042333080605430883, 'samples': 2744832, 'steps': 14295, 'loss/train': 0.7577779591083527} 01/27/2022 09:18:32 - INFO - codeparrot_training - Step 14296: {'lr': 0.00042331901445551514, 'samples': 2745024, 'steps': 14296, 'loss/train': 0.7451578080654144} 01/27/2022 09:18:35 - INFO - codeparrot_training - Step 14297: {'lr': 0.00042330722211427775, 'samples': 2745216, 'steps': 14297, 'loss/train': 0.9192924499511719} 01/27/2022 09:18:38 - INFO - codeparrot_training - Step 14298: {'lr': 0.00042329542903064724, 'samples': 2745408, 'steps': 14298, 'loss/train': 1.1104785203933716} 01/27/2022 09:18:41 - INFO - codeparrot_training - Step 14299: {'lr': 0.00042328363520467417, 'samples': 2745600, 'steps': 14299, 'loss/train': 0.9583881497383118} 01/27/2022 09:18:44 - INFO - codeparrot_training - Step 14300: {'lr': 0.000423271840636409, 'samples': 2745792, 'steps': 14300, 'loss/train': 0.513668954372406} 01/27/2022 09:18:47 - INFO - codeparrot_training - Step 14301: {'lr': 0.0004232600453259023, 'samples': 2745984, 'steps': 14301, 'loss/train': 1.0602406561374664} 01/27/2022 09:18:51 - INFO - codeparrot_training - Step 14302: {'lr': 0.0004232482492732046, 'samples': 2746176, 'steps': 14302, 'loss/train': 0.9703687727451324} 01/27/2022 09:18:54 - INFO - codeparrot_training - Step 14303: {'lr': 0.00042323645247836636, 'samples': 2746368, 'steps': 14303, 'loss/train': 0.4998127520084381} 01/27/2022 09:19:00 - INFO - codeparrot_training - Step 14304: {'lr': 0.00042322465494143814, 'samples': 2746560, 'steps': 14304, 'loss/train': 0.5433225631713867} 01/27/2022 09:19:03 - INFO - codeparrot_training - Step 14305: {'lr': 0.00042321285666247063, 'samples': 2746752, 'steps': 14305, 'loss/train': 0.933937668800354} 01/27/2022 09:19:06 - INFO - codeparrot_training - Step 14306: {'lr': 0.0004232010576415141, 'samples': 2746944, 'steps': 14306, 'loss/train': 0.7979668378829956} 01/27/2022 09:19:09 - INFO - codeparrot_training - Step 14307: {'lr': 0.00042318925787861937, 'samples': 2747136, 'steps': 14307, 'loss/train': 0.5648932009935379} 01/27/2022 09:19:12 - INFO - codeparrot_training - Step 14308: {'lr': 0.0004231774573738367, 'samples': 2747328, 'steps': 14308, 'loss/train': 0.5788000077009201} 01/27/2022 09:19:16 - INFO - codeparrot_training - Step 14309: {'lr': 0.000423165656127217, 'samples': 2747520, 'steps': 14309, 'loss/train': 0.37852974236011505} 01/27/2022 09:19:19 - INFO - codeparrot_training - Step 14310: {'lr': 0.00042315385413881047, 'samples': 2747712, 'steps': 14310, 'loss/train': 1.3541596233844757} 01/27/2022 09:19:22 - INFO - codeparrot_training - Step 14311: {'lr': 0.00042314205140866785, 'samples': 2747904, 'steps': 14311, 'loss/train': 0.9080301225185394} 01/27/2022 09:19:27 - INFO - codeparrot_training - Step 14312: {'lr': 0.00042313024793683965, 'samples': 2748096, 'steps': 14312, 'loss/train': 0.6868321895599365} 01/27/2022 09:19:30 - INFO - codeparrot_training - Step 14313: {'lr': 0.0004231184437233765, 'samples': 2748288, 'steps': 14313, 'loss/train': 0.7418138980865479} 01/27/2022 09:19:33 - INFO - codeparrot_training - Step 14314: {'lr': 0.0004231066387683288, 'samples': 2748480, 'steps': 14314, 'loss/train': 0.8356214761734009} 01/27/2022 09:19:36 - INFO - codeparrot_training - Step 14315: {'lr': 0.0004230948330717472, 'samples': 2748672, 'steps': 14315, 'loss/train': 0.618102103471756} 01/27/2022 09:19:40 - INFO - codeparrot_training - Step 14316: {'lr': 0.0004230830266336825, 'samples': 2748864, 'steps': 14316, 'loss/train': 0.586384579539299} 01/27/2022 09:19:43 - INFO - codeparrot_training - Step 14317: {'lr': 0.00042307121945418493, 'samples': 2749056, 'steps': 14317, 'loss/train': 0.4852640926837921} 01/27/2022 09:19:46 - INFO - codeparrot_training - Step 14318: {'lr': 0.00042305941153330525, 'samples': 2749248, 'steps': 14318, 'loss/train': 1.0242800116539001} 01/27/2022 09:19:49 - INFO - codeparrot_training - Step 14319: {'lr': 0.00042304760287109394, 'samples': 2749440, 'steps': 14319, 'loss/train': 0.8425469398498535} 01/27/2022 09:19:52 - INFO - codeparrot_training - Step 14320: {'lr': 0.0004230357934676017, 'samples': 2749632, 'steps': 14320, 'loss/train': 0.4784736782312393} 01/27/2022 09:19:55 - INFO - codeparrot_training - Step 14321: {'lr': 0.00042302398332287903, 'samples': 2749824, 'steps': 14321, 'loss/train': 0.251325823366642} 01/27/2022 09:20:00 - INFO - codeparrot_training - Step 14322: {'lr': 0.00042301217243697665, 'samples': 2750016, 'steps': 14322, 'loss/train': 0.8205897510051727} 01/27/2022 09:20:03 - INFO - codeparrot_training - Step 14323: {'lr': 0.00042300036080994495, 'samples': 2750208, 'steps': 14323, 'loss/train': 0.6214554458856583} 01/27/2022 09:20:06 - INFO - codeparrot_training - Step 14324: {'lr': 0.00042298854844183476, 'samples': 2750400, 'steps': 14324, 'loss/train': 0.7097420990467072} 01/27/2022 09:20:09 - INFO - codeparrot_training - Step 14325: {'lr': 0.0004229767353326964, 'samples': 2750592, 'steps': 14325, 'loss/train': 0.3682222366333008} 01/27/2022 09:20:12 - INFO - codeparrot_training - Step 14326: {'lr': 0.0004229649214825808, 'samples': 2750784, 'steps': 14326, 'loss/train': 0.588453009724617} 01/27/2022 09:20:16 - INFO - codeparrot_training - Step 14327: {'lr': 0.0004229531068915383, 'samples': 2750976, 'steps': 14327, 'loss/train': 1.2690320312976837} 01/27/2022 09:20:19 - INFO - codeparrot_training - Step 14328: {'lr': 0.0004229412915596196, 'samples': 2751168, 'steps': 14328, 'loss/train': 1.0698515474796295} 01/27/2022 09:20:22 - INFO - codeparrot_training - Step 14329: {'lr': 0.0004229294754868754, 'samples': 2751360, 'steps': 14329, 'loss/train': 0.6015733927488327} 01/27/2022 09:20:25 - INFO - codeparrot_training - Step 14330: {'lr': 0.0004229176586733562, 'samples': 2751552, 'steps': 14330, 'loss/train': 0.6864897608757019} 01/27/2022 09:20:29 - INFO - codeparrot_training - Step 14331: {'lr': 0.0004229058411191126, 'samples': 2751744, 'steps': 14331, 'loss/train': 0.7860829532146454} 01/27/2022 09:20:32 - INFO - codeparrot_training - Step 14332: {'lr': 0.0004228940228241953, 'samples': 2751936, 'steps': 14332, 'loss/train': 0.8003464937210083} 01/27/2022 09:20:36 - INFO - codeparrot_training - Step 14333: {'lr': 0.0004228822037886549, 'samples': 2752128, 'steps': 14333, 'loss/train': 0.5524534732103348} 01/27/2022 09:20:39 - INFO - codeparrot_training - Step 14334: {'lr': 0.00042287038401254214, 'samples': 2752320, 'steps': 14334, 'loss/train': 0.94110506772995} 01/27/2022 09:20:42 - INFO - codeparrot_training - Step 14335: {'lr': 0.00042285856349590746, 'samples': 2752512, 'steps': 14335, 'loss/train': 1.186696618795395} 01/27/2022 09:20:45 - INFO - codeparrot_training - Step 14336: {'lr': 0.0004228467422388016, 'samples': 2752704, 'steps': 14336, 'loss/train': 0.5758137702941895} 01/27/2022 09:20:48 - INFO - codeparrot_training - Step 14337: {'lr': 0.00042283492024127524, 'samples': 2752896, 'steps': 14337, 'loss/train': 1.2521802484989166} 01/27/2022 09:20:51 - INFO - codeparrot_training - Step 14338: {'lr': 0.00042282309750337887, 'samples': 2753088, 'steps': 14338, 'loss/train': 0.534479945898056} 01/27/2022 09:20:57 - INFO - codeparrot_training - Step 14339: {'lr': 0.0004228112740251632, 'samples': 2753280, 'steps': 14339, 'loss/train': 0.7077648192644119} 01/27/2022 09:21:01 - INFO - codeparrot_training - Step 14340: {'lr': 0.00042279944980667906, 'samples': 2753472, 'steps': 14340, 'loss/train': 0.7233779579401016} 01/27/2022 09:21:04 - INFO - codeparrot_training - Step 14341: {'lr': 0.00042278762484797684, 'samples': 2753664, 'steps': 14341, 'loss/train': 0.43581072986125946} 01/27/2022 09:21:07 - INFO - codeparrot_training - Step 14342: {'lr': 0.0004227757991491073, 'samples': 2753856, 'steps': 14342, 'loss/train': 1.0082566738128662} 01/27/2022 09:21:10 - INFO - codeparrot_training - Step 14343: {'lr': 0.0004227639727101211, 'samples': 2754048, 'steps': 14343, 'loss/train': 1.0097726583480835} 01/27/2022 09:21:13 - INFO - codeparrot_training - Step 14344: {'lr': 0.0004227521455310689, 'samples': 2754240, 'steps': 14344, 'loss/train': 0.38958849012851715} 01/27/2022 09:21:16 - INFO - codeparrot_training - Step 14345: {'lr': 0.0004227403176120014, 'samples': 2754432, 'steps': 14345, 'loss/train': 1.1212505400180817} 01/27/2022 09:21:19 - INFO - codeparrot_training - Step 14346: {'lr': 0.00042272848895296924, 'samples': 2754624, 'steps': 14346, 'loss/train': 0.6210549473762512} 01/27/2022 09:21:23 - INFO - codeparrot_training - Step 14347: {'lr': 0.000422716659554023, 'samples': 2754816, 'steps': 14347, 'loss/train': 1.1838494539260864} 01/27/2022 09:21:27 - INFO - codeparrot_training - Step 14348: {'lr': 0.00042270482941521347, 'samples': 2755008, 'steps': 14348, 'loss/train': 0.5793857127428055} 01/27/2022 09:21:30 - INFO - codeparrot_training - Step 14349: {'lr': 0.0004226929985365913, 'samples': 2755200, 'steps': 14349, 'loss/train': 1.1945917010307312} 01/27/2022 09:21:33 - INFO - codeparrot_training - Step 14350: {'lr': 0.00042268116691820723, 'samples': 2755392, 'steps': 14350, 'loss/train': 0.22134210169315338} 01/27/2022 09:21:36 - INFO - codeparrot_training - Step 14351: {'lr': 0.00042266933456011174, 'samples': 2755584, 'steps': 14351, 'loss/train': 0.5421672910451889} 01/27/2022 09:21:40 - INFO - codeparrot_training - Step 14352: {'lr': 0.0004226575014623557, 'samples': 2755776, 'steps': 14352, 'loss/train': 0.7268961668014526} 01/27/2022 09:21:43 - INFO - codeparrot_training - Step 14353: {'lr': 0.0004226456676249898, 'samples': 2755968, 'steps': 14353, 'loss/train': 1.1724359393119812} 01/27/2022 09:21:46 - INFO - codeparrot_training - Step 14354: {'lr': 0.0004226338330480646, 'samples': 2756160, 'steps': 14354, 'loss/train': 0.5814786404371262} 01/27/2022 09:21:49 - INFO - codeparrot_training - Step 14355: {'lr': 0.00042262199773163096, 'samples': 2756352, 'steps': 14355, 'loss/train': 1.291445016860962} 01/27/2022 09:21:52 - INFO - codeparrot_training - Step 14356: {'lr': 0.00042261016167573944, 'samples': 2756544, 'steps': 14356, 'loss/train': 0.8525928854942322} 01/27/2022 09:21:58 - INFO - codeparrot_training - Step 14357: {'lr': 0.0004225983248804408, 'samples': 2756736, 'steps': 14357, 'loss/train': 0.9238451421260834} 01/27/2022 09:22:01 - INFO - codeparrot_training - Step 14358: {'lr': 0.0004225864873457858, 'samples': 2756928, 'steps': 14358, 'loss/train': 0.5528281331062317} 01/27/2022 09:22:04 - INFO - codeparrot_training - Step 14359: {'lr': 0.0004225746490718251, 'samples': 2757120, 'steps': 14359, 'loss/train': 0.8927678167819977} 01/27/2022 09:22:08 - INFO - codeparrot_training - Step 14360: {'lr': 0.0004225628100586093, 'samples': 2757312, 'steps': 14360, 'loss/train': 0.8085581660270691} 01/27/2022 09:22:11 - INFO - codeparrot_training - Step 14361: {'lr': 0.0004225509703061893, 'samples': 2757504, 'steps': 14361, 'loss/train': 0.7228201925754547} 01/27/2022 09:22:14 - INFO - codeparrot_training - Step 14362: {'lr': 0.0004225391298146157, 'samples': 2757696, 'steps': 14362, 'loss/train': 0.44779010117053986} 01/27/2022 09:22:17 - INFO - codeparrot_training - Step 14363: {'lr': 0.0004225272885839392, 'samples': 2757888, 'steps': 14363, 'loss/train': 0.8764284253120422} 01/27/2022 09:22:20 - INFO - codeparrot_training - Step 14364: {'lr': 0.0004225154466142107, 'samples': 2758080, 'steps': 14364, 'loss/train': 1.0763943493366241} 01/27/2022 09:22:24 - INFO - codeparrot_training - Step 14365: {'lr': 0.0004225036039054807, 'samples': 2758272, 'steps': 14365, 'loss/train': 0.45451095700263977} 01/27/2022 09:22:28 - INFO - codeparrot_training - Step 14366: {'lr': 0.00042249176045780013, 'samples': 2758464, 'steps': 14366, 'loss/train': 2.0474377870559692} 01/27/2022 09:22:31 - INFO - codeparrot_training - Step 14367: {'lr': 0.0004224799162712195, 'samples': 2758656, 'steps': 14367, 'loss/train': 0.9160047769546509} 01/27/2022 09:22:34 - INFO - codeparrot_training - Step 14368: {'lr': 0.0004224680713457898, 'samples': 2758848, 'steps': 14368, 'loss/train': 0.6571607440710068} 01/27/2022 09:22:37 - INFO - codeparrot_training - Step 14369: {'lr': 0.00042245622568156164, 'samples': 2759040, 'steps': 14369, 'loss/train': 0.8708453178405762} 01/27/2022 09:22:40 - INFO - codeparrot_training - Step 14370: {'lr': 0.0004224443792785857, 'samples': 2759232, 'steps': 14370, 'loss/train': 0.5567598938941956} 01/27/2022 09:22:43 - INFO - codeparrot_training - Step 14371: {'lr': 0.0004224325321369128, 'samples': 2759424, 'steps': 14371, 'loss/train': 1.2571393847465515} 01/27/2022 09:22:47 - INFO - codeparrot_training - Step 14372: {'lr': 0.0004224206842565937, 'samples': 2759616, 'steps': 14372, 'loss/train': 0.6731864959001541} 01/27/2022 09:22:50 - INFO - codeparrot_training - Step 14373: {'lr': 0.00042240883563767916, 'samples': 2759808, 'steps': 14373, 'loss/train': 0.8399308919906616} 01/27/2022 09:22:54 - INFO - codeparrot_training - Step 14374: {'lr': 0.00042239698628021994, 'samples': 2760000, 'steps': 14374, 'loss/train': 0.9541171789169312} 01/27/2022 09:22:57 - INFO - codeparrot_training - Step 14375: {'lr': 0.0004223851361842668, 'samples': 2760192, 'steps': 14375, 'loss/train': 1.4306694567203522} 01/27/2022 09:23:00 - INFO - codeparrot_training - Step 14376: {'lr': 0.00042237328534987034, 'samples': 2760384, 'steps': 14376, 'loss/train': 1.4691174924373627} 01/27/2022 09:23:04 - INFO - codeparrot_training - Step 14377: {'lr': 0.0004223614337770816, 'samples': 2760576, 'steps': 14377, 'loss/train': 0.7572878301143646} 01/27/2022 09:23:07 - INFO - codeparrot_training - Step 14378: {'lr': 0.0004223495814659511, 'samples': 2760768, 'steps': 14378, 'loss/train': 0.8255065977573395} 01/27/2022 09:23:10 - INFO - codeparrot_training - Step 14379: {'lr': 0.00042233772841652974, 'samples': 2760960, 'steps': 14379, 'loss/train': 0.836522787809372} 01/27/2022 09:23:13 - INFO - codeparrot_training - Step 14380: {'lr': 0.00042232587462886833, 'samples': 2761152, 'steps': 14380, 'loss/train': 0.7791426479816437} 01/27/2022 09:23:16 - INFO - codeparrot_training - Step 14381: {'lr': 0.0004223140201030176, 'samples': 2761344, 'steps': 14381, 'loss/train': 0.5605128407478333} 01/27/2022 09:23:19 - INFO - codeparrot_training - Step 14382: {'lr': 0.0004223021648390283, 'samples': 2761536, 'steps': 14382, 'loss/train': 0.8333341777324677} 01/27/2022 09:23:26 - INFO - codeparrot_training - Step 14383: {'lr': 0.0004222903088369512, 'samples': 2761728, 'steps': 14383, 'loss/train': 0.14003561809659004} 01/27/2022 09:23:29 - INFO - codeparrot_training - Step 14384: {'lr': 0.0004222784520968371, 'samples': 2761920, 'steps': 14384, 'loss/train': 0.7671217918395996} 01/27/2022 09:23:32 - INFO - codeparrot_training - Step 14385: {'lr': 0.000422266594618737, 'samples': 2762112, 'steps': 14385, 'loss/train': 0.7966287732124329} 01/27/2022 09:23:35 - INFO - codeparrot_training - Step 14386: {'lr': 0.0004222547364027013, 'samples': 2762304, 'steps': 14386, 'loss/train': 1.267805814743042} 01/27/2022 09:23:39 - INFO - codeparrot_training - Step 14387: {'lr': 0.0004222428774487811, 'samples': 2762496, 'steps': 14387, 'loss/train': 1.0170113146305084} 01/27/2022 09:23:42 - INFO - codeparrot_training - Step 14388: {'lr': 0.00042223101775702704, 'samples': 2762688, 'steps': 14388, 'loss/train': 0.8901536464691162} 01/27/2022 09:23:45 - INFO - codeparrot_training - Step 14389: {'lr': 0.00042221915732749006, 'samples': 2762880, 'steps': 14389, 'loss/train': 0.7059946954250336} 01/27/2022 09:23:48 - INFO - codeparrot_training - Step 14390: {'lr': 0.0004222072961602209, 'samples': 2763072, 'steps': 14390, 'loss/train': 1.028389424085617} 01/27/2022 09:23:51 - INFO - codeparrot_training - Step 14391: {'lr': 0.0004221954342552703, 'samples': 2763264, 'steps': 14391, 'loss/train': 0.895744800567627} 01/27/2022 09:23:54 - INFO - codeparrot_training - Step 14392: {'lr': 0.00042218357161268917, 'samples': 2763456, 'steps': 14392, 'loss/train': 0.9488701522350311} 01/27/2022 09:23:59 - INFO - codeparrot_training - Step 14393: {'lr': 0.0004221717082325283, 'samples': 2763648, 'steps': 14393, 'loss/train': 0.6655298620462418} 01/27/2022 09:24:02 - INFO - codeparrot_training - Step 14394: {'lr': 0.00042215984411483854, 'samples': 2763840, 'steps': 14394, 'loss/train': 0.7588776648044586} 01/27/2022 09:24:05 - INFO - codeparrot_training - Step 14395: {'lr': 0.00042214797925967064, 'samples': 2764032, 'steps': 14395, 'loss/train': 1.2940278053283691} 01/27/2022 09:24:08 - INFO - codeparrot_training - Step 14396: {'lr': 0.00042213611366707547, 'samples': 2764224, 'steps': 14396, 'loss/train': 0.8323528468608856} 01/27/2022 09:24:12 - INFO - codeparrot_training - Step 14397: {'lr': 0.0004221242473371038, 'samples': 2764416, 'steps': 14397, 'loss/train': 1.0071116387844086} 01/27/2022 09:24:15 - INFO - codeparrot_training - Step 14398: {'lr': 0.00042211238026980657, 'samples': 2764608, 'steps': 14398, 'loss/train': 0.9704278707504272} 01/27/2022 09:24:18 - INFO - codeparrot_training - Step 14399: {'lr': 0.0004221005124652345, 'samples': 2764800, 'steps': 14399, 'loss/train': 1.021740585565567} 01/27/2022 09:24:21 - INFO - codeparrot_training - Step 14400: {'lr': 0.0004220886439234385, 'samples': 2764992, 'steps': 14400, 'loss/train': 1.0850888192653656} 01/27/2022 09:24:24 - INFO - codeparrot_training - Step 14401: {'lr': 0.0004220767746444694, 'samples': 2765184, 'steps': 14401, 'loss/train': 0.7459539324045181} 01/27/2022 09:24:30 - INFO - codeparrot_training - Step 14402: {'lr': 0.0004220649046283781, 'samples': 2765376, 'steps': 14402, 'loss/train': 0.9197062253952026} 01/27/2022 09:24:33 - INFO - codeparrot_training - Step 14403: {'lr': 0.00042205303387521533, 'samples': 2765568, 'steps': 14403, 'loss/train': 1.0628364980220795} 01/27/2022 09:24:37 - INFO - codeparrot_training - Step 14404: {'lr': 0.00042204116238503197, 'samples': 2765760, 'steps': 14404, 'loss/train': 0.5122514516115189} 01/27/2022 09:24:40 - INFO - codeparrot_training - Step 14405: {'lr': 0.00042202929015787893, 'samples': 2765952, 'steps': 14405, 'loss/train': 0.04109922889620066} 01/27/2022 09:24:43 - INFO - codeparrot_training - Step 14406: {'lr': 0.000422017417193807, 'samples': 2766144, 'steps': 14406, 'loss/train': 0.4987386167049408} 01/27/2022 09:24:46 - INFO - codeparrot_training - Step 14407: {'lr': 0.0004220055434928671, 'samples': 2766336, 'steps': 14407, 'loss/train': 1.1031387448310852} 01/27/2022 09:24:49 - INFO - codeparrot_training - Step 14408: {'lr': 0.0004219936690551101, 'samples': 2766528, 'steps': 14408, 'loss/train': 1.2550359964370728} 01/27/2022 09:24:53 - INFO - codeparrot_training - Step 14409: {'lr': 0.0004219817938805869, 'samples': 2766720, 'steps': 14409, 'loss/train': 0.839638352394104} 01/27/2022 09:24:57 - INFO - codeparrot_training - Step 14410: {'lr': 0.0004219699179693481, 'samples': 2766912, 'steps': 14410, 'loss/train': 0.9101841151714325} 01/27/2022 09:25:00 - INFO - codeparrot_training - Step 14411: {'lr': 0.000421958041321445, 'samples': 2767104, 'steps': 14411, 'loss/train': 0.4013863205909729} 01/27/2022 09:25:03 - INFO - codeparrot_training - Step 14412: {'lr': 0.0004219461639369281, 'samples': 2767296, 'steps': 14412, 'loss/train': 0.5333157330751419} 01/27/2022 09:25:06 - INFO - codeparrot_training - Step 14413: {'lr': 0.0004219342858158485, 'samples': 2767488, 'steps': 14413, 'loss/train': 0.8734095096588135} 01/27/2022 09:25:10 - INFO - codeparrot_training - Step 14414: {'lr': 0.000421922406958257, 'samples': 2767680, 'steps': 14414, 'loss/train': 0.595910981297493} 01/27/2022 09:25:13 - INFO - codeparrot_training - Step 14415: {'lr': 0.00042191052736420445, 'samples': 2767872, 'steps': 14415, 'loss/train': 0.9594951868057251} 01/27/2022 09:25:16 - INFO - codeparrot_training - Step 14416: {'lr': 0.0004218986470337419, 'samples': 2768064, 'steps': 14416, 'loss/train': 1.3609235286712646} 01/27/2022 09:25:19 - INFO - codeparrot_training - Step 14417: {'lr': 0.00042188676596692, 'samples': 2768256, 'steps': 14417, 'loss/train': 1.1135479509830475} 01/27/2022 09:25:22 - INFO - codeparrot_training - Step 14418: {'lr': 0.0004218748841637899, 'samples': 2768448, 'steps': 14418, 'loss/train': 0.7475330829620361} 01/27/2022 09:25:27 - INFO - codeparrot_training - Step 14419: {'lr': 0.0004218630016244023, 'samples': 2768640, 'steps': 14419, 'loss/train': 1.1087610125541687} 01/27/2022 09:25:30 - INFO - codeparrot_training - Step 14420: {'lr': 0.0004218511183488082, 'samples': 2768832, 'steps': 14420, 'loss/train': 0.6970864981412888} 01/27/2022 09:25:33 - INFO - codeparrot_training - Step 14421: {'lr': 0.0004218392343370584, 'samples': 2769024, 'steps': 14421, 'loss/train': 1.0141584277153015} 01/27/2022 09:25:36 - INFO - codeparrot_training - Step 14422: {'lr': 0.000421827349589204, 'samples': 2769216, 'steps': 14422, 'loss/train': 0.8268519937992096} 01/27/2022 09:25:39 - INFO - codeparrot_training - Step 14423: {'lr': 0.0004218154641052957, 'samples': 2769408, 'steps': 14423, 'loss/train': 0.7507629096508026} 01/27/2022 09:25:42 - INFO - codeparrot_training - Step 14424: {'lr': 0.0004218035778853846, 'samples': 2769600, 'steps': 14424, 'loss/train': 0.5817403346300125} 01/27/2022 09:25:46 - INFO - codeparrot_training - Step 14425: {'lr': 0.0004217916909295215, 'samples': 2769792, 'steps': 14425, 'loss/train': 0.8972252011299133} 01/27/2022 09:25:49 - INFO - codeparrot_training - Step 14426: {'lr': 0.00042177980323775734, 'samples': 2769984, 'steps': 14426, 'loss/train': 0.8555029034614563} 01/27/2022 09:25:52 - INFO - codeparrot_training - Step 14427: {'lr': 0.00042176791481014303, 'samples': 2770176, 'steps': 14427, 'loss/train': 0.913575142621994} 01/27/2022 09:25:56 - INFO - codeparrot_training - Step 14428: {'lr': 0.0004217560256467295, 'samples': 2770368, 'steps': 14428, 'loss/train': 0.6797819137573242} 01/27/2022 09:25:59 - INFO - codeparrot_training - Step 14429: {'lr': 0.00042174413574756775, 'samples': 2770560, 'steps': 14429, 'loss/train': 1.1368997991085052} 01/27/2022 09:26:03 - INFO - codeparrot_training - Step 14430: {'lr': 0.0004217322451127086, 'samples': 2770752, 'steps': 14430, 'loss/train': 0.9026723206043243} 01/27/2022 09:26:06 - INFO - codeparrot_training - Step 14431: {'lr': 0.00042172035374220306, 'samples': 2770944, 'steps': 14431, 'loss/train': 1.1765846908092499} 01/27/2022 09:26:09 - INFO - codeparrot_training - Step 14432: {'lr': 0.0004217084616361021, 'samples': 2771136, 'steps': 14432, 'loss/train': 0.6705069243907928} 01/27/2022 09:26:12 - INFO - codeparrot_training - Step 14433: {'lr': 0.00042169656879445657, 'samples': 2771328, 'steps': 14433, 'loss/train': 0.6782628893852234} 01/27/2022 09:26:15 - INFO - codeparrot_training - Step 14434: {'lr': 0.00042168467521731747, 'samples': 2771520, 'steps': 14434, 'loss/train': 0.42917583882808685} 01/27/2022 09:26:18 - INFO - codeparrot_training - Step 14435: {'lr': 0.00042167278090473573, 'samples': 2771712, 'steps': 14435, 'loss/train': 0.47453588247299194} 01/27/2022 09:26:24 - INFO - codeparrot_training - Step 14436: {'lr': 0.0004216608858567623, 'samples': 2771904, 'steps': 14436, 'loss/train': 1.1609399020671844} 01/27/2022 09:26:28 - INFO - codeparrot_training - Step 14437: {'lr': 0.00042164899007344814, 'samples': 2772096, 'steps': 14437, 'loss/train': 4.625206232070923} 01/27/2022 09:26:31 - INFO - codeparrot_training - Step 14438: {'lr': 0.00042163709355484425, 'samples': 2772288, 'steps': 14438, 'loss/train': 0.03907719813287258} 01/27/2022 09:26:34 - INFO - codeparrot_training - Step 14439: {'lr': 0.0004216251963010015, 'samples': 2772480, 'steps': 14439, 'loss/train': 0.898187130689621} 01/27/2022 09:26:37 - INFO - codeparrot_training - Step 14440: {'lr': 0.0004216132983119709, 'samples': 2772672, 'steps': 14440, 'loss/train': 1.130095660686493} 01/27/2022 09:26:40 - INFO - codeparrot_training - Step 14441: {'lr': 0.00042160139958780346, 'samples': 2772864, 'steps': 14441, 'loss/train': 0.3290546089410782} 01/27/2022 09:26:43 - INFO - codeparrot_training - Step 14442: {'lr': 0.0004215895001285501, 'samples': 2773056, 'steps': 14442, 'loss/train': 0.5484882295131683} 01/27/2022 09:26:47 - INFO - codeparrot_training - Step 14443: {'lr': 0.0004215775999342618, 'samples': 2773248, 'steps': 14443, 'loss/train': 1.0173275470733643} 01/27/2022 09:26:50 - INFO - codeparrot_training - Step 14444: {'lr': 0.0004215656990049896, 'samples': 2773440, 'steps': 14444, 'loss/train': 0.93079474568367} 01/27/2022 09:26:54 - INFO - codeparrot_training - Step 14445: {'lr': 0.0004215537973407844, 'samples': 2773632, 'steps': 14445, 'loss/train': 0.810340404510498} 01/27/2022 09:26:57 - INFO - codeparrot_training - Step 14446: {'lr': 0.0004215418949416972, 'samples': 2773824, 'steps': 14446, 'loss/train': 0.8437301516532898} 01/27/2022 09:27:00 - INFO - codeparrot_training - Step 14447: {'lr': 0.00042152999180777894, 'samples': 2774016, 'steps': 14447, 'loss/train': 0.6909996271133423} 01/27/2022 09:27:04 - INFO - codeparrot_training - Step 14448: {'lr': 0.0004215180879390807, 'samples': 2774208, 'steps': 14448, 'loss/train': 0.7857309579849243} 01/27/2022 09:27:07 - INFO - codeparrot_training - Step 14449: {'lr': 0.0004215061833356535, 'samples': 2774400, 'steps': 14449, 'loss/train': 0.6656115800142288} 01/27/2022 09:27:10 - INFO - codeparrot_training - Step 14450: {'lr': 0.00042149427799754817, 'samples': 2774592, 'steps': 14450, 'loss/train': 0.7264866828918457} 01/27/2022 09:27:13 - INFO - codeparrot_training - Step 14451: {'lr': 0.00042148237192481586, 'samples': 2774784, 'steps': 14451, 'loss/train': 1.5580299496650696} 01/27/2022 09:27:16 - INFO - codeparrot_training - Step 14452: {'lr': 0.0004214704651175075, 'samples': 2774976, 'steps': 14452, 'loss/train': 0.9654604196548462} 01/27/2022 09:27:19 - INFO - codeparrot_training - Step 14453: {'lr': 0.0004214585575756742, 'samples': 2775168, 'steps': 14453, 'loss/train': 2.9237101078033447} 01/27/2022 09:27:24 - INFO - codeparrot_training - Step 14454: {'lr': 0.0004214466492993668, 'samples': 2775360, 'steps': 14454, 'loss/train': 1.1588982939720154} 01/27/2022 09:27:27 - INFO - codeparrot_training - Step 14455: {'lr': 0.00042143474028863637, 'samples': 2775552, 'steps': 14455, 'loss/train': 0.8973933756351471} 01/27/2022 09:27:30 - INFO - codeparrot_training - Step 14456: {'lr': 0.000421422830543534, 'samples': 2775744, 'steps': 14456, 'loss/train': 1.8617772459983826} 01/27/2022 09:27:34 - INFO - codeparrot_training - Step 14457: {'lr': 0.0004214109200641106, 'samples': 2775936, 'steps': 14457, 'loss/train': 0.6924179792404175} 01/27/2022 09:27:37 - INFO - codeparrot_training - Step 14458: {'lr': 0.00042139900885041734, 'samples': 2776128, 'steps': 14458, 'loss/train': 1.3103719353675842} 01/27/2022 09:27:40 - INFO - codeparrot_training - Step 14459: {'lr': 0.00042138709690250507, 'samples': 2776320, 'steps': 14459, 'loss/train': 0.9275031387805939} 01/27/2022 09:27:43 - INFO - codeparrot_training - Step 14460: {'lr': 0.0004213751842204249, 'samples': 2776512, 'steps': 14460, 'loss/train': 1.241137146949768} 01/27/2022 09:27:46 - INFO - codeparrot_training - Step 14461: {'lr': 0.00042136327080422785, 'samples': 2776704, 'steps': 14461, 'loss/train': 0.41795937716960907} 01/27/2022 09:27:49 - INFO - codeparrot_training - Step 14462: {'lr': 0.0004213513566539651, 'samples': 2776896, 'steps': 14462, 'loss/train': 0.7933521866798401} 01/27/2022 09:27:55 - INFO - codeparrot_training - Step 14463: {'lr': 0.0004213394417696874, 'samples': 2777088, 'steps': 14463, 'loss/train': 0.9667510092258453} 01/27/2022 09:27:59 - INFO - codeparrot_training - Step 14464: {'lr': 0.00042132752615144597, 'samples': 2777280, 'steps': 14464, 'loss/train': 0.8516510725021362} 01/27/2022 09:28:02 - INFO - codeparrot_training - Step 14465: {'lr': 0.00042131560979929186, 'samples': 2777472, 'steps': 14465, 'loss/train': 1.408642441034317} 01/27/2022 09:28:05 - INFO - codeparrot_training - Step 14466: {'lr': 0.00042130369271327605, 'samples': 2777664, 'steps': 14466, 'loss/train': 0.9582488536834717} 01/27/2022 09:28:08 - INFO - codeparrot_training - Step 14467: {'lr': 0.0004212917748934496, 'samples': 2777856, 'steps': 14467, 'loss/train': 0.5621844381093979} 01/27/2022 09:28:11 - INFO - codeparrot_training - Step 14468: {'lr': 0.00042127985633986365, 'samples': 2778048, 'steps': 14468, 'loss/train': 0.8498403131961823} 01/27/2022 09:28:14 - INFO - codeparrot_training - Step 14469: {'lr': 0.00042126793705256913, 'samples': 2778240, 'steps': 14469, 'loss/train': 1.3029142320156097} 01/27/2022 09:28:17 - INFO - codeparrot_training - Step 14470: {'lr': 0.00042125601703161706, 'samples': 2778432, 'steps': 14470, 'loss/train': 0.8963984549045563} 01/27/2022 09:28:21 - INFO - codeparrot_training - Step 14471: {'lr': 0.00042124409627705873, 'samples': 2778624, 'steps': 14471, 'loss/train': 1.1731768548488617} 01/27/2022 09:28:25 - INFO - codeparrot_training - Step 14472: {'lr': 0.00042123217478894504, 'samples': 2778816, 'steps': 14472, 'loss/train': 0.9095066785812378} 01/27/2022 09:28:28 - INFO - codeparrot_training - Step 14473: {'lr': 0.0004212202525673271, 'samples': 2779008, 'steps': 14473, 'loss/train': 0.5193129926919937} 01/27/2022 09:28:31 - INFO - codeparrot_training - Step 14474: {'lr': 0.00042120832961225585, 'samples': 2779200, 'steps': 14474, 'loss/train': 0.9703179001808167} 01/27/2022 09:28:35 - INFO - codeparrot_training - Step 14475: {'lr': 0.00042119640592378263, 'samples': 2779392, 'steps': 14475, 'loss/train': 0.0939881019294262} 01/27/2022 09:28:38 - INFO - codeparrot_training - Step 14476: {'lr': 0.00042118448150195827, 'samples': 2779584, 'steps': 14476, 'loss/train': 0.627482995390892} 01/27/2022 09:28:41 - INFO - codeparrot_training - Step 14477: {'lr': 0.000421172556346834, 'samples': 2779776, 'steps': 14477, 'loss/train': 0.5072409063577652} 01/27/2022 09:28:44 - INFO - codeparrot_training - Step 14478: {'lr': 0.00042116063045846073, 'samples': 2779968, 'steps': 14478, 'loss/train': 0.9221839606761932} 01/27/2022 09:28:47 - INFO - codeparrot_training - Step 14479: {'lr': 0.00042114870383688985, 'samples': 2780160, 'steps': 14479, 'loss/train': 0.7872251272201538} 01/27/2022 09:28:50 - INFO - codeparrot_training - Step 14480: {'lr': 0.0004211367764821722, 'samples': 2780352, 'steps': 14480, 'loss/train': 1.4597254693508148} 01/27/2022 09:28:55 - INFO - codeparrot_training - Step 14481: {'lr': 0.00042112484839435893, 'samples': 2780544, 'steps': 14481, 'loss/train': 0.7884701192378998} 01/27/2022 09:28:58 - INFO - codeparrot_training - Step 14482: {'lr': 0.00042111291957350113, 'samples': 2780736, 'steps': 14482, 'loss/train': 0.9117384552955627} 01/27/2022 09:29:01 - INFO - codeparrot_training - Step 14483: {'lr': 0.00042110099001964996, 'samples': 2780928, 'steps': 14483, 'loss/train': 0.7409770041704178} 01/27/2022 09:29:04 - INFO - codeparrot_training - Step 14484: {'lr': 0.0004210890597328564, 'samples': 2781120, 'steps': 14484, 'loss/train': 1.0366149544715881} 01/27/2022 09:29:07 - INFO - codeparrot_training - Step 14485: {'lr': 0.0004210771287131717, 'samples': 2781312, 'steps': 14485, 'loss/train': 1.0737177729606628} 01/27/2022 09:29:10 - INFO - codeparrot_training - Step 14486: {'lr': 0.00042106519696064694, 'samples': 2781504, 'steps': 14486, 'loss/train': 0.9778135120868683} 01/27/2022 09:29:14 - INFO - codeparrot_training - Step 14487: {'lr': 0.0004210532644753331, 'samples': 2781696, 'steps': 14487, 'loss/train': 0.5547505617141724} 01/27/2022 09:29:17 - INFO - codeparrot_training - Step 14488: {'lr': 0.00042104133125728146, 'samples': 2781888, 'steps': 14488, 'loss/train': 0.9681587219238281} 01/27/2022 09:29:23 - INFO - codeparrot_training - Step 14489: {'lr': 0.00042102939730654304, 'samples': 2782080, 'steps': 14489, 'loss/train': 0.978799045085907} 01/27/2022 09:29:26 - INFO - codeparrot_training - Step 14490: {'lr': 0.000421017462623169, 'samples': 2782272, 'steps': 14490, 'loss/train': 0.4087405353784561} 01/27/2022 09:29:29 - INFO - codeparrot_training - Step 14491: {'lr': 0.0004210055272072104, 'samples': 2782464, 'steps': 14491, 'loss/train': 0.8553954362869263} 01/27/2022 09:29:32 - INFO - codeparrot_training - Step 14492: {'lr': 0.00042099359105871856, 'samples': 2782656, 'steps': 14492, 'loss/train': 0.7905256748199463} 01/27/2022 09:29:36 - INFO - codeparrot_training - Step 14493: {'lr': 0.0004209816541777444, 'samples': 2782848, 'steps': 14493, 'loss/train': 1.2241169214248657} 01/27/2022 09:29:39 - INFO - codeparrot_training - Step 14494: {'lr': 0.0004209697165643391, 'samples': 2783040, 'steps': 14494, 'loss/train': 1.6836492419242859} 01/27/2022 09:29:42 - INFO - codeparrot_training - Step 14495: {'lr': 0.0004209577782185538, 'samples': 2783232, 'steps': 14495, 'loss/train': 1.0210855901241302} 01/27/2022 09:29:45 - INFO - codeparrot_training - Step 14496: {'lr': 0.0004209458391404397, 'samples': 2783424, 'steps': 14496, 'loss/train': 0.6866628974676132} 01/27/2022 09:29:48 - INFO - codeparrot_training - Step 14497: {'lr': 0.0004209338993300479, 'samples': 2783616, 'steps': 14497, 'loss/train': 0.98428013920784} 01/27/2022 09:29:51 - INFO - codeparrot_training - Step 14498: {'lr': 0.00042092195878742954, 'samples': 2783808, 'steps': 14498, 'loss/train': 0.6694585382938385} 01/27/2022 09:29:56 - INFO - codeparrot_training - Step 14499: {'lr': 0.0004209100175126358, 'samples': 2784000, 'steps': 14499, 'loss/train': 0.6646627485752106} 01/27/2022 09:29:59 - INFO - codeparrot_training - Step 14500: {'lr': 0.0004208980755057178, 'samples': 2784192, 'steps': 14500, 'loss/train': 0.7417833209037781} 01/27/2022 09:30:02 - INFO - codeparrot_training - Step 14501: {'lr': 0.0004208861327667268, 'samples': 2784384, 'steps': 14501, 'loss/train': 1.2180006802082062} 01/27/2022 09:30:05 - INFO - codeparrot_training - Step 14502: {'lr': 0.00042087418929571377, 'samples': 2784576, 'steps': 14502, 'loss/train': 1.1603479385375977} 01/27/2022 09:30:08 - INFO - codeparrot_training - Step 14503: {'lr': 0.00042086224509272995, 'samples': 2784768, 'steps': 14503, 'loss/train': 1.1282957196235657} 01/27/2022 09:30:11 - INFO - codeparrot_training - Step 14504: {'lr': 0.0004208503001578266, 'samples': 2784960, 'steps': 14504, 'loss/train': 0.7658489942550659} 01/27/2022 09:30:15 - INFO - codeparrot_training - Step 14505: {'lr': 0.00042083835449105477, 'samples': 2785152, 'steps': 14505, 'loss/train': 0.9246900379657745} 01/27/2022 09:30:18 - INFO - codeparrot_training - Step 14506: {'lr': 0.00042082640809246576, 'samples': 2785344, 'steps': 14506, 'loss/train': 1.0561244487762451} 01/27/2022 09:30:22 - INFO - codeparrot_training - Step 14507: {'lr': 0.0004208144609621106, 'samples': 2785536, 'steps': 14507, 'loss/train': 0.3948938548564911} 01/27/2022 09:30:25 - INFO - codeparrot_training - Step 14508: {'lr': 0.0004208025131000405, 'samples': 2785728, 'steps': 14508, 'loss/train': 0.9509541392326355} 01/27/2022 09:30:29 - INFO - codeparrot_training - Step 14509: {'lr': 0.0004207905645063067, 'samples': 2785920, 'steps': 14509, 'loss/train': 0.7590352892875671} 01/27/2022 09:30:32 - INFO - codeparrot_training - Step 14510: {'lr': 0.00042077861518096033, 'samples': 2786112, 'steps': 14510, 'loss/train': 0.2022121176123619} 01/27/2022 09:30:35 - INFO - codeparrot_training - Step 14511: {'lr': 0.0004207666651240526, 'samples': 2786304, 'steps': 14511, 'loss/train': 0.4118529260158539} 01/27/2022 09:30:38 - INFO - codeparrot_training - Step 14512: {'lr': 0.0004207547143356347, 'samples': 2786496, 'steps': 14512, 'loss/train': 0.6770116835832596} 01/27/2022 09:30:41 - INFO - codeparrot_training - Step 14513: {'lr': 0.00042074276281575787, 'samples': 2786688, 'steps': 14513, 'loss/train': 1.024745911359787} 01/27/2022 09:30:44 - INFO - codeparrot_training - Step 14514: {'lr': 0.00042073081056447325, 'samples': 2786880, 'steps': 14514, 'loss/train': 0.9472421407699585} 01/27/2022 09:30:48 - INFO - codeparrot_training - Step 14515: {'lr': 0.00042071885758183204, 'samples': 2787072, 'steps': 14515, 'loss/train': 0.4195275753736496} 01/27/2022 09:30:54 - INFO - codeparrot_training - Step 14516: {'lr': 0.00042070690386788545, 'samples': 2787264, 'steps': 14516, 'loss/train': 0.6508060991764069} 01/27/2022 09:30:57 - INFO - codeparrot_training - Step 14517: {'lr': 0.0004206949494226847, 'samples': 2787456, 'steps': 14517, 'loss/train': 0.6054255813360214} 01/27/2022 09:31:00 - INFO - codeparrot_training - Step 14518: {'lr': 0.000420682994246281, 'samples': 2787648, 'steps': 14518, 'loss/train': 1.2019346058368683} 01/27/2022 09:31:04 - INFO - codeparrot_training - Step 14519: {'lr': 0.00042067103833872554, 'samples': 2787840, 'steps': 14519, 'loss/train': 0.7660499811172485} 01/27/2022 09:31:07 - INFO - codeparrot_training - Step 14520: {'lr': 0.0004206590817000695, 'samples': 2788032, 'steps': 14520, 'loss/train': 0.906202107667923} 01/27/2022 09:31:10 - INFO - codeparrot_training - Step 14521: {'lr': 0.0004206471243303642, 'samples': 2788224, 'steps': 14521, 'loss/train': 1.327378660440445} 01/27/2022 09:31:13 - INFO - codeparrot_training - Step 14522: {'lr': 0.0004206351662296608, 'samples': 2788416, 'steps': 14522, 'loss/train': 1.3098507821559906} 01/27/2022 09:31:16 - INFO - codeparrot_training - Step 14523: {'lr': 0.0004206232073980105, 'samples': 2788608, 'steps': 14523, 'loss/train': 0.9097177684307098} 01/27/2022 09:31:19 - INFO - codeparrot_training - Step 14524: {'lr': 0.00042061124783546454, 'samples': 2788800, 'steps': 14524, 'loss/train': 0.5270503163337708} 01/27/2022 09:31:23 - INFO - codeparrot_training - Step 14525: {'lr': 0.0004205992875420742, 'samples': 2788992, 'steps': 14525, 'loss/train': 1.0339031517505646} 01/27/2022 09:31:27 - INFO - codeparrot_training - Step 14526: {'lr': 0.0004205873265178907, 'samples': 2789184, 'steps': 14526, 'loss/train': 0.939132571220398} 01/27/2022 09:31:30 - INFO - codeparrot_training - Step 14527: {'lr': 0.0004205753647629653, 'samples': 2789376, 'steps': 14527, 'loss/train': 0.28479308634996414} 01/27/2022 09:31:33 - INFO - codeparrot_training - Step 14528: {'lr': 0.0004205634022773491, 'samples': 2789568, 'steps': 14528, 'loss/train': 0.9555148780345917} 01/27/2022 09:31:36 - INFO - codeparrot_training - Step 14529: {'lr': 0.0004205514390610935, 'samples': 2789760, 'steps': 14529, 'loss/train': 0.4125848561525345} 01/27/2022 09:31:40 - INFO - codeparrot_training - Step 14530: {'lr': 0.00042053947511424975, 'samples': 2789952, 'steps': 14530, 'loss/train': 0.9416978359222412} 01/27/2022 09:31:43 - INFO - codeparrot_training - Step 14531: {'lr': 0.00042052751043686895, 'samples': 2790144, 'steps': 14531, 'loss/train': 1.3614908158779144} 01/27/2022 09:31:46 - INFO - codeparrot_training - Step 14532: {'lr': 0.00042051554502900245, 'samples': 2790336, 'steps': 14532, 'loss/train': 1.0034666955471039} 01/27/2022 09:31:49 - INFO - codeparrot_training - Step 14533: {'lr': 0.0004205035788907015, 'samples': 2790528, 'steps': 14533, 'loss/train': 0.5220656991004944} 01/27/2022 09:31:52 - INFO - codeparrot_training - Step 14534: {'lr': 0.0004204916120220174, 'samples': 2790720, 'steps': 14534, 'loss/train': 0.19978545606136322} 01/27/2022 09:31:56 - INFO - codeparrot_training - Step 14535: {'lr': 0.00042047964442300137, 'samples': 2790912, 'steps': 14535, 'loss/train': 1.1870808899402618} 01/27/2022 09:32:00 - INFO - codeparrot_training - Step 14536: {'lr': 0.0004204676760937046, 'samples': 2791104, 'steps': 14536, 'loss/train': 0.5705276727676392} 01/27/2022 09:32:03 - INFO - codeparrot_training - Step 14537: {'lr': 0.00042045570703417857, 'samples': 2791296, 'steps': 14537, 'loss/train': 0.8454370200634003} 01/27/2022 09:32:06 - INFO - codeparrot_training - Step 14538: {'lr': 0.00042044373724447434, 'samples': 2791488, 'steps': 14538, 'loss/train': 1.0435466766357422} 01/27/2022 09:32:09 - INFO - codeparrot_training - Step 14539: {'lr': 0.0004204317667246432, 'samples': 2791680, 'steps': 14539, 'loss/train': 0.621541902422905} 01/27/2022 09:32:12 - INFO - codeparrot_training - Step 14540: {'lr': 0.00042041979547473665, 'samples': 2791872, 'steps': 14540, 'loss/train': 0.9140615165233612} 01/27/2022 09:32:15 - INFO - codeparrot_training - Step 14541: {'lr': 0.0004204078234948057, 'samples': 2792064, 'steps': 14541, 'loss/train': 1.0425889492034912} 01/27/2022 09:32:18 - INFO - codeparrot_training - Step 14542: {'lr': 0.00042039585078490173, 'samples': 2792256, 'steps': 14542, 'loss/train': 0.8605214059352875} 01/27/2022 09:32:22 - INFO - codeparrot_training - Step 14543: {'lr': 0.000420383877345076, 'samples': 2792448, 'steps': 14543, 'loss/train': 0.9905081391334534} 01/27/2022 09:32:28 - INFO - codeparrot_training - Step 14544: {'lr': 0.00042037190317538, 'samples': 2792640, 'steps': 14544, 'loss/train': 0.6540424525737762} 01/27/2022 09:32:31 - INFO - codeparrot_training - Step 14545: {'lr': 0.00042035992827586474, 'samples': 2792832, 'steps': 14545, 'loss/train': 1.0332371592521667} 01/27/2022 09:32:34 - INFO - codeparrot_training - Step 14546: {'lr': 0.00042034795264658163, 'samples': 2793024, 'steps': 14546, 'loss/train': 1.0012752413749695} 01/27/2022 09:32:37 - INFO - codeparrot_training - Step 14547: {'lr': 0.00042033597628758206, 'samples': 2793216, 'steps': 14547, 'loss/train': 0.7677215337753296} 01/27/2022 09:32:40 - INFO - codeparrot_training - Step 14548: {'lr': 0.00042032399919891724, 'samples': 2793408, 'steps': 14548, 'loss/train': 0.5619672685861588} 01/27/2022 09:32:44 - INFO - codeparrot_training - Step 14549: {'lr': 0.0004203120213806385, 'samples': 2793600, 'steps': 14549, 'loss/train': 0.9811786115169525} 01/27/2022 09:32:47 - INFO - codeparrot_training - Step 14550: {'lr': 0.0004203000428327971, 'samples': 2793792, 'steps': 14550, 'loss/train': 0.8689026832580566} 01/27/2022 09:32:50 - INFO - codeparrot_training - Step 14551: {'lr': 0.00042028806355544443, 'samples': 2793984, 'steps': 14551, 'loss/train': 1.085404872894287} 01/27/2022 09:32:54 - INFO - codeparrot_training - Step 14552: {'lr': 0.0004202760835486317, 'samples': 2794176, 'steps': 14552, 'loss/train': 0.9343622624874115} 01/27/2022 09:32:58 - INFO - codeparrot_training - Step 14553: {'lr': 0.00042026410281241033, 'samples': 2794368, 'steps': 14553, 'loss/train': 0.2766171544790268} 01/27/2022 09:33:01 - INFO - codeparrot_training - Step 14554: {'lr': 0.00042025212134683165, 'samples': 2794560, 'steps': 14554, 'loss/train': 1.3742137849330902} 01/27/2022 09:33:04 - INFO - codeparrot_training - Step 14555: {'lr': 0.0004202401391519469, 'samples': 2794752, 'steps': 14555, 'loss/train': 1.0953112542629242} 01/27/2022 09:33:07 - INFO - codeparrot_training - Step 14556: {'lr': 0.0004202281562278075, 'samples': 2794944, 'steps': 14556, 'loss/train': 0.50103360414505} 01/27/2022 09:33:10 - INFO - codeparrot_training - Step 14557: {'lr': 0.0004202161725744647, 'samples': 2795136, 'steps': 14557, 'loss/train': 1.1619705855846405} 01/27/2022 09:33:13 - INFO - codeparrot_training - Step 14558: {'lr': 0.0004202041881919699, 'samples': 2795328, 'steps': 14558, 'loss/train': 0.49301183223724365} 01/27/2022 09:33:16 - INFO - codeparrot_training - Step 14559: {'lr': 0.0004201922030803743, 'samples': 2795520, 'steps': 14559, 'loss/train': 0.48405472934246063} 01/27/2022 09:33:20 - INFO - codeparrot_training - Step 14560: {'lr': 0.0004201802172397295, 'samples': 2795712, 'steps': 14560, 'loss/train': 0.6241238862276077} 01/27/2022 09:33:26 - INFO - codeparrot_training - Step 14561: {'lr': 0.0004201682306700866, 'samples': 2795904, 'steps': 14561, 'loss/train': 0.9935800731182098} 01/27/2022 09:33:29 - INFO - codeparrot_training - Step 14562: {'lr': 0.00042015624337149703, 'samples': 2796096, 'steps': 14562, 'loss/train': 1.5448240041732788} 01/27/2022 09:33:32 - INFO - codeparrot_training - Step 14563: {'lr': 0.0004201442553440121, 'samples': 2796288, 'steps': 14563, 'loss/train': 0.949792742729187} 01/27/2022 09:33:35 - INFO - codeparrot_training - Step 14564: {'lr': 0.00042013226658768333, 'samples': 2796480, 'steps': 14564, 'loss/train': 0.8796691596508026} 01/27/2022 09:33:38 - INFO - codeparrot_training - Step 14565: {'lr': 0.0004201202771025618, 'samples': 2796672, 'steps': 14565, 'loss/train': 1.0586282908916473} 01/27/2022 09:33:41 - INFO - codeparrot_training - Step 14566: {'lr': 0.0004201082868886992, 'samples': 2796864, 'steps': 14566, 'loss/train': 0.703694298863411} 01/27/2022 09:33:44 - INFO - codeparrot_training - Step 14567: {'lr': 0.00042009629594614656, 'samples': 2797056, 'steps': 14567, 'loss/train': 0.49180278182029724} 01/27/2022 09:33:48 - INFO - codeparrot_training - Step 14568: {'lr': 0.0004200843042749555, 'samples': 2797248, 'steps': 14568, 'loss/train': 0.923474907875061} 01/27/2022 09:33:51 - INFO - codeparrot_training - Step 14569: {'lr': 0.0004200723118751772, 'samples': 2797440, 'steps': 14569, 'loss/train': 0.8950442969799042} 01/27/2022 09:33:55 - INFO - codeparrot_training - Step 14570: {'lr': 0.00042006031874686315, 'samples': 2797632, 'steps': 14570, 'loss/train': 0.7863157689571381} 01/27/2022 09:33:58 - INFO - codeparrot_training - Step 14571: {'lr': 0.00042004832489006474, 'samples': 2797824, 'steps': 14571, 'loss/train': 0.07356058806180954} 01/27/2022 09:34:01 - INFO - codeparrot_training - Step 14572: {'lr': 0.0004200363303048332, 'samples': 2798016, 'steps': 14572, 'loss/train': 0.9502094686031342} 01/27/2022 09:34:04 - INFO - codeparrot_training - Step 14573: {'lr': 0.00042002433499122016, 'samples': 2798208, 'steps': 14573, 'loss/train': 0.5144761651754379} 01/27/2022 09:34:08 - INFO - codeparrot_training - Step 14574: {'lr': 0.00042001233894927684, 'samples': 2798400, 'steps': 14574, 'loss/train': 0.557689368724823} 01/27/2022 09:34:11 - INFO - codeparrot_training - Step 14575: {'lr': 0.0004200003421790546, 'samples': 2798592, 'steps': 14575, 'loss/train': 0.7439205944538116} 01/27/2022 09:34:14 - INFO - codeparrot_training - Step 14576: {'lr': 0.0004199883446806048, 'samples': 2798784, 'steps': 14576, 'loss/train': 1.2817760109901428} 01/27/2022 09:34:17 - INFO - codeparrot_training - Step 14577: {'lr': 0.00041997634645397897, 'samples': 2798976, 'steps': 14577, 'loss/train': 1.0155308246612549} 01/27/2022 09:34:20 - INFO - codeparrot_training - Step 14578: {'lr': 0.0004199643474992285, 'samples': 2799168, 'steps': 14578, 'loss/train': 0.6829057037830353} 01/27/2022 09:34:25 - INFO - codeparrot_training - Step 14579: {'lr': 0.00041995234781640466, 'samples': 2799360, 'steps': 14579, 'loss/train': 0.533542737364769} 01/27/2022 09:34:28 - INFO - codeparrot_training - Step 14580: {'lr': 0.00041994034740555896, 'samples': 2799552, 'steps': 14580, 'loss/train': 0.8295055031776428} 01/27/2022 09:34:32 - INFO - codeparrot_training - Step 14581: {'lr': 0.00041992834626674273, 'samples': 2799744, 'steps': 14581, 'loss/train': 1.362406611442566} 01/27/2022 09:34:35 - INFO - codeparrot_training - Step 14582: {'lr': 0.0004199163444000075, 'samples': 2799936, 'steps': 14582, 'loss/train': 0.8247813284397125} 01/27/2022 09:34:38 - INFO - codeparrot_training - Step 14583: {'lr': 0.00041990434180540453, 'samples': 2800128, 'steps': 14583, 'loss/train': 1.0878243148326874} 01/27/2022 09:34:41 - INFO - codeparrot_training - Step 14584: {'lr': 0.00041989233848298534, 'samples': 2800320, 'steps': 14584, 'loss/train': 0.8054029047489166} 01/27/2022 09:34:44 - INFO - codeparrot_training - Step 14585: {'lr': 0.00041988033443280136, 'samples': 2800512, 'steps': 14585, 'loss/train': 0.7279120534658432} 01/27/2022 09:34:47 - INFO - codeparrot_training - Step 14586: {'lr': 0.00041986832965490396, 'samples': 2800704, 'steps': 14586, 'loss/train': 0.6369442641735077} 01/27/2022 09:34:53 - INFO - codeparrot_training - Step 14587: {'lr': 0.0004198563241493445, 'samples': 2800896, 'steps': 14587, 'loss/train': 0.9598861634731293} 01/27/2022 09:34:57 - INFO - codeparrot_training - Step 14588: {'lr': 0.00041984431791617456, 'samples': 2801088, 'steps': 14588, 'loss/train': 0.9072262644767761} 01/27/2022 09:35:00 - INFO - codeparrot_training - Step 14589: {'lr': 0.00041983231095544545, 'samples': 2801280, 'steps': 14589, 'loss/train': 0.8308858573436737} 01/27/2022 09:35:03 - INFO - codeparrot_training - Step 14590: {'lr': 0.00041982030326720866, 'samples': 2801472, 'steps': 14590, 'loss/train': 0.773174375295639} 01/27/2022 09:35:06 - INFO - codeparrot_training - Step 14591: {'lr': 0.00041980829485151563, 'samples': 2801664, 'steps': 14591, 'loss/train': 0.7474144399166107} 01/27/2022 09:35:09 - INFO - codeparrot_training - Step 14592: {'lr': 0.00041979628570841776, 'samples': 2801856, 'steps': 14592, 'loss/train': 1.0304082334041595} 01/27/2022 09:35:12 - INFO - codeparrot_training - Step 14593: {'lr': 0.00041978427583796654, 'samples': 2802048, 'steps': 14593, 'loss/train': 0.8465198278427124} 01/27/2022 09:35:15 - INFO - codeparrot_training - Step 14594: {'lr': 0.00041977226524021337, 'samples': 2802240, 'steps': 14594, 'loss/train': 0.9279935359954834} 01/27/2022 09:35:19 - INFO - codeparrot_training - Step 14595: {'lr': 0.0004197602539152098, 'samples': 2802432, 'steps': 14595, 'loss/train': 1.1681474447250366} 01/27/2022 09:35:23 - INFO - codeparrot_training - Step 14596: {'lr': 0.00041974824186300706, 'samples': 2802624, 'steps': 14596, 'loss/train': 0.606344148516655} 01/27/2022 09:35:26 - INFO - codeparrot_training - Step 14597: {'lr': 0.0004197362290836569, 'samples': 2802816, 'steps': 14597, 'loss/train': 0.962935209274292} 01/27/2022 09:35:29 - INFO - codeparrot_training - Step 14598: {'lr': 0.00041972421557721055, 'samples': 2803008, 'steps': 14598, 'loss/train': 0.7736509144306183} 01/27/2022 09:35:32 - INFO - codeparrot_training - Step 14599: {'lr': 0.00041971220134371957, 'samples': 2803200, 'steps': 14599, 'loss/train': 0.9043280482292175} 01/27/2022 09:35:36 - INFO - codeparrot_training - Step 14600: {'lr': 0.00041970018638323546, 'samples': 2803392, 'steps': 14600, 'loss/train': 0.7137068063020706} 01/27/2022 09:35:39 - INFO - codeparrot_training - Step 14601: {'lr': 0.0004196881706958096, 'samples': 2803584, 'steps': 14601, 'loss/train': 0.5279713422060013} 01/27/2022 09:35:42 - INFO - codeparrot_training - Step 14602: {'lr': 0.00041967615428149346, 'samples': 2803776, 'steps': 14602, 'loss/train': 0.9779272377490997} 01/27/2022 09:35:45 - INFO - codeparrot_training - Step 14603: {'lr': 0.0004196641371403386, 'samples': 2803968, 'steps': 14603, 'loss/train': 0.9970919787883759} 01/27/2022 09:35:48 - INFO - codeparrot_training - Step 14604: {'lr': 0.00041965211927239644, 'samples': 2804160, 'steps': 14604, 'loss/train': 0.6000882536172867} 01/27/2022 09:35:53 - INFO - codeparrot_training - Step 14605: {'lr': 0.0004196401006777185, 'samples': 2804352, 'steps': 14605, 'loss/train': 0.9335696697235107} 01/27/2022 09:35:56 - INFO - codeparrot_training - Step 14606: {'lr': 0.00041962808135635624, 'samples': 2804544, 'steps': 14606, 'loss/train': 0.36433880031108856} 01/27/2022 09:35:59 - INFO - codeparrot_training - Step 14607: {'lr': 0.00041961606130836105, 'samples': 2804736, 'steps': 14607, 'loss/train': 0.8199672996997833} 01/27/2022 09:36:02 - INFO - codeparrot_training - Step 14608: {'lr': 0.0004196040405337845, 'samples': 2804928, 'steps': 14608, 'loss/train': 0.5985968112945557} 01/27/2022 09:36:05 - INFO - codeparrot_training - Step 14609: {'lr': 0.0004195920190326782, 'samples': 2805120, 'steps': 14609, 'loss/train': 0.2482696697115898} 01/27/2022 09:36:08 - INFO - codeparrot_training - Step 14610: {'lr': 0.0004195799968050935, 'samples': 2805312, 'steps': 14610, 'loss/train': 0.7018756717443466} 01/27/2022 09:36:12 - INFO - codeparrot_training - Step 14611: {'lr': 0.000419567973851082, 'samples': 2805504, 'steps': 14611, 'loss/train': 1.4747478067874908} 01/27/2022 09:36:15 - INFO - codeparrot_training - Step 14612: {'lr': 0.0004195559501706951, 'samples': 2805696, 'steps': 14612, 'loss/train': 0.706634983420372} 01/27/2022 09:36:18 - INFO - codeparrot_training - Step 14613: {'lr': 0.00041954392576398433, 'samples': 2805888, 'steps': 14613, 'loss/train': 0.9508496224880219} 01/27/2022 09:36:22 - INFO - codeparrot_training - Step 14614: {'lr': 0.0004195319006310012, 'samples': 2806080, 'steps': 14614, 'loss/train': 1.0042341649532318} 01/27/2022 09:36:26 - INFO - codeparrot_training - Step 14615: {'lr': 0.0004195198747717973, 'samples': 2806272, 'steps': 14615, 'loss/train': 0.6521348208189011} 01/27/2022 09:36:29 - INFO - codeparrot_training - Step 14616: {'lr': 0.00041950784818642404, 'samples': 2806464, 'steps': 14616, 'loss/train': 0.5851780772209167} 01/27/2022 09:36:32 - INFO - codeparrot_training - Step 14617: {'lr': 0.000419495820874933, 'samples': 2806656, 'steps': 14617, 'loss/train': 0.26271526515483856} 01/27/2022 09:36:35 - INFO - codeparrot_training - Step 14618: {'lr': 0.0004194837928373757, 'samples': 2806848, 'steps': 14618, 'loss/train': 0.5679017454385757} 01/27/2022 09:36:38 - INFO - codeparrot_training - Step 14619: {'lr': 0.0004194717640738036, 'samples': 2807040, 'steps': 14619, 'loss/train': 1.4071779549121857} 01/27/2022 09:36:41 - INFO - codeparrot_training - Step 14620: {'lr': 0.0004194597345842683, 'samples': 2807232, 'steps': 14620, 'loss/train': 1.0447184443473816} 01/27/2022 09:36:44 - INFO - codeparrot_training - Step 14621: {'lr': 0.00041944770436882134, 'samples': 2807424, 'steps': 14621, 'loss/train': 0.8549111187458038} 01/27/2022 09:36:51 - INFO - codeparrot_training - Step 14622: {'lr': 0.00041943567342751423, 'samples': 2807616, 'steps': 14622, 'loss/train': 0.8387998044490814} 01/27/2022 09:36:54 - INFO - codeparrot_training - Step 14623: {'lr': 0.0004194236417603985, 'samples': 2807808, 'steps': 14623, 'loss/train': 0.675030380487442} 01/27/2022 09:36:57 - INFO - codeparrot_training - Step 14624: {'lr': 0.0004194116093675256, 'samples': 2808000, 'steps': 14624, 'loss/train': 0.9604562222957611} 01/27/2022 09:37:00 - INFO - codeparrot_training - Step 14625: {'lr': 0.0004193995762489472, 'samples': 2808192, 'steps': 14625, 'loss/train': 1.0138428211212158} 01/27/2022 09:37:03 - INFO - codeparrot_training - Step 14626: {'lr': 0.0004193875424047148, 'samples': 2808384, 'steps': 14626, 'loss/train': 0.4301336109638214} 01/27/2022 09:37:06 - INFO - codeparrot_training - Step 14627: {'lr': 0.00041937550783488, 'samples': 2808576, 'steps': 14627, 'loss/train': 0.9238438010215759} 01/27/2022 09:37:10 - INFO - codeparrot_training - Step 14628: {'lr': 0.00041936347253949426, 'samples': 2808768, 'steps': 14628, 'loss/train': 0.47743476927280426} 01/27/2022 09:37:13 - INFO - codeparrot_training - Step 14629: {'lr': 0.00041935143651860917, 'samples': 2808960, 'steps': 14629, 'loss/train': 1.0805966556072235} 01/27/2022 09:37:16 - INFO - codeparrot_training - Step 14630: {'lr': 0.0004193393997722764, 'samples': 2809152, 'steps': 14630, 'loss/train': 0.28373701125383377} 01/27/2022 09:37:20 - INFO - codeparrot_training - Step 14631: {'lr': 0.00041932736230054725, 'samples': 2809344, 'steps': 14631, 'loss/train': 0.9410839676856995} 01/27/2022 09:37:24 - INFO - codeparrot_training - Step 14632: {'lr': 0.0004193153241034736, 'samples': 2809536, 'steps': 14632, 'loss/train': 1.0807791352272034} 01/27/2022 09:37:27 - INFO - codeparrot_training - Step 14633: {'lr': 0.00041930328518110675, 'samples': 2809728, 'steps': 14633, 'loss/train': 0.9147649705410004} 01/27/2022 09:37:30 - INFO - codeparrot_training - Step 14634: {'lr': 0.0004192912455334985, 'samples': 2809920, 'steps': 14634, 'loss/train': 1.1978933215141296} 01/27/2022 09:37:33 - INFO - codeparrot_training - Step 14635: {'lr': 0.0004192792051607002, 'samples': 2810112, 'steps': 14635, 'loss/train': 1.463774174451828} 01/27/2022 09:37:36 - INFO - codeparrot_training - Step 14636: {'lr': 0.00041926716406276367, 'samples': 2810304, 'steps': 14636, 'loss/train': 0.7552303969860077} 01/27/2022 09:37:39 - INFO - codeparrot_training - Step 14637: {'lr': 0.0004192551222397402, 'samples': 2810496, 'steps': 14637, 'loss/train': 0.7853117287158966} 01/27/2022 09:37:42 - INFO - codeparrot_training - Step 14638: {'lr': 0.0004192430796916816, 'samples': 2810688, 'steps': 14638, 'loss/train': 0.8517679274082184} 01/27/2022 09:37:46 - INFO - codeparrot_training - Step 14639: {'lr': 0.0004192310364186394, 'samples': 2810880, 'steps': 14639, 'loss/train': 1.0425962805747986} 01/27/2022 09:37:50 - INFO - codeparrot_training - Step 14640: {'lr': 0.0004192189924206652, 'samples': 2811072, 'steps': 14640, 'loss/train': 1.449643850326538} 01/27/2022 09:37:53 - INFO - codeparrot_training - Step 14641: {'lr': 0.0004192069476978105, 'samples': 2811264, 'steps': 14641, 'loss/train': 0.6241938918828964} 01/27/2022 09:37:57 - INFO - codeparrot_training - Step 14642: {'lr': 0.000419194902250127, 'samples': 2811456, 'steps': 14642, 'loss/train': 1.6495562195777893} 01/27/2022 09:38:00 - INFO - codeparrot_training - Step 14643: {'lr': 0.0004191828560776663, 'samples': 2811648, 'steps': 14643, 'loss/train': 0.6785459071397781} 01/27/2022 09:38:03 - INFO - codeparrot_training - Step 14644: {'lr': 0.00041917080918047996, 'samples': 2811840, 'steps': 14644, 'loss/train': 0.6782529205083847} 01/27/2022 09:38:06 - INFO - codeparrot_training - Step 14645: {'lr': 0.00041915876155861954, 'samples': 2812032, 'steps': 14645, 'loss/train': 0.8069485723972321} 01/27/2022 09:38:09 - INFO - codeparrot_training - Step 14646: {'lr': 0.0004191467132121367, 'samples': 2812224, 'steps': 14646, 'loss/train': 0.10885109379887581} 01/27/2022 09:38:12 - INFO - codeparrot_training - Step 14647: {'lr': 0.00041913466414108315, 'samples': 2812416, 'steps': 14647, 'loss/train': 0.8022248446941376} 01/27/2022 09:38:15 - INFO - codeparrot_training - Step 14648: {'lr': 0.0004191226143455103, 'samples': 2812608, 'steps': 14648, 'loss/train': 1.6213234663009644} 01/27/2022 09:38:22 - INFO - codeparrot_training - Step 14649: {'lr': 0.00041911056382546997, 'samples': 2812800, 'steps': 14649, 'loss/train': 0.9419182240962982} 01/27/2022 09:38:25 - INFO - codeparrot_training - Step 14650: {'lr': 0.00041909851258101357, 'samples': 2812992, 'steps': 14650, 'loss/train': 1.0607759356498718} 01/27/2022 09:38:28 - INFO - codeparrot_training - Step 14651: {'lr': 0.0004190864606121929, 'samples': 2813184, 'steps': 14651, 'loss/train': 0.5615985989570618} 01/27/2022 09:38:31 - INFO - codeparrot_training - Step 14652: {'lr': 0.0004190744079190595, 'samples': 2813376, 'steps': 14652, 'loss/train': 0.9481981694698334} 01/27/2022 09:38:34 - INFO - codeparrot_training - Step 14653: {'lr': 0.0004190623545016651, 'samples': 2813568, 'steps': 14653, 'loss/train': 1.1620320081710815} 01/27/2022 09:38:37 - INFO - codeparrot_training - Step 14654: {'lr': 0.00041905030036006106, 'samples': 2813760, 'steps': 14654, 'loss/train': 0.40437206625938416} 01/27/2022 09:38:40 - INFO - codeparrot_training - Step 14655: {'lr': 0.00041903824549429936, 'samples': 2813952, 'steps': 14655, 'loss/train': 0.7149298042058945} 01/27/2022 09:38:44 - INFO - codeparrot_training - Step 14656: {'lr': 0.00041902618990443156, 'samples': 2814144, 'steps': 14656, 'loss/train': 0.4370279610157013} 01/27/2022 09:38:47 - INFO - codeparrot_training - Step 14657: {'lr': 0.0004190141335905091, 'samples': 2814336, 'steps': 14657, 'loss/train': 0.6924601346254349} 01/27/2022 09:38:51 - INFO - codeparrot_training - Step 14658: {'lr': 0.0004190020765525838, 'samples': 2814528, 'steps': 14658, 'loss/train': 1.0857647359371185} 01/27/2022 09:38:55 - INFO - codeparrot_training - Step 14659: {'lr': 0.0004189900187907073, 'samples': 2814720, 'steps': 14659, 'loss/train': 0.9051251113414764} 01/27/2022 09:38:58 - INFO - codeparrot_training - Step 14660: {'lr': 0.0004189779603049312, 'samples': 2814912, 'steps': 14660, 'loss/train': 1.084419697523117} 01/27/2022 09:39:01 - INFO - codeparrot_training - Step 14661: {'lr': 0.00041896590109530713, 'samples': 2815104, 'steps': 14661, 'loss/train': 0.546230748295784} 01/27/2022 09:39:04 - INFO - codeparrot_training - Step 14662: {'lr': 0.00041895384116188685, 'samples': 2815296, 'steps': 14662, 'loss/train': 0.5894006788730621} 01/27/2022 09:39:07 - INFO - codeparrot_training - Step 14663: {'lr': 0.000418941780504722, 'samples': 2815488, 'steps': 14663, 'loss/train': 0.6621887236833572} 01/27/2022 09:39:10 - INFO - codeparrot_training - Step 14664: {'lr': 0.00041892971912386415, 'samples': 2815680, 'steps': 14664, 'loss/train': 0.6637994349002838} 01/27/2022 09:39:13 - INFO - codeparrot_training - Step 14665: {'lr': 0.000418917657019365, 'samples': 2815872, 'steps': 14665, 'loss/train': 0.872938334941864} 01/27/2022 09:39:20 - INFO - codeparrot_training - Step 14666: {'lr': 0.0004189055941912763, 'samples': 2816064, 'steps': 14666, 'loss/train': 0.8275067210197449} 01/27/2022 09:39:23 - INFO - codeparrot_training - Step 14667: {'lr': 0.0004188935306396496, 'samples': 2816256, 'steps': 14667, 'loss/train': 0.8122271597385406} 01/27/2022 09:39:26 - INFO - codeparrot_training - Step 14668: {'lr': 0.00041888146636453674, 'samples': 2816448, 'steps': 14668, 'loss/train': 0.7215463221073151} 01/27/2022 09:39:29 - INFO - codeparrot_training - Step 14669: {'lr': 0.0004188694013659892, 'samples': 2816640, 'steps': 14669, 'loss/train': 1.0571164190769196} 01/27/2022 09:39:32 - INFO - codeparrot_training - Step 14670: {'lr': 0.0004188573356440588, 'samples': 2816832, 'steps': 14670, 'loss/train': 0.8774251341819763} 01/27/2022 09:39:36 - INFO - codeparrot_training - Step 14671: {'lr': 0.0004188452691987973, 'samples': 2817024, 'steps': 14671, 'loss/train': 0.6868015676736832} 01/27/2022 09:39:39 - INFO - codeparrot_training - Step 14672: {'lr': 0.0004188332020302561, 'samples': 2817216, 'steps': 14672, 'loss/train': 0.8803143203258514} 01/27/2022 09:39:42 - INFO - codeparrot_training - Step 14673: {'lr': 0.0004188211341384872, 'samples': 2817408, 'steps': 14673, 'loss/train': 0.8729958236217499} 01/27/2022 09:39:45 - INFO - codeparrot_training - Step 14674: {'lr': 0.0004188090655235421, 'samples': 2817600, 'steps': 14674, 'loss/train': 0.9524068236351013} 01/27/2022 09:39:50 - INFO - codeparrot_training - Step 14675: {'lr': 0.00041879699618547263, 'samples': 2817792, 'steps': 14675, 'loss/train': 0.8621561229228973} 01/27/2022 09:39:53 - INFO - codeparrot_training - Step 14676: {'lr': 0.0004187849261243304, 'samples': 2817984, 'steps': 14676, 'loss/train': 0.5677065700292587} 01/27/2022 09:39:56 - INFO - codeparrot_training - Step 14677: {'lr': 0.0004187728553401671, 'samples': 2818176, 'steps': 14677, 'loss/train': 1.2535160779953003} 01/27/2022 09:39:59 - INFO - codeparrot_training - Step 14678: {'lr': 0.0004187607838330345, 'samples': 2818368, 'steps': 14678, 'loss/train': 0.8035756945610046} 01/27/2022 09:40:02 - INFO - codeparrot_training - Step 14679: {'lr': 0.0004187487116029843, 'samples': 2818560, 'steps': 14679, 'loss/train': 0.6796009540557861} 01/27/2022 09:40:05 - INFO - codeparrot_training - Step 14680: {'lr': 0.0004187366386500683, 'samples': 2818752, 'steps': 14680, 'loss/train': 0.6616907268762589} 01/27/2022 09:40:08 - INFO - codeparrot_training - Step 14681: {'lr': 0.00041872456497433797, 'samples': 2818944, 'steps': 14681, 'loss/train': 0.29991238564252853} 01/27/2022 09:40:11 - INFO - codeparrot_training - Step 14682: {'lr': 0.00041871249057584526, 'samples': 2819136, 'steps': 14682, 'loss/train': 1.0577578246593475} 01/27/2022 09:40:15 - INFO - codeparrot_training - Step 14683: {'lr': 0.00041870041545464176, 'samples': 2819328, 'steps': 14683, 'loss/train': 0.7127761691808701} 01/27/2022 09:40:19 - INFO - codeparrot_training - Step 14684: {'lr': 0.00041868833961077935, 'samples': 2819520, 'steps': 14684, 'loss/train': 0.6910829544067383} 01/27/2022 09:40:22 - INFO - codeparrot_training - Step 14685: {'lr': 0.0004186762630443096, 'samples': 2819712, 'steps': 14685, 'loss/train': 1.0491018891334534} 01/27/2022 09:40:25 - INFO - codeparrot_training - Step 14686: {'lr': 0.0004186641857552842, 'samples': 2819904, 'steps': 14686, 'loss/train': 0.99989452958107} 01/27/2022 09:40:28 - INFO - codeparrot_training - Step 14687: {'lr': 0.0004186521077437551, 'samples': 2820096, 'steps': 14687, 'loss/train': 0.678710550069809} 01/27/2022 09:40:32 - INFO - codeparrot_training - Step 14688: {'lr': 0.00041864002900977393, 'samples': 2820288, 'steps': 14688, 'loss/train': 1.188464641571045} 01/27/2022 09:40:35 - INFO - codeparrot_training - Step 14689: {'lr': 0.0004186279495533923, 'samples': 2820480, 'steps': 14689, 'loss/train': 0.7125510424375534} 01/27/2022 09:40:38 - INFO - codeparrot_training - Step 14690: {'lr': 0.0004186158693746622, 'samples': 2820672, 'steps': 14690, 'loss/train': 1.3235315680503845} 01/27/2022 09:40:41 - INFO - codeparrot_training - Step 14691: {'lr': 0.0004186037884736352, 'samples': 2820864, 'steps': 14691, 'loss/train': 1.0423507690429688} 01/27/2022 09:40:44 - INFO - codeparrot_training - Step 14692: {'lr': 0.0004185917068503632, 'samples': 2821056, 'steps': 14692, 'loss/train': 0.8035167753696442} 01/27/2022 09:40:50 - INFO - codeparrot_training - Step 14693: {'lr': 0.00041857962450489786, 'samples': 2821248, 'steps': 14693, 'loss/train': 0.8102477788925171} 01/27/2022 09:40:54 - INFO - codeparrot_training - Step 14694: {'lr': 0.0004185675414372908, 'samples': 2821440, 'steps': 14694, 'loss/train': 1.450908601284027} 01/27/2022 09:40:57 - INFO - codeparrot_training - Step 14695: {'lr': 0.000418555457647594, 'samples': 2821632, 'steps': 14695, 'loss/train': 0.8337876498699188} 01/27/2022 09:41:00 - INFO - codeparrot_training - Step 14696: {'lr': 0.00041854337313585913, 'samples': 2821824, 'steps': 14696, 'loss/train': 0.9468078017234802} 01/27/2022 09:41:03 - INFO - codeparrot_training - Step 14697: {'lr': 0.00041853128790213804, 'samples': 2822016, 'steps': 14697, 'loss/train': 0.07157945819199085} 01/27/2022 09:41:06 - INFO - codeparrot_training - Step 14698: {'lr': 0.0004185192019464823, 'samples': 2822208, 'steps': 14698, 'loss/train': 0.5361178815364838} 01/27/2022 09:41:09 - INFO - codeparrot_training - Step 14699: {'lr': 0.0004185071152689439, 'samples': 2822400, 'steps': 14699, 'loss/train': 1.1995435953140259} 01/27/2022 09:41:12 - INFO - codeparrot_training - Step 14700: {'lr': 0.0004184950278695745, 'samples': 2822592, 'steps': 14700, 'loss/train': 0.9708455801010132} 01/27/2022 09:41:16 - INFO - codeparrot_training - Step 14701: {'lr': 0.0004184829397484259, 'samples': 2822784, 'steps': 14701, 'loss/train': 1.0767499208450317} 01/27/2022 09:41:20 - INFO - codeparrot_training - Step 14702: {'lr': 0.00041847085090554985, 'samples': 2822976, 'steps': 14702, 'loss/train': 0.7786416113376617} 01/27/2022 09:41:23 - INFO - codeparrot_training - Step 14703: {'lr': 0.00041845876134099825, 'samples': 2823168, 'steps': 14703, 'loss/train': 0.6775613576173782} 01/27/2022 09:41:26 - INFO - codeparrot_training - Step 14704: {'lr': 0.0004184466710548227, 'samples': 2823360, 'steps': 14704, 'loss/train': 1.198413759469986} 01/27/2022 09:41:29 - INFO - codeparrot_training - Step 14705: {'lr': 0.0004184345800470752, 'samples': 2823552, 'steps': 14705, 'loss/train': 0.8965454399585724} 01/27/2022 09:41:33 - INFO - codeparrot_training - Step 14706: {'lr': 0.00041842248831780736, 'samples': 2823744, 'steps': 14706, 'loss/train': 0.8890028893947601} 01/27/2022 09:41:36 - INFO - codeparrot_training - Step 14707: {'lr': 0.0004184103958670712, 'samples': 2823936, 'steps': 14707, 'loss/train': 0.8735033869743347} 01/27/2022 09:41:39 - INFO - codeparrot_training - Step 14708: {'lr': 0.00041839830269491823, 'samples': 2824128, 'steps': 14708, 'loss/train': 0.6840658932924271} 01/27/2022 09:41:42 - INFO - codeparrot_training - Step 14709: {'lr': 0.00041838620880140046, 'samples': 2824320, 'steps': 14709, 'loss/train': 0.36487387865781784} 01/27/2022 09:41:45 - INFO - codeparrot_training - Step 14710: {'lr': 0.00041837411418656965, 'samples': 2824512, 'steps': 14710, 'loss/train': 0.9477590024471283} 01/27/2022 09:41:50 - INFO - codeparrot_training - Step 14711: {'lr': 0.0004183620188504776, 'samples': 2824704, 'steps': 14711, 'loss/train': 0.8802561163902283} 01/27/2022 09:41:53 - INFO - codeparrot_training - Step 14712: {'lr': 0.0004183499227931761, 'samples': 2824896, 'steps': 14712, 'loss/train': 1.2066048681735992} 01/27/2022 09:41:56 - INFO - codeparrot_training - Step 14713: {'lr': 0.00041833782601471704, 'samples': 2825088, 'steps': 14713, 'loss/train': 0.5958120971918106} 01/27/2022 09:41:59 - INFO - codeparrot_training - Step 14714: {'lr': 0.0004183257285151521, 'samples': 2825280, 'steps': 14714, 'loss/train': 0.8856074810028076} 01/27/2022 09:42:02 - INFO - codeparrot_training - Step 14715: {'lr': 0.00041831363029453327, 'samples': 2825472, 'steps': 14715, 'loss/train': 0.7699288129806519} 01/27/2022 09:42:06 - INFO - codeparrot_training - Step 14716: {'lr': 0.0004183015313529123, 'samples': 2825664, 'steps': 14716, 'loss/train': 0.6050240099430084} 01/27/2022 09:42:09 - INFO - codeparrot_training - Step 14717: {'lr': 0.00041828943169034094, 'samples': 2825856, 'steps': 14717, 'loss/train': 1.1114689707756042} 01/27/2022 09:42:12 - INFO - codeparrot_training - Step 14718: {'lr': 0.0004182773313068711, 'samples': 2826048, 'steps': 14718, 'loss/train': 0.6214543282985687} 01/27/2022 09:42:16 - INFO - codeparrot_training - Step 14719: {'lr': 0.00041826523020255463, 'samples': 2826240, 'steps': 14719, 'loss/train': 0.7709759473800659} 01/27/2022 09:42:19 - INFO - codeparrot_training - Step 14720: {'lr': 0.00041825312837744333, 'samples': 2826432, 'steps': 14720, 'loss/train': 0.4294821470975876} 01/27/2022 09:42:22 - INFO - codeparrot_training - Step 14721: {'lr': 0.00041824102583158906, 'samples': 2826624, 'steps': 14721, 'loss/train': 0.45274047553539276} 01/27/2022 09:42:26 - INFO - codeparrot_training - Step 14722: {'lr': 0.0004182289225650437, 'samples': 2826816, 'steps': 14722, 'loss/train': 0.6684993803501129} 01/27/2022 09:42:29 - INFO - codeparrot_training - Step 14723: {'lr': 0.00041821681857785904, 'samples': 2827008, 'steps': 14723, 'loss/train': 0.5814145803451538} 01/27/2022 09:42:32 - INFO - codeparrot_training - Step 14724: {'lr': 0.0004182047138700869, 'samples': 2827200, 'steps': 14724, 'loss/train': 1.171672135591507} 01/27/2022 09:42:35 - INFO - codeparrot_training - Step 14725: {'lr': 0.0004181926084417792, 'samples': 2827392, 'steps': 14725, 'loss/train': 1.204192578792572} 01/27/2022 09:42:38 - INFO - codeparrot_training - Step 14726: {'lr': 0.0004181805022929878, 'samples': 2827584, 'steps': 14726, 'loss/train': 0.13331691920757294} 01/27/2022 09:42:41 - INFO - codeparrot_training - Step 14727: {'lr': 0.0004181683954237645, 'samples': 2827776, 'steps': 14727, 'loss/train': 0.9631253778934479} 01/27/2022 09:42:47 - INFO - codeparrot_training - Step 14728: {'lr': 0.00041815628783416117, 'samples': 2827968, 'steps': 14728, 'loss/train': 0.7458924651145935} 01/27/2022 09:42:51 - INFO - codeparrot_training - Step 14729: {'lr': 0.00041814417952422975, 'samples': 2828160, 'steps': 14729, 'loss/train': 1.2032437026500702} 01/27/2022 09:42:54 - INFO - codeparrot_training - Step 14730: {'lr': 0.000418132070494022, 'samples': 2828352, 'steps': 14730, 'loss/train': 0.7057810574769974} 01/27/2022 09:42:57 - INFO - codeparrot_training - Step 14731: {'lr': 0.00041811996074358993, 'samples': 2828544, 'steps': 14731, 'loss/train': 0.8446931540966034} 01/27/2022 09:43:00 - INFO - codeparrot_training - Step 14732: {'lr': 0.00041810785027298524, 'samples': 2828736, 'steps': 14732, 'loss/train': 1.0381837785243988} 01/27/2022 09:43:03 - INFO - codeparrot_training - Step 14733: {'lr': 0.00041809573908225997, 'samples': 2828928, 'steps': 14733, 'loss/train': 0.8627919852733612} 01/27/2022 09:43:06 - INFO - codeparrot_training - Step 14734: {'lr': 0.00041808362717146594, 'samples': 2829120, 'steps': 14734, 'loss/train': 0.7342803329229355} 01/27/2022 09:43:09 - INFO - codeparrot_training - Step 14735: {'lr': 0.00041807151454065493, 'samples': 2829312, 'steps': 14735, 'loss/train': 0.8722310364246368} 01/27/2022 09:43:13 - INFO - codeparrot_training - Step 14736: {'lr': 0.00041805940118987904, 'samples': 2829504, 'steps': 14736, 'loss/train': 0.9414417743682861} 01/27/2022 09:43:17 - INFO - codeparrot_training - Step 14737: {'lr': 0.0004180472871191899, 'samples': 2829696, 'steps': 14737, 'loss/train': 0.8041000664234161} 01/27/2022 09:43:20 - INFO - codeparrot_training - Step 14738: {'lr': 0.0004180351723286396, 'samples': 2829888, 'steps': 14738, 'loss/train': 1.4101426899433136} 01/27/2022 09:43:23 - INFO - codeparrot_training - Step 14739: {'lr': 0.00041802305681828007, 'samples': 2830080, 'steps': 14739, 'loss/train': 1.277050495147705} 01/27/2022 09:43:26 - INFO - codeparrot_training - Step 14740: {'lr': 0.00041801094058816304, 'samples': 2830272, 'steps': 14740, 'loss/train': 0.9346383512020111} 01/27/2022 09:43:30 - INFO - codeparrot_training - Step 14741: {'lr': 0.0004179988236383405, 'samples': 2830464, 'steps': 14741, 'loss/train': 0.6566660106182098} 01/27/2022 09:43:33 - INFO - codeparrot_training - Step 14742: {'lr': 0.00041798670596886433, 'samples': 2830656, 'steps': 14742, 'loss/train': 0.6905596107244492} 01/27/2022 09:43:36 - INFO - codeparrot_training - Step 14743: {'lr': 0.00041797458757978647, 'samples': 2830848, 'steps': 14743, 'loss/train': 0.6328418701887131} 01/27/2022 09:43:39 - INFO - codeparrot_training - Step 14744: {'lr': 0.0004179624684711588, 'samples': 2831040, 'steps': 14744, 'loss/train': 0.5018843114376068} 01/27/2022 09:43:42 - INFO - codeparrot_training - Step 14745: {'lr': 0.0004179503486430333, 'samples': 2831232, 'steps': 14745, 'loss/train': 1.1646584272384644} 01/27/2022 09:43:48 - INFO - codeparrot_training - Step 14746: {'lr': 0.00041793822809546176, 'samples': 2831424, 'steps': 14746, 'loss/train': 0.34616658836603165} 01/27/2022 09:43:52 - INFO - codeparrot_training - Step 14747: {'lr': 0.0004179261068284963, 'samples': 2831616, 'steps': 14747, 'loss/train': 1.0727170407772064} 01/27/2022 09:43:55 - INFO - codeparrot_training - Step 14748: {'lr': 0.00041791398484218855, 'samples': 2831808, 'steps': 14748, 'loss/train': 0.3609984442591667} 01/27/2022 09:43:58 - INFO - codeparrot_training - Step 14749: {'lr': 0.0004179018621365908, 'samples': 2832000, 'steps': 14749, 'loss/train': 1.0386211574077606} 01/27/2022 09:44:01 - INFO - codeparrot_training - Step 14750: {'lr': 0.00041788973871175465, 'samples': 2832192, 'steps': 14750, 'loss/train': 0.7308953404426575} 01/27/2022 09:44:04 - INFO - codeparrot_training - Step 14751: {'lr': 0.00041787761456773214, 'samples': 2832384, 'steps': 14751, 'loss/train': 1.2049443125724792} 01/27/2022 09:44:07 - INFO - codeparrot_training - Step 14752: {'lr': 0.00041786548970457535, 'samples': 2832576, 'steps': 14752, 'loss/train': 0.7305739223957062} 01/27/2022 09:44:11 - INFO - codeparrot_training - Step 14753: {'lr': 0.000417853364122336, 'samples': 2832768, 'steps': 14753, 'loss/train': 0.5155317038297653} 01/27/2022 09:44:15 - INFO - codeparrot_training - Step 14754: {'lr': 0.0004178412378210662, 'samples': 2832960, 'steps': 14754, 'loss/train': 0.7736903429031372} 01/27/2022 09:44:18 - INFO - codeparrot_training - Step 14755: {'lr': 0.0004178291108008179, 'samples': 2833152, 'steps': 14755, 'loss/train': 0.820461630821228} 01/27/2022 09:44:21 - INFO - codeparrot_training - Step 14756: {'lr': 0.00041781698306164283, 'samples': 2833344, 'steps': 14756, 'loss/train': 0.7758226096630096} 01/27/2022 09:44:24 - INFO - codeparrot_training - Step 14757: {'lr': 0.0004178048546035932, 'samples': 2833536, 'steps': 14757, 'loss/train': 0.9858404695987701} 01/27/2022 09:44:28 - INFO - codeparrot_training - Step 14758: {'lr': 0.00041779272542672086, 'samples': 2833728, 'steps': 14758, 'loss/train': 0.7178333848714828} 01/27/2022 09:44:31 - INFO - codeparrot_training - Step 14759: {'lr': 0.00041778059553107766, 'samples': 2833920, 'steps': 14759, 'loss/train': 0.5144877433776855} 01/27/2022 09:44:34 - INFO - codeparrot_training - Step 14760: {'lr': 0.00041776846491671575, 'samples': 2834112, 'steps': 14760, 'loss/train': 1.1205784678459167} 01/27/2022 09:44:37 - INFO - codeparrot_training - Step 14761: {'lr': 0.000417756333583687, 'samples': 2834304, 'steps': 14761, 'loss/train': 1.210334837436676} 01/27/2022 09:44:40 - INFO - codeparrot_training - Step 14762: {'lr': 0.0004177442015320434, 'samples': 2834496, 'steps': 14762, 'loss/train': 0.7166055142879486} 01/27/2022 09:44:45 - INFO - codeparrot_training - Step 14763: {'lr': 0.0004177320687618369, 'samples': 2834688, 'steps': 14763, 'loss/train': 0.7639804780483246} 01/27/2022 09:44:48 - INFO - codeparrot_training - Step 14764: {'lr': 0.0004177199352731194, 'samples': 2834880, 'steps': 14764, 'loss/train': 0.9898688793182373} 01/27/2022 09:44:51 - INFO - codeparrot_training - Step 14765: {'lr': 0.0004177078010659431, 'samples': 2835072, 'steps': 14765, 'loss/train': 0.8888974785804749} 01/27/2022 09:44:54 - INFO - codeparrot_training - Step 14766: {'lr': 0.0004176956661403597, 'samples': 2835264, 'steps': 14766, 'loss/train': 0.9572798609733582} 01/27/2022 09:44:57 - INFO - codeparrot_training - Step 14767: {'lr': 0.0004176835304964214, 'samples': 2835456, 'steps': 14767, 'loss/train': 0.7814667820930481} 01/27/2022 09:45:00 - INFO - codeparrot_training - Step 14768: {'lr': 0.00041767139413418, 'samples': 2835648, 'steps': 14768, 'loss/train': 0.9931512773036957} 01/27/2022 09:45:03 - INFO - codeparrot_training - Step 14769: {'lr': 0.00041765925705368766, 'samples': 2835840, 'steps': 14769, 'loss/train': 1.0892200469970703} 01/27/2022 09:45:07 - INFO - codeparrot_training - Step 14770: {'lr': 0.00041764711925499633, 'samples': 2836032, 'steps': 14770, 'loss/train': 0.9476015567779541} 01/27/2022 09:45:10 - INFO - codeparrot_training - Step 14771: {'lr': 0.0004176349807381579, 'samples': 2836224, 'steps': 14771, 'loss/train': 0.743904635310173} 01/27/2022 09:45:16 - INFO - codeparrot_training - Step 14772: {'lr': 0.0004176228415032245, 'samples': 2836416, 'steps': 14772, 'loss/train': 1.103442907333374} 01/27/2022 09:45:19 - INFO - codeparrot_training - Step 14773: {'lr': 0.000417610701550248, 'samples': 2836608, 'steps': 14773, 'loss/train': 0.9646082818508148} 01/27/2022 09:45:22 - INFO - codeparrot_training - Step 14774: {'lr': 0.0004175985608792806, 'samples': 2836800, 'steps': 14774, 'loss/train': 2.1541024446487427} 01/27/2022 09:45:25 - INFO - codeparrot_training - Step 14775: {'lr': 0.00041758641949037414, 'samples': 2836992, 'steps': 14775, 'loss/train': 0.9861268401145935} 01/27/2022 09:45:29 - INFO - codeparrot_training - Step 14776: {'lr': 0.00041757427738358066, 'samples': 2837184, 'steps': 14776, 'loss/train': 0.5722086131572723} 01/27/2022 09:45:32 - INFO - codeparrot_training - Step 14777: {'lr': 0.00041756213455895215, 'samples': 2837376, 'steps': 14777, 'loss/train': 0.7353585809469223} 01/27/2022 09:45:35 - INFO - codeparrot_training - Step 14778: {'lr': 0.00041754999101654066, 'samples': 2837568, 'steps': 14778, 'loss/train': 0.8062432408332825} 01/27/2022 09:45:38 - INFO - codeparrot_training - Step 14779: {'lr': 0.0004175378467563983, 'samples': 2837760, 'steps': 14779, 'loss/train': 1.0281025171279907} 01/27/2022 09:45:41 - INFO - codeparrot_training - Step 14780: {'lr': 0.00041752570177857695, 'samples': 2837952, 'steps': 14780, 'loss/train': 0.6528675109148026} 01/27/2022 09:45:46 - INFO - codeparrot_training - Step 14781: {'lr': 0.0004175135560831287, 'samples': 2838144, 'steps': 14781, 'loss/train': 0.7588910758495331} 01/27/2022 09:45:49 - INFO - codeparrot_training - Step 14782: {'lr': 0.00041750140967010554, 'samples': 2838336, 'steps': 14782, 'loss/train': 0.8113317489624023} 01/27/2022 09:45:52 - INFO - codeparrot_training - Step 14783: {'lr': 0.00041748926253955954, 'samples': 2838528, 'steps': 14783, 'loss/train': 0.5977474004030228} 01/27/2022 09:45:55 - INFO - codeparrot_training - Step 14784: {'lr': 0.0004174771146915427, 'samples': 2838720, 'steps': 14784, 'loss/train': 0.8546854555606842} 01/27/2022 09:45:58 - INFO - codeparrot_training - Step 14785: {'lr': 0.00041746496612610705, 'samples': 2838912, 'steps': 14785, 'loss/train': 0.9476097822189331} 01/27/2022 09:46:02 - INFO - codeparrot_training - Step 14786: {'lr': 0.00041745281684330476, 'samples': 2839104, 'steps': 14786, 'loss/train': 0.6566299349069595} 01/27/2022 09:46:05 - INFO - codeparrot_training - Step 14787: {'lr': 0.0004174406668431877, 'samples': 2839296, 'steps': 14787, 'loss/train': 0.8156943619251251} 01/27/2022 09:46:08 - INFO - codeparrot_training - Step 14788: {'lr': 0.000417428516125808, 'samples': 2839488, 'steps': 14788, 'loss/train': 0.9465360045433044} 01/27/2022 09:46:14 - INFO - codeparrot_training - Step 14789: {'lr': 0.0004174163646912178, 'samples': 2839680, 'steps': 14789, 'loss/train': 0.5141174644231796} 01/27/2022 09:46:18 - INFO - codeparrot_training - Step 14790: {'lr': 0.0004174042125394689, 'samples': 2839872, 'steps': 14790, 'loss/train': 0.5665283203125} 01/27/2022 09:46:21 - INFO - codeparrot_training - Step 14791: {'lr': 0.00041739205967061366, 'samples': 2840064, 'steps': 14791, 'loss/train': 0.8038610816001892} 01/27/2022 09:46:24 - INFO - codeparrot_training - Step 14792: {'lr': 0.0004173799060847039, 'samples': 2840256, 'steps': 14792, 'loss/train': 0.536593034863472} 01/27/2022 09:46:27 - INFO - codeparrot_training - Step 14793: {'lr': 0.00041736775178179174, 'samples': 2840448, 'steps': 14793, 'loss/train': 1.1547671556472778} 01/27/2022 09:46:30 - INFO - codeparrot_training - Step 14794: {'lr': 0.0004173555967619294, 'samples': 2840640, 'steps': 14794, 'loss/train': 0.767947643995285} 01/27/2022 09:46:33 - INFO - codeparrot_training - Step 14795: {'lr': 0.00041734344102516873, 'samples': 2840832, 'steps': 14795, 'loss/train': 0.9453931152820587} 01/27/2022 09:46:37 - INFO - codeparrot_training - Step 14796: {'lr': 0.0004173312845715619, 'samples': 2841024, 'steps': 14796, 'loss/train': 0.7022225707769394} 01/27/2022 09:46:40 - INFO - codeparrot_training - Step 14797: {'lr': 0.000417319127401161, 'samples': 2841216, 'steps': 14797, 'loss/train': 0.8349330425262451} 01/27/2022 09:46:44 - INFO - codeparrot_training - Step 14798: {'lr': 0.00041730696951401816, 'samples': 2841408, 'steps': 14798, 'loss/train': 0.9120772182941437} 01/27/2022 09:46:47 - INFO - codeparrot_training - Step 14799: {'lr': 0.00041729481091018527, 'samples': 2841600, 'steps': 14799, 'loss/train': 0.5320842862129211} 01/27/2022 09:46:51 - INFO - codeparrot_training - Step 14800: {'lr': 0.0004172826515897146, 'samples': 2841792, 'steps': 14800, 'loss/train': 0.9921875596046448} 01/27/2022 09:46:54 - INFO - codeparrot_training - Step 14801: {'lr': 0.0004172704915526581, 'samples': 2841984, 'steps': 14801, 'loss/train': 0.7252427190542221} 01/27/2022 09:46:57 - INFO - codeparrot_training - Step 14802: {'lr': 0.000417258330799068, 'samples': 2842176, 'steps': 14802, 'loss/train': 0.9280117750167847} 01/27/2022 09:47:00 - INFO - codeparrot_training - Step 14803: {'lr': 0.00041724616932899627, 'samples': 2842368, 'steps': 14803, 'loss/train': 0.9856557548046112} 01/27/2022 09:47:03 - INFO - codeparrot_training - Step 14804: {'lr': 0.0004172340071424951, 'samples': 2842560, 'steps': 14804, 'loss/train': 1.3009657859802246} 01/27/2022 09:47:06 - INFO - codeparrot_training - Step 14805: {'lr': 0.0004172218442396165, 'samples': 2842752, 'steps': 14805, 'loss/train': 0.722277045249939} 01/27/2022 09:47:11 - INFO - codeparrot_training - Step 14806: {'lr': 0.00041720968062041266, 'samples': 2842944, 'steps': 14806, 'loss/train': 0.9851036667823792} 01/27/2022 09:47:14 - INFO - codeparrot_training - Step 14807: {'lr': 0.0004171975162849356, 'samples': 2843136, 'steps': 14807, 'loss/train': 0.8300107419490814} 01/27/2022 09:47:17 - INFO - codeparrot_training - Step 14808: {'lr': 0.0004171853512332375, 'samples': 2843328, 'steps': 14808, 'loss/train': 1.1342875063419342} 01/27/2022 09:47:20 - INFO - codeparrot_training - Step 14809: {'lr': 0.00041717318546537045, 'samples': 2843520, 'steps': 14809, 'loss/train': 1.038821965456009} 01/27/2022 09:47:23 - INFO - codeparrot_training - Step 14810: {'lr': 0.0004171610189813866, 'samples': 2843712, 'steps': 14810, 'loss/train': 0.6825499534606934} 01/27/2022 09:47:27 - INFO - codeparrot_training - Step 14811: {'lr': 0.000417148851781338, 'samples': 2843904, 'steps': 14811, 'loss/train': 0.9285024404525757} 01/27/2022 09:47:30 - INFO - codeparrot_training - Step 14812: {'lr': 0.0004171366838652767, 'samples': 2844096, 'steps': 14812, 'loss/train': 0.9844288229942322} 01/27/2022 09:47:33 - INFO - codeparrot_training - Step 14813: {'lr': 0.000417124515233255, 'samples': 2844288, 'steps': 14813, 'loss/train': 1.006439745426178} 01/27/2022 09:47:36 - INFO - codeparrot_training - Step 14814: {'lr': 0.00041711234588532497, 'samples': 2844480, 'steps': 14814, 'loss/train': 0.7405475825071335} 01/27/2022 09:47:39 - INFO - codeparrot_training - Step 14815: {'lr': 0.0004171001758215387, 'samples': 2844672, 'steps': 14815, 'loss/train': 0.462926521897316} 01/27/2022 09:47:44 - INFO - codeparrot_training - Step 14816: {'lr': 0.0004170880050419483, 'samples': 2844864, 'steps': 14816, 'loss/train': 1.0326532423496246} 01/27/2022 09:47:47 - INFO - codeparrot_training - Step 14817: {'lr': 0.00041707583354660597, 'samples': 2845056, 'steps': 14817, 'loss/train': 0.8180705308914185} 01/27/2022 09:47:50 - INFO - codeparrot_training - Step 14818: {'lr': 0.0004170636613355638, 'samples': 2845248, 'steps': 14818, 'loss/train': 0.6342102885246277} 01/27/2022 09:47:53 - INFO - codeparrot_training - Step 14819: {'lr': 0.000417051488408874, 'samples': 2845440, 'steps': 14819, 'loss/train': 0.9276755154132843} 01/27/2022 09:47:56 - INFO - codeparrot_training - Step 14820: {'lr': 0.00041703931476658857, 'samples': 2845632, 'steps': 14820, 'loss/train': 0.7134802043437958} 01/27/2022 09:48:00 - INFO - codeparrot_training - Step 14821: {'lr': 0.0004170271404087598, 'samples': 2845824, 'steps': 14821, 'loss/train': 0.7472375929355621} 01/27/2022 09:48:03 - INFO - codeparrot_training - Step 14822: {'lr': 0.0004170149653354398, 'samples': 2846016, 'steps': 14822, 'loss/train': 0.6008272916078568} 01/27/2022 09:48:06 - INFO - codeparrot_training - Step 14823: {'lr': 0.0004170027895466807, 'samples': 2846208, 'steps': 14823, 'loss/train': 1.7794066071510315} 01/27/2022 09:48:12 - INFO - codeparrot_training - Step 14824: {'lr': 0.00041699061304253476, 'samples': 2846400, 'steps': 14824, 'loss/train': 1.2936800122261047} 01/27/2022 09:48:15 - INFO - codeparrot_training - Step 14825: {'lr': 0.00041697843582305406, 'samples': 2846592, 'steps': 14825, 'loss/train': 0.36899325996637344} 01/27/2022 09:48:18 - INFO - codeparrot_training - Step 14826: {'lr': 0.0004169662578882907, 'samples': 2846784, 'steps': 14826, 'loss/train': 1.0826586484909058} 01/27/2022 09:48:22 - INFO - codeparrot_training - Step 14827: {'lr': 0.0004169540792382969, 'samples': 2846976, 'steps': 14827, 'loss/train': 0.28187743574380875} 01/27/2022 09:48:25 - INFO - codeparrot_training - Step 14828: {'lr': 0.0004169418998731249, 'samples': 2847168, 'steps': 14828, 'loss/train': 0.7971208691596985} 01/27/2022 09:48:28 - INFO - codeparrot_training - Step 14829: {'lr': 0.0004169297197928268, 'samples': 2847360, 'steps': 14829, 'loss/train': 1.0158546566963196} 01/27/2022 09:48:31 - INFO - codeparrot_training - Step 14830: {'lr': 0.0004169175389974548, 'samples': 2847552, 'steps': 14830, 'loss/train': 1.2617656588554382} 01/27/2022 09:48:34 - INFO - codeparrot_training - Step 14831: {'lr': 0.0004169053574870609, 'samples': 2847744, 'steps': 14831, 'loss/train': 0.5128538310527802} 01/27/2022 09:48:37 - INFO - codeparrot_training - Step 14832: {'lr': 0.0004168931752616977, 'samples': 2847936, 'steps': 14832, 'loss/train': 1.1525744497776031} 01/27/2022 09:48:40 - INFO - codeparrot_training - Step 14833: {'lr': 0.00041688099232141694, 'samples': 2848128, 'steps': 14833, 'loss/train': 0.6738033592700958} 01/27/2022 09:48:45 - INFO - codeparrot_training - Step 14834: {'lr': 0.0004168688086662711, 'samples': 2848320, 'steps': 14834, 'loss/train': 1.0554452240467072} 01/27/2022 09:48:48 - INFO - codeparrot_training - Step 14835: {'lr': 0.0004168566242963122, 'samples': 2848512, 'steps': 14835, 'loss/train': 1.218626081943512} 01/27/2022 09:48:52 - INFO - codeparrot_training - Step 14836: {'lr': 0.00041684443921159253, 'samples': 2848704, 'steps': 14836, 'loss/train': 0.762505978345871} 01/27/2022 09:48:55 - INFO - codeparrot_training - Step 14837: {'lr': 0.00041683225341216426, 'samples': 2848896, 'steps': 14837, 'loss/train': 0.8371227085590363} 01/27/2022 09:48:58 - INFO - codeparrot_training - Step 14838: {'lr': 0.0004168200668980796, 'samples': 2849088, 'steps': 14838, 'loss/train': 0.9246532917022705} 01/27/2022 09:49:01 - INFO - codeparrot_training - Step 14839: {'lr': 0.0004168078796693908, 'samples': 2849280, 'steps': 14839, 'loss/train': 0.4371764212846756} 01/27/2022 09:49:04 - INFO - codeparrot_training - Step 14840: {'lr': 0.00041679569172614996, 'samples': 2849472, 'steps': 14840, 'loss/train': 1.0923790633678436} 01/27/2022 09:49:07 - INFO - codeparrot_training - Step 14841: {'lr': 0.0004167835030684093, 'samples': 2849664, 'steps': 14841, 'loss/train': 0.37112393975257874} 01/27/2022 09:49:12 - INFO - codeparrot_training - Step 14842: {'lr': 0.0004167713136962211, 'samples': 2849856, 'steps': 14842, 'loss/train': 1.2079759240150452} 01/27/2022 09:49:15 - INFO - codeparrot_training - Step 14843: {'lr': 0.00041675912360963766, 'samples': 2850048, 'steps': 14843, 'loss/train': 0.8484532535076141} 01/27/2022 09:49:18 - INFO - codeparrot_training - Step 14844: {'lr': 0.0004167469328087109, 'samples': 2850240, 'steps': 14844, 'loss/train': 0.9894675314426422} 01/27/2022 09:49:21 - INFO - codeparrot_training - Step 14845: {'lr': 0.0004167347412934933, 'samples': 2850432, 'steps': 14845, 'loss/train': 0.9104496538639069} 01/27/2022 09:49:24 - INFO - codeparrot_training - Step 14846: {'lr': 0.00041672254906403703, 'samples': 2850624, 'steps': 14846, 'loss/train': 0.9841272532939911} 01/27/2022 09:49:27 - INFO - codeparrot_training - Step 14847: {'lr': 0.00041671035612039434, 'samples': 2850816, 'steps': 14847, 'loss/train': 0.902584433555603} 01/27/2022 09:49:31 - INFO - codeparrot_training - Step 14848: {'lr': 0.0004166981624626174, 'samples': 2851008, 'steps': 14848, 'loss/train': 0.606144055724144} 01/27/2022 09:49:34 - INFO - codeparrot_training - Step 14849: {'lr': 0.00041668596809075835, 'samples': 2851200, 'steps': 14849, 'loss/train': 0.7090845108032227} 01/27/2022 09:49:37 - INFO - codeparrot_training - Step 14850: {'lr': 0.0004166737730048697, 'samples': 2851392, 'steps': 14850, 'loss/train': 0.3808351010084152} 01/27/2022 09:49:43 - INFO - codeparrot_training - Step 14851: {'lr': 0.00041666157720500344, 'samples': 2851584, 'steps': 14851, 'loss/train': 1.139484018087387} 01/27/2022 09:49:46 - INFO - codeparrot_training - Step 14852: {'lr': 0.00041664938069121195, 'samples': 2851776, 'steps': 14852, 'loss/train': 0.448562353849411} 01/27/2022 09:49:49 - INFO - codeparrot_training - Step 14853: {'lr': 0.0004166371834635474, 'samples': 2851968, 'steps': 14853, 'loss/train': 1.376759111881256} 01/27/2022 09:49:52 - INFO - codeparrot_training - Step 14854: {'lr': 0.00041662498552206206, 'samples': 2852160, 'steps': 14854, 'loss/train': 0.7254250198602676} 01/27/2022 09:49:56 - INFO - codeparrot_training - Step 14855: {'lr': 0.00041661278686680827, 'samples': 2852352, 'steps': 14855, 'loss/train': 1.3295163810253143} 01/27/2022 09:49:59 - INFO - codeparrot_training - Step 14856: {'lr': 0.00041660058749783813, 'samples': 2852544, 'steps': 14856, 'loss/train': 0.6458389610052109} 01/27/2022 09:50:02 - INFO - codeparrot_training - Step 14857: {'lr': 0.000416588387415204, 'samples': 2852736, 'steps': 14857, 'loss/train': 1.0190350413322449} 01/27/2022 09:50:05 - INFO - codeparrot_training - Step 14858: {'lr': 0.0004165761866189581, 'samples': 2852928, 'steps': 14858, 'loss/train': 0.6567938178777695} 01/27/2022 09:50:08 - INFO - codeparrot_training - Step 14859: {'lr': 0.00041656398510915273, 'samples': 2853120, 'steps': 14859, 'loss/train': 0.892786055803299} 01/27/2022 09:50:14 - INFO - codeparrot_training - Step 14860: {'lr': 0.00041655178288584006, 'samples': 2853312, 'steps': 14860, 'loss/train': 1.0964846312999725} 01/27/2022 09:50:17 - INFO - codeparrot_training - Step 14861: {'lr': 0.00041653957994907255, 'samples': 2853504, 'steps': 14861, 'loss/train': 0.9217387139797211} 01/27/2022 09:50:20 - INFO - codeparrot_training - Step 14862: {'lr': 0.0004165273762989023, 'samples': 2853696, 'steps': 14862, 'loss/train': 1.7883444428443909} 01/27/2022 09:50:23 - INFO - codeparrot_training - Step 14863: {'lr': 0.0004165151719353817, 'samples': 2853888, 'steps': 14863, 'loss/train': 0.7755110263824463} 01/27/2022 09:50:26 - INFO - codeparrot_training - Step 14864: {'lr': 0.0004165029668585629, 'samples': 2854080, 'steps': 14864, 'loss/train': 0.7723269760608673} 01/27/2022 09:50:29 - INFO - codeparrot_training - Step 14865: {'lr': 0.00041649076106849836, 'samples': 2854272, 'steps': 14865, 'loss/train': 1.0524187982082367} 01/27/2022 09:50:33 - INFO - codeparrot_training - Step 14866: {'lr': 0.0004164785545652402, 'samples': 2854464, 'steps': 14866, 'loss/train': 1.0664228796958923} 01/27/2022 09:50:36 - INFO - codeparrot_training - Step 14867: {'lr': 0.0004164663473488408, 'samples': 2854656, 'steps': 14867, 'loss/train': 0.8680561780929565} 01/27/2022 09:50:39 - INFO - codeparrot_training - Step 14868: {'lr': 0.0004164541394193524, 'samples': 2854848, 'steps': 14868, 'loss/train': 0.8964868783950806} 01/27/2022 09:50:45 - INFO - codeparrot_training - Step 14869: {'lr': 0.00041644193077682734, 'samples': 2855040, 'steps': 14869, 'loss/train': 1.0786835253238678} 01/27/2022 09:50:48 - INFO - codeparrot_training - Step 14870: {'lr': 0.0004164297214213179, 'samples': 2855232, 'steps': 14870, 'loss/train': 0.6810383051633835} 01/27/2022 09:50:51 - INFO - codeparrot_training - Step 14871: {'lr': 0.0004164175113528763, 'samples': 2855424, 'steps': 14871, 'loss/train': 1.0401233732700348} 01/27/2022 09:50:54 - INFO - codeparrot_training - Step 14872: {'lr': 0.000416405300571555, 'samples': 2855616, 'steps': 14872, 'loss/train': 0.15714527294039726} 01/27/2022 09:50:58 - INFO - codeparrot_training - Step 14873: {'lr': 0.00041639308907740624, 'samples': 2855808, 'steps': 14873, 'loss/train': 1.154172420501709} 01/27/2022 09:51:01 - INFO - codeparrot_training - Step 14874: {'lr': 0.0004163808768704823, 'samples': 2856000, 'steps': 14874, 'loss/train': 0.67677041888237} 01/27/2022 09:51:04 - INFO - codeparrot_training - Step 14875: {'lr': 0.0004163686639508356, 'samples': 2856192, 'steps': 14875, 'loss/train': 1.0373145639896393} 01/27/2022 09:51:07 - INFO - codeparrot_training - Step 14876: {'lr': 0.00041635645031851826, 'samples': 2856384, 'steps': 14876, 'loss/train': 0.9687960147857666} 01/27/2022 09:51:10 - INFO - codeparrot_training - Step 14877: {'lr': 0.0004163442359735827, 'samples': 2856576, 'steps': 14877, 'loss/train': 0.7696496844291687} 01/27/2022 09:51:15 - INFO - codeparrot_training - Step 14878: {'lr': 0.00041633202091608136, 'samples': 2856768, 'steps': 14878, 'loss/train': 0.847482293844223} 01/27/2022 09:51:18 - INFO - codeparrot_training - Step 14879: {'lr': 0.00041631980514606636, 'samples': 2856960, 'steps': 14879, 'loss/train': 0.6754580587148666} 01/27/2022 09:51:21 - INFO - codeparrot_training - Step 14880: {'lr': 0.0004163075886635902, 'samples': 2857152, 'steps': 14880, 'loss/train': 0.7555694282054901} 01/27/2022 09:51:24 - INFO - codeparrot_training - Step 14881: {'lr': 0.0004162953714687051, 'samples': 2857344, 'steps': 14881, 'loss/train': 0.2570982053875923} 01/27/2022 09:51:27 - INFO - codeparrot_training - Step 14882: {'lr': 0.0004162831535614635, 'samples': 2857536, 'steps': 14882, 'loss/train': 1.0315191149711609} 01/27/2022 09:51:31 - INFO - codeparrot_training - Step 14883: {'lr': 0.0004162709349419176, 'samples': 2857728, 'steps': 14883, 'loss/train': 1.1975763738155365} 01/27/2022 09:51:34 - INFO - codeparrot_training - Step 14884: {'lr': 0.0004162587156101198, 'samples': 2857920, 'steps': 14884, 'loss/train': 0.8490925133228302} 01/27/2022 09:51:37 - INFO - codeparrot_training - Step 14885: {'lr': 0.0004162464955661225, 'samples': 2858112, 'steps': 14885, 'loss/train': 0.42097820341587067} 01/27/2022 09:51:40 - INFO - codeparrot_training - Step 14886: {'lr': 0.000416234274809978, 'samples': 2858304, 'steps': 14886, 'loss/train': 0.5582098513841629} 01/27/2022 09:51:44 - INFO - codeparrot_training - Step 14887: {'lr': 0.00041622205334173863, 'samples': 2858496, 'steps': 14887, 'loss/train': 0.7581454217433929} 01/27/2022 09:51:48 - INFO - codeparrot_training - Step 14888: {'lr': 0.00041620983116145673, 'samples': 2858688, 'steps': 14888, 'loss/train': 1.0134406685829163} 01/27/2022 09:51:51 - INFO - codeparrot_training - Step 14889: {'lr': 0.00041619760826918474, 'samples': 2858880, 'steps': 14889, 'loss/train': 0.46819138526916504} 01/27/2022 09:51:54 - INFO - codeparrot_training - Step 14890: {'lr': 0.00041618538466497496, 'samples': 2859072, 'steps': 14890, 'loss/train': 0.7389763444662094} 01/27/2022 09:51:57 - INFO - codeparrot_training - Step 14891: {'lr': 0.00041617316034887983, 'samples': 2859264, 'steps': 14891, 'loss/train': 0.4043544977903366} 01/27/2022 09:52:00 - INFO - codeparrot_training - Step 14892: {'lr': 0.00041616093532095155, 'samples': 2859456, 'steps': 14892, 'loss/train': 0.6869072467088699} 01/27/2022 09:52:03 - INFO - codeparrot_training - Step 14893: {'lr': 0.00041614870958124264, 'samples': 2859648, 'steps': 14893, 'loss/train': 0.6493963748216629} 01/27/2022 09:52:06 - INFO - codeparrot_training - Step 14894: {'lr': 0.00041613648312980537, 'samples': 2859840, 'steps': 14894, 'loss/train': 1.036612182855606} 01/27/2022 09:52:10 - INFO - codeparrot_training - Step 14895: {'lr': 0.00041612425596669215, 'samples': 2860032, 'steps': 14895, 'loss/train': 2.634717106819153} 01/27/2022 09:52:16 - INFO - codeparrot_training - Step 14896: {'lr': 0.0004161120280919554, 'samples': 2860224, 'steps': 14896, 'loss/train': 1.0464337170124054} 01/27/2022 09:52:19 - INFO - codeparrot_training - Step 14897: {'lr': 0.00041609979950564747, 'samples': 2860416, 'steps': 14897, 'loss/train': 0.9059052765369415} 01/27/2022 09:52:22 - INFO - codeparrot_training - Step 14898: {'lr': 0.00041608757020782073, 'samples': 2860608, 'steps': 14898, 'loss/train': 0.9642875790596008} 01/27/2022 09:52:26 - INFO - codeparrot_training - Step 14899: {'lr': 0.0004160753401985276, 'samples': 2860800, 'steps': 14899, 'loss/train': 0.6371002346277237} 01/27/2022 09:52:29 - INFO - codeparrot_training - Step 14900: {'lr': 0.00041606310947782046, 'samples': 2860992, 'steps': 14900, 'loss/train': 0.6879485696554184} 01/27/2022 09:52:32 - INFO - codeparrot_training - Step 14901: {'lr': 0.00041605087804575167, 'samples': 2861184, 'steps': 14901, 'loss/train': 0.029866931028664112} 01/27/2022 09:52:35 - INFO - codeparrot_training - Step 14902: {'lr': 0.0004160386459023736, 'samples': 2861376, 'steps': 14902, 'loss/train': 1.1892339885234833} 01/27/2022 09:52:38 - INFO - codeparrot_training - Step 14903: {'lr': 0.00041602641304773876, 'samples': 2861568, 'steps': 14903, 'loss/train': 1.1637123227119446} 01/27/2022 09:52:43 - INFO - codeparrot_training - Step 14904: {'lr': 0.0004160141794818995, 'samples': 2861760, 'steps': 14904, 'loss/train': 1.3617397248744965} 01/27/2022 09:52:46 - INFO - codeparrot_training - Step 14905: {'lr': 0.00041600194520490815, 'samples': 2861952, 'steps': 14905, 'loss/train': 0.6246704310178757} 01/27/2022 09:52:49 - INFO - codeparrot_training - Step 14906: {'lr': 0.0004159897102168172, 'samples': 2862144, 'steps': 14906, 'loss/train': 0.9599403440952301} 01/27/2022 09:52:52 - INFO - codeparrot_training - Step 14907: {'lr': 0.00041597747451767905, 'samples': 2862336, 'steps': 14907, 'loss/train': 1.0182712376117706} 01/27/2022 09:52:56 - INFO - codeparrot_training - Step 14908: {'lr': 0.00041596523810754607, 'samples': 2862528, 'steps': 14908, 'loss/train': 0.6133207082748413} 01/27/2022 09:52:59 - INFO - codeparrot_training - Step 14909: {'lr': 0.0004159530009864707, 'samples': 2862720, 'steps': 14909, 'loss/train': 0.7166661322116852} 01/27/2022 09:53:02 - INFO - codeparrot_training - Step 14910: {'lr': 0.0004159407631545054, 'samples': 2862912, 'steps': 14910, 'loss/train': 1.1071895956993103} 01/27/2022 09:53:05 - INFO - codeparrot_training - Step 14911: {'lr': 0.0004159285246117026, 'samples': 2863104, 'steps': 14911, 'loss/train': 0.5593938678503036} 01/27/2022 09:53:08 - INFO - codeparrot_training - Step 14912: {'lr': 0.00041591628535811464, 'samples': 2863296, 'steps': 14912, 'loss/train': 0.8118923306465149} 01/27/2022 09:53:12 - INFO - codeparrot_training - Step 14913: {'lr': 0.000415904045393794, 'samples': 2863488, 'steps': 14913, 'loss/train': 0.4378426820039749} 01/27/2022 09:53:16 - INFO - codeparrot_training - Step 14914: {'lr': 0.0004158918047187931, 'samples': 2863680, 'steps': 14914, 'loss/train': 0.9494974315166473} 01/27/2022 09:53:19 - INFO - codeparrot_training - Step 14915: {'lr': 0.0004158795633331645, 'samples': 2863872, 'steps': 14915, 'loss/train': 1.633556306362152} 01/27/2022 09:53:22 - INFO - codeparrot_training - Step 14916: {'lr': 0.00041586732123696037, 'samples': 2864064, 'steps': 14916, 'loss/train': 0.8532348275184631} 01/27/2022 09:53:25 - INFO - codeparrot_training - Step 14917: {'lr': 0.0004158550784302334, 'samples': 2864256, 'steps': 14917, 'loss/train': 0.8623372614383698} 01/27/2022 09:53:28 - INFO - codeparrot_training - Step 14918: {'lr': 0.0004158428349130359, 'samples': 2864448, 'steps': 14918, 'loss/train': 0.8054812252521515} 01/27/2022 09:53:31 - INFO - codeparrot_training - Step 14919: {'lr': 0.00041583059068542034, 'samples': 2864640, 'steps': 14919, 'loss/train': 0.5904173254966736} 01/27/2022 09:53:34 - INFO - codeparrot_training - Step 14920: {'lr': 0.0004158183457474392, 'samples': 2864832, 'steps': 14920, 'loss/train': 1.0304043889045715} 01/27/2022 09:53:38 - INFO - codeparrot_training - Step 14921: {'lr': 0.00041580610009914486, 'samples': 2865024, 'steps': 14921, 'loss/train': 0.6133888363838196} 01/27/2022 09:53:44 - INFO - codeparrot_training - Step 14922: {'lr': 0.00041579385374058996, 'samples': 2865216, 'steps': 14922, 'loss/train': 0.8990558981895447} 01/27/2022 09:53:47 - INFO - codeparrot_training - Step 14923: {'lr': 0.00041578160667182676, 'samples': 2865408, 'steps': 14923, 'loss/train': 0.8090297877788544} 01/27/2022 09:53:50 - INFO - codeparrot_training - Step 14924: {'lr': 0.00041576935889290777, 'samples': 2865600, 'steps': 14924, 'loss/train': 1.5105382204055786} 01/27/2022 09:53:53 - INFO - codeparrot_training - Step 14925: {'lr': 0.0004157571104038856, 'samples': 2865792, 'steps': 14925, 'loss/train': 0.7566186189651489} 01/27/2022 09:53:56 - INFO - codeparrot_training - Step 14926: {'lr': 0.00041574486120481255, 'samples': 2865984, 'steps': 14926, 'loss/train': 0.8501961529254913} 01/27/2022 09:54:00 - INFO - codeparrot_training - Step 14927: {'lr': 0.0004157326112957411, 'samples': 2866176, 'steps': 14927, 'loss/train': 1.0049327909946442} 01/27/2022 09:54:03 - INFO - codeparrot_training - Step 14928: {'lr': 0.0004157203606767238, 'samples': 2866368, 'steps': 14928, 'loss/train': 0.6804579198360443} 01/27/2022 09:54:06 - INFO - codeparrot_training - Step 14929: {'lr': 0.0004157081093478131, 'samples': 2866560, 'steps': 14929, 'loss/train': 0.252153642475605} 01/27/2022 09:54:09 - INFO - codeparrot_training - Step 14930: {'lr': 0.00041569585730906147, 'samples': 2866752, 'steps': 14930, 'loss/train': 1.0068927705287933} 01/27/2022 09:54:14 - INFO - codeparrot_training - Step 14931: {'lr': 0.0004156836045605214, 'samples': 2866944, 'steps': 14931, 'loss/train': 0.563710168004036} 01/27/2022 09:54:17 - INFO - codeparrot_training - Step 14932: {'lr': 0.0004156713511022454, 'samples': 2867136, 'steps': 14932, 'loss/train': 0.42748913168907166} 01/27/2022 09:54:20 - INFO - codeparrot_training - Step 14933: {'lr': 0.00041565909693428593, 'samples': 2867328, 'steps': 14933, 'loss/train': 1.3977684080600739} 01/27/2022 09:54:23 - INFO - codeparrot_training - Step 14934: {'lr': 0.00041564684205669546, 'samples': 2867520, 'steps': 14934, 'loss/train': 0.846125453710556} 01/27/2022 09:54:26 - INFO - codeparrot_training - Step 14935: {'lr': 0.00041563458646952655, 'samples': 2867712, 'steps': 14935, 'loss/train': 0.7471296787261963} 01/27/2022 09:54:29 - INFO - codeparrot_training - Step 14936: {'lr': 0.0004156223301728316, 'samples': 2867904, 'steps': 14936, 'loss/train': 0.8648886680603027} 01/27/2022 09:54:32 - INFO - codeparrot_training - Step 14937: {'lr': 0.00041561007316666333, 'samples': 2868096, 'steps': 14937, 'loss/train': 0.28202104568481445} 01/27/2022 09:54:36 - INFO - codeparrot_training - Step 14938: {'lr': 0.00041559781545107393, 'samples': 2868288, 'steps': 14938, 'loss/train': 0.5227208286523819} 01/27/2022 09:54:40 - INFO - codeparrot_training - Step 14939: {'lr': 0.00041558555702611615, 'samples': 2868480, 'steps': 14939, 'loss/train': 0.774116188287735} 01/27/2022 09:54:43 - INFO - codeparrot_training - Step 14940: {'lr': 0.0004155732978918424, 'samples': 2868672, 'steps': 14940, 'loss/train': 1.1576021611690521} 01/27/2022 09:54:46 - INFO - codeparrot_training - Step 14941: {'lr': 0.00041556103804830523, 'samples': 2868864, 'steps': 14941, 'loss/train': 0.8798396587371826} 01/27/2022 09:54:50 - INFO - codeparrot_training - Step 14942: {'lr': 0.0004155487774955572, 'samples': 2869056, 'steps': 14942, 'loss/train': 1.0762514770030975} 01/27/2022 09:54:53 - INFO - codeparrot_training - Step 14943: {'lr': 0.00041553651623365076, 'samples': 2869248, 'steps': 14943, 'loss/train': 1.0704386830329895} 01/27/2022 09:54:56 - INFO - codeparrot_training - Step 14944: {'lr': 0.00041552425426263836, 'samples': 2869440, 'steps': 14944, 'loss/train': 0.028738743625581264} 01/27/2022 09:54:59 - INFO - codeparrot_training - Step 14945: {'lr': 0.00041551199158257264, 'samples': 2869632, 'steps': 14945, 'loss/train': 0.8262295424938202} 01/27/2022 09:55:02 - INFO - codeparrot_training - Step 14946: {'lr': 0.00041549972819350615, 'samples': 2869824, 'steps': 14946, 'loss/train': 1.181008905172348} 01/27/2022 09:55:05 - INFO - codeparrot_training - Step 14947: {'lr': 0.00041548746409549134, 'samples': 2870016, 'steps': 14947, 'loss/train': 1.2898524105548859} 01/27/2022 09:55:10 - INFO - codeparrot_training - Step 14948: {'lr': 0.0004154751992885808, 'samples': 2870208, 'steps': 14948, 'loss/train': 0.9211480915546417} 01/27/2022 09:55:13 - INFO - codeparrot_training - Step 14949: {'lr': 0.0004154629337728271, 'samples': 2870400, 'steps': 14949, 'loss/train': 0.6386914998292923} 01/27/2022 09:55:16 - INFO - codeparrot_training - Step 14950: {'lr': 0.00041545066754828264, 'samples': 2870592, 'steps': 14950, 'loss/train': 0.6085518300533295} 01/27/2022 09:55:19 - INFO - codeparrot_training - Step 14951: {'lr': 0.00041543840061500007, 'samples': 2870784, 'steps': 14951, 'loss/train': 0.8187743425369263} 01/27/2022 09:55:22 - INFO - codeparrot_training - Step 14952: {'lr': 0.000415426132973032, 'samples': 2870976, 'steps': 14952, 'loss/train': 0.9395715594291687} 01/27/2022 09:55:26 - INFO - codeparrot_training - Step 14953: {'lr': 0.0004154138646224308, 'samples': 2871168, 'steps': 14953, 'loss/train': 0.9904250800609589} 01/27/2022 09:55:29 - INFO - codeparrot_training - Step 14954: {'lr': 0.0004154015955632492, 'samples': 2871360, 'steps': 14954, 'loss/train': 0.841172844171524} 01/27/2022 09:55:32 - INFO - codeparrot_training - Step 14955: {'lr': 0.0004153893257955397, 'samples': 2871552, 'steps': 14955, 'loss/train': 0.9992859363555908} 01/27/2022 09:55:35 - INFO - codeparrot_training - Step 14956: {'lr': 0.00041537705531935476, 'samples': 2871744, 'steps': 14956, 'loss/train': 0.5827158540487289} 01/27/2022 09:55:41 - INFO - codeparrot_training - Step 14957: {'lr': 0.0004153647841347471, 'samples': 2871936, 'steps': 14957, 'loss/train': 1.140465795993805} 01/27/2022 09:55:44 - INFO - codeparrot_training - Step 14958: {'lr': 0.0004153525122417692, 'samples': 2872128, 'steps': 14958, 'loss/train': 0.7829126715660095} 01/27/2022 09:55:48 - INFO - codeparrot_training - Step 14959: {'lr': 0.00041534023964047363, 'samples': 2872320, 'steps': 14959, 'loss/train': 1.0058841705322266} 01/27/2022 09:55:51 - INFO - codeparrot_training - Step 14960: {'lr': 0.00041532796633091297, 'samples': 2872512, 'steps': 14960, 'loss/train': 0.5565561354160309} 01/27/2022 09:55:54 - INFO - codeparrot_training - Step 14961: {'lr': 0.0004153156923131398, 'samples': 2872704, 'steps': 14961, 'loss/train': 1.2491376399993896} 01/27/2022 09:55:57 - INFO - codeparrot_training - Step 14962: {'lr': 0.0004153034175872067, 'samples': 2872896, 'steps': 14962, 'loss/train': 1.0159537196159363} 01/27/2022 09:56:00 - INFO - codeparrot_training - Step 14963: {'lr': 0.00041529114215316633, 'samples': 2873088, 'steps': 14963, 'loss/train': 0.38084784150123596} 01/27/2022 09:56:03 - INFO - codeparrot_training - Step 14964: {'lr': 0.0004152788660110711, 'samples': 2873280, 'steps': 14964, 'loss/train': 0.6386151909828186} 01/27/2022 09:56:06 - INFO - codeparrot_training - Step 14965: {'lr': 0.0004152665891609737, 'samples': 2873472, 'steps': 14965, 'loss/train': 0.6162815541028976} 01/27/2022 09:56:11 - INFO - codeparrot_training - Step 14966: {'lr': 0.0004152543116029267, 'samples': 2873664, 'steps': 14966, 'loss/train': 0.7855096757411957} 01/27/2022 09:56:14 - INFO - codeparrot_training - Step 14967: {'lr': 0.0004152420333369827, 'samples': 2873856, 'steps': 14967, 'loss/train': 1.015951931476593} 01/27/2022 09:56:17 - INFO - codeparrot_training - Step 14968: {'lr': 0.00041522975436319445, 'samples': 2874048, 'steps': 14968, 'loss/train': 0.8404653668403625} 01/27/2022 09:56:21 - INFO - codeparrot_training - Step 14969: {'lr': 0.00041521747468161417, 'samples': 2874240, 'steps': 14969, 'loss/train': 0.6196260452270508} 01/27/2022 09:56:24 - INFO - codeparrot_training - Step 14970: {'lr': 0.00041520519429229485, 'samples': 2874432, 'steps': 14970, 'loss/train': 1.113535076379776} 01/27/2022 09:56:27 - INFO - codeparrot_training - Step 14971: {'lr': 0.00041519291319528886, 'samples': 2874624, 'steps': 14971, 'loss/train': 0.39278528094291687} 01/27/2022 09:56:30 - INFO - codeparrot_training - Step 14972: {'lr': 0.00041518063139064893, 'samples': 2874816, 'steps': 14972, 'loss/train': 0.43085020780563354} 01/27/2022 09:56:33 - INFO - codeparrot_training - Step 14973: {'lr': 0.0004151683488784276, 'samples': 2875008, 'steps': 14973, 'loss/train': 0.7466789782047272} 01/27/2022 09:56:36 - INFO - codeparrot_training - Step 14974: {'lr': 0.00041515606565867746, 'samples': 2875200, 'steps': 14974, 'loss/train': 0.965239405632019} 01/27/2022 09:56:43 - INFO - codeparrot_training - Step 14975: {'lr': 0.0004151437817314513, 'samples': 2875392, 'steps': 14975, 'loss/train': 0.8094145953655243} 01/27/2022 09:56:46 - INFO - codeparrot_training - Step 14976: {'lr': 0.00041513149709680155, 'samples': 2875584, 'steps': 14976, 'loss/train': 0.8900536000728607} 01/27/2022 09:56:49 - INFO - codeparrot_training - Step 14977: {'lr': 0.00041511921175478085, 'samples': 2875776, 'steps': 14977, 'loss/train': 0.42175009846687317} 01/27/2022 09:56:52 - INFO - codeparrot_training - Step 14978: {'lr': 0.0004151069257054419, 'samples': 2875968, 'steps': 14978, 'loss/train': 0.8767218589782715} 01/27/2022 09:56:55 - INFO - codeparrot_training - Step 14979: {'lr': 0.0004150946389488374, 'samples': 2876160, 'steps': 14979, 'loss/train': 1.1750462651252747} 01/27/2022 09:56:58 - INFO - codeparrot_training - Step 14980: {'lr': 0.0004150823514850198, 'samples': 2876352, 'steps': 14980, 'loss/train': 0.2686251327395439} 01/27/2022 09:57:01 - INFO - codeparrot_training - Step 14981: {'lr': 0.00041507006331404186, 'samples': 2876544, 'steps': 14981, 'loss/train': 0.8051825165748596} 01/27/2022 09:57:05 - INFO - codeparrot_training - Step 14982: {'lr': 0.00041505777443595615, 'samples': 2876736, 'steps': 14982, 'loss/train': 0.9834910333156586} 01/27/2022 09:57:09 - INFO - codeparrot_training - Step 14983: {'lr': 0.0004150454848508154, 'samples': 2876928, 'steps': 14983, 'loss/train': 0.6123595833778381} 01/27/2022 09:57:12 - INFO - codeparrot_training - Step 14984: {'lr': 0.00041503319455867216, 'samples': 2877120, 'steps': 14984, 'loss/train': 0.811616063117981} 01/27/2022 09:57:15 - INFO - codeparrot_training - Step 14985: {'lr': 0.0004150209035595791, 'samples': 2877312, 'steps': 14985, 'loss/train': 0.5834321826696396} 01/27/2022 09:57:18 - INFO - codeparrot_training - Step 14986: {'lr': 0.000415008611853589, 'samples': 2877504, 'steps': 14986, 'loss/train': 0.7463065981864929} 01/27/2022 09:57:22 - INFO - codeparrot_training - Step 14987: {'lr': 0.0004149963194407543, 'samples': 2877696, 'steps': 14987, 'loss/train': 0.8775812387466431} 01/27/2022 09:57:25 - INFO - codeparrot_training - Step 14988: {'lr': 0.00041498402632112776, 'samples': 2877888, 'steps': 14988, 'loss/train': 0.9469330608844757} 01/27/2022 09:57:28 - INFO - codeparrot_training - Step 14989: {'lr': 0.00041497173249476204, 'samples': 2878080, 'steps': 14989, 'loss/train': 1.1701518595218658} 01/27/2022 09:57:31 - INFO - codeparrot_training - Step 14990: {'lr': 0.0004149594379617099, 'samples': 2878272, 'steps': 14990, 'loss/train': 1.138960987329483} 01/27/2022 09:57:34 - INFO - codeparrot_training - Step 14991: {'lr': 0.00041494714272202385, 'samples': 2878464, 'steps': 14991, 'loss/train': 0.9694905281066895} 01/27/2022 09:57:39 - INFO - codeparrot_training - Step 14992: {'lr': 0.00041493484677575655, 'samples': 2878656, 'steps': 14992, 'loss/train': 0.886474996805191} 01/27/2022 09:57:42 - INFO - codeparrot_training - Step 14993: {'lr': 0.00041492255012296077, 'samples': 2878848, 'steps': 14993, 'loss/train': 1.0491735935211182} 01/27/2022 09:57:45 - INFO - codeparrot_training - Step 14994: {'lr': 0.0004149102527636892, 'samples': 2879040, 'steps': 14994, 'loss/train': 0.7877329587936401} 01/27/2022 09:57:48 - INFO - codeparrot_training - Step 14995: {'lr': 0.0004148979546979944, 'samples': 2879232, 'steps': 14995, 'loss/train': 0.7126472890377045} 01/27/2022 09:57:51 - INFO - codeparrot_training - Step 14996: {'lr': 0.00041488565592592917, 'samples': 2879424, 'steps': 14996, 'loss/train': 0.5102161020040512} 01/27/2022 09:57:54 - INFO - codeparrot_training - Step 14997: {'lr': 0.0004148733564475462, 'samples': 2879616, 'steps': 14997, 'loss/train': 0.8605283796787262} 01/27/2022 09:57:58 - INFO - codeparrot_training - Step 14998: {'lr': 0.000414861056262898, 'samples': 2879808, 'steps': 14998, 'loss/train': 0.6425975114107132} 01/27/2022 09:58:01 - INFO - codeparrot_training - Step 14999: {'lr': 0.0004148487553720375, 'samples': 2880000, 'steps': 14999, 'loss/train': 0.6975443065166473} 01/27/2022 09:58:04 - INFO - codeparrot_training - Step 15000: {'lr': 0.0004148364537750172, 'samples': 2880192, 'steps': 15000, 'loss/train': 1.240765392780304} 01/27/2022 09:58:10 - INFO - codeparrot_training - Step 15001: {'lr': 0.0004148241514718899, 'samples': 2880384, 'steps': 15001, 'loss/train': 0.9081425964832306} 01/27/2022 09:58:13 - INFO - codeparrot_training - Step 15002: {'lr': 0.00041481184846270836, 'samples': 2880576, 'steps': 15002, 'loss/train': 0.9218344688415527} 01/27/2022 09:58:16 - INFO - codeparrot_training - Step 15003: {'lr': 0.00041479954474752507, 'samples': 2880768, 'steps': 15003, 'loss/train': 0.5083440989255905} 01/27/2022 09:58:20 - INFO - codeparrot_training - Step 15004: {'lr': 0.0004147872403263929, 'samples': 2880960, 'steps': 15004, 'loss/train': 0.9584490358829498} 01/27/2022 09:58:23 - INFO - codeparrot_training - Step 15005: {'lr': 0.0004147749351993645, 'samples': 2881152, 'steps': 15005, 'loss/train': 0.642764613032341} 01/27/2022 09:58:26 - INFO - codeparrot_training - Step 15006: {'lr': 0.0004147626293664926, 'samples': 2881344, 'steps': 15006, 'loss/train': 0.6055668443441391} 01/27/2022 09:58:29 - INFO - codeparrot_training - Step 15007: {'lr': 0.00041475032282783, 'samples': 2881536, 'steps': 15007, 'loss/train': 0.7532553970813751} 01/27/2022 09:58:32 - INFO - codeparrot_training - Step 15008: {'lr': 0.0004147380155834293, 'samples': 2881728, 'steps': 15008, 'loss/train': 1.5778334140777588} 01/27/2022 09:58:35 - INFO - codeparrot_training - Step 15009: {'lr': 0.00041472570763334316, 'samples': 2881920, 'steps': 15009, 'loss/train': 1.0040399730205536} 01/27/2022 09:58:40 - INFO - codeparrot_training - Step 15010: {'lr': 0.00041471339897762447, 'samples': 2882112, 'steps': 15010, 'loss/train': 0.8009811043739319} 01/27/2022 09:58:43 - INFO - codeparrot_training - Step 15011: {'lr': 0.0004147010896163259, 'samples': 2882304, 'steps': 15011, 'loss/train': 1.0453627109527588} 01/27/2022 09:58:46 - INFO - codeparrot_training - Step 15012: {'lr': 0.00041468877954950006, 'samples': 2882496, 'steps': 15012, 'loss/train': 0.7006141394376755} 01/27/2022 09:58:49 - INFO - codeparrot_training - Step 15013: {'lr': 0.0004146764687771999, 'samples': 2882688, 'steps': 15013, 'loss/train': 0.48668789863586426} 01/27/2022 09:58:52 - INFO - codeparrot_training - Step 15014: {'lr': 0.00041466415729947794, 'samples': 2882880, 'steps': 15014, 'loss/train': 0.9708479046821594} 01/27/2022 09:58:55 - INFO - codeparrot_training - Step 15015: {'lr': 0.0004146518451163871, 'samples': 2883072, 'steps': 15015, 'loss/train': 0.7377860248088837} 01/27/2022 09:58:59 - INFO - codeparrot_training - Step 15016: {'lr': 0.00041463953222798, 'samples': 2883264, 'steps': 15016, 'loss/train': 0.7226182222366333} 01/27/2022 09:59:02 - INFO - codeparrot_training - Step 15017: {'lr': 0.00041462721863430943, 'samples': 2883456, 'steps': 15017, 'loss/train': 0.9016934037208557} 01/27/2022 09:59:05 - INFO - codeparrot_training - Step 15018: {'lr': 0.0004146149043354281, 'samples': 2883648, 'steps': 15018, 'loss/train': 0.9316791594028473} 01/27/2022 09:59:10 - INFO - codeparrot_training - Step 15019: {'lr': 0.0004146025893313888, 'samples': 2883840, 'steps': 15019, 'loss/train': 0.39896075427532196} 01/27/2022 09:59:13 - INFO - codeparrot_training - Step 15020: {'lr': 0.00041459027362224433, 'samples': 2884032, 'steps': 15020, 'loss/train': 0.9398731291294098} 01/27/2022 09:59:16 - INFO - codeparrot_training - Step 15021: {'lr': 0.0004145779572080473, 'samples': 2884224, 'steps': 15021, 'loss/train': 1.0990937054157257} 01/27/2022 09:59:19 - INFO - codeparrot_training - Step 15022: {'lr': 0.0004145656400888506, 'samples': 2884416, 'steps': 15022, 'loss/train': 1.0989840924739838} 01/27/2022 09:59:22 - INFO - codeparrot_training - Step 15023: {'lr': 0.000414553322264707, 'samples': 2884608, 'steps': 15023, 'loss/train': 0.7557978630065918} 01/27/2022 09:59:25 - INFO - codeparrot_training - Step 15024: {'lr': 0.00041454100373566915, 'samples': 2884800, 'steps': 15024, 'loss/train': 0.5843474417924881} 01/27/2022 09:59:28 - INFO - codeparrot_training - Step 15025: {'lr': 0.00041452868450178994, 'samples': 2884992, 'steps': 15025, 'loss/train': 0.46224354207515717} 01/27/2022 09:59:32 - INFO - codeparrot_training - Step 15026: {'lr': 0.00041451636456312207, 'samples': 2885184, 'steps': 15026, 'loss/train': 1.195622831583023} 01/27/2022 09:59:38 - INFO - codeparrot_training - Step 15027: {'lr': 0.0004145040439197183, 'samples': 2885376, 'steps': 15027, 'loss/train': 0.8623473644256592} 01/27/2022 09:59:41 - INFO - codeparrot_training - Step 15028: {'lr': 0.00041449172257163156, 'samples': 2885568, 'steps': 15028, 'loss/train': 0.4661318063735962} 01/27/2022 09:59:44 - INFO - codeparrot_training - Step 15029: {'lr': 0.00041447940051891435, 'samples': 2885760, 'steps': 15029, 'loss/train': 1.0171912908554077} 01/27/2022 09:59:47 - INFO - codeparrot_training - Step 15030: {'lr': 0.00041446707776161975, 'samples': 2885952, 'steps': 15030, 'loss/train': 0.6461917161941528} 01/27/2022 09:59:50 - INFO - codeparrot_training - Step 15031: {'lr': 0.00041445475429980033, 'samples': 2886144, 'steps': 15031, 'loss/train': 0.7885690927505493} 01/27/2022 09:59:54 - INFO - codeparrot_training - Step 15032: {'lr': 0.000414442430133509, 'samples': 2886336, 'steps': 15032, 'loss/train': 0.6364949494600296} 01/27/2022 09:59:57 - INFO - codeparrot_training - Step 15033: {'lr': 0.0004144301052627985, 'samples': 2886528, 'steps': 15033, 'loss/train': 0.48566973209381104} 01/27/2022 10:00:00 - INFO - codeparrot_training - Step 15034: {'lr': 0.00041441777968772165, 'samples': 2886720, 'steps': 15034, 'loss/train': 0.21621376276016235} 01/27/2022 10:00:03 - INFO - codeparrot_training - Step 15035: {'lr': 0.00041440545340833124, 'samples': 2886912, 'steps': 15035, 'loss/train': 2.3699668049812317} 01/27/2022 10:00:08 - INFO - codeparrot_training - Step 15036: {'lr': 0.00041439312642468007, 'samples': 2887104, 'steps': 15036, 'loss/train': 0.9903697371482849} 01/27/2022 10:00:11 - INFO - codeparrot_training - Step 15037: {'lr': 0.000414380798736821, 'samples': 2887296, 'steps': 15037, 'loss/train': 0.6954662650823593} 01/27/2022 10:00:14 - INFO - codeparrot_training - Step 15038: {'lr': 0.0004143684703448067, 'samples': 2887488, 'steps': 15038, 'loss/train': 0.7904753386974335} 01/27/2022 10:00:17 - INFO - codeparrot_training - Step 15039: {'lr': 0.0004143561412486901, 'samples': 2887680, 'steps': 15039, 'loss/train': 0.9077926576137543} 01/27/2022 10:00:20 - INFO - codeparrot_training - Step 15040: {'lr': 0.00041434381144852395, 'samples': 2887872, 'steps': 15040, 'loss/train': 0.7395349591970444} 01/27/2022 10:00:24 - INFO - codeparrot_training - Step 15041: {'lr': 0.00041433148094436115, 'samples': 2888064, 'steps': 15041, 'loss/train': 0.7252790182828903} 01/27/2022 10:00:27 - INFO - codeparrot_training - Step 15042: {'lr': 0.0004143191497362544, 'samples': 2888256, 'steps': 15042, 'loss/train': 0.9122817814350128} 01/27/2022 10:00:30 - INFO - codeparrot_training - Step 15043: {'lr': 0.0004143068178242566, 'samples': 2888448, 'steps': 15043, 'loss/train': 1.2050303220748901} 01/27/2022 10:00:33 - INFO - codeparrot_training - Step 15044: {'lr': 0.00041429448520842064, 'samples': 2888640, 'steps': 15044, 'loss/train': 0.7303286343812943} 01/27/2022 10:00:38 - INFO - codeparrot_training - Step 15045: {'lr': 0.00041428215188879926, 'samples': 2888832, 'steps': 15045, 'loss/train': 0.7626270353794098} 01/27/2022 10:00:41 - INFO - codeparrot_training - Step 15046: {'lr': 0.0004142698178654453, 'samples': 2889024, 'steps': 15046, 'loss/train': 0.60106261074543} 01/27/2022 10:00:44 - INFO - codeparrot_training - Step 15047: {'lr': 0.0004142574831384115, 'samples': 2889216, 'steps': 15047, 'loss/train': 1.0041577219963074} 01/27/2022 10:00:47 - INFO - codeparrot_training - Step 15048: {'lr': 0.0004142451477077509, 'samples': 2889408, 'steps': 15048, 'loss/train': 0.8390726745128632} 01/27/2022 10:00:50 - INFO - codeparrot_training - Step 15049: {'lr': 0.00041423281157351624, 'samples': 2889600, 'steps': 15049, 'loss/train': 1.0486519932746887} 01/27/2022 10:00:53 - INFO - codeparrot_training - Step 15050: {'lr': 0.00041422047473576033, 'samples': 2889792, 'steps': 15050, 'loss/train': 1.1063047349452972} 01/27/2022 10:00:56 - INFO - codeparrot_training - Step 15051: {'lr': 0.0004142081371945361, 'samples': 2889984, 'steps': 15051, 'loss/train': 0.477162167429924} 01/27/2022 10:01:00 - INFO - codeparrot_training - Step 15052: {'lr': 0.00041419579894989633, 'samples': 2890176, 'steps': 15052, 'loss/train': 0.5751157701015472} 01/27/2022 10:01:03 - INFO - codeparrot_training - Step 15053: {'lr': 0.0004141834600018939, 'samples': 2890368, 'steps': 15053, 'loss/train': 0.8466656506061554} 01/27/2022 10:01:09 - INFO - codeparrot_training - Step 15054: {'lr': 0.00041417112035058157, 'samples': 2890560, 'steps': 15054, 'loss/train': 0.935164600610733} 01/27/2022 10:01:12 - INFO - codeparrot_training - Step 15055: {'lr': 0.00041415877999601236, 'samples': 2890752, 'steps': 15055, 'loss/train': 0.9793701767921448} 01/27/2022 10:01:15 - INFO - codeparrot_training - Step 15056: {'lr': 0.0004141464389382391, 'samples': 2890944, 'steps': 15056, 'loss/train': 0.8182620406150818} 01/27/2022 10:01:18 - INFO - codeparrot_training - Step 15057: {'lr': 0.0004141340971773147, 'samples': 2891136, 'steps': 15057, 'loss/train': 0.9918774962425232} 01/27/2022 10:01:22 - INFO - codeparrot_training - Step 15058: {'lr': 0.00041412175471329174, 'samples': 2891328, 'steps': 15058, 'loss/train': 0.3347039446234703} 01/27/2022 10:01:25 - INFO - codeparrot_training - Step 15059: {'lr': 0.0004141094115462234, 'samples': 2891520, 'steps': 15059, 'loss/train': 0.7202198803424835} 01/27/2022 10:01:28 - INFO - codeparrot_training - Step 15060: {'lr': 0.00041409706767616246, 'samples': 2891712, 'steps': 15060, 'loss/train': 0.8116297423839569} 01/27/2022 10:01:31 - INFO - codeparrot_training - Step 15061: {'lr': 0.0004140847231031618, 'samples': 2891904, 'steps': 15061, 'loss/train': 0.2850605249404907} 01/27/2022 10:01:34 - INFO - codeparrot_training - Step 15062: {'lr': 0.00041407237782727427, 'samples': 2892096, 'steps': 15062, 'loss/train': 0.9285447299480438} 01/27/2022 10:01:39 - INFO - codeparrot_training - Step 15063: {'lr': 0.0004140600318485527, 'samples': 2892288, 'steps': 15063, 'loss/train': 1.611127495765686} 01/27/2022 10:01:42 - INFO - codeparrot_training - Step 15064: {'lr': 0.0004140476851670502, 'samples': 2892480, 'steps': 15064, 'loss/train': 0.6915925741195679} 01/27/2022 10:01:45 - INFO - codeparrot_training - Step 15065: {'lr': 0.00041403533778281934, 'samples': 2892672, 'steps': 15065, 'loss/train': 0.3854217231273651} 01/27/2022 10:01:48 - INFO - codeparrot_training - Step 15066: {'lr': 0.0004140229896959132, 'samples': 2892864, 'steps': 15066, 'loss/train': 0.7096794694662094} 01/27/2022 10:01:51 - INFO - codeparrot_training - Step 15067: {'lr': 0.00041401064090638474, 'samples': 2893056, 'steps': 15067, 'loss/train': 0.3744534105062485} 01/27/2022 10:01:54 - INFO - codeparrot_training - Step 15068: {'lr': 0.0004139982914142868, 'samples': 2893248, 'steps': 15068, 'loss/train': 1.241644710302353} 01/27/2022 10:01:58 - INFO - codeparrot_training - Step 15069: {'lr': 0.00041398594121967215, 'samples': 2893440, 'steps': 15069, 'loss/train': 0.9711930155754089} 01/27/2022 10:02:01 - INFO - codeparrot_training - Step 15070: {'lr': 0.0004139735903225939, 'samples': 2893632, 'steps': 15070, 'loss/train': 1.2107165157794952} 01/27/2022 10:02:04 - INFO - codeparrot_training - Step 15071: {'lr': 0.0004139612387231048, 'samples': 2893824, 'steps': 15071, 'loss/train': 0.13207560405135155} 01/27/2022 10:02:08 - INFO - codeparrot_training - Step 15072: {'lr': 0.0004139488864212578, 'samples': 2894016, 'steps': 15072, 'loss/train': 0.9070402085781097} 01/27/2022 10:02:11 - INFO - codeparrot_training - Step 15073: {'lr': 0.0004139365334171059, 'samples': 2894208, 'steps': 15073, 'loss/train': 0.7397717088460922} 01/27/2022 10:02:15 - INFO - codeparrot_training - Step 15074: {'lr': 0.0004139241797107019, 'samples': 2894400, 'steps': 15074, 'loss/train': 0.69002665579319} 01/27/2022 10:02:18 - INFO - codeparrot_training - Step 15075: {'lr': 0.00041391182530209873, 'samples': 2894592, 'steps': 15075, 'loss/train': 0.7817244529724121} 01/27/2022 10:02:21 - INFO - codeparrot_training - Step 15076: {'lr': 0.0004138994701913494, 'samples': 2894784, 'steps': 15076, 'loss/train': 0.8523864448070526} 01/27/2022 10:02:24 - INFO - codeparrot_training - Step 15077: {'lr': 0.00041388711437850676, 'samples': 2894976, 'steps': 15077, 'loss/train': 0.7179430425167084} 01/27/2022 10:02:27 - INFO - codeparrot_training - Step 15078: {'lr': 0.00041387475786362386, 'samples': 2895168, 'steps': 15078, 'loss/train': 0.7193187922239304} 01/27/2022 10:02:30 - INFO - codeparrot_training - Step 15079: {'lr': 0.0004138624006467534, 'samples': 2895360, 'steps': 15079, 'loss/train': 0.8820582032203674} 01/27/2022 10:02:34 - INFO - codeparrot_training - Step 15080: {'lr': 0.00041385004272794846, 'samples': 2895552, 'steps': 15080, 'loss/train': 0.695886567234993} 01/27/2022 10:02:40 - INFO - codeparrot_training - Step 15081: {'lr': 0.00041383768410726207, 'samples': 2895744, 'steps': 15081, 'loss/train': 1.1517982184886932} 01/27/2022 10:02:43 - INFO - codeparrot_training - Step 15082: {'lr': 0.000413825324784747, 'samples': 2895936, 'steps': 15082, 'loss/train': 1.1338875889778137} 01/27/2022 10:02:46 - INFO - codeparrot_training - Step 15083: {'lr': 0.00041381296476045626, 'samples': 2896128, 'steps': 15083, 'loss/train': 0.8780824542045593} 01/27/2022 10:02:49 - INFO - codeparrot_training - Step 15084: {'lr': 0.0004138006040344428, 'samples': 2896320, 'steps': 15084, 'loss/train': 0.6773297935724258} 01/27/2022 10:02:52 - INFO - codeparrot_training - Step 15085: {'lr': 0.0004137882426067595, 'samples': 2896512, 'steps': 15085, 'loss/train': 0.568618431687355} 01/27/2022 10:02:55 - INFO - codeparrot_training - Step 15086: {'lr': 0.0004137758804774594, 'samples': 2896704, 'steps': 15086, 'loss/train': 1.054057538509369} 01/27/2022 10:02:59 - INFO - codeparrot_training - Step 15087: {'lr': 0.0004137635176465955, 'samples': 2896896, 'steps': 15087, 'loss/train': 1.364462435245514} 01/27/2022 10:03:02 - INFO - codeparrot_training - Step 15088: {'lr': 0.00041375115411422064, 'samples': 2897088, 'steps': 15088, 'loss/train': 1.2102851271629333} 01/27/2022 10:03:07 - INFO - codeparrot_training - Step 15089: {'lr': 0.0004137387898803878, 'samples': 2897280, 'steps': 15089, 'loss/train': 0.9780047535896301} 01/27/2022 10:03:10 - INFO - codeparrot_training - Step 15090: {'lr': 0.0004137264249451501, 'samples': 2897472, 'steps': 15090, 'loss/train': 0.7758292257785797} 01/27/2022 10:03:13 - INFO - codeparrot_training - Step 15091: {'lr': 0.00041371405930856026, 'samples': 2897664, 'steps': 15091, 'loss/train': 0.9431616961956024} 01/27/2022 10:03:16 - INFO - codeparrot_training - Step 15092: {'lr': 0.00041370169297067145, 'samples': 2897856, 'steps': 15092, 'loss/train': 1.7366500496864319} 01/27/2022 10:03:19 - INFO - codeparrot_training - Step 15093: {'lr': 0.0004136893259315365, 'samples': 2898048, 'steps': 15093, 'loss/train': 0.8432925045490265} 01/27/2022 10:03:22 - INFO - codeparrot_training - Step 15094: {'lr': 0.00041367695819120854, 'samples': 2898240, 'steps': 15094, 'loss/train': 0.6958901435136795} 01/27/2022 10:03:26 - INFO - codeparrot_training - Step 15095: {'lr': 0.0004136645897497404, 'samples': 2898432, 'steps': 15095, 'loss/train': 0.5880395472049713} 01/27/2022 10:03:29 - INFO - codeparrot_training - Step 15096: {'lr': 0.0004136522206071852, 'samples': 2898624, 'steps': 15096, 'loss/train': 1.138904482126236} 01/27/2022 10:03:32 - INFO - codeparrot_training - Step 15097: {'lr': 0.0004136398507635958, 'samples': 2898816, 'steps': 15097, 'loss/train': 0.9278659522533417} 01/27/2022 10:03:35 - INFO - codeparrot_training - Step 15098: {'lr': 0.00041362748021902526, 'samples': 2899008, 'steps': 15098, 'loss/train': 0.12431562691926956} 01/27/2022 10:03:41 - INFO - codeparrot_training - Step 15099: {'lr': 0.0004136151089735265, 'samples': 2899200, 'steps': 15099, 'loss/train': 1.3303684294223785} 01/27/2022 10:03:44 - INFO - codeparrot_training - Step 15100: {'lr': 0.00041360273702715263, 'samples': 2899392, 'steps': 15100, 'loss/train': 0.8454715311527252} 01/27/2022 10:03:48 - INFO - codeparrot_training - Step 15101: {'lr': 0.0004135903643799566, 'samples': 2899584, 'steps': 15101, 'loss/train': 1.3062570691108704} 01/27/2022 10:03:51 - INFO - codeparrot_training - Step 15102: {'lr': 0.00041357799103199127, 'samples': 2899776, 'steps': 15102, 'loss/train': 0.3033187910914421} 01/27/2022 10:03:54 - INFO - codeparrot_training - Step 15103: {'lr': 0.00041356561698330984, 'samples': 2899968, 'steps': 15103, 'loss/train': 0.6348081529140472} 01/27/2022 10:03:57 - INFO - codeparrot_training - Step 15104: {'lr': 0.0004135532422339653, 'samples': 2900160, 'steps': 15104, 'loss/train': 0.9333213865756989} 01/27/2022 10:04:00 - INFO - codeparrot_training - Step 15105: {'lr': 0.00041354086678401056, 'samples': 2900352, 'steps': 15105, 'loss/train': 1.0410080552101135} 01/27/2022 10:04:03 - INFO - codeparrot_training - Step 15106: {'lr': 0.00041352849063349865, 'samples': 2900544, 'steps': 15106, 'loss/train': 0.6982188820838928} 01/27/2022 10:04:06 - INFO - codeparrot_training - Step 15107: {'lr': 0.0004135161137824827, 'samples': 2900736, 'steps': 15107, 'loss/train': 0.6585945636034012} 01/27/2022 10:04:11 - INFO - codeparrot_training - Step 15108: {'lr': 0.0004135037362310155, 'samples': 2900928, 'steps': 15108, 'loss/train': 0.29781322181224823} 01/27/2022 10:04:14 - INFO - codeparrot_training - Step 15109: {'lr': 0.0004134913579791503, 'samples': 2901120, 'steps': 15109, 'loss/train': 0.4704756885766983} 01/27/2022 10:04:17 - INFO - codeparrot_training - Step 15110: {'lr': 0.00041347897902694003, 'samples': 2901312, 'steps': 15110, 'loss/train': 0.14110201969742775} 01/27/2022 10:04:20 - INFO - codeparrot_training - Step 15111: {'lr': 0.00041346659937443775, 'samples': 2901504, 'steps': 15111, 'loss/train': 0.6410219371318817} 01/27/2022 10:04:23 - INFO - codeparrot_training - Step 15112: {'lr': 0.00041345421902169645, 'samples': 2901696, 'steps': 15112, 'loss/train': 0.9590623676776886} 01/27/2022 10:04:26 - INFO - codeparrot_training - Step 15113: {'lr': 0.0004134418379687691, 'samples': 2901888, 'steps': 15113, 'loss/train': 0.3362956792116165} 01/27/2022 10:04:30 - INFO - codeparrot_training - Step 15114: {'lr': 0.0004134294562157089, 'samples': 2902080, 'steps': 15114, 'loss/train': 0.8609730005264282} 01/27/2022 10:04:33 - INFO - codeparrot_training - Step 15115: {'lr': 0.00041341707376256877, 'samples': 2902272, 'steps': 15115, 'loss/train': 1.0970311760902405} 01/27/2022 10:04:37 - INFO - codeparrot_training - Step 15116: {'lr': 0.00041340469060940183, 'samples': 2902464, 'steps': 15116, 'loss/train': 0.9373852014541626} 01/27/2022 10:04:40 - INFO - codeparrot_training - Step 15117: {'lr': 0.0004133923067562611, 'samples': 2902656, 'steps': 15117, 'loss/train': 0.29573075473308563} 01/27/2022 10:04:44 - INFO - codeparrot_training - Step 15118: {'lr': 0.0004133799222031995, 'samples': 2902848, 'steps': 15118, 'loss/train': 0.5302037000656128} 01/27/2022 10:04:47 - INFO - codeparrot_training - Step 15119: {'lr': 0.0004133675369502703, 'samples': 2903040, 'steps': 15119, 'loss/train': 0.5095786303281784} 01/27/2022 10:04:50 - INFO - codeparrot_training - Step 15120: {'lr': 0.0004133551509975264, 'samples': 2903232, 'steps': 15120, 'loss/train': 0.9138329029083252} 01/27/2022 10:04:53 - INFO - codeparrot_training - Step 15121: {'lr': 0.0004133427643450209, 'samples': 2903424, 'steps': 15121, 'loss/train': 1.2112557291984558} 01/27/2022 10:04:56 - INFO - codeparrot_training - Step 15122: {'lr': 0.0004133303769928068, 'samples': 2903616, 'steps': 15122, 'loss/train': 1.0615377724170685} 01/27/2022 10:04:59 - INFO - codeparrot_training - Step 15123: {'lr': 0.00041331798894093735, 'samples': 2903808, 'steps': 15123, 'loss/train': 0.45769791305065155} 01/27/2022 10:05:02 - INFO - codeparrot_training - Step 15124: {'lr': 0.0004133056001894655, 'samples': 2904000, 'steps': 15124, 'loss/train': 0.7500764429569244} 01/27/2022 10:05:07 - INFO - codeparrot_training - Step 15125: {'lr': 0.0004132932107384442, 'samples': 2904192, 'steps': 15125, 'loss/train': 0.7160933464765549} 01/27/2022 10:05:10 - INFO - codeparrot_training - Step 15126: {'lr': 0.0004132808205879267, 'samples': 2904384, 'steps': 15126, 'loss/train': 0.1793643683195114} 01/27/2022 10:05:13 - INFO - codeparrot_training - Step 15127: {'lr': 0.000413268429737966, 'samples': 2904576, 'steps': 15127, 'loss/train': 0.4053032398223877} 01/27/2022 10:05:16 - INFO - codeparrot_training - Step 15128: {'lr': 0.00041325603818861517, 'samples': 2904768, 'steps': 15128, 'loss/train': 0.7500234246253967} 01/27/2022 10:05:20 - INFO - codeparrot_training - Step 15129: {'lr': 0.00041324364593992735, 'samples': 2904960, 'steps': 15129, 'loss/train': 1.0157876908779144} 01/27/2022 10:05:23 - INFO - codeparrot_training - Step 15130: {'lr': 0.00041323125299195563, 'samples': 2905152, 'steps': 15130, 'loss/train': 0.9396091997623444} 01/27/2022 10:05:26 - INFO - codeparrot_training - Step 15131: {'lr': 0.000413218859344753, 'samples': 2905344, 'steps': 15131, 'loss/train': 0.9353137314319611} 01/27/2022 10:05:29 - INFO - codeparrot_training - Step 15132: {'lr': 0.00041320646499837254, 'samples': 2905536, 'steps': 15132, 'loss/train': 1.380929410457611} 01/27/2022 10:05:32 - INFO - codeparrot_training - Step 15133: {'lr': 0.00041319406995286753, 'samples': 2905728, 'steps': 15133, 'loss/train': 0.6920620501041412} 01/27/2022 10:05:38 - INFO - codeparrot_training - Step 15134: {'lr': 0.0004131816742082909, 'samples': 2905920, 'steps': 15134, 'loss/train': 1.0013879835605621} 01/27/2022 10:05:42 - INFO - codeparrot_training - Step 15135: {'lr': 0.00041316927776469575, 'samples': 2906112, 'steps': 15135, 'loss/train': 0.948353111743927} 01/27/2022 10:05:45 - INFO - codeparrot_training - Step 15136: {'lr': 0.00041315688062213524, 'samples': 2906304, 'steps': 15136, 'loss/train': 0.7306952029466629} 01/27/2022 10:05:48 - INFO - codeparrot_training - Step 15137: {'lr': 0.0004131444827806625, 'samples': 2906496, 'steps': 15137, 'loss/train': 0.7089412361383438} 01/27/2022 10:05:51 - INFO - codeparrot_training - Step 15138: {'lr': 0.00041313208424033056, 'samples': 2906688, 'steps': 15138, 'loss/train': 0.522646889090538} 01/27/2022 10:05:54 - INFO - codeparrot_training - Step 15139: {'lr': 0.0004131196850011926, 'samples': 2906880, 'steps': 15139, 'loss/train': 0.9984056353569031} 01/27/2022 10:05:57 - INFO - codeparrot_training - Step 15140: {'lr': 0.0004131072850633017, 'samples': 2907072, 'steps': 15140, 'loss/train': 0.9814590811729431} 01/27/2022 10:06:00 - INFO - codeparrot_training - Step 15141: {'lr': 0.00041309488442671093, 'samples': 2907264, 'steps': 15141, 'loss/train': 0.6543077677488327} 01/27/2022 10:06:04 - INFO - codeparrot_training - Step 15142: {'lr': 0.00041308248309147356, 'samples': 2907456, 'steps': 15142, 'loss/train': 0.7398658096790314} 01/27/2022 10:06:08 - INFO - codeparrot_training - Step 15143: {'lr': 0.00041307008105764256, 'samples': 2907648, 'steps': 15143, 'loss/train': 0.8784997165203094} 01/27/2022 10:06:11 - INFO - codeparrot_training - Step 15144: {'lr': 0.0004130576783252712, 'samples': 2907840, 'steps': 15144, 'loss/train': 0.7648639976978302} 01/27/2022 10:06:14 - INFO - codeparrot_training - Step 15145: {'lr': 0.00041304527489441237, 'samples': 2908032, 'steps': 15145, 'loss/train': 0.8827351927757263} 01/27/2022 10:06:18 - INFO - codeparrot_training - Step 15146: {'lr': 0.0004130328707651195, 'samples': 2908224, 'steps': 15146, 'loss/train': 0.8313217163085938} 01/27/2022 10:06:21 - INFO - codeparrot_training - Step 15147: {'lr': 0.00041302046593744547, 'samples': 2908416, 'steps': 15147, 'loss/train': 1.0079635977745056} 01/27/2022 10:06:24 - INFO - codeparrot_training - Step 15148: {'lr': 0.00041300806041144356, 'samples': 2908608, 'steps': 15148, 'loss/train': 0.8122485280036926} 01/27/2022 10:06:27 - INFO - codeparrot_training - Step 15149: {'lr': 0.0004129956541871669, 'samples': 2908800, 'steps': 15149, 'loss/train': 0.8511426150798798} 01/27/2022 10:06:30 - INFO - codeparrot_training - Step 15150: {'lr': 0.00041298324726466855, 'samples': 2908992, 'steps': 15150, 'loss/train': 1.2348343133926392} 01/27/2022 10:06:33 - INFO - codeparrot_training - Step 15151: {'lr': 0.0004129708396440018, 'samples': 2909184, 'steps': 15151, 'loss/train': 0.4877796918153763} 01/27/2022 10:06:38 - INFO - codeparrot_training - Step 15152: {'lr': 0.00041295843132521973, 'samples': 2909376, 'steps': 15152, 'loss/train': 0.6658680886030197} 01/27/2022 10:06:41 - INFO - codeparrot_training - Step 15153: {'lr': 0.0004129460223083754, 'samples': 2909568, 'steps': 15153, 'loss/train': 0.5225119292736053} 01/27/2022 10:06:44 - INFO - codeparrot_training - Step 15154: {'lr': 0.0004129336125935221, 'samples': 2909760, 'steps': 15154, 'loss/train': 0.78013014793396} 01/27/2022 10:06:47 - INFO - codeparrot_training - Step 15155: {'lr': 0.000412921202180713, 'samples': 2909952, 'steps': 15155, 'loss/train': 0.5048753321170807} 01/27/2022 10:06:50 - INFO - codeparrot_training - Step 15156: {'lr': 0.00041290879107000114, 'samples': 2910144, 'steps': 15156, 'loss/train': 0.9795485436916351} 01/27/2022 10:06:53 - INFO - codeparrot_training - Step 15157: {'lr': 0.00041289637926143974, 'samples': 2910336, 'steps': 15157, 'loss/train': 0.7698147296905518} 01/27/2022 10:06:57 - INFO - codeparrot_training - Step 15158: {'lr': 0.000412883966755082, 'samples': 2910528, 'steps': 15158, 'loss/train': 0.6363487243652344} 01/27/2022 10:07:00 - INFO - codeparrot_training - Step 15159: {'lr': 0.000412871553550981, 'samples': 2910720, 'steps': 15159, 'loss/train': 1.1220355331897736} 01/27/2022 10:07:03 - INFO - codeparrot_training - Step 15160: {'lr': 0.00041285913964919006, 'samples': 2910912, 'steps': 15160, 'loss/train': 0.5562449544668198} 01/27/2022 10:07:09 - INFO - codeparrot_training - Step 15161: {'lr': 0.0004128467250497623, 'samples': 2911104, 'steps': 15161, 'loss/train': 1.2295538485050201} 01/27/2022 10:07:12 - INFO - codeparrot_training - Step 15162: {'lr': 0.00041283430975275085, 'samples': 2911296, 'steps': 15162, 'loss/train': 0.7895789444446564} 01/27/2022 10:07:15 - INFO - codeparrot_training - Step 15163: {'lr': 0.0004128218937582089, 'samples': 2911488, 'steps': 15163, 'loss/train': 0.8025662004947662} 01/27/2022 10:07:18 - INFO - codeparrot_training - Step 15164: {'lr': 0.00041280947706618965, 'samples': 2911680, 'steps': 15164, 'loss/train': 0.8695378303527832} 01/27/2022 10:07:22 - INFO - codeparrot_training - Step 15165: {'lr': 0.00041279705967674636, 'samples': 2911872, 'steps': 15165, 'loss/train': 0.7063073068857193} 01/27/2022 10:07:25 - INFO - codeparrot_training - Step 15166: {'lr': 0.00041278464158993214, 'samples': 2912064, 'steps': 15166, 'loss/train': 0.5916159152984619} 01/27/2022 10:07:28 - INFO - codeparrot_training - Step 15167: {'lr': 0.0004127722228058002, 'samples': 2912256, 'steps': 15167, 'loss/train': 0.1484822742640972} 01/27/2022 10:07:31 - INFO - codeparrot_training - Step 15168: {'lr': 0.0004127598033244037, 'samples': 2912448, 'steps': 15168, 'loss/train': 0.8348661661148071} 01/27/2022 10:07:36 - INFO - codeparrot_training - Step 15169: {'lr': 0.0004127473831457959, 'samples': 2912640, 'steps': 15169, 'loss/train': 0.6404544711112976} 01/27/2022 10:07:39 - INFO - codeparrot_training - Step 15170: {'lr': 0.00041273496227003004, 'samples': 2912832, 'steps': 15170, 'loss/train': 0.42868445813655853} 01/27/2022 10:07:43 - INFO - codeparrot_training - Step 15171: {'lr': 0.0004127225406971592, 'samples': 2913024, 'steps': 15171, 'loss/train': 0.03490063827484846} 01/27/2022 10:07:46 - INFO - codeparrot_training - Step 15172: {'lr': 0.00041271011842723676, 'samples': 2913216, 'steps': 15172, 'loss/train': 0.6420041620731354} 01/27/2022 10:07:49 - INFO - codeparrot_training - Step 15173: {'lr': 0.00041269769546031576, 'samples': 2913408, 'steps': 15173, 'loss/train': 0.8425111770629883} 01/27/2022 10:07:52 - INFO - codeparrot_training - Step 15174: {'lr': 0.0004126852717964495, 'samples': 2913600, 'steps': 15174, 'loss/train': 1.9573153853416443} 01/27/2022 10:07:55 - INFO - codeparrot_training - Step 15175: {'lr': 0.0004126728474356912, 'samples': 2913792, 'steps': 15175, 'loss/train': 0.7955903112888336} 01/27/2022 10:07:58 - INFO - codeparrot_training - Step 15176: {'lr': 0.0004126604223780941, 'samples': 2913984, 'steps': 15176, 'loss/train': 1.7590491771697998} 01/27/2022 10:08:01 - INFO - codeparrot_training - Step 15177: {'lr': 0.00041264799662371144, 'samples': 2914176, 'steps': 15177, 'loss/train': 0.7702573835849762} 01/27/2022 10:08:05 - INFO - codeparrot_training - Step 15178: {'lr': 0.0004126355701725963, 'samples': 2914368, 'steps': 15178, 'loss/train': 1.3199823796749115} 01/27/2022 10:08:11 - INFO - codeparrot_training - Step 15179: {'lr': 0.00041262314302480216, 'samples': 2914560, 'steps': 15179, 'loss/train': 0.39415620267391205} 01/27/2022 10:08:14 - INFO - codeparrot_training - Step 15180: {'lr': 0.000412610715180382, 'samples': 2914752, 'steps': 15180, 'loss/train': 0.6454146355390549} 01/27/2022 10:08:17 - INFO - codeparrot_training - Step 15181: {'lr': 0.0004125982866393892, 'samples': 2914944, 'steps': 15181, 'loss/train': 0.9572467803955078} 01/27/2022 10:08:20 - INFO - codeparrot_training - Step 15182: {'lr': 0.0004125858574018769, 'samples': 2915136, 'steps': 15182, 'loss/train': 0.5039267241954803} 01/27/2022 10:08:23 - INFO - codeparrot_training - Step 15183: {'lr': 0.0004125734274678986, 'samples': 2915328, 'steps': 15183, 'loss/train': 0.20860301703214645} 01/27/2022 10:08:26 - INFO - codeparrot_training - Step 15184: {'lr': 0.0004125609968375072, 'samples': 2915520, 'steps': 15184, 'loss/train': 1.0032720565795898} 01/27/2022 10:08:30 - INFO - codeparrot_training - Step 15185: {'lr': 0.00041254856551075616, 'samples': 2915712, 'steps': 15185, 'loss/train': 0.7500565946102142} 01/27/2022 10:08:33 - INFO - codeparrot_training - Step 15186: {'lr': 0.0004125361334876987, 'samples': 2915904, 'steps': 15186, 'loss/train': 0.8125545680522919} 01/27/2022 10:08:36 - INFO - codeparrot_training - Step 15187: {'lr': 0.000412523700768388, 'samples': 2916096, 'steps': 15187, 'loss/train': 0.8894788026809692} 01/27/2022 10:08:40 - INFO - codeparrot_training - Step 15188: {'lr': 0.0004125112673528775, 'samples': 2916288, 'steps': 15188, 'loss/train': 1.149211585521698} 01/27/2022 10:08:43 - INFO - codeparrot_training - Step 15189: {'lr': 0.0004124988332412202, 'samples': 2916480, 'steps': 15189, 'loss/train': 1.0949885845184326} 01/27/2022 10:08:46 - INFO - codeparrot_training - Step 15190: {'lr': 0.00041248639843346953, 'samples': 2916672, 'steps': 15190, 'loss/train': 0.6252223402261734} 01/27/2022 10:08:50 - INFO - codeparrot_training - Step 15191: {'lr': 0.0004124739629296787, 'samples': 2916864, 'steps': 15191, 'loss/train': 1.060896635055542} 01/27/2022 10:08:53 - INFO - codeparrot_training - Step 15192: {'lr': 0.00041246152672990105, 'samples': 2917056, 'steps': 15192, 'loss/train': 0.7346066236495972} 01/27/2022 10:08:56 - INFO - codeparrot_training - Step 15193: {'lr': 0.00041244908983418985, 'samples': 2917248, 'steps': 15193, 'loss/train': 0.8659083545207977} 01/27/2022 10:08:59 - INFO - codeparrot_training - Step 15194: {'lr': 0.0004124366522425982, 'samples': 2917440, 'steps': 15194, 'loss/train': 1.2697033882141113} 01/27/2022 10:09:02 - INFO - codeparrot_training - Step 15195: {'lr': 0.0004124242139551796, 'samples': 2917632, 'steps': 15195, 'loss/train': 1.1108210384845734} 01/27/2022 10:09:05 - INFO - codeparrot_training - Step 15196: {'lr': 0.00041241177497198725, 'samples': 2917824, 'steps': 15196, 'loss/train': 0.9862688183784485} 01/27/2022 10:09:11 - INFO - codeparrot_training - Step 15197: {'lr': 0.00041239933529307437, 'samples': 2918016, 'steps': 15197, 'loss/train': 0.6750245690345764} 01/27/2022 10:09:14 - INFO - codeparrot_training - Step 15198: {'lr': 0.00041238689491849434, 'samples': 2918208, 'steps': 15198, 'loss/train': 1.0515141785144806} 01/27/2022 10:09:18 - INFO - codeparrot_training - Step 15199: {'lr': 0.00041237445384830043, 'samples': 2918400, 'steps': 15199, 'loss/train': 1.4679451882839203} 01/27/2022 10:09:21 - INFO - codeparrot_training - Step 15200: {'lr': 0.0004123620120825459, 'samples': 2918592, 'steps': 15200, 'loss/train': 1.0032333433628082} 01/27/2022 10:09:24 - INFO - codeparrot_training - Step 15201: {'lr': 0.0004123495696212841, 'samples': 2918784, 'steps': 15201, 'loss/train': 0.8444022238254547} 01/27/2022 10:09:27 - INFO - codeparrot_training - Step 15202: {'lr': 0.00041233712646456823, 'samples': 2918976, 'steps': 15202, 'loss/train': 0.5353946685791016} 01/27/2022 10:09:30 - INFO - codeparrot_training - Step 15203: {'lr': 0.0004123246826124517, 'samples': 2919168, 'steps': 15203, 'loss/train': 0.5479595214128494} 01/27/2022 10:09:33 - INFO - codeparrot_training - Step 15204: {'lr': 0.00041231223806498777, 'samples': 2919360, 'steps': 15204, 'loss/train': 0.7386463433504105} 01/27/2022 10:09:38 - INFO - codeparrot_training - Step 15205: {'lr': 0.0004122997928222298, 'samples': 2919552, 'steps': 15205, 'loss/train': 1.018709510564804} 01/27/2022 10:09:41 - INFO - codeparrot_training - Step 15206: {'lr': 0.000412287346884231, 'samples': 2919744, 'steps': 15206, 'loss/train': 1.233458787202835} 01/27/2022 10:09:44 - INFO - codeparrot_training - Step 15207: {'lr': 0.00041227490025104474, 'samples': 2919936, 'steps': 15207, 'loss/train': 0.904532790184021} 01/27/2022 10:09:47 - INFO - codeparrot_training - Step 15208: {'lr': 0.00041226245292272433, 'samples': 2920128, 'steps': 15208, 'loss/train': 0.3402152583003044} 01/27/2022 10:09:50 - INFO - codeparrot_training - Step 15209: {'lr': 0.00041225000489932315, 'samples': 2920320, 'steps': 15209, 'loss/train': 0.8034434616565704} 01/27/2022 10:09:53 - INFO - codeparrot_training - Step 15210: {'lr': 0.00041223755618089445, 'samples': 2920512, 'steps': 15210, 'loss/train': 1.1959684789180756} 01/27/2022 10:09:56 - INFO - codeparrot_training - Step 15211: {'lr': 0.0004122251067674915, 'samples': 2920704, 'steps': 15211, 'loss/train': 0.8464322984218597} 01/27/2022 10:10:00 - INFO - codeparrot_training - Step 15212: {'lr': 0.00041221265665916776, 'samples': 2920896, 'steps': 15212, 'loss/train': 0.2642107307910919} 01/27/2022 10:10:03 - INFO - codeparrot_training - Step 15213: {'lr': 0.0004122002058559765, 'samples': 2921088, 'steps': 15213, 'loss/train': 0.8393420577049255} 01/27/2022 10:10:07 - INFO - codeparrot_training - Step 15214: {'lr': 0.00041218775435797106, 'samples': 2921280, 'steps': 15214, 'loss/train': 1.1217803657054901} 01/27/2022 10:10:11 - INFO - codeparrot_training - Step 15215: {'lr': 0.0004121753021652048, 'samples': 2921472, 'steps': 15215, 'loss/train': 0.7977834641933441} 01/27/2022 10:10:14 - INFO - codeparrot_training - Step 15216: {'lr': 0.0004121628492777311, 'samples': 2921664, 'steps': 15216, 'loss/train': 0.8108708560466766} 01/27/2022 10:10:17 - INFO - codeparrot_training - Step 15217: {'lr': 0.0004121503956956031, 'samples': 2921856, 'steps': 15217, 'loss/train': 0.8680626153945923} 01/27/2022 10:10:20 - INFO - codeparrot_training - Step 15218: {'lr': 0.0004121379414188744, 'samples': 2922048, 'steps': 15218, 'loss/train': 0.6437699943780899} 01/27/2022 10:10:23 - INFO - codeparrot_training - Step 15219: {'lr': 0.0004121254864475982, 'samples': 2922240, 'steps': 15219, 'loss/train': 0.0846392959356308} 01/27/2022 10:10:26 - INFO - codeparrot_training - Step 15220: {'lr': 0.0004121130307818279, 'samples': 2922432, 'steps': 15220, 'loss/train': 0.8900691568851471} 01/27/2022 10:10:29 - INFO - codeparrot_training - Step 15221: {'lr': 0.00041210057442161687, 'samples': 2922624, 'steps': 15221, 'loss/train': 1.0904810428619385} 01/27/2022 10:10:33 - INFO - codeparrot_training - Step 15222: {'lr': 0.0004120881173670184, 'samples': 2922816, 'steps': 15222, 'loss/train': 0.8718493580818176} 01/27/2022 10:10:38 - INFO - codeparrot_training - Step 15223: {'lr': 0.000412075659618086, 'samples': 2923008, 'steps': 15223, 'loss/train': 0.9674663543701172} 01/27/2022 10:10:41 - INFO - codeparrot_training - Step 15224: {'lr': 0.0004120632011748728, 'samples': 2923200, 'steps': 15224, 'loss/train': 0.3632773831486702} 01/27/2022 10:10:44 - INFO - codeparrot_training - Step 15225: {'lr': 0.00041205074203743244, 'samples': 2923392, 'steps': 15225, 'loss/train': 0.8264859616756439} 01/27/2022 10:10:47 - INFO - codeparrot_training - Step 15226: {'lr': 0.00041203828220581805, 'samples': 2923584, 'steps': 15226, 'loss/train': 0.7806507647037506} 01/27/2022 10:10:50 - INFO - codeparrot_training - Step 15227: {'lr': 0.00041202582168008324, 'samples': 2923776, 'steps': 15227, 'loss/train': 0.6771206259727478} 01/27/2022 10:10:53 - INFO - codeparrot_training - Step 15228: {'lr': 0.00041201336046028117, 'samples': 2923968, 'steps': 15228, 'loss/train': 0.7714901268482208} 01/27/2022 10:10:56 - INFO - codeparrot_training - Step 15229: {'lr': 0.0004120008985464654, 'samples': 2924160, 'steps': 15229, 'loss/train': 0.831468254327774} 01/27/2022 10:11:00 - INFO - codeparrot_training - Step 15230: {'lr': 0.0004119884359386891, 'samples': 2924352, 'steps': 15230, 'loss/train': 0.4272100031375885} 01/27/2022 10:11:03 - INFO - codeparrot_training - Step 15231: {'lr': 0.0004119759726370058, 'samples': 2924544, 'steps': 15231, 'loss/train': 0.5351268500089645} 01/27/2022 10:11:07 - INFO - codeparrot_training - Step 15232: {'lr': 0.0004119635086414689, 'samples': 2924736, 'steps': 15232, 'loss/train': 0.843328446149826} 01/27/2022 10:11:11 - INFO - codeparrot_training - Step 15233: {'lr': 0.0004119510439521318, 'samples': 2924928, 'steps': 15233, 'loss/train': 0.7913035154342651} 01/27/2022 10:11:14 - INFO - codeparrot_training - Step 15234: {'lr': 0.0004119385785690478, 'samples': 2925120, 'steps': 15234, 'loss/train': 0.8355817794799805} 01/27/2022 10:11:17 - INFO - codeparrot_training - Step 15235: {'lr': 0.0004119261124922703, 'samples': 2925312, 'steps': 15235, 'loss/train': 1.0629558563232422} 01/27/2022 10:11:20 - INFO - codeparrot_training - Step 15236: {'lr': 0.00041191364572185286, 'samples': 2925504, 'steps': 15236, 'loss/train': 0.8704325258731842} 01/27/2022 10:11:23 - INFO - codeparrot_training - Step 15237: {'lr': 0.0004119011782578487, 'samples': 2925696, 'steps': 15237, 'loss/train': 0.3841683268547058} 01/27/2022 10:11:26 - INFO - codeparrot_training - Step 15238: {'lr': 0.00041188871010031135, 'samples': 2925888, 'steps': 15238, 'loss/train': 0.9356585741043091} 01/27/2022 10:11:30 - INFO - codeparrot_training - Step 15239: {'lr': 0.0004118762412492941, 'samples': 2926080, 'steps': 15239, 'loss/train': 0.7363952547311783} 01/27/2022 10:11:33 - INFO - codeparrot_training - Step 15240: {'lr': 0.00041186377170485057, 'samples': 2926272, 'steps': 15240, 'loss/train': 0.823079913854599} 01/27/2022 10:11:37 - INFO - codeparrot_training - Step 15241: {'lr': 0.00041185130146703387, 'samples': 2926464, 'steps': 15241, 'loss/train': 0.94053053855896} 01/27/2022 10:11:40 - INFO - codeparrot_training - Step 15242: {'lr': 0.0004118388305358977, 'samples': 2926656, 'steps': 15242, 'loss/train': 0.7145417332649231} 01/27/2022 10:11:43 - INFO - codeparrot_training - Step 15243: {'lr': 0.0004118263589114953, 'samples': 2926848, 'steps': 15243, 'loss/train': 0.7779580950737} 01/27/2022 10:11:47 - INFO - codeparrot_training - Step 15244: {'lr': 0.00041181388659388026, 'samples': 2927040, 'steps': 15244, 'loss/train': 1.0018833875656128} 01/27/2022 10:11:50 - INFO - codeparrot_training - Step 15245: {'lr': 0.00041180141358310586, 'samples': 2927232, 'steps': 15245, 'loss/train': 0.9527736604213715} 01/27/2022 10:11:53 - INFO - codeparrot_training - Step 15246: {'lr': 0.00041178893987922556, 'samples': 2927424, 'steps': 15246, 'loss/train': 0.579461395740509} 01/27/2022 10:11:56 - INFO - codeparrot_training - Step 15247: {'lr': 0.0004117764654822929, 'samples': 2927616, 'steps': 15247, 'loss/train': 0.7173086553812027} 01/27/2022 10:11:59 - INFO - codeparrot_training - Step 15248: {'lr': 0.0004117639903923611, 'samples': 2927808, 'steps': 15248, 'loss/train': 0.7694372534751892} 01/27/2022 10:12:04 - INFO - codeparrot_training - Step 15249: {'lr': 0.0004117515146094838, 'samples': 2928000, 'steps': 15249, 'loss/train': 0.533294141292572} 01/27/2022 10:12:08 - INFO - codeparrot_training - Step 15250: {'lr': 0.0004117390381337144, 'samples': 2928192, 'steps': 15250, 'loss/train': 0.7289541810750961} 01/27/2022 10:12:11 - INFO - codeparrot_training - Step 15251: {'lr': 0.00041172656096510624, 'samples': 2928384, 'steps': 15251, 'loss/train': 0.7778588533401489} 01/27/2022 10:12:14 - INFO - codeparrot_training - Step 15252: {'lr': 0.0004117140831037129, 'samples': 2928576, 'steps': 15252, 'loss/train': 0.36101067066192627} 01/27/2022 10:12:17 - INFO - codeparrot_training - Step 15253: {'lr': 0.00041170160454958785, 'samples': 2928768, 'steps': 15253, 'loss/train': 0.9255006909370422} 01/27/2022 10:12:20 - INFO - codeparrot_training - Step 15254: {'lr': 0.00041168912530278434, 'samples': 2928960, 'steps': 15254, 'loss/train': 0.7794992029666901} 01/27/2022 10:12:23 - INFO - codeparrot_training - Step 15255: {'lr': 0.00041167664536335605, 'samples': 2929152, 'steps': 15255, 'loss/train': 0.8711095154285431} 01/27/2022 10:12:26 - INFO - codeparrot_training - Step 15256: {'lr': 0.0004116641647313563, 'samples': 2929344, 'steps': 15256, 'loss/train': 0.9579510390758514} 01/27/2022 10:12:30 - INFO - codeparrot_training - Step 15257: {'lr': 0.00041165168340683857, 'samples': 2929536, 'steps': 15257, 'loss/train': 1.2563281059265137} 01/27/2022 10:12:34 - INFO - codeparrot_training - Step 15258: {'lr': 0.0004116392013898564, 'samples': 2929728, 'steps': 15258, 'loss/train': 1.565624177455902} 01/27/2022 10:12:37 - INFO - codeparrot_training - Step 15259: {'lr': 0.0004116267186804632, 'samples': 2929920, 'steps': 15259, 'loss/train': 1.0207874178886414} 01/27/2022 10:12:40 - INFO - codeparrot_training - Step 15260: {'lr': 0.0004116142352787125, 'samples': 2930112, 'steps': 15260, 'loss/train': 0.8236487209796906} 01/27/2022 10:12:43 - INFO - codeparrot_training - Step 15261: {'lr': 0.0004116017511846577, 'samples': 2930304, 'steps': 15261, 'loss/train': 0.8386998474597931} 01/27/2022 10:12:47 - INFO - codeparrot_training - Step 15262: {'lr': 0.00041158926639835234, 'samples': 2930496, 'steps': 15262, 'loss/train': 0.7969809472560883} 01/27/2022 10:12:50 - INFO - codeparrot_training - Step 15263: {'lr': 0.00041157678091984987, 'samples': 2930688, 'steps': 15263, 'loss/train': 0.9494924247264862} 01/27/2022 10:12:53 - INFO - codeparrot_training - Step 15264: {'lr': 0.0004115642947492038, 'samples': 2930880, 'steps': 15264, 'loss/train': 1.0654816925525665} 01/27/2022 10:12:56 - INFO - codeparrot_training - Step 15265: {'lr': 0.0004115518078864675, 'samples': 2931072, 'steps': 15265, 'loss/train': 0.4887765794992447} 01/27/2022 10:12:59 - INFO - codeparrot_training - Step 15266: {'lr': 0.0004115393203316946, 'samples': 2931264, 'steps': 15266, 'loss/train': 0.906529426574707} 01/27/2022 10:13:04 - INFO - codeparrot_training - Step 15267: {'lr': 0.00041152683208493855, 'samples': 2931456, 'steps': 15267, 'loss/train': 0.2713078334927559} 01/27/2022 10:13:07 - INFO - codeparrot_training - Step 15268: {'lr': 0.0004115143431462529, 'samples': 2931648, 'steps': 15268, 'loss/train': 1.2759371101856232} 01/27/2022 10:13:11 - INFO - codeparrot_training - Step 15269: {'lr': 0.000411501853515691, 'samples': 2931840, 'steps': 15269, 'loss/train': 0.742437869310379} 01/27/2022 10:13:14 - INFO - codeparrot_training - Step 15270: {'lr': 0.00041148936319330656, 'samples': 2932032, 'steps': 15270, 'loss/train': 0.757684975862503} 01/27/2022 10:13:17 - INFO - codeparrot_training - Step 15271: {'lr': 0.0004114768721791529, 'samples': 2932224, 'steps': 15271, 'loss/train': 0.9529096484184265} 01/27/2022 10:13:20 - INFO - codeparrot_training - Step 15272: {'lr': 0.00041146438047328347, 'samples': 2932416, 'steps': 15272, 'loss/train': 0.4093526601791382} 01/27/2022 10:13:23 - INFO - codeparrot_training - Step 15273: {'lr': 0.00041145188807575206, 'samples': 2932608, 'steps': 15273, 'loss/train': 0.8265180587768555} 01/27/2022 10:13:26 - INFO - codeparrot_training - Step 15274: {'lr': 0.000411439394986612, 'samples': 2932800, 'steps': 15274, 'loss/train': 0.7537222802639008} 01/27/2022 10:13:29 - INFO - codeparrot_training - Step 15275: {'lr': 0.00041142690120591686, 'samples': 2932992, 'steps': 15275, 'loss/train': 0.7074666023254395} 01/27/2022 10:13:34 - INFO - codeparrot_training - Step 15276: {'lr': 0.0004114144067337201, 'samples': 2933184, 'steps': 15276, 'loss/train': 0.6893979460000992} 01/27/2022 10:13:37 - INFO - codeparrot_training - Step 15277: {'lr': 0.0004114019115700752, 'samples': 2933376, 'steps': 15277, 'loss/train': 1.6662030816078186} 01/27/2022 10:13:40 - INFO - codeparrot_training - Step 15278: {'lr': 0.00041138941571503587, 'samples': 2933568, 'steps': 15278, 'loss/train': 0.8236050009727478} 01/27/2022 10:13:43 - INFO - codeparrot_training - Step 15279: {'lr': 0.0004113769191686555, 'samples': 2933760, 'steps': 15279, 'loss/train': 0.7639484703540802} 01/27/2022 10:13:46 - INFO - codeparrot_training - Step 15280: {'lr': 0.00041136442193098765, 'samples': 2933952, 'steps': 15280, 'loss/train': 1.1968939304351807} 01/27/2022 10:13:50 - INFO - codeparrot_training - Step 15281: {'lr': 0.00041135192400208585, 'samples': 2934144, 'steps': 15281, 'loss/train': 1.1819135248661041} 01/27/2022 10:13:53 - INFO - codeparrot_training - Step 15282: {'lr': 0.00041133942538200364, 'samples': 2934336, 'steps': 15282, 'loss/train': 0.8196035027503967} 01/27/2022 10:13:56 - INFO - codeparrot_training - Step 15283: {'lr': 0.0004113269260707946, 'samples': 2934528, 'steps': 15283, 'loss/train': 0.9080958366394043} 01/27/2022 10:14:00 - INFO - codeparrot_training - Step 15284: {'lr': 0.0004113144260685122, 'samples': 2934720, 'steps': 15284, 'loss/train': 0.6088594347238541} 01/27/2022 10:14:03 - INFO - codeparrot_training - Step 15285: {'lr': 0.00041130192537521, 'samples': 2934912, 'steps': 15285, 'loss/train': 0.5103038549423218} 01/27/2022 10:14:07 - INFO - codeparrot_training - Step 15286: {'lr': 0.0004112894239909416, 'samples': 2935104, 'steps': 15286, 'loss/train': 0.2525225132703781} 01/27/2022 10:14:10 - INFO - codeparrot_training - Step 15287: {'lr': 0.0004112769219157605, 'samples': 2935296, 'steps': 15287, 'loss/train': 0.39502689242362976} 01/27/2022 10:14:13 - INFO - codeparrot_training - Step 15288: {'lr': 0.00041126441914972036, 'samples': 2935488, 'steps': 15288, 'loss/train': 0.831643670797348} 01/27/2022 10:14:16 - INFO - codeparrot_training - Step 15289: {'lr': 0.00041125191569287456, 'samples': 2935680, 'steps': 15289, 'loss/train': 1.1580512523651123} 01/27/2022 10:14:19 - INFO - codeparrot_training - Step 15290: {'lr': 0.0004112394115452768, 'samples': 2935872, 'steps': 15290, 'loss/train': 0.6716718524694443} 01/27/2022 10:14:22 - INFO - codeparrot_training - Step 15291: {'lr': 0.00041122690670698054, 'samples': 2936064, 'steps': 15291, 'loss/train': 0.6267798095941544} 01/27/2022 10:14:25 - INFO - codeparrot_training - Step 15292: {'lr': 0.0004112144011780395, 'samples': 2936256, 'steps': 15292, 'loss/train': 0.7489550560712814} 01/27/2022 10:14:30 - INFO - codeparrot_training - Step 15293: {'lr': 0.00041120189495850713, 'samples': 2936448, 'steps': 15293, 'loss/train': 1.079516440629959} 01/27/2022 10:14:33 - INFO - codeparrot_training - Step 15294: {'lr': 0.000411189388048437, 'samples': 2936640, 'steps': 15294, 'loss/train': 0.9992784261703491} 01/27/2022 10:14:36 - INFO - codeparrot_training - Step 15295: {'lr': 0.0004111768804478827, 'samples': 2936832, 'steps': 15295, 'loss/train': 0.6946471184492111} 01/27/2022 10:14:39 - INFO - codeparrot_training - Step 15296: {'lr': 0.00041116437215689785, 'samples': 2937024, 'steps': 15296, 'loss/train': 0.9386417269706726} 01/27/2022 10:14:43 - INFO - codeparrot_training - Step 15297: {'lr': 0.000411151863175536, 'samples': 2937216, 'steps': 15297, 'loss/train': 0.9065860211849213} 01/27/2022 10:14:46 - INFO - codeparrot_training - Step 15298: {'lr': 0.00041113935350385074, 'samples': 2937408, 'steps': 15298, 'loss/train': 0.6236107349395752} 01/27/2022 10:14:49 - INFO - codeparrot_training - Step 15299: {'lr': 0.0004111268431418957, 'samples': 2937600, 'steps': 15299, 'loss/train': 0.8787169754505157} 01/27/2022 10:14:52 - INFO - codeparrot_training - Step 15300: {'lr': 0.0004111143320897244, 'samples': 2937792, 'steps': 15300, 'loss/train': 0.9310993552207947} 01/27/2022 10:14:55 - INFO - codeparrot_training - Step 15301: {'lr': 0.0004111018203473904, 'samples': 2937984, 'steps': 15301, 'loss/train': 0.8509276807308197} 01/27/2022 10:15:01 - INFO - codeparrot_training - Step 15302: {'lr': 0.0004110893079149474, 'samples': 2938176, 'steps': 15302, 'loss/train': 0.9827264249324799} 01/27/2022 10:15:04 - INFO - codeparrot_training - Step 15303: {'lr': 0.000411076794792449, 'samples': 2938368, 'steps': 15303, 'loss/train': 0.645241767168045} 01/27/2022 10:15:07 - INFO - codeparrot_training - Step 15304: {'lr': 0.0004110642809799487, 'samples': 2938560, 'steps': 15304, 'loss/train': 0.6606384068727493} 01/27/2022 10:15:10 - INFO - codeparrot_training - Step 15305: {'lr': 0.0004110517664775002, 'samples': 2938752, 'steps': 15305, 'loss/train': 1.1449845135211945} 01/27/2022 10:15:13 - INFO - codeparrot_training - Step 15306: {'lr': 0.00041103925128515705, 'samples': 2938944, 'steps': 15306, 'loss/train': 0.6412767469882965} 01/27/2022 10:15:17 - INFO - codeparrot_training - Step 15307: {'lr': 0.0004110267354029729, 'samples': 2939136, 'steps': 15307, 'loss/train': 0.7992390096187592} 01/27/2022 10:15:20 - INFO - codeparrot_training - Step 15308: {'lr': 0.0004110142188310013, 'samples': 2939328, 'steps': 15308, 'loss/train': 0.792874664068222} 01/27/2022 10:15:23 - INFO - codeparrot_training - Step 15309: {'lr': 0.00041100170156929596, 'samples': 2939520, 'steps': 15309, 'loss/train': 0.8168321549892426} 01/27/2022 10:15:26 - INFO - codeparrot_training - Step 15310: {'lr': 0.0004109891836179105, 'samples': 2939712, 'steps': 15310, 'loss/train': 0.7460162043571472} 01/27/2022 10:15:30 - INFO - codeparrot_training - Step 15311: {'lr': 0.0004109766649768984, 'samples': 2939904, 'steps': 15311, 'loss/train': 0.8948601186275482} 01/27/2022 10:15:34 - INFO - codeparrot_training - Step 15312: {'lr': 0.00041096414564631347, 'samples': 2940096, 'steps': 15312, 'loss/train': 0.9286184012889862} 01/27/2022 10:15:37 - INFO - codeparrot_training - Step 15313: {'lr': 0.00041095162562620915, 'samples': 2940288, 'steps': 15313, 'loss/train': 0.8153792023658752} 01/27/2022 10:15:40 - INFO - codeparrot_training - Step 15314: {'lr': 0.00041093910491663926, 'samples': 2940480, 'steps': 15314, 'loss/train': 1.0667567253112793} 01/27/2022 10:15:43 - INFO - codeparrot_training - Step 15315: {'lr': 0.0004109265835176573, 'samples': 2940672, 'steps': 15315, 'loss/train': 1.0862012207508087} 01/27/2022 10:15:46 - INFO - codeparrot_training - Step 15316: {'lr': 0.00041091406142931705, 'samples': 2940864, 'steps': 15316, 'loss/train': 0.6550306230783463} 01/27/2022 10:15:49 - INFO - codeparrot_training - Step 15317: {'lr': 0.00041090153865167196, 'samples': 2941056, 'steps': 15317, 'loss/train': 0.7640447616577148} 01/27/2022 10:15:52 - INFO - codeparrot_training - Step 15318: {'lr': 0.0004108890151847758, 'samples': 2941248, 'steps': 15318, 'loss/train': 0.6777294427156448} 01/27/2022 10:15:56 - INFO - codeparrot_training - Step 15319: {'lr': 0.0004108764910286822, 'samples': 2941440, 'steps': 15319, 'loss/train': 1.3649592697620392} 01/27/2022 10:16:00 - INFO - codeparrot_training - Step 15320: {'lr': 0.00041086396618344475, 'samples': 2941632, 'steps': 15320, 'loss/train': 0.6145138442516327} 01/27/2022 10:16:03 - INFO - codeparrot_training - Step 15321: {'lr': 0.0004108514406491172, 'samples': 2941824, 'steps': 15321, 'loss/train': 0.7728468775749207} 01/27/2022 10:16:06 - INFO - codeparrot_training - Step 15322: {'lr': 0.0004108389144257531, 'samples': 2942016, 'steps': 15322, 'loss/train': 1.0586694180965424} 01/27/2022 10:16:09 - INFO - codeparrot_training - Step 15323: {'lr': 0.0004108263875134062, 'samples': 2942208, 'steps': 15323, 'loss/train': 0.7949241399765015} 01/27/2022 10:16:13 - INFO - codeparrot_training - Step 15324: {'lr': 0.0004108138599121301, 'samples': 2942400, 'steps': 15324, 'loss/train': 0.5776965469121933} 01/27/2022 10:16:16 - INFO - codeparrot_training - Step 15325: {'lr': 0.00041080133162197855, 'samples': 2942592, 'steps': 15325, 'loss/train': 1.0449239015579224} 01/27/2022 10:16:19 - INFO - codeparrot_training - Step 15326: {'lr': 0.0004107888026430051, 'samples': 2942784, 'steps': 15326, 'loss/train': 1.0155149102210999} 01/27/2022 10:16:22 - INFO - codeparrot_training - Step 15327: {'lr': 0.0004107762729752635, 'samples': 2942976, 'steps': 15327, 'loss/train': 0.28637829422950745} 01/27/2022 10:16:25 - INFO - codeparrot_training - Step 15328: {'lr': 0.00041076374261880735, 'samples': 2943168, 'steps': 15328, 'loss/train': 0.9560295939445496} 01/27/2022 10:16:31 - INFO - codeparrot_training - Step 15329: {'lr': 0.0004107512115736904, 'samples': 2943360, 'steps': 15329, 'loss/train': 0.7067835330963135} 01/27/2022 10:16:34 - INFO - codeparrot_training - Step 15330: {'lr': 0.0004107386798399664, 'samples': 2943552, 'steps': 15330, 'loss/train': 0.7481270581483841} 01/27/2022 10:16:37 - INFO - codeparrot_training - Step 15331: {'lr': 0.00041072614741768877, 'samples': 2943744, 'steps': 15331, 'loss/train': 0.472157746553421} 01/27/2022 10:16:40 - INFO - codeparrot_training - Step 15332: {'lr': 0.00041071361430691143, 'samples': 2943936, 'steps': 15332, 'loss/train': 0.8282226026058197} 01/27/2022 10:16:43 - INFO - codeparrot_training - Step 15333: {'lr': 0.00041070108050768805, 'samples': 2944128, 'steps': 15333, 'loss/train': 5.481606602668762} 01/27/2022 10:16:47 - INFO - codeparrot_training - Step 15334: {'lr': 0.00041068854602007224, 'samples': 2944320, 'steps': 15334, 'loss/train': 0.7248976528644562} 01/27/2022 10:16:50 - INFO - codeparrot_training - Step 15335: {'lr': 0.0004106760108441177, 'samples': 2944512, 'steps': 15335, 'loss/train': 1.2078734636306763} 01/27/2022 10:16:53 - INFO - codeparrot_training - Step 15336: {'lr': 0.0004106634749798782, 'samples': 2944704, 'steps': 15336, 'loss/train': 1.0520188808441162} 01/27/2022 10:16:56 - INFO - codeparrot_training - Step 15337: {'lr': 0.0004106509384274073, 'samples': 2944896, 'steps': 15337, 'loss/train': 0.6389463096857071} 01/27/2022 10:17:00 - INFO - codeparrot_training - Step 15338: {'lr': 0.0004106384011867589, 'samples': 2945088, 'steps': 15338, 'loss/train': 0.5491106361150742} 01/27/2022 10:17:04 - INFO - codeparrot_training - Step 15339: {'lr': 0.00041062586325798654, 'samples': 2945280, 'steps': 15339, 'loss/train': 1.0565254390239716} 01/27/2022 10:17:07 - INFO - codeparrot_training - Step 15340: {'lr': 0.000410613324641144, 'samples': 2945472, 'steps': 15340, 'loss/train': 0.7047126889228821} 01/27/2022 10:17:10 - INFO - codeparrot_training - Step 15341: {'lr': 0.000410600785336285, 'samples': 2945664, 'steps': 15341, 'loss/train': 0.8681658804416656} 01/27/2022 10:17:13 - INFO - codeparrot_training - Step 15342: {'lr': 0.0004105882453434632, 'samples': 2945856, 'steps': 15342, 'loss/train': 0.7791650891304016} 01/27/2022 10:17:16 - INFO - codeparrot_training - Step 15343: {'lr': 0.0004105757046627323, 'samples': 2946048, 'steps': 15343, 'loss/train': 0.44820772111415863} 01/27/2022 10:17:19 - INFO - codeparrot_training - Step 15344: {'lr': 0.00041056316329414613, 'samples': 2946240, 'steps': 15344, 'loss/train': 0.3611082136631012} 01/27/2022 10:17:22 - INFO - codeparrot_training - Step 15345: {'lr': 0.0004105506212377583, 'samples': 2946432, 'steps': 15345, 'loss/train': 1.0265438854694366} 01/27/2022 10:17:28 - INFO - codeparrot_training - Step 15346: {'lr': 0.0004105380784936227, 'samples': 2946624, 'steps': 15346, 'loss/train': 1.035628080368042} 01/27/2022 10:17:31 - INFO - codeparrot_training - Step 15347: {'lr': 0.0004105255350617928, 'samples': 2946816, 'steps': 15347, 'loss/train': 0.7671354711055756} 01/27/2022 10:17:34 - INFO - codeparrot_training - Step 15348: {'lr': 0.0004105129909423226, 'samples': 2947008, 'steps': 15348, 'loss/train': 0.9211370944976807} 01/27/2022 10:17:37 - INFO - codeparrot_training - Step 15349: {'lr': 0.0004105004461352657, 'samples': 2947200, 'steps': 15349, 'loss/train': 1.0758951902389526} 01/27/2022 10:17:40 - INFO - codeparrot_training - Step 15350: {'lr': 0.00041048790064067577, 'samples': 2947392, 'steps': 15350, 'loss/train': 1.1553095877170563} 01/27/2022 10:17:43 - INFO - codeparrot_training - Step 15351: {'lr': 0.0004104753544586067, 'samples': 2947584, 'steps': 15351, 'loss/train': 0.8947143852710724} 01/27/2022 10:17:46 - INFO - codeparrot_training - Step 15352: {'lr': 0.0004104628075891121, 'samples': 2947776, 'steps': 15352, 'loss/train': 0.7557309865951538} 01/27/2022 10:17:50 - INFO - codeparrot_training - Step 15353: {'lr': 0.00041045026003224593, 'samples': 2947968, 'steps': 15353, 'loss/train': 1.0776202976703644} 01/27/2022 10:17:53 - INFO - codeparrot_training - Step 15354: {'lr': 0.00041043771178806164, 'samples': 2948160, 'steps': 15354, 'loss/train': 0.5889621376991272} 01/27/2022 10:17:57 - INFO - codeparrot_training - Step 15355: {'lr': 0.00041042516285661325, 'samples': 2948352, 'steps': 15355, 'loss/train': 0.6216762810945511} 01/27/2022 10:18:00 - INFO - codeparrot_training - Step 15356: {'lr': 0.00041041261323795437, 'samples': 2948544, 'steps': 15356, 'loss/train': 0.7122577428817749} 01/27/2022 10:18:03 - INFO - codeparrot_training - Step 15357: {'lr': 0.00041040006293213883, 'samples': 2948736, 'steps': 15357, 'loss/train': 0.7970263659954071} 01/27/2022 10:18:07 - INFO - codeparrot_training - Step 15358: {'lr': 0.0004103875119392203, 'samples': 2948928, 'steps': 15358, 'loss/train': 1.0583327114582062} 01/27/2022 10:18:10 - INFO - codeparrot_training - Step 15359: {'lr': 0.00041037496025925256, 'samples': 2949120, 'steps': 15359, 'loss/train': 0.7347071617841721} 01/27/2022 10:18:13 - INFO - codeparrot_training - Step 15360: {'lr': 0.0004103624078922895, 'samples': 2949312, 'steps': 15360, 'loss/train': 1.1761009991168976} 01/27/2022 10:18:16 - INFO - codeparrot_training - Step 15361: {'lr': 0.0004103498548383847, 'samples': 2949504, 'steps': 15361, 'loss/train': 1.86068594455719} 01/27/2022 10:18:19 - INFO - codeparrot_training - Step 15362: {'lr': 0.00041033730109759216, 'samples': 2949696, 'steps': 15362, 'loss/train': 0.7421299070119858} 01/27/2022 10:18:22 - INFO - codeparrot_training - Step 15363: {'lr': 0.00041032474666996544, 'samples': 2949888, 'steps': 15363, 'loss/train': 0.8323457837104797} 01/27/2022 10:18:27 - INFO - codeparrot_training - Step 15364: {'lr': 0.0004103121915555585, 'samples': 2950080, 'steps': 15364, 'loss/train': 0.8781857192516327} 01/27/2022 10:18:30 - INFO - codeparrot_training - Step 15365: {'lr': 0.00041029963575442494, 'samples': 2950272, 'steps': 15365, 'loss/train': 0.6962047219276428} 01/27/2022 10:18:33 - INFO - codeparrot_training - Step 15366: {'lr': 0.0004102870792666187, 'samples': 2950464, 'steps': 15366, 'loss/train': 0.7535512447357178} 01/27/2022 10:18:36 - INFO - codeparrot_training - Step 15367: {'lr': 0.0004102745220921935, 'samples': 2950656, 'steps': 15367, 'loss/train': 1.1577123999595642} 01/27/2022 10:18:39 - INFO - codeparrot_training - Step 15368: {'lr': 0.0004102619642312031, 'samples': 2950848, 'steps': 15368, 'loss/train': 1.0223055481910706} 01/27/2022 10:18:43 - INFO - codeparrot_training - Step 15369: {'lr': 0.0004102494056837014, 'samples': 2951040, 'steps': 15369, 'loss/train': 0.6968680769205093} 01/27/2022 10:18:46 - INFO - codeparrot_training - Step 15370: {'lr': 0.00041023684644974213, 'samples': 2951232, 'steps': 15370, 'loss/train': 0.3338806629180908} 01/27/2022 10:18:49 - INFO - codeparrot_training - Step 15371: {'lr': 0.00041022428652937905, 'samples': 2951424, 'steps': 15371, 'loss/train': 0.6201076805591583} 01/27/2022 10:18:52 - INFO - codeparrot_training - Step 15372: {'lr': 0.000410211725922666, 'samples': 2951616, 'steps': 15372, 'loss/train': 0.5547677725553513} 01/27/2022 10:18:58 - INFO - codeparrot_training - Step 15373: {'lr': 0.00041019916462965684, 'samples': 2951808, 'steps': 15373, 'loss/train': 0.3629724830389023} 01/27/2022 10:19:01 - INFO - codeparrot_training - Step 15374: {'lr': 0.0004101866026504053, 'samples': 2952000, 'steps': 15374, 'loss/train': 0.852946400642395} 01/27/2022 10:19:04 - INFO - codeparrot_training - Step 15375: {'lr': 0.00041017403998496523, 'samples': 2952192, 'steps': 15375, 'loss/train': 0.7558294236660004} 01/27/2022 10:19:07 - INFO - codeparrot_training - Step 15376: {'lr': 0.0004101614766333904, 'samples': 2952384, 'steps': 15376, 'loss/train': 0.872931182384491} 01/27/2022 10:19:10 - INFO - codeparrot_training - Step 15377: {'lr': 0.0004101489125957347, 'samples': 2952576, 'steps': 15377, 'loss/train': 0.8523485362529755} 01/27/2022 10:19:13 - INFO - codeparrot_training - Step 15378: {'lr': 0.0004101363478720519, 'samples': 2952768, 'steps': 15378, 'loss/train': 1.1727977693080902} 01/27/2022 10:19:16 - INFO - codeparrot_training - Step 15379: {'lr': 0.0004101237824623958, 'samples': 2952960, 'steps': 15379, 'loss/train': 1.3045495748519897} 01/27/2022 10:19:20 - INFO - codeparrot_training - Step 15380: {'lr': 0.00041011121636682024, 'samples': 2953152, 'steps': 15380, 'loss/train': 0.8636935651302338} 01/27/2022 10:19:23 - INFO - codeparrot_training - Step 15381: {'lr': 0.0004100986495853791, 'samples': 2953344, 'steps': 15381, 'loss/train': 0.8260681629180908} 01/27/2022 10:19:27 - INFO - codeparrot_training - Step 15382: {'lr': 0.00041008608211812625, 'samples': 2953536, 'steps': 15382, 'loss/train': 0.8720628619194031} 01/27/2022 10:19:31 - INFO - codeparrot_training - Step 15383: {'lr': 0.00041007351396511537, 'samples': 2953728, 'steps': 15383, 'loss/train': 0.48479560017585754} 01/27/2022 10:19:34 - INFO - codeparrot_training - Step 15384: {'lr': 0.00041006094512640044, 'samples': 2953920, 'steps': 15384, 'loss/train': 0.9024207293987274} 01/27/2022 10:19:37 - INFO - codeparrot_training - Step 15385: {'lr': 0.00041004837560203525, 'samples': 2954112, 'steps': 15385, 'loss/train': 0.6653470695018768} 01/27/2022 10:19:40 - INFO - codeparrot_training - Step 15386: {'lr': 0.0004100358053920736, 'samples': 2954304, 'steps': 15386, 'loss/train': 0.4877624362707138} 01/27/2022 10:19:43 - INFO - codeparrot_training - Step 15387: {'lr': 0.00041002323449656943, 'samples': 2954496, 'steps': 15387, 'loss/train': 0.532447099685669} 01/27/2022 10:19:46 - INFO - codeparrot_training - Step 15388: {'lr': 0.00041001066291557653, 'samples': 2954688, 'steps': 15388, 'loss/train': 0.7097757160663605} 01/27/2022 10:19:49 - INFO - codeparrot_training - Step 15389: {'lr': 0.0004099980906491487, 'samples': 2954880, 'steps': 15389, 'loss/train': 0.6709431409835815} 01/27/2022 10:19:54 - INFO - codeparrot_training - Step 15390: {'lr': 0.0004099855176973399, 'samples': 2955072, 'steps': 15390, 'loss/train': 1.2830601632595062} 01/27/2022 10:19:57 - INFO - codeparrot_training - Step 15391: {'lr': 0.0004099729440602039, 'samples': 2955264, 'steps': 15391, 'loss/train': 0.842666745185852} 01/27/2022 10:20:00 - INFO - codeparrot_training - Step 15392: {'lr': 0.0004099603697377946, 'samples': 2955456, 'steps': 15392, 'loss/train': 0.3614940941333771} 01/27/2022 10:20:03 - INFO - codeparrot_training - Step 15393: {'lr': 0.000409947794730166, 'samples': 2955648, 'steps': 15393, 'loss/train': 0.3422804921865463} 01/27/2022 10:20:07 - INFO - codeparrot_training - Step 15394: {'lr': 0.0004099352190373716, 'samples': 2955840, 'steps': 15394, 'loss/train': 1.1397319436073303} 01/27/2022 10:20:10 - INFO - codeparrot_training - Step 15395: {'lr': 0.0004099226426594657, 'samples': 2956032, 'steps': 15395, 'loss/train': 0.5593622177839279} 01/27/2022 10:20:13 - INFO - codeparrot_training - Step 15396: {'lr': 0.0004099100655965019, 'samples': 2956224, 'steps': 15396, 'loss/train': 0.3328387588262558} 01/27/2022 10:20:16 - INFO - codeparrot_training - Step 15397: {'lr': 0.0004098974878485342, 'samples': 2956416, 'steps': 15397, 'loss/train': 1.0343619883060455} 01/27/2022 10:20:19 - INFO - codeparrot_training - Step 15398: {'lr': 0.0004098849094156164, 'samples': 2956608, 'steps': 15398, 'loss/train': 0.26610899716615677} 01/27/2022 10:20:24 - INFO - codeparrot_training - Step 15399: {'lr': 0.0004098723302978025, 'samples': 2956800, 'steps': 15399, 'loss/train': 1.0840213894844055} 01/27/2022 10:20:27 - INFO - codeparrot_training - Step 15400: {'lr': 0.00040985975049514617, 'samples': 2956992, 'steps': 15400, 'loss/train': 0.974641889333725} 01/27/2022 10:20:30 - INFO - codeparrot_training - Step 15401: {'lr': 0.00040984717000770157, 'samples': 2957184, 'steps': 15401, 'loss/train': 0.5797430723905563} 01/27/2022 10:20:33 - INFO - codeparrot_training - Step 15402: {'lr': 0.00040983458883552237, 'samples': 2957376, 'steps': 15402, 'loss/train': 1.0707774460315704} 01/27/2022 10:20:36 - INFO - codeparrot_training - Step 15403: {'lr': 0.00040982200697866256, 'samples': 2957568, 'steps': 15403, 'loss/train': 0.7000829726457596} 01/27/2022 10:20:39 - INFO - codeparrot_training - Step 15404: {'lr': 0.00040980942443717596, 'samples': 2957760, 'steps': 15404, 'loss/train': 0.9079862236976624} 01/27/2022 10:20:42 - INFO - codeparrot_training - Step 15405: {'lr': 0.0004097968412111166, 'samples': 2957952, 'steps': 15405, 'loss/train': 0.5042577087879181} 01/27/2022 10:20:46 - INFO - codeparrot_training - Step 15406: {'lr': 0.0004097842573005383, 'samples': 2958144, 'steps': 15406, 'loss/train': 1.1329558789730072} 01/27/2022 10:20:49 - INFO - codeparrot_training - Step 15407: {'lr': 0.000409771672705495, 'samples': 2958336, 'steps': 15407, 'loss/train': 0.7826457917690277} 01/27/2022 10:20:54 - INFO - codeparrot_training - Step 15408: {'lr': 0.0004097590874260405, 'samples': 2958528, 'steps': 15408, 'loss/train': 0.765110582113266} 01/27/2022 10:20:57 - INFO - codeparrot_training - Step 15409: {'lr': 0.0004097465014622289, 'samples': 2958720, 'steps': 15409, 'loss/train': 0.8616582155227661} 01/27/2022 10:21:00 - INFO - codeparrot_training - Step 15410: {'lr': 0.00040973391481411396, 'samples': 2958912, 'steps': 15410, 'loss/train': 0.803503543138504} 01/27/2022 10:21:03 - INFO - codeparrot_training - Step 15411: {'lr': 0.00040972132748174966, 'samples': 2959104, 'steps': 15411, 'loss/train': 0.6397913843393326} 01/27/2022 10:21:07 - INFO - codeparrot_training - Step 15412: {'lr': 0.00040970873946518993, 'samples': 2959296, 'steps': 15412, 'loss/train': 0.3910711705684662} 01/27/2022 10:21:10 - INFO - codeparrot_training - Step 15413: {'lr': 0.00040969615076448865, 'samples': 2959488, 'steps': 15413, 'loss/train': 0.5715521425008774} 01/27/2022 10:21:13 - INFO - codeparrot_training - Step 15414: {'lr': 0.0004096835613796998, 'samples': 2959680, 'steps': 15414, 'loss/train': 0.8066655993461609} 01/27/2022 10:21:16 - INFO - codeparrot_training - Step 15415: {'lr': 0.00040967097131087727, 'samples': 2959872, 'steps': 15415, 'loss/train': 0.832080066204071} 01/27/2022 10:21:19 - INFO - codeparrot_training - Step 15416: {'lr': 0.00040965838055807493, 'samples': 2960064, 'steps': 15416, 'loss/train': 1.0196921825408936} 01/27/2022 10:21:24 - INFO - codeparrot_training - Step 15417: {'lr': 0.00040964578912134687, 'samples': 2960256, 'steps': 15417, 'loss/train': 0.29385410249233246} 01/27/2022 10:21:27 - INFO - codeparrot_training - Step 15418: {'lr': 0.00040963319700074684, 'samples': 2960448, 'steps': 15418, 'loss/train': 0.7163080126047134} 01/27/2022 10:21:30 - INFO - codeparrot_training - Step 15419: {'lr': 0.00040962060419632906, 'samples': 2960640, 'steps': 15419, 'loss/train': 0.8143856227397919} 01/27/2022 10:21:33 - INFO - codeparrot_training - Step 15420: {'lr': 0.00040960801070814715, 'samples': 2960832, 'steps': 15420, 'loss/train': 0.7456417679786682} 01/27/2022 10:21:36 - INFO - codeparrot_training - Step 15421: {'lr': 0.00040959541653625526, 'samples': 2961024, 'steps': 15421, 'loss/train': 1.076731413602829} 01/27/2022 10:21:39 - INFO - codeparrot_training - Step 15422: {'lr': 0.0004095828216807073, 'samples': 2961216, 'steps': 15422, 'loss/train': 1.050427794456482} 01/27/2022 10:21:43 - INFO - codeparrot_training - Step 15423: {'lr': 0.00040957022614155714, 'samples': 2961408, 'steps': 15423, 'loss/train': 0.9452986121177673} 01/27/2022 10:21:46 - INFO - codeparrot_training - Step 15424: {'lr': 0.0004095576299188589, 'samples': 2961600, 'steps': 15424, 'loss/train': 0.7747955024242401} 01/27/2022 10:21:49 - INFO - codeparrot_training - Step 15425: {'lr': 0.0004095450330126663, 'samples': 2961792, 'steps': 15425, 'loss/train': 0.5942653566598892} 01/27/2022 10:21:54 - INFO - codeparrot_training - Step 15426: {'lr': 0.0004095324354230335, 'samples': 2961984, 'steps': 15426, 'loss/train': 0.6088310480117798} 01/27/2022 10:21:57 - INFO - codeparrot_training - Step 15427: {'lr': 0.0004095198371500145, 'samples': 2962176, 'steps': 15427, 'loss/train': 0.8906205296516418} 01/27/2022 10:22:00 - INFO - codeparrot_training - Step 15428: {'lr': 0.00040950723819366307, 'samples': 2962368, 'steps': 15428, 'loss/train': 0.8517653346061707} 01/27/2022 10:22:03 - INFO - codeparrot_training - Step 15429: {'lr': 0.00040949463855403326, 'samples': 2962560, 'steps': 15429, 'loss/train': 0.7726511657238007} 01/27/2022 10:22:06 - INFO - codeparrot_training - Step 15430: {'lr': 0.00040948203823117915, 'samples': 2962752, 'steps': 15430, 'loss/train': 1.1972092688083649} 01/27/2022 10:22:10 - INFO - codeparrot_training - Step 15431: {'lr': 0.00040946943722515455, 'samples': 2962944, 'steps': 15431, 'loss/train': 0.569014772772789} 01/27/2022 10:22:13 - INFO - codeparrot_training - Step 15432: {'lr': 0.0004094568355360135, 'samples': 2963136, 'steps': 15432, 'loss/train': 0.9697061777114868} 01/27/2022 10:22:16 - INFO - codeparrot_training - Step 15433: {'lr': 0.00040944423316381006, 'samples': 2963328, 'steps': 15433, 'loss/train': 0.28715329617261887} 01/27/2022 10:22:20 - INFO - codeparrot_training - Step 15434: {'lr': 0.0004094316301085982, 'samples': 2963520, 'steps': 15434, 'loss/train': 0.8934229016304016} 01/27/2022 10:22:23 - INFO - codeparrot_training - Step 15435: {'lr': 0.00040941902637043183, 'samples': 2963712, 'steps': 15435, 'loss/train': 0.6711100190877914} 01/27/2022 10:22:27 - INFO - codeparrot_training - Step 15436: {'lr': 0.00040940642194936495, 'samples': 2963904, 'steps': 15436, 'loss/train': 0.7659207880496979} 01/27/2022 10:22:30 - INFO - codeparrot_training - Step 15437: {'lr': 0.0004093938168454515, 'samples': 2964096, 'steps': 15437, 'loss/train': 0.9133318662643433} 01/27/2022 10:22:33 - INFO - codeparrot_training - Step 15438: {'lr': 0.00040938121105874573, 'samples': 2964288, 'steps': 15438, 'loss/train': 0.9779216051101685} 01/27/2022 10:22:36 - INFO - codeparrot_training - Step 15439: {'lr': 0.0004093686045893013, 'samples': 2964480, 'steps': 15439, 'loss/train': 0.22707512229681015} 01/27/2022 10:22:39 - INFO - codeparrot_training - Step 15440: {'lr': 0.00040935599743717243, 'samples': 2964672, 'steps': 15440, 'loss/train': 0.9135463535785675} 01/27/2022 10:22:42 - INFO - codeparrot_training - Step 15441: {'lr': 0.00040934338960241305, 'samples': 2964864, 'steps': 15441, 'loss/train': 0.6397002339363098} 01/27/2022 10:22:45 - INFO - codeparrot_training - Step 15442: {'lr': 0.00040933078108507727, 'samples': 2965056, 'steps': 15442, 'loss/train': 1.0284152626991272} 01/27/2022 10:22:50 - INFO - codeparrot_training - Step 15443: {'lr': 0.00040931817188521894, 'samples': 2965248, 'steps': 15443, 'loss/train': 0.9298297762870789} 01/27/2022 10:22:53 - INFO - codeparrot_training - Step 15444: {'lr': 0.00040930556200289214, 'samples': 2965440, 'steps': 15444, 'loss/train': 0.9625587165355682} 01/27/2022 10:22:56 - INFO - codeparrot_training - Step 15445: {'lr': 0.00040929295143815093, 'samples': 2965632, 'steps': 15445, 'loss/train': 0.1810937449336052} 01/27/2022 10:22:59 - INFO - codeparrot_training - Step 15446: {'lr': 0.0004092803401910493, 'samples': 2965824, 'steps': 15446, 'loss/train': 1.0016240179538727} 01/27/2022 10:23:02 - INFO - codeparrot_training - Step 15447: {'lr': 0.00040926772826164126, 'samples': 2966016, 'steps': 15447, 'loss/train': 1.3628455102443695} 01/27/2022 10:23:06 - INFO - codeparrot_training - Step 15448: {'lr': 0.0004092551156499809, 'samples': 2966208, 'steps': 15448, 'loss/train': 0.4611998051404953} 01/27/2022 10:23:09 - INFO - codeparrot_training - Step 15449: {'lr': 0.000409242502356122, 'samples': 2966400, 'steps': 15449, 'loss/train': 0.7133510112762451} 01/27/2022 10:23:12 - INFO - codeparrot_training - Step 15450: {'lr': 0.000409229888380119, 'samples': 2966592, 'steps': 15450, 'loss/train': 0.8699155747890472} 01/27/2022 10:23:15 - INFO - codeparrot_training - Step 15451: {'lr': 0.00040921727372202565, 'samples': 2966784, 'steps': 15451, 'loss/train': 0.39516958594322205} 01/27/2022 10:23:20 - INFO - codeparrot_training - Step 15452: {'lr': 0.000409204658381896, 'samples': 2966976, 'steps': 15452, 'loss/train': 0.8630495667457581} 01/27/2022 10:23:23 - INFO - codeparrot_training - Step 15453: {'lr': 0.00040919204235978425, 'samples': 2967168, 'steps': 15453, 'loss/train': 0.6643611788749695} 01/27/2022 10:23:26 - INFO - codeparrot_training - Step 15454: {'lr': 0.0004091794256557443, 'samples': 2967360, 'steps': 15454, 'loss/train': 1.0685891211032867} 01/27/2022 10:23:30 - INFO - codeparrot_training - Step 15455: {'lr': 0.00040916680826983017, 'samples': 2967552, 'steps': 15455, 'loss/train': 0.42240411043167114} 01/27/2022 10:23:33 - INFO - codeparrot_training - Step 15456: {'lr': 0.00040915419020209605, 'samples': 2967744, 'steps': 15456, 'loss/train': 0.2999411076307297} 01/27/2022 10:23:36 - INFO - codeparrot_training - Step 15457: {'lr': 0.0004091415714525959, 'samples': 2967936, 'steps': 15457, 'loss/train': 0.992893785238266} 01/27/2022 10:23:39 - INFO - codeparrot_training - Step 15458: {'lr': 0.0004091289520213838, 'samples': 2968128, 'steps': 15458, 'loss/train': 0.8570938110351562} 01/27/2022 10:23:42 - INFO - codeparrot_training - Step 15459: {'lr': 0.0004091163319085137, 'samples': 2968320, 'steps': 15459, 'loss/train': 0.9159821569919586} 01/27/2022 10:23:45 - INFO - codeparrot_training - Step 15460: {'lr': 0.0004091037111140399, 'samples': 2968512, 'steps': 15460, 'loss/train': 0.6374292522668839} 01/27/2022 10:23:50 - INFO - codeparrot_training - Step 15461: {'lr': 0.00040909108963801624, 'samples': 2968704, 'steps': 15461, 'loss/train': 0.33885468542575836} 01/27/2022 10:23:53 - INFO - codeparrot_training - Step 15462: {'lr': 0.0004090784674804969, 'samples': 2968896, 'steps': 15462, 'loss/train': 0.8767612874507904} 01/27/2022 10:23:56 - INFO - codeparrot_training - Step 15463: {'lr': 0.0004090658446415359, 'samples': 2969088, 'steps': 15463, 'loss/train': 1.3293474912643433} 01/27/2022 10:23:59 - INFO - codeparrot_training - Step 15464: {'lr': 0.0004090532211211874, 'samples': 2969280, 'steps': 15464, 'loss/train': 0.4809568226337433} 01/27/2022 10:24:02 - INFO - codeparrot_training - Step 15465: {'lr': 0.0004090405969195053, 'samples': 2969472, 'steps': 15465, 'loss/train': 0.7625145614147186} 01/27/2022 10:24:05 - INFO - codeparrot_training - Step 15466: {'lr': 0.0004090279720365438, 'samples': 2969664, 'steps': 15466, 'loss/train': 0.42583778500556946} 01/27/2022 10:24:09 - INFO - codeparrot_training - Step 15467: {'lr': 0.00040901534647235703, 'samples': 2969856, 'steps': 15467, 'loss/train': 0.6243401169776917} 01/27/2022 10:24:12 - INFO - codeparrot_training - Step 15468: {'lr': 0.00040900272022699897, 'samples': 2970048, 'steps': 15468, 'loss/train': 1.0161330699920654} 01/27/2022 10:24:15 - INFO - codeparrot_training - Step 15469: {'lr': 0.00040899009330052375, 'samples': 2970240, 'steps': 15469, 'loss/train': 0.6883056610822678} 01/27/2022 10:24:20 - INFO - codeparrot_training - Step 15470: {'lr': 0.00040897746569298546, 'samples': 2970432, 'steps': 15470, 'loss/train': 0.5604508370161057} 01/27/2022 10:24:23 - INFO - codeparrot_training - Step 15471: {'lr': 0.0004089648374044382, 'samples': 2970624, 'steps': 15471, 'loss/train': 0.35868293792009354} 01/27/2022 10:24:26 - INFO - codeparrot_training - Step 15472: {'lr': 0.000408952208434936, 'samples': 2970816, 'steps': 15472, 'loss/train': 1.0780014395713806} 01/27/2022 10:24:29 - INFO - codeparrot_training - Step 15473: {'lr': 0.00040893957878453314, 'samples': 2971008, 'steps': 15473, 'loss/train': 1.3253121972084045} 01/27/2022 10:24:32 - INFO - codeparrot_training - Step 15474: {'lr': 0.0004089269484532834, 'samples': 2971200, 'steps': 15474, 'loss/train': 0.974394679069519} 01/27/2022 10:24:36 - INFO - codeparrot_training - Step 15475: {'lr': 0.00040891431744124123, 'samples': 2971392, 'steps': 15475, 'loss/train': 0.9289584159851074} 01/27/2022 10:24:39 - INFO - codeparrot_training - Step 15476: {'lr': 0.00040890168574846055, 'samples': 2971584, 'steps': 15476, 'loss/train': 0.7677411139011383} 01/27/2022 10:24:42 - INFO - codeparrot_training - Step 15477: {'lr': 0.0004088890533749955, 'samples': 2971776, 'steps': 15477, 'loss/train': 0.4670577943325043} 01/27/2022 10:24:46 - INFO - codeparrot_training - Step 15478: {'lr': 0.0004088764203209002, 'samples': 2971968, 'steps': 15478, 'loss/train': 1.1226298213005066} 01/27/2022 10:24:49 - INFO - codeparrot_training - Step 15479: {'lr': 0.0004088637865862287, 'samples': 2972160, 'steps': 15479, 'loss/train': 0.2535514086484909} 01/27/2022 10:24:53 - INFO - codeparrot_training - Step 15480: {'lr': 0.0004088511521710352, 'samples': 2972352, 'steps': 15480, 'loss/train': 0.6481539756059647} 01/27/2022 10:24:56 - INFO - codeparrot_training - Step 15481: {'lr': 0.0004088385170753739, 'samples': 2972544, 'steps': 15481, 'loss/train': 0.8312601149082184} 01/27/2022 10:24:59 - INFO - codeparrot_training - Step 15482: {'lr': 0.00040882588129929876, 'samples': 2972736, 'steps': 15482, 'loss/train': 0.7791342437267303} 01/27/2022 10:25:02 - INFO - codeparrot_training - Step 15483: {'lr': 0.000408813244842864, 'samples': 2972928, 'steps': 15483, 'loss/train': 0.8003999590873718} 01/27/2022 10:25:05 - INFO - codeparrot_training - Step 15484: {'lr': 0.0004088006077061237, 'samples': 2973120, 'steps': 15484, 'loss/train': 0.32488737255334854} 01/27/2022 10:25:08 - INFO - codeparrot_training - Step 15485: {'lr': 0.00040878796988913204, 'samples': 2973312, 'steps': 15485, 'loss/train': 0.9065728783607483} 01/27/2022 10:25:11 - INFO - codeparrot_training - Step 15486: {'lr': 0.00040877533139194313, 'samples': 2973504, 'steps': 15486, 'loss/train': 0.6098978072404861} 01/27/2022 10:25:16 - INFO - codeparrot_training - Step 15487: {'lr': 0.00040876269221461117, 'samples': 2973696, 'steps': 15487, 'loss/train': 0.9121798574924469} 01/27/2022 10:25:19 - INFO - codeparrot_training - Step 15488: {'lr': 0.0004087500523571902, 'samples': 2973888, 'steps': 15488, 'loss/train': 0.7558278143405914} 01/27/2022 10:25:22 - INFO - codeparrot_training - Step 15489: {'lr': 0.0004087374118197344, 'samples': 2974080, 'steps': 15489, 'loss/train': 1.1846924722194672} 01/27/2022 10:25:25 - INFO - codeparrot_training - Step 15490: {'lr': 0.00040872477060229797, 'samples': 2974272, 'steps': 15490, 'loss/train': 0.8027832806110382} 01/27/2022 10:25:29 - INFO - codeparrot_training - Step 15491: {'lr': 0.00040871212870493504, 'samples': 2974464, 'steps': 15491, 'loss/train': 0.6275765597820282} 01/27/2022 10:25:32 - INFO - codeparrot_training - Step 15492: {'lr': 0.0004086994861276996, 'samples': 2974656, 'steps': 15492, 'loss/train': 0.8696827590465546} 01/27/2022 10:25:35 - INFO - codeparrot_training - Step 15493: {'lr': 0.00040868684287064617, 'samples': 2974848, 'steps': 15493, 'loss/train': 0.4872148185968399} 01/27/2022 10:25:38 - INFO - codeparrot_training - Step 15494: {'lr': 0.0004086741989338285, 'samples': 2975040, 'steps': 15494, 'loss/train': 1.120416909456253} 01/27/2022 10:25:41 - INFO - codeparrot_training - Step 15495: {'lr': 0.0004086615543173011, 'samples': 2975232, 'steps': 15495, 'loss/train': 0.7007139176130295} 01/27/2022 10:25:46 - INFO - codeparrot_training - Step 15496: {'lr': 0.0004086489090211178, 'samples': 2975424, 'steps': 15496, 'loss/train': 0.9246970117092133} 01/27/2022 10:25:49 - INFO - codeparrot_training - Step 15497: {'lr': 0.00040863626304533316, 'samples': 2975616, 'steps': 15497, 'loss/train': 0.8666257560253143} 01/27/2022 10:25:52 - INFO - codeparrot_training - Step 15498: {'lr': 0.000408623616390001, 'samples': 2975808, 'steps': 15498, 'loss/train': 0.6772508472204208} 01/27/2022 10:25:55 - INFO - codeparrot_training - Step 15499: {'lr': 0.00040861096905517574, 'samples': 2976000, 'steps': 15499, 'loss/train': 1.1238928735256195} 01/27/2022 10:25:58 - INFO - codeparrot_training - Step 15500: {'lr': 0.0004085983210409114, 'samples': 2976192, 'steps': 15500, 'loss/train': 1.052830070257187} 01/27/2022 10:26:01 - INFO - codeparrot_training - Step 15501: {'lr': 0.00040858567234726217, 'samples': 2976384, 'steps': 15501, 'loss/train': 0.8793705403804779} 01/27/2022 10:26:05 - INFO - codeparrot_training - Step 15502: {'lr': 0.00040857302297428233, 'samples': 2976576, 'steps': 15502, 'loss/train': 0.9307697117328644} 01/27/2022 10:26:08 - INFO - codeparrot_training - Step 15503: {'lr': 0.000408560372922026, 'samples': 2976768, 'steps': 15503, 'loss/train': 0.7388522028923035} 01/27/2022 10:26:11 - INFO - codeparrot_training - Step 15504: {'lr': 0.00040854772219054737, 'samples': 2976960, 'steps': 15504, 'loss/train': 0.27265483886003494} 01/27/2022 10:26:16 - INFO - codeparrot_training - Step 15505: {'lr': 0.00040853507077990073, 'samples': 2977152, 'steps': 15505, 'loss/train': 0.4460193067789078} 01/27/2022 10:26:20 - INFO - codeparrot_training - Step 15506: {'lr': 0.00040852241869014004, 'samples': 2977344, 'steps': 15506, 'loss/train': 1.0507761240005493} 01/27/2022 10:26:23 - INFO - codeparrot_training - Step 15507: {'lr': 0.00040850976592131974, 'samples': 2977536, 'steps': 15507, 'loss/train': 1.1300164461135864} 01/27/2022 10:26:26 - INFO - codeparrot_training - Step 15508: {'lr': 0.0004084971124734939, 'samples': 2977728, 'steps': 15508, 'loss/train': 0.7711881101131439} 01/27/2022 10:26:29 - INFO - codeparrot_training - Step 15509: {'lr': 0.0004084844583467168, 'samples': 2977920, 'steps': 15509, 'loss/train': 0.37747304141521454} 01/27/2022 10:26:32 - INFO - codeparrot_training - Step 15510: {'lr': 0.00040847180354104256, 'samples': 2978112, 'steps': 15510, 'loss/train': 0.9257469177246094} 01/27/2022 10:26:35 - INFO - codeparrot_training - Step 15511: {'lr': 0.00040845914805652544, 'samples': 2978304, 'steps': 15511, 'loss/train': 0.8829046189785004} 01/27/2022 10:26:38 - INFO - codeparrot_training - Step 15512: {'lr': 0.0004084464918932197, 'samples': 2978496, 'steps': 15512, 'loss/train': 0.5658711791038513} 01/27/2022 10:26:43 - INFO - codeparrot_training - Step 15513: {'lr': 0.0004084338350511795, 'samples': 2978688, 'steps': 15513, 'loss/train': 1.0610131323337555} 01/27/2022 10:26:46 - INFO - codeparrot_training - Step 15514: {'lr': 0.00040842117753045893, 'samples': 2978880, 'steps': 15514, 'loss/train': 0.9088510572910309} 01/27/2022 10:26:49 - INFO - codeparrot_training - Step 15515: {'lr': 0.0004084085193311124, 'samples': 2979072, 'steps': 15515, 'loss/train': 0.6154107749462128} 01/27/2022 10:26:52 - INFO - codeparrot_training - Step 15516: {'lr': 0.0004083958604531941, 'samples': 2979264, 'steps': 15516, 'loss/train': 0.6783743351697922} 01/27/2022 10:26:55 - INFO - codeparrot_training - Step 15517: {'lr': 0.0004083832008967583, 'samples': 2979456, 'steps': 15517, 'loss/train': 0.8290367424488068} 01/27/2022 10:26:59 - INFO - codeparrot_training - Step 15518: {'lr': 0.00040837054066185906, 'samples': 2979648, 'steps': 15518, 'loss/train': 0.4785505682229996} 01/27/2022 10:27:02 - INFO - codeparrot_training - Step 15519: {'lr': 0.0004083578797485508, 'samples': 2979840, 'steps': 15519, 'loss/train': 0.807129442691803} 01/27/2022 10:27:05 - INFO - codeparrot_training - Step 15520: {'lr': 0.00040834521815688753, 'samples': 2980032, 'steps': 15520, 'loss/train': 0.7868169844150543} 01/27/2022 10:27:08 - INFO - codeparrot_training - Step 15521: {'lr': 0.00040833255588692375, 'samples': 2980224, 'steps': 15521, 'loss/train': 0.6673819273710251} 01/27/2022 10:27:12 - INFO - codeparrot_training - Step 15522: {'lr': 0.0004083198929387135, 'samples': 2980416, 'steps': 15522, 'loss/train': 0.2882488891482353} 01/27/2022 10:27:16 - INFO - codeparrot_training - Step 15523: {'lr': 0.0004083072293123111, 'samples': 2980608, 'steps': 15523, 'loss/train': 0.7695490121841431} 01/27/2022 10:27:19 - INFO - codeparrot_training - Step 15524: {'lr': 0.00040829456500777084, 'samples': 2980800, 'steps': 15524, 'loss/train': 0.5395924597978592} 01/27/2022 10:27:22 - INFO - codeparrot_training - Step 15525: {'lr': 0.00040828190002514694, 'samples': 2980992, 'steps': 15525, 'loss/train': 0.39022837579250336} 01/27/2022 10:27:25 - INFO - codeparrot_training - Step 15526: {'lr': 0.0004082692343644936, 'samples': 2981184, 'steps': 15526, 'loss/train': 0.890484631061554} 01/27/2022 10:27:28 - INFO - codeparrot_training - Step 15527: {'lr': 0.00040825656802586513, 'samples': 2981376, 'steps': 15527, 'loss/train': 0.6270619332790375} 01/27/2022 10:27:31 - INFO - codeparrot_training - Step 15528: {'lr': 0.00040824390100931585, 'samples': 2981568, 'steps': 15528, 'loss/train': 0.7770337164402008} 01/27/2022 10:27:34 - INFO - codeparrot_training - Step 15529: {'lr': 0.00040823123331489985, 'samples': 2981760, 'steps': 15529, 'loss/train': 0.6430349349975586} 01/27/2022 10:27:38 - INFO - codeparrot_training - Step 15530: {'lr': 0.0004082185649426715, 'samples': 2981952, 'steps': 15530, 'loss/train': 0.26952318102121353} 01/27/2022 10:27:43 - INFO - codeparrot_training - Step 15531: {'lr': 0.0004082058958926851, 'samples': 2982144, 'steps': 15531, 'loss/train': 0.38056357204914093} 01/27/2022 10:27:46 - INFO - codeparrot_training - Step 15532: {'lr': 0.0004081932261649949, 'samples': 2982336, 'steps': 15532, 'loss/train': 0.67392897605896} 01/27/2022 10:27:49 - INFO - codeparrot_training - Step 15533: {'lr': 0.00040818055575965505, 'samples': 2982528, 'steps': 15533, 'loss/train': 1.0609095990657806} 01/27/2022 10:27:52 - INFO - codeparrot_training - Step 15534: {'lr': 0.0004081678846767199, 'samples': 2982720, 'steps': 15534, 'loss/train': 0.8146214783191681} 01/27/2022 10:27:55 - INFO - codeparrot_training - Step 15535: {'lr': 0.00040815521291624393, 'samples': 2982912, 'steps': 15535, 'loss/train': 1.0544217824935913} 01/27/2022 10:27:59 - INFO - codeparrot_training - Step 15536: {'lr': 0.0004081425404782811, 'samples': 2983104, 'steps': 15536, 'loss/train': 0.1711202748119831} 01/27/2022 10:28:02 - INFO - codeparrot_training - Step 15537: {'lr': 0.0004081298673628859, 'samples': 2983296, 'steps': 15537, 'loss/train': 0.5388596802949905} 01/27/2022 10:28:05 - INFO - codeparrot_training - Step 15538: {'lr': 0.00040811719357011257, 'samples': 2983488, 'steps': 15538, 'loss/train': 1.142158180475235} 01/27/2022 10:28:08 - INFO - codeparrot_training - Step 15539: {'lr': 0.00040810451910001537, 'samples': 2983680, 'steps': 15539, 'loss/train': 0.6383031606674194} 01/27/2022 10:28:12 - INFO - codeparrot_training - Step 15540: {'lr': 0.00040809184395264867, 'samples': 2983872, 'steps': 15540, 'loss/train': 0.9126244783401489} 01/27/2022 10:28:16 - INFO - codeparrot_training - Step 15541: {'lr': 0.0004080791681280667, 'samples': 2984064, 'steps': 15541, 'loss/train': 0.3040817230939865} 01/27/2022 10:28:19 - INFO - codeparrot_training - Step 15542: {'lr': 0.00040806649162632364, 'samples': 2984256, 'steps': 15542, 'loss/train': 1.0884831547737122} 01/27/2022 10:28:22 - INFO - codeparrot_training - Step 15543: {'lr': 0.000408053814447474, 'samples': 2984448, 'steps': 15543, 'loss/train': 0.9826526641845703} 01/27/2022 10:28:25 - INFO - codeparrot_training - Step 15544: {'lr': 0.00040804113659157203, 'samples': 2984640, 'steps': 15544, 'loss/train': 1.2122803330421448} 01/27/2022 10:28:28 - INFO - codeparrot_training - Step 15545: {'lr': 0.00040802845805867205, 'samples': 2984832, 'steps': 15545, 'loss/train': 1.0390942990779877} 01/27/2022 10:28:31 - INFO - codeparrot_training - Step 15546: {'lr': 0.0004080157788488282, 'samples': 2985024, 'steps': 15546, 'loss/train': 0.593425378203392} 01/27/2022 10:28:34 - INFO - codeparrot_training - Step 15547: {'lr': 0.0004080030989620951, 'samples': 2985216, 'steps': 15547, 'loss/train': 1.0299219489097595} 01/27/2022 10:28:38 - INFO - codeparrot_training - Step 15548: {'lr': 0.0004079904183985268, 'samples': 2985408, 'steps': 15548, 'loss/train': 0.3770473301410675} 01/27/2022 10:28:43 - INFO - codeparrot_training - Step 15549: {'lr': 0.0004079777371581777, 'samples': 2985600, 'steps': 15549, 'loss/train': 0.8531733155250549} 01/27/2022 10:28:46 - INFO - codeparrot_training - Step 15550: {'lr': 0.00040796505524110215, 'samples': 2985792, 'steps': 15550, 'loss/train': 0.507839173078537} 01/27/2022 10:28:50 - INFO - codeparrot_training - Step 15551: {'lr': 0.00040795237264735454, 'samples': 2985984, 'steps': 15551, 'loss/train': 0.5825046300888062} 01/27/2022 10:28:53 - INFO - codeparrot_training - Step 15552: {'lr': 0.00040793968937698905, 'samples': 2986176, 'steps': 15552, 'loss/train': 0.8076326251029968} 01/27/2022 10:28:56 - INFO - codeparrot_training - Step 15553: {'lr': 0.00040792700543006014, 'samples': 2986368, 'steps': 15553, 'loss/train': 1.4840116202831268} 01/27/2022 10:28:59 - INFO - codeparrot_training - Step 15554: {'lr': 0.000407914320806622, 'samples': 2986560, 'steps': 15554, 'loss/train': 0.1377776451408863} 01/27/2022 10:29:02 - INFO - codeparrot_training - Step 15555: {'lr': 0.0004079016355067291, 'samples': 2986752, 'steps': 15555, 'loss/train': 0.5085232704877853} 01/27/2022 10:29:05 - INFO - codeparrot_training - Step 15556: {'lr': 0.0004078889495304357, 'samples': 2986944, 'steps': 15556, 'loss/train': 0.3819936364889145} 01/27/2022 10:29:08 - INFO - codeparrot_training - Step 15557: {'lr': 0.00040787626287779624, 'samples': 2987136, 'steps': 15557, 'loss/train': 0.9636213183403015} 01/27/2022 10:29:13 - INFO - codeparrot_training - Step 15558: {'lr': 0.0004078635755488649, 'samples': 2987328, 'steps': 15558, 'loss/train': 0.603533998131752} 01/27/2022 10:29:16 - INFO - codeparrot_training - Step 15559: {'lr': 0.00040785088754369627, 'samples': 2987520, 'steps': 15559, 'loss/train': 0.9217486381530762} 01/27/2022 10:29:19 - INFO - codeparrot_training - Step 15560: {'lr': 0.00040783819886234445, 'samples': 2987712, 'steps': 15560, 'loss/train': 0.7269825786352158} 01/27/2022 10:29:22 - INFO - codeparrot_training - Step 15561: {'lr': 0.000407825509504864, 'samples': 2987904, 'steps': 15561, 'loss/train': 0.7753691375255585} 01/27/2022 10:29:26 - INFO - codeparrot_training - Step 15562: {'lr': 0.00040781281947130897, 'samples': 2988096, 'steps': 15562, 'loss/train': 0.8595869243144989} 01/27/2022 10:29:29 - INFO - codeparrot_training - Step 15563: {'lr': 0.0004078001287617342, 'samples': 2988288, 'steps': 15563, 'loss/train': 0.883450984954834} 01/27/2022 10:29:32 - INFO - codeparrot_training - Step 15564: {'lr': 0.0004077874373761936, 'samples': 2988480, 'steps': 15564, 'loss/train': 0.5504356473684311} 01/27/2022 10:29:35 - INFO - codeparrot_training - Step 15565: {'lr': 0.0004077747453147418, 'samples': 2988672, 'steps': 15565, 'loss/train': 0.43435701727867126} 01/27/2022 10:29:40 - INFO - codeparrot_training - Step 15566: {'lr': 0.0004077620525774331, 'samples': 2988864, 'steps': 15566, 'loss/train': 0.554634690284729} 01/27/2022 10:29:43 - INFO - codeparrot_training - Step 15567: {'lr': 0.0004077493591643219, 'samples': 2989056, 'steps': 15567, 'loss/train': 1.201582431793213} 01/27/2022 10:29:46 - INFO - codeparrot_training - Step 15568: {'lr': 0.00040773666507546244, 'samples': 2989248, 'steps': 15568, 'loss/train': 0.9120202660560608} 01/27/2022 10:29:49 - INFO - codeparrot_training - Step 15569: {'lr': 0.00040772397031090923, 'samples': 2989440, 'steps': 15569, 'loss/train': 0.6953467726707458} 01/27/2022 10:29:52 - INFO - codeparrot_training - Step 15570: {'lr': 0.0004077112748707166, 'samples': 2989632, 'steps': 15570, 'loss/train': 0.7448669672012329} 01/27/2022 10:29:55 - INFO - codeparrot_training - Step 15571: {'lr': 0.000407698578754939, 'samples': 2989824, 'steps': 15571, 'loss/train': 0.8711704015731812} 01/27/2022 10:29:58 - INFO - codeparrot_training - Step 15572: {'lr': 0.0004076858819636307, 'samples': 2990016, 'steps': 15572, 'loss/train': 0.542843297123909} 01/27/2022 10:30:02 - INFO - codeparrot_training - Step 15573: {'lr': 0.0004076731844968462, 'samples': 2990208, 'steps': 15573, 'loss/train': 0.8216398358345032} 01/27/2022 10:30:05 - INFO - codeparrot_training - Step 15574: {'lr': 0.00040766048635463984, 'samples': 2990400, 'steps': 15574, 'loss/train': 0.6938819736242294} 01/27/2022 10:30:11 - INFO - codeparrot_training - Step 15575: {'lr': 0.000407647787537066, 'samples': 2990592, 'steps': 15575, 'loss/train': 5.0880206823349} 01/27/2022 10:30:14 - INFO - codeparrot_training - Step 15576: {'lr': 0.00040763508804417904, 'samples': 2990784, 'steps': 15576, 'loss/train': 5.035927176475525} 01/27/2022 10:30:17 - INFO - codeparrot_training - Step 15577: {'lr': 0.0004076223878760335, 'samples': 2990976, 'steps': 15577, 'loss/train': 0.8472124636173248} 01/27/2022 10:30:20 - INFO - codeparrot_training - Step 15578: {'lr': 0.0004076096870326837, 'samples': 2991168, 'steps': 15578, 'loss/train': 0.4127427041530609} 01/27/2022 10:30:23 - INFO - codeparrot_training - Step 15579: {'lr': 0.000407596985514184, 'samples': 2991360, 'steps': 15579, 'loss/train': 0.8911983668804169} 01/27/2022 10:30:26 - INFO - codeparrot_training - Step 15580: {'lr': 0.00040758428332058895, 'samples': 2991552, 'steps': 15580, 'loss/train': 0.4939490854740143} 01/27/2022 10:30:29 - INFO - codeparrot_training - Step 15581: {'lr': 0.00040757158045195274, 'samples': 2991744, 'steps': 15581, 'loss/train': 1.0463954508304596} 01/27/2022 10:30:33 - INFO - codeparrot_training - Step 15582: {'lr': 0.00040755887690833005, 'samples': 2991936, 'steps': 15582, 'loss/train': 1.027804970741272} 01/27/2022 10:30:36 - INFO - codeparrot_training - Step 15583: {'lr': 0.00040754617268977503, 'samples': 2992128, 'steps': 15583, 'loss/train': 0.5191497355699539} 01/27/2022 10:30:40 - INFO - codeparrot_training - Step 15584: {'lr': 0.0004075334677963423, 'samples': 2992320, 'steps': 15584, 'loss/train': 0.8034518659114838} 01/27/2022 10:30:43 - INFO - codeparrot_training - Step 15585: {'lr': 0.00040752076222808623, 'samples': 2992512, 'steps': 15585, 'loss/train': 0.12912015616893768} 01/27/2022 10:30:47 - INFO - codeparrot_training - Step 15586: {'lr': 0.00040750805598506115, 'samples': 2992704, 'steps': 15586, 'loss/train': 0.7208998650312424} 01/27/2022 10:30:50 - INFO - codeparrot_training - Step 15587: {'lr': 0.00040749534906732167, 'samples': 2992896, 'steps': 15587, 'loss/train': 0.5922729671001434} 01/27/2022 10:30:53 - INFO - codeparrot_training - Step 15588: {'lr': 0.0004074826414749221, 'samples': 2993088, 'steps': 15588, 'loss/train': 0.5022197216749191} 01/27/2022 10:30:56 - INFO - codeparrot_training - Step 15589: {'lr': 0.00040746993320791685, 'samples': 2993280, 'steps': 15589, 'loss/train': 0.38245511054992676} 01/27/2022 10:30:59 - INFO - codeparrot_training - Step 15590: {'lr': 0.00040745722426636043, 'samples': 2993472, 'steps': 15590, 'loss/train': 0.8424258828163147} 01/27/2022 10:31:02 - INFO - codeparrot_training - Step 15591: {'lr': 0.0004074445146503073, 'samples': 2993664, 'steps': 15591, 'loss/train': 0.6391682177782059} 01/27/2022 10:31:06 - INFO - codeparrot_training - Step 15592: {'lr': 0.00040743180435981187, 'samples': 2993856, 'steps': 15592, 'loss/train': 0.6918118894100189} 01/27/2022 10:31:10 - INFO - codeparrot_training - Step 15593: {'lr': 0.0004074190933949286, 'samples': 2994048, 'steps': 15593, 'loss/train': 0.5519225299358368} 01/27/2022 10:31:13 - INFO - codeparrot_training - Step 15594: {'lr': 0.00040740638175571175, 'samples': 2994240, 'steps': 15594, 'loss/train': 0.387773260474205} 01/27/2022 10:31:17 - INFO - codeparrot_training - Step 15595: {'lr': 0.0004073936694422161, 'samples': 2994432, 'steps': 15595, 'loss/train': 0.789720743894577} 01/27/2022 10:31:20 - INFO - codeparrot_training - Step 15596: {'lr': 0.0004073809564544959, 'samples': 2994624, 'steps': 15596, 'loss/train': 0.06723910197615623} 01/27/2022 10:31:23 - INFO - codeparrot_training - Step 15597: {'lr': 0.0004073682427926057, 'samples': 2994816, 'steps': 15597, 'loss/train': 0.9839837551116943} 01/27/2022 10:31:26 - INFO - codeparrot_training - Step 15598: {'lr': 0.00040735552845659986, 'samples': 2995008, 'steps': 15598, 'loss/train': 0.8038419485092163} 01/27/2022 10:31:29 - INFO - codeparrot_training - Step 15599: {'lr': 0.00040734281344653294, 'samples': 2995200, 'steps': 15599, 'loss/train': 0.9281355142593384} 01/27/2022 10:31:32 - INFO - codeparrot_training - Step 15600: {'lr': 0.0004073300977624594, 'samples': 2995392, 'steps': 15600, 'loss/train': 0.5337085872888565} 01/27/2022 10:31:36 - INFO - codeparrot_training - Step 15601: {'lr': 0.0004073173814044336, 'samples': 2995584, 'steps': 15601, 'loss/train': 0.7302671670913696} 01/27/2022 10:31:40 - INFO - codeparrot_training - Step 15602: {'lr': 0.0004073046643725101, 'samples': 2995776, 'steps': 15602, 'loss/train': 0.8859627842903137} 01/27/2022 10:31:43 - INFO - codeparrot_training - Step 15603: {'lr': 0.0004072919466667434, 'samples': 2995968, 'steps': 15603, 'loss/train': 1.456421971321106} 01/27/2022 10:31:46 - INFO - codeparrot_training - Step 15604: {'lr': 0.000407279228287188, 'samples': 2996160, 'steps': 15604, 'loss/train': 0.6069286018610001} 01/27/2022 10:31:49 - INFO - codeparrot_training - Step 15605: {'lr': 0.00040726650923389825, 'samples': 2996352, 'steps': 15605, 'loss/train': 0.8599977493286133} 01/27/2022 10:31:53 - INFO - codeparrot_training - Step 15606: {'lr': 0.00040725378950692874, 'samples': 2996544, 'steps': 15606, 'loss/train': 0.6981897801160812} 01/27/2022 10:31:56 - INFO - codeparrot_training - Step 15607: {'lr': 0.0004072410691063339, 'samples': 2996736, 'steps': 15607, 'loss/train': 0.973052054643631} 01/27/2022 10:31:59 - INFO - codeparrot_training - Step 15608: {'lr': 0.00040722834803216834, 'samples': 2996928, 'steps': 15608, 'loss/train': 0.6552781462669373} 01/27/2022 10:32:02 - INFO - codeparrot_training - Step 15609: {'lr': 0.0004072156262844864, 'samples': 2997120, 'steps': 15609, 'loss/train': 1.1246516704559326} 01/27/2022 10:32:07 - INFO - codeparrot_training - Step 15610: {'lr': 0.0004072029038633426, 'samples': 2997312, 'steps': 15610, 'loss/train': 0.4990398734807968} 01/27/2022 10:32:10 - INFO - codeparrot_training - Step 15611: {'lr': 0.0004071901807687915, 'samples': 2997504, 'steps': 15611, 'loss/train': 0.7242873162031174} 01/27/2022 10:32:13 - INFO - codeparrot_training - Step 15612: {'lr': 0.0004071774570008876, 'samples': 2997696, 'steps': 15612, 'loss/train': 0.6396433711051941} 01/27/2022 10:32:17 - INFO - codeparrot_training - Step 15613: {'lr': 0.00040716473255968534, 'samples': 2997888, 'steps': 15613, 'loss/train': 0.6063070446252823} 01/27/2022 10:32:20 - INFO - codeparrot_training - Step 15614: {'lr': 0.0004071520074452393, 'samples': 2998080, 'steps': 15614, 'loss/train': 1.178872436285019} 01/27/2022 10:32:23 - INFO - codeparrot_training - Step 15615: {'lr': 0.000407139281657604, 'samples': 2998272, 'steps': 15615, 'loss/train': 0.3146365284919739} 01/27/2022 10:32:26 - INFO - codeparrot_training - Step 15616: {'lr': 0.0004071265551968338, 'samples': 2998464, 'steps': 15616, 'loss/train': 0.6595892161130905} 01/27/2022 10:32:29 - INFO - codeparrot_training - Step 15617: {'lr': 0.0004071138280629835, 'samples': 2998656, 'steps': 15617, 'loss/train': 1.1717610955238342} 01/27/2022 10:32:32 - INFO - codeparrot_training - Step 15618: {'lr': 0.00040710110025610733, 'samples': 2998848, 'steps': 15618, 'loss/train': 0.7536022961139679} 01/27/2022 10:32:37 - INFO - codeparrot_training - Step 15619: {'lr': 0.00040708837177626, 'samples': 2999040, 'steps': 15619, 'loss/train': 0.9073634147644043} 01/27/2022 10:32:40 - INFO - codeparrot_training - Step 15620: {'lr': 0.00040707564262349594, 'samples': 2999232, 'steps': 15620, 'loss/train': 0.28159095346927643} 01/27/2022 10:32:43 - INFO - codeparrot_training - Step 15621: {'lr': 0.00040706291279786965, 'samples': 2999424, 'steps': 15621, 'loss/train': 0.28899912536144257} 01/27/2022 10:32:46 - INFO - codeparrot_training - Step 15622: {'lr': 0.0004070501822994358, 'samples': 2999616, 'steps': 15622, 'loss/train': 0.5352132618427277} 01/27/2022 10:32:50 - INFO - codeparrot_training - Step 15623: {'lr': 0.00040703745112824876, 'samples': 2999808, 'steps': 15623, 'loss/train': 0.3112994357943535} 01/27/2022 10:32:53 - INFO - codeparrot_training - Step 15624: {'lr': 0.00040702471928436316, 'samples': 3000000, 'steps': 15624, 'loss/train': 0.8546259999275208} 01/27/2022 10:32:56 - INFO - codeparrot_training - Step 15625: {'lr': 0.00040701198676783355, 'samples': 3000192, 'steps': 15625, 'loss/train': 0.14668841287493706} 01/27/2022 10:32:59 - INFO - codeparrot_training - Step 15626: {'lr': 0.00040699925357871446, 'samples': 3000384, 'steps': 15626, 'loss/train': 1.291738897562027} 01/27/2022 10:33:02 - INFO - codeparrot_training - Step 15627: {'lr': 0.00040698651971706037, 'samples': 3000576, 'steps': 15627, 'loss/train': 0.4575551301240921} 01/27/2022 10:33:07 - INFO - codeparrot_training - Step 15628: {'lr': 0.00040697378518292593, 'samples': 3000768, 'steps': 15628, 'loss/train': 0.9747738540172577} 01/27/2022 10:33:10 - INFO - codeparrot_training - Step 15629: {'lr': 0.0004069610499763656, 'samples': 3000960, 'steps': 15629, 'loss/train': 0.7424184679985046} 01/27/2022 10:33:13 - INFO - codeparrot_training - Step 15630: {'lr': 0.00040694831409743406, 'samples': 3001152, 'steps': 15630, 'loss/train': 0.7376485168933868} 01/27/2022 10:33:16 - INFO - codeparrot_training - Step 15631: {'lr': 0.00040693557754618566, 'samples': 3001344, 'steps': 15631, 'loss/train': 0.4232730567455292} 01/27/2022 10:33:19 - INFO - codeparrot_training - Step 15632: {'lr': 0.00040692284032267515, 'samples': 3001536, 'steps': 15632, 'loss/train': 0.03516738396137953} 01/27/2022 10:33:23 - INFO - codeparrot_training - Step 15633: {'lr': 0.00040691010242695696, 'samples': 3001728, 'steps': 15633, 'loss/train': 0.78569296002388} 01/27/2022 10:33:26 - INFO - codeparrot_training - Step 15634: {'lr': 0.00040689736385908574, 'samples': 3001920, 'steps': 15634, 'loss/train': 0.7876304984092712} 01/27/2022 10:33:29 - INFO - codeparrot_training - Step 15635: {'lr': 0.0004068846246191161, 'samples': 3002112, 'steps': 15635, 'loss/train': 0.5231411755084991} 01/27/2022 10:33:32 - INFO - codeparrot_training - Step 15636: {'lr': 0.00040687188470710245, 'samples': 3002304, 'steps': 15636, 'loss/train': 1.5908406972885132} 01/27/2022 10:33:38 - INFO - codeparrot_training - Step 15637: {'lr': 0.00040685914412309955, 'samples': 3002496, 'steps': 15637, 'loss/train': 0.6664626449346542} 01/27/2022 10:33:41 - INFO - codeparrot_training - Step 15638: {'lr': 0.0004068464028671618, 'samples': 3002688, 'steps': 15638, 'loss/train': 0.5473641157150269} 01/27/2022 10:33:44 - INFO - codeparrot_training - Step 15639: {'lr': 0.00040683366093934394, 'samples': 3002880, 'steps': 15639, 'loss/train': 0.45102958381175995} 01/27/2022 10:33:47 - INFO - codeparrot_training - Step 15640: {'lr': 0.0004068209183397004, 'samples': 3003072, 'steps': 15640, 'loss/train': 0.9850993752479553} 01/27/2022 10:33:50 - INFO - codeparrot_training - Step 15641: {'lr': 0.0004068081750682859, 'samples': 3003264, 'steps': 15641, 'loss/train': 0.8080179691314697} 01/27/2022 10:33:53 - INFO - codeparrot_training - Step 15642: {'lr': 0.00040679543112515494, 'samples': 3003456, 'steps': 15642, 'loss/train': 0.7773769497871399} 01/27/2022 10:33:56 - INFO - codeparrot_training - Step 15643: {'lr': 0.00040678268651036213, 'samples': 3003648, 'steps': 15643, 'loss/train': 0.8158512711524963} 01/27/2022 10:34:00 - INFO - codeparrot_training - Step 15644: {'lr': 0.0004067699412239622, 'samples': 3003840, 'steps': 15644, 'loss/train': 0.7840842604637146} 01/27/2022 10:34:03 - INFO - codeparrot_training - Step 15645: {'lr': 0.00040675719526600947, 'samples': 3004032, 'steps': 15645, 'loss/train': 0.036787248216569424} 01/27/2022 10:34:07 - INFO - codeparrot_training - Step 15646: {'lr': 0.0004067444486365587, 'samples': 3004224, 'steps': 15646, 'loss/train': 1.0905160009860992} 01/27/2022 10:34:10 - INFO - codeparrot_training - Step 15647: {'lr': 0.00040673170133566453, 'samples': 3004416, 'steps': 15647, 'loss/train': 0.5925396233797073} 01/27/2022 10:34:14 - INFO - codeparrot_training - Step 15648: {'lr': 0.0004067189533633815, 'samples': 3004608, 'steps': 15648, 'loss/train': 1.0180284976959229} 01/27/2022 10:34:17 - INFO - codeparrot_training - Step 15649: {'lr': 0.00040670620471976426, 'samples': 3004800, 'steps': 15649, 'loss/train': 1.4313391149044037} 01/27/2022 10:34:20 - INFO - codeparrot_training - Step 15650: {'lr': 0.0004066934554048674, 'samples': 3004992, 'steps': 15650, 'loss/train': 0.9943420886993408} 01/27/2022 10:34:23 - INFO - codeparrot_training - Step 15651: {'lr': 0.00040668070541874553, 'samples': 3005184, 'steps': 15651, 'loss/train': 0.6132618337869644} 01/27/2022 10:34:26 - INFO - codeparrot_training - Step 15652: {'lr': 0.00040666795476145326, 'samples': 3005376, 'steps': 15652, 'loss/train': 1.041513741016388} 01/27/2022 10:34:29 - INFO - codeparrot_training - Step 15653: {'lr': 0.00040665520343304516, 'samples': 3005568, 'steps': 15653, 'loss/train': 0.6273356527090073} 01/27/2022 10:34:34 - INFO - codeparrot_training - Step 15654: {'lr': 0.00040664245143357604, 'samples': 3005760, 'steps': 15654, 'loss/train': 1.182309776544571} 01/27/2022 10:34:38 - INFO - codeparrot_training - Step 15655: {'lr': 0.0004066296987631003, 'samples': 3005952, 'steps': 15655, 'loss/train': 0.8292187750339508} 01/27/2022 10:34:41 - INFO - codeparrot_training - Step 15656: {'lr': 0.0004066169454216727, 'samples': 3006144, 'steps': 15656, 'loss/train': 0.621228888630867} 01/27/2022 10:34:44 - INFO - codeparrot_training - Step 15657: {'lr': 0.00040660419140934787, 'samples': 3006336, 'steps': 15657, 'loss/train': 1.289659470319748} 01/27/2022 10:34:47 - INFO - codeparrot_training - Step 15658: {'lr': 0.0004065914367261804, 'samples': 3006528, 'steps': 15658, 'loss/train': 0.7034256756305695} 01/27/2022 10:34:50 - INFO - codeparrot_training - Step 15659: {'lr': 0.00040657868137222486, 'samples': 3006720, 'steps': 15659, 'loss/train': 0.9773639738559723} 01/27/2022 10:34:53 - INFO - codeparrot_training - Step 15660: {'lr': 0.000406565925347536, 'samples': 3006912, 'steps': 15660, 'loss/train': 0.3563143461942673} 01/27/2022 10:34:57 - INFO - codeparrot_training - Step 15661: {'lr': 0.0004065531686521685, 'samples': 3007104, 'steps': 15661, 'loss/train': 0.7162356823682785} 01/27/2022 10:35:00 - INFO - codeparrot_training - Step 15662: {'lr': 0.00040654041128617693, 'samples': 3007296, 'steps': 15662, 'loss/train': 0.7455568313598633} 01/27/2022 10:35:04 - INFO - codeparrot_training - Step 15663: {'lr': 0.0004065276532496158, 'samples': 3007488, 'steps': 15663, 'loss/train': 0.6190991252660751} 01/27/2022 10:35:07 - INFO - codeparrot_training - Step 15664: {'lr': 0.0004065148945425401, 'samples': 3007680, 'steps': 15664, 'loss/train': 0.6893211901187897} 01/27/2022 10:35:10 - INFO - codeparrot_training - Step 15665: {'lr': 0.0004065021351650042, 'samples': 3007872, 'steps': 15665, 'loss/train': 1.2268229126930237} 01/27/2022 10:35:14 - INFO - codeparrot_training - Step 15666: {'lr': 0.00040648937511706285, 'samples': 3008064, 'steps': 15666, 'loss/train': 1.2060426771640778} 01/27/2022 10:35:17 - INFO - codeparrot_training - Step 15667: {'lr': 0.0004064766143987707, 'samples': 3008256, 'steps': 15667, 'loss/train': 0.6545256525278091} 01/27/2022 10:35:20 - INFO - codeparrot_training - Step 15668: {'lr': 0.00040646385301018243, 'samples': 3008448, 'steps': 15668, 'loss/train': 1.3056898713111877} 01/27/2022 10:35:23 - INFO - codeparrot_training - Step 15669: {'lr': 0.0004064510909513527, 'samples': 3008640, 'steps': 15669, 'loss/train': 0.7700411081314087} 01/27/2022 10:35:26 - INFO - codeparrot_training - Step 15670: {'lr': 0.00040643832822233615, 'samples': 3008832, 'steps': 15670, 'loss/train': 0.7135010361671448} 01/27/2022 10:35:29 - INFO - codeparrot_training - Step 15671: {'lr': 0.0004064255648231875, 'samples': 3009024, 'steps': 15671, 'loss/train': 0.49279242753982544} 01/27/2022 10:35:34 - INFO - codeparrot_training - Step 15672: {'lr': 0.00040641280075396144, 'samples': 3009216, 'steps': 15672, 'loss/train': 1.022241622209549} 01/27/2022 10:35:37 - INFO - codeparrot_training - Step 15673: {'lr': 0.00040640003601471255, 'samples': 3009408, 'steps': 15673, 'loss/train': 0.8535444438457489} 01/27/2022 10:35:40 - INFO - codeparrot_training - Step 15674: {'lr': 0.00040638727060549556, 'samples': 3009600, 'steps': 15674, 'loss/train': 0.6760061681270599} 01/27/2022 10:35:43 - INFO - codeparrot_training - Step 15675: {'lr': 0.00040637450452636517, 'samples': 3009792, 'steps': 15675, 'loss/train': 0.7199352979660034} 01/27/2022 10:35:46 - INFO - codeparrot_training - Step 15676: {'lr': 0.00040636173777737613, 'samples': 3009984, 'steps': 15676, 'loss/train': 0.8876660764217377} 01/27/2022 10:35:50 - INFO - codeparrot_training - Step 15677: {'lr': 0.000406348970358583, 'samples': 3010176, 'steps': 15677, 'loss/train': 0.48838184773921967} 01/27/2022 10:35:53 - INFO - codeparrot_training - Step 15678: {'lr': 0.00040633620227004054, 'samples': 3010368, 'steps': 15678, 'loss/train': 0.7037598788738251} 01/27/2022 10:35:56 - INFO - codeparrot_training - Step 15679: {'lr': 0.0004063234335118033, 'samples': 3010560, 'steps': 15679, 'loss/train': 0.7762987911701202} 01/27/2022 10:35:59 - INFO - codeparrot_training - Step 15680: {'lr': 0.00040631066408392636, 'samples': 3010752, 'steps': 15680, 'loss/train': 0.5776901543140411} 01/27/2022 10:36:05 - INFO - codeparrot_training - Step 15681: {'lr': 0.000406297893986464, 'samples': 3010944, 'steps': 15681, 'loss/train': 0.5503857135772705} 01/27/2022 10:36:08 - INFO - codeparrot_training - Step 15682: {'lr': 0.0004062851232194711, 'samples': 3011136, 'steps': 15682, 'loss/train': 0.45200349390506744} 01/27/2022 10:36:11 - INFO - codeparrot_training - Step 15683: {'lr': 0.00040627235178300236, 'samples': 3011328, 'steps': 15683, 'loss/train': 0.5987409353256226} 01/27/2022 10:36:14 - INFO - codeparrot_training - Step 15684: {'lr': 0.0004062595796771126, 'samples': 3011520, 'steps': 15684, 'loss/train': 0.5401462465524673} 01/27/2022 10:36:17 - INFO - codeparrot_training - Step 15685: {'lr': 0.0004062468069018563, 'samples': 3011712, 'steps': 15685, 'loss/train': 0.9766831398010254} 01/27/2022 10:36:21 - INFO - codeparrot_training - Step 15686: {'lr': 0.0004062340334572883, 'samples': 3011904, 'steps': 15686, 'loss/train': 0.020332693587988615} 01/27/2022 10:36:24 - INFO - codeparrot_training - Step 15687: {'lr': 0.0004062212593434634, 'samples': 3012096, 'steps': 15687, 'loss/train': 0.2760489508509636} 01/27/2022 10:36:27 - INFO - codeparrot_training - Step 15688: {'lr': 0.0004062084845604361, 'samples': 3012288, 'steps': 15688, 'loss/train': 0.10797799006104469} 01/27/2022 10:36:32 - INFO - codeparrot_training - Step 15689: {'lr': 0.00040619570910826135, 'samples': 3012480, 'steps': 15689, 'loss/train': 0.43655043840408325} 01/27/2022 10:36:35 - INFO - codeparrot_training - Step 15690: {'lr': 0.0004061829329869937, 'samples': 3012672, 'steps': 15690, 'loss/train': 1.0677664875984192} 01/27/2022 10:36:38 - INFO - codeparrot_training - Step 15691: {'lr': 0.0004061701561966881, 'samples': 3012864, 'steps': 15691, 'loss/train': 0.8104136288166046} 01/27/2022 10:36:41 - INFO - codeparrot_training - Step 15692: {'lr': 0.000406157378737399, 'samples': 3013056, 'steps': 15692, 'loss/train': 1.7483943700790405} 01/27/2022 10:36:44 - INFO - codeparrot_training - Step 15693: {'lr': 0.00040614460060918136, 'samples': 3013248, 'steps': 15693, 'loss/train': 0.8786450028419495} 01/27/2022 10:36:47 - INFO - codeparrot_training - Step 15694: {'lr': 0.0004061318218120898, 'samples': 3013440, 'steps': 15694, 'loss/train': 0.9270368814468384} 01/27/2022 10:36:50 - INFO - codeparrot_training - Step 15695: {'lr': 0.000406119042346179, 'samples': 3013632, 'steps': 15695, 'loss/train': 0.8598174154758453} 01/27/2022 10:36:54 - INFO - codeparrot_training - Step 15696: {'lr': 0.0004061062622115039, 'samples': 3013824, 'steps': 15696, 'loss/train': 0.9367427229881287} 01/27/2022 10:36:57 - INFO - codeparrot_training - Step 15697: {'lr': 0.0004060934814081192, 'samples': 3014016, 'steps': 15697, 'loss/train': 1.5508900880813599} 01/27/2022 10:37:00 - INFO - codeparrot_training - Step 15698: {'lr': 0.00040608069993607954, 'samples': 3014208, 'steps': 15698, 'loss/train': 1.8156405687332153} 01/27/2022 10:37:04 - INFO - codeparrot_training - Step 15699: {'lr': 0.00040606791779543966, 'samples': 3014400, 'steps': 15699, 'loss/train': 1.9201408624649048} 01/27/2022 10:37:07 - INFO - codeparrot_training - Step 15700: {'lr': 0.00040605513498625443, 'samples': 3014592, 'steps': 15700, 'loss/train': 1.0708368122577667} 01/27/2022 10:37:11 - INFO - codeparrot_training - Step 15701: {'lr': 0.00040604235150857855, 'samples': 3014784, 'steps': 15701, 'loss/train': 1.0034924447536469} 01/27/2022 10:37:14 - INFO - codeparrot_training - Step 15702: {'lr': 0.00040602956736246677, 'samples': 3014976, 'steps': 15702, 'loss/train': 1.296532541513443} 01/27/2022 10:37:17 - INFO - codeparrot_training - Step 15703: {'lr': 0.00040601678254797394, 'samples': 3015168, 'steps': 15703, 'loss/train': 0.604518860578537} 01/27/2022 10:37:20 - INFO - codeparrot_training - Step 15704: {'lr': 0.00040600399706515466, 'samples': 3015360, 'steps': 15704, 'loss/train': 1.0621517300605774} 01/27/2022 10:37:23 - INFO - codeparrot_training - Step 15705: {'lr': 0.0004059912109140638, 'samples': 3015552, 'steps': 15705, 'loss/train': 0.3516403064131737} 01/27/2022 10:37:26 - INFO - codeparrot_training - Step 15706: {'lr': 0.00040597842409475615, 'samples': 3015744, 'steps': 15706, 'loss/train': 0.5540933310985565} 01/27/2022 10:37:30 - INFO - codeparrot_training - Step 15707: {'lr': 0.00040596563660728646, 'samples': 3015936, 'steps': 15707, 'loss/train': 0.9122962653636932} 01/27/2022 10:37:34 - INFO - codeparrot_training - Step 15708: {'lr': 0.00040595284845170956, 'samples': 3016128, 'steps': 15708, 'loss/train': 0.9515817761421204} 01/27/2022 10:37:38 - INFO - codeparrot_training - Step 15709: {'lr': 0.0004059400596280801, 'samples': 3016320, 'steps': 15709, 'loss/train': 0.9576432108879089} 01/27/2022 10:37:41 - INFO - codeparrot_training - Step 15710: {'lr': 0.00040592727013645297, 'samples': 3016512, 'steps': 15710, 'loss/train': 0.5693067312240601} 01/27/2022 10:37:44 - INFO - codeparrot_training - Step 15711: {'lr': 0.0004059144799768829, 'samples': 3016704, 'steps': 15711, 'loss/train': 0.9694055914878845} 01/27/2022 10:37:47 - INFO - codeparrot_training - Step 15712: {'lr': 0.00040590168914942477, 'samples': 3016896, 'steps': 15712, 'loss/train': 0.7975320518016815} 01/27/2022 10:37:50 - INFO - codeparrot_training - Step 15713: {'lr': 0.0004058888976541333, 'samples': 3017088, 'steps': 15713, 'loss/train': 1.1337738633155823} 01/27/2022 10:37:53 - INFO - codeparrot_training - Step 15714: {'lr': 0.00040587610549106326, 'samples': 3017280, 'steps': 15714, 'loss/train': 0.8402758240699768} 01/27/2022 10:37:56 - INFO - codeparrot_training - Step 15715: {'lr': 0.00040586331266026943, 'samples': 3017472, 'steps': 15715, 'loss/train': 0.16281845793128014} 01/27/2022 10:37:59 - INFO - codeparrot_training - Step 15716: {'lr': 0.0004058505191618067, 'samples': 3017664, 'steps': 15716, 'loss/train': 0.8250508904457092} 01/27/2022 10:38:05 - INFO - codeparrot_training - Step 15717: {'lr': 0.0004058377249957299, 'samples': 3017856, 'steps': 15717, 'loss/train': 1.0698883831501007} 01/27/2022 10:38:08 - INFO - codeparrot_training - Step 15718: {'lr': 0.0004058249301620937, 'samples': 3018048, 'steps': 15718, 'loss/train': 0.7564921975135803} 01/27/2022 10:38:11 - INFO - codeparrot_training - Step 15719: {'lr': 0.00040581213466095304, 'samples': 3018240, 'steps': 15719, 'loss/train': 0.8794682621955872} 01/27/2022 10:38:14 - INFO - codeparrot_training - Step 15720: {'lr': 0.0004057993384923626, 'samples': 3018432, 'steps': 15720, 'loss/train': 0.6597152799367905} 01/27/2022 10:38:17 - INFO - codeparrot_training - Step 15721: {'lr': 0.0004057865416563773, 'samples': 3018624, 'steps': 15721, 'loss/train': 1.000141203403473} 01/27/2022 10:38:20 - INFO - codeparrot_training - Step 15722: {'lr': 0.0004057737441530519, 'samples': 3018816, 'steps': 15722, 'loss/train': 0.8694716691970825} 01/27/2022 10:38:24 - INFO - codeparrot_training - Step 15723: {'lr': 0.0004057609459824412, 'samples': 3019008, 'steps': 15723, 'loss/train': 0.6742167323827744} 01/27/2022 10:38:27 - INFO - codeparrot_training - Step 15724: {'lr': 0.00040574814714460015, 'samples': 3019200, 'steps': 15724, 'loss/train': 0.8840144276618958} 01/27/2022 10:38:31 - INFO - codeparrot_training - Step 15725: {'lr': 0.0004057353476395835, 'samples': 3019392, 'steps': 15725, 'loss/train': 0.870579332113266} 01/27/2022 10:38:35 - INFO - codeparrot_training - Step 15726: {'lr': 0.00040572254746744607, 'samples': 3019584, 'steps': 15726, 'loss/train': 0.6343060433864594} 01/27/2022 10:38:38 - INFO - codeparrot_training - Step 15727: {'lr': 0.00040570974662824266, 'samples': 3019776, 'steps': 15727, 'loss/train': 0.5992527455091476} 01/27/2022 10:38:41 - INFO - codeparrot_training - Step 15728: {'lr': 0.00040569694512202815, 'samples': 3019968, 'steps': 15728, 'loss/train': 0.689862459897995} 01/27/2022 10:38:44 - INFO - codeparrot_training - Step 15729: {'lr': 0.00040568414294885736, 'samples': 3020160, 'steps': 15729, 'loss/train': 0.3691251575946808} 01/27/2022 10:38:47 - INFO - codeparrot_training - Step 15730: {'lr': 0.00040567134010878513, 'samples': 3020352, 'steps': 15730, 'loss/train': 0.6531042605638504} 01/27/2022 10:38:50 - INFO - codeparrot_training - Step 15731: {'lr': 0.00040565853660186633, 'samples': 3020544, 'steps': 15731, 'loss/train': 0.9442330598831177} 01/27/2022 10:38:53 - INFO - codeparrot_training - Step 15732: {'lr': 0.0004056457324281557, 'samples': 3020736, 'steps': 15732, 'loss/train': 1.929390549659729} 01/27/2022 10:38:57 - INFO - codeparrot_training - Step 15733: {'lr': 0.0004056329275877083, 'samples': 3020928, 'steps': 15733, 'loss/train': 1.1133512556552887} 01/27/2022 10:39:02 - INFO - codeparrot_training - Step 15734: {'lr': 0.00040562012208057886, 'samples': 3021120, 'steps': 15734, 'loss/train': 0.7730762958526611} 01/27/2022 10:39:05 - INFO - codeparrot_training - Step 15735: {'lr': 0.0004056073159068222, 'samples': 3021312, 'steps': 15735, 'loss/train': 1.1235328316688538} 01/27/2022 10:39:08 - INFO - codeparrot_training - Step 15736: {'lr': 0.0004055945090664931, 'samples': 3021504, 'steps': 15736, 'loss/train': 0.594909131526947} 01/27/2022 10:39:11 - INFO - codeparrot_training - Step 15737: {'lr': 0.0004055817015596467, 'samples': 3021696, 'steps': 15737, 'loss/train': 0.6950836926698685} 01/27/2022 10:39:14 - INFO - codeparrot_training - Step 15738: {'lr': 0.00040556889338633754, 'samples': 3021888, 'steps': 15738, 'loss/train': 0.5858600735664368} 01/27/2022 10:39:17 - INFO - codeparrot_training - Step 15739: {'lr': 0.00040555608454662074, 'samples': 3022080, 'steps': 15739, 'loss/train': 1.0516335368156433} 01/27/2022 10:39:21 - INFO - codeparrot_training - Step 15740: {'lr': 0.00040554327504055106, 'samples': 3022272, 'steps': 15740, 'loss/train': 1.1074225008487701} 01/27/2022 10:39:24 - INFO - codeparrot_training - Step 15741: {'lr': 0.00040553046486818336, 'samples': 3022464, 'steps': 15741, 'loss/train': 0.6780335158109665} 01/27/2022 10:39:27 - INFO - codeparrot_training - Step 15742: {'lr': 0.0004055176540295725, 'samples': 3022656, 'steps': 15742, 'loss/train': 1.765175700187683} 01/27/2022 10:39:31 - INFO - codeparrot_training - Step 15743: {'lr': 0.00040550484252477347, 'samples': 3022848, 'steps': 15743, 'loss/train': 0.8886852264404297} 01/27/2022 10:39:34 - INFO - codeparrot_training - Step 15744: {'lr': 0.00040549203035384105, 'samples': 3023040, 'steps': 15744, 'loss/train': 0.5455527752637863} 01/27/2022 10:39:38 - INFO - codeparrot_training - Step 15745: {'lr': 0.0004054792175168301, 'samples': 3023232, 'steps': 15745, 'loss/train': 0.9437538385391235} 01/27/2022 10:39:41 - INFO - codeparrot_training - Step 15746: {'lr': 0.00040546640401379556, 'samples': 3023424, 'steps': 15746, 'loss/train': 0.8903069794178009} 01/27/2022 10:39:44 - INFO - codeparrot_training - Step 15747: {'lr': 0.0004054535898447924, 'samples': 3023616, 'steps': 15747, 'loss/train': 0.19941667467355728} 01/27/2022 10:39:47 - INFO - codeparrot_training - Step 15748: {'lr': 0.0004054407750098753, 'samples': 3023808, 'steps': 15748, 'loss/train': 0.7813836336135864} 01/27/2022 10:39:50 - INFO - codeparrot_training - Step 15749: {'lr': 0.0004054279595090994, 'samples': 3024000, 'steps': 15749, 'loss/train': 0.9351506531238556} 01/27/2022 10:39:53 - INFO - codeparrot_training - Step 15750: {'lr': 0.0004054151433425194, 'samples': 3024192, 'steps': 15750, 'loss/train': 1.1451847851276398} 01/27/2022 10:39:56 - INFO - codeparrot_training - Step 15751: {'lr': 0.00040540232651019027, 'samples': 3024384, 'steps': 15751, 'loss/train': 1.0219130516052246} 01/27/2022 10:40:01 - INFO - codeparrot_training - Step 15752: {'lr': 0.0004053895090121669, 'samples': 3024576, 'steps': 15752, 'loss/train': 1.0680803954601288} 01/27/2022 10:40:04 - INFO - codeparrot_training - Step 15753: {'lr': 0.00040537669084850426, 'samples': 3024768, 'steps': 15753, 'loss/train': 1.1061465740203857} 01/27/2022 10:40:07 - INFO - codeparrot_training - Step 15754: {'lr': 0.0004053638720192572, 'samples': 3024960, 'steps': 15754, 'loss/train': 0.2594103142619133} 01/27/2022 10:40:10 - INFO - codeparrot_training - Step 15755: {'lr': 0.00040535105252448067, 'samples': 3025152, 'steps': 15755, 'loss/train': 1.0529007017612457} 01/27/2022 10:40:13 - INFO - codeparrot_training - Step 15756: {'lr': 0.0004053382323642295, 'samples': 3025344, 'steps': 15756, 'loss/train': 0.9301646947860718} 01/27/2022 10:40:17 - INFO - codeparrot_training - Step 15757: {'lr': 0.0004053254115385587, 'samples': 3025536, 'steps': 15757, 'loss/train': 0.6411663293838501} 01/27/2022 10:40:20 - INFO - codeparrot_training - Step 15758: {'lr': 0.00040531259004752317, 'samples': 3025728, 'steps': 15758, 'loss/train': 0.19054855406284332} 01/27/2022 10:40:23 - INFO - codeparrot_training - Step 15759: {'lr': 0.00040529976789117786, 'samples': 3025920, 'steps': 15759, 'loss/train': 0.8676028847694397} 01/27/2022 10:40:26 - INFO - codeparrot_training - Step 15760: {'lr': 0.0004052869450695776, 'samples': 3026112, 'steps': 15760, 'loss/train': 1.632610559463501} 01/27/2022 10:40:31 - INFO - codeparrot_training - Step 15761: {'lr': 0.00040527412158277744, 'samples': 3026304, 'steps': 15761, 'loss/train': 0.7356557697057724} 01/27/2022 10:40:34 - INFO - codeparrot_training - Step 15762: {'lr': 0.00040526129743083216, 'samples': 3026496, 'steps': 15762, 'loss/train': 0.627466544508934} 01/27/2022 10:40:38 - INFO - codeparrot_training - Step 15763: {'lr': 0.0004052484726137968, 'samples': 3026688, 'steps': 15763, 'loss/train': 1.2493381798267365} 01/27/2022 10:40:41 - INFO - codeparrot_training - Step 15764: {'lr': 0.00040523564713172634, 'samples': 3026880, 'steps': 15764, 'loss/train': 0.673240140080452} 01/27/2022 10:40:44 - INFO - codeparrot_training - Step 15765: {'lr': 0.0004052228209846756, 'samples': 3027072, 'steps': 15765, 'loss/train': 0.5794259458780289} 01/27/2022 10:40:47 - INFO - codeparrot_training - Step 15766: {'lr': 0.0004052099941726996, 'samples': 3027264, 'steps': 15766, 'loss/train': 1.0197400152683258} 01/27/2022 10:40:50 - INFO - codeparrot_training - Step 15767: {'lr': 0.0004051971666958533, 'samples': 3027456, 'steps': 15767, 'loss/train': 0.40519653260707855} 01/27/2022 10:40:53 - INFO - codeparrot_training - Step 15768: {'lr': 0.0004051843385541916, 'samples': 3027648, 'steps': 15768, 'loss/train': 0.778348982334137} 01/27/2022 10:40:56 - INFO - codeparrot_training - Step 15769: {'lr': 0.00040517150974776945, 'samples': 3027840, 'steps': 15769, 'loss/train': 0.9875755906105042} 01/27/2022 10:41:01 - INFO - codeparrot_training - Step 15770: {'lr': 0.00040515868027664185, 'samples': 3028032, 'steps': 15770, 'loss/train': 0.34746886789798737} 01/27/2022 10:41:04 - INFO - codeparrot_training - Step 15771: {'lr': 0.00040514585014086367, 'samples': 3028224, 'steps': 15771, 'loss/train': 0.8838028907775879} 01/27/2022 10:41:07 - INFO - codeparrot_training - Step 15772: {'lr': 0.00040513301934049005, 'samples': 3028416, 'steps': 15772, 'loss/train': 0.8548937737941742} 01/27/2022 10:41:10 - INFO - codeparrot_training - Step 15773: {'lr': 0.00040512018787557574, 'samples': 3028608, 'steps': 15773, 'loss/train': 0.8801071643829346} 01/27/2022 10:41:13 - INFO - codeparrot_training - Step 15774: {'lr': 0.0004051073557461759, 'samples': 3028800, 'steps': 15774, 'loss/train': 0.967812180519104} 01/27/2022 10:41:17 - INFO - codeparrot_training - Step 15775: {'lr': 0.00040509452295234527, 'samples': 3028992, 'steps': 15775, 'loss/train': 0.8506843149662018} 01/27/2022 10:41:20 - INFO - codeparrot_training - Step 15776: {'lr': 0.00040508168949413904, 'samples': 3029184, 'steps': 15776, 'loss/train': 0.692777082324028} 01/27/2022 10:41:23 - INFO - codeparrot_training - Step 15777: {'lr': 0.0004050688553716121, 'samples': 3029376, 'steps': 15777, 'loss/train': 0.8200711905956268} 01/27/2022 10:41:26 - INFO - codeparrot_training - Step 15778: {'lr': 0.0004050560205848194, 'samples': 3029568, 'steps': 15778, 'loss/train': 0.34250224381685257} 01/27/2022 10:41:31 - INFO - codeparrot_training - Step 15779: {'lr': 0.0004050431851338159, 'samples': 3029760, 'steps': 15779, 'loss/train': 1.7127978801727295} 01/27/2022 10:41:34 - INFO - codeparrot_training - Step 15780: {'lr': 0.00040503034901865666, 'samples': 3029952, 'steps': 15780, 'loss/train': 1.0358848571777344} 01/27/2022 10:41:37 - INFO - codeparrot_training - Step 15781: {'lr': 0.00040501751223939665, 'samples': 3030144, 'steps': 15781, 'loss/train': 0.7337314635515213} 01/27/2022 10:41:40 - INFO - codeparrot_training - Step 15782: {'lr': 0.00040500467479609084, 'samples': 3030336, 'steps': 15782, 'loss/train': 1.134171187877655} 01/27/2022 10:41:43 - INFO - codeparrot_training - Step 15783: {'lr': 0.00040499183668879415, 'samples': 3030528, 'steps': 15783, 'loss/train': 0.7894969582557678} 01/27/2022 10:41:46 - INFO - codeparrot_training - Step 15784: {'lr': 0.0004049789979175617, 'samples': 3030720, 'steps': 15784, 'loss/train': 0.7801816463470459} 01/27/2022 10:41:49 - INFO - codeparrot_training - Step 15785: {'lr': 0.00040496615848244845, 'samples': 3030912, 'steps': 15785, 'loss/train': 1.128368228673935} 01/27/2022 10:41:53 - INFO - codeparrot_training - Step 15786: {'lr': 0.00040495331838350933, 'samples': 3031104, 'steps': 15786, 'loss/train': 0.4084232300519943} 01/27/2022 10:41:57 - INFO - codeparrot_training - Step 15787: {'lr': 0.00040494047762079953, 'samples': 3031296, 'steps': 15787, 'loss/train': 1.103697806596756} 01/27/2022 10:42:00 - INFO - codeparrot_training - Step 15788: {'lr': 0.0004049276361943738, 'samples': 3031488, 'steps': 15788, 'loss/train': 1.1364565193653107} 01/27/2022 10:42:03 - INFO - codeparrot_training - Step 15789: {'lr': 0.00040491479410428735, 'samples': 3031680, 'steps': 15789, 'loss/train': 1.4359527826309204} 01/27/2022 10:42:06 - INFO - codeparrot_training - Step 15790: {'lr': 0.00040490195135059503, 'samples': 3031872, 'steps': 15790, 'loss/train': 0.6061565279960632} 01/27/2022 10:42:10 - INFO - codeparrot_training - Step 15791: {'lr': 0.000404889107933352, 'samples': 3032064, 'steps': 15791, 'loss/train': 0.290904276072979} 01/27/2022 10:42:13 - INFO - codeparrot_training - Step 15792: {'lr': 0.0004048762638526132, 'samples': 3032256, 'steps': 15792, 'loss/train': 0.8411537110805511} 01/27/2022 10:42:16 - INFO - codeparrot_training - Step 15793: {'lr': 0.0004048634191084336, 'samples': 3032448, 'steps': 15793, 'loss/train': 0.7670989036560059} 01/27/2022 10:42:19 - INFO - codeparrot_training - Step 15794: {'lr': 0.0004048505737008684, 'samples': 3032640, 'steps': 15794, 'loss/train': 0.3804671913385391} 01/27/2022 10:42:22 - INFO - codeparrot_training - Step 15795: {'lr': 0.0004048377276299724, 'samples': 3032832, 'steps': 15795, 'loss/train': 0.4056909531354904} 01/27/2022 10:42:27 - INFO - codeparrot_training - Step 15796: {'lr': 0.00040482488089580083, 'samples': 3033024, 'steps': 15796, 'loss/train': 0.9631420969963074} 01/27/2022 10:42:30 - INFO - codeparrot_training - Step 15797: {'lr': 0.00040481203349840864, 'samples': 3033216, 'steps': 15797, 'loss/train': 0.8404028713703156} 01/27/2022 10:42:34 - INFO - codeparrot_training - Step 15798: {'lr': 0.0004047991854378508, 'samples': 3033408, 'steps': 15798, 'loss/train': 0.8815054893493652} 01/27/2022 10:42:37 - INFO - codeparrot_training - Step 15799: {'lr': 0.00040478633671418244, 'samples': 3033600, 'steps': 15799, 'loss/train': 0.9557957947254181} 01/27/2022 10:42:40 - INFO - codeparrot_training - Step 15800: {'lr': 0.00040477348732745853, 'samples': 3033792, 'steps': 15800, 'loss/train': 0.4363967031240463} 01/27/2022 10:42:43 - INFO - codeparrot_training - Step 15801: {'lr': 0.00040476063727773416, 'samples': 3033984, 'steps': 15801, 'loss/train': 1.3895502984523773} 01/27/2022 10:42:46 - INFO - codeparrot_training - Step 15802: {'lr': 0.0004047477865650644, 'samples': 3034176, 'steps': 15802, 'loss/train': 0.6610544621944427} 01/27/2022 10:42:49 - INFO - codeparrot_training - Step 15803: {'lr': 0.00040473493518950414, 'samples': 3034368, 'steps': 15803, 'loss/train': 0.7326797693967819} 01/27/2022 10:42:52 - INFO - codeparrot_training - Step 15804: {'lr': 0.00040472208315110866, 'samples': 3034560, 'steps': 15804, 'loss/train': 0.7310878783464432} 01/27/2022 10:42:57 - INFO - codeparrot_training - Step 15805: {'lr': 0.0004047092304499329, 'samples': 3034752, 'steps': 15805, 'loss/train': 0.9037556648254395} 01/27/2022 10:43:00 - INFO - codeparrot_training - Step 15806: {'lr': 0.0004046963770860319, 'samples': 3034944, 'steps': 15806, 'loss/train': 0.9054333865642548} 01/27/2022 10:43:03 - INFO - codeparrot_training - Step 15807: {'lr': 0.0004046835230594608, 'samples': 3035136, 'steps': 15807, 'loss/train': 0.7851873636245728} 01/27/2022 10:43:06 - INFO - codeparrot_training - Step 15808: {'lr': 0.0004046706683702744, 'samples': 3035328, 'steps': 15808, 'loss/train': 1.018451303243637} 01/27/2022 10:43:09 - INFO - codeparrot_training - Step 15809: {'lr': 0.0004046578130185282, 'samples': 3035520, 'steps': 15809, 'loss/train': 0.9557715654373169} 01/27/2022 10:43:13 - INFO - codeparrot_training - Step 15810: {'lr': 0.00040464495700427694, 'samples': 3035712, 'steps': 15810, 'loss/train': 0.4876471906900406} 01/27/2022 10:43:16 - INFO - codeparrot_training - Step 15811: {'lr': 0.0004046321003275759, 'samples': 3035904, 'steps': 15811, 'loss/train': 0.900312602519989} 01/27/2022 10:43:19 - INFO - codeparrot_training - Step 15812: {'lr': 0.00040461924298847987, 'samples': 3036096, 'steps': 15812, 'loss/train': 0.43594689667224884} 01/27/2022 10:43:25 - INFO - codeparrot_training - Step 15813: {'lr': 0.0004046063849870442, 'samples': 3036288, 'steps': 15813, 'loss/train': 0.44287633895874023} 01/27/2022 10:43:28 - INFO - codeparrot_training - Step 15814: {'lr': 0.00040459352632332387, 'samples': 3036480, 'steps': 15814, 'loss/train': 0.8975049555301666} 01/27/2022 10:43:31 - INFO - codeparrot_training - Step 15815: {'lr': 0.0004045806669973739, 'samples': 3036672, 'steps': 15815, 'loss/train': 0.8538034558296204} 01/27/2022 10:43:34 - INFO - codeparrot_training - Step 15816: {'lr': 0.00040456780700924956, 'samples': 3036864, 'steps': 15816, 'loss/train': 1.7255841493606567} 01/27/2022 10:43:37 - INFO - codeparrot_training - Step 15817: {'lr': 0.0004045549463590057, 'samples': 3037056, 'steps': 15817, 'loss/train': 1.6329957246780396} 01/27/2022 10:43:40 - INFO - codeparrot_training - Step 15818: {'lr': 0.0004045420850466975, 'samples': 3037248, 'steps': 15818, 'loss/train': 0.5929710119962692} 01/27/2022 10:43:43 - INFO - codeparrot_training - Step 15819: {'lr': 0.00040452922307238016, 'samples': 3037440, 'steps': 15819, 'loss/train': 0.8172426223754883} 01/27/2022 10:43:47 - INFO - codeparrot_training - Step 15820: {'lr': 0.00040451636043610875, 'samples': 3037632, 'steps': 15820, 'loss/train': 0.7419150620698929} 01/27/2022 10:43:50 - INFO - codeparrot_training - Step 15821: {'lr': 0.0004045034971379382, 'samples': 3037824, 'steps': 15821, 'loss/train': 0.9539506137371063} 01/27/2022 10:43:53 - INFO - codeparrot_training - Step 15822: {'lr': 0.0004044906331779238, 'samples': 3038016, 'steps': 15822, 'loss/train': 0.0652850791811943} 01/27/2022 10:43:58 - INFO - codeparrot_training - Step 15823: {'lr': 0.00040447776855612053, 'samples': 3038208, 'steps': 15823, 'loss/train': 0.44844697415828705} 01/27/2022 10:44:01 - INFO - codeparrot_training - Step 15824: {'lr': 0.0004044649032725836, 'samples': 3038400, 'steps': 15824, 'loss/train': 0.5497914254665375} 01/27/2022 10:44:04 - INFO - codeparrot_training - Step 15825: {'lr': 0.000404452037327368, 'samples': 3038592, 'steps': 15825, 'loss/train': 0.7260218113660812} 01/27/2022 10:44:07 - INFO - codeparrot_training - Step 15826: {'lr': 0.00040443917072052906, 'samples': 3038784, 'steps': 15826, 'loss/train': 0.7065645754337311} 01/27/2022 10:44:10 - INFO - codeparrot_training - Step 15827: {'lr': 0.0004044263034521216, 'samples': 3038976, 'steps': 15827, 'loss/train': 0.4435698688030243} 01/27/2022 10:44:13 - INFO - codeparrot_training - Step 15828: {'lr': 0.000404413435522201, 'samples': 3039168, 'steps': 15828, 'loss/train': 1.0625162422657013} 01/27/2022 10:44:16 - INFO - codeparrot_training - Step 15829: {'lr': 0.00040440056693082224, 'samples': 3039360, 'steps': 15829, 'loss/train': 0.3594801798462868} 01/27/2022 10:44:20 - INFO - codeparrot_training - Step 15830: {'lr': 0.0004043876976780404, 'samples': 3039552, 'steps': 15830, 'loss/train': 0.857306957244873} 01/27/2022 10:44:23 - INFO - codeparrot_training - Step 15831: {'lr': 0.0004043748277639108, 'samples': 3039744, 'steps': 15831, 'loss/train': 0.995802104473114} 01/27/2022 10:44:27 - INFO - codeparrot_training - Step 15832: {'lr': 0.0004043619571884884, 'samples': 3039936, 'steps': 15832, 'loss/train': 0.7304923832416534} 01/27/2022 10:44:30 - INFO - codeparrot_training - Step 15833: {'lr': 0.0004043490859518284, 'samples': 3040128, 'steps': 15833, 'loss/train': 0.8303113281726837} 01/27/2022 10:44:34 - INFO - codeparrot_training - Step 15834: {'lr': 0.0004043362140539859, 'samples': 3040320, 'steps': 15834, 'loss/train': 0.7152265012264252} 01/27/2022 10:44:37 - INFO - codeparrot_training - Step 15835: {'lr': 0.00040432334149501613, 'samples': 3040512, 'steps': 15835, 'loss/train': 0.9392289519309998} 01/27/2022 10:44:40 - INFO - codeparrot_training - Step 15836: {'lr': 0.00040431046827497415, 'samples': 3040704, 'steps': 15836, 'loss/train': 0.7766838669776917} 01/27/2022 10:44:43 - INFO - codeparrot_training - Step 15837: {'lr': 0.00040429759439391513, 'samples': 3040896, 'steps': 15837, 'loss/train': 0.7847301363945007} 01/27/2022 10:44:46 - INFO - codeparrot_training - Step 15838: {'lr': 0.00040428471985189416, 'samples': 3041088, 'steps': 15838, 'loss/train': 1.1564498841762543} 01/27/2022 10:44:49 - INFO - codeparrot_training - Step 15839: {'lr': 0.0004042718446489665, 'samples': 3041280, 'steps': 15839, 'loss/train': 0.35339880734682083} 01/27/2022 10:44:54 - INFO - codeparrot_training - Step 15840: {'lr': 0.0004042589687851872, 'samples': 3041472, 'steps': 15840, 'loss/train': 0.5635401606559753} 01/27/2022 10:44:57 - INFO - codeparrot_training - Step 15841: {'lr': 0.00040424609226061146, 'samples': 3041664, 'steps': 15841, 'loss/train': 0.2023550570011139} 01/27/2022 10:45:01 - INFO - codeparrot_training - Step 15842: {'lr': 0.0004042332150752944, 'samples': 3041856, 'steps': 15842, 'loss/train': 1.5135743021965027} 01/27/2022 10:45:04 - INFO - codeparrot_training - Step 15843: {'lr': 0.0004042203372292913, 'samples': 3042048, 'steps': 15843, 'loss/train': 0.9303327798843384} 01/27/2022 10:45:07 - INFO - codeparrot_training - Step 15844: {'lr': 0.00040420745872265726, 'samples': 3042240, 'steps': 15844, 'loss/train': 1.449015587568283} 01/27/2022 10:45:10 - INFO - codeparrot_training - Step 15845: {'lr': 0.0004041945795554474, 'samples': 3042432, 'steps': 15845, 'loss/train': 0.7997302114963531} 01/27/2022 10:45:13 - INFO - codeparrot_training - Step 15846: {'lr': 0.0004041816997277169, 'samples': 3042624, 'steps': 15846, 'loss/train': 1.0357554852962494} 01/27/2022 10:45:17 - INFO - codeparrot_training - Step 15847: {'lr': 0.000404168819239521, 'samples': 3042816, 'steps': 15847, 'loss/train': 1.1277852952480316} 01/27/2022 10:45:20 - INFO - codeparrot_training - Step 15848: {'lr': 0.0004041559380909148, 'samples': 3043008, 'steps': 15848, 'loss/train': 0.5053092241287231} 01/27/2022 10:45:24 - INFO - codeparrot_training - Step 15849: {'lr': 0.00040414305628195347, 'samples': 3043200, 'steps': 15849, 'loss/train': 1.7235082983970642} 01/27/2022 10:45:27 - INFO - codeparrot_training - Step 15850: {'lr': 0.00040413017381269237, 'samples': 3043392, 'steps': 15850, 'loss/train': 0.8440417349338531} 01/27/2022 10:45:30 - INFO - codeparrot_training - Step 15851: {'lr': 0.00040411729068318635, 'samples': 3043584, 'steps': 15851, 'loss/train': 0.8654311001300812} 01/27/2022 10:45:34 - INFO - codeparrot_training - Step 15852: {'lr': 0.0004041044068934909, 'samples': 3043776, 'steps': 15852, 'loss/train': 0.9661687016487122} 01/27/2022 10:45:37 - INFO - codeparrot_training - Step 15853: {'lr': 0.00040409152244366117, 'samples': 3043968, 'steps': 15853, 'loss/train': 0.6278281956911087} 01/27/2022 10:45:40 - INFO - codeparrot_training - Step 15854: {'lr': 0.00040407863733375217, 'samples': 3044160, 'steps': 15854, 'loss/train': 1.0331066250801086} 01/27/2022 10:45:43 - INFO - codeparrot_training - Step 15855: {'lr': 0.0004040657515638193, 'samples': 3044352, 'steps': 15855, 'loss/train': 0.7883612215518951} 01/27/2022 10:45:46 - INFO - codeparrot_training - Step 15856: {'lr': 0.0004040528651339176, 'samples': 3044544, 'steps': 15856, 'loss/train': 0.6156122535467148} 01/27/2022 10:45:49 - INFO - codeparrot_training - Step 15857: {'lr': 0.00040403997804410244, 'samples': 3044736, 'steps': 15857, 'loss/train': 0.5855689197778702} 01/27/2022 10:45:54 - INFO - codeparrot_training - Step 15858: {'lr': 0.00040402709029442883, 'samples': 3044928, 'steps': 15858, 'loss/train': 0.6567723602056503} 01/27/2022 10:45:57 - INFO - codeparrot_training - Step 15859: {'lr': 0.0004040142018849521, 'samples': 3045120, 'steps': 15859, 'loss/train': 0.9871104061603546} 01/27/2022 10:46:00 - INFO - codeparrot_training - Step 15860: {'lr': 0.0004040013128157275, 'samples': 3045312, 'steps': 15860, 'loss/train': 0.8779299259185791} 01/27/2022 10:46:03 - INFO - codeparrot_training - Step 15861: {'lr': 0.0004039884230868101, 'samples': 3045504, 'steps': 15861, 'loss/train': 1.5136545896530151} 01/27/2022 10:46:06 - INFO - codeparrot_training - Step 15862: {'lr': 0.0004039755326982552, 'samples': 3045696, 'steps': 15862, 'loss/train': 0.8249252736568451} 01/27/2022 10:46:09 - INFO - codeparrot_training - Step 15863: {'lr': 0.000403962641650118, 'samples': 3045888, 'steps': 15863, 'loss/train': 1.032995492219925} 01/27/2022 10:46:12 - INFO - codeparrot_training - Step 15864: {'lr': 0.0004039497499424538, 'samples': 3046080, 'steps': 15864, 'loss/train': 1.024817705154419} 01/27/2022 10:46:16 - INFO - codeparrot_training - Step 15865: {'lr': 0.00040393685757531776, 'samples': 3046272, 'steps': 15865, 'loss/train': 0.7738977670669556} 01/27/2022 10:46:19 - INFO - codeparrot_training - Step 15866: {'lr': 0.000403923964548765, 'samples': 3046464, 'steps': 15866, 'loss/train': 0.737411767244339} 01/27/2022 10:46:24 - INFO - codeparrot_training - Step 15867: {'lr': 0.0004039110708628509, 'samples': 3046656, 'steps': 15867, 'loss/train': 0.4975927323102951} 01/27/2022 10:46:27 - INFO - codeparrot_training - Step 15868: {'lr': 0.00040389817651763073, 'samples': 3046848, 'steps': 15868, 'loss/train': 0.6608000546693802} 01/27/2022 10:46:30 - INFO - codeparrot_training - Step 15869: {'lr': 0.0004038852815131595, 'samples': 3047040, 'steps': 15869, 'loss/train': 1.15775665640831} 01/27/2022 10:46:33 - INFO - codeparrot_training - Step 15870: {'lr': 0.0004038723858494927, 'samples': 3047232, 'steps': 15870, 'loss/train': 0.8909288048744202} 01/27/2022 10:46:37 - INFO - codeparrot_training - Step 15871: {'lr': 0.00040385948952668537, 'samples': 3047424, 'steps': 15871, 'loss/train': 0.4548579901456833} 01/27/2022 10:46:40 - INFO - codeparrot_training - Step 15872: {'lr': 0.0004038465925447929, 'samples': 3047616, 'steps': 15872, 'loss/train': 1.8405178785324097} 01/27/2022 10:46:43 - INFO - codeparrot_training - Step 15873: {'lr': 0.00040383369490387043, 'samples': 3047808, 'steps': 15873, 'loss/train': 0.8154062032699585} 01/27/2022 10:46:46 - INFO - codeparrot_training - Step 15874: {'lr': 0.0004038207966039733, 'samples': 3048000, 'steps': 15874, 'loss/train': 1.0257627367973328} 01/27/2022 10:46:49 - INFO - codeparrot_training - Step 15875: {'lr': 0.00040380789764515667, 'samples': 3048192, 'steps': 15875, 'loss/train': 0.5186170041561127} 01/27/2022 10:46:53 - INFO - codeparrot_training - Step 15876: {'lr': 0.0004037949980274759, 'samples': 3048384, 'steps': 15876, 'loss/train': 0.7927306294441223} 01/27/2022 10:46:57 - INFO - codeparrot_training - Step 15877: {'lr': 0.0004037820977509862, 'samples': 3048576, 'steps': 15877, 'loss/train': 1.4380521476268768} 01/27/2022 10:47:00 - INFO - codeparrot_training - Step 15878: {'lr': 0.00040376919681574285, 'samples': 3048768, 'steps': 15878, 'loss/train': 1.3119031190872192} 01/27/2022 10:47:03 - INFO - codeparrot_training - Step 15879: {'lr': 0.000403756295221801, 'samples': 3048960, 'steps': 15879, 'loss/train': 0.5754084438085556} 01/27/2022 10:47:06 - INFO - codeparrot_training - Step 15880: {'lr': 0.00040374339296921606, 'samples': 3049152, 'steps': 15880, 'loss/train': 0.837433934211731} 01/27/2022 10:47:09 - INFO - codeparrot_training - Step 15881: {'lr': 0.00040373049005804323, 'samples': 3049344, 'steps': 15881, 'loss/train': 0.7528701424598694} 01/27/2022 10:47:12 - INFO - codeparrot_training - Step 15882: {'lr': 0.00040371758648833776, 'samples': 3049536, 'steps': 15882, 'loss/train': 0.8456114530563354} 01/27/2022 10:47:15 - INFO - codeparrot_training - Step 15883: {'lr': 0.00040370468226015507, 'samples': 3049728, 'steps': 15883, 'loss/train': 1.1196337938308716} 01/27/2022 10:47:20 - INFO - codeparrot_training - Step 15884: {'lr': 0.0004036917773735502, 'samples': 3049920, 'steps': 15884, 'loss/train': 0.09344386495649815} 01/27/2022 10:47:23 - INFO - codeparrot_training - Step 15885: {'lr': 0.00040367887182857866, 'samples': 3050112, 'steps': 15885, 'loss/train': 0.16653797775506973} 01/27/2022 10:47:26 - INFO - codeparrot_training - Step 15886: {'lr': 0.00040366596562529554, 'samples': 3050304, 'steps': 15886, 'loss/train': 0.8097220659255981} 01/27/2022 10:47:29 - INFO - codeparrot_training - Step 15887: {'lr': 0.00040365305876375636, 'samples': 3050496, 'steps': 15887, 'loss/train': 1.0628576874732971} 01/27/2022 10:47:32 - INFO - codeparrot_training - Step 15888: {'lr': 0.0004036401512440161, 'samples': 3050688, 'steps': 15888, 'loss/train': 0.9778043031692505} 01/27/2022 10:47:36 - INFO - codeparrot_training - Step 15889: {'lr': 0.0004036272430661303, 'samples': 3050880, 'steps': 15889, 'loss/train': 0.8734466135501862} 01/27/2022 10:47:39 - INFO - codeparrot_training - Step 15890: {'lr': 0.0004036143342301542, 'samples': 3051072, 'steps': 15890, 'loss/train': 1.4610282182693481} 01/27/2022 10:47:42 - INFO - codeparrot_training - Step 15891: {'lr': 0.000403601424736143, 'samples': 3051264, 'steps': 15891, 'loss/train': 1.32945317029953} 01/27/2022 10:47:45 - INFO - codeparrot_training - Step 15892: {'lr': 0.0004035885145841521, 'samples': 3051456, 'steps': 15892, 'loss/train': 0.22864260524511337} 01/27/2022 10:47:50 - INFO - codeparrot_training - Step 15893: {'lr': 0.00040357560377423675, 'samples': 3051648, 'steps': 15893, 'loss/train': 0.6869247257709503} 01/27/2022 10:47:53 - INFO - codeparrot_training - Step 15894: {'lr': 0.0004035626923064524, 'samples': 3051840, 'steps': 15894, 'loss/train': 0.8503405451774597} 01/27/2022 10:47:56 - INFO - codeparrot_training - Step 15895: {'lr': 0.00040354978018085407, 'samples': 3052032, 'steps': 15895, 'loss/train': 0.8380831182003021} 01/27/2022 10:48:00 - INFO - codeparrot_training - Step 15896: {'lr': 0.00040353686739749733, 'samples': 3052224, 'steps': 15896, 'loss/train': 1.433519035577774} 01/27/2022 10:48:03 - INFO - codeparrot_training - Step 15897: {'lr': 0.00040352395395643737, 'samples': 3052416, 'steps': 15897, 'loss/train': 1.0187241733074188} 01/27/2022 10:48:06 - INFO - codeparrot_training - Step 15898: {'lr': 0.00040351103985772964, 'samples': 3052608, 'steps': 15898, 'loss/train': 0.711328312754631} 01/27/2022 10:48:09 - INFO - codeparrot_training - Step 15899: {'lr': 0.00040349812510142923, 'samples': 3052800, 'steps': 15899, 'loss/train': 1.03606715798378} 01/27/2022 10:48:12 - INFO - codeparrot_training - Step 15900: {'lr': 0.0004034852096875916, 'samples': 3052992, 'steps': 15900, 'loss/train': 0.8503400087356567} 01/27/2022 10:48:15 - INFO - codeparrot_training - Step 15901: {'lr': 0.0004034722936162721, 'samples': 3053184, 'steps': 15901, 'loss/train': 0.8401461839675903} 01/27/2022 10:48:20 - INFO - codeparrot_training - Step 15902: {'lr': 0.00040345937688752607, 'samples': 3053376, 'steps': 15902, 'loss/train': 0.9001806378364563} 01/27/2022 10:48:23 - INFO - codeparrot_training - Step 15903: {'lr': 0.0004034464595014088, 'samples': 3053568, 'steps': 15903, 'loss/train': 1.013807773590088} 01/27/2022 10:48:26 - INFO - codeparrot_training - Step 15904: {'lr': 0.00040343354145797554, 'samples': 3053760, 'steps': 15904, 'loss/train': 0.7260244488716125} 01/27/2022 10:48:29 - INFO - codeparrot_training - Step 15905: {'lr': 0.0004034206227572818, 'samples': 3053952, 'steps': 15905, 'loss/train': 0.5656226724386215} 01/27/2022 10:48:32 - INFO - codeparrot_training - Step 15906: {'lr': 0.0004034077033993828, 'samples': 3054144, 'steps': 15906, 'loss/train': 0.8087723851203918} 01/27/2022 10:48:35 - INFO - codeparrot_training - Step 15907: {'lr': 0.00040339478338433386, 'samples': 3054336, 'steps': 15907, 'loss/train': 0.606577679514885} 01/27/2022 10:48:39 - INFO - codeparrot_training - Step 15908: {'lr': 0.0004033818627121904, 'samples': 3054528, 'steps': 15908, 'loss/train': 0.5134348422288895} 01/27/2022 10:48:42 - INFO - codeparrot_training - Step 15909: {'lr': 0.00040336894138300777, 'samples': 3054720, 'steps': 15909, 'loss/train': 0.7509387731552124} 01/27/2022 10:48:46 - INFO - codeparrot_training - Step 15910: {'lr': 0.0004033560193968413, 'samples': 3054912, 'steps': 15910, 'loss/train': 0.8957132399082184} 01/27/2022 10:48:49 - INFO - codeparrot_training - Step 15911: {'lr': 0.00040334309675374636, 'samples': 3055104, 'steps': 15911, 'loss/train': 0.29854072630405426} 01/27/2022 10:48:53 - INFO - codeparrot_training - Step 15912: {'lr': 0.0004033301734537782, 'samples': 3055296, 'steps': 15912, 'loss/train': 0.595838651061058} 01/27/2022 10:48:56 - INFO - codeparrot_training - Step 15913: {'lr': 0.0004033172494969923, 'samples': 3055488, 'steps': 15913, 'loss/train': 1.4226442873477936} 01/27/2022 10:48:59 - INFO - codeparrot_training - Step 15914: {'lr': 0.000403304324883444, 'samples': 3055680, 'steps': 15914, 'loss/train': 1.1866225898265839} 01/27/2022 10:49:02 - INFO - codeparrot_training - Step 15915: {'lr': 0.00040329139961318863, 'samples': 3055872, 'steps': 15915, 'loss/train': 0.6561603248119354} 01/27/2022 10:49:05 - INFO - codeparrot_training - Step 15916: {'lr': 0.00040327847368628163, 'samples': 3056064, 'steps': 15916, 'loss/train': 0.8509580790996552} 01/27/2022 10:49:08 - INFO - codeparrot_training - Step 15917: {'lr': 0.0004032655471027783, 'samples': 3056256, 'steps': 15917, 'loss/train': 0.4859050065279007} 01/27/2022 10:49:11 - INFO - codeparrot_training - Step 15918: {'lr': 0.000403252619862734, 'samples': 3056448, 'steps': 15918, 'loss/train': 0.5695238560438156} 01/27/2022 10:49:17 - INFO - codeparrot_training - Step 15919: {'lr': 0.0004032396919662041, 'samples': 3056640, 'steps': 15919, 'loss/train': 1.5581914186477661} 01/27/2022 10:49:20 - INFO - codeparrot_training - Step 15920: {'lr': 0.00040322676341324415, 'samples': 3056832, 'steps': 15920, 'loss/train': 0.9792717397212982} 01/27/2022 10:49:23 - INFO - codeparrot_training - Step 15921: {'lr': 0.0004032138342039093, 'samples': 3057024, 'steps': 15921, 'loss/train': 0.03607937414199114} 01/27/2022 10:49:26 - INFO - codeparrot_training - Step 15922: {'lr': 0.0004032009043382551, 'samples': 3057216, 'steps': 15922, 'loss/train': 1.1417209804058075} 01/27/2022 10:49:29 - INFO - codeparrot_training - Step 15923: {'lr': 0.0004031879738163368, 'samples': 3057408, 'steps': 15923, 'loss/train': 0.9061854779720306} 01/27/2022 10:49:32 - INFO - codeparrot_training - Step 15924: {'lr': 0.00040317504263820994, 'samples': 3057600, 'steps': 15924, 'loss/train': 1.0217056274414062} 01/27/2022 10:49:35 - INFO - codeparrot_training - Step 15925: {'lr': 0.0004031621108039298, 'samples': 3057792, 'steps': 15925, 'loss/train': 0.7702926993370056} 01/27/2022 10:49:39 - INFO - codeparrot_training - Step 15926: {'lr': 0.0004031491783135518, 'samples': 3057984, 'steps': 15926, 'loss/train': 0.46267475187778473} 01/27/2022 10:49:42 - INFO - codeparrot_training - Step 15927: {'lr': 0.0004031362451671314, 'samples': 3058176, 'steps': 15927, 'loss/train': 0.6191606372594833} 01/27/2022 10:49:46 - INFO - codeparrot_training - Step 15928: {'lr': 0.00040312331136472385, 'samples': 3058368, 'steps': 15928, 'loss/train': 0.7884693145751953} 01/27/2022 10:49:49 - INFO - codeparrot_training - Step 15929: {'lr': 0.00040311037690638477, 'samples': 3058560, 'steps': 15929, 'loss/train': 1.034155011177063} 01/27/2022 10:49:53 - INFO - codeparrot_training - Step 15930: {'lr': 0.00040309744179216936, 'samples': 3058752, 'steps': 15930, 'loss/train': 0.7509830296039581} 01/27/2022 10:49:56 - INFO - codeparrot_training - Step 15931: {'lr': 0.0004030845060221332, 'samples': 3058944, 'steps': 15931, 'loss/train': 0.5110121816396713} 01/27/2022 10:49:59 - INFO - codeparrot_training - Step 15932: {'lr': 0.00040307156959633154, 'samples': 3059136, 'steps': 15932, 'loss/train': 0.572925791144371} 01/27/2022 10:50:02 - INFO - codeparrot_training - Step 15933: {'lr': 0.00040305863251482, 'samples': 3059328, 'steps': 15933, 'loss/train': 0.8698444962501526} 01/27/2022 10:50:05 - INFO - codeparrot_training - Step 15934: {'lr': 0.00040304569477765375, 'samples': 3059520, 'steps': 15934, 'loss/train': 0.621804803609848} 01/27/2022 10:50:08 - INFO - codeparrot_training - Step 15935: {'lr': 0.0004030327563848885, 'samples': 3059712, 'steps': 15935, 'loss/train': 0.5994575768709183} 01/27/2022 10:50:12 - INFO - codeparrot_training - Step 15936: {'lr': 0.00040301981733657934, 'samples': 3059904, 'steps': 15936, 'loss/train': 0.7923910617828369} 01/27/2022 10:50:16 - INFO - codeparrot_training - Step 15937: {'lr': 0.00040300687763278196, 'samples': 3060096, 'steps': 15937, 'loss/train': 0.9408222734928131} 01/27/2022 10:50:19 - INFO - codeparrot_training - Step 15938: {'lr': 0.0004029939372735517, 'samples': 3060288, 'steps': 15938, 'loss/train': 1.0263008773326874} 01/27/2022 10:50:22 - INFO - codeparrot_training - Step 15939: {'lr': 0.000402980996258944, 'samples': 3060480, 'steps': 15939, 'loss/train': 0.7715176641941071} 01/27/2022 10:50:25 - INFO - codeparrot_training - Step 15940: {'lr': 0.00040296805458901427, 'samples': 3060672, 'steps': 15940, 'loss/train': 0.8964411914348602} 01/27/2022 10:50:29 - INFO - codeparrot_training - Step 15941: {'lr': 0.0004029551122638179, 'samples': 3060864, 'steps': 15941, 'loss/train': 0.034213246777653694} 01/27/2022 10:50:32 - INFO - codeparrot_training - Step 15942: {'lr': 0.0004029421692834105, 'samples': 3061056, 'steps': 15942, 'loss/train': 0.7306340932846069} 01/27/2022 10:50:35 - INFO - codeparrot_training - Step 15943: {'lr': 0.0004029292256478474, 'samples': 3061248, 'steps': 15943, 'loss/train': 0.914590448141098} 01/27/2022 10:50:38 - INFO - codeparrot_training - Step 15944: {'lr': 0.00040291628135718404, 'samples': 3061440, 'steps': 15944, 'loss/train': 0.29969096928834915} 01/27/2022 10:50:41 - INFO - codeparrot_training - Step 15945: {'lr': 0.0004029033364114759, 'samples': 3061632, 'steps': 15945, 'loss/train': 0.967570424079895} 01/27/2022 10:50:46 - INFO - codeparrot_training - Step 15946: {'lr': 0.00040289039081077837, 'samples': 3061824, 'steps': 15946, 'loss/train': 0.06510771811008453} 01/27/2022 10:50:49 - INFO - codeparrot_training - Step 15947: {'lr': 0.00040287744455514703, 'samples': 3062016, 'steps': 15947, 'loss/train': 0.5416173934936523} 01/27/2022 10:50:53 - INFO - codeparrot_training - Step 15948: {'lr': 0.00040286449764463715, 'samples': 3062208, 'steps': 15948, 'loss/train': 1.2154988944530487} 01/27/2022 10:50:56 - INFO - codeparrot_training - Step 15949: {'lr': 0.0004028515500793044, 'samples': 3062400, 'steps': 15949, 'loss/train': 0.6943444758653641} 01/27/2022 10:50:59 - INFO - codeparrot_training - Step 15950: {'lr': 0.0004028386018592041, 'samples': 3062592, 'steps': 15950, 'loss/train': 0.7367964684963226} 01/27/2022 10:51:02 - INFO - codeparrot_training - Step 15951: {'lr': 0.0004028256529843918, 'samples': 3062784, 'steps': 15951, 'loss/train': 0.1951119303703308} 01/27/2022 10:51:05 - INFO - codeparrot_training - Step 15952: {'lr': 0.00040281270345492295, 'samples': 3062976, 'steps': 15952, 'loss/train': 0.7404391765594482} 01/27/2022 10:51:08 - INFO - codeparrot_training - Step 15953: {'lr': 0.00040279975327085294, 'samples': 3063168, 'steps': 15953, 'loss/train': 1.019188016653061} 01/27/2022 10:51:13 - INFO - codeparrot_training - Step 15954: {'lr': 0.00040278680243223733, 'samples': 3063360, 'steps': 15954, 'loss/train': 1.4263292849063873} 01/27/2022 10:51:16 - INFO - codeparrot_training - Step 15955: {'lr': 0.00040277385093913154, 'samples': 3063552, 'steps': 15955, 'loss/train': 0.7635740339756012} 01/27/2022 10:51:19 - INFO - codeparrot_training - Step 15956: {'lr': 0.0004027608987915912, 'samples': 3063744, 'steps': 15956, 'loss/train': 0.9450821578502655} 01/27/2022 10:51:22 - INFO - codeparrot_training - Step 15957: {'lr': 0.0004027479459896716, 'samples': 3063936, 'steps': 15957, 'loss/train': 0.877367377281189} 01/27/2022 10:51:25 - INFO - codeparrot_training - Step 15958: {'lr': 0.0004027349925334282, 'samples': 3064128, 'steps': 15958, 'loss/train': 0.9186590015888214} 01/27/2022 10:51:29 - INFO - codeparrot_training - Step 15959: {'lr': 0.00040272203842291676, 'samples': 3064320, 'steps': 15959, 'loss/train': 0.5872037708759308} 01/27/2022 10:51:32 - INFO - codeparrot_training - Step 15960: {'lr': 0.00040270908365819247, 'samples': 3064512, 'steps': 15960, 'loss/train': 0.741337850689888} 01/27/2022 10:51:35 - INFO - codeparrot_training - Step 15961: {'lr': 0.000402696128239311, 'samples': 3064704, 'steps': 15961, 'loss/train': 0.7080902606248856} 01/27/2022 10:51:38 - INFO - codeparrot_training - Step 15962: {'lr': 0.00040268317216632783, 'samples': 3064896, 'steps': 15962, 'loss/train': 1.0211680233478546} 01/27/2022 10:51:43 - INFO - codeparrot_training - Step 15963: {'lr': 0.0004026702154392984, 'samples': 3065088, 'steps': 15963, 'loss/train': 1.3270134329795837} 01/27/2022 10:51:46 - INFO - codeparrot_training - Step 15964: {'lr': 0.0004026572580582783, 'samples': 3065280, 'steps': 15964, 'loss/train': 0.8921737968921661} 01/27/2022 10:51:49 - INFO - codeparrot_training - Step 15965: {'lr': 0.000402644300023323, 'samples': 3065472, 'steps': 15965, 'loss/train': 1.315031111240387} 01/27/2022 10:51:53 - INFO - codeparrot_training - Step 15966: {'lr': 0.0004026313413344879, 'samples': 3065664, 'steps': 15966, 'loss/train': 0.5468924939632416} 01/27/2022 10:51:56 - INFO - codeparrot_training - Step 15967: {'lr': 0.0004026183819918286, 'samples': 3065856, 'steps': 15967, 'loss/train': 0.7950821220874786} 01/27/2022 10:51:59 - INFO - codeparrot_training - Step 15968: {'lr': 0.00040260542199540064, 'samples': 3066048, 'steps': 15968, 'loss/train': 1.3473446667194366} 01/27/2022 10:52:02 - INFO - codeparrot_training - Step 15969: {'lr': 0.00040259246134525953, 'samples': 3066240, 'steps': 15969, 'loss/train': 0.5939890891313553} 01/27/2022 10:52:05 - INFO - codeparrot_training - Step 15970: {'lr': 0.0004025795000414608, 'samples': 3066432, 'steps': 15970, 'loss/train': 1.0007121562957764} 01/27/2022 10:52:08 - INFO - codeparrot_training - Step 15971: {'lr': 0.0004025665380840599, 'samples': 3066624, 'steps': 15971, 'loss/train': 0.45434877276420593} 01/27/2022 10:52:13 - INFO - codeparrot_training - Step 15972: {'lr': 0.00040255357547311235, 'samples': 3066816, 'steps': 15972, 'loss/train': 0.8398591876029968} 01/27/2022 10:52:16 - INFO - codeparrot_training - Step 15973: {'lr': 0.0004025406122086738, 'samples': 3067008, 'steps': 15973, 'loss/train': 0.606710895895958} 01/27/2022 10:52:19 - INFO - codeparrot_training - Step 15974: {'lr': 0.0004025276482907996, 'samples': 3067200, 'steps': 15974, 'loss/train': 0.8103033900260925} 01/27/2022 10:52:22 - INFO - codeparrot_training - Step 15975: {'lr': 0.0004025146837195455, 'samples': 3067392, 'steps': 15975, 'loss/train': 0.8858403861522675} 01/27/2022 10:52:25 - INFO - codeparrot_training - Step 15976: {'lr': 0.00040250171849496685, 'samples': 3067584, 'steps': 15976, 'loss/train': 0.7394555658102036} 01/27/2022 10:52:28 - INFO - codeparrot_training - Step 15977: {'lr': 0.0004024887526171193, 'samples': 3067776, 'steps': 15977, 'loss/train': 1.1499018967151642} 01/27/2022 10:52:31 - INFO - codeparrot_training - Step 15978: {'lr': 0.0004024757860860584, 'samples': 3067968, 'steps': 15978, 'loss/train': 0.988450437784195} 01/27/2022 10:52:35 - INFO - codeparrot_training - Step 15979: {'lr': 0.00040246281890183954, 'samples': 3068160, 'steps': 15979, 'loss/train': 0.8495365083217621} 01/27/2022 10:52:38 - INFO - codeparrot_training - Step 15980: {'lr': 0.0004024498510645185, 'samples': 3068352, 'steps': 15980, 'loss/train': 0.8621210753917694} 01/27/2022 10:52:42 - INFO - codeparrot_training - Step 15981: {'lr': 0.00040243688257415064, 'samples': 3068544, 'steps': 15981, 'loss/train': 0.7319075167179108} 01/27/2022 10:52:46 - INFO - codeparrot_training - Step 15982: {'lr': 0.00040242391343079157, 'samples': 3068736, 'steps': 15982, 'loss/train': 0.6308471113443375} 01/27/2022 10:52:49 - INFO - codeparrot_training - Step 15983: {'lr': 0.00040241094363449684, 'samples': 3068928, 'steps': 15983, 'loss/train': 0.25226589292287827} 01/27/2022 10:52:52 - INFO - codeparrot_training - Step 15984: {'lr': 0.000402397973185322, 'samples': 3069120, 'steps': 15984, 'loss/train': 0.6666531264781952} 01/27/2022 10:52:55 - INFO - codeparrot_training - Step 15985: {'lr': 0.0004023850020833227, 'samples': 3069312, 'steps': 15985, 'loss/train': 0.8800175786018372} 01/27/2022 10:52:58 - INFO - codeparrot_training - Step 15986: {'lr': 0.00040237203032855446, 'samples': 3069504, 'steps': 15986, 'loss/train': 0.5587428063154221} 01/27/2022 10:53:01 - INFO - codeparrot_training - Step 15987: {'lr': 0.00040235905792107275, 'samples': 3069696, 'steps': 15987, 'loss/train': 0.8682463467121124} 01/27/2022 10:53:04 - INFO - codeparrot_training - Step 15988: {'lr': 0.00040234608486093326, 'samples': 3069888, 'steps': 15988, 'loss/train': 1.115193396806717} 01/27/2022 10:53:09 - INFO - codeparrot_training - Step 15989: {'lr': 0.00040233311114819156, 'samples': 3070080, 'steps': 15989, 'loss/train': 0.8071020841598511} 01/27/2022 10:53:12 - INFO - codeparrot_training - Step 15990: {'lr': 0.00040232013678290316, 'samples': 3070272, 'steps': 15990, 'loss/train': 1.0783033668994904} 01/27/2022 10:53:15 - INFO - codeparrot_training - Step 15991: {'lr': 0.0004023071617651236, 'samples': 3070464, 'steps': 15991, 'loss/train': 0.9162822961807251} 01/27/2022 10:53:18 - INFO - codeparrot_training - Step 15992: {'lr': 0.0004022941860949085, 'samples': 3070656, 'steps': 15992, 'loss/train': 0.7560935318470001} 01/27/2022 10:53:22 - INFO - codeparrot_training - Step 15993: {'lr': 0.00040228120977231355, 'samples': 3070848, 'steps': 15993, 'loss/train': 1.331439346075058} 01/27/2022 10:53:25 - INFO - codeparrot_training - Step 15994: {'lr': 0.00040226823279739427, 'samples': 3071040, 'steps': 15994, 'loss/train': 0.1973893716931343} 01/27/2022 10:53:28 - INFO - codeparrot_training - Step 15995: {'lr': 0.00040225525517020616, 'samples': 3071232, 'steps': 15995, 'loss/train': 0.711289644241333} 01/27/2022 10:53:31 - INFO - codeparrot_training - Step 15996: {'lr': 0.0004022422768908049, 'samples': 3071424, 'steps': 15996, 'loss/train': 0.6317132413387299} 01/27/2022 10:53:34 - INFO - codeparrot_training - Step 15997: {'lr': 0.00040222929795924613, 'samples': 3071616, 'steps': 15997, 'loss/train': 0.6880833953619003} 01/27/2022 10:53:39 - INFO - codeparrot_training - Step 15998: {'lr': 0.0004022163183755853, 'samples': 3071808, 'steps': 15998, 'loss/train': 1.232497125864029} 01/27/2022 10:53:43 - INFO - codeparrot_training - Step 15999: {'lr': 0.0004022033381398781, 'samples': 3072000, 'steps': 15999, 'loss/train': 0.9505569040775299} 01/27/2022 10:53:43 - INFO - codeparrot_training - Evaluating and saving model checkpoint 01/27/2022 10:54:00 - WARNING - huggingface_hub.repository - Several commits (8) will be pushed upstream. 01/27/2022 10:54:00 - WARNING - huggingface_hub.repository - The progress bars may be unreliable. 01/27/2022 10:54:35 - WARNING - huggingface_hub.repository - To https://huggingface.co/ncoop57/codeparrot-neo-125M-py 283559f..6d4e11b royal-monkey-12 -> royal-monkey-12 01/27/2022 10:54:40 - INFO - codeparrot_training - Step 16000: {'lr': 0.0004021903572521802, 'samples': 3072192, 'steps': 16000, 'loss/train': 1.1661683320999146} 01/27/2022 10:54:43 - INFO - codeparrot_training - Step 16001: {'lr': 0.0004021773757125471, 'samples': 3072384, 'steps': 16001, 'loss/train': 0.5482946634292603} 01/27/2022 10:54:46 - INFO - codeparrot_training - Step 16002: {'lr': 0.0004021643935210344, 'samples': 3072576, 'steps': 16002, 'loss/train': 1.156781941652298} 01/27/2022 10:54:49 - INFO - codeparrot_training - Step 16003: {'lr': 0.0004021514106776978, 'samples': 3072768, 'steps': 16003, 'loss/train': 0.8858944773674011} 01/27/2022 10:54:52 - INFO - codeparrot_training - Step 16004: {'lr': 0.00040213842718259287, 'samples': 3072960, 'steps': 16004, 'loss/train': 0.805213987827301} 01/27/2022 10:54:56 - INFO - codeparrot_training - Step 16005: {'lr': 0.00040212544303577525, 'samples': 3073152, 'steps': 16005, 'loss/train': 1.2046675086021423} 01/27/2022 10:54:59 - INFO - codeparrot_training - Step 16006: {'lr': 0.00040211245823730047, 'samples': 3073344, 'steps': 16006, 'loss/train': 1.084903746843338} 01/27/2022 10:55:03 - INFO - codeparrot_training - Step 16007: {'lr': 0.00040209947278722425, 'samples': 3073536, 'steps': 16007, 'loss/train': 0.8159877955913544} 01/27/2022 10:55:06 - INFO - codeparrot_training - Step 16008: {'lr': 0.0004020864866856022, 'samples': 3073728, 'steps': 16008, 'loss/train': 0.36306392401456833} 01/27/2022 10:55:09 - INFO - codeparrot_training - Step 16009: {'lr': 0.0004020734999324899, 'samples': 3073920, 'steps': 16009, 'loss/train': 1.1039950847625732} 01/27/2022 10:55:13 - INFO - codeparrot_training - Step 16010: {'lr': 0.0004020605125279431, 'samples': 3074112, 'steps': 16010, 'loss/train': 0.9269937872886658} 01/27/2022 10:55:16 - INFO - codeparrot_training - Step 16011: {'lr': 0.0004020475244720173, 'samples': 3074304, 'steps': 16011, 'loss/train': 0.8019360601902008} 01/27/2022 10:55:19 - INFO - codeparrot_training - Step 16012: {'lr': 0.0004020345357647681, 'samples': 3074496, 'steps': 16012, 'loss/train': 1.1659138798713684} 01/27/2022 10:55:22 - INFO - codeparrot_training - Step 16013: {'lr': 0.0004020215464062513, 'samples': 3074688, 'steps': 16013, 'loss/train': 0.3456604778766632} 01/27/2022 10:55:25 - INFO - codeparrot_training - Step 16014: {'lr': 0.0004020085563965226, 'samples': 3074880, 'steps': 16014, 'loss/train': 0.8018893003463745} 01/27/2022 10:55:28 - INFO - codeparrot_training - Step 16015: {'lr': 0.00040199556573563736, 'samples': 3075072, 'steps': 16015, 'loss/train': 0.6280533671379089} 01/27/2022 10:55:33 - INFO - codeparrot_training - Step 16016: {'lr': 0.0004019825744236514, 'samples': 3075264, 'steps': 16016, 'loss/train': 0.978644996881485} 01/27/2022 10:55:36 - INFO - codeparrot_training - Step 16017: {'lr': 0.00040196958246062033, 'samples': 3075456, 'steps': 16017, 'loss/train': 0.6406474560499191} 01/27/2022 10:55:39 - INFO - codeparrot_training - Step 16018: {'lr': 0.00040195658984659987, 'samples': 3075648, 'steps': 16018, 'loss/train': 0.7530016601085663} 01/27/2022 10:55:42 - INFO - codeparrot_training - Step 16019: {'lr': 0.0004019435965816456, 'samples': 3075840, 'steps': 16019, 'loss/train': 0.8047084808349609} 01/27/2022 10:55:45 - INFO - codeparrot_training - Step 16020: {'lr': 0.0004019306026658132, 'samples': 3076032, 'steps': 16020, 'loss/train': 0.9200859367847443} 01/27/2022 10:55:48 - INFO - codeparrot_training - Step 16021: {'lr': 0.00040191760809915833, 'samples': 3076224, 'steps': 16021, 'loss/train': 1.126099169254303} 01/27/2022 10:55:52 - INFO - codeparrot_training - Step 16022: {'lr': 0.00040190461288173675, 'samples': 3076416, 'steps': 16022, 'loss/train': 0.8763271272182465} 01/27/2022 10:55:55 - INFO - codeparrot_training - Step 16023: {'lr': 0.000401891617013604, 'samples': 3076608, 'steps': 16023, 'loss/train': 0.7621990442276001} 01/27/2022 10:56:00 - INFO - codeparrot_training - Step 16024: {'lr': 0.00040187862049481573, 'samples': 3076800, 'steps': 16024, 'loss/train': 0.8959260284900665} 01/27/2022 10:56:03 - INFO - codeparrot_training - Step 16025: {'lr': 0.00040186562332542773, 'samples': 3076992, 'steps': 16025, 'loss/train': 0.979356586933136} 01/27/2022 10:56:06 - INFO - codeparrot_training - Step 16026: {'lr': 0.0004018526255054956, 'samples': 3077184, 'steps': 16026, 'loss/train': 1.3099334836006165} 01/27/2022 10:56:10 - INFO - codeparrot_training - Step 16027: {'lr': 0.00040183962703507515, 'samples': 3077376, 'steps': 16027, 'loss/train': 0.2690383046865463} 01/27/2022 10:56:13 - INFO - codeparrot_training - Step 16028: {'lr': 0.00040182662791422185, 'samples': 3077568, 'steps': 16028, 'loss/train': 0.8848666548728943} 01/27/2022 10:56:16 - INFO - codeparrot_training - Step 16029: {'lr': 0.0004018136281429915, 'samples': 3077760, 'steps': 16029, 'loss/train': 0.7618233561515808} 01/27/2022 10:56:19 - INFO - codeparrot_training - Step 16030: {'lr': 0.0004018006277214398, 'samples': 3077952, 'steps': 16030, 'loss/train': 0.35345611721277237} 01/27/2022 10:56:22 - INFO - codeparrot_training - Step 16031: {'lr': 0.00040178762664962235, 'samples': 3078144, 'steps': 16031, 'loss/train': 0.8115330934524536} 01/27/2022 10:56:26 - INFO - codeparrot_training - Step 16032: {'lr': 0.000401774624927595, 'samples': 3078336, 'steps': 16032, 'loss/train': 0.725628063082695} 01/27/2022 10:56:30 - INFO - codeparrot_training - Step 16033: {'lr': 0.00040176162255541325, 'samples': 3078528, 'steps': 16033, 'loss/train': 1.0257490575313568} 01/27/2022 10:56:33 - INFO - codeparrot_training - Step 16034: {'lr': 0.00040174861953313297, 'samples': 3078720, 'steps': 16034, 'loss/train': 0.7061260342597961} 01/27/2022 10:56:36 - INFO - codeparrot_training - Step 16035: {'lr': 0.00040173561586080974, 'samples': 3078912, 'steps': 16035, 'loss/train': 0.744817391037941} 01/27/2022 10:56:39 - INFO - codeparrot_training - Step 16036: {'lr': 0.0004017226115384994, 'samples': 3079104, 'steps': 16036, 'loss/train': 0.9325784146785736} 01/27/2022 10:56:42 - INFO - codeparrot_training - Step 16037: {'lr': 0.00040170960656625744, 'samples': 3079296, 'steps': 16037, 'loss/train': 0.9887999296188354} 01/27/2022 10:56:46 - INFO - codeparrot_training - Step 16038: {'lr': 0.00040169660094413977, 'samples': 3079488, 'steps': 16038, 'loss/train': 0.9525887668132782} 01/27/2022 10:56:49 - INFO - codeparrot_training - Step 16039: {'lr': 0.00040168359467220206, 'samples': 3079680, 'steps': 16039, 'loss/train': 0.4065826088190079} 01/27/2022 10:56:52 - INFO - codeparrot_training - Step 16040: {'lr': 0.00040167058775049993, 'samples': 3079872, 'steps': 16040, 'loss/train': 0.9671700596809387} 01/27/2022 10:56:55 - INFO - codeparrot_training - Step 16041: {'lr': 0.0004016575801790892, 'samples': 3080064, 'steps': 16041, 'loss/train': 1.1109221577644348} 01/27/2022 10:57:00 - INFO - codeparrot_training - Step 16042: {'lr': 0.0004016445719580256, 'samples': 3080256, 'steps': 16042, 'loss/train': 0.507965013384819} 01/27/2022 10:57:04 - INFO - codeparrot_training - Step 16043: {'lr': 0.0004016315630873647, 'samples': 3080448, 'steps': 16043, 'loss/train': 1.3136743605136871} 01/27/2022 10:57:07 - INFO - codeparrot_training - Step 16044: {'lr': 0.00040161855356716245, 'samples': 3080640, 'steps': 16044, 'loss/train': 0.75885209441185} 01/27/2022 10:57:10 - INFO - codeparrot_training - Step 16045: {'lr': 0.00040160554339747434, 'samples': 3080832, 'steps': 16045, 'loss/train': 0.6729195713996887} 01/27/2022 10:57:13 - INFO - codeparrot_training - Step 16046: {'lr': 0.00040159253257835624, 'samples': 3081024, 'steps': 16046, 'loss/train': 0.9527392387390137} 01/27/2022 10:57:16 - INFO - codeparrot_training - Step 16047: {'lr': 0.00040157952110986397, 'samples': 3081216, 'steps': 16047, 'loss/train': 0.7149668633937836} 01/27/2022 10:57:19 - INFO - codeparrot_training - Step 16048: {'lr': 0.00040156650899205305, 'samples': 3081408, 'steps': 16048, 'loss/train': 1.0416427552700043} 01/27/2022 10:57:22 - INFO - codeparrot_training - Step 16049: {'lr': 0.00040155349622497937, 'samples': 3081600, 'steps': 16049, 'loss/train': 1.0698343813419342} 01/27/2022 10:57:26 - INFO - codeparrot_training - Step 16050: {'lr': 0.0004015404828086987, 'samples': 3081792, 'steps': 16050, 'loss/train': 0.5175999104976654} 01/27/2022 10:57:30 - INFO - codeparrot_training - Step 16051: {'lr': 0.0004015274687432667, 'samples': 3081984, 'steps': 16051, 'loss/train': 0.11417432874441147} 01/27/2022 10:57:33 - INFO - codeparrot_training - Step 16052: {'lr': 0.0004015144540287391, 'samples': 3082176, 'steps': 16052, 'loss/train': 0.7645329236984253} 01/27/2022 10:57:36 - INFO - codeparrot_training - Step 16053: {'lr': 0.00040150143866517164, 'samples': 3082368, 'steps': 16053, 'loss/train': 0.9410390853881836} 01/27/2022 10:57:39 - INFO - codeparrot_training - Step 16054: {'lr': 0.0004014884226526202, 'samples': 3082560, 'steps': 16054, 'loss/train': 0.9200019836425781} 01/27/2022 10:57:42 - INFO - codeparrot_training - Step 16055: {'lr': 0.0004014754059911405, 'samples': 3082752, 'steps': 16055, 'loss/train': 0.40017472207546234} 01/27/2022 10:57:46 - INFO - codeparrot_training - Step 16056: {'lr': 0.0004014623886807882, 'samples': 3082944, 'steps': 16056, 'loss/train': 0.702796071767807} 01/27/2022 10:57:49 - INFO - codeparrot_training - Step 16057: {'lr': 0.0004014493707216191, 'samples': 3083136, 'steps': 16057, 'loss/train': 0.19270335137844086} 01/27/2022 10:57:52 - INFO - codeparrot_training - Step 16058: {'lr': 0.00040143635211368903, 'samples': 3083328, 'steps': 16058, 'loss/train': 0.830212265253067} 01/27/2022 10:57:56 - INFO - codeparrot_training - Step 16059: {'lr': 0.0004014233328570537, 'samples': 3083520, 'steps': 16059, 'loss/train': 0.5202421098947525} 01/27/2022 10:58:00 - INFO - codeparrot_training - Step 16060: {'lr': 0.0004014103129517689, 'samples': 3083712, 'steps': 16060, 'loss/train': 0.7666836082935333} 01/27/2022 10:58:03 - INFO - codeparrot_training - Step 16061: {'lr': 0.00040139729239789036, 'samples': 3083904, 'steps': 16061, 'loss/train': 0.41621115803718567} 01/27/2022 10:58:06 - INFO - codeparrot_training - Step 16062: {'lr': 0.0004013842711954739, 'samples': 3084096, 'steps': 16062, 'loss/train': 0.9253009557723999} 01/27/2022 10:58:09 - INFO - codeparrot_training - Step 16063: {'lr': 0.0004013712493445753, 'samples': 3084288, 'steps': 16063, 'loss/train': 0.9212969541549683} 01/27/2022 10:58:12 - INFO - codeparrot_training - Step 16064: {'lr': 0.00040135822684525036, 'samples': 3084480, 'steps': 16064, 'loss/train': 1.1886706352233887} 01/27/2022 10:58:15 - INFO - codeparrot_training - Step 16065: {'lr': 0.0004013452036975548, 'samples': 3084672, 'steps': 16065, 'loss/train': 0.8177409768104553} 01/27/2022 10:58:19 - INFO - codeparrot_training - Step 16066: {'lr': 0.0004013321799015445, 'samples': 3084864, 'steps': 16066, 'loss/train': 0.8625762462615967} 01/27/2022 10:58:22 - INFO - codeparrot_training - Step 16067: {'lr': 0.00040131915545727517, 'samples': 3085056, 'steps': 16067, 'loss/train': 0.7645277380943298} 01/27/2022 10:58:27 - INFO - codeparrot_training - Step 16068: {'lr': 0.00040130613036480265, 'samples': 3085248, 'steps': 16068, 'loss/train': 0.8315145671367645} 01/27/2022 10:58:30 - INFO - codeparrot_training - Step 16069: {'lr': 0.0004012931046241827, 'samples': 3085440, 'steps': 16069, 'loss/train': 0.3940197676420212} 01/27/2022 10:58:33 - INFO - codeparrot_training - Step 16070: {'lr': 0.00040128007823547106, 'samples': 3085632, 'steps': 16070, 'loss/train': 0.8903836011886597} 01/27/2022 10:58:36 - INFO - codeparrot_training - Step 16071: {'lr': 0.00040126705119872367, 'samples': 3085824, 'steps': 16071, 'loss/train': 0.939335972070694} 01/27/2022 10:58:39 - INFO - codeparrot_training - Step 16072: {'lr': 0.00040125402351399623, 'samples': 3086016, 'steps': 16072, 'loss/train': 0.7546872496604919} 01/27/2022 10:58:43 - INFO - codeparrot_training - Step 16073: {'lr': 0.0004012409951813446, 'samples': 3086208, 'steps': 16073, 'loss/train': 0.5747102648019791} 01/27/2022 10:58:46 - INFO - codeparrot_training - Step 16074: {'lr': 0.0004012279662008246, 'samples': 3086400, 'steps': 16074, 'loss/train': 0.7705071866512299} 01/27/2022 10:58:49 - INFO - codeparrot_training - Step 16075: {'lr': 0.000401214936572492, 'samples': 3086592, 'steps': 16075, 'loss/train': 0.8908843696117401} 01/27/2022 10:58:52 - INFO - codeparrot_training - Step 16076: {'lr': 0.0004012019062964026, 'samples': 3086784, 'steps': 16076, 'loss/train': 0.33995582163333893} 01/27/2022 10:58:57 - INFO - codeparrot_training - Step 16077: {'lr': 0.0004011888753726123, 'samples': 3086976, 'steps': 16077, 'loss/train': 0.9412097632884979} 01/27/2022 10:59:00 - INFO - codeparrot_training - Step 16078: {'lr': 0.00040117584380117675, 'samples': 3087168, 'steps': 16078, 'loss/train': 0.7451140880584717} 01/27/2022 10:59:03 - INFO - codeparrot_training - Step 16079: {'lr': 0.000401162811582152, 'samples': 3087360, 'steps': 16079, 'loss/train': 0.9822689294815063} 01/27/2022 10:59:06 - INFO - codeparrot_training - Step 16080: {'lr': 0.00040114977871559375, 'samples': 3087552, 'steps': 16080, 'loss/train': 0.6839516311883926} 01/27/2022 10:59:09 - INFO - codeparrot_training - Step 16081: {'lr': 0.0004011367452015578, 'samples': 3087744, 'steps': 16081, 'loss/train': 1.4525760412216187} 01/27/2022 10:59:12 - INFO - codeparrot_training - Step 16082: {'lr': 0.00040112371104010004, 'samples': 3087936, 'steps': 16082, 'loss/train': 0.7872274518013} 01/27/2022 10:59:16 - INFO - codeparrot_training - Step 16083: {'lr': 0.00040111067623127626, 'samples': 3088128, 'steps': 16083, 'loss/train': 0.6307044178247452} 01/27/2022 10:59:19 - INFO - codeparrot_training - Step 16084: {'lr': 0.0004010976407751424, 'samples': 3088320, 'steps': 16084, 'loss/train': 0.6168482601642609} 01/27/2022 10:59:22 - INFO - codeparrot_training - Step 16085: {'lr': 0.00040108460467175425, 'samples': 3088512, 'steps': 16085, 'loss/train': 0.7885091006755829} 01/27/2022 10:59:26 - INFO - codeparrot_training - Step 16086: {'lr': 0.00040107156792116753, 'samples': 3088704, 'steps': 16086, 'loss/train': 0.7441748678684235} 01/27/2022 10:59:29 - INFO - codeparrot_training - Step 16087: {'lr': 0.0004010585305234382, 'samples': 3088896, 'steps': 16087, 'loss/train': 0.7913475036621094} 01/27/2022 10:59:33 - INFO - codeparrot_training - Step 16088: {'lr': 0.00040104549247862217, 'samples': 3089088, 'steps': 16088, 'loss/train': 0.8577834069728851} 01/27/2022 10:59:36 - INFO - codeparrot_training - Step 16089: {'lr': 0.0004010324537867751, 'samples': 3089280, 'steps': 16089, 'loss/train': 0.05432242155075073} 01/27/2022 10:59:39 - INFO - codeparrot_training - Step 16090: {'lr': 0.000401019414447953, 'samples': 3089472, 'steps': 16090, 'loss/train': 0.9519981443881989} 01/27/2022 10:59:42 - INFO - codeparrot_training - Step 16091: {'lr': 0.0004010063744622117, 'samples': 3089664, 'steps': 16091, 'loss/train': 1.5884395837783813} 01/27/2022 10:59:45 - INFO - codeparrot_training - Step 16092: {'lr': 0.00040099333382960707, 'samples': 3089856, 'steps': 16092, 'loss/train': 0.6136399805545807} 01/27/2022 10:59:48 - INFO - codeparrot_training - Step 16093: {'lr': 0.00040098029255019484, 'samples': 3090048, 'steps': 16093, 'loss/train': 0.8105615973472595} 01/27/2022 10:59:51 - INFO - codeparrot_training - Step 16094: {'lr': 0.0004009672506240311, 'samples': 3090240, 'steps': 16094, 'loss/train': 0.864004522562027} 01/27/2022 10:59:56 - INFO - codeparrot_training - Step 16095: {'lr': 0.00040095420805117153, 'samples': 3090432, 'steps': 16095, 'loss/train': 1.0065853893756866} 01/27/2022 10:59:59 - INFO - codeparrot_training - Step 16096: {'lr': 0.0004009411648316721, 'samples': 3090624, 'steps': 16096, 'loss/train': 0.8987221419811249} 01/27/2022 11:00:02 - INFO - codeparrot_training - Step 16097: {'lr': 0.0004009281209655886, 'samples': 3090816, 'steps': 16097, 'loss/train': 0.4887055903673172} 01/27/2022 11:00:05 - INFO - codeparrot_training - Step 16098: {'lr': 0.000400915076452977, 'samples': 3091008, 'steps': 16098, 'loss/train': 0.9994000196456909} 01/27/2022 11:00:09 - INFO - codeparrot_training - Step 16099: {'lr': 0.0004009020312938931, 'samples': 3091200, 'steps': 16099, 'loss/train': 1.2118475139141083} 01/27/2022 11:00:12 - INFO - codeparrot_training - Step 16100: {'lr': 0.0004008889854883929, 'samples': 3091392, 'steps': 16100, 'loss/train': 0.5654051452875137} 01/27/2022 11:00:15 - INFO - codeparrot_training - Step 16101: {'lr': 0.0004008759390365321, 'samples': 3091584, 'steps': 16101, 'loss/train': 0.5668313205242157} 01/27/2022 11:00:18 - INFO - codeparrot_training - Step 16102: {'lr': 0.00040086289193836674, 'samples': 3091776, 'steps': 16102, 'loss/train': 0.8640321493148804} 01/27/2022 11:00:23 - INFO - codeparrot_training - Step 16103: {'lr': 0.00040084984419395264, 'samples': 3091968, 'steps': 16103, 'loss/train': 0.5375676602125168} 01/27/2022 11:00:27 - INFO - codeparrot_training - Step 16104: {'lr': 0.00040083679580334565, 'samples': 3092160, 'steps': 16104, 'loss/train': 1.8602385520935059} 01/27/2022 11:00:30 - INFO - codeparrot_training - Step 16105: {'lr': 0.00040082374676660176, 'samples': 3092352, 'steps': 16105, 'loss/train': 1.322939783334732} 01/27/2022 11:00:33 - INFO - codeparrot_training - Step 16106: {'lr': 0.00040081069708377686, 'samples': 3092544, 'steps': 16106, 'loss/train': 0.7416754513978958} 01/27/2022 11:00:36 - INFO - codeparrot_training - Step 16107: {'lr': 0.0004007976467549268, 'samples': 3092736, 'steps': 16107, 'loss/train': 1.258715271949768} 01/27/2022 11:00:39 - INFO - codeparrot_training - Step 16108: {'lr': 0.0004007845957801075, 'samples': 3092928, 'steps': 16108, 'loss/train': 0.9229179918766022} 01/27/2022 11:00:42 - INFO - codeparrot_training - Step 16109: {'lr': 0.0004007715441593749, 'samples': 3093120, 'steps': 16109, 'loss/train': 1.1582191586494446} 01/27/2022 11:00:45 - INFO - codeparrot_training - Step 16110: {'lr': 0.0004007584918927849, 'samples': 3093312, 'steps': 16110, 'loss/train': 0.6482914388179779} 01/27/2022 11:00:49 - INFO - codeparrot_training - Step 16111: {'lr': 0.0004007454389803933, 'samples': 3093504, 'steps': 16111, 'loss/train': 0.8451608419418335} 01/27/2022 11:00:53 - INFO - codeparrot_training - Step 16112: {'lr': 0.00040073238542225623, 'samples': 3093696, 'steps': 16112, 'loss/train': 0.18500537797808647} 01/27/2022 11:00:57 - INFO - codeparrot_training - Step 16113: {'lr': 0.00040071933121842943, 'samples': 3093888, 'steps': 16113, 'loss/train': 1.3139043152332306} 01/27/2022 11:01:00 - INFO - codeparrot_training - Step 16114: {'lr': 0.00040070627636896886, 'samples': 3094080, 'steps': 16114, 'loss/train': 0.798805832862854} 01/27/2022 11:01:03 - INFO - codeparrot_training - Step 16115: {'lr': 0.0004006932208739304, 'samples': 3094272, 'steps': 16115, 'loss/train': 0.9381473064422607} 01/27/2022 11:01:06 - INFO - codeparrot_training - Step 16116: {'lr': 0.0004006801647333701, 'samples': 3094464, 'steps': 16116, 'loss/train': 0.601113572716713} 01/27/2022 11:01:09 - INFO - codeparrot_training - Step 16117: {'lr': 0.0004006671079473438, 'samples': 3094656, 'steps': 16117, 'loss/train': 0.7330269366502762} 01/27/2022 11:01:12 - INFO - codeparrot_training - Step 16118: {'lr': 0.00040065405051590745, 'samples': 3094848, 'steps': 16118, 'loss/train': 0.8243494927883148} 01/27/2022 11:01:15 - INFO - codeparrot_training - Step 16119: {'lr': 0.000400640992439117, 'samples': 3095040, 'steps': 16119, 'loss/train': 0.1335626095533371} 01/27/2022 11:01:19 - INFO - codeparrot_training - Step 16120: {'lr': 0.0004006279337170283, 'samples': 3095232, 'steps': 16120, 'loss/train': 0.9009931683540344} 01/27/2022 11:01:23 - INFO - codeparrot_training - Step 16121: {'lr': 0.00040061487434969744, 'samples': 3095424, 'steps': 16121, 'loss/train': 0.9683931469917297} 01/27/2022 11:01:26 - INFO - codeparrot_training - Step 16122: {'lr': 0.00040060181433718037, 'samples': 3095616, 'steps': 16122, 'loss/train': 0.9955757260322571} 01/27/2022 11:01:29 - INFO - codeparrot_training - Step 16123: {'lr': 0.00040058875367953285, 'samples': 3095808, 'steps': 16123, 'loss/train': 1.0257788300514221} 01/27/2022 11:01:33 - INFO - codeparrot_training - Step 16124: {'lr': 0.0004005756923768109, 'samples': 3096000, 'steps': 16124, 'loss/train': 0.36369357258081436} 01/27/2022 11:01:36 - INFO - codeparrot_training - Step 16125: {'lr': 0.0004005626304290705, 'samples': 3096192, 'steps': 16125, 'loss/train': 0.7323752492666245} 01/27/2022 11:01:39 - INFO - codeparrot_training - Step 16126: {'lr': 0.00040054956783636765, 'samples': 3096384, 'steps': 16126, 'loss/train': 0.6439166218042374} 01/27/2022 11:01:42 - INFO - codeparrot_training - Step 16127: {'lr': 0.00040053650459875823, 'samples': 3096576, 'steps': 16127, 'loss/train': 0.6893886029720306} 01/27/2022 11:01:45 - INFO - codeparrot_training - Step 16128: {'lr': 0.0004005234407162982, 'samples': 3096768, 'steps': 16128, 'loss/train': 0.7801358699798584} 01/27/2022 11:01:50 - INFO - codeparrot_training - Step 16129: {'lr': 0.00040051037618904365, 'samples': 3096960, 'steps': 16129, 'loss/train': 0.35465774685144424} 01/27/2022 11:01:53 - INFO - codeparrot_training - Step 16130: {'lr': 0.0004004973110170503, 'samples': 3097152, 'steps': 16130, 'loss/train': 0.6837345510721207} 01/27/2022 11:01:57 - INFO - codeparrot_training - Step 16131: {'lr': 0.0004004842452003743, 'samples': 3097344, 'steps': 16131, 'loss/train': 0.6895062178373337} 01/27/2022 11:02:00 - INFO - codeparrot_training - Step 16132: {'lr': 0.0004004711787390716, 'samples': 3097536, 'steps': 16132, 'loss/train': 1.583378255367279} 01/27/2022 11:02:03 - INFO - codeparrot_training - Step 16133: {'lr': 0.0004004581116331981, 'samples': 3097728, 'steps': 16133, 'loss/train': 0.34579894691705704} 01/27/2022 11:02:06 - INFO - codeparrot_training - Step 16134: {'lr': 0.00040044504388280996, 'samples': 3097920, 'steps': 16134, 'loss/train': 0.8061041235923767} 01/27/2022 11:02:09 - INFO - codeparrot_training - Step 16135: {'lr': 0.00040043197548796295, 'samples': 3098112, 'steps': 16135, 'loss/train': 0.42744599282741547} 01/27/2022 11:02:12 - INFO - codeparrot_training - Step 16136: {'lr': 0.0004004189064487131, 'samples': 3098304, 'steps': 16136, 'loss/train': 0.8541747629642487} 01/27/2022 11:02:15 - INFO - codeparrot_training - Step 16137: {'lr': 0.00040040583676511645, 'samples': 3098496, 'steps': 16137, 'loss/train': 0.5681349188089371} 01/27/2022 11:02:20 - INFO - codeparrot_training - Step 16138: {'lr': 0.0004003927664372289, 'samples': 3098688, 'steps': 16138, 'loss/train': 0.9292392432689667} 01/27/2022 11:02:23 - INFO - codeparrot_training - Step 16139: {'lr': 0.00040037969546510653, 'samples': 3098880, 'steps': 16139, 'loss/train': 0.5920339822769165} 01/27/2022 11:02:26 - INFO - codeparrot_training - Step 16140: {'lr': 0.0004003666238488053, 'samples': 3099072, 'steps': 16140, 'loss/train': 1.0657279193401337} 01/27/2022 11:02:29 - INFO - codeparrot_training - Step 16141: {'lr': 0.00040035355158838114, 'samples': 3099264, 'steps': 16141, 'loss/train': 0.6732043325901031} 01/27/2022 11:02:32 - INFO - codeparrot_training - Step 16142: {'lr': 0.0004003404786838902, 'samples': 3099456, 'steps': 16142, 'loss/train': 0.7918139398097992} 01/27/2022 11:02:36 - INFO - codeparrot_training - Step 16143: {'lr': 0.0004003274051353884, 'samples': 3099648, 'steps': 16143, 'loss/train': 0.68306864798069} 01/27/2022 11:02:39 - INFO - codeparrot_training - Step 16144: {'lr': 0.00040031433094293167, 'samples': 3099840, 'steps': 16144, 'loss/train': 0.8096366822719574} 01/27/2022 11:02:42 - INFO - codeparrot_training - Step 16145: {'lr': 0.0004003012561065761, 'samples': 3100032, 'steps': 16145, 'loss/train': 0.7284202873706818} 01/27/2022 11:02:45 - INFO - codeparrot_training - Step 16146: {'lr': 0.0004002881806263776, 'samples': 3100224, 'steps': 16146, 'loss/train': 0.8681244850158691} 01/27/2022 11:02:51 - INFO - codeparrot_training - Step 16147: {'lr': 0.0004002751045023924, 'samples': 3100416, 'steps': 16147, 'loss/train': 1.2953082025051117} 01/27/2022 11:02:54 - INFO - codeparrot_training - Step 16148: {'lr': 0.00040026202773467623, 'samples': 3100608, 'steps': 16148, 'loss/train': 1.031703382730484} 01/27/2022 11:02:57 - INFO - codeparrot_training - Step 16149: {'lr': 0.00040024895032328536, 'samples': 3100800, 'steps': 16149, 'loss/train': 0.5150815844535828} 01/27/2022 11:03:00 - INFO - codeparrot_training - Step 16150: {'lr': 0.0004002358722682756, 'samples': 3100992, 'steps': 16150, 'loss/train': 0.39607080817222595} 01/27/2022 11:03:03 - INFO - codeparrot_training - Step 16151: {'lr': 0.00040022279356970316, 'samples': 3101184, 'steps': 16151, 'loss/train': 1.1239608228206635} 01/27/2022 11:03:06 - INFO - codeparrot_training - Step 16152: {'lr': 0.0004002097142276239, 'samples': 3101376, 'steps': 16152, 'loss/train': 0.636260524392128} 01/27/2022 11:03:10 - INFO - codeparrot_training - Step 16153: {'lr': 0.00040019663424209397, 'samples': 3101568, 'steps': 16153, 'loss/train': 0.6892773360013962} 01/27/2022 11:03:13 - INFO - codeparrot_training - Step 16154: {'lr': 0.0004001835536131693, 'samples': 3101760, 'steps': 16154, 'loss/train': 0.66307532787323} 01/27/2022 11:03:17 - INFO - codeparrot_training - Step 16155: {'lr': 0.00040017047234090596, 'samples': 3101952, 'steps': 16155, 'loss/train': 0.3124677836894989} 01/27/2022 11:03:20 - INFO - codeparrot_training - Step 16156: {'lr': 0.00040015739042536, 'samples': 3102144, 'steps': 16156, 'loss/train': 0.7959659099578857} 01/27/2022 11:03:23 - INFO - codeparrot_training - Step 16157: {'lr': 0.00040014430786658754, 'samples': 3102336, 'steps': 16157, 'loss/train': 0.6381566673517227} 01/27/2022 11:03:27 - INFO - codeparrot_training - Step 16158: {'lr': 0.0004001312246646446, 'samples': 3102528, 'steps': 16158, 'loss/train': 1.0710554122924805} 01/27/2022 11:03:30 - INFO - codeparrot_training - Step 16159: {'lr': 0.000400118140819587, 'samples': 3102720, 'steps': 16159, 'loss/train': 0.7727639973163605} 01/27/2022 11:03:33 - INFO - codeparrot_training - Step 16160: {'lr': 0.00040010505633147106, 'samples': 3102912, 'steps': 16160, 'loss/train': 0.4233178049325943} 01/27/2022 11:03:36 - INFO - codeparrot_training - Step 16161: {'lr': 0.0004000919712003526, 'samples': 3103104, 'steps': 16161, 'loss/train': 0.6450802534818649} 01/27/2022 11:03:39 - INFO - codeparrot_training - Step 16162: {'lr': 0.0004000788854262879, 'samples': 3103296, 'steps': 16162, 'loss/train': 0.7920041978359222} 01/27/2022 11:03:42 - INFO - codeparrot_training - Step 16163: {'lr': 0.00040006579900933294, 'samples': 3103488, 'steps': 16163, 'loss/train': 1.2418106496334076} 01/27/2022 11:03:47 - INFO - codeparrot_training - Step 16164: {'lr': 0.00040005271194954367, 'samples': 3103680, 'steps': 16164, 'loss/train': 0.7443996369838715} 01/27/2022 11:03:50 - INFO - codeparrot_training - Step 16165: {'lr': 0.00040003962424697625, 'samples': 3103872, 'steps': 16165, 'loss/train': 1.0874756276607513} 01/27/2022 11:03:53 - INFO - codeparrot_training - Step 16166: {'lr': 0.0004000265359016867, 'samples': 3104064, 'steps': 16166, 'loss/train': 0.9540434181690216} 01/27/2022 11:03:56 - INFO - codeparrot_training - Step 16167: {'lr': 0.0004000134469137312, 'samples': 3104256, 'steps': 16167, 'loss/train': 0.80058154463768} 01/27/2022 11:03:59 - INFO - codeparrot_training - Step 16168: {'lr': 0.00040000035728316564, 'samples': 3104448, 'steps': 16168, 'loss/train': 1.0064590573310852} 01/27/2022 11:04:02 - INFO - codeparrot_training - Step 16169: {'lr': 0.0003999872670100462, 'samples': 3104640, 'steps': 16169, 'loss/train': 0.9559896290302277} 01/27/2022 11:04:06 - INFO - codeparrot_training - Step 16170: {'lr': 0.000399974176094429, 'samples': 3104832, 'steps': 16170, 'loss/train': 0.6497704535722733} 01/27/2022 11:04:09 - INFO - codeparrot_training - Step 16171: {'lr': 0.00039996108453637, 'samples': 3105024, 'steps': 16171, 'loss/train': 0.6208857446908951} 01/27/2022 11:04:12 - INFO - codeparrot_training - Step 16172: {'lr': 0.0003999479923359253, 'samples': 3105216, 'steps': 16172, 'loss/train': 1.4152099192142487} 01/27/2022 11:04:17 - INFO - codeparrot_training - Step 16173: {'lr': 0.00039993489949315103, 'samples': 3105408, 'steps': 16173, 'loss/train': 0.744867280125618} 01/27/2022 11:04:20 - INFO - codeparrot_training - Step 16174: {'lr': 0.0003999218060081032, 'samples': 3105600, 'steps': 16174, 'loss/train': 0.48310884833335876} 01/27/2022 11:04:23 - INFO - codeparrot_training - Step 16175: {'lr': 0.0003999087118808381, 'samples': 3105792, 'steps': 16175, 'loss/train': 0.670142412185669} 01/27/2022 11:04:27 - INFO - codeparrot_training - Step 16176: {'lr': 0.0003998956171114116, 'samples': 3105984, 'steps': 16176, 'loss/train': 0.7885953783988953} 01/27/2022 11:04:30 - INFO - codeparrot_training - Step 16177: {'lr': 0.0003998825216998799, 'samples': 3106176, 'steps': 16177, 'loss/train': 0.6725649833679199} 01/27/2022 11:04:33 - INFO - codeparrot_training - Step 16178: {'lr': 0.00039986942564629904, 'samples': 3106368, 'steps': 16178, 'loss/train': 0.9765986502170563} 01/27/2022 11:04:36 - INFO - codeparrot_training - Step 16179: {'lr': 0.0003998563289507251, 'samples': 3106560, 'steps': 16179, 'loss/train': 1.6755763292312622} 01/27/2022 11:04:39 - INFO - codeparrot_training - Step 16180: {'lr': 0.0003998432316132143, 'samples': 3106752, 'steps': 16180, 'loss/train': 0.46593882143497467} 01/27/2022 11:04:42 - INFO - codeparrot_training - Step 16181: {'lr': 0.0003998301336338227, 'samples': 3106944, 'steps': 16181, 'loss/train': 0.25301726907491684} 01/27/2022 11:04:47 - INFO - codeparrot_training - Step 16182: {'lr': 0.0003998170350126064, 'samples': 3107136, 'steps': 16182, 'loss/train': 0.05575856566429138} 01/27/2022 11:04:50 - INFO - codeparrot_training - Step 16183: {'lr': 0.0003998039357496214, 'samples': 3107328, 'steps': 16183, 'loss/train': 0.7141703814268112} 01/27/2022 11:04:53 - INFO - codeparrot_training - Step 16184: {'lr': 0.000399790835844924, 'samples': 3107520, 'steps': 16184, 'loss/train': 0.8561670184135437} 01/27/2022 11:04:56 - INFO - codeparrot_training - Step 16185: {'lr': 0.00039977773529857016, 'samples': 3107712, 'steps': 16185, 'loss/train': 1.07328262925148} 01/27/2022 11:04:59 - INFO - codeparrot_training - Step 16186: {'lr': 0.00039976463411061606, 'samples': 3107904, 'steps': 16186, 'loss/train': 0.7389503717422485} 01/27/2022 11:05:03 - INFO - codeparrot_training - Step 16187: {'lr': 0.00039975153228111784, 'samples': 3108096, 'steps': 16187, 'loss/train': 1.6909891963005066} 01/27/2022 11:05:06 - INFO - codeparrot_training - Step 16188: {'lr': 0.0003997384298101316, 'samples': 3108288, 'steps': 16188, 'loss/train': 0.6677248477935791} 01/27/2022 11:05:09 - INFO - codeparrot_training - Step 16189: {'lr': 0.0003997253266977135, 'samples': 3108480, 'steps': 16189, 'loss/train': 0.8549251556396484} 01/27/2022 11:05:12 - INFO - codeparrot_training - Step 16190: {'lr': 0.0003997122229439196, 'samples': 3108672, 'steps': 16190, 'loss/train': 0.5774542540311813} 01/27/2022 11:05:17 - INFO - codeparrot_training - Step 16191: {'lr': 0.00039969911854880613, 'samples': 3108864, 'steps': 16191, 'loss/train': 0.9107017815113068} 01/27/2022 11:05:20 - INFO - codeparrot_training - Step 16192: {'lr': 0.0003996860135124292, 'samples': 3109056, 'steps': 16192, 'loss/train': 0.6766562908887863} 01/27/2022 11:05:23 - INFO - codeparrot_training - Step 16193: {'lr': 0.00039967290783484485, 'samples': 3109248, 'steps': 16193, 'loss/train': 0.5964110344648361} 01/27/2022 11:05:26 - INFO - codeparrot_training - Step 16194: {'lr': 0.00039965980151610925, 'samples': 3109440, 'steps': 16194, 'loss/train': 0.937375545501709} 01/27/2022 11:05:29 - INFO - codeparrot_training - Step 16195: {'lr': 0.0003996466945562787, 'samples': 3109632, 'steps': 16195, 'loss/train': 0.9767438471317291} 01/27/2022 11:05:32 - INFO - codeparrot_training - Step 16196: {'lr': 0.00039963358695540907, 'samples': 3109824, 'steps': 16196, 'loss/train': 0.5100443512201309} 01/27/2022 11:05:36 - INFO - codeparrot_training - Step 16197: {'lr': 0.00039962047871355686, 'samples': 3110016, 'steps': 16197, 'loss/train': 1.1521844565868378} 01/27/2022 11:05:39 - INFO - codeparrot_training - Step 16198: {'lr': 0.00039960736983077783, 'samples': 3110208, 'steps': 16198, 'loss/train': 1.3670508563518524} 01/27/2022 11:05:43 - INFO - codeparrot_training - Step 16199: {'lr': 0.0003995942603071285, 'samples': 3110400, 'steps': 16199, 'loss/train': 1.3524795770645142} 01/27/2022 11:05:46 - INFO - codeparrot_training - Step 16200: {'lr': 0.0003995811501426648, 'samples': 3110592, 'steps': 16200, 'loss/train': 0.49441806972026825} 01/27/2022 11:05:49 - INFO - codeparrot_training - Step 16201: {'lr': 0.0003995680393374429, 'samples': 3110784, 'steps': 16201, 'loss/train': 0.9774319231510162} 01/27/2022 11:05:53 - INFO - codeparrot_training - Step 16202: {'lr': 0.00039955492789151904, 'samples': 3110976, 'steps': 16202, 'loss/train': 1.0279123485088348} 01/27/2022 11:05:56 - INFO - codeparrot_training - Step 16203: {'lr': 0.0003995418158049494, 'samples': 3111168, 'steps': 16203, 'loss/train': 0.7564727962017059} 01/27/2022 11:05:59 - INFO - codeparrot_training - Step 16204: {'lr': 0.0003995287030777901, 'samples': 3111360, 'steps': 16204, 'loss/train': 0.6424032300710678} 01/27/2022 11:06:02 - INFO - codeparrot_training - Step 16205: {'lr': 0.0003995155897100973, 'samples': 3111552, 'steps': 16205, 'loss/train': 0.7752204537391663} 01/27/2022 11:06:05 - INFO - codeparrot_training - Step 16206: {'lr': 0.0003995024757019272, 'samples': 3111744, 'steps': 16206, 'loss/train': 0.3311643451452255} 01/27/2022 11:06:08 - INFO - codeparrot_training - Step 16207: {'lr': 0.00039948936105333593, 'samples': 3111936, 'steps': 16207, 'loss/train': 0.8803102076053619} 01/27/2022 11:06:14 - INFO - codeparrot_training - Step 16208: {'lr': 0.0003994762457643797, 'samples': 3112128, 'steps': 16208, 'loss/train': 0.6792906671762466} 01/27/2022 11:06:17 - INFO - codeparrot_training - Step 16209: {'lr': 0.0003994631298351148, 'samples': 3112320, 'steps': 16209, 'loss/train': 0.5388921350240707} 01/27/2022 11:06:20 - INFO - codeparrot_training - Step 16210: {'lr': 0.0003994500132655972, 'samples': 3112512, 'steps': 16210, 'loss/train': 0.5494813621044159} 01/27/2022 11:06:23 - INFO - codeparrot_training - Step 16211: {'lr': 0.0003994368960558832, 'samples': 3112704, 'steps': 16211, 'loss/train': 1.0097468197345734} 01/27/2022 11:06:26 - INFO - codeparrot_training - Step 16212: {'lr': 0.0003994237782060291, 'samples': 3112896, 'steps': 16212, 'loss/train': 1.1294676661491394} 01/27/2022 11:06:30 - INFO - codeparrot_training - Step 16213: {'lr': 0.00039941065971609084, 'samples': 3113088, 'steps': 16213, 'loss/train': 0.8697527647018433} 01/27/2022 11:06:33 - INFO - codeparrot_training - Step 16214: {'lr': 0.00039939754058612487, 'samples': 3113280, 'steps': 16214, 'loss/train': 0.687748521566391} 01/27/2022 11:06:36 - INFO - codeparrot_training - Step 16215: {'lr': 0.0003993844208161872, 'samples': 3113472, 'steps': 16215, 'loss/train': 1.127063512802124} 01/27/2022 11:06:39 - INFO - codeparrot_training - Step 16216: {'lr': 0.0003993713004063341, 'samples': 3113664, 'steps': 16216, 'loss/train': 0.7654272615909576} 01/27/2022 11:06:44 - INFO - codeparrot_training - Step 16217: {'lr': 0.0003993581793566219, 'samples': 3113856, 'steps': 16217, 'loss/train': 0.8087926805019379} 01/27/2022 11:06:47 - INFO - codeparrot_training - Step 16218: {'lr': 0.00039934505766710656, 'samples': 3114048, 'steps': 16218, 'loss/train': 0.04706752672791481} 01/27/2022 11:06:51 - INFO - codeparrot_training - Step 16219: {'lr': 0.0003993319353378445, 'samples': 3114240, 'steps': 16219, 'loss/train': 0.8821775615215302} 01/27/2022 11:06:54 - INFO - codeparrot_training - Step 16220: {'lr': 0.0003993188123688918, 'samples': 3114432, 'steps': 16220, 'loss/train': 0.7198865711688995} 01/27/2022 11:06:57 - INFO - codeparrot_training - Step 16221: {'lr': 0.00039930568876030473, 'samples': 3114624, 'steps': 16221, 'loss/train': 0.8737896680831909} 01/27/2022 11:07:00 - INFO - codeparrot_training - Step 16222: {'lr': 0.0003992925645121395, 'samples': 3114816, 'steps': 16222, 'loss/train': 0.8249053359031677} 01/27/2022 11:07:03 - INFO - codeparrot_training - Step 16223: {'lr': 0.00039927943962445234, 'samples': 3115008, 'steps': 16223, 'loss/train': 0.2131868228316307} 01/27/2022 11:07:06 - INFO - codeparrot_training - Step 16224: {'lr': 0.0003992663140972994, 'samples': 3115200, 'steps': 16224, 'loss/train': 1.0417856276035309} 01/27/2022 11:07:09 - INFO - codeparrot_training - Step 16225: {'lr': 0.0003992531879307371, 'samples': 3115392, 'steps': 16225, 'loss/train': 0.316602922976017} 01/27/2022 11:07:14 - INFO - codeparrot_training - Step 16226: {'lr': 0.0003992400611248214, 'samples': 3115584, 'steps': 16226, 'loss/train': 0.03718013595789671} 01/27/2022 11:07:18 - INFO - codeparrot_training - Step 16227: {'lr': 0.0003992269336796087, 'samples': 3115776, 'steps': 16227, 'loss/train': 0.8122426271438599} 01/27/2022 11:07:21 - INFO - codeparrot_training - Step 16228: {'lr': 0.0003992138055951552, 'samples': 3115968, 'steps': 16228, 'loss/train': 0.5651389807462692} 01/27/2022 11:07:24 - INFO - codeparrot_training - Step 16229: {'lr': 0.00039920067687151717, 'samples': 3116160, 'steps': 16229, 'loss/train': 1.0383670628070831} 01/27/2022 11:07:27 - INFO - codeparrot_training - Step 16230: {'lr': 0.0003991875475087508, 'samples': 3116352, 'steps': 16230, 'loss/train': 1.1189160346984863} 01/27/2022 11:07:30 - INFO - codeparrot_training - Step 16231: {'lr': 0.00039917441750691237, 'samples': 3116544, 'steps': 16231, 'loss/train': 0.8673271536827087} 01/27/2022 11:07:33 - INFO - codeparrot_training - Step 16232: {'lr': 0.0003991612868660581, 'samples': 3116736, 'steps': 16232, 'loss/train': 1.0767593085765839} 01/27/2022 11:07:36 - INFO - codeparrot_training - Step 16233: {'lr': 0.0003991481555862442, 'samples': 3116928, 'steps': 16233, 'loss/train': 0.9410102963447571} 01/27/2022 11:07:40 - INFO - codeparrot_training - Step 16234: {'lr': 0.00039913502366752704, 'samples': 3117120, 'steps': 16234, 'loss/train': 1.0308907628059387} 01/27/2022 11:07:44 - INFO - codeparrot_training - Step 16235: {'lr': 0.0003991218911099627, 'samples': 3117312, 'steps': 16235, 'loss/train': 0.6291737258434296} 01/27/2022 11:07:47 - INFO - codeparrot_training - Step 16236: {'lr': 0.0003991087579136076, 'samples': 3117504, 'steps': 16236, 'loss/train': 1.1416684985160828} 01/27/2022 11:07:51 - INFO - codeparrot_training - Step 16237: {'lr': 0.00039909562407851784, 'samples': 3117696, 'steps': 16237, 'loss/train': 1.071918547153473} 01/27/2022 11:07:54 - INFO - codeparrot_training - Step 16238: {'lr': 0.0003990824896047498, 'samples': 3117888, 'steps': 16238, 'loss/train': 0.4931730329990387} 01/27/2022 11:07:57 - INFO - codeparrot_training - Step 16239: {'lr': 0.00039906935449235983, 'samples': 3118080, 'steps': 16239, 'loss/train': 0.5481891632080078} 01/27/2022 11:08:00 - INFO - codeparrot_training - Step 16240: {'lr': 0.00039905621874140396, 'samples': 3118272, 'steps': 16240, 'loss/train': 0.8554481863975525} 01/27/2022 11:08:03 - INFO - codeparrot_training - Step 16241: {'lr': 0.00039904308235193866, 'samples': 3118464, 'steps': 16241, 'loss/train': 1.6575230956077576} 01/27/2022 11:08:06 - INFO - codeparrot_training - Step 16242: {'lr': 0.00039902994532402004, 'samples': 3118656, 'steps': 16242, 'loss/train': 0.9631115198135376} 01/27/2022 11:08:10 - INFO - codeparrot_training - Step 16243: {'lr': 0.0003990168076577045, 'samples': 3118848, 'steps': 16243, 'loss/train': 1.032608985900879} 01/27/2022 11:08:14 - INFO - codeparrot_training - Step 16244: {'lr': 0.00039900366935304824, 'samples': 3119040, 'steps': 16244, 'loss/train': 1.0263049900531769} 01/27/2022 11:08:17 - INFO - codeparrot_training - Step 16245: {'lr': 0.00039899053041010765, 'samples': 3119232, 'steps': 16245, 'loss/train': 0.5330948531627655} 01/27/2022 11:08:20 - INFO - codeparrot_training - Step 16246: {'lr': 0.00039897739082893883, 'samples': 3119424, 'steps': 16246, 'loss/train': 0.9059200286865234} 01/27/2022 11:08:23 - INFO - codeparrot_training - Step 16247: {'lr': 0.0003989642506095983, 'samples': 3119616, 'steps': 16247, 'loss/train': 1.2391507029533386} 01/27/2022 11:08:27 - INFO - codeparrot_training - Step 16248: {'lr': 0.0003989511097521421, 'samples': 3119808, 'steps': 16248, 'loss/train': 0.6522131860256195} 01/27/2022 11:08:30 - INFO - codeparrot_training - Step 16249: {'lr': 0.00039893796825662676, 'samples': 3120000, 'steps': 16249, 'loss/train': 0.4519360810518265} 01/27/2022 11:08:33 - INFO - codeparrot_training - Step 16250: {'lr': 0.0003989248261231084, 'samples': 3120192, 'steps': 16250, 'loss/train': 0.3935438543558121} 01/27/2022 11:08:36 - INFO - codeparrot_training - Step 16251: {'lr': 0.0003989116833516433, 'samples': 3120384, 'steps': 16251, 'loss/train': 0.3833215981721878} 01/27/2022 11:08:39 - INFO - codeparrot_training - Step 16252: {'lr': 0.000398898539942288, 'samples': 3120576, 'steps': 16252, 'loss/train': 1.5999439358711243} 01/27/2022 11:08:44 - INFO - codeparrot_training - Step 16253: {'lr': 0.0003988853958950984, 'samples': 3120768, 'steps': 16253, 'loss/train': 0.3693583533167839} 01/27/2022 11:08:47 - INFO - codeparrot_training - Step 16254: {'lr': 0.00039887225121013124, 'samples': 3120960, 'steps': 16254, 'loss/train': 0.772210031747818} 01/27/2022 11:08:51 - INFO - codeparrot_training - Step 16255: {'lr': 0.0003988591058874426, 'samples': 3121152, 'steps': 16255, 'loss/train': 0.4513072371482849} 01/27/2022 11:08:54 - INFO - codeparrot_training - Step 16256: {'lr': 0.00039884595992708877, 'samples': 3121344, 'steps': 16256, 'loss/train': 0.8744835555553436} 01/27/2022 11:08:57 - INFO - codeparrot_training - Step 16257: {'lr': 0.0003988328133291261, 'samples': 3121536, 'steps': 16257, 'loss/train': 0.38357050716876984} 01/27/2022 11:09:00 - INFO - codeparrot_training - Step 16258: {'lr': 0.000398819666093611, 'samples': 3121728, 'steps': 16258, 'loss/train': 0.6371843218803406} 01/27/2022 11:09:03 - INFO - codeparrot_training - Step 16259: {'lr': 0.0003988065182205996, 'samples': 3121920, 'steps': 16259, 'loss/train': 1.0179716348648071} 01/27/2022 11:09:06 - INFO - codeparrot_training - Step 16260: {'lr': 0.0003987933697101484, 'samples': 3122112, 'steps': 16260, 'loss/train': 1.2850188910961151} 01/27/2022 11:09:09 - INFO - codeparrot_training - Step 16261: {'lr': 0.0003987802205623136, 'samples': 3122304, 'steps': 16261, 'loss/train': 0.7786278426647186} 01/27/2022 11:09:14 - INFO - codeparrot_training - Step 16262: {'lr': 0.0003987670707771516, 'samples': 3122496, 'steps': 16262, 'loss/train': 0.9967038631439209} 01/27/2022 11:09:17 - INFO - codeparrot_training - Step 16263: {'lr': 0.0003987539203547187, 'samples': 3122688, 'steps': 16263, 'loss/train': 0.6211754232645035} 01/27/2022 11:09:20 - INFO - codeparrot_training - Step 16264: {'lr': 0.00039874076929507124, 'samples': 3122880, 'steps': 16264, 'loss/train': 1.1608092784881592} 01/27/2022 11:09:23 - INFO - codeparrot_training - Step 16265: {'lr': 0.0003987276175982656, 'samples': 3123072, 'steps': 16265, 'loss/train': 0.8607359826564789} 01/27/2022 11:09:26 - INFO - codeparrot_training - Step 16266: {'lr': 0.00039871446526435806, 'samples': 3123264, 'steps': 16266, 'loss/train': 0.853412389755249} 01/27/2022 11:09:30 - INFO - codeparrot_training - Step 16267: {'lr': 0.00039870131229340495, 'samples': 3123456, 'steps': 16267, 'loss/train': 1.112040102481842} 01/27/2022 11:09:33 - INFO - codeparrot_training - Step 16268: {'lr': 0.00039868815868546257, 'samples': 3123648, 'steps': 16268, 'loss/train': 1.001338541507721} 01/27/2022 11:09:36 - INFO - codeparrot_training - Step 16269: {'lr': 0.00039867500444058747, 'samples': 3123840, 'steps': 16269, 'loss/train': 0.6718313097953796} 01/27/2022 11:09:39 - INFO - codeparrot_training - Step 16270: {'lr': 0.0003986618495588358, 'samples': 3124032, 'steps': 16270, 'loss/train': 1.0366871058940887} 01/27/2022 11:09:43 - INFO - codeparrot_training - Step 16271: {'lr': 0.00039864869404026394, 'samples': 3124224, 'steps': 16271, 'loss/train': 0.8711869418621063} 01/27/2022 11:09:47 - INFO - codeparrot_training - Step 16272: {'lr': 0.0003986355378849283, 'samples': 3124416, 'steps': 16272, 'loss/train': 1.0948247909545898} 01/27/2022 11:09:50 - INFO - codeparrot_training - Step 16273: {'lr': 0.00039862238109288523, 'samples': 3124608, 'steps': 16273, 'loss/train': 1.1524816453456879} 01/27/2022 11:09:53 - INFO - codeparrot_training - Step 16274: {'lr': 0.0003986092236641911, 'samples': 3124800, 'steps': 16274, 'loss/train': 0.6929799020290375} 01/27/2022 11:09:56 - INFO - codeparrot_training - Step 16275: {'lr': 0.00039859606559890215, 'samples': 3124992, 'steps': 16275, 'loss/train': 0.8650272488594055} 01/27/2022 11:09:59 - INFO - codeparrot_training - Step 16276: {'lr': 0.0003985829068970749, 'samples': 3125184, 'steps': 16276, 'loss/train': 0.8608489036560059} 01/27/2022 11:10:02 - INFO - codeparrot_training - Step 16277: {'lr': 0.00039856974755876563, 'samples': 3125376, 'steps': 16277, 'loss/train': 0.7365667819976807} 01/27/2022 11:10:06 - INFO - codeparrot_training - Step 16278: {'lr': 0.0003985565875840308, 'samples': 3125568, 'steps': 16278, 'loss/train': 0.5427868813276291} 01/27/2022 11:10:10 - INFO - codeparrot_training - Step 16279: {'lr': 0.0003985434269729267, 'samples': 3125760, 'steps': 16279, 'loss/train': 0.5338517725467682} 01/27/2022 11:10:13 - INFO - codeparrot_training - Step 16280: {'lr': 0.00039853026572550965, 'samples': 3125952, 'steps': 16280, 'loss/train': 0.9932606220245361} 01/27/2022 11:10:16 - INFO - codeparrot_training - Step 16281: {'lr': 0.00039851710384183615, 'samples': 3126144, 'steps': 16281, 'loss/train': 0.502830907702446} 01/27/2022 11:10:19 - INFO - codeparrot_training - Step 16282: {'lr': 0.0003985039413219626, 'samples': 3126336, 'steps': 16282, 'loss/train': 0.8430125713348389} 01/27/2022 11:10:22 - INFO - codeparrot_training - Step 16283: {'lr': 0.0003984907781659452, 'samples': 3126528, 'steps': 16283, 'loss/train': 0.8155592679977417} 01/27/2022 11:10:26 - INFO - codeparrot_training - Step 16284: {'lr': 0.00039847761437384054, 'samples': 3126720, 'steps': 16284, 'loss/train': 0.7957356870174408} 01/27/2022 11:10:29 - INFO - codeparrot_training - Step 16285: {'lr': 0.0003984644499457049, 'samples': 3126912, 'steps': 16285, 'loss/train': 0.9324469864368439} 01/27/2022 11:10:32 - INFO - codeparrot_training - Step 16286: {'lr': 0.0003984512848815948, 'samples': 3127104, 'steps': 16286, 'loss/train': 0.6606500744819641} 01/27/2022 11:10:35 - INFO - codeparrot_training - Step 16287: {'lr': 0.00039843811918156635, 'samples': 3127296, 'steps': 16287, 'loss/train': 0.8816904723644257} 01/27/2022 11:10:40 - INFO - codeparrot_training - Step 16288: {'lr': 0.0003984249528456762, 'samples': 3127488, 'steps': 16288, 'loss/train': 0.2783263474702835} 01/27/2022 11:10:43 - INFO - codeparrot_training - Step 16289: {'lr': 0.00039841178587398074, 'samples': 3127680, 'steps': 16289, 'loss/train': 0.7745550870895386} 01/27/2022 11:10:47 - INFO - codeparrot_training - Step 16290: {'lr': 0.0003983986182665362, 'samples': 3127872, 'steps': 16290, 'loss/train': 0.5719417333602905} 01/27/2022 11:10:50 - INFO - codeparrot_training - Step 16291: {'lr': 0.00039838545002339926, 'samples': 3128064, 'steps': 16291, 'loss/train': 0.6814579367637634} 01/27/2022 11:10:53 - INFO - codeparrot_training - Step 16292: {'lr': 0.0003983722811446261, 'samples': 3128256, 'steps': 16292, 'loss/train': 0.8080460429191589} 01/27/2022 11:10:56 - INFO - codeparrot_training - Step 16293: {'lr': 0.00039835911163027315, 'samples': 3128448, 'steps': 16293, 'loss/train': 0.7708510458469391} 01/27/2022 11:10:59 - INFO - codeparrot_training - Step 16294: {'lr': 0.00039834594148039693, 'samples': 3128640, 'steps': 16294, 'loss/train': 0.754031628370285} 01/27/2022 11:11:02 - INFO - codeparrot_training - Step 16295: {'lr': 0.0003983327706950538, 'samples': 3128832, 'steps': 16295, 'loss/train': 0.89482381939888} 01/27/2022 11:11:05 - INFO - codeparrot_training - Step 16296: {'lr': 0.00039831959927430017, 'samples': 3129024, 'steps': 16296, 'loss/train': 0.6457739621400833} 01/27/2022 11:11:10 - INFO - codeparrot_training - Step 16297: {'lr': 0.00039830642721819254, 'samples': 3129216, 'steps': 16297, 'loss/train': 1.07470241189003} 01/27/2022 11:11:13 - INFO - codeparrot_training - Step 16298: {'lr': 0.0003982932545267872, 'samples': 3129408, 'steps': 16298, 'loss/train': 0.8556700050830841} 01/27/2022 11:11:16 - INFO - codeparrot_training - Step 16299: {'lr': 0.00039828008120014057, 'samples': 3129600, 'steps': 16299, 'loss/train': 0.7670690417289734} 01/27/2022 11:11:20 - INFO - codeparrot_training - Step 16300: {'lr': 0.00039826690723830926, 'samples': 3129792, 'steps': 16300, 'loss/train': 0.8765671849250793} 01/27/2022 11:11:23 - INFO - codeparrot_training - Step 16301: {'lr': 0.00039825373264134955, 'samples': 3129984, 'steps': 16301, 'loss/train': 0.7851921021938324} 01/27/2022 11:11:26 - INFO - codeparrot_training - Step 16302: {'lr': 0.00039824055740931804, 'samples': 3130176, 'steps': 16302, 'loss/train': 0.9035835564136505} 01/27/2022 11:11:29 - INFO - codeparrot_training - Step 16303: {'lr': 0.0003982273815422709, 'samples': 3130368, 'steps': 16303, 'loss/train': 0.7076370120048523} 01/27/2022 11:11:32 - INFO - codeparrot_training - Step 16304: {'lr': 0.00039821420504026486, 'samples': 3130560, 'steps': 16304, 'loss/train': 1.009582668542862} 01/27/2022 11:11:35 - INFO - codeparrot_training - Step 16305: {'lr': 0.0003982010279033561, 'samples': 3130752, 'steps': 16305, 'loss/train': 0.8883113265037537} 01/27/2022 11:11:40 - INFO - codeparrot_training - Step 16306: {'lr': 0.0003981878501316013, 'samples': 3130944, 'steps': 16306, 'loss/train': 0.7432819604873657} 01/27/2022 11:11:44 - INFO - codeparrot_training - Step 16307: {'lr': 0.0003981746717250567, 'samples': 3131136, 'steps': 16307, 'loss/train': 0.8210342824459076} 01/27/2022 11:11:47 - INFO - codeparrot_training - Step 16308: {'lr': 0.000398161492683779, 'samples': 3131328, 'steps': 16308, 'loss/train': 0.4928121864795685} 01/27/2022 11:11:50 - INFO - codeparrot_training - Step 16309: {'lr': 0.0003981483130078244, 'samples': 3131520, 'steps': 16309, 'loss/train': 1.050301730632782} 01/27/2022 11:11:53 - INFO - codeparrot_training - Step 16310: {'lr': 0.0003981351326972495, 'samples': 3131712, 'steps': 16310, 'loss/train': 1.2259369790554047} 01/27/2022 11:11:56 - INFO - codeparrot_training - Step 16311: {'lr': 0.00039812195175211075, 'samples': 3131904, 'steps': 16311, 'loss/train': 1.1644804179668427} 01/27/2022 11:11:59 - INFO - codeparrot_training - Step 16312: {'lr': 0.0003981087701724645, 'samples': 3132096, 'steps': 16312, 'loss/train': 0.8268392980098724} 01/27/2022 11:12:02 - INFO - codeparrot_training - Step 16313: {'lr': 0.00039809558795836743, 'samples': 3132288, 'steps': 16313, 'loss/train': 0.8993506729602814} 01/27/2022 11:12:07 - INFO - codeparrot_training - Step 16314: {'lr': 0.00039808240510987584, 'samples': 3132480, 'steps': 16314, 'loss/train': 0.3010287433862686} 01/27/2022 11:12:10 - INFO - codeparrot_training - Step 16315: {'lr': 0.0003980692216270462, 'samples': 3132672, 'steps': 16315, 'loss/train': 0.9246939718723297} 01/27/2022 11:12:13 - INFO - codeparrot_training - Step 16316: {'lr': 0.00039805603750993514, 'samples': 3132864, 'steps': 16316, 'loss/train': 0.7390566766262054} 01/27/2022 11:12:16 - INFO - codeparrot_training - Step 16317: {'lr': 0.0003980428527585989, 'samples': 3133056, 'steps': 16317, 'loss/train': 0.3761061429977417} 01/27/2022 11:12:20 - INFO - codeparrot_training - Step 16318: {'lr': 0.0003980296673730942, 'samples': 3133248, 'steps': 16318, 'loss/train': 0.6053046137094498} 01/27/2022 11:12:23 - INFO - codeparrot_training - Step 16319: {'lr': 0.0003980164813534773, 'samples': 3133440, 'steps': 16319, 'loss/train': 0.8428775668144226} 01/27/2022 11:12:26 - INFO - codeparrot_training - Step 16320: {'lr': 0.0003980032946998049, 'samples': 3133632, 'steps': 16320, 'loss/train': 0.6845053732395172} 01/27/2022 11:12:29 - INFO - codeparrot_training - Step 16321: {'lr': 0.00039799010741213336, 'samples': 3133824, 'steps': 16321, 'loss/train': 0.34418465942144394} 01/27/2022 11:12:32 - INFO - codeparrot_training - Step 16322: {'lr': 0.0003979769194905192, 'samples': 3134016, 'steps': 16322, 'loss/train': 0.9932771623134613} 01/27/2022 11:12:37 - INFO - codeparrot_training - Step 16323: {'lr': 0.0003979637309350188, 'samples': 3134208, 'steps': 16323, 'loss/train': 1.0993483364582062} 01/27/2022 11:12:40 - INFO - codeparrot_training - Step 16324: {'lr': 0.0003979505417456889, 'samples': 3134400, 'steps': 16324, 'loss/train': 0.7312969118356705} 01/27/2022 11:12:43 - INFO - codeparrot_training - Step 16325: {'lr': 0.00039793735192258575, 'samples': 3134592, 'steps': 16325, 'loss/train': 0.8382711410522461} 01/27/2022 11:12:46 - INFO - codeparrot_training - Step 16326: {'lr': 0.000397924161465766, 'samples': 3134784, 'steps': 16326, 'loss/train': 0.6754876524209976} 01/27/2022 11:12:49 - INFO - codeparrot_training - Step 16327: {'lr': 0.0003979109703752861, 'samples': 3134976, 'steps': 16327, 'loss/train': 0.7692593336105347} 01/27/2022 11:12:52 - INFO - codeparrot_training - Step 16328: {'lr': 0.00039789777865120257, 'samples': 3135168, 'steps': 16328, 'loss/train': 0.8904886543750763} 01/27/2022 11:12:56 - INFO - codeparrot_training - Step 16329: {'lr': 0.00039788458629357195, 'samples': 3135360, 'steps': 16329, 'loss/train': 0.2693357840180397} 01/27/2022 11:12:59 - INFO - codeparrot_training - Step 16330: {'lr': 0.0003978713933024507, 'samples': 3135552, 'steps': 16330, 'loss/train': 0.7962928712368011} 01/27/2022 11:13:02 - INFO - codeparrot_training - Step 16331: {'lr': 0.0003978581996778954, 'samples': 3135744, 'steps': 16331, 'loss/train': 0.46128053963184357} 01/27/2022 11:13:07 - INFO - codeparrot_training - Step 16332: {'lr': 0.0003978450054199625, 'samples': 3135936, 'steps': 16332, 'loss/train': 1.4301784336566925} 01/27/2022 11:13:11 - INFO - codeparrot_training - Step 16333: {'lr': 0.0003978318105287085, 'samples': 3136128, 'steps': 16333, 'loss/train': 0.9687565863132477} 01/27/2022 11:13:14 - INFO - codeparrot_training - Step 16334: {'lr': 0.00039781861500419, 'samples': 3136320, 'steps': 16334, 'loss/train': 0.1528499834239483} 01/27/2022 11:13:17 - INFO - codeparrot_training - Step 16335: {'lr': 0.00039780541884646347, 'samples': 3136512, 'steps': 16335, 'loss/train': 0.970037430524826} 01/27/2022 11:13:20 - INFO - codeparrot_training - Step 16336: {'lr': 0.0003977922220555855, 'samples': 3136704, 'steps': 16336, 'loss/train': 0.7505045235157013} 01/27/2022 11:13:23 - INFO - codeparrot_training - Step 16337: {'lr': 0.0003977790246316125, 'samples': 3136896, 'steps': 16337, 'loss/train': 0.7529717087745667} 01/27/2022 11:13:26 - INFO - codeparrot_training - Step 16338: {'lr': 0.00039776582657460115, 'samples': 3137088, 'steps': 16338, 'loss/train': 0.5797615349292755} 01/27/2022 11:13:29 - INFO - codeparrot_training - Step 16339: {'lr': 0.000397752627884608, 'samples': 3137280, 'steps': 16339, 'loss/train': 0.9103802740573883} 01/27/2022 11:13:33 - INFO - codeparrot_training - Step 16340: {'lr': 0.0003977394285616893, 'samples': 3137472, 'steps': 16340, 'loss/train': 0.9499474167823792} 01/27/2022 11:13:37 - INFO - codeparrot_training - Step 16341: {'lr': 0.000397726228605902, 'samples': 3137664, 'steps': 16341, 'loss/train': 1.1836229860782623} 01/27/2022 11:13:40 - INFO - codeparrot_training - Step 16342: {'lr': 0.00039771302801730235, 'samples': 3137856, 'steps': 16342, 'loss/train': 1.1383735835552216} 01/27/2022 11:13:43 - INFO - codeparrot_training - Step 16343: {'lr': 0.00039769982679594703, 'samples': 3138048, 'steps': 16343, 'loss/train': 0.8605233728885651} 01/27/2022 11:13:46 - INFO - codeparrot_training - Step 16344: {'lr': 0.0003976866249418925, 'samples': 3138240, 'steps': 16344, 'loss/train': 0.7122722268104553} 01/27/2022 11:13:50 - INFO - codeparrot_training - Step 16345: {'lr': 0.0003976734224551954, 'samples': 3138432, 'steps': 16345, 'loss/train': 1.660026490688324} 01/27/2022 11:13:53 - INFO - codeparrot_training - Step 16346: {'lr': 0.0003976602193359122, 'samples': 3138624, 'steps': 16346, 'loss/train': 1.1642853319644928} 01/27/2022 11:13:56 - INFO - codeparrot_training - Step 16347: {'lr': 0.00039764701558409955, 'samples': 3138816, 'steps': 16347, 'loss/train': 0.8584884703159332} 01/27/2022 11:13:59 - INFO - codeparrot_training - Step 16348: {'lr': 0.000397633811199814, 'samples': 3139008, 'steps': 16348, 'loss/train': 0.7647995352745056} 01/27/2022 11:14:02 - INFO - codeparrot_training - Step 16349: {'lr': 0.000397620606183112, 'samples': 3139200, 'steps': 16349, 'loss/train': 0.8294972777366638} 01/27/2022 11:14:07 - INFO - codeparrot_training - Step 16350: {'lr': 0.00039760740053405033, 'samples': 3139392, 'steps': 16350, 'loss/train': 0.8636766672134399} 01/27/2022 11:14:11 - INFO - codeparrot_training - Step 16351: {'lr': 0.00039759419425268526, 'samples': 3139584, 'steps': 16351, 'loss/train': 0.9668554365634918} 01/27/2022 11:14:14 - INFO - codeparrot_training - Step 16352: {'lr': 0.00039758098733907364, 'samples': 3139776, 'steps': 16352, 'loss/train': 0.9940418601036072} 01/27/2022 11:14:17 - INFO - codeparrot_training - Step 16353: {'lr': 0.00039756777979327193, 'samples': 3139968, 'steps': 16353, 'loss/train': 0.7608877122402191} 01/27/2022 11:14:20 - INFO - codeparrot_training - Step 16354: {'lr': 0.0003975545716153367, 'samples': 3140160, 'steps': 16354, 'loss/train': 0.759500652551651} 01/27/2022 11:14:23 - INFO - codeparrot_training - Step 16355: {'lr': 0.0003975413628053245, 'samples': 3140352, 'steps': 16355, 'loss/train': 0.6195583194494247} 01/27/2022 11:14:26 - INFO - codeparrot_training - Step 16356: {'lr': 0.000397528153363292, 'samples': 3140544, 'steps': 16356, 'loss/train': 0.7404384166002274} 01/27/2022 11:14:29 - INFO - codeparrot_training - Step 16357: {'lr': 0.00039751494328929565, 'samples': 3140736, 'steps': 16357, 'loss/train': 1.1545848548412323} 01/27/2022 11:14:33 - INFO - codeparrot_training - Step 16358: {'lr': 0.00039750173258339225, 'samples': 3140928, 'steps': 16358, 'loss/train': 1.3863947689533234} 01/27/2022 11:14:37 - INFO - codeparrot_training - Step 16359: {'lr': 0.00039748852124563816, 'samples': 3141120, 'steps': 16359, 'loss/train': 1.242002785205841} 01/27/2022 11:14:40 - INFO - codeparrot_training - Step 16360: {'lr': 0.0003974753092760901, 'samples': 3141312, 'steps': 16360, 'loss/train': 0.4398142397403717} 01/27/2022 11:14:44 - INFO - codeparrot_training - Step 16361: {'lr': 0.00039746209667480473, 'samples': 3141504, 'steps': 16361, 'loss/train': 0.8595219254493713} 01/27/2022 11:14:47 - INFO - codeparrot_training - Step 16362: {'lr': 0.00039744888344183846, 'samples': 3141696, 'steps': 16362, 'loss/train': 0.8012799024581909} 01/27/2022 11:14:50 - INFO - codeparrot_training - Step 16363: {'lr': 0.00039743566957724805, 'samples': 3141888, 'steps': 16363, 'loss/train': 0.24459897726774216} 01/27/2022 11:14:53 - INFO - codeparrot_training - Step 16364: {'lr': 0.00039742245508109, 'samples': 3142080, 'steps': 16364, 'loss/train': 0.8025190830230713} 01/27/2022 11:14:56 - INFO - codeparrot_training - Step 16365: {'lr': 0.000397409239953421, 'samples': 3142272, 'steps': 16365, 'loss/train': 0.6357373595237732} 01/27/2022 11:14:59 - INFO - codeparrot_training - Step 16366: {'lr': 0.00039739602419429755, 'samples': 3142464, 'steps': 16366, 'loss/train': 0.8606222569942474} 01/27/2022 11:15:02 - INFO - codeparrot_training - Step 16367: {'lr': 0.00039738280780377645, 'samples': 3142656, 'steps': 16367, 'loss/train': 0.6234107315540314} 01/27/2022 11:15:07 - INFO - codeparrot_training - Step 16368: {'lr': 0.0003973695907819141, 'samples': 3142848, 'steps': 16368, 'loss/train': 0.5991390198469162} 01/27/2022 11:15:10 - INFO - codeparrot_training - Step 16369: {'lr': 0.0003973563731287673, 'samples': 3143040, 'steps': 16369, 'loss/train': 0.7862134873867035} 01/27/2022 11:15:14 - INFO - codeparrot_training - Step 16370: {'lr': 0.00039734315484439255, 'samples': 3143232, 'steps': 16370, 'loss/train': 0.5914807319641113} 01/27/2022 11:15:17 - INFO - codeparrot_training - Step 16371: {'lr': 0.0003973299359288465, 'samples': 3143424, 'steps': 16371, 'loss/train': 0.4683467298746109} 01/27/2022 11:15:20 - INFO - codeparrot_training - Step 16372: {'lr': 0.0003973167163821858, 'samples': 3143616, 'steps': 16372, 'loss/train': 0.8322366178035736} 01/27/2022 11:15:23 - INFO - codeparrot_training - Step 16373: {'lr': 0.0003973034962044671, 'samples': 3143808, 'steps': 16373, 'loss/train': 0.7112137377262115} 01/27/2022 11:15:26 - INFO - codeparrot_training - Step 16374: {'lr': 0.00039729027539574696, 'samples': 3144000, 'steps': 16374, 'loss/train': 0.5096184611320496} 01/27/2022 11:15:29 - INFO - codeparrot_training - Step 16375: {'lr': 0.00039727705395608203, 'samples': 3144192, 'steps': 16375, 'loss/train': 1.239459067583084} 01/27/2022 11:15:34 - INFO - codeparrot_training - Step 16376: {'lr': 0.00039726383188552907, 'samples': 3144384, 'steps': 16376, 'loss/train': 0.7693246901035309} 01/27/2022 11:15:37 - INFO - codeparrot_training - Step 16377: {'lr': 0.00039725060918414446, 'samples': 3144576, 'steps': 16377, 'loss/train': 1.109573632478714} 01/27/2022 11:15:40 - INFO - codeparrot_training - Step 16378: {'lr': 0.0003972373858519851, 'samples': 3144768, 'steps': 16378, 'loss/train': 1.333444207906723} 01/27/2022 11:15:43 - INFO - codeparrot_training - Step 16379: {'lr': 0.00039722416188910754, 'samples': 3144960, 'steps': 16379, 'loss/train': 1.1786867380142212} 01/27/2022 11:15:46 - INFO - codeparrot_training - Step 16380: {'lr': 0.00039721093729556836, 'samples': 3145152, 'steps': 16380, 'loss/train': 0.9484879374504089} 01/27/2022 11:15:49 - INFO - codeparrot_training - Step 16381: {'lr': 0.0003971977120714243, 'samples': 3145344, 'steps': 16381, 'loss/train': 0.8165819048881531} 01/27/2022 11:15:53 - INFO - codeparrot_training - Step 16382: {'lr': 0.000397184486216732, 'samples': 3145536, 'steps': 16382, 'loss/train': 0.8462325632572174} 01/27/2022 11:15:56 - INFO - codeparrot_training - Step 16383: {'lr': 0.0003971712597315481, 'samples': 3145728, 'steps': 16383, 'loss/train': 1.2614806294441223} 01/27/2022 11:15:59 - INFO - codeparrot_training - Step 16384: {'lr': 0.0003971580326159292, 'samples': 3145920, 'steps': 16384, 'loss/train': 0.8884590268135071} 01/27/2022 11:16:04 - INFO - codeparrot_training - Step 16385: {'lr': 0.0003971448048699321, 'samples': 3146112, 'steps': 16385, 'loss/train': 0.6288697868585587} 01/27/2022 11:16:07 - INFO - codeparrot_training - Step 16386: {'lr': 0.00039713157649361327, 'samples': 3146304, 'steps': 16386, 'loss/train': 1.0685488879680634} 01/27/2022 11:16:10 - INFO - codeparrot_training - Step 16387: {'lr': 0.00039711834748702956, 'samples': 3146496, 'steps': 16387, 'loss/train': 0.9877242743968964} 01/27/2022 11:16:13 - INFO - codeparrot_training - Step 16388: {'lr': 0.0003971051178502375, 'samples': 3146688, 'steps': 16388, 'loss/train': 0.6256601214408875} 01/27/2022 11:16:16 - INFO - codeparrot_training - Step 16389: {'lr': 0.00039709188758329394, 'samples': 3146880, 'steps': 16389, 'loss/train': 0.6501017957925797} 01/27/2022 11:16:20 - INFO - codeparrot_training - Step 16390: {'lr': 0.0003970786566862553, 'samples': 3147072, 'steps': 16390, 'loss/train': 1.1275242269039154} 01/27/2022 11:16:23 - INFO - codeparrot_training - Step 16391: {'lr': 0.00039706542515917853, 'samples': 3147264, 'steps': 16391, 'loss/train': 0.6267930418252945} 01/27/2022 11:16:26 - INFO - codeparrot_training - Step 16392: {'lr': 0.00039705219300212015, 'samples': 3147456, 'steps': 16392, 'loss/train': 0.8122687339782715} 01/27/2022 11:16:29 - INFO - codeparrot_training - Step 16393: {'lr': 0.00039703896021513684, 'samples': 3147648, 'steps': 16393, 'loss/train': 0.8229024410247803} 01/27/2022 11:16:36 - INFO - codeparrot_training - Step 16394: {'lr': 0.0003970257267982853, 'samples': 3147840, 'steps': 16394, 'loss/train': 1.2244775891304016} 01/27/2022 11:16:39 - INFO - codeparrot_training - Step 16395: {'lr': 0.0003970124927516222, 'samples': 3148032, 'steps': 16395, 'loss/train': 0.433227002620697} 01/27/2022 11:16:42 - INFO - codeparrot_training - Step 16396: {'lr': 0.0003969992580752043, 'samples': 3148224, 'steps': 16396, 'loss/train': 0.5091579258441925} 01/27/2022 11:16:45 - INFO - codeparrot_training - Step 16397: {'lr': 0.00039698602276908826, 'samples': 3148416, 'steps': 16397, 'loss/train': 0.7512785196304321} 01/27/2022 11:16:48 - INFO - codeparrot_training - Step 16398: {'lr': 0.0003969727868333308, 'samples': 3148608, 'steps': 16398, 'loss/train': 0.41959767043590546} 01/27/2022 11:16:51 - INFO - codeparrot_training - Step 16399: {'lr': 0.00039695955026798857, 'samples': 3148800, 'steps': 16399, 'loss/train': 0.6502413153648376} 01/27/2022 11:16:55 - INFO - codeparrot_training - Step 16400: {'lr': 0.0003969463130731183, 'samples': 3148992, 'steps': 16400, 'loss/train': 0.9085874855518341} 01/27/2022 11:16:58 - INFO - codeparrot_training - Step 16401: {'lr': 0.00039693307524877664, 'samples': 3149184, 'steps': 16401, 'loss/train': 0.9677371680736542} 01/27/2022 11:17:01 - INFO - codeparrot_training - Step 16402: {'lr': 0.0003969198367950204, 'samples': 3149376, 'steps': 16402, 'loss/train': 0.7147872000932693} 01/27/2022 11:17:05 - INFO - codeparrot_training - Step 16403: {'lr': 0.00039690659771190616, 'samples': 3149568, 'steps': 16403, 'loss/train': 0.10759841278195381} 01/27/2022 11:17:08 - INFO - codeparrot_training - Step 16404: {'lr': 0.0003968933579994908, 'samples': 3149760, 'steps': 16404, 'loss/train': 0.987971305847168} 01/27/2022 11:17:12 - INFO - codeparrot_training - Step 16405: {'lr': 0.0003968801176578309, 'samples': 3149952, 'steps': 16405, 'loss/train': 0.45827150344848633} 01/27/2022 11:17:15 - INFO - codeparrot_training - Step 16406: {'lr': 0.00039686687668698316, 'samples': 3150144, 'steps': 16406, 'loss/train': 1.1288044452667236} 01/27/2022 11:17:18 - INFO - codeparrot_training - Step 16407: {'lr': 0.00039685363508700443, 'samples': 3150336, 'steps': 16407, 'loss/train': 0.8957827091217041} 01/27/2022 11:17:21 - INFO - codeparrot_training - Step 16408: {'lr': 0.00039684039285795133, 'samples': 3150528, 'steps': 16408, 'loss/train': 0.41786250472068787} 01/27/2022 11:17:24 - INFO - codeparrot_training - Step 16409: {'lr': 0.0003968271499998806, 'samples': 3150720, 'steps': 16409, 'loss/train': 0.616459384560585} 01/27/2022 11:17:27 - INFO - codeparrot_training - Step 16410: {'lr': 0.000396813906512849, 'samples': 3150912, 'steps': 16410, 'loss/train': 0.1511295922100544} 01/27/2022 11:17:30 - INFO - codeparrot_training - Step 16411: {'lr': 0.00039680066239691325, 'samples': 3151104, 'steps': 16411, 'loss/train': 0.7987346649169922} 01/27/2022 11:17:35 - INFO - codeparrot_training - Step 16412: {'lr': 0.00039678741765213006, 'samples': 3151296, 'steps': 16412, 'loss/train': 0.8277180790901184} 01/27/2022 11:17:38 - INFO - codeparrot_training - Step 16413: {'lr': 0.00039677417227855624, 'samples': 3151488, 'steps': 16413, 'loss/train': 0.8293441236019135} 01/27/2022 11:17:41 - INFO - codeparrot_training - Step 16414: {'lr': 0.0003967609262762484, 'samples': 3151680, 'steps': 16414, 'loss/train': 0.6646831780672073} 01/27/2022 11:17:44 - INFO - codeparrot_training - Step 16415: {'lr': 0.0003967476796452634, 'samples': 3151872, 'steps': 16415, 'loss/train': 1.3371411859989166} 01/27/2022 11:17:47 - INFO - codeparrot_training - Step 16416: {'lr': 0.00039673443238565786, 'samples': 3152064, 'steps': 16416, 'loss/train': 0.9867611825466156} 01/27/2022 11:17:51 - INFO - codeparrot_training - Step 16417: {'lr': 0.0003967211844974887, 'samples': 3152256, 'steps': 16417, 'loss/train': 0.7275371253490448} 01/27/2022 11:17:54 - INFO - codeparrot_training - Step 16418: {'lr': 0.0003967079359808125, 'samples': 3152448, 'steps': 16418, 'loss/train': 0.048546502366662025} 01/27/2022 11:17:57 - INFO - codeparrot_training - Step 16419: {'lr': 0.0003966946868356861, 'samples': 3152640, 'steps': 16419, 'loss/train': 0.9831126630306244} 01/27/2022 11:18:03 - INFO - codeparrot_training - Step 16420: {'lr': 0.0003966814370621663, 'samples': 3152832, 'steps': 16420, 'loss/train': 0.7818227112293243} 01/27/2022 11:18:06 - INFO - codeparrot_training - Step 16421: {'lr': 0.00039666818666030974, 'samples': 3153024, 'steps': 16421, 'loss/train': 1.1001561284065247} 01/27/2022 11:18:09 - INFO - codeparrot_training - Step 16422: {'lr': 0.0003966549356301733, 'samples': 3153216, 'steps': 16422, 'loss/train': 0.5559077560901642} 01/27/2022 11:18:13 - INFO - codeparrot_training - Step 16423: {'lr': 0.0003966416839718136, 'samples': 3153408, 'steps': 16423, 'loss/train': 0.8029322326183319} 01/27/2022 11:18:16 - INFO - codeparrot_training - Step 16424: {'lr': 0.00039662843168528756, 'samples': 3153600, 'steps': 16424, 'loss/train': 0.7819757759571075} 01/27/2022 11:18:19 - INFO - codeparrot_training - Step 16425: {'lr': 0.00039661517877065183, 'samples': 3153792, 'steps': 16425, 'loss/train': 0.7896585166454315} 01/27/2022 11:18:22 - INFO - codeparrot_training - Step 16426: {'lr': 0.0003966019252279633, 'samples': 3153984, 'steps': 16426, 'loss/train': 1.0619692504405975} 01/27/2022 11:18:25 - INFO - codeparrot_training - Step 16427: {'lr': 0.00039658867105727856, 'samples': 3154176, 'steps': 16427, 'loss/train': 0.4249438941478729} 01/27/2022 11:18:28 - INFO - codeparrot_training - Step 16428: {'lr': 0.0003965754162586547, 'samples': 3154368, 'steps': 16428, 'loss/train': 1.3989788889884949} 01/27/2022 11:18:33 - INFO - codeparrot_training - Step 16429: {'lr': 0.0003965621608321481, 'samples': 3154560, 'steps': 16429, 'loss/train': 0.5543275773525238} 01/27/2022 11:18:36 - INFO - codeparrot_training - Step 16430: {'lr': 0.0003965489047778158, 'samples': 3154752, 'steps': 16430, 'loss/train': 0.22664880752563477} 01/27/2022 11:18:39 - INFO - codeparrot_training - Step 16431: {'lr': 0.0003965356480957145, 'samples': 3154944, 'steps': 16431, 'loss/train': 0.4697231501340866} 01/27/2022 11:18:42 - INFO - codeparrot_training - Step 16432: {'lr': 0.0003965223907859011, 'samples': 3155136, 'steps': 16432, 'loss/train': 0.6264688074588776} 01/27/2022 11:18:45 - INFO - codeparrot_training - Step 16433: {'lr': 0.00039650913284843225, 'samples': 3155328, 'steps': 16433, 'loss/train': 0.8311588168144226} 01/27/2022 11:18:49 - INFO - codeparrot_training - Step 16434: {'lr': 0.00039649587428336474, 'samples': 3155520, 'steps': 16434, 'loss/train': 0.1469871997833252} 01/27/2022 11:18:52 - INFO - codeparrot_training - Step 16435: {'lr': 0.00039648261509075554, 'samples': 3155712, 'steps': 16435, 'loss/train': 0.458076149225235} 01/27/2022 11:18:55 - INFO - codeparrot_training - Step 16436: {'lr': 0.00039646935527066124, 'samples': 3155904, 'steps': 16436, 'loss/train': 1.2633817791938782} 01/27/2022 11:18:58 - INFO - codeparrot_training - Step 16437: {'lr': 0.0003964560948231388, 'samples': 3156096, 'steps': 16437, 'loss/train': 0.6900997012853622} 01/27/2022 11:19:05 - INFO - codeparrot_training - Step 16438: {'lr': 0.0003964428337482449, 'samples': 3156288, 'steps': 16438, 'loss/train': 0.8631270825862885} 01/27/2022 11:19:08 - INFO - codeparrot_training - Step 16439: {'lr': 0.00039642957204603647, 'samples': 3156480, 'steps': 16439, 'loss/train': 0.4677002280950546} 01/27/2022 11:19:11 - INFO - codeparrot_training - Step 16440: {'lr': 0.0003964163097165702, 'samples': 3156672, 'steps': 16440, 'loss/train': 0.7438879609107971} 01/27/2022 11:19:14 - INFO - codeparrot_training - Step 16441: {'lr': 0.0003964030467599029, 'samples': 3156864, 'steps': 16441, 'loss/train': 0.5567301660776138} 01/27/2022 11:19:17 - INFO - codeparrot_training - Step 16442: {'lr': 0.00039638978317609155, 'samples': 3157056, 'steps': 16442, 'loss/train': 0.8895531892776489} 01/27/2022 11:19:20 - INFO - codeparrot_training - Step 16443: {'lr': 0.0003963765189651928, 'samples': 3157248, 'steps': 16443, 'loss/train': 0.70783831179142} 01/27/2022 11:19:23 - INFO - codeparrot_training - Step 16444: {'lr': 0.0003963632541272635, 'samples': 3157440, 'steps': 16444, 'loss/train': 0.9935088157653809} 01/27/2022 11:19:27 - INFO - codeparrot_training - Step 16445: {'lr': 0.00039634998866236047, 'samples': 3157632, 'steps': 16445, 'loss/train': 0.30726467818021774} 01/27/2022 11:19:31 - INFO - codeparrot_training - Step 16446: {'lr': 0.0003963367225705406, 'samples': 3157824, 'steps': 16446, 'loss/train': 1.2245839834213257} 01/27/2022 11:19:34 - INFO - codeparrot_training - Step 16447: {'lr': 0.0003963234558518607, 'samples': 3158016, 'steps': 16447, 'loss/train': 0.6575295925140381} 01/27/2022 11:19:37 - INFO - codeparrot_training - Step 16448: {'lr': 0.0003963101885063776, 'samples': 3158208, 'steps': 16448, 'loss/train': 1.1062726378440857} 01/27/2022 11:19:41 - INFO - codeparrot_training - Step 16449: {'lr': 0.000396296920534148, 'samples': 3158400, 'steps': 16449, 'loss/train': 1.0263479053974152} 01/27/2022 11:19:44 - INFO - codeparrot_training - Step 16450: {'lr': 0.000396283651935229, 'samples': 3158592, 'steps': 16450, 'loss/train': 1.0904256999492645} 01/27/2022 11:19:47 - INFO - codeparrot_training - Step 16451: {'lr': 0.0003962703827096771, 'samples': 3158784, 'steps': 16451, 'loss/train': 1.1059216260910034} 01/27/2022 11:19:50 - INFO - codeparrot_training - Step 16452: {'lr': 0.00039625711285754943, 'samples': 3158976, 'steps': 16452, 'loss/train': 1.5165544152259827} 01/27/2022 11:19:53 - INFO - codeparrot_training - Step 16453: {'lr': 0.00039624384237890275, 'samples': 3159168, 'steps': 16453, 'loss/train': 0.34100617468357086} 01/27/2022 11:19:56 - INFO - codeparrot_training - Step 16454: {'lr': 0.00039623057127379386, 'samples': 3159360, 'steps': 16454, 'loss/train': 0.6216109246015549} 01/27/2022 11:20:01 - INFO - codeparrot_training - Step 16455: {'lr': 0.0003962172995422796, 'samples': 3159552, 'steps': 16455, 'loss/train': 1.0587633848190308} 01/27/2022 11:20:04 - INFO - codeparrot_training - Step 16456: {'lr': 0.00039620402718441687, 'samples': 3159744, 'steps': 16456, 'loss/train': 0.8901599049568176} 01/27/2022 11:20:07 - INFO - codeparrot_training - Step 16457: {'lr': 0.0003961907542002626, 'samples': 3159936, 'steps': 16457, 'loss/train': 0.12307662516832352} 01/27/2022 11:20:10 - INFO - codeparrot_training - Step 16458: {'lr': 0.00039617748058987345, 'samples': 3160128, 'steps': 16458, 'loss/train': 0.546050637960434} 01/27/2022 11:20:13 - INFO - codeparrot_training - Step 16459: {'lr': 0.0003961642063533065, 'samples': 3160320, 'steps': 16459, 'loss/train': 0.47912295162677765} 01/27/2022 11:20:17 - INFO - codeparrot_training - Step 16460: {'lr': 0.0003961509314906184, 'samples': 3160512, 'steps': 16460, 'loss/train': 0.437875896692276} 01/27/2022 11:20:20 - INFO - codeparrot_training - Step 16461: {'lr': 0.0003961376560018662, 'samples': 3160704, 'steps': 16461, 'loss/train': 0.9406188726425171} 01/27/2022 11:20:23 - INFO - codeparrot_training - Step 16462: {'lr': 0.0003961243798871066, 'samples': 3160896, 'steps': 16462, 'loss/train': 0.7530031800270081} 01/27/2022 11:20:26 - INFO - codeparrot_training - Step 16463: {'lr': 0.00039611110314639663, 'samples': 3161088, 'steps': 16463, 'loss/train': 0.8586059510707855} 01/27/2022 11:20:32 - INFO - codeparrot_training - Step 16464: {'lr': 0.00039609782577979306, 'samples': 3161280, 'steps': 16464, 'loss/train': 0.9285542070865631} 01/27/2022 11:20:36 - INFO - codeparrot_training - Step 16465: {'lr': 0.0003960845477873528, 'samples': 3161472, 'steps': 16465, 'loss/train': 0.984802633523941} 01/27/2022 11:20:39 - INFO - codeparrot_training - Step 16466: {'lr': 0.00039607126916913274, 'samples': 3161664, 'steps': 16466, 'loss/train': 0.9545544683933258} 01/27/2022 11:20:42 - INFO - codeparrot_training - Step 16467: {'lr': 0.00039605798992518973, 'samples': 3161856, 'steps': 16467, 'loss/train': 1.2074414491653442} 01/27/2022 11:20:45 - INFO - codeparrot_training - Step 16468: {'lr': 0.00039604471005558065, 'samples': 3162048, 'steps': 16468, 'loss/train': 0.8222541511058807} 01/27/2022 11:20:48 - INFO - codeparrot_training - Step 16469: {'lr': 0.0003960314295603624, 'samples': 3162240, 'steps': 16469, 'loss/train': 1.9956028461456299} 01/27/2022 11:20:51 - INFO - codeparrot_training - Step 16470: {'lr': 0.00039601814843959193, 'samples': 3162432, 'steps': 16470, 'loss/train': 1.0499126315116882} 01/27/2022 11:20:54 - INFO - codeparrot_training - Step 16471: {'lr': 0.00039600486669332603, 'samples': 3162624, 'steps': 16471, 'loss/train': 0.0696211326867342} 01/27/2022 11:20:58 - INFO - codeparrot_training - Step 16472: {'lr': 0.00039599158432162163, 'samples': 3162816, 'steps': 16472, 'loss/train': 0.3520548865199089} 01/27/2022 11:21:02 - INFO - codeparrot_training - Step 16473: {'lr': 0.0003959783013245357, 'samples': 3163008, 'steps': 16473, 'loss/train': 0.927079439163208} 01/27/2022 11:21:05 - INFO - codeparrot_training - Step 16474: {'lr': 0.000395965017702125, 'samples': 3163200, 'steps': 16474, 'loss/train': 0.4509286880493164} 01/27/2022 11:21:08 - INFO - codeparrot_training - Step 16475: {'lr': 0.00039595173345444656, 'samples': 3163392, 'steps': 16475, 'loss/train': 1.194792777299881} 01/27/2022 11:21:11 - INFO - codeparrot_training - Step 16476: {'lr': 0.0003959384485815573, 'samples': 3163584, 'steps': 16476, 'loss/train': 0.5340692549943924} 01/27/2022 11:21:15 - INFO - codeparrot_training - Step 16477: {'lr': 0.000395925163083514, 'samples': 3163776, 'steps': 16477, 'loss/train': 1.0951342284679413} 01/27/2022 11:21:18 - INFO - codeparrot_training - Step 16478: {'lr': 0.00039591187696037366, 'samples': 3163968, 'steps': 16478, 'loss/train': 0.8578071892261505} 01/27/2022 11:21:21 - INFO - codeparrot_training - Step 16479: {'lr': 0.0003958985902121931, 'samples': 3164160, 'steps': 16479, 'loss/train': 0.8020130395889282} 01/27/2022 11:21:24 - INFO - codeparrot_training - Step 16480: {'lr': 0.00039588530283902936, 'samples': 3164352, 'steps': 16480, 'loss/train': 0.36279820650815964} 01/27/2022 11:21:27 - INFO - codeparrot_training - Step 16481: {'lr': 0.00039587201484093937, 'samples': 3164544, 'steps': 16481, 'loss/train': 0.9909215569496155} 01/27/2022 11:21:33 - INFO - codeparrot_training - Step 16482: {'lr': 0.0003958587262179799, 'samples': 3164736, 'steps': 16482, 'loss/train': 0.7821823954582214} 01/27/2022 11:21:37 - INFO - codeparrot_training - Step 16483: {'lr': 0.00039584543697020804, 'samples': 3164928, 'steps': 16483, 'loss/train': 0.8446026742458344} 01/27/2022 11:21:40 - INFO - codeparrot_training - Step 16484: {'lr': 0.00039583214709768054, 'samples': 3165120, 'steps': 16484, 'loss/train': 1.0304425656795502} 01/27/2022 11:21:43 - INFO - codeparrot_training - Step 16485: {'lr': 0.00039581885660045445, 'samples': 3165312, 'steps': 16485, 'loss/train': 1.0588009357452393} 01/27/2022 11:21:46 - INFO - codeparrot_training - Step 16486: {'lr': 0.0003958055654785867, 'samples': 3165504, 'steps': 16486, 'loss/train': 0.6049697399139404} 01/27/2022 11:21:49 - INFO - codeparrot_training - Step 16487: {'lr': 0.0003957922737321343, 'samples': 3165696, 'steps': 16487, 'loss/train': 0.9308875501155853} 01/27/2022 11:21:52 - INFO - codeparrot_training - Step 16488: {'lr': 0.00039577898136115397, 'samples': 3165888, 'steps': 16488, 'loss/train': 0.728198915719986} 01/27/2022 11:21:56 - INFO - codeparrot_training - Step 16489: {'lr': 0.00039576568836570283, 'samples': 3166080, 'steps': 16489, 'loss/train': 0.5297000259160995} 01/27/2022 11:22:00 - INFO - codeparrot_training - Step 16490: {'lr': 0.0003957523947458377, 'samples': 3166272, 'steps': 16490, 'loss/train': 0.6964269429445267} 01/27/2022 11:22:03 - INFO - codeparrot_training - Step 16491: {'lr': 0.00039573910050161564, 'samples': 3166464, 'steps': 16491, 'loss/train': 0.696523904800415} 01/27/2022 11:22:06 - INFO - codeparrot_training - Step 16492: {'lr': 0.0003957258056330936, 'samples': 3166656, 'steps': 16492, 'loss/train': 1.0968183875083923} 01/27/2022 11:22:09 - INFO - codeparrot_training - Step 16493: {'lr': 0.00039571251014032847, 'samples': 3166848, 'steps': 16493, 'loss/train': 1.0563878417015076} 01/27/2022 11:22:12 - INFO - codeparrot_training - Step 16494: {'lr': 0.00039569921402337715, 'samples': 3167040, 'steps': 16494, 'loss/train': 0.7124888151884079} 01/27/2022 11:22:16 - INFO - codeparrot_training - Step 16495: {'lr': 0.00039568591728229667, 'samples': 3167232, 'steps': 16495, 'loss/train': 0.4610670804977417} 01/27/2022 11:22:19 - INFO - codeparrot_training - Step 16496: {'lr': 0.00039567261991714406, 'samples': 3167424, 'steps': 16496, 'loss/train': 1.0298214554786682} 01/27/2022 11:22:22 - INFO - codeparrot_training - Step 16497: {'lr': 0.0003956593219279761, 'samples': 3167616, 'steps': 16497, 'loss/train': 0.6108525842428207} 01/27/2022 11:22:25 - INFO - codeparrot_training - Step 16498: {'lr': 0.00039564602331484993, 'samples': 3167808, 'steps': 16498, 'loss/train': 0.7337829619646072} 01/27/2022 11:22:29 - INFO - codeparrot_training - Step 16499: {'lr': 0.0003956327240778224, 'samples': 3168000, 'steps': 16499, 'loss/train': 0.33012619614601135} 01/27/2022 11:22:33 - INFO - codeparrot_training - Step 16500: {'lr': 0.00039561942421695057, 'samples': 3168192, 'steps': 16500, 'loss/train': 1.0325500667095184} 01/27/2022 11:22:36 - INFO - codeparrot_training - Step 16501: {'lr': 0.00039560612373229135, 'samples': 3168384, 'steps': 16501, 'loss/train': 1.0396359264850616} 01/27/2022 11:22:39 - INFO - codeparrot_training - Step 16502: {'lr': 0.0003955928226239017, 'samples': 3168576, 'steps': 16502, 'loss/train': 0.7965084314346313} 01/27/2022 11:22:42 - INFO - codeparrot_training - Step 16503: {'lr': 0.00039557952089183863, 'samples': 3168768, 'steps': 16503, 'loss/train': 0.3303249031305313} 01/27/2022 11:22:45 - INFO - codeparrot_training - Step 16504: {'lr': 0.00039556621853615914, 'samples': 3168960, 'steps': 16504, 'loss/train': 0.9424332976341248} 01/27/2022 11:22:48 - INFO - codeparrot_training - Step 16505: {'lr': 0.0003955529155569202, 'samples': 3169152, 'steps': 16505, 'loss/train': 1.0110992789268494} 01/27/2022 11:22:51 - INFO - codeparrot_training - Step 16506: {'lr': 0.0003955396119541788, 'samples': 3169344, 'steps': 16506, 'loss/train': 0.050900084897875786} 01/27/2022 11:22:55 - INFO - codeparrot_training - Step 16507: {'lr': 0.00039552630772799185, 'samples': 3169536, 'steps': 16507, 'loss/train': 0.9185109436511993} 01/27/2022 11:22:59 - INFO - codeparrot_training - Step 16508: {'lr': 0.0003955130028784165, 'samples': 3169728, 'steps': 16508, 'loss/train': 0.6492849290370941} 01/27/2022 11:23:02 - INFO - codeparrot_training - Step 16509: {'lr': 0.00039549969740550954, 'samples': 3169920, 'steps': 16509, 'loss/train': 0.6369474828243256} 01/27/2022 11:23:05 - INFO - codeparrot_training - Step 16510: {'lr': 0.00039548639130932816, 'samples': 3170112, 'steps': 16510, 'loss/train': 0.8093358278274536} 01/27/2022 11:23:09 - INFO - codeparrot_training - Step 16511: {'lr': 0.00039547308458992927, 'samples': 3170304, 'steps': 16511, 'loss/train': 1.0987606644630432} 01/27/2022 11:23:12 - INFO - codeparrot_training - Step 16512: {'lr': 0.00039545977724736984, 'samples': 3170496, 'steps': 16512, 'loss/train': 0.16172479838132858} 01/27/2022 11:23:15 - INFO - codeparrot_training - Step 16513: {'lr': 0.00039544646928170695, 'samples': 3170688, 'steps': 16513, 'loss/train': 0.6994959264993668} 01/27/2022 11:23:18 - INFO - codeparrot_training - Step 16514: {'lr': 0.0003954331606929976, 'samples': 3170880, 'steps': 16514, 'loss/train': 0.30243629962205887} 01/27/2022 11:23:21 - INFO - codeparrot_training - Step 16515: {'lr': 0.00039541985148129865, 'samples': 3171072, 'steps': 16515, 'loss/train': 0.844240665435791} 01/27/2022 11:23:24 - INFO - codeparrot_training - Step 16516: {'lr': 0.00039540654164666735, 'samples': 3171264, 'steps': 16516, 'loss/train': 0.8916550576686859} 01/27/2022 11:23:31 - INFO - codeparrot_training - Step 16517: {'lr': 0.00039539323118916055, 'samples': 3171456, 'steps': 16517, 'loss/train': 1.0751637518405914} 01/27/2022 11:23:34 - INFO - codeparrot_training - Step 16518: {'lr': 0.0003953799201088353, 'samples': 3171648, 'steps': 16518, 'loss/train': 1.0046809315681458} 01/27/2022 11:23:37 - INFO - codeparrot_training - Step 16519: {'lr': 0.00039536660840574866, 'samples': 3171840, 'steps': 16519, 'loss/train': 0.5885586440563202} 01/27/2022 11:23:40 - INFO - codeparrot_training - Step 16520: {'lr': 0.0003953532960799577, 'samples': 3172032, 'steps': 16520, 'loss/train': 1.9914727807044983} 01/27/2022 11:23:43 - INFO - codeparrot_training - Step 16521: {'lr': 0.00039533998313151926, 'samples': 3172224, 'steps': 16521, 'loss/train': 0.8456672430038452} 01/27/2022 11:23:46 - INFO - codeparrot_training - Step 16522: {'lr': 0.0003953266695604906, 'samples': 3172416, 'steps': 16522, 'loss/train': 1.4317070245742798} 01/27/2022 11:23:50 - INFO - codeparrot_training - Step 16523: {'lr': 0.0003953133553669285, 'samples': 3172608, 'steps': 16523, 'loss/train': 0.8191746175289154} 01/27/2022 11:23:53 - INFO - codeparrot_training - Step 16524: {'lr': 0.0003953000405508902, 'samples': 3172800, 'steps': 16524, 'loss/train': 1.1051898300647736} 01/27/2022 11:23:56 - INFO - codeparrot_training - Step 16525: {'lr': 0.00039528672511243256, 'samples': 3172992, 'steps': 16525, 'loss/train': 0.8391530513763428} 01/27/2022 11:24:00 - INFO - codeparrot_training - Step 16526: {'lr': 0.0003952734090516129, 'samples': 3173184, 'steps': 16526, 'loss/train': 0.9399526119232178} 01/27/2022 11:24:03 - INFO - codeparrot_training - Step 16527: {'lr': 0.000395260092368488, 'samples': 3173376, 'steps': 16527, 'loss/train': 0.9037066698074341} 01/27/2022 11:24:06 - INFO - codeparrot_training - Step 16528: {'lr': 0.000395246775063115, 'samples': 3173568, 'steps': 16528, 'loss/train': 0.5166259557008743} 01/27/2022 11:24:10 - INFO - codeparrot_training - Step 16529: {'lr': 0.0003952334571355509, 'samples': 3173760, 'steps': 16529, 'loss/train': 0.9063565135002136} 01/27/2022 11:24:13 - INFO - codeparrot_training - Step 16530: {'lr': 0.0003952201385858528, 'samples': 3173952, 'steps': 16530, 'loss/train': 1.3277464807033539} 01/27/2022 11:24:16 - INFO - codeparrot_training - Step 16531: {'lr': 0.00039520681941407777, 'samples': 3174144, 'steps': 16531, 'loss/train': 0.8441122770309448} 01/27/2022 11:24:19 - INFO - codeparrot_training - Step 16532: {'lr': 0.00039519349962028276, 'samples': 3174336, 'steps': 16532, 'loss/train': 0.8823607563972473} 01/27/2022 11:24:22 - INFO - codeparrot_training - Step 16533: {'lr': 0.000395180179204525, 'samples': 3174528, 'steps': 16533, 'loss/train': 0.9904004037380219} 01/27/2022 11:24:27 - INFO - codeparrot_training - Step 16534: {'lr': 0.0003951668581668614, 'samples': 3174720, 'steps': 16534, 'loss/train': 0.7316027283668518} 01/27/2022 11:24:30 - INFO - codeparrot_training - Step 16535: {'lr': 0.0003951535365073491, 'samples': 3174912, 'steps': 16535, 'loss/train': 0.8587291538715363} 01/27/2022 11:24:33 - INFO - codeparrot_training - Step 16536: {'lr': 0.00039514021422604515, 'samples': 3175104, 'steps': 16536, 'loss/train': 0.7947175204753876} 01/27/2022 11:24:36 - INFO - codeparrot_training - Step 16537: {'lr': 0.0003951268913230066, 'samples': 3175296, 'steps': 16537, 'loss/train': 0.48509278893470764} 01/27/2022 11:24:39 - INFO - codeparrot_training - Step 16538: {'lr': 0.0003951135677982904, 'samples': 3175488, 'steps': 16538, 'loss/train': 0.36743807047605515} 01/27/2022 11:24:43 - INFO - codeparrot_training - Step 16539: {'lr': 0.000395100243651954, 'samples': 3175680, 'steps': 16539, 'loss/train': 0.961525171995163} 01/27/2022 11:24:46 - INFO - codeparrot_training - Step 16540: {'lr': 0.00039508691888405403, 'samples': 3175872, 'steps': 16540, 'loss/train': 0.731463298201561} 01/27/2022 11:24:49 - INFO - codeparrot_training - Step 16541: {'lr': 0.0003950735934946478, 'samples': 3176064, 'steps': 16541, 'loss/train': 0.6246278285980225} 01/27/2022 11:24:52 - INFO - codeparrot_training - Step 16542: {'lr': 0.0003950602674837924, 'samples': 3176256, 'steps': 16542, 'loss/train': 1.0293222069740295} 01/27/2022 11:24:58 - INFO - codeparrot_training - Step 16543: {'lr': 0.0003950469408515449, 'samples': 3176448, 'steps': 16543, 'loss/train': 1.0070224106311798} 01/27/2022 11:25:01 - INFO - codeparrot_training - Step 16544: {'lr': 0.00039503361359796235, 'samples': 3176640, 'steps': 16544, 'loss/train': 0.6476696133613586} 01/27/2022 11:25:04 - INFO - codeparrot_training - Step 16545: {'lr': 0.00039502028572310186, 'samples': 3176832, 'steps': 16545, 'loss/train': 0.918239951133728} 01/27/2022 11:25:08 - INFO - codeparrot_training - Step 16546: {'lr': 0.0003950069572270205, 'samples': 3177024, 'steps': 16546, 'loss/train': 0.9084859192371368} 01/27/2022 11:25:11 - INFO - codeparrot_training - Step 16547: {'lr': 0.00039499362810977535, 'samples': 3177216, 'steps': 16547, 'loss/train': 0.5326113402843475} 01/27/2022 11:25:14 - INFO - codeparrot_training - Step 16548: {'lr': 0.00039498029837142356, 'samples': 3177408, 'steps': 16548, 'loss/train': 0.6391774266958237} 01/27/2022 11:25:17 - INFO - codeparrot_training - Step 16549: {'lr': 0.0003949669680120223, 'samples': 3177600, 'steps': 16549, 'loss/train': 0.9387180805206299} 01/27/2022 11:25:20 - INFO - codeparrot_training - Step 16550: {'lr': 0.00039495363703162843, 'samples': 3177792, 'steps': 16550, 'loss/train': 0.5353071838617325} 01/27/2022 11:25:23 - INFO - codeparrot_training - Step 16551: {'lr': 0.00039494030543029925, 'samples': 3177984, 'steps': 16551, 'loss/train': 0.7367417514324188} 01/27/2022 11:25:28 - INFO - codeparrot_training - Step 16552: {'lr': 0.0003949269732080919, 'samples': 3178176, 'steps': 16552, 'loss/train': 0.39576494693756104} 01/27/2022 11:25:31 - INFO - codeparrot_training - Step 16553: {'lr': 0.0003949136403650633, 'samples': 3178368, 'steps': 16553, 'loss/train': 0.7498573064804077} 01/27/2022 11:25:34 - INFO - codeparrot_training - Step 16554: {'lr': 0.0003949003069012708, 'samples': 3178560, 'steps': 16554, 'loss/train': 0.422603040933609} 01/27/2022 11:25:37 - INFO - codeparrot_training - Step 16555: {'lr': 0.0003948869728167713, 'samples': 3178752, 'steps': 16555, 'loss/train': 1.057502567768097} 01/27/2022 11:25:40 - INFO - codeparrot_training - Step 16556: {'lr': 0.0003948736381116221, 'samples': 3178944, 'steps': 16556, 'loss/train': 0.19354160875082016} 01/27/2022 11:25:44 - INFO - codeparrot_training - Step 16557: {'lr': 0.0003948603027858802, 'samples': 3179136, 'steps': 16557, 'loss/train': 0.2749708369374275} 01/27/2022 11:25:47 - INFO - codeparrot_training - Step 16558: {'lr': 0.00039484696683960276, 'samples': 3179328, 'steps': 16558, 'loss/train': 0.7165920585393906} 01/27/2022 11:25:50 - INFO - codeparrot_training - Step 16559: {'lr': 0.0003948336302728469, 'samples': 3179520, 'steps': 16559, 'loss/train': 0.99613156914711} 01/27/2022 11:25:56 - INFO - codeparrot_training - Step 16560: {'lr': 0.0003948202930856697, 'samples': 3179712, 'steps': 16560, 'loss/train': 0.8270643353462219} 01/27/2022 11:25:59 - INFO - codeparrot_training - Step 16561: {'lr': 0.0003948069552781285, 'samples': 3179904, 'steps': 16561, 'loss/train': 0.676588699221611} 01/27/2022 11:26:03 - INFO - codeparrot_training - Step 16562: {'lr': 0.00039479361685028016, 'samples': 3180096, 'steps': 16562, 'loss/train': 0.783758819103241} 01/27/2022 11:26:06 - INFO - codeparrot_training - Step 16563: {'lr': 0.00039478027780218193, 'samples': 3180288, 'steps': 16563, 'loss/train': 0.6258453726768494} 01/27/2022 11:26:09 - INFO - codeparrot_training - Step 16564: {'lr': 0.00039476693813389105, 'samples': 3180480, 'steps': 16564, 'loss/train': 0.8174875974655151} 01/27/2022 11:26:12 - INFO - codeparrot_training - Step 16565: {'lr': 0.0003947535978454645, 'samples': 3180672, 'steps': 16565, 'loss/train': 0.8849004507064819} 01/27/2022 11:26:15 - INFO - codeparrot_training - Step 16566: {'lr': 0.0003947402569369596, 'samples': 3180864, 'steps': 16566, 'loss/train': 1.084971696138382} 01/27/2022 11:26:18 - INFO - codeparrot_training - Step 16567: {'lr': 0.0003947269154084333, 'samples': 3181056, 'steps': 16567, 'loss/train': 1.10421484708786} 01/27/2022 11:26:21 - INFO - codeparrot_training - Step 16568: {'lr': 0.0003947135732599428, 'samples': 3181248, 'steps': 16568, 'loss/train': 1.1927977502346039} 01/27/2022 11:26:26 - INFO - codeparrot_training - Step 16569: {'lr': 0.00039470023049154544, 'samples': 3181440, 'steps': 16569, 'loss/train': 1.3085245192050934} 01/27/2022 11:26:29 - INFO - codeparrot_training - Step 16570: {'lr': 0.00039468688710329826, 'samples': 3181632, 'steps': 16570, 'loss/train': 0.7902727425098419} 01/27/2022 11:26:32 - INFO - codeparrot_training - Step 16571: {'lr': 0.0003946735430952583, 'samples': 3181824, 'steps': 16571, 'loss/train': 0.7922590970993042} 01/27/2022 11:26:36 - INFO - codeparrot_training - Step 16572: {'lr': 0.0003946601984674828, 'samples': 3182016, 'steps': 16572, 'loss/train': 0.5704865902662277} 01/27/2022 11:26:39 - INFO - codeparrot_training - Step 16573: {'lr': 0.00039464685322002904, 'samples': 3182208, 'steps': 16573, 'loss/train': 0.6855838000774384} 01/27/2022 11:26:42 - INFO - codeparrot_training - Step 16574: {'lr': 0.000394633507352954, 'samples': 3182400, 'steps': 16574, 'loss/train': 0.7145133018493652} 01/27/2022 11:26:45 - INFO - codeparrot_training - Step 16575: {'lr': 0.00039462016086631505, 'samples': 3182592, 'steps': 16575, 'loss/train': 0.972789466381073} 01/27/2022 11:26:48 - INFO - codeparrot_training - Step 16576: {'lr': 0.00039460681376016915, 'samples': 3182784, 'steps': 16576, 'loss/train': 0.35083360970020294} 01/27/2022 11:26:51 - INFO - codeparrot_training - Step 16577: {'lr': 0.0003945934660345736, 'samples': 3182976, 'steps': 16577, 'loss/train': 0.3271612375974655} 01/27/2022 11:26:56 - INFO - codeparrot_training - Step 16578: {'lr': 0.00039458011768958557, 'samples': 3183168, 'steps': 16578, 'loss/train': 0.8080206513404846} 01/27/2022 11:26:59 - INFO - codeparrot_training - Step 16579: {'lr': 0.00039456676872526227, 'samples': 3183360, 'steps': 16579, 'loss/train': 1.1806439459323883} 01/27/2022 11:27:02 - INFO - codeparrot_training - Step 16580: {'lr': 0.00039455341914166074, 'samples': 3183552, 'steps': 16580, 'loss/train': 0.8876824378967285} 01/27/2022 11:27:05 - INFO - codeparrot_training - Step 16581: {'lr': 0.0003945400689388384, 'samples': 3183744, 'steps': 16581, 'loss/train': 0.9121866524219513} 01/27/2022 11:27:08 - INFO - codeparrot_training - Step 16582: {'lr': 0.00039452671811685214, 'samples': 3183936, 'steps': 16582, 'loss/train': 1.10377836227417} 01/27/2022 11:27:11 - INFO - codeparrot_training - Step 16583: {'lr': 0.00039451336667575945, 'samples': 3184128, 'steps': 16583, 'loss/train': 0.6815927177667618} 01/27/2022 11:27:15 - INFO - codeparrot_training - Step 16584: {'lr': 0.0003945000146156173, 'samples': 3184320, 'steps': 16584, 'loss/train': 0.2676375210285187} 01/27/2022 11:27:18 - INFO - codeparrot_training - Step 16585: {'lr': 0.00039448666193648305, 'samples': 3184512, 'steps': 16585, 'loss/train': 0.5688890665769577} 01/27/2022 11:27:21 - INFO - codeparrot_training - Step 16586: {'lr': 0.0003944733086384137, 'samples': 3184704, 'steps': 16586, 'loss/train': 0.8929148018360138} 01/27/2022 11:27:25 - INFO - codeparrot_training - Step 16587: {'lr': 0.00039445995472146665, 'samples': 3184896, 'steps': 16587, 'loss/train': 0.7771497666835785} 01/27/2022 11:27:28 - INFO - codeparrot_training - Step 16588: {'lr': 0.000394446600185699, 'samples': 3185088, 'steps': 16588, 'loss/train': 1.4753319025039673} 01/27/2022 11:27:32 - INFO - codeparrot_training - Step 16589: {'lr': 0.000394433245031168, 'samples': 3185280, 'steps': 16589, 'loss/train': 0.16334032639861107} 01/27/2022 11:27:35 - INFO - codeparrot_training - Step 16590: {'lr': 0.0003944198892579309, 'samples': 3185472, 'steps': 16590, 'loss/train': 1.1409006714820862} 01/27/2022 11:27:38 - INFO - codeparrot_training - Step 16591: {'lr': 0.0003944065328660447, 'samples': 3185664, 'steps': 16591, 'loss/train': 0.2998780086636543} 01/27/2022 11:27:41 - INFO - codeparrot_training - Step 16592: {'lr': 0.0003943931758555669, 'samples': 3185856, 'steps': 16592, 'loss/train': 1.0056205093860626} 01/27/2022 11:27:44 - INFO - codeparrot_training - Step 16593: {'lr': 0.00039437981822655453, 'samples': 3186048, 'steps': 16593, 'loss/train': 1.02582648396492} 01/27/2022 11:27:47 - INFO - codeparrot_training - Step 16594: {'lr': 0.00039436645997906487, 'samples': 3186240, 'steps': 16594, 'loss/train': 0.6022307574748993} 01/27/2022 11:27:50 - INFO - codeparrot_training - Step 16595: {'lr': 0.00039435310111315513, 'samples': 3186432, 'steps': 16595, 'loss/train': 0.6373828500509262} 01/27/2022 11:27:55 - INFO - codeparrot_training - Step 16596: {'lr': 0.00039433974162888266, 'samples': 3186624, 'steps': 16596, 'loss/train': 1.0721024572849274} 01/27/2022 11:27:58 - INFO - codeparrot_training - Step 16597: {'lr': 0.0003943263815263044, 'samples': 3186816, 'steps': 16597, 'loss/train': 0.6327148228883743} 01/27/2022 11:28:01 - INFO - codeparrot_training - Step 16598: {'lr': 0.0003943130208054778, 'samples': 3187008, 'steps': 16598, 'loss/train': 0.9300316572189331} 01/27/2022 11:28:04 - INFO - codeparrot_training - Step 16599: {'lr': 0.0003942996594664601, 'samples': 3187200, 'steps': 16599, 'loss/train': 0.8318009376525879} 01/27/2022 11:28:08 - INFO - codeparrot_training - Step 16600: {'lr': 0.00039428629750930846, 'samples': 3187392, 'steps': 16600, 'loss/train': 0.79554882645607} 01/27/2022 11:28:11 - INFO - codeparrot_training - Step 16601: {'lr': 0.0003942729349340801, 'samples': 3187584, 'steps': 16601, 'loss/train': 0.7911353409290314} 01/27/2022 11:28:14 - INFO - codeparrot_training - Step 16602: {'lr': 0.00039425957174083224, 'samples': 3187776, 'steps': 16602, 'loss/train': 0.3604074865579605} 01/27/2022 11:28:17 - INFO - codeparrot_training - Step 16603: {'lr': 0.0003942462079296223, 'samples': 3187968, 'steps': 16603, 'loss/train': 0.7541225552558899} 01/27/2022 11:28:23 - INFO - codeparrot_training - Step 16604: {'lr': 0.00039423284350050735, 'samples': 3188160, 'steps': 16604, 'loss/train': 0.6731172949075699} 01/27/2022 11:28:26 - INFO - codeparrot_training - Step 16605: {'lr': 0.00039421947845354476, 'samples': 3188352, 'steps': 16605, 'loss/train': 0.5511913150548935} 01/27/2022 11:28:29 - INFO - codeparrot_training - Step 16606: {'lr': 0.0003942061127887916, 'samples': 3188544, 'steps': 16606, 'loss/train': 0.9432304501533508} 01/27/2022 11:28:33 - INFO - codeparrot_training - Step 16607: {'lr': 0.00039419274650630536, 'samples': 3188736, 'steps': 16607, 'loss/train': 1.0078726708889008} 01/27/2022 11:28:36 - INFO - codeparrot_training - Step 16608: {'lr': 0.00039417937960614316, 'samples': 3188928, 'steps': 16608, 'loss/train': 1.0350541770458221} 01/27/2022 11:28:39 - INFO - codeparrot_training - Step 16609: {'lr': 0.0003941660120883622, 'samples': 3189120, 'steps': 16609, 'loss/train': 0.5650066137313843} 01/27/2022 11:28:42 - INFO - codeparrot_training - Step 16610: {'lr': 0.0003941526439530199, 'samples': 3189312, 'steps': 16610, 'loss/train': 1.2736766338348389} 01/27/2022 11:28:45 - INFO - codeparrot_training - Step 16611: {'lr': 0.00039413927520017347, 'samples': 3189504, 'steps': 16611, 'loss/train': 0.9660032093524933} 01/27/2022 11:28:48 - INFO - codeparrot_training - Step 16612: {'lr': 0.00039412590582988007, 'samples': 3189696, 'steps': 16612, 'loss/train': 0.38302333652973175} 01/27/2022 11:28:53 - INFO - codeparrot_training - Step 16613: {'lr': 0.00039411253584219707, 'samples': 3189888, 'steps': 16613, 'loss/train': 0.6929687261581421} 01/27/2022 11:28:56 - INFO - codeparrot_training - Step 16614: {'lr': 0.0003940991652371818, 'samples': 3190080, 'steps': 16614, 'loss/train': 0.8751102089881897} 01/27/2022 11:28:59 - INFO - codeparrot_training - Step 16615: {'lr': 0.0003940857940148914, 'samples': 3190272, 'steps': 16615, 'loss/train': 1.2635843753814697} 01/27/2022 11:29:02 - INFO - codeparrot_training - Step 16616: {'lr': 0.00039407242217538317, 'samples': 3190464, 'steps': 16616, 'loss/train': 0.8175921142101288} 01/27/2022 11:29:05 - INFO - codeparrot_training - Step 16617: {'lr': 0.00039405904971871454, 'samples': 3190656, 'steps': 16617, 'loss/train': 0.6919123381376266} 01/27/2022 11:29:08 - INFO - codeparrot_training - Step 16618: {'lr': 0.00039404567664494264, 'samples': 3190848, 'steps': 16618, 'loss/train': 0.508542001247406} 01/27/2022 11:29:12 - INFO - codeparrot_training - Step 16619: {'lr': 0.0003940323029541248, 'samples': 3191040, 'steps': 16619, 'loss/train': 0.5910428613424301} 01/27/2022 11:29:15 - INFO - codeparrot_training - Step 16620: {'lr': 0.00039401892864631826, 'samples': 3191232, 'steps': 16620, 'loss/train': 0.9494670331478119} 01/27/2022 11:29:18 - INFO - codeparrot_training - Step 16621: {'lr': 0.0003940055537215804, 'samples': 3191424, 'steps': 16621, 'loss/train': 0.7614655494689941} 01/27/2022 11:29:24 - INFO - codeparrot_training - Step 16622: {'lr': 0.0003939921781799685, 'samples': 3191616, 'steps': 16622, 'loss/train': 0.7803365886211395} 01/27/2022 11:29:27 - INFO - codeparrot_training - Step 16623: {'lr': 0.0003939788020215398, 'samples': 3191808, 'steps': 16623, 'loss/train': 0.5933124125003815} 01/27/2022 11:29:31 - INFO - codeparrot_training - Step 16624: {'lr': 0.0003939654252463517, 'samples': 3192000, 'steps': 16624, 'loss/train': 0.6415375471115112} 01/27/2022 11:29:34 - INFO - codeparrot_training - Step 16625: {'lr': 0.00039395204785446137, 'samples': 3192192, 'steps': 16625, 'loss/train': 0.9080023169517517} 01/27/2022 11:29:37 - INFO - codeparrot_training - Step 16626: {'lr': 0.00039393866984592616, 'samples': 3192384, 'steps': 16626, 'loss/train': 0.2179461568593979} 01/27/2022 11:29:40 - INFO - codeparrot_training - Step 16627: {'lr': 0.00039392529122080343, 'samples': 3192576, 'steps': 16627, 'loss/train': 0.790966272354126} 01/27/2022 11:29:43 - INFO - codeparrot_training - Step 16628: {'lr': 0.0003939119119791504, 'samples': 3192768, 'steps': 16628, 'loss/train': 0.6232640594244003} 01/27/2022 11:29:46 - INFO - codeparrot_training - Step 16629: {'lr': 0.0003938985321210245, 'samples': 3192960, 'steps': 16629, 'loss/train': 0.29216165095567703} 01/27/2022 11:29:49 - INFO - codeparrot_training - Step 16630: {'lr': 0.00039388515164648293, 'samples': 3193152, 'steps': 16630, 'loss/train': 0.6077028661966324} 01/27/2022 11:29:54 - INFO - codeparrot_training - Step 16631: {'lr': 0.0003938717705555831, 'samples': 3193344, 'steps': 16631, 'loss/train': 0.9190243184566498} 01/27/2022 11:29:57 - INFO - codeparrot_training - Step 16632: {'lr': 0.0003938583888483823, 'samples': 3193536, 'steps': 16632, 'loss/train': 0.7955794036388397} 01/27/2022 11:30:00 - INFO - codeparrot_training - Step 16633: {'lr': 0.0003938450065249378, 'samples': 3193728, 'steps': 16633, 'loss/train': 0.5880499631166458} 01/27/2022 11:30:03 - INFO - codeparrot_training - Step 16634: {'lr': 0.00039383162358530696, 'samples': 3193920, 'steps': 16634, 'loss/train': 0.5856458097696304} 01/27/2022 11:30:07 - INFO - codeparrot_training - Step 16635: {'lr': 0.0003938182400295471, 'samples': 3194112, 'steps': 16635, 'loss/train': 0.9202377498149872} 01/27/2022 11:30:10 - INFO - codeparrot_training - Step 16636: {'lr': 0.00039380485585771563, 'samples': 3194304, 'steps': 16636, 'loss/train': 0.4655890166759491} 01/27/2022 11:30:13 - INFO - codeparrot_training - Step 16637: {'lr': 0.00039379147106986985, 'samples': 3194496, 'steps': 16637, 'loss/train': 0.5169586837291718} 01/27/2022 11:30:16 - INFO - codeparrot_training - Step 16638: {'lr': 0.00039377808566606697, 'samples': 3194688, 'steps': 16638, 'loss/train': 1.0658023059368134} 01/27/2022 11:30:19 - INFO - codeparrot_training - Step 16639: {'lr': 0.00039376469964636445, 'samples': 3194880, 'steps': 16639, 'loss/train': 1.6631666421890259} 01/27/2022 11:30:24 - INFO - codeparrot_training - Step 16640: {'lr': 0.0003937513130108197, 'samples': 3195072, 'steps': 16640, 'loss/train': 0.7431111931800842} 01/27/2022 11:30:27 - INFO - codeparrot_training - Step 16641: {'lr': 0.00039373792575948986, 'samples': 3195264, 'steps': 16641, 'loss/train': 0.8739589154720306} 01/27/2022 11:30:30 - INFO - codeparrot_training - Step 16642: {'lr': 0.00039372453789243245, 'samples': 3195456, 'steps': 16642, 'loss/train': 1.3651200234889984} 01/27/2022 11:30:33 - INFO - codeparrot_training - Step 16643: {'lr': 0.0003937111494097047, 'samples': 3195648, 'steps': 16643, 'loss/train': 0.9363327026367188} 01/27/2022 11:30:36 - INFO - codeparrot_training - Step 16644: {'lr': 0.0003936977603113641, 'samples': 3195840, 'steps': 16644, 'loss/train': 0.73640276491642} 01/27/2022 11:30:39 - INFO - codeparrot_training - Step 16645: {'lr': 0.00039368437059746785, 'samples': 3196032, 'steps': 16645, 'loss/train': 0.8628147840499878} 01/27/2022 11:30:42 - INFO - codeparrot_training - Step 16646: {'lr': 0.0003936709802680734, 'samples': 3196224, 'steps': 16646, 'loss/train': 0.8485980033874512} 01/27/2022 11:30:46 - INFO - codeparrot_training - Step 16647: {'lr': 0.0003936575893232381, 'samples': 3196416, 'steps': 16647, 'loss/train': 0.6648914963006973} 01/27/2022 11:30:52 - INFO - codeparrot_training - Step 16648: {'lr': 0.0003936441977630193, 'samples': 3196608, 'steps': 16648, 'loss/train': 0.5970194488763809} 01/27/2022 11:30:56 - INFO - codeparrot_training - Step 16649: {'lr': 0.0003936308055874744, 'samples': 3196800, 'steps': 16649, 'loss/train': 0.8995825946331024} 01/27/2022 11:30:59 - INFO - codeparrot_training - Step 16650: {'lr': 0.00039361741279666065, 'samples': 3196992, 'steps': 16650, 'loss/train': 1.0453486740589142} 01/27/2022 11:31:02 - INFO - codeparrot_training - Step 16651: {'lr': 0.0003936040193906356, 'samples': 3197184, 'steps': 16651, 'loss/train': 0.5045066624879837} 01/27/2022 11:31:05 - INFO - codeparrot_training - Step 16652: {'lr': 0.00039359062536945645, 'samples': 3197376, 'steps': 16652, 'loss/train': 0.6516699492931366} 01/27/2022 11:31:08 - INFO - codeparrot_training - Step 16653: {'lr': 0.00039357723073318076, 'samples': 3197568, 'steps': 16653, 'loss/train': 0.8535715341567993} 01/27/2022 11:31:11 - INFO - codeparrot_training - Step 16654: {'lr': 0.0003935638354818657, 'samples': 3197760, 'steps': 16654, 'loss/train': 1.2044422030448914} 01/27/2022 11:31:14 - INFO - codeparrot_training - Step 16655: {'lr': 0.0003935504396155688, 'samples': 3197952, 'steps': 16655, 'loss/train': 0.6183807402849197} 01/27/2022 11:31:18 - INFO - codeparrot_training - Step 16656: {'lr': 0.00039353704313434745, 'samples': 3198144, 'steps': 16656, 'loss/train': 0.3731156140565872} 01/27/2022 11:31:22 - INFO - codeparrot_training - Step 16657: {'lr': 0.000393523646038259, 'samples': 3198336, 'steps': 16657, 'loss/train': 0.6135188341140747} 01/27/2022 11:31:25 - INFO - codeparrot_training - Step 16658: {'lr': 0.0003935102483273607, 'samples': 3198528, 'steps': 16658, 'loss/train': 0.8142899572849274} 01/27/2022 11:31:29 - INFO - codeparrot_training - Step 16659: {'lr': 0.0003934968500017101, 'samples': 3198720, 'steps': 16659, 'loss/train': 0.7979467213153839} 01/27/2022 11:31:32 - INFO - codeparrot_training - Step 16660: {'lr': 0.0003934834510613646, 'samples': 3198912, 'steps': 16660, 'loss/train': 0.4622219502925873} 01/27/2022 11:31:35 - INFO - codeparrot_training - Step 16661: {'lr': 0.00039347005150638156, 'samples': 3199104, 'steps': 16661, 'loss/train': 1.2901637256145477} 01/27/2022 11:31:38 - INFO - codeparrot_training - Step 16662: {'lr': 0.0003934566513368183, 'samples': 3199296, 'steps': 16662, 'loss/train': 0.14947481453418732} 01/27/2022 11:31:41 - INFO - codeparrot_training - Step 16663: {'lr': 0.00039344325055273236, 'samples': 3199488, 'steps': 16663, 'loss/train': 0.6703546196222305} 01/27/2022 11:31:44 - INFO - codeparrot_training - Step 16664: {'lr': 0.0003934298491541811, 'samples': 3199680, 'steps': 16664, 'loss/train': 0.8610442578792572} 01/27/2022 11:31:47 - INFO - codeparrot_training - Step 16665: {'lr': 0.00039341644714122195, 'samples': 3199872, 'steps': 16665, 'loss/train': 0.700242206454277} 01/27/2022 11:31:54 - INFO - codeparrot_training - Step 16666: {'lr': 0.00039340304451391216, 'samples': 3200064, 'steps': 16666, 'loss/train': 1.6794821619987488} 01/27/2022 11:31:57 - INFO - codeparrot_training - Step 16667: {'lr': 0.00039338964127230935, 'samples': 3200256, 'steps': 16667, 'loss/train': 0.14145388081669807} 01/27/2022 11:32:00 - INFO - codeparrot_training - Step 16668: {'lr': 0.00039337623741647084, 'samples': 3200448, 'steps': 16668, 'loss/train': 0.5690004676580429} 01/27/2022 11:32:03 - INFO - codeparrot_training - Step 16669: {'lr': 0.000393362832946454, 'samples': 3200640, 'steps': 16669, 'loss/train': 5.116445302963257} 01/27/2022 11:32:07 - INFO - codeparrot_training - Step 16670: {'lr': 0.0003933494278623164, 'samples': 3200832, 'steps': 16670, 'loss/train': 0.8027774691581726} 01/27/2022 11:32:10 - INFO - codeparrot_training - Step 16671: {'lr': 0.0003933360221641153, 'samples': 3201024, 'steps': 16671, 'loss/train': 0.8326976001262665} 01/27/2022 11:32:13 - INFO - codeparrot_training - Step 16672: {'lr': 0.0003933226158519082, 'samples': 3201216, 'steps': 16672, 'loss/train': 0.7170132994651794} 01/27/2022 11:32:16 - INFO - codeparrot_training - Step 16673: {'lr': 0.0003933092089257525, 'samples': 3201408, 'steps': 16673, 'loss/train': 1.3849074840545654} 01/27/2022 11:32:19 - INFO - codeparrot_training - Step 16674: {'lr': 0.0003932958013857057, 'samples': 3201600, 'steps': 16674, 'loss/train': 1.15932235121727} 01/27/2022 11:32:23 - INFO - codeparrot_training - Step 16675: {'lr': 0.0003932823932318252, 'samples': 3201792, 'steps': 16675, 'loss/train': 0.7461849600076675} 01/27/2022 11:32:27 - INFO - codeparrot_training - Step 16676: {'lr': 0.0003932689844641684, 'samples': 3201984, 'steps': 16676, 'loss/train': 1.2937967777252197} 01/27/2022 11:32:30 - INFO - codeparrot_training - Step 16677: {'lr': 0.00039325557508279276, 'samples': 3202176, 'steps': 16677, 'loss/train': 1.228144884109497} 01/27/2022 11:32:33 - INFO - codeparrot_training - Step 16678: {'lr': 0.00039324216508775567, 'samples': 3202368, 'steps': 16678, 'loss/train': 1.1171397864818573} 01/27/2022 11:32:36 - INFO - codeparrot_training - Step 16679: {'lr': 0.0003932287544791148, 'samples': 3202560, 'steps': 16679, 'loss/train': 0.6430477201938629} 01/27/2022 11:32:39 - INFO - codeparrot_training - Step 16680: {'lr': 0.00039321534325692726, 'samples': 3202752, 'steps': 16680, 'loss/train': 0.35831549763679504} 01/27/2022 11:32:42 - INFO - codeparrot_training - Step 16681: {'lr': 0.0003932019314212507, 'samples': 3202944, 'steps': 16681, 'loss/train': 0.5996710360050201} 01/27/2022 11:32:45 - INFO - codeparrot_training - Step 16682: {'lr': 0.0003931885189721426, 'samples': 3203136, 'steps': 16682, 'loss/train': 0.999709278345108} 01/27/2022 11:32:49 - INFO - codeparrot_training - Step 16683: {'lr': 0.00039317510590966033, 'samples': 3203328, 'steps': 16683, 'loss/train': 1.5889348983764648} 01/27/2022 11:32:53 - INFO - codeparrot_training - Step 16684: {'lr': 0.0003931616922338613, 'samples': 3203520, 'steps': 16684, 'loss/train': 0.8786716461181641} 01/27/2022 11:32:56 - INFO - codeparrot_training - Step 16685: {'lr': 0.00039314827794480314, 'samples': 3203712, 'steps': 16685, 'loss/train': 1.4999263286590576} 01/27/2022 11:32:59 - INFO - codeparrot_training - Step 16686: {'lr': 0.00039313486304254315, 'samples': 3203904, 'steps': 16686, 'loss/train': 0.8605344593524933} 01/27/2022 11:33:03 - INFO - codeparrot_training - Step 16687: {'lr': 0.00039312144752713885, 'samples': 3204096, 'steps': 16687, 'loss/train': 0.9058350026607513} 01/27/2022 11:33:06 - INFO - codeparrot_training - Step 16688: {'lr': 0.00039310803139864777, 'samples': 3204288, 'steps': 16688, 'loss/train': 1.2467730939388275} 01/27/2022 11:33:09 - INFO - codeparrot_training - Step 16689: {'lr': 0.00039309461465712725, 'samples': 3204480, 'steps': 16689, 'loss/train': 0.6246209442615509} 01/27/2022 11:33:12 - INFO - codeparrot_training - Step 16690: {'lr': 0.00039308119730263494, 'samples': 3204672, 'steps': 16690, 'loss/train': 0.9108148813247681} 01/27/2022 11:33:15 - INFO - codeparrot_training - Step 16691: {'lr': 0.00039306777933522806, 'samples': 3204864, 'steps': 16691, 'loss/train': 0.9141234755516052} 01/27/2022 11:33:18 - INFO - codeparrot_training - Step 16692: {'lr': 0.00039305436075496436, 'samples': 3205056, 'steps': 16692, 'loss/train': 0.856312483549118} 01/27/2022 11:33:23 - INFO - codeparrot_training - Step 16693: {'lr': 0.0003930409415619012, 'samples': 3205248, 'steps': 16693, 'loss/train': 0.6361477375030518} 01/27/2022 11:33:26 - INFO - codeparrot_training - Step 16694: {'lr': 0.000393027521756096, 'samples': 3205440, 'steps': 16694, 'loss/train': 0.8870329856872559} 01/27/2022 11:33:29 - INFO - codeparrot_training - Step 16695: {'lr': 0.0003930141013376064, 'samples': 3205632, 'steps': 16695, 'loss/train': 0.9518773555755615} 01/27/2022 11:33:32 - INFO - codeparrot_training - Step 16696: {'lr': 0.00039300068030648976, 'samples': 3205824, 'steps': 16696, 'loss/train': 0.4598803371191025} 01/27/2022 11:33:35 - INFO - codeparrot_training - Step 16697: {'lr': 0.0003929872586628036, 'samples': 3206016, 'steps': 16697, 'loss/train': 0.6698092371225357} 01/27/2022 11:33:38 - INFO - codeparrot_training - Step 16698: {'lr': 0.00039297383640660545, 'samples': 3206208, 'steps': 16698, 'loss/train': 0.8733636438846588} 01/27/2022 11:33:42 - INFO - codeparrot_training - Step 16699: {'lr': 0.0003929604135379528, 'samples': 3206400, 'steps': 16699, 'loss/train': 0.6255200654268265} 01/27/2022 11:33:45 - INFO - codeparrot_training - Step 16700: {'lr': 0.000392946990056903, 'samples': 3206592, 'steps': 16700, 'loss/train': 0.5678131878376007} 01/27/2022 11:33:51 - INFO - codeparrot_training - Step 16701: {'lr': 0.0003929335659635139, 'samples': 3206784, 'steps': 16701, 'loss/train': 0.9285567104816437} 01/27/2022 11:33:54 - INFO - codeparrot_training - Step 16702: {'lr': 0.00039292014125784266, 'samples': 3206976, 'steps': 16702, 'loss/train': 0.9886830747127533} 01/27/2022 11:33:57 - INFO - codeparrot_training - Step 16703: {'lr': 0.00039290671593994697, 'samples': 3207168, 'steps': 16703, 'loss/train': 0.555996760725975} 01/27/2022 11:34:01 - INFO - codeparrot_training - Step 16704: {'lr': 0.0003928932900098842, 'samples': 3207360, 'steps': 16704, 'loss/train': 0.04711763933300972} 01/27/2022 11:34:04 - INFO - codeparrot_training - Step 16705: {'lr': 0.00039287986346771205, 'samples': 3207552, 'steps': 16705, 'loss/train': 1.9233128428459167} 01/27/2022 11:34:07 - INFO - codeparrot_training - Step 16706: {'lr': 0.0003928664363134879, 'samples': 3207744, 'steps': 16706, 'loss/train': 0.8812556862831116} 01/27/2022 11:34:10 - INFO - codeparrot_training - Step 16707: {'lr': 0.00039285300854726926, 'samples': 3207936, 'steps': 16707, 'loss/train': 0.876979798078537} 01/27/2022 11:34:13 - INFO - codeparrot_training - Step 16708: {'lr': 0.00039283958016911373, 'samples': 3208128, 'steps': 16708, 'loss/train': 0.8022417426109314} 01/27/2022 11:34:16 - INFO - codeparrot_training - Step 16709: {'lr': 0.00039282615117907884, 'samples': 3208320, 'steps': 16709, 'loss/train': 0.7989880442619324} 01/27/2022 11:34:21 - INFO - codeparrot_training - Step 16710: {'lr': 0.00039281272157722205, 'samples': 3208512, 'steps': 16710, 'loss/train': 0.514949306845665} 01/27/2022 11:34:24 - INFO - codeparrot_training - Step 16711: {'lr': 0.0003927992913636008, 'samples': 3208704, 'steps': 16711, 'loss/train': 0.10661918297410011} 01/27/2022 11:34:27 - INFO - codeparrot_training - Step 16712: {'lr': 0.0003927858605382728, 'samples': 3208896, 'steps': 16712, 'loss/train': 1.2503102123737335} 01/27/2022 11:34:30 - INFO - codeparrot_training - Step 16713: {'lr': 0.0003927724291012955, 'samples': 3209088, 'steps': 16713, 'loss/train': 0.765852302312851} 01/27/2022 11:34:33 - INFO - codeparrot_training - Step 16714: {'lr': 0.00039275899705272656, 'samples': 3209280, 'steps': 16714, 'loss/train': 0.6418425589799881} 01/27/2022 11:34:37 - INFO - codeparrot_training - Step 16715: {'lr': 0.00039274556439262325, 'samples': 3209472, 'steps': 16715, 'loss/train': 0.7408009618520737} 01/27/2022 11:34:40 - INFO - codeparrot_training - Step 16716: {'lr': 0.0003927321311210434, 'samples': 3209664, 'steps': 16716, 'loss/train': 0.6624245792627335} 01/27/2022 11:34:43 - INFO - codeparrot_training - Step 16717: {'lr': 0.00039271869723804434, 'samples': 3209856, 'steps': 16717, 'loss/train': 0.47735631465911865} 01/27/2022 11:34:46 - INFO - codeparrot_training - Step 16718: {'lr': 0.0003927052627436837, 'samples': 3210048, 'steps': 16718, 'loss/train': 0.68913334608078} 01/27/2022 11:34:50 - INFO - codeparrot_training - Step 16719: {'lr': 0.000392691827638019, 'samples': 3210240, 'steps': 16719, 'loss/train': 0.9620359539985657} 01/27/2022 11:34:54 - INFO - codeparrot_training - Step 16720: {'lr': 0.000392678391921108, 'samples': 3210432, 'steps': 16720, 'loss/train': 0.8534120321273804} 01/27/2022 11:34:57 - INFO - codeparrot_training - Step 16721: {'lr': 0.00039266495559300786, 'samples': 3210624, 'steps': 16721, 'loss/train': 0.7651798725128174} 01/27/2022 11:35:00 - INFO - codeparrot_training - Step 16722: {'lr': 0.00039265151865377644, 'samples': 3210816, 'steps': 16722, 'loss/train': 0.8505843579769135} 01/27/2022 11:35:03 - INFO - codeparrot_training - Step 16723: {'lr': 0.0003926380811034712, 'samples': 3211008, 'steps': 16723, 'loss/train': 0.688309907913208} 01/27/2022 11:35:06 - INFO - codeparrot_training - Step 16724: {'lr': 0.0003926246429421497, 'samples': 3211200, 'steps': 16724, 'loss/train': 0.560474082827568} 01/27/2022 11:35:09 - INFO - codeparrot_training - Step 16725: {'lr': 0.0003926112041698696, 'samples': 3211392, 'steps': 16725, 'loss/train': 0.8942853212356567} 01/27/2022 11:35:12 - INFO - codeparrot_training - Step 16726: {'lr': 0.0003925977647866883, 'samples': 3211584, 'steps': 16726, 'loss/train': 0.6951418071985245} 01/27/2022 11:35:16 - INFO - codeparrot_training - Step 16727: {'lr': 0.0003925843247926635, 'samples': 3211776, 'steps': 16727, 'loss/train': 0.31534988433122635} 01/27/2022 11:35:22 - INFO - codeparrot_training - Step 16728: {'lr': 0.00039257088418785267, 'samples': 3211968, 'steps': 16728, 'loss/train': 1.0744222104549408} 01/27/2022 11:35:25 - INFO - codeparrot_training - Step 16729: {'lr': 0.00039255744297231354, 'samples': 3212160, 'steps': 16729, 'loss/train': 0.7197470963001251} 01/27/2022 11:35:28 - INFO - codeparrot_training - Step 16730: {'lr': 0.0003925440011461035, 'samples': 3212352, 'steps': 16730, 'loss/train': 0.9369017779827118} 01/27/2022 11:35:31 - INFO - codeparrot_training - Step 16731: {'lr': 0.0003925305587092802, 'samples': 3212544, 'steps': 16731, 'loss/train': 1.1171679496765137} 01/27/2022 11:35:34 - INFO - codeparrot_training - Step 16732: {'lr': 0.00039251711566190133, 'samples': 3212736, 'steps': 16732, 'loss/train': 0.5926946997642517} 01/27/2022 11:35:38 - INFO - codeparrot_training - Step 16733: {'lr': 0.0003925036720040244, 'samples': 3212928, 'steps': 16733, 'loss/train': 0.6009356081485748} 01/27/2022 11:35:41 - INFO - codeparrot_training - Step 16734: {'lr': 0.000392490227735707, 'samples': 3213120, 'steps': 16734, 'loss/train': 1.1935933828353882} 01/27/2022 11:35:44 - INFO - codeparrot_training - Step 16735: {'lr': 0.0003924767828570066, 'samples': 3213312, 'steps': 16735, 'loss/train': 0.8786973059177399} 01/27/2022 11:35:47 - INFO - codeparrot_training - Step 16736: {'lr': 0.00039246333736798095, 'samples': 3213504, 'steps': 16736, 'loss/train': 0.8101111650466919} 01/27/2022 11:35:51 - INFO - codeparrot_training - Step 16737: {'lr': 0.00039244989126868755, 'samples': 3213696, 'steps': 16737, 'loss/train': 0.9912816882133484} 01/27/2022 11:35:55 - INFO - codeparrot_training - Step 16738: {'lr': 0.0003924364445591842, 'samples': 3213888, 'steps': 16738, 'loss/train': 1.0460240542888641} 01/27/2022 11:35:58 - INFO - codeparrot_training - Step 16739: {'lr': 0.0003924229972395282, 'samples': 3214080, 'steps': 16739, 'loss/train': 1.021691232919693} 01/27/2022 11:36:01 - INFO - codeparrot_training - Step 16740: {'lr': 0.00039240954930977744, 'samples': 3214272, 'steps': 16740, 'loss/train': 0.6755223423242569} 01/27/2022 11:36:04 - INFO - codeparrot_training - Step 16741: {'lr': 0.0003923961007699893, 'samples': 3214464, 'steps': 16741, 'loss/train': 0.8301410973072052} 01/27/2022 11:36:07 - INFO - codeparrot_training - Step 16742: {'lr': 0.00039238265162022147, 'samples': 3214656, 'steps': 16742, 'loss/train': 1.0666493475437164} 01/27/2022 11:36:10 - INFO - codeparrot_training - Step 16743: {'lr': 0.0003923692018605316, 'samples': 3214848, 'steps': 16743, 'loss/train': 1.1022498607635498} 01/27/2022 11:36:13 - INFO - codeparrot_training - Step 16744: {'lr': 0.0003923557514909773, 'samples': 3215040, 'steps': 16744, 'loss/train': 1.1632480323314667} 01/27/2022 11:36:20 - INFO - codeparrot_training - Step 16745: {'lr': 0.00039234230051161614, 'samples': 3215232, 'steps': 16745, 'loss/train': 1.0130780339241028} 01/27/2022 11:36:23 - INFO - codeparrot_training - Step 16746: {'lr': 0.00039232884892250575, 'samples': 3215424, 'steps': 16746, 'loss/train': 0.9194342494010925} 01/27/2022 11:36:26 - INFO - codeparrot_training - Step 16747: {'lr': 0.00039231539672370376, 'samples': 3215616, 'steps': 16747, 'loss/train': 0.7772029638290405} 01/27/2022 11:36:29 - INFO - codeparrot_training - Step 16748: {'lr': 0.00039230194391526784, 'samples': 3215808, 'steps': 16748, 'loss/train': 1.2626834213733673} 01/27/2022 11:36:32 - INFO - codeparrot_training - Step 16749: {'lr': 0.0003922884904972556, 'samples': 3216000, 'steps': 16749, 'loss/train': 0.8118246495723724} 01/27/2022 11:36:36 - INFO - codeparrot_training - Step 16750: {'lr': 0.0003922750364697246, 'samples': 3216192, 'steps': 16750, 'loss/train': 0.8513975143432617} 01/27/2022 11:36:39 - INFO - codeparrot_training - Step 16751: {'lr': 0.0003922615818327325, 'samples': 3216384, 'steps': 16751, 'loss/train': 0.9015172719955444} 01/27/2022 11:36:42 - INFO - codeparrot_training - Step 16752: {'lr': 0.000392248126586337, 'samples': 3216576, 'steps': 16752, 'loss/train': 0.9106683433055878} 01/27/2022 11:36:45 - INFO - codeparrot_training - Step 16753: {'lr': 0.0003922346707305957, 'samples': 3216768, 'steps': 16753, 'loss/train': 0.7612093985080719} 01/27/2022 11:36:49 - INFO - codeparrot_training - Step 16754: {'lr': 0.00039222121426556617, 'samples': 3216960, 'steps': 16754, 'loss/train': 0.3005252256989479} 01/27/2022 11:36:53 - INFO - codeparrot_training - Step 16755: {'lr': 0.0003922077571913062, 'samples': 3217152, 'steps': 16755, 'loss/train': 0.6153963804244995} 01/27/2022 11:36:56 - INFO - codeparrot_training - Step 16756: {'lr': 0.00039219429950787326, 'samples': 3217344, 'steps': 16756, 'loss/train': 0.860878586769104} 01/27/2022 11:36:59 - INFO - codeparrot_training - Step 16757: {'lr': 0.0003921808412153252, 'samples': 3217536, 'steps': 16757, 'loss/train': 0.6527019292116165} 01/27/2022 11:37:02 - INFO - codeparrot_training - Step 16758: {'lr': 0.0003921673823137195, 'samples': 3217728, 'steps': 16758, 'loss/train': 0.58095283806324} 01/27/2022 11:37:05 - INFO - codeparrot_training - Step 16759: {'lr': 0.00039215392280311383, 'samples': 3217920, 'steps': 16759, 'loss/train': 0.9643063545227051} 01/27/2022 11:37:08 - INFO - codeparrot_training - Step 16760: {'lr': 0.000392140462683566, 'samples': 3218112, 'steps': 16760, 'loss/train': 0.8284721374511719} 01/27/2022 11:37:11 - INFO - codeparrot_training - Step 16761: {'lr': 0.0003921270019551335, 'samples': 3218304, 'steps': 16761, 'loss/train': 5.530507922172546} 01/27/2022 11:37:15 - INFO - codeparrot_training - Step 16762: {'lr': 0.00039211354061787407, 'samples': 3218496, 'steps': 16762, 'loss/train': 0.6056576818227768} 01/27/2022 11:37:20 - INFO - codeparrot_training - Step 16763: {'lr': 0.0003921000786718454, 'samples': 3218688, 'steps': 16763, 'loss/train': 0.5534693598747253} 01/27/2022 11:37:23 - INFO - codeparrot_training - Step 16764: {'lr': 0.0003920866161171051, 'samples': 3218880, 'steps': 16764, 'loss/train': 0.796954482793808} 01/27/2022 11:37:26 - INFO - codeparrot_training - Step 16765: {'lr': 0.0003920731529537108, 'samples': 3219072, 'steps': 16765, 'loss/train': 0.9207988679409027} 01/27/2022 11:37:29 - INFO - codeparrot_training - Step 16766: {'lr': 0.00039205968918172026, 'samples': 3219264, 'steps': 16766, 'loss/train': 1.3204565048217773} 01/27/2022 11:37:32 - INFO - codeparrot_training - Step 16767: {'lr': 0.00039204622480119107, 'samples': 3219456, 'steps': 16767, 'loss/train': 1.726703703403473} 01/27/2022 11:37:35 - INFO - codeparrot_training - Step 16768: {'lr': 0.000392032759812181, 'samples': 3219648, 'steps': 16768, 'loss/train': 0.7946044206619263} 01/27/2022 11:37:38 - INFO - codeparrot_training - Step 16769: {'lr': 0.0003920192942147477, 'samples': 3219840, 'steps': 16769, 'loss/train': 0.9263950288295746} 01/27/2022 11:37:42 - INFO - codeparrot_training - Step 16770: {'lr': 0.00039200582800894885, 'samples': 3220032, 'steps': 16770, 'loss/train': 1.0182404816150665} 01/27/2022 11:37:45 - INFO - codeparrot_training - Step 16771: {'lr': 0.00039199236119484207, 'samples': 3220224, 'steps': 16771, 'loss/train': 0.31293871253728867} 01/27/2022 11:37:50 - INFO - codeparrot_training - Step 16772: {'lr': 0.0003919788937724852, 'samples': 3220416, 'steps': 16772, 'loss/train': 0.9355471730232239} 01/27/2022 11:37:54 - INFO - codeparrot_training - Step 16773: {'lr': 0.0003919654257419357, 'samples': 3220608, 'steps': 16773, 'loss/train': 0.6336187720298767} 01/27/2022 11:37:57 - INFO - codeparrot_training - Step 16774: {'lr': 0.0003919519571032515, 'samples': 3220800, 'steps': 16774, 'loss/train': 1.2122949957847595} 01/27/2022 11:38:00 - INFO - codeparrot_training - Step 16775: {'lr': 0.00039193848785649016, 'samples': 3220992, 'steps': 16775, 'loss/train': 0.5471028238534927} 01/27/2022 11:38:03 - INFO - codeparrot_training - Step 16776: {'lr': 0.0003919250180017094, 'samples': 3221184, 'steps': 16776, 'loss/train': 0.05610623583197594} 01/27/2022 11:38:06 - INFO - codeparrot_training - Step 16777: {'lr': 0.00039191154753896696, 'samples': 3221376, 'steps': 16777, 'loss/train': 0.3731239289045334} 01/27/2022 11:38:09 - INFO - codeparrot_training - Step 16778: {'lr': 0.00039189807646832045, 'samples': 3221568, 'steps': 16778, 'loss/train': 0.3794650733470917} 01/27/2022 11:38:12 - INFO - codeparrot_training - Step 16779: {'lr': 0.0003918846047898277, 'samples': 3221760, 'steps': 16779, 'loss/train': 1.1759809255599976} 01/27/2022 11:38:16 - INFO - codeparrot_training - Step 16780: {'lr': 0.00039187113250354635, 'samples': 3221952, 'steps': 16780, 'loss/train': 0.6449707746505737} 01/27/2022 11:38:20 - INFO - codeparrot_training - Step 16781: {'lr': 0.00039185765960953405, 'samples': 3222144, 'steps': 16781, 'loss/train': 0.2099103033542633} 01/27/2022 11:38:23 - INFO - codeparrot_training - Step 16782: {'lr': 0.0003918441861078486, 'samples': 3222336, 'steps': 16782, 'loss/train': 0.8961345255374908} 01/27/2022 11:38:26 - INFO - codeparrot_training - Step 16783: {'lr': 0.0003918307119985477, 'samples': 3222528, 'steps': 16783, 'loss/train': 0.732565239071846} 01/27/2022 11:38:30 - INFO - codeparrot_training - Step 16784: {'lr': 0.0003918172372816892, 'samples': 3222720, 'steps': 16784, 'loss/train': 0.7673167884349823} 01/27/2022 11:38:33 - INFO - codeparrot_training - Step 16785: {'lr': 0.0003918037619573305, 'samples': 3222912, 'steps': 16785, 'loss/train': 0.6072016060352325} 01/27/2022 11:38:36 - INFO - codeparrot_training - Step 16786: {'lr': 0.0003917902860255296, 'samples': 3223104, 'steps': 16786, 'loss/train': 0.8642038106918335} 01/27/2022 11:38:39 - INFO - codeparrot_training - Step 16787: {'lr': 0.0003917768094863441, 'samples': 3223296, 'steps': 16787, 'loss/train': 0.08242721110582352} 01/27/2022 11:38:42 - INFO - codeparrot_training - Step 16788: {'lr': 0.00039176333233983187, 'samples': 3223488, 'steps': 16788, 'loss/train': 0.5627690702676773} 01/27/2022 11:38:45 - INFO - codeparrot_training - Step 16789: {'lr': 0.0003917498545860504, 'samples': 3223680, 'steps': 16789, 'loss/train': 2.5399033427238464} 01/27/2022 11:38:50 - INFO - codeparrot_training - Step 16790: {'lr': 0.0003917363762250576, 'samples': 3223872, 'steps': 16790, 'loss/train': 0.9434401094913483} 01/27/2022 11:38:53 - INFO - codeparrot_training - Step 16791: {'lr': 0.00039172289725691124, 'samples': 3224064, 'steps': 16791, 'loss/train': 0.8442219793796539} 01/27/2022 11:38:56 - INFO - codeparrot_training - Step 16792: {'lr': 0.000391709417681669, 'samples': 3224256, 'steps': 16792, 'loss/train': 0.9742890894412994} 01/27/2022 11:38:59 - INFO - codeparrot_training - Step 16793: {'lr': 0.0003916959374993885, 'samples': 3224448, 'steps': 16793, 'loss/train': 0.5274993628263474} 01/27/2022 11:39:02 - INFO - codeparrot_training - Step 16794: {'lr': 0.0003916824567101277, 'samples': 3224640, 'steps': 16794, 'loss/train': 1.027316004037857} 01/27/2022 11:39:05 - INFO - codeparrot_training - Step 16795: {'lr': 0.0003916689753139442, 'samples': 3224832, 'steps': 16795, 'loss/train': 1.153609961271286} 01/27/2022 11:39:08 - INFO - codeparrot_training - Step 16796: {'lr': 0.0003916554933108958, 'samples': 3225024, 'steps': 16796, 'loss/train': 0.6559967547655106} 01/27/2022 11:39:12 - INFO - codeparrot_training - Step 16797: {'lr': 0.0003916420107010402, 'samples': 3225216, 'steps': 16797, 'loss/train': 0.15351517125964165} 01/27/2022 11:39:18 - INFO - codeparrot_training - Step 16798: {'lr': 0.0003916285274844353, 'samples': 3225408, 'steps': 16798, 'loss/train': 0.991411417722702} 01/27/2022 11:39:21 - INFO - codeparrot_training - Step 16799: {'lr': 0.0003916150436611387, 'samples': 3225600, 'steps': 16799, 'loss/train': 0.7129653096199036} 01/27/2022 11:39:24 - INFO - codeparrot_training - Step 16800: {'lr': 0.0003916015592312082, 'samples': 3225792, 'steps': 16800, 'loss/train': 0.6359747350215912} 01/27/2022 11:39:27 - INFO - codeparrot_training - Step 16801: {'lr': 0.00039158807419470166, 'samples': 3225984, 'steps': 16801, 'loss/train': 1.3880435228347778} 01/27/2022 11:39:30 - INFO - codeparrot_training - Step 16802: {'lr': 0.0003915745885516767, 'samples': 3226176, 'steps': 16802, 'loss/train': 0.6845333129167557} 01/27/2022 11:39:34 - INFO - codeparrot_training - Step 16803: {'lr': 0.0003915611023021912, 'samples': 3226368, 'steps': 16803, 'loss/train': 0.7006503939628601} 01/27/2022 11:39:37 - INFO - codeparrot_training - Step 16804: {'lr': 0.00039154761544630287, 'samples': 3226560, 'steps': 16804, 'loss/train': 1.3191989064216614} 01/27/2022 11:39:40 - INFO - codeparrot_training - Step 16805: {'lr': 0.0003915341279840695, 'samples': 3226752, 'steps': 16805, 'loss/train': 1.1211770474910736} 01/27/2022 11:39:43 - INFO - codeparrot_training - Step 16806: {'lr': 0.00039152063991554885, 'samples': 3226944, 'steps': 16806, 'loss/train': 0.77216836810112} 01/27/2022 11:39:47 - INFO - codeparrot_training - Step 16807: {'lr': 0.0003915071512407987, 'samples': 3227136, 'steps': 16807, 'loss/train': 1.017994076013565} 01/27/2022 11:39:51 - INFO - codeparrot_training - Step 16808: {'lr': 0.0003914936619598769, 'samples': 3227328, 'steps': 16808, 'loss/train': 0.9824958443641663} 01/27/2022 11:39:54 - INFO - codeparrot_training - Step 16809: {'lr': 0.00039148017207284115, 'samples': 3227520, 'steps': 16809, 'loss/train': 0.7405836582183838} 01/27/2022 11:39:57 - INFO - codeparrot_training - Step 16810: {'lr': 0.0003914666815797493, 'samples': 3227712, 'steps': 16810, 'loss/train': 1.2057855427265167} 01/27/2022 11:40:00 - INFO - codeparrot_training - Step 16811: {'lr': 0.00039145319048065907, 'samples': 3227904, 'steps': 16811, 'loss/train': 0.8157295882701874} 01/27/2022 11:40:03 - INFO - codeparrot_training - Step 16812: {'lr': 0.00039143969877562833, 'samples': 3228096, 'steps': 16812, 'loss/train': 0.5855377167463303} 01/27/2022 11:40:06 - INFO - codeparrot_training - Step 16813: {'lr': 0.00039142620646471485, 'samples': 3228288, 'steps': 16813, 'loss/train': 0.38132715225219727} 01/27/2022 11:40:09 - INFO - codeparrot_training - Step 16814: {'lr': 0.00039141271354797635, 'samples': 3228480, 'steps': 16814, 'loss/train': 1.009420484304428} 01/27/2022 11:40:13 - INFO - codeparrot_training - Step 16815: {'lr': 0.0003913992200254707, 'samples': 3228672, 'steps': 16815, 'loss/train': 0.924778014421463} 01/27/2022 11:40:17 - INFO - codeparrot_training - Step 16816: {'lr': 0.0003913857258972557, 'samples': 3228864, 'steps': 16816, 'loss/train': 1.2576481997966766} 01/27/2022 11:40:20 - INFO - codeparrot_training - Step 16817: {'lr': 0.0003913722311633892, 'samples': 3229056, 'steps': 16817, 'loss/train': 1.390502393245697} 01/27/2022 11:40:23 - INFO - codeparrot_training - Step 16818: {'lr': 0.0003913587358239288, 'samples': 3229248, 'steps': 16818, 'loss/train': 1.2251645922660828} 01/27/2022 11:40:26 - INFO - codeparrot_training - Step 16819: {'lr': 0.0003913452398789326, 'samples': 3229440, 'steps': 16819, 'loss/train': 0.6522023230791092} 01/27/2022 11:40:30 - INFO - codeparrot_training - Step 16820: {'lr': 0.0003913317433284582, 'samples': 3229632, 'steps': 16820, 'loss/train': 0.8496522903442383} 01/27/2022 11:40:33 - INFO - codeparrot_training - Step 16821: {'lr': 0.00039131824617256354, 'samples': 3229824, 'steps': 16821, 'loss/train': 0.6969248503446579} 01/27/2022 11:40:36 - INFO - codeparrot_training - Step 16822: {'lr': 0.0003913047484113064, 'samples': 3230016, 'steps': 16822, 'loss/train': 0.4682147651910782} 01/27/2022 11:40:39 - INFO - codeparrot_training - Step 16823: {'lr': 0.0003912912500447445, 'samples': 3230208, 'steps': 16823, 'loss/train': 0.7875845432281494} 01/27/2022 11:40:42 - INFO - codeparrot_training - Step 16824: {'lr': 0.0003912777510729358, 'samples': 3230400, 'steps': 16824, 'loss/train': 0.5978379249572754} 01/27/2022 11:40:49 - INFO - codeparrot_training - Step 16825: {'lr': 0.0003912642514959381, 'samples': 3230592, 'steps': 16825, 'loss/train': 0.11844279617071152} 01/27/2022 11:40:52 - INFO - codeparrot_training - Step 16826: {'lr': 0.00039125075131380923, 'samples': 3230784, 'steps': 16826, 'loss/train': 1.0011006295681} 01/27/2022 11:40:55 - INFO - codeparrot_training - Step 16827: {'lr': 0.00039123725052660696, 'samples': 3230976, 'steps': 16827, 'loss/train': 0.04758444428443909} 01/27/2022 11:40:58 - INFO - codeparrot_training - Step 16828: {'lr': 0.00039122374913438913, 'samples': 3231168, 'steps': 16828, 'loss/train': 1.0879873931407928} 01/27/2022 11:41:01 - INFO - codeparrot_training - Step 16829: {'lr': 0.00039121024713721365, 'samples': 3231360, 'steps': 16829, 'loss/train': 0.9653832614421844} 01/27/2022 11:41:04 - INFO - codeparrot_training - Step 16830: {'lr': 0.0003911967445351382, 'samples': 3231552, 'steps': 16830, 'loss/train': 0.9355493187904358} 01/27/2022 11:41:08 - INFO - codeparrot_training - Step 16831: {'lr': 0.00039118324132822083, 'samples': 3231744, 'steps': 16831, 'loss/train': 0.6660778373479843} 01/27/2022 11:41:11 - INFO - codeparrot_training - Step 16832: {'lr': 0.0003911697375165193, 'samples': 3231936, 'steps': 16832, 'loss/train': 0.9991978704929352} 01/27/2022 11:41:15 - INFO - codeparrot_training - Step 16833: {'lr': 0.00039115623310009135, 'samples': 3232128, 'steps': 16833, 'loss/train': 0.8684123754501343} 01/27/2022 11:41:18 - INFO - codeparrot_training - Step 16834: {'lr': 0.00039114272807899496, 'samples': 3232320, 'steps': 16834, 'loss/train': 0.8950560986995697} 01/27/2022 11:41:22 - INFO - codeparrot_training - Step 16835: {'lr': 0.000391129222453288, 'samples': 3232512, 'steps': 16835, 'loss/train': 1.0092829763889313} 01/27/2022 11:41:25 - INFO - codeparrot_training - Step 16836: {'lr': 0.00039111571622302824, 'samples': 3232704, 'steps': 16836, 'loss/train': 0.9063351452350616} 01/27/2022 11:41:28 - INFO - codeparrot_training - Step 16837: {'lr': 0.0003911022093882736, 'samples': 3232896, 'steps': 16837, 'loss/train': 0.598248302936554} 01/27/2022 11:41:31 - INFO - codeparrot_training - Step 16838: {'lr': 0.00039108870194908175, 'samples': 3233088, 'steps': 16838, 'loss/train': 0.8673560321331024} 01/27/2022 11:41:34 - INFO - codeparrot_training - Step 16839: {'lr': 0.00039107519390551085, 'samples': 3233280, 'steps': 16839, 'loss/train': 0.9127008318901062} 01/27/2022 11:41:37 - INFO - codeparrot_training - Step 16840: {'lr': 0.00039106168525761855, 'samples': 3233472, 'steps': 16840, 'loss/train': 0.7007948756217957} 01/27/2022 11:41:40 - INFO - codeparrot_training - Step 16841: {'lr': 0.00039104817600546277, 'samples': 3233664, 'steps': 16841, 'loss/train': 0.7678492069244385} 01/27/2022 11:41:45 - INFO - codeparrot_training - Step 16842: {'lr': 0.00039103466614910144, 'samples': 3233856, 'steps': 16842, 'loss/train': 0.9272978603839874} 01/27/2022 11:41:48 - INFO - codeparrot_training - Step 16843: {'lr': 0.0003910211556885923, 'samples': 3234048, 'steps': 16843, 'loss/train': 0.6041042357683182} 01/27/2022 11:41:51 - INFO - codeparrot_training - Step 16844: {'lr': 0.0003910076446239934, 'samples': 3234240, 'steps': 16844, 'loss/train': 0.9683482646942139} 01/27/2022 11:41:54 - INFO - codeparrot_training - Step 16845: {'lr': 0.00039099413295536246, 'samples': 3234432, 'steps': 16845, 'loss/train': 0.6289411336183548} 01/27/2022 11:41:58 - INFO - codeparrot_training - Step 16846: {'lr': 0.0003909806206827575, 'samples': 3234624, 'steps': 16846, 'loss/train': 0.3367649093270302} 01/27/2022 11:42:01 - INFO - codeparrot_training - Step 16847: {'lr': 0.00039096710780623625, 'samples': 3234816, 'steps': 16847, 'loss/train': 1.2910743355751038} 01/27/2022 11:42:04 - INFO - codeparrot_training - Step 16848: {'lr': 0.0003909535943258567, 'samples': 3235008, 'steps': 16848, 'loss/train': 0.6661932170391083} 01/27/2022 11:42:07 - INFO - codeparrot_training - Step 16849: {'lr': 0.0003909400802416767, 'samples': 3235200, 'steps': 16849, 'loss/train': 0.4175412654876709} 01/27/2022 11:42:10 - INFO - codeparrot_training - Step 16850: {'lr': 0.00039092656555375416, 'samples': 3235392, 'steps': 16850, 'loss/train': 1.1915354132652283} 01/27/2022 11:42:16 - INFO - codeparrot_training - Step 16851: {'lr': 0.00039091305026214704, 'samples': 3235584, 'steps': 16851, 'loss/train': 0.780450314283371} 01/27/2022 11:42:19 - INFO - codeparrot_training - Step 16852: {'lr': 0.0003908995343669131, 'samples': 3235776, 'steps': 16852, 'loss/train': 0.6312952190637589} 01/27/2022 11:42:22 - INFO - codeparrot_training - Step 16853: {'lr': 0.0003908860178681102, 'samples': 3235968, 'steps': 16853, 'loss/train': 1.2336612045764923} 01/27/2022 11:42:25 - INFO - codeparrot_training - Step 16854: {'lr': 0.0003908725007657964, 'samples': 3236160, 'steps': 16854, 'loss/train': 0.5714336335659027} 01/27/2022 11:42:28 - INFO - codeparrot_training - Step 16855: {'lr': 0.0003908589830600296, 'samples': 3236352, 'steps': 16855, 'loss/train': 0.7534431517124176} 01/27/2022 11:42:31 - INFO - codeparrot_training - Step 16856: {'lr': 0.0003908454647508676, 'samples': 3236544, 'steps': 16856, 'loss/train': 1.0875763893127441} 01/27/2022 11:42:35 - INFO - codeparrot_training - Step 16857: {'lr': 0.00039083194583836836, 'samples': 3236736, 'steps': 16857, 'loss/train': 0.6401609480381012} 01/27/2022 11:42:38 - INFO - codeparrot_training - Step 16858: {'lr': 0.0003908184263225898, 'samples': 3236928, 'steps': 16858, 'loss/train': 2.4269081354141235} 01/27/2022 11:42:41 - INFO - codeparrot_training - Step 16859: {'lr': 0.0003908049062035898, 'samples': 3237120, 'steps': 16859, 'loss/train': 0.7410305142402649} 01/27/2022 11:42:45 - INFO - codeparrot_training - Step 16860: {'lr': 0.0003907913854814262, 'samples': 3237312, 'steps': 16860, 'loss/train': 0.9646310806274414} 01/27/2022 11:42:48 - INFO - codeparrot_training - Step 16861: {'lr': 0.00039077786415615714, 'samples': 3237504, 'steps': 16861, 'loss/train': 0.7373970597982407} 01/27/2022 11:42:52 - INFO - codeparrot_training - Step 16862: {'lr': 0.0003907643422278404, 'samples': 3237696, 'steps': 16862, 'loss/train': 0.5253083556890488} 01/27/2022 11:42:55 - INFO - codeparrot_training - Step 16863: {'lr': 0.00039075081969653383, 'samples': 3237888, 'steps': 16863, 'loss/train': 0.9523229598999023} 01/27/2022 11:42:58 - INFO - codeparrot_training - Step 16864: {'lr': 0.0003907372965622955, 'samples': 3238080, 'steps': 16864, 'loss/train': 0.7623406648635864} 01/27/2022 11:43:01 - INFO - codeparrot_training - Step 16865: {'lr': 0.0003907237728251833, 'samples': 3238272, 'steps': 16865, 'loss/train': 0.6992100030183792} 01/27/2022 11:43:04 - INFO - codeparrot_training - Step 16866: {'lr': 0.0003907102484852551, 'samples': 3238464, 'steps': 16866, 'loss/train': 0.8990136086940765} 01/27/2022 11:43:07 - INFO - codeparrot_training - Step 16867: {'lr': 0.0003906967235425689, 'samples': 3238656, 'steps': 16867, 'loss/train': 0.8337095081806183} 01/27/2022 11:43:11 - INFO - codeparrot_training - Step 16868: {'lr': 0.0003906831979971826, 'samples': 3238848, 'steps': 16868, 'loss/train': 0.8573309183120728} 01/27/2022 11:43:15 - INFO - codeparrot_training - Step 16869: {'lr': 0.0003906696718491541, 'samples': 3239040, 'steps': 16869, 'loss/train': 0.4679349213838577} 01/27/2022 11:43:18 - INFO - codeparrot_training - Step 16870: {'lr': 0.0003906561450985415, 'samples': 3239232, 'steps': 16870, 'loss/train': 0.4599645584821701} 01/27/2022 11:43:21 - INFO - codeparrot_training - Step 16871: {'lr': 0.00039064261774540254, 'samples': 3239424, 'steps': 16871, 'loss/train': 0.8348590135574341} 01/27/2022 11:43:24 - INFO - codeparrot_training - Step 16872: {'lr': 0.0003906290897897953, 'samples': 3239616, 'steps': 16872, 'loss/train': 0.7666544616222382} 01/27/2022 11:43:28 - INFO - codeparrot_training - Step 16873: {'lr': 0.00039061556123177777, 'samples': 3239808, 'steps': 16873, 'loss/train': 0.4653368890285492} 01/27/2022 11:43:31 - INFO - codeparrot_training - Step 16874: {'lr': 0.00039060203207140774, 'samples': 3240000, 'steps': 16874, 'loss/train': 0.7820010781288147} 01/27/2022 11:43:34 - INFO - codeparrot_training - Step 16875: {'lr': 0.0003905885023087433, 'samples': 3240192, 'steps': 16875, 'loss/train': 1.2156483829021454} 01/27/2022 11:43:37 - INFO - codeparrot_training - Step 16876: {'lr': 0.0003905749719438423, 'samples': 3240384, 'steps': 16876, 'loss/train': 0.2888473570346832} 01/27/2022 11:43:43 - INFO - codeparrot_training - Step 16877: {'lr': 0.00039056144097676285, 'samples': 3240576, 'steps': 16877, 'loss/train': 0.2498508095741272} 01/27/2022 11:43:46 - INFO - codeparrot_training - Step 16878: {'lr': 0.0003905479094075627, 'samples': 3240768, 'steps': 16878, 'loss/train': 0.9661105871200562} 01/27/2022 11:43:49 - INFO - codeparrot_training - Step 16879: {'lr': 0.00039053437723630003, 'samples': 3240960, 'steps': 16879, 'loss/train': 0.9111748337745667} 01/27/2022 11:43:53 - INFO - codeparrot_training - Step 16880: {'lr': 0.00039052084446303264, 'samples': 3241152, 'steps': 16880, 'loss/train': 0.5124429166316986} 01/27/2022 11:43:56 - INFO - codeparrot_training - Step 16881: {'lr': 0.0003905073110878186, 'samples': 3241344, 'steps': 16881, 'loss/train': 0.6561328321695328} 01/27/2022 11:43:59 - INFO - codeparrot_training - Step 16882: {'lr': 0.00039049377711071595, 'samples': 3241536, 'steps': 16882, 'loss/train': 0.9551517069339752} 01/27/2022 11:44:02 - INFO - codeparrot_training - Step 16883: {'lr': 0.00039048024253178243, 'samples': 3241728, 'steps': 16883, 'loss/train': 0.8015717267990112} 01/27/2022 11:44:05 - INFO - codeparrot_training - Step 16884: {'lr': 0.00039046670735107627, 'samples': 3241920, 'steps': 16884, 'loss/train': 0.7641280889511108} 01/27/2022 11:44:08 - INFO - codeparrot_training - Step 16885: {'lr': 0.00039045317156865525, 'samples': 3242112, 'steps': 16885, 'loss/train': 0.8805055618286133} 01/27/2022 11:44:13 - INFO - codeparrot_training - Step 16886: {'lr': 0.0003904396351845775, 'samples': 3242304, 'steps': 16886, 'loss/train': 0.6275840252637863} 01/27/2022 11:44:16 - INFO - codeparrot_training - Step 16887: {'lr': 0.00039042609819890087, 'samples': 3242496, 'steps': 16887, 'loss/train': 2.1334437131881714} 01/27/2022 11:44:19 - INFO - codeparrot_training - Step 16888: {'lr': 0.0003904125606116835, 'samples': 3242688, 'steps': 16888, 'loss/train': 0.7131605297327042} 01/27/2022 11:44:22 - INFO - codeparrot_training - Step 16889: {'lr': 0.0003903990224229833, 'samples': 3242880, 'steps': 16889, 'loss/train': 1.0128754377365112} 01/27/2022 11:44:25 - INFO - codeparrot_training - Step 16890: {'lr': 0.00039038548363285825, 'samples': 3243072, 'steps': 16890, 'loss/train': 0.5173426866531372} 01/27/2022 11:44:28 - INFO - codeparrot_training - Step 16891: {'lr': 0.00039037194424136634, 'samples': 3243264, 'steps': 16891, 'loss/train': 1.3186720311641693} 01/27/2022 11:44:32 - INFO - codeparrot_training - Step 16892: {'lr': 0.0003903584042485656, 'samples': 3243456, 'steps': 16892, 'loss/train': 0.9015749394893646} 01/27/2022 11:44:35 - INFO - codeparrot_training - Step 16893: {'lr': 0.00039034486365451405, 'samples': 3243648, 'steps': 16893, 'loss/train': 0.924083411693573} 01/27/2022 11:44:38 - INFO - codeparrot_training - Step 16894: {'lr': 0.00039033132245926974, 'samples': 3243840, 'steps': 16894, 'loss/train': 0.9485701024532318} 01/27/2022 11:44:44 - INFO - codeparrot_training - Step 16895: {'lr': 0.0003903177806628905, 'samples': 3244032, 'steps': 16895, 'loss/train': 0.5335685759782791} 01/27/2022 11:44:47 - INFO - codeparrot_training - Step 16896: {'lr': 0.00039030423826543446, 'samples': 3244224, 'steps': 16896, 'loss/train': 0.8597194254398346} 01/27/2022 11:44:50 - INFO - codeparrot_training - Step 16897: {'lr': 0.0003902906952669596, 'samples': 3244416, 'steps': 16897, 'loss/train': 1.0436486899852753} 01/27/2022 11:44:54 - INFO - codeparrot_training - Step 16898: {'lr': 0.000390277151667524, 'samples': 3244608, 'steps': 16898, 'loss/train': 1.1486335694789886} 01/27/2022 11:44:57 - INFO - codeparrot_training - Step 16899: {'lr': 0.0003902636074671856, 'samples': 3244800, 'steps': 16899, 'loss/train': 1.0118916928768158} 01/27/2022 11:45:00 - INFO - codeparrot_training - Step 16900: {'lr': 0.0003902500626660025, 'samples': 3244992, 'steps': 16900, 'loss/train': 1.076905757188797} 01/27/2022 11:45:03 - INFO - codeparrot_training - Step 16901: {'lr': 0.00039023651726403263, 'samples': 3245184, 'steps': 16901, 'loss/train': 0.7469543516635895} 01/27/2022 11:45:06 - INFO - codeparrot_training - Step 16902: {'lr': 0.00039022297126133397, 'samples': 3245376, 'steps': 16902, 'loss/train': 0.1911558285355568} 01/27/2022 11:45:09 - INFO - codeparrot_training - Step 16903: {'lr': 0.0003902094246579647, 'samples': 3245568, 'steps': 16903, 'loss/train': 0.6816944181919098} 01/27/2022 11:45:14 - INFO - codeparrot_training - Step 16904: {'lr': 0.00039019587745398276, 'samples': 3245760, 'steps': 16904, 'loss/train': 0.6018802374601364} 01/27/2022 11:45:17 - INFO - codeparrot_training - Step 16905: {'lr': 0.00039018232964944623, 'samples': 3245952, 'steps': 16905, 'loss/train': 0.9022212624549866} 01/27/2022 11:45:20 - INFO - codeparrot_training - Step 16906: {'lr': 0.0003901687812444131, 'samples': 3246144, 'steps': 16906, 'loss/train': 1.492704838514328} 01/27/2022 11:45:24 - INFO - codeparrot_training - Step 16907: {'lr': 0.0003901552322389414, 'samples': 3246336, 'steps': 16907, 'loss/train': 0.6090226918458939} 01/27/2022 11:45:27 - INFO - codeparrot_training - Step 16908: {'lr': 0.00039014168263308926, 'samples': 3246528, 'steps': 16908, 'loss/train': 0.7020551115274429} 01/27/2022 11:45:30 - INFO - codeparrot_training - Step 16909: {'lr': 0.00039012813242691454, 'samples': 3246720, 'steps': 16909, 'loss/train': 0.5655768066644669} 01/27/2022 11:45:33 - INFO - codeparrot_training - Step 16910: {'lr': 0.00039011458162047547, 'samples': 3246912, 'steps': 16910, 'loss/train': 0.5809158682823181} 01/27/2022 11:45:36 - INFO - codeparrot_training - Step 16911: {'lr': 0.00039010103021383, 'samples': 3247104, 'steps': 16911, 'loss/train': 0.8071163892745972} 01/27/2022 11:45:39 - INFO - codeparrot_training - Step 16912: {'lr': 0.00039008747820703615, 'samples': 3247296, 'steps': 16912, 'loss/train': 0.6788030862808228} 01/27/2022 11:45:44 - INFO - codeparrot_training - Step 16913: {'lr': 0.0003900739256001521, 'samples': 3247488, 'steps': 16913, 'loss/train': 0.7524007558822632} 01/27/2022 11:45:47 - INFO - codeparrot_training - Step 16914: {'lr': 0.00039006037239323584, 'samples': 3247680, 'steps': 16914, 'loss/train': 0.777653306722641} 01/27/2022 11:45:50 - INFO - codeparrot_training - Step 16915: {'lr': 0.00039004681858634537, 'samples': 3247872, 'steps': 16915, 'loss/train': 0.6411704868078232} 01/27/2022 11:45:53 - INFO - codeparrot_training - Step 16916: {'lr': 0.0003900332641795388, 'samples': 3248064, 'steps': 16916, 'loss/train': 1.066002756357193} 01/27/2022 11:45:56 - INFO - codeparrot_training - Step 16917: {'lr': 0.0003900197091728742, 'samples': 3248256, 'steps': 16917, 'loss/train': 1.163102924823761} 01/27/2022 11:45:59 - INFO - codeparrot_training - Step 16918: {'lr': 0.0003900061535664097, 'samples': 3248448, 'steps': 16918, 'loss/train': 0.4907820224761963} 01/27/2022 11:46:03 - INFO - codeparrot_training - Step 16919: {'lr': 0.0003899925973602032, 'samples': 3248640, 'steps': 16919, 'loss/train': 1.062075287103653} 01/27/2022 11:46:06 - INFO - codeparrot_training - Step 16920: {'lr': 0.0003899790405543129, 'samples': 3248832, 'steps': 16920, 'loss/train': 0.711375430226326} 01/27/2022 11:46:09 - INFO - codeparrot_training - Step 16921: {'lr': 0.0003899654831487969, 'samples': 3249024, 'steps': 16921, 'loss/train': 0.5722496062517166} 01/27/2022 11:46:14 - INFO - codeparrot_training - Step 16922: {'lr': 0.0003899519251437131, 'samples': 3249216, 'steps': 16922, 'loss/train': 0.1864769607782364} 01/27/2022 11:46:17 - INFO - codeparrot_training - Step 16923: {'lr': 0.00038993836653911974, 'samples': 3249408, 'steps': 16923, 'loss/train': 0.9627458453178406} 01/27/2022 11:46:20 - INFO - codeparrot_training - Step 16924: {'lr': 0.00038992480733507487, 'samples': 3249600, 'steps': 16924, 'loss/train': 0.4025430679321289} 01/27/2022 11:46:23 - INFO - codeparrot_training - Step 16925: {'lr': 0.0003899112475316365, 'samples': 3249792, 'steps': 16925, 'loss/train': 0.7798760533332825} 01/27/2022 11:46:26 - INFO - codeparrot_training - Step 16926: {'lr': 0.00038989768712886287, 'samples': 3249984, 'steps': 16926, 'loss/train': 1.0067017078399658} 01/27/2022 11:46:30 - INFO - codeparrot_training - Step 16927: {'lr': 0.0003898841261268119, 'samples': 3250176, 'steps': 16927, 'loss/train': 1.3577590584754944} 01/27/2022 11:46:33 - INFO - codeparrot_training - Step 16928: {'lr': 0.00038987056452554177, 'samples': 3250368, 'steps': 16928, 'loss/train': 0.967994213104248} 01/27/2022 11:46:36 - INFO - codeparrot_training - Step 16929: {'lr': 0.00038985700232511055, 'samples': 3250560, 'steps': 16929, 'loss/train': 0.9476949870586395} 01/27/2022 11:46:42 - INFO - codeparrot_training - Step 16930: {'lr': 0.0003898434395255763, 'samples': 3250752, 'steps': 16930, 'loss/train': 1.0211888551712036} 01/27/2022 11:46:45 - INFO - codeparrot_training - Step 16931: {'lr': 0.0003898298761269973, 'samples': 3250944, 'steps': 16931, 'loss/train': 0.5557549595832825} 01/27/2022 11:46:49 - INFO - codeparrot_training - Step 16932: {'lr': 0.0003898163121294314, 'samples': 3251136, 'steps': 16932, 'loss/train': 0.7581929862499237} 01/27/2022 11:46:52 - INFO - codeparrot_training - Step 16933: {'lr': 0.0003898027475329368, 'samples': 3251328, 'steps': 16933, 'loss/train': 0.6421479284763336} 01/27/2022 11:46:55 - INFO - codeparrot_training - Step 16934: {'lr': 0.00038978918233757167, 'samples': 3251520, 'steps': 16934, 'loss/train': 1.3237614333629608} 01/27/2022 11:46:58 - INFO - codeparrot_training - Step 16935: {'lr': 0.000389775616543394, 'samples': 3251712, 'steps': 16935, 'loss/train': 0.7780162990093231} 01/27/2022 11:47:01 - INFO - codeparrot_training - Step 16936: {'lr': 0.00038976205015046206, 'samples': 3251904, 'steps': 16936, 'loss/train': 0.6622701287269592} 01/27/2022 11:47:04 - INFO - codeparrot_training - Step 16937: {'lr': 0.00038974848315883383, 'samples': 3252096, 'steps': 16937, 'loss/train': 0.712713897228241} 01/27/2022 11:47:08 - INFO - codeparrot_training - Step 16938: {'lr': 0.00038973491556856755, 'samples': 3252288, 'steps': 16938, 'loss/train': 0.920296847820282} 01/27/2022 11:47:12 - INFO - codeparrot_training - Step 16939: {'lr': 0.0003897213473797212, 'samples': 3252480, 'steps': 16939, 'loss/train': 1.2752052247524261} 01/27/2022 11:47:15 - INFO - codeparrot_training - Step 16940: {'lr': 0.0003897077785923529, 'samples': 3252672, 'steps': 16940, 'loss/train': 0.941489964723587} 01/27/2022 11:47:18 - INFO - codeparrot_training - Step 16941: {'lr': 0.0003896942092065209, 'samples': 3252864, 'steps': 16941, 'loss/train': 0.8200270235538483} 01/27/2022 11:47:21 - INFO - codeparrot_training - Step 16942: {'lr': 0.0003896806392222833, 'samples': 3253056, 'steps': 16942, 'loss/train': 1.1924895644187927} 01/27/2022 11:47:25 - INFO - codeparrot_training - Step 16943: {'lr': 0.00038966706863969815, 'samples': 3253248, 'steps': 16943, 'loss/train': 0.82960644364357} 01/27/2022 11:47:28 - INFO - codeparrot_training - Step 16944: {'lr': 0.00038965349745882365, 'samples': 3253440, 'steps': 16944, 'loss/train': 0.6690224558115005} 01/27/2022 11:47:31 - INFO - codeparrot_training - Step 16945: {'lr': 0.00038963992567971794, 'samples': 3253632, 'steps': 16945, 'loss/train': 0.07631062343716621} 01/27/2022 11:47:34 - INFO - codeparrot_training - Step 16946: {'lr': 0.0003896263533024391, 'samples': 3253824, 'steps': 16946, 'loss/train': 0.6476037204265594} 01/27/2022 11:47:37 - INFO - codeparrot_training - Step 16947: {'lr': 0.0003896127803270453, 'samples': 3254016, 'steps': 16947, 'loss/train': 0.7610760927200317} 01/27/2022 11:47:41 - INFO - codeparrot_training - Step 16948: {'lr': 0.0003895992067535946, 'samples': 3254208, 'steps': 16948, 'loss/train': 0.7775023877620697} 01/27/2022 11:47:45 - INFO - codeparrot_training - Step 16949: {'lr': 0.0003895856325821454, 'samples': 3254400, 'steps': 16949, 'loss/train': 0.8749565184116364} 01/27/2022 11:47:48 - INFO - codeparrot_training - Step 16950: {'lr': 0.00038957205781275554, 'samples': 3254592, 'steps': 16950, 'loss/train': 0.7663524448871613} 01/27/2022 11:47:51 - INFO - codeparrot_training - Step 16951: {'lr': 0.00038955848244548333, 'samples': 3254784, 'steps': 16951, 'loss/train': 0.6056289821863174} 01/27/2022 11:47:54 - INFO - codeparrot_training - Step 16952: {'lr': 0.00038954490648038687, 'samples': 3254976, 'steps': 16952, 'loss/train': 1.1136817932128906} 01/27/2022 11:47:57 - INFO - codeparrot_training - Step 16953: {'lr': 0.0003895313299175244, 'samples': 3255168, 'steps': 16953, 'loss/train': 0.7329642623662949} 01/27/2022 11:48:00 - INFO - codeparrot_training - Step 16954: {'lr': 0.000389517752756954, 'samples': 3255360, 'steps': 16954, 'loss/train': 0.5365540087223053} 01/27/2022 11:48:03 - INFO - codeparrot_training - Step 16955: {'lr': 0.0003895041749987338, 'samples': 3255552, 'steps': 16955, 'loss/train': 0.8544944822788239} 01/27/2022 11:48:07 - INFO - codeparrot_training - Step 16956: {'lr': 0.00038949059664292207, 'samples': 3255744, 'steps': 16956, 'loss/train': 0.9806711375713348} 01/27/2022 11:48:13 - INFO - codeparrot_training - Step 16957: {'lr': 0.0003894770176895769, 'samples': 3255936, 'steps': 16957, 'loss/train': 0.6531978249549866} 01/27/2022 11:48:16 - INFO - codeparrot_training - Step 16958: {'lr': 0.0003894634381387565, 'samples': 3256128, 'steps': 16958, 'loss/train': 0.9627905488014221} 01/27/2022 11:48:19 - INFO - codeparrot_training - Step 16959: {'lr': 0.00038944985799051896, 'samples': 3256320, 'steps': 16959, 'loss/train': 1.0260493755340576} 01/27/2022 11:48:22 - INFO - codeparrot_training - Step 16960: {'lr': 0.0003894362772449226, 'samples': 3256512, 'steps': 16960, 'loss/train': 0.8106839060783386} 01/27/2022 11:48:26 - INFO - codeparrot_training - Step 16961: {'lr': 0.0003894226959020254, 'samples': 3256704, 'steps': 16961, 'loss/train': 1.4848626852035522} 01/27/2022 11:48:29 - INFO - codeparrot_training - Step 16962: {'lr': 0.00038940911396188573, 'samples': 3256896, 'steps': 16962, 'loss/train': 0.9038337171077728} 01/27/2022 11:48:32 - INFO - codeparrot_training - Step 16963: {'lr': 0.0003893955314245616, 'samples': 3257088, 'steps': 16963, 'loss/train': 0.5744192451238632} 01/27/2022 11:48:35 - INFO - codeparrot_training - Step 16964: {'lr': 0.0003893819482901113, 'samples': 3257280, 'steps': 16964, 'loss/train': 0.911381721496582} 01/27/2022 11:48:40 - INFO - codeparrot_training - Step 16965: {'lr': 0.000389368364558593, 'samples': 3257472, 'steps': 16965, 'loss/train': 0.8829802572727203} 01/27/2022 11:48:43 - INFO - codeparrot_training - Step 16966: {'lr': 0.00038935478023006487, 'samples': 3257664, 'steps': 16966, 'loss/train': 0.8382733762264252} 01/27/2022 11:48:46 - INFO - codeparrot_training - Step 16967: {'lr': 0.0003893411953045852, 'samples': 3257856, 'steps': 16967, 'loss/train': 0.9348568618297577} 01/27/2022 11:48:49 - INFO - codeparrot_training - Step 16968: {'lr': 0.000389327609782212, 'samples': 3258048, 'steps': 16968, 'loss/train': 1.2356248497962952} 01/27/2022 11:48:52 - INFO - codeparrot_training - Step 16969: {'lr': 0.0003893140236630036, 'samples': 3258240, 'steps': 16969, 'loss/train': 0.057005250826478004} 01/27/2022 11:48:55 - INFO - codeparrot_training - Step 16970: {'lr': 0.0003893004369470181, 'samples': 3258432, 'steps': 16970, 'loss/train': 0.5362645536661148} 01/27/2022 11:48:59 - INFO - codeparrot_training - Step 16971: {'lr': 0.00038928684963431383, 'samples': 3258624, 'steps': 16971, 'loss/train': 0.9633254706859589} 01/27/2022 11:49:02 - INFO - codeparrot_training - Step 16972: {'lr': 0.00038927326172494894, 'samples': 3258816, 'steps': 16972, 'loss/train': 0.6587191969156265} 01/27/2022 11:49:05 - INFO - codeparrot_training - Step 16973: {'lr': 0.0003892596732189816, 'samples': 3259008, 'steps': 16973, 'loss/train': 0.7833366394042969} 01/27/2022 11:49:11 - INFO - codeparrot_training - Step 16974: {'lr': 0.00038924608411647, 'samples': 3259200, 'steps': 16974, 'loss/train': 1.1013591885566711} 01/27/2022 11:49:14 - INFO - codeparrot_training - Step 16975: {'lr': 0.00038923249441747245, 'samples': 3259392, 'steps': 16975, 'loss/train': 0.8554796576499939} 01/27/2022 11:49:17 - INFO - codeparrot_training - Step 16976: {'lr': 0.000389218904122047, 'samples': 3259584, 'steps': 16976, 'loss/train': 0.35823018103837967} 01/27/2022 11:49:20 - INFO - codeparrot_training - Step 16977: {'lr': 0.00038920531323025206, 'samples': 3259776, 'steps': 16977, 'loss/train': 0.28835946321487427} 01/27/2022 11:49:24 - INFO - codeparrot_training - Step 16978: {'lr': 0.0003891917217421458, 'samples': 3259968, 'steps': 16978, 'loss/train': 0.9356774389743805} 01/27/2022 11:49:27 - INFO - codeparrot_training - Step 16979: {'lr': 0.00038917812965778625, 'samples': 3260160, 'steps': 16979, 'loss/train': 0.8370275795459747} 01/27/2022 11:49:30 - INFO - codeparrot_training - Step 16980: {'lr': 0.00038916453697723194, 'samples': 3260352, 'steps': 16980, 'loss/train': 1.3011333346366882} 01/27/2022 11:49:33 - INFO - codeparrot_training - Step 16981: {'lr': 0.00038915094370054083, 'samples': 3260544, 'steps': 16981, 'loss/train': 0.8093619346618652} 01/27/2022 11:49:36 - INFO - codeparrot_training - Step 16982: {'lr': 0.00038913734982777136, 'samples': 3260736, 'steps': 16982, 'loss/train': 0.8828321099281311} 01/27/2022 11:49:40 - INFO - codeparrot_training - Step 16983: {'lr': 0.0003891237553589816, 'samples': 3260928, 'steps': 16983, 'loss/train': 1.0371741950511932} 01/27/2022 11:49:44 - INFO - codeparrot_training - Step 16984: {'lr': 0.00038911016029422984, 'samples': 3261120, 'steps': 16984, 'loss/train': 0.4332704097032547} 01/27/2022 11:49:47 - INFO - codeparrot_training - Step 16985: {'lr': 0.0003890965646335744, 'samples': 3261312, 'steps': 16985, 'loss/train': 1.1151655912399292} 01/27/2022 11:49:50 - INFO - codeparrot_training - Step 16986: {'lr': 0.0003890829683770734, 'samples': 3261504, 'steps': 16986, 'loss/train': 0.3305295333266258} 01/27/2022 11:49:53 - INFO - codeparrot_training - Step 16987: {'lr': 0.0003890693715247851, 'samples': 3261696, 'steps': 16987, 'loss/train': 0.666034609079361} 01/27/2022 11:49:56 - INFO - codeparrot_training - Step 16988: {'lr': 0.0003890557740767678, 'samples': 3261888, 'steps': 16988, 'loss/train': 0.8982296884059906} 01/27/2022 11:49:59 - INFO - codeparrot_training - Step 16989: {'lr': 0.0003890421760330798, 'samples': 3262080, 'steps': 16989, 'loss/train': 1.086978167295456} 01/27/2022 11:50:02 - INFO - codeparrot_training - Step 16990: {'lr': 0.0003890285773937792, 'samples': 3262272, 'steps': 16990, 'loss/train': 0.8090126216411591} 01/27/2022 11:50:06 - INFO - codeparrot_training - Step 16991: {'lr': 0.0003890149781589243, 'samples': 3262464, 'steps': 16991, 'loss/train': 0.6847304552793503} 01/27/2022 11:50:10 - INFO - codeparrot_training - Step 16992: {'lr': 0.0003890013783285733, 'samples': 3262656, 'steps': 16992, 'loss/train': 0.5485451817512512} 01/27/2022 11:50:13 - INFO - codeparrot_training - Step 16993: {'lr': 0.00038898777790278465, 'samples': 3262848, 'steps': 16993, 'loss/train': 0.7054782807826996} 01/27/2022 11:50:16 - INFO - codeparrot_training - Step 16994: {'lr': 0.00038897417688161644, 'samples': 3263040, 'steps': 16994, 'loss/train': 0.948241800069809} 01/27/2022 11:50:19 - INFO - codeparrot_training - Step 16995: {'lr': 0.0003889605752651271, 'samples': 3263232, 'steps': 16995, 'loss/train': 1.1427498757839203} 01/27/2022 11:50:23 - INFO - codeparrot_training - Step 16996: {'lr': 0.0003889469730533746, 'samples': 3263424, 'steps': 16996, 'loss/train': 0.4091653972864151} 01/27/2022 11:50:26 - INFO - codeparrot_training - Step 16997: {'lr': 0.0003889333702464175, 'samples': 3263616, 'steps': 16997, 'loss/train': 1.4571962356567383} 01/27/2022 11:50:29 - INFO - codeparrot_training - Step 16998: {'lr': 0.00038891976684431395, 'samples': 3263808, 'steps': 16998, 'loss/train': 0.8151625692844391} 01/27/2022 11:50:32 - INFO - codeparrot_training - Step 16999: {'lr': 0.0003889061628471222, 'samples': 3264000, 'steps': 16999, 'loss/train': 0.5776092857122421} 01/27/2022 11:50:35 - INFO - codeparrot_training - Step 17000: {'lr': 0.00038889255825490053, 'samples': 3264192, 'steps': 17000, 'loss/train': 0.7305838465690613} 01/27/2022 11:50:42 - INFO - codeparrot_training - Step 17001: {'lr': 0.0003888789530677073, 'samples': 3264384, 'steps': 17001, 'loss/train': 0.32645507901906967} 01/27/2022 11:50:45 - INFO - codeparrot_training - Step 17002: {'lr': 0.00038886534728560073, 'samples': 3264576, 'steps': 17002, 'loss/train': 0.4996393471956253} 01/27/2022 11:50:48 - INFO - codeparrot_training - Step 17003: {'lr': 0.0003888517409086391, 'samples': 3264768, 'steps': 17003, 'loss/train': 0.9575718641281128} 01/27/2022 11:50:51 - INFO - codeparrot_training - Step 17004: {'lr': 0.0003888381339368807, 'samples': 3264960, 'steps': 17004, 'loss/train': 0.9887277781963348} 01/27/2022 11:50:54 - INFO - codeparrot_training - Step 17005: {'lr': 0.00038882452637038377, 'samples': 3265152, 'steps': 17005, 'loss/train': 0.2427799478173256} 01/27/2022 11:50:58 - INFO - codeparrot_training - Step 17006: {'lr': 0.00038881091820920676, 'samples': 3265344, 'steps': 17006, 'loss/train': 1.0604884922504425} 01/27/2022 11:51:01 - INFO - codeparrot_training - Step 17007: {'lr': 0.00038879730945340775, 'samples': 3265536, 'steps': 17007, 'loss/train': 0.8758994936943054} 01/27/2022 11:51:04 - INFO - codeparrot_training - Step 17008: {'lr': 0.0003887837001030452, 'samples': 3265728, 'steps': 17008, 'loss/train': 0.1941739171743393} 01/27/2022 11:51:08 - INFO - codeparrot_training - Step 17009: {'lr': 0.00038877009015817734, 'samples': 3265920, 'steps': 17009, 'loss/train': 1.8740796446800232} 01/27/2022 11:51:11 - INFO - codeparrot_training - Step 17010: {'lr': 0.0003887564796188625, 'samples': 3266112, 'steps': 17010, 'loss/train': 0.6667362749576569} 01/27/2022 11:51:15 - INFO - codeparrot_training - Step 17011: {'lr': 0.0003887428684851589, 'samples': 3266304, 'steps': 17011, 'loss/train': 1.0712314546108246} 01/27/2022 11:51:18 - INFO - codeparrot_training - Step 17012: {'lr': 0.00038872925675712493, 'samples': 3266496, 'steps': 17012, 'loss/train': 0.8743101954460144} 01/27/2022 11:51:21 - INFO - codeparrot_training - Step 17013: {'lr': 0.00038871564443481886, 'samples': 3266688, 'steps': 17013, 'loss/train': 0.5236360430717468} 01/27/2022 11:51:24 - INFO - codeparrot_training - Step 17014: {'lr': 0.0003887020315182991, 'samples': 3266880, 'steps': 17014, 'loss/train': 1.0370870232582092} 01/27/2022 11:51:27 - INFO - codeparrot_training - Step 17015: {'lr': 0.0003886884180076238, 'samples': 3267072, 'steps': 17015, 'loss/train': 1.2306594550609589} 01/27/2022 11:51:30 - INFO - codeparrot_training - Step 17016: {'lr': 0.0003886748039028514, 'samples': 3267264, 'steps': 17016, 'loss/train': 0.7918047308921814} 01/27/2022 11:51:33 - INFO - codeparrot_training - Step 17017: {'lr': 0.00038866118920404013, 'samples': 3267456, 'steps': 17017, 'loss/train': 1.0017724335193634} 01/27/2022 11:51:38 - INFO - codeparrot_training - Step 17018: {'lr': 0.0003886475739112484, 'samples': 3267648, 'steps': 17018, 'loss/train': 0.662424311041832} 01/27/2022 11:51:41 - INFO - codeparrot_training - Step 17019: {'lr': 0.0003886339580245344, 'samples': 3267840, 'steps': 17019, 'loss/train': 0.8934758305549622} 01/27/2022 11:51:44 - INFO - codeparrot_training - Step 17020: {'lr': 0.00038862034154395664, 'samples': 3268032, 'steps': 17020, 'loss/train': 0.9115326404571533} 01/27/2022 11:51:47 - INFO - codeparrot_training - Step 17021: {'lr': 0.00038860672446957336, 'samples': 3268224, 'steps': 17021, 'loss/train': 0.839954674243927} 01/27/2022 11:51:51 - INFO - codeparrot_training - Step 17022: {'lr': 0.00038859310680144276, 'samples': 3268416, 'steps': 17022, 'loss/train': 0.7890474200248718} 01/27/2022 11:51:54 - INFO - codeparrot_training - Step 17023: {'lr': 0.0003885794885396234, 'samples': 3268608, 'steps': 17023, 'loss/train': 0.7510886192321777} 01/27/2022 11:51:57 - INFO - codeparrot_training - Step 17024: {'lr': 0.00038856586968417353, 'samples': 3268800, 'steps': 17024, 'loss/train': 1.0419365465641022} 01/27/2022 11:52:00 - INFO - codeparrot_training - Step 17025: {'lr': 0.0003885522502351514, 'samples': 3268992, 'steps': 17025, 'loss/train': 0.8937330543994904} 01/27/2022 11:52:03 - INFO - codeparrot_training - Step 17026: {'lr': 0.0003885386301926155, 'samples': 3269184, 'steps': 17026, 'loss/train': 1.1400487124919891} 01/27/2022 11:52:08 - INFO - codeparrot_training - Step 17027: {'lr': 0.00038852500955662407, 'samples': 3269376, 'steps': 17027, 'loss/train': 0.9712927043437958} 01/27/2022 11:52:11 - INFO - codeparrot_training - Step 17028: {'lr': 0.0003885113883272355, 'samples': 3269568, 'steps': 17028, 'loss/train': 0.6891266405582428} 01/27/2022 11:52:14 - INFO - codeparrot_training - Step 17029: {'lr': 0.0003884977665045081, 'samples': 3269760, 'steps': 17029, 'loss/train': 1.0174627304077148} 01/27/2022 11:52:17 - INFO - codeparrot_training - Step 17030: {'lr': 0.0003884841440885003, 'samples': 3269952, 'steps': 17030, 'loss/train': 1.2079386413097382} 01/27/2022 11:52:20 - INFO - codeparrot_training - Step 17031: {'lr': 0.0003884705210792703, 'samples': 3270144, 'steps': 17031, 'loss/train': 0.0451526390388608} 01/27/2022 11:52:23 - INFO - codeparrot_training - Step 17032: {'lr': 0.00038845689747687664, 'samples': 3270336, 'steps': 17032, 'loss/train': 0.22059445828199387} 01/27/2022 11:52:26 - INFO - codeparrot_training - Step 17033: {'lr': 0.0003884432732813776, 'samples': 3270528, 'steps': 17033, 'loss/train': 0.7348201721906662} 01/27/2022 11:52:30 - INFO - codeparrot_training - Step 17034: {'lr': 0.00038842964849283146, 'samples': 3270720, 'steps': 17034, 'loss/train': 0.9016899168491364} 01/27/2022 11:52:33 - INFO - codeparrot_training - Step 17035: {'lr': 0.0003884160231112968, 'samples': 3270912, 'steps': 17035, 'loss/train': 0.6028322875499725} 01/27/2022 11:52:39 - INFO - codeparrot_training - Step 17036: {'lr': 0.00038840239713683165, 'samples': 3271104, 'steps': 17036, 'loss/train': 0.7409137040376663} 01/27/2022 11:52:42 - INFO - codeparrot_training - Step 17037: {'lr': 0.00038838877056949475, 'samples': 3271296, 'steps': 17037, 'loss/train': 0.6765693426132202} 01/27/2022 11:52:45 - INFO - codeparrot_training - Step 17038: {'lr': 0.00038837514340934424, 'samples': 3271488, 'steps': 17038, 'loss/train': 0.10178080201148987} 01/27/2022 11:52:48 - INFO - codeparrot_training - Step 17039: {'lr': 0.0003883615156564385, 'samples': 3271680, 'steps': 17039, 'loss/train': 0.5524561554193497} 01/27/2022 11:52:52 - INFO - codeparrot_training - Step 17040: {'lr': 0.000388347887310836, 'samples': 3271872, 'steps': 17040, 'loss/train': 0.6607821732759476} 01/27/2022 11:52:55 - INFO - codeparrot_training - Step 17041: {'lr': 0.0003883342583725952, 'samples': 3272064, 'steps': 17041, 'loss/train': 1.474579095840454} 01/27/2022 11:52:58 - INFO - codeparrot_training - Step 17042: {'lr': 0.0003883206288417742, 'samples': 3272256, 'steps': 17042, 'loss/train': 0.7803474068641663} 01/27/2022 11:53:01 - INFO - codeparrot_training - Step 17043: {'lr': 0.0003883069987184316, 'samples': 3272448, 'steps': 17043, 'loss/train': 0.8445212244987488} 01/27/2022 11:53:06 - INFO - codeparrot_training - Step 17044: {'lr': 0.0003882933680026257, 'samples': 3272640, 'steps': 17044, 'loss/train': 0.728143036365509} 01/27/2022 11:53:09 - INFO - codeparrot_training - Step 17045: {'lr': 0.000388279736694415, 'samples': 3272832, 'steps': 17045, 'loss/train': 0.9062857031822205} 01/27/2022 11:53:12 - INFO - codeparrot_training - Step 17046: {'lr': 0.00038826610479385774, 'samples': 3273024, 'steps': 17046, 'loss/train': 1.0894111096858978} 01/27/2022 11:53:15 - INFO - codeparrot_training - Step 17047: {'lr': 0.00038825247230101244, 'samples': 3273216, 'steps': 17047, 'loss/train': 0.7716476619243622} 01/27/2022 11:53:18 - INFO - codeparrot_training - Step 17048: {'lr': 0.0003882388392159375, 'samples': 3273408, 'steps': 17048, 'loss/train': 0.20042574405670166} 01/27/2022 11:53:21 - INFO - codeparrot_training - Step 17049: {'lr': 0.0003882252055386912, 'samples': 3273600, 'steps': 17049, 'loss/train': 0.42289701104164124} 01/27/2022 11:53:25 - INFO - codeparrot_training - Step 17050: {'lr': 0.00038821157126933204, 'samples': 3273792, 'steps': 17050, 'loss/train': 0.8383811116218567} 01/27/2022 11:53:28 - INFO - codeparrot_training - Step 17051: {'lr': 0.00038819793640791834, 'samples': 3273984, 'steps': 17051, 'loss/train': 0.6655778288841248} 01/27/2022 11:53:31 - INFO - codeparrot_training - Step 17052: {'lr': 0.0003881843009545086, 'samples': 3274176, 'steps': 17052, 'loss/train': 0.8225883543491364} 01/27/2022 11:53:37 - INFO - codeparrot_training - Step 17053: {'lr': 0.0003881706649091612, 'samples': 3274368, 'steps': 17053, 'loss/train': 0.4879691004753113} 01/27/2022 11:53:41 - INFO - codeparrot_training - Step 17054: {'lr': 0.0003881570282719346, 'samples': 3274560, 'steps': 17054, 'loss/train': 0.7611680030822754} 01/27/2022 11:53:44 - INFO - codeparrot_training - Step 17055: {'lr': 0.00038814339104288706, 'samples': 3274752, 'steps': 17055, 'loss/train': 0.8038306832313538} 01/27/2022 11:53:47 - INFO - codeparrot_training - Step 17056: {'lr': 0.00038812975322207713, 'samples': 3274944, 'steps': 17056, 'loss/train': 0.5895051509141922} 01/27/2022 11:53:50 - INFO - codeparrot_training - Step 17057: {'lr': 0.0003881161148095632, 'samples': 3275136, 'steps': 17057, 'loss/train': 0.7737646400928497} 01/27/2022 11:53:53 - INFO - codeparrot_training - Step 17058: {'lr': 0.0003881024758054037, 'samples': 3275328, 'steps': 17058, 'loss/train': 0.858266294002533} 01/27/2022 11:53:56 - INFO - codeparrot_training - Step 17059: {'lr': 0.00038808883620965705, 'samples': 3275520, 'steps': 17059, 'loss/train': 0.3931390643119812} 01/27/2022 11:53:59 - INFO - codeparrot_training - Step 17060: {'lr': 0.00038807519602238174, 'samples': 3275712, 'steps': 17060, 'loss/train': 0.4802197515964508} 01/27/2022 11:54:03 - INFO - codeparrot_training - Step 17061: {'lr': 0.00038806155524363594, 'samples': 3275904, 'steps': 17061, 'loss/train': 0.6924887448549271} 01/27/2022 11:54:07 - INFO - codeparrot_training - Step 17062: {'lr': 0.00038804791387347844, 'samples': 3276096, 'steps': 17062, 'loss/train': 0.8739460408687592} 01/27/2022 11:54:10 - INFO - codeparrot_training - Step 17063: {'lr': 0.0003880342719119675, 'samples': 3276288, 'steps': 17063, 'loss/train': 0.8691548109054565} 01/27/2022 11:54:13 - INFO - codeparrot_training - Step 17064: {'lr': 0.0003880206293591615, 'samples': 3276480, 'steps': 17064, 'loss/train': 0.5927909910678864} 01/27/2022 11:54:16 - INFO - codeparrot_training - Step 17065: {'lr': 0.000388006986215119, 'samples': 3276672, 'steps': 17065, 'loss/train': 0.859227329492569} 01/27/2022 11:54:20 - INFO - codeparrot_training - Step 17066: {'lr': 0.0003879933424798984, 'samples': 3276864, 'steps': 17066, 'loss/train': 0.732380211353302} 01/27/2022 11:54:23 - INFO - codeparrot_training - Step 17067: {'lr': 0.0003879796981535582, 'samples': 3277056, 'steps': 17067, 'loss/train': 0.5548969656229019} 01/27/2022 11:54:26 - INFO - codeparrot_training - Step 17068: {'lr': 0.00038796605323615664, 'samples': 3277248, 'steps': 17068, 'loss/train': 0.3936801105737686} 01/27/2022 11:54:29 - INFO - codeparrot_training - Step 17069: {'lr': 0.00038795240772775244, 'samples': 3277440, 'steps': 17069, 'loss/train': 0.7332470566034317} 01/27/2022 11:54:32 - INFO - codeparrot_training - Step 17070: {'lr': 0.0003879387616284038, 'samples': 3277632, 'steps': 17070, 'loss/train': 1.0897755324840546} 01/27/2022 11:54:37 - INFO - codeparrot_training - Step 17071: {'lr': 0.0003879251149381694, 'samples': 3277824, 'steps': 17071, 'loss/train': 0.9200559854507446} 01/27/2022 11:54:40 - INFO - codeparrot_training - Step 17072: {'lr': 0.0003879114676571076, 'samples': 3278016, 'steps': 17072, 'loss/train': 0.5846855789422989} 01/27/2022 11:54:43 - INFO - codeparrot_training - Step 17073: {'lr': 0.00038789781978527683, 'samples': 3278208, 'steps': 17073, 'loss/train': 1.0526083409786224} 01/27/2022 11:54:46 - INFO - codeparrot_training - Step 17074: {'lr': 0.0003878841713227356, 'samples': 3278400, 'steps': 17074, 'loss/train': 0.8529087603092194} 01/27/2022 11:54:49 - INFO - codeparrot_training - Step 17075: {'lr': 0.00038787052226954235, 'samples': 3278592, 'steps': 17075, 'loss/train': 0.7341953963041306} 01/27/2022 11:54:52 - INFO - codeparrot_training - Step 17076: {'lr': 0.0003878568726257556, 'samples': 3278784, 'steps': 17076, 'loss/train': 0.8961077928543091} 01/27/2022 11:54:55 - INFO - codeparrot_training - Step 17077: {'lr': 0.0003878432223914338, 'samples': 3278976, 'steps': 17077, 'loss/train': 0.7210025489330292} 01/27/2022 11:54:59 - INFO - codeparrot_training - Step 17078: {'lr': 0.00038782957156663535, 'samples': 3279168, 'steps': 17078, 'loss/train': 0.9196067154407501} 01/27/2022 11:55:02 - INFO - codeparrot_training - Step 17079: {'lr': 0.0003878159201514188, 'samples': 3279360, 'steps': 17079, 'loss/train': 0.5812731385231018} 01/27/2022 11:55:08 - INFO - codeparrot_training - Step 17080: {'lr': 0.00038780226814584263, 'samples': 3279552, 'steps': 17080, 'loss/train': 0.7888712882995605} 01/27/2022 11:55:11 - INFO - codeparrot_training - Step 17081: {'lr': 0.00038778861554996524, 'samples': 3279744, 'steps': 17081, 'loss/train': 0.8665232956409454} 01/27/2022 11:55:14 - INFO - codeparrot_training - Step 17082: {'lr': 0.00038777496236384526, 'samples': 3279936, 'steps': 17082, 'loss/train': 0.5821090936660767} 01/27/2022 11:55:17 - INFO - codeparrot_training - Step 17083: {'lr': 0.000387761308587541, 'samples': 3280128, 'steps': 17083, 'loss/train': 0.45525088906288147} 01/27/2022 11:55:21 - INFO - codeparrot_training - Step 17084: {'lr': 0.0003877476542211111, 'samples': 3280320, 'steps': 17084, 'loss/train': 0.28613872826099396} 01/27/2022 11:55:24 - INFO - codeparrot_training - Step 17085: {'lr': 0.00038773399926461395, 'samples': 3280512, 'steps': 17085, 'loss/train': 1.1982513964176178} 01/27/2022 11:55:27 - INFO - codeparrot_training - Step 17086: {'lr': 0.0003877203437181081, 'samples': 3280704, 'steps': 17086, 'loss/train': 0.7807973921298981} 01/27/2022 11:55:30 - INFO - codeparrot_training - Step 17087: {'lr': 0.0003877066875816521, 'samples': 3280896, 'steps': 17087, 'loss/train': 0.6829292625188828} 01/27/2022 11:55:34 - INFO - codeparrot_training - Step 17088: {'lr': 0.00038769303085530425, 'samples': 3281088, 'steps': 17088, 'loss/train': 1.1501461565494537} 01/27/2022 11:55:38 - INFO - codeparrot_training - Step 17089: {'lr': 0.0003876793735391233, 'samples': 3281280, 'steps': 17089, 'loss/train': 0.8927645981311798} 01/27/2022 11:55:41 - INFO - codeparrot_training - Step 17090: {'lr': 0.00038766571563316756, 'samples': 3281472, 'steps': 17090, 'loss/train': 0.6974459141492844} 01/27/2022 11:55:44 - INFO - codeparrot_training - Step 17091: {'lr': 0.00038765205713749563, 'samples': 3281664, 'steps': 17091, 'loss/train': 0.6338396072387695} 01/27/2022 11:55:47 - INFO - codeparrot_training - Step 17092: {'lr': 0.0003876383980521659, 'samples': 3281856, 'steps': 17092, 'loss/train': 1.0689464807510376} 01/27/2022 11:55:50 - INFO - codeparrot_training - Step 17093: {'lr': 0.0003876247383772371, 'samples': 3282048, 'steps': 17093, 'loss/train': 1.0504281520843506} 01/27/2022 11:55:53 - INFO - codeparrot_training - Step 17094: {'lr': 0.00038761107811276756, 'samples': 3282240, 'steps': 17094, 'loss/train': 0.5748596638441086} 01/27/2022 11:55:56 - INFO - codeparrot_training - Step 17095: {'lr': 0.00038759741725881593, 'samples': 3282432, 'steps': 17095, 'loss/train': 2.8273560404777527} 01/27/2022 11:56:00 - INFO - codeparrot_training - Step 17096: {'lr': 0.0003875837558154406, 'samples': 3282624, 'steps': 17096, 'loss/train': 0.8496375381946564} 01/27/2022 11:56:11 - INFO - codeparrot_training - Step 17097: {'lr': 0.00038757009378270014, 'samples': 3282816, 'steps': 17097, 'loss/train': 0.9595415890216827} 01/27/2022 11:56:15 - INFO - codeparrot_training - Step 17098: {'lr': 0.0003875564311606531, 'samples': 3283008, 'steps': 17098, 'loss/train': 0.9611683487892151} 01/27/2022 11:56:18 - INFO - codeparrot_training - Step 17099: {'lr': 0.000387542767949358, 'samples': 3283200, 'steps': 17099, 'loss/train': 0.5808737128973007} 01/27/2022 11:56:21 - INFO - codeparrot_training - Step 17100: {'lr': 0.0003875291041488734, 'samples': 3283392, 'steps': 17100, 'loss/train': 0.9940761923789978} 01/27/2022 11:56:24 - INFO - codeparrot_training - Step 17101: {'lr': 0.00038751543975925766, 'samples': 3283584, 'steps': 17101, 'loss/train': 0.659049466252327} 01/27/2022 11:56:27 - INFO - codeparrot_training - Step 17102: {'lr': 0.00038750177478056956, 'samples': 3283776, 'steps': 17102, 'loss/train': 0.3995637148618698} 01/27/2022 11:56:30 - INFO - codeparrot_training - Step 17103: {'lr': 0.0003874881092128675, 'samples': 3283968, 'steps': 17103, 'loss/train': 1.7050702571868896} 01/27/2022 11:56:34 - INFO - codeparrot_training - Step 17104: {'lr': 0.00038747444305621, 'samples': 3284160, 'steps': 17104, 'loss/train': 1.6842021346092224} 01/27/2022 11:56:37 - INFO - codeparrot_training - Step 17105: {'lr': 0.0003874607763106556, 'samples': 3284352, 'steps': 17105, 'loss/train': 0.5757706314325333} 01/27/2022 11:56:40 - INFO - codeparrot_training - Step 17106: {'lr': 0.00038744710897626293, 'samples': 3284544, 'steps': 17106, 'loss/train': 1.452052116394043} 01/27/2022 11:56:44 - INFO - codeparrot_training - Step 17107: {'lr': 0.00038743344105309055, 'samples': 3284736, 'steps': 17107, 'loss/train': 1.1913694739341736} 01/27/2022 11:56:47 - INFO - codeparrot_training - Step 17108: {'lr': 0.0003874197725411969, 'samples': 3284928, 'steps': 17108, 'loss/train': 0.6733482331037521} 01/27/2022 11:56:50 - INFO - codeparrot_training - Step 17109: {'lr': 0.0003874061034406405, 'samples': 3285120, 'steps': 17109, 'loss/train': 0.9188793003559113} 01/27/2022 11:56:54 - INFO - codeparrot_training - Step 17110: {'lr': 0.00038739243375148, 'samples': 3285312, 'steps': 17110, 'loss/train': 1.4206112623214722} 01/27/2022 11:56:57 - INFO - codeparrot_training - Step 17111: {'lr': 0.0003873787634737741, 'samples': 3285504, 'steps': 17111, 'loss/train': 0.8982879817485809} 01/27/2022 11:57:00 - INFO - codeparrot_training - Step 17112: {'lr': 0.00038736509260758103, 'samples': 3285696, 'steps': 17112, 'loss/train': 0.5065480917692184} 01/27/2022 11:57:03 - INFO - codeparrot_training - Step 17113: {'lr': 0.00038735142115295965, 'samples': 3285888, 'steps': 17113, 'loss/train': 1.072641670703888} 01/27/2022 11:57:06 - INFO - codeparrot_training - Step 17114: {'lr': 0.00038733774910996825, 'samples': 3286080, 'steps': 17114, 'loss/train': 0.877034604549408} 01/27/2022 11:57:09 - INFO - codeparrot_training - Step 17115: {'lr': 0.00038732407647866567, 'samples': 3286272, 'steps': 17115, 'loss/train': 0.8730863928794861} 01/27/2022 11:57:14 - INFO - codeparrot_training - Step 17116: {'lr': 0.00038731040325911027, 'samples': 3286464, 'steps': 17116, 'loss/train': 0.7605755925178528} 01/27/2022 11:57:17 - INFO - codeparrot_training - Step 17117: {'lr': 0.0003872967294513608, 'samples': 3286656, 'steps': 17117, 'loss/train': 0.720842108130455} 01/27/2022 11:57:20 - INFO - codeparrot_training - Step 17118: {'lr': 0.0003872830550554757, 'samples': 3286848, 'steps': 17118, 'loss/train': 0.7333067357540131} 01/27/2022 11:57:23 - INFO - codeparrot_training - Step 17119: {'lr': 0.0003872693800715135, 'samples': 3287040, 'steps': 17119, 'loss/train': 0.7090850472450256} 01/27/2022 11:57:26 - INFO - codeparrot_training - Step 17120: {'lr': 0.00038725570449953296, 'samples': 3287232, 'steps': 17120, 'loss/train': 0.8487252295017242} 01/27/2022 11:57:30 - INFO - codeparrot_training - Step 17121: {'lr': 0.00038724202833959254, 'samples': 3287424, 'steps': 17121, 'loss/train': 1.102437973022461} 01/27/2022 11:57:33 - INFO - codeparrot_training - Step 17122: {'lr': 0.00038722835159175087, 'samples': 3287616, 'steps': 17122, 'loss/train': 0.7238404154777527} 01/27/2022 11:57:36 - INFO - codeparrot_training - Step 17123: {'lr': 0.00038721467425606644, 'samples': 3287808, 'steps': 17123, 'loss/train': 0.7338927090167999} 01/27/2022 11:57:40 - INFO - codeparrot_training - Step 17124: {'lr': 0.000387200996332598, 'samples': 3288000, 'steps': 17124, 'loss/train': 0.5847375243902206} 01/27/2022 11:57:44 - INFO - codeparrot_training - Step 17125: {'lr': 0.000387187317821404, 'samples': 3288192, 'steps': 17125, 'loss/train': 0.43509162962436676} 01/27/2022 11:57:47 - INFO - codeparrot_training - Step 17126: {'lr': 0.0003871736387225431, 'samples': 3288384, 'steps': 17126, 'loss/train': 0.8111625909805298} 01/27/2022 11:57:50 - INFO - codeparrot_training - Step 17127: {'lr': 0.0003871599590360739, 'samples': 3288576, 'steps': 17127, 'loss/train': 0.8504003584384918} 01/27/2022 11:57:53 - INFO - codeparrot_training - Step 17128: {'lr': 0.000387146278762055, 'samples': 3288768, 'steps': 17128, 'loss/train': 0.8934609889984131} 01/27/2022 11:57:56 - INFO - codeparrot_training - Step 17129: {'lr': 0.000387132597900545, 'samples': 3288960, 'steps': 17129, 'loss/train': 1.182650774717331} 01/27/2022 11:57:59 - INFO - codeparrot_training - Step 17130: {'lr': 0.0003871189164516025, 'samples': 3289152, 'steps': 17130, 'loss/train': 0.8245615661144257} 01/27/2022 11:58:03 - INFO - codeparrot_training - Step 17131: {'lr': 0.000387105234415286, 'samples': 3289344, 'steps': 17131, 'loss/train': 0.43334394693374634} 01/27/2022 11:58:06 - INFO - codeparrot_training - Step 17132: {'lr': 0.00038709155179165436, 'samples': 3289536, 'steps': 17132, 'loss/train': 0.9035642445087433} 01/27/2022 11:58:12 - INFO - codeparrot_training - Step 17133: {'lr': 0.000387077868580766, 'samples': 3289728, 'steps': 17133, 'loss/train': 4.857071042060852} 01/27/2022 11:58:15 - INFO - codeparrot_training - Step 17134: {'lr': 0.00038706418478267945, 'samples': 3289920, 'steps': 17134, 'loss/train': 1.1010225713253021} 01/27/2022 11:58:18 - INFO - codeparrot_training - Step 17135: {'lr': 0.0003870505003974536, 'samples': 3290112, 'steps': 17135, 'loss/train': 0.35707008093595505} 01/27/2022 11:58:21 - INFO - codeparrot_training - Step 17136: {'lr': 0.0003870368154251469, 'samples': 3290304, 'steps': 17136, 'loss/train': 0.6170548796653748} 01/27/2022 11:58:25 - INFO - codeparrot_training - Step 17137: {'lr': 0.000387023129865818, 'samples': 3290496, 'steps': 17137, 'loss/train': 1.501773476600647} 01/27/2022 11:58:28 - INFO - codeparrot_training - Step 17138: {'lr': 0.00038700944371952543, 'samples': 3290688, 'steps': 17138, 'loss/train': 0.845130980014801} 01/27/2022 11:58:31 - INFO - codeparrot_training - Step 17139: {'lr': 0.00038699575698632806, 'samples': 3290880, 'steps': 17139, 'loss/train': 0.9448512196540833} 01/27/2022 11:58:34 - INFO - codeparrot_training - Step 17140: {'lr': 0.00038698206966628426, 'samples': 3291072, 'steps': 17140, 'loss/train': 0.7295672446489334} 01/27/2022 11:58:37 - INFO - codeparrot_training - Step 17141: {'lr': 0.00038696838175945284, 'samples': 3291264, 'steps': 17141, 'loss/train': 0.8344963788986206} 01/27/2022 11:58:41 - INFO - codeparrot_training - Step 17142: {'lr': 0.0003869546932658923, 'samples': 3291456, 'steps': 17142, 'loss/train': 0.9255821406841278} 01/27/2022 11:58:45 - INFO - codeparrot_training - Step 17143: {'lr': 0.0003869410041856614, 'samples': 3291648, 'steps': 17143, 'loss/train': 0.6557753831148148} 01/27/2022 11:58:48 - INFO - codeparrot_training - Step 17144: {'lr': 0.0003869273145188186, 'samples': 3291840, 'steps': 17144, 'loss/train': 0.8829418122768402} 01/27/2022 11:58:51 - INFO - codeparrot_training - Step 17145: {'lr': 0.00038691362426542273, 'samples': 3292032, 'steps': 17145, 'loss/train': 0.7441485822200775} 01/27/2022 11:58:54 - INFO - codeparrot_training - Step 17146: {'lr': 0.0003868999334255324, 'samples': 3292224, 'steps': 17146, 'loss/train': 0.7426473051309586} 01/27/2022 11:58:57 - INFO - codeparrot_training - Step 17147: {'lr': 0.00038688624199920623, 'samples': 3292416, 'steps': 17147, 'loss/train': 1.040901392698288} 01/27/2022 11:59:00 - INFO - codeparrot_training - Step 17148: {'lr': 0.0003868725499865029, 'samples': 3292608, 'steps': 17148, 'loss/train': 0.32856787741184235} 01/27/2022 11:59:03 - INFO - codeparrot_training - Step 17149: {'lr': 0.00038685885738748096, 'samples': 3292800, 'steps': 17149, 'loss/train': 0.867857962846756} 01/27/2022 11:59:07 - INFO - codeparrot_training - Step 17150: {'lr': 0.0003868451642021992, 'samples': 3292992, 'steps': 17150, 'loss/train': 1.0771827399730682} 01/27/2022 11:59:12 - INFO - codeparrot_training - Step 17151: {'lr': 0.0003868314704307161, 'samples': 3293184, 'steps': 17151, 'loss/train': 0.4406020939350128} 01/27/2022 11:59:15 - INFO - codeparrot_training - Step 17152: {'lr': 0.0003868177760730905, 'samples': 3293376, 'steps': 17152, 'loss/train': 1.188736081123352} 01/27/2022 11:59:18 - INFO - codeparrot_training - Step 17153: {'lr': 0.00038680408112938097, 'samples': 3293568, 'steps': 17153, 'loss/train': 0.5251313745975494} 01/27/2022 11:59:21 - INFO - codeparrot_training - Step 17154: {'lr': 0.00038679038559964626, 'samples': 3293760, 'steps': 17154, 'loss/train': 1.7475634217262268} 01/27/2022 11:59:24 - INFO - codeparrot_training - Step 17155: {'lr': 0.0003867766894839449, 'samples': 3293952, 'steps': 17155, 'loss/train': 1.6415782570838928} 01/27/2022 11:59:28 - INFO - codeparrot_training - Step 17156: {'lr': 0.0003867629927823357, 'samples': 3294144, 'steps': 17156, 'loss/train': 0.8078372776508331} 01/27/2022 11:59:31 - INFO - codeparrot_training - Step 17157: {'lr': 0.00038674929549487714, 'samples': 3294336, 'steps': 17157, 'loss/train': 1.0802510976791382} 01/27/2022 11:59:34 - INFO - codeparrot_training - Step 17158: {'lr': 0.00038673559762162816, 'samples': 3294528, 'steps': 17158, 'loss/train': 0.475567951798439} 01/27/2022 11:59:37 - INFO - codeparrot_training - Step 17159: {'lr': 0.0003867218991626472, 'samples': 3294720, 'steps': 17159, 'loss/train': 0.90149787068367} 01/27/2022 11:59:43 - INFO - codeparrot_training - Step 17160: {'lr': 0.0003867082001179931, 'samples': 3294912, 'steps': 17160, 'loss/train': 0.987870454788208} 01/27/2022 11:59:47 - INFO - codeparrot_training - Step 17161: {'lr': 0.0003866945004877245, 'samples': 3295104, 'steps': 17161, 'loss/train': 0.8352283537387848} 01/27/2022 11:59:50 - INFO - codeparrot_training - Step 17162: {'lr': 0.0003866808002719, 'samples': 3295296, 'steps': 17162, 'loss/train': 0.9312686026096344} 01/27/2022 11:59:53 - INFO - codeparrot_training - Step 17163: {'lr': 0.00038666709947057836, 'samples': 3295488, 'steps': 17163, 'loss/train': 0.39049476385116577} 01/27/2022 11:59:56 - INFO - codeparrot_training - Step 17164: {'lr': 0.0003866533980838183, 'samples': 3295680, 'steps': 17164, 'loss/train': 1.055180311203003} 01/27/2022 11:59:59 - INFO - codeparrot_training - Step 17165: {'lr': 0.0003866396961116785, 'samples': 3295872, 'steps': 17165, 'loss/train': 0.7023611068725586} 01/27/2022 12:00:02 - INFO - codeparrot_training - Step 17166: {'lr': 0.00038662599355421756, 'samples': 3296064, 'steps': 17166, 'loss/train': 1.2058171033859253} 01/27/2022 12:00:05 - INFO - codeparrot_training - Step 17167: {'lr': 0.00038661229041149427, 'samples': 3296256, 'steps': 17167, 'loss/train': 1.3383604288101196} 01/27/2022 12:00:09 - INFO - codeparrot_training - Step 17168: {'lr': 0.0003865985866835673, 'samples': 3296448, 'steps': 17168, 'loss/train': 0.9030220806598663} 01/27/2022 12:00:13 - INFO - codeparrot_training - Step 17169: {'lr': 0.0003865848823704954, 'samples': 3296640, 'steps': 17169, 'loss/train': 1.2047575414180756} 01/27/2022 12:00:16 - INFO - codeparrot_training - Step 17170: {'lr': 0.00038657117747233717, 'samples': 3296832, 'steps': 17170, 'loss/train': 0.788334310054779} 01/27/2022 12:00:19 - INFO - codeparrot_training - Step 17171: {'lr': 0.00038655747198915137, 'samples': 3297024, 'steps': 17171, 'loss/train': 0.8884812891483307} 01/27/2022 12:00:23 - INFO - codeparrot_training - Step 17172: {'lr': 0.0003865437659209968, 'samples': 3297216, 'steps': 17172, 'loss/train': 0.7638389468193054} 01/27/2022 12:00:26 - INFO - codeparrot_training - Step 17173: {'lr': 0.00038653005926793203, 'samples': 3297408, 'steps': 17173, 'loss/train': 0.7381017208099365} 01/27/2022 12:00:29 - INFO - codeparrot_training - Step 17174: {'lr': 0.0003865163520300159, 'samples': 3297600, 'steps': 17174, 'loss/train': 5.271773099899292} 01/27/2022 12:00:32 - INFO - codeparrot_training - Step 17175: {'lr': 0.00038650264420730707, 'samples': 3297792, 'steps': 17175, 'loss/train': 1.0470133423805237} 01/27/2022 12:00:35 - INFO - codeparrot_training - Step 17176: {'lr': 0.00038648893579986424, 'samples': 3297984, 'steps': 17176, 'loss/train': 0.8803399801254272} 01/27/2022 12:00:38 - INFO - codeparrot_training - Step 17177: {'lr': 0.00038647522680774603, 'samples': 3298176, 'steps': 17177, 'loss/train': 0.9328750669956207} 01/27/2022 12:00:44 - INFO - codeparrot_training - Step 17178: {'lr': 0.0003864615172310115, 'samples': 3298368, 'steps': 17178, 'loss/train': 0.7722419500350952} 01/27/2022 12:00:47 - INFO - codeparrot_training - Step 17179: {'lr': 0.000386447807069719, 'samples': 3298560, 'steps': 17179, 'loss/train': 0.9769330322742462} 01/27/2022 12:00:51 - INFO - codeparrot_training - Step 17180: {'lr': 0.0003864340963239275, 'samples': 3298752, 'steps': 17180, 'loss/train': 0.7703602015972137} 01/27/2022 12:00:54 - INFO - codeparrot_training - Step 17181: {'lr': 0.00038642038499369556, 'samples': 3298944, 'steps': 17181, 'loss/train': 0.6687292903661728} 01/27/2022 12:00:57 - INFO - codeparrot_training - Step 17182: {'lr': 0.0003864066730790821, 'samples': 3299136, 'steps': 17182, 'loss/train': 0.8582193553447723} 01/27/2022 12:01:00 - INFO - codeparrot_training - Step 17183: {'lr': 0.00038639296058014575, 'samples': 3299328, 'steps': 17183, 'loss/train': 2.0792430639266968} 01/27/2022 12:01:03 - INFO - codeparrot_training - Step 17184: {'lr': 0.0003863792474969453, 'samples': 3299520, 'steps': 17184, 'loss/train': 1.1491015255451202} 01/27/2022 12:01:06 - INFO - codeparrot_training - Step 17185: {'lr': 0.00038636553382953944, 'samples': 3299712, 'steps': 17185, 'loss/train': 0.8016312718391418} 01/27/2022 12:01:09 - INFO - codeparrot_training - Step 17186: {'lr': 0.00038635181957798686, 'samples': 3299904, 'steps': 17186, 'loss/train': 0.9545815587043762} 01/27/2022 12:01:14 - INFO - codeparrot_training - Step 17187: {'lr': 0.00038633810474234643, 'samples': 3300096, 'steps': 17187, 'loss/train': 0.7852778434753418} 01/27/2022 12:01:17 - INFO - codeparrot_training - Step 17188: {'lr': 0.00038632438932267686, 'samples': 3300288, 'steps': 17188, 'loss/train': 0.8019550144672394} 01/27/2022 12:01:20 - INFO - codeparrot_training - Step 17189: {'lr': 0.0003863106733190369, 'samples': 3300480, 'steps': 17189, 'loss/train': 0.20018691569566727} 01/27/2022 12:01:23 - INFO - codeparrot_training - Step 17190: {'lr': 0.0003862969567314852, 'samples': 3300672, 'steps': 17190, 'loss/train': 4.099797606468201} 01/27/2022 12:01:26 - INFO - codeparrot_training - Step 17191: {'lr': 0.0003862832395600808, 'samples': 3300864, 'steps': 17191, 'loss/train': 0.73240926861763} 01/27/2022 12:01:30 - INFO - codeparrot_training - Step 17192: {'lr': 0.0003862695218048822, 'samples': 3301056, 'steps': 17192, 'loss/train': 0.8167985379695892} 01/27/2022 12:01:33 - INFO - codeparrot_training - Step 17193: {'lr': 0.00038625580346594824, 'samples': 3301248, 'steps': 17193, 'loss/train': 0.9045749008655548} 01/27/2022 12:01:36 - INFO - codeparrot_training - Step 17194: {'lr': 0.00038624208454333763, 'samples': 3301440, 'steps': 17194, 'loss/train': 1.2144889533519745} 01/27/2022 12:01:39 - INFO - codeparrot_training - Step 17195: {'lr': 0.00038622836503710917, 'samples': 3301632, 'steps': 17195, 'loss/train': 0.7750020325183868} 01/27/2022 12:01:44 - INFO - codeparrot_training - Step 17196: {'lr': 0.00038621464494732174, 'samples': 3301824, 'steps': 17196, 'loss/train': 0.7740617394447327} 01/27/2022 12:01:47 - INFO - codeparrot_training - Step 17197: {'lr': 0.00038620092427403395, 'samples': 3302016, 'steps': 17197, 'loss/train': 0.9700503051280975} 01/27/2022 12:01:50 - INFO - codeparrot_training - Step 17198: {'lr': 0.0003861872030173047, 'samples': 3302208, 'steps': 17198, 'loss/train': 1.0479870736598969} 01/27/2022 12:01:53 - INFO - codeparrot_training - Step 17199: {'lr': 0.0003861734811771928, 'samples': 3302400, 'steps': 17199, 'loss/train': 1.3490813970565796} 01/27/2022 12:01:56 - INFO - codeparrot_training - Step 17200: {'lr': 0.00038615975875375683, 'samples': 3302592, 'steps': 17200, 'loss/train': 0.7893092036247253} 01/27/2022 12:01:59 - INFO - codeparrot_training - Step 17201: {'lr': 0.0003861460357470556, 'samples': 3302784, 'steps': 17201, 'loss/train': 0.7958753407001495} 01/27/2022 12:02:03 - INFO - codeparrot_training - Step 17202: {'lr': 0.0003861323121571482, 'samples': 3302976, 'steps': 17202, 'loss/train': 0.8382782936096191} 01/27/2022 12:02:06 - INFO - codeparrot_training - Step 17203: {'lr': 0.0003861185879840931, 'samples': 3303168, 'steps': 17203, 'loss/train': 0.6341733634471893} 01/27/2022 12:02:12 - INFO - codeparrot_training - Step 17204: {'lr': 0.00038610486322794915, 'samples': 3303360, 'steps': 17204, 'loss/train': 1.0078166127204895} 01/27/2022 12:02:15 - INFO - codeparrot_training - Step 17205: {'lr': 0.0003860911378887752, 'samples': 3303552, 'steps': 17205, 'loss/train': 0.9842389225959778} 01/27/2022 12:02:18 - INFO - codeparrot_training - Step 17206: {'lr': 0.00038607741196663005, 'samples': 3303744, 'steps': 17206, 'loss/train': 0.3809104710817337} 01/27/2022 12:02:22 - INFO - codeparrot_training - Step 17207: {'lr': 0.0003860636854615725, 'samples': 3303936, 'steps': 17207, 'loss/train': 0.9755906760692596} 01/27/2022 12:02:25 - INFO - codeparrot_training - Step 17208: {'lr': 0.0003860499583736613, 'samples': 3304128, 'steps': 17208, 'loss/train': 1.0893445014953613} 01/27/2022 12:02:28 - INFO - codeparrot_training - Step 17209: {'lr': 0.00038603623070295536, 'samples': 3304320, 'steps': 17209, 'loss/train': 1.2445896863937378} 01/27/2022 12:02:31 - INFO - codeparrot_training - Step 17210: {'lr': 0.0003860225024495133, 'samples': 3304512, 'steps': 17210, 'loss/train': 0.7934837937355042} 01/27/2022 12:02:34 - INFO - codeparrot_training - Step 17211: {'lr': 0.000386008773613394, 'samples': 3304704, 'steps': 17211, 'loss/train': 0.9850592315196991} 01/27/2022 12:02:37 - INFO - codeparrot_training - Step 17212: {'lr': 0.0003859950441946564, 'samples': 3304896, 'steps': 17212, 'loss/train': 0.8522422313690186} 01/27/2022 12:02:42 - INFO - codeparrot_training - Step 17213: {'lr': 0.0003859813141933592, 'samples': 3305088, 'steps': 17213, 'loss/train': 1.0689623951911926} 01/27/2022 12:02:45 - INFO - codeparrot_training - Step 17214: {'lr': 0.0003859675836095612, 'samples': 3305280, 'steps': 17214, 'loss/train': 0.9115203022956848} 01/27/2022 12:02:48 - INFO - codeparrot_training - Step 17215: {'lr': 0.00038595385244332125, 'samples': 3305472, 'steps': 17215, 'loss/train': 0.9020907282829285} 01/27/2022 12:02:51 - INFO - codeparrot_training - Step 17216: {'lr': 0.00038594012069469814, 'samples': 3305664, 'steps': 17216, 'loss/train': 1.027454674243927} 01/27/2022 12:02:54 - INFO - codeparrot_training - Step 17217: {'lr': 0.00038592638836375075, 'samples': 3305856, 'steps': 17217, 'loss/train': 0.9640725553035736} 01/27/2022 12:02:58 - INFO - codeparrot_training - Step 17218: {'lr': 0.0003859126554505379, 'samples': 3306048, 'steps': 17218, 'loss/train': 1.1395648419857025} 01/27/2022 12:03:01 - INFO - codeparrot_training - Step 17219: {'lr': 0.00038589892195511834, 'samples': 3306240, 'steps': 17219, 'loss/train': 1.263746201992035} 01/27/2022 12:03:04 - INFO - codeparrot_training - Step 17220: {'lr': 0.00038588518787755096, 'samples': 3306432, 'steps': 17220, 'loss/train': 0.7372184693813324} 01/27/2022 12:03:07 - INFO - codeparrot_training - Step 17221: {'lr': 0.00038587145321789456, 'samples': 3306624, 'steps': 17221, 'loss/train': 0.6328811198472977} 01/27/2022 12:03:11 - INFO - codeparrot_training - Step 17222: {'lr': 0.00038585771797620803, 'samples': 3306816, 'steps': 17222, 'loss/train': 1.1549644768238068} 01/27/2022 12:03:14 - INFO - codeparrot_training - Step 17223: {'lr': 0.00038584398215255023, 'samples': 3307008, 'steps': 17223, 'loss/train': 1.0643807351589203} 01/27/2022 12:03:18 - INFO - codeparrot_training - Step 17224: {'lr': 0.0003858302457469799, 'samples': 3307200, 'steps': 17224, 'loss/train': 0.5199750512838364} 01/27/2022 12:03:21 - INFO - codeparrot_training - Step 17225: {'lr': 0.0003858165087595559, 'samples': 3307392, 'steps': 17225, 'loss/train': 0.8222046196460724} 01/27/2022 12:03:24 - INFO - codeparrot_training - Step 17226: {'lr': 0.00038580277119033715, 'samples': 3307584, 'steps': 17226, 'loss/train': 1.2518862783908844} 01/27/2022 12:03:27 - INFO - codeparrot_training - Step 17227: {'lr': 0.0003857890330393824, 'samples': 3307776, 'steps': 17227, 'loss/train': 0.21527986228466034} 01/27/2022 12:03:30 - INFO - codeparrot_training - Step 17228: {'lr': 0.0003857752943067506, 'samples': 3307968, 'steps': 17228, 'loss/train': 0.5521108657121658} 01/27/2022 12:03:33 - INFO - codeparrot_training - Step 17229: {'lr': 0.00038576155499250056, 'samples': 3308160, 'steps': 17229, 'loss/train': 0.8426592350006104} 01/27/2022 12:03:37 - INFO - codeparrot_training - Step 17230: {'lr': 0.000385747815096691, 'samples': 3308352, 'steps': 17230, 'loss/train': 0.13608907163143158} 01/27/2022 12:03:41 - INFO - codeparrot_training - Step 17231: {'lr': 0.00038573407461938103, 'samples': 3308544, 'steps': 17231, 'loss/train': 1.2773436605930328} 01/27/2022 12:03:44 - INFO - codeparrot_training - Step 17232: {'lr': 0.0003857203335606294, 'samples': 3308736, 'steps': 17232, 'loss/train': 0.9917554557323456} 01/27/2022 12:03:47 - INFO - codeparrot_training - Step 17233: {'lr': 0.00038570659192049497, 'samples': 3308928, 'steps': 17233, 'loss/train': 1.2190384268760681} 01/27/2022 12:03:50 - INFO - codeparrot_training - Step 17234: {'lr': 0.0003856928496990364, 'samples': 3309120, 'steps': 17234, 'loss/train': 0.95966836810112} 01/27/2022 12:03:54 - INFO - codeparrot_training - Step 17235: {'lr': 0.000385679106896313, 'samples': 3309312, 'steps': 17235, 'loss/train': 1.0819309651851654} 01/27/2022 12:03:57 - INFO - codeparrot_training - Step 17236: {'lr': 0.0003856653635123832, 'samples': 3309504, 'steps': 17236, 'loss/train': 1.0137571692466736} 01/27/2022 12:04:00 - INFO - codeparrot_training - Step 17237: {'lr': 0.0003856516195473062, 'samples': 3309696, 'steps': 17237, 'loss/train': 0.754884660243988} 01/27/2022 12:04:03 - INFO - codeparrot_training - Step 17238: {'lr': 0.0003856378750011407, 'samples': 3309888, 'steps': 17238, 'loss/train': 0.8089282214641571} 01/27/2022 12:04:06 - INFO - codeparrot_training - Step 17239: {'lr': 0.0003856241298739456, 'samples': 3310080, 'steps': 17239, 'loss/train': 0.25753019750118256} 01/27/2022 12:04:12 - INFO - codeparrot_training - Step 17240: {'lr': 0.0003856103841657797, 'samples': 3310272, 'steps': 17240, 'loss/train': 0.8107045590877533} 01/27/2022 12:04:15 - INFO - codeparrot_training - Step 17241: {'lr': 0.0003855966378767021, 'samples': 3310464, 'steps': 17241, 'loss/train': 1.0111234188079834} 01/27/2022 12:04:19 - INFO - codeparrot_training - Step 17242: {'lr': 0.00038558289100677144, 'samples': 3310656, 'steps': 17242, 'loss/train': 0.8398600816726685} 01/27/2022 12:04:22 - INFO - codeparrot_training - Step 17243: {'lr': 0.00038556914355604676, 'samples': 3310848, 'steps': 17243, 'loss/train': 0.49779273569583893} 01/27/2022 12:04:25 - INFO - codeparrot_training - Step 17244: {'lr': 0.0003855553955245871, 'samples': 3311040, 'steps': 17244, 'loss/train': 0.7260745167732239} 01/27/2022 12:04:28 - INFO - codeparrot_training - Step 17245: {'lr': 0.00038554164691245095, 'samples': 3311232, 'steps': 17245, 'loss/train': 0.6497770249843597} 01/27/2022 12:04:31 - INFO - codeparrot_training - Step 17246: {'lr': 0.00038552789771969755, 'samples': 3311424, 'steps': 17246, 'loss/train': 0.8452306687831879} 01/27/2022 12:04:34 - INFO - codeparrot_training - Step 17247: {'lr': 0.00038551414794638555, 'samples': 3311616, 'steps': 17247, 'loss/train': 0.8565156161785126} 01/27/2022 12:04:39 - INFO - codeparrot_training - Step 17248: {'lr': 0.00038550039759257404, 'samples': 3311808, 'steps': 17248, 'loss/train': 0.6468378603458405} 01/27/2022 12:04:42 - INFO - codeparrot_training - Step 17249: {'lr': 0.0003854866466583219, 'samples': 3312000, 'steps': 17249, 'loss/train': 0.6167835742235184} 01/27/2022 12:04:45 - INFO - codeparrot_training - Step 17250: {'lr': 0.00038547289514368795, 'samples': 3312192, 'steps': 17250, 'loss/train': 1.1010775566101074} 01/27/2022 12:04:48 - INFO - codeparrot_training - Step 17251: {'lr': 0.00038545914304873117, 'samples': 3312384, 'steps': 17251, 'loss/train': 0.5249023139476776} 01/27/2022 12:04:51 - INFO - codeparrot_training - Step 17252: {'lr': 0.00038544539037351037, 'samples': 3312576, 'steps': 17252, 'loss/train': 0.491867333650589} 01/27/2022 12:04:54 - INFO - codeparrot_training - Step 17253: {'lr': 0.00038543163711808457, 'samples': 3312768, 'steps': 17253, 'loss/train': 0.4940957576036453} 01/27/2022 12:04:58 - INFO - codeparrot_training - Step 17254: {'lr': 0.0003854178832825126, 'samples': 3312960, 'steps': 17254, 'loss/train': 1.127654492855072} 01/27/2022 12:05:01 - INFO - codeparrot_training - Step 17255: {'lr': 0.0003854041288668534, 'samples': 3313152, 'steps': 17255, 'loss/train': 0.6838559210300446} 01/27/2022 12:05:04 - INFO - codeparrot_training - Step 17256: {'lr': 0.00038539037387116595, 'samples': 3313344, 'steps': 17256, 'loss/train': 0.8344090282917023} 01/27/2022 12:05:08 - INFO - codeparrot_training - Step 17257: {'lr': 0.0003853766182955092, 'samples': 3313536, 'steps': 17257, 'loss/train': 0.8003952205181122} 01/27/2022 12:05:11 - INFO - codeparrot_training - Step 17258: {'lr': 0.0003853628621399419, 'samples': 3313728, 'steps': 17258, 'loss/train': 0.4011252522468567} 01/27/2022 12:05:14 - INFO - codeparrot_training - Step 17259: {'lr': 0.00038534910540452305, 'samples': 3313920, 'steps': 17259, 'loss/train': 0.9199228584766388} 01/27/2022 12:05:18 - INFO - codeparrot_training - Step 17260: {'lr': 0.0003853353480893117, 'samples': 3314112, 'steps': 17260, 'loss/train': 1.2258745729923248} 01/27/2022 12:05:21 - INFO - codeparrot_training - Step 17261: {'lr': 0.0003853215901943667, 'samples': 3314304, 'steps': 17261, 'loss/train': 0.4479082077741623} 01/27/2022 12:05:24 - INFO - codeparrot_training - Step 17262: {'lr': 0.00038530783171974694, 'samples': 3314496, 'steps': 17262, 'loss/train': 1.0066815912723541} 01/27/2022 12:05:27 - INFO - codeparrot_training - Step 17263: {'lr': 0.0003852940726655114, 'samples': 3314688, 'steps': 17263, 'loss/train': 0.7901995182037354} 01/27/2022 12:05:30 - INFO - codeparrot_training - Step 17264: {'lr': 0.000385280313031719, 'samples': 3314880, 'steps': 17264, 'loss/train': 0.5094643235206604} 01/27/2022 12:05:33 - INFO - codeparrot_training - Step 17265: {'lr': 0.0003852665528184287, 'samples': 3315072, 'steps': 17265, 'loss/train': 0.9375911056995392} 01/27/2022 12:05:40 - INFO - codeparrot_training - Step 17266: {'lr': 0.0003852527920256994, 'samples': 3315264, 'steps': 17266, 'loss/train': 0.8437194228172302} 01/27/2022 12:05:43 - INFO - codeparrot_training - Step 17267: {'lr': 0.00038523903065359013, 'samples': 3315456, 'steps': 17267, 'loss/train': 0.5410618185997009} 01/27/2022 12:05:46 - INFO - codeparrot_training - Step 17268: {'lr': 0.0003852252687021598, 'samples': 3315648, 'steps': 17268, 'loss/train': 0.4964694678783417} 01/27/2022 12:05:49 - INFO - codeparrot_training - Step 17269: {'lr': 0.00038521150617146737, 'samples': 3315840, 'steps': 17269, 'loss/train': 0.6763711273670197} 01/27/2022 12:05:52 - INFO - codeparrot_training - Step 17270: {'lr': 0.00038519774306157174, 'samples': 3316032, 'steps': 17270, 'loss/train': 0.8893589079380035} 01/27/2022 12:05:55 - INFO - codeparrot_training - Step 17271: {'lr': 0.00038518397937253195, 'samples': 3316224, 'steps': 17271, 'loss/train': 0.7312084883451462} 01/27/2022 12:05:58 - INFO - codeparrot_training - Step 17272: {'lr': 0.00038517021510440694, 'samples': 3316416, 'steps': 17272, 'loss/train': 0.7250602394342422} 01/27/2022 12:06:02 - INFO - codeparrot_training - Step 17273: {'lr': 0.0003851564502572556, 'samples': 3316608, 'steps': 17273, 'loss/train': 1.248950868844986} 01/27/2022 12:06:06 - INFO - codeparrot_training - Step 17274: {'lr': 0.00038514268483113694, 'samples': 3316800, 'steps': 17274, 'loss/train': 0.7562805712223053} 01/27/2022 12:06:09 - INFO - codeparrot_training - Step 17275: {'lr': 0.00038512891882610997, 'samples': 3316992, 'steps': 17275, 'loss/train': 5.618906021118164} 01/27/2022 12:06:12 - INFO - codeparrot_training - Step 17276: {'lr': 0.0003851151522422336, 'samples': 3317184, 'steps': 17276, 'loss/train': 0.604311615228653} 01/27/2022 12:06:16 - INFO - codeparrot_training - Step 17277: {'lr': 0.0003851013850795668, 'samples': 3317376, 'steps': 17277, 'loss/train': 0.821129322052002} 01/27/2022 12:06:19 - INFO - codeparrot_training - Step 17278: {'lr': 0.00038508761733816864, 'samples': 3317568, 'steps': 17278, 'loss/train': 1.0323987901210785} 01/27/2022 12:06:22 - INFO - codeparrot_training - Step 17279: {'lr': 0.00038507384901809795, 'samples': 3317760, 'steps': 17279, 'loss/train': 0.8463170528411865} 01/27/2022 12:06:25 - INFO - codeparrot_training - Step 17280: {'lr': 0.00038506008011941376, 'samples': 3317952, 'steps': 17280, 'loss/train': 1.1559272110462189} 01/27/2022 12:06:28 - INFO - codeparrot_training - Step 17281: {'lr': 0.0003850463106421751, 'samples': 3318144, 'steps': 17281, 'loss/train': 0.8020239472389221} 01/27/2022 12:06:31 - INFO - codeparrot_training - Step 17282: {'lr': 0.000385032540586441, 'samples': 3318336, 'steps': 17282, 'loss/train': 1.0159482657909393} 01/27/2022 12:06:38 - INFO - codeparrot_training - Step 17283: {'lr': 0.00038501876995227023, 'samples': 3318528, 'steps': 17283, 'loss/train': 0.28964464366436005} 01/27/2022 12:06:41 - INFO - codeparrot_training - Step 17284: {'lr': 0.00038500499873972204, 'samples': 3318720, 'steps': 17284, 'loss/train': 0.4893749803304672} 01/27/2022 12:06:44 - INFO - codeparrot_training - Step 17285: {'lr': 0.0003849912269488552, 'samples': 3318912, 'steps': 17285, 'loss/train': 0.8351521790027618} 01/27/2022 12:06:47 - INFO - codeparrot_training - Step 17286: {'lr': 0.000384977454579729, 'samples': 3319104, 'steps': 17286, 'loss/train': 0.9028458595275879} 01/27/2022 12:06:50 - INFO - codeparrot_training - Step 17287: {'lr': 0.00038496368163240215, 'samples': 3319296, 'steps': 17287, 'loss/train': 0.7461619824171066} 01/27/2022 12:06:53 - INFO - codeparrot_training - Step 17288: {'lr': 0.00038494990810693366, 'samples': 3319488, 'steps': 17288, 'loss/train': 0.8105019629001617} 01/27/2022 12:06:57 - INFO - codeparrot_training - Step 17289: {'lr': 0.00038493613400338267, 'samples': 3319680, 'steps': 17289, 'loss/train': 0.9958089888095856} 01/27/2022 12:07:00 - INFO - codeparrot_training - Step 17290: {'lr': 0.0003849223593218082, 'samples': 3319872, 'steps': 17290, 'loss/train': 0.7786571681499481} 01/27/2022 12:07:03 - INFO - codeparrot_training - Step 17291: {'lr': 0.00038490858406226903, 'samples': 3320064, 'steps': 17291, 'loss/train': 1.0172542333602905} 01/27/2022 12:07:07 - INFO - codeparrot_training - Step 17292: {'lr': 0.00038489480822482446, 'samples': 3320256, 'steps': 17292, 'loss/train': 0.5405485779047012} 01/27/2022 12:07:10 - INFO - codeparrot_training - Step 17293: {'lr': 0.00038488103180953326, 'samples': 3320448, 'steps': 17293, 'loss/train': 0.4422297030687332} 01/27/2022 12:07:14 - INFO - codeparrot_training - Step 17294: {'lr': 0.00038486725481645467, 'samples': 3320640, 'steps': 17294, 'loss/train': 1.043163388967514} 01/27/2022 12:07:17 - INFO - codeparrot_training - Step 17295: {'lr': 0.00038485347724564746, 'samples': 3320832, 'steps': 17295, 'loss/train': 0.8734130859375} 01/27/2022 12:07:20 - INFO - codeparrot_training - Step 17296: {'lr': 0.0003848396990971709, 'samples': 3321024, 'steps': 17296, 'loss/train': 0.9932441711425781} 01/27/2022 12:07:23 - INFO - codeparrot_training - Step 17297: {'lr': 0.00038482592037108375, 'samples': 3321216, 'steps': 17297, 'loss/train': 0.7481312602758408} 01/27/2022 12:07:26 - INFO - codeparrot_training - Step 17298: {'lr': 0.0003848121410674453, 'samples': 3321408, 'steps': 17298, 'loss/train': 1.0444898307323456} 01/27/2022 12:07:29 - INFO - codeparrot_training - Step 17299: {'lr': 0.0003847983611863144, 'samples': 3321600, 'steps': 17299, 'loss/train': 0.36014796048402786} 01/27/2022 12:07:32 - INFO - codeparrot_training - Step 17300: {'lr': 0.0003847845807277501, 'samples': 3321792, 'steps': 17300, 'loss/train': 0.9920140206813812} 01/27/2022 12:07:37 - INFO - codeparrot_training - Step 17301: {'lr': 0.00038477079969181146, 'samples': 3321984, 'steps': 17301, 'loss/train': 0.7235189527273178} 01/27/2022 12:07:40 - INFO - codeparrot_training - Step 17302: {'lr': 0.00038475701807855753, 'samples': 3322176, 'steps': 17302, 'loss/train': 1.3511719107627869} 01/27/2022 12:07:43 - INFO - codeparrot_training - Step 17303: {'lr': 0.00038474323588804727, 'samples': 3322368, 'steps': 17303, 'loss/train': 0.7229753583669662} 01/27/2022 12:07:46 - INFO - codeparrot_training - Step 17304: {'lr': 0.0003847294531203398, 'samples': 3322560, 'steps': 17304, 'loss/train': 0.42376287281513214} 01/27/2022 12:07:49 - INFO - codeparrot_training - Step 17305: {'lr': 0.0003847156697754942, 'samples': 3322752, 'steps': 17305, 'loss/train': 0.7098683416843414} 01/27/2022 12:07:53 - INFO - codeparrot_training - Step 17306: {'lr': 0.00038470188585356936, 'samples': 3322944, 'steps': 17306, 'loss/train': 0.8706230521202087} 01/27/2022 12:07:56 - INFO - codeparrot_training - Step 17307: {'lr': 0.00038468810135462445, 'samples': 3323136, 'steps': 17307, 'loss/train': 0.9052166640758514} 01/27/2022 12:07:59 - INFO - codeparrot_training - Step 17308: {'lr': 0.00038467431627871844, 'samples': 3323328, 'steps': 17308, 'loss/train': 0.8047813475131989} 01/27/2022 12:08:02 - INFO - codeparrot_training - Step 17309: {'lr': 0.0003846605306259105, 'samples': 3323520, 'steps': 17309, 'loss/train': 0.6914016902446747} 01/27/2022 12:08:07 - INFO - codeparrot_training - Step 17310: {'lr': 0.0003846467443962596, 'samples': 3323712, 'steps': 17310, 'loss/train': 0.8374458253383636} 01/27/2022 12:08:11 - INFO - codeparrot_training - Step 17311: {'lr': 0.00038463295758982475, 'samples': 3323904, 'steps': 17311, 'loss/train': 1.0438011288642883} 01/27/2022 12:08:14 - INFO - codeparrot_training - Step 17312: {'lr': 0.00038461917020666506, 'samples': 3324096, 'steps': 17312, 'loss/train': 0.42586031556129456} 01/27/2022 12:08:17 - INFO - codeparrot_training - Step 17313: {'lr': 0.0003846053822468396, 'samples': 3324288, 'steps': 17313, 'loss/train': 1.1752041578292847} 01/27/2022 12:08:20 - INFO - codeparrot_training - Step 17314: {'lr': 0.00038459159371040743, 'samples': 3324480, 'steps': 17314, 'loss/train': 0.5639998465776443} 01/27/2022 12:08:23 - INFO - codeparrot_training - Step 17315: {'lr': 0.0003845778045974276, 'samples': 3324672, 'steps': 17315, 'loss/train': 0.9631868004798889} 01/27/2022 12:08:26 - INFO - codeparrot_training - Step 17316: {'lr': 0.0003845640149079592, 'samples': 3324864, 'steps': 17316, 'loss/train': 0.9095828533172607} 01/27/2022 12:08:29 - INFO - codeparrot_training - Step 17317: {'lr': 0.0003845502246420613, 'samples': 3325056, 'steps': 17317, 'loss/train': 0.9990110993385315} 01/27/2022 12:08:34 - INFO - codeparrot_training - Step 17318: {'lr': 0.00038453643379979295, 'samples': 3325248, 'steps': 17318, 'loss/train': 0.08117925748229027} 01/27/2022 12:08:37 - INFO - codeparrot_training - Step 17319: {'lr': 0.00038452264238121326, 'samples': 3325440, 'steps': 17319, 'loss/train': 1.002740889787674} 01/27/2022 12:08:40 - INFO - codeparrot_training - Step 17320: {'lr': 0.0003845088503863813, 'samples': 3325632, 'steps': 17320, 'loss/train': 0.734030082821846} 01/27/2022 12:08:43 - INFO - codeparrot_training - Step 17321: {'lr': 0.0003844950578153561, 'samples': 3325824, 'steps': 17321, 'loss/train': 1.7182039618492126} 01/27/2022 12:08:46 - INFO - codeparrot_training - Step 17322: {'lr': 0.00038448126466819675, 'samples': 3326016, 'steps': 17322, 'loss/train': 0.9405626356601715} 01/27/2022 12:08:50 - INFO - codeparrot_training - Step 17323: {'lr': 0.00038446747094496243, 'samples': 3326208, 'steps': 17323, 'loss/train': 0.6842790842056274} 01/27/2022 12:08:53 - INFO - codeparrot_training - Step 17324: {'lr': 0.00038445367664571216, 'samples': 3326400, 'steps': 17324, 'loss/train': 0.4445958137512207} 01/27/2022 12:08:56 - INFO - codeparrot_training - Step 17325: {'lr': 0.000384439881770505, 'samples': 3326592, 'steps': 17325, 'loss/train': 0.7818026840686798} 01/27/2022 12:08:59 - INFO - codeparrot_training - Step 17326: {'lr': 0.0003844260863194001, 'samples': 3326784, 'steps': 17326, 'loss/train': 0.3377355560660362} 01/27/2022 12:09:03 - INFO - codeparrot_training - Step 17327: {'lr': 0.0003844122902924565, 'samples': 3326976, 'steps': 17327, 'loss/train': 1.2799673974514008} 01/27/2022 12:09:07 - INFO - codeparrot_training - Step 17328: {'lr': 0.00038439849368973334, 'samples': 3327168, 'steps': 17328, 'loss/train': 0.967228889465332} 01/27/2022 12:09:10 - INFO - codeparrot_training - Step 17329: {'lr': 0.0003843846965112897, 'samples': 3327360, 'steps': 17329, 'loss/train': 0.611377939581871} 01/27/2022 12:09:13 - INFO - codeparrot_training - Step 17330: {'lr': 0.0003843708987571847, 'samples': 3327552, 'steps': 17330, 'loss/train': 1.3449221849441528} 01/27/2022 12:09:16 - INFO - codeparrot_training - Step 17331: {'lr': 0.0003843571004274775, 'samples': 3327744, 'steps': 17331, 'loss/train': 0.6128996014595032} 01/27/2022 12:09:19 - INFO - codeparrot_training - Step 17332: {'lr': 0.0003843433015222271, 'samples': 3327936, 'steps': 17332, 'loss/train': 0.9001774191856384} 01/27/2022 12:09:22 - INFO - codeparrot_training - Step 17333: {'lr': 0.0003843295020414926, 'samples': 3328128, 'steps': 17333, 'loss/train': 1.0729883015155792} 01/27/2022 12:09:26 - INFO - codeparrot_training - Step 17334: {'lr': 0.0003843157019853332, 'samples': 3328320, 'steps': 17334, 'loss/train': 0.8696693480014801} 01/27/2022 12:09:29 - INFO - codeparrot_training - Step 17335: {'lr': 0.00038430190135380803, 'samples': 3328512, 'steps': 17335, 'loss/train': 1.391870230436325} 01/27/2022 12:09:33 - INFO - codeparrot_training - Step 17336: {'lr': 0.00038428810014697615, 'samples': 3328704, 'steps': 17336, 'loss/train': 1.1846664547920227} 01/27/2022 12:09:36 - INFO - codeparrot_training - Step 17337: {'lr': 0.00038427429836489663, 'samples': 3328896, 'steps': 17337, 'loss/train': 1.3876873254776} 01/27/2022 12:09:39 - INFO - codeparrot_training - Step 17338: {'lr': 0.00038426049600762867, 'samples': 3329088, 'steps': 17338, 'loss/train': 1.1478002965450287} 01/27/2022 12:09:43 - INFO - codeparrot_training - Step 17339: {'lr': 0.00038424669307523135, 'samples': 3329280, 'steps': 17339, 'loss/train': 0.34568506479263306} 01/27/2022 12:09:46 - INFO - codeparrot_training - Step 17340: {'lr': 0.00038423288956776394, 'samples': 3329472, 'steps': 17340, 'loss/train': 0.7177906483411789} 01/27/2022 12:09:49 - INFO - codeparrot_training - Step 17341: {'lr': 0.00038421908548528534, 'samples': 3329664, 'steps': 17341, 'loss/train': 0.7663499414920807} 01/27/2022 12:09:52 - INFO - codeparrot_training - Step 17342: {'lr': 0.0003842052808278549, 'samples': 3329856, 'steps': 17342, 'loss/train': 0.02663511224091053} 01/27/2022 12:09:55 - INFO - codeparrot_training - Step 17343: {'lr': 0.0003841914755955315, 'samples': 3330048, 'steps': 17343, 'loss/train': 0.9332700669765472} 01/27/2022 12:09:58 - INFO - codeparrot_training - Step 17344: {'lr': 0.00038417766978837453, 'samples': 3330240, 'steps': 17344, 'loss/train': 1.1043168604373932} 01/27/2022 12:10:04 - INFO - codeparrot_training - Step 17345: {'lr': 0.00038416386340644305, 'samples': 3330432, 'steps': 17345, 'loss/train': 0.8539256751537323} 01/27/2022 12:10:08 - INFO - codeparrot_training - Step 17346: {'lr': 0.00038415005644979616, 'samples': 3330624, 'steps': 17346, 'loss/train': 0.8399129211902618} 01/27/2022 12:10:11 - INFO - codeparrot_training - Step 17347: {'lr': 0.00038413624891849295, 'samples': 3330816, 'steps': 17347, 'loss/train': 1.2483892142772675} 01/27/2022 12:10:14 - INFO - codeparrot_training - Step 17348: {'lr': 0.00038412244081259273, 'samples': 3331008, 'steps': 17348, 'loss/train': 0.7914360165596008} 01/27/2022 12:10:17 - INFO - codeparrot_training - Step 17349: {'lr': 0.00038410863213215454, 'samples': 3331200, 'steps': 17349, 'loss/train': 0.9063577651977539} 01/27/2022 12:10:20 - INFO - codeparrot_training - Step 17350: {'lr': 0.0003840948228772376, 'samples': 3331392, 'steps': 17350, 'loss/train': 0.9410102069377899} 01/27/2022 12:10:23 - INFO - codeparrot_training - Step 17351: {'lr': 0.00038408101304790096, 'samples': 3331584, 'steps': 17351, 'loss/train': 1.0353861451148987} 01/27/2022 12:10:27 - INFO - codeparrot_training - Step 17352: {'lr': 0.0003840672026442038, 'samples': 3331776, 'steps': 17352, 'loss/train': 1.4229017794132233} 01/27/2022 12:10:30 - INFO - codeparrot_training - Step 17353: {'lr': 0.0003840533916662054, 'samples': 3331968, 'steps': 17353, 'loss/train': 0.45604442059993744} 01/27/2022 12:10:34 - INFO - codeparrot_training - Step 17354: {'lr': 0.00038403958011396476, 'samples': 3332160, 'steps': 17354, 'loss/train': 0.7084895968437195} 01/27/2022 12:10:37 - INFO - codeparrot_training - Step 17355: {'lr': 0.0003840257679875412, 'samples': 3332352, 'steps': 17355, 'loss/train': 0.7983041703701019} 01/27/2022 12:10:40 - INFO - codeparrot_training - Step 17356: {'lr': 0.00038401195528699374, 'samples': 3332544, 'steps': 17356, 'loss/train': 0.577332079410553} 01/27/2022 12:10:44 - INFO - codeparrot_training - Step 17357: {'lr': 0.0003839981420123817, 'samples': 3332736, 'steps': 17357, 'loss/train': 1.1558298468589783} 01/27/2022 12:10:47 - INFO - codeparrot_training - Step 17358: {'lr': 0.00038398432816376404, 'samples': 3332928, 'steps': 17358, 'loss/train': 0.12298128381371498} 01/27/2022 12:10:50 - INFO - codeparrot_training - Step 17359: {'lr': 0.00038397051374120016, 'samples': 3333120, 'steps': 17359, 'loss/train': 1.0058072805404663} 01/27/2022 12:10:53 - INFO - codeparrot_training - Step 17360: {'lr': 0.00038395669874474915, 'samples': 3333312, 'steps': 17360, 'loss/train': 1.0320155024528503} 01/27/2022 12:10:56 - INFO - codeparrot_training - Step 17361: {'lr': 0.0003839428831744702, 'samples': 3333504, 'steps': 17361, 'loss/train': 0.7938851416110992} 01/27/2022 12:11:02 - INFO - codeparrot_training - Step 17362: {'lr': 0.0003839290670304224, 'samples': 3333696, 'steps': 17362, 'loss/train': 0.9845447838306427} 01/27/2022 12:11:05 - INFO - codeparrot_training - Step 17363: {'lr': 0.00038391525031266494, 'samples': 3333888, 'steps': 17363, 'loss/train': 0.7631959319114685} 01/27/2022 12:11:08 - INFO - codeparrot_training - Step 17364: {'lr': 0.0003839014330212572, 'samples': 3334080, 'steps': 17364, 'loss/train': 0.35819772630929947} 01/27/2022 12:11:12 - INFO - codeparrot_training - Step 17365: {'lr': 0.00038388761515625815, 'samples': 3334272, 'steps': 17365, 'loss/train': 0.6109247803688049} 01/27/2022 12:11:15 - INFO - codeparrot_training - Step 17366: {'lr': 0.0003838737967177271, 'samples': 3334464, 'steps': 17366, 'loss/train': 0.3478442654013634} 01/27/2022 12:11:18 - INFO - codeparrot_training - Step 17367: {'lr': 0.00038385997770572336, 'samples': 3334656, 'steps': 17367, 'loss/train': 0.734886422753334} 01/27/2022 12:11:21 - INFO - codeparrot_training - Step 17368: {'lr': 0.0003838461581203058, 'samples': 3334848, 'steps': 17368, 'loss/train': 0.6378163844347} 01/27/2022 12:11:24 - INFO - codeparrot_training - Step 17369: {'lr': 0.00038383233796153383, 'samples': 3335040, 'steps': 17369, 'loss/train': 0.9703617095947266} 01/27/2022 12:11:27 - INFO - codeparrot_training - Step 17370: {'lr': 0.00038381851722946663, 'samples': 3335232, 'steps': 17370, 'loss/train': 0.8987545967102051} 01/27/2022 12:11:32 - INFO - codeparrot_training - Step 17371: {'lr': 0.00038380469592416347, 'samples': 3335424, 'steps': 17371, 'loss/train': 0.7706951200962067} 01/27/2022 12:11:35 - INFO - codeparrot_training - Step 17372: {'lr': 0.00038379087404568333, 'samples': 3335616, 'steps': 17372, 'loss/train': 1.4598216712474823} 01/27/2022 12:11:38 - INFO - codeparrot_training - Step 17373: {'lr': 0.0003837770515940857, 'samples': 3335808, 'steps': 17373, 'loss/train': 0.5704761296510696} 01/27/2022 12:11:41 - INFO - codeparrot_training - Step 17374: {'lr': 0.0003837632285694296, 'samples': 3336000, 'steps': 17374, 'loss/train': 0.8199278712272644} 01/27/2022 12:11:44 - INFO - codeparrot_training - Step 17375: {'lr': 0.00038374940497177434, 'samples': 3336192, 'steps': 17375, 'loss/train': 0.8777465522289276} 01/27/2022 12:11:47 - INFO - codeparrot_training - Step 17376: {'lr': 0.000383735580801179, 'samples': 3336384, 'steps': 17376, 'loss/train': 1.1777991950511932} 01/27/2022 12:11:50 - INFO - codeparrot_training - Step 17377: {'lr': 0.00038372175605770305, 'samples': 3336576, 'steps': 17377, 'loss/train': 0.5687758773565292} 01/27/2022 12:11:54 - INFO - codeparrot_training - Step 17378: {'lr': 0.00038370793074140545, 'samples': 3336768, 'steps': 17378, 'loss/train': 0.6160747110843658} 01/27/2022 12:11:57 - INFO - codeparrot_training - Step 17379: {'lr': 0.00038369410485234557, 'samples': 3336960, 'steps': 17379, 'loss/train': 0.7546310126781464} 01/27/2022 12:12:01 - INFO - codeparrot_training - Step 17380: {'lr': 0.0003836802783905826, 'samples': 3337152, 'steps': 17380, 'loss/train': 0.4948379248380661} 01/27/2022 12:12:04 - INFO - codeparrot_training - Step 17381: {'lr': 0.0003836664513561758, 'samples': 3337344, 'steps': 17381, 'loss/train': 1.0011994242668152} 01/27/2022 12:12:07 - INFO - codeparrot_training - Step 17382: {'lr': 0.0003836526237491843, 'samples': 3337536, 'steps': 17382, 'loss/train': 0.876231461763382} 01/27/2022 12:12:10 - INFO - codeparrot_training - Step 17383: {'lr': 0.0003836387955696674, 'samples': 3337728, 'steps': 17383, 'loss/train': 0.9364541172981262} 01/27/2022 12:12:14 - INFO - codeparrot_training - Step 17384: {'lr': 0.00038362496681768434, 'samples': 3337920, 'steps': 17384, 'loss/train': 1.2400301098823547} 01/27/2022 12:12:17 - INFO - codeparrot_training - Step 17385: {'lr': 0.00038361113749329443, 'samples': 3338112, 'steps': 17385, 'loss/train': 1.4865602552890778} 01/27/2022 12:12:20 - INFO - codeparrot_training - Step 17386: {'lr': 0.00038359730759655674, 'samples': 3338304, 'steps': 17386, 'loss/train': 0.679748922586441} 01/27/2022 12:12:23 - INFO - codeparrot_training - Step 17387: {'lr': 0.00038358347712753063, 'samples': 3338496, 'steps': 17387, 'loss/train': 0.8711357116699219} 01/27/2022 12:12:26 - INFO - codeparrot_training - Step 17388: {'lr': 0.0003835696460862753, 'samples': 3338688, 'steps': 17388, 'loss/train': 0.712509423494339} 01/27/2022 12:12:33 - INFO - codeparrot_training - Step 17389: {'lr': 0.00038355581447285005, 'samples': 3338880, 'steps': 17389, 'loss/train': 0.57874895632267} 01/27/2022 12:12:36 - INFO - codeparrot_training - Step 17390: {'lr': 0.00038354198228731414, 'samples': 3339072, 'steps': 17390, 'loss/train': 0.8388234078884125} 01/27/2022 12:12:39 - INFO - codeparrot_training - Step 17391: {'lr': 0.0003835281495297267, 'samples': 3339264, 'steps': 17391, 'loss/train': 0.6834256052970886} 01/27/2022 12:12:43 - INFO - codeparrot_training - Step 17392: {'lr': 0.0003835143162001472, 'samples': 3339456, 'steps': 17392, 'loss/train': 0.38365624845027924} 01/27/2022 12:12:46 - INFO - codeparrot_training - Step 17393: {'lr': 0.0003835004822986346, 'samples': 3339648, 'steps': 17393, 'loss/train': 1.2774766087532043} 01/27/2022 12:12:49 - INFO - codeparrot_training - Step 17394: {'lr': 0.00038348664782524846, 'samples': 3339840, 'steps': 17394, 'loss/train': 0.8322208821773529} 01/27/2022 12:12:52 - INFO - codeparrot_training - Step 17395: {'lr': 0.00038347281278004774, 'samples': 3340032, 'steps': 17395, 'loss/train': 0.9303825795650482} 01/27/2022 12:12:55 - INFO - codeparrot_training - Step 17396: {'lr': 0.0003834589771630921, 'samples': 3340224, 'steps': 17396, 'loss/train': 0.9360744953155518} 01/27/2022 12:13:00 - INFO - codeparrot_training - Step 17397: {'lr': 0.0003834451409744404, 'samples': 3340416, 'steps': 17397, 'loss/train': 0.9696314334869385} 01/27/2022 12:13:03 - INFO - codeparrot_training - Step 17398: {'lr': 0.0003834313042141522, 'samples': 3340608, 'steps': 17398, 'loss/train': 0.8427042067050934} 01/27/2022 12:13:06 - INFO - codeparrot_training - Step 17399: {'lr': 0.0003834174668822865, 'samples': 3340800, 'steps': 17399, 'loss/train': 0.32703063637018204} 01/27/2022 12:13:09 - INFO - codeparrot_training - Step 17400: {'lr': 0.0003834036289789029, 'samples': 3340992, 'steps': 17400, 'loss/train': 0.49507784843444824} 01/27/2022 12:13:12 - INFO - codeparrot_training - Step 17401: {'lr': 0.0003833897905040604, 'samples': 3341184, 'steps': 17401, 'loss/train': 0.8647011816501617} 01/27/2022 12:13:15 - INFO - codeparrot_training - Step 17402: {'lr': 0.00038337595145781844, 'samples': 3341376, 'steps': 17402, 'loss/train': 0.5454166531562805} 01/27/2022 12:13:19 - INFO - codeparrot_training - Step 17403: {'lr': 0.00038336211184023634, 'samples': 3341568, 'steps': 17403, 'loss/train': 0.7052793502807617} 01/27/2022 12:13:22 - INFO - codeparrot_training - Step 17404: {'lr': 0.0003833482716513732, 'samples': 3341760, 'steps': 17404, 'loss/train': 1.0739258229732513} 01/27/2022 12:13:25 - INFO - codeparrot_training - Step 17405: {'lr': 0.0003833344308912885, 'samples': 3341952, 'steps': 17405, 'loss/train': 1.2584266662597656} 01/27/2022 12:13:31 - INFO - codeparrot_training - Step 17406: {'lr': 0.00038332058956004134, 'samples': 3342144, 'steps': 17406, 'loss/train': 0.7941047251224518} 01/27/2022 12:13:34 - INFO - codeparrot_training - Step 17407: {'lr': 0.0003833067476576911, 'samples': 3342336, 'steps': 17407, 'loss/train': 0.9457806944847107} 01/27/2022 12:13:37 - INFO - codeparrot_training - Step 17408: {'lr': 0.0003832929051842972, 'samples': 3342528, 'steps': 17408, 'loss/train': 0.9787053465843201} 01/27/2022 12:13:40 - INFO - codeparrot_training - Step 17409: {'lr': 0.0003832790621399187, 'samples': 3342720, 'steps': 17409, 'loss/train': 0.7154784053564072} 01/27/2022 12:13:44 - INFO - codeparrot_training - Step 17410: {'lr': 0.00038326521852461505, 'samples': 3342912, 'steps': 17410, 'loss/train': 0.6060643941164017} 01/27/2022 12:13:47 - INFO - codeparrot_training - Step 17411: {'lr': 0.0003832513743384456, 'samples': 3343104, 'steps': 17411, 'loss/train': 1.202091783285141} 01/27/2022 12:13:50 - INFO - codeparrot_training - Step 17412: {'lr': 0.0003832375295814695, 'samples': 3343296, 'steps': 17412, 'loss/train': 0.889755517244339} 01/27/2022 12:13:53 - INFO - codeparrot_training - Step 17413: {'lr': 0.0003832236842537461, 'samples': 3343488, 'steps': 17413, 'loss/train': 0.6376149952411652} 01/27/2022 12:13:56 - INFO - codeparrot_training - Step 17414: {'lr': 0.0003832098383553347, 'samples': 3343680, 'steps': 17414, 'loss/train': 0.6875883489847183} 01/27/2022 12:14:01 - INFO - codeparrot_training - Step 17415: {'lr': 0.00038319599188629485, 'samples': 3343872, 'steps': 17415, 'loss/train': 0.9991341233253479} 01/27/2022 12:14:04 - INFO - codeparrot_training - Step 17416: {'lr': 0.00038318214484668557, 'samples': 3344064, 'steps': 17416, 'loss/train': 0.9859030544757843} 01/27/2022 12:14:07 - INFO - codeparrot_training - Step 17417: {'lr': 0.0003831682972365662, 'samples': 3344256, 'steps': 17417, 'loss/train': 1.0002467036247253} 01/27/2022 12:14:10 - INFO - codeparrot_training - Step 17418: {'lr': 0.0003831544490559962, 'samples': 3344448, 'steps': 17418, 'loss/train': 0.6348497718572617} 01/27/2022 12:14:13 - INFO - codeparrot_training - Step 17419: {'lr': 0.00038314060030503476, 'samples': 3344640, 'steps': 17419, 'loss/train': 0.6488919407129288} 01/27/2022 12:14:16 - INFO - codeparrot_training - Step 17420: {'lr': 0.00038312675098374136, 'samples': 3344832, 'steps': 17420, 'loss/train': 0.7642919719219208} 01/27/2022 12:14:19 - INFO - codeparrot_training - Step 17421: {'lr': 0.0003831129010921751, 'samples': 3345024, 'steps': 17421, 'loss/train': 0.3899569809436798} 01/27/2022 12:14:23 - INFO - codeparrot_training - Step 17422: {'lr': 0.0003830990506303956, 'samples': 3345216, 'steps': 17422, 'loss/train': 0.7257312387228012} 01/27/2022 12:14:27 - INFO - codeparrot_training - Step 17423: {'lr': 0.0003830851995984619, 'samples': 3345408, 'steps': 17423, 'loss/train': 0.932055652141571} 01/27/2022 12:14:30 - INFO - codeparrot_training - Step 17424: {'lr': 0.0003830713479964335, 'samples': 3345600, 'steps': 17424, 'loss/train': 0.968845009803772} 01/27/2022 12:14:33 - INFO - codeparrot_training - Step 17425: {'lr': 0.0003830574958243697, 'samples': 3345792, 'steps': 17425, 'loss/train': 0.5606117695569992} 01/27/2022 12:14:37 - INFO - codeparrot_training - Step 17426: {'lr': 0.00038304364308232986, 'samples': 3345984, 'steps': 17426, 'loss/train': 0.6127442568540573} 01/27/2022 12:14:40 - INFO - codeparrot_training - Step 17427: {'lr': 0.0003830297897703733, 'samples': 3346176, 'steps': 17427, 'loss/train': 0.9435603618621826} 01/27/2022 12:14:43 - INFO - codeparrot_training - Step 17428: {'lr': 0.0003830159358885593, 'samples': 3346368, 'steps': 17428, 'loss/train': 0.7075652182102203} 01/27/2022 12:14:46 - INFO - codeparrot_training - Step 17429: {'lr': 0.00038300208143694737, 'samples': 3346560, 'steps': 17429, 'loss/train': 0.33063722401857376} 01/27/2022 12:14:49 - INFO - codeparrot_training - Step 17430: {'lr': 0.00038298822641559673, 'samples': 3346752, 'steps': 17430, 'loss/train': 0.7688887417316437} 01/27/2022 12:14:52 - INFO - codeparrot_training - Step 17431: {'lr': 0.0003829743708245667, 'samples': 3346944, 'steps': 17431, 'loss/train': 0.7742585241794586} 01/27/2022 12:14:57 - INFO - codeparrot_training - Step 17432: {'lr': 0.0003829605146639167, 'samples': 3347136, 'steps': 17432, 'loss/train': 0.7253496944904327} 01/27/2022 12:15:00 - INFO - codeparrot_training - Step 17433: {'lr': 0.0003829466579337061, 'samples': 3347328, 'steps': 17433, 'loss/train': 0.8708010613918304} 01/27/2022 12:15:03 - INFO - codeparrot_training - Step 17434: {'lr': 0.00038293280063399427, 'samples': 3347520, 'steps': 17434, 'loss/train': 0.8003986179828644} 01/27/2022 12:15:06 - INFO - codeparrot_training - Step 17435: {'lr': 0.00038291894276484053, 'samples': 3347712, 'steps': 17435, 'loss/train': 0.7516883611679077} 01/27/2022 12:15:09 - INFO - codeparrot_training - Step 17436: {'lr': 0.0003829050843263041, 'samples': 3347904, 'steps': 17436, 'loss/train': 0.8640706837177277} 01/27/2022 12:15:13 - INFO - codeparrot_training - Step 17437: {'lr': 0.0003828912253184446, 'samples': 3348096, 'steps': 17437, 'loss/train': 0.4093736708164215} 01/27/2022 12:15:16 - INFO - codeparrot_training - Step 17438: {'lr': 0.0003828773657413213, 'samples': 3348288, 'steps': 17438, 'loss/train': 0.8513549566268921} 01/27/2022 12:15:19 - INFO - codeparrot_training - Step 17439: {'lr': 0.0003828635055949935, 'samples': 3348480, 'steps': 17439, 'loss/train': 1.0270476043224335} 01/27/2022 12:15:22 - INFO - codeparrot_training - Step 17440: {'lr': 0.0003828496448795207, 'samples': 3348672, 'steps': 17440, 'loss/train': 0.7145764231681824} 01/27/2022 12:15:28 - INFO - codeparrot_training - Step 17441: {'lr': 0.0003828357835949622, 'samples': 3348864, 'steps': 17441, 'loss/train': 0.48617054522037506} 01/27/2022 12:15:31 - INFO - codeparrot_training - Step 17442: {'lr': 0.00038282192174137744, 'samples': 3349056, 'steps': 17442, 'loss/train': 1.292456030845642} 01/27/2022 12:15:35 - INFO - codeparrot_training - Step 17443: {'lr': 0.00038280805931882557, 'samples': 3349248, 'steps': 17443, 'loss/train': 0.8344650864601135} 01/27/2022 12:15:38 - INFO - codeparrot_training - Step 17444: {'lr': 0.0003827941963273663, 'samples': 3349440, 'steps': 17444, 'loss/train': 0.596059575676918} 01/27/2022 12:15:41 - INFO - codeparrot_training - Step 17445: {'lr': 0.00038278033276705875, 'samples': 3349632, 'steps': 17445, 'loss/train': 0.6150305271148682} 01/27/2022 12:15:44 - INFO - codeparrot_training - Step 17446: {'lr': 0.0003827664686379625, 'samples': 3349824, 'steps': 17446, 'loss/train': 0.24077432602643967} 01/27/2022 12:15:47 - INFO - codeparrot_training - Step 17447: {'lr': 0.00038275260394013676, 'samples': 3350016, 'steps': 17447, 'loss/train': 0.5864391177892685} 01/27/2022 12:15:50 - INFO - codeparrot_training - Step 17448: {'lr': 0.0003827387386736411, 'samples': 3350208, 'steps': 17448, 'loss/train': 0.07031954638659954} 01/27/2022 12:15:53 - INFO - codeparrot_training - Step 17449: {'lr': 0.0003827248728385349, 'samples': 3350400, 'steps': 17449, 'loss/train': 0.7389795184135437} 01/27/2022 12:15:58 - INFO - codeparrot_training - Step 17450: {'lr': 0.0003827110064348773, 'samples': 3350592, 'steps': 17450, 'loss/train': 1.1510248482227325} 01/27/2022 12:16:01 - INFO - codeparrot_training - Step 17451: {'lr': 0.000382697139462728, 'samples': 3350784, 'steps': 17451, 'loss/train': 0.8760152757167816} 01/27/2022 12:16:04 - INFO - codeparrot_training - Step 17452: {'lr': 0.00038268327192214635, 'samples': 3350976, 'steps': 17452, 'loss/train': 0.7420395165681839} 01/27/2022 12:16:07 - INFO - codeparrot_training - Step 17453: {'lr': 0.0003826694038131916, 'samples': 3351168, 'steps': 17453, 'loss/train': 0.7792365252971649} 01/27/2022 12:16:10 - INFO - codeparrot_training - Step 17454: {'lr': 0.00038265553513592334, 'samples': 3351360, 'steps': 17454, 'loss/train': 0.2625616192817688} 01/27/2022 12:16:13 - INFO - codeparrot_training - Step 17455: {'lr': 0.00038264166589040084, 'samples': 3351552, 'steps': 17455, 'loss/train': 0.5943495333194733} 01/27/2022 12:16:17 - INFO - codeparrot_training - Step 17456: {'lr': 0.00038262779607668354, 'samples': 3351744, 'steps': 17456, 'loss/train': 0.800645649433136} 01/27/2022 12:16:20 - INFO - codeparrot_training - Step 17457: {'lr': 0.00038261392569483087, 'samples': 3351936, 'steps': 17457, 'loss/train': 0.7355403453111649} 01/27/2022 12:16:23 - INFO - codeparrot_training - Step 17458: {'lr': 0.0003826000547449023, 'samples': 3352128, 'steps': 17458, 'loss/train': 1.0187968611717224} 01/27/2022 12:16:27 - INFO - codeparrot_training - Step 17459: {'lr': 0.0003825861832269571, 'samples': 3352320, 'steps': 17459, 'loss/train': 1.09365713596344} 01/27/2022 12:16:31 - INFO - codeparrot_training - Step 17460: {'lr': 0.00038257231114105495, 'samples': 3352512, 'steps': 17460, 'loss/train': 0.7036960870027542} 01/27/2022 12:16:34 - INFO - codeparrot_training - Step 17461: {'lr': 0.00038255843848725504, 'samples': 3352704, 'steps': 17461, 'loss/train': 0.8107507824897766} 01/27/2022 12:16:37 - INFO - codeparrot_training - Step 17462: {'lr': 0.0003825445652656169, 'samples': 3352896, 'steps': 17462, 'loss/train': 0.35699623078107834} 01/27/2022 12:16:40 - INFO - codeparrot_training - Step 17463: {'lr': 0.00038253069147619977, 'samples': 3353088, 'steps': 17463, 'loss/train': 0.6396523118019104} 01/27/2022 12:16:43 - INFO - codeparrot_training - Step 17464: {'lr': 0.00038251681711906345, 'samples': 3353280, 'steps': 17464, 'loss/train': 0.85936439037323} 01/27/2022 12:16:46 - INFO - codeparrot_training - Step 17465: {'lr': 0.00038250294219426706, 'samples': 3353472, 'steps': 17465, 'loss/train': 0.860742598772049} 01/27/2022 12:16:49 - INFO - codeparrot_training - Step 17466: {'lr': 0.00038248906670187017, 'samples': 3353664, 'steps': 17466, 'loss/train': 0.18838533014059067} 01/27/2022 12:16:53 - INFO - codeparrot_training - Step 17467: {'lr': 0.00038247519064193216, 'samples': 3353856, 'steps': 17467, 'loss/train': 0.7119137048721313} 01/27/2022 12:16:59 - INFO - codeparrot_training - Step 17468: {'lr': 0.0003824613140145125, 'samples': 3354048, 'steps': 17468, 'loss/train': 0.8265596330165863} 01/27/2022 12:17:02 - INFO - codeparrot_training - Step 17469: {'lr': 0.00038244743681967066, 'samples': 3354240, 'steps': 17469, 'loss/train': 0.8623052537441254} 01/27/2022 12:17:05 - INFO - codeparrot_training - Step 17470: {'lr': 0.000382433559057466, 'samples': 3354432, 'steps': 17470, 'loss/train': 1.085395485162735} 01/27/2022 12:17:08 - INFO - codeparrot_training - Step 17471: {'lr': 0.00038241968072795805, 'samples': 3354624, 'steps': 17471, 'loss/train': 0.5383385717868805} 01/27/2022 12:17:12 - INFO - codeparrot_training - Step 17472: {'lr': 0.00038240580183120624, 'samples': 3354816, 'steps': 17472, 'loss/train': 0.6566796898841858} 01/27/2022 12:17:15 - INFO - codeparrot_training - Step 17473: {'lr': 0.0003823919223672701, 'samples': 3355008, 'steps': 17473, 'loss/train': 0.8781140148639679} 01/27/2022 12:17:18 - INFO - codeparrot_training - Step 17474: {'lr': 0.00038237804233620887, 'samples': 3355200, 'steps': 17474, 'loss/train': 0.47033657133579254} 01/27/2022 12:17:21 - INFO - codeparrot_training - Step 17475: {'lr': 0.0003823641617380823, 'samples': 3355392, 'steps': 17475, 'loss/train': 0.7462128549814224} 01/27/2022 12:17:26 - INFO - codeparrot_training - Step 17476: {'lr': 0.00038235028057294953, 'samples': 3355584, 'steps': 17476, 'loss/train': 1.4378129839897156} 01/27/2022 12:17:29 - INFO - codeparrot_training - Step 17477: {'lr': 0.0003823363988408703, 'samples': 3355776, 'steps': 17477, 'loss/train': 1.2519384026527405} 01/27/2022 12:17:32 - INFO - codeparrot_training - Step 17478: {'lr': 0.00038232251654190386, 'samples': 3355968, 'steps': 17478, 'loss/train': 0.7136898636817932} 01/27/2022 12:17:35 - INFO - codeparrot_training - Step 17479: {'lr': 0.0003823086336761099, 'samples': 3356160, 'steps': 17479, 'loss/train': 1.0080040991306305} 01/27/2022 12:17:38 - INFO - codeparrot_training - Step 17480: {'lr': 0.00038229475024354766, 'samples': 3356352, 'steps': 17480, 'loss/train': 0.7654073238372803} 01/27/2022 12:17:42 - INFO - codeparrot_training - Step 17481: {'lr': 0.00038228086624427675, 'samples': 3356544, 'steps': 17481, 'loss/train': 1.0931824743747711} 01/27/2022 12:17:45 - INFO - codeparrot_training - Step 17482: {'lr': 0.0003822669816783566, 'samples': 3356736, 'steps': 17482, 'loss/train': 0.8471819758415222} 01/27/2022 12:17:48 - INFO - codeparrot_training - Step 17483: {'lr': 0.0003822530965458467, 'samples': 3356928, 'steps': 17483, 'loss/train': 0.9397275745868683} 01/27/2022 12:17:51 - INFO - codeparrot_training - Step 17484: {'lr': 0.0003822392108468066, 'samples': 3357120, 'steps': 17484, 'loss/train': 0.6372285783290863} 01/27/2022 12:17:56 - INFO - codeparrot_training - Step 17485: {'lr': 0.00038222532458129563, 'samples': 3357312, 'steps': 17485, 'loss/train': 1.007326751947403} 01/27/2022 12:17:59 - INFO - codeparrot_training - Step 17486: {'lr': 0.0003822114377493734, 'samples': 3357504, 'steps': 17486, 'loss/train': 0.0823157038539648} 01/27/2022 12:18:02 - INFO - codeparrot_training - Step 17487: {'lr': 0.0003821975503510993, 'samples': 3357696, 'steps': 17487, 'loss/train': 0.8822834193706512} 01/27/2022 12:18:05 - INFO - codeparrot_training - Step 17488: {'lr': 0.0003821836623865329, 'samples': 3357888, 'steps': 17488, 'loss/train': 0.6713958531618118} 01/27/2022 12:18:08 - INFO - codeparrot_training - Step 17489: {'lr': 0.0003821697738557337, 'samples': 3358080, 'steps': 17489, 'loss/train': 3.5097720623016357} 01/27/2022 12:18:11 - INFO - codeparrot_training - Step 17490: {'lr': 0.00038215588475876117, 'samples': 3358272, 'steps': 17490, 'loss/train': 0.7954971492290497} 01/27/2022 12:18:14 - INFO - codeparrot_training - Step 17491: {'lr': 0.0003821419950956747, 'samples': 3358464, 'steps': 17491, 'loss/train': 1.2565458118915558} 01/27/2022 12:18:17 - INFO - codeparrot_training - Step 17492: {'lr': 0.00038212810486653394, 'samples': 3358656, 'steps': 17492, 'loss/train': 1.0412573218345642} 01/27/2022 12:18:21 - INFO - codeparrot_training - Step 17493: {'lr': 0.0003821142140713983, 'samples': 3358848, 'steps': 17493, 'loss/train': 0.6795124858617783} 01/27/2022 12:18:27 - INFO - codeparrot_training - Step 17494: {'lr': 0.0003821003227103274, 'samples': 3359040, 'steps': 17494, 'loss/train': 1.0772932469844818} 01/27/2022 12:18:30 - INFO - codeparrot_training - Step 17495: {'lr': 0.00038208643078338055, 'samples': 3359232, 'steps': 17495, 'loss/train': 0.9373748302459717} 01/27/2022 12:18:33 - INFO - codeparrot_training - Step 17496: {'lr': 0.0003820725382906175, 'samples': 3359424, 'steps': 17496, 'loss/train': 0.7637994289398193} 01/27/2022 12:18:37 - INFO - codeparrot_training - Step 17497: {'lr': 0.0003820586452320975, 'samples': 3359616, 'steps': 17497, 'loss/train': 0.0925069022923708} 01/27/2022 12:18:40 - INFO - codeparrot_training - Step 17498: {'lr': 0.0003820447516078803, 'samples': 3359808, 'steps': 17498, 'loss/train': 0.43254344165325165} 01/27/2022 12:18:43 - INFO - codeparrot_training - Step 17499: {'lr': 0.0003820308574180253, 'samples': 3360000, 'steps': 17499, 'loss/train': 0.8230116963386536} 01/27/2022 12:18:46 - INFO - codeparrot_training - Step 17500: {'lr': 0.000382016962662592, 'samples': 3360192, 'steps': 17500, 'loss/train': 0.4130830317735672} 01/27/2022 12:18:49 - INFO - codeparrot_training - Step 17501: {'lr': 0.0003820030673416399, 'samples': 3360384, 'steps': 17501, 'loss/train': 0.45884111523628235} 01/27/2022 12:18:52 - INFO - codeparrot_training - Step 17502: {'lr': 0.0003819891714552287, 'samples': 3360576, 'steps': 17502, 'loss/train': 0.5004651993513107} 01/27/2022 12:18:57 - INFO - codeparrot_training - Step 17503: {'lr': 0.00038197527500341777, 'samples': 3360768, 'steps': 17503, 'loss/train': 0.9714834988117218} 01/27/2022 12:19:00 - INFO - codeparrot_training - Step 17504: {'lr': 0.00038196137798626663, 'samples': 3360960, 'steps': 17504, 'loss/train': 0.8076061606407166} 01/27/2022 12:19:03 - INFO - codeparrot_training - Step 17505: {'lr': 0.00038194748040383487, 'samples': 3361152, 'steps': 17505, 'loss/train': 1.1978643536567688} 01/27/2022 12:19:06 - INFO - codeparrot_training - Step 17506: {'lr': 0.00038193358225618195, 'samples': 3361344, 'steps': 17506, 'loss/train': 0.8706175088882446} 01/27/2022 12:19:10 - INFO - codeparrot_training - Step 17507: {'lr': 0.0003819196835433675, 'samples': 3361536, 'steps': 17507, 'loss/train': 0.5156856179237366} 01/27/2022 12:19:13 - INFO - codeparrot_training - Step 17508: {'lr': 0.000381905784265451, 'samples': 3361728, 'steps': 17508, 'loss/train': 0.6851937621831894} 01/27/2022 12:19:16 - INFO - codeparrot_training - Step 17509: {'lr': 0.000381891884422492, 'samples': 3361920, 'steps': 17509, 'loss/train': 0.5026418119668961} 01/27/2022 12:19:19 - INFO - codeparrot_training - Step 17510: {'lr': 0.0003818779840145501, 'samples': 3362112, 'steps': 17510, 'loss/train': 1.190850019454956} 01/27/2022 12:19:25 - INFO - codeparrot_training - Step 17511: {'lr': 0.00038186408304168474, 'samples': 3362304, 'steps': 17511, 'loss/train': 0.4246915429830551} 01/27/2022 12:19:28 - INFO - codeparrot_training - Step 17512: {'lr': 0.00038185018150395557, 'samples': 3362496, 'steps': 17512, 'loss/train': 0.46924401819705963} 01/27/2022 12:19:32 - INFO - codeparrot_training - Step 17513: {'lr': 0.000381836279401422, 'samples': 3362688, 'steps': 17513, 'loss/train': 0.9070177674293518} 01/27/2022 12:19:35 - INFO - codeparrot_training - Step 17514: {'lr': 0.00038182237673414375, 'samples': 3362880, 'steps': 17514, 'loss/train': 0.9267464876174927} 01/27/2022 12:19:38 - INFO - codeparrot_training - Step 17515: {'lr': 0.0003818084735021803, 'samples': 3363072, 'steps': 17515, 'loss/train': 0.6759776920080185} 01/27/2022 12:19:41 - INFO - codeparrot_training - Step 17516: {'lr': 0.00038179456970559116, 'samples': 3363264, 'steps': 17516, 'loss/train': 1.1087413430213928} 01/27/2022 12:19:44 - INFO - codeparrot_training - Step 17517: {'lr': 0.00038178066534443587, 'samples': 3363456, 'steps': 17517, 'loss/train': 0.7099573016166687} 01/27/2022 12:19:47 - INFO - codeparrot_training - Step 17518: {'lr': 0.00038176676041877424, 'samples': 3363648, 'steps': 17518, 'loss/train': 0.48759596049785614} 01/27/2022 12:19:50 - INFO - codeparrot_training - Step 17519: {'lr': 0.0003817528549286655, 'samples': 3363840, 'steps': 17519, 'loss/train': 0.6598745137453079} 01/27/2022 12:19:55 - INFO - codeparrot_training - Step 17520: {'lr': 0.00038173894887416946, 'samples': 3364032, 'steps': 17520, 'loss/train': 0.7835880517959595} 01/27/2022 12:19:58 - INFO - codeparrot_training - Step 17521: {'lr': 0.0003817250422553455, 'samples': 3364224, 'steps': 17521, 'loss/train': 0.9134827852249146} 01/27/2022 12:20:01 - INFO - codeparrot_training - Step 17522: {'lr': 0.0003817111350722533, 'samples': 3364416, 'steps': 17522, 'loss/train': 0.5483958721160889} 01/27/2022 12:20:05 - INFO - codeparrot_training - Step 17523: {'lr': 0.0003816972273249525, 'samples': 3364608, 'steps': 17523, 'loss/train': 0.6918669193983078} 01/27/2022 12:20:08 - INFO - codeparrot_training - Step 17524: {'lr': 0.00038168331901350253, 'samples': 3364800, 'steps': 17524, 'loss/train': 0.8949649035930634} 01/27/2022 12:20:11 - INFO - codeparrot_training - Step 17525: {'lr': 0.0003816694101379631, 'samples': 3364992, 'steps': 17525, 'loss/train': 0.6667816489934921} 01/27/2022 12:20:14 - INFO - codeparrot_training - Step 17526: {'lr': 0.0003816555006983936, 'samples': 3365184, 'steps': 17526, 'loss/train': 0.8910436928272247} 01/27/2022 12:20:17 - INFO - codeparrot_training - Step 17527: {'lr': 0.0003816415906948538, 'samples': 3365376, 'steps': 17527, 'loss/train': 0.7835187613964081} 01/27/2022 12:20:20 - INFO - codeparrot_training - Step 17528: {'lr': 0.00038162768012740323, 'samples': 3365568, 'steps': 17528, 'loss/train': 0.6811739355325699} 01/27/2022 12:20:25 - INFO - codeparrot_training - Step 17529: {'lr': 0.00038161376899610154, 'samples': 3365760, 'steps': 17529, 'loss/train': 1.2958041429519653} 01/27/2022 12:20:28 - INFO - codeparrot_training - Step 17530: {'lr': 0.0003815998573010082, 'samples': 3365952, 'steps': 17530, 'loss/train': 0.6818699240684509} 01/27/2022 12:20:31 - INFO - codeparrot_training - Step 17531: {'lr': 0.0003815859450421829, 'samples': 3366144, 'steps': 17531, 'loss/train': 0.8118098080158234} 01/27/2022 12:20:34 - INFO - codeparrot_training - Step 17532: {'lr': 0.00038157203221968514, 'samples': 3366336, 'steps': 17532, 'loss/train': 0.5982089638710022} 01/27/2022 12:20:37 - INFO - codeparrot_training - Step 17533: {'lr': 0.00038155811883357454, 'samples': 3366528, 'steps': 17533, 'loss/train': 0.7265773862600327} 01/27/2022 12:20:40 - INFO - codeparrot_training - Step 17534: {'lr': 0.0003815442048839108, 'samples': 3366720, 'steps': 17534, 'loss/train': 1.0042346119880676} 01/27/2022 12:20:44 - INFO - codeparrot_training - Step 17535: {'lr': 0.0003815302903707534, 'samples': 3366912, 'steps': 17535, 'loss/train': 0.8042386472225189} 01/27/2022 12:20:47 - INFO - codeparrot_training - Step 17536: {'lr': 0.0003815163752941621, 'samples': 3367104, 'steps': 17536, 'loss/train': 0.7953372895717621} 01/27/2022 12:20:50 - INFO - codeparrot_training - Step 17537: {'lr': 0.00038150245965419636, 'samples': 3367296, 'steps': 17537, 'loss/train': 0.6973494440317154} 01/27/2022 12:20:54 - INFO - codeparrot_training - Step 17538: {'lr': 0.0003814885434509158, 'samples': 3367488, 'steps': 17538, 'loss/train': 0.801744818687439} 01/27/2022 12:20:58 - INFO - codeparrot_training - Step 17539: {'lr': 0.0003814746266843801, 'samples': 3367680, 'steps': 17539, 'loss/train': 1.0285227298736572} 01/27/2022 12:21:01 - INFO - codeparrot_training - Step 17540: {'lr': 0.0003814607093546489, 'samples': 3367872, 'steps': 17540, 'loss/train': 0.6992193460464478} 01/27/2022 12:21:04 - INFO - codeparrot_training - Step 17541: {'lr': 0.00038144679146178166, 'samples': 3368064, 'steps': 17541, 'loss/train': 0.9193615615367889} 01/27/2022 12:21:07 - INFO - codeparrot_training - Step 17542: {'lr': 0.00038143287300583816, 'samples': 3368256, 'steps': 17542, 'loss/train': 0.8259209990501404} 01/27/2022 12:21:10 - INFO - codeparrot_training - Step 17543: {'lr': 0.00038141895398687806, 'samples': 3368448, 'steps': 17543, 'loss/train': 0.5555993020534515} 01/27/2022 12:21:13 - INFO - codeparrot_training - Step 17544: {'lr': 0.0003814050344049608, 'samples': 3368640, 'steps': 17544, 'loss/train': 1.0293809473514557} 01/27/2022 12:21:16 - INFO - codeparrot_training - Step 17545: {'lr': 0.00038139111426014607, 'samples': 3368832, 'steps': 17545, 'loss/train': 1.2555532157421112} 01/27/2022 12:21:20 - INFO - codeparrot_training - Step 17546: {'lr': 0.00038137719355249355, 'samples': 3369024, 'steps': 17546, 'loss/train': 0.9436326026916504} 01/27/2022 12:21:26 - INFO - codeparrot_training - Step 17547: {'lr': 0.00038136327228206285, 'samples': 3369216, 'steps': 17547, 'loss/train': 0.6809262335300446} 01/27/2022 12:21:29 - INFO - codeparrot_training - Step 17548: {'lr': 0.0003813493504489136, 'samples': 3369408, 'steps': 17548, 'loss/train': 0.9901327192783356} 01/27/2022 12:21:32 - INFO - codeparrot_training - Step 17549: {'lr': 0.0003813354280531055, 'samples': 3369600, 'steps': 17549, 'loss/train': 0.6241798549890518} 01/27/2022 12:21:35 - INFO - codeparrot_training - Step 17550: {'lr': 0.00038132150509469806, 'samples': 3369792, 'steps': 17550, 'loss/train': 0.7401200383901596} 01/27/2022 12:21:39 - INFO - codeparrot_training - Step 17551: {'lr': 0.000381307581573751, 'samples': 3369984, 'steps': 17551, 'loss/train': 0.8443306088447571} 01/27/2022 12:21:42 - INFO - codeparrot_training - Step 17552: {'lr': 0.00038129365749032395, 'samples': 3370176, 'steps': 17552, 'loss/train': 0.697687491774559} 01/27/2022 12:21:45 - INFO - codeparrot_training - Step 17553: {'lr': 0.0003812797328444766, 'samples': 3370368, 'steps': 17553, 'loss/train': 0.9391675293445587} 01/27/2022 12:21:48 - INFO - codeparrot_training - Step 17554: {'lr': 0.0003812658076362685, 'samples': 3370560, 'steps': 17554, 'loss/train': 0.3838339000940323} 01/27/2022 12:21:52 - INFO - codeparrot_training - Step 17555: {'lr': 0.00038125188186575944, 'samples': 3370752, 'steps': 17555, 'loss/train': 0.4677797108888626} 01/27/2022 12:21:55 - INFO - codeparrot_training - Step 17556: {'lr': 0.00038123795553300893, 'samples': 3370944, 'steps': 17556, 'loss/train': 0.8074381649494171} 01/27/2022 12:21:59 - INFO - codeparrot_training - Step 17557: {'lr': 0.0003812240286380767, 'samples': 3371136, 'steps': 17557, 'loss/train': 0.8397026360034943} 01/27/2022 12:22:02 - INFO - codeparrot_training - Step 17558: {'lr': 0.0003812101011810224, 'samples': 3371328, 'steps': 17558, 'loss/train': 0.6931201368570328} 01/27/2022 12:22:05 - INFO - codeparrot_training - Step 17559: {'lr': 0.0003811961731619057, 'samples': 3371520, 'steps': 17559, 'loss/train': 0.759276419878006} 01/27/2022 12:22:08 - INFO - codeparrot_training - Step 17560: {'lr': 0.0003811822445807863, 'samples': 3371712, 'steps': 17560, 'loss/train': 0.9370661973953247} 01/27/2022 12:22:11 - INFO - codeparrot_training - Step 17561: {'lr': 0.00038116831543772377, 'samples': 3371904, 'steps': 17561, 'loss/train': 0.4099772125482559} 01/27/2022 12:22:14 - INFO - codeparrot_training - Step 17562: {'lr': 0.00038115438573277784, 'samples': 3372096, 'steps': 17562, 'loss/train': 0.9384652376174927} 01/27/2022 12:22:18 - INFO - codeparrot_training - Step 17563: {'lr': 0.0003811404554660082, 'samples': 3372288, 'steps': 17563, 'loss/train': 0.5200569480657578} 01/27/2022 12:22:22 - INFO - codeparrot_training - Step 17564: {'lr': 0.00038112652463747444, 'samples': 3372480, 'steps': 17564, 'loss/train': 0.8636096119880676} 01/27/2022 12:22:25 - INFO - codeparrot_training - Step 17565: {'lr': 0.00038111259324723624, 'samples': 3372672, 'steps': 17565, 'loss/train': 0.5550636649131775} 01/27/2022 12:22:28 - INFO - codeparrot_training - Step 17566: {'lr': 0.0003810986612953534, 'samples': 3372864, 'steps': 17566, 'loss/train': 0.6710767149925232} 01/27/2022 12:22:31 - INFO - codeparrot_training - Step 17567: {'lr': 0.0003810847287818855, 'samples': 3373056, 'steps': 17567, 'loss/train': 0.7653064727783203} 01/27/2022 12:22:35 - INFO - codeparrot_training - Step 17568: {'lr': 0.0003810707957068923, 'samples': 3373248, 'steps': 17568, 'loss/train': 0.6420416235923767} 01/27/2022 12:22:38 - INFO - codeparrot_training - Step 17569: {'lr': 0.0003810568620704334, 'samples': 3373440, 'steps': 17569, 'loss/train': 0.7790663838386536} 01/27/2022 12:22:41 - INFO - codeparrot_training - Step 17570: {'lr': 0.00038104292787256844, 'samples': 3373632, 'steps': 17570, 'loss/train': 1.198676437139511} 01/27/2022 12:22:44 - INFO - codeparrot_training - Step 17571: {'lr': 0.0003810289931133573, 'samples': 3373824, 'steps': 17571, 'loss/train': 0.3762316256761551} 01/27/2022 12:22:47 - INFO - codeparrot_training - Step 17572: {'lr': 0.0003810150577928595, 'samples': 3374016, 'steps': 17572, 'loss/train': 1.0040989816188812} 01/27/2022 12:22:53 - INFO - codeparrot_training - Step 17573: {'lr': 0.0003810011219111348, 'samples': 3374208, 'steps': 17573, 'loss/train': 1.1294004321098328} 01/27/2022 12:22:56 - INFO - codeparrot_training - Step 17574: {'lr': 0.00038098718546824287, 'samples': 3374400, 'steps': 17574, 'loss/train': 0.9147258102893829} 01/27/2022 12:22:59 - INFO - codeparrot_training - Step 17575: {'lr': 0.00038097324846424354, 'samples': 3374592, 'steps': 17575, 'loss/train': 0.398577556014061} 01/27/2022 12:23:03 - INFO - codeparrot_training - Step 17576: {'lr': 0.0003809593108991962, 'samples': 3374784, 'steps': 17576, 'loss/train': 0.9043717682361603} 01/27/2022 12:23:06 - INFO - codeparrot_training - Step 17577: {'lr': 0.0003809453727731609, 'samples': 3374976, 'steps': 17577, 'loss/train': 0.7413520663976669} 01/27/2022 12:23:09 - INFO - codeparrot_training - Step 17578: {'lr': 0.00038093143408619726, 'samples': 3375168, 'steps': 17578, 'loss/train': 0.9380393922328949} 01/27/2022 12:23:12 - INFO - codeparrot_training - Step 17579: {'lr': 0.0003809174948383648, 'samples': 3375360, 'steps': 17579, 'loss/train': 0.48193618655204773} 01/27/2022 12:23:15 - INFO - codeparrot_training - Step 17580: {'lr': 0.0003809035550297234, 'samples': 3375552, 'steps': 17580, 'loss/train': 0.7141970247030258} 01/27/2022 12:23:18 - INFO - codeparrot_training - Step 17581: {'lr': 0.00038088961466033276, 'samples': 3375744, 'steps': 17581, 'loss/train': 1.3027331829071045} 01/27/2022 12:23:23 - INFO - codeparrot_training - Step 17582: {'lr': 0.00038087567373025255, 'samples': 3375936, 'steps': 17582, 'loss/train': 0.8489871919155121} 01/27/2022 12:23:26 - INFO - codeparrot_training - Step 17583: {'lr': 0.0003808617322395425, 'samples': 3376128, 'steps': 17583, 'loss/train': 0.599931076169014} 01/27/2022 12:23:29 - INFO - codeparrot_training - Step 17584: {'lr': 0.00038084779018826245, 'samples': 3376320, 'steps': 17584, 'loss/train': 0.9065417647361755} 01/27/2022 12:23:32 - INFO - codeparrot_training - Step 17585: {'lr': 0.00038083384757647186, 'samples': 3376512, 'steps': 17585, 'loss/train': 0.9769366085529327} 01/27/2022 12:23:36 - INFO - codeparrot_training - Step 17586: {'lr': 0.0003808199044042308, 'samples': 3376704, 'steps': 17586, 'loss/train': 0.7999990582466125} 01/27/2022 12:23:39 - INFO - codeparrot_training - Step 17587: {'lr': 0.00038080596067159865, 'samples': 3376896, 'steps': 17587, 'loss/train': 0.5833088010549545} 01/27/2022 12:23:42 - INFO - codeparrot_training - Step 17588: {'lr': 0.0003807920163786353, 'samples': 3377088, 'steps': 17588, 'loss/train': 1.1053553223609924} 01/27/2022 12:23:45 - INFO - codeparrot_training - Step 17589: {'lr': 0.0003807780715254006, 'samples': 3377280, 'steps': 17589, 'loss/train': 1.0071618854999542} 01/27/2022 12:23:52 - INFO - codeparrot_training - Step 17590: {'lr': 0.000380764126111954, 'samples': 3377472, 'steps': 17590, 'loss/train': 0.7911767363548279} 01/27/2022 12:23:55 - INFO - codeparrot_training - Step 17591: {'lr': 0.0003807501801383555, 'samples': 3377664, 'steps': 17591, 'loss/train': 0.935079038143158} 01/27/2022 12:23:58 - INFO - codeparrot_training - Step 17592: {'lr': 0.0003807362336046648, 'samples': 3377856, 'steps': 17592, 'loss/train': 1.7963011264801025} 01/27/2022 12:24:01 - INFO - codeparrot_training - Step 17593: {'lr': 0.00038072228651094155, 'samples': 3378048, 'steps': 17593, 'loss/train': 0.6747234910726547} 01/27/2022 12:24:04 - INFO - codeparrot_training - Step 17594: {'lr': 0.0003807083388572455, 'samples': 3378240, 'steps': 17594, 'loss/train': 0.7520534098148346} 01/27/2022 12:24:07 - INFO - codeparrot_training - Step 17595: {'lr': 0.0003806943906436364, 'samples': 3378432, 'steps': 17595, 'loss/train': 0.8757303357124329} 01/27/2022 12:24:10 - INFO - codeparrot_training - Step 17596: {'lr': 0.0003806804418701741, 'samples': 3378624, 'steps': 17596, 'loss/train': 0.901152491569519} 01/27/2022 12:24:14 - INFO - codeparrot_training - Step 17597: {'lr': 0.0003806664925369183, 'samples': 3378816, 'steps': 17597, 'loss/train': 0.6177244484424591} 01/27/2022 12:24:17 - INFO - codeparrot_training - Step 17598: {'lr': 0.0003806525426439287, 'samples': 3379008, 'steps': 17598, 'loss/train': 0.42204125225543976} 01/27/2022 12:24:21 - INFO - codeparrot_training - Step 17599: {'lr': 0.00038063859219126514, 'samples': 3379200, 'steps': 17599, 'loss/train': 0.8274960815906525} 01/27/2022 12:24:24 - INFO - codeparrot_training - Step 17600: {'lr': 0.0003806246411789872, 'samples': 3379392, 'steps': 17600, 'loss/train': 0.6486605554819107} 01/27/2022 12:24:27 - INFO - codeparrot_training - Step 17601: {'lr': 0.00038061068960715494, 'samples': 3379584, 'steps': 17601, 'loss/train': 0.8894058465957642} 01/27/2022 12:24:31 - INFO - codeparrot_training - Step 17602: {'lr': 0.00038059673747582783, 'samples': 3379776, 'steps': 17602, 'loss/train': 0.5349358767271042} 01/27/2022 12:24:34 - INFO - codeparrot_training - Step 17603: {'lr': 0.00038058278478506584, 'samples': 3379968, 'steps': 17603, 'loss/train': 0.5744879096746445} 01/27/2022 12:24:37 - INFO - codeparrot_training - Step 17604: {'lr': 0.0003805688315349286, 'samples': 3380160, 'steps': 17604, 'loss/train': 0.4258686304092407} 01/27/2022 12:24:40 - INFO - codeparrot_training - Step 17605: {'lr': 0.00038055487772547603, 'samples': 3380352, 'steps': 17605, 'loss/train': 0.8670195043087006} 01/27/2022 12:24:43 - INFO - codeparrot_training - Step 17606: {'lr': 0.00038054092335676774, 'samples': 3380544, 'steps': 17606, 'loss/train': 0.5635651499032974} 01/27/2022 12:24:46 - INFO - codeparrot_training - Step 17607: {'lr': 0.00038052696842886364, 'samples': 3380736, 'steps': 17607, 'loss/train': 0.7464081197977066} 01/27/2022 12:24:51 - INFO - codeparrot_training - Step 17608: {'lr': 0.0003805130129418235, 'samples': 3380928, 'steps': 17608, 'loss/train': 0.7785082161426544} 01/27/2022 12:24:54 - INFO - codeparrot_training - Step 17609: {'lr': 0.00038049905689570697, 'samples': 3381120, 'steps': 17609, 'loss/train': 0.9789430797100067} 01/27/2022 12:24:57 - INFO - codeparrot_training - Step 17610: {'lr': 0.00038048510029057393, 'samples': 3381312, 'steps': 17610, 'loss/train': 1.1393182575702667} 01/27/2022 12:25:00 - INFO - codeparrot_training - Step 17611: {'lr': 0.00038047114312648414, 'samples': 3381504, 'steps': 17611, 'loss/train': 0.9507156014442444} 01/27/2022 12:25:03 - INFO - codeparrot_training - Step 17612: {'lr': 0.0003804571854034975, 'samples': 3381696, 'steps': 17612, 'loss/train': 0.7753790616989136} 01/27/2022 12:25:06 - INFO - codeparrot_training - Step 17613: {'lr': 0.0003804432271216736, 'samples': 3381888, 'steps': 17613, 'loss/train': 0.6945864111185074} 01/27/2022 12:25:10 - INFO - codeparrot_training - Step 17614: {'lr': 0.0003804292682810724, 'samples': 3382080, 'steps': 17614, 'loss/train': 0.8680836260318756} 01/27/2022 12:25:13 - INFO - codeparrot_training - Step 17615: {'lr': 0.00038041530888175356, 'samples': 3382272, 'steps': 17615, 'loss/train': 0.8822807371616364} 01/27/2022 12:25:16 - INFO - codeparrot_training - Step 17616: {'lr': 0.00038040134892377696, 'samples': 3382464, 'steps': 17616, 'loss/train': 0.7911579608917236} 01/27/2022 12:25:23 - INFO - codeparrot_training - Step 17617: {'lr': 0.00038038738840720244, 'samples': 3382656, 'steps': 17617, 'loss/train': 1.108813852071762} 01/27/2022 12:25:26 - INFO - codeparrot_training - Step 17618: {'lr': 0.0003803734273320897, 'samples': 3382848, 'steps': 17618, 'loss/train': 1.0023580491542816} 01/27/2022 12:25:29 - INFO - codeparrot_training - Step 17619: {'lr': 0.0003803594656984986, 'samples': 3383040, 'steps': 17619, 'loss/train': 0.7388640940189362} 01/27/2022 12:25:32 - INFO - codeparrot_training - Step 17620: {'lr': 0.000380345503506489, 'samples': 3383232, 'steps': 17620, 'loss/train': 0.8199170529842377} 01/27/2022 12:25:35 - INFO - codeparrot_training - Step 17621: {'lr': 0.00038033154075612063, 'samples': 3383424, 'steps': 17621, 'loss/train': 0.9106069207191467} 01/27/2022 12:25:38 - INFO - codeparrot_training - Step 17622: {'lr': 0.00038031757744745327, 'samples': 3383616, 'steps': 17622, 'loss/train': 1.0614877045154572} 01/27/2022 12:25:42 - INFO - codeparrot_training - Step 17623: {'lr': 0.0003803036135805469, 'samples': 3383808, 'steps': 17623, 'loss/train': 0.706055760383606} 01/27/2022 12:25:45 - INFO - codeparrot_training - Step 17624: {'lr': 0.00038028964915546107, 'samples': 3384000, 'steps': 17624, 'loss/train': 0.8001103699207306} 01/27/2022 12:25:49 - INFO - codeparrot_training - Step 17625: {'lr': 0.00038027568417225586, 'samples': 3384192, 'steps': 17625, 'loss/train': 0.7546382546424866} 01/27/2022 12:25:52 - INFO - codeparrot_training - Step 17626: {'lr': 0.00038026171863099093, 'samples': 3384384, 'steps': 17626, 'loss/train': 0.7697919309139252} 01/27/2022 12:25:56 - INFO - codeparrot_training - Step 17627: {'lr': 0.0003802477525317263, 'samples': 3384576, 'steps': 17627, 'loss/train': 0.9213748276233673} 01/27/2022 12:25:59 - INFO - codeparrot_training - Step 17628: {'lr': 0.00038023378587452144, 'samples': 3384768, 'steps': 17628, 'loss/train': 0.7434128075838089} 01/27/2022 12:26:02 - INFO - codeparrot_training - Step 17629: {'lr': 0.0003802198186594366, 'samples': 3384960, 'steps': 17629, 'loss/train': 0.8375399708747864} 01/27/2022 12:26:05 - INFO - codeparrot_training - Step 17630: {'lr': 0.00038020585088653126, 'samples': 3385152, 'steps': 17630, 'loss/train': 0.7644470930099487} 01/27/2022 12:26:08 - INFO - codeparrot_training - Step 17631: {'lr': 0.00038019188255586546, 'samples': 3385344, 'steps': 17631, 'loss/train': 0.4997047930955887} 01/27/2022 12:26:11 - INFO - codeparrot_training - Step 17632: {'lr': 0.00038017791366749896, 'samples': 3385536, 'steps': 17632, 'loss/train': 0.8453731834888458} 01/27/2022 12:26:14 - INFO - codeparrot_training - Step 17633: {'lr': 0.0003801639442214916, 'samples': 3385728, 'steps': 17633, 'loss/train': 1.596295416355133} 01/27/2022 12:26:21 - INFO - codeparrot_training - Step 17634: {'lr': 0.0003801499742179033, 'samples': 3385920, 'steps': 17634, 'loss/train': 0.7018159478902817} 01/27/2022 12:26:24 - INFO - codeparrot_training - Step 17635: {'lr': 0.0003801360036567938, 'samples': 3386112, 'steps': 17635, 'loss/train': 1.0462090373039246} 01/27/2022 12:26:27 - INFO - codeparrot_training - Step 17636: {'lr': 0.000380122032538223, 'samples': 3386304, 'steps': 17636, 'loss/train': 0.40242840349674225} 01/27/2022 12:26:30 - INFO - codeparrot_training - Step 17637: {'lr': 0.0003801080608622507, 'samples': 3386496, 'steps': 17637, 'loss/train': 0.5320856720209122} 01/27/2022 12:26:33 - INFO - codeparrot_training - Step 17638: {'lr': 0.0003800940886289368, 'samples': 3386688, 'steps': 17638, 'loss/train': 0.7203127294778824} 01/27/2022 12:26:36 - INFO - codeparrot_training - Step 17639: {'lr': 0.0003800801158383411, 'samples': 3386880, 'steps': 17639, 'loss/train': 0.4303123354911804} 01/27/2022 12:26:40 - INFO - codeparrot_training - Step 17640: {'lr': 0.00038006614249052353, 'samples': 3387072, 'steps': 17640, 'loss/train': 0.8024492561817169} 01/27/2022 12:26:43 - INFO - codeparrot_training - Step 17641: {'lr': 0.0003800521685855439, 'samples': 3387264, 'steps': 17641, 'loss/train': 1.227532535791397} 01/27/2022 12:26:46 - INFO - codeparrot_training - Step 17642: {'lr': 0.000380038194123462, 'samples': 3387456, 'steps': 17642, 'loss/train': 0.6158351451158524} 01/27/2022 12:26:50 - INFO - codeparrot_training - Step 17643: {'lr': 0.0003800242191043379, 'samples': 3387648, 'steps': 17643, 'loss/train': 1.0035163164138794} 01/27/2022 12:26:53 - INFO - codeparrot_training - Step 17644: {'lr': 0.00038001024352823123, 'samples': 3387840, 'steps': 17644, 'loss/train': 2.6012887358665466} 01/27/2022 12:26:56 - INFO - codeparrot_training - Step 17645: {'lr': 0.00037999626739520197, 'samples': 3388032, 'steps': 17645, 'loss/train': 1.234763503074646} 01/27/2022 12:27:00 - INFO - codeparrot_training - Step 17646: {'lr': 0.00037998229070531, 'samples': 3388224, 'steps': 17646, 'loss/train': 1.2936771512031555} 01/27/2022 12:27:03 - INFO - codeparrot_training - Step 17647: {'lr': 0.0003799683134586152, 'samples': 3388416, 'steps': 17647, 'loss/train': 0.7855320274829865} 01/27/2022 12:27:06 - INFO - codeparrot_training - Step 17648: {'lr': 0.0003799543356551773, 'samples': 3388608, 'steps': 17648, 'loss/train': 0.6672200113534927} 01/27/2022 12:27:09 - INFO - codeparrot_training - Step 17649: {'lr': 0.0003799403572950565, 'samples': 3388800, 'steps': 17649, 'loss/train': 0.6547168493270874} 01/27/2022 12:27:12 - INFO - codeparrot_training - Step 17650: {'lr': 0.00037992637837831235, 'samples': 3388992, 'steps': 17650, 'loss/train': 1.073117583990097} 01/27/2022 12:27:17 - INFO - codeparrot_training - Step 17651: {'lr': 0.00037991239890500483, 'samples': 3389184, 'steps': 17651, 'loss/train': 0.43304094672203064} 01/27/2022 12:27:20 - INFO - codeparrot_training - Step 17652: {'lr': 0.00037989841887519385, 'samples': 3389376, 'steps': 17652, 'loss/train': 0.5141791999340057} 01/27/2022 12:27:23 - INFO - codeparrot_training - Step 17653: {'lr': 0.00037988443828893936, 'samples': 3389568, 'steps': 17653, 'loss/train': 0.8803211152553558} 01/27/2022 12:27:26 - INFO - codeparrot_training - Step 17654: {'lr': 0.0003798704571463011, 'samples': 3389760, 'steps': 17654, 'loss/train': 0.9142555296421051} 01/27/2022 12:27:29 - INFO - codeparrot_training - Step 17655: {'lr': 0.00037985647544733903, 'samples': 3389952, 'steps': 17655, 'loss/train': 1.0644122064113617} 01/27/2022 12:27:33 - INFO - codeparrot_training - Step 17656: {'lr': 0.0003798424931921131, 'samples': 3390144, 'steps': 17656, 'loss/train': 0.7376801669597626} 01/27/2022 12:27:36 - INFO - codeparrot_training - Step 17657: {'lr': 0.0003798285103806831, 'samples': 3390336, 'steps': 17657, 'loss/train': 0.8549390137195587} 01/27/2022 12:27:39 - INFO - codeparrot_training - Step 17658: {'lr': 0.0003798145270131091, 'samples': 3390528, 'steps': 17658, 'loss/train': 0.39151203632354736} 01/27/2022 12:27:42 - INFO - codeparrot_training - Step 17659: {'lr': 0.00037980054308945076, 'samples': 3390720, 'steps': 17659, 'loss/train': 1.1263719499111176} 01/27/2022 12:27:48 - INFO - codeparrot_training - Step 17660: {'lr': 0.00037978655860976826, 'samples': 3390912, 'steps': 17660, 'loss/train': 1.106958121061325} 01/27/2022 12:27:51 - INFO - codeparrot_training - Step 17661: {'lr': 0.0003797725735741212, 'samples': 3391104, 'steps': 17661, 'loss/train': 0.5523396581411362} 01/27/2022 12:27:54 - INFO - codeparrot_training - Step 17662: {'lr': 0.0003797585879825698, 'samples': 3391296, 'steps': 17662, 'loss/train': 0.8923992812633514} 01/27/2022 12:27:57 - INFO - codeparrot_training - Step 17663: {'lr': 0.00037974460183517366, 'samples': 3391488, 'steps': 17663, 'loss/train': 0.6943888664245605} 01/27/2022 12:28:00 - INFO - codeparrot_training - Step 17664: {'lr': 0.0003797306151319929, 'samples': 3391680, 'steps': 17664, 'loss/train': 0.6808324903249741} 01/27/2022 12:28:03 - INFO - codeparrot_training - Step 17665: {'lr': 0.00037971662787308734, 'samples': 3391872, 'steps': 17665, 'loss/train': 1.6969846487045288} 01/27/2022 12:28:07 - INFO - codeparrot_training - Step 17666: {'lr': 0.00037970264005851703, 'samples': 3392064, 'steps': 17666, 'loss/train': 1.025417000055313} 01/27/2022 12:28:10 - INFO - codeparrot_training - Step 17667: {'lr': 0.0003796886516883418, 'samples': 3392256, 'steps': 17667, 'loss/train': 2.6182313561439514} 01/27/2022 12:28:13 - INFO - codeparrot_training - Step 17668: {'lr': 0.0003796746627626214, 'samples': 3392448, 'steps': 17668, 'loss/train': 0.5651771575212479} 01/27/2022 12:28:18 - INFO - codeparrot_training - Step 17669: {'lr': 0.00037966067328141606, 'samples': 3392640, 'steps': 17669, 'loss/train': 0.7804397642612457} 01/27/2022 12:28:21 - INFO - codeparrot_training - Step 17670: {'lr': 0.0003796466832447856, 'samples': 3392832, 'steps': 17670, 'loss/train': 0.45675739645957947} 01/27/2022 12:28:24 - INFO - codeparrot_training - Step 17671: {'lr': 0.00037963269265278986, 'samples': 3393024, 'steps': 17671, 'loss/train': 0.6983577758073807} 01/27/2022 12:28:27 - INFO - codeparrot_training - Step 17672: {'lr': 0.0003796187015054888, 'samples': 3393216, 'steps': 17672, 'loss/train': 1.200396180152893} 01/27/2022 12:28:30 - INFO - codeparrot_training - Step 17673: {'lr': 0.0003796047098029424, 'samples': 3393408, 'steps': 17673, 'loss/train': 0.9162155091762543} 01/27/2022 12:28:33 - INFO - codeparrot_training - Step 17674: {'lr': 0.0003795907175452106, 'samples': 3393600, 'steps': 17674, 'loss/train': 1.4042057991027832} 01/27/2022 12:28:36 - INFO - codeparrot_training - Step 17675: {'lr': 0.0003795767247323533, 'samples': 3393792, 'steps': 17675, 'loss/train': 1.6248448491096497} 01/27/2022 12:28:40 - INFO - codeparrot_training - Step 17676: {'lr': 0.00037956273136443056, 'samples': 3393984, 'steps': 17676, 'loss/train': 0.7945581078529358} 01/27/2022 12:28:43 - INFO - codeparrot_training - Step 17677: {'lr': 0.000379548737441502, 'samples': 3394176, 'steps': 17677, 'loss/train': 0.3918060064315796} 01/27/2022 12:28:48 - INFO - codeparrot_training - Step 17678: {'lr': 0.00037953474296362796, 'samples': 3394368, 'steps': 17678, 'loss/train': 1.3041239082813263} 01/27/2022 12:28:51 - INFO - codeparrot_training - Step 17679: {'lr': 0.0003795207479308681, 'samples': 3394560, 'steps': 17679, 'loss/train': 1.0668982565402985} 01/27/2022 12:28:54 - INFO - codeparrot_training - Step 17680: {'lr': 0.00037950675234328256, 'samples': 3394752, 'steps': 17680, 'loss/train': 0.4474448561668396} 01/27/2022 12:28:57 - INFO - codeparrot_training - Step 17681: {'lr': 0.00037949275620093124, 'samples': 3394944, 'steps': 17681, 'loss/train': 0.5069480091333389} 01/27/2022 12:29:00 - INFO - codeparrot_training - Step 17682: {'lr': 0.000379478759503874, 'samples': 3395136, 'steps': 17682, 'loss/train': 1.1141667366027832} 01/27/2022 12:29:03 - INFO - codeparrot_training - Step 17683: {'lr': 0.00037946476225217087, 'samples': 3395328, 'steps': 17683, 'loss/train': 0.39164958894252777} 01/27/2022 12:29:06 - INFO - codeparrot_training - Step 17684: {'lr': 0.0003794507644458819, 'samples': 3395520, 'steps': 17684, 'loss/train': 1.0509098768234253} 01/27/2022 12:29:10 - INFO - codeparrot_training - Step 17685: {'lr': 0.00037943676608506683, 'samples': 3395712, 'steps': 17685, 'loss/train': 1.186167687177658} 01/27/2022 12:29:14 - INFO - codeparrot_training - Step 17686: {'lr': 0.00037942276716978584, 'samples': 3395904, 'steps': 17686, 'loss/train': 1.0408623218536377} 01/27/2022 12:29:17 - INFO - codeparrot_training - Step 17687: {'lr': 0.0003794087677000988, 'samples': 3396096, 'steps': 17687, 'loss/train': 0.8815032541751862} 01/27/2022 12:29:20 - INFO - codeparrot_training - Step 17688: {'lr': 0.0003793947676760657, 'samples': 3396288, 'steps': 17688, 'loss/train': 1.6696629524230957} 01/27/2022 12:29:24 - INFO - codeparrot_training - Step 17689: {'lr': 0.00037938076709774645, 'samples': 3396480, 'steps': 17689, 'loss/train': 0.6941552013158798} 01/27/2022 12:29:27 - INFO - codeparrot_training - Step 17690: {'lr': 0.0003793667659652011, 'samples': 3396672, 'steps': 17690, 'loss/train': 0.9092665314674377} 01/27/2022 12:29:30 - INFO - codeparrot_training - Step 17691: {'lr': 0.0003793527642784896, 'samples': 3396864, 'steps': 17691, 'loss/train': 0.7028269171714783} 01/27/2022 12:29:33 - INFO - codeparrot_training - Step 17692: {'lr': 0.0003793387620376719, 'samples': 3397056, 'steps': 17692, 'loss/train': 1.2979472279548645} 01/27/2022 12:29:36 - INFO - codeparrot_training - Step 17693: {'lr': 0.0003793247592428081, 'samples': 3397248, 'steps': 17693, 'loss/train': 1.18478125333786} 01/27/2022 12:29:39 - INFO - codeparrot_training - Step 17694: {'lr': 0.00037931075589395805, 'samples': 3397440, 'steps': 17694, 'loss/train': 0.37494785338640213} 01/27/2022 12:29:45 - INFO - codeparrot_training - Step 17695: {'lr': 0.00037929675199118183, 'samples': 3397632, 'steps': 17695, 'loss/train': 0.6255338788032532} 01/27/2022 12:29:48 - INFO - codeparrot_training - Step 17696: {'lr': 0.0003792827475345393, 'samples': 3397824, 'steps': 17696, 'loss/train': 0.731553465127945} 01/27/2022 12:29:51 - INFO - codeparrot_training - Step 17697: {'lr': 0.0003792687425240906, 'samples': 3398016, 'steps': 17697, 'loss/train': 1.0597515106201172} 01/27/2022 12:29:54 - INFO - codeparrot_training - Step 17698: {'lr': 0.0003792547369598956, 'samples': 3398208, 'steps': 17698, 'loss/train': 0.6732043772935867} 01/27/2022 12:29:57 - INFO - codeparrot_training - Step 17699: {'lr': 0.0003792407308420144, 'samples': 3398400, 'steps': 17699, 'loss/train': 1.070304036140442} 01/27/2022 12:30:01 - INFO - codeparrot_training - Step 17700: {'lr': 0.00037922672417050685, 'samples': 3398592, 'steps': 17700, 'loss/train': 0.8304772675037384} 01/27/2022 12:30:04 - INFO - codeparrot_training - Step 17701: {'lr': 0.00037921271694543317, 'samples': 3398784, 'steps': 17701, 'loss/train': 1.18225559592247} 01/27/2022 12:30:07 - INFO - codeparrot_training - Step 17702: {'lr': 0.0003791987091668532, 'samples': 3398976, 'steps': 17702, 'loss/train': 0.9208369553089142} 01/27/2022 12:30:10 - INFO - codeparrot_training - Step 17703: {'lr': 0.00037918470083482693, 'samples': 3399168, 'steps': 17703, 'loss/train': 1.2328450977802277} 01/27/2022 12:30:14 - INFO - codeparrot_training - Step 17704: {'lr': 0.0003791706919494145, 'samples': 3399360, 'steps': 17704, 'loss/train': 0.9051245748996735} 01/27/2022 12:30:18 - INFO - codeparrot_training - Step 17705: {'lr': 0.0003791566825106758, 'samples': 3399552, 'steps': 17705, 'loss/train': 0.769343376159668} 01/27/2022 12:30:21 - INFO - codeparrot_training - Step 17706: {'lr': 0.0003791426725186709, 'samples': 3399744, 'steps': 17706, 'loss/train': 0.621571496129036} 01/27/2022 12:30:24 - INFO - codeparrot_training - Step 17707: {'lr': 0.0003791286619734597, 'samples': 3399936, 'steps': 17707, 'loss/train': 0.4595901221036911} 01/27/2022 12:30:27 - INFO - codeparrot_training - Step 17708: {'lr': 0.0003791146508751025, 'samples': 3400128, 'steps': 17708, 'loss/train': 0.8479340672492981} 01/27/2022 12:30:30 - INFO - codeparrot_training - Step 17709: {'lr': 0.00037910063922365903, 'samples': 3400320, 'steps': 17709, 'loss/train': 1.076926052570343} 01/27/2022 12:30:33 - INFO - codeparrot_training - Step 17710: {'lr': 0.00037908662701918944, 'samples': 3400512, 'steps': 17710, 'loss/train': 0.6719412803649902} 01/27/2022 12:30:36 - INFO - codeparrot_training - Step 17711: {'lr': 0.00037907261426175365, 'samples': 3400704, 'steps': 17711, 'loss/train': 0.6751941740512848} 01/27/2022 12:30:40 - INFO - codeparrot_training - Step 17712: {'lr': 0.0003790586009514119, 'samples': 3400896, 'steps': 17712, 'loss/train': 0.9057995975017548} 01/27/2022 12:30:45 - INFO - codeparrot_training - Step 17713: {'lr': 0.000379044587088224, 'samples': 3401088, 'steps': 17713, 'loss/train': 0.6229561418294907} 01/27/2022 12:30:48 - INFO - codeparrot_training - Step 17714: {'lr': 0.0003790305726722501, 'samples': 3401280, 'steps': 17714, 'loss/train': 0.8434931337833405} 01/27/2022 12:30:52 - INFO - codeparrot_training - Step 17715: {'lr': 0.00037901655770355015, 'samples': 3401472, 'steps': 17715, 'loss/train': 1.2113399505615234} 01/27/2022 12:30:55 - INFO - codeparrot_training - Step 17716: {'lr': 0.0003790025421821843, 'samples': 3401664, 'steps': 17716, 'loss/train': 0.7394381761550903} 01/27/2022 12:30:58 - INFO - codeparrot_training - Step 17717: {'lr': 0.0003789885261082124, 'samples': 3401856, 'steps': 17717, 'loss/train': 0.28448040783405304} 01/27/2022 12:31:01 - INFO - codeparrot_training - Step 17718: {'lr': 0.00037897450948169476, 'samples': 3402048, 'steps': 17718, 'loss/train': 1.1871600151062012} 01/27/2022 12:31:04 - INFO - codeparrot_training - Step 17719: {'lr': 0.0003789604923026912, 'samples': 3402240, 'steps': 17719, 'loss/train': 0.1921374723315239} 01/27/2022 12:31:07 - INFO - codeparrot_training - Step 17720: {'lr': 0.00037894647457126186, 'samples': 3402432, 'steps': 17720, 'loss/train': 0.21144280582666397} 01/27/2022 12:31:10 - INFO - codeparrot_training - Step 17721: {'lr': 0.0003789324562874668, 'samples': 3402624, 'steps': 17721, 'loss/train': 0.2824966236948967} 01/27/2022 12:31:15 - INFO - codeparrot_training - Step 17722: {'lr': 0.000378918437451366, 'samples': 3402816, 'steps': 17722, 'loss/train': 1.2950773537158966} 01/27/2022 12:31:18 - INFO - codeparrot_training - Step 17723: {'lr': 0.00037890441806301954, 'samples': 3403008, 'steps': 17723, 'loss/train': 0.850862592458725} 01/27/2022 12:31:21 - INFO - codeparrot_training - Step 17724: {'lr': 0.0003788903981224875, 'samples': 3403200, 'steps': 17724, 'loss/train': 1.2488656640052795} 01/27/2022 12:31:24 - INFO - codeparrot_training - Step 17725: {'lr': 0.00037887637762982996, 'samples': 3403392, 'steps': 17725, 'loss/train': 0.6582675576210022} 01/27/2022 12:31:27 - INFO - codeparrot_training - Step 17726: {'lr': 0.0003788623565851068, 'samples': 3403584, 'steps': 17726, 'loss/train': 0.7107021063566208} 01/27/2022 12:31:31 - INFO - codeparrot_training - Step 17727: {'lr': 0.00037884833498837833, 'samples': 3403776, 'steps': 17727, 'loss/train': 0.5869897753000259} 01/27/2022 12:31:34 - INFO - codeparrot_training - Step 17728: {'lr': 0.00037883431283970454, 'samples': 3403968, 'steps': 17728, 'loss/train': 0.29002057760953903} 01/27/2022 12:31:37 - INFO - codeparrot_training - Step 17729: {'lr': 0.00037882029013914544, 'samples': 3404160, 'steps': 17729, 'loss/train': 0.5466892719268799} 01/27/2022 12:31:41 - INFO - codeparrot_training - Step 17730: {'lr': 0.0003788062668867611, 'samples': 3404352, 'steps': 17730, 'loss/train': 0.41187572479248047} 01/27/2022 12:31:44 - INFO - codeparrot_training - Step 17731: {'lr': 0.00037879224308261163, 'samples': 3404544, 'steps': 17731, 'loss/train': 0.934820830821991} 01/27/2022 12:31:48 - INFO - codeparrot_training - Step 17732: {'lr': 0.00037877821872675705, 'samples': 3404736, 'steps': 17732, 'loss/train': 1.040990173816681} 01/27/2022 12:31:51 - INFO - codeparrot_training - Step 17733: {'lr': 0.0003787641938192575, 'samples': 3404928, 'steps': 17733, 'loss/train': 0.78511181473732} 01/27/2022 12:31:54 - INFO - codeparrot_training - Step 17734: {'lr': 0.00037875016836017304, 'samples': 3405120, 'steps': 17734, 'loss/train': 0.7643394470214844} 01/27/2022 12:31:57 - INFO - codeparrot_training - Step 17735: {'lr': 0.0003787361423495637, 'samples': 3405312, 'steps': 17735, 'loss/train': 0.06435721926391125} 01/27/2022 12:32:00 - INFO - codeparrot_training - Step 17736: {'lr': 0.0003787221157874897, 'samples': 3405504, 'steps': 17736, 'loss/train': 1.6632746458053589} 01/27/2022 12:32:03 - INFO - codeparrot_training - Step 17737: {'lr': 0.00037870808867401085, 'samples': 3405696, 'steps': 17737, 'loss/train': 0.923375129699707} 01/27/2022 12:32:06 - INFO - codeparrot_training - Step 17738: {'lr': 0.00037869406100918756, 'samples': 3405888, 'steps': 17738, 'loss/train': 0.9484435021877289} 01/27/2022 12:32:12 - INFO - codeparrot_training - Step 17739: {'lr': 0.0003786800327930797, 'samples': 3406080, 'steps': 17739, 'loss/train': 0.32493162900209427} 01/27/2022 12:32:15 - INFO - codeparrot_training - Step 17740: {'lr': 0.0003786660040257475, 'samples': 3406272, 'steps': 17740, 'loss/train': 0.8537809252738953} 01/27/2022 12:32:18 - INFO - codeparrot_training - Step 17741: {'lr': 0.00037865197470725103, 'samples': 3406464, 'steps': 17741, 'loss/train': 0.8624403476715088} 01/27/2022 12:32:22 - INFO - codeparrot_training - Step 17742: {'lr': 0.0003786379448376503, 'samples': 3406656, 'steps': 17742, 'loss/train': 1.2402792870998383} 01/27/2022 12:32:25 - INFO - codeparrot_training - Step 17743: {'lr': 0.0003786239144170055, 'samples': 3406848, 'steps': 17743, 'loss/train': 0.645303949713707} 01/27/2022 12:32:28 - INFO - codeparrot_training - Step 17744: {'lr': 0.0003786098834453766, 'samples': 3407040, 'steps': 17744, 'loss/train': 0.6766538769006729} 01/27/2022 12:32:31 - INFO - codeparrot_training - Step 17745: {'lr': 0.00037859585192282386, 'samples': 3407232, 'steps': 17745, 'loss/train': 1.2060854136943817} 01/27/2022 12:32:34 - INFO - codeparrot_training - Step 17746: {'lr': 0.00037858181984940734, 'samples': 3407424, 'steps': 17746, 'loss/train': 0.17646407335996628} 01/27/2022 12:32:37 - INFO - codeparrot_training - Step 17747: {'lr': 0.0003785677872251871, 'samples': 3407616, 'steps': 17747, 'loss/train': 0.7733698189258575} 01/27/2022 12:32:42 - INFO - codeparrot_training - Step 17748: {'lr': 0.0003785537540502233, 'samples': 3407808, 'steps': 17748, 'loss/train': 1.0578591227531433} 01/27/2022 12:32:45 - INFO - codeparrot_training - Step 17749: {'lr': 0.0003785397203245761, 'samples': 3408000, 'steps': 17749, 'loss/train': 0.7659654915332794} 01/27/2022 12:32:48 - INFO - codeparrot_training - Step 17750: {'lr': 0.0003785256860483054, 'samples': 3408192, 'steps': 17750, 'loss/train': 1.178657054901123} 01/27/2022 12:32:51 - INFO - codeparrot_training - Step 17751: {'lr': 0.0003785116512214716, 'samples': 3408384, 'steps': 17751, 'loss/train': 1.1738046705722809} 01/27/2022 12:32:54 - INFO - codeparrot_training - Step 17752: {'lr': 0.0003784976158441347, 'samples': 3408576, 'steps': 17752, 'loss/train': 0.04111298080533743} 01/27/2022 12:32:57 - INFO - codeparrot_training - Step 17753: {'lr': 0.0003784835799163547, 'samples': 3408768, 'steps': 17753, 'loss/train': 0.7648500502109528} 01/27/2022 12:33:01 - INFO - codeparrot_training - Step 17754: {'lr': 0.00037846954343819195, 'samples': 3408960, 'steps': 17754, 'loss/train': 0.7031485587358475} 01/27/2022 12:33:04 - INFO - codeparrot_training - Step 17755: {'lr': 0.00037845550640970636, 'samples': 3409152, 'steps': 17755, 'loss/train': 0.6835691928863525} 01/27/2022 12:33:07 - INFO - codeparrot_training - Step 17756: {'lr': 0.0003784414688309583, 'samples': 3409344, 'steps': 17756, 'loss/train': 0.999997079372406} 01/27/2022 12:33:11 - INFO - codeparrot_training - Step 17757: {'lr': 0.00037842743070200767, 'samples': 3409536, 'steps': 17757, 'loss/train': 0.6320207566022873} 01/27/2022 12:33:15 - INFO - codeparrot_training - Step 17758: {'lr': 0.0003784133920229148, 'samples': 3409728, 'steps': 17758, 'loss/train': 1.1022075712680817} 01/27/2022 12:33:18 - INFO - codeparrot_training - Step 17759: {'lr': 0.0003783993527937397, 'samples': 3409920, 'steps': 17759, 'loss/train': 1.1408203840255737} 01/27/2022 12:33:21 - INFO - codeparrot_training - Step 17760: {'lr': 0.0003783853130145425, 'samples': 3410112, 'steps': 17760, 'loss/train': 0.6201674938201904} 01/27/2022 12:33:24 - INFO - codeparrot_training - Step 17761: {'lr': 0.0003783712726853835, 'samples': 3410304, 'steps': 17761, 'loss/train': 0.5229577124118805} 01/27/2022 12:33:27 - INFO - codeparrot_training - Step 17762: {'lr': 0.00037835723180632263, 'samples': 3410496, 'steps': 17762, 'loss/train': 0.9081635177135468} 01/27/2022 12:33:30 - INFO - codeparrot_training - Step 17763: {'lr': 0.00037834319037742016, 'samples': 3410688, 'steps': 17763, 'loss/train': 0.42422300577163696} 01/27/2022 12:33:33 - INFO - codeparrot_training - Step 17764: {'lr': 0.00037832914839873623, 'samples': 3410880, 'steps': 17764, 'loss/train': 0.7212165892124176} 01/27/2022 12:33:40 - INFO - codeparrot_training - Step 17765: {'lr': 0.0003783151058703309, 'samples': 3411072, 'steps': 17765, 'loss/train': 1.0867212116718292} 01/27/2022 12:33:43 - INFO - codeparrot_training - Step 17766: {'lr': 0.0003783010627922645, 'samples': 3411264, 'steps': 17766, 'loss/train': 1.6776471734046936} 01/27/2022 12:33:46 - INFO - codeparrot_training - Step 17767: {'lr': 0.0003782870191645971, 'samples': 3411456, 'steps': 17767, 'loss/train': 0.5070259273052216} 01/27/2022 12:33:49 - INFO - codeparrot_training - Step 17768: {'lr': 0.0003782729749873887, 'samples': 3411648, 'steps': 17768, 'loss/train': 0.8707565367221832} 01/27/2022 12:33:53 - INFO - codeparrot_training - Step 17769: {'lr': 0.00037825893026069977, 'samples': 3411840, 'steps': 17769, 'loss/train': 1.1111523807048798} 01/27/2022 12:33:56 - INFO - codeparrot_training - Step 17770: {'lr': 0.0003782448849845902, 'samples': 3412032, 'steps': 17770, 'loss/train': 0.8579094707965851} 01/27/2022 12:33:59 - INFO - codeparrot_training - Step 17771: {'lr': 0.0003782308391591203, 'samples': 3412224, 'steps': 17771, 'loss/train': 0.7559394836425781} 01/27/2022 12:34:02 - INFO - codeparrot_training - Step 17772: {'lr': 0.00037821679278435017, 'samples': 3412416, 'steps': 17772, 'loss/train': 1.293379157781601} 01/27/2022 12:34:05 - INFO - codeparrot_training - Step 17773: {'lr': 0.0003782027458603401, 'samples': 3412608, 'steps': 17773, 'loss/train': 1.2864887416362762} 01/27/2022 12:34:10 - INFO - codeparrot_training - Step 17774: {'lr': 0.0003781886983871501, 'samples': 3412800, 'steps': 17774, 'loss/train': 0.8114047050476074} 01/27/2022 12:34:13 - INFO - codeparrot_training - Step 17775: {'lr': 0.00037817465036484043, 'samples': 3412992, 'steps': 17775, 'loss/train': 0.9736630618572235} 01/27/2022 12:34:16 - INFO - codeparrot_training - Step 17776: {'lr': 0.0003781606017934713, 'samples': 3413184, 'steps': 17776, 'loss/train': 0.747618243098259} 01/27/2022 12:34:19 - INFO - codeparrot_training - Step 17777: {'lr': 0.0003781465526731028, 'samples': 3413376, 'steps': 17777, 'loss/train': 0.7631416618824005} 01/27/2022 12:34:22 - INFO - codeparrot_training - Step 17778: {'lr': 0.0003781325030037952, 'samples': 3413568, 'steps': 17778, 'loss/train': 0.38618794083595276} 01/27/2022 12:34:25 - INFO - codeparrot_training - Step 17779: {'lr': 0.00037811845278560864, 'samples': 3413760, 'steps': 17779, 'loss/train': 0.606358990073204} 01/27/2022 12:34:28 - INFO - codeparrot_training - Step 17780: {'lr': 0.0003781044020186033, 'samples': 3413952, 'steps': 17780, 'loss/train': 0.17154952883720398} 01/27/2022 12:34:31 - INFO - codeparrot_training - Step 17781: {'lr': 0.0003780903507028393, 'samples': 3414144, 'steps': 17781, 'loss/train': 0.9237264096736908} 01/27/2022 12:34:35 - INFO - codeparrot_training - Step 17782: {'lr': 0.00037807629883837703, 'samples': 3414336, 'steps': 17782, 'loss/train': 0.40533529222011566} 01/27/2022 12:34:40 - INFO - codeparrot_training - Step 17783: {'lr': 0.00037806224642527653, 'samples': 3414528, 'steps': 17783, 'loss/train': 0.7241041213274002} 01/27/2022 12:34:43 - INFO - codeparrot_training - Step 17784: {'lr': 0.000378048193463598, 'samples': 3414720, 'steps': 17784, 'loss/train': 0.8158173859119415} 01/27/2022 12:34:46 - INFO - codeparrot_training - Step 17785: {'lr': 0.0003780341399534017, 'samples': 3414912, 'steps': 17785, 'loss/train': 0.7599489390850067} 01/27/2022 12:34:50 - INFO - codeparrot_training - Step 17786: {'lr': 0.00037802008589474777, 'samples': 3415104, 'steps': 17786, 'loss/train': 1.3232737183570862} 01/27/2022 12:34:53 - INFO - codeparrot_training - Step 17787: {'lr': 0.0003780060312876965, 'samples': 3415296, 'steps': 17787, 'loss/train': 0.8640496730804443} 01/27/2022 12:34:56 - INFO - codeparrot_training - Step 17788: {'lr': 0.00037799197613230795, 'samples': 3415488, 'steps': 17788, 'loss/train': 0.9790913164615631} 01/27/2022 12:34:59 - INFO - codeparrot_training - Step 17789: {'lr': 0.00037797792042864247, 'samples': 3415680, 'steps': 17789, 'loss/train': 0.9319522082805634} 01/27/2022 12:35:02 - INFO - codeparrot_training - Step 17790: {'lr': 0.0003779638641767602, 'samples': 3415872, 'steps': 17790, 'loss/train': 0.44592931866645813} 01/27/2022 12:35:05 - INFO - codeparrot_training - Step 17791: {'lr': 0.0003779498073767214, 'samples': 3416064, 'steps': 17791, 'loss/train': 0.9241170287132263} 01/27/2022 12:35:10 - INFO - codeparrot_training - Step 17792: {'lr': 0.00037793575002858625, 'samples': 3416256, 'steps': 17792, 'loss/train': 1.2508339583873749} 01/27/2022 12:35:13 - INFO - codeparrot_training - Step 17793: {'lr': 0.00037792169213241494, 'samples': 3416448, 'steps': 17793, 'loss/train': 0.818234771490097} 01/27/2022 12:35:16 - INFO - codeparrot_training - Step 17794: {'lr': 0.00037790763368826774, 'samples': 3416640, 'steps': 17794, 'loss/train': 1.0224229395389557} 01/27/2022 12:35:19 - INFO - codeparrot_training - Step 17795: {'lr': 0.00037789357469620487, 'samples': 3416832, 'steps': 17795, 'loss/train': 0.9671560227870941} 01/27/2022 12:35:22 - INFO - codeparrot_training - Step 17796: {'lr': 0.0003778795151562865, 'samples': 3417024, 'steps': 17796, 'loss/train': 0.18831110000610352} 01/27/2022 12:35:25 - INFO - codeparrot_training - Step 17797: {'lr': 0.00037786545506857295, 'samples': 3417216, 'steps': 17797, 'loss/train': 0.7587812840938568} 01/27/2022 12:35:29 - INFO - codeparrot_training - Step 17798: {'lr': 0.0003778513944331243, 'samples': 3417408, 'steps': 17798, 'loss/train': 1.1899133026599884} 01/27/2022 12:35:32 - INFO - codeparrot_training - Step 17799: {'lr': 0.0003778373332500009, 'samples': 3417600, 'steps': 17799, 'loss/train': 0.8800412714481354} 01/27/2022 12:35:35 - INFO - codeparrot_training - Step 17800: {'lr': 0.00037782327151926297, 'samples': 3417792, 'steps': 17800, 'loss/train': 0.3728077858686447} 01/27/2022 12:35:39 - INFO - codeparrot_training - Step 17801: {'lr': 0.00037780920924097085, 'samples': 3417984, 'steps': 17801, 'loss/train': 0.7881158888339996} 01/27/2022 12:35:42 - INFO - codeparrot_training - Step 17802: {'lr': 0.00037779514641518455, 'samples': 3418176, 'steps': 17802, 'loss/train': 0.8491335511207581} 01/27/2022 12:35:46 - INFO - codeparrot_training - Step 17803: {'lr': 0.0003777810830419644, 'samples': 3418368, 'steps': 17803, 'loss/train': 0.9170521795749664} 01/27/2022 12:35:49 - INFO - codeparrot_training - Step 17804: {'lr': 0.00037776701912137066, 'samples': 3418560, 'steps': 17804, 'loss/train': 0.8212615549564362} 01/27/2022 12:35:52 - INFO - codeparrot_training - Step 17805: {'lr': 0.00037775295465346373, 'samples': 3418752, 'steps': 17805, 'loss/train': 0.5756540447473526} 01/27/2022 12:35:55 - INFO - codeparrot_training - Step 17806: {'lr': 0.0003777388896383035, 'samples': 3418944, 'steps': 17806, 'loss/train': 1.1730384528636932} 01/27/2022 12:35:58 - INFO - codeparrot_training - Step 17807: {'lr': 0.00037772482407595056, 'samples': 3419136, 'steps': 17807, 'loss/train': 0.7193736881017685} 01/27/2022 12:36:01 - INFO - codeparrot_training - Step 17808: {'lr': 0.000377710757966465, 'samples': 3419328, 'steps': 17808, 'loss/train': 0.8357550501823425} 01/27/2022 12:36:04 - INFO - codeparrot_training - Step 17809: {'lr': 0.0003776966913099071, 'samples': 3419520, 'steps': 17809, 'loss/train': 0.6030236184597015} 01/27/2022 12:36:09 - INFO - codeparrot_training - Step 17810: {'lr': 0.00037768262410633715, 'samples': 3419712, 'steps': 17810, 'loss/train': 1.5301258563995361} 01/27/2022 12:36:12 - INFO - codeparrot_training - Step 17811: {'lr': 0.0003776685563558153, 'samples': 3419904, 'steps': 17811, 'loss/train': 0.6363756358623505} 01/27/2022 12:36:15 - INFO - codeparrot_training - Step 17812: {'lr': 0.00037765448805840196, 'samples': 3420096, 'steps': 17812, 'loss/train': 0.8167665302753448} 01/27/2022 12:36:18 - INFO - codeparrot_training - Step 17813: {'lr': 0.00037764041921415736, 'samples': 3420288, 'steps': 17813, 'loss/train': 5.503037810325623} 01/27/2022 12:36:22 - INFO - codeparrot_training - Step 17814: {'lr': 0.00037762634982314164, 'samples': 3420480, 'steps': 17814, 'loss/train': 0.6128091663122177} 01/27/2022 12:36:25 - INFO - codeparrot_training - Step 17815: {'lr': 0.00037761227988541523, 'samples': 3420672, 'steps': 17815, 'loss/train': 1.2876260876655579} 01/27/2022 12:36:28 - INFO - codeparrot_training - Step 17816: {'lr': 0.00037759820940103827, 'samples': 3420864, 'steps': 17816, 'loss/train': 0.6718382835388184} 01/27/2022 12:36:31 - INFO - codeparrot_training - Step 17817: {'lr': 0.00037758413837007124, 'samples': 3421056, 'steps': 17817, 'loss/train': 0.7165743559598923} 01/27/2022 12:36:37 - INFO - codeparrot_training - Step 17818: {'lr': 0.0003775700667925741, 'samples': 3421248, 'steps': 17818, 'loss/train': 0.8752159774303436} 01/27/2022 12:36:40 - INFO - codeparrot_training - Step 17819: {'lr': 0.0003775559946686075, 'samples': 3421440, 'steps': 17819, 'loss/train': 0.968622475862503} 01/27/2022 12:36:43 - INFO - codeparrot_training - Step 17820: {'lr': 0.00037754192199823135, 'samples': 3421632, 'steps': 17820, 'loss/train': 1.2922334969043732} 01/27/2022 12:36:46 - INFO - codeparrot_training - Step 17821: {'lr': 0.00037752784878150613, 'samples': 3421824, 'steps': 17821, 'loss/train': 0.7727307379245758} 01/27/2022 12:36:49 - INFO - codeparrot_training - Step 17822: {'lr': 0.00037751377501849215, 'samples': 3422016, 'steps': 17822, 'loss/train': 0.8676148653030396} 01/27/2022 12:36:53 - INFO - codeparrot_training - Step 17823: {'lr': 0.0003774997007092496, 'samples': 3422208, 'steps': 17823, 'loss/train': 0.9425946772098541} 01/27/2022 12:36:56 - INFO - codeparrot_training - Step 17824: {'lr': 0.00037748562585383886, 'samples': 3422400, 'steps': 17824, 'loss/train': 0.9956670105457306} 01/27/2022 12:36:59 - INFO - codeparrot_training - Step 17825: {'lr': 0.00037747155045232016, 'samples': 3422592, 'steps': 17825, 'loss/train': 0.7316556125879288} 01/27/2022 12:37:02 - INFO - codeparrot_training - Step 17826: {'lr': 0.0003774574745047539, 'samples': 3422784, 'steps': 17826, 'loss/train': 0.9868363738059998} 01/27/2022 12:37:05 - INFO - codeparrot_training - Step 17827: {'lr': 0.0003774433980112001, 'samples': 3422976, 'steps': 17827, 'loss/train': 0.16220640018582344} 01/27/2022 12:37:10 - INFO - codeparrot_training - Step 17828: {'lr': 0.00037742932097171945, 'samples': 3423168, 'steps': 17828, 'loss/train': 0.806742399930954} 01/27/2022 12:37:13 - INFO - codeparrot_training - Step 17829: {'lr': 0.0003774152433863719, 'samples': 3423360, 'steps': 17829, 'loss/train': 0.6970316469669342} 01/27/2022 12:37:16 - INFO - codeparrot_training - Step 17830: {'lr': 0.000377401165255218, 'samples': 3423552, 'steps': 17830, 'loss/train': 0.7486716359853745} 01/27/2022 12:37:19 - INFO - codeparrot_training - Step 17831: {'lr': 0.0003773870865783179, 'samples': 3423744, 'steps': 17831, 'loss/train': 0.48460453748703003} 01/27/2022 12:37:23 - INFO - codeparrot_training - Step 17832: {'lr': 0.00037737300735573204, 'samples': 3423936, 'steps': 17832, 'loss/train': 1.0549482107162476} 01/27/2022 12:37:26 - INFO - codeparrot_training - Step 17833: {'lr': 0.00037735892758752063, 'samples': 3424128, 'steps': 17833, 'loss/train': 1.028437614440918} 01/27/2022 12:37:29 - INFO - codeparrot_training - Step 17834: {'lr': 0.000377344847273744, 'samples': 3424320, 'steps': 17834, 'loss/train': 1.5317743420600891} 01/27/2022 12:37:32 - INFO - codeparrot_training - Step 17835: {'lr': 0.0003773307664144625, 'samples': 3424512, 'steps': 17835, 'loss/train': 1.03023362159729} 01/27/2022 12:37:35 - INFO - codeparrot_training - Step 17836: {'lr': 0.00037731668500973637, 'samples': 3424704, 'steps': 17836, 'loss/train': 0.6799329668283463} 01/27/2022 12:37:40 - INFO - codeparrot_training - Step 17837: {'lr': 0.00037730260305962604, 'samples': 3424896, 'steps': 17837, 'loss/train': 0.8980058133602142} 01/27/2022 12:37:43 - INFO - codeparrot_training - Step 17838: {'lr': 0.00037728852056419183, 'samples': 3425088, 'steps': 17838, 'loss/train': 0.7650082111358643} 01/27/2022 12:37:46 - INFO - codeparrot_training - Step 17839: {'lr': 0.000377274437523494, 'samples': 3425280, 'steps': 17839, 'loss/train': 1.0262062847614288} 01/27/2022 12:37:49 - INFO - codeparrot_training - Step 17840: {'lr': 0.00037726035393759286, 'samples': 3425472, 'steps': 17840, 'loss/train': 1.0618519484996796} 01/27/2022 12:37:52 - INFO - codeparrot_training - Step 17841: {'lr': 0.00037724626980654877, 'samples': 3425664, 'steps': 17841, 'loss/train': 1.1169151067733765} 01/27/2022 12:37:55 - INFO - codeparrot_training - Step 17842: {'lr': 0.00037723218513042203, 'samples': 3425856, 'steps': 17842, 'loss/train': 0.891390323638916} 01/27/2022 12:37:59 - INFO - codeparrot_training - Step 17843: {'lr': 0.0003772180999092731, 'samples': 3426048, 'steps': 17843, 'loss/train': 1.1056045889854431} 01/27/2022 12:38:02 - INFO - codeparrot_training - Step 17844: {'lr': 0.00037720401414316213, 'samples': 3426240, 'steps': 17844, 'loss/train': 1.03940749168396} 01/27/2022 12:38:07 - INFO - codeparrot_training - Step 17845: {'lr': 0.00037718992783214965, 'samples': 3426432, 'steps': 17845, 'loss/train': 0.9667668342590332} 01/27/2022 12:38:11 - INFO - codeparrot_training - Step 17846: {'lr': 0.0003771758409762958, 'samples': 3426624, 'steps': 17846, 'loss/train': 0.3835897296667099} 01/27/2022 12:38:14 - INFO - codeparrot_training - Step 17847: {'lr': 0.0003771617535756611, 'samples': 3426816, 'steps': 17847, 'loss/train': 0.45386382937431335} 01/27/2022 12:38:17 - INFO - codeparrot_training - Step 17848: {'lr': 0.00037714766563030585, 'samples': 3427008, 'steps': 17848, 'loss/train': 1.0644161403179169} 01/27/2022 12:38:20 - INFO - codeparrot_training - Step 17849: {'lr': 0.00037713357714029035, 'samples': 3427200, 'steps': 17849, 'loss/train': 0.8950679898262024} 01/27/2022 12:38:23 - INFO - codeparrot_training - Step 17850: {'lr': 0.000377119488105675, 'samples': 3427392, 'steps': 17850, 'loss/train': 0.5049117654561996} 01/27/2022 12:38:26 - INFO - codeparrot_training - Step 17851: {'lr': 0.00037710539852652003, 'samples': 3427584, 'steps': 17851, 'loss/train': 1.0289808511734009} 01/27/2022 12:38:30 - INFO - codeparrot_training - Step 17852: {'lr': 0.00037709130840288605, 'samples': 3427776, 'steps': 17852, 'loss/train': 0.047231873497366905} 01/27/2022 12:38:33 - INFO - codeparrot_training - Step 17853: {'lr': 0.0003770772177348331, 'samples': 3427968, 'steps': 17853, 'loss/train': 0.7396425157785416} 01/27/2022 12:38:37 - INFO - codeparrot_training - Step 17854: {'lr': 0.0003770631265224218, 'samples': 3428160, 'steps': 17854, 'loss/train': 0.47764505445957184} 01/27/2022 12:38:40 - INFO - codeparrot_training - Step 17855: {'lr': 0.0003770490347657124, 'samples': 3428352, 'steps': 17855, 'loss/train': 0.8726488351821899} 01/27/2022 12:38:43 - INFO - codeparrot_training - Step 17856: {'lr': 0.00037703494246476524, 'samples': 3428544, 'steps': 17856, 'loss/train': 0.7103675007820129} 01/27/2022 12:38:47 - INFO - codeparrot_training - Step 17857: {'lr': 0.00037702084961964075, 'samples': 3428736, 'steps': 17857, 'loss/train': 0.842792272567749} 01/27/2022 12:38:50 - INFO - codeparrot_training - Step 17858: {'lr': 0.00037700675623039925, 'samples': 3428928, 'steps': 17858, 'loss/train': 0.7852815985679626} 01/27/2022 12:38:53 - INFO - codeparrot_training - Step 17859: {'lr': 0.00037699266229710115, 'samples': 3429120, 'steps': 17859, 'loss/train': 0.5865557491779327} 01/27/2022 12:38:56 - INFO - codeparrot_training - Step 17860: {'lr': 0.0003769785678198068, 'samples': 3429312, 'steps': 17860, 'loss/train': 0.5531609952449799} 01/27/2022 12:38:59 - INFO - codeparrot_training - Step 17861: {'lr': 0.0003769644727985766, 'samples': 3429504, 'steps': 17861, 'loss/train': 0.8094476759433746} 01/27/2022 12:39:02 - INFO - codeparrot_training - Step 17862: {'lr': 0.00037695037723347094, 'samples': 3429696, 'steps': 17862, 'loss/train': 0.43270061910152435} 01/27/2022 12:39:08 - INFO - codeparrot_training - Step 17863: {'lr': 0.00037693628112455015, 'samples': 3429888, 'steps': 17863, 'loss/train': 0.7407862544059753} 01/27/2022 12:39:11 - INFO - codeparrot_training - Step 17864: {'lr': 0.0003769221844718746, 'samples': 3430080, 'steps': 17864, 'loss/train': 0.7080651372671127} 01/27/2022 12:39:14 - INFO - codeparrot_training - Step 17865: {'lr': 0.00037690808727550477, 'samples': 3430272, 'steps': 17865, 'loss/train': 0.7394076436758041} 01/27/2022 12:39:17 - INFO - codeparrot_training - Step 17866: {'lr': 0.0003768939895355009, 'samples': 3430464, 'steps': 17866, 'loss/train': 0.3710573762655258} 01/27/2022 12:39:21 - INFO - codeparrot_training - Step 17867: {'lr': 0.0003768798912519236, 'samples': 3430656, 'steps': 17867, 'loss/train': 0.8106890916824341} 01/27/2022 12:39:24 - INFO - codeparrot_training - Step 17868: {'lr': 0.0003768657924248331, 'samples': 3430848, 'steps': 17868, 'loss/train': 0.7833350300788879} 01/27/2022 12:39:27 - INFO - codeparrot_training - Step 17869: {'lr': 0.0003768516930542898, 'samples': 3431040, 'steps': 17869, 'loss/train': 0.9321344196796417} 01/27/2022 12:39:30 - INFO - codeparrot_training - Step 17870: {'lr': 0.00037683759314035414, 'samples': 3431232, 'steps': 17870, 'loss/train': 0.7998980283737183} 01/27/2022 12:39:33 - INFO - codeparrot_training - Step 17871: {'lr': 0.0003768234926830865, 'samples': 3431424, 'steps': 17871, 'loss/train': 0.7183521240949631} 01/27/2022 12:39:38 - INFO - codeparrot_training - Step 17872: {'lr': 0.0003768093916825473, 'samples': 3431616, 'steps': 17872, 'loss/train': 0.7743760049343109} 01/27/2022 12:39:41 - INFO - codeparrot_training - Step 17873: {'lr': 0.00037679529013879686, 'samples': 3431808, 'steps': 17873, 'loss/train': 1.0870539844036102} 01/27/2022 12:39:44 - INFO - codeparrot_training - Step 17874: {'lr': 0.00037678118805189575, 'samples': 3432000, 'steps': 17874, 'loss/train': 0.9091189205646515} 01/27/2022 12:39:47 - INFO - codeparrot_training - Step 17875: {'lr': 0.0003767670854219043, 'samples': 3432192, 'steps': 17875, 'loss/train': 1.3348017632961273} 01/27/2022 12:39:50 - INFO - codeparrot_training - Step 17876: {'lr': 0.00037675298224888287, 'samples': 3432384, 'steps': 17876, 'loss/train': 1.0138724148273468} 01/27/2022 12:39:53 - INFO - codeparrot_training - Step 17877: {'lr': 0.0003767388785328919, 'samples': 3432576, 'steps': 17877, 'loss/train': 0.8898642361164093} 01/27/2022 12:39:56 - INFO - codeparrot_training - Step 17878: {'lr': 0.0003767247742739918, 'samples': 3432768, 'steps': 17878, 'loss/train': 0.4390469491481781} 01/27/2022 12:40:00 - INFO - codeparrot_training - Step 17879: {'lr': 0.0003767106694722431, 'samples': 3432960, 'steps': 17879, 'loss/train': 0.5408623069524765} 01/27/2022 12:40:04 - INFO - codeparrot_training - Step 17880: {'lr': 0.000376696564127706, 'samples': 3433152, 'steps': 17880, 'loss/train': 0.7494834065437317} 01/27/2022 12:40:07 - INFO - codeparrot_training - Step 17881: {'lr': 0.0003766824582404411, 'samples': 3433344, 'steps': 17881, 'loss/train': 0.8861905932426453} 01/27/2022 12:40:10 - INFO - codeparrot_training - Step 17882: {'lr': 0.00037666835181050887, 'samples': 3433536, 'steps': 17882, 'loss/train': 1.2920831143856049} 01/27/2022 12:40:14 - INFO - codeparrot_training - Step 17883: {'lr': 0.0003766542448379695, 'samples': 3433728, 'steps': 17883, 'loss/train': 0.8060343861579895} 01/27/2022 12:40:17 - INFO - codeparrot_training - Step 17884: {'lr': 0.0003766401373228836, 'samples': 3433920, 'steps': 17884, 'loss/train': 0.8285459876060486} 01/27/2022 12:40:20 - INFO - codeparrot_training - Step 17885: {'lr': 0.00037662602926531166, 'samples': 3434112, 'steps': 17885, 'loss/train': 0.9514822661876678} 01/27/2022 12:40:23 - INFO - codeparrot_training - Step 17886: {'lr': 0.0003766119206653139, 'samples': 3434304, 'steps': 17886, 'loss/train': 0.1768232323229313} 01/27/2022 12:40:26 - INFO - codeparrot_training - Step 17887: {'lr': 0.00037659781152295094, 'samples': 3434496, 'steps': 17887, 'loss/train': 0.5970326364040375} 01/27/2022 12:40:29 - INFO - codeparrot_training - Step 17888: {'lr': 0.0003765837018382831, 'samples': 3434688, 'steps': 17888, 'loss/train': 0.8003620505332947} 01/27/2022 12:40:35 - INFO - codeparrot_training - Step 17889: {'lr': 0.00037656959161137094, 'samples': 3434880, 'steps': 17889, 'loss/train': 0.8234326243400574} 01/27/2022 12:40:38 - INFO - codeparrot_training - Step 17890: {'lr': 0.00037655548084227484, 'samples': 3435072, 'steps': 17890, 'loss/train': 0.6930548250675201} 01/27/2022 12:40:41 - INFO - codeparrot_training - Step 17891: {'lr': 0.0003765413695310552, 'samples': 3435264, 'steps': 17891, 'loss/train': 0.8977358043193817} 01/27/2022 12:40:45 - INFO - codeparrot_training - Step 17892: {'lr': 0.00037652725767777255, 'samples': 3435456, 'steps': 17892, 'loss/train': 0.42084574699401855} 01/27/2022 12:40:48 - INFO - codeparrot_training - Step 17893: {'lr': 0.00037651314528248724, 'samples': 3435648, 'steps': 17893, 'loss/train': 0.40352632105350494} 01/27/2022 12:40:51 - INFO - codeparrot_training - Step 17894: {'lr': 0.00037649903234525996, 'samples': 3435840, 'steps': 17894, 'loss/train': 1.012834757566452} 01/27/2022 12:40:54 - INFO - codeparrot_training - Step 17895: {'lr': 0.00037648491886615077, 'samples': 3436032, 'steps': 17895, 'loss/train': 1.1155715882778168} 01/27/2022 12:40:57 - INFO - codeparrot_training - Step 17896: {'lr': 0.0003764708048452205, 'samples': 3436224, 'steps': 17896, 'loss/train': 0.9174368977546692} 01/27/2022 12:41:00 - INFO - codeparrot_training - Step 17897: {'lr': 0.0003764566902825294, 'samples': 3436416, 'steps': 17897, 'loss/train': 0.7664187848567963} 01/27/2022 12:41:03 - INFO - codeparrot_training - Step 17898: {'lr': 0.0003764425751781381, 'samples': 3436608, 'steps': 17898, 'loss/train': 0.6826503574848175} 01/27/2022 12:41:08 - INFO - codeparrot_training - Step 17899: {'lr': 0.0003764284595321068, 'samples': 3436800, 'steps': 17899, 'loss/train': 0.7936385571956635} 01/27/2022 12:41:11 - INFO - codeparrot_training - Step 17900: {'lr': 0.0003764143433444962, 'samples': 3436992, 'steps': 17900, 'loss/train': 0.6782476007938385} 01/27/2022 12:41:15 - INFO - codeparrot_training - Step 17901: {'lr': 0.00037640022661536665, 'samples': 3437184, 'steps': 17901, 'loss/train': 0.9428253471851349} 01/27/2022 12:41:18 - INFO - codeparrot_training - Step 17902: {'lr': 0.0003763861093447787, 'samples': 3437376, 'steps': 17902, 'loss/train': 1.848872423171997} 01/27/2022 12:41:21 - INFO - codeparrot_training - Step 17903: {'lr': 0.0003763719915327928, 'samples': 3437568, 'steps': 17903, 'loss/train': 0.9533161818981171} 01/27/2022 12:41:24 - INFO - codeparrot_training - Step 17904: {'lr': 0.00037635787317946945, 'samples': 3437760, 'steps': 17904, 'loss/train': 0.5548265129327774} 01/27/2022 12:41:27 - INFO - codeparrot_training - Step 17905: {'lr': 0.000376343754284869, 'samples': 3437952, 'steps': 17905, 'loss/train': 0.9358738660812378} 01/27/2022 12:41:30 - INFO - codeparrot_training - Step 17906: {'lr': 0.00037632963484905213, 'samples': 3438144, 'steps': 17906, 'loss/train': 1.0115763545036316} 01/27/2022 12:41:35 - INFO - codeparrot_training - Step 17907: {'lr': 0.0003763155148720791, 'samples': 3438336, 'steps': 17907, 'loss/train': 0.6282755881547928} 01/27/2022 12:41:38 - INFO - codeparrot_training - Step 17908: {'lr': 0.00037630139435401055, 'samples': 3438528, 'steps': 17908, 'loss/train': 0.8558928072452545} 01/27/2022 12:41:41 - INFO - codeparrot_training - Step 17909: {'lr': 0.000376287273294907, 'samples': 3438720, 'steps': 17909, 'loss/train': 1.0716103613376617} 01/27/2022 12:41:44 - INFO - codeparrot_training - Step 17910: {'lr': 0.0003762731516948288, 'samples': 3438912, 'steps': 17910, 'loss/train': 0.6623565852642059} 01/27/2022 12:41:47 - INFO - codeparrot_training - Step 17911: {'lr': 0.00037625902955383664, 'samples': 3439104, 'steps': 17911, 'loss/train': 1.1786466836929321} 01/27/2022 12:41:51 - INFO - codeparrot_training - Step 17912: {'lr': 0.0003762449068719907, 'samples': 3439296, 'steps': 17912, 'loss/train': 0.8526237308979034} 01/27/2022 12:41:54 - INFO - codeparrot_training - Step 17913: {'lr': 0.0003762307836493518, 'samples': 3439488, 'steps': 17913, 'loss/train': 0.711031123995781} 01/27/2022 12:41:57 - INFO - codeparrot_training - Step 17914: {'lr': 0.00037621665988598024, 'samples': 3439680, 'steps': 17914, 'loss/train': 1.1132463812828064} 01/27/2022 12:42:00 - INFO - codeparrot_training - Step 17915: {'lr': 0.0003762025355819366, 'samples': 3439872, 'steps': 17915, 'loss/train': 0.9389703869819641} 01/27/2022 12:42:04 - INFO - codeparrot_training - Step 17916: {'lr': 0.0003761884107372814, 'samples': 3440064, 'steps': 17916, 'loss/train': 1.1480253338813782} 01/27/2022 12:42:07 - INFO - codeparrot_training - Step 17917: {'lr': 0.0003761742853520751, 'samples': 3440256, 'steps': 17917, 'loss/train': 0.64650097489357} 01/27/2022 12:42:11 - INFO - codeparrot_training - Step 17918: {'lr': 0.00037616015942637824, 'samples': 3440448, 'steps': 17918, 'loss/train': 0.7604730427265167} 01/27/2022 12:42:14 - INFO - codeparrot_training - Step 17919: {'lr': 0.0003761460329602513, 'samples': 3440640, 'steps': 17919, 'loss/train': 0.5834275335073471} 01/27/2022 12:42:17 - INFO - codeparrot_training - Step 17920: {'lr': 0.0003761319059537548, 'samples': 3440832, 'steps': 17920, 'loss/train': 0.9251265227794647} 01/27/2022 12:42:20 - INFO - codeparrot_training - Step 17921: {'lr': 0.0003761177784069493, 'samples': 3441024, 'steps': 17921, 'loss/train': 0.4388592392206192} 01/27/2022 12:42:23 - INFO - codeparrot_training - Step 17922: {'lr': 0.00037610365031989524, 'samples': 3441216, 'steps': 17922, 'loss/train': 1.2855013310909271} 01/27/2022 12:42:26 - INFO - codeparrot_training - Step 17923: {'lr': 0.0003760895216926532, 'samples': 3441408, 'steps': 17923, 'loss/train': 1.125196248292923} 01/27/2022 12:42:30 - INFO - codeparrot_training - Step 17924: {'lr': 0.0003760753925252838, 'samples': 3441600, 'steps': 17924, 'loss/train': 0.3592373952269554} 01/27/2022 12:42:36 - INFO - codeparrot_training - Step 17925: {'lr': 0.00037606126281784725, 'samples': 3441792, 'steps': 17925, 'loss/train': 0.8088610768318176} 01/27/2022 12:42:39 - INFO - codeparrot_training - Step 17926: {'lr': 0.0003760471325704045, 'samples': 3441984, 'steps': 17926, 'loss/train': 0.7711814045906067} 01/27/2022 12:42:42 - INFO - codeparrot_training - Step 17927: {'lr': 0.0003760330017830157, 'samples': 3442176, 'steps': 17927, 'loss/train': 1.3911140263080597} 01/27/2022 12:42:45 - INFO - codeparrot_training - Step 17928: {'lr': 0.00037601887045574155, 'samples': 3442368, 'steps': 17928, 'loss/train': 0.8131018280982971} 01/27/2022 12:42:48 - INFO - codeparrot_training - Step 17929: {'lr': 0.0003760047385886426, 'samples': 3442560, 'steps': 17929, 'loss/train': 0.23695486038923264} 01/27/2022 12:42:52 - INFO - codeparrot_training - Step 17930: {'lr': 0.0003759906061817794, 'samples': 3442752, 'steps': 17930, 'loss/train': 0.27896542847156525} 01/27/2022 12:42:55 - INFO - codeparrot_training - Step 17931: {'lr': 0.00037597647323521234, 'samples': 3442944, 'steps': 17931, 'loss/train': 1.0130764245986938} 01/27/2022 12:42:58 - INFO - codeparrot_training - Step 17932: {'lr': 0.0003759623397490022, 'samples': 3443136, 'steps': 17932, 'loss/train': 0.7405153959989548} 01/27/2022 12:43:01 - INFO - codeparrot_training - Step 17933: {'lr': 0.00037594820572320933, 'samples': 3443328, 'steps': 17933, 'loss/train': 0.7939205467700958} 01/27/2022 12:43:05 - INFO - codeparrot_training - Step 17934: {'lr': 0.0003759340711578944, 'samples': 3443520, 'steps': 17934, 'loss/train': 0.5022717118263245} 01/27/2022 12:43:08 - INFO - codeparrot_training - Step 17935: {'lr': 0.0003759199360531178, 'samples': 3443712, 'steps': 17935, 'loss/train': 1.0045287609100342} 01/27/2022 12:43:12 - INFO - codeparrot_training - Step 17936: {'lr': 0.00037590580040894024, 'samples': 3443904, 'steps': 17936, 'loss/train': 0.8051446080207825} 01/27/2022 12:43:15 - INFO - codeparrot_training - Step 17937: {'lr': 0.0003758916642254222, 'samples': 3444096, 'steps': 17937, 'loss/train': 0.9203100800514221} 01/27/2022 12:43:18 - INFO - codeparrot_training - Step 17938: {'lr': 0.00037587752750262426, 'samples': 3444288, 'steps': 17938, 'loss/train': 0.30617518723011017} 01/27/2022 12:43:21 - INFO - codeparrot_training - Step 17939: {'lr': 0.00037586339024060696, 'samples': 3444480, 'steps': 17939, 'loss/train': 0.6931801289319992} 01/27/2022 12:43:24 - INFO - codeparrot_training - Step 17940: {'lr': 0.0003758492524394308, 'samples': 3444672, 'steps': 17940, 'loss/train': 0.46619197726249695} 01/27/2022 12:43:27 - INFO - codeparrot_training - Step 17941: {'lr': 0.0003758351140991565, 'samples': 3444864, 'steps': 17941, 'loss/train': 0.8829436898231506} 01/27/2022 12:43:33 - INFO - codeparrot_training - Step 17942: {'lr': 0.0003758209752198444, 'samples': 3445056, 'steps': 17942, 'loss/train': 0.6434184908866882} 01/27/2022 12:43:37 - INFO - codeparrot_training - Step 17943: {'lr': 0.0003758068358015553, 'samples': 3445248, 'steps': 17943, 'loss/train': 0.48153796792030334} 01/27/2022 12:43:40 - INFO - codeparrot_training - Step 17944: {'lr': 0.0003757926958443496, 'samples': 3445440, 'steps': 17944, 'loss/train': 0.7031002342700958} 01/27/2022 12:43:43 - INFO - codeparrot_training - Step 17945: {'lr': 0.000375778555348288, 'samples': 3445632, 'steps': 17945, 'loss/train': 1.113799273967743} 01/27/2022 12:43:46 - INFO - codeparrot_training - Step 17946: {'lr': 0.000375764414313431, 'samples': 3445824, 'steps': 17946, 'loss/train': 0.6074004918336868} 01/27/2022 12:43:49 - INFO - codeparrot_training - Step 17947: {'lr': 0.0003757502727398391, 'samples': 3446016, 'steps': 17947, 'loss/train': 0.9972122311592102} 01/27/2022 12:43:52 - INFO - codeparrot_training - Step 17948: {'lr': 0.00037573613062757304, 'samples': 3446208, 'steps': 17948, 'loss/train': 0.8115869164466858} 01/27/2022 12:43:55 - INFO - codeparrot_training - Step 17949: {'lr': 0.0003757219879766933, 'samples': 3446400, 'steps': 17949, 'loss/train': 0.9752765893936157} 01/27/2022 12:43:59 - INFO - codeparrot_training - Step 17950: {'lr': 0.00037570784478726057, 'samples': 3446592, 'steps': 17950, 'loss/train': 0.7239139080047607} 01/27/2022 12:44:03 - INFO - codeparrot_training - Step 17951: {'lr': 0.00037569370105933523, 'samples': 3446784, 'steps': 17951, 'loss/train': 0.3716737926006317} 01/27/2022 12:44:06 - INFO - codeparrot_training - Step 17952: {'lr': 0.00037567955679297806, 'samples': 3446976, 'steps': 17952, 'loss/train': 0.6268345713615417} 01/27/2022 12:44:10 - INFO - codeparrot_training - Step 17953: {'lr': 0.0003756654119882496, 'samples': 3447168, 'steps': 17953, 'loss/train': 0.5763550400733948} 01/27/2022 12:44:13 - INFO - codeparrot_training - Step 17954: {'lr': 0.0003756512666452103, 'samples': 3447360, 'steps': 17954, 'loss/train': 0.262355737388134} 01/27/2022 12:44:16 - INFO - codeparrot_training - Step 17955: {'lr': 0.0003756371207639209, 'samples': 3447552, 'steps': 17955, 'loss/train': 0.9930324554443359} 01/27/2022 12:44:19 - INFO - codeparrot_training - Step 17956: {'lr': 0.00037562297434444203, 'samples': 3447744, 'steps': 17956, 'loss/train': 1.2762583494186401} 01/27/2022 12:44:22 - INFO - codeparrot_training - Step 17957: {'lr': 0.0003756088273868342, 'samples': 3447936, 'steps': 17957, 'loss/train': 0.8031097054481506} 01/27/2022 12:44:25 - INFO - codeparrot_training - Step 17958: {'lr': 0.00037559467989115806, 'samples': 3448128, 'steps': 17958, 'loss/train': 0.5294175893068314} 01/27/2022 12:44:29 - INFO - codeparrot_training - Step 17959: {'lr': 0.00037558053185747416, 'samples': 3448320, 'steps': 17959, 'loss/train': 0.7924456894397736} 01/27/2022 12:44:33 - INFO - codeparrot_training - Step 17960: {'lr': 0.00037556638328584314, 'samples': 3448512, 'steps': 17960, 'loss/train': 0.827648788690567} 01/27/2022 12:44:37 - INFO - codeparrot_training - Step 17961: {'lr': 0.00037555223417632565, 'samples': 3448704, 'steps': 17961, 'loss/train': 0.6637343019247055} 01/27/2022 12:44:40 - INFO - codeparrot_training - Step 17962: {'lr': 0.0003755380845289822, 'samples': 3448896, 'steps': 17962, 'loss/train': 1.1260019838809967} 01/27/2022 12:44:43 - INFO - codeparrot_training - Step 17963: {'lr': 0.0003755239343438735, 'samples': 3449088, 'steps': 17963, 'loss/train': 0.785476416349411} 01/27/2022 12:44:46 - INFO - codeparrot_training - Step 17964: {'lr': 0.00037550978362106, 'samples': 3449280, 'steps': 17964, 'loss/train': 1.199933409690857} 01/27/2022 12:44:49 - INFO - codeparrot_training - Step 17965: {'lr': 0.0003754956323606026, 'samples': 3449472, 'steps': 17965, 'loss/train': 0.7535121738910675} 01/27/2022 12:44:52 - INFO - codeparrot_training - Step 17966: {'lr': 0.0003754814805625617, 'samples': 3449664, 'steps': 17966, 'loss/train': 0.4501775801181793} 01/27/2022 12:44:56 - INFO - codeparrot_training - Step 17967: {'lr': 0.00037546732822699803, 'samples': 3449856, 'steps': 17967, 'loss/train': 0.41887445747852325} 01/27/2022 12:44:59 - INFO - codeparrot_training - Step 17968: {'lr': 0.0003754531753539721, 'samples': 3450048, 'steps': 17968, 'loss/train': 0.5372181683778763} 01/27/2022 12:45:07 - INFO - codeparrot_training - Step 17969: {'lr': 0.0003754390219435446, 'samples': 3450240, 'steps': 17969, 'loss/train': 0.8379674255847931} 01/27/2022 12:45:10 - INFO - codeparrot_training - Step 17970: {'lr': 0.00037542486799577624, 'samples': 3450432, 'steps': 17970, 'loss/train': 0.9128230512142181} 01/27/2022 12:45:13 - INFO - codeparrot_training - Step 17971: {'lr': 0.00037541071351072746, 'samples': 3450624, 'steps': 17971, 'loss/train': 0.5380378067493439} 01/27/2022 12:45:17 - INFO - codeparrot_training - Step 17972: {'lr': 0.0003753965584884591, 'samples': 3450816, 'steps': 17972, 'loss/train': 0.6869356334209442} 01/27/2022 12:45:20 - INFO - codeparrot_training - Step 17973: {'lr': 0.00037538240292903167, 'samples': 3451008, 'steps': 17973, 'loss/train': 0.3067547678947449} 01/27/2022 12:45:23 - INFO - codeparrot_training - Step 17974: {'lr': 0.0003753682468325059, 'samples': 3451200, 'steps': 17974, 'loss/train': 0.6076724678277969} 01/27/2022 12:45:26 - INFO - codeparrot_training - Step 17975: {'lr': 0.0003753540901989422, 'samples': 3451392, 'steps': 17975, 'loss/train': 0.49873723089694977} 01/27/2022 12:45:29 - INFO - codeparrot_training - Step 17976: {'lr': 0.00037533993302840153, 'samples': 3451584, 'steps': 17976, 'loss/train': 0.3396426737308502} 01/27/2022 12:45:32 - INFO - codeparrot_training - Step 17977: {'lr': 0.00037532577532094436, 'samples': 3451776, 'steps': 17977, 'loss/train': 1.7281374335289001} 01/27/2022 12:45:37 - INFO - codeparrot_training - Step 17978: {'lr': 0.00037531161707663136, 'samples': 3451968, 'steps': 17978, 'loss/train': 0.6896446198225021} 01/27/2022 12:45:40 - INFO - codeparrot_training - Step 17979: {'lr': 0.0003752974582955232, 'samples': 3452160, 'steps': 17979, 'loss/train': 0.6871363073587418} 01/27/2022 12:45:43 - INFO - codeparrot_training - Step 17980: {'lr': 0.0003752832989776804, 'samples': 3452352, 'steps': 17980, 'loss/train': 0.7854113280773163} 01/27/2022 12:45:46 - INFO - codeparrot_training - Step 17981: {'lr': 0.0003752691391231639, 'samples': 3452544, 'steps': 17981, 'loss/train': 0.8390191197395325} 01/27/2022 12:45:49 - INFO - codeparrot_training - Step 17982: {'lr': 0.00037525497873203405, 'samples': 3452736, 'steps': 17982, 'loss/train': 2.324696123600006} 01/27/2022 12:45:53 - INFO - codeparrot_training - Step 17983: {'lr': 0.0003752408178043518, 'samples': 3452928, 'steps': 17983, 'loss/train': 0.8827296495437622} 01/27/2022 12:45:56 - INFO - codeparrot_training - Step 17984: {'lr': 0.0003752266563401775, 'samples': 3453120, 'steps': 17984, 'loss/train': 0.8177593946456909} 01/27/2022 12:45:59 - INFO - codeparrot_training - Step 17985: {'lr': 0.00037521249433957203, 'samples': 3453312, 'steps': 17985, 'loss/train': 0.7612828016281128} 01/27/2022 12:46:02 - INFO - codeparrot_training - Step 17986: {'lr': 0.000375198331802596, 'samples': 3453504, 'steps': 17986, 'loss/train': 0.8115473985671997} 01/27/2022 12:46:06 - INFO - codeparrot_training - Step 17987: {'lr': 0.00037518416872931007, 'samples': 3453696, 'steps': 17987, 'loss/train': 1.4162762761116028} 01/27/2022 12:46:10 - INFO - codeparrot_training - Step 17988: {'lr': 0.00037517000511977486, 'samples': 3453888, 'steps': 17988, 'loss/train': 1.028537929058075} 01/27/2022 12:46:13 - INFO - codeparrot_training - Step 17989: {'lr': 0.00037515584097405115, 'samples': 3454080, 'steps': 17989, 'loss/train': 0.6136468201875687} 01/27/2022 12:46:16 - INFO - codeparrot_training - Step 17990: {'lr': 0.00037514167629219955, 'samples': 3454272, 'steps': 17990, 'loss/train': 0.60965596139431} 01/27/2022 12:46:19 - INFO - codeparrot_training - Step 17991: {'lr': 0.0003751275110742807, 'samples': 3454464, 'steps': 17991, 'loss/train': 0.6522223949432373} 01/27/2022 12:46:22 - INFO - codeparrot_training - Step 17992: {'lr': 0.00037511334532035537, 'samples': 3454656, 'steps': 17992, 'loss/train': 0.7448626756668091} 01/27/2022 12:46:25 - INFO - codeparrot_training - Step 17993: {'lr': 0.00037509917903048417, 'samples': 3454848, 'steps': 17993, 'loss/train': 0.7819700539112091} 01/27/2022 12:46:28 - INFO - codeparrot_training - Step 17994: {'lr': 0.00037508501220472783, 'samples': 3455040, 'steps': 17994, 'loss/train': 0.9454674124717712} 01/27/2022 12:46:35 - INFO - codeparrot_training - Step 17995: {'lr': 0.000375070844843147, 'samples': 3455232, 'steps': 17995, 'loss/train': 0.8421874344348907} 01/27/2022 12:46:38 - INFO - codeparrot_training - Step 17996: {'lr': 0.00037505667694580244, 'samples': 3455424, 'steps': 17996, 'loss/train': 1.1058039665222168} 01/27/2022 12:46:41 - INFO - codeparrot_training - Step 17997: {'lr': 0.00037504250851275466, 'samples': 3455616, 'steps': 17997, 'loss/train': 0.9111911952495575} 01/27/2022 12:46:44 - INFO - codeparrot_training - Step 17998: {'lr': 0.0003750283395440647, 'samples': 3455808, 'steps': 17998, 'loss/train': 0.6563953310251236} 01/27/2022 12:46:47 - INFO - codeparrot_training - Step 17999: {'lr': 0.0003750141700397928, 'samples': 3456000, 'steps': 17999, 'loss/train': 0.7130634784698486} 01/27/2022 12:46:47 - INFO - codeparrot_training - Evaluating and saving model checkpoint