|
[2023-08-07 22:54:39,387][00422] Saving configuration to /content/train_dir/default_experiment/config.json... |
|
[2023-08-07 22:54:39,391][00422] Rollout worker 0 uses device cpu |
|
[2023-08-07 22:54:39,392][00422] Rollout worker 1 uses device cpu |
|
[2023-08-07 22:54:39,395][00422] Rollout worker 2 uses device cpu |
|
[2023-08-07 22:54:39,399][00422] Rollout worker 3 uses device cpu |
|
[2023-08-07 22:54:39,401][00422] Rollout worker 4 uses device cpu |
|
[2023-08-07 22:54:39,402][00422] Rollout worker 5 uses device cpu |
|
[2023-08-07 22:54:39,406][00422] Rollout worker 6 uses device cpu |
|
[2023-08-07 22:54:39,407][00422] Rollout worker 7 uses device cpu |
|
[2023-08-07 22:54:39,562][00422] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2023-08-07 22:54:39,564][00422] InferenceWorker_p0-w0: min num requests: 2 |
|
[2023-08-07 22:54:39,594][00422] Starting all processes... |
|
[2023-08-07 22:54:39,595][00422] Starting process learner_proc0 |
|
[2023-08-07 22:54:39,644][00422] Starting all processes... |
|
[2023-08-07 22:54:39,653][00422] Starting process inference_proc0-0 |
|
[2023-08-07 22:54:39,653][00422] Starting process rollout_proc0 |
|
[2023-08-07 22:54:39,655][00422] Starting process rollout_proc1 |
|
[2023-08-07 22:54:39,656][00422] Starting process rollout_proc2 |
|
[2023-08-07 22:54:39,656][00422] Starting process rollout_proc3 |
|
[2023-08-07 22:54:39,656][00422] Starting process rollout_proc4 |
|
[2023-08-07 22:54:39,656][00422] Starting process rollout_proc5 |
|
[2023-08-07 22:54:39,656][00422] Starting process rollout_proc6 |
|
[2023-08-07 22:54:39,656][00422] Starting process rollout_proc7 |
|
[2023-08-07 22:54:55,630][09958] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2023-08-07 22:54:55,631][09958] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for learning process 0 |
|
[2023-08-07 22:54:55,671][09972] Worker 0 uses CPU cores [0] |
|
[2023-08-07 22:54:55,684][09958] Num visible devices: 1 |
|
[2023-08-07 22:54:55,723][09958] Starting seed is not provided |
|
[2023-08-07 22:54:55,724][09958] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2023-08-07 22:54:55,724][09958] Initializing actor-critic model on device cuda:0 |
|
[2023-08-07 22:54:55,725][09958] RunningMeanStd input shape: (3, 72, 128) |
|
[2023-08-07 22:54:55,729][09958] RunningMeanStd input shape: (1,) |
|
[2023-08-07 22:54:55,811][09958] ConvEncoder: input_channels=3 |
|
[2023-08-07 22:54:56,010][09971] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2023-08-07 22:54:56,012][09971] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for inference process 0 |
|
[2023-08-07 22:54:56,086][09971] Num visible devices: 1 |
|
[2023-08-07 22:54:56,107][09973] Worker 1 uses CPU cores [1] |
|
[2023-08-07 22:54:56,246][09979] Worker 7 uses CPU cores [1] |
|
[2023-08-07 22:54:56,271][09974] Worker 3 uses CPU cores [1] |
|
[2023-08-07 22:54:56,282][09976] Worker 2 uses CPU cores [0] |
|
[2023-08-07 22:54:56,320][09975] Worker 4 uses CPU cores [0] |
|
[2023-08-07 22:54:56,337][09978] Worker 6 uses CPU cores [0] |
|
[2023-08-07 22:54:56,346][09977] Worker 5 uses CPU cores [1] |
|
[2023-08-07 22:54:56,407][09958] Conv encoder output size: 512 |
|
[2023-08-07 22:54:56,407][09958] Policy head output size: 512 |
|
[2023-08-07 22:54:56,450][09958] Created Actor Critic model with architecture: |
|
[2023-08-07 22:54:56,450][09958] ActorCriticSharedWeights( |
|
(obs_normalizer): ObservationNormalizer( |
|
(running_mean_std): RunningMeanStdDictInPlace( |
|
(running_mean_std): ModuleDict( |
|
(obs): RunningMeanStdInPlace() |
|
) |
|
) |
|
) |
|
(returns_normalizer): RecursiveScriptModule(original_name=RunningMeanStdInPlace) |
|
(encoder): VizdoomEncoder( |
|
(basic_encoder): ConvEncoder( |
|
(enc): RecursiveScriptModule( |
|
original_name=ConvEncoderImpl |
|
(conv_head): RecursiveScriptModule( |
|
original_name=Sequential |
|
(0): RecursiveScriptModule(original_name=Conv2d) |
|
(1): RecursiveScriptModule(original_name=ELU) |
|
(2): RecursiveScriptModule(original_name=Conv2d) |
|
(3): RecursiveScriptModule(original_name=ELU) |
|
(4): RecursiveScriptModule(original_name=Conv2d) |
|
(5): RecursiveScriptModule(original_name=ELU) |
|
) |
|
(mlp_layers): RecursiveScriptModule( |
|
original_name=Sequential |
|
(0): RecursiveScriptModule(original_name=Linear) |
|
(1): RecursiveScriptModule(original_name=ELU) |
|
) |
|
) |
|
) |
|
) |
|
(core): ModelCoreRNN( |
|
(core): GRU(512, 512) |
|
) |
|
(decoder): MlpDecoder( |
|
(mlp): Identity() |
|
) |
|
(critic_linear): Linear(in_features=512, out_features=1, bias=True) |
|
(action_parameterization): ActionParameterizationDefault( |
|
(distribution_linear): Linear(in_features=512, out_features=6, bias=True) |
|
) |
|
) |
|
[2023-08-07 22:54:59,555][00422] Heartbeat connected on Batcher_0 |
|
[2023-08-07 22:54:59,562][00422] Heartbeat connected on InferenceWorker_p0-w0 |
|
[2023-08-07 22:54:59,571][00422] Heartbeat connected on RolloutWorker_w0 |
|
[2023-08-07 22:54:59,574][00422] Heartbeat connected on RolloutWorker_w1 |
|
[2023-08-07 22:54:59,578][00422] Heartbeat connected on RolloutWorker_w2 |
|
[2023-08-07 22:54:59,585][00422] Heartbeat connected on RolloutWorker_w4 |
|
[2023-08-07 22:54:59,586][00422] Heartbeat connected on RolloutWorker_w3 |
|
[2023-08-07 22:54:59,591][00422] Heartbeat connected on RolloutWorker_w5 |
|
[2023-08-07 22:54:59,593][00422] Heartbeat connected on RolloutWorker_w6 |
|
[2023-08-07 22:54:59,598][00422] Heartbeat connected on RolloutWorker_w7 |
|
[2023-08-07 22:55:05,518][09958] Using optimizer <class 'torch.optim.adam.Adam'> |
|
[2023-08-07 22:55:05,519][09958] No checkpoints found |
|
[2023-08-07 22:55:05,519][09958] Did not load from checkpoint, starting from scratch! |
|
[2023-08-07 22:55:05,520][09958] Initialized policy 0 weights for model version 0 |
|
[2023-08-07 22:55:05,524][09958] LearnerWorker_p0 finished initialization! |
|
[2023-08-07 22:55:05,525][00422] Heartbeat connected on LearnerWorker_p0 |
|
[2023-08-07 22:55:05,525][09958] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2023-08-07 22:55:05,734][09971] RunningMeanStd input shape: (3, 72, 128) |
|
[2023-08-07 22:55:05,735][09971] RunningMeanStd input shape: (1,) |
|
[2023-08-07 22:55:05,747][09971] ConvEncoder: input_channels=3 |
|
[2023-08-07 22:55:05,849][09971] Conv encoder output size: 512 |
|
[2023-08-07 22:55:05,849][09971] Policy head output size: 512 |
|
[2023-08-07 22:55:06,042][00422] Inference worker 0-0 is ready! |
|
[2023-08-07 22:55:06,044][00422] All inference workers are ready! Signal rollout workers to start! |
|
[2023-08-07 22:55:06,309][09972] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:06,312][09975] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:06,333][09978] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:06,339][09974] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:06,347][09973] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:06,351][09976] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:06,379][09977] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:06,488][09979] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 22:55:07,406][09974] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:07,407][09973] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:07,712][09976] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:07,716][09972] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:07,718][09975] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:07,898][09977] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:08,478][09974] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:08,798][09977] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:08,864][09976] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:08,871][09972] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:08,988][09978] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:09,463][00422] Fps is (10 sec: nan, 60 sec: nan, 300 sec: nan). Total num frames: 0. Throughput: 0: nan. Samples: 0. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0) |
|
[2023-08-07 22:55:09,668][09973] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:10,148][09974] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:10,223][09975] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:10,301][09978] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:10,525][09977] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:10,708][09972] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:11,075][09979] Decorrelating experience for 0 frames... |
|
[2023-08-07 22:55:11,376][09976] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:11,507][09973] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:11,722][09978] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:12,145][09977] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:12,353][09972] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:12,464][09976] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:12,645][09979] Decorrelating experience for 32 frames... |
|
[2023-08-07 22:55:12,654][09974] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:13,028][09973] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:13,389][09975] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:14,463][00422] Fps is (10 sec: 0.0, 60 sec: 0.0, 300 sec: 0.0). Total num frames: 0. Throughput: 0: 0.0. Samples: 0. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0) |
|
[2023-08-07 22:55:14,961][09978] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:16,574][09979] Decorrelating experience for 64 frames... |
|
[2023-08-07 22:55:17,172][09975] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:19,464][00422] Fps is (10 sec: 0.0, 60 sec: 0.0, 300 sec: 0.0). Total num frames: 0. Throughput: 0: 182.4. Samples: 1824. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0) |
|
[2023-08-07 22:55:19,578][09958] Signal inference workers to stop experience collection... |
|
[2023-08-07 22:55:19,600][09971] InferenceWorker_p0-w0: stopping experience collection |
|
[2023-08-07 22:55:20,081][09979] Decorrelating experience for 96 frames... |
|
[2023-08-07 22:55:24,044][09958] Signal inference workers to resume experience collection... |
|
[2023-08-07 22:55:24,045][09971] InferenceWorker_p0-w0: resuming experience collection |
|
[2023-08-07 22:55:24,463][00422] Fps is (10 sec: 409.6, 60 sec: 273.1, 300 sec: 273.1). Total num frames: 4096. Throughput: 0: 154.8. Samples: 2322. Policy #0 lag: (min: 0.0, avg: 0.0, max: 0.0) |
|
[2023-08-07 22:55:29,463][00422] Fps is (10 sec: 2048.0, 60 sec: 1024.0, 300 sec: 1024.0). Total num frames: 20480. Throughput: 0: 190.0. Samples: 3800. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 22:55:29,470][00422] Avg episode reward: [(0, '0.950')] |
|
[2023-08-07 22:55:34,464][00422] Fps is (10 sec: 3276.5, 60 sec: 1474.5, 300 sec: 1474.5). Total num frames: 36864. Throughput: 0: 386.2. Samples: 9656. Policy #0 lag: (min: 0.0, avg: 0.3, max: 1.0) |
|
[2023-08-07 22:55:34,473][00422] Avg episode reward: [(0, '0.913')] |
|
[2023-08-07 22:55:34,539][09971] Updated weights for policy 0, policy_version 10 (0.0031) |
|
[2023-08-07 22:55:39,463][00422] Fps is (10 sec: 2867.2, 60 sec: 1638.4, 300 sec: 1638.4). Total num frames: 49152. Throughput: 0: 445.6. Samples: 13368. Policy #0 lag: (min: 0.0, avg: 0.3, max: 1.0) |
|
[2023-08-07 22:55:39,469][00422] Avg episode reward: [(0, '0.256')] |
|
[2023-08-07 22:55:44,463][00422] Fps is (10 sec: 2867.4, 60 sec: 1872.5, 300 sec: 1872.5). Total num frames: 65536. Throughput: 0: 437.3. Samples: 15306. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 22:55:44,466][00422] Avg episode reward: [(0, '-0.044')] |
|
[2023-08-07 22:55:47,612][09971] Updated weights for policy 0, policy_version 20 (0.0022) |
|
[2023-08-07 22:55:49,463][00422] Fps is (10 sec: 3686.4, 60 sec: 2150.4, 300 sec: 2150.4). Total num frames: 86016. Throughput: 0: 540.0. Samples: 21602. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:55:49,465][00422] Avg episode reward: [(0, '-0.031')] |
|
[2023-08-07 22:55:54,463][00422] Fps is (10 sec: 4096.0, 60 sec: 2366.6, 300 sec: 2366.6). Total num frames: 106496. Throughput: 0: 605.9. Samples: 27264. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:55:54,469][00422] Avg episode reward: [(0, '0.014')] |
|
[2023-08-07 22:55:54,476][09958] Saving new best policy, reward=0.014! |
|
[2023-08-07 22:55:59,463][00422] Fps is (10 sec: 3276.8, 60 sec: 2375.7, 300 sec: 2375.7). Total num frames: 118784. Throughput: 0: 647.5. Samples: 29136. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:55:59,470][00422] Avg episode reward: [(0, '-0.030')] |
|
[2023-08-07 22:56:00,502][09971] Updated weights for policy 0, policy_version 30 (0.0021) |
|
[2023-08-07 22:56:04,463][00422] Fps is (10 sec: 2867.2, 60 sec: 2457.6, 300 sec: 2457.6). Total num frames: 135168. Throughput: 0: 703.1. Samples: 33462. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:56:04,466][00422] Avg episode reward: [(0, '-0.059')] |
|
[2023-08-07 22:56:09,463][00422] Fps is (10 sec: 4096.0, 60 sec: 2662.4, 300 sec: 2662.4). Total num frames: 159744. Throughput: 0: 842.9. Samples: 40254. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:56:09,467][00422] Avg episode reward: [(0, '-0.070')] |
|
[2023-08-07 22:56:10,282][09971] Updated weights for policy 0, policy_version 40 (0.0026) |
|
[2023-08-07 22:56:14,463][00422] Fps is (10 sec: 4096.0, 60 sec: 2935.5, 300 sec: 2709.7). Total num frames: 176128. Throughput: 0: 883.3. Samples: 43550. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:56:14,465][00422] Avg episode reward: [(0, '-0.062')] |
|
[2023-08-07 22:56:19,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3140.3, 300 sec: 2691.7). Total num frames: 188416. Throughput: 0: 843.2. Samples: 47598. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:56:19,473][00422] Avg episode reward: [(0, '-0.075')] |
|
[2023-08-07 22:56:23,454][09971] Updated weights for policy 0, policy_version 50 (0.0023) |
|
[2023-08-07 22:56:24,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3413.3, 300 sec: 2785.3). Total num frames: 208896. Throughput: 0: 871.0. Samples: 52562. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:56:24,472][00422] Avg episode reward: [(0, '-0.069')] |
|
[2023-08-07 22:56:29,469][00422] Fps is (10 sec: 4093.7, 60 sec: 3481.3, 300 sec: 2867.0). Total num frames: 229376. Throughput: 0: 900.6. Samples: 55840. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:56:29,472][00422] Avg episode reward: [(0, '-0.081')] |
|
[2023-08-07 22:56:33,027][09971] Updated weights for policy 0, policy_version 60 (0.0015) |
|
[2023-08-07 22:56:34,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 2891.3). Total num frames: 245760. Throughput: 0: 896.6. Samples: 61950. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:56:34,470][00422] Avg episode reward: [(0, '-0.061')] |
|
[2023-08-07 22:56:34,481][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000060_245760.pth... |
|
[2023-08-07 22:56:39,467][00422] Fps is (10 sec: 3277.5, 60 sec: 3549.7, 300 sec: 2912.6). Total num frames: 262144. Throughput: 0: 857.4. Samples: 65852. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:56:39,469][00422] Avg episode reward: [(0, '-0.051')] |
|
[2023-08-07 22:56:44,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 2931.9). Total num frames: 278528. Throughput: 0: 858.9. Samples: 67788. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:56:44,465][00422] Avg episode reward: [(0, '-0.050')] |
|
[2023-08-07 22:56:46,122][09971] Updated weights for policy 0, policy_version 70 (0.0021) |
|
[2023-08-07 22:56:49,463][00422] Fps is (10 sec: 3687.7, 60 sec: 3549.9, 300 sec: 2990.1). Total num frames: 299008. Throughput: 0: 907.2. Samples: 74284. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 22:56:49,466][00422] Avg episode reward: [(0, '-0.050')] |
|
[2023-08-07 22:56:54,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3042.7). Total num frames: 319488. Throughput: 0: 888.0. Samples: 80214. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:56:54,466][00422] Avg episode reward: [(0, '-0.050')] |
|
[2023-08-07 22:56:57,393][09971] Updated weights for policy 0, policy_version 80 (0.0019) |
|
[2023-08-07 22:56:59,469][00422] Fps is (10 sec: 3275.0, 60 sec: 3549.5, 300 sec: 3016.0). Total num frames: 331776. Throughput: 0: 859.1. Samples: 82212. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 22:56:59,471][00422] Avg episode reward: [(0, '-0.050')] |
|
[2023-08-07 22:57:04,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3549.9, 300 sec: 3027.5). Total num frames: 348160. Throughput: 0: 865.2. Samples: 86532. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 22:57:04,467][00422] Avg episode reward: [(0, '-0.051')] |
|
[2023-08-07 22:57:08,733][09971] Updated weights for policy 0, policy_version 90 (0.0033) |
|
[2023-08-07 22:57:09,463][00422] Fps is (10 sec: 3688.4, 60 sec: 3481.6, 300 sec: 3072.0). Total num frames: 368640. Throughput: 0: 902.0. Samples: 93154. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 22:57:09,466][00422] Avg episode reward: [(0, '-0.072')] |
|
[2023-08-07 22:57:14,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3113.0). Total num frames: 389120. Throughput: 0: 902.9. Samples: 96466. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 22:57:14,469][00422] Avg episode reward: [(0, '-0.061')] |
|
[2023-08-07 22:57:19,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3087.8). Total num frames: 401408. Throughput: 0: 856.9. Samples: 100512. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 22:57:19,470][00422] Avg episode reward: [(0, '-0.072')] |
|
[2023-08-07 22:57:21,649][09971] Updated weights for policy 0, policy_version 100 (0.0038) |
|
[2023-08-07 22:57:24,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3094.8). Total num frames: 417792. Throughput: 0: 877.0. Samples: 105312. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 22:57:24,469][00422] Avg episode reward: [(0, '-0.107')] |
|
[2023-08-07 22:57:29,473][00422] Fps is (10 sec: 3682.9, 60 sec: 3481.4, 300 sec: 3130.3). Total num frames: 438272. Throughput: 0: 905.6. Samples: 108550. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 22:57:29,479][00422] Avg episode reward: [(0, '-0.119')] |
|
[2023-08-07 22:57:33,652][09971] Updated weights for policy 0, policy_version 110 (0.0018) |
|
[2023-08-07 22:57:34,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3413.3, 300 sec: 3107.3). Total num frames: 450560. Throughput: 0: 858.5. Samples: 112918. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 22:57:34,465][00422] Avg episode reward: [(0, '-0.131')] |
|
[2023-08-07 22:57:39,464][00422] Fps is (10 sec: 2459.8, 60 sec: 3345.2, 300 sec: 3085.6). Total num frames: 462848. Throughput: 0: 799.5. Samples: 116190. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:57:39,466][00422] Avg episode reward: [(0, '-0.131')] |
|
[2023-08-07 22:57:44,464][00422] Fps is (10 sec: 2457.4, 60 sec: 3276.8, 300 sec: 3065.4). Total num frames: 475136. Throughput: 0: 791.7. Samples: 117834. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:57:44,471][00422] Avg episode reward: [(0, '-0.131')] |
|
[2023-08-07 22:57:48,630][09971] Updated weights for policy 0, policy_version 120 (0.0016) |
|
[2023-08-07 22:57:49,463][00422] Fps is (10 sec: 3277.0, 60 sec: 3276.8, 300 sec: 3097.6). Total num frames: 495616. Throughput: 0: 809.7. Samples: 122968. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 22:57:49,468][00422] Avg episode reward: [(0, '-0.154')] |
|
[2023-08-07 22:57:54,463][00422] Fps is (10 sec: 4096.3, 60 sec: 3276.8, 300 sec: 3127.9). Total num frames: 516096. Throughput: 0: 810.5. Samples: 129626. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 22:57:54,469][00422] Avg episode reward: [(0, '-0.154')] |
|
[2023-08-07 22:57:59,356][09971] Updated weights for policy 0, policy_version 130 (0.0029) |
|
[2023-08-07 22:57:59,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3345.4, 300 sec: 3132.2). Total num frames: 532480. Throughput: 0: 794.8. Samples: 132230. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:57:59,471][00422] Avg episode reward: [(0, '-0.154')] |
|
[2023-08-07 22:58:04,464][00422] Fps is (10 sec: 2866.9, 60 sec: 3276.7, 300 sec: 3112.9). Total num frames: 544768. Throughput: 0: 791.2. Samples: 136118. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:58:04,471][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 22:58:09,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3276.8, 300 sec: 3140.3). Total num frames: 565248. Throughput: 0: 809.0. Samples: 141716. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 22:58:09,465][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 22:58:11,044][09971] Updated weights for policy 0, policy_version 140 (0.0034) |
|
[2023-08-07 22:58:14,463][00422] Fps is (10 sec: 4096.5, 60 sec: 3276.8, 300 sec: 3166.1). Total num frames: 585728. Throughput: 0: 810.5. Samples: 145014. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:58:14,465][00422] Avg episode reward: [(0, '-0.166')] |
|
[2023-08-07 22:58:19,466][00422] Fps is (10 sec: 3685.5, 60 sec: 3344.9, 300 sec: 3169.0). Total num frames: 602112. Throughput: 0: 837.8. Samples: 150620. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:58:19,468][00422] Avg episode reward: [(0, '-0.166')] |
|
[2023-08-07 22:58:23,469][09971] Updated weights for policy 0, policy_version 150 (0.0029) |
|
[2023-08-07 22:58:24,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3276.8, 300 sec: 3150.8). Total num frames: 614400. Throughput: 0: 854.1. Samples: 154626. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 22:58:24,466][00422] Avg episode reward: [(0, '-0.166')] |
|
[2023-08-07 22:58:29,463][00422] Fps is (10 sec: 3277.5, 60 sec: 3277.3, 300 sec: 3174.4). Total num frames: 634880. Throughput: 0: 875.9. Samples: 157248. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 22:58:29,466][00422] Avg episode reward: [(0, '-0.166')] |
|
[2023-08-07 22:58:33,627][09971] Updated weights for policy 0, policy_version 160 (0.0018) |
|
[2023-08-07 22:58:34,464][00422] Fps is (10 sec: 4095.7, 60 sec: 3413.3, 300 sec: 3196.9). Total num frames: 655360. Throughput: 0: 907.1. Samples: 163786. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:58:34,470][00422] Avg episode reward: [(0, '-0.176')] |
|
[2023-08-07 22:58:34,521][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000161_659456.pth... |
|
[2023-08-07 22:58:39,463][00422] Fps is (10 sec: 3686.5, 60 sec: 3481.6, 300 sec: 3198.8). Total num frames: 671744. Throughput: 0: 858.3. Samples: 168248. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 22:58:39,468][00422] Avg episode reward: [(0, '-0.176')] |
|
[2023-08-07 22:58:44,464][00422] Fps is (10 sec: 2867.3, 60 sec: 3481.6, 300 sec: 3181.5). Total num frames: 684032. Throughput: 0: 842.8. Samples: 170156. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 22:58:44,470][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:58:47,713][09971] Updated weights for policy 0, policy_version 170 (0.0022) |
|
[2023-08-07 22:58:49,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3413.3, 300 sec: 3183.7). Total num frames: 700416. Throughput: 0: 867.3. Samples: 175144. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:58:49,466][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:58:54,463][00422] Fps is (10 sec: 4096.2, 60 sec: 3481.6, 300 sec: 3222.2). Total num frames: 724992. Throughput: 0: 892.8. Samples: 181894. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:58:54,465][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:58:57,366][09971] Updated weights for policy 0, policy_version 180 (0.0030) |
|
[2023-08-07 22:58:59,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3481.6, 300 sec: 3223.4). Total num frames: 741376. Throughput: 0: 881.6. Samples: 184684. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 22:58:59,467][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:59:04,466][00422] Fps is (10 sec: 2866.3, 60 sec: 3481.5, 300 sec: 3207.0). Total num frames: 753664. Throughput: 0: 841.8. Samples: 188500. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:59:04,475][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:59:09,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3225.6). Total num frames: 774144. Throughput: 0: 873.5. Samples: 193932. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:59:09,466][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:59:10,251][09971] Updated weights for policy 0, policy_version 190 (0.0017) |
|
[2023-08-07 22:59:14,463][00422] Fps is (10 sec: 4097.3, 60 sec: 3481.6, 300 sec: 3243.4). Total num frames: 794624. Throughput: 0: 888.0. Samples: 197206. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:59:14,466][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 22:59:19,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.7, 300 sec: 3244.0). Total num frames: 811008. Throughput: 0: 866.5. Samples: 202776. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:59:19,466][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 22:59:21,979][09971] Updated weights for policy 0, policy_version 200 (0.0034) |
|
[2023-08-07 22:59:24,465][00422] Fps is (10 sec: 2866.7, 60 sec: 3481.5, 300 sec: 3228.6). Total num frames: 823296. Throughput: 0: 854.8. Samples: 206714. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 22:59:24,472][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 22:59:29,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3245.3). Total num frames: 843776. Throughput: 0: 869.7. Samples: 209292. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:59:29,466][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 22:59:32,817][09971] Updated weights for policy 0, policy_version 210 (0.0018) |
|
[2023-08-07 22:59:34,463][00422] Fps is (10 sec: 4096.7, 60 sec: 3481.6, 300 sec: 3261.3). Total num frames: 864256. Throughput: 0: 904.6. Samples: 215852. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 22:59:34,466][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:59:39,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 3261.6). Total num frames: 880640. Throughput: 0: 868.5. Samples: 220978. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:59:39,469][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 22:59:44,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3247.0). Total num frames: 892928. Throughput: 0: 851.4. Samples: 222996. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:59:44,472][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 22:59:46,126][09971] Updated weights for policy 0, policy_version 220 (0.0034) |
|
[2023-08-07 22:59:49,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3262.2). Total num frames: 913408. Throughput: 0: 876.0. Samples: 227916. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 22:59:49,465][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 22:59:54,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3481.6, 300 sec: 3276.8). Total num frames: 933888. Throughput: 0: 900.7. Samples: 234464. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:59:54,465][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 22:59:55,513][09971] Updated weights for policy 0, policy_version 230 (0.0013) |
|
[2023-08-07 22:59:59,463][00422] Fps is (10 sec: 3686.3, 60 sec: 3481.6, 300 sec: 3276.8). Total num frames: 950272. Throughput: 0: 887.9. Samples: 237162. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 22:59:59,468][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:04,466][00422] Fps is (10 sec: 2866.5, 60 sec: 3481.6, 300 sec: 3262.9). Total num frames: 962560. Throughput: 0: 844.9. Samples: 240798. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:00:04,470][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:09,463][00422] Fps is (10 sec: 2457.7, 60 sec: 3345.1, 300 sec: 3304.6). Total num frames: 974848. Throughput: 0: 832.0. Samples: 244152. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:00:09,466][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:12,359][09971] Updated weights for policy 0, policy_version 240 (0.0022) |
|
[2023-08-07 23:00:14,463][00422] Fps is (10 sec: 2458.2, 60 sec: 3208.5, 300 sec: 3346.2). Total num frames: 987136. Throughput: 0: 819.0. Samples: 246148. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:00:14,466][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:19,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3276.8, 300 sec: 3401.8). Total num frames: 1007616. Throughput: 0: 807.9. Samples: 252206. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:00:19,468][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:24,423][09971] Updated weights for policy 0, policy_version 250 (0.0037) |
|
[2023-08-07 23:00:24,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3345.2, 300 sec: 3401.8). Total num frames: 1024000. Throughput: 0: 783.4. Samples: 256230. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:00:24,469][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:29,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3208.5, 300 sec: 3387.9). Total num frames: 1036288. Throughput: 0: 784.5. Samples: 258298. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:00:29,468][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:34,464][00422] Fps is (10 sec: 3686.2, 60 sec: 3276.8, 300 sec: 3429.5). Total num frames: 1060864. Throughput: 0: 815.4. Samples: 264610. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:00:34,466][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:00:34,479][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000259_1060864.pth... |
|
[2023-08-07 23:00:34,592][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000060_245760.pth |
|
[2023-08-07 23:00:35,098][09971] Updated weights for policy 0, policy_version 260 (0.0018) |
|
[2023-08-07 23:00:39,470][00422] Fps is (10 sec: 4502.5, 60 sec: 3344.7, 300 sec: 3443.3). Total num frames: 1081344. Throughput: 0: 803.3. Samples: 270618. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:00:39,472][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:00:44,468][00422] Fps is (10 sec: 3275.4, 60 sec: 3344.8, 300 sec: 3415.6). Total num frames: 1093632. Throughput: 0: 785.9. Samples: 272530. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:00:44,475][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:00:48,561][09971] Updated weights for policy 0, policy_version 270 (0.0041) |
|
[2023-08-07 23:00:49,463][00422] Fps is (10 sec: 2459.3, 60 sec: 3208.5, 300 sec: 3387.9). Total num frames: 1105920. Throughput: 0: 792.7. Samples: 276468. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:00:49,471][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:00:54,463][00422] Fps is (10 sec: 3688.1, 60 sec: 3276.8, 300 sec: 3429.5). Total num frames: 1130496. Throughput: 0: 864.5. Samples: 283056. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:00:54,472][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:00:57,826][09971] Updated weights for policy 0, policy_version 280 (0.0019) |
|
[2023-08-07 23:00:59,463][00422] Fps is (10 sec: 4505.6, 60 sec: 3345.1, 300 sec: 3443.4). Total num frames: 1150976. Throughput: 0: 891.6. Samples: 286270. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:00:59,467][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:01:04,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3276.9, 300 sec: 3387.9). Total num frames: 1159168. Throughput: 0: 850.6. Samples: 290484. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:01:04,469][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:01:09,463][00422] Fps is (10 sec: 2457.6, 60 sec: 3345.1, 300 sec: 3387.9). Total num frames: 1175552. Throughput: 0: 859.9. Samples: 294926. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:01:09,470][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:01:11,475][09971] Updated weights for policy 0, policy_version 290 (0.0037) |
|
[2023-08-07 23:01:14,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3429.5). Total num frames: 1200128. Throughput: 0: 886.4. Samples: 298184. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:01:14,469][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:01:19,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3481.6, 300 sec: 3415.6). Total num frames: 1216512. Throughput: 0: 888.2. Samples: 304578. Policy #0 lag: (min: 0.0, avg: 0.3, max: 2.0) |
|
[2023-08-07 23:01:19,473][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:23,354][09971] Updated weights for policy 0, policy_version 300 (0.0033) |
|
[2023-08-07 23:01:24,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3413.3, 300 sec: 3387.9). Total num frames: 1228800. Throughput: 0: 836.5. Samples: 308256. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:01:24,465][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:29,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3387.9). Total num frames: 1245184. Throughput: 0: 838.1. Samples: 310240. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:01:29,468][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:34,383][09971] Updated weights for policy 0, policy_version 310 (0.0023) |
|
[2023-08-07 23:01:34,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3481.6, 300 sec: 3415.7). Total num frames: 1269760. Throughput: 0: 888.7. Samples: 316460. Policy #0 lag: (min: 0.0, avg: 0.3, max: 2.0) |
|
[2023-08-07 23:01:34,468][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:39,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3413.7, 300 sec: 3415.6). Total num frames: 1286144. Throughput: 0: 876.2. Samples: 322484. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:01:39,466][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:44,463][00422] Fps is (10 sec: 3276.7, 60 sec: 3481.9, 300 sec: 3401.8). Total num frames: 1302528. Throughput: 0: 850.1. Samples: 324524. Policy #0 lag: (min: 0.0, avg: 0.3, max: 2.0) |
|
[2023-08-07 23:01:44,468][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:47,629][09971] Updated weights for policy 0, policy_version 320 (0.0031) |
|
[2023-08-07 23:01:49,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3387.9). Total num frames: 1318912. Throughput: 0: 848.3. Samples: 328658. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:01:49,466][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:54,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 3415.7). Total num frames: 1339392. Throughput: 0: 897.5. Samples: 335312. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:01:54,468][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:01:57,060][09971] Updated weights for policy 0, policy_version 330 (0.0012) |
|
[2023-08-07 23:01:59,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3413.3, 300 sec: 3415.6). Total num frames: 1355776. Throughput: 0: 896.7. Samples: 338534. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:01:59,466][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:02:04,463][00422] Fps is (10 sec: 3276.9, 60 sec: 3549.9, 300 sec: 3401.8). Total num frames: 1372160. Throughput: 0: 847.1. Samples: 342696. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:02:04,467][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:02:09,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3374.0). Total num frames: 1384448. Throughput: 0: 865.3. Samples: 347196. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:02:09,471][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:02:10,525][09971] Updated weights for policy 0, policy_version 340 (0.0046) |
|
[2023-08-07 23:02:14,465][00422] Fps is (10 sec: 3685.9, 60 sec: 3481.5, 300 sec: 3415.6). Total num frames: 1409024. Throughput: 0: 893.6. Samples: 350452. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:02:14,467][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:02:19,465][00422] Fps is (10 sec: 4095.4, 60 sec: 3481.5, 300 sec: 3415.6). Total num frames: 1425408. Throughput: 0: 896.1. Samples: 356784. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:02:19,470][00422] Avg episode reward: [(0, '-0.186')] |
|
[2023-08-07 23:02:21,552][09971] Updated weights for policy 0, policy_version 350 (0.0016) |
|
[2023-08-07 23:02:24,463][00422] Fps is (10 sec: 2867.6, 60 sec: 3481.6, 300 sec: 3388.0). Total num frames: 1437696. Throughput: 0: 843.1. Samples: 360422. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:02:24,465][00422] Avg episode reward: [(0, '-0.175')] |
|
[2023-08-07 23:02:29,463][00422] Fps is (10 sec: 2867.6, 60 sec: 3481.6, 300 sec: 3401.8). Total num frames: 1454080. Throughput: 0: 838.4. Samples: 362250. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:02:29,466][00422] Avg episode reward: [(0, '-0.175')] |
|
[2023-08-07 23:02:34,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3345.1, 300 sec: 3415.7). Total num frames: 1470464. Throughput: 0: 874.0. Samples: 367986. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:02:34,469][00422] Avg episode reward: [(0, '-0.175')] |
|
[2023-08-07 23:02:34,491][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000359_1470464.pth... |
|
[2023-08-07 23:02:34,664][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000161_659456.pth |
|
[2023-08-07 23:02:35,116][09971] Updated weights for policy 0, policy_version 360 (0.0021) |
|
[2023-08-07 23:02:39,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3276.8, 300 sec: 3415.7). Total num frames: 1482752. Throughput: 0: 802.1. Samples: 371406. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:02:39,466][00422] Avg episode reward: [(0, '-0.175')] |
|
[2023-08-07 23:02:44,463][00422] Fps is (10 sec: 2048.0, 60 sec: 3140.3, 300 sec: 3374.0). Total num frames: 1490944. Throughput: 0: 764.8. Samples: 372950. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:02:44,468][00422] Avg episode reward: [(0, '-0.175')] |
|
[2023-08-07 23:02:49,463][00422] Fps is (10 sec: 2457.7, 60 sec: 3140.3, 300 sec: 3360.1). Total num frames: 1507328. Throughput: 0: 746.9. Samples: 376308. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:02:49,469][00422] Avg episode reward: [(0, '-0.175')] |
|
[2023-08-07 23:02:51,787][09971] Updated weights for policy 0, policy_version 370 (0.0030) |
|
[2023-08-07 23:02:54,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3072.0, 300 sec: 3360.1). Total num frames: 1523712. Throughput: 0: 769.2. Samples: 381810. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:02:54,471][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 23:02:59,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3140.3, 300 sec: 3387.9). Total num frames: 1544192. Throughput: 0: 767.7. Samples: 384998. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:02:59,466][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 23:03:02,097][09971] Updated weights for policy 0, policy_version 380 (0.0042) |
|
[2023-08-07 23:03:04,464][00422] Fps is (10 sec: 3686.2, 60 sec: 3140.2, 300 sec: 3374.0). Total num frames: 1560576. Throughput: 0: 747.1. Samples: 390402. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:03:04,466][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 23:03:09,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3140.3, 300 sec: 3346.2). Total num frames: 1572864. Throughput: 0: 751.4. Samples: 394234. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:03:09,468][00422] Avg episode reward: [(0, '-0.154')] |
|
[2023-08-07 23:03:14,463][00422] Fps is (10 sec: 3277.0, 60 sec: 3072.1, 300 sec: 3360.1). Total num frames: 1593344. Throughput: 0: 770.2. Samples: 396908. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:03:14,470][00422] Avg episode reward: [(0, '-0.154')] |
|
[2023-08-07 23:03:14,829][09971] Updated weights for policy 0, policy_version 390 (0.0026) |
|
[2023-08-07 23:03:19,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3140.3, 300 sec: 3387.9). Total num frames: 1613824. Throughput: 0: 785.5. Samples: 403334. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:03:19,470][00422] Avg episode reward: [(0, '-0.166')] |
|
[2023-08-07 23:03:24,464][00422] Fps is (10 sec: 3686.3, 60 sec: 3208.5, 300 sec: 3374.0). Total num frames: 1630208. Throughput: 0: 816.0. Samples: 408128. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:03:24,472][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:03:27,183][09971] Updated weights for policy 0, policy_version 400 (0.0047) |
|
[2023-08-07 23:03:29,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3140.3, 300 sec: 3346.2). Total num frames: 1642496. Throughput: 0: 825.1. Samples: 410080. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:03:29,467][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:03:34,463][00422] Fps is (10 sec: 2867.3, 60 sec: 3140.3, 300 sec: 3346.2). Total num frames: 1658880. Throughput: 0: 856.5. Samples: 414852. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:03:34,466][00422] Avg episode reward: [(0, '-0.188')] |
|
[2023-08-07 23:03:38,634][09971] Updated weights for policy 0, policy_version 410 (0.0030) |
|
[2023-08-07 23:03:39,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3276.8, 300 sec: 3374.0). Total num frames: 1679360. Throughput: 0: 868.9. Samples: 420910. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:03:39,474][00422] Avg episode reward: [(0, '-0.188')] |
|
[2023-08-07 23:03:44,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3413.3, 300 sec: 3374.0). Total num frames: 1695744. Throughput: 0: 852.1. Samples: 423342. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:03:44,469][00422] Avg episode reward: [(0, '-0.188')] |
|
[2023-08-07 23:03:49,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3345.1, 300 sec: 3332.3). Total num frames: 1708032. Throughput: 0: 817.7. Samples: 427200. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:03:49,468][00422] Avg episode reward: [(0, '-0.188')] |
|
[2023-08-07 23:03:52,128][09971] Updated weights for policy 0, policy_version 420 (0.0041) |
|
[2023-08-07 23:03:54,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3413.3, 300 sec: 3346.2). Total num frames: 1728512. Throughput: 0: 854.4. Samples: 432682. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:03:54,466][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:03:59,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3413.3, 300 sec: 3374.0). Total num frames: 1748992. Throughput: 0: 868.5. Samples: 435992. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:03:59,471][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:04:02,483][09971] Updated weights for policy 0, policy_version 430 (0.0023) |
|
[2023-08-07 23:04:04,467][00422] Fps is (10 sec: 3685.1, 60 sec: 3413.2, 300 sec: 3360.1). Total num frames: 1765376. Throughput: 0: 843.0. Samples: 441272. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:04:04,469][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:04:09,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3413.3, 300 sec: 3332.3). Total num frames: 1777664. Throughput: 0: 822.2. Samples: 445126. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:04:09,470][00422] Avg episode reward: [(0, '-0.199')] |
|
[2023-08-07 23:04:14,463][00422] Fps is (10 sec: 3278.0, 60 sec: 3413.3, 300 sec: 3346.2). Total num frames: 1798144. Throughput: 0: 839.9. Samples: 447876. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:04:14,466][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:04:15,025][09971] Updated weights for policy 0, policy_version 440 (0.0028) |
|
[2023-08-07 23:04:19,463][00422] Fps is (10 sec: 4096.1, 60 sec: 3413.3, 300 sec: 3374.0). Total num frames: 1818624. Throughput: 0: 881.6. Samples: 454522. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:04:19,465][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:04:24,469][00422] Fps is (10 sec: 3684.4, 60 sec: 3413.0, 300 sec: 3360.0). Total num frames: 1835008. Throughput: 0: 860.3. Samples: 459628. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:04:24,471][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:04:26,729][09971] Updated weights for policy 0, policy_version 450 (0.0027) |
|
[2023-08-07 23:04:29,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3413.3, 300 sec: 3332.3). Total num frames: 1847296. Throughput: 0: 850.0. Samples: 461590. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:04:29,467][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:04:34,463][00422] Fps is (10 sec: 3278.6, 60 sec: 3481.6, 300 sec: 3346.2). Total num frames: 1867776. Throughput: 0: 882.6. Samples: 466918. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:04:34,470][00422] Avg episode reward: [(0, '-0.186')] |
|
[2023-08-07 23:04:34,533][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000457_1871872.pth... |
|
[2023-08-07 23:04:34,640][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000259_1060864.pth |
|
[2023-08-07 23:04:37,283][09971] Updated weights for policy 0, policy_version 460 (0.0026) |
|
[2023-08-07 23:04:39,463][00422] Fps is (10 sec: 4505.6, 60 sec: 3549.9, 300 sec: 3387.9). Total num frames: 1892352. Throughput: 0: 910.0. Samples: 473632. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:04:39,465][00422] Avg episode reward: [(0, '-0.186')] |
|
[2023-08-07 23:04:44,466][00422] Fps is (10 sec: 3685.5, 60 sec: 3481.5, 300 sec: 3360.1). Total num frames: 1904640. Throughput: 0: 891.2. Samples: 476100. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:04:44,473][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:04:49,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3549.9, 300 sec: 3346.2). Total num frames: 1921024. Throughput: 0: 863.6. Samples: 480130. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:04:49,466][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:04:50,512][09971] Updated weights for policy 0, policy_version 470 (0.0018) |
|
[2023-08-07 23:04:54,463][00422] Fps is (10 sec: 3687.3, 60 sec: 3549.9, 300 sec: 3360.1). Total num frames: 1941504. Throughput: 0: 905.8. Samples: 485886. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:04:54,466][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:04:59,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3387.9). Total num frames: 1961984. Throughput: 0: 918.7. Samples: 489218. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:04:59,470][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:04:59,728][09971] Updated weights for policy 0, policy_version 480 (0.0013) |
|
[2023-08-07 23:05:04,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3550.1, 300 sec: 3401.8). Total num frames: 1978368. Throughput: 0: 889.8. Samples: 494562. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:05:04,468][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:09,467][00422] Fps is (10 sec: 2456.8, 60 sec: 3481.4, 300 sec: 3387.8). Total num frames: 1986560. Throughput: 0: 843.1. Samples: 497564. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:05:09,468][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:14,467][00422] Fps is (10 sec: 1637.8, 60 sec: 3276.6, 300 sec: 3346.2). Total num frames: 1994752. Throughput: 0: 829.5. Samples: 498922. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:05:14,470][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:18,195][09971] Updated weights for policy 0, policy_version 490 (0.0047) |
|
[2023-08-07 23:05:19,463][00422] Fps is (10 sec: 2458.5, 60 sec: 3208.5, 300 sec: 3346.2). Total num frames: 2011136. Throughput: 0: 786.9. Samples: 502328. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:05:19,466][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:24,463][00422] Fps is (10 sec: 3687.7, 60 sec: 3277.1, 300 sec: 3374.0). Total num frames: 2031616. Throughput: 0: 772.0. Samples: 508370. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:05:24,470][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:29,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3276.8, 300 sec: 3332.3). Total num frames: 2043904. Throughput: 0: 759.4. Samples: 510270. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:05:29,473][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:30,636][09971] Updated weights for policy 0, policy_version 500 (0.0025) |
|
[2023-08-07 23:05:34,463][00422] Fps is (10 sec: 2457.6, 60 sec: 3140.3, 300 sec: 3304.6). Total num frames: 2056192. Throughput: 0: 758.2. Samples: 514250. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:05:34,465][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:39,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3072.0, 300 sec: 3332.4). Total num frames: 2076672. Throughput: 0: 764.1. Samples: 520270. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:05:39,470][00422] Avg episode reward: [(0, '-0.174')] |
|
[2023-08-07 23:05:41,441][09971] Updated weights for policy 0, policy_version 510 (0.0036) |
|
[2023-08-07 23:05:44,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3208.7, 300 sec: 3360.1). Total num frames: 2097152. Throughput: 0: 759.8. Samples: 523408. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:05:44,470][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 23:05:49,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3140.3, 300 sec: 3318.5). Total num frames: 2109440. Throughput: 0: 737.7. Samples: 527760. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:05:49,468][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:05:54,463][00422] Fps is (10 sec: 2457.6, 60 sec: 3003.7, 300 sec: 3290.7). Total num frames: 2121728. Throughput: 0: 744.7. Samples: 531072. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:05:54,467][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:05:56,497][09971] Updated weights for policy 0, policy_version 520 (0.0043) |
|
[2023-08-07 23:05:59,463][00422] Fps is (10 sec: 2867.2, 60 sec: 2935.5, 300 sec: 3318.5). Total num frames: 2138112. Throughput: 0: 769.1. Samples: 533528. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:05:59,466][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:06:04,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3003.7, 300 sec: 3332.3). Total num frames: 2158592. Throughput: 0: 818.7. Samples: 539170. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:06:04,470][00422] Avg episode reward: [(0, '-0.167')] |
|
[2023-08-07 23:06:09,467][09971] Updated weights for policy 0, policy_version 530 (0.0030) |
|
[2023-08-07 23:06:09,477][00422] Fps is (10 sec: 3272.4, 60 sec: 3071.5, 300 sec: 3290.5). Total num frames: 2170880. Throughput: 0: 771.1. Samples: 543078. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:06:09,480][00422] Avg episode reward: [(0, '-0.167')] |
|
[2023-08-07 23:06:14,464][00422] Fps is (10 sec: 2047.9, 60 sec: 3072.1, 300 sec: 3262.9). Total num frames: 2179072. Throughput: 0: 766.8. Samples: 544778. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:06:14,470][00422] Avg episode reward: [(0, '-0.167')] |
|
[2023-08-07 23:06:19,463][00422] Fps is (10 sec: 2050.7, 60 sec: 3003.7, 300 sec: 3262.9). Total num frames: 2191360. Throughput: 0: 754.6. Samples: 548208. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:06:19,470][00422] Avg episode reward: [(0, '-0.156')] |
|
[2023-08-07 23:06:23,731][09971] Updated weights for policy 0, policy_version 540 (0.0031) |
|
[2023-08-07 23:06:24,463][00422] Fps is (10 sec: 3277.0, 60 sec: 3003.7, 300 sec: 3276.8). Total num frames: 2211840. Throughput: 0: 746.8. Samples: 553874. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:06:24,469][00422] Avg episode reward: [(0, '-0.156')] |
|
[2023-08-07 23:06:29,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3072.0, 300 sec: 3249.0). Total num frames: 2228224. Throughput: 0: 736.0. Samples: 556530. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:06:29,467][00422] Avg episode reward: [(0, '-0.156')] |
|
[2023-08-07 23:06:34,464][00422] Fps is (10 sec: 2867.0, 60 sec: 3072.0, 300 sec: 3235.1). Total num frames: 2240512. Throughput: 0: 726.6. Samples: 560456. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:06:34,466][00422] Avg episode reward: [(0, '-0.157')] |
|
[2023-08-07 23:06:34,482][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000547_2240512.pth... |
|
[2023-08-07 23:06:34,637][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000359_1470464.pth |
|
[2023-08-07 23:06:37,211][09971] Updated weights for policy 0, policy_version 550 (0.0016) |
|
[2023-08-07 23:06:39,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3072.0, 300 sec: 3249.0). Total num frames: 2260992. Throughput: 0: 773.3. Samples: 565872. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:06:39,465][00422] Avg episode reward: [(0, '-0.157')] |
|
[2023-08-07 23:06:44,463][00422] Fps is (10 sec: 4096.2, 60 sec: 3072.0, 300 sec: 3262.9). Total num frames: 2281472. Throughput: 0: 792.0. Samples: 569170. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:06:44,466][00422] Avg episode reward: [(0, '-0.167')] |
|
[2023-08-07 23:06:46,766][09971] Updated weights for policy 0, policy_version 560 (0.0029) |
|
[2023-08-07 23:06:49,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3140.3, 300 sec: 3249.0). Total num frames: 2297856. Throughput: 0: 793.7. Samples: 574886. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:06:49,471][00422] Avg episode reward: [(0, '-0.167')] |
|
[2023-08-07 23:06:54,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3140.3, 300 sec: 3235.1). Total num frames: 2310144. Throughput: 0: 794.3. Samples: 578810. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:06:54,473][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:06:59,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3208.5, 300 sec: 3249.0). Total num frames: 2330624. Throughput: 0: 805.2. Samples: 581012. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:06:59,470][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:07:00,096][09971] Updated weights for policy 0, policy_version 570 (0.0024) |
|
[2023-08-07 23:07:04,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3208.5, 300 sec: 3276.8). Total num frames: 2351104. Throughput: 0: 875.2. Samples: 587594. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:07:04,471][00422] Avg episode reward: [(0, '-0.166')] |
|
[2023-08-07 23:07:09,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3277.5, 300 sec: 3249.0). Total num frames: 2367488. Throughput: 0: 856.8. Samples: 592432. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:07:09,467][00422] Avg episode reward: [(0, '-0.176')] |
|
[2023-08-07 23:07:12,316][09971] Updated weights for policy 0, policy_version 580 (0.0017) |
|
[2023-08-07 23:07:14,464][00422] Fps is (10 sec: 2866.8, 60 sec: 3345.0, 300 sec: 3235.1). Total num frames: 2379776. Throughput: 0: 841.9. Samples: 594416. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:07:14,467][00422] Avg episode reward: [(0, '-0.176')] |
|
[2023-08-07 23:07:19,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3262.9). Total num frames: 2400256. Throughput: 0: 858.1. Samples: 599070. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:07:19,466][00422] Avg episode reward: [(0, '-0.188')] |
|
[2023-08-07 23:07:23,294][09971] Updated weights for policy 0, policy_version 590 (0.0016) |
|
[2023-08-07 23:07:24,463][00422] Fps is (10 sec: 4096.5, 60 sec: 3481.6, 300 sec: 3276.8). Total num frames: 2420736. Throughput: 0: 883.5. Samples: 605628. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:07:24,470][00422] Avg episode reward: [(0, '-0.188')] |
|
[2023-08-07 23:07:29,471][00422] Fps is (10 sec: 3683.7, 60 sec: 3481.2, 300 sec: 3276.7). Total num frames: 2437120. Throughput: 0: 874.9. Samples: 608546. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:07:29,473][00422] Avg episode reward: [(0, '-0.188')] |
|
[2023-08-07 23:07:34,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3276.8). Total num frames: 2449408. Throughput: 0: 829.8. Samples: 612226. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:07:34,470][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:07:36,862][09971] Updated weights for policy 0, policy_version 600 (0.0032) |
|
[2023-08-07 23:07:39,464][00422] Fps is (10 sec: 2459.3, 60 sec: 3345.0, 300 sec: 3290.7). Total num frames: 2461696. Throughput: 0: 836.1. Samples: 616436. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:07:39,469][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:07:44,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3276.8, 300 sec: 3290.7). Total num frames: 2478080. Throughput: 0: 831.1. Samples: 618412. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:07:44,471][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:07:49,463][00422] Fps is (10 sec: 2867.4, 60 sec: 3208.5, 300 sec: 3276.8). Total num frames: 2490368. Throughput: 0: 772.9. Samples: 622376. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:07:49,467][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:07:52,319][09971] Updated weights for policy 0, policy_version 610 (0.0028) |
|
[2023-08-07 23:07:54,463][00422] Fps is (10 sec: 2457.6, 60 sec: 3208.5, 300 sec: 3249.0). Total num frames: 2502656. Throughput: 0: 755.3. Samples: 626420. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:07:54,472][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 23:07:59,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3140.3, 300 sec: 3249.0). Total num frames: 2519040. Throughput: 0: 758.2. Samples: 628536. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:07:59,466][00422] Avg episode reward: [(0, '-0.164')] |
|
[2023-08-07 23:08:03,336][09971] Updated weights for policy 0, policy_version 620 (0.0043) |
|
[2023-08-07 23:08:04,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3208.5, 300 sec: 3290.7). Total num frames: 2543616. Throughput: 0: 802.5. Samples: 635184. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:08:04,468][00422] Avg episode reward: [(0, '-0.155')] |
|
[2023-08-07 23:08:09,464][00422] Fps is (10 sec: 4096.0, 60 sec: 3208.5, 300 sec: 3276.8). Total num frames: 2560000. Throughput: 0: 783.1. Samples: 640868. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:08:09,472][00422] Avg episode reward: [(0, '-0.155')] |
|
[2023-08-07 23:08:14,464][00422] Fps is (10 sec: 2867.0, 60 sec: 3208.6, 300 sec: 3249.0). Total num frames: 2572288. Throughput: 0: 761.9. Samples: 642828. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:08:14,471][00422] Avg episode reward: [(0, '-0.155')] |
|
[2023-08-07 23:08:16,595][09971] Updated weights for policy 0, policy_version 630 (0.0030) |
|
[2023-08-07 23:08:19,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3140.3, 300 sec: 3249.0). Total num frames: 2588672. Throughput: 0: 778.8. Samples: 647272. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:08:19,465][00422] Avg episode reward: [(0, '-0.155')] |
|
[2023-08-07 23:08:24,463][00422] Fps is (10 sec: 4096.3, 60 sec: 3208.5, 300 sec: 3290.7). Total num frames: 2613248. Throughput: 0: 832.4. Samples: 653894. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:08:24,465][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:08:25,985][09971] Updated weights for policy 0, policy_version 640 (0.0024) |
|
[2023-08-07 23:08:29,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3208.9, 300 sec: 3290.7). Total num frames: 2629632. Throughput: 0: 861.2. Samples: 657164. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:08:29,472][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:08:34,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3208.5, 300 sec: 3262.9). Total num frames: 2641920. Throughput: 0: 859.8. Samples: 661068. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:08:34,472][00422] Avg episode reward: [(0, '-0.133')] |
|
[2023-08-07 23:08:34,574][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000646_2646016.pth... |
|
[2023-08-07 23:08:34,733][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000457_1871872.pth |
|
[2023-08-07 23:08:39,270][09971] Updated weights for policy 0, policy_version 650 (0.0023) |
|
[2023-08-07 23:08:39,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3345.1, 300 sec: 3276.8). Total num frames: 2662400. Throughput: 0: 878.4. Samples: 665946. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:08:39,472][00422] Avg episode reward: [(0, '-0.133')] |
|
[2023-08-07 23:08:44,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3413.3, 300 sec: 3304.6). Total num frames: 2682880. Throughput: 0: 904.1. Samples: 669220. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:08:44,471][00422] Avg episode reward: [(0, '-0.057')] |
|
[2023-08-07 23:08:49,464][00422] Fps is (10 sec: 3686.3, 60 sec: 3481.6, 300 sec: 3290.7). Total num frames: 2699264. Throughput: 0: 895.0. Samples: 675460. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:08:49,471][00422] Avg episode reward: [(0, '-0.035')] |
|
[2023-08-07 23:08:49,662][09971] Updated weights for policy 0, policy_version 660 (0.0021) |
|
[2023-08-07 23:08:54,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3276.8). Total num frames: 2715648. Throughput: 0: 858.2. Samples: 679486. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:08:54,469][00422] Avg episode reward: [(0, '-0.047')] |
|
[2023-08-07 23:08:59,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3276.8). Total num frames: 2732032. Throughput: 0: 857.9. Samples: 681434. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:08:59,465][00422] Avg episode reward: [(0, '-0.046')] |
|
[2023-08-07 23:09:01,980][09971] Updated weights for policy 0, policy_version 670 (0.0025) |
|
[2023-08-07 23:09:04,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 3304.6). Total num frames: 2752512. Throughput: 0: 903.1. Samples: 687912. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:09:04,469][00422] Avg episode reward: [(0, '-0.057')] |
|
[2023-08-07 23:09:09,463][00422] Fps is (10 sec: 4096.1, 60 sec: 3549.9, 300 sec: 3304.6). Total num frames: 2772992. Throughput: 0: 890.3. Samples: 693956. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:09:09,466][00422] Avg episode reward: [(0, '-0.057')] |
|
[2023-08-07 23:09:13,537][09971] Updated weights for policy 0, policy_version 680 (0.0023) |
|
[2023-08-07 23:09:14,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3276.8). Total num frames: 2785280. Throughput: 0: 861.3. Samples: 695922. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:09:14,469][00422] Avg episode reward: [(0, '-0.057')] |
|
[2023-08-07 23:09:19,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3549.9, 300 sec: 3276.9). Total num frames: 2801664. Throughput: 0: 867.1. Samples: 700088. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:09:19,466][00422] Avg episode reward: [(0, '-0.057')] |
|
[2023-08-07 23:09:24,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3318.5). Total num frames: 2826240. Throughput: 0: 907.9. Samples: 706800. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:09:24,469][00422] Avg episode reward: [(0, '-0.068')] |
|
[2023-08-07 23:09:24,480][09971] Updated weights for policy 0, policy_version 690 (0.0025) |
|
[2023-08-07 23:09:29,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3304.6). Total num frames: 2842624. Throughput: 0: 908.7. Samples: 710112. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:09:29,468][00422] Avg episode reward: [(0, '-0.080')] |
|
[2023-08-07 23:09:34,464][00422] Fps is (10 sec: 2867.1, 60 sec: 3549.8, 300 sec: 3262.9). Total num frames: 2854912. Throughput: 0: 863.1. Samples: 714300. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:09:34,471][00422] Avg episode reward: [(0, '-0.080')] |
|
[2023-08-07 23:09:37,662][09971] Updated weights for policy 0, policy_version 700 (0.0056) |
|
[2023-08-07 23:09:39,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3276.8). Total num frames: 2871296. Throughput: 0: 873.6. Samples: 718798. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:09:39,468][00422] Avg episode reward: [(0, '-0.090')] |
|
[2023-08-07 23:09:44,463][00422] Fps is (10 sec: 4096.1, 60 sec: 3549.9, 300 sec: 3304.6). Total num frames: 2895872. Throughput: 0: 903.2. Samples: 722080. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:09:44,466][00422] Avg episode reward: [(0, '-0.157')] |
|
[2023-08-07 23:09:47,247][09971] Updated weights for policy 0, policy_version 710 (0.0020) |
|
[2023-08-07 23:09:49,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3290.7). Total num frames: 2912256. Throughput: 0: 904.8. Samples: 728630. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:09:49,466][00422] Avg episode reward: [(0, '-0.179')] |
|
[2023-08-07 23:09:54,463][00422] Fps is (10 sec: 3276.7, 60 sec: 3549.9, 300 sec: 3276.8). Total num frames: 2928640. Throughput: 0: 860.3. Samples: 732668. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:09:54,470][00422] Avg episode reward: [(0, '-0.178')] |
|
[2023-08-07 23:09:59,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3276.8). Total num frames: 2945024. Throughput: 0: 861.5. Samples: 734690. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:09:59,470][00422] Avg episode reward: [(0, '-0.167')] |
|
[2023-08-07 23:10:00,235][09971] Updated weights for policy 0, policy_version 720 (0.0027) |
|
[2023-08-07 23:10:04,463][00422] Fps is (10 sec: 3686.5, 60 sec: 3549.9, 300 sec: 3318.5). Total num frames: 2965504. Throughput: 0: 906.6. Samples: 740886. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:10:04,466][00422] Avg episode reward: [(0, '-0.134')] |
|
[2023-08-07 23:10:09,466][00422] Fps is (10 sec: 3276.0, 60 sec: 3413.2, 300 sec: 3332.4). Total num frames: 2977792. Throughput: 0: 861.3. Samples: 745562. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:10:09,468][00422] Avg episode reward: [(0, '-0.134')] |
|
[2023-08-07 23:10:13,940][09971] Updated weights for policy 0, policy_version 730 (0.0031) |
|
[2023-08-07 23:10:14,463][00422] Fps is (10 sec: 2457.6, 60 sec: 3413.3, 300 sec: 3318.5). Total num frames: 2990080. Throughput: 0: 822.5. Samples: 747124. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:10:14,466][00422] Avg episode reward: [(0, '-0.134')] |
|
[2023-08-07 23:10:19,463][00422] Fps is (10 sec: 2048.5, 60 sec: 3276.8, 300 sec: 3276.8). Total num frames: 2998272. Throughput: 0: 801.9. Samples: 750386. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:10:19,466][00422] Avg episode reward: [(0, '-0.134')] |
|
[2023-08-07 23:10:24,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3208.5, 300 sec: 3304.6). Total num frames: 3018752. Throughput: 0: 807.4. Samples: 755132. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:10:24,468][00422] Avg episode reward: [(0, '-0.134')] |
|
[2023-08-07 23:10:26,934][09971] Updated weights for policy 0, policy_version 740 (0.0033) |
|
[2023-08-07 23:10:29,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3276.8, 300 sec: 3332.3). Total num frames: 3039232. Throughput: 0: 809.2. Samples: 758496. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:10:29,470][00422] Avg episode reward: [(0, '-0.134')] |
|
[2023-08-07 23:10:34,466][00422] Fps is (10 sec: 4095.0, 60 sec: 3413.2, 300 sec: 3332.3). Total num frames: 3059712. Throughput: 0: 800.8. Samples: 764666. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:10:34,473][00422] Avg episode reward: [(0, '-0.124')] |
|
[2023-08-07 23:10:34,484][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000747_3059712.pth... |
|
[2023-08-07 23:10:34,624][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000547_2240512.pth |
|
[2023-08-07 23:10:38,962][09971] Updated weights for policy 0, policy_version 750 (0.0019) |
|
[2023-08-07 23:10:39,466][00422] Fps is (10 sec: 3276.0, 60 sec: 3344.9, 300 sec: 3304.5). Total num frames: 3072000. Throughput: 0: 798.7. Samples: 768612. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:10:39,472][00422] Avg episode reward: [(0, '-0.113')] |
|
[2023-08-07 23:10:44,463][00422] Fps is (10 sec: 2867.9, 60 sec: 3208.5, 300 sec: 3318.5). Total num frames: 3088384. Throughput: 0: 799.0. Samples: 770646. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:10:44,466][00422] Avg episode reward: [(0, '-0.114')] |
|
[2023-08-07 23:10:49,463][00422] Fps is (10 sec: 3687.3, 60 sec: 3276.8, 300 sec: 3346.2). Total num frames: 3108864. Throughput: 0: 811.2. Samples: 777392. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:10:49,465][00422] Avg episode reward: [(0, '-0.114')] |
|
[2023-08-07 23:10:49,498][09971] Updated weights for policy 0, policy_version 760 (0.0024) |
|
[2023-08-07 23:10:54,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3345.1, 300 sec: 3360.1). Total num frames: 3129344. Throughput: 0: 834.0. Samples: 783088. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:10:54,470][00422] Avg episode reward: [(0, '-0.114')] |
|
[2023-08-07 23:10:59,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3276.8, 300 sec: 3332.3). Total num frames: 3141632. Throughput: 0: 842.4. Samples: 785030. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:10:59,472][00422] Avg episode reward: [(0, '-0.125')] |
|
[2023-08-07 23:11:02,845][09971] Updated weights for policy 0, policy_version 770 (0.0029) |
|
[2023-08-07 23:11:04,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3208.5, 300 sec: 3346.4). Total num frames: 3158016. Throughput: 0: 862.7. Samples: 789206. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:11:04,471][00422] Avg episode reward: [(0, '-0.136')] |
|
[2023-08-07 23:11:09,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3413.5, 300 sec: 3401.8). Total num frames: 3182592. Throughput: 0: 902.2. Samples: 795732. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:11:09,469][00422] Avg episode reward: [(0, '-0.135')] |
|
[2023-08-07 23:11:12,417][09971] Updated weights for policy 0, policy_version 780 (0.0034) |
|
[2023-08-07 23:11:14,469][00422] Fps is (10 sec: 4093.7, 60 sec: 3481.3, 300 sec: 3415.6). Total num frames: 3198976. Throughput: 0: 899.8. Samples: 798990. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:11:14,474][00422] Avg episode reward: [(0, '-0.135')] |
|
[2023-08-07 23:11:19,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3549.9, 300 sec: 3387.9). Total num frames: 3211264. Throughput: 0: 846.3. Samples: 802748. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:11:19,471][00422] Avg episode reward: [(0, '-0.135')] |
|
[2023-08-07 23:11:24,463][00422] Fps is (10 sec: 2868.7, 60 sec: 3481.6, 300 sec: 3387.9). Total num frames: 3227648. Throughput: 0: 857.9. Samples: 807214. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:11:24,466][00422] Avg episode reward: [(0, '-0.113')] |
|
[2023-08-07 23:11:26,306][09971] Updated weights for policy 0, policy_version 790 (0.0036) |
|
[2023-08-07 23:11:29,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 3415.7). Total num frames: 3248128. Throughput: 0: 885.8. Samples: 810508. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:11:29,473][00422] Avg episode reward: [(0, '-0.113')] |
|
[2023-08-07 23:11:34,463][00422] Fps is (10 sec: 3686.5, 60 sec: 3413.5, 300 sec: 3401.8). Total num frames: 3264512. Throughput: 0: 871.9. Samples: 816628. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:11:34,472][00422] Avg episode reward: [(0, '-0.123')] |
|
[2023-08-07 23:11:37,841][09971] Updated weights for policy 0, policy_version 800 (0.0032) |
|
[2023-08-07 23:11:39,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.7, 300 sec: 3387.9). Total num frames: 3280896. Throughput: 0: 835.0. Samples: 820664. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:11:39,466][00422] Avg episode reward: [(0, '-0.123')] |
|
[2023-08-07 23:11:44,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3387.9). Total num frames: 3297280. Throughput: 0: 835.3. Samples: 822618. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:11:44,468][00422] Avg episode reward: [(0, '-0.123')] |
|
[2023-08-07 23:11:48,956][09971] Updated weights for policy 0, policy_version 810 (0.0026) |
|
[2023-08-07 23:11:49,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 3415.6). Total num frames: 3317760. Throughput: 0: 886.4. Samples: 829096. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:11:49,466][00422] Avg episode reward: [(0, '-0.101')] |
|
[2023-08-07 23:11:54,463][00422] Fps is (10 sec: 4095.9, 60 sec: 3481.6, 300 sec: 3415.6). Total num frames: 3338240. Throughput: 0: 871.8. Samples: 834964. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:11:54,466][00422] Avg episode reward: [(0, '-0.111')] |
|
[2023-08-07 23:11:59,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3387.9). Total num frames: 3350528. Throughput: 0: 843.2. Samples: 836928. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:11:59,467][00422] Avg episode reward: [(0, '-0.111')] |
|
[2023-08-07 23:12:02,248][09971] Updated weights for policy 0, policy_version 820 (0.0015) |
|
[2023-08-07 23:12:04,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3387.9). Total num frames: 3366912. Throughput: 0: 854.1. Samples: 841182. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:12:04,465][00422] Avg episode reward: [(0, '-0.123')] |
|
[2023-08-07 23:12:09,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3413.3, 300 sec: 3415.7). Total num frames: 3387392. Throughput: 0: 901.8. Samples: 847796. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:12:09,465][00422] Avg episode reward: [(0, '-0.145')] |
|
[2023-08-07 23:12:11,532][09971] Updated weights for policy 0, policy_version 830 (0.0018) |
|
[2023-08-07 23:12:14,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3481.9, 300 sec: 3415.6). Total num frames: 3407872. Throughput: 0: 903.2. Samples: 851154. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:12:14,469][00422] Avg episode reward: [(0, '-0.145')] |
|
[2023-08-07 23:12:19,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3387.9). Total num frames: 3420160. Throughput: 0: 859.6. Samples: 855308. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:12:19,473][00422] Avg episode reward: [(0, '-0.155')] |
|
[2023-08-07 23:12:24,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3388.0). Total num frames: 3436544. Throughput: 0: 872.1. Samples: 859910. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:12:24,467][00422] Avg episode reward: [(0, '-0.154')] |
|
[2023-08-07 23:12:24,705][09971] Updated weights for policy 0, policy_version 840 (0.0019) |
|
[2023-08-07 23:12:29,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3549.9, 300 sec: 3429.5). Total num frames: 3461120. Throughput: 0: 904.5. Samples: 863320. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:12:29,471][00422] Avg episode reward: [(0, '-0.154')] |
|
[2023-08-07 23:12:34,466][00422] Fps is (10 sec: 4095.0, 60 sec: 3549.7, 300 sec: 3443.4). Total num frames: 3477504. Throughput: 0: 907.5. Samples: 869936. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:12:34,467][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 23:12:34,479][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000849_3477504.pth... |
|
[2023-08-07 23:12:34,693][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000646_2646016.pth |
|
[2023-08-07 23:12:35,032][09971] Updated weights for policy 0, policy_version 850 (0.0019) |
|
[2023-08-07 23:12:39,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3429.5). Total num frames: 3489792. Throughput: 0: 849.5. Samples: 873192. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:12:39,467][00422] Avg episode reward: [(0, '-0.165')] |
|
[2023-08-07 23:12:44,463][00422] Fps is (10 sec: 2458.2, 60 sec: 3413.3, 300 sec: 3429.5). Total num frames: 3502080. Throughput: 0: 842.8. Samples: 874854. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:12:44,466][00422] Avg episode reward: [(0, '-0.176')] |
|
[2023-08-07 23:12:49,463][00422] Fps is (10 sec: 2457.6, 60 sec: 3276.8, 300 sec: 3429.5). Total num frames: 3514368. Throughput: 0: 826.5. Samples: 878376. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:12:49,471][00422] Avg episode reward: [(0, '-0.198')] |
|
[2023-08-07 23:12:50,948][09971] Updated weights for policy 0, policy_version 860 (0.0032) |
|
[2023-08-07 23:12:54,463][00422] Fps is (10 sec: 3686.3, 60 sec: 3345.1, 300 sec: 3457.3). Total num frames: 3538944. Throughput: 0: 825.0. Samples: 884920. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:12:54,471][00422] Avg episode reward: [(0, '-0.176')] |
|
[2023-08-07 23:12:59,466][00422] Fps is (10 sec: 4504.5, 60 sec: 3481.5, 300 sec: 3443.4). Total num frames: 3559424. Throughput: 0: 826.1. Samples: 888332. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:12:59,471][00422] Avg episode reward: [(0, '-0.155')] |
|
[2023-08-07 23:13:00,737][09971] Updated weights for policy 0, policy_version 870 (0.0013) |
|
[2023-08-07 23:13:04,469][00422] Fps is (10 sec: 3275.1, 60 sec: 3413.0, 300 sec: 3429.5). Total num frames: 3571712. Throughput: 0: 839.7. Samples: 893098. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:13:04,471][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:09,463][00422] Fps is (10 sec: 2867.9, 60 sec: 3345.1, 300 sec: 3443.4). Total num frames: 3588096. Throughput: 0: 831.6. Samples: 897334. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:13:09,470][00422] Avg episode reward: [(0, '-0.132')] |
|
[2023-08-07 23:13:12,952][09971] Updated weights for policy 0, policy_version 880 (0.0020) |
|
[2023-08-07 23:13:14,463][00422] Fps is (10 sec: 3688.4, 60 sec: 3345.1, 300 sec: 3457.3). Total num frames: 3608576. Throughput: 0: 828.6. Samples: 900608. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:13:14,470][00422] Avg episode reward: [(0, '-0.132')] |
|
[2023-08-07 23:13:19,465][00422] Fps is (10 sec: 4095.4, 60 sec: 3481.5, 300 sec: 3443.4). Total num frames: 3629056. Throughput: 0: 831.4. Samples: 907346. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:13:19,471][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:24,371][09971] Updated weights for policy 0, policy_version 890 (0.0027) |
|
[2023-08-07 23:13:24,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 3443.4). Total num frames: 3645440. Throughput: 0: 853.0. Samples: 911576. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:13:24,470][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:29,463][00422] Fps is (10 sec: 2867.6, 60 sec: 3276.8, 300 sec: 3443.4). Total num frames: 3657728. Throughput: 0: 859.9. Samples: 913548. Policy #0 lag: (min: 0.0, avg: 0.4, max: 1.0) |
|
[2023-08-07 23:13:29,466][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:34,463][00422] Fps is (10 sec: 3686.5, 60 sec: 3413.5, 300 sec: 3457.3). Total num frames: 3682304. Throughput: 0: 914.8. Samples: 919544. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:13:34,468][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:35,370][09971] Updated weights for policy 0, policy_version 900 (0.0059) |
|
[2023-08-07 23:13:39,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3481.6, 300 sec: 3443.4). Total num frames: 3698688. Throughput: 0: 912.3. Samples: 925972. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0) |
|
[2023-08-07 23:13:39,470][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:44,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3549.9, 300 sec: 3443.4). Total num frames: 3715072. Throughput: 0: 880.4. Samples: 927950. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:13:44,470][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:48,612][09971] Updated weights for policy 0, policy_version 910 (0.0023) |
|
[2023-08-07 23:13:49,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3549.9, 300 sec: 3429.5). Total num frames: 3727360. Throughput: 0: 862.1. Samples: 931890. Policy #0 lag: (min: 0.0, avg: 0.4, max: 2.0) |
|
[2023-08-07 23:13:49,472][00422] Avg episode reward: [(0, '-0.144')] |
|
[2023-08-07 23:13:54,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3549.9, 300 sec: 3457.3). Total num frames: 3751936. Throughput: 0: 913.5. Samples: 938440. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:13:54,470][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 23:13:57,945][09971] Updated weights for policy 0, policy_version 920 (0.0024) |
|
[2023-08-07 23:13:59,463][00422] Fps is (10 sec: 4505.6, 60 sec: 3550.0, 300 sec: 3457.3). Total num frames: 3772416. Throughput: 0: 913.4. Samples: 941710. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:13:59,469][00422] Avg episode reward: [(0, '-0.187')] |
|
[2023-08-07 23:14:04,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3550.2, 300 sec: 3429.5). Total num frames: 3784704. Throughput: 0: 865.8. Samples: 946306. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:14:04,467][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:09,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3549.9, 300 sec: 3443.4). Total num frames: 3801088. Throughput: 0: 867.3. Samples: 950604. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:14:09,470][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:11,320][09971] Updated weights for policy 0, policy_version 930 (0.0035) |
|
[2023-08-07 23:14:14,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3549.9, 300 sec: 3457.3). Total num frames: 3821568. Throughput: 0: 896.0. Samples: 953868. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:14:14,465][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:19,464][00422] Fps is (10 sec: 4095.9, 60 sec: 3549.9, 300 sec: 3443.4). Total num frames: 3842048. Throughput: 0: 910.1. Samples: 960500. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:14:19,471][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:21,962][09971] Updated weights for policy 0, policy_version 940 (0.0020) |
|
[2023-08-07 23:14:24,463][00422] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3429.5). Total num frames: 3854336. Throughput: 0: 858.1. Samples: 964586. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:14:24,466][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:29,463][00422] Fps is (10 sec: 2867.3, 60 sec: 3549.9, 300 sec: 3443.4). Total num frames: 3870720. Throughput: 0: 856.8. Samples: 966504. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0) |
|
[2023-08-07 23:14:29,467][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:34,463][00422] Fps is (10 sec: 3276.7, 60 sec: 3413.3, 300 sec: 3443.4). Total num frames: 3887104. Throughput: 0: 879.4. Samples: 971462. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:14:34,469][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:34,490][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000949_3887104.pth... |
|
[2023-08-07 23:14:34,609][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000747_3059712.pth |
|
[2023-08-07 23:14:34,959][09971] Updated weights for policy 0, policy_version 950 (0.0034) |
|
[2023-08-07 23:14:39,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3481.6, 300 sec: 3429.5). Total num frames: 3907584. Throughput: 0: 874.8. Samples: 977804. Policy #0 lag: (min: 0.0, avg: 0.7, max: 1.0) |
|
[2023-08-07 23:14:39,466][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:44,463][00422] Fps is (10 sec: 3276.9, 60 sec: 3413.3, 300 sec: 3415.6). Total num frames: 3919872. Throughput: 0: 845.6. Samples: 979764. Policy #0 lag: (min: 0.0, avg: 0.6, max: 1.0) |
|
[2023-08-07 23:14:44,466][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:48,120][09971] Updated weights for policy 0, policy_version 960 (0.0035) |
|
[2023-08-07 23:14:49,463][00422] Fps is (10 sec: 2867.2, 60 sec: 3481.6, 300 sec: 3415.7). Total num frames: 3936256. Throughput: 0: 829.5. Samples: 983632. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:14:49,468][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:54,463][00422] Fps is (10 sec: 3686.4, 60 sec: 3413.3, 300 sec: 3429.5). Total num frames: 3956736. Throughput: 0: 877.4. Samples: 990086. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:14:54,471][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:14:57,390][09971] Updated weights for policy 0, policy_version 970 (0.0016) |
|
[2023-08-07 23:14:59,463][00422] Fps is (10 sec: 4096.0, 60 sec: 3413.3, 300 sec: 3429.5). Total num frames: 3977216. Throughput: 0: 878.8. Samples: 993412. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0) |
|
[2023-08-07 23:14:59,468][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:15:04,466][00422] Fps is (10 sec: 3276.0, 60 sec: 3413.2, 300 sec: 3429.5). Total num frames: 3989504. Throughput: 0: 831.5. Samples: 997918. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:15:04,471][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:15:09,463][00422] Fps is (10 sec: 2457.6, 60 sec: 3345.1, 300 sec: 3429.5). Total num frames: 4001792. Throughput: 0: 811.4. Samples: 1001098. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0) |
|
[2023-08-07 23:15:09,466][00422] Avg episode reward: [(0, '-0.210')] |
|
[2023-08-07 23:15:10,816][09958] Stopping Batcher_0... |
|
[2023-08-07 23:15:10,817][09958] Loop batcher_evt_loop terminating... |
|
[2023-08-07 23:15:10,818][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth... |
|
[2023-08-07 23:15:10,831][00422] Component Batcher_0 stopped! |
|
[2023-08-07 23:15:10,894][09976] Stopping RolloutWorker_w2... |
|
[2023-08-07 23:15:10,895][09976] Loop rollout_proc2_evt_loop terminating... |
|
[2023-08-07 23:15:10,897][00422] Component RolloutWorker_w2 stopped! |
|
[2023-08-07 23:15:10,924][00422] Component RolloutWorker_w1 stopped! |
|
[2023-08-07 23:15:10,927][09973] Stopping RolloutWorker_w1... |
|
[2023-08-07 23:15:10,963][09972] Stopping RolloutWorker_w0... |
|
[2023-08-07 23:15:10,963][09972] Loop rollout_proc0_evt_loop terminating... |
|
[2023-08-07 23:15:10,963][00422] Component RolloutWorker_w0 stopped! |
|
[2023-08-07 23:15:10,962][09973] Loop rollout_proc1_evt_loop terminating... |
|
[2023-08-07 23:15:10,974][09971] Weights refcount: 2 0 |
|
[2023-08-07 23:15:10,977][09975] Stopping RolloutWorker_w4... |
|
[2023-08-07 23:15:10,978][09975] Loop rollout_proc4_evt_loop terminating... |
|
[2023-08-07 23:15:10,975][00422] Component RolloutWorker_w4 stopped! |
|
[2023-08-07 23:15:10,989][00422] Component RolloutWorker_w5 stopped! |
|
[2023-08-07 23:15:10,991][09977] Stopping RolloutWorker_w5... |
|
[2023-08-07 23:15:10,992][09977] Loop rollout_proc5_evt_loop terminating... |
|
[2023-08-07 23:15:11,002][09978] Stopping RolloutWorker_w6... |
|
[2023-08-07 23:15:11,005][09978] Loop rollout_proc6_evt_loop terminating... |
|
[2023-08-07 23:15:11,002][00422] Component RolloutWorker_w6 stopped! |
|
[2023-08-07 23:15:11,011][00422] Component InferenceWorker_p0-w0 stopped! |
|
[2023-08-07 23:15:11,015][09971] Stopping InferenceWorker_p0-w0... |
|
[2023-08-07 23:15:11,016][09971] Loop inference_proc0-0_evt_loop terminating... |
|
[2023-08-07 23:15:11,067][00422] Component RolloutWorker_w3 stopped! |
|
[2023-08-07 23:15:11,069][09974] Stopping RolloutWorker_w3... |
|
[2023-08-07 23:15:11,074][00422] Component RolloutWorker_w7 stopped! |
|
[2023-08-07 23:15:11,076][09979] Stopping RolloutWorker_w7... |
|
[2023-08-07 23:15:11,070][09974] Loop rollout_proc3_evt_loop terminating... |
|
[2023-08-07 23:15:11,079][09979] Loop rollout_proc7_evt_loop terminating... |
|
[2023-08-07 23:15:11,085][09958] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000849_3477504.pth |
|
[2023-08-07 23:15:11,106][09958] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth... |
|
[2023-08-07 23:15:11,389][09958] Stopping LearnerWorker_p0... |
|
[2023-08-07 23:15:11,390][09958] Loop learner_proc0_evt_loop terminating... |
|
[2023-08-07 23:15:11,390][00422] Component LearnerWorker_p0 stopped! |
|
[2023-08-07 23:15:11,398][00422] Waiting for process learner_proc0 to stop... |
|
[2023-08-07 23:15:13,857][00422] Waiting for process inference_proc0-0 to join... |
|
[2023-08-07 23:15:13,864][00422] Waiting for process rollout_proc0 to join... |
|
[2023-08-07 23:15:16,284][00422] Waiting for process rollout_proc1 to join... |
|
[2023-08-07 23:15:16,287][00422] Waiting for process rollout_proc2 to join... |
|
[2023-08-07 23:15:16,289][00422] Waiting for process rollout_proc3 to join... |
|
[2023-08-07 23:15:16,291][00422] Waiting for process rollout_proc4 to join... |
|
[2023-08-07 23:15:16,293][00422] Waiting for process rollout_proc5 to join... |
|
[2023-08-07 23:15:16,297][00422] Waiting for process rollout_proc6 to join... |
|
[2023-08-07 23:15:16,299][00422] Waiting for process rollout_proc7 to join... |
|
[2023-08-07 23:15:16,301][00422] Batcher 0 profile tree view: |
|
batching: 29.0745, releasing_batches: 0.0248 |
|
[2023-08-07 23:15:16,302][00422] InferenceWorker_p0-w0 profile tree view: |
|
wait_policy: 0.0000 |
|
wait_policy_total: 419.5711 |
|
update_model: 8.7910 |
|
weight_update: 0.0063 |
|
one_step: 0.0190 |
|
handle_policy_step: 728.4103 |
|
deserialize: 16.8792, stack: 3.1577, obs_to_device_normalize: 115.9818, forward: 458.2508, send_messages: 30.4931 |
|
prepare_outputs: 74.8187 |
|
to_cpu: 41.8246 |
|
[2023-08-07 23:15:16,304][00422] Learner 0 profile tree view: |
|
misc: 0.0051, prepare_batch: 19.7825 |
|
train: 74.2731 |
|
epoch_init: 0.0057, minibatch_init: 0.0149, losses_postprocess: 0.5562, kl_divergence: 1.1512, after_optimizer: 3.8309 |
|
calculate_losses: 25.0094 |
|
losses_init: 0.0039, forward_head: 1.2819, bptt_initial: 16.1683, tail: 1.8119, advantages_returns: 0.2799, losses: 3.1712 |
|
bptt: 2.0099 |
|
bptt_forward_core: 1.9358 |
|
update: 42.9842 |
|
clip: 31.5160 |
|
[2023-08-07 23:15:16,307][00422] RolloutWorker_w0 profile tree view: |
|
wait_for_trajectories: 0.3667, enqueue_policy_requests: 123.0260, env_step: 912.1798, overhead: 23.9594, complete_rollouts: 7.1605 |
|
save_policy_outputs: 22.4593 |
|
split_output_tensors: 10.5630 |
|
[2023-08-07 23:15:16,309][00422] RolloutWorker_w7 profile tree view: |
|
wait_for_trajectories: 0.3031, enqueue_policy_requests: 122.6129, env_step: 908.8991, overhead: 23.9091, complete_rollouts: 7.9349 |
|
save_policy_outputs: 22.0991 |
|
split_output_tensors: 10.7511 |
|
[2023-08-07 23:15:16,310][00422] Loop Runner_EvtLoop terminating... |
|
[2023-08-07 23:15:16,312][00422] Runner profile tree view: |
|
main_loop: 1236.7185 |
|
[2023-08-07 23:15:16,317][00422] Collected {0: 4005888}, FPS: 3239.1 |
|
[2023-08-07 23:15:58,033][00422] Loading existing experiment configuration from /content/train_dir/default_experiment/config.json |
|
[2023-08-07 23:15:58,035][00422] Overriding arg 'num_workers' with value 1 passed from command line |
|
[2023-08-07 23:15:58,039][00422] Adding new argument 'no_render'=True that is not in the saved config file! |
|
[2023-08-07 23:15:58,042][00422] Adding new argument 'save_video'=True that is not in the saved config file! |
|
[2023-08-07 23:15:58,046][00422] Adding new argument 'video_frames'=1000000000.0 that is not in the saved config file! |
|
[2023-08-07 23:15:58,049][00422] Adding new argument 'video_name'=None that is not in the saved config file! |
|
[2023-08-07 23:15:58,050][00422] Adding new argument 'max_num_frames'=1000000000.0 that is not in the saved config file! |
|
[2023-08-07 23:15:58,051][00422] Adding new argument 'max_num_episodes'=10 that is not in the saved config file! |
|
[2023-08-07 23:15:58,053][00422] Adding new argument 'push_to_hub'=False that is not in the saved config file! |
|
[2023-08-07 23:15:58,054][00422] Adding new argument 'hf_repository'=None that is not in the saved config file! |
|
[2023-08-07 23:15:58,056][00422] Adding new argument 'policy_index'=0 that is not in the saved config file! |
|
[2023-08-07 23:15:58,057][00422] Adding new argument 'eval_deterministic'=False that is not in the saved config file! |
|
[2023-08-07 23:15:58,058][00422] Adding new argument 'train_script'=None that is not in the saved config file! |
|
[2023-08-07 23:15:58,060][00422] Adding new argument 'enjoy_script'=None that is not in the saved config file! |
|
[2023-08-07 23:15:58,061][00422] Using frameskip 1 and render_action_repeat=4 for evaluation |
|
[2023-08-07 23:15:58,104][00422] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2023-08-07 23:15:58,109][00422] RunningMeanStd input shape: (3, 72, 128) |
|
[2023-08-07 23:15:58,110][00422] RunningMeanStd input shape: (1,) |
|
[2023-08-07 23:15:58,133][00422] ConvEncoder: input_channels=3 |
|
[2023-08-07 23:15:58,270][00422] Conv encoder output size: 512 |
|
[2023-08-07 23:15:58,275][00422] Policy head output size: 512 |
|
[2023-08-07 23:16:01,316][00422] Loading state from checkpoint /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth... |
|
[2023-08-07 23:16:03,489][00422] Num frames 100... |
|
[2023-08-07 23:16:03,625][00422] Num frames 200... |
|
[2023-08-07 23:16:03,764][00422] Num frames 300... |
|
[2023-08-07 23:16:03,898][00422] Num frames 400... |
|
[2023-08-07 23:16:04,040][00422] Num frames 500... |
|
[2023-08-07 23:16:04,189][00422] Num frames 600... |
|
[2023-08-07 23:16:04,328][00422] Num frames 700... |
|
[2023-08-07 23:16:04,469][00422] Num frames 800... |
|
[2023-08-07 23:16:04,607][00422] Num frames 900... |
|
[2023-08-07 23:16:04,749][00422] Num frames 1000... |
|
[2023-08-07 23:16:04,887][00422] Num frames 1100... |
|
[2023-08-07 23:16:05,026][00422] Num frames 1200... |
|
[2023-08-07 23:16:05,176][00422] Num frames 1300... |
|
[2023-08-07 23:16:05,315][00422] Num frames 1400... |
|
[2023-08-07 23:16:05,446][00422] Num frames 1500... |
|
[2023-08-07 23:16:05,585][00422] Num frames 1600... |
|
[2023-08-07 23:16:05,721][00422] Num frames 1700... |
|
[2023-08-07 23:16:05,862][00422] Num frames 1800... |
|
[2023-08-07 23:16:06,005][00422] Num frames 1900... |
|
[2023-08-07 23:16:06,155][00422] Num frames 2000... |
|
[2023-08-07 23:16:06,300][00422] Num frames 2100... |
|
[2023-08-07 23:16:06,355][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:06,356][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:06,494][00422] Num frames 2200... |
|
[2023-08-07 23:16:06,627][00422] Num frames 2300... |
|
[2023-08-07 23:16:06,767][00422] Num frames 2400... |
|
[2023-08-07 23:16:06,908][00422] Num frames 2500... |
|
[2023-08-07 23:16:07,047][00422] Num frames 2600... |
|
[2023-08-07 23:16:07,193][00422] Num frames 2700... |
|
[2023-08-07 23:16:07,334][00422] Num frames 2800... |
|
[2023-08-07 23:16:07,474][00422] Num frames 2900... |
|
[2023-08-07 23:16:07,613][00422] Num frames 3000... |
|
[2023-08-07 23:16:07,758][00422] Num frames 3100... |
|
[2023-08-07 23:16:07,899][00422] Num frames 3200... |
|
[2023-08-07 23:16:08,039][00422] Num frames 3300... |
|
[2023-08-07 23:16:08,183][00422] Num frames 3400... |
|
[2023-08-07 23:16:08,325][00422] Num frames 3500... |
|
[2023-08-07 23:16:08,463][00422] Num frames 3600... |
|
[2023-08-07 23:16:08,602][00422] Num frames 3700... |
|
[2023-08-07 23:16:08,735][00422] Num frames 3800... |
|
[2023-08-07 23:16:08,875][00422] Num frames 3900... |
|
[2023-08-07 23:16:09,016][00422] Num frames 4000... |
|
[2023-08-07 23:16:09,153][00422] Num frames 4100... |
|
[2023-08-07 23:16:09,315][00422] Num frames 4200... |
|
[2023-08-07 23:16:09,368][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:09,370][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:09,511][00422] Num frames 4300... |
|
[2023-08-07 23:16:09,654][00422] Num frames 4400... |
|
[2023-08-07 23:16:09,796][00422] Num frames 4500... |
|
[2023-08-07 23:16:09,938][00422] Num frames 4600... |
|
[2023-08-07 23:16:10,082][00422] Num frames 4700... |
|
[2023-08-07 23:16:10,229][00422] Num frames 4800... |
|
[2023-08-07 23:16:10,373][00422] Num frames 4900... |
|
[2023-08-07 23:16:10,518][00422] Num frames 5000... |
|
[2023-08-07 23:16:10,657][00422] Num frames 5100... |
|
[2023-08-07 23:16:10,801][00422] Num frames 5200... |
|
[2023-08-07 23:16:10,942][00422] Num frames 5300... |
|
[2023-08-07 23:16:11,084][00422] Num frames 5400... |
|
[2023-08-07 23:16:11,235][00422] Num frames 5500... |
|
[2023-08-07 23:16:11,367][00422] Num frames 5600... |
|
[2023-08-07 23:16:11,499][00422] Num frames 5700... |
|
[2023-08-07 23:16:11,636][00422] Num frames 5800... |
|
[2023-08-07 23:16:11,774][00422] Num frames 5900... |
|
[2023-08-07 23:16:11,917][00422] Num frames 6000... |
|
[2023-08-07 23:16:12,062][00422] Num frames 6100... |
|
[2023-08-07 23:16:12,204][00422] Num frames 6200... |
|
[2023-08-07 23:16:12,353][00422] Num frames 6300... |
|
[2023-08-07 23:16:12,405][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:12,407][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:12,550][00422] Num frames 6400... |
|
[2023-08-07 23:16:12,692][00422] Num frames 6500... |
|
[2023-08-07 23:16:12,837][00422] Num frames 6600... |
|
[2023-08-07 23:16:12,976][00422] Num frames 6700... |
|
[2023-08-07 23:16:13,123][00422] Num frames 6800... |
|
[2023-08-07 23:16:13,275][00422] Num frames 6900... |
|
[2023-08-07 23:16:13,469][00422] Num frames 7000... |
|
[2023-08-07 23:16:13,673][00422] Num frames 7100... |
|
[2023-08-07 23:16:13,873][00422] Num frames 7200... |
|
[2023-08-07 23:16:14,067][00422] Num frames 7300... |
|
[2023-08-07 23:16:14,269][00422] Num frames 7400... |
|
[2023-08-07 23:16:14,474][00422] Num frames 7500... |
|
[2023-08-07 23:16:14,675][00422] Num frames 7600... |
|
[2023-08-07 23:16:14,880][00422] Num frames 7700... |
|
[2023-08-07 23:16:15,079][00422] Num frames 7800... |
|
[2023-08-07 23:16:15,280][00422] Num frames 7900... |
|
[2023-08-07 23:16:15,484][00422] Num frames 8000... |
|
[2023-08-07 23:16:15,678][00422] Num frames 8100... |
|
[2023-08-07 23:16:15,883][00422] Num frames 8200... |
|
[2023-08-07 23:16:16,085][00422] Num frames 8300... |
|
[2023-08-07 23:16:16,297][00422] Num frames 8400... |
|
[2023-08-07 23:16:16,351][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:16,353][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:16,577][00422] Num frames 8500... |
|
[2023-08-07 23:16:16,788][00422] Num frames 8600... |
|
[2023-08-07 23:16:16,993][00422] Num frames 8700... |
|
[2023-08-07 23:16:17,202][00422] Num frames 8800... |
|
[2023-08-07 23:16:17,405][00422] Num frames 8900... |
|
[2023-08-07 23:16:17,583][00422] Num frames 9000... |
|
[2023-08-07 23:16:17,722][00422] Num frames 9100... |
|
[2023-08-07 23:16:17,863][00422] Num frames 9200... |
|
[2023-08-07 23:16:18,021][00422] Num frames 9300... |
|
[2023-08-07 23:16:18,156][00422] Num frames 9400... |
|
[2023-08-07 23:16:18,296][00422] Num frames 9500... |
|
[2023-08-07 23:16:18,438][00422] Num frames 9600... |
|
[2023-08-07 23:16:18,586][00422] Num frames 9700... |
|
[2023-08-07 23:16:18,726][00422] Num frames 9800... |
|
[2023-08-07 23:16:18,878][00422] Num frames 9900... |
|
[2023-08-07 23:16:19,057][00422] Num frames 10000... |
|
[2023-08-07 23:16:19,347][00422] Num frames 10100... |
|
[2023-08-07 23:16:19,652][00422] Num frames 10200... |
|
[2023-08-07 23:16:19,845][00422] Num frames 10300... |
|
[2023-08-07 23:16:19,989][00422] Num frames 10400... |
|
[2023-08-07 23:16:20,156][00422] Num frames 10500... |
|
[2023-08-07 23:16:20,210][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:20,216][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:20,377][00422] Num frames 10600... |
|
[2023-08-07 23:16:20,553][00422] Num frames 10700... |
|
[2023-08-07 23:16:20,710][00422] Num frames 10800... |
|
[2023-08-07 23:16:20,863][00422] Num frames 10900... |
|
[2023-08-07 23:16:21,006][00422] Num frames 11000... |
|
[2023-08-07 23:16:21,155][00422] Num frames 11100... |
|
[2023-08-07 23:16:21,299][00422] Num frames 11200... |
|
[2023-08-07 23:16:21,440][00422] Num frames 11300... |
|
[2023-08-07 23:16:21,587][00422] Num frames 11400... |
|
[2023-08-07 23:16:21,734][00422] Num frames 11500... |
|
[2023-08-07 23:16:21,880][00422] Num frames 11600... |
|
[2023-08-07 23:16:22,019][00422] Num frames 11700... |
|
[2023-08-07 23:16:22,166][00422] Num frames 11800... |
|
[2023-08-07 23:16:22,304][00422] Num frames 11900... |
|
[2023-08-07 23:16:22,444][00422] Num frames 12000... |
|
[2023-08-07 23:16:22,587][00422] Num frames 12100... |
|
[2023-08-07 23:16:22,725][00422] Num frames 12200... |
|
[2023-08-07 23:16:22,863][00422] Num frames 12300... |
|
[2023-08-07 23:16:22,999][00422] Num frames 12400... |
|
[2023-08-07 23:16:23,138][00422] Num frames 12500... |
|
[2023-08-07 23:16:23,282][00422] Num frames 12600... |
|
[2023-08-07 23:16:23,334][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:23,335][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:23,478][00422] Num frames 12700... |
|
[2023-08-07 23:16:23,623][00422] Num frames 12800... |
|
[2023-08-07 23:16:23,761][00422] Num frames 12900... |
|
[2023-08-07 23:16:23,905][00422] Num frames 13000... |
|
[2023-08-07 23:16:24,047][00422] Num frames 13100... |
|
[2023-08-07 23:16:24,187][00422] Num frames 13200... |
|
[2023-08-07 23:16:24,370][00422] Num frames 13300... |
|
[2023-08-07 23:16:24,536][00422] Num frames 13400... |
|
[2023-08-07 23:16:24,678][00422] Num frames 13500... |
|
[2023-08-07 23:16:24,821][00422] Num frames 13600... |
|
[2023-08-07 23:16:24,962][00422] Num frames 13700... |
|
[2023-08-07 23:16:25,103][00422] Num frames 13800... |
|
[2023-08-07 23:16:25,252][00422] Num frames 13900... |
|
[2023-08-07 23:16:25,394][00422] Num frames 14000... |
|
[2023-08-07 23:16:25,534][00422] Num frames 14100... |
|
[2023-08-07 23:16:25,681][00422] Num frames 14200... |
|
[2023-08-07 23:16:25,827][00422] Num frames 14300... |
|
[2023-08-07 23:16:25,962][00422] Num frames 14400... |
|
[2023-08-07 23:16:26,096][00422] Num frames 14500... |
|
[2023-08-07 23:16:26,238][00422] Num frames 14600... |
|
[2023-08-07 23:16:26,384][00422] Num frames 14700... |
|
[2023-08-07 23:16:26,440][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:26,442][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:26,579][00422] Num frames 14800... |
|
[2023-08-07 23:16:26,736][00422] Num frames 14900... |
|
[2023-08-07 23:16:26,877][00422] Num frames 15000... |
|
[2023-08-07 23:16:27,015][00422] Num frames 15100... |
|
[2023-08-07 23:16:27,156][00422] Num frames 15200... |
|
[2023-08-07 23:16:27,305][00422] Num frames 15300... |
|
[2023-08-07 23:16:27,441][00422] Num frames 15400... |
|
[2023-08-07 23:16:27,608][00422] Num frames 15500... |
|
[2023-08-07 23:16:27,809][00422] Num frames 15600... |
|
[2023-08-07 23:16:28,006][00422] Num frames 15700... |
|
[2023-08-07 23:16:28,204][00422] Num frames 15800... |
|
[2023-08-07 23:16:28,400][00422] Num frames 15900... |
|
[2023-08-07 23:16:28,593][00422] Num frames 16000... |
|
[2023-08-07 23:16:28,810][00422] Num frames 16100... |
|
[2023-08-07 23:16:29,009][00422] Num frames 16200... |
|
[2023-08-07 23:16:29,225][00422] Num frames 16300... |
|
[2023-08-07 23:16:29,439][00422] Num frames 16400... |
|
[2023-08-07 23:16:29,651][00422] Num frames 16500... |
|
[2023-08-07 23:16:29,866][00422] Num frames 16600... |
|
[2023-08-07 23:16:30,070][00422] Num frames 16700... |
|
[2023-08-07 23:16:30,268][00422] Num frames 16800... |
|
[2023-08-07 23:16:30,323][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:30,326][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:30,519][00422] Num frames 16900... |
|
[2023-08-07 23:16:30,716][00422] Num frames 17000... |
|
[2023-08-07 23:16:30,917][00422] Num frames 17100... |
|
[2023-08-07 23:16:31,109][00422] Num frames 17200... |
|
[2023-08-07 23:16:31,303][00422] Num frames 17300... |
|
[2023-08-07 23:16:31,497][00422] Num frames 17400... |
|
[2023-08-07 23:16:31,688][00422] Num frames 17500... |
|
[2023-08-07 23:16:31,831][00422] Num frames 17600... |
|
[2023-08-07 23:16:31,980][00422] Num frames 17700... |
|
[2023-08-07 23:16:32,123][00422] Num frames 17800... |
|
[2023-08-07 23:16:32,280][00422] Num frames 17900... |
|
[2023-08-07 23:16:32,430][00422] Num frames 18000... |
|
[2023-08-07 23:16:32,573][00422] Num frames 18100... |
|
[2023-08-07 23:16:32,711][00422] Num frames 18200... |
|
[2023-08-07 23:16:32,850][00422] Num frames 18300... |
|
[2023-08-07 23:16:33,013][00422] Num frames 18400... |
|
[2023-08-07 23:16:33,156][00422] Num frames 18500... |
|
[2023-08-07 23:16:33,294][00422] Num frames 18600... |
|
[2023-08-07 23:16:33,430][00422] Num frames 18700... |
|
[2023-08-07 23:16:33,568][00422] Num frames 18800... |
|
[2023-08-07 23:16:33,709][00422] Num frames 18900... |
|
[2023-08-07 23:16:33,761][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:33,763][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:16:33,919][00422] Num frames 19000... |
|
[2023-08-07 23:16:34,069][00422] Num frames 19100... |
|
[2023-08-07 23:16:34,213][00422] Num frames 19200... |
|
[2023-08-07 23:16:34,352][00422] Num frames 19300... |
|
[2023-08-07 23:16:34,489][00422] Num frames 19400... |
|
[2023-08-07 23:16:34,630][00422] Num frames 19500... |
|
[2023-08-07 23:16:34,774][00422] Num frames 19600... |
|
[2023-08-07 23:16:34,918][00422] Num frames 19700... |
|
[2023-08-07 23:16:35,060][00422] Num frames 19800... |
|
[2023-08-07 23:16:35,201][00422] Num frames 19900... |
|
[2023-08-07 23:16:35,345][00422] Num frames 20000... |
|
[2023-08-07 23:16:35,474][00422] Num frames 20100... |
|
[2023-08-07 23:16:35,611][00422] Num frames 20200... |
|
[2023-08-07 23:16:35,754][00422] Num frames 20300... |
|
[2023-08-07 23:16:35,893][00422] Num frames 20400... |
|
[2023-08-07 23:16:36,035][00422] Num frames 20500... |
|
[2023-08-07 23:16:36,179][00422] Num frames 20600... |
|
[2023-08-07 23:16:36,317][00422] Num frames 20700... |
|
[2023-08-07 23:16:36,457][00422] Num frames 20800... |
|
[2023-08-07 23:16:36,601][00422] Num frames 20900... |
|
[2023-08-07 23:16:36,751][00422] Num frames 21000... |
|
[2023-08-07 23:16:36,806][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:16:36,808][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:18:50,270][00422] Replay video saved to /content/train_dir/default_experiment/replay.mp4! |
|
[2023-08-07 23:20:08,430][00422] Loading existing experiment configuration from /content/train_dir/default_experiment/config.json |
|
[2023-08-07 23:20:08,432][00422] Overriding arg 'num_workers' with value 1 passed from command line |
|
[2023-08-07 23:20:08,434][00422] Adding new argument 'no_render'=True that is not in the saved config file! |
|
[2023-08-07 23:20:08,438][00422] Adding new argument 'save_video'=True that is not in the saved config file! |
|
[2023-08-07 23:20:08,440][00422] Adding new argument 'video_frames'=1000000000.0 that is not in the saved config file! |
|
[2023-08-07 23:20:08,442][00422] Adding new argument 'video_name'=None that is not in the saved config file! |
|
[2023-08-07 23:20:08,444][00422] Adding new argument 'max_num_frames'=100000 that is not in the saved config file! |
|
[2023-08-07 23:20:08,445][00422] Adding new argument 'max_num_episodes'=10 that is not in the saved config file! |
|
[2023-08-07 23:20:08,446][00422] Adding new argument 'push_to_hub'=True that is not in the saved config file! |
|
[2023-08-07 23:20:08,448][00422] Adding new argument 'hf_repository'='rzambrano/rl_course_vizdoom_my_way_home' that is not in the saved config file! |
|
[2023-08-07 23:20:08,449][00422] Adding new argument 'policy_index'=0 that is not in the saved config file! |
|
[2023-08-07 23:20:08,450][00422] Adding new argument 'eval_deterministic'=False that is not in the saved config file! |
|
[2023-08-07 23:20:08,451][00422] Adding new argument 'train_script'=None that is not in the saved config file! |
|
[2023-08-07 23:20:08,453][00422] Adding new argument 'enjoy_script'=None that is not in the saved config file! |
|
[2023-08-07 23:20:08,454][00422] Using frameskip 1 and render_action_repeat=4 for evaluation |
|
[2023-08-07 23:20:08,497][00422] RunningMeanStd input shape: (3, 72, 128) |
|
[2023-08-07 23:20:08,499][00422] RunningMeanStd input shape: (1,) |
|
[2023-08-07 23:20:08,513][00422] ConvEncoder: input_channels=3 |
|
[2023-08-07 23:20:08,554][00422] Conv encoder output size: 512 |
|
[2023-08-07 23:20:08,557][00422] Policy head output size: 512 |
|
[2023-08-07 23:20:08,578][00422] Loading state from checkpoint /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth... |
|
[2023-08-07 23:20:09,323][00422] Num frames 100... |
|
[2023-08-07 23:20:09,574][00422] Num frames 200... |
|
[2023-08-07 23:20:09,800][00422] Num frames 300... |
|
[2023-08-07 23:20:09,954][00422] Num frames 400... |
|
[2023-08-07 23:20:10,116][00422] Num frames 500... |
|
[2023-08-07 23:20:10,267][00422] Num frames 600... |
|
[2023-08-07 23:20:10,419][00422] Num frames 700... |
|
[2023-08-07 23:20:10,567][00422] Num frames 800... |
|
[2023-08-07 23:20:10,717][00422] Num frames 900... |
|
[2023-08-07 23:20:10,870][00422] Num frames 1000... |
|
[2023-08-07 23:20:11,032][00422] Num frames 1100... |
|
[2023-08-07 23:20:11,188][00422] Num frames 1200... |
|
[2023-08-07 23:20:11,342][00422] Num frames 1300... |
|
[2023-08-07 23:20:11,493][00422] Num frames 1400... |
|
[2023-08-07 23:20:11,636][00422] Num frames 1500... |
|
[2023-08-07 23:20:11,790][00422] Num frames 1600... |
|
[2023-08-07 23:20:11,943][00422] Num frames 1700... |
|
[2023-08-07 23:20:12,099][00422] Num frames 1800... |
|
[2023-08-07 23:20:12,264][00422] Num frames 1900... |
|
[2023-08-07 23:20:12,411][00422] Num frames 2000... |
|
[2023-08-07 23:20:12,564][00422] Num frames 2100... |
|
[2023-08-07 23:20:12,616][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:12,618][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:12,774][00422] Num frames 2200... |
|
[2023-08-07 23:20:12,927][00422] Num frames 2300... |
|
[2023-08-07 23:20:13,086][00422] Num frames 2400... |
|
[2023-08-07 23:20:13,243][00422] Num frames 2500... |
|
[2023-08-07 23:20:13,392][00422] Num frames 2600... |
|
[2023-08-07 23:20:13,536][00422] Num frames 2700... |
|
[2023-08-07 23:20:13,689][00422] Num frames 2800... |
|
[2023-08-07 23:20:13,842][00422] Num frames 2900... |
|
[2023-08-07 23:20:13,999][00422] Num frames 3000... |
|
[2023-08-07 23:20:14,160][00422] Num frames 3100... |
|
[2023-08-07 23:20:14,316][00422] Num frames 3200... |
|
[2023-08-07 23:20:14,460][00422] Num frames 3300... |
|
[2023-08-07 23:20:14,608][00422] Num frames 3400... |
|
[2023-08-07 23:20:14,761][00422] Num frames 3500... |
|
[2023-08-07 23:20:14,912][00422] Num frames 3600... |
|
[2023-08-07 23:20:15,065][00422] Num frames 3700... |
|
[2023-08-07 23:20:15,237][00422] Num frames 3800... |
|
[2023-08-07 23:20:15,389][00422] Num frames 3900... |
|
[2023-08-07 23:20:15,539][00422] Num frames 4000... |
|
[2023-08-07 23:20:15,696][00422] Num frames 4100... |
|
[2023-08-07 23:20:15,854][00422] Num frames 4200... |
|
[2023-08-07 23:20:15,906][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:15,909][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:16,068][00422] Num frames 4300... |
|
[2023-08-07 23:20:16,232][00422] Num frames 4400... |
|
[2023-08-07 23:20:16,383][00422] Num frames 4500... |
|
[2023-08-07 23:20:16,533][00422] Num frames 4600... |
|
[2023-08-07 23:20:16,691][00422] Num frames 4700... |
|
[2023-08-07 23:20:16,857][00422] Num frames 4800... |
|
[2023-08-07 23:20:17,073][00422] Num frames 4900... |
|
[2023-08-07 23:20:17,297][00422] Num frames 5000... |
|
[2023-08-07 23:20:17,504][00422] Num frames 5100... |
|
[2023-08-07 23:20:17,712][00422] Num frames 5200... |
|
[2023-08-07 23:20:17,929][00422] Num frames 5300... |
|
[2023-08-07 23:20:18,144][00422] Num frames 5400... |
|
[2023-08-07 23:20:18,363][00422] Num frames 5500... |
|
[2023-08-07 23:20:18,574][00422] Num frames 5600... |
|
[2023-08-07 23:20:18,807][00422] Num frames 5700... |
|
[2023-08-07 23:20:19,020][00422] Num frames 5800... |
|
[2023-08-07 23:20:19,236][00422] Num frames 5900... |
|
[2023-08-07 23:20:19,453][00422] Num frames 6000... |
|
[2023-08-07 23:20:19,663][00422] Num frames 6100... |
|
[2023-08-07 23:20:19,874][00422] Num frames 6200... |
|
[2023-08-07 23:20:20,092][00422] Num frames 6300... |
|
[2023-08-07 23:20:20,147][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:20,149][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:20,377][00422] Num frames 6400... |
|
[2023-08-07 23:20:20,593][00422] Num frames 6500... |
|
[2023-08-07 23:20:20,807][00422] Num frames 6600... |
|
[2023-08-07 23:20:21,020][00422] Num frames 6700... |
|
[2023-08-07 23:20:21,232][00422] Num frames 6800... |
|
[2023-08-07 23:20:21,435][00422] Num frames 6900... |
|
[2023-08-07 23:20:21,585][00422] Num frames 7000... |
|
[2023-08-07 23:20:21,732][00422] Num frames 7100... |
|
[2023-08-07 23:20:21,883][00422] Num frames 7200... |
|
[2023-08-07 23:20:22,030][00422] Num frames 7300... |
|
[2023-08-07 23:20:22,180][00422] Num frames 7400... |
|
[2023-08-07 23:20:22,332][00422] Num frames 7500... |
|
[2023-08-07 23:20:22,490][00422] Num frames 7600... |
|
[2023-08-07 23:20:22,640][00422] Num frames 7700... |
|
[2023-08-07 23:20:22,793][00422] Num frames 7800... |
|
[2023-08-07 23:20:22,950][00422] Num frames 7900... |
|
[2023-08-07 23:20:23,107][00422] Num frames 8000... |
|
[2023-08-07 23:20:23,265][00422] Num frames 8100... |
|
[2023-08-07 23:20:23,433][00422] Num frames 8200... |
|
[2023-08-07 23:20:23,585][00422] Num frames 8300... |
|
[2023-08-07 23:20:23,739][00422] Num frames 8400... |
|
[2023-08-07 23:20:23,794][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:23,796][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:23,950][00422] Num frames 8500... |
|
[2023-08-07 23:20:24,098][00422] Num frames 8600... |
|
[2023-08-07 23:20:24,253][00422] Num frames 8700... |
|
[2023-08-07 23:20:24,411][00422] Num frames 8800... |
|
[2023-08-07 23:20:24,567][00422] Num frames 8900... |
|
[2023-08-07 23:20:24,716][00422] Num frames 9000... |
|
[2023-08-07 23:20:24,870][00422] Num frames 9100... |
|
[2023-08-07 23:20:25,025][00422] Num frames 9200... |
|
[2023-08-07 23:20:25,183][00422] Num frames 9300... |
|
[2023-08-07 23:20:25,346][00422] Num frames 9400... |
|
[2023-08-07 23:20:25,511][00422] Num frames 9500... |
|
[2023-08-07 23:20:25,658][00422] Num frames 9600... |
|
[2023-08-07 23:20:25,808][00422] Num frames 9700... |
|
[2023-08-07 23:20:25,968][00422] Num frames 9800... |
|
[2023-08-07 23:20:26,115][00422] Num frames 9900... |
|
[2023-08-07 23:20:26,270][00422] Num frames 10000... |
|
[2023-08-07 23:20:26,422][00422] Num frames 10100... |
|
[2023-08-07 23:20:26,579][00422] Num frames 10200... |
|
[2023-08-07 23:20:26,733][00422] Num frames 10300... |
|
[2023-08-07 23:20:26,889][00422] Num frames 10400... |
|
[2023-08-07 23:20:27,047][00422] Num frames 10500... |
|
[2023-08-07 23:20:27,100][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:27,101][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:27,260][00422] Num frames 10600... |
|
[2023-08-07 23:20:27,413][00422] Num frames 10700... |
|
[2023-08-07 23:20:27,572][00422] Num frames 10800... |
|
[2023-08-07 23:20:27,722][00422] Num frames 10900... |
|
[2023-08-07 23:20:27,872][00422] Num frames 11000... |
|
[2023-08-07 23:20:28,022][00422] Num frames 11100... |
|
[2023-08-07 23:20:28,175][00422] Num frames 11200... |
|
[2023-08-07 23:20:28,327][00422] Num frames 11300... |
|
[2023-08-07 23:20:28,481][00422] Num frames 11400... |
|
[2023-08-07 23:20:28,637][00422] Num frames 11500... |
|
[2023-08-07 23:20:28,786][00422] Num frames 11600... |
|
[2023-08-07 23:20:28,957][00422] Num frames 11700... |
|
[2023-08-07 23:20:29,104][00422] Num frames 11800... |
|
[2023-08-07 23:20:29,258][00422] Num frames 11900... |
|
[2023-08-07 23:20:29,407][00422] Num frames 12000... |
|
[2023-08-07 23:20:29,563][00422] Num frames 12100... |
|
[2023-08-07 23:20:29,718][00422] Num frames 12200... |
|
[2023-08-07 23:20:29,867][00422] Num frames 12300... |
|
[2023-08-07 23:20:30,016][00422] Num frames 12400... |
|
[2023-08-07 23:20:30,168][00422] Num frames 12500... |
|
[2023-08-07 23:20:30,325][00422] Num frames 12600... |
|
[2023-08-07 23:20:30,380][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:30,382][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:30,533][00422] Num frames 12700... |
|
[2023-08-07 23:20:30,694][00422] Num frames 12800... |
|
[2023-08-07 23:20:30,847][00422] Num frames 12900... |
|
[2023-08-07 23:20:30,993][00422] Num frames 13000... |
|
[2023-08-07 23:20:31,146][00422] Num frames 13100... |
|
[2023-08-07 23:20:31,295][00422] Num frames 13200... |
|
[2023-08-07 23:20:31,471][00422] Num frames 13300... |
|
[2023-08-07 23:20:31,689][00422] Num frames 13400... |
|
[2023-08-07 23:20:31,903][00422] Num frames 13500... |
|
[2023-08-07 23:20:32,108][00422] Num frames 13600... |
|
[2023-08-07 23:20:32,327][00422] Num frames 13700... |
|
[2023-08-07 23:20:32,533][00422] Num frames 13800... |
|
[2023-08-07 23:20:32,752][00422] Num frames 13900... |
|
[2023-08-07 23:20:32,966][00422] Num frames 14000... |
|
[2023-08-07 23:20:33,180][00422] Num frames 14100... |
|
[2023-08-07 23:20:33,391][00422] Num frames 14200... |
|
[2023-08-07 23:20:33,601][00422] Num frames 14300... |
|
[2023-08-07 23:20:33,841][00422] Num frames 14400... |
|
[2023-08-07 23:20:34,064][00422] Num frames 14500... |
|
[2023-08-07 23:20:34,298][00422] Num frames 14600... |
|
[2023-08-07 23:20:34,522][00422] Num frames 14700... |
|
[2023-08-07 23:20:34,575][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:34,578][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:34,800][00422] Num frames 14800... |
|
[2023-08-07 23:20:35,011][00422] Num frames 14900... |
|
[2023-08-07 23:20:35,224][00422] Num frames 15000... |
|
[2023-08-07 23:20:35,432][00422] Num frames 15100... |
|
[2023-08-07 23:20:35,641][00422] Num frames 15200... |
|
[2023-08-07 23:20:35,864][00422] Num frames 15300... |
|
[2023-08-07 23:20:36,067][00422] Num frames 15400... |
|
[2023-08-07 23:20:36,223][00422] Num frames 15500... |
|
[2023-08-07 23:20:36,380][00422] Num frames 15600... |
|
[2023-08-07 23:20:36,531][00422] Num frames 15700... |
|
[2023-08-07 23:20:36,681][00422] Num frames 15800... |
|
[2023-08-07 23:20:36,844][00422] Num frames 15900... |
|
[2023-08-07 23:20:36,997][00422] Num frames 16000... |
|
[2023-08-07 23:20:37,155][00422] Num frames 16100... |
|
[2023-08-07 23:20:37,305][00422] Num frames 16200... |
|
[2023-08-07 23:20:37,454][00422] Num frames 16300... |
|
[2023-08-07 23:20:37,602][00422] Num frames 16400... |
|
[2023-08-07 23:20:37,751][00422] Num frames 16500... |
|
[2023-08-07 23:20:37,906][00422] Num frames 16600... |
|
[2023-08-07 23:20:38,048][00422] Num frames 16700... |
|
[2023-08-07 23:20:38,199][00422] Num frames 16800... |
|
[2023-08-07 23:20:38,251][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:38,255][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:38,403][00422] Num frames 16900... |
|
[2023-08-07 23:20:38,547][00422] Num frames 17000... |
|
[2023-08-07 23:20:38,694][00422] Num frames 17100... |
|
[2023-08-07 23:20:38,856][00422] Num frames 17200... |
|
[2023-08-07 23:20:39,005][00422] Num frames 17300... |
|
[2023-08-07 23:20:39,170][00422] Num frames 17400... |
|
[2023-08-07 23:20:39,321][00422] Num frames 17500... |
|
[2023-08-07 23:20:39,469][00422] Num frames 17600... |
|
[2023-08-07 23:20:39,617][00422] Num frames 17700... |
|
[2023-08-07 23:20:39,766][00422] Num frames 17800... |
|
[2023-08-07 23:20:39,929][00422] Num frames 17900... |
|
[2023-08-07 23:20:40,080][00422] Num frames 18000... |
|
[2023-08-07 23:20:40,229][00422] Num frames 18100... |
|
[2023-08-07 23:20:40,384][00422] Num frames 18200... |
|
[2023-08-07 23:20:40,535][00422] Num frames 18300... |
|
[2023-08-07 23:20:40,685][00422] Num frames 18400... |
|
[2023-08-07 23:20:40,845][00422] Num frames 18500... |
|
[2023-08-07 23:20:41,002][00422] Num frames 18600... |
|
[2023-08-07 23:20:41,156][00422] Num frames 18700... |
|
[2023-08-07 23:20:41,308][00422] Num frames 18800... |
|
[2023-08-07 23:20:41,471][00422] Num frames 18900... |
|
[2023-08-07 23:20:41,524][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:41,525][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:20:41,684][00422] Num frames 19000... |
|
[2023-08-07 23:20:41,835][00422] Num frames 19100... |
|
[2023-08-07 23:20:41,993][00422] Num frames 19200... |
|
[2023-08-07 23:20:42,156][00422] Num frames 19300... |
|
[2023-08-07 23:20:42,311][00422] Num frames 19400... |
|
[2023-08-07 23:20:42,463][00422] Num frames 19500... |
|
[2023-08-07 23:20:42,615][00422] Num frames 19600... |
|
[2023-08-07 23:20:42,772][00422] Num frames 19700... |
|
[2023-08-07 23:20:42,935][00422] Num frames 19800... |
|
[2023-08-07 23:20:43,089][00422] Num frames 19900... |
|
[2023-08-07 23:20:43,250][00422] Num frames 20000... |
|
[2023-08-07 23:20:43,400][00422] Num frames 20100... |
|
[2023-08-07 23:20:43,552][00422] Num frames 20200... |
|
[2023-08-07 23:20:43,709][00422] Num frames 20300... |
|
[2023-08-07 23:20:43,864][00422] Num frames 20400... |
|
[2023-08-07 23:20:44,022][00422] Num frames 20500... |
|
[2023-08-07 23:20:44,174][00422] Num frames 20600... |
|
[2023-08-07 23:20:44,327][00422] Num frames 20700... |
|
[2023-08-07 23:20:44,485][00422] Num frames 20800... |
|
[2023-08-07 23:20:44,638][00422] Num frames 20900... |
|
[2023-08-07 23:20:44,795][00422] Num frames 21000... |
|
[2023-08-07 23:20:44,848][00422] Avg episode rewards: #0: -0.210, true rewards: #0: -0.210 |
|
[2023-08-07 23:20:44,850][00422] Avg episode reward: -0.210, avg true_objective: -0.210 |
|
[2023-08-07 23:22:52,855][00422] Replay video saved to /content/train_dir/default_experiment/replay.mp4! |
|
|