ppo-LunarLander-v2 / results.json
mrvincenzo's picture
Initial commit
bf13844
raw
history blame contribute delete
165 Bytes
{"mean_reward": 291.71720960000005, "std_reward": 20.983960447857566, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-07-20T18:50:20.992874"}