ppo-LunarLander-v2 / results.json
jproman's picture
Initial commit
2b848e7
raw
history blame contribute delete
157 Bytes
{"mean_reward": 262.240407, "std_reward": 23.470728764826504, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-10-08T17:27:26.490339"}