ppo-LunarLander-v2 / results.json
HaythamB's picture
version2
e1c1c1e
raw
history blame contribute delete
158 Bytes
{"mean_reward": 276.6080709, "std_reward": 19.488516808454857, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-12-26T12:23:03.733682"}