ppo-LunarLander-v2 / results.json
shenyichong's picture
first commit
f02525c
raw
history blame contribute delete
155 Bytes
{"mean_reward": 263.844807, "std_reward": 7.90495857316147, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-07-10T23:36:53.620631"}