ppo-LunarLander-v2_v3 / results.json
DBusAI's picture
ADD PPO model for LunarLander-v2_v3
c437d5e
raw
history blame
164 Bytes
{"mean_reward": 296.3646097727046, "std_reward": 13.541001834239337, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-05T15:55:14.376078"}