PPO-LunarLander / results.json
ThePieroCV's picture
Added first model
c41600a
raw
history blame contribute delete
163 Bytes
{"mean_reward": 270.8242764732631, "std_reward": 17.82188887259743, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-07T04:36:00.580716"}