ppo-MountainCar-v0 / results.json
meln1k's picture
first commit
965b5f1
raw
history blame contribute delete
138 Bytes
{"mean_reward": -200.0, "std_reward": 0.0, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-27T17:58:45.680251"}