ppo-mountan_car / results.json
danieladejumo's picture
Created and train PPO model
8e062f7
raw
history blame contribute delete
164 Bytes
{"mean_reward": -92.03675462990489, "std_reward": 3.968297130020583, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-06-20T17:22:40.216094"}