deep-rl / ppo-LunarLander-v2 /_stable_baselines3_version
EmmaRo's picture
first try at LunarLander-v2 with PPO
937bc8b
raw
history blame
7 Bytes
2.0.0a5