deep-rl / ppo-LunarLander-v2
EmmaRo's picture
first try at LunarLander-v2 with PPO
937bc8b