deep-rl / ppo-LunarLander-v2 /policy.optimizer.pth

Commit History

first try at LunarLander-v2 with PPO
937bc8b

EmmaRo commited on