ppo-MountainCar-v0 / environment.yml
sgoodfriend's picture
PPO playing MountainCar-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/983cb75e43e51cf4ef57f177194ab9a4a1a8808b
3cc5c1d
raw
history blame contribute delete
171 Bytes
name: rl_algo_impls
channels:
- pytorch
- conda-forge
- nodefaults
dependencies:
- python>=3.8, <3.10
- mamba
- pip
- pytorch
- torchvision
- torchaudio