ppo-Swimmer-v3 / config.yml
croumegous's picture
Initial commit
00d2fd0
raw
history blame
295 Bytes
!!python/object/apply:collections.OrderedDict
- - - batch_size
- 256
- - gae_lambda
- 0.98
- - gamma
- 0.9999
- - learning_rate
- 0.0006
- - n_envs
- 4
- - n_steps
- 1024
- - n_timesteps
- 1000000.0
- - normalize
- true
- - policy
- MlpPolicy