ppo-Swimmer-v3 / config.yml
araffin's picture
Initial commit
40e156f
raw
history blame contribute delete
232 Bytes
!!python/object/apply:collections.OrderedDict
- - - env_wrapper
- sb3_contrib.common.wrappers.TimeFeatureWrapper
- - gamma
- 0.9999
- - n_timesteps
- 1000000.0
- - normalize
- true
- - policy
- MlpPolicy