ppo-Walker2d-v3 / config.yml
araffin's picture
Initial commit
b1eb204
raw
history blame
207 Bytes
!!python/object/apply:collections.OrderedDict
- - - env_wrapper
- sb3_contrib.common.wrappers.TimeFeatureWrapper
- - n_timesteps
- 1000000.0
- - normalize
- true
- - policy
- MlpPolicy