ppo-MountainCar-v0 / huggingface_publish.py
sgoodfriend's picture
PPO playing MountainCar-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/2067e21d62fff5db60168687e7d9e89019a8bfc0
b18ddcc
raw
history blame contribute delete
120 Bytes
from rl_algo_impls.huggingface_publish import huggingface_publish
if __name__ == "__main__":
huggingface_publish()