ppo-LunarLander-v2 / results.json
Galeros's picture
PPO with 3e6 iterations
81e6c97
raw
history blame
165 Bytes
{"mean_reward": 281.43437720458627, "std_reward": 23.892504554624935, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-29T21:31:39.071226"}