Add first learning algorithm (PPO)
This tag has no release notes.