AI Explainer: What Are Reinforcement Learning 'Rewards'?
In a previous blog post, which was a glossary of terms related to artificial intelligence, I included this brief definition of "reinforcement learning": I expect this definition would prompt many to ask, "What rewards can you give a machine learning agent?" A gold star? Praise? No, the short answer is: numerical values. In reinforcement learning, rewards are crucial for training agents to make decisions that maximize their performance in a given environment.