Markov Decision Process

Intermediate

Formal framework for sequential decision-making under uncertainty.

Full Definition

Formal framework for sequential decision-making under uncertainty.

Keywords

MDP

Domains

Related Terms

Concept Map

See how Markov Decision Process connects to other concepts.

Open Knowledge Graph