Introduction To Statistical Decision Theory
Introduction To Statistical Decision Theory
Introduction To Statistical Decision Theory
statistical decision
theory
Group 7
Decision theory, in statistics, a set of quantitative methods for
reaching optimal decisions. A solvable decision problem must be
capable of being tightly formulated in terms of initial conditions
and choices or courses of action, with their consequences. In
general, such consequences are not known with
certainty but are expressed as a set
of probabilistic outcomes. Each
outcome is assigned a “utility” value
based on the preferences of the
decision maker. An
optimal decision, following the logic of the
theory, is one that maximizes the expected
utility.
Decision Making Under
Certainty
Making decisions under certainty is easy. The cause and
effect are known, and the risk involved is minimal. What’s
tough is making decisions under risk and uncertainty. The
outcome is unpredictable because you don’t have all the
information about the alternatives.
Before we learn deeper about decision-making under
risk and uncertainty, let’s look at each of these
situations.