Thursday, March 22, 2012

Summary of Findings (White Team): Decision Trees (3 out of 5 stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst University on 22 March 2012 regarding Decision Trees specifically. This technique was evaluated based on its overall validity, simplicity, flexibility, its ability to effectively use unstructured data, and its ease of communication to a decision maker.

Description:
This is a flexible method that visually represents iterative decisions and quantifies probable outcomes used to aid decision-making processes. It can use quantitative or qualitative data to identify probabilities and help to select the optimal decision. Decision trees divide the topic into potential outcomes based on variables shown as nodes, which creates a tree-shaped structure. Depending on variables, the decision tree can be used to convey estimative forecasts or deterministic probabilities.

Strengths:
  • Provides a clear visual representation of all possible decisions
  • Allows quantifiable analysis of the possible consequences of a decision
  • Can be used for both quantitative and qualitative data
  • Can be used to determine probabilities of different options or outcomes
  • Can be easily edited if new alternatives are found

Weaknesses:
  • Can rapidly grow to the point where the tree represents too many possible outcomes – having 90 outcomes that have to add up to 100% total reduces the impact of any possible analytic conclusions
  • Adding variables/decisions high in the tree can create a drastically-expanding workload, taking a great deal more time and effort.
  • Susceptible to psychological bias and blind spots if one course of action is favored from the start.
  • Assumes there are a finite number of variables
  • Assumes the creator of the decision tree knows all possible variables
  • Editing something at the beginning of the tree will alter the outcomes of the all nodes below it.

How-To:
1. Start with a single node (your most important variable) and connect it to outcomes that are mutually exclusive and capture most of the possibilities. The initial node forms the peak of the tree, and all variables branch out from there.
2. Continue drawing branches (connecting lines) outward from each decision node until you reach the last possible outcome.
3. Define the probability of occurrence for each possible outcome. This number should be between 0 and 1 (or 0% and 100% if using percentages). The sum of all outcome-probabilities stemming from a single chance node should be 1 (or 100%).
  • Example: Outcome A is 20%, Outcome B is 40%, and Outcome C is also 40%, totaling 100%. This means the tree thoroughly explores all possible outcomes by definition.
4. Circle the nodes at the end of branches. These are called “chance nodes.”
5. Begin removing the least likely/desirable branches and leaves of the decision tree, according to the individual or team goals affected by the decisions under consideration. This helps identify which decisions have the most favorable potential outcomes – the primary point of creating the decision tree.

Personal Application of Technique:
For the activity, the class was divided into 3 teams and given a situation to chart out a decision tree of the situation. The situation was as follows:

Imagine you only ever do four things on Saturday: go shopping, watch a movie, play tennis, or just stay in. What you do depends on three things: how much money you have (rich or poor), the weather (windy, rainy, or sunny), and whether or not your parents are visiting. Draw a decision tree to represent the best choices given the variables.

Teams then charted out the effects of each of these branching variables – “If parents are visiting, what is possible?” versus “If parents are NOT visiting, what is possible?” and so on. Ultimately, the trees indicated which of the four activity options would be available given each possible combination of variables.

Rating: 3 of 5 Intelligence Stars

For Further Information:
Decision Tree (Wikipedia)

No comments:

Post a Comment