Note:
This post represents the synthesis of the thoughts, procedures and
experiences of others as represented in the 8 articles read in advance
(see previous posts) and the discussion among the students and
instructor during the Advanced Analytic Techniques class at Mercyhurst
College in March 2013 regarding Decision Tree Analysis specifically.
This technique was evaluated based on its overall validity, simplicity,
flexibility and its ability to effectively use unstructured data.

Description:

A
decision tree is a diagram of nodes and branches. The nodes indicate
decision points, chance events, or branch terminals. The branches
correspond to a each decision alternative or event outcome connected to a
node. Decision trees are a tool that can be utilized as a predictive
tool.

Strengths:

1. Able to graphically display alternative choices for a decision-maker and the probability of certain courses of action.

2. Able to display different indicators for actions/warning list.

3. Display interrelationships between nodes and branches.

4. Can be useful to reduce certainty and give estimates in terms of probabilistic thinking.

Weaknesses:

1.
Can become rather robust, and have many decision paths. Too large the
user can become computer reliant to create more robust decision trees.

2. Reliance on calculated probabilities and the individual who calculated those probabilities.

3. Lacks the ability to predict other information that may eventually be a factor in the decision

tree, only displays information known to the creator.

4.
Decision trees lack the ability to take into account deceptive
information and how it could affect both final outcomes and calculated
probabilities within the decision tree.

Step by Step Action:

1. Draw a root node and extend branches for each decision alternative, including the decision to do nothing.

2. Label each branch and include the cost of each decision.

3.
Draw a chance node for each decision and include two branches stemming
from each node labeling the chance of success and failure.

4. Label the payoff of each success and failure by subtracting the cost from the expected revenue.

5.
Label the probability of each success and failure in decimal form and
then conduct a formula to calculate the expected value (EV); formula
will depend on how the decision tree is conducted.

6. The highest value is the best decision, one that is most reliable that reduces uncertainty.

Exercise:

Conducted
a decision tree exercise on two potential products the decision-maker
should choose based on which would be more profitable. Probabilities of
success were given for each product, the cost to make each product, and
the potential for economic gain for each product. A decision tree was
used to highlight this step with an expected value calculated at the end
which highlighted what product would be the best to produce for the
company. Through the flow of the decision-tree the class was able to
calculate which product was a more reliable choice to produce using the
formula: EV= (Payoff x Prob of Success) + (Cost x Prob of Failure).

Further Information:

This
was a very good chapter of a book on decision trees and gives an
excellent description of how decision trees are utilized and need to
know information pertaining to decision trees. A summary and critique
of this chapter can be found on this blog and written by Ethan Robinson.

http://www.public.asu.edu/~kirkwood/DAStuff/decisiontrees/DecisionTreePrimer-1.pdf

Subscribe to:
Post Comments (Atom)

Some nice stuff in your article I really feel speechless, because event staffing is quit pretty article. Beside this it is also a long after reading lasting article. Thanks for giving me such type of useful information..

ReplyDelete