Tuesday, April 20, 2010

The Delphi Technique in Developing International Health Policies: Experience from the SARSControl Project

Intro:The Delphi technique, according to the author, helps in structuring a group communication process that is particularly useful when there is little knowledge or uncertainty surrounding a complex area being investigated. This paper uses the criteria defined in the literature to assess the process of using the Delphi technique in developing international emerging infectious disease policies. The evaluation was done using the qualitative description of the SARSControl Delphi and carrying out a critical analysis of different aspects of each criterion.

The Delphi technique can be classified as follows:
  • The Classical Delphi-to establish facts
  • The Policy Delphi- to generate ideas
  • The Decision Delphi- to make decisions
  • The Group Delphi- for group discussions
Criteria: The Delphi technique must be applied systematically using the following 5 criteria:
  1. Panel Composition (geographic and professional representativeness, size, heterogeneity)
  2. Participant Motivation (response rate, written consent, clarity of questions, reminders)
  3. Problem Exploration, e.g. as percentage of agreement /medians
  4. Consensus Definition, e.g. as percentage of agreement /medians
  5. Format of Feedback, e.g. individual responses, measures of tendency and spread of responses
Other criteria includes: number of rounds, anonymity to encouraging open expression of opinions, and sufficient resources which include time and administrative services.

How Delphi was used for this project:

Panel Composition:
The Delphi process, which was carried out over a period of nine months consisted of a pilot round, two written rounds and a final face-to-face meeting. Thirty-eight infectious disease experts (from 22 countries) participated in the 1st written round and 28 experts (from 19 countries) in the 2nd written round; and 11 newly recruited experts with similar expertise as the participants from the written rounds from 9 countries participated in the face-to-face meeting.

Problem Exploration: Thirteen policy components considered important in terms of emerging diseases were used to formulate statements for the written Delphi rounds.
The two written rounds of the Delphi and a face-to-face meeting were found sufficient to obtain the outcomes in terms of generating policy options and alternatives. The 2nd Delphi round was crucial in clarifying issues from the 1 st round.

Round 1: Closed-ended questionnaire with a possibility to make comments.
The written comments (qualitative data) provided by the respondents were individually analyzed. (Took 2 months to complete)

Round 2: The questionnaire for the 2nd round was based on the results and comments received in the 1st round. (Took 2 months to complete)
  • Questions which had reached consensus in the 1 st round were excluded from the 2 nd
  • Unclear and confusing questions were either rephrased or new questions were formulated based on the panellists' comments
  • Unclear questions from the 1 st round to the 2 nd were dropped in the course of the Delphi written process.
Feedback for the Face-to-Face Meeting: Panel members were fed back with percentages of agreement within the panel for each statement in every round. Summaries of the comments from the previous rounds were also given.
  • Prior to the face-to-face Delphi meeting, its participants were informed via email about the Delphi technique and about the results of the written Delphi rounds. This was done to ensure that the participants had a common starting point as none of them had participated in the written rounds. The questionnaire responses were anonymous to other participants.

Conclusion:
The SARSControl Delphi process succeeded in its aim to generate policy options and alternatives — this in spite of the discontinuity in the involvement of the experts throughout the Delphi process. It can be concluded that from the results that Delphi technique when rigorously administered, analyzed and reported is a valuable method to develop international health policy recommendations for emerging infectious diseases and aid the international policy development process creatively collecting expert opinions and suggestions.

Source:
Syed, A., Hjarnoe, L., & Aro, A. (2009). The Delphi Technique In Developing International Health Policies: Experience From The SARSControl Project. Internet Journal of Health, 8(2), 5. Retrieved from Academic Search Complete database.

The Delphi Technique

What is it?
The Delphi Technique is a method for structuring a group communication process so that the process is effective in allowing a group of individuals, as a whole, to deal with a complex problem, according to Linstone and Turoff.

Where does it come from?
Delphi was developed by the Rand Corporation in the 1950s, funded by the U.S. Air Force, to find a way to establish reliable consensus of opinion among a group of experts about how Soviet military planners might target the US industrial system in an attack and how many atomic bombs would be needed to have a specified level of impact on U.S. military capability.

What is it used for?
It is widely used for more peaceful purposes today, but with the same underlying rationale: to establish as objectively as possible a consensus on a complex problem, in circumstances where accurate information does not exist or is impossible to obtain economically, or inputs to conventional decision making for example by a committee meeting face-to-face are so subjective that they risk drowning out individuals’ critical judgments.

The typical features of a Delphi procedure are an expert panel; a series of rounds in which information is collected from panelists, analyzed and fed back to them as the basis for subsequent rounds; an opportunity for individuals to revise their judgments on the basis of this feedback; and some degree of anonymity for their individual contributions.

The UK government commonly uses Delphi to make decisions or allocate resources in the health service, a classic context in which demand for resources will always outstrip their availability. Delphi is also mentioned in business texts under decision making techniques, along with other structured approaches such as the Nominal Group Technique. They allow complex decision­making and creative problem­-solving in a way which avoids the drawbacks of conventional meetings with unstructured, free­-flowing interaction and minimal direction such as
  • High variability in participant behavior and group social behavior · Discussion falls into a rut or goes off at tangents
  • The absence of an opportunity to think through independent ideas results in generalizations
  • High status or dominant personalities dominate discussions and decisions
  • Unequal participation among those present
  • Meetings conclude with a perceived lack of accomplishment
The basic method to implement Delphi as described by Delbecq et al is:
  1. develop an initial questionnaire and distribute it to the panel
  2. panelists independently form ideas to answer the questionnaire and return it
  3. the moderator summarizes the responses to the first questionnaire and
    develops a feedback report along with the second set of questionnaires
    for the panelists
  4. panelists independently evaluate earlier responses and vote on the second questionnaire
  5. the moderator develops a final summary and feedback report to the group
    and decision makers
Some variations to this very basic method include:
  • The number of iterations (the more rounds, the closer the consensus likely to be reached)
  • The method of selection and size of the panel
  • The scoring system and the rules used to aggregate the judgments of the panelists
  • The extent of anonymity afforded to the panelists
  • How consensus is defined and how disagreements are dealt with
Conclusion
This article, written by Nic Underhill, provides a very concise overview of what the Delphi technique is to the average person. Unfortunately, he does not identify any potential strengths and/or weaknesses of the technique which significantly diminishes the utility of his article. Although he introduces the reader to the Delphi technique, he does not give that person any direction in how to apply it to his/her own unique situation.

Using Delphi Technique in a Consensual Curriculum for Periodontics

Fried and Leao conducted an experiment using the Delphi technique to obtain a group consensus on the curriculum for periodontics. The goal of this study was to use the Delphi technique to identify a consensus about what topics should be included in a periodontics curriculum for undergraduate dental students.

METHODS
Nine periodontics faculties at dental schools in two Brazilian cities participated in this curriculum planning process. Forty undergraduate professors, all dentists, were invited to participate in the study. For the first phase of the experiment, lecturers were asked to list, in writing, items that should be included in a periodontics curriculum for dental students. Suggested items were split into two groups. The first group involved theory-related items associated with foundational concepts and basic principles of periodontics. The second group of items, related to dental practice, was associated with laboratory training and clinical experience with patients.

For the second phase of the experiment, previously obtained responses were scrutinized and collated into items. Then, a comprehensive Likert-scale questionnaire was compiled for submission to the panel. The questionnaire offered the following options for rating the importance of proposed curriculum items: "indispensable"; "important"; "relatively important"; "of little importance"; and "should not be included." The researchers sent the questionnaire to all forty respondents and evaluated the answers as follows:
  • item was "kept" if it reached a "50 percent plus one" consensus classification as "essential/important"
  • item was eliminated if it reached a "50 percent plus one" consensus classification as "of little importance/should not be included"
  • item was included in the next round of the questionnaire if it did not reach a "50 percent plus one" consensus agreement or a "50 percent plus one" consensus rejection
During stage three, the only participants were individuals who provided the most extreme positive ratings or the most extreme negative ratings during the previous stage for items where no clear consensus was reached.

For the fourth and last stage, for each non-consensual item singled out in the third stage, the positive and negative justifications required from third-stage participants were transcribed into a new questionnaire. All the original participants were then asked to for inclusion or against inclusion of each pending item. The format used for the fourth stage questionnaire was that of a sequence of items, each of them carrying its associated positive and negative justifications, followed by corresponding "yes" and "no" answering options. Frequencies for the answers thus obtained were calculated with the SPSS statistical package.

RESULTS
Out of the forty initial participants in the study, six (15 percent) eventually dropped out. Two moved abroad, and the other four were unable to stay involved until the end of the experiment. In accordance with the literature, this 85 percent response rate may be considered as "good."15 Out of the thirty-four lecturers who participated in each of the four Delphi technique stages, thirteen (38 percent) were females, and twenty-one (62 percent) were males. They had an average of ten years (±8.19) teaching experience.

Throughout the four stages of the consensus-building Delphi technique approach used in this study, participants identified various up-to-date scientific research trends within periodontics. However, in spite of that, items associated with some recent technological advances were not included in the final set of "indispensable" items. Items related to technological advances in odontology, such as human molecular genetics and DNA probing, were initially listed as important to a periodontics curriculum. However, half way through their study, they were excluded from the final list of selected items. The results of their study indicate that some issues such as a public health-oriented vision, were not addressed by the panel. This suggests that the majority of the individuals involved are focused on the treatment model rather than on health promotion. However, such a drawback may also be credited to limitations of the Delphi technique, which does not allow an interpersonal discussion of topics. Overall, use of the Delphi process allowed accomplishment of the primary goal of the study, which was to identify a consensus about what items should be included in the periodontics curriculum for undergraduate dental students based on the perspectives of a panel of faculty who teach periodontics at several different institutions.

Source: http://www.jdentaled.org/cgi/content/full/71/11/1441

An Overview of Four Futures Methodologies (Delphi, Environmental Scanning, Issues Management and Emerging Issue Analysis)

In this paper on futures research by Trudi Lang, the author compares and contrasts the Delphi method with other futures methodologies. The Delphi technique is found to be different from the other three methodologies chosen by the author for this study viz., Environmental Scanning, Issues Management and Emerging Issues Analysis (EIA).

For the Delphi technique, the positives are that Delphi studies have an excellent record of forecasting computer capability advances, nuclear energy expansion, energy demand and population growth and the technique is also said to expose real agreements and disagreements among respondents as well as giving the facilitator simple and direct control over the scope of the study.

The key drawbacks of the technique are that there is a strong response of the group to conform with the statistical feedback of the panel, extreme points of view, which may provide new insights tend to be suppressed. Also, the way the questionnaire and process is structured can lead to a bias and a Delphi study is at the mercy of the view and biases of the coordinating team, who choose respondents, interpret the information and structure the question.

The author's evaluation is that it is difficult to evaluate the accuracy and reliability of the Delphi method, because the technique is based on determining the opinion of panel members and the findings thus become person and situation specific. In addition, much of the work undertaken to evaluate the Delphi technique has been done with university students asking almanac-type questions.

The author however mentions that the Delphi technique is found to be best suited for exploration of issues that involve a mixture of scientific evidence and social values and that the Delphi method to be considered as one of last resort - to deal with extremely complex problems for which there are no other models.

In conclusion, the author states that the Delphi technique is an "inside - out" methodology when compared to the other techniques which are "outside-in" and by their very nature are all inter-related. The author advocates an integrated approach in using the futures methodologies available so that the strengths of one can make up for the weaknesses of the other.

Comments:

The author's research is thorough and a wide variety of resources have been referenced in her article. The author's conclusion is also valid; as for instance, the Delphi technique is far more successful than other techniques in gaining a consensus from a group but is weaker on allowing independent voices to be heard. In this regard, EIA provides a methodology that is more sensitive to these independent voices and could compensate for this weakness of the Delphi methodology.


Pragmatic Research Design: an Illustration of the Use of the Delphi Technique

This study conducted by two scholars at Rhodes University in South Africa used the Delphi Technique in hopes of forecasting the educational needs that will have to be met to prepare students to be entrepreneurs over the next 20-40 years. This paper does not walk readers through the results of the study, but rather the practical challenges in definitions and the organization of applying the Delphi Technique.

Definitions:
Based on the author’s review of the literature, they identified five main characteristics which define the technique:
  1. Its focus on researching the future or things about which little is known,
  2. Reliance on the use of expert opinion,
  3. Utilizing remote group processes,
  4. The adoption of an iterative research process, and
  5. The creation of a consensus of opinion.
The authors also identify three versions of the Delphi Technique:
  • Numeric – aims to specify a single or minimum range of numeric estimates through the use of summary statistics.
  • Policy - on the exploration, generation and definition of several alternatives and the arguments for and against each of these alternatives.
  • Historic - aims to explain the range of issues that fostered a specific decision, identification of several scenarios that could have led to the resolution of a past problem.
Entrepreneurship was defined as person who provides innovation in an economy, not owners of micro-businesses in saturated markets. The foundation of their study is based on a review of literature that suggests entrepreneurship can be formed through education.

The Experiment:
To comprehensively forecast the answer to their question, they asked three separate questions. The questions were the following:
  1. What sector of the South African economy will most likely offer the greatest potential for entrepreneurial opportunities in the next 25 to 40 years?
  2. What qualities are needed by graduates to equip them to be innovative entrepreneurs in the future?
  3. What should Higher Education in South Africa do to prepare/develop students to constructively participate in the future economy as innovative entrepreneurs?
Three separate panels were created because each question requires answers from a different set of experts.
  1. The panel for the first question was referred to members of government departments and research councils.
  2. The panel for the second question was referred to endowed Chairs in the area of entrepreneurship.
  3. The panel for the third question was referred to alumni of entrepreneurship programs and educationalists and academics in these programs.

Monday, April 19, 2010

The Future of High-Technology Crime: A Parallel Delphi Study

This study conducted by Larry E. Coutorie in 1995 is a follow-up to a 1980s study using the Delphi Technique to forecast the future of high-technology crimes. One of the purposes of this study is to give law enforcement a forecast of where high-tech crimes are headed, since most other techniques only allow reactionary responses.

The Experiment:
The study was conducted using two panels. One was comprised of “traditional” experts, or people already in the high-tech law enforcement field, and “nontraditional” experts, member of hacker and cracker groups recommended by other experts. Two groups on experts were sent three rounds of questionnaires with the following questions, refined each time by the groups’ responses to the previous questionnaire.
  1. In your opinion, what area(s) of high technology will be the focus of criminal activity in the next ten years?
  2. What form(s) do you believe this activity will take?
  3. What steps should be taken now to prepare the police to combat this criminal activity?
  4. Do you believe the responsibility for criminal investigation of high-technology crimes will be primarily that of government or private businesses? Why?
  5. Do you believe the responsibility for crime prevention activities regarding high-technology crimes will be primarily that of government or private businesses? Why?
Findings:
Each groups’ perspective diverged significantly from the first round of questioning onward. However, at the end of the three questionnaires, a consensus on several issues was identified.
  • Likely high-tech future crime areas include computer system attacks via telecommunications, a growing increase in computer-assisted fraud, and computer assisted data manipulation or theft.
  • Crime will take the form of software piracy, increased incidents of computer assisted counterfeiting, increased incidents of financial fraud, and increased attacks on computer systems via advanced technologies.
  • Preventative steps recommended include recruitment of individual with computer knowledge, increased public/private partnership, more training for law enforcement officers earlier in their career, and legislation that better defines jurisdiction.
  • At the time of this study experts forecasted private business would conduct the initial investigation and have an active participatory role in government investigations.
  • They also forecasted that private businesses would be responsible for protecting their own assets, with government assistance in identifying potential threats.

Use of the Delphi Method to Generate a Gold Standard for the Medical Coding Industry

The Delphi Method was developed in the 1950’s at the RAND Corporation. The original process is still in use today. In this article the authors proposed a modified version of the process with the intent of providing a “gold standard” for medical coding for records that eliminates coder variability. They called their new process, The Delphi Method of Medical Coding, which is a patent pending process.

This proposed new method of the Delphi process was created in hopes of eliminating some of the biases that can be presented in the original method. It also eliminates some of the cumbersome written communication required in the old method.

The Delphi Method of Medical Coding

1. Multiple coders are given identical sets of medical records and asked to code them as they would normally. These results show the different possible ways of coding. Each different possible code is shown on consolidated list.

2. The consolidated list from step 1 are presented to a second set of coders. The coders then make a simple yes/no determination about whether they should be applied to the associated record or not.

3. For each of the codes on the listing, a percentage is calculated of coders who decided the code should be applied. From this percentage, determined by either a fixed percentage or some derived statistic, one of three decisions is made. These possible decisions are the code should be applied to the record, the code should not be applied, or the code is indeterminate.

Conclusion

Although this study only showed only one use of this method, it’s possible it could be applied to other areas, similarly to the versatility of the original Delphi Method. However, it would be a good idea to conduct research comparing this method with the original to see if it truly eliminates biases while arriving at a similar outcome. This new method may provide a better way of gaining active participation due to smaller amount of steps and less time spent on the overall process. The article talks about possibly even implementing this method as a game, which may provide an even easier interaction for the people involved it the process.

Engaging communication experts in a Delphi process to identify patient behaviors that could impact communication in medical encounters.

Communication between patients and physicians is a very important area of study in the world of communication. The article states that evidence supports the Four Habits Model has a link between specific behaviors of the physicians with improved outcomes of the care provided.

Method


Utilizing the Four Habits Model as a starting point, a four round Delphi process was conducted with 17 international experts in communication research, medical education, and health care delivery. Each round was conducted via the internet.
Implementation of the Delphi Process:
Round 1: The experts reviewed a list of proposed patient verbal communication behaviors, identified based on a review of communications literature, within the Four Habits Model framework. The experts chose from the options of: approving the proposed list, adding new behaviors, or modifying existing behaviors from the list.
Rounds 2, 3, and 4: Each of these rounds was the same. Each behavior was rated for its fit, by agreeing or disagreeing, with a particular habit. After each round, the percentage of agreement for each behavior was calculated and the data was used for the determination of behaviors to be analyzed in following rounds. Behaviors that received more than 70% of the experts’ votes were considered as achieving consensus.

Results

Throughout the course of the four rounds, the experts started with the 14 originally proposed patients verbal communication behaviors, of which they modified 12, added 20 behaviors. They eventually retained only 22 behaviors, which included such obvious behaviors as asking questions, expressing preferences, and summarizing information.

Conclusion

The use of the Delphi appears to work very well for the expectation of finding out more effective ways of communication for physicians and patients. Without the notes of the actual rounds and how everything worked out when the experts voted on what they believed was right or wrong it’s difficult to actually make an assessment of the Delphi process.

Essential Components Of Curricular Learning Communities In Higher Education

The purpose of this study was to identify the essential components of curricular learning communities in higher education. A panel of experts participated in a four-round Delphi process designed to identify these essential features. The writer used a modified Delphi process to first elicit and then rate the importance of characteristics of curricular learning communities in higher education.

Delphi Advantages:
-The Delphi technique offers the advantage of group response without the attendant disadvantages sometimes experienced with group problem solving or decision-making.
-Expert participants are more likely to generate reasoned, independent, and well-considered opinions in the absence of exposure to the "persuasively stated opinions of others". Because the experts do not ever participate in a face-to-face discussion, there is no danger of one or more individuals’ opinions being swayed by a more dominant or more experienced individual.
-Efficiency and flexibility, especially in light of modern communication technologies such as e-mail and the Internet. Experts may be drawn from a wide geographic area, and the participants’ commitment in terms of time and money invested is minimal.
-Delphi method has been shown to be an effective way to conduct research when the responses being sought are value judgments rather than factual information. Although it is more difficult to assess the "correctness" of value judgments, it is generally agreed upon that value judgments are not all equal but can in fact be more "right" or more "wrong."

Delphi Limitations:-Delphi should not be used when any of the following three critical conditions are not present: adequate time, participant skill in written communication, and high participant motivation. It is estimated that a minimum of 45 days is required to carry out a Delphi study.
-Participants must be knowledgeable and able to clearly communicate their ideas. A high degree of motivation is needed to offset the tendency for participant dropout as the study progresses. Because there is no direct contact between participants, those who are not highly motivated and interested in the subject at hand may feel isolated or detached from the process.
-Another is the problem of bias in Delphi studies that can occur from poorly worded or leading questions or selective interpretation of the results.

Instrument Design and Implementation
Round One: Initial Survey:
The first round in the current study consisted of a brief survey, designed to collect some demographic data on the participants, and one open-ended question.

Round Two: Questionnaire One:
A list of 79 features was compiled from the information obtained in the initial survey. Obvious repetitions were eliminated, though items that were similar but not exactly the same were maintained. Items were sorted into four categories: Curricular Features, Pedagogical Features, Structural Features, and Environmental Features. Participants were asked to rate each feature on a Likert-type scale, identifying each feature as an "essential" (5), "very important" (4), "moderately important" (3), "slightly important" (2), or "not important"(1) characteristic of a curricular learning community.

Round Three: Questionnaire Two:
Questionnaire Two listed only the features that had received a mean rating of 4.0 or higher in the previous round. Once again the items were placed into the four categories of Curricular, Pedagogical, Structural, and Environmental.

Round Four: Questionnaire Three:
The third and final questionnaire listed the forty features that received the highest rating (determined by mean and mode) on the previous questionnaire. Panelists were given the following information: ranking of the items from first and second questionnaire, mean score of the items from both rounds, and the number of times each item was selected as one of the three to five most important items.

In this round, panelists were asked to assign a total of 100 value points to the forty items. At the end of this questionnaire, participants were asked to answer the another open-ended question.

Summary of Data Collection and Analysis Procedures:

The following table outlines the four-round Delphi procedure that was followed in this study: (see link for table)

Link: http://www.winona.edu/advising/lcchapter3.htm

Using the Delphi Technique to Achieve Consensus

How it is leading us away from representative government to an illusion of citizen participation.

This article explained the Delphi Technique and how it is used as a methodology in the education system. It was critical of the technique and explains the consequences and motivations of those who are using it.

The Delphi Technique and consensus building are both founded in the same principle - the Hegelian dialectic of thesis, antithesis, and synthesis, with synthesis becoming the new thesis. The goal is a continual evolution to "oneness of mind" (consensus means solidarity of belief) -the collective mind, the wholistic society, the wholistic earth, etc. In thesis and antithesis, opinions or views are presented on a subject to establish views and opposing views. In synthesis, opposites are brought together to form the new thesis. All participants in the process are then to accept ownership of the new thesis and support it, changing their views to align with the new thesis. Through a continual process of evolution, "oneness of mind" will supposedly occur.
In group settings, the Delphi Technique is an unethical method of achieving consensus on controversial topics. It requires well-trained professionals, known as "facilitators" or "change agents," who deliberately escalate tension among group members, pitting one faction against another to make a preordained viewpoint appear "sensible," while making opposing views appear ridiculous.

The facilitators or change agents encourage each person in a group to express concerns about the programs, projects, or policies in question. They listen attentively, elicit input from group members, form "task forces," urge participants to make lists, and in going through these motions, learn about each member of a group. They are trained to identify the "leaders," the "loud mouths," the "weak or non-committal members," and those who are apt to change sides frequently during an argument.

Suddenly, the amiable facilitators become professional agitators and "devil's advocates." Using the "divide and conquer" principle, they manipulate one opinion against another, making those who are out of step appear "ridiculous, unknowledgeable, inarticulate, or dogmatic." They attempt to anger certain participants, thereby accelerating tensions. The facilitators are well trained in psychological manipulation. They are able to predict the reactions of each member in a group. Individuals in opposition to the desired policy or program will be shut out.

The Delphi Technique works. It is very effective with parents, teachers, school children, and community groups. The "targets" rarely, if ever, realize that they are being manipulated. If they do suspect what is happening, they do not know how to end the process. The facilitator seeks to polarize the group in order to become an accepted member of the group and of the process. The desired idea is then placed on the table and individual opinions are sought during discussion. Soon, associates from the divided group begin to adopt the idea as if it were their own, and they pressure the entire group to accept their proposition.

How the Delphi Technique Works
First, a facilitator is hired. While his job is supposedly neutral and non-judgmental, the opposite is actually true. The facilitator is there to direct the meeting to a preset conclusion. The facilitator begins by working the crowd to establish a good-guy-bad-guy scenario. Anyone disagreeing with the facilitator must be made to appear as the bad guy, with the facilitator appearing as the good guy.

Next, the attendees are broken up into smaller groups of seven or eight people. Each group has its own facilitator. The group facilitators steer participants to discuss preset issues, employing the same tactics as the lead facilitator. Why hold such meetings at all if the outcomes are already established? The answer is because it is imperative for the acceptance of the School-to-Work agenda, or the environmental agenda, or whatever the agenda, that ordinary people assume ownership of the preset outcomes. If people believe an idea is theirs, they'll support it. If they believe an idea is being forced on them, they'll resist.


How to Diffuse the Delphi Technique


Three steps can diffuse the Delphi Technique as facilitators attempt to steer a meeting in a specific direction.

Always be charming, courteous, and pleasant. Smile. Moderate your voice so as not to come across as belligerent or aggressive.

Stay focused. If possible, jot down your thoughts or questions. When facilitators are asked questions they don't want to answer, they often digress from the issue that was raised and try instead to put the questioner on the defensive. Do not fall for this tactic. Courteously bring the facilitator back to your original question. If he rephrases it so that it becomes an accusatory statement (a popular tactic), simply say, "That is not what I asked. What I asked was . . ." and repeat your question.

Be persistent. If putting you on the defensive doesn't work, facilitators often resort to long monologues that drag on for several minutes. During that time, the group usually forgets the question that was asked, which is the intent. Let the facilitator finish. Then with polite persistence state: "But you didn't answer my question. My question was . . ." and repeat your question.

Never become angry under any circumstances. Anger directed at the facilitator will immediately make the facilitator the victim. This defeats the purpose. The goal of facilitators is to make the majority of the group members like them, and to alienate anyone who might pose a threat to the realization of their agenda. People with firm, fixed beliefs, who are not afraid to stand up for what they believe in, are obvious threats.

At a meeting, have two or three people who know the Delphi Technique dispersed through the crowd so that, when the facilitator digresses from a question, they can stand up and politely say: "But you didn't answer that lady/gentleman's question."

Establish a plan of action before a meeting. Everyone on your team should know his part. Later, analyze what went right, what went wrong and why, and what needs to happen the next time. Never strategize during a meeting.

link: http://www.eagleforum.org/educate/1998/nov98/focus.html

This process sounds very political in this type of setting and that personal agendas, or vendettas may come into play if a facilitator is not in control of the situation. The illusion of a representative participation is obvious in this type of situation.

Apprehending the Future: Emerging Technologies, from Science Fiction to Campus Reality

Bryan Alexander, Director of Research at the National Institute for Technology and Liberal Education (NITLE) talks about the various techniques used to decipher emerging technology trends from the perspective of higher education in his article. The author talks about how the Delphi technique has been particularly useful in forecasting technology in higher education.

One of the cases where the Delphi technique is used is in the Horizon Project. Launched in 2002, this project draws on a large body of experts across academia. Over several months, the group identifies trends, ranks their impact, compares estimates, and progressively builds up a profile of emerging technologies. This is then published as the annual Horizon Report. The January 2009 report identified the following technologies:

  • Mobiles (time-to-adoption: one year or less)
  • Cloud computing (time-to-adoption: one year or less)
  • Geo-Everything (time-to-adoption: two to three years)
  • The Personal Web (time-to-adoption: two to three years)
  • Semantic-Aware Applications (time-to-adoption: four to five years)
  • Smart Objects (time-to-adoption: four to five years)
Another application of the Delphi Technique in higher education was the "The Future of Internet III" project by Pew Internet & American Life Project and Elon University. This study was much broader in scope and had a much longer timeline. The outcome from this exercise is as below:

  • The mobile device will be the primary connection tool to the Internet for most people in the world in 2020.
  • The transparency of people and organizations will increase, but that will not necessarily yield more personal integrity, social tolerance, or forgiveness.
  • Talk and touch user-interfaces with the Internet will be more prevalent and accepted by 2020.
  • Those working to enforce intellectual property law and copyright protection will remain in a continuing "arms race," with the crackers who will find ways to copy and share content without payment.
  • The divisions between "personal" time and work time and between physical and virtual reality will be further erased for everyone who's connected, and the results will be mixed in terms of social relations.
  • Next-generation engineering of the network to improve the current Internet architecture is more likely than an effort to rebuild the architecture from scratch.
In conclusion, the author states that no technique can effectively predict the future and using a combination of techniques can only help us have some idea of the future. The future is increasingly complex and "black swans" continue to occur and have enormous effects on the future.

Comments:

As can be seen from the above conclusions, they do not seem to be too radical or innovative in any way but instead seem to tug the line of what is believed to be the general consensus (for instance mobile devices being the primary connecting tool in 2020 or touch and talk user interfaces being prevalent does not necessarily need to be deciphered by experts). This is in fact the major drawback of this technique - that Delphi outcomes can be driven by a desire for consensus, rather than actual agreement, meaning that divergent ideas can and often do get quashed.

Saturday, April 17, 2010

The Delphi Technique; Lets Stop Being Manipulated

More and more citizens are being invited to participate in various form of meeting, councils, or boards to "help determine" public policy in one field or another. According to the author of this article Albert V. Burns, this sounds great, but in reality it is deception.

Generally, each meeting has someone designated to facilitate the meeting. The job of facilitator is to be neutral, non-directing helper to see that the meeting flows smoothly. Actually, he or she is there for exactly the opposite reason. To see that the conclusions reached during the meeting are in accord with a plan already decided upon by those who called the meeting.

The process used to "facilitate" the meeting called Delphi Technique. This technique was developed by the RAND Corporation for the U.S. Department of Defense in 1950s.

How does the process take place?

First, the person who will be leading the meeting, the facilitator must be a likable person with whom the meeting participants can agree and sympathize.

Facilitators are trained to recognized potential opponents and how to make such people appear aggressive and foolish. The audience are broken up into groups seven or eight people each.

Within each group discussion takes place of issues already decided upon by the leadership of the meeting. Generally, the participants are asked to write down their ideas and disagreements with the papers to be turned in and "compiled" for general discussion.

How do you know that the ideas on your notes were included in the final result? You Do not! You will come to a conclusion that you were probably in the minority. You do not even know if any one's ideas are part of the final conclusion.
Those who organized the meeting are able to tell the community that the conclusions, reached at the meeting, are the result of public participation.

Actually, the desired conclusions had been established, long before the meeting ever took place.

Friday, April 16, 2010

Prioritization Process Using Delphi Technique

In a white paper, Alan Cline who is a member of Carolla Development, Inc. gives some insights on how to prioritize using Delphi technique.

Background and Motivation

Delphi technique was developed in 1960s as as forecasting methodology. Later, the U.S. government enhanced it as a group decision making tool. Delphi is particularly appropriate when decision-making is required in a political or emotional environment. The tool works formally or informally, in large or small contexts. For example: Taiwan used the method to prioritize their entire Information Technology industry.

Delphi Prioritization Procedure

1- Pick a facilitation leader: The facilitator is an expert in research data collection, and is not a stakeholder.

2- Select a panel of experts: The panelist should have an intimate knowledge of the projects.

3- Identify a straw man criteria list from the panel: Brainstorm a list of criteria that are appropriate to the projects.

4- The panel ranks the criteria: The panel ranks from 1(very important) to 3 (not important). The ranking should be done individually and anonymously.

5- Calculate the mean and deviation

6- Re rank the criteria

7- Identify project constrains and preference: Constrains could be budget or mandatory regulations.

8- Rank project by constrain and preference

9- Analyze the result and feedback to panel

10- Re rank the project until it is stabilized

Thursday, April 15, 2010

Green Team Summary of Findings: Imagery Analysis (4 out of 5 stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 16 articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 15 April 2010 regarding Imagery Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:

Imagery analysis is the extraction of meaningful information from images. It is used by both the military and the civilian sectors. Imagery analysis is highly useful in state-related matters as well as natural resource related businesses and scientific research institutes. The groups that use imagery analysis run the gamut from NGO's to multinational organizations.

There are many different types of imagery that can be used alone or in combination with others. This list can include satellite, infrared, electro-optical, and multispectral imagery. One of the most well known types of imagery is satellite imagery. Since the early days of satellite imagery there have been dramatic improvements in availability and quality that have lead to increased accessibility. Some of the largest commercial vendors are GeoEye, DigitalGlobe, ImageSat International, and Satellite Imaging Corporation. Google Earth is one of the most popular sources for images because it is commercially available to anyone with Internet access.


STRENGTHS AND WEAKNESSES:

Strengths
  • Capable of providing a detailed overview of large area
  • More manageable than on-the-ground classification efforts
  • Accessibility and ability of systems continues to improve
  • Data provided is clear and highly detailed
  • Can be used to corroborate other collected data
  • Can be applied to a variety of problems, and is very adaptable
  • Allows for data collection in remote areas
  • Relatively easy to perform cue to commercial availability
  • Allows for a retrospective view for an area

Weaknesses
  • Software and systems can be expensive
  • Limited by resolution, image quality, atmospheric haze, and contrast
  • Hard to compare images taken from different angles and at different resolutions
  • This methodology still requires a trained analyst to go over the tentative findings and discriminate between objects
  • Need for trained professionals (level of training debatable)
  • Chance for error if preparatory steps not followed such as accounting for clouds, resolution, etc.
  • Does not account for other factors such as change between the images
  • Can be subject deception

How To:

There appear to be no standard procedures applicable to all images at all times or for all purposes. However, several standard processes for specific purposes were identified through the literature reviewed in advance of this class. For example, with respect to classifying types of vegetation:

1.Identify landmarks
2 Align landmarks (if using photos from different time periods)
3.Standardize levels of contrast within an image (necessary when images were taken by different sensors)
4.Image Classification- divide the photograph into "clusters", spatial areas with the same characteristics then decide what those are with respect to the region and culture of an area.
5.Accuracy Assessment- compare classification results to what the ground areas truly are as revealed by aerial photographs or teams on the ground.


Personal Application:

The class looked at ten designated images; five different cities at two different heights. The class had three minutes to look at the two images of the city and identify the following: Density, Bodies of Water, Building Shadows, Landmarks, and Vegetation. These five criteria helped to narrow the range of possible cities. For example, when looking at New York City, the identification was made possible by looking for vegetation (Central Park), building shadows (consistently tall buildings), and density (buildings very close together over a wide area).


Summary of Findings (White Team): Geospatial Analysis (4 out of 5 stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 16 articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 15 April 2010 regarding Geospatial Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:
Geospatial Analysis is a method using aerial imagery technology for analysis. These technologies include, Electro Optical, Infra Red, Multispectral, or Radar. This type of imagery is commercially available and useful for a multitude of applications. Analyzing geospatial imagery can provide decision makers with visual indicators that offer evidence of internal operations. Static images to aid the intelligence process by contributing to all-source analysis provided to decision makers. Imagery can be combined with cartography and mapping to determine distances, heights, demographics and population density.

Strengths:
  • Objective in nature, data is structured.
  • Easy to perform, even for untrained analysts.
  • Readily available open sources such as Google Earth/ it is commercially available.
  • Satellite imagery can provide data in remote areas that lack accurate geographic maps.
  • Usually more cost efficient than sending personnel to a site.
  • Can also provide powerful visualization to assist when shaping policy
  • On the ground situation can be assessed, even when outside parties don't have access to the area.
  • The same data can be used on multiple types of problems/issues.
  • Technology is continually being updated to allow for easier use.
Weaknesses:
  • Potentially expensive.
  • Static images (only a snapshot in time).
  • May require training depending on the level of analysis required.
  • Errors may occur due to lack of resolution, deception efforts, or even weather.
  • Imagery can generally be obtained at specific time intervals (i.e.- satellites cannot maintain stationary low earth orbit).
  • Unable to view objects/structures located beneath the surface of the Earth.
  • May require additional resources to provide specific measurements (e.g.-distance between objects, or heights of structures).
  • Privacy and confidentiality issues may arise as technologies and resolution advance in sophistication.
How to:
There appear to be no standard procedures applicable to all images at all times or for all purposes. However, several standard processes for specific purposes were identified through the literature reviewed in advance of this class. For example, with respect to classifying types of vegetation:

  • Images to be analyzed should show the same season (in case multiple images are used for analysis) and must be accurately registered (matched up) to the ground and to each other.
  • The images must be radiometrically calibrated to minimize effects of instrument variations and atmospheric haze (Radiometric Normalization (RRN)). This is necessary when images are taken by different sensors, to standardize radiometry, or contrast within an image
  • The landmarks in the images need to be aligned (geometric rectification). This is necessary for analysis over time.
  • A classification scheme should be decided on and designed. Classification categories ex. cultivated land or grassland, forested land vs. non forested land can be used.
  • Classify the images divide the photograph into "clusters", spatial areas with the same characteristics, then decide the category they should be placed.
  • To reduce errors in classification, techniques such as having a human check portions of computer-classified work is also done (spatial re-classification).
  • Classification results are compared to what the ground areas truly are (as revealed by aerial photographs) to assess the accuracy of the classification.
Application:
Geospatial Analysis, as a method, was demonstrated by looking at various images of major international cities. We had to identify them by viewing wide screen shots taken from Google Earth followed by viewing a close-up of the image. By recognizing key indicators, i.e. population density, landmarks, shadows, terrain, etc..., we were able to identify the cities (some, but not all) which taught us that as a method, it is best practiced using modifiers, particularly when resolution is lacking or distorted. The depth of Imagery analysis is dependent upon the source of the image (i.e. aerial vs. satellite) and the skill set of the analyst. This exercise, a descriptive analysis, would be an excellent starting point for those with low exposure to imagery analysis as it is low cost and easily accessible.


Tuesday, April 13, 2010

Satellite Imagery Activism: Sharpening the Focus on Tropical Deforestation

The Rise of Satellite Imagery Activism
According to the authors (from the RAND Corp. & George Washington University), the first Landsat satellite was launched by the U.S. in 1972 and since then has opened up the doors to satellite remote sensing, especially in terms of natural resource sustainability and human activity monitoring. The original users of satellite imagery were state agencies (civil & military), natural resource related businesses and scientific research institutes. However today the groups that use satellite imagery run the gamut from NGO's to multinational organizations. Since the early days of satellite imagery there have been dramatic improvements in availability and quality that have lead to increased accessibility. First, cheap and highly powerful computing abilities have broadened the range of individuals that can work with imagery data. Second, "the advent of user-friendly software for image processing and analysis has made imagery analysis much less the purview of remote sensing specialists". Third, the cost of imagery data has drastically declined with the onset of cheaper Landsat/SPOT images, declassified U.S. & Russian intelligence imagery and many other forms of commercially available imagery. Lastly, the internet and CD-ROM's have facilitated the spread of imagery with giants like Google moving into the arena.

Imagery Activism & Tropical Forests
Due to "rising levels of global transparency" and recent software and internet abilities, there are many avenues in which citizens can bring awareness to public policy issues through the use of satellite imagery. A lot of environmental factors can be analyzed via satellite imagery such as changes in vegetation, biological stress and habitat characterization. Out of these issues, there has been the greatest focus on monitoring trends in tropical deforestation. Besides simple deforestation, "satellite imagery can be used to monitor growth, as well as the effects of human activity and exploitation...on the surrounding ecosystems". Satellite images can provide objective proof of potential logging infractions and at the same time be built up into databases to provide long-term monitoring of forests. In the last several years, the international community's gaze has fallen on the tropical rainforests of Brazil and Indonesia. Brazil in particular has seen the dramatic expansion of road networks, farms, pastures and plantations, all at the expense of the rainforests.

Conclusions
In summary, this paper examined the rise of satellite imagery in today's society and how this imagery affects the world's tropical rainforests. The paper's focus was on deforestation in specifically Brazil and Indonesia. Furthermore, the paper's aim is to attract global attention to the problem of deforestation by utilizing satellite imagery data and other geospatial technologies.

Source
http://www.aseanenvironment.info/Abstract/41016424.pdf

Tropical Cyclone Intensity Analysis and Forecasting from Satellite Imagery

Introduction
This article by Vernon F. Dvorak, discusses the implications of analyzing cyclones with intensity analysis, by using satellite platforms. Dvorak notes that satellite imagery is highly useful in monitoring tropical cyclones because methods have been developed that allow analysts to estimate the intensity of the cyclone based on certain distinguishing cloud features.
The Technique
The technique of combining intensity analysis with satellite imagery yields rather good estimates in terms of forecasting storm intensity. The specific cloud features that are used to estimate cyclone intensity are derived from two categories of features: "central features" and "outer-banding features". Each of these features is assigned a "T-number" ("T" for tropical) which denotes tropical disturbances. T1 describes disturbances "exhibiting minimal but significant signs of tropical cyclone intensity", whereas T8 denotes the "maximum possible" cyclone intensity. Next, each of the cloud features is analyzed to determine whether the cyclone will remain at its modeled intensity over the next 24 hours.

The Analysis Procedure

The intensity analysis via the satellite platform consists of three stages. The first stage "requires a qualitative judgment as to how cloud features related to cyclone intensity have changed between yesterday's picture and today's". The second and third stages of analysis involve examining the overall cloud pattern and component features to see if they agree with the modeled intensity estimate that the forecasters arrived at during the first stage of analysis.
The Forecast Procedure
To come up with the actual intensity forecast, the researchers have to use either the cyclone's model curve to obtain tomorrow's intensity or adjust the curve when an interuption due to landfall or the approach of some unfavorable circumstance is near. These changes can be detected with an unexplainable change of T-number or in changes to the cloud features.
Performance
A researcher named Erickson conducted the first tests using intensity analysis via satellite imagery in 1972, and his results have shown good consistency. Another researcher by the name of Arnold (1972)used the T-number estimates in tandem with the Joint Typhoon Warning System estimates for the west Pacific and had pretty good results as well.

(Satellite Imagery of Tropical Cyclone patterns with "T" Numbers)
Source
http://journals.ametsoc.org/doi/pdf/10.1175/1520-0493%281975%29103%3C0420%3ATCIAAF%3E2.0.CO%3B2

Validation of SPOT-5 Satellite Imagery for Geological hazard Identification

Summary
This study was developed to determine whether SPOT-5 satellite imagery could be used to develop geographic hazard maps for areas lacking available resource. The study took place in Nicaragua, and compared SPOT-5 analysis with newly created threat maps, and additional geographic monitoring. Maps were created using SPOT-5 imagery for hazard inventory, hazard susceptibility, and vulnerability mapping. The study concluded that SPOT-5 imagery could be used to effectively develop geographic hazard maps. The system was however limited by the level analysis needed to create the hazard maps.
Strengths
Satellite imagery was capable of providing necessary data in remote areas that lack accurate geographic maps. Additionally it allowed for direct GIS system and an ability to gather data despite poor weather conditions.
Limitations
Despite the SPOT-5 test success, the full process of creating the maps required large amounts of additional analysis. The system was also unable to easily identify certain features needed for hazard mapping, which could lead to possible issues without additional support from standard measurement systems

Source:http://www.cartesia.org/geodoc/isprs2004/comm1/papers/51.pdf

The Use Of Satellite Imagery In Landslide Studies In High Mountain Areas

Summary

This is a case-study that compares the application of Landsat ETM+ and IKONOS imagery when assessing a natural terrain's susceptibility to landslides. This study looked at six areas located in the upland areas of Nepal and Bhutan. In each case, the imagery has been used both to directly map landslides and to examine the occurrence of factors that might be important in landslide initiation, such as water seepage. The results from the imagery were bench-marked using field surveys.

The results of this study demonstrated that Landsat ETM+ continues to be the most cost-effective imagery tool for mapping landslide susceptibility. This is due to the fact that it is relatively low-cost and has high spectral resolution. However, the researchers state that the spatial resolution is still a significant limitation to Landsat ETM+.

The high resolution, multispectral IKONOS imagery is not limited in the same way. With IKONOS, even small landslides are able to be mapped in great detail. However, according to the report, this type of imagery is less useful for factor type mapping. The report concluded that of the high cost of IKONOS will prevent most developing countries from developing and utilizing the tool, resulting in Landsat ETM+ being the better tool for mapping landslide susceptibility.

Sources: http://www.securinglivelihoods.org/nepal/files/nepal%20case%20study.pdf

Processing Satellite Imagery To Detect Waste Tire Piles


Summary


This article, posted on www.techbriefs.com, discusses the development of a new methodology in satellite imagery analysis. The developers, Joseph Skiles, Cynthia Schmidt, Becky Quinlan, and Catherine Huybrechts, state that they have created a new methodology for processing commercially available satellite spectral imagery to identify and map waste tire piles in California. The methodology uses a combination of previously commercially available image-processing and georeferencing software. This is then used to develop a model that identifies tire piles.

Tire piles are difficult to distinguish in satellite imagery because of their low reflectance levels. Often times tire piles are mistaken for shadows or deep water. The developers of this methodology claim it attempts to correct these misinterpretations by using software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model incorporated lessons learned in previous research on the detection and mapping of tire piles by various methods of analysis (manual/visual and/or computational) of aerial and satellite imagery.

Strengths
  • Reduces time spent surveying regions for tire sites
Weaknesses
  • This methodology still requires a trained analyst to go over the tentative findings and discriminate between tires, water, vegetation, etc.

Source: http://www.techbriefs.com/component/content/article/2486