Monday, April 19, 2010

Use of the Delphi Method to Generate a Gold Standard for the Medical Coding Industry

The Delphi Method was developed in the 1950’s at the RAND Corporation. The original process is still in use today. In this article the authors proposed a modified version of the process with the intent of providing a “gold standard” for medical coding for records that eliminates coder variability. They called their new process, The Delphi Method of Medical Coding, which is a patent pending process.

This proposed new method of the Delphi process was created in hopes of eliminating some of the biases that can be presented in the original method. It also eliminates some of the cumbersome written communication required in the old method.

The Delphi Method of Medical Coding

1. Multiple coders are given identical sets of medical records and asked to code them as they would normally. These results show the different possible ways of coding. Each different possible code is shown on consolidated list.

2. The consolidated list from step 1 are presented to a second set of coders. The coders then make a simple yes/no determination about whether they should be applied to the associated record or not.

3. For each of the codes on the listing, a percentage is calculated of coders who decided the code should be applied. From this percentage, determined by either a fixed percentage or some derived statistic, one of three decisions is made. These possible decisions are the code should be applied to the record, the code should not be applied, or the code is indeterminate.

Conclusion

Although this study only showed only one use of this method, it’s possible it could be applied to other areas, similarly to the versatility of the original Delphi Method. However, it would be a good idea to conduct research comparing this method with the original to see if it truly eliminates biases while arriving at a similar outcome. This new method may provide a better way of gaining active participation due to smaller amount of steps and less time spent on the overall process. The article talks about possibly even implementing this method as a game, which may provide an even easier interaction for the people involved it the process.

Engaging communication experts in a Delphi process to identify patient behaviors that could impact communication in medical encounters.

Communication between patients and physicians is a very important area of study in the world of communication. The article states that evidence supports the Four Habits Model has a link between specific behaviors of the physicians with improved outcomes of the care provided.

Method


Utilizing the Four Habits Model as a starting point, a four round Delphi process was conducted with 17 international experts in communication research, medical education, and health care delivery. Each round was conducted via the internet.
Implementation of the Delphi Process:
Round 1: The experts reviewed a list of proposed patient verbal communication behaviors, identified based on a review of communications literature, within the Four Habits Model framework. The experts chose from the options of: approving the proposed list, adding new behaviors, or modifying existing behaviors from the list.
Rounds 2, 3, and 4: Each of these rounds was the same. Each behavior was rated for its fit, by agreeing or disagreeing, with a particular habit. After each round, the percentage of agreement for each behavior was calculated and the data was used for the determination of behaviors to be analyzed in following rounds. Behaviors that received more than 70% of the experts’ votes were considered as achieving consensus.

Results

Throughout the course of the four rounds, the experts started with the 14 originally proposed patients verbal communication behaviors, of which they modified 12, added 20 behaviors. They eventually retained only 22 behaviors, which included such obvious behaviors as asking questions, expressing preferences, and summarizing information.

Conclusion

The use of the Delphi appears to work very well for the expectation of finding out more effective ways of communication for physicians and patients. Without the notes of the actual rounds and how everything worked out when the experts voted on what they believed was right or wrong it’s difficult to actually make an assessment of the Delphi process.

Essential Components Of Curricular Learning Communities In Higher Education

The purpose of this study was to identify the essential components of curricular learning communities in higher education. A panel of experts participated in a four-round Delphi process designed to identify these essential features. The writer used a modified Delphi process to first elicit and then rate the importance of characteristics of curricular learning communities in higher education.

Delphi Advantages:
-The Delphi technique offers the advantage of group response without the attendant disadvantages sometimes experienced with group problem solving or decision-making.
-Expert participants are more likely to generate reasoned, independent, and well-considered opinions in the absence of exposure to the "persuasively stated opinions of others". Because the experts do not ever participate in a face-to-face discussion, there is no danger of one or more individuals’ opinions being swayed by a more dominant or more experienced individual.
-Efficiency and flexibility, especially in light of modern communication technologies such as e-mail and the Internet. Experts may be drawn from a wide geographic area, and the participants’ commitment in terms of time and money invested is minimal.
-Delphi method has been shown to be an effective way to conduct research when the responses being sought are value judgments rather than factual information. Although it is more difficult to assess the "correctness" of value judgments, it is generally agreed upon that value judgments are not all equal but can in fact be more "right" or more "wrong."

Delphi Limitations:-Delphi should not be used when any of the following three critical conditions are not present: adequate time, participant skill in written communication, and high participant motivation. It is estimated that a minimum of 45 days is required to carry out a Delphi study.
-Participants must be knowledgeable and able to clearly communicate their ideas. A high degree of motivation is needed to offset the tendency for participant dropout as the study progresses. Because there is no direct contact between participants, those who are not highly motivated and interested in the subject at hand may feel isolated or detached from the process.
-Another is the problem of bias in Delphi studies that can occur from poorly worded or leading questions or selective interpretation of the results.

Instrument Design and Implementation
Round One: Initial Survey:
The first round in the current study consisted of a brief survey, designed to collect some demographic data on the participants, and one open-ended question.

Round Two: Questionnaire One:
A list of 79 features was compiled from the information obtained in the initial survey. Obvious repetitions were eliminated, though items that were similar but not exactly the same were maintained. Items were sorted into four categories: Curricular Features, Pedagogical Features, Structural Features, and Environmental Features. Participants were asked to rate each feature on a Likert-type scale, identifying each feature as an "essential" (5), "very important" (4), "moderately important" (3), "slightly important" (2), or "not important"(1) characteristic of a curricular learning community.

Round Three: Questionnaire Two:
Questionnaire Two listed only the features that had received a mean rating of 4.0 or higher in the previous round. Once again the items were placed into the four categories of Curricular, Pedagogical, Structural, and Environmental.

Round Four: Questionnaire Three:
The third and final questionnaire listed the forty features that received the highest rating (determined by mean and mode) on the previous questionnaire. Panelists were given the following information: ranking of the items from first and second questionnaire, mean score of the items from both rounds, and the number of times each item was selected as one of the three to five most important items.

In this round, panelists were asked to assign a total of 100 value points to the forty items. At the end of this questionnaire, participants were asked to answer the another open-ended question.

Summary of Data Collection and Analysis Procedures:

The following table outlines the four-round Delphi procedure that was followed in this study: (see link for table)

Link: http://www.winona.edu/advising/lcchapter3.htm

Using the Delphi Technique to Achieve Consensus

How it is leading us away from representative government to an illusion of citizen participation.

This article explained the Delphi Technique and how it is used as a methodology in the education system. It was critical of the technique and explains the consequences and motivations of those who are using it.

The Delphi Technique and consensus building are both founded in the same principle - the Hegelian dialectic of thesis, antithesis, and synthesis, with synthesis becoming the new thesis. The goal is a continual evolution to "oneness of mind" (consensus means solidarity of belief) -the collective mind, the wholistic society, the wholistic earth, etc. In thesis and antithesis, opinions or views are presented on a subject to establish views and opposing views. In synthesis, opposites are brought together to form the new thesis. All participants in the process are then to accept ownership of the new thesis and support it, changing their views to align with the new thesis. Through a continual process of evolution, "oneness of mind" will supposedly occur.
In group settings, the Delphi Technique is an unethical method of achieving consensus on controversial topics. It requires well-trained professionals, known as "facilitators" or "change agents," who deliberately escalate tension among group members, pitting one faction against another to make a preordained viewpoint appear "sensible," while making opposing views appear ridiculous.

The facilitators or change agents encourage each person in a group to express concerns about the programs, projects, or policies in question. They listen attentively, elicit input from group members, form "task forces," urge participants to make lists, and in going through these motions, learn about each member of a group. They are trained to identify the "leaders," the "loud mouths," the "weak or non-committal members," and those who are apt to change sides frequently during an argument.

Suddenly, the amiable facilitators become professional agitators and "devil's advocates." Using the "divide and conquer" principle, they manipulate one opinion against another, making those who are out of step appear "ridiculous, unknowledgeable, inarticulate, or dogmatic." They attempt to anger certain participants, thereby accelerating tensions. The facilitators are well trained in psychological manipulation. They are able to predict the reactions of each member in a group. Individuals in opposition to the desired policy or program will be shut out.

The Delphi Technique works. It is very effective with parents, teachers, school children, and community groups. The "targets" rarely, if ever, realize that they are being manipulated. If they do suspect what is happening, they do not know how to end the process. The facilitator seeks to polarize the group in order to become an accepted member of the group and of the process. The desired idea is then placed on the table and individual opinions are sought during discussion. Soon, associates from the divided group begin to adopt the idea as if it were their own, and they pressure the entire group to accept their proposition.

How the Delphi Technique Works
First, a facilitator is hired. While his job is supposedly neutral and non-judgmental, the opposite is actually true. The facilitator is there to direct the meeting to a preset conclusion. The facilitator begins by working the crowd to establish a good-guy-bad-guy scenario. Anyone disagreeing with the facilitator must be made to appear as the bad guy, with the facilitator appearing as the good guy.

Next, the attendees are broken up into smaller groups of seven or eight people. Each group has its own facilitator. The group facilitators steer participants to discuss preset issues, employing the same tactics as the lead facilitator. Why hold such meetings at all if the outcomes are already established? The answer is because it is imperative for the acceptance of the School-to-Work agenda, or the environmental agenda, or whatever the agenda, that ordinary people assume ownership of the preset outcomes. If people believe an idea is theirs, they'll support it. If they believe an idea is being forced on them, they'll resist.


How to Diffuse the Delphi Technique


Three steps can diffuse the Delphi Technique as facilitators attempt to steer a meeting in a specific direction.

Always be charming, courteous, and pleasant. Smile. Moderate your voice so as not to come across as belligerent or aggressive.

Stay focused. If possible, jot down your thoughts or questions. When facilitators are asked questions they don't want to answer, they often digress from the issue that was raised and try instead to put the questioner on the defensive. Do not fall for this tactic. Courteously bring the facilitator back to your original question. If he rephrases it so that it becomes an accusatory statement (a popular tactic), simply say, "That is not what I asked. What I asked was . . ." and repeat your question.

Be persistent. If putting you on the defensive doesn't work, facilitators often resort to long monologues that drag on for several minutes. During that time, the group usually forgets the question that was asked, which is the intent. Let the facilitator finish. Then with polite persistence state: "But you didn't answer my question. My question was . . ." and repeat your question.

Never become angry under any circumstances. Anger directed at the facilitator will immediately make the facilitator the victim. This defeats the purpose. The goal of facilitators is to make the majority of the group members like them, and to alienate anyone who might pose a threat to the realization of their agenda. People with firm, fixed beliefs, who are not afraid to stand up for what they believe in, are obvious threats.

At a meeting, have two or three people who know the Delphi Technique dispersed through the crowd so that, when the facilitator digresses from a question, they can stand up and politely say: "But you didn't answer that lady/gentleman's question."

Establish a plan of action before a meeting. Everyone on your team should know his part. Later, analyze what went right, what went wrong and why, and what needs to happen the next time. Never strategize during a meeting.

link: http://www.eagleforum.org/educate/1998/nov98/focus.html

This process sounds very political in this type of setting and that personal agendas, or vendettas may come into play if a facilitator is not in control of the situation. The illusion of a representative participation is obvious in this type of situation.

Apprehending the Future: Emerging Technologies, from Science Fiction to Campus Reality

Bryan Alexander, Director of Research at the National Institute for Technology and Liberal Education (NITLE) talks about the various techniques used to decipher emerging technology trends from the perspective of higher education in his article. The author talks about how the Delphi technique has been particularly useful in forecasting technology in higher education.

One of the cases where the Delphi technique is used is in the Horizon Project. Launched in 2002, this project draws on a large body of experts across academia. Over several months, the group identifies trends, ranks their impact, compares estimates, and progressively builds up a profile of emerging technologies. This is then published as the annual Horizon Report. The January 2009 report identified the following technologies:

  • Mobiles (time-to-adoption: one year or less)
  • Cloud computing (time-to-adoption: one year or less)
  • Geo-Everything (time-to-adoption: two to three years)
  • The Personal Web (time-to-adoption: two to three years)
  • Semantic-Aware Applications (time-to-adoption: four to five years)
  • Smart Objects (time-to-adoption: four to five years)
Another application of the Delphi Technique in higher education was the "The Future of Internet III" project by Pew Internet & American Life Project and Elon University. This study was much broader in scope and had a much longer timeline. The outcome from this exercise is as below:

  • The mobile device will be the primary connection tool to the Internet for most people in the world in 2020.
  • The transparency of people and organizations will increase, but that will not necessarily yield more personal integrity, social tolerance, or forgiveness.
  • Talk and touch user-interfaces with the Internet will be more prevalent and accepted by 2020.
  • Those working to enforce intellectual property law and copyright protection will remain in a continuing "arms race," with the crackers who will find ways to copy and share content without payment.
  • The divisions between "personal" time and work time and between physical and virtual reality will be further erased for everyone who's connected, and the results will be mixed in terms of social relations.
  • Next-generation engineering of the network to improve the current Internet architecture is more likely than an effort to rebuild the architecture from scratch.
In conclusion, the author states that no technique can effectively predict the future and using a combination of techniques can only help us have some idea of the future. The future is increasingly complex and "black swans" continue to occur and have enormous effects on the future.

Comments:

As can be seen from the above conclusions, they do not seem to be too radical or innovative in any way but instead seem to tug the line of what is believed to be the general consensus (for instance mobile devices being the primary connecting tool in 2020 or touch and talk user interfaces being prevalent does not necessarily need to be deciphered by experts). This is in fact the major drawback of this technique - that Delphi outcomes can be driven by a desire for consensus, rather than actual agreement, meaning that divergent ideas can and often do get quashed.

Saturday, April 17, 2010

The Delphi Technique; Lets Stop Being Manipulated

More and more citizens are being invited to participate in various form of meeting, councils, or boards to "help determine" public policy in one field or another. According to the author of this article Albert V. Burns, this sounds great, but in reality it is deception.

Generally, each meeting has someone designated to facilitate the meeting. The job of facilitator is to be neutral, non-directing helper to see that the meeting flows smoothly. Actually, he or she is there for exactly the opposite reason. To see that the conclusions reached during the meeting are in accord with a plan already decided upon by those who called the meeting.

The process used to "facilitate" the meeting called Delphi Technique. This technique was developed by the RAND Corporation for the U.S. Department of Defense in 1950s.

How does the process take place?

First, the person who will be leading the meeting, the facilitator must be a likable person with whom the meeting participants can agree and sympathize.

Facilitators are trained to recognized potential opponents and how to make such people appear aggressive and foolish. The audience are broken up into groups seven or eight people each.

Within each group discussion takes place of issues already decided upon by the leadership of the meeting. Generally, the participants are asked to write down their ideas and disagreements with the papers to be turned in and "compiled" for general discussion.

How do you know that the ideas on your notes were included in the final result? You Do not! You will come to a conclusion that you were probably in the minority. You do not even know if any one's ideas are part of the final conclusion.
Those who organized the meeting are able to tell the community that the conclusions, reached at the meeting, are the result of public participation.

Actually, the desired conclusions had been established, long before the meeting ever took place.

Friday, April 16, 2010

Prioritization Process Using Delphi Technique

In a white paper, Alan Cline who is a member of Carolla Development, Inc. gives some insights on how to prioritize using Delphi technique.

Background and Motivation

Delphi technique was developed in 1960s as as forecasting methodology. Later, the U.S. government enhanced it as a group decision making tool. Delphi is particularly appropriate when decision-making is required in a political or emotional environment. The tool works formally or informally, in large or small contexts. For example: Taiwan used the method to prioritize their entire Information Technology industry.

Delphi Prioritization Procedure

1- Pick a facilitation leader: The facilitator is an expert in research data collection, and is not a stakeholder.

2- Select a panel of experts: The panelist should have an intimate knowledge of the projects.

3- Identify a straw man criteria list from the panel: Brainstorm a list of criteria that are appropriate to the projects.

4- The panel ranks the criteria: The panel ranks from 1(very important) to 3 (not important). The ranking should be done individually and anonymously.

5- Calculate the mean and deviation

6- Re rank the criteria

7- Identify project constrains and preference: Constrains could be budget or mandatory regulations.

8- Rank project by constrain and preference

9- Analyze the result and feedback to panel

10- Re rank the project until it is stabilized

Thursday, April 15, 2010

Green Team Summary of Findings: Imagery Analysis (4 out of 5 stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 16 articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 15 April 2010 regarding Imagery Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:

Imagery analysis is the extraction of meaningful information from images. It is used by both the military and the civilian sectors. Imagery analysis is highly useful in state-related matters as well as natural resource related businesses and scientific research institutes. The groups that use imagery analysis run the gamut from NGO's to multinational organizations.

There are many different types of imagery that can be used alone or in combination with others. This list can include satellite, infrared, electro-optical, and multispectral imagery. One of the most well known types of imagery is satellite imagery. Since the early days of satellite imagery there have been dramatic improvements in availability and quality that have lead to increased accessibility. Some of the largest commercial vendors are GeoEye, DigitalGlobe, ImageSat International, and Satellite Imaging Corporation. Google Earth is one of the most popular sources for images because it is commercially available to anyone with Internet access.


STRENGTHS AND WEAKNESSES:

Strengths
  • Capable of providing a detailed overview of large area
  • More manageable than on-the-ground classification efforts
  • Accessibility and ability of systems continues to improve
  • Data provided is clear and highly detailed
  • Can be used to corroborate other collected data
  • Can be applied to a variety of problems, and is very adaptable
  • Allows for data collection in remote areas
  • Relatively easy to perform cue to commercial availability
  • Allows for a retrospective view for an area

Weaknesses
  • Software and systems can be expensive
  • Limited by resolution, image quality, atmospheric haze, and contrast
  • Hard to compare images taken from different angles and at different resolutions
  • This methodology still requires a trained analyst to go over the tentative findings and discriminate between objects
  • Need for trained professionals (level of training debatable)
  • Chance for error if preparatory steps not followed such as accounting for clouds, resolution, etc.
  • Does not account for other factors such as change between the images
  • Can be subject deception

How To:

There appear to be no standard procedures applicable to all images at all times or for all purposes. However, several standard processes for specific purposes were identified through the literature reviewed in advance of this class. For example, with respect to classifying types of vegetation:

1.Identify landmarks
2 Align landmarks (if using photos from different time periods)
3.Standardize levels of contrast within an image (necessary when images were taken by different sensors)
4.Image Classification- divide the photograph into "clusters", spatial areas with the same characteristics then decide what those are with respect to the region and culture of an area.
5.Accuracy Assessment- compare classification results to what the ground areas truly are as revealed by aerial photographs or teams on the ground.


Personal Application:

The class looked at ten designated images; five different cities at two different heights. The class had three minutes to look at the two images of the city and identify the following: Density, Bodies of Water, Building Shadows, Landmarks, and Vegetation. These five criteria helped to narrow the range of possible cities. For example, when looking at New York City, the identification was made possible by looking for vegetation (Central Park), building shadows (consistently tall buildings), and density (buildings very close together over a wide area).


Summary of Findings (White Team): Geospatial Analysis (4 out of 5 stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 16 articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 15 April 2010 regarding Geospatial Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:
Geospatial Analysis is a method using aerial imagery technology for analysis. These technologies include, Electro Optical, Infra Red, Multispectral, or Radar. This type of imagery is commercially available and useful for a multitude of applications. Analyzing geospatial imagery can provide decision makers with visual indicators that offer evidence of internal operations. Static images to aid the intelligence process by contributing to all-source analysis provided to decision makers. Imagery can be combined with cartography and mapping to determine distances, heights, demographics and population density.

Strengths:
  • Objective in nature, data is structured.
  • Easy to perform, even for untrained analysts.
  • Readily available open sources such as Google Earth/ it is commercially available.
  • Satellite imagery can provide data in remote areas that lack accurate geographic maps.
  • Usually more cost efficient than sending personnel to a site.
  • Can also provide powerful visualization to assist when shaping policy
  • On the ground situation can be assessed, even when outside parties don't have access to the area.
  • The same data can be used on multiple types of problems/issues.
  • Technology is continually being updated to allow for easier use.
Weaknesses:
  • Potentially expensive.
  • Static images (only a snapshot in time).
  • May require training depending on the level of analysis required.
  • Errors may occur due to lack of resolution, deception efforts, or even weather.
  • Imagery can generally be obtained at specific time intervals (i.e.- satellites cannot maintain stationary low earth orbit).
  • Unable to view objects/structures located beneath the surface of the Earth.
  • May require additional resources to provide specific measurements (e.g.-distance between objects, or heights of structures).
  • Privacy and confidentiality issues may arise as technologies and resolution advance in sophistication.
How to:
There appear to be no standard procedures applicable to all images at all times or for all purposes. However, several standard processes for specific purposes were identified through the literature reviewed in advance of this class. For example, with respect to classifying types of vegetation:

  • Images to be analyzed should show the same season (in case multiple images are used for analysis) and must be accurately registered (matched up) to the ground and to each other.
  • The images must be radiometrically calibrated to minimize effects of instrument variations and atmospheric haze (Radiometric Normalization (RRN)). This is necessary when images are taken by different sensors, to standardize radiometry, or contrast within an image
  • The landmarks in the images need to be aligned (geometric rectification). This is necessary for analysis over time.
  • A classification scheme should be decided on and designed. Classification categories ex. cultivated land or grassland, forested land vs. non forested land can be used.
  • Classify the images divide the photograph into "clusters", spatial areas with the same characteristics, then decide the category they should be placed.
  • To reduce errors in classification, techniques such as having a human check portions of computer-classified work is also done (spatial re-classification).
  • Classification results are compared to what the ground areas truly are (as revealed by aerial photographs) to assess the accuracy of the classification.
Application:
Geospatial Analysis, as a method, was demonstrated by looking at various images of major international cities. We had to identify them by viewing wide screen shots taken from Google Earth followed by viewing a close-up of the image. By recognizing key indicators, i.e. population density, landmarks, shadows, terrain, etc..., we were able to identify the cities (some, but not all) which taught us that as a method, it is best practiced using modifiers, particularly when resolution is lacking or distorted. The depth of Imagery analysis is dependent upon the source of the image (i.e. aerial vs. satellite) and the skill set of the analyst. This exercise, a descriptive analysis, would be an excellent starting point for those with low exposure to imagery analysis as it is low cost and easily accessible.


Tuesday, April 13, 2010

Satellite Imagery Activism: Sharpening the Focus on Tropical Deforestation

The Rise of Satellite Imagery Activism
According to the authors (from the RAND Corp. & George Washington University), the first Landsat satellite was launched by the U.S. in 1972 and since then has opened up the doors to satellite remote sensing, especially in terms of natural resource sustainability and human activity monitoring. The original users of satellite imagery were state agencies (civil & military), natural resource related businesses and scientific research institutes. However today the groups that use satellite imagery run the gamut from NGO's to multinational organizations. Since the early days of satellite imagery there have been dramatic improvements in availability and quality that have lead to increased accessibility. First, cheap and highly powerful computing abilities have broadened the range of individuals that can work with imagery data. Second, "the advent of user-friendly software for image processing and analysis has made imagery analysis much less the purview of remote sensing specialists". Third, the cost of imagery data has drastically declined with the onset of cheaper Landsat/SPOT images, declassified U.S. & Russian intelligence imagery and many other forms of commercially available imagery. Lastly, the internet and CD-ROM's have facilitated the spread of imagery with giants like Google moving into the arena.

Imagery Activism & Tropical Forests
Due to "rising levels of global transparency" and recent software and internet abilities, there are many avenues in which citizens can bring awareness to public policy issues through the use of satellite imagery. A lot of environmental factors can be analyzed via satellite imagery such as changes in vegetation, biological stress and habitat characterization. Out of these issues, there has been the greatest focus on monitoring trends in tropical deforestation. Besides simple deforestation, "satellite imagery can be used to monitor growth, as well as the effects of human activity and exploitation...on the surrounding ecosystems". Satellite images can provide objective proof of potential logging infractions and at the same time be built up into databases to provide long-term monitoring of forests. In the last several years, the international community's gaze has fallen on the tropical rainforests of Brazil and Indonesia. Brazil in particular has seen the dramatic expansion of road networks, farms, pastures and plantations, all at the expense of the rainforests.

Conclusions
In summary, this paper examined the rise of satellite imagery in today's society and how this imagery affects the world's tropical rainforests. The paper's focus was on deforestation in specifically Brazil and Indonesia. Furthermore, the paper's aim is to attract global attention to the problem of deforestation by utilizing satellite imagery data and other geospatial technologies.

Source
http://www.aseanenvironment.info/Abstract/41016424.pdf

Tropical Cyclone Intensity Analysis and Forecasting from Satellite Imagery

Introduction
This article by Vernon F. Dvorak, discusses the implications of analyzing cyclones with intensity analysis, by using satellite platforms. Dvorak notes that satellite imagery is highly useful in monitoring tropical cyclones because methods have been developed that allow analysts to estimate the intensity of the cyclone based on certain distinguishing cloud features.
The Technique
The technique of combining intensity analysis with satellite imagery yields rather good estimates in terms of forecasting storm intensity. The specific cloud features that are used to estimate cyclone intensity are derived from two categories of features: "central features" and "outer-banding features". Each of these features is assigned a "T-number" ("T" for tropical) which denotes tropical disturbances. T1 describes disturbances "exhibiting minimal but significant signs of tropical cyclone intensity", whereas T8 denotes the "maximum possible" cyclone intensity. Next, each of the cloud features is analyzed to determine whether the cyclone will remain at its modeled intensity over the next 24 hours.

The Analysis Procedure

The intensity analysis via the satellite platform consists of three stages. The first stage "requires a qualitative judgment as to how cloud features related to cyclone intensity have changed between yesterday's picture and today's". The second and third stages of analysis involve examining the overall cloud pattern and component features to see if they agree with the modeled intensity estimate that the forecasters arrived at during the first stage of analysis.
The Forecast Procedure
To come up with the actual intensity forecast, the researchers have to use either the cyclone's model curve to obtain tomorrow's intensity or adjust the curve when an interuption due to landfall or the approach of some unfavorable circumstance is near. These changes can be detected with an unexplainable change of T-number or in changes to the cloud features.
Performance
A researcher named Erickson conducted the first tests using intensity analysis via satellite imagery in 1972, and his results have shown good consistency. Another researcher by the name of Arnold (1972)used the T-number estimates in tandem with the Joint Typhoon Warning System estimates for the west Pacific and had pretty good results as well.

(Satellite Imagery of Tropical Cyclone patterns with "T" Numbers)
Source
http://journals.ametsoc.org/doi/pdf/10.1175/1520-0493%281975%29103%3C0420%3ATCIAAF%3E2.0.CO%3B2

Validation of SPOT-5 Satellite Imagery for Geological hazard Identification

Summary
This study was developed to determine whether SPOT-5 satellite imagery could be used to develop geographic hazard maps for areas lacking available resource. The study took place in Nicaragua, and compared SPOT-5 analysis with newly created threat maps, and additional geographic monitoring. Maps were created using SPOT-5 imagery for hazard inventory, hazard susceptibility, and vulnerability mapping. The study concluded that SPOT-5 imagery could be used to effectively develop geographic hazard maps. The system was however limited by the level analysis needed to create the hazard maps.
Strengths
Satellite imagery was capable of providing necessary data in remote areas that lack accurate geographic maps. Additionally it allowed for direct GIS system and an ability to gather data despite poor weather conditions.
Limitations
Despite the SPOT-5 test success, the full process of creating the maps required large amounts of additional analysis. The system was also unable to easily identify certain features needed for hazard mapping, which could lead to possible issues without additional support from standard measurement systems

Source:http://www.cartesia.org/geodoc/isprs2004/comm1/papers/51.pdf

The Use Of Satellite Imagery In Landslide Studies In High Mountain Areas

Summary

This is a case-study that compares the application of Landsat ETM+ and IKONOS imagery when assessing a natural terrain's susceptibility to landslides. This study looked at six areas located in the upland areas of Nepal and Bhutan. In each case, the imagery has been used both to directly map landslides and to examine the occurrence of factors that might be important in landslide initiation, such as water seepage. The results from the imagery were bench-marked using field surveys.

The results of this study demonstrated that Landsat ETM+ continues to be the most cost-effective imagery tool for mapping landslide susceptibility. This is due to the fact that it is relatively low-cost and has high spectral resolution. However, the researchers state that the spatial resolution is still a significant limitation to Landsat ETM+.

The high resolution, multispectral IKONOS imagery is not limited in the same way. With IKONOS, even small landslides are able to be mapped in great detail. However, according to the report, this type of imagery is less useful for factor type mapping. The report concluded that of the high cost of IKONOS will prevent most developing countries from developing and utilizing the tool, resulting in Landsat ETM+ being the better tool for mapping landslide susceptibility.

Sources: http://www.securinglivelihoods.org/nepal/files/nepal%20case%20study.pdf

Processing Satellite Imagery To Detect Waste Tire Piles


Summary


This article, posted on www.techbriefs.com, discusses the development of a new methodology in satellite imagery analysis. The developers, Joseph Skiles, Cynthia Schmidt, Becky Quinlan, and Catherine Huybrechts, state that they have created a new methodology for processing commercially available satellite spectral imagery to identify and map waste tire piles in California. The methodology uses a combination of previously commercially available image-processing and georeferencing software. This is then used to develop a model that identifies tire piles.

Tire piles are difficult to distinguish in satellite imagery because of their low reflectance levels. Often times tire piles are mistaken for shadows or deep water. The developers of this methodology claim it attempts to correct these misinterpretations by using software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model incorporated lessons learned in previous research on the detection and mapping of tire piles by various methods of analysis (manual/visual and/or computational) of aerial and satellite imagery.

Strengths
  • Reduces time spent surveying regions for tire sites
Weaknesses
  • This methodology still requires a trained analyst to go over the tentative findings and discriminate between tires, water, vegetation, etc.

Source: http://www.techbriefs.com/component/content/article/2486

Spatial Information Technologies in Critical Infrastructure Protection

Summary
This document prepared by the National Consortia for Remote Sensing in Transportation(NCRST), addresses the necessity of remote sensing technologies in protecting critical transportation infrastructure. The document discuss the varying concepts on defining critical infrastructure, threats to the infrastructure, protection of infrastructure and disaster management needs. The document then discusses the critical role of remote sensing technologies. While they have become a vital tool in critical infrastructure protection, the systems are limited by the users and their status.

Strengths
Remote sensing provides low cost, multi-purpose, wide area “at-a-distance” data. That can assist in modeling, identification of critical infrastructure. It can also provide powerful visualization to assist when shaping policy.

limitations
While remote sensing data is useful, it may be unusable due to timeliness and expertise required to convert data. Some sensing may also be limited by its sensitive nature, limiting its use. Additional concerns stem from the interoperability of data sets and systems.

Source:http://www.ncgia.ucsb.edu/ncrst/research/cip/CIPAgenda.pdf

High-Resolution Satellite Imagery and the Conflict in Sri Lanka

In May 2009, the Science and Human Rights Program of the American Association for the Advancement of Science (AAAS) acquired and analyzed commercial high-resolution satellite imagery of the Civilian Safety Zone (CSZ) and surrounding area in northeastern Sri Lanka. The project was done at the request of Human Rights Watch and Amnesty International, who expressed concern over the status and safety of civilians due to the heavy fighting occurring 9-10 May, 2009. Comparing the May 6 and May 10, 2009 images of the CSZ, AAAS found significant removal of IDP shelters. In addition, imagery showed evidence of bombshell craters, destroyed permanent structures, mortar positions, and 1,346 individual graves. AAAS’s analysis was based on images from various publically accessible commercial satellites, US Army Field Manuals, and open-source information from public statements and media reports.

Strengths - Satellite imagery analysis is a useful way to assess the situation on the ground during conflicts in which no outside parties are allowed in the area.

Weaknesses – None

Source: http://shr.aaas.org/geotech/srilanka/srilanka.shtml

Landsat Satellite Images Change Detection Methods

Minnesota's Department of Natural Resources (DNR)'s website depicts its use of Landsat Thematic Mapper (TM) images to map Minnesota land cover. Landsat TM images are digital and differ from traditional digital images in their ability to express additional measures of brightness beyond simple RGB brightness. Specifically, Landsat records an additional four sets of brightness from the near, middle, and thermal infrared portions of the electromagnetic spectrum. Landsat images are useful in showing changes between two images taken of the same area but years apart. Image differencing, or subtracting the original image from the new image to determine changes, is a simple technique in remote sensing.

Preparatory Steps

Both images should show the same season and must be accurately registered (matched up) to the ground and to each other. Remove areas with clouds and distinguish between forest and nonforest areas. The images must be radiometrically calibrated to minimize effects of instrument variations and atmospheric haze.

Analysis

Values in the difference image that are red or orange show loss of vegetation, while greens depict growth of vegetation. The example uses a simple bell curve in which minor changes in vegetation account for 80% of the total changes and significant changes occur on either side of that distribution.

Strengths

  • Relatively easy to perform
  • Objective in nature

Weaknesses

  • Chance for error if preparatory steps not followed
  • Does not account for other factors of vegetation change between the images
  • Unlike satellite image analysis, it is not possible to compare analysis units and determine which has shown the most vegetation loss in this case, since the statistical thresholds display roughly the same amount of change and non-change regardless of what has actually happened in the area.

Source: http://www.ra.dnr.state.mn.us/changeview/change_tech.html

Monday, April 12, 2010

Tsunami Satellite Image Analysis Reveals Dramatic Water Quality Changes

Satellite Imagery Analysis

Summary:
An article in XPress Press, via Applied Analysis Inc. describes how Satellite Imagery Analysis was used to determine water quality levels after the Tsunami that hit Sri Lanka and India. Applied Analysis Inc., an American company, used satellite imagery analysis processes originally designed for military use, to determine the clarity of the water. Applied Analysis Inc. used the IKONOS imagery software for their analysis, and believes that the technology they used will be able to better identify other problems with water supplies in the near future.

Strengths:
  • Can be used on multiple types of problems/issues.
  • Continually upgraded technology.
  • Has relatively clear data that is visible.
Weaknesses:
  • None noted.

Link

http://www.xpresspress.com/news/AppliedAnalysis_011305.html

Satellite Image Analysis Reveals South Ossetian Damage

Satellite Imagery Analysis

Summary:
An article from ScienceDaily on October 9, 2008 discusses the usage of satellite imagery analysis to determine the damage done in the South Ossetian during the Russian-Georgian conflict. The article uses analysis done by the American Association for the Advancement of Science at the request of Amnesty International USA. The AAAS study examined damage to 24 villages near the city of Tskhinvali, which is considered the main city in South Ossetia. The AAAS looked at the damage of structures on August 10th, and compared them to the damage on August 19th. They determined that Tskhinvali sustained the greatest amount of damage (182 structures) between the 10th and 19th of August. The details of how they believe the buildings were destroyed corroborated the stories from the ground that fires were the main cause. The AAAS study combined eye-witness accounts of destruction with objectively interpreted satellite images purchased through three commercial vendors: GeoEye, DigitalGlobe and ImageSat International. They also used the software packages ERDAS Imagine and ArcView to leverage the power of remote sensing technology and Geographic Information Systems (GIS), respectively.

Strengths:
  • Used to corroborate reports on the ground.
  • Can be very detailed.
  • Easy to compare images from different dates.

Weaknesses:
  • Need of trained professionals.
  • Software can be expensive.

Source:

http://www.sciencedaily.com/releases/2008/10/081009144105.htm

Sunday, April 11, 2010

A Comparison of Aerial Photography, Landsat TM and SPOT Satellite Imagery

A study was done on the tropical wetland environments of northern Australia; population growth and environmental problems are encroaching and threatening the wetlands. Because of the remoteness and fragility of the area, remote sensing was an appealing method for obtaining information about the land. The purpose of the study was to investigate the utility of image data sets to see which type of image produced the best resolution for mapping the environment.

Method:
1.) In the study, Landsat, TM (satellite images) , SPOT XS and PAN (remote sensing satellites) and large-scale, true-color aerial photography were evaluated for mapping the vegetation.

2.) Five sample points were placed at 1km intervals. Each point was labeled and information about the location was recorded. To ensure the correct cover types were recorded, both photographic and written records were collected at each site. This procedure provided 12 observations per sample point and a total sample size of 240 observations for the entire swamp.

3.) Landsat, TM, SPOT, and large-scale photography was used for each location.

4.) Images were evaluated.

Results:
The study concluded that aerial photography was superior to satellite imagery for detailed mapping of vegetation environment studied.

The results suggest that either Landsat TM or SPOT XS imagery is adequate for mapping these generalized land cover classes. But the resolution needed for the classification of the vegetation was acquired through the aerial imagery. When comparing the two satellite imagery data sets, the broader spectral range of the Landsat TM data appeared to more than compensate for the superior spatial resolution of the SPOT imagery.

The researchers note that the most useful technique will depend on the application and is closely related to the physical characteristics of the features being mapped.

Study can be accessed through EBSCOhost:
Vegetation mapping of a tropical freshwater swamp in the Northern Territory, Australia: a comparison of aerial photography, Landsat TM and SPOT satellite imagery by K. R. HARVEY and G. J. E. HILL