Tuesday, March 23, 2010

Getting Smart About BI: Best Practices Deliver Real Value

Introduction
In 2006, BusinessWeek Research Services (BWRS) conducted a study to discover and analyze the implementation practices of companies that utilize business intelligence (BI) and other analytics systems. Additionally, they looked to answer if BI helped these companies achieve their business value goals. This results of this study allowed BWRS to compile a list of five Best Practices for BI.

Methodology
BWRS conducted an online survey of senior executives and managers at large companies who are members of BWRS' Market Advisory Board, as well as North American subscribers to BusinessWeek Magazine and/or the BusinessWeek Web site; BWRS received 359 survey respondents. In addition to the online survey, researchers conducted in-depth telephonic interviews with 10 senior officials at large and mid-size companies known to be using BI and analytics. Finally, BWRS analyzed results of prior BWRS surveys about BI and general business trends and applied those results to the current study.

Results
The results of this study indicate that there are five best practices in business that result in achieving value from BI.
  1. Business information governance programs--programs that govern standards and corporate requirements for data management.
  2. Enterprise information strategy--a corporate-level strategy to organize, structure and leverage information assets.
  3. Information quality programs--formal procedures to identify, fix and prevent data quality problems such as inaccuracy and incompleteness. Most BI experts say data quality is the umber one prerequisite for delivering BI business value. This is especially true at large companies because they are prone to decentralized systems.
  4. Enterprise data warehouse--central repositories of enterprise data for reporting and analytical purposes. These warehouses prevent inaccuracy and are less expensive to maintain than a series of smaller repositories throughout an organization.
  5. BI competency centers--a core team dedicated to managing BI within their organization/business.
Conclusion
There is a clear correlation between achieving business value and implementing these five data management techniques. The survey found that the companies that achieve or exceed their expected business value were much more likely to have adopted these techniques. However, using these techniques alone will not result in achieving value from BI. The study offers examples such as executive sponsorship, business and IT alignment, and encouraging the adoption of BI tools as techniques to use in addition to the five Best Practices to achieve the full value of BI.

Nobody Left Behind: Report on Exemplary and Best Practices in Disaster Preparedness and Emergency Response For People with Disabilities

Description
The Nobody Left Behind(NLB)study was developed to evaluate the preparedness of emergency management sites and their ability to assist persons with mobility impairments. A portion of the NLB study was designed to determine the exemplary and emerging best practices. Data collected for the study came from telephone surveys of site emergency managers, and through the review of local emergency plans.

Exemplary Practices
Out of the 30 sites surveyed, six were identified as having exemplary practices.These six sites were evenly divided between rural and urban settings. The six sites were determined to have exemplary practices through the cumulative effect of policies and procedures at place in the facilities and their communities.

Exemplary Practices:
1. Administering and maintaining a surveillance system, usually a self-identified registry system of persons needing assistance during a disaster or emergency.
2. Identifying accessible transportation vehicles and guidelines to evacuate persons with disabilities needing assistance.
3. Establishing a so called special medical needs shelter.
4. Conducting training and exercises on evacuation of persons with disabilities.

Emerging Best Practices
Analysis of policies from the 30 surveyed sites determined a number of evolving best practices. The practices were determined through research and recommendations from policy experts, advisors, and consultants. This analysis of policy and practices determined three emerging best practices.

Emerging Best Practices:
1. Comprehensive planning for persons with disabilities in the local emergency management plan.
2. Comprehensive planning tool using surveillance and consumer education.
3. Day to day surveillance and consumer education outreach.

Conclusion
This study attempts to understand best practices for emergency management relating to persons with disabilities. The effects of the study were limited by the size of the surveyed population. Practices may not be effective for feasible in certain situations.

White Paper: Best Practices in Emergency Alerting

Summary
This White Paper developed by Rave Wireless is designed to assist colleges and universities in their emergency management practices. The paper provides an initial overview of the fundamentals of emergency management practices. After establishing these fundamentals, the paper goes on to underline the best practices in emergency preparedness and establishing a communication/alert system for a college or university community.Additionally, the article provides templates for internal assessment.

Emergency Preparedness Best Practices
The initial pages of the article provide a number of Best Practices in Emergency Management for colleges and universities. These best practices focus on internal assessment to improve the effectiveness of emergency preparedness programs. This assessment is formed from successful practices used by other colleges and universities.

Best practices in Emergency Preparedness:Strategic Planning
-Assess and rank threats
-Identify prevention methods
-Define what constitutes and emergency
-Identify constituents and their roles
-Conduct assessments of situation and assess resource gaps
-Define scenarios and time lines for completing plans

Communication Plan Practices
Similar to the papers presentation on preparedness best practices, the article also provides best practices for implementing an emergency communication plan. Practices outlined are based on both company technical recommendations and practices employed by other colleges and universities. These practices relate to aspects from the implementation of a system to post incident evaluation of the system.

Implementation
-Create communication processes, templates and approval procedures
-Define acceptable terms for emergency mass communications
-Determine target audience(s) specifics
-Identify the appropriate mode of communication for each audience
-Consider coverage and capacity limitations of your available communication modes
-Consider implementation of an inbound notification infrastructure
-Define coordination around the approval process in detail
-Establish policies for the frequency and level of communication
-Create message templates
-Identify alternates and back up plans
-Document and make plans easily accessible
-Communicate the plan to the campus and local community
-Internalize via Practice, Practice, Practice

Response
-Communicate truthfully and promptly, and communicate succinctly
-Be comfortable not over communicating
-Expect to be forced to make decisions based on incomplete information
-Organize your first responders
-Plan to utilize response features in your alert solution

Recovery
-Determine decisionmakers
-Provide a smooth transition from crisis to normal mode

Evaluation and Assessment
-Perform post-mortem analysis
-Utilize system self reporting

Evaluation
The article itself identifies that best practices used are unique and not always effective to parties utilizing them. Ideally practices in emergency management are designed to all target threats to communities, but often may not be able to address the rapidly changing nature of threats.

Physician Explanations For Failing To Comply With "Best Practices"

Introduction
As the title indicates, this article is devoted to improving physician based compliance with evidence-based guidelines. To get more specific, the primary objective of this paper was to examine why physicians don't follow known "best practices" when treating patients with type 2 diabetes.

Method
The research design for this study was a descriptive study based upon self-assessed compliance. The research team consisted of 85 internists who volunteered to aid in conducting the study. The physicians were queried by the internists sending out 7,000 randomly generated survey requests (using a professional organization) and 800 of the 7,000 expressed interest and 137 went on to complete the task. In completing the survey, the physicians simply reviewed their own charts for type 2 diabetes patients and then reported open-ended comments for the reasons they did not comply with known "best practices".

Results/Conclusion
For diabetes care measures, the "physician noncompliance was most common for screening urinalysis(26%) and screening microalbuminuria (46%)" among the five measures examined. When examining the physicians open-ended comments, the main issues that were cited for noncompliance were physician oversight, patient non adherence, and systems issues. Physicians did admit that sometimes there is a conscious decision made to not comply due to the patient's age, comorbid illness or other factors. The study concludes that "even among a self-selected group of physicians, noncompliance with best practices in diabetes is common" and that physicians often disagree about what constitutes "best practice".

Monday, March 22, 2010

Small Business Best Practice Benchmarking: How to Effectively Borrow Ideas, Strategies and Tactics

Darrell Zahorsky, a former Absolut.com Guide, wrote an article about how to use Best Practices for small businesses. He discusses what Best Practices is by saying "A best practice is the process of finding and using ideas and strategies from outside your company and industry to improve performance in any given area." He goes on to also mention that Best Practices has been used by big business in the past and that Best Practices is similar to benchmarking.

Zahorsky then explains that Best Practices can help a small business by allowing them to...
  • Reduce costs
  • Avoid mistakes
  • Find new ideas
  • Improve performance.
Zahorsky they describes that in order to successfully use Best Practices, a small business must first note the "ingredients" a successful competitor is using go improve. He notes that only by looking at the specific steps a competitor makes to succeed can you "bake the cake."

In order to apply Best Practices, Zahorksy gives a list of steps to use.
  • Identify one business process or service to improve. (Product delivery)
  • Look for one metric to measure. (Late Shipment %)
  • Find competitors and companies within your industry and outside your industry. (FedEx)
  • Collect information on the successful, best practices of other companies. (FedEx spoke and hub system)
  • Modify the best practice for your situation. (Have one retail store per city act as central hub for shipments.)
  • Implement the process then measure the results.

Best Practice In Performance Reporting In Natural Resource Management

Introduction
Published in 1997 by the Department of Natural Resources and Environment - Victoria, this article is chiefly about the progress of "outcome based management" of natural resources in Australian park management agencies. According to the paper, the Treasury Board of Canada Secretariat (1996) provides a good definition for best practices sharing which is the "capture, dissemination and sharing of a work method, process or initiative to improve organizational effectiveness, service delivery and employee satisfaction".

Method
From here, the article says that a literature review was performed and a questionnaire was created and distributed. The questionnaire was passed out to an officer in all State and Territory protected area management agencies as well as to the Australian National Parks and Wildlife Service. The purpose of the survey was five-fold looking into: 1)to examine the degree of which performance reporting was used by agencies, 2)determine the methods that were used, 3)assess the extent of ecological monitoring programs, 4)examine any "State Of the Environment (SOE)" reporting for performance reporting, 5)investigate best practices in activity bases monitoring.

As a result of the questionnaire and literature review, the researchers determined that there was very little information on best practices in regards to natural resource management in parks, however interest is growing in this subject area.

Developing A Framework Of The Best Practice Model For Natural Resource Management
The last wholly relevant piece of this article was the focus on the development of a best practice model that fits in with the natural resource management focus. A number of criteria were laid out for the development of a best practice model based upon Australian and international approaches. These criteria included:

1)a clear nexus between an agency's legislative requirements and its strategic objectives for natural resource management
2)clearly stated management goals that are derived directly from strategic objectives
3)a plan of natural resource management programs and activities at both the agency and the park level for meeting the strategic objectives within a specified time-frame (both medium term and annual)
4)performance indicators and targets against which the degree to which goals were achieved can be assessed, at both the agency and the park level
5)natural resource monitoring programs that provide data for the assessment of performance indicators


In conclusion, none of the agencies assessed in Australia nor elsewhere, meet all of the aforementioned criteria for best practice analysis in regards to natural resource management

The Study of Best Practices in Civil Service Reforms

Description

The concept of "best practice" is widely used to distinguish exemplary or improved performance in organizations. James Katorobo defines best practice research (BPR) as "the selective observation of a set of exemplars across different contexts in order to derive more generalizable principles and theories of management". The focus of analyses is on "success stories" in order to discover the chain of practices, or ways of doing things that achieve results. The fundamental objective of BPR is to discover best practices that can be adapted by other organizations where the level of achievement is low.

Civil Service Application

Levels of civil service reform performance differ from country to country: some doing well, others poorly. Whereas focusing on problems and difficulties may encourage negativism and promote organizational paralysis; focusing on success stories and best practices may encourage a positive outlook and promote diffusion of best practices.

Key Concepts

  • Best practice research is typified by a focus on successful cases. It is also typified by the objective of discovering the "causes" of that success, not merely to explain and understand (academic), but to guide action (pragmatic).
  • The concept of "best" implies an ability to evaluate several practices and their impact and to conclude that one or more practices are more successful; that is, these "best" practices lead to more desirable results. Underlying this evaluation is the notion that the practices can be graded on a scale of measurement.
  • For a set of practices to be useful, it should be holistic, or archetypal. A plethora of details can create unmanageable complexity.
  • A generic best practice is one that is applicable irrespective of institutional and structural differences.

Criticism

Critics of the "best practices" approach often ask why not study the bad cases in order to identify and avoid problems. Alas, focusing exclusively on problems, mistakes and failures provides few lessons on what to do. By focusing on successes, a manager can learn how to overcome problems.

SEO "Best Practices" Are Bunk

In an article on Search Engine Land, Adam Audette describes how the Best Practices model is not worth using in search engine optimization (SEO). Audette gives a brief background as to how Best Practices was started in the mid 1990's, and then begins to discuss how the methodology fails when being used for SEOs. Audette claims specifically that the entire basis of Best Practices, which is based on using a specific rule set, used throughout an organization to follow a certain procedure to achieve the "best" practice, is incompatible with how SEO is currently being used.

Aduette then explains the negatives of Best Practices for SEOs;
"By their very nature, best practices are rule-sets that are standardized and formalized procedures. There is no competitive advantage in having best practices, at least in SEO. There is only a summation of basic webmastering (e.g., place relevant keywords in the title tag, make pages semantic and relevant, etc). That’s simply not cutting it anymore, because frankly, that stuff represents the basic price of admission. Best practices are neutered, stale and massively reproduced conventions that have been used (and sometimes abused) to the point of ubiquity. SEO and ubiquity don’t mix. By definition, a best practice:
1-Is a static rule-set
2-Is a standard to be followed has worked in the past (read: is old)
3-Has been popularized (read: is average)
4-Limits judgement, evaluation, and strategy (cornerstones of quality search marketing)."


Strengths:
  • Is a set of rules that can be taught
  • Can be employed throughout an organization to improve multiple areas
Weaknesses:
  • Not dynamic
  • Doesn't work in overly competitive constantly changing industries

Sunday, March 21, 2010

Differences Across First District Banks In Operational Efficiency (Using Best Practices Methodology)

In this article, Robert Tannenwald uses Best Practices Methodology to find which banks are most efficient in the First Federal Reserve District. He also analyzed the trends in efficiency of the banks in that New England district over the course of several years. For the purpose of this article, the author defines Best Practice as an ideal or a business' "maximum attainable performance". This is impossible to calculate precisely, but scholars can approximate it by seeing which businesses operate most efficiently. To this end, the author employs two complex statistical methods to quantify best practices which are models unique to the banking industry --the Stochastic Economic Frontier Approach (SEFA) and the Thick Frontier Approach (TFA).

Strength of the Model:
  • Quantifies the impact of superior practices

Weakness of the Model:

  • It is hard to identify what criterion to measure for efficiency
  • Statistically complex

Methodology:

The author uses SEFA, where regression techniques yield a model in which total cost = a function of variables including input prices and the mix of outputs. The resulting function represents best practice and can be used as a benchmark for evaluating the efficiency of individual banks. The study also uses TFA, which assumes that banks with relatively low average cost (total costs/ total assets) set the standard of efficiency with which experts can compare all other banks of a comparable size.

Results:

Assuming that the methodologies were sound, the banks in New England improved their practices to become more efficient between 1985-89 and 1990-93.

Critique:

These methods of using Best Practices have allure due to the hard numbers they yield. However, the results are only valid if the researcher chooses the right variable to measure in the first place. Also, the statistical complexity of this form of Best Practice task is daunting which makes it an impractical choice for low-stakes analytic exercises.

Source:

This article came from a proprietary database Business Source Elite.

Saturday, March 20, 2010

Remote Area Indigenous Housing: Towards A Model Of Best Practice

This article by John Minney, Michelle Manicaros, and Michael Lindfield used a Best Practices model to evaluate the success of 26 rural Indigenous people housing programs in Australia, Canada, and the United States.

Manicaros defines Best Practice Methodology as "examples of action which could be recommended for further application whether in a similar or adopted form ." In other words, the technique looks to other similar programs or businesses to determine what works and what does not. Best Practice Methodology originated in a business context, but this article shows its adaptation to a social science. The article mentions that the United Nations Center for Human Settlements uses the technique to evaluate programs to discern their successful characteristics.

Model Strengths:
  • Dynamic
  • Holistic

Model Weaknesses:

  • Ignores the influence of geography, cultural norms, and institutional context
  • Must be confined to a specific time frame
  • Difficult to define which variables should be assessed

How to: The study made a framework to represent best practices in a two dimensional matrix. Then the authors evaluated the degree of success in each area using either a check or minus sign.

Dimension 1: Evaluates the 4 stages of housing provision

  • Needs assessment
  • Development and Design
  • Implementation
  • Post Construction

Dimension 2: Contains variables relevant to the stages of the housing process

  • Funding
  • Skills development and training
  • Technology
  • Organization
  • Cultural factors
  • Hard and soft infrastructure of assistance program

Results: The exercise identified the several areas of weakness in rural indigenous people housing programs. There needs to be more flexibility in the way that funding can be spent. Land titles need to be clearly defined to give the program staff maximum options to assist their clients. Also, there must be sufficient infrastructure within successful programs to maintain the houses after they are built.

This study lacked the robust nature of some Best Practices exercises as it did not quantify the rating of each factor. However, it proved that the model has useful applications in analyzing social programs.

Source:

This article came from a proprietary database Academic Search Complete.

Business Link Application of Best Practices

According to Business Link, the official UK government website for businesses, best practice means “finding – and using – the best ways of working to achieve your business objectives.” This involves monitoring successful businesses, including those outside your sector, and measuring how you work against how the market leaders operate. Evaluating how your operations compare with the most effective and profitable enterprises, and then using their most successful elements - the 'best practice' - in your own business, can make a big difference.

Best practice through benchmarking

Applying best practice means learning from and through the experience of others. One way of doing this is through benchmarking, which allows you to compare your business with other successful businesses to highlight areas where your business could improve.

Best practice through standards

Independent bodies such as the British Standards Institution (BSI) establish standards, which are fixed specifications or benchmarks. BSI develops both technical and management standards:

  • technical standards are precise specifications against which a business can measure the quality of its product, service or processes
  • management standards are models for achieving best business and organizational practice

Applying the appropriate standards to your business will enable you to apply best practice across the organization, and to work against objective criteria to achieve manufacturing or service quality.

Strengths

A best practice strategy can help your business to:

  • become more competitive
  • increase sales and develop new markets
  • reduce costs and become more efficient
  • improve the skills of your workforce
  • use technology more effectively
  • reduce waste and improve quality
  • respond more quickly to innovations in your sector

Applications

The guide goes on to provide in-depth applications of best practice to management, people management, and sales and marketing. It also provides useful information on how IT can improve the use of best practice for businesses.

Friday, March 19, 2010

Viperlink Application Of Best Practices Methodology

Viperlink – a Microsoft-certified Gold partner – is guided by a “Best Practice” methodology starting with systems design to pre-installation planning right up to the actual implementation work. The direct impact to your results-based, customized IT infrastructure is a fully functional and consistent setup. Yes, an implementation process that is systematically guided by the idea of deploying technology that works, no matter who installs the IT systems.
The term “Best Practice” is used frequently in such diverse fields as project management, hardware installation, software product development, government administration, the education system, and many other complex scenarios and human endeavors. In truth, a “Best Practice” is simply a technique or methodology that – through experience and research – has proven to be reliable and leads to an expected outcome or the desired result.


Effective Change Management

Throughout the IT outsourcing industry, several “Best Practices” are widely followed. Some of the more commonly used include an iterative development process, requirement management, quality control, and change control management.

An iterative or repetitive development process, which progresses in incremental stages, helps to maintain a focus on manageable tasks, and ensures that earlier stages are successful before the later stages are attempted.
Requirement management addresses the problem of shifting requirements, which is a situation in which the client requests additional changes to the service or product that are beyond the scope of what was originally planned. To guard against this common phenomenon, requirement management employs strategies such as documentation of requirements, sign-offs, and defining specific methodologies.
Quality control is a strategy that defines objective measures for assessing quality throughout the development process – in terms of the service offering or product functionality, reliability, and performance.
Change control is a strategy that seeks to closely monitor changes throughout the iterative process and implementation rollout. Here, prompt documentation ensures that records are intact for changes that have been made, and that unacceptable changes are not made or started.

A “Best Practice” tends to spread or be repeated throughout a field or industry after an actual success situation has been demonstrated. However, you must realize that an actual or demonstrated “Best Practice” mindset can be slow to spread, even within a dynamic organization.
The main barriers to adoption of a “Best Practice” methodology are: a lack of knowledge about current Best Practices; a lack of motivation to make changes involved in their adoption; and a lack of knowledge and skills required to do so.

Sunday, March 14, 2010

Difficulties with Best Practice Methodology

The author of this article, Derek Stockley, defines Best Practice Methodology as "a method where organizations identify their key business processes, and actively seek out and compare them with similar processes in organizations recognized for their exceptional customer service or outstanding business processes. The purpose of the comparison is to gather information and insight about better, more efficient and effective methods and approaches, with the view to identifying and implementing the 'best' practice/s. The comparison can be informal, through the analysis of competitor processes or systems, or done more formally through a co-operative venture (benchmarking)."

Stockley, the owner of a training and consulting firm in Australia, identifies difficulties with implementing Best Practice:

Ideas gained from other organizations may not be implemented successfully if the company's culture is not taken into consideration. It is important to not underestimate the strength of the company's culture when making decisions using another company's ideas.

Companies can make incorrect assumptions about best practices.

Outsourcing should be carefully considered and scrutinized as a means of cost reduction, paying considerable attention to what service or staff would be outsourced; it is important to look at the big picture when using Best Practice.

Best Practice is most useful when used in conjunction with other methodologies such as benchmarking.

GOA Application of Best Practices Methodology

Description

The "best practices methodology" is used to analyze organizations that are widely recognized for making large improvements in their products or processes. The organizations are then used as models for a "how-to" for increased efficiency and performance.

"Best Practices Methodology: A New Approach for Improving Government Operations" looks at how government operations, such as supply management, can be improved by implementing this private sector methodology. A best practices review can be applied to a variety of processes, such as payroll, travel administration, employee training, accounting and budgeting systems, procurement, transportation, maintenance services, repair services, and distribution. Best practices can be used to help streamline processes for cost savings.

Step-By-Step Actions

(1) Gaining an understanding of and documenting the process you want to improve.
(2) Researching industry trends and literature, and speaking with consultants, academics, and interest group officials on the subject matter.
(3) Selecting appropriate organizations for your review.
(4) Collecting data from these selected organizations.
(5) Identifying barriers to change.
(6) Comparing and contrasting processes to develop recommendations.

Here is an example of government use of best practices:

Defense Logistics: Observations on Private Sector Efforts to Improve
Operations (GAO/NSIAD-91-210, June 13, 1991).
Private sector firms have found that integrated logistics management can help reduce costs and increase their competitiveness. Major elements for successful implementation of integrated logistics management include total cost analysis and top management commitment. DOD may be able to benefit from private sector experiences in improving their logistics operations.

This article did not offer strengths and weaknesses for Best Practices, but it did offer many examples of its use in the private sector.

Wednesday, May 6, 2009

Summary Of Findings: Bayesian Analysis (4 out of 5 Stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 12 articles read in advance of (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 6 MAY 2009 regarding Bayesian Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:
Bayesian analysis is a method that uses Bayesian statistics to assess the likelihood of an event happening in light of new evidence. It generates an estimate and the use of Bayesian statistics in Intelligence analysis allows for the uncertainty of the traditional intelligence data set to be understood in a scientifically valid manner.

Strengths:
*can limit analyst biases by reducing the weight of evidence simply because it is new or vivid
*forces the analyst to resassess evidence and consider alternative possibilites
*adheres to rigid mathematical formulas
*provides a numerical likelihood
*provides audit trail and ability to reproduce results

Weaknesses:
*Probabilities are based largely on subjectivity
*Susceptible to biases
*Highly complex problems require heavy computations
*Can be mathematically complex
*Not always useful as a stand alone method (works well in tandem with methods like Delphi); may require SMEs for determining probability distributions
*Some reliance on ambiguous validities
*"Negative evidence"--absence of positive evidence

How-To:

This method loosely follows the guidance suggested by his line of research into the use of natural frequencies in teaching and explaining Bayes to beginners.

1.) Create a 2x2 matrix. Label the quadrants with the respective information that creates true positive, false negative, false positive, and true negative quadrants.
2.) Take the given information, the base line (for example, 100 out of 1,000) with the new information (for example, a new document that is 90% credible saying that war is immiment) which means that your true positive and your false negative must equal 100 and the false positive and true negative must equal 900.
3.) To calculate the true positive quandrant, take 90% of the 100 from the base line (which equals 90).
4.) To calculate the false negative quadrant, take the numerator of the base line (100) and subtract the true positive quadrant (90), creating an answer of 10.
5.) To calculate the true negative quadrant, take 90% of your non-war cases (900), equalling 810.
6.) To calculate the false positives, subtract the sum of the three quadrants known from the total number of cases (1,000), which equals 90.
7.) To calculate the new probablitiy, divide what the numerator of the base line (100) from the new total of positive caes (90+90=180), which equals 55.5%

The 55.5% means that there is a 55% probability that countries X and Y are likely to go to war.



Experience:
To understand the basic mathematical principles behind Bayes, the class worked through some sample problems. One of the problems was based on a medical test with an 80% accuracy rate for a cancer with 2% affliction rate in the general population. The class applied this to a sample population of 1000 cases. We established a matrix and assessed the true positive, false positive, false negative, and true negative quantities (16, 116, 4, and 784 respectively). We plugged these numbers into the appropriate matrix fields. We then divided the number of actual cases of cancer (20 or 2% of 1000) into the number of positive tests (132--the 16 true positives and 116 false postives). The result was 15% rate of those who have the cancer from the positive tests, a rather stark difference from the 2% base rate! This problem actually reflects the number of breast cancer rates from a medical treatment from around a two decades ago!
Note: see the matrix for a synopsis of another of the problems we worked through (a peice of evidence emerging suggesting a cause for war).

The class also used a Bayesian application to assess the likelihood we would contract swine flu. We started with the initial hypothesis that we would contract swine flu or we would not contract swine flu, and assigned an initial probability to each hypothesis (the latter >5%) We then added weighted evidence which influenced the base rate of the hypothesis. After all the evidence was entered, the class assessed the likelihood of contracting swine flu.

Tuesday, May 5, 2009

Using Search Engine Optimization For Intelligence Analysis

SEO Analysis Now - A Site For Using SEO in Intelligence Analysis
Rated: 4 Stars out of 5

As a final requirement of this course, I had to research and report findings on an analytical technique of my choosing. In addition, I had to test-out and apply the technique in order to reveal its true strengths and shortcomings; as well as come up with a how-to guide to use the technique. For my project, I chose to conduct a Search Engine Optimization (SEO) Analysis on the Websites of two popular coffee shops, Starbucks (which is well established) and Caribou Coffee (which is slowly gaining popularity) and reflect on how this process can be applied to Competitive Intelligence (CI), to Law Enforcement Intelligence (LEI), or to the National Security sector. I chose SEO not only because it is an emerging analytical process and can be useful to intelligence analysts, but also because I found it quite intriguing and fun.

SEO provides a way to gain insight into a Website’s audience.

If we know who an audience is (age, gender, ethnicity, education, affluence, etc), how they behave (what other Websites, or types of Websites, they visit), from where they log on, when they visit, and what other sites direct their traffic (as well as what their general interests are), then we can make assessments and predictions about how to either promote their behavior (steering them toward a particular site – useful for CI & marketing purposes) or how to counter that behavior (keeping them away from sites – useful for LEI & N’tl Sec purposes).

For more detailed information about how SEO can be used for Intel analysis, please check out my project:

SEO Analysis Now - A Site For Using SEO in Intelligence Analysis


(The image below is a geographical search index comparison of online users searching for Caribou Coffee [left] and Starbucks [right]. Images provided by Google Insights for Search)

Monday, May 4, 2009

Bayesian statistics: principles and benefits

http://library.wur.nl/frontis/bayes/03_o_hagan.pdf

This article is meant to summarize the basics of Bayesian statistics for beginners.

In Bayesian statistics:
Graphically, the narrower the curve, the tighter the parameters. The difference between frequentist methods and Bayesian analysis is the use of past information, which is principally subjective. It is important for the prior information to be defensible and reasonable. The author believes subjectivity is a strength of the system because it allows for the examination of posterior distributions from different informed observers.

Until the 1990s, computational tools for conducting Bayesian analysis were nascent or non-existent. While there are tools for the specialist available today, the general practitioner of Bayesian analysis will find there are few user friendly tools available.

The author enumerates a number of benefits for using Bayesian analysis. They are:
  • It provides meaningful and intuitive inferences.
  • It can answer complex questions cleanly and exactly.
  • It makes use of all available information.
  • It is well suited for decision-making.
Enhanced by Zemanta

Introduction to Bayesian Analysis

http://www.mimuw.edu.pl/~pokar/StatystykaSemMgr/WalshBayesIntro.pdf

"Bayesian statistics is concerned with generating the posterior distribution of the unknown parameters given both the data and some prior density for these parameters. Bayesian statistics creates a much more complete picture of the uncertainty in the estimation of the unknown parameters."

The article's main usefulness lies in suggestions on how to improve the Bayesian process. The author's first suggestion is to remove the effect of non-critical (or at least non interesting) parameters from the overall findings. They label these non-critical parameters as "nuisance" parameters.

Second, the authors recognize the lack of hypothesis testing in the scholarly works of their peers. They underhandedly advocate for such studies to be done.

The Logic of Intelligence Failure

http://www.cdi.org/blair/logic.cfm
By Bruce G. Blair, Ph.D.

The author of this article asserts that critics of the intelligence community concerning both of the major recent US intelligence failures--Iraq WMD and 9/11--ought to come to the realization that threats of varying levels of uncertainty are typically inaccurately assessed. Only successive, repeated assessments of updated data can narrow the gap between perceptions and reality. Since data users and intelligence analysts tend to initially process information subjectively (against their own rational beliefs, judgments, opinions), and modify their rationality as new information or intelligence comes in.

Blair argues that decision making was the result of intelligence analysis that basically followed the laws of reason. He contends that, "applying a rule of logic known as Bayes' law to these cases (9/11 WMD) shows that the intelligence process produced conclusions that were not only plausible but reasonable."

Blair utilized the following formula in his study:
____________________________________________________________________
While all of the probabilities come from the minds of people and are inherently subjective, the analysis itself depends on the product of successive analyses of real (objective) data, the probabilities of which are likely to converge with reality, so long as the individuals involved are thinking logically/rationally. Lower rates of error are likely to accelerate this convergent process.

Here are two scenarios, applied to Baye's formula:

WMD

Terrorist Attack


The author then provides possible scenarios , based on Bayesian calculations and iterations, that determine the point at which the relationship of warnings converge with the reality of the event.

___________________

Thus, concerning situations that involve preemption or preventive war, the author suggests that neither inductive reasoning nor even Bayesian analysis can truly clarify the validity of warnings, intelligence interpretations, or new information in certain situations--"two observers with different preexisting beliefs will often believe that the same bit of behavior confirms their beliefs - hawks seeing aggresive behavior and doves seeing evidence of conciliatory behavior".

What is Bayesian Analysis?

http://www.bayesian.org/bayesexp/bayesexp.html

This article is a summary of the International Society for Bayesian Analysis's (ISBA) definition of the basics of the Bayesian theorem and analysis. The according to the ISBA website, the organization "was founded in 1992 to promote the development and application of Bayesian analysis useful in the solution of theoretical and applied problems in science, industry and government. By sponsoring and organizing meetings, publishing the electronic journal of Bayesian statistics Bayesian Analysis, and other activities ISBA provides a focal point for those interested in Bayesian analysis and its applications".

Summary
Bayesian analysis, a statistical tool for handling probability distributions, got its start in the mid 18th century. It was not, however, until the 1980s when modern computers were able to handle the complex computations that made Bayesian implementation difficult, that Bayesian analysis gained more widespread acceptance. Since then, its use has increased in popularity, being used in many different applications--from healthcare, to weather, to criminal justice. Despite the many nuanced manifestations of Bayesian analysis, it serves a common application: to analyze the probability of unknown and uncertain occurences.



How To:

The left side of the equation expresses the known quantities--"parameters"--as a probability of the current data--"prior distribution". 'y' represents the new data that enters into the calculation. Thus, the "'likelihood,' [is] proportional to the distribution of the observed data given the model parameters.

On the right, the equation's new probablity distribution (posterior distribution) is read: "posterior is proportional to the prior times the likelihood".



Strengths:

  • Many diverse applications
  • "Philosophical consistency"
  • Lacks problems that are associated with other 'frequentist' methods
  • Produce clear answers, products
  • Reformulates for each variable

Weaknesses:

  • Subjective nature of prior probabilities-"your prior information is different from mine".
  • More complex problems require more powerful computational tools

Saturday, May 2, 2009

Cruise Missile Proliferation: An Application of Bayesian Analysis To Intelligence Forecasting

Michael William Gannon

Summary:
The author applies Bayesian analysis to the problem of cruise missile proliferation. The author defines Bayesian analysis as, "a quantitative procedure in which alternative hypothetical outcomes are postulated and their prior probabilities estimated. As additional relevant events occur, the probabilities of their association with each hypothesis are used to calculate a revised probability for each alternative outcome." He notes that Bayesian analysis has been used by the CIA to provide Indicators & Warnings (I&W) as to the probability of outbreak of armed conflict. Any observed event has a probability associated with its actual occurrence, depending on initial causes. By observing and evaluating events that do occur, "posterior probabilities" can be assigned to each cause, creating a likelihood of an event that may occur in the future based on similar initial causes.

Strengths of the method, according to the author, are "The principal advantage of the method is the establishment of a formal analytical framework which accommodates weighted inputs of all observed events, makes differing interpretations of a given event more explicit, and provides a readily available chronological record of the analytical process." A major weakness to the method, however, is that it "is limited to situations which can be expressed as a number of
mutually exclusive outcomes. An ample flow of data which is logically related to the hypotheses to be tested must be available, and analysts must be qualified to assign realistic probabilities associating the observed events to their hypothetical causes."

In assessing the history of Bayesian analysis, the author notes that the CIA found that Bayesian analysis and the Delphi method to be highly complimentary. Furthermore, the CIA found that Bayesian analysis had distinct advantages over other methods. These advantages were:

(1) More information can be extracted from the available data.. .and probabilities are not at the mercy of the most recent or most visible item.
(2) The formal procedure has been shown to be less conservative than the analysts' informal opinions, and to drive the probabilities away from fifty-fifty faster and farther than the analysts' overall subjective judgments do....
(3) The procedure provides a reproducible sequence for arriving at the final figures ....
(4) The formulation of the questions forces the analyst to consider alternative explanations of the evidence he sees.. . [and] to look at how well the evidence explains hypotheses other than the one he has already decided is the most likely.
(5) The use of quantified judgments allows the results of the analysis to be displayed on a numerical scale, rather through through the use of [subjective terms].

Limitations discovered by the CIA included:
(1) The question must lend itself to formulation in mutually exclusive categories .
(2) The question must be expressed as a specific set of hypothetical outcomes.
(3) There should be a fairly rich flow of data which is at least peripherally related to the question.
(4) The question must revolve around the type of activity that produces preliminary signs and is not largely a chance or random event.

Ultimately, Bayesian analysis is intended as a forecasting tool, but has the added benefit of utilizing raw data (which can be "graded" for source reliability and assessed for explanatory hypotheses). This provides the analyst with a "quick reference" source to conduct "snap shot evaluations" on current program evaluations, such as the state of a nations missile program.

Authors Comment:
This paper is a master's thesis that used Bayesian Analysis to assess future cruise missile proliferation. I did not include any findings of the thesis in this summary; instead I summarized the sections that dealt with the method of Bayesian analysis itself.

Bayes' Theorem and Intelligence

Net Wars

Summary:
According to the author, "Beliefs are based on probabilistic information. Bayes Theorem says that our initial beliefs are updated to to posterior beliefs after observing new conditions." As an analytic method, Bayesian analysis provides a formula which allows the analyst to upgrade original assertions as new evidence is discovered, and assign likelihoods to events; the more we observe the better we can predict the likelihood of a certain event. The formula used to update the analysts initial beliefs to posterior beliefs is: p(C|O) = p(O|C)p(C)/p(O|C)p(C) + p(O|¬C)p(¬C). According to Bayes, initial beliefs have a high margin of error; this is alleviated by incorporating new evidence through this formula. This formula "produces interesting results because it accounts for uncertainties created by False Positives and False Negatives."

The author provides the following example of using Bayesian analysis to update your beliefs:

"There is a case of this occurring. Europeans believed that swans were always White and there could be no Black Swans. They updated their probability of a Swan being White to 99% based on their limited experiences. As they explored the world, they found Black Swans in Australia. This reduced the probability of a swan being white and increased the probability of a swan being black. This process of inductive reasoning can be explained via Bayesian probability."

The author provides the following example of using Bayesian analysis as an intelligence methodology:

"There are 10,000 civilians. 1% of whom are insurgents pretending to be civilians. Police can investigate individuals and determine if they are an insurgent or civilian with 95% certainty.

Prior Probability is this: 0.01 (10,000) and 0.99(10,000). So
Group 1: 100 insurgents
Group 2: 9,900 Civilians

The Police investigate the entire population. This produces four groups:
Group 1: Insurgents - Positive test (0.95)
Group 2: Insurgents - False Negative test (0.05)
Group 3: Civilians - False Postive test (0.05)
Group 4: Civilians - Negative test (0.95)

How certain are the police that the men they captured are actually insurgents? The answer is 16%.
(0.95 x 0.01)/ (0.95 x 0.01) + (0.05 x 0.99) =
0.0095/0.0590 = 0.161"

The 16% certainty rate stems from the uncertainty that always exists as some insurgents escape detection while some innocents test positive as insurgents. The author notes that this is an extremely oversimplified example; actual Bayesian analysis in this situation would require some serious computing power that takes into account many other factors, as well as multiple testing to insure that the most accurate results were reached. Nonetheless, the example highlights the use of Bayesian analysis as a method of predictive analysis. However, the method predicts the probability of a particular event happening, and not whether that event will actually occur.

The author returns to the "black swan" example, stating that these highly unlikely, yet possible, intelligence "black swans" are events that can occur, but are highly unlikely too. Just because they haven't happened, doesn't mean they won't. Bayesian analysis provides a method for determining their likelihood. The author concludes by reiterating that intelligence analysis is not about predicting future events, but rather about predicting the likelihood of future events. "The inability to stop a Black Swan event, or a false prediction of a Black Swan event, does not always mean that the intelligence community 'failed'." Rather, the notion that Intel failed comes from the distorted view of Intel analysts as fortune tellers, rather than the reducers of uncertainty that they truly are.

Friday, May 1, 2009

Bayesian Statistics In The Real World Of Intelligence Analysis: Lessons Learned

Bayesian Statistics In The Real World Of Intelligence Analysis: Lessons Learned
By Kristan Wheaton, Jennifer Lee, & Hemangini Deshmukh
Journal of Strategic Studies, vol. 2 n.1
February 2009

Summary:
In this article, Kris Wheaton, in collaboration with Jennifer Lee and Hemangini Deshmukh, agree that alternative methods, ones that are more structured, should be applied to the intelligence process in order to improve intelligence analysis. Recognizing the potential that Bayesian Statistics can bring to the field of intelligence, the author questions the ease in which entry-level intelligence analysts can apply and use this advanced statistical method.

Following the model of an experiment conducted by Gerd Gigenrenzer to test the accuracy of a diagnosis by two groups of doctors (with one group using traditional statistic formulations [5% or .05] and the other using natural frequencies [5 times out of 100]), the author tested 67 Senior Intelligence Studies students at Mercyhurst College. The findings of Wheaton’s experiment were extremely similar to that of Gigenrenzer’s, showing that groups who receive natural frequencies have a much higher rate of being accurate when using Bayesian statistics (79% versus 18% accuracy). This accuracy is attributed to the power of ‘framing’ questions. The author concludes, “natural frequencies are an effective method for encouraging Bayesian reasoning.”

In addition to the experiment, the article covers a brief how-to and overview of Bayesian Analysis. In short, Bayesian statistics is particularly useful due to its ability to take-in-account probabilities of one event affecting another, allowing the analyst to rationally update a prior assessment in light of new evidence. This process also helps in the reduction of two very common cognitive biases, the vividness and recency biases. See article for relevant examples of how the Bayes Theorem can be applied to intelligence-related issues.

The Bayes Theorem is illustrated below:


Bayesian Analysis For Intelligence: Some Focus on the Middle East

Bayesian Analysis For Intelligence: Some Focus on the Middle East
By Nicholas Schweitzer
Approved For Release 1994
CIA Historical Review Program
02 July 96

Summary:
Nicholas Schweitzer suggests that advanced analytical methods, such as Bayesian analysis, should be used to aid analysts in an age where information flows continue to rise. In an effort to test Bayesian Analysis as a tool for intelligence analysts, he used the technique among a group of intelligence analysts to assess complex political-military problems. The Middle East was chosen as a discussion point because of the level regional complexities.

Schweitzer defines Bayesian Analysis as “a tool of statistical inference used to deduce the probabilities of various hypothetical causes from the observation of a real event. It also provides a convenient method for recalculating those probabilities in the light of a continuing flow of new events…the ‘rule of Bayes’ states that the probability of an underlying cause (hypothesis) equals its previous probability multiplied by the probability that the observed event was caused by that hypothesis.”

How to:

Because of limitations, the Bayesian technique can only be applied where certain criteria are met. First, the question to be answered must lend itself to formulation in mutually exclusive categories (i.e. war vs. no war); the insertion of overlapping possibilities reduces accuracy of the Bayesian technique. Second, the question must be expressed as a specific set of hypothetical outcomes. Third, there should be a fairly rich flow of data that is at least peripherally related to the question. Lastly, the question must revolve around the type of activity that produces preliminary signs and is not largely a chance or random event. If this criteria is met then:

1. Assign numeric probabilities to hypotheses. The sum of the values must equal .1 or 100%. Because the examination of political/military affairs and events do not automatically yield quantified results, the possible outcomes (hypotheses) have to be quantified. Schweitzer asserts that implementing a Delphi method is the best solution to quantify possible outcomes. He suggests the following procedure to do this:
  • Use analysts who are experts on the subject matter (preferably ones who are working on the situation with you)
  • Establish a periodic routine for reporting
  • On the first day of the period, each of a number of participating analysts submits the items of evidence they have seen since the last round.
  • Submissions should be in the form of 1-2 sentences summarizing the item, along with the date, source, & classification.
  • The inclusion of relevant items and exclusion of irrelevant items is up to the discretion of the analyst.
  • A coordinator consolidates the items, resolving differences of wording, emphasis, and meaning, and returns the complete list of items to the participants.
  • On the following day, the analysts (working individually) evaluate the items and return the numerical assessments
  • *the use of a group of analysts, as opposed to a single expert, is highly recommended*
2. Assess and quantify the evidence that supports/negates the hypotheses.
3. Calculate the new probabilities according to the rule of Bayes:

E is an event, an “item” of intelligence
H is a hypothesis, a hypothetical cause of events
Hi is one of a set of n mutually exclusive hypotheses
P(Hi) is the starting, or “prior” probability of a hypothesis
P(E/Hi) is the probability of an event given Hi, of an event occurring, given a particular underlying cause
P(Hi/E) is the probability of a hypothesis given E, the “revised” probability of a hypothesis, given that a particular event has occurred.

Strengths (please see article for further explanation):
  • Allows for the weighting of evidence
  • Provides transparency in intelligence assessments
  • Forces the consideration of alternative possibilities
  • Quantifies analysis instead of using words of estimative probability
  • Displays the trend toward an outcome quicker than the analyst can typically realize it on their own
  • Incorporating the Delphi method adds credibility to the assessment when presented to managers and decision makers.
Weaknesses (please see article for further explanation):
  • Limited applicability
  • Data problems – can exist in deciding which information is relevant and should be included, as well as what weight values should be given to evidence.
  • Source reliability – what is the best practice to account for this
  • “Negative evidence”- the absence of any positive evidence may in itself be highly indicative of something
  • Problems over time – problems in using this method in a project continuing over many months
  • Problem with numbers – cannot use the probability of ‘zero’ (doesn’t work mathematically or analytically) therefore extremely low probabilities must be indicated by a very small number. Also, some people have difficulty thinking in, and assigning, probabilities.
  • Subject to bias and manipulation – this is one of the reasons for which the author suggests using a group of experts/analysts to assign probabilities.

Wednesday, April 29, 2009

A Bayesian Approach To Modeling Binary Data: The Case Of High-Intensity Crime Areas



Law, Jane & Haining, Robert
Geographical Analysis, Vol. 36. No 3 (July 2004) Ohio State University

Summary
The purpose of this paper is to apply Bayesian logistic models to binary data in order to explain how high-intensity crime areas (HIAs) are distributed in Sheffeild, England.

The article defines an HIA as an urban area experiencing high levels of violent crime. Perpetrators of crimes in HIAs often reside in the neighbors where they committ crimes, resulting in high levels of witness intimidation making it difficult to identify an HIA with full accuracy.

The article argues that Bayesian approaches are useful for spatial data analysis because of its emphasis on randomness, prior beliefs about values, and WinBUGS software that enables analysts to test hypothese with spatial elements.
Prior to using Bayesian analysis, Sheffeild was divided into Basic Command Units (BCUs). Within the BCUs were HIAs. Within both BCUs and HIAs were Enumeration Districts (EDs). EDs were then classified in terms of if they were in an HIA or not. According to the article several EDs were miscategorized because the original statistics used the standard logistic model, that did not analyze the spatial relationship between EDs. The article continues by discussing the use of Logistic Regression with WinBUGS software, to analyze random elements and spatial analysis to redesignate the EDs.




Figure 1, shows what is known as the logit function, including random effects. However it does not include spatial relationships of EDs.











In Figure 3A spatial relationships are shown with the links to W which is known as the contingency matrix. The contingency matrix is used to show the likelyhood of EDs neighboring HIAs and being at risk of spill-over crime from the HIAs making the EDs likley to become HIAs.


The paper's conclusion is that using a three stage Bayesian hierarchal model, can give estimated probabilities of EDs being HIAs, because it takes into account spatial elements between EDs. The paper also argues for "map decomposition" as another relevant way for analyzing spatial models along with Bayesian hierarchal models.

Why I Don't Like Bayesian Statistics

Gelman, Andrew, Professor of statistics and political science and director of the Applied Statistics Center at Columbia University

Summary



Professor Gelman refers to Bayesian inference as a "coherent mathematical theory," but does not trust it for scientific applications. Gelman believes that it is too easy to apply subjective beliefs about a given situation to Bayesian theory; because people want to believe their own preconceived notions and reject results statistical results they do not want to agree with. Bayesian methods according Gelman, encourage this kind of thinking.



Gelman takes special issue with political scientists like himself adopting Bayesian methods. Bayesian approaches tend to assume exchangeability of variables. However in political science it is impossible to exchange each of the 50 states, they cannot be used randomly or as samples.



Gelman continues by saying that he is not hostile to mathematics of Bayesianism, but its "philosophical foundations, that the goal of statistics is to make an optimal decision." Gelman believes that statistics are for doing "estimation and hypothesis testing," not to "minimize the average risk." He also faults the Bayesian philosophy of axiomatic reasoning because it implies that random sampling should not be done which Gelman considers to be "strike against the theory right there." He also accuses Bayesians of believing in "the irrelevance of stopping times," which means that stopping an experiment it will not change your inference. Gelman concludes by saying "the p-value does change when you alter the stopping rule, and no amount of philosophical reasoning will get you around that point."

Bayes' Formula



Author's Note: This is a great video for teaching Bayes' Theorem in its simplest form.


Summary
In order to illustrate the utility of Bayes’ Theorem, the author draws upon two simple scenarios. First, suppose someone faces the decision of needing to choose between three doors. If the person making the decision does not have any prior knowledge about the situation, the scenario creates an unconditional probability. But, once the person receives new information about the scenario, the rational person should reconsider his/her decision and subsequent probabilities.

Bayes’ Theorem is about the introduction of new information used to adjust probabilities and create conditional probabilities. In the formula, P(G/U), P is the probability that G will occur, if U happens.

To illustrate the application of Bayes’ Theorem and conditional probabilities, the author illustrates a second scenario. Pretend that there is a 70% probability that the economy will grow and a 30% probability that the economy will slow (an unconditional probability). The author owns a stock that has an 80% chance of increasing if the economy grows. That same stock, however, only has a 30% chance of increasing if the economy slows. The 80% and the second 30% are conditional probabilities; they are based on the condition that the economy will grow or slow.

The author can then determine the scenario’s four conditional probabilities:
1) What is the probability that the economy will grow and the stock will increase?
2) What is the probability that the economy will grow and the stock will decrease?
3) What is the probability that the economy will slow and the stock will increase?
4) What is the probability that the economy will slow and the stock will decrease?

To answer these questions and determine their probabilities, the author uses the equation: P(UG) = P(U/G)P(G). Notice that this equation is longer than the first because this one incorporates two conditions: the economy will grow/slow and the stock will increase/decrease.
Reblog this post [with Zemanta]

Bayes' Theorem for Intelligence Analysis

Jack Zlotnick
CIA Historical Review Program


Author’s Note: Released by the CIA’s Historical Program in the early 1990’s, Jack Zlotnick wrote this piece in the 1970’s. At the time, the CIA was still in the process of testing Bayes’ Theorem. Due to the ongoing testing period (at that time), Zlotnick does not offer a position on the utility and validity of the Bayesian method with regards to intelligence. In fact, Zlotnick spends a considerable amount of time in the article discussing the ways the theorem should continue to be tested.

Summary
Due to the very nature of intelligence, analysts should be naturally interested in the Bayesian Theorem. Intelligence is probabilistic in nature. Intelligence analysts usually conduct their analysis based on incomplete evidence in which they must address probabilities (thus WOEP’s).

For intelligence applications, Bayes’ Theorem is represented by the equation R=PL. “R” is the revised estimate of the odds favoring one hypothesis over another competing hypothesis (the odds of a particular hypothesis occurring after new evidence is entered into the equation). “P” is the prior estimate on the hypotheses probabilities (the odds before considering the new evidence entered into the equation). The analyst must offer judgments about “L” or the likelihood ratio. This variable is the analyst’s evaluation of the “diagnosticity” of an item of evidence. For instance, if a foreign power mobilizes its troops, what are the chances that “X” will happen over “Y”.

The principle features of the Bayes Theorem distinguish it from conventional intelligence analysis in three ways. First, it forces analysts to quantify judgments that are not ordinarily expressed in numeric terms. Second, the analyst does not take the available evidence as given and draw conclusions. And third, the analyst makes his/her own judgments about the bits and pieces of evidence. He/she does not sum up the evidence as he/she would if he/she had to judge its meaning for a final conclusion. The mathematics does the summing up.

The author is skeptical that the complex tasks analysts are forced to consider can be reduced to numeric values. Bayes’ Theorem, however, may be useful for examining strategic warning by uncovering patterns of activity by foreign powers.
Reblog this post [with Zemanta]

Summary Of Findings: Gap Analysis (3 Out Of 5 Stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 12 articles read in advance of (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 29 APR 2009 regarding Gap Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:
Traditionally, "gap analysis" is a method used to conduct an internal operational analysis, whereas the gap analysis identifies the "gap" between a current state and a desired endstate within a company or agency. From an intelligence analysis perspective, "gap analysis" can be used as a tool to identify the likely pathway or pathways a target may take to arrive at a given endstate from a known position. Thus, "gap analysis" does not necessarily provide an estimate, but rather provides the analyst with a list of possible actions a target may likely take. Gap analysis as an analytic technique bears a striking resemblance to several other methods, such as Indicators & Warnings and Decision Trees.

Strengths:
1) Identify the target
2) Characterize the current status of the target as well as the target's goals
3) Identify what you want to know about the target
4) List the pieces of information that you have
5) Use a systematic approach to infer what the target is likely to do in order for the target to reach its goals

Weaknesses:
* No structured method to conduct the analysis
* May not leave you with a clear estimate
* Open to bias and other cognitive downfalls (satisficing, mirror imaging, etc.)
* Overlaps with the process of other methods and modifiers (i.e. decision treees, I&W, Brainstorming, SWOT, etc.)
* Susceptible to deception
* Danger of pitfalls

How-To:
1) Identify the target
2) Characterize the current status of the target as well as the target's goals
3) Identify what you want to know about the target
4) List the pieces of information that you have
5) Use a systematic approach to infer what the target is likely to do in order for the target to reach its goals

Experience:
For the first application, the group tried to determine what thesis topic Mary, a fictional first year graduate, would write about. Professor Wheaton acted in place of Mary, and we role played in questions and answer format. The group determined the gaps needing filled would be what intelligence track she was most interested in (national security, law enforcement, or competitive), what area or topic in her previous classes interested her the most, the choice of her primary reader, and the reader's academic interests. Upon filling these gaps, the group ascertained a plausible topic for Mary's thesis.

For the second application, the group discussed Russia's long held ambitions for a warm water port, preferably on the Mediterranean. The first part of the discussion centered on hostilities between Georgia and Russia, and actions Russia could take against Georgia to maintain their sphere of influence in the Black Sea. Additionally, the group discussed what actions may be necessary in their diplomatic relations with other nations bordering the sea. After discussing how Russia could potentially gain a Black Sea port by bringing Georgia into their sphere of influence, the group discussed how Russia could proceed toward gaining access to Mediterranean ports. The group determined that Turkey would be central in future Russian objectives for ports in the Mediterranean. The group put together lists of things the Russians could do that they are not doing now that would indicate their goals of extending their influence in Georgia and Turkey.

These applications illustrated the following:
* Helped in thinking through the steps that lead to a decision
* Allowed for open discussion and debate, helping the critical thinking process
* At some points it felt like stabbing in the dark, but the estimates later became more clear as the group discussed the options
* Eliminated peripheral influences not directly related to the topic, such as administrative processes.