Tuesday, April 16, 2013

Bayesian Inference Analysis of the Uncertainty Linked to the Evaluation of Potential Flood Damage in Urban Areas

Fontanazza, Freni and Notaro explain that flood impact on highly urbanized areas can be high and has the potential to increase with the effects of climate change. Thus, decision-makers prefer reduced uncertainty when planning flooding mitigation and prevention. This analysis is beneficial because there exists uncertainty in the physical processes that must be simulated in hydraulic models and in the limit of data for model calibration. Additionally, there are sometimes measurement errors in terms of depth-damage curves which can affect data.

In this article, the authors applied Bayesian probability analysis to a case study of Palermo, Italy to determine whether uncertainty decreases with the addition of data. Bayesian analysis has two benefits: "parameter estimation and uncertainty analysis" in both hydraulic model parameters and the depth-damage curve coefficients. They create a mathematical probability model using Bayesian analysis including values in the equation for "the uncertainty of a generic model parameter", "observed values" and a "likelihood function."

The authors split the historical data into three sections, that from January 1994 to April 1999, from May 1999 to January 2003 and from February 2003 to December 2008, to determine whether uncertainty would decrease with each subsequent addition of a data group. The land use in the Palermo case study was identified as mostly for residential dwellings with 88 percent of the area being impervious. The following three images show the reduction in uncertainty once more data became available, demonstrating that Bayesian probability analysis did in fact reduce uncertainty. By the addition of only the second set of data (in the second image), the reduction in uncertainty was about 40%, without a reduction in reliability.

There were some limitations in Bayesian analysis, such as that it relies on an initial hypothesis which can often be subjective as well as that the approach may not be objective if the parameter distribution is not made on physical observations. Nevertheless, I noticed many advantages of the methodology. The authors were successful in demonstrating its effectiveness with a case study, thus showing with real but historical data, that a significant reduction in uncertainty was possible. They also accounted for the aforementioned limitations with additional probabilistic analyses on the parameter choices to ensure that they did not skew the results.

The interest in reducing uncertainty for a decision-maker seems to be the same for any profession. I would be curious to see how this could be applicable to a study of crime mapping in which it is determined whether a decrease in uncertainty actually does occur with an increase in data. This could perhaps be applied to the "Newton-Swoope Buffers" in ATAC Workshop that are intended to determine the location of an offender's home or business. These buffers change with each additional piece of information, seemingly because they are becoming more accurate with more data. A Bayesian probability analysis could be applied to this tool to determine its effectiveness and additionally, application to law enforcement intelligence.

Fontanazza, C.M., Freni, G., & Notaro, V. (2012). Bayesian inference analysis of the uncertainty linked to the evaluation of potential flood damage in urban areas. Water Science and Technology, 1669-1677. doi: 10.2166/wst.2012.359


  1. Ana, this is an interesting read and seems to validate the use of Bayesian analysis to reduce uncertainty. I am particularly interested in the 40% reduction of uncertainty with the introduction of the second data set. That fits with what we have been learning in the HTMA book, that the first reduction in uncertainty is normally the biggest.

  2. I agree with both you and Ethan. This shows a practical way for Bayesian analysis to be applied that could save lives and property. Working as a crisis planning analyst could definitely apply to future careers from our program. The first addition of data seemed to reduce the uncertainty more than the second addition. This makes sense as in class (as Ethan mentioned) we discussed the first few pieces of information tend to reduce uncertainty the most, with later pieces adding only small amounts of reducing uncertainty (decreasing marginal utility).