2021
Global sensitivity analysis (GSA) has long been recognized as an indispensable tool for model analysis. GSA has been extensively used for model simplification, identifiability analysis, and diagnostic tests. Nevertheless, computationally efficient methodologies are needed for GSA, not only to reduce the computational overhead, but also to improve the quality and robustness of the results. This is especially the case for process-based hydrologic models, as their simulation time typically exceeds the computational resources available for a comprehensive GSA. To overcome this computational barrier, we propose a data-driven method called VISCOUS, variance-based sensitivity analysis using copulas. VISCOUS uses Gaussian mixture copulas to approximate the joint probability density function of a given set of input-output pairs for estimating the variance-based sensitivity indices. Our method identifies dominant hydrologic factors by recycling existing input-output data, and thus can deal with arbitrary sample sets drawn from the input-output space. We used two hydrologic models of increasing complexity (HBV and VIC) to assess the performance of VISCOUS. Our results confirm that VISCOUS and the conventional variance-based method can detect similar important and unimportant factors. Furthermore, the VISCOUS method can substantially reduce the computational cost required for sensitivity analysis. Our proposed method is particularly useful for process-based models with many uncertain parameters, large domain size, and high spatial and temporal resolution.
Global sensitivity analysis (GSA) has long been recognized as an indispensable tool for model analysis. GSA has been extensively used for model simplification, identifiability analysis, and diagnostic tests. Nevertheless, computationally efficient methodologies are needed for GSA, not only to reduce the computational overhead, but also to improve the quality and robustness of the results. This is especially the case for process-based hydrologic models, as their simulation time typically exceeds the computational resources available for a comprehensive GSA. To overcome this computational barrier, we propose a data-driven method called VISCOUS, variance-based sensitivity analysis using copulas. VISCOUS uses Gaussian mixture copulas to approximate the joint probability density function of a given set of input-output pairs for estimating the variance-based sensitivity indices. Our method identifies dominant hydrologic factors by recycling existing input-output data, and thus can deal with arbitrary sample sets drawn from the input-output space. We used two hydrologic models of increasing complexity (HBV and VIC) to assess the performance of VISCOUS. Our results confirm that VISCOUS and the conventional variance-based method can detect similar important and unimportant factors. Furthermore, the VISCOUS method can substantially reduce the computational cost required for sensitivity analysis. Our proposed method is particularly useful for process-based models with many uncertain parameters, large domain size, and high spatial and temporal resolution.
DOI
bib
abs
Peering into agricultural rebound phenomenon using a global sensitivity analysis approach
Mohammad Ghoreishi,
Razi Sheikholeslami,
Amin Elshorbagy,
Saman Razavi,
Kenneth Belcher,
Mohammad Ghoreishi,
Razi Sheikholeslami,
Amin Elshorbagy,
Saman Razavi,
Kenneth Belcher
Journal of Hydrology, Volume 602
• Time-varying GSA offers a good understanding of the coupled human-natural systems. • Economy is the most influential factor in the rebound phenomenon of the BRB. • Social interaction had a high total-effect on the rebound phenomenon of the BRB. • Raising farmers’ awareness by formal channels could avoid the rebound phenomenon. • Switching to crops needing less water could prevent the rebound phenomenon. Modernizing traditional irrigation systems has long been recognized as a means to reduce water losses. However, empirical evidence shows that this practice may not necessarily reduce water use in the long run; in fact, in many cases, the converse is true—a concept known as the rebound phenomenon. This phenomenon is at the heart of a fundamental research gap in the explicit evaluation of co-evolutionary dynamics and interactions among socio-economic and hydrologic factors in agricultural systems. This gap calls for the application of systems-based methods to evaluate such dynamics. To address this gap, we use a previously developed Agent-Based Agricultural Water Demand (ABAD) model, applied to the Bow River Basin (BRB) in Canada. We perform a time-varying variance-based global sensitivity analysis (GSA) on the ABAD model to examine the individual effect of factors, as well as their joint effect, that may give rise to the rebound phenomenon in the BRB. Our results show that economic factors dominantly control possible rebounds. Although social interaction among farmers is found to be less influential than the irrigation expansion factor, its interaction effect with other factors becomes more important, indicating the highly interactive nature of the underlying socio-hydrological system. Based on the insights gained via GSA, we discuss several strategies, including community participation and water restrictions, that can be adopted to avoid the rebound phenomenon in irrigation systems. This study demonstrates that a time-varying variance-based GSA can provide a better understanding of the co-evolutionary dynamics of the socio-hydrological systems and can pave the way for better management of water resources.
DOI
bib
abs
Peering into agricultural rebound phenomenon using a global sensitivity analysis approach
Mohammad Ghoreishi,
Razi Sheikholeslami,
Amin Elshorbagy,
Saman Razavi,
Kenneth Belcher,
Mohammad Ghoreishi,
Razi Sheikholeslami,
Amin Elshorbagy,
Saman Razavi,
Kenneth Belcher
Journal of Hydrology, Volume 602
• Time-varying GSA offers a good understanding of the coupled human-natural systems. • Economy is the most influential factor in the rebound phenomenon of the BRB. • Social interaction had a high total-effect on the rebound phenomenon of the BRB. • Raising farmers’ awareness by formal channels could avoid the rebound phenomenon. • Switching to crops needing less water could prevent the rebound phenomenon. Modernizing traditional irrigation systems has long been recognized as a means to reduce water losses. However, empirical evidence shows that this practice may not necessarily reduce water use in the long run; in fact, in many cases, the converse is true—a concept known as the rebound phenomenon. This phenomenon is at the heart of a fundamental research gap in the explicit evaluation of co-evolutionary dynamics and interactions among socio-economic and hydrologic factors in agricultural systems. This gap calls for the application of systems-based methods to evaluate such dynamics. To address this gap, we use a previously developed Agent-Based Agricultural Water Demand (ABAD) model, applied to the Bow River Basin (BRB) in Canada. We perform a time-varying variance-based global sensitivity analysis (GSA) on the ABAD model to examine the individual effect of factors, as well as their joint effect, that may give rise to the rebound phenomenon in the BRB. Our results show that economic factors dominantly control possible rebounds. Although social interaction among farmers is found to be less influential than the irrigation expansion factor, its interaction effect with other factors becomes more important, indicating the highly interactive nature of the underlying socio-hydrological system. Based on the insights gained via GSA, we discuss several strategies, including community participation and water restrictions, that can be adopted to avoid the rebound phenomenon in irrigation systems. This study demonstrates that a time-varying variance-based GSA can provide a better understanding of the co-evolutionary dynamics of the socio-hydrological systems and can pave the way for better management of water resources.
DOI
bib
abs
The Future of Sensitivity Analysis: An essential discipline for systems modeling and policy support
Saman Razavi,
Anthony J. Jakeman,
Andrea Saltelli,
Clémentine Prieur,
Bertrand Iooss,
Emanuele Borgonovo,
Elmar Plischke,
Samuele Lo Piano,
Takuya Iwanaga,
William E. Becker,
Stefano Tarantola,
Joseph H. A. Guillaume,
John Jakeman,
Hoshin V. Gupta,
Nicola Melillo,
Giovanni Rabitti,
Vincent Chabridon,
Qingyun Duan,
Xifu Sun,
Stefán Thor Smith,
Razi Sheikholeslami,
Nasim Hosseini,
Masoud Asadzadeh,
Arnald Puy,
Sergei Kucherenko,
Holger R. Maier,
Saman Razavi,
Anthony J. Jakeman,
Andrea Saltelli,
Clémentine Prieur,
Bertrand Iooss,
Emanuele Borgonovo,
Elmar Plischke,
Samuele Lo Piano,
Takuya Iwanaga,
William E. Becker,
Stefano Tarantola,
Joseph H. A. Guillaume,
John Jakeman,
Hoshin V. Gupta,
Nicola Melillo,
Giovanni Rabitti,
Vincent Chabridon,
Qingyun Duan,
Xifu Sun,
Stefán Thor Smith,
Razi Sheikholeslami,
Nasim Hosseini,
Masoud Asadzadeh,
Arnald Puy,
Sergei Kucherenko,
Holger R. Maier
Environmental Modelling & Software, Volume 137
Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling. The tremendous potential benefits of SA are, however, yet to be fully realized, both for advancing mechanistic and data-driven modeling of human and natural systems, and in support of decision making. In this perspective paper, a multidisciplinary group of researchers and practitioners revisit the current status of SA, and outline research challenges in regard to both theoretical frameworks and their applications to solve real-world problems. Six areas are discussed that warrant further attention, including (1) structuring and standardizing SA as a discipline, (2) realizing the untapped potential of SA for systems modeling, (3) addressing the computational burden of SA, (4) progressing SA in the context of machine learning, (5) clarifying the relationship and role of SA to uncertainty quantification, and (6) evolving the use of SA in support of decision making. An outlook for the future of SA is provided that underlines how SA must underpin a wide variety of activities to better serve science and society. • Sensitivity analysis (SA) should be promoted as an independent discipline. • Several grand challenges hinder full realization of the benefits of SA. • The potential of SA for systems modeling & machine learning is untapped. • New prospects exist for SA to support uncertainty quantification & decision making. • Coordination rather than consensus is key to cross-fertilize new ideas.
DOI
bib
abs
The Future of Sensitivity Analysis: An essential discipline for systems modeling and policy support
Saman Razavi,
Anthony J. Jakeman,
Andrea Saltelli,
Clémentine Prieur,
Bertrand Iooss,
Emanuele Borgonovo,
Elmar Plischke,
Samuele Lo Piano,
Takuya Iwanaga,
William E. Becker,
Stefano Tarantola,
Joseph H. A. Guillaume,
John Jakeman,
Hoshin V. Gupta,
Nicola Melillo,
Giovanni Rabitti,
Vincent Chabridon,
Qingyun Duan,
Xifu Sun,
Stefán Thor Smith,
Razi Sheikholeslami,
Nasim Hosseini,
Masoud Asadzadeh,
Arnald Puy,
Sergei Kucherenko,
Holger R. Maier,
Saman Razavi,
Anthony J. Jakeman,
Andrea Saltelli,
Clémentine Prieur,
Bertrand Iooss,
Emanuele Borgonovo,
Elmar Plischke,
Samuele Lo Piano,
Takuya Iwanaga,
William E. Becker,
Stefano Tarantola,
Joseph H. A. Guillaume,
John Jakeman,
Hoshin V. Gupta,
Nicola Melillo,
Giovanni Rabitti,
Vincent Chabridon,
Qingyun Duan,
Xifu Sun,
Stefán Thor Smith,
Razi Sheikholeslami,
Nasim Hosseini,
Masoud Asadzadeh,
Arnald Puy,
Sergei Kucherenko,
Holger R. Maier
Environmental Modelling & Software, Volume 137
Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling. The tremendous potential benefits of SA are, however, yet to be fully realized, both for advancing mechanistic and data-driven modeling of human and natural systems, and in support of decision making. In this perspective paper, a multidisciplinary group of researchers and practitioners revisit the current status of SA, and outline research challenges in regard to both theoretical frameworks and their applications to solve real-world problems. Six areas are discussed that warrant further attention, including (1) structuring and standardizing SA as a discipline, (2) realizing the untapped potential of SA for systems modeling, (3) addressing the computational burden of SA, (4) progressing SA in the context of machine learning, (5) clarifying the relationship and role of SA to uncertainty quantification, and (6) evolving the use of SA in support of decision making. An outlook for the future of SA is provided that underlines how SA must underpin a wide variety of activities to better serve science and society. • Sensitivity analysis (SA) should be promoted as an independent discipline. • Several grand challenges hinder full realization of the benefits of SA. • The potential of SA for systems modeling & machine learning is untapped. • New prospects exist for SA to support uncertainty quantification & decision making. • Coordination rather than consensus is key to cross-fertilize new ideas.
2020
Sensitivity analysis in Earth and environmental systems modeling typically demands an onerous computational cost. This issue coexists with the reliance of these algorithms on ad hoc designs of experiments, which hampers making the most out of the existing data sets. We tackle this problem by introducing a method for sensitivity analysis, based on the theory of variogram analysis of response surfaces (VARS), that works on any sample of input-output data or pre-computed model evaluations. Called data-driven VARS (D-VARS), this method characterizes the relationship strength between inputs and outputs by investigating their covariograms. We also propose a method to assess “robustness” of the results against sampling variability and numerical methods' imperfectness. Using two hydrologic modeling case studies, we show that D-VARS is highly efficient and statistically robust, even when the sample size is small. Therefore, D-VARS can provide unique opportunities to investigate geophysical systems whose models are computationally expensive or available data is scarce.
2019
Abstract. Complex, software-intensive, technically advanced, and computationally demanding models, presumably with ever-growing realism and fidelity, have been widely used to simulate and predict the dynamics of the Earth and environmental systems. The parameter-induced simulation crash (failure) problem is typical across most of these models, despite considerable efforts that modellers have directed at model development and implementation over the last few decades. A simulation failure mainly occurs due to the violation of the numerical stability conditions, non-robust numerical implementations, or errors in programming. However, the existing sampling-based analysis techniques such as global sensitivity analysis (GSA) methods, which require running these models under many configurations of parameter values, are ill-equipped to effectively deal with model failures. To tackle this problem, we propose a novel approach that allows users to cope with failed designs (samples) during the GSA, without knowing where they took place and without re-running the entire experiment. This approach deems model crashes as missing data and uses strategies such as median substitution, single nearest neighbour, or response surface modelling to fill in for model crashes. We test the proposed approach on a 10-paramter HBV-SASK rainfall-runoff model and a 111-parameter MESH land surface-hydrology model. Our results show that response surface modelling is a superior strategy, out of the data filling strategies tested, and can scale well to the dimensionality of the model, sample size, and the ratio of number of failures to the sample size. Further, we conduct a "failure analysis" and discuss some possible causes of the MESH model failure.
Abstract VARS-TOOL is a software toolbox for sensitivity and uncertainty analysis. Developed primarily around the “Variogram Analysis of Response Surfaces” framework, VARS-TOOL adopts a multi-method approach that enables simultaneous generation of a range of sensitivity indices, including ones based on derivative, variance, and variogram concepts, from a single sample. Other special features of VARS-TOOL include (1) novel tools for time-varying and time-aggregate sensitivity analysis of dynamical systems models, (2) highly efficient sampling techniques, such as Progressive Latin Hypercube Sampling (PLHS), that maximize robustness and rapid convergence to stable sensitivity estimates, (3) factor grouping for dealing with high-dimensional problems, (4) visualization for monitoring stability and convergence, (5) model emulation for handling model crashes, and (6) an interface that allows working with any model in any programming language and operating system. As a test bed for training and research, VARS-TOOL provides a set of mathematical test functions and the (dynamical) HBV-SASK hydrologic model.
Abstract. Complex, software-intensive, technically advanced, and computationally demanding models, presumably with ever-growing realism and fidelity, have been widely used to simulate and predict the dynamics of the Earth and environmental systems. The parameter-induced simulation crash (failure) problem is typical across most of these models despite considerable efforts that modellers have directed at model development and implementation over the last few decades. A simulation failure mainly occurs due to the violation of numerical stability conditions, non-robust numerical implementations, or errors in programming. However, the existing sampling-based analysis techniques such as global sensitivity analysis (GSA) methods, which require running these models under many configurations of parameter values, are ill equipped to effectively deal with model failures. To tackle this problem, we propose a new approach that allows users to cope with failed designs (samples) when performing GSA without rerunning the entire experiment. This approach deems model crashes as missing data and uses strategies such as median substitution, single nearest-neighbor, or response surface modeling to fill in for model crashes. We test the proposed approach on a 10-parameter HBV-SASK (Hydrologiska Byråns Vattenbalansavdelning modified by the second author for educational purposes) rainfall–runoff model and a 111-parameter Modélisation Environmentale–Surface et Hydrologie (MESH) land surface–hydrology model. Our results show that response surface modeling is a superior strategy, out of the data-filling strategies tested, and can comply with the dimensionality of the model, sample size, and the ratio of the number of failures to the sample size. Further, we conduct a “failure analysis” and discuss some possible causes of the MESH model failure that can be used for future model improvement.
Abstract Dynamical earth and environmental systems models are typically computationally intensive and highly parameterized with many uncertain parameters. Together, these characteristics severely limit the applicability of Global Sensitivity Analysis (GSA) to high-dimensional models because very large numbers of model runs are typically required to achieve convergence and provide a robust assessment. Paradoxically, only 30 percent of GSA applications in the environmental modelling literature have investigated models with more than 20 parameters, suggesting that GSA is under-utilized on problems for which it should prove most useful. We develop a novel grouping strategy, based on bootstrap-based clustering, that enables efficient application of GSA to high-dimensional models. We also provide a new measure of robustness that assesses GSA stability and convergence. For two models, having 50 and 111 parameters, we show that grouping-enabled GSA provides results that are highly robust to sampling variability, while converging with a much smaller number of model runs.
2018
Anonymous review of scientific manuscripts was intended to encourage reviewers to speak freely, but other models may be better for accountability and inclusivity.
2017
Efficient sampling strategies that scale with the size of the problem, computational budget, and users needs are essential for various sampling-based analyses, such as sensitivity and uncertainty analysis. In this study, we propose a new strategy, called Progressive Latin Hypercube Sampling (PLHS), which sequentially generates sample points while progressively preserving the distributional properties of interest (Latin hypercube properties, space-filling, etc.), as the sample size grows. Unlike Latin hypercube sampling, PLHS generates a series of smaller sub-sets (slices) such that (1) the first slice is Latin hypercube, (2) the progressive union of slices remains Latin hypercube and achieves maximum stratification in any one-dimensional projection, and as such (3) the entire sample set is Latin hypercube. The performance of PLHS is compared with benchmark sampling strategies across multiple case studies for Monte Carlo simulation, sensitivity and uncertainty analysis. Our results indicate that PLHS leads to improved efficiency, convergence, and robustness of sampling-based analyses. A new sequential sampling strategy called PLHS is proposed for sampling-based analysis of simulation models.PLHS is evaluated across multiple case studies for Monte Carlo simulation, sensitivity and uncertainty analysis.PLHS provides better performance compared with the other sampling strategies in terms of convergence rate and robustness.PLHS can be used to monitor the performance of the associated sampling-based analysis and to avoid over- or under-sampling.
AbstractThe high impact of river ice phenomena on the hydrology of cold regions has led to the extensive use of numerical models in simulating and predicting river ice processes. Consequently, ther...