Sensitivity analysis adapted to a mixture of epistemic and aleatory uncertainty 

Get Complete Project Material File(s) Now! »

Application to slope stability analysis

Let us consider a simple application of VBSA using a commonly-used model to assess landslide susceptibility, namely the infinite slope analytical model (e.g. [Hansen, 1984]). The stability of the infinite slope model as depicted in Fig. 2.1 is evaluated by deriving the factor of safety SF, which corresponds to the ratio between the resisting and the driving forces acting on the slope (Eq. 2.4). If SF is lower than 1.0 the potential for failure is high. SF = C +(γ−m·γw) · z ·cos(θ) ·tan(φ) γ·sin(θ) ·cos(θ)2.4).
The model parameters (as indicated in Fig. 2.1) correspond to C, the cohesion of the soil material; φ, the friction angle; θ, the slope angle; γ, the soil unit weight; z, the thickness of slopematerial above the slip plane; andm, the ratio between thickness of superficial saturated slope material and z. The water unit weight γw is considered constant at 9.81 kN/m3.

Global sensitivity analysis

Main and total effects are computed using the Monte-Carlo-based algorithm developed by [Saltelli, 2002], which requires N · (n +2) model runs (N is the number of Monte-Carlo samples and n is the number of input parameters). Sampling error, due to theMonte-Carlo evaluation of the variances in the definition of Sobol’ indices (Eqs. 2.1 and 2.2), are estimated through a confidence interval calculated using 100 bootstrap samples as proposed for instance by [Archer et al., 1997]. Preliminary convergence tests showed that N=20,000 both yields satisfactory convergence of the sensitivity measures to one decimal place and non-overlapping confidence intervals defined at a level of 90 %. To illustrate, results are depicted in Fig. 2.4 for respectively 2,500 and 20,000. In practice the package “sensitivity” 2 of the R software Several observations can be made:
• the total variance on the safety factor reaches 0.045: this represents the whole uncertainty on SF given the uncertainty on the input parameters (Fig. 2.1).
• the total number ofmodel runs reaches 20,000×(6+2) = 160,000. The convergence of the total effects was more difficult than the one for the main effects.
• considering the value of the main effects, θ appears to be the dominant input parameter with a sensitivity measure exceeding 60 %, meaning that more than 60 % of the uncertainty on SF (here its variance) are driven by this property.
• the ranking of the epistemic uncertainties are: θ,m, and φ. These are the input parameters, which contribute themost to the variance of SF. Priority should be given to them for the characterisation studies.

Limitations and links with the subsequent chapters

The application of VBSA on the infinite slope analytical model outlines the richness of the information, which can be provided through the calculation of the Sobol’ indices, whether for importance ranking and for deeper insight in themodel behaviour. Yet, to the author’s best knowledge, this kind of analysis has rarely been conducted in the field of geo-hazards, except for the study by [Hamm et al., 2006]. This can be explained by the specificities of the domain of geo-hazard assessments, which impose considering several constraints.
• Despite the extensive research work on the optimization of the computation algorithms (e.g., [Saltelli et al., 2010] and references therein), VBSA remains computationally intensive, as it imposes to run a large number of simulations. In the example described in Sect. 2.3, the number of necessary model runs for the Sobol’ indices to converge reaches 160,000. If a single model run had a low computation time (CPU time), say of 1 second, the application of VBSA would require about 44 hours (about 1.8 days) of calculation, which is achievable using a single computer unit (CPU). If the CPU time of a single model run was 1minute, the application of VBSA would require more than 111 days of calculation, which is achievable using a computer cluster (e.g., [Boulahya et al., 2007]) of limited number of CPU (10 to 20). If the CPU time of a single model run was 1 hour, the application of VBSA would require more than 6,666 days (about 18 years) of calculation.
To achieve VBSAwith oneweek of calculation, the computer cluster should be composed of at least 1,000 CPU. Few research team or engineering companies can afford such large computer clusters. Nevertheless, most numerical models supporting geo-hazard assessments fall in the third category, either because they are large-scale or because the underlying processes are difficult to solve numerically. The application of slope stability analysis at the spatial scale of a valley, [Olivier et al., 2013, Baills et al., 2013], illustrates the first case, whereas the model of the La Frasse landslide illustrates the second case (it has a CPU time of about 4 days because it involves a complex elastoplastic model describing the complex behaviour of the slip surface, see further details in the next chapter). In those situations, the direct application of VBSA is obviously not achievable. Chapter 3 discusses this issue and proposes to rely on meta-modelling techniques, which basically consists in replacing the long-running simulator by a costless-to-evaluate mathematical approximation (see an overview by [Storlie et al., 2009]) to overcome such a difficulty.
• The second limitation is related to the nature of the parameters (input or output) that VBSA deals with: they are scalar. Yet, in the domain of geo-hazard, parameters are often functional, i.e. they are complex functions of time or space (or both). This means that parameters can be vectors with possible high dimension (typically 100 – 1,000). In the infinite slope case, the inputs parameters describing the soil properties can be spatially varying, for instance due to the presence of heterogeneities at the scale of the slope thickness (like clay patches embedded within some sandy soil formation). Besides, the water table can temporally vary, for instance due to time-varying rainfall infiltration. In the La Frasse case, the outputs are not scalar but temporal curves of the displacements (discretized in 300 steps) at any nodes of the mesh, i.e. the outputs are vectors of size 300 at any location. Another example is the spatial distribution of hydraulic conductivities of a soil formation (see an example provided by [Tacher et al., 2005]). Chapter 4 further discusses this issue and describes a possible strategy to both overcome the computation burden and the high dimensionality of model outputs. The case of functional inputs is also addressed.
• Finally, a third limitation is related to the way uncertainty is mathematically represented. By construction, VBSA is based on the assumption that the variance can capture the main features of the uncertainty. This assumption has been shown not to be valid in cases of heavy tailed or multi-modal distributions [Auder and Iooss, 2008]. Besides, the emphasis is on a particular moment of the distribution, whichmay be too restrictive for efficient decision-making, because a decision-maker/analyst state of knowledge on a parameter or on a model output is represented by the entire uncertainty distribution[Borgonovo, 2007] or a given probability of exceedance like for the stability analysis [Morio, 2011]. Alternatives to VBSA have then been proposed in the statistical community, either based on the entire probability distribution [Borgonovo, 2007] or on the use of the statistical entropy [Auder and Iooss, 2008] in cases when the variables are deterministic but not known exactly. Yet, in the domain of geo-hazard assessments, data are often scarce, incomplete or imprecise, which add more difficulties. In the infinite slope case, data can be derived either from literature data or from laboratory tests conducted on soil / rock samples (but these are usually of small number). In particular, water table’s height is related to water circulations, which are known to be complex at the scale of a slope and suffer froma lack of information and studies (e.g., [Winckel et al., 2004] for the French Basque coast). Systematically resorting to the probabilistic framework in such situations can be debatable. In Chapter 5, an in-depth discussion is provided. The applicability of an alternative tool (namely Fuzzy sets originally introduced by [Zadeh, 1965]) is then explored to mathematically represent epistemic uncertainty in a more flexible manner. Its integration in a sensitivity analysis is explored in Chapter 6.

READ  Historical spatial mapping and tools used for analysis

Table of contents :

Résumé étendu 
0.1 Limitation n°1 : gérer le temps de calcul
0.2 Limitation n°2 : gérer des paramètres variant dans l’espace et le temps
0.3 Limite n°3 : gérer lemanque de connaissance
0.4 En résumé…
1 Introduction 
1.1 Hazard, Risk, uncertainty and decision-making
1.2 Aleatory and Epistemic uncertainty
1.3 Epistemic uncertainty of type « parameter »
1.4 A real-case example
1.5 Objectives and structure of the manuscript
2 A probabilistic tool: variance-based global sensitivity analysis 
2.1 Global sensivity analysis
2.2 Variance-based global sensivity analysis
2.3 Application to slope stability analysis
2.3.1 Local sensitivity analysis
2.3.2 Global sensitivity analysis
2.4 Limitations and links with the subsequent chapters
3 Handling long-running simulators 
3.1 A motivating real case: the numerical model of the La Frasse landslide
3.1.1 Description of the model
3.1.2 Objective of the sensitivity analysis
3.2 A meta-model-based strategy
3.2.1 Principles
3.2.2 Step 1: setting the training data
3.2.3 Step 2: construction of the meta-model
3.2.4 Step 3: validation of themeta-model
3.3 A flexible meta-model: the krigingmodel
3.4 An additional source of uncertainty
3.5 Application to an analytical case
3.6 Application to the La Frasse case
3.7 Concluding remarks of Chapter 3
4 Handling functional variables 
4.1 Problem definition for functional outputs
4.2 Reducing the dimension
4.2.1 Principles
4.2.2 Principal Component Analysis
4.2.3 Interpreting the basis set expansion
4.3 Strategy description
4.3.1 Step 1: selecting the training samples
4.3.2 Step 2: reducing themodel output dimensionality
4.3.3 Step 3: constructing the meta-model
4.3.4 Step 4: validating the meta-model
4.4 Application to the La Frasse case
4.4.1 Construction of themeta-model
4.4.2 Computation and analysis of the main effects
4.5 Towards dealing with functional inputs
4.5.1 Strategy description
4.5.2 Case study
4.5.3 Discussion
4.6 Concluding remarks of Chapter 4
5 A more flexible tool to represent epistemic uncertainties 
5.1 On the limitations of the systematic use of probabilities
5.2 Handling vagueness
5.2.1 A motivating real-case: hazard related to abandoned underground structures
5.2.2 Membership function
5.2.3 Application
5.3 Reasoning with vagueness
5.3.1 Amotivating real-case: the inventory of assets at risk
5.3.2 Application of Fuzzy Logic
5.4 Handling imprecision
5.4.1 Possibility theory
5.4.2 A practical definition
5.4.3 Illustrative real-case application
5.5 Handling probabilistic laws with imprecise parameters
5.5.1 Amotivating example: the Risk-UE (level 1) model
5.5.2 Problem definition
5.5.3 Use for an informed decision
5.6 Concluding remarks of Chapter 5
6 Sensitivity analysis adapted to a mixture of epistemic and aleatory uncertainty 
6.1 State of the art of sensitivity analysis accounting for hybrid uncertainty representations
6.2 A graphical-based approach
6.2.1 Motivation
6.2.2 Joint propagation of randomness and imprecision
6.2.3 Contribution to probability of failure sample plot
6.2.4 Adaptation to possibilistic information
6.3 Case studies
6.3.1 Simple example
6.3.2 Case study n°1: stability analysis of steep slopes
6.3.3 Case study n°2: stability analysis in post-mining
6.3.4 Case study n°3: numerical simulation for stability analysis in post-mining
6.4 Concluding remarks for Chapter 6
7 Conclusions 
7.1 Achieved results
7.2 Open questions and Future developments
7.2.1 Model uncertainty
7.2.2 Use of new uncertainty theories for practical decision-making
A Functional decomposition of the variance: the Sobol´ indices 
B Universal kriging equations 
C Key ingredients of a bayesian treatment of kriging-based meta-modelling 
C.1 Principles of BayesianModel Averaging
C.2 Monte-Carlo-based procedures
C.3 Bayesian kriging
C.4 Deriving a full posterior distribution for the sensitivity indices
D Brief introduction to the main uncertainty theories 
D.1 Probability
D.2 Imprecise probability
D.3 Evidence theory
D.4 Probability bound analysis
D.5 Possibility theory
E Fuzzy RandomVariable 
Bibliography 

GET THE COMPLETE PROJECT

Related Posts