Next Article in Journal
On Entropy, Information, and Conservation of Information
Next Article in Special Issue
Assessment and Prediction of Water Resources Vulnerability Based on a NRS-RF Model: A Case Study of the Song-Liao River Basin, China
Previous Article in Journal
Deep Learning for Walking Behaviour Detection in Elderly People Using Smart Footwear
Previous Article in Special Issue
Assessing the Europe 2020 Strategy Implementation Using Interval Entropy and Cluster Analysis for Interrelation between Two Groups of Headline Indicators
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Global Sensitivity Analysis Based on Entropy: From Differential Entropy to Alternative Measures

Department of Structural Mechanics, Faculty of Civil Engineering, Brno University of Technology, 602 00 Brno, Czech Republic
Entropy 2021, 23(6), 778; https://doi.org/10.3390/e23060778
Submission received: 31 May 2021 / Revised: 16 June 2021 / Accepted: 17 June 2021 / Published: 19 June 2021

Abstract

:
Differential entropy can be negative, while discrete entropy is always non-negative. This article shows that negative entropy is a significant flaw when entropy is used as a sensitivity measure in global sensitivity analysis. Global sensitivity analysis based on differential entropy cannot have negative entropy, just as Sobol sensitivity analysis does not have negative variance. Entropy is similar to variance but does not have the same properties. An alternative sensitivity measure based on the approximation of the differential entropy using dome-shaped functionals with non-negative values is proposed in the article. Case studies have shown that new sensitivity measures lead to a rational structure of sensitivity indices with a significantly lower proportion of higher-order sensitivity indices compared to other types of distributional sensitivity analysis. In terms of the concept of sensitivity analysis, a decrease in variance to zero means a transition from the differential to discrete entropy. The form of this transition is an open question, which can be studied using other scientific disciplines. The search for new functionals for distributional sensitivity analysis is not closed, and other suitable sensitivity measures may be found.

1. Introduction

Sensitivity analysis (SA) based on entropy uses entropy to quantify uncertainty as Sobol SA [1,2] uses variance. Probability distributions with low variance have low entropy, while probability distributions with high variance have high entropy.
From a mathematical point of view, entropy is a certain additive functional on the probability distributions of possible states of a given system [3]. Entropy-based SA belongs to the category of distributional SA, which includes, for example, methods [4,5,6,7,8,9]. In these SAs, uncertainty is characterized by examining the entire distribution of model outputs, not just its variance.
There exist two popular indices based on entropy that have been used for SA. The first is entropy-based SA [10], which is based on the definition of Shannon’s entropy [11]. The second [12] is based on Kullback–Leibler entropy, which measures the difference between two probability distributions.
The use of entropy instead of variance is usually justified by the need to analyze the output random variable with heavy-tail or outliers [13]. SA based on entropy was used to study, for example, traffic flow [13], limit states of load-bearing structures [14,15], the seismic demand of concrete structures [16], and groundwater level [17].
Another group of tasks uses entropy to examine the state of a system in combination with certain types of SA, which may not be based on entropy. This group includes, for example, SA of the working process of heat exchangers [18], the hydraulic reliability of water distribution systems [19], shear stress distribution in a rectangular channel [20], creep of soft marine soil [21], air energy storage systems in coal-fired power plants [22], uncertainties of mathematical decision-making models [23,24,25,26,27], and many others.
Other types of SA include quantile-oriented methods [28,29,30,31,32,33], probability of failure [34,35,36,37,38,39,40,41], or decision-making approaches [42,43,44,45]. The subject of interest of these types of SA is the reliability of the system, which cannot be examined using SA distribution types.
The choice of SA method(s) remains an open question. A comparison of different types of sensitivity indices [39] shows that the sensitivity order from different types of SA can be the same or similar. Firm conclusions on the selection of the best SA method could not be reached in the majority of the studies pertaining to SA, which is understandable, because there are numerous ways to define which is the best one [46]. Sobol SA is very popular (see, e.g., [47,48,49,50,51,52,53,54]); on the contrary, less unified types of SA are less widespread [55,56,57,58,59,60,61,62,63,64]. In general, a global SA, which analyzes the influence of the variability of inputs throughout their distribution range and can describe the influence of interactions between input variables on the output, can be recommended.
This article aims to research global SA based on differential entropy using case studies. The concept of estimating sensitivity indices is described and the reasons why negative differential entropy is an undesirable part of SA are mentioned. The motivation for this work is to propose an alternative sensitivity measure based on the approximation of the differential entropy using functionals with non-negative values.

2. Entropy of a Random Variable

The concept of entropy for a discrete random variable was introduced by Claude Shannon [11] as a useful benchmark in information theory. The entropy of discrete random variable Y having a probability mass function can be written using the equation:
H ( Y ) = i = 1 n P ( y i ) log b ( P ( y i ) ) .
A valued characteristic of discrete entropy is that the entropy of a discrete random variable Y is zero or positive because the probabilities P(yi) in Equation (1) are in [0, 1]. This is also an important difference from differential entropy.
The differential (continuous) entropy can be defined using the following formula:
H ( R ) = f ( r ) log b ( f ( r ) ) d r .
R is a continuous random variable with the probability density function (pdf) f(r) on the real line. The differential entropy is not a limit of the Shannon entropy for n → ∞, although Equation (2) resembles an intuitive extension of Equation (1). In particular, it may be problematic that the differential entropy may be negative if f(r) > 1. For Gauss pdf, this occurs when the standard deviation σR of f(r) is very small, which is illustrated by the example with mean value μR = 0, where σR is parameter of the graph—see Figure 1.
Entropy is a measure of uncertainty similar to variance. Higher entropy indicates higher uncertainty or higher variance. An important difference occurs with small uncertainties. If R has Gauss pdf, then the negative values of the differential entropy H(R) decrease when σR decreases with the limit H(R) → −∞ when σR → 0—see Figure 1. The same is true for other classic pdfs. Unlike the discrete case, the entropy of a continuous system does not remain invariant during the transformation of the coordinate systems [65].
The right part of Figure 1 shows that the dependence H(R) vs. ln(σR) is linear, similarly, the dependence H(R) vs. ln(σR·σR) is also linear. It can be noted that the linear dependence of H(R) vs. ln(σR) is observed for the Gauss pdf of R but does not occur generally for each pdf. For example, the dependence ln(σR) vs. H(R) is linear if the variation coefficient is constant for log-normal pdf of R.

3. Entropy-Based Sensitivity Analysis

In SA, entropy is used as a measure of uncertainty for two types of sensitivity indices [10,12]. Both SA analyze changes in the probability density function (pdf) of the model output under the condition that one or more input random variables are fixed. The first concept of SA [10] is based on conditional entropy, which directly uses the definition of Shannon’s entropy. The second concept of SA [12] is based on the conditional relative entropy called Kullback–Leibler entropy.
This article builds on the first concept [10], but with the implementation of differential entropy according to Equation (2) and with the introduction of new alternative sensitivity measures.
Any computational model may be regarded as a function R = g(X), where R is a scalar model output, and X is a vector of M uncertain model inputs {X1, X2, … XM}, where statistical independence is assumed between inputs.

3.1. Sensitivity Indices based on Differential Entropy H(R)

Global sensitivity indices based on entropy can be formulated analogously to Sobol sensitivity indices [1,2] with the difference that variance is replaced by entropy [10]. Can global sensitivity indices be formulated using Equation (2), and with the shortcoming that the differential entropy can be negative when the variance is small? The answer can be obtained by analyzing the sensitivity index of the first and higher orders. The first-order entropy-base sensitivity index Ti can be written as:
T i = H ( R ) E ( H ( R | X i ) ) H ( R ) ,   T i [ 0 ,   1 ] ,
where the mean value E [·] is taken across all likely values of Xi. H(R|Xi) is the conditional differential entropy, which represents the average loss of random variability on model output R when the input value of Xi is known.
The values of H(R) and H(R|Xi) must be such that Ti ∈[0, 1]. In the limit case, if R|Xi loses all random variability (σR|Xi = 0), then the expected influence of Xi on R is 100%, which means Ti = 1. Therefore, H(R|Xi) must be equal to zero and not −∞ as given by Equation (2)—see Figure 1. Equation (2) has the drawback that it can give negative entropy, which allows a sensitivity greater than 100%, Ti > 1, which is not desired in the SA concept. On the other hand, Equation (2) satisfies the second extreme H(R) = H(R|Xi), Ti = 0, where fixing Xi does not influence the pdf of output R. From the point of view of the SA concept, there are problematic cases where the variance of the output decreases to zero.
The variance and entropy of R further decrease if two or more input variables are fixed. The second-order entropy-base sensitivity index Tij is computed with the fixing of pairs Xi, Xj:
T i j = H ( R ) E ( H ( R | X i , X j ) ) H ( R ) T i T j ,
where the mean value E [·] is taken across all likely values of Xi and Xj. The third-order sensitivity index, Eijk, is computed analogously:
T i j k = H ( R ) E ( H ( R | X i , X j , X k ) ) H ( R ) T i T j T k T i j T i k T j k .
All input random variables are assumed to be statistically independent. The sum of all indices must be equal to one:
i T i + i j > i T i j + i j > i k > j T i j k + + T 123 M = 1 .
The total entropy-base sensitivity index TTi can be written as:
T T i = 1 H ( R ) E ( H ( R | X ~ i ) ) H ( R ) ,
where the second term in the numerator contains the conditional entropy evaluated for the input random variable Xi and fixed variables (X1, X2, …, Xi−1, Xi+1, …, XM).
The higher the order of the sensitivity index, the more input variables are fixed and the lower the entropy of the output R, including negative values. Each additional order of the sensitivity index has one additional random variable fixed until all inputs are fixed and the output becomes deterministic. During this process, both the entropy and the variance of R decrease. While the variance decreases to zero, the entropy decreases to negative values, which is particularly problematic when estimating higher-order sensitivity indices.
If all input random variables are fixed, then the output R is deterministic and the variance of the output R is zero. This occurs when the last-order sensitivity index is computed. Although the deterministic value of the output should have zero entropy according to Equation (1), the differential entropy according to Equation (2) extends to minus infinity. From the point of view of the SA concept, the entropy needs to decrease to zero when the variance of the output reaches zero.
This article seeks new forms of functionals as alternatives to Equation (2) rather than the application of Kullback–Leibler (K-L) (relative) entropy [66,67]. In terms of the concept of global SA, alternative forms of functional are sought so that the entropy is not negative when the variance of the output goes to zero.

3.2. Approximation of Differential Entropy by Functional H ˜ ( R ) for Sensitivity Indices

From an SA point of view, Equation (2) is a functional that assigns a real value to function f(r). The modified Equation (2) should transform the inputs to the logarithm in the interval of zero to one so that the output value (entropy) cannot be negative for small σR and also differ as little as possible from the differential entropy if f(r) < 1. The function should be increasing and decreasing approximately according to Equation (2) to fit Equation (2) well in the unproblematic areas. One such function, which is useful in modifying Equation (2), is the hyperbolic tangent:
g ( z ) = tanh ( z t ) t ,
where z = f(r).
In Equation (8), the larger the exponent t, the better g(z) fits the bilinear function—see left graph in Figure 2. Function g(z) does not provide an output greater than one, with the limit case in the form g(z) = z, z∈[0, 1); g(z) = 1, z∈[1, ∞]—see Figure 2.
Substituting Equation (8) into Equation (2), we obtain Equation (9) in the form:
H ˜ ( R ) = f ( r ) log b ( g [ f ( r ) ] ) d r = f ( r ) log b ( tanh ( [ f ( r ) ] t ) t ) d r ,
where H ˜ ( R ) ≥ 0 and H ˜ ( R ) → 0 if σR → 0. The right side of Figure 2 shows examples of the plots of the natural logarithm functions that are used in Equation (9).
From the point of view of SA, Equation (9) is a functional that has the properties required in the decomposition of sensitivity indices. In the limit case t → ∞, the positive values of the logarithm are simply replaced by zero in the integral in Equation (9). It is apparent from the example shown in Figure 3 that the functional from Equation (9) differs from the differential entropy from Equation (2) only for small σR; otherwise, H ˜ ( R ) H(R) approximately according to Equation (2).
Equation (9) prevents the logarithm from returning a positive value within the integral when f(r) > 1. Let f(r) be a Gauss pdf and a natural logarithm is used in Equation (9). Then, the values of the functional H ˜ ( R ) are plotted in Figure 3, including a comparison with the differential entropy H(Y) computed according to Equation (2). Equation (9) perfectly approximates the differential entropy from Equation (2) if large values of σR are used. It can be noted that there is not much difference between the values plotted from Equation (9) for t = 4 (red curve) and t = ∞ (black curve)—see Figure 3.
The defect in Equation (2) could be engineered by using an additional condition, which replaces the negative entropy values with zero if such a situation occurs. However, this solution can assign zero differential entropy, even in the case where the random variable does not yet have zero standard deviation σR > 0, so a gradual decrease to zero is more appropriate.
The sensitivity indices based on Equation (9) are computed according to Equations (3) to (7), with the difference that the differential entropy is replaced by the functional H ˜ ( R ) according to Equation (9). Sensitivity indices based on H ˜ ( R ) are denoted as T ˜ i , T ˜ i j , T ˜ i j k , each index is in the interval [0, 1] and the sum of all sensitivity indices is equal to one.

3.3. Approximation of Differential Entropy by Functional H ^ ( R ) and Sensitivity Indices

Let us consider another functional H ^ ( R ) approximating Equation (2) such that H ˜ ( R ) ≥ 0
H ^ ( R ) = c 1 f ( r ) e 2 ( f ( r ) ) 2 d r ,   where c 1 = 0.794423 ln ( b )
In order to approach Equation (2), c1 can be computed from the condition
c 1 max ( z e 2 z 2 ) = max ( z log b ( z ) ) ,
where the maximum can be found in z [ 0 ,   ) for the following arguments:
z 1 = Argmax z   ( z e 2 z 2 ) = 2 4 ,       z 2 = Argmax z   ( z log b ( z ) ) = 1 e .
Upon substituting z1, z2 into (11), c1 can be computed as
c 1 = 1 e log b ( 1 e ) 2 4 e 2 ( 2 4 ) 2 = 2 3 4 e 3 4 ln ( b ) 0.794423 ln ( b ) .
The left part of Figure 4 shows the variants of the functions that are integrated in Equation (10). Figure 4 shows that Equation (10) does not approximate the differential entropy as perfectly as Equation (9); however, this will not present a problem. The function used in Equation (10) differs more or less from the function z·logb(z) for both small and large values of z = f(r)—see the left part of Figure 4. Using the Gauss pdf of f(r), it is shown that large deviations in the approximation of the differential entropy are observed for both large and small values of σR—see the right part of Figure 4.
Although the approximation of the differential entropy using Equation (10) is not as perfect as Equation (9), it is not a shortcoming in the evaluation of the sensitivity indices, as shown in the case studies below.
Sensitivity indices based on H ^ ( R ) are computed according to Equation (3) to (7), with the difference that the differential entropy is replaced by the functional H ^ ( R ) according to Equation (10). Sensitivity indices based on H ^ ( R ) are denoted as T ^ i , T ^ i j , T ^ i j k . The functional gives a non-negative output and is equal to zero when σR = 0, each index is in the interval [0, 1] and the sum of all sensitivity indices is equal to one.

4. Standard Distribution-Based Sensitivity Analyzes

The sensitivity analysis described in the previous chapter is based on the probability distribution of all possible outcomes of the random phenomenon R being observed. This type of SA can be categorized as distribution-based SA, because the result of SA depends on the whole probability distribution of random variable R and not only on one moment, as is the case, for example, with Sobol SA. Other types of distribution-based sensitivity analysis that are relevant for comparison are Cramér-von Mises SA and Borgonovo moment-independent SA.

4.1. Cramér-von Mises Sensitivity Indices

Let ΦR be the distribution function of R, where R is a model output, and X is a vector of M uncertain model inputs {X1, X2, … XM} with the assumption of statistical independence.
Φ R ( p ) = P ( R p ) = E ( 1 R p ) ,   for   p R ,
Let Φ R i be the conditional distribution function of R conditionally on Xi:
Φ R i ( p ) = P ( R p | X i ) = E ( 1 R p | X i ) ,   for   p R ,
The first-order Cramér-von Mises index Gi is determined by measuring the distance between probability ΦZ(t) and conditional probability Φ Z i (t) when an input is fixed [68].
G i = E [ ( Φ R ( p ) Φ R i ( p ) ) 2 ] Φ R ( p ) ( 1 Φ R ( p ) ) d Φ R ( p ) ,
The second-order Cramér-von Mises index Gij can be written using [68] as
G i j = E [ ( Φ R ( p ) Φ R i j ( p ) ) 2 ] Φ R ( p ) ( 1 Φ R ( p ) ) d Φ R ( p ) G i G j ,
Since sensitivity indices Gi, Gij, etc., are based on Hoeffding decomposition, the sum of all sensitivity indices is one [68]. Other characteristics of Cramér-von Mises indices and their behavior in engineering applications are mentioned in [39,69].

4.2. Borgonovo Moment-Independent Sensitivity Indices

Borgonovo moment-independent sensitivity indices [70] examine the whole distribution of inputs and outputs.
B i = 1 2 E | f ( r ) f R | X i ( r ) | d r ,
where f(r) is the pdf of R and fR│Xi(r) is the conditional pdf of R with fixed parameter Xi [70].
Upon fixing pairs Xi, Xj, we obtain the second-order index Bij, where i < j. Upon fixing triplets Xi, Xj, Xk, we obtain the third-order index Bijk, where i < j < k, etc. The higher the order of the index, the greater its value, and the index of the last order is equal to one, 0 ≤ BiBij ≤ … ≤ B1,2,…,M ≤ 1 [70]. Compared to SA, which has the sum of all indices equal to one, Borgonovo indices are less practical, especially the last index with fixed inputs, which is always equal to one and does not provide any new information. Identification of the influence of Xi using total indices is not possible for Borgonovo indices. The advantage of Borgonovo indices is their evaluation of the whole distribution of the output in a more transparent way than presented by [68]—see Equation (18) vs. Equation (16). The second advantage is that the input variables can be statistically correlated, which is difficult to ensure for other types of indices based on decomposition, such as Sobol indices.

5. Variance-Based Sensitivity Analysis

Sobol SA [1,2] is a variance-based SA, which decomposes the variance of the model output into segments that can be attributed to inputs or sets of inputs. Sobol SA is a classical method, which is an important part of the research of computational models in stochastic structural mechanics—see, e.g., [71,72,73]. Although Sobol SA is of a different type than the SA in the previous chapters, supplementing the case study with Sobol indices is useful for comparison.
Although the differential entropy is dependent on the entire shape of the pdf, variance is an important characteristic for the computation of H(R). Entropy is a measure of system uncertainty similar to variance. Entropy increases when variance increases. Variance dependency makes Equation (3) similar to Sobol’s first-order sensitivity index:
S i = V ( R ) E ( V ( R | X i ) ) V ( R ) ,
where V(R) is the variance and V(R|Xi) is the conditional variance of the model output R.
Higher-order Sobol sensitivity indices can be written analogously—see, e.g., [74].

6. The Case Studies

The resistance R of a steel member of a rectangular cross-section in tension is studied using six SA methods. Two new types of sensitivity indices, which are based on functionals H ˜ ( R ) and H ^ ( R ) , are compared with four classical types of sensitivity indices in the case study.

6.1. Computational Model

The resistance R is the product of three random variables: yield strength fy, thickness t2, and width b. Statistical characteristics of fy, t2 and b are taken into consideration using the results of experimental research [75,76], where steel grade S355 was studied for selected steel products—see Table 1.
The cross-sectional area is expressed as the product of the thickness and width. The resistance R is the product of yield strength fy and cross-sectional area t2·b—see Equation (20).
R = f y t 2 b .
The mean value μR of the product R can be obtained from Equation (21):
μ R = μ f y μ t 2 μ b .
The variance of the product R can be obtained from Equation (22):
σ R 2 = ( μ b 2 + σ b 2 ) ( μ t 2 2 σ f y 2 + σ f y 2 σ t 2 2 ) + μ f y 2 ( σ t 2 2 ( μ b 2 + σ b 2 ) + μ t 2 2 σ b 2 ) ,
where σR is the standard deviation.
Although all input random variables have Gauss pdf, their product has non-zero skewness aR—see Equation (23).
a R = 6 μ R σ R 3 ( μ f y 2 σ t 2 2 σ b 2 + σ f y 2 μ t 2 2 σ b 2 + σ f y 2 σ t 2 2 μ b 2 + 4 σ f y 2 σ t 2 2 σ b 2 ) ,
The following statistical characteristics of output R: μR = 495.216 kN, σR = 40.822 kN, aR = 0.11 are obtained upon substituting the statistical characteristics of the input random variables from Table 1—see Figure 5.
To evaluate global SA, the pdf of product R can be reliably approximated using a log-normal pdf with three parameters—μR, σR, and aR [69]. The three-parameter log-normal pdf can be used to estimate the sensitivity index, even if one of the three variables in Equation (20) is fixed as deterministic [32]. If two input variables are fixed, aR = 0 and product R has a Gauss pdf.

6.2. The Results of the Case Studies

The sensitivity indices are estimated using the Latin Hypercube Sampling method (LHS) [77,78] in combination with numerical integration. In equations where the arithmetic mean E (·) is used, its value is estimated using 1000 LHS runs. The three-parameter log-normal pdf of f(r) is used [69]. In all cases, integration is performed numerically by Simpson’s rule, using more than 10,000 integration steps over the interval [μR − 10σR, μR + 10σR]. If the lower bound of the domain of f(r) is greater than μR − 10σR, then integration is performed from the lower bound of f(r). Numerical integration is not used for Sobol sensitivity indices, which are computed analytically using Equation (22). Cramér-von Mises sensitivity indices are estimated using the algorithm described in [39,69], with the difference that three input random variables are used in this article. Borgonovo sensitivity indices were estimated according to the procedure in [39]. Further details of the numerical estimates of the sensitivity indices can be found in [39,69,72]. The results of sensitivity analyzes are shown in Figure 6, Figure 7, Figure 8, Figure 9, Figure 10 and Figure 11.
Sensitivity indices based on differential entropy H(R) are computed for b = e, but the same values can also be obtained for b = 2 or 10—see Figure 6. The sensitivity index of the last third-order was computed using the formal assumption that H(R|X1, X2, X3) = 0.
Sensitivity indices based on functional H ˜ ( R ) are computed for b = e and t = 4, but practically the same values were obtained for t = 1, 2, 6—see Figure 7. The sensitivity index of the last third order was computed using H ˜ (R|X1, X2, X3) = 0.
The sensitivity indices based on functional H ^ ( R ) are computed for b = e, but practically the same values were obtained for b = 2, 10—see Figure 8. The sensitivity index of the last third-order was computed using H ^ (R|X1, X2, X3) = 0.

7. Discussion

The case study presented the results of several types of SA using the input random variables listed in Table 1, which are typical in structural mechanics. For SA based on the differential entropy, the values of relative frequency are especially important—see Figure 12. The combination of fixed and random input variables changes the variance of output variable R, with the peak of the pdf changing from 0.01 to infinity—see Figure 12. Fixing all three inputs leads to zero variance of R and theoretically infinite pdf value—see the red line in Figure 12. The pdf of R with all random inputs (full variance) is depicted in pink—see Figure 5. Figure 12 shows the pdf’s, which have the inputs fixed at the mean values.
Zero entropy replaces infinity when estimating the sensitivity index of the last third-order—see the red line in Figure 12. The sum of all sensitivity indices is equal to one only if E(H(R|X1, X2, X3)) = 0—see Equation (5). This means that the entropy of the sensitivity index of the last order (deterministic variables) must be calculated according to Equation (1).
In terms of the concept of sensitivity analysis, differential and discrete entropy are two related concepts, where the decrease of variance to zero (occurring gradually by fixing the input variables in all combinations) means a transition from differential to discrete entropy. The study suggests that global sensitivity analysis can help elucidate the nature of the transition between differential and discrete entropy.
The results depicted in Figure 6 and Figure 7 are practically the same because the estimates of the sensitivity indices in both cases are obtained using f(r) < 0.08 (relatively small values), i.e., in the region where H ˜ ( R ) is very precisely equal to H(R). Although the values of f(r) are relatively small, using the differential entropy without further modifications does not provide a sufficiently general solution. The use of H ˜ ( R ) provides a more general possibility of attaining f(r) > 1, even during the computation of sensitivity indices of lower orders, not just the last one.
All of the utilized SA types identically identified the sensitivity of the output R to the inputs in the following descending order: fy, t2, b. This sensitivity order was determined using total indices, except for Borgonovo SA, where total indices do not exist. The large value of the sensitivity index of the last order causes the difference between the total indices of certain SA types to be very small—see Figure 6 and Figure 7. In contrast, Sobol SA (Figure 11) and SA based on H ^ ( R ) —see Figure 8—which clearly identify a strong influence of fy, provide clear identification of the influential and non-influential inputs.
Although the sensitivity order is the same, the sizes of sensitivity indices of the same order based on H ^ ( R ) differ from the sizes of indices based on H(R) or H ˜ ( R ) . The results of SA based on H ^ ( R ) have a smaller value of the index of the last (third) order, which does not provide any new useful information for determining the sensitivity order of input variables. Gamboa SA also has a large share of the sensitivity index of the last order. Borgonovo SA has a last-order sensitivity index value implicitly equal to one. With a bit of exaggeration, the sensitivity index of the last order can be described as a “ballast” index, which does not provide useful information for determining the sensitivity order, either directly or using total indices.
The largest sum of first-order sensitivity indices (sum of all Si is 0.998) and very small higher-order sensitivity indices is given by Sobol SA—see Figure 11. This has also been observed for other tasks [31,47,71]. If the sum of all Si is equal to one, then the sensitivity order can be determined using only the Si indices, which are the same as the total indices, and higher order sensitivity indices do not have to be calculated. Easy interpretation of SA results, often carried out only with Si, is one of the features that makes Sobol SA popular.
A relatively large sum of all first-order sensitivity indices (0.34) was also obtained using SA based on H ^ ( R ) —see Figure 8. Gamboa sensitivity indices have the sum of all first-order sensitivity indices equal to 0.3; however, the last-order sensitivity index has a relatively high value of 0.65—see Figure 9.
New distribution-oriented sensitivity indices, which are an alternative to other types of distribution SA such as Cramér-von Mises SA [68] or Borgonovo moment-independent SA [70], have been proposed using functional H ^ ( R ) . The case study showed that the sensitivity indices based on functional H ^ ( R ) have a good structure, which provides clear information about the sensitivity order of input variables—see Figure 8. The properties of these indices are mainly influenced by the beginnings of the curves integrated in Equation (10) and are shown on the left side of Figure 4. Virtually the same indices were obtained when the curves integrated in Equation (10) were replaced by a semicircle for z ∈ [0, 1] or zero for z > 1—see the blue curve on the left side of Figure 4.
In addition to the semicircle, it is also possible to experiment with other dome-shaped curves, which can suitably replace the function integrated in Equation (10). Further qualitative research could study the influence of dome shapes on the structure of sensitivity indices to find suitable curves for specific types of tasks. In general, it is possible to aim to maximize the share of first- and low-order sensitivity indices similar to Sobol’s indices. The results can be used in the application of acquired knowledge in understanding similar cases. Based on the findings from the case studies, new functionals with properties usable in SA can be sought.

8. Conclusions

The presented article compared several types of sensitivity analyzes and presented new distribution-oriented sensitivity indices, which were formulated based on differential entropy research. The comparative studies have shown the rationality of new sensitivity measures, their advantages, and disadvantages, in contrast to other types of SA.
The search for functionals suitable for distributional sensitivity analysis (SA) is not closed and other suitable sensitivity measures can be found. Sensitivity measures reflecting clear contrasts between sensitivity indices with the clear identification of influential and non-influential input variables ought to be sought. Preference may be given to a large proportion of sensitivity indices of the first and lower orders, similar to Sobol SA.
Entropy is an alternative measure for variance, but other similar measures are possible. From the point of view of the SA concept, a decrease in variance to zero means a transition from the differential to discrete entropy; differential entropy alone is not enough. The basic ways of making this transition were formulated in this article. Further research may proceed with the analysis of the links between differential and discrete entropy in specific applications of sensitivity analysis.

Funding

The work has been supported and prepared within the project “probability-oriented global sensitivity measures of structural reliability” of The Czech Science Foundation (GACR, https://gacr.cz/) no. 20-01734S, Czechia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Sobol, I.M. Sensitivity estimates for non-linear mathematical models. Math. Model. Comput. Exp. 1993, 1, 407–414. [Google Scholar]
  2. Sobol, I.M. Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates. Math. Comput. Simul. 2001, 55, 271–280. [Google Scholar] [CrossRef]
  3. Amigó, J.M.; Balogh, S.G.; Hernández, S. A brief review of generalized entropies. Entropy 2018, 20, 813. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Castaings, W.; Borgonovo, E.; Morris, M.D.; Tarantola, S. Sampling strategies in density-based sensitivity analysis. Environ. Model Softw. 2012, 38, 13–26. [Google Scholar] [CrossRef]
  5. Pianosi, F.; Wagener, T. A simple and efficient method for global sensitivity analysis based on cumulative distribution functions. Environ. Modell. Softw. 2015, 67, 1–11. [Google Scholar] [CrossRef] [Green Version]
  6. Borgonovo, E.; Plischke, E. Sensitivity analysis: A review of recent advances. Eur. J. Oper. Res. 2016, 248, 869–887. [Google Scholar] [CrossRef]
  7. Borgonovo, E.; Plischke, E.; Rakovec, O.; Hill, M.C. Making the most out of a hydrological model data set: Sensitivity analyses to open the model black-box. Water Resour. Res. 2017, 53, 7933–7950. [Google Scholar] [CrossRef]
  8. Pianosi, F.; Wagener, T. Distribution-based sensitivity analysis from a generic input-output sample. Environ. Model Softw. 2018, 108, 197–207. [Google Scholar] [CrossRef] [Green Version]
  9. Baroni, G.; Francke, T. An effective strategy for combining variance- and distribution-based global sensitivity analysis. Environ. Modell. Softw. 2020, 134, 104851. [Google Scholar] [CrossRef]
  10. Krykacz-Hausmann, B. Epistemic sensitivity analysis based on the concept of entropy. In Proceedings of the International Symposium on Sensitivity Analysis of Model Output, Madrid, Spain, 18–20 June 2001; pp. 31–35. [Google Scholar]
  11. Shannon, C.E. A Mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  12. Liu, H.; Sudjianto, A.; Chen, W. Relative entropy based method for probabilistic sensitivity analysis in engineering design. J. Mech. Des. 2006, 128, 326–336. [Google Scholar] [CrossRef]
  13. Zhong, R.X.; Fu, K.Y.; Sumalee, A.; Ngoduy, D.; Lam, W.H.K. A cross-entropy method and probabilistic sensitivity analysis framework for calibrating microscopic traffic models. Transp. Res. Part C Emerg. Technol. 2016, 63, 147–169. [Google Scholar] [CrossRef]
  14. Tang, Z.C.; Lu, Z.Z.; Pan, W.; Zhang, F. An entropy-based global sensitivity analysis for the structures with both fuzzy variables and random variables. Proc. Inst. Mech. Eng. C J. Mech. Eng. Sci. 2013, 227, 195–212. [Google Scholar]
  15. Shi, Y.; Lu, Z.; Zhou, Y. Global sensitivity analysis for fuzzy inputs based on the decomposition of fuzzy output entropy. Eng. Optim. 2018, 50, 1078–1096. [Google Scholar] [CrossRef]
  16. Yazdani, A.; Nicknam, A.; Dadras, E.Y.; Eftekhari, S.N. Entropy-based sensitivity analysis of global seismic demand of concrete structures. Eng. Struct. 2017, 146, 118–126. [Google Scholar] [CrossRef]
  17. Zeng, X.; Wang, D.; Wu, J. Sensitivity analysis of the probability distribution of groundwater level series based on information entropy. Stoch. Environ. Res. Risk Assess. 2012, 26, 345–356. [Google Scholar] [CrossRef]
  18. Zhu, G.R.; Wang, X.H.; Huang, H.B.; Chen, H. Sensitivity analysis for shell-and-tube heat exchangers based on entropy production. Adv. Mat. Res. 2012, 516–517, 419–424. [Google Scholar] [CrossRef]
  19. Tanyimboh, T.T.; Setiadi, Y. Sensitivity analysis of entropy-constrained designs of water distribution systems. Eng. Optim. 2008, 40, 439–457. [Google Scholar] [CrossRef]
  20. Lashkar-Ara, B.; Kalantari, N.; Sheikh Khozani, Z.; Mosavi, A. Assessing machine learning versus a mathematical model to estimate the transverse shear stress distribution in a rectangular channel. Mathematics 2021, 9, 596. [Google Scholar] [CrossRef]
  21. Zhou, C.; Cui, G.; Liang, W.; Liu, Z.; Zhang, L. A coupled macroscopic and mesoscopic creep model of soft marine soil using a directional probability entropy approach. J. Mar. Sci. Eng. 2021, 9, 224. [Google Scholar] [CrossRef]
  22. Pan, P.; Zhang, M.; Peng, W.; Chen, H.; Xu, G.; Liu, T. Thermodynamic evaluation and sensitivity analysis of a novel compressed air energy storage system incorporated with a coal-fired power plant. Entropy 2020, 22, 1316. [Google Scholar] [CrossRef] [PubMed]
  23. Lescauskiene, I.; Bausys, R.; Zavadskas, E.K.; Juodagalviene, B. VASMA weighting: Survey-based criteria weighting methodology that combines ENTROPY and WASPAS-SVNS to reflect the psychometric features of the VAS scales. Symmetry 2020, 12, 1641. [Google Scholar] [CrossRef]
  24. Hashemi, H.; Mousavi, S.M.; Zavadskas, E.K.; Chalekaee, A.; Turskis, Z. A New group decision model based on grey-intuitionistic fuzzy-ELECTRE and VIKOR for contractor assessment problem. Sustainability 2018, 10, 1635. [Google Scholar] [CrossRef] [Green Version]
  25. Cavallaro, F.; Zavadskas, E.K.; Raslanas, S. Evaluation of combined heat and power (CHP) systems using fuzzy shannon entropy and fuzzy TOPSIS. Sustainability 2016, 8, 556. [Google Scholar] [CrossRef] [Green Version]
  26. Ghorabaee, M.K.; Zavadskas, E.K.; Amiri, M.; Esmaeili, A. Multi-criteria evaluation of green suppliers using an extended WASPAS method with interval type-2 fuzzy sets. J. Clean. Prod. 2016, 137, 213–229. [Google Scholar] [CrossRef]
  27. Liu, D.; Luo, Y.; Liu, Z. The linguistic picture fuzzy set and its application in multi-criteria decision-making: An illustration to the TOPSIS and TODIM methods based on entropy weight. Symmetry 2020, 12, 1170. [Google Scholar] [CrossRef]
  28. Maume-Deschamps, V.; Niang, I. Estimation of quantile oriented sensitivity indices. Stat. Probab. Lett. 2018, 134, 122–127. [Google Scholar] [CrossRef] [Green Version]
  29. Kucherenko, S.; Song, S.; Wang, L. Quantile based global sensitivity measures. Reliab. Eng. Syst. Saf. 2019, 185, 35–48253. [Google Scholar] [CrossRef] [Green Version]
  30. Kala, Z. Quantile-oriented global sensitivity analysis of design resistance. J. Civ. Eng. Manag. 2019, 25, 297–305. [Google Scholar] [CrossRef] [Green Version]
  31. Kala, Z. Quantile-based versus Sobol sensitivity analysis in limit state design. Structures 2020, 28, 2424–2430. [Google Scholar] [CrossRef]
  32. Kala, Z. From probabilistic to quantile-oriented sensitivity analysis: New indices of design quantiles. Symmetry 2020, 12, 1720. [Google Scholar] [CrossRef]
  33. Kala, Z. Global sensitivity analysis of quantiles: New importance measure based on superquantiles and subquantiles. Symmetry 2021, 13, 263. [Google Scholar] [CrossRef]
  34. Wei, P.; Lu, Z.; Hao, W.; Feng, J.; Wang, B. Efficient sampling methods for global reliability sensitivity analysis. Comput. Phys. Commun. 2012, 183, 1728–1743. [Google Scholar] [CrossRef]
  35. Zhao, J.; Zeng, S.; Guo, J.; Du, S. Global reliability sensitivity analysis based on maximum entropy and 2-Layer polynomial chaos expansion. Entropy 2018, 20, 202. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Zhang, X.; Liu, J.; Yan, Y.; Pandey, M. An effective approach for reliability-based sensitivity analysis with the principle of Maximum entropy and fractional moments. Entropy 2019, 21, 649. [Google Scholar] [CrossRef] [Green Version]
  37. Kala, Z. Global sensitivity analysis of reliability of structural bridge system. Eng. Struct. 2019, 194, 36–45. [Google Scholar] [CrossRef]
  38. Kala, Z. Estimating probability of fatigue failure of steel structures. Acta Comment. Univ. Tartu. Math. 2019, 23, 245–254. [Google Scholar] [CrossRef]
  39. Kala, Z. Sensitivity analysis in probabilistic structural design: A comparison of selected techniques. Sustainability 2020, 12, 4788. [Google Scholar] [CrossRef]
  40. Lei, J.; Lu, Z.; He, L. The single-loop Kriging model combined with Bayes’ formula for time-dependent failure probability based global sensitivity. Structures 2021, 32, 987–996. [Google Scholar] [CrossRef]
  41. Wang, P.; Li, H.; Huang, X.; Zhang, Z.; Xiao, S. Numerical decomposition for the reliability-oriented sensitivity with dependent variables using vine copulas. J. Mech. Des. 2021, 143, 081701. [Google Scholar] [CrossRef]
  42. Rani, P.; Mishra, A.R.; Mardani, A.; Cavallaro, F.; Štreimikienė, D.; Khan, S.A.R. Pythagorean Fuzzy SWARA–VIKOR Framework for Performance Evaluation of Solar Panel Selection. Sustainability 2020, 12, 4278. [Google Scholar] [CrossRef]
  43. Mitrović Simić, J.; Stević, Ž.; Zavadskas, E.K.; Bogdanović, V.; Subotić, M.; Mardani, A. A Novel CRITIC-Fuzzy FUCOM-DEA-Fuzzy MARCOS model for safety evaluation of road sections based on geometric parameters of road. Symmetry 2020, 12, 2006. [Google Scholar] [CrossRef]
  44. Rani, P.; Mishra, A.R.; Krishankumar, R.; Mardani, A.; Cavallaro, F.; Soundarapandian Ravichandran, K.; Balasubramanian, K. Hesitant fuzzy SWARA-complex proportional assessment approach for sustainable supplier selection (HF-SWARA-COPRAS). Symmetry 2020, 12, 1152. [Google Scholar] [CrossRef]
  45. Puška, A.; Nedeljković, M.; Hashemkhani Zolfani, S.; Pamučar, D. Application of interval fuzzy logic in selecting a sustainable supplier on the example of agricultural production. Symmetry 2021, 13, 774. [Google Scholar] [CrossRef]
  46. Wang, A.; Solomatine, D.P. Practical experience of sensitivity analysis: Comparing six methods, on three hydrological models, with three performance criteria. Water 2019, 11, 1062. [Google Scholar] [CrossRef] [Green Version]
  47. Štefaňák, J.; Kala, Z.; Miča, L.; Norkus, A. Global sensitivity analysis for transformation of Hoek-Brown failure criterion for rock mass. J. Civ. Eng. Manag. 2018, 24, 390–398. [Google Scholar] [CrossRef]
  48. Ching, D.S.; Safta, C.; Reichardt, T.A. Sensitivity-informed bayesian inference for home PLC network models with unknown parameters. Energies 2021, 14, 2402. [Google Scholar] [CrossRef]
  49. Rahn, S.; Gödel, M.; Fischer, R.; Köster, G. Dynamics of a simulated demonstration march: An efficient sensitivity analysis. Sustainability 2021, 13, 3455. [Google Scholar] [CrossRef]
  50. Martínez-Ruiz, A.; Ruiz-García, A.; Prado-Hernández, J.V.; López-Cruz, I.L.; Valencia-Islas, J.O.; Pineda-Pineda, J. Global sensitivity analysis and calibration by differential evolution algorithm of HORTSYST crop model for fertigation management. Water 2021, 13, 610. [Google Scholar] [CrossRef]
  51. Xu, N.; Luo, J.; Zuo, J.; Hu, X.; Dong, J.; Wu, T.; Wu, S.; Liu, H. Accurate suitability evaluation of large-scale roof greening based on RS and GIS methods. Sustainability 2020, 12, 4375. [Google Scholar] [CrossRef]
  52. Islam, A.B.M.; Karadoğan, E. Analysis of one-dimensional ivshin–pence shape memory alloy constitutive model for sensitivity and uncertainty. Materials 2020, 13, 1482. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  53. Gamannossi, A.; Amerini, A.; Mazzei, L.; Bacci, T.; Poggiali, M.; Andreini, A. Uncertainty quantification of film cooling performance of an industrial gas turbine vane. Entropy 2020, 22, 16. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  54. De Falco, A.; Resta, C.; Sevieri, G. Sensitivity analysis of frequency-based tie-rod axial load evaluation methods. Eng. Struct. 2021, 229, 111568. [Google Scholar] [CrossRef]
  55. Antucheviciene, J.; Kala, Z.; Marzouk, M.; Vaidogas, E.R. Solving civil engineering problems by means of fuzzy and stochastic MCDM methods: Current state and future research. Math. Probl. Eng. 2015, 2015, 362579. [Google Scholar] [CrossRef] [Green Version]
  56. Kala, Z.; Valeš, J. Sensitivity assessment and lateral-torsional buckling design of I-beams using solid finite elements. J. Constr. Steel Res. 2017, 139, 110–122. [Google Scholar] [CrossRef]
  57. Wen, Z.; Xia, Y.; Ji, Y.; Liu, Y.; Xiong, Z.; Lu, H. Study on risk control of water inrush in tunnel construction period considering uncertainty. J. Civ. Eng. Manag. 2019, 25, 757–772. [Google Scholar] [CrossRef] [Green Version]
  58. Strieška, M.; Koteš, P. Sensitivity of dose-response function for carbon steel under various conditions in Slovakia. Transp. Res. Procedia 2019, 40, 912–919. [Google Scholar] [CrossRef]
  59. Su, L.; Wang, T.; Li, H.; Chao, Y.; Wang, L. Multi-criteria decision making for identification of unbalanced bidding. J. Civ. Eng. Manag. 2020, 26, 43–52. [Google Scholar] [CrossRef]
  60. Rykov, V.; Kozyrev, D. On the reliability function of a double redundant system with general repair time distribution. Appl. Stoch. Models Bus Ind. 2019, 35, 191–197. [Google Scholar] [CrossRef]
  61. Luo, L.; Zhang, L.; Wu, G. Bayesian belief network-based project complexity measurement considering causal relationships. J. Civ. Eng. Manag. 2020, 26, 200–2015. [Google Scholar] [CrossRef]
  62. Strauss, A.; Moser, T.; Honeger, C.; Spyridis, P.; Frangopol, D.M. Likelihood of impact events in transport networks considering road conditions, traffic and routing elements properties. J. Civ. Eng. Manag. 2020, 26, 95–112. [Google Scholar] [CrossRef] [Green Version]
  63. Rykov, V.V.; Sukharev, M.G.; Itkin, V.Y. Investigations of the potential application of k-out-of-n systems in oil and gas industry objects. J. Mar. Sci. Eng. 2020, 8, 928. [Google Scholar] [CrossRef]
  64. Pan, L.; Novák, L.; Lehký, D.; Novák, D.; Cao, M. Neural network ensemble-based sensitivity analysis in structural engineering: Comparison of selected methods and the influence of statistical correlation. Comput. Struct. 2021, 242, 106376. [Google Scholar] [CrossRef]
  65. Schroeder, M.J. An Alternative to entropy in the measurement of information. Entropy 2004, 6, 388–412. [Google Scholar] [CrossRef] [Green Version]
  66. Kullback, S.; Leibler, R. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
  67. Kullback, S. Information Theory and Statistics; John Wiley and Sons: Hoboken, NJ, USA, 1959. [Google Scholar]
  68. Gamboa, F.; Klein, T.; Lagnoux, A. Sensitivity analysis based on Cramér-von Mises distance. SIAM/ASA J. Uncertain. Quantif. 2018, 6, 522–548. [Google Scholar] [CrossRef] [Green Version]
  69. Kala, Z. Limit states of structures and global sensitivity analysis based on Cramér-von Mises distance. Int. J. Mech. 2020, 14, 107–118. [Google Scholar]
  70. Borgonovo, E. A new uncertainty importance measure. Reliab. Eng. Syst. Saf. 2007, 92, 771–784. [Google Scholar] [CrossRef]
  71. Kala, Z. Sensitivity assessment of steel members under compression. Eng. Struct. 2009, 31, 1344–1348. [Google Scholar] [CrossRef]
  72. Kala, Z. Global sensitivity analysis in stability problems of steel frame structures. J. Civ. Eng. Manag. 2016, 22, 417–424. [Google Scholar] [CrossRef]
  73. Kala, Z.; Valeš, J. Imperfection sensitivity analysis of steel columns at ultimate limit state. Arch. Civ. Mech. Eng. 2018, 18, 1207–1218. [Google Scholar] [CrossRef]
  74. Saltelli, A.; Ratto, M.; Andres, T.; Campolongo, F.; Cariboni, J.; Gatelli, D.; Saisana, M.; Tarantola, S. Global Sensitivity Analysis: The Primer; John Wiley & Sons: West Sussex, UK, 2008. [Google Scholar]
  75. Melcher, J.; Kala, Z.; Holický, M.; Fajkus, M.; Rozlívka, L. Design characteristics of structural steels based on statistical analysis of metallurgical products. J. Constr. Steel Res. 2004, 60, 795–808. [Google Scholar] [CrossRef]
  76. Kala, Z.; Melcher, J.; Puklický, L. Material and geometrical characteristics of structural steels based on statistical analysis of metallurgical products. J. Civ. Eng. Manag. 2009, 15, 299–307. [Google Scholar] [CrossRef]
  77. McKey, M.D.; Beckman, R.J.; Conover, W.J. A comparison of the three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 1979, 21, 239–245. [Google Scholar]
  78. Iman, R.C.; Conover, W.J. Small sample sensitivity analysis techniques for computer models with an application to risk assessment. Commun. Stat. Theory Methods 1980, 9, 1749–1842. [Google Scholar] [CrossRef]
Figure 1. Differential entropy H(R) vs. σR from Equation (2) using Gauss pdf of f(r).
Figure 1. Differential entropy H(R) vs. σR from Equation (2) using Gauss pdf of f(r).
Entropy 23 00778 g001
Figure 2. Examples of g(z) vs. z and examples of z·ln(g(z)) vs. z with using a natural logarithm.
Figure 2. Examples of g(z) vs. z and examples of z·ln(g(z)) vs. z with using a natural logarithm.
Entropy 23 00778 g002
Figure 3. Approximation of H(R) by H ˜ ( R ) using natural logarithm and Gauss pdf of f(r).
Figure 3. Approximation of H(R) by H ˜ ( R ) using natural logarithm and Gauss pdf of f(r).
Entropy 23 00778 g003
Figure 4. Approximation of H(R) by H ^ ( R ) using three types of logarithm and Gauss pdf of f(r).
Figure 4. Approximation of H(R) by H ^ ( R ) using three types of logarithm and Gauss pdf of f(r).
Entropy 23 00778 g004
Figure 5. The probability density function of resistance R based on Table 1.
Figure 5. The probability density function of resistance R based on Table 1.
Entropy 23 00778 g005
Figure 6. Sensitivity indices based on differential entropy H(R) with b = e.
Figure 6. Sensitivity indices based on differential entropy H(R) with b = e.
Entropy 23 00778 g006
Figure 7. Sensitivity indices based on functional H ˜ ( R ) with b = e; t = 4.
Figure 7. Sensitivity indices based on functional H ˜ ( R ) with b = e; t = 4.
Entropy 23 00778 g007
Figure 8. Sensitivity indices based on functional H ^ ( R ) with b = e.
Figure 8. Sensitivity indices based on functional H ^ ( R ) with b = e.
Entropy 23 00778 g008
Figure 9. Gamboa sensitivity indices.
Figure 9. Gamboa sensitivity indices.
Entropy 23 00778 g009
Figure 10. Borgonovo sensitivity indices.
Figure 10. Borgonovo sensitivity indices.
Entropy 23 00778 g010
Figure 11. Sobol sensitivity indices.
Figure 11. Sobol sensitivity indices.
Entropy 23 00778 g011
Figure 12. Probability density functions of resistance R with all combinations of input fixations.
Figure 12. Probability density functions of resistance R with all combinations of input fixations.
Entropy 23 00778 g012
Table 1. Input random variables.
Table 1. Input random variables.
CharacteristicIndexSymbolMean Value μStandard
Deviation σ
Yield Strength1fy412.68 MPa27.941 MPa
Thickness2t212 mm0.55 mm
Width3b100 mm1 mm
1 All inputs have Gauss pdf and are uncorrelated.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kala, Z. Global Sensitivity Analysis Based on Entropy: From Differential Entropy to Alternative Measures. Entropy 2021, 23, 778. https://doi.org/10.3390/e23060778

AMA Style

Kala Z. Global Sensitivity Analysis Based on Entropy: From Differential Entropy to Alternative Measures. Entropy. 2021; 23(6):778. https://doi.org/10.3390/e23060778

Chicago/Turabian Style

Kala, Zdeněk. 2021. "Global Sensitivity Analysis Based on Entropy: From Differential Entropy to Alternative Measures" Entropy 23, no. 6: 778. https://doi.org/10.3390/e23060778

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop