Next Article in Journal
Generalizing the Wells–Riley Infection Probability: A Superstatistical Scheme for Indoor Infection Risk Estimation
Next Article in Special Issue
Orders between Channels and Implications for Partial Information Decomposition
Previous Article in Journal
A Goodwin Model Modification and Its Interactions in Complex Networks
Previous Article in Special Issue
Jensen–Inaccuracy Information Measure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On the Uncertainty Properties of the Conditional Distribution of the Past Life Time

Department of Statistics and Operations Research, College of Science, King Saud University, P.O. Box 2455, Riyadh 11451, Saudi Arabia
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(6), 895; https://doi.org/10.3390/e25060895
Submission received: 5 May 2023 / Revised: 27 May 2023 / Accepted: 31 May 2023 / Published: 2 June 2023
(This article belongs to the Special Issue Measures of Information III)

Abstract

:
For a given system observed at time t, the past entropy serves as an uncertainty measure about the past life-time of the distribution. We consider a coherent system in which there are n components that have all failed at time t. To assess the predictability of the life-time of such a system, we use the signature vector to determine the entropy of its past life-time. We explore various analytical results, including expressions, bounds, and order properties, for this measure. Our results provide valuable insight into the predictability of the coherent system’s life-time, which may be useful in a number of practical applications.

1. Introduction

The process of quantifying and managing uncertainty over the random life-time of a system is a major task for engineers. As uncertainty increases, the reliability of a system will also decrease, so systems that have a longer life-time while benefiting from a lower level of uncertainty are preferable (see, e.g., Ebrahimi and Pellery [1]). The concept of uncertainty has far-reaching applications, as highlighted in Shannon’s seminal work on information theory [2]. Information theory has provided valuable tools for evaluating and managing uncertainty in engineering systems. Let X be the lifespan of a system or other living organism with an absolutely continuous cumulative distribution function (cdf) F ( x ) and a probability density function (pdf) f ( x ) . Shannon’s differential entropy is a well-known measure and is given as follows:
H ( X ) = 0 f ( x ) log f ( x ) d x ,
where “log” stands for the natural logarithm. If X represents the life-time a new system has, then H ( X ) calculates the uncertainty for the life-time of the system. In certain scenarios, operators may only partially know the age that a system currently has. For example, an operator may know that a system was in service at specified time t, and he/she is quantifying the uncertainty for the remaining life-time of the system after age t, commonly referred to as the remaining life-time or residual life-time after t. According to Ebrahimi [3], the residual entropy of X is considered to be the entropy of X t = [ X t | X > t ] . Formally, for all t > 0 , the residual life-time entropy for X is measured as
H ( X t ) = t f ( x ) 1 F ( t ) log f ( x ) 1 F ( t ) d x ,
If we already know that an object has survived to time t , then H ( X t ) quantifies the uncertainty contained in the distribution of remaining life-times. Di Crescenzo and Longobardi [4] have proposed a notion of past entropy over the interval ( 0 , t ) using an analogy with the definition of entropy over time given in Equation (2). The introduction of the past entropy is motivated by the observation that in realistic scenarios, uncertainty needs not be limited to the future, but can also affect the past. The authors pointed out the importance of the past entropy and its relation to the residual entropy. Thus, if X is a random life-time and recalls that the pdf of X t = [ X | X < t ] is f t ( x ) = f ( x ) / F ( t ) , 0 < x < t , then the differential entropy of [ X | X < t ] is called the past entropy at time t of X , and it is denoted by
H ¯ ( X t ) = 0 t f ( x ) F ( t ) log f ( x ) F ( t ) d x = 1 1 F ( t ) 0 t f ( x ) log τ ( x ) d x ,
where τ ( x ) = f ( x ) / F ( x ) is known as the reversed hazard rate function.
Various aspects and statistical perspectives on past entropy have been treated in the literature, as can be seen in Di Crescenzo and Longobardi [4], Nair and Sunoj [5], Loperfido [6], and Shangari and Chen [7], as well as in the references used in these papers. In this case, Gupta et al. [8] obtained some results concerning the residual entropy and past entropy for order statistics, as well as presented several relevant stochastic ordering properties. In this context, they provided some characterization properties; see also [9]. Recently, Toomaj et al. [10] applied the residual entropy to a coherent system and obtained several related properties. Kayid and Alshehri [11] have recently studied the uncertainty in coherent structures using Tsallis entropy. In addition, Mesfioui et al. [12] also studied the phenomenon of uncertainty in the life-time of a coherent system using the Rényi entropy. In this research, we consider a coherent structure where all of the components have failed at time t . The system signature approach is utilized to compute the differential entropy of the past life-time.
The contents of this paper are organized as follows: In Section 2, we present a formula for the Shannon differential entropy of a coherent system when all components are inactive at time t . The method of system signature is applicable when the random life-times of the components are independent and identically distributed (i.i.d.). In Section 3, some valuable bounds are pointed out and outlined. In Section 4, the Jensen–Shannon disparity of the coherent framework is considered. Some concluding remarks are outlined in Section 5.

2. The Past Life-Time Uncertainty in Coherent Systems

Here, in order to define the past-life entropy for coherent structures, we apply the signature vector of the underlying structure. We assume that all of the components in the system have become inactive at time t . The coherent system is defined as a system that satisfies the requirements of having no unnecessary components and has a monotonic structure function. The vector p = ( p 1 , , p n ) , in which the ith component is given by p i = P ( T = X i : n ) , i = 1 , 2 , , n , is known as the signature vector (see [13]). We contemplate a coherent structure with components that have i.i.d. random life-times X 1 , , X n and a specified signature p = ( p 1 , , p n ) . If T t = [ t T | X n : n t ] stands for the past life-time of the system, provided that at time t the components have all become inactive, then from the results of Khaledi and Kochar [14], the survival function of T t can be obtained as
P ( T t > x ) = i = 1 n p i P ( t X i : n > x | X n : n t ) ,
where
P ( t X i : n > x | X n : n t ) = k = i n n k F ( t x ) F ( t ) k 1 F ( t x ) F ( t ) n k , 0 < x < t ,
denotes the survival function of the past life-time of an i-out-of-n system as long as all of the components have failed at time t. It follows from (4) that
f T t ( x ) = i = 1 n p i f T t i ( x ) ,
where
f T t i ( x ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F ( t x ) F ( t ) i 1 1 F ( t x ) F ( t ) n i f ( t x ) F ( t ) , 0 < x < t ,
such that Γ ( · ) is the full gamma function and T t i = [ t X i : n | X n : n t ] , i = 1 , 2 , , n , is the time elapsed since the failure of the component with life-time X i : n in the system, assuming that the system failed at or before time t . Remark that T t i denotes the i-th order statistic among n i.i.d. components with cdf F ( t x ) F ( t ) , 0 < x < t . Now, we give a statement about the entropy of T t . To this aim, let us set F t ( x ) = F ( x ) F ( t ) , 0 < x < t . The probability integral transformation given by V = F t ( T t ) plays a vital role in our study and it is obvious that U i : n = F t ( T t i ) follows the beta distribution with parameters i and n i + 1 with pdf
g i ( u ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) u i 1 ( 1 u ) n i , 0 < u < 1 ,
for all i = 1 , , n . In the forthcoming theorem, we shall give a formula for the past entropy of T t using (6).
Theorem 1. 
The past entropy of T t can be expressed as follows:
H ¯ ( T t ) = H ( V ) E [ log f t ( T t ) ]
= H ( V ) i = 1 n p i E [ log f t ( F t 1 ( U i : n ) ) ] ,
V is the life-time of the coherent system which has pdf g V ( v ) = i = 1 n p i g i ( v ) and F t 1 ( u ) = inf { x ; F t ( x ) u } is the quantile function of F t ( x ) = F ( x ) / F ( t ) , 0 < x t .
Proof. 
By (1) and (6), and by substituting z = t x , we have
H ¯ ( T t ) = 0 t f T t ( x ) log f T t ( x ) d x , = 0 t i = 1 n p i Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F ( t x ) F ( t ) i 1 1 F ( t x ) F ( t ) n i f ( t x ) F ( t ) × log i = 1 n p i Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F ( t x ) F ( t ) i 1 1 F ( t x ) F ( t ) n i f ( t x ) F ( t ) d x = 0 t i = 1 n p i Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F t ( z ) i 1 1 F t ( z ) n i f t ( z ) × log i = 1 n p i Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F t ( z ) i 1 1 F t ( z ) n i f t ( z ) d x = H ( V ) i = 1 n p i 0 1 g i ( u ) log f t ( F t 1 ( u ) ) d u .
The last equality is obtained by changing the variable of u = F t ( z ) , and the proof is then completed. □
It is important to keep in mind that Equation (8) expresses the entropy of T t as the sum of two terms, where the first term does not depend on the distribution of past life-times, while the second term depends on the distribution of the past life-times of the component. If T t = [ t T | X n : n t ] stands for the past life-time of the coherent system under the condition that at time t , all components of the system have failed, then H ¯ ( T t ) calculates the expected amount of uncertainty induced by the conditional density of t T , as long as X n : n t , on the predictability of the past life-time of the system. Especially if we consider an i-out-of-n system with the system signature p = ( 0 , , 0 , 1 i , 0 , , 0 ) , i = 1 , 2 , , n , then Equation (9) to
H ¯ ( T t i ) = H ( U i : n ) E [ log f t ( F t 1 ( U i : n ) ) ] ,
for all t > 0 . The next theorem is a direct consequence of Theorem 1 that uses the property that the reversed hazard rate of X is decreasing. Recall that the random life-time X belongs to the class of the decreasing reversed hazard rate (DRHR) if τ ( x ) is a decreasing function of x > 0 .
Theorem 2. 
If X is DRHR, then H ¯ ( T t ) is increasing in t.
Proof. 
Through the identity f t ( F t 1 ( x ) ) = x τ t ( F t 1 ( x ) ) , Equation (9) can be rewritten as
H ¯ ( T t ) = H ( V ) i = 1 n p i [ ψ ( n i + 1 ) ψ ( n + 1 ) ] i = 1 n p i E [ log τ t ( F t 1 ( U i : n ) ) ] .
It is plain to verify that F t 1 ( u ) = F 1 ( u F ( t ) ) , for all 0 < u < 1 , and hence,
τ t ( F t 1 ( u ) ) = τ ( F 1 ( u F ( t ) ) ) , 0 < u < 1 .
If t 1 t 2 , then F 1 ( u F ( t 1 ) ) F 1 ( u F ( t 2 ) ) . Consequently, when F is DRHR, then
E [ log τ ( F 1 ( U i : n F ( t 1 ) ) ) ] E [ log τ ( F 1 ( U i : n F ( t 2 ) ) ) ] .
Using (12), the proof is then completed. □
The next example deals with a situation where Theorems 1 and 2 are applied.
Example 1. 
Consider a coherent system with the signature p = ( 0 , 2 / 3 , 1 / 3 ) . It follows that H ( V ) = 0.05757 . Given the distributions of the components’ life-times, the Relation (9) can be used to determine the precise value of H ( T t ) . Let us assume the following life-time distributions for this purpose.
(a)
Let X be uniformly distributed in [ 0 , 1 ] . It holds that
E [ log f t ( F t 1 ( U i : n ) ) ] = log ( t ) ,
for all i = 1 , 2 , 3 , 4 . From (8), we immediately obtain
H ¯ ( T t ) = 0.05757 + log ( t ) .
It is seen that the entropy of T t is an increasing function of time t . We note that the uniform distribution has the DRHR property, and therefore, H ¯ ( T t ) is an increasing function of time t, as we expected based on Theorem 1.
(b)
Let us assume that X follows the cdf
F ( x ) = e x k , x > 0 , k > 0 .
One can see that
E [ log f t ( F t 1 ( U i : n ) ) ] = log ( k ) + E [ log U i : n ] + k + 1 k E log t k log ( U i : n ) ,
for all i = 1 , 2 , 3 , 4 . Upon recalling (9), we obtain
H ¯ ( T t ) = 1.0257 log ( k ) k + 1 k i = 1 n p i E log t k log ( U i : n ) ,
for all t > 0 . For several choices of k, we have shown the exact value of H ¯ ( T t ) with respect to time t in Figure 1. It is obvious that H ¯ ( T t ) is an increasing function of time t for all k > 0 since X is DRHR, as can follow from Theorem 1.
The duality of a system is a useful concept for technical reliability, which makes it possible to reduce the computational complexity for determining the signatures of all coherent systems of a given size by about half. Kochar et al. [15] have proposed a duality relation that exists between the signature of a system and that of its dual. If p = ( p 1 , , p n ) denotes the signature a coherent system with life-time T has, then the signature of its dual system with life-time T D is given by p D = ( p n , , p 1 ) . In the following theorem, we apply the duality property to simplify the calculation of the past entropy for coherent systems. First, we need the following the lemma that is well-known as the Müntz–Szász theorem, and one can find it in [16].
Lemma 1. 
If ϕ ( x ) is a continuous function of [ 0 , 1 ] , such that 0 1 x n ϕ ( x ) d x = 0 for all n 0 , then ϕ ( x ) = 0 for any x [ 0 , 1 ] .
Theorem 3. 
Let T t be the life-time of a coherent system with signature p consisting of n i.i.d. components. Then, H ( T t ) = H ( T t D ) for all p and all n, if and only if f t ( F t 1 ( u ) ) = f t ( F t 1 ( 1 u ) ) satisfies for all 0 < u < 1 and t .
Proof. 
It is worth noting that Theorem 2.2 of Toomaj and Doostparast [17] asserts the equality of entropies between V and V D , i.e., H ( V ) = H ( V D ) . To prove sufficiency, let us assume that f t ( F t 1 ( u ) ) = f t ( F t 1 ( 1 u ) ) for all 0 < u < 1 . It is worth noting that g i ( 1 u ) = g n i + 1 ( u ) for all i = 1 , , n and 0 < u < 1 . Consequently, utilizing (9), we obtain that:
0 1 g V D ( u ) log f t ( F t 1 ( u ) ) d u = 0 1 i = 1 n p n i + 1 g i ( u ) f t ( F t 1 ( u ) ) d u = 0 1 r = 1 n p r g n r + 1 ( u ) log f t ( F t 1 ( u ) ) d u = 0 1 r = 1 n p r g r ( 1 u ) log f t ( F t 1 ( 1 u ) ) d u = 0 1 r = 1 n p r g r ( u ) log f t ( F t 1 ( u ) ) d u = 0 1 g V ( u ) log f t ( F t 1 ( u ) ) d u ,
and this completes the proof by recalling Equation (8). For necessity, H ( T t ) = H ( T t D ) holds for all p and all n . Let p = ( 1 , 0 , , 0 ) . So, it follows from (9) that the assumption H ( T t ) = H ( T t D ) is equivalent to
0 1 g n ( u ) log f t ( F t 1 ( u ) ) d u = 0 1 g 1 ( u ) log f t ( F t 1 ( u ) ) d u = 0 1 g n ( 1 u ) log f t ( F t 1 ( u ) ) d u ,
where the last equality is obtained by noting that g 1 ( u ) = g n ( 1 u ) , 0 < u < 1 . Putting v = 1 u in the right side of the above equation leads to
0 1 g n ( u ) log f t ( F t 1 ( u ) ) d u = 0 1 g n ( u ) log f t ( F t 1 ( 1 u ) ) d u .
Thus, we obtain
0 1 g n ( u ) [ log f t ( F t 1 ( u ) ) log f t ( F t 1 ( 1 u ) ) ] d u = 0 1 ( 1 u ) n 1 log f t ( F t 1 ( u ) ) f t ( F t 1 ( 1 u ) ) d u = 0 1 u n 1 log f t ( F t 1 ( 1 u ) ) f t ( F t 1 ( u ) ) d u = 0 .
Hence f t ( F t 1 ( 1 u ) ) = f t ( F t 1 ( u ) ) due to Lemma 1, and this concludes the proof. □
An immediate consequence of the above theorem is given for the i-out-of-n systems.
Corollary 1. 
Let T t i be the life-time of an i-out-of-n system consisting of n i.i.d. components. Then, H ¯ ( T t i ) = H ¯ ( T t n i + 1 ) for all n and i = 1 , 2 , , n / 2 if n is even and i = 1 , 2 , , ( n 1 ) / 2 if n is odd, if and only if f t ( F t 1 ( u ) ) = f t ( F t 1 ( 1 u ) ) satisfies for all 0 < u < 1 and t .

3. Bounds for the Past Entropy

Hereafter, we provide several useful bounds for H ¯ ( T t ) by using the concept of the system signature. For the first bound, we use the notion of Kullback–Leibler (KL) discrimination information. We recall that the KL discrimination information between two random variables X and Y with pdfs f and g , respectively, is given by
K ( X : Y ) = 0 f ( x ) log f ( x ) g ( x ) d x = H ( X ) + H ( X , Y ) ,
where H ( X , Y ) = E ( log g ( X ) ) is known as the inaccuracy between f and g .
Theorem 4. 
Let T t denote the past life-time of a coherent system consisting of n i.i.d. components’ life-times X 1 , , X n having the common pdf f in which, at time t, all components of the system have failed. Then, we have
H ¯ L ( T t ) H ¯ ( T t ) H ¯ U ( T t ) ,
where H ¯ L ( T t ) = i = 1 n p i H ¯ ( T t i ) and H ¯ U ( T t ) = H ¯ L ( T t ) + i = 1 n p i K ( U i : n : U j * : n ) for all t > 0 .
Proof.  
For the lower bound, since the differential entropy is a concave function of the density function, we can find a lower bound for the entropy of T t given by the following representation:
H ¯ ( T t ) H L ( T t ) = i = 1 n p i H ¯ ( T t i ) .
Moreover, the upper bound can be obtained by noting that the Kullback–Leibler (KL) discrimination information is a non-negative measure. Thus, we have
K ( T t : T t j ) = H ¯ ( T t ) + H ¯ ( T t , T t j ) 0 .
So, one can obtain
H ¯ ( T t ) H ¯ U ( T t ) = min 1 j n H ¯ ( T t , T t j ) .
The upper bound (16) can be rewritten as
H ¯ ( T t , T t j ) = i = 1 n p i H ¯ ( T t i , T t j ) = H L ( T t ) + i = 1 n p i K ( U i : n : U j : n ) ,
where
K ( U i : n : U j : n ) = log Γ ( j ) Γ ( n j + 1 ) Γ ( i ) Γ ( n i + 1 ) + ( i j ) ψ ( i ) ψ ( n i + 1 ) ,
denotes the Kullback–Leibler divergence of beta distributions (see [18] for details). The second equality in (17) is obtained by noting that the KL function is invariant under one-to-one transformations. If we assume that j * = arg min 1 j n i = 1 n p i K ( U i : n : U j : n ) , then
H ¯ U ( T t ) = H ¯ L ( T t ) + i = 1 n p i K ( U i : n : U j : n ) ,
and the proof is then completed. □
We remark that by recalling Equation (11), the lower bound can be rewritten as
H ¯ L ( T t ) = i = 1 n p i H ( U i : n ) i = 1 n p i E [ log f t ( F t 1 ( U i : n ) ) ] .
It is worth pointing out that using Equation (11), expression (9) can be rewritten as
H ¯ ( T t ) = H ( V ) i = 1 n p i H ( U i : n ) + i = 1 n p i H ¯ ( T t i ) = H ( V ) H L ( V ) + H ¯ L ( T t ) ,
where H L ( V ) = i = 1 n p i H ( U i : n ) . It is worth noting that the difference between the past entropy and the lower bound of T t , i.e., H ¯ ( T t ) H L ( T t ) is distribution free and depends only on the system signature. For further information about the bounds and to obtain the optimal index j * , we refer the reader to [10,19].
Numerous authors have investigated the characteristics of coherent systems with various distribution components including Murthy and Jiang [20], Jiang et al. [21], Castet and Saleh [22], and Qiu et al. [23], as well as the references therein. To compare the bounds derived in Theorems 4 and 5, we present an example of a coherent system with power distribution components.
Example 2. 
Consider a coherent system having signature p = ( 0 , 2 / 3 , 1 / 3 ) . It is easy to see that H ( V ) = 0.0874 . Moreover, we can obtain j = 2 (see e.g., [24]). The exact value of H ( T t ) can be computed using Relation (9) when the component life-time distributions are given. Let us denote the life-time of each component by X . We assume that X is a power distribution random variable, with the pdf given by
f ( x ) = k x k 1 , 0 < x < 1 ,
for all k > 0 . It is plain to observe that
E [ log f t ( F t 1 ( U i : n ) ) ] = log t k k 1 k ψ ( n i + 1 ) ψ ( n + 1 ) ,
for all i = 1 , 2 , 3 . From (8), we immediately obtain
H ¯ ( T t ) = 0.0874 k 1 k 1.1666 + log t k .
Alternatively, from Equation (19), the lower bound is given as:
H ¯ L ( T t ) = 0.2273 k 1 k 1.1666 + log t k .
The upper bound can be obtained by recalling Equation (18) as follows:
H ¯ U ( T t ) = 0.0416 k 1 k 1.1666 + log t k .
The entropy of T t is a monotonically increasing function of time t. We note that the power distribution possesses the DRHR property, thus, as expected due to Theorem 1, H ¯ ( T t ) is also an increasing function of time t. Figure 2 displays the exact value of H ¯ ( T t ) together with the lower and upper bounds computed as described above for various values of k. As predicted by Theorem 1, it is evident that H ¯ ( T t ) monotonically increases with respect to time t for all k > 0 , since X is DRHR.
Another useful lower bound can be obtained in the next theorem.
Theorem 5. 
By assuming that the conditions in Theorem 4 hold, one obtains
H ¯ ( T t ) H ¯ L ( T t ) H L ( V ) ,
for all t > 0 .
Proof. 
Due to Lemma 4.1 of Toomaj et al. [24], it holds that H ( V ) 0 . Upon recalling Equation (20), the proof is then completed. □
The following theorem compares the past entropies of two coherent systems that have distinct structures and the same component life-times.
Theorem 6. 
Let T 1 , t = [ t T 1 | X n : n t ] and T 2 , t = [ t T 2 | X n : n t ] represent the past life-times in two coherent systems with signatures p 1 and p 2 , respectively, so that p 1 s t p 2 . Let the system’s components be i.i.d. with the common cdf F. Then, for t > 0 ,
(i) 
if H ( V 1 ) H ( V 2 ) and f t ( F t 1 ( u ) ) increases in u for all t > 0 , then H ¯ ( T 1 , t ) H ¯ ( T 2 , t ) .
(ii) 
if H ( V 1 ) H ( V 2 ) and f t ( F t 1 ( u ) ) decreases in u for all t > 0 , then H ¯ ( T 1 , t ) H ¯ ( T 2 , t ) .
Proof. 
(i) First, it should be noted that the following equation can be used to rewrite Equation (9):
H ¯ ( T i , t ) H ( V i ) = 0 1 g V i ( u ) log f t ( F t 1 ( u ) ) d u , ( i = 1 , 2 ) .
Assumption p 1 s t s 2 implies V 1 s t V 2 . So, we obtain
0 1 g V 1 ( u ) log f t ( F t 1 ( u ) ) d u 0 1 g V 2 ( u ) log f t ( F t 1 ( u ) ) d u ,
in which the inequality in (26) is derived in spirit of the implication that V 1 s t V 2 implies E [ π ( V 1 ) ] E [ π ( V 2 ) ] for all decreasing functions of π . Therefore, Relation (25) gives
H ¯ ( T 1 , t ) H ( V 1 ) H ¯ ( T 2 , t ) H ( V 2 ) ,
or equivalently
H ¯ ( T 1 , t ) H ¯ ( T 2 , t ) H ( V 1 ) H ( V 2 ) 0 ,
where the last inequality is obtained from the assumption and hence the theorem. Part (ii) is analogously proven. □
The following example supplies a situation to apply to Theorem 6.
Example 3. 
We take into account two coherent systems with four components shown in Figure 3 with past life-times T 1 , t = [ t T 1 | X 4 : 4 t ] (left panel) and T 2 , t = [ t T 2 | X 4 : 4 t ] (right panel). It is easily identified that p 1 = ( 1 2 , 1 2 , 0 , 0 ) and p 2 = ( 1 4 , 1 4 , 1 2 , 0 ) , respectively. Further, we can plainly see that H ( V 1 ) = 0.2970 and that H ( V 2 ) = 0.0575 , hence, H ( V 1 ) H ( V 2 ) . Moreover, we have p 1 s t p 2 . Suppose that the component life-times are i.i.d. with the standard exponential distribution with the cdf F ( t ) = 1 e t , t > 0 . It is easily seen that
f t ( F t 1 ( u ) ) = 1 u ( 1 e t ) 1 e t , t > 0 ,
for all 0 < u < 1 . Obviously, f t ( F t 1 ( u ) ) is a decreasing function of u for all t > 0 . Hence, due to Theorem 6, it holds that H ¯ ( T 1 , t ) H ¯ ( T 2 , t ) for all t > 0 .
In the next theorem, we use the concept of duality to reduce the calculation of the past entropy of coherent systems. We recall that s t stands for the stochastic order (see Shaked and Shanthikumar [25]).
Corollary 2. 
Let T t = [ t T | X n : n t ] represent the past life-time of a coherent system with signature vectors p and let T t D = [ t T D | X n : n t ] be its dual with signature s D consisting of i.i.d. component life-time with the common cdf F. Let also p s t p D . Then,
(i) 
if f t ( F t 1 ( u ) ) increases in u for all t > 0 , then H ¯ ( T t ) H ¯ ( T t D ) .
(ii) 
if f t ( F t 1 ( u ) ) decreases in u for all t > 0 , then H ¯ ( T t ) H ¯ ( T t D ) .

4. Jensen–Shannon Divergence of System

This section presents an analytical expression for the Jensen–Shannon (JS) divergence of the past life-time of a coherent system. Specifically, we demonstrate that the JS divergence of the proposed past life-time provides an information criterion for comparing systems based solely on their designs, independent of the parent distribution function F of the system. Drawing on earlier findings by Asadi et al. [18], the JS divergence of the mixture given by Equation (5) can be defined as follows:
J S ( T t : T t 1 , , T t n ; p ) = J S ( p ) = H ¯ ( T t ) i = 1 n p i H ¯ ( T t i ) = H ( V ) i = 1 n p i H ( U i : n ) .
By recalling Equation (11), we easily obtain
i = 1 n p i H ¯ ( T t i ) = i = 1 n p i H ( U i : n ) i = 1 n p i E [ log f t ( F t 1 ( U i : n ) ) ] .
Upon recalling the above relation and (9), the third equality in (27) is easily obtained. It is worth pointing out that (27) does not depend on time t and the common cdf F, and it solely depends on the design of the system signature that coincides with the results given by [18].Therefore, all given results in that paper are also held.
It is evident from (27) that the past life-time entropy concerning the coherent system can be written in terms of JS divergence as follows:
H ¯ ( T t ) = J S ( p ) + i = 1 n p i H ¯ ( T t i ) ,
for all t > 0 . This representation is useful and interesting since it relates the entropy of T t to the JS divergence as well as a weighted sum of the past entropy of order statistics. From the results of Toomaj et al. [10], we have the following useful results for which its proof is omitted.
Theorem 7. 
For a given coherent system with signature s and dual system with signature s D , we have
J S ( p ) = J S ( p D ) .
Boundaries play a crucial role in many areas of research; therefore, much attention has been paid to the study of the acquisition of boundaries presented in the literature. Asadi et al. [18] proposed an approach to computing upper bounds for the Jensen–Shannon (JS) divergence. Specifically, let N be a random variable with a probability mass function p = ( p 1 , , p n ) , where p i = P ( N = n i ) and n i = 1 , 2 , , n , represent the number of failures of components fatal to the system. The Shannon entropy of the signature vector H ( N ) = H ( p ) = i = 1 n p i log p i measures the uncertainty associated with the failure of the system due to the failure of its components. Asadi et al. [18] have derived a primitive upper bound for the JS divergence related to the Shannon entropy of the signature vector given by:
0 J S ( p ) H ( p ) .
The above representation allows for us to obtain bounds for H ¯ ( T t ) in terms of the entropy of the signature vector as follows:
H L ( T t ) H ¯ ( T t ) H ( p ) + H L ( T t ) ,
where H L ( T t ) = i = 1 n p i H ¯ ( T t i ) . In an attempt to obtain an improved upper bound, Asadi et al. [18] obtained the following representation that is applicable for the general case of the JS divergence of mixture distributions. Namely, it holds that
0 J S ( p ) i = 1 n j = 1 n p i p j K ( U i : n : U j : n ) .
Therefore, the second upper bound for H ¯ ( T t ) can be obtained by substituting (30) in place of H ( p ) in (29).

5. Concluding Remarks

In recent years and decades, researchers in the field of information theory have become increasingly interested in developing measures that can be used to evaluate the degree of uncertainty in random variables. The phenomenon of uncertainty associated with the life-time of engineering systems is related to other aspects of the systems. For example, imagine a situation in which an inspection at time t by an operator makes it clear that a number of components that were functioning in a system have become inactive. The problem here is that an event has occurred in the past, but there is still uncertainty about the exact time at which the system or the components within it failed. The ability to assess predictability over the life-time of a system can be a valuable criterion in this regard. Differential Shannon entropy has proven to be an attractive measure for quantifying uncertainty in such situations. Assuming that each system component has failed at time t, we have established in this work an equation for the entropy of the life-time of a system. We have also investigated various properties of this proposed measure, including the determination of boundaries and partial orders between the past life-times of two coherent systems based on their entropy uncertainties using the concept of system signature. To demonstrate the effectiveness of our approach, we give several examples of its application. Our results highlight the potential of this measure for assessing the predictability of system life-times and its usefulness for engineering applications.

Author Contributions

Conceptualization, M.K.; methodology, M.K.; software, M.S.; validation, M.S.; formal analysis, M.K.; investigation, M.S.; resources, M.S.; writing—original draft preparation, M.K.; writing—review and editing, M.S.; visualization, M.S.; supervision, M.S.; project administration, M.S.; funding acquisition, M.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Researchers Supporting Project number (RSP2023R464), King Saud University, Riyadh, Saudi Arabia.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors acknowledge the financial support from King Saud University. This work was supported by Researchers Supporting Project number (RSP2023R464), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ebrahimi, N.; Pellerey, F. New partial ordering of survival functions based on the notion of uncertainty. J. Appl. Probab. 1995, 32, 202–211. [Google Scholar] [CrossRef]
  2. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  3. Ebrahimi, N. How to measure uncertainty in the residual life time distribution. Sankhyā Indian J. Stat. Ser. A 1996, 58, 48–56. [Google Scholar]
  4. Di Crescenzo, A.; Longobardi, M. Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Probab. 2002, 39, 434–440. [Google Scholar] [CrossRef]
  5. Nair, N.U.; Sunoj, S. Some aspects of reversed hazard rate and past entropy. Commun.-Stat.-Theory Methods 2021, 32, 2106–2116. [Google Scholar] [CrossRef]
  6. Shangari, D.; Chen, J. Partial monotonicity of entropy measures. Stat. Probab. Lett. 2012, 82, 1935–1940. [Google Scholar] [CrossRef]
  7. Loperfido, N. Kurtosis-based projection pursuit for outlier detection in financial time series. Eur. J. Financ. 2020, 26, 142–164. [Google Scholar] [CrossRef]
  8. Gupta, R.C.; Taneja, H.; Thapliyal, R. Stochastic comparisons of residual entropy of order statistics and some characterization results. J. Stat. Theory Appl. 2014, 13, 27–37. [Google Scholar] [CrossRef] [Green Version]
  9. Thapliyal, R.; Taneja, H. Order statistics based measure of past entropy. Math. J. Interdiscip. Sci. 2013, 1, 63–70. [Google Scholar] [CrossRef] [Green Version]
  10. Toomaj, A.; Chahkandi, M.; Balakrishnan, N. On the information properties of working used systems using dynamic signature. Appl. Stoch. Model. Bus. Ind. 2021, 37, 318–341. [Google Scholar] [CrossRef]
  11. Kayid, M.; Alshehri, M.A. Tsallis Entropy of a Used Reliability System at the System Level. Entropy 2023, 25, 550. [Google Scholar] [CrossRef] [PubMed]
  12. Mesfioui, M.; Kayid, M.; Shrahili, M. Renyi Entropy of the Residual Lifetime of a Reliability System at the System Level. Axioms 2023, 12, 320. [Google Scholar] [CrossRef]
  13. Samaniego, F.J. System Signatures and Their Applications in Engineering Reliability; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2007; Volume 110. [Google Scholar]
  14. Khaledi, B.E.; Shaked, M. Ordering conditional lifetimes of coherent systems. J. Stat. Plan. Inference 2007, 137, 1173–1184. [Google Scholar] [CrossRef]
  15. Kochar, S.; Mukerjee, H.; Samaniego, F.J. The “signature” of a coherent system and its application to comparisons among systems. Nav. Res. Logist. 1999, 46, 507–523. [Google Scholar] [CrossRef]
  16. Hwang, J.; Lin, G. On a generalized moment problem. II. Proc. Am. Math. Soc. 1984, 91, 577–580. [Google Scholar] [CrossRef]
  17. Toomaj, A.; Doostparast, M. On the Kullback Leibler information for mixed systems. Int. J. Syst. Sci. 2016, 47, 2458–2465. [Google Scholar] [CrossRef]
  18. Asadi, M.; Ebrahimi, N.; Soofi, E.S.; Zohrevand, Y. Jensen–Shannon information of the coherent system lifetime. Reliab. Eng. Syst. Saf. 2016, 156, 244–255. [Google Scholar] [CrossRef]
  19. Abdolsaeed, T.; Doostparast, M. A note on signature-based expressions for the entropy of mixed r-out-of-n systems. Nav. Res. Logist. 2014, 61, 202–206. [Google Scholar]
  20. Murthy, D.; Jiang, R. Parametric study of sectional models involving two Weibull distributions. Reliab. Eng. Syst. Saf. 1997, 56, 151–159. [Google Scholar] [CrossRef]
  21. Jiang, R.; Zuo, M.; Li, H.X. Weibull and inverse Weibull mixture models allowing negative weights. Reliab. Eng. Syst. Saf. 1999, 66, 227–234. [Google Scholar] [CrossRef]
  22. Castet, J.F.; Saleh, J.H. Single versus mixture Weibull distributions for nonparametric satellite reliability. Reliab. Eng. Syst. Saf. 2010, 95, 295–300. [Google Scholar] [CrossRef]
  23. Qiu, G.; Wang, L.; Wang, X. On extropy properties of mixed systems. Probab. Eng. Inf. Sci. 2019, 33, 471–486. [Google Scholar] [CrossRef]
  24. Toomaj, A.; Di Crescenzo, A.; Doostparast, M. Some results on information properties of coherent systems. Appl. Stoch. Model. Bus. Ind. 2018, 34, 128–143. [Google Scholar] [CrossRef]
  25. Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
Figure 1. The exact value of H ¯ ( T t ) for various values of k, as demonstrated in Part (b) of Example 1.
Figure 1. The exact value of H ¯ ( T t ) for various values of k, as demonstrated in Part (b) of Example 1.
Entropy 25 00895 g001
Figure 2. The exact value of H ¯ ( T t ) given in Equation (21) (solid line) along with the lower bound given in Equation (22) (dashed line) and the upper bound given in Equation (23) (dotted line) for different values of k, as demonstrated in Example 2.
Figure 2. The exact value of H ¯ ( T t ) given in Equation (21) (solid line) along with the lower bound given in Equation (22) (dashed line) and the upper bound given in Equation (23) (dotted line) for different values of k, as demonstrated in Example 2.
Entropy 25 00895 g002
Figure 3. Two coherent systems that have signatures with likelihood ratio ordering properties.
Figure 3. Two coherent systems that have signatures with likelihood ratio ordering properties.
Entropy 25 00895 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kayid, M.; Shrahili, M. On the Uncertainty Properties of the Conditional Distribution of the Past Life Time. Entropy 2023, 25, 895. https://doi.org/10.3390/e25060895

AMA Style

Kayid M, Shrahili M. On the Uncertainty Properties of the Conditional Distribution of the Past Life Time. Entropy. 2023; 25(6):895. https://doi.org/10.3390/e25060895

Chicago/Turabian Style

Kayid, Mohamed, and Mansour Shrahili. 2023. "On the Uncertainty Properties of the Conditional Distribution of the Past Life Time" Entropy 25, no. 6: 895. https://doi.org/10.3390/e25060895

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop